Home Why is decimal.Parse() slower than (decimal)double.Parse()?
 Why is it faster to parse with double.Parse() and cast to decimal rather than calling decimal.Parse()? Given the following; string stringNumber = "18.34"; double dResult = 0d; decimal mResult = 0m; for (int i = 0; i < 9999999; i++) { mResult = (decimal)double.Parse(stringNumber); mResult = decimal.Parse(stringNumber); }  Results in the following metrics in VS2017 profiler (.NET framework v4.7); The cumulative double.Parse() and cast comes to 37.84% of the CPU usage vs decimal.Parse()'s 46.93% of the CPU. There is more of a difference there than can be easily put down to a difference in datatype size. Can anyone explain? The app where this came up on the profiler takes 10+ days to run so this small difference equates to hours of runtime. It'd be good to understand why. I can see that decimal.Parse() calls out to oleaut32.dll but...wth?