Home Why is decimal.Parse() slower than (decimal)double.Parse()?
Reply: 1

Why is decimal.Parse() slower than (decimal)double.Parse()?

DiskJunky Published in 2018-01-12 19:36:19Z

Why is it faster to parse with double.Parse() and cast to decimal rather than calling decimal.Parse()?

Given the following;

string stringNumber = "18.34";

double dResult = 0d;
decimal mResult = 0m;
for (int i = 0; i < 9999999; i++)
    mResult = (decimal)double.Parse(stringNumber);
    mResult = decimal.Parse(stringNumber);

Results in the following metrics in VS2017 profiler (.NET framework v4.7);

The cumulative double.Parse() and cast comes to 37.84% of the CPU usage vs decimal.Parse()'s 46.93% of the CPU. There is more of a difference there than can be easily put down to a difference in datatype size. Can anyone explain?

The app where this came up on the profiler takes 10+ days to run so this small difference equates to hours of runtime. It'd be good to understand why. I can see that decimal.Parse() calls out to oleaut32.dll but...wth?

Robert Harvey
Robert Harvey Reply to 2018-01-12 21:21:39Z

Going from the source of double's implementation and decimal's implementation it appears that the decimal.Parse() deals with the precision in situ, whereas the double.Parse() is optimized to do as much work in integers as possible.

You need to login account before you can post.

About| Privacy statement| Terms of Service| Advertising| Contact us| Help| Sitemap|
Processed in 0.331868 second(s) , Gzip On .

© 2016 Powered by mzan.com design MATCHINFO