Home Why is decimal.Parse() slower than (decimal)double.Parse()?
Reply: 0

Why is decimal.Parse() slower than (decimal)double.Parse()?

user1025
1#
user1025 Published in June 21, 2018, 8:42 am

Why is it faster to parse with double.Parse() and cast to decimal rather than calling decimal.Parse()?

Given the following;

string stringNumber = "18.34";

double dResult = 0d;
decimal mResult = 0m;
for (int i = 0; i < 9999999; i++)
{
    mResult = (decimal)double.Parse(stringNumber);
    mResult = decimal.Parse(stringNumber);
}

Results in the following metrics in VS2017 profiler (.NET framework v4.7);

The cumulative double.Parse() and cast comes to 37.84% of the CPU usage vs decimal.Parse()'s 46.93% of the CPU. There is more of a difference there than can be easily put down to a difference in datatype size. Can anyone explain?

The app where this came up on the profiler takes 10+ days to run so this small difference equates to hours of runtime. It'd be good to understand why. I can see that decimal.Parse() calls out to oleaut32.dll but...wth?

You need to login account before you can post.

About| Privacy statement| Terms of Service| Advertising| Contact us| Help| Sitemap|
Processed in 0.451629 second(s) , Gzip On .

© 2016 Powered by mzan.com design MATCHINFO