Prevent String::num_scientific from giving different precision levels depending on compiler#86951
Prevent String::num_scientific from giving different precision levels depending on compiler#86951TheSofox wants to merge 1 commit into
String::num_scientific from giving different precision levels depending on compiler#86951Conversation
…ls depending on compiler
|
Okay, the tests are failing largely because the Unit Tests were made with |
|
Unsure what compilers would be non GNUC, but there's no tests run on android, iOS, or web, so might be there Note that this also breaks existing documentation, so all that has to be updated with the new precision, I ran into these issues when I was trying to solve this myself last year or so My suggestion for the documentation is to change the specifics of the code generation there to avoid overly specific strings in the documentation |
String::num_scientific from giving different precision levels depending on compilerString::num_scientific from giving different precision levels depending on compiler
|
Closing as superseded by PR #98750. Thank you for getting us started fixing this bug :) |
Fixes #78204.
String::num_scientifichas different codepaths depending on compiler, however there was a fundamental difference in precision of the conversion from float to String depending compiler/platform. Essentially one used%.16lgwhile the other used%lg. This led to surprisingly low precision (not even able to print out a timestamp properly). I've tweaked it so that this low precision codepath uses the same high precision as the other codepath.