c++ - Correct behavior for std::setprecision(0) in GCC vs VS2013 -


depending on compiler use, different out put function @ n=0.

std::string tostrwprec(double a_value, const int n) {     std::ostringstream out;      out << std::setprecision(n) << a_value;      return out.str(); } 

(gcc) 4.8.3 20140911 (red hat 4.8.3-9) returns 1 tostrwprec(1.2345678,0). vs2013 returns 1.2346 same code.

my question are:

  1. what correct/standard behavior setprecision?
  2. what alternative using setprecision?

here updated code based on comment below

std::string tostrwprec(double a_value, const int n) {     std::ostringstream out;      out << std::setprecision(n) << std::fixed<< a_value;      return out.str(); } 

according 22.4.2.2.2 [facet.num.put.virtuals] paragraph 5, stage 1, said precision:

for conversion floating-point type, if floatfield != (ios_base::fixed | ios_base::scientific), str.precision() specified precision in conversion specification. otherwise, no precision specified.

the same paragraph specifies elsewhere format specifier defining result %g.

the default value floatfield nothing set according 27.5.5.2 [basic.ios.cons] paragraph 3, table 128:

flags() skipws | dec

thus, boils down how format string of "%.0g" formats value. c standard in 7.21.6.1 paragraph 8 states this:

let p equal precision if nonzero, 6 if precision omitted, or 1 if precision zero.

it seems, correct result 1.


Comments

Popular posts from this blog

how to insert data php javascript mysql with multiple array session 2 -

multithreading - Exception in Application constructor -

windows - CertCreateCertificateContext returns CRYPT_E_ASN1_BADTAG / 8009310b -