I am using the __int128
extension of g . The problem with -std=c 17
is that some of the C library does not have all the support for that extension (i.e. std::make_unsigned<>
fails). When using -std=gnu 17
it works fine.
I've added a header file that allows for the <limit>
to work with __int128
when using -std=c 17
and I'd like to keep it for now, but when using -std=gnu 17
it breaks (because it is already defined). So I was thinking to add a condition like so:
#if !(<something>)
...
#endif
if the compiler already supports the limits with __int128
.
My question is: what is that <something>
I could check to distinguish between the standard and the GNU c 17 libraries?
CodePudding user response:
I did this:
$ diff <(g -11 -std=c 17 -E -dM -x c /dev/null|LC_ALL=C sort) \
<(g -11 -std=gnu 17 -E -dM -x c /dev/null|LC_ALL=C sort)
And the output was:
180a181,182
> #define __GLIBCXX_BITSIZE_INT_N_0 128
> #define __GLIBCXX_TYPE_INT_N_0 __int128
315d316
< #define __STRICT_ANSI__ 1
424a426,427
> #define linux 1
> #define unix 1
That's not definitive, of course, but it's maybe a start.
So you could check for __STRICT_ANSI__
(indicating that there are no Gnu extensions), but perhaps the undocumentable __GLIBCXX_BITSIZE_INT_N_0
is more direct.