LinuxQuestions.org
Help answer threads with 0 replies.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 08-04-2019, 01:49 PM   #1
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
C preprocessor define for 32 vs 64 bit long int


I want to compile some C code on two machines, one is 32 bit Linux, the other 64 bit FreeBSD.

How can i determine via a preprocessor define which architecture it is compiling on? Specifically, i want to know how a long int is defined without using sizeof(long) or other run-time logic.

On the Linux 32 bit machine, i can look at the definition of __BITS_PER_LONG, or __WORDSIZE, both of which are defined as 32. I can't find either of these definitions on the 64 bit FreeBSD server

In my case i can use #ifdef __linux__ , but that is hardly the proper method. I think there must be some universally available preprocessor define to determine 32 vs 64 bits, but - what is it?
 
Old 08-04-2019, 02:04 PM   #2
NevemTeve
Senior Member
 
Registered: Oct 2011
Location: Budapest
Distribution: Debian/GNU/Linux, AIX
Posts: 3,803

Rep: Reputation: 1287Reputation: 1287Reputation: 1287Reputation: 1287Reputation: 1287Reputation: 1287Reputation: 1287Reputation: 1287Reputation: 1287
You could check 'ULONG_MAX' from 'limits.h'
It can be 4294967295 or 18446744073709551615 (or something else, as it is platform dependent).
 
1 members found this post helpful.
Old 08-04-2019, 02:42 PM   #3
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373

Original Poster
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
Quote:
Originally Posted by NevemTeve View Post
You could check 'ULONG_MAX' from 'limits.h'
It can be 4294967295 or 18446744073709551615 (or something else, as it is platform dependent).
That sounds reasonable, but on the 64 bit FreeBSD machine 'ULONG_MAX' is still defined as 4294967295, ('LONG_MAX' is defined as 1 [??]). But sizeof(long) returns 4 on the 32 bit machine, and 8 on the 64 bit server.
 
Old 08-04-2019, 02:44 PM   #4
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373

Original Poster
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
It might help to know where or how GCC assigns the sizeof() value.
 
Old 08-04-2019, 03:04 PM   #5
ehartman
Member
 
Registered: Jul 2007
Location: Delft, The Netherlands
Distribution: Slackware
Posts: 806

Rep: Reputation: 404Reputation: 404Reputation: 404Reputation: 404Reputation: 404
Quote:
Originally Posted by dogpatch View Post
without using sizeof(long) or other run-time logic.
As far as I know "sizeof" is compile-time, although NOT pre-processor, but the C compiler itself. Note that the default size of a long is - in gcc - controllable by a compile-time option:
Code:
       -m32
       -m64
       -mx32
           Generate code for a 32-bit or 64-bit environment.  The -m32 option sets "int",
           "long", and pointer types to 32 bits, and generates code that runs on any
           i386 system.

           The -m64 option sets "int" to 32 bits and "long" and pointer types to 64 bits,
           and generates code for the x86-64 architecture.  For Darwin only the -m64
           option also turns off the -fno-pic and -mdynamic-no-pic options.

           The -mx32 option sets "int", "long", and pointer types to 32 bits, and
           generates code for the x86-64 architecture.
(from "man gcc")

Last edited by ehartman; 08-04-2019 at 03:05 PM.
 
1 members found this post helpful.
Old 08-04-2019, 03:26 PM   #6
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373

Original Poster
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
Quote:
Originally Posted by ehartman View Post
As far as I know "sizeof" is compile-time, although NOT pre-processor, but the C compiler itself. Note that the default size of a long is - in gcc - controllable by a compile-time option:[CODE]
-m32
-m64
-mx32
You are correct, of course: sizeof() is determined at compile time. Pardon my inaccurate use of terms.

As for the -m64 option, it is not available on my 32 bit machine, and, anyway, what i want is for the same C code to produce run-time binaries for either machine as their native 32- or 64-bit architectures allow, while making some logic determinations based upon which architecture it is compiled on.

My (temporary?) workaround to follow, as a seprate post
 
Old 08-04-2019, 03:27 PM   #7
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373

Original Poster
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
I've found that limits.h on the Linux machine looks at whether '__x86_64__' is defined, to determine the definition of '__WORDSIZE', and that '__x86_64__' is indeed defined on the 64 bit server. So, am doing this:
Code:
#if defined __x86_64__
#ifndef __BITS_PER_LONG
#define __BITS_PER_LONG 64
#endif 
#else
#ifndef __BITS_PER_LONG
#define __BITS_PER_LONG 32
#endif 
#endif
then this code:
Code:
printf("size of int = %d\n", sizeof(int));
printf("size of long = %d\n", sizeof(long));
printf("long_max=%d\nulong_max=%u\n", LONG_MAX, ULONG_MAX);
printf("bits_per_long=%d\n", __BITS_PER_LONG);
produces this output on the 32 bit Linux machine:
Code:
size of int = 4
size of long = 4
long_max=2147483647
ulong_max=4294967295
bits_per_long=32
and this output on the 64 bit FreeBSD server:
Code:
size of int = 4
size of long = 8
long_max=1
ulong_max=4294967295
bits_per_long=64
You can see that both 'LONG_MAX' and 'ULONG_MAX' appear to be unreliable, and it SEEMS better to rely upon '__x86_64__'. But will not yet mark this thread as solved, unless someone in LQ land can either confirm this, or point me to a more universal and reliable method.
 
Old 08-04-2019, 03:45 PM   #8
wainamoinen
LQ Newbie
 
Registered: Sep 2009
Posts: 4

Rep: Reputation: Disabled
If you are using the Visual C++ compiler (or the Intel C++ compiler on Windows). You can use the '_M_X64' macro to know the architecture.
The type 'long' is always 4 bytes, on both 64bit and 32bit architectures (https://docs.microsoft.com/en-us/cpp...p?view=vs-2019)
You can use the type 'long long' for 64 bits integer.

With CLang, GCC and Intel (on Linux) you can use the '__x86_64' to know if you are in a 64bit architecture.
You can use the macro '__LONG_MAX__' to know the size of the type long

The Intel C/C++ compiler is peculiar, because it behaves like the Visual C++ on Windows and like GNU C on Linux.

The macros to detect each compiler are:
Code:
_MSC_VER   Microsoft Visual C++
__clang__  LLVM C compiler
__INTEL_   COMPILER Intel C/C++ compiler
__GNUC__   GNU C compiler
You can use something similar to:

Code:
#if defined(_MSC_VER) || (defined(__INTEL_COMPILER) && defined(_WIN32))
   #if defined(_M_X64)
      #define BITNESS 64
      #define LONG_SIZE 4
   #else
      #define BITNESS 32
      #define LONG_SIZE 4
   #endif
#elif defined(__clang__) || defined(__INTEL_COMPILER) || defined(__GNUC__)
   #if defined(__x86_64)
      #define BITNESS 64
   #else
      #define BITNESS 32
   #endif
   #if __LONG_MAX__ == 2147483647L
      #define LONG_SIZE 4
   #else
      #define LONG_SIZE 8
   #endif
#endif
 
1 members found this post helpful.
Old 08-04-2019, 04:12 PM   #9
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373

Original Poster
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
Thank you for affirming my use of '__x86_64'. Can i be sure that this will be defined on all GCC implementations on 64-bit machines? (should have specified that am only interested in GCC)

'__LONG_MAX__' is defined as -1 on the 64 bit FreeBSD server, still apparently as meaningless as 'LONG_MAX'
 
Old 08-04-2019, 04:40 PM   #10
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373

Original Poster
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
Is there a way to mark a thread as 'completely confused'?

Upon starting to implement the above, i find that on the 64-bit FreeBSD server, the ULONG_MAX definition as 4294967295 is indeed correct even though sizeof(long) is 8. When i perform a calculation that exceeds the above max, the long integer becomes negative, or truncated. If a long integer is 8 bytes (64 bits), but its effective max is only 32 bits, what are the other 32 bits used for?

Is this a bug in GCC as implemented on the 64-bit server?
 
Old 08-04-2019, 09:33 PM   #11
ntubski
Senior Member
 
Registered: Nov 2005
Distribution: Debian, Arch
Posts: 3,494

Rep: Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789Reputation: 1789
Quote:
Originally Posted by dogpatch View Post
Code:
printf("size of int = %d\n", sizeof(int));
printf("size of long = %d\n", sizeof(long));
printf("long_max=%d\nulong_max=%u\n", LONG_MAX, ULONG_MAX);
printf("bits_per_long=%d\n", __BITS_PER_LONG);
Quote:
Originally Posted by dogpatch View Post
Is there a way to mark a thread as 'completely confused'?

Upon starting to implement the above, i find that on the 64-bit FreeBSD server, the ULONG_MAX definition as 4294967295 is indeed correct even though sizeof(long) is 8. When i perform a calculation that exceeds the above max, the long integer becomes negative, or truncated. If a long integer is 8 bytes (64 bits), but its effective max is only 32 bits, what are the other 32 bits used for?

Is this a bug in GCC as implemented on the 64-bit server?
If you are talking about the above code, then the problem is that %d takes an int, so if you pass an 8 byte long, it gets truncated. If you compile with -Wall you should get a warning about it.
 
4 members found this post helpful.
Old 08-06-2019, 09:36 AM   #12
dogpatch
Member
 
Registered: Nov 2005
Location: Central America
Distribution: Mepis, Android
Posts: 373

Original Poster
Blog Entries: 4

Rep: Reputation: 169Reputation: 169
Quote:
Originally Posted by ntubski View Post
If you are talking about the above code, then the problem is that %d takes an int, so if you pass an 8 byte long, it gets truncated. If you compile with -Wall you should get a warning about it.
What a noob am i! Thank you.

Likewise thanks to NevemTeve for his original reply to my question.
 
  


Reply

Tags
32 v 64 bits, long int, preprocessor define


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
C++ preprocessor "/lib/cpp" fails sanity check C++ preprocessor "/lib/cpp" fails san mutetikasikali Linux - Software 3 01-25-2014 06:57 AM
[SOLVED] Question about long long int in C smag Programming 2 10-18-2011 03:50 PM
long long long: Too long for GCC Kenny_Strawn Programming 5 09-18-2010 01:14 AM
printf unsigned long long int? blackzone Programming 9 03-04-2008 12:41 PM
C pre-preprocessor or preprocessor replacement Tischbein Programming 3 02-11-2007 11:38 AM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 12:01 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration