LinuxQuestions.org
View the Most Wanted LQ Wiki articles.
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices

Reply
 
Search this Thread
Old 07-09-2008, 07:06 PM   #1
RudraB
Member
 
Registered: Mar 2007
Distribution: Fedora
Posts: 262

Rep: Reputation: 23
bitset variable


Dear friends,
i am in mess with bitset again. i have managed to write a messy code
which checks the machine bitsize and then wants to do a bitflip. here
is the code:
Quote:
int bit;
if (strncmp(sysptr, "i686",4)==0)
{
bit=32;
}

......
bitset<32> i (0);
....etc.
where in bitset is equal to bit,which is 32. but i tried to write the
bitset as *bitset<bit> i (0);*(i.e. to read the bit as int variable)
where its giving the error:error: ‘bit’ cannot appear in a constant-
expression
can you tell me a way to make that possible?
 
Old 07-09-2008, 09:06 PM   #2
ntubski
Senior Member
 
Registered: Nov 2005
Distribution: Debian
Posts: 2,396

Rep: Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814
bitset<32> makes a class at compile time, so the number in there must be known at compile time.

I think you should be able to get away with
Code:
#include <limits.h> /* for CHAR_BIT */
...
bitset<CHAR_BIT * sizeof(void*)> ...
Pointer size is usually the register size of the machine (not guaranteed though...).
 
Old 07-09-2008, 09:30 PM   #3
RudraB
Member
 
Registered: Mar 2007
Distribution: Fedora
Posts: 262

Original Poster
Rep: Reputation: 23
I dont think i got the point.
what is CHAR_BIT? Do you mean i have to replace int bit to char? but how it will work?
i have done:
Quote:
char cbit=bit;
......
bitset<cbit*sizeof(void*)>i (0);
but it didi not work(quite sensibly as cbit is variable now)
plz excuse this idiotic question as i am a novice in C++ world.
 
Old 07-10-2008, 01:25 PM   #4
ntubski
Senior Member
 
Registered: Nov 2005
Distribution: Debian
Posts: 2,396

Rep: Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814Reputation: 814
CHAR_BIT is the number of bits in a char, generally 8. sizeof returns the size of a type in chars so needs to be multiplied to give size in bits. If you #include limits.h it will define (as a compile time constant) CHAR_BIT for you.

In short
Code:
bitset<8*sizeof(void*)>i (0);
would work too, I just like to avoid magic numbers when I can.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
bitset in C RudraB Programming 1 07-09-2008 06:06 AM
Help: removing a variable substring from a string variable in sh script gnparsons Programming 2 06-04-2008 05:21 PM
Replace variable with user defined variable ce124 Programming 10 04-13-2007 09:29 AM
setting a variable variable in a script... this works, but could it be more elegant? pwc101 Programming 3 08-18-2006 11:23 AM
Scripting: accessing a variable stored in a variable? tomolesonjr Linux - Newbie 5 05-05-2006 08:47 PM


All times are GMT -5. The time now is 11:01 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration