I have got a C program which does bitwise operations to get value of the bits. This is the program-
/*
* Example-
* BITS1|BITS2|BITS3|BITS4
* 0001| 0010| 0100| 1000
*
* 0001
* 0010
* 0100
* 1000
* ------
* 1111
* ------------------------
* HOW TO GET VALUES WITH A LOOP?
*
* i = 1;
* 1111 << TOTAL-i
* =1000
*
* 1000 >> 3
* =1111 = VALUE OF FIRST BIT!
*
* i = 2;
* 1111 << TOTAL-i
* =1100
*
* 1100 >> 3
* =0001 = VALUE OF SECOND BIT!
*
* AND SO ON!!!
*/
#include <stdio.h>
typedef enum _bits_t {
BITS1 = 1,
BITS2 = 2,
BITS3 = 4,
BITS4 = 8
} bits_t;
int main() {
bits_t bits = BITS4|BITS1;
int i = 1, TOTAL = 4, current;
while(i <= TOTAL) {
current = ((bits<<(TOTAL-i))>>3)&1;
printf("value of bit %d = %d\n", i, current);
i ;
}
}
I put the comment at the start of the code to make me understand what to do and how. What I though was to do left shift on the number (to get rid of all bits in left), and then do right shift (to move the remaining bit to the right). I mean I calculated my current
like this-
current = ((bits<<(TOTAL-i))>>3);
But to my surprise, that code didn't work. Then later on, just by luck, I put that &1
at the end of the line computing value of current
, and then came a bigger surprise, that it worked! It computed the value of each bit correctly!.
But since it was just a blind guess, I don't currently understand why my program didn't work previously, and why does it now?
Please help me understand the program.
Thanks in advance.
CodePudding user response:
(bits<<(TOTAL-i))>>3
"adds some zeroes to the right", then you "remove some bits from the right". Since it doesn't overflow, it doesn't affect the bits on the left at all. (If it did overflow, it would be undefined behaviour since it's a signed integer.) So what you have is just a complicated way to write
bits >> ( i - 1 )
You need to remove the bits from the left. This is what & 1
accomplishes.
Fixed:
typedef enum _bits_t {
BITS0 = 1, // Normally numbered as an offset (i.e. first is 0).
BITS1 = 2, // This means the mask for bit n is 2^n or 1<<n.
BITS2 = 4,
BITS3 = 8
} bits_t;
int main( void ) {
bits_t bits = BITS3|BITS0;
int TOTAL = 4;
for ( int i=0; i<TOTAL; i ) {
int bit = ( bits >> i ) & 1;
printf( "value of bit %d = %d\n", i, bit );
}
}
( bits >> i ) & 1
is super easy to visualize:
--- --- - ... - --- --- --- --- ---
| 0 | 0 | ... | 0 | d | c | b | a |
--- --- - ... - --- --- --- --- ---
>> 2
--- --- --- --- - ... - --- --- ---
| 0 | 0 | 0 | 0 | ... | 0 | d | c |
--- --- --- --- - ... - --- --- ---
& 1
--- --- --- --- - ... - --- --- ---
| 0 | 0 | 0 | 0 | ... | 0 | 0 | c |
--- --- --- --- - ... - --- --- ---
CodePudding user response:
This is the concept of a mask (& 0x01 for instance), if you want to get the value of a bit (1 or 0) it is necessary to do it in two steps. 1- Bring the bit to the LSB position 2- Apply the mask so you only get the bit of your interest.
Here an example. imagin I have '1001' as your case. If you want to get each bit then you need to do the two steps for each bit.
LSB
1001 & 0x01 = 1
(1001>>1) & 0x01 = 0
(1001>>2) & 0x01 = 0
(1001>>3) & 0x01 = 1 -> as you can see, I bring the MSB to the LSB position so i only get a 0 or a 1 applying the mask
MSB