I am trying to concatenate the bits of 3 characters a
, b
and c
into a bitset of 16 bits. The constraints are the following:
- Concatenate the last 2 bits of
a
intonewVal1
- Concatenate the 8 bits of
b
intonewVal1
- Concatenate the first 2 bits of
c
intonewVal1
On paper I am getting: 1111111111110000
same as the result. But I am not sure of the way I am concatenating the bits. First shift left by 14
character a
then shift left by 6
character b
and finally, since There is no space left for character c
then shift right by 2
. Is there a better way to do it? It's already confusing for me
#include <iostream>
#include <bitset>
int main() {
int a = 0b11111111 & 0b00000011;
int b = 0b11111111;
int c = 0b11111111 & 0b11000000;
uint16_t newVal1 = (a << 14) (b << 6) (c >> 2 );
std::cout << std::bitset<16>(newVal1).to_string() << std::endl;
return 0;
}
CodePudding user response:
First of all you need to consider the signed and unsigned integer problem. With signed integers you can get unexpected sign extensions, adding all ones at the top. And possible overflow will lead to undefined behavior.
So the first thing I would do is to use all unsigned integer values.
Then to make it clear and simple, my suggestion is that you do all the shifting on newVal1
instead, and just do bitwise OR into it:
unsigned a = /* value of a */;
unsigned b = /* value of b */;
unsigned c = /* value of c */
unsigned newVal1 = 0;
newVal1 |= a & 0x02; // Get lowest two bits of a
newVal1 <<= 8; // Make space for the next eight bits
newVal1 |= b & 0xffu; // "Concatenate" eight bits from b
newVal1 <<= 2; // Make space for the next two bits
newVal1 |= (c >> 6) & 0x02; // Get the two "top" bits from c
Now the lowest twelve bits of newVal1
should follow the three rules set up for your assignment. The bits top bits will be all zero.