Home > Mobile >  Bitwise storing of a date in an unsigned short int: Why is my result off by 1? [C]
Bitwise storing of a date in an unsigned short int: Why is my result off by 1? [C]

Time:03-28

I am writing a program that employs bitwise functions to store a date in a singular unsigned short int (01010/1010/1010101) = (DDDDD/MMMM/YYYYYYY). My program is working almost as intended but there is one issue which I assume to be small: the expected output is always 1 less than what I expect.

For example, the date 01/07/33 should give back 2977, but it gives 2976. The date 31/07/33 should return 64417, but gives 64416.

Here is my code:

unsigned short int date_to_binary(int day, int month, int year) {
unsigned short int result = 0;
unsigned int short_size = sizeof(unsigned short int) * 8;
int mask;
int M_count = 0;
int D_count = 0;
for (unsigned int i = 0; i < short_size; i  ) {
    if (i < 7) {
        //assigning bits from year
        printf("Working on assigning year... \n");
        if ((year >> i) & 1) {
            printf("Year has a 1 at index %u\n", i);
            mask = 1 << i;
            result = ((result & ~mask) | (1 << i));
        }
    }
    else {
        if (i < 11) {
            //assigning bits from month
            printf("Working on assigning month... \n");
            unsigned int j = M_count;
            if ((month >> j) & 1) {
                printf("Month has a 1 at index %u\n", j);
                mask = 1 << j;
                result = ((result & ~mask) | (1 << i));
            }
            M_count  ;
        }
        else {
            //assigning bits from day
            printf("Working on assigning day... \n");
            unsigned int k = D_count;
            if ((day >> k) & 1) {
                printf("Day has a 1 at index %u\n", k);
                mask = 1 << k;
                result = ((result & ~mask) | (1 << i));
            }
            D_count  ;
        }
    }
    printf("Current result: %u\n", result);
}
return result;

And my driving main():

int main() {
int testD = 1;
int testM = 7;
int testY = 33;
printf("Result of date_to_binary (1/7/33 should yield 2977): %u\n", date_to_binary(testD, testM, testY));}

Any ideas as to why this happens?

CodePudding user response:

You could get away with :

#include <stdint.h>

uint16_t date2short (int day, int month, int year) {
    if (day > 31 || day < 1 || month > 12 || month < 1 || year > 127 || year < 0)
        return 0;
    uint16_t result = 0;
    result |= day << 11;
    result |= month << 7;
    result |= year;

    return result;
}

You're clearly overwriting year-bits when you do :

mask = 1 << k; // also for j (they start from 0)
result = ((result & ~mask) | (1 << i));
  •  Tags:  
  • c
  • Related