Home > Net >  Converting to decimal algorithm
Converting to decimal algorithm

Time:10-08

I was studying an algorithm which converts binary to decimal and I stumbled across a weird problem for me. The algorithm is supposed to work the same in every platform, but it returns different results on another computer. It returns 7 from 10110 which is 22. Code:

public class Main {
    
    public static int convertToDecimal(String binary) {
        int conversion = 1; 
        int result = 0;
        for(int i = 1; i <= binary.length(); i  ) {
            if (binary.charAt(binary.length() - i) == '1') {
                result  = conversion;
                conversion *= 2;
            }
        }
        System.out.println(result);
        return result;
    }
    
    public static void main(String[] args) {
        convertToDecimal("10110");
    }

}

CodePudding user response:

Doubling the conversion variable should be placed outside the if block.

for(int i = 1; i <= binary.length(); i  ) {
    if (binary.charAt(binary.length() - i) == '1') {
        result  = conversion;
    }
    conversion *= 2; // or conversion <<= 1;
}

The built-in method with Integer.parseInt should be preferred:

return Integer.parseInt(binary, 2);

CodePudding user response:

public static int convertToDecimal(String binary) {
    int result = 0;
    for (int i = binary.length() - 1; i >= 0; i--) {
        if (binary.charAt(i) == '1') {
            result  = Math.pow(2, binary.length() - i - 1);
        }
    }
    return result;
}

this sould work

  •  Tags:  
  • java
  • Related