Im new too programming, we have been given an assignment of converting bin to dec and vise versa. I can't seem to get the program to work, please let me know what i can fix. (no built in functions)
binary_base = input()
length = len(binary_base)
digit = 0
power = 0
for i in range (length):
if binary_base [i] == 1:
digit = digit (2 ** power)
else:
digit = digit 0
power = power 1
print(digit)
CodePudding user response:
def binaryToDecimal(binary):
binary = int(binary, 2)
decimal = 0
while(binary != 0):
decimal = decimal * 2 binary % 10
binary = binary // 10
return decimal
binaryToDecimal('100')
binaryToDecimal('101')
binaryToDecimal('1001')
binaryToDecimal('011')
binaryToDecimal('111')
or
def binaryToDecimal(binary):
print(int(binary, 2))
binaryToDecimal('100')
binaryToDecimal('101')
binaryToDecimal('1001')
binaryToDecimal('011')
binaryToDecimal('111')
CodePudding user response:
One problem of learn programming with Python is its implicit type declaration. When you have to code some function, it is good to have clear what data types it should have as parameters and what is the return type.
In this exercise, do you want to return a string or an integer? Your 3rd line initialize digit
as int so, why do you add a 0 on the else
statement? It seems that you want to append a '0' to the output. In such a case, you are thinking the variable as string and the solution would be to multiply by 10.
Another issue is what do you want to do with the line power = power 1
at the end? It has to be in the cycle, so you increment it each iteration.
CodePudding user response:
power = power 1
should happen every iteration, but now it happens only once after the cycle. You probably missed 4 leading spaces in that line.
You also probably should reverse(input()) to make your code to consider the left-leaning digits to be more significant than the right-leaning.
CodePudding user response:
Here you go :)
def DecimalToBinary(decimal):
if decimal > 1:
DecimalToBinary(decimal // 2)
print(decimal % 2, end = '')
a = DecimalToBinary(28)
print()
def BinaryToDecimal(binary):
decimal = 0
for i in binary:
decimal = decimal*2 int(i)
return(decimal)
print(BinaryToDecimal('11100'))