I'm currently new to coding Python and am struggling with a small piece of code that I can't seem to figure out. My code essentially takes in a starting salary and outputs the salary increase over the course of how ever many years the user enters and makes a table out of the output. Everything in my code is correct except for one problem. I need to start the table with the starting salary that the user enters instead it starts the table with what you would make in year 2.
percent = float(input("Enter the annual percent increase: "))
years = int(input("Enter the number of years: "))
percent = percent / 100
print("%-7ss" % ("Year","Salary"))
for i in range(1,years):
starting = (starting * percent)
print("-.2f" % (i, starting))
CodePudding user response:
Try swapping the order of the prints and the addition:
for i in range(1,years):
print("-.2f" % (i, starting))
starting = (starting * percent)
CodePudding user response:
Try this
starting = 1000 # sample starting
percent = float(input("Enter the annual percent increase: "))
years = int(input("Enter the number of years: "))
percent /= 100
print("%-7ss" % ("Year", "Salary"))
for i in range(years):
print("-.2f" % (i, starting))
starting = (starting * percent)
Will give
Year Salary
0 1000.00
1 1030.00
2 1060.90