I'm trying to get a christmas tree, where the user can input the number of stars, which should be in the last row. The problem is, I don't get, how to get the height from the number of stars in the last row.
So far, I got the tree with an input from the user for how high the tree should get.
eingabe = input("Bitte geben Sie die Höhe des Baumes ein: ")
eingabe = int(eingabe)
eingabe = eingabe 1
stern = "*"
for i in range(1, eingabe):
x = (2 * i - 1) * stern
if i == 1:
print(stern.center(100))
if i > 1:
print(x.center(100))
CodePudding user response:
The number of stars in the last row is:
stars = rows * 2 - 1
Therefore, the height of the tree, knowing the number of stars in the last row, is:
rows = (stars 1) / 2
given that the number of stars must be odd because of the way you're drawing it.