I'm trying to implement a function that gets an input string (for example: input = 1234@Hello) and the function will return only the chars (in the example: return Hello)
So I created a function that get a variable which is the input, converting one by one to char and checking if the char is a letter. if so, the function will add the letter to final string.
When I tried to run the script, and gave it input, I got the following error: [1234HI
Traceback (most recent call last):line 6, in select_only_chars char = chr() TypeError: chr() takes exactly one argument (0 given)]
key = input()
def select_only_chars(user_input):
full_string = user_input
char = chr()
final_string = ''
for char in full_string:
if ('a' <= char <= 'z') or ('A' <= char <= 'Z'):
final_string = char
return final_string
print(select_only_chars(key))
CodePudding user response:
Alternative solution
import string
def select_only_chars(user_input):
full_string = user_input
all_letters = string.ascii_lowercase string.ascii_uppercase
final_string = ''
for char in full_string:
if char in all_letters:
final_string = char
return final_string
key = "1234@Hello"
print(select_only_chars(key))
CodePudding user response:
Credit to @Jhonny Mopp Deleting the char = chr() variable fixed the whole code.