This is my function in objc:
(NSString *)hexForString:(NSString *)string{
NSUInteger charIndex = 0;
for (NSUInteger i = 0; i < string.length; i ){
charIndex = (NSUInteger )[string characterAtIndex:i];
}
NSArray *colors = @[
@"1abc9c", @"2ecc71", @"3498db",
@"9b59b6", @"34495e", @"16a085",
@"27ae60", @"2980b9", @"8e44ad",
@"2c3e50", @"f1c40f", @"e67e22",
@"e74c3c", @"95a5a6", @"f39c12",
@"d35400", @"c0392b", @"bdc3c7",
@"7f8c8d"
];
charIndex %= colors.count;
NSLog(@">>po modulo %lld", charIndex); // output is 13 for "Johny ivan"
return colors[charIndex];
}
and now Swift:
func hex(for string: String) -> String {
var index = 0
string.forEach { character in
index = character.wholeNumberValue ?? 0
}
let colors = [
"1abc9c", "2ecc71", "3498db",
"9b59b6", "34495e", "16a085",
"27ae60", "2980b9", "8e44ad",
"2c3e50", "f1c40f", "e67e22",
"e74c3c", "95a5a6", "f39c12",
"d35400", "c0392b", "bdc3c7",
"7f8c8d",
]
index %= colors.count
print(" \(index)") // output is 0 for "Johny ivan"
return colors[index]
}
CodePudding user response:
The Objective-C version basically adds up the ASCII value of each character (actually, UTF16-scalar value), while your Swift version checks whether a character is a number and returns that value. A "more correct" approach would be to iterate over character.unicodeScalars
and adding their .value
. But there's a shortcut on String
already for that:
for scalar in string.unicodeScalars {
index = scalar.value
}
CodePudding user response:
NSString
s are a sequence of UTF-16 code units:
An NSString object encodes a Unicode-compliant text string, represented as a sequence of UTF–16 code units. All lengths, character indexes, and ranges are expressed in terms of 16-bit platform-endian values, with index values starting at 0.
Therefore, your Objective-C for loop was looping through the UTF-16 code units, and adding up the numerical value of each of them.
You can do the same thing in Swift:
for codeUnit in string.utf16 {
index = Int(codeUnit)
}
You can also rewrite this into a reduce
:
func hex(for string: String) -> String {
let colors = [
"1abc9c", "2ecc71", "3498db",
"9b59b6", "34495e", "16a085",
"27ae60", "2980b9", "8e44ad",
"2c3e50", "f1c40f", "e67e22",
"e74c3c", "95a5a6", "f39c12",
"d35400", "c0392b", "bdc3c7",
"7f8c8d",
]
let index = string.utf16.reduce(0, { result, acc in result Int(acc) })
% colors.count
print(" \(index)") // output is 0 for "Johny ivan"
return colors[index]
}
Your swift code is looping through the Character
s of a String
(which are a different concept from UTF-16 code units), and adding up their wholeNumberValue
s, which is:
The numeric value this character represents, if it represents a whole number.
Clearly, none of the characters in "Johny ivan" represents a whole number.