Home > front end >  swift what's the difference between "maximumLengthOfBytes(using:)" and "lengthOf
swift what's the difference between "maximumLengthOfBytes(using:)" and "lengthOf

Time:06-28

The descriptions seem to be the same to me. "required" vs "needed" what does that mean?

// Returns the number of bytes required to store the String in a given encoding.
lengthOfBytes(using:)

// Returns the maximum number of bytes needed to store the receiver in a given encoding.
maximumLengthOfBytes(using:)

CodePudding user response:

The lengthOfBytes(using:) returns the exact number, while maximumLengthOfBytes(using:) returns an estimate, which "may be considerably greater than the actual length" (in Apple own words)

CodePudding user response:

The main difference between these methods is given by the Discussion sections of their documentation.

lengthOfBytes(using:):

The result is exact and is returned in O(n) time.

maximumLengthOfBytes(using:):

The result is an estimate and is returned in O(1) time; the estimate may be considerably greater than the actual length needed.

An example where they may differ: the UTF-8 string encoding requires between 1 and 4 bytes to represent a code point, but the exact representation depends on which code point is being represented. lengthOfBytes(using:) will go through the string, calculating the exact number of bytes for every single character, while maximumLengthOfBytes(using:) is allowed to round up to 4 for every code point without looking at the actual value in the string. In this case, the maximum returned is 3× as much as actually needed:

import Foundation

let str = "Hello, world!"
print(str.lengthOfBytes(using: .utf8)) // => 13
print(str.maximumLengthOfBytes(using: .utf8)) // => 39

maximumLengthOfBytes(using:) can give you an immediate answer with little to no computation, at the cost of overestimating, sometimes greatly. The tradeoff of which to use depends on your specific use-case.

  • Related