Home > OS >  Does Int bits in Swift(Core Data) automatically cast to the same size as the device bits?
Does Int bits in Swift(Core Data) automatically cast to the same size as the device bits?

Time:12-31

I'm trying to figure out whether the Int32 in Core Data will be presented as 32 bits or 64 bits on a 64 bits Device, but I could not find a valid answer on SO.

I went to Swift document, it only displayed this information:

Int In most cases, you don’t need to pick a specific size of integer to use in your code. Swift provides an additional integer type, Int, which has the same size as the current platform’s native word size:

On a 32-bit platform, Int is the same size as Int32.

On a 64-bit platform, Int is the same size as Int64.

Unless you need to work with a specific size of integer, always use Int for integer values in your code. This aids code consistency and interoperability. Even on 32-bit platforms, Int can store any value between -2,147,483,648 and 2,147,483,647, and is large enough for many integer ranges.

Could someone point me out would I save some memory if I use the Int32 instead of Int64 in Core Data with 64 bits device?

CodePudding user response:

Integer 16, 32, and 64 in CoreData are equivalent to Swift int16, int32, and int64. So you will save memory and disk space with a proper size.

Source: APress Media, LLC, part of Springer Nature 2022 A. Tsadok, Unleash Core Data

CodePudding user response:

If you’re using a SQLite persistent store (which is usually how people use Core Data), it won’t make any difference to file size. SQLite uses dynamic typing and doesn’t restrict values based on column type. SQLite’s documentation says that for integers,

The value is a signed integer, stored in 0, 1, 2, 3, 4, 6, or 8 bytes depending on the magnitude of the value.

So whatever integer value you store uses as many bytes as it needs, and it doesn’t know or care if your code uses 32 bits or 64 or some other value.

In memory it’s different. A 32 bit integer takes 4 bytes, a 64 bit int takes 8 bytes, etc. Use whatever integer type is large enough to hold all of the values you need to store in a variable.

At the same time though, do you have so much data that this kind of optimization will have any effect? Unless you have very large data sets, using the wrong integer type is unlikely to make a difference to your app.

  • Related