I have some Objective-C code that works on Intel Macs, but fails on Macs with the M1 chip.
dispatch_time_t now = dispatch_time(DISPATCH_TIME_NOW, 0);
dispatch_time_t delta = (uint64_t)(1 * NSEC_PER_SEC);
dispatch_time_t when = dispatch_time(now, delta);
NSLog(@"now: %llu", now);
NSLog(@"delta: %llu", delta);
NSLog(@"when: %llu", when);
NSLog(@"real delta: %llu", when-now);
It'll print something like this:
now: 1272914827933
delta: 1000000000
when: 1272938827933
real delta: 24000000
1272914827933
1000000000
= 1273914827933
, NOT 1272938827933
. Why the hell is this failing, and only on some Macs?
Edit: this is also broken when running on an actual iPhone device. Just create a new Xcode project using Obj-C, paste the code into the AppDelegate, and run on your phone.
CodePudding user response:
Turns out dispatch_time
returns CPU ticks, not nanoseconds. And dispatch_after
takes ticks so it's all fine in the end, but the short of it is that you can't compare the numbers like I was trying to do.
CodePudding user response:
dispatch_time() second argument (delta) is in nanoseconds, however the returned dispatch_time_t depends on the CPU architecture. For Intel, the returned value is measured in nanoseconds. For Apple Silicon, the value is in ticks.
Since the two values are not measured the same way, you can not calculate numbers this way.
Here is a new test. Notice mac_absolute_time is always in advance of the dispatch_time now time.
// cc -framework Foundation -o dispatch_time dispatch_time.m
#include <time.h>
#include <mach/mach_time.h>
#import <Foundation/Foundation.h>
int main() {
dispatch_time_t now = dispatch_time(DISPATCH_TIME_NOW, 0);
dispatch_time_t delta = (uint64_t) (1 * NSEC_PER_SEC);
dispatch_time_t when = dispatch_time(now, delta);
NSLog(@"mac_absolute_time: %llu", mach_absolute_time());
NSLog(@"now: %llu", now);
NSLog(@"delta: %llu", delta);
NSLog(@"when: %llu", when);
NSLog(@"invalid: when-now delta: %llu", when-now);
return 0;
}