Alt text:
It’s not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn’t know and the Devil isn’t telling.]
Could you eli5 what frame and epoch are? I don’t get why aren’t unix timestamps an adequate way to store time, they seem pretty easy and intuitive
Well in three dimensions you can have an cartiesian coordinate frame (x,y,z) with it’s length of measure you also have an origin (a (0,0,0)
With time you can have an arbitrary length of time say a certain number of ticks of a cessium atom vibration, and you have a point where you start ticking from say a specific time where all the stars were one way. Let’s call that our inertial time base
Now lets add a clock into the mix. The clock counts seconds in it’s own way from the time they power on so it has it’s own time frame (length of second) and epoch (zero time).
Now if the clock second is the same as the inertial time frame you are golden t2-t1 + the offset in the zeros. If the clock is fast or slow (let’s assume linear for now)
You need to juggle around the difference in slopes between time the clock and the inertial time frame while performing the subtraction and even the offset
So you end up with stuff like clocktime_rel_J2000epoch_TAI = clocktimeclock*TAisloperelCLK+ CLKzerooffsetrelTAI in production code and other people remove it and wonder why you’d bother having a relative slope at all and later find out nothing works when the clock gets hot or cold and the slope drifts…