Hacker News new | past | comments | ask | show | jobs | submit login
Why the iPhone Timer app displays a fake time (lukashermann.dev)
223 points by _antix on Dec 28, 2020 | hide | past | favorite | 74 comments



> Milliseconds are converted to seconds by dividing by 1000 and rounding down like so

It seems like the simpler explanation is that the iPhone actually rounds to the closest integer rather than rounding down. (Whether they do this invisibly by adding 0.5s and rounding down is an implementation detail.)


Correctly rounding to the nearest second is exactly the behaviour that I expect from a timer.

It's not "fake".


Exactly. Seems like the author is suggesting a timer with 4.9 seconds left should tell you it has 4 seconds left instead of 5. Not sure why anyone would think this.

The tidbit about it displaying 1 even if there is 0.01 left is interesting though. Makes sense.


reddit works like this if you get downvoted and only get to post every 10 minutes. it tells you you have to wait 1 more minute, and then a minute later it tells you you have to wait 59 more seconds, very counterintuitive. gets me every time when I have a post typed out and it's putting me on timeout for going against the grain (like e.g. suggesting that mob justice is bad)


> reddit works like this if you get downvoted and only get to post every 10 minutes.

Oh, so that's why it's telling me to wait... I always wondered that whenever I got into an argument with someone.


Unless you pause it... In which case it rounds up...


Only at 0:00. That's a separate feature, showing 0:01 at a minimum for a paused timer (presumably to make it very clear that you aren't looking at an expired timer, even though the UI disappears for an expired timer.)


Ah. I see that my Apple app does do that, but it's not my preference.

I would much prefer "0 seconds remaining" to mean 0 to 0.5 seconds remaining, than any other possible case. I would find it much easier to parse.

(Alternatively, please show more accuracy for this case, where it's likely relevant)


The sentence quoted is for the first example in the post, not the iPhone timer.

That’s most likely what it does indeed, much more elegant than keeping track of the extra 500ms.


I realize that. It seems like the entire blog post could be summarized as "I thought about doing something this way and when it didn't give me what I and everyone else expects out of a timer, my mental model adjusted to conclude that Apple was doing some weird gymnastics involving adding 500ms to some of their timer code and using Math.floor() but not to the time the timer actually goes off rather than adjusting my mental model to match the simpler way that humans expect rounding to work in a timer using [the equivalent of] Math.round()."


Agreed, to my eyes it seems clear that the display shows both “5” and “0” for half a second, which means the total duration is still 5 seconds, not 5.5 seconds.


I've never seen something so confusing. I've been (and quite frankly still am) seriously considering this is a big prank that's going over my head. The author even mentions that because time goes up and countdowns go down, rounding down is confusing. Wouldn't the logical conclusion be that you needed to round up?

Wouldn't rounding up be the most logical option ANYWAY? If someone counts me down, I expect them to say "5, 4, 3, 2, 1, GO", and I expect a timer to do the exact same thing. This fixes seemingly every issue, starting on 5, not having to round to nearest integer (or add 500ms, which is an odd way to look at it, but sure), and not showing 0 (which I'd find quite weird). I just checked my own (Android) timer, and sure enough it rounds up, which is what I expected and what I expect from a countdown in general.

Maybe what seems to me like the most obvious behaviour for a timer is not as obvious as I think, but why did the author not even mention rounding up as a solution, or how it could conflict with his own expectations?

I'll just assume this to be a shower thought that didn't fully mature before making it into an article, but there seems to be enough effort that I'm still left quite confused. Oh and don't mistake this for a euphemism for me critiquing the intelligence of the author, at worst it's just a funny blindspot (as we all experience) but I'm seeing very little mentions in the comments here, which only adds to my bafflement.


> If someone counts me down, I expect them to say "5, 4, 3, 2, 1, GO", and I expect a timer to do the exact same thing.

"Exact same thing" isn't really possible here because the timer needs to show each integer for a duration of 1 second, while the spoken words each have a duration of much less than 1 second. So much less that they're really identifying a moment (perhaps the initial consonant, or the peak amplitude) rather than a window of time. This is why "go" is able to be intuitively anticipated accurately.

If the timer app flashed each number briefly, with no display for the vast majority of each second, then it could do the "exact same thing" as spoken English. Considering the transition from displaying one number to displaying the next as analogous to a spoken word is about as good as we'll get, in which case "1" ought to be displayed for one second immediately prior to "go" (don't display "0" for the last second and certainly not for a half second).


Original author here.

I have thought about your and other peoples comments. It could well be just rounded to the nearest integer, making my +500ms assumption wrong. However, this would result in 59.51s being displayed as 0:60, though it should be 1:00. Rounding up has the same problem.

I went through the same assumptions coding my own timer. Let's just round everything. But this resulted in 51.0s being displayed as 1:51 on the timer, with rounding up it's even worse resulting in 1:01:51. So at least hours and minutes have to be rounded down. But that makes it even more confusing that seconds are NOT Math.floor().

So in the end I "gave up" and used the date-fns package (works great) but it also rounds everything down. That's why I believe that iOS adds 500ms so the minutes and hours work properly.


> this resulted in 51.0s being displayed as 1:51

What?

> with rounding up it's even worse resulting in 1:01:51

Huh?

If you are making some kind of timer, you first should convert the time to pure seconds, or in whatever the smallest unit is that you care to deal with (milliseconds?). It is only at that point that the data is ready for any kind of arithmetic (for example, to subtract 1 second). Then, when you wish to display it, you first convert it.

For example, if the user sets your timer to 2 minutes, you would convert "2:00" to 120 seconds. Then you subtract 1 second. Now the internal value is 119. Then you convert back to the display format: "1:59". But be sure to keep "119" in some internal variable, for the next change.

    <form name=f>
        <input type=number name=h min=0 max=99 value=0>
        <input type=number name=m min=0 max=59 value=00>
        <input type=number name=s min=0 max=59 value=00>
        <input type=submit value=Start>
        <output name=r></output>
    </form>

    <script>

    document.forms.f.addEventListener('submit', function (ev) {

        ev.preventDefault();

        let f = ev.target,
            s = Number(f.h.value * 60 * 60) +
                Number(f.m.value * 60) +
                Number(f.s.value);

        function show(f, s) {

            let h = Math.floor(s / 60 / 60);
            s -= (h * 60 * 60);

            let m = Math.floor(s / 60);
            s -= (m * 60);

            m = String(m).padStart(2, '0');
            s = String(s).padStart(2, '0');

            f.elements.r.value = [h, m, s].join(':');
        }

        show(f, s);

        if (f.timer) { clearInterval(f.timer); }
        f.timer = setInterval(function () {
            if (0 === s) { clearInterval(f.timer); return; }
            s -= 1;
            show(f, s);
        }, 1000);
    });

    </script>


> However, this would result in 59.51s being displayed as 0:60, though it should be 1:00. Rounding up has the same problem.

Surely one shouldn't round _after_ splitting the time into minutes and seconds.

You floor() or ceil() the time, depending on if you're counting up or down, to the lowest unit your timer is showing, then you display it and in the process split it into hours, minutes etc.

So ceil(59.51s) = 60s, which is then converted to 0h 1m 0s for display.


> However, this would result in 59.51s being displayed as 0:60, though it should be 1:00.

They would round it up to 60, the format it to be 01:00.


Round half up is the typical expectation for most people in rounding to the nearest whole value.

It isn't displaying a "fake" time, it's rounding.

https://en.wikipedia.org/wiki/Rounding#Round_half_up

Edit: Note, Round Half Up != Round half away from zero but that isn't important for this scenario :D


Round half up is a reasonable thing to do with many data types, but it's not something that's typically done with time. Any sensible time handling solution will truncate instead of rounding.

Which is not to call apple's timer nonsensical. I agree with the author that it's a nicer experience, but I think it's likely an intentional decision to make the display look better, not a consequence of naive rounding.


> Any sensible time handling solution will truncate instead of rounding.

That doesn't really work for a countdown timer though since the last second would then display as zero, which is nonsense for 'time remaining'. You have to round up, always.

The Android 9 countdown works intuitively, it takes a second to decrement from 5 to 4 and sounds the alarm exactly at 0. You can't pause it at zero...

> Which is not to call apple's timer nonsensical.

It pretty much is, since the last 'second' takes more than a second to elapse.


I'll rewrite parent's statement to apply to timers and stopwatches:

> Any sensible time handling solution will round off toward the starting value.

So a stopwatch won't display "1" until a full second has elapsed, and will look like this for five seconds: "0...1...2...3...4...5!"

A timer won't display "4" until a full second has elapsed, and will look like this: "5...4...3...2...1...0!"

In the stopwatch case, the display shows "0" for 0.999 s; in the timer case it'll show "1" for 0.999 s.


But this highlights the difference between the two devices, stopwatches and timers.

A stopwatch counts up. Therefore, it makes sense to use the floor (4.99 is still "4", not yet "5").

A timer counts down. Therefore, it makes sense to use the ceiling (3.01 is still "4", not yet "3").

In fact this was a turning point in the original article: "rounding down . . . makes a lot of sense when counting up. . . . But for a countdown timer, this is counterintuitive."


Yup, that was my motivation for rewriting the parent statement. Abstracting away the type of counter and just saying, "show the number only when you crossed it" basically.


Sorry. First, I thought that you set out to contradict, rather than agree with, the person you replied to (since that is common). Then I was thrown off by your topic sentence, which seemed to lump together the two devices ("timers and stopwatches"). At this point, I think my reading accelerated to highway speeds and totally botched parsing the rest ;)


> Any sensible time handling solution will truncate instead of rounding.

It's not a "handling of time" its just a display. And its extremely sensible to round to the nearest displayable number instead of rounding up or down, because this minimizes the error in the displayed number. And that's what you are interested in, conveying as accurately as possible what the current time is.


Again, I'm not trying to suggest anybody made a wrong choice here, just that they made an intentional decision.

Theres no way this is happening because somebody called math.round() without thinking, you don't round time and the developers who make apple's clock app know that. They made a conscious decision to have the timer work this way. You can disagree with whether they made the right choice, but I'm pretty sure they thought about it.


Agreed. This is a desktop widget display we're talking about here, not a log file entry.


Conceptually you are attempting to display something that has more precision than you are willing to display.

You can

   - Round (Truncation is a type of rounding, Rounding towards zero)
   - Adjust the display to increase the precision it displays
   - Do both by dynamically increasing the precision of the display/decreasing rounding as the distance to zero closes in.


I think I find the ceiling method most intuitive. I have noticed that video games with cooldowns tend to agree with me; some ability has a 4 second cooldown, you press it, and the UI says "4", even though 3.999 seconds remain until the ability is back. (This means that you never see "0"; the counter reads "1" for the last second before the cooldown is available again.)


The ceiling method matches with the expectation of how much time is left when the major digit changes, which is the same as how we do countdowns verbally in English.

We say, "5, 4, 3, 2, 1, 0", not "4, 3, 2, 1, 0, done".


... or they just round the time to the nearest second instead of taking the floor


The way I see it, the problem stated by TFA is simply a mismatch between what wants to be represented vs. what is represented.

Dividing the amount of milliseconds left / 1000 is too naive. If you want that the representation "0 seconds" actually matches the instant when there are actually 0 seconds left (i.e., no more time left), you should label these millisecond ranges as follows:

* 5000 to 4001: "5 seconds"

* 4000 to 3001: "4 seconds"

* 3000 to 2001: "3 seconds"

* 2000 to 1001: "2 seconds"

* 1000 to 0001: "1 second"

* 0: "0 seconds" (which usually the user doesn't read if the screen shows something else)

Simple enough?

The thing is, yes the iPhone timer does something in between, in order to show both the initial (5) and the final (0) numbers for a non-instantaneous amount of time, but (IMHO) that's the most confusing choice.


This gives me a nostalgic reminder of when I used to watch Nickelodeon GUTS. Whenever a countdown timed challenge was about to end, I would hear the audience count down starting with "10" at maybe 10.7 seconds, "9" at 9.7 seconds, etc. Then after "1", there was a small period of awkward silence and murmurs from the audience before the siren blasted.

It's a good example of how measurement of time in terms of human perception differs from the pure mathematical sense, and rounding is a good compromise here (wherever it doesn't display fractional seconds).


I was thinking of the same phenomenon, but at school sporting events when the crowd would count down while watching the time on the scoreboard. The awkward pause between shouts of “zero!” and the actual buzzer.


Very weird to frame this as "lying" and "fake" when it's just using .round() instead of .floor()


To be fair, it would be an extremely boring thing to write about without all the misleading hyperbole.


From what I can see, the iPhone timer app is rounding to the nearest second. For positive numbers that can be accomplished with either Math.round(t) or Math.floor(t+0.5).


iOS 9 and below, IIRC, the timer app stopwatch screen would really flake out after 1000 hours. By 'flake out' I mean it would slow down the updating a lot - screen got somewhat jerky and slow to respond.

From iOS 10 onwards, 1000+ hours doesn't seem to phase it at all - no slow downs.

source: 1490:43:24 ago I started my timer, but I can't remember why now.


Not a "fake time" at all, as many others here have already pointed out; but just an example of https://en.wikipedia.org/wiki/Quantization_error .


Answer: it rounds to the nearest second.


Seems like a bit of a reverse Occam's razor to explain that when the timer is at 0.501 it shows 1 and when its 0.499 it shows 0, is due to a fake secondary timer running with a 500ms offset that is rounded down, instead of just saying "the display is rounded to the nearest whole number".


> But for a countdown timer, this is counterintuitive.

I get the feeling this pun wasn’t intentional, but it’s a good one nonetheless.


I think "rounds to closest" is a simpler way to put it than "adds 500 ms before truncating".


I'm I alone in thinking the new timer app UI sucks? The old app where it used a spinner to set an alerm time and where I had N alarms that I could edit was simple and easy to use.

The new app where hour and minutes are separate text fields and where you can't edit an alarm you can only delete old ones and create new ones is utterly unintuitive to me and much slower and more error prone than the old one. Clicking in the hour or minute section is a much smaller target. Editing sucks. I want to set an alarm for 10:00 and it defaults to showing the current time 09:33 so I click the 09 and type 1000 but it only edits the 09 to 10. The 33 stays and I have to click the other side and enter 00.

To be snarky it's like some UX designer from Google went to Apple. This is not a UX I'd expect from Apple.


I guess I'm with sibling commenter that I must use Siri to set my alarms for the last few iOS versions, because I just opened it to see what you're on about (I mean, c'mon, that UI is fine) and...OMG, you're right. What tiny-fingered designer thought that time field was acceptable? The only thing they seemed to have gotten right is allowing the direct typing of digits to set a time. The rest? Exactly the hot mess parent describes.


You can still edit them, just tap edit in the top left and then the alarm. Why we can’t just tap the alarm and save a step I don’t know. The small text input UI instead of the old scrolling thing (I’m aware it still scrolls, just harder to use) is terrible though.


YES! This is so irritating - I assumed I was just being blind to some amazing UX as I fat finger the smallest fields know to human kind trying to eek out a few more minutes before my first zoom call.


I just ask Siri to "set an alarm for...", much less painful.


Asking siri is noisy, (people sleeping next to me).


You can type to Siri, under the accessibility settings.


Typing "Hey Siri, set an alarm for 10am" doesn't sound remotely like an improvement over the old alarm UX.


It's an improvement over the new one. IMHO anyway :)


Maybe the author should look up different modes of rounding. Round-to-nearest is what's the "intuitive" choice Apple made here. It's not "fake time". Srsly..

Besides, instead of truncating, it would even be sensible to do a round-up here, so that you get "5, 4, 3, 2, 1, done!" Non-mathematicians and non-programmers don't want to stare at 0 for a seconds. They likely don't want to stare at a 0 at all. That wouldn't "fake time" either.

It really gets to me that he writes such an article and his self-description is "I'm a Maker and Full Stack Developer who loves working with the Frontend, Vue.js and User Interface Design." Not a great ad for yourself..


Yo. Stop being mean. This isn’t a competition. You’re not in high school anymore. You don’t need to put others down to boost yourself.


Mature adults should be able to take criticism without getting butt-hurt about it. The real world can be mean and nasty and sometimes you need a wake-up call that you're being silly or screwing up.


Assuming productive criticism, yes. IMO the comment in question wasn’t that. E.g remarks about author’s bio seem random and unnecessary to me.


That's fair and I agree. I'm not defending GP's remarks, just adding general commentary.


I have seen a lot of comments on Hacker News that I thought were mean. This isn't one of them.

Rounding is something I learned in elementary school. If someone in elementary school was unfamiliar with rounding, it would be mean to take them to task about it. But this writer is an adult, a professional programmer, and presumes to be a teacher, by taking the time to publish an article about the subject.

In fact, the writer has even less excuse, because he knows about rounding. Twice he mentions "rounding down". So it is all the more baffling that he did not call what Apple was doing, simply, "rounding up". Instead he wrote a multi-paragraph blog post about adding 500 milliseconds of "fake time", then having to duct-tape over that by making the timer say "0" when the timer reached 500 ms.

Suppose a man bills himself as a car salesman, but every question you ask him about the car, he says the wrong thing. He says the car has 3 wheels when it has 4. He says it is red when it is black. He says the engine is in the back when it is in the front. Would it be mean for you to mention his title, by saying something like, "You say you are a car salesman, but you don't know about this car on your lot." It is right to call someone out for being less than what they say they are.

As for random blog posts on the internet, this isn't so surprising. What's surprising is that it wound up on Hacker News with over 100 votes.


You don’t need to put others down to boost yourself.

Isn't that exactly what you did right here?


Yes, it is.


I built a similar tool to the OP that implements the same rounding behavior if you’re in the iOS ecosystem — https://apps.apple.com/us/app/synctimer-by-practice/id152297...

One of the main benefits is that it doesn’t require the internet and can synchronize iPhones, iPads, and Apple TV’s using either peer-to-peer WiFi or Bluetooth with no manual pairing.

Also had an interval timer for exercises, group activities, etc...


Summary: If you start a timer for 60 seconds it should practically not display 60, but should start at 59,... This is not intuitive so Apple added 500ms to the visualization.


I think my Garmin smartwatch shows distance rounded up, but plays alerts based on the actual value. So you can have run a watch that says you've run 10.00 km on the display... but have to go a bit further to trigger the 10 km vibration alarm. And crucially if you stopped before the alarm it wouldn't register as a 10 km for your records!


I feel like the countdown timer should round up, rather than round to the nearest integer. This makes it consistent with an integer stopwatch, which will round down. In my experience, Apple's timer behavior makes me feel like there's a weird half-second delay when starting and stopping the timer.


Android shows rounded up. You never see zero. That also seems reasonable to me.


This blog could be summarized to fit into a tweet without losing any value.


Other than the start and zero the time remaining is in reality an irrational number which cannot be represented at all. So every representation ever is truncated. :)


Is this the fencepost problem but with time :P?


Whit these issues I always remember the unix “uptime” command and their rounded precision.


While building a tea timer I naturally stumbled into this exact solution.

Edit: sencha.app for those curious


That was the solution as well for my primetimer app, funny that I had the same findings


Everyone is saying "rounding, duh" but that a bit unfair IMO

You should show each integer value second for, ideally, one second (in the case with no display for fractional seconds). So the question is: exactly what rage of time to cover? The poster expected that for second N, it would be:

[N,N+1)

But Apple seems to do:

[N-0.5, N+0.5)

I think the Apple behavior makes perfect sense when you are displaying the current time (eg like a watch). It may be a little less obviously correct for the timer case, but still seems fine. I'd bet they implemented it for the time of day case and just re-used it for the timer case.


Well you have 6 numbers and only 5 seconds to show them in. You could show 0 only when the timer has fully completed.

I think either rounding up, or rounding to the closest integer is fine. The authors expectation of always rounding down is the only option I would never expect.


I think people have different expectations for:

"What time is it now?"

vs

"How much time is left?"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: