
Why doesn’t the clock in the taskbar display seconds? (2003) - breadtk
https://blogs.msdn.microsoft.com/oldnewthing/20031010-00/?p=42203
======
phkahler
On a more practical note, why would you want to have seconds on the clock? I
find that most of the time I don't want to know the time to a precision of
better than 5 minutes. That may just be me, but when someone asks the time and
I read it off my phone as 3:56 it really bothers me and sometimes I convert
that to 5-to-4 just to avoid what in most cases is arbitrary extra precision.
Sure there are times when minutes count, and there are occasions when I want
to time something to the second, but from a UI perspective those are not
common use cases that warrant screen real estate for extra digits especially
back then.

~~~
anonred
On macOS, it’s a useful indicator of when the system has frozen/kernel panic,
which unfortunately happens far too often.

~~~
throwaway2016a
You must have really bad luck with Macs.

As a MacOS user for seven years I can't say I've ever had that issue.
Certainly not enough to need the seconds to be an indicator. I've had my
system completely lock up maybe four times and those times my mouse refusing
to move was enough of an indicator.

Heck, right now it's been two and a half months since I've rebooted... and
probably a year since I've rebooted because it froze.

I'm not an Apple fan boy either. In fact my next computer will probably be
Linux. But I have to admin the last three Macs I have owned have been great
computers.

~~~
throwaway32617
You must not work in an organization that insists on installing security
software for you. If I have >2w uptime on my machine I generally have to
restart for some reason or other. It's like people have forgotten that
computers can and should have more uptime than that.

A faulty update from one of our security software vendors caused a once-per-
hour kernel panic and forced reboot for a day or two. It was really nice.

(Obviously none of this is the fault of OSX or Apple but I would kill for 2.5m
of uptime on my work laptop).

~~~
eropple
I feel you there. It got to the point at one gig where engineering folks--who
were given root on their preconfigured laptops--often spent their first day
figuring out how to kill and extricate Sophos from their machines. For
developers who weren't seasoned jerks, it could be a real challenge until
somebody took pity on them and helped them find single-user mode, yank out the
kexts, and manually delete everything. (The happy .pkg BOMs helped out, too!)

Ordinarily I wouldn't do this and just punch out on a gig (because seriously,
it's that annoying and that effectively-useless), but we were a recently-
acquired startup subsidiary, so we were also our own IT shop--the CTO actively
approved.

~~~
i336_
Is it still possible to do this kind of thing on the most recent macOS? I
recall the issue with the 'git' binary not being modifiable, even with sudo,
unless you mounted the macOS filesystem under Linux and made the changes from
there.

~~~
eropple
You can disable rootless via the recovery partition.

~~~
i336_
Ah, thanks. I'm assuming that quietly got added in after "we can't touch the
filesystem _at all_ " caused too many complaints...?

~~~
eropple
It was literally there from day one (from day minus-a-bunch if you count the
beta releases) and the hyperbolic inanity that led normal people to the
misconception you had was never necessary in the first place.

~~~
i336_
Heh. Thanks for the info.

I just checked, I read about this at
[http://rachelbythebay.com/w/2016/04/17/unprotected/](http://rachelbythebay.com/w/2016/04/17/unprotected/).

I must admit that I _am_ very curious why " _git and 64 other files in
/usr/bin are all the same size (18176 bytes on my machine)_", why dtruss and
strings fail, and what behavior changes when you turn rootless off.

------
achivetta
It might not be the memory concern it once was, but today it might very well
be a power concern. Modern laptop display pipelines can save power if they
don't have to redraw any of the screen: constantly updating status bar
animations break that.[1]

1:
[https://developer.apple.com/videos/play/wwdc2013/712/](https://developer.apple.com/videos/play/wwdc2013/712/)
(I think the relatively short graphics section mentions this. Can't seem to
find the talk that goes into it in more detail...)

~~~
Moter8
Can't watch on either Chrome or Firefox. Apple are real gits.

~~~
huxley
Neither Chrome or Firefox support HLS on the desktop despite providing support
on mobile (since Android 3.0 I believe). They have categorically refused to
add desktop support, if you have any complaints I would take it up with them.

~~~
milesrout
Why would they support a closed, proprietary format when there exist better,
open ones?

~~~
clouddrover
What is a better one? A patent pool has formed around DASH and you need to buy
a license to use it:

[http://www.mpegla.com/main/programs/MPEG-
DASH/Pages/Intro.as...](http://www.mpegla.com/main/programs/MPEG-
DASH/Pages/Intro.aspx)

~~~
icebraining
Use a plain file? It's not like that video is a real time feed anymore.

Otherwise, include a js player with HLS support for other browsers.

------
krallja
Too bad the team which made that decision wasn't available to help the VS Code
team not use 13% CPU to render a blinking cursor
([https://news.ycombinator.com/item?id=13940014](https://news.ycombinator.com/item?id=13940014))

~~~
LeoNatan25
We are in a different era now, where desktop apps are written in JS.
Performance is implicitly thrown out of the window(s) when compared to
software written for control over every bit of CPU and memory usage.

~~~
mschuster91
But the JS ecosystem learns things! They implement tree shaking and other
stuff to improve performance!

Obviously real programmers learned all this decades ago - same for the "NoSQL"
followers who after years of struggling with piles of cr.p finally
(re)discover what a true ACID-capable database is...

I'm just waiting for the JS world to rediscover threads and proper
multitasking, but given the track record I don't expect that in the next
decade.

Seriously, the JS world is a hellhole of disasters, and many of them easily
preventable if the people acting as "evangelists" actually had a bit of clue
about what they're talking. And to top it off, a JS-only implementation will
never be nearly as fast as well-written C or even C++ code, no matter how hard
"evangelists" push it.

~~~
frozenport
But C++ GUI toolkits are disaster. While any kid can put together a reasonable
looking JS app, the default options in toolkits like Qt are junk. Common
features like centering a QComboBox aren't there. Quickly, you're going to
have to custom build every widget. Then you need to hire engineers to manually
press all the buttons. These engineers won't be too good at art stuff, so
guess what, now you need to hire graphic designers - many of whom aren't
familiar with Qt. You're best bet is to pay a consulting company tons of money
to make a GUI, which you will struggle to keep updated. Even with unlimited
cash, the lead time for this is unjustifiable.

Do customers really want to trade performance for actual money? Are people
still browsing the fatweb?

~~~
mschuster91
> Common features like centering a QComboBox aren't there

Well, browsers didn't have proper vertical centering for UI elements for ages,
too. Not to mention it's between a PITA and impossible to do "simple" things
like styling a file upload button (only works via pseudo CSS on Chrome) or
cross-browser styling of a scrollbar (usually people tend to handroll JS
stuff, which is expectedly slow and unintuitive).

> You're best bet is to pay a consulting company tons of money to make a GUI,
> which you will struggle to keep updated.

It's the same in the Web sphere, with the added difference of clients not
simply accepting "you cannot style a file upload button/input element cross-
browser-like", they will usually answer you something along "it's the Web and
HTML5 after all!!!". You will always need specialized engineers, designers and
UX designers for a well-working app, no matter if native or web.

(But yes I agree with you that C++ GUI toolkits are a desaster, especially
when cross-platform! And especially the build tooling _coughs at autotools_ )

~~~
LeoNatan25
What saddens me is that instead of improving the web software development
process with tried and tested methodologies, the inverse happened, and the web
and its terrible software ecosystems and perpetrators are somehow seen as the
"cool kids" of software development. When observer pattern is seen as the holy
grail of modern web development, you know we are in trouble.

~~~
tracker1
The observer pattern isn't sold to everyone... The thing is, Web browsers have
been nearly as diverse as OSes both historically and recently... But, you get
a mostly compatible UI and runtime engine that's installed on pretty much
every personal computing device out there. There's a lot to be said for that.

As to the "cool kids", they exist in every corner of things that get done by
people. I think WebASM will open up to higher-level tools once it's widely
available, and signalling between the UI layer and WebASM code becomes easier
to deal with.

~~~
LeoNatan25
> But, you get a mostly compatible UI and runtime engine that's installed on
> pretty much every personal computing device out there. There's a lot to be
> said for that.

Only "lowest common denominator" comes to mind.

But to the point of this thread, what bothers me is the transcendence of web
tech outside of browsers, where a technology known historically for terrible
practices with a very low barrier to entry is now considered the "assembly" of
all these platforms, such as mobile and desktop, and the developers, that
until yesterday could only make a web page that loads 20MB of dependencies to
show a few paragraphs of text, now call themselves "mobile developers" or
"full-stack developers".

~~~
tracker1
Well, I take a bit of offense to the last statement... in only that JavaScript
is just as, or more valid as a server-side language for development as any
other scripting language is. The engines themselves tend to be more optimized
than alternatives, and combined with NPM/CJS modules a much nicer experience
than many of the alternatives.

As to being able to target mobile/desktop, why not? I mean with Cordova and
Electron you can target 4+ platforms with minimal code variance. And in most
cases the performance is good enough, until you need more. And React-Native
can go even further towards native/compiled language performance, with
slightly more variance.

Yes, there is more memory and cpu use than alternatives, but there's also a
real cost in developer time, and time to launch. In many cases it's the
difference between having platform X, or not... the alternative is nothing,
not something better in most cases.

------
mschaef
This is a good article, but keep in mind the timeframe it writes about. Not
only was it 22 years ago, its writing about systems at the bare minimum system
requirements. (Personally speaking, I remember speccing out a previous gen
Windows 3.1 machine with 8MB of RAM in 92... and this is covering 4MB systems
in 1995. By that time, 8-16MB would've been common.)

Another story from the time is that I went to CompUSA (as a 'spectator') on
the night of the Windows95 release. (It was a midnight release and the stores
were open late.) There was a line out the door that snaked around through the
store past the Windows 95 boxes, the Plus Pack, boxes, MS Office, and then the
memory upgrade desk...

~~~
bitwize
The Windows 95 hypetrain was truly something to behold. The Empire State
Building lights in Windows colors, Friends castmembers appearing in a (truly
lame) promotional video... Windows 95 was the release that put the phrase
"operating system" on the common man's lips -- for better or worse.

~~~
movedx
I'd never heard of this "Friends castmembers" video, so I did a bit of
YouTubing... I have no words (nearly an hour long, and I think there's a part-
two):
[https://www.youtube.com/watch?v=kGYcNcFhctc](https://www.youtube.com/watch?v=kGYcNcFhctc)

This led me to a few other videos:
[https://www.youtube.com/watch?v=2owaKucyU1Y](https://www.youtube.com/watch?v=2owaKucyU1Y)

Sigh. The 90s.

~~~
Arizhel
The 90s were really great in a lot of ways. But as you've shown, it was really
awful in a few others.

I wonder if there's a parallel universe out there where Windows 95 never
happened and Microsoft went out of business somehow. That would be the
universe where the 90s were a truly wondrous age.

------
valuearb
I worked on a visual programming language in late 80s. It would compile it's
programs but the compiler was really slow, took minutes for small programs.

We worked on optimizing it but couldn't find much to optimize. Then the
founder took out the code that updated the progress bar as it compiled and
compiles finished 10x faster.

This was on Mac Plus.

~~~
Nition
Meanwhile in 2016:
[https://github.com/npm/npm/issues/11283](https://github.com/npm/npm/issues/11283)

~~~
digi_owl
Truly seems like those that do not learn from history is forced to relive
it...

------
JoshTriplett
And on current systems, updating every second would decrease battery life by
preventing your system from sleeping. Modern systems can easily sleep for more
than a second at a time, even with the screen on (and not updating).

I also, personally, can't stand to have distractions like that on my screen
away from where I want to focus.

~~~
TazeTSchnitzel
That's why Apple got rid of the Time Machine icon animation, IIRC.

~~~
adviceadam
I had completely forgotten about the Time Machine icon animating before.
Interesting. Odd choice perhaps because I imagine most people have the hard
drive at their desk, when wall power is easily available. There is a setting
that enables backing up on battery power but I think the default value is
disabled.

------
firasd
Raymond once had an article about why the date/time control panel is not a
calendar
[https://blogs.msdn.microsoft.com/oldnewthing/20050621-04/?p=...](https://blogs.msdn.microsoft.com/oldnewthing/20050621-04/?p=35253)

In Windows 7 they've finally given up and the UI lets you easily scroll up and
down through months after clicking on the clock. (Edit: sorry I meant Windows
10)

~~~
IshKebab
> In Windows 7 they've finally given up

You mean seen sense? How often do you change the date or time on your
computer? Probably never since it's automatic these days.

How often do you want to know the date, or what day of the week a day is? I'm
guessing a lot more than never.

The old UI was stupid.

------
alkonaut
Would have thought it would be useless because our eyes notice movement in the
periphery and it distracts you to look at it. I would be annoyed with some
animation going on that I don't care about.

------
devereaux
On Linux, there is a very practical reason: to reduce the number of wakeups
per second, and save power.

You may think it is far fetched, but try a minimalist XFCE or LXDE desktop,
run powertop updating every 5 seconds (so that screen refresh from powertop do
not alter your measures) and see for yourself.

For that same reason, you may want to disable the blinking cursor, and conky,
and other gizmos that blink things on the screen.

------
lucb1e
... in Windows 95.

I mean, the article describes why in versions of Windows from an era that was
much more memory-constrained. Even in 2003, Windows 95 was as old as Android
2.2 is now. I configured my task bar to display seconds and can't say Cinnamon
is hogging that much more memory or CPU because of it.

------
petercooper
My gut reaction was going to be that it fared poorly in user testing. I'd be
pretty distracted by such a thing and prone to idly watching it :-D

------
userbinator
One thing I've always wondered about the taskbar clock is why they didn't just
follow an existing time formatting standard...

[http://3.bp.blogspot.com/-9pDonc_uipU/UEoMOdhA-
eI/AAAAAAAABw...](http://3.bp.blogspot.com/-9pDonc_uipU/UEoMOdhA-
eI/AAAAAAAABw8/8gyhuFzTG3A/s1600/Edit_Digital_Clock_Configuration_Window.png)

...and invented their own:

[https://www.groovypost.com/wp-
content/uploads/2010/01/image_...](https://www.groovypost.com/wp-
content/uploads/2010/01/image_714.png)

~~~
toast0
From the screenshot, someone not familiar with strftime would be lost; but
most people would be able to figure out the invented format. Also, it looked
like users got to choose from a drop down of suggestions -- the "format
string" was probably only for users to read, not actually parsed.

~~~
userbinator
_Also, it looked like users got to choose from a drop down of suggestions --
the "format string" was probably only for users to read, not actually parsed._

Actually it is possible to write your own format string, since those are
comboboxes:

[https://www.bleepstatic.com/tutorials/windows/customize-
wind...](https://www.bleepstatic.com/tutorials/windows/customize-windows-
time/time-customize-format.jpg)

Also, I would agree that the Windows UI is better since it provides a short
description of the format letters; but they could've just as easily done that
for the commonly-used strftime() specifiers too, with perhaps a link that
opens a popup with all of them. The live preview at the top also helps.

------
neverminder
I always have the seconds on (Ubuntu) and it's a good feature. I leave work
every day at 17:28:15 sharp, otherwise I will miss my train (London DLR) and
yes, seconds matter in this case.

~~~
Someone
If that isn't satire, how are you using the on-screen clock to do that?

Do you start staring at it at around 17:20 (or even earlier?) and wait for it
to hit 17:28:15, or is your internal clock so good that you only have to start
staring at it at 17:28:00?

Either way, why spend time staring at an on-screen clock if you also could
spend that time walking to the train station, or even walking on the platform?

I would find some tool that allows me to set daily alarms at around 17:15 (as
an early warning to finish whatever I am doing) and 17:28:15 (as a sign to
leave now), so that I wouldn't have to waste time staring at that clock.

~~~
neverminder
It's not that hard once you're used to it. I just glance at the clock several
times towards the end of work to know exactly when to wind things down and
stand up from my chair at exactly 17:28:15. The optimal time was determined by
observing elevator business patterns, crowd thickness downstairs and delay
margins of train arriving at my stop.

------
frik
> Because that blinking colon and the constantly-updating time were killing
> our benchmark numbers.

Like Visual Studio Code uses 12% CPU in idle due to cursor blinking?
[https://news.ycombinator.com/item?id=13940014](https://news.ycombinator.com/item?id=13940014)

In comparison the Windows Shell (up to Win7) is a lot more efficient and coded
in C/C++.

------
e2e8
I have been using T-Clock for many years to get seconds in the taskbar clock.
[https://github.com/White-Tiger/T-Clock](https://github.com/White-
Tiger/T-Clock)

------
ams6110
MacOS can optionally show seconds. Anyone know if it has similar benchmark
impact?

------
eridius
If it's just for benchmarks, why not make it a preference (defaulted off), so
benchmarks can be run with the default settings (off) just fine, but people
can turn it on if they want to?

~~~
daok
1) People with it on would say "Windows is slow". 2) Because it adds code to
maintain that doesn't need to be.

~~~
eridius
Displaying seconds is a valuable feature, so it's not a matter of having
unnecessary code. As for being slow, the ability to display seconds was
removed because of the minimum system requirements for Windows. Anyone with a
machine that has more than the bare minimum can probably spare the kilobytes
necessary to draw seconds.

------
ex3ndr
Funny thing is that Android removed in 5.0 animations in status bar to save
power as redrawing frequently will eat battery.

------
anc84
How could I benchmark this on XFCE nowadays?

------
shmerl
I always switch it on (KDE).

