Not all analogue clocks smoothly move the minute hand to show progress in the current minute. Many of them tick over, truncating the information to the minute like what digital clocks do.
In movies when the villain has placed the hero in the mechanism of a clock-tower, the minute hand seems to always tick over a minute. I don't recall ever seeing it in real-life, but I don't look at clocks in clock-towers that often.
I have a round analog clock with a particularly strange arrangement: it has a second hand (that ticks every second), and it has a minute hand that only moves every fifteen seconds.
(It's a radio-controlled clock: it has the second and minute hand on separate motors presumably because syncing to the actual time if there were only a motor for the second hand like a conventional analog clock would take too long (and probably make determining position more complicated). There is no independent motor for the hour hand, so it does have to roll the minute hand around to move that one.)
I see these clocks often in railway stations (I live in India). There is no seconds hand. The minute and hour hands move in clicks, not smoothly like most clocks.
The clocks in German railway stations have second hands which 'click'. It's particulary fun how the seconds hand runs slightly fast so that it can pause on the minute, waiting momentarily for a synchronization pulse:
Swiss stations are similar but have contiuously moving second hands. I could have sworn this clock characteristic was indeed called 'Swiss motion' but I can't find any such reference on the Web...
There’s some very neat designs that only tick the minute hand once per minute, as it’s significantly more power efficient to do so. You just power the hand once per minute, as opposed to continuously driving the hand in small increments.
Quartz crystals are inherently digital, where the quartz crystal itself oscilates at a stable rate, 32,768 Hz in most timepieces.
That's counted by a digital accumulator, and a one-second advance occurs every 32,768 cycles.
The display of a quartz timepiece may be digital (as in a liquid crystal display, or any other discrete display), or analogue, as with a watch or clock with hands. But the underlying timekeeping is digital at its heart.
(One might make a similar argument for a pendulum- or spring-driven clock mechanism, with discrete periodic movements, or for an hourglass (discrete sand particles flowing through an orifice) though the more variable physical process tends to argue against this.)
There is very little that is "digital" about the crystal itself. Tuned oscillation is analog.
The accumulator attached to it is digital, but it only has to be that way because it's so hard to make gears that tiny. If quartz was slower you could use the signal to directly drive a gear and have nothing digital in the entire timepiece.
Digital in the sense that there's a set of discrete countable events.
Contrast with several other timekeeping systems: water clocks, hourglasses (sand glasses), sundials or astronomical observations (ultimately the definitive reference), in which periodic processes or entropic gradients serve as analogues to the passage of time.
Modern atomic clocks are also to my mind inherently digital or quantum, where counts of discrete events are tallied.
The discussion of DAC/ADC conversion process has some merits, but to me, thinking of ideas as interfaces, and as models of reality, "digital" fits far better than "analogue" in this case. Particularly as you'll find that same DAC/ADC process in what we manifestly call digital computers or digital memory/storage systems, where some other-than-discretely-varying signal is nonetheless abstracted to 1s and 0s.
As occurs when tallying quartz crystal oscillations in a timepiece.
The counterargument is that whilst the escapement of a spring-driven or pendulum clock is in a sense digital (as, for that matter, was a Jaquard loom's card-based governor), the driving mechanism (gravity weights, spring) isn't. But then, quartz-crystal clocks utilise a battery....
Again: ideas are interfaces, ways we get a grasp and shared understanding of reality, and are models of that reality. To that extent, all definitions are both arbitrary and flexible, but ultimately are governed by their utility.
Again: quartz crystal movements involve a manifestly digital accumulator which drives a display (character-based or analogue hands) based on accumulated ticks. The ticks are digital, a sequence of accumulated 1s and 0s, and are interpreted by logic rather than mechanism.
(Yes, logic itself can be mechanically implemented, but it remains a mechanical implementation of logic rather than a purely mechanical process.)
Escapement mechanisms lack such an accumulator, but rather utilise gears and cogs to match the movement's oscillation to the desired movement of hands on the display. That entire process is analogous of time, and hence, and analogue movement.
You said they "might" be in the same category and sounded unconvinced. Now that you're insisting on an expansive definition of digital, I'm addressing the problems it causes more directly.
> To that extent, all definitions are both arbitrary and flexible, but ultimately are governed by their utility.
And the utility of the term depends on clockwork clocks not being digital.
> (Yes, logic itself can be mechanically implemented, but it remains a mechanical implementation of logic rather than a purely mechanical process.)
Gears are a mechanical implementation of logic. If you make a distinction here, it puts clockwork clocks on the wrong side.
The logic here is just a divider. The same thing the gears already do in a clock.
Also you didn't really address one of my points, but I think it was an important one. If quartz crystals had a slower frequency, you could easily use them to directly push gears like a pendulum does. Since you're so focused on the crystal itself as being digital, would you call that a digital clock?
First: I appreciate your pressing your points, as it's helpful for me to clarify my own reasoning. This hinges on a few points, and again the sense of ideas as interfaces / models is key for me. I'm less a believer in truth than of pragmatism of ideas, something I've been delving into through history of philosophy for a number of years now. And it's the utility of dividing systems into analogue vs. digital which seems key.
And again, this allows for differences and distinctions, one of which is your view, which again I find less useful and clear. That is, crystal-driven timepieces strike me as more usefully considered as digital rather than analogue.
On the distinction between a mechanically-implemented logic vs. an electronically implemented one: the degree is largely of complexity, scale, and speed, but in general once you've ventured into the electronic domain, it becomes infeasible to provide a comparable utility or function by mechanical means. We've tried mechanical digital calculators and computers. We've abandoned them on account of cost, slowness, unreliability, size, and power consumption. Mechanical systems cost too much, ran too slowly, broke down too often, and simply could not scale in the way that semiconductor-based systems could.
To the extent that a mechanical function (say, a mulitiplier gear) represents a pure logical function, we can abstract that functioning from the gear itself, much as writing conveys meaning independent of medium.
It's less possible to divorce the mechanical functioning of such a system from its inherent parts: their materials, mass, size, and the like. If we look at purely mechanical systems, they're very tightly linked to the inherent material constituents in a way that pure digital logic isn't. Let me give two examples.
The printing press (as a mechanical system, I'm not arguing its digital attributes if any) saw a profound development over the course of the 19th century. At the beginning of same, it was little evolved from Gutenberg's early adapted wine press, and with skilled operators might produce ~120 impressions an hour, a sheet every 30 seconds. Converting from a wood to a cast-iron frame roughly doubled that. Further developments: electrical power, cylindrical plates, web-based paper feed, increased that by the end of the century to one million impressions per hour, four orders of magnitude faster. That is, the function and capability of the machine was intrinsically bounded by materials (and power sources and paper characterstics, etc., etc.) from which it was constructed. Taking this further, modern Web servers / application servers are capable of millions of requests per second, another three orders of magnitude faster.
(In general, only one or two orders of magnitude is a fundamental revolution in capabilities: walking (5 kph) to bicycle (32 kph) to automobile (130 kph) to jet airliner (1000 kph) are separated by roughly 1.5 -- 2.5 orders of magnitude, with the largest step (e^2.2) being between the first two.)
Of digital systems, the unrelatedness to fundamental materials is probably best exemplified by virtualisation. That is, we see tremendous adoption of entirely virtualised systems in which the basic logic functions occur entirely independently of the underlying hardware implementation. A digital watch can fundamentally be implemented entirely in software, in ways that I'll venture a hardware watch cannot be. Though here again we trip up on my ideas / language / models distinction: is modelling a hardware system in software the same as emulation? I'm ... going to stick with "no", Because Reasons, though I'll acknowledge the question, though a large part would be that a model is a simplification of the reference system, whilst emulation is a complete functional equivalence.
Similarly, we can run identical software on entirely different CPU architectures, command sets, and semiconductor substrates with few if any practical considerations. Operation is divorced from hardware.
Keep in mind that the function of a transistor itself is equivalent to that of a gate or valve: a small controlling input leads to a large controlled input, and indeed these are often called gates or valves in the field / historically. There have been mechanical and hydraulic computers of limited capabilities. But again, moving from mechanical to electronic components
I'm not arguing that gears aren't capable of logic. I am arguing that gear-based logics severely restrict the capabilities and increase the physicality and physical constraints in ways in ways which fundamentally differ from those of purely electronic systems, and are best considered "analogue".
There's also the point that gears ... rotate continuously, rather than discretely. We can modify that (e.g., with cams or similar designs), but there's still that continous rotational motion at core.
> I'm not arguing that gears aren't capable of logic. I am arguing that gear-based logics severely restrict the capabilities and increase the physicality and physical constraints in ways in ways which fundamentally differ from those of purely electronic systems, and are best considered "analogue".
But the only electronic part you actually need is a toggle (well, a handful of them in series). That doesn't increase your capabilities beyond gears, except that you can pick up a faster signal (and then the only thing you can do with it is slow it back down).
> A digital watch can fundamentally be implemented entirely in software, in ways that I'll venture a hardware watch cannot be.
If you say hardware can't truly be represented, then neither can the quartz crystal, and I would argue that output hardware can't truly be represented either. So the two things you're doing entirely in software are the conversion from 32kHz down to 1Hz, and the conversion from 1Hz to a series of digits.
But the type of watch we're talking about doesn't have a digits display so discard that.
Now the only thing you're doing in software is dividing by 2 a few times, maybe also dividing by 60. That's not beyond the capabilities of a simple mechanical system. There's no need to have "software" involved either. "Software" is orders of magnitude more complex than dividing by 2.
Or, another way to look at things: A watch just like a quartz watch can be implemented completely in hardware, if you put in a """crystal""" that wiggles at 8Hz instead of 32768Hz.
I see your point about speed being a meaningful difference in many areas, but in this case the only thing done at high speed is slow it back down.
> There's also the point that gears ... rotate continuously, rather than discretely. We can modify that (e.g., with cams or similar designs), but there's still that continous rotational motion at core.
The gears right at the core of a mechanical clock are moving in pulses. It's not very far off from what the transistors or vacuum tubes would be doing.
I guess I would also appreciate the conversation if you had ever addressed my argument about attaching gears to a slower crystal, especially since that was in my very first very short comment. As is, I'm kind of annoyed.
Overall, I think the things you're saying about the expressiveness and power of digital logic are valid, but I don't think they really apply here when the logic is so minimal and could in theory be removed entirely.
I didn't touch that one for a few reasons, mostly addressed to exhaustion above, viz: mechanical timepieces tend to be based on a regulated entropic source (watchspring, weights). It's not clear to me that a slow resonance oscillator would effectively couple to gearing, and I'm very far from enough of a watch/clock nerd to think of how this might be done or whether there are any current or historical examples of same. Basically: if you had a pure resonator, then a mechanical coupling seems to me very likely to degrade its regularity beyond use in timekeeping. An escapement design is preferable, and again, that's an inherently analogue mechanism.
I can find no examples of an acoustically-based mechanical timekeeping mechanism. If you're aware of any I'd be interested in seeing them.
I can also remember when mechanical stopwatches were still A Thing, used in sport timing when I was a wee'un. I suspect that these were the highest-precision timepieces reasonably mass-produced (and likely expensive nonetheless), and they could reach 1/10th second accuracy. Far cheaper digital stopwatches came available shortly after, were less expensive, and had 1/100th s accuracy. They could easily have recorded to greater accuracy but the limits of human perception and reaction would have made that redundant.
Current prices seem to range from ~US25 to ~$150 for mechanical stopwatches, versus ~$2 to $20 for digital electronic stopwatches, going off Amazon.
Even now, timed events are generally only measured and judged to 1/100th of a second, given that unavoidable variances (e.g., in track or lane length for track or swimming events, or course lengths for others) would introduce variability not strictly addressed by an athlete's capability.
The quartz crystal itself is an analog component which resonates at some specific frequency. The crystal is placed within a feedback circuit to create a stable, sinusoidal oscillation; the analog sinusoid is then converted into digital pulses to be counted.
It's the same principle as a "pendulum- or spring-driven clock mechanism, with discrete periodic movements"; just on a microscopic scale -- you're taking an analog physical system which naturally resonates at some specific frequency, and then converting the continuous motion of the system into discrete pulses.
While automatic quartz watches do exist[1], “generally, some form of digital logic counts the cycles of this signal”[2], to the point of becoming a synonym.
Almost every clock based on mechanical escapement stops hands on each beat. That is where the ticking noise of classic mechanical movement comes from. For quartz clocks, smooth sweeping hands is a premium feature and I'm not sure are even those truly continuous motion or just higher frequency.
The output of the quartz oscillator is a high frequency electrical signal which is read by a digital frequency divider then fed back into a motor.
> The data line output from such a quartz resonator goes high and low 32768 times a second. This is fed into a flip-flop (which is essentially two transistors with a bit of cross-connection) which changes from low to high, or vice versa, whenever the line from the crystal goes from high to low. The output from that is fed into a second flip-flop, and so on through a chain of 15 flip-flops, each of which acts as an effective power of 2 frequency divider by dividing the frequency of the input signal by 2. The result is a 15-bit binary digital counter driven by the frequency that will overflow once per second, creating a digital pulse once per second.
> It would be weird if we rounded for years, months and days, that's for sure. I think most people think of those scales as intervals. In other words, July is a period of time, with a start and an end. So are years, centuries, seasons. We are inside of it or outside.
I feel like my sense of time is different from the author's. While it can be useful to round the current hour/minute on some occasions, the information about which exact segment of the day/hour you're in can also be very useful. I can certainly tell that I ask the question of "when exactly is it going to be 12:00?" far more often than "how many seconds have statistically likely elapsed in the current minute?"
The biggest issue for me is that the precise moment of when one minute/hour transitions into the next is important for people. Like, when coordinating an event or meeting, would you prefer it if your clock indicated the precise moment when 12:59:59 becomes 13:00:00 and told you to start the meeting, or would it be better if the clock instead told you that it was "13ish" and you'd have to wait out ~30 seconds by counting in your head?
This also causes a jarring discontinuity - now clocks with a ticking hour hand appear to run 30 seconds late than clocks without, turning on the digital clock setting to show seconds offsets it, and so on. Some people celebrate New Year's or occasions that happen at a specific time 30 seconds early because they no longer have a strong reference point.
> It's controversial because of how he makes fun about the people using and promoting other languages, like Rust. I thought it was hilarious.
But doesn't that completely ruin the point of the post? I agree with you that something feeling 'fun' is more personal, and that the criteria of what constitutes fun are up to the user. The author doesn't agree with that - you can either adopt the former point or promote the Right Way of having fun. Those snarky remarks made me put this blog into the second category. When you're so invested in your argument, even a fundamentally harmless post about having fun will get that language wars hit piece subtext.
If anything, they seem more like desperate cheap shots than arguments. Other people, the NSA etc dislike unsafe-by-default code? Well, they're just authoritarian anti-fun ideologues! Rust users bring up some of the same criticisms I recall in the last paragraph? Well.. uh... that borrow checker, am I right?
I think he was just making fun of people who inflate the importance of their language and their way too much and it should be taken in that spirit. There are good reasons why all these new languages like Rust and Zig popped up in the last 10 years or so and started getting traction. Obviously a lot of people were unhappy having C/C++ as their only choice for performance focused or system level programming. At least in the games business it still seems to be doing well.
But to get back to the point of the article, for fun solo projects, when the 'mood for coding' comes over, I may be biased, but I think C++ is still the best. It's like when building a prototype. You just want to test your idea and see how it looks and play with it and just worry about bugs and program correctness later. While coding it in Rust you'd have to spend extra time determining the correct memory ownership relations and that can break the flow.
After programming for 20 years, it doesn't come nearly as often as it used to, but I still get that feeling from time to time.
The reason why this is a difficult problem is that physically emulating the flicker requires emulating the beam and phosphor decay, which necessitates a far higher refresh rate than just the input refresh rate. You'd need cutting-edge extremely high refresh rate monitors. The best monitor I found runs at 500hz, but pushing the limits like that usually means concessions in other departments. Maybe you could do it with that one.
If Recall is all that needs to be fixed, it can just be disabled pretty easily from systems that have it (I presume that most don't, and it doesn't even exist on my current Windows PC).
LTSB/LTSC is very much viable for businesses that can obtain it legit through bulk licensing as a build of Windows Enterprise. The complicated process above is intended for individuals, who can't get buy it directly. Even then, it's more oriented towards HN users and such, not the average person.
Looking at all the articles about this issue, this seems to be more about a bug in the Windows cleanup tool that lets the user delete old update files. Maybe the tool isn't working properly, or it's flagging update files as deletable and they're not supposed to be. Admin users can still delete whatever they want manually, unless the system or something else is currently accessing the file. The OS sometimes protects its system files by having them be owned by the SYSTEM user, but the admin can take ownership of them to delete them. This hasn't changed and I can't see it changing.
I don't think it's that bad, the duration is certainly not enumerated in hours, at least on Windows. It depends on how high a standard you have, but since other people were long aware of the issue, third-party scripts and software to automate a lot of the tedium exist and work pretty well, anything else can be smoothed out by hand.
Freeing up the disk space in a VM is definitely an hours long struggle the first time, must count research time into it as well. I did it last year. Don’t forget the privacy tweaks and research. Sure you could script most of it, like anywhere.
Any such tools that are worth their salt have their code posted publicly, you can check it yourself. Most of it isn't some groundbreaking stuff, a lot can be stripped out by just changing registry values and other deeply-ingrained settings that would take a long time to find and edit by hand. I also don't see how "trust into Microsoft" factors in here - it's not about trust, we know there's telemetry that can't be disabled through standard settings, and I know there are features I'd like to uproot entirely (like deep OneDrive integration).
> Don't expect people to profoundly connect to music that is nothing more than a collection of regurgitated ideas.
Music is one of the worse examples to pick for claiming that people don't regurgitate in art. Everything in music builds off one another, and a lot of music (especially music that's seen as lower quality) is described as being just collections of cliches. The reason why "sad music" sounds sad isn't because there's something about instrument choices, key, chords, melody, tempo etc that is measurably intrinsically "sad" - it's because these are stereotypes that the creator has combined together to invoke a certain association in the listeners. If you were extra cynical, you could describe the entire musical field as people largely conditioning themselves over generations to like certain qualities of sound and hate others.
And that applies to almost all art. Basically everything people make is based on stuff that came before that - and it's frustrating to encounter hubris that assumes there's some magical creative process going on inside human brains that will never ever be even approximated by any other means.
> it's frustrating to encounter hubris that assumes there's some magical creative process going on inside human brains that will never ever be even approximated by any other means
Maybe we will, maybe we won't. In the same way maybe we will be able to create life artificially, maybe not. The AI I am critiquing today is what Tamagotchi is to human life. Sure, you can get attached to it and think it's expressing real emotions and wonder why other people are being "hubris" by not realizing how wonderful it is.
I think that a reasonable definition of counterculture is moving against the dominant culture, rather than just doing something uncommon. And if this is it, it absolutely can't be defined as counterculture.
Look at a country as religious as the US - even if it's seemingly getting barely less religious than before, the dominant culture is absolutely run by religion. A political candidate, let alone a presidential candidate who's openly non-religious would almost always be seen as an abhorrent non-starter. Public servants swear on Bibles and school children recite a mantra to prove their belief in a "nation under God". Basically the entire ideological landscape is run by religion, from the issues that people talk about to the stances they take on them. You don't even have to be religious to be influenced by all this. It doesn't really matter if a believer is committed enough to go to a church, all of them are contributing at least a little bit.
And then, just as my own unrelated opinion... the thing we associate with counterculture was always a radical disregard for arbitrary standards, an ability to pinpoint and reject the things in the world that are unjustly enforced by society. In that way, doing anything religious can never be counterculture - how can you envision a rebel with a capacity to question everything wholeheartedly believing a system of preconceived dogma?
This misses the bigger point of my argument. The whole reason why abortion is seen as this religious issue is the system of religious thought that's ingrained in American society. It was created as a wedge issue for religious voters, and this system grew so much that nowadays it's seen as this intrinsically religion-dividing issue, as if it's just natural for it to be this way. The reason why you (and possibly the mayor) see opening a clinic as this big act of secularism is because the underlying religion is deeply integrated into their community.
And the second half of my argument still stands. Someday, religion may become a minority ideology. But it will never be a rebellious counterculture.
No the overwhelming majority of the city loves aborting children. Being pro-life is not a majority opinion whatsoever. Maybe you live in the rural south, I live in a Democrat-lead blue city.