Hacker News new | past | comments | ask | show | jobs | submit login
Hacking the Timex m851 (cmpxchg8b.com)
313 points by taviso on Sept 3, 2023 | hide | past | favorite | 82 comments



Part of the magic is how bonkers frugal the Epson SoC (PN S1C88349) is: when running at a low-power mode at 32kHz, it consumes an astoundingly low 9 microamps. [1] How much do you think you could get done with a cycle budget of 32k per second?

That, and alongside 48K ROM and 2K RAM, you get 3 timers, a UART, and an A/D converter. [1] https://global.epson.com/products_and_drivers/semicon/pdf/id...


>Part of the magic is how bonkers frugal the Epson SoC (PN S1C88349)

Would it blow your mind that such ultra-frugal 8-bit parts have been available for about 30 years now?

Those Seiko-Epson chips, alongside with EM-Swatch, and OKI-Casio chips, were used in all kinds of timekeeping, calculator, thermometers, and all kinds of cheap low-power widgets with segment LCD displays that need to run years on a single button cell.

I have a calculator and a digital fever thermometer(the stick-type ones for the armpit or anus) that's over 15 years old, also using on one of those Seiko-Epson chips and it's still running on the same 1.5v button-cell that it came with, which is mind blowing when you factor in the charge decay of the lithium cell over time.


You should be reaching the charge decay of that cell more or less now. :)

But yes, incredible. These are products made with a lot of care, unlike modern counterparts that are just so very lax on these things.


> [...] unlike modern counterparts [...]

What makes you think so? You can still buy calculators that last a long time, and 'solar' powered calculators that work in pretty dim light inside of a class room.


I didnt mean to compare it with calculators that are entirely powered by light. More to modern watches, smartwatches and whatever modern devices we have.


You are right about them. But battery hungry monsters ain't new. Have a look at the Sega Game Gear.

The Game Gear had much higher specs than Nintendo's Game Boy. It had a proper colour display, instead of a pea soup screen.

> The internal reaction to the Game Boy at Nintendo was initially very poor, earning it the derogatory nickname "DameGame" from Nintendo employees, in which dame (だめ) means "hopeless" or "useless".[19][20]

(An idiomatic translation might be, 'Lame Boy'.)

As you might be aware, the supposed Lame Boy won the market. Partially because the Game Gear ate through batteries like crazy, while the Game Boy subsisted on a more measured diet.


There's a staggering, mind blowing, astronomical levels of bloat and complexity in modern systems, be it SoCs, Operating systems or apps.

Many mobile apps nowadays are larger in binary size than whole operating systems+applications doing extremely useful work.

It's staggering, scary and, where is this all going? Will someone or something put a damper on it? Probably not.


> Will someone or something put a damper on it? Probably not.

Oh yes.

Global warming will.

The fossil-fuels companies have been very effective at spreading propaganda, so some readers are probably laughing and thinking "ha! Idiot! It's not real!"

If so, I am sorry for you: you are flat-earthers. You have been eaten by the brain worms.

We are heading for 80-90% of the land on Earth being uninhabitable by humans inside 2-3 decades, and a small remnant population at the poles. All silicon chips, RAM, storage, etc. is made in tropical and subtropical regions, and they will all be gone.

If we are lucky we'll be back to handmade computers with individually soldered components scavenged from dead consumer electronics.

The CollapseOS person is probably right: http://collapseos.org/

Here is a quick guide to the science for those with the brain worms: https://medium.com/@samyoureyes/the-busy-workers-handbook-to...

It is the length of a short book, but that's because there is a lot to get in for those who've been trained to bury their heads in the sand because they listen to billionaire's BS.


Only if there is an economic incentive to it. Currently, there is none. (A cynic in me would say the reverse is true at least in Apple ecosystem, as more resource-hungry apps push users to upgrade their devices more often.)

Moreover, I'm not sure how much you can gain, actually. You can probably get some reduction in app size as far as code segment is concerned, but usually most of the space is taken up by assets, especially in games, and they're already well compressed. So basically we could optimize the last 20% which are the hardest to get right.


If we really needed to, we could compress assets a lot, eg with procedural generation. (Including the neural network kind.) But that would be a lot of hassle, and would probably trade run time for space.

As for code: I think shaving of bytes isn't all that important, especially compared to assets; but simplifying logic can give various benefits. However, that also takes time and is probably not worth it for many applications.

A distinction I like to make is being careful to distinguish between bloat that worsens throughput and bloat that worsens latency.

Throughput still gets better almost for free over time by general hardware improvements. Latency still requires attention.

An example: an early word processing software (I think it was part of GEOS for the Commodore 64) paid a lot of attention to keeping the latency between you pressing a key and a letter appearing on the screen low. By today's standards, the Commodore 64 was slow as molasses, so when you typed quickly GEOS eg temporarily disabled breaking wrapping words at the end of the line and just jumped to the next line in the middle of a word.

After you had finished your burst of typing, GEOS would go back and calculate proper line breaks.

Compare that with textfields in modern websites, which sometimes have a noticeable delay before a letter shows up.

GEOS's code for such latency reduction was probably incredible convoluted and messy (I am just guessing here) and made it hard to a nightmare to add new features, but was worth it back then. The modern bloated website can probably sustain a higher throughput, even without no one taking any care to optimize that. But latency is horrible.


I wrote the PC equivalent for a similar product in the 1980s and wrote some different GEOS code. It was less convoluted than you might imagine. But it was very unforgiving in terms of de-referencing null pointers and memory leaks. Instead of using a little too much memory or causing a process to crash, it would often cause the system to hang. Windows 3 (and to a lesser degree Windows 95) had similar problems.


In modern times the argument is the economic incentive is to optimize for the developer, not the hardware. Python, Node etc. over C.


And most of the time that's the right trade-off.

Not all the time, though. Eg consider a video game: the image frames have to be pumped out on a tight time budget every couple of milliseconds, but the logic to decide whether you have finished your quest can afford to run for a few seconds in the background.

Even though the latter is much simpler and could probably be coded up to run on even the Timex m851 in fractions of a second.


I totally agree, there's no economic incentive, and sadly the "industry" will make sure complexity/size/scale will keep on growing. How much more performance and quantity do we need in CPU/RAM/storage/storage bandwidth for regular apps which I assume is the vast majority of computing at home/office.

Sure, games are ever-demanding and will continue going that way probably for a while longer especially on the pure graphics side.

Many apps do contain truck loads of libraries and dependencies, which have dependencies and all of that alone, accounts to massive code size.


One of my hopes is that LLM technology can eventually be used to fight technical debt and help us clean up the software stack.


It'll probably just allow us to get away with even more technical debt.


That's also what I'm afraid of...


There's an ultra-low power mode in newer TMS430s, and you can use an external ultra-low-power RTC.

From memory, it's under 100nA on paper, including things like the power switch leakage. Crazy stuff.


Do you mean the TI MSP430?



Ugh yes! Don't know why I always get that wrong!


The [E0C6S46](https://download.epson-europe.com/pub/electronics-de/asmic/4...) still powers the 1st and 2nd gen Tamagotchi!

4-bit (!) 32KHz MCU with 6,144 words of 12-bit (‼) ROM, 640 words of internal 4-bit RAM, and a 160-word 4-bit frame buffer for the integrated LCD driver (enough for double-buffering)!

The thing is a beauty! I wrote a Typescript emulator for it, a year ago or so, though for whatever reason I haven’t pushed it to GH yet (but I will if anyone’s interested! It can run unmodified Tamagotchi firmware.


Wow, I'd love to see the emulator


I found "ROM" to be confusing because I always thought that meant you couldn't change it, but they mention swapping modifiable data in and out of there. It's actually EEPROM and I guess that is a little similar to Flash memory.


Usually NOR flash. SSDs are usually NAND flash.


In the Arthur and DW ActiMates (https://news.ycombinator.com/item?id=36656190), my teammate, Craig Ranta, implemented the Real Time Clock using a Microchip 12C508 or 12C509. It ran at 32.768 kHz.

Using the microcontroller was advantageous because we could implement a custom interface (data and wake up alarm signaling) to the main microcontroller, the functionality was exactly what we wanted, super low power and very inexpensive.

He had to deal with complications like maintaining the time, running the interface, maintaining the alarm, all at the same time.


Very cool.

If you're interested in something with a bit more features, check out the Bangle.js[0]. The benefits are you have Bluetooth, GPS, accelerometer, vibrator, and a colour screen. The main downside is that the battery lasts considerably less than 3 years.

[0] https://banglejs.com/


For anyone wondering, the battery on the Bangle is claimed to be 4 weeks.

> With a sunlight readable always-on screen, 4 week battery life, complete flexibility, and complete control of your data, Bangle.js 2 is a refreshing break from expensive smart watches.


Can confirm from personal experience: in standby mode with infrequent app usage it was clocking a good 4 weeks.


I had this watch about 15 years ago. Among other things I could track:

0. Where my next class is going to be

1. Store the bus schedule to the bus stops I used

2. The phase of the moon

Checking the bus schedule on your watch before smartphones looked like a James Bond move :)

I wish they still made watches like this, at this power consumption you could keep your watch charged from solar power indefinitely


>I had this watch about 15 years ago.[...]Checking the bus schedule on your watch before smartphones looked like a James Bond move :)

To be fair, the first iPhone was launched 16 years ago, and smartphones that could store a bust schedule have been around even before the first iPhone. I would store the bus schedule as an SMS draft on my Siemens phone over 20 years ago.


The James-Bond-like part is doing it from your watch specifically. (Consider the various wrist-gadgets found in the movies over the years.)


Casio Databank watches had that functionality in the early 90s I think (without the USB/IR PC link). Not that James Bond for the year 2000.


My Treo could hold quite a few bus schedules (the Caltrain schedule, too!). And it had a very crappy web browser (by today's standards) that worked "okay" for mostly text pages. Jeff Hawkins even used it to buy a book from Amazon as a test. I think my point is pre-iPhone phones weren't completely crappy. I still sort of miss the danger hiptops.


wow, I can't believe the iphone is 16 years old. I also had this watch and it came well before it. Maybe 2002?


Well, maybe a more modern approach to a hackable watch is

https://github.com/sharandac/My-TTGO-Watch

I stumbled over this, while I was researching for building my own DIY iPod Nano 7G with the LILYGO T5 E-Paper dev board

Product: https://www.aliexpress.com/item/1005002474854718.html

Code-Repo: https://github.com/Xinyuan-LilyGO/LilyGo-T5-Epaper-Series

This would be an amazing iPod Nano 7G replacement, if I had more time and more skill in arduino stuff ;)


I have a couple of these things. Battery life is pretty crappy. The stock code barely worked and had about 12 hours of battery life (if you didn't do anything with it.) I mean... give credit where credit is due though... it's an amazing buy if you're hip to debugging other people's code. I eventually filled in some of the bits that weren't finished and eventually figured out settings so it's not completely sucking down power, so it would last about 6 hours with "moderate" use.

I would be surprised if the current code isn't MUCH better, but it's power budget is going to be much closer to an Apple or Samsung smart-watch and not an 80s era ultra-low-power m851.

If you're hip to experimentation (like I am) it's a fun little project and worth the money. If you're looking for something to compete with an iWatch, you're going to be disappointed.


>a single battery can last 3 years! This is a big selling point for me, I don’t think they’re making consumer watches like this any more.

Well Casio surely does. Search "Casio MIP display".


More on the Timex Datalink series

https://en.m.wikipedia.org/wiki/Timex_Datalink


Ironically, I find their older models with optical receivers a lot more high-tech than the later USB models.


I guess it's one way? But yes, very cool. I can imagine using it with my phone to sync useful stuff directly from my phone to the watch without a cable.


Why was could it only encore with a CRT and not an LCD? Could this be replicated with modern IPS LCDs?


It’s the scanning beam, that LCDs simply don’t have. But fear not, a single LED will also work and was recently reverse engineered: https://lemmy.sdf.org/post/691827


Also CRT monitors were ubiquitous. consumer LCD monitors didn't start appearing until the late 1990s.

It was magical watching a Timex DataLink watch receive data (at a really slow data rate though) - short moment in time before USB and WiFi appeared.


I really wish more watches/wearables would be made with these 8-bit low-power chips and have exposed data pins to make them hackable. 3 years battery life is where it's at for a watch. If I have to think about charging my watch is a no-go for me.


I keep buying cheap mechanical (wind up) watches and Swatches off eBay. My current Swatch has been telling time consistently for about a year on the same battery. The problem with cheap mechanical watches is you can over-wind them if you're not careful. I keep killing them every couple of years. But hey, they're cheap.

The cool thing about both these watches is you don't have to recharge them every day (though you do have to wind up the wind-up watch.)


This is a reminder that we can, in fact, have Nice Things – if there only were a market for Nice Things.

I'd love to have a decently integrated watch with low battery usage (and generally low stakes for wearing/replacing) which I can also agree with aesthetically. Bums me out that Sensorwatch [1] is the best we can do.

[1] https://www.sensorwatch.net

edit: Sensorwatch is amazing, but not an ecosystem (yet).


That looks promising; what sorts of things can you do with the board? What would one like to do with the board?


This does seem nice and even simpler than TI's Ez430-chronos [0].

That said, 3 years on a single battery doesn't seem great of course I suppose it depends what apps you use and load..and that it's programmable that is cool.

[0] https://www.sparkfun.com/products/retired/10019


Battery life is probably relative. I lost one of my Casio F91s in 2018. Found it last week while packing for a canoe trip, the time and date were within seconds of the Casio F91 on my wrist.


Can you still buy the chronos? I bought one a while back and it disappeared in a move. Would love to monkey with it again.

[Edit: Oh. Okay. I see that you included a reference to the "RETIRED" product listing. I mean, it wouldn't be a TI product unless TI EoL'd it just as it was getting market traction.]


I got one of these for my birthday in 2003, and I'm still using it. The Data Link USB "only" has a battery life of about a year. The 3-year life Data Link would be the older model, that optically received data from a PC by watching light pulses from a CRT.

I don't know why he's complaining about the Windows 98/XP PIM software. It worked fine, and it's open source <https://sourceforge.net/p/datalinkusb/code/HEAD/tree/USBPIM/>. It does not work on Windows 10 and later, however. I looked into it years ago, and it seemed related to the GUI code (I think it used MFC?) and not anything involving watch access. Probably fairly easy for someone familiar with Windows dev to fix, especially with the source. Edit: Wait, I misread. He was complaining about the SDK wizard, not the PIM. Yeah, the wizard wasn't great. Also, it was slow since every file assembled had several megabytes of headers included.

Having C code running on the watch is nuts. The builtin ROM software was written in a high-level assembler (probably for size/CPU/power consumption reasons), and the user SDK was a regular assembler (I got the impression that back then Epson wasn't giving away the better development software, so the best that could be included for free with the SDK was the regular assembler.) The OS's overlay system and semi-adhoc calling convention isn't C friendly. Often, values were passed in whatever registers were convenient for the callee, with macros to help hide this when possible. EEPROM access functions took parameters from global RAM variables.

The SDK documentation was great. 10/10. You could not reasonably ask for more. It had many examples, every single header file used by Timex, which documented every single function and variable (even ones internal to the builtin software), and PDFs describing every bit of data going in and out of the watch, from the communication protocol to the file formats used by every builtin app. The OS used by the watch is impressively clean and easy to use for a one-off 8-bit system.

Most development for the watch was shared on a Yahoo Group, which is down now, but there are backups somewhere. I manually saved a bunch, and I know that people there backed up the messages and uploaded them to... I don't remember exactly where, I'd have to look it up. I remember running across a post there from the EEVBlog guy when looking through old posts. He was asking about making a fitness tracking app.

It took me a long time to get around to writing anything for it, but I wrote several programs for it around 2013-2015ish. A timer, calender, expense tracker, and a viewer for the user tracking data (stuff like how many times you've pressed a button, entered an app, or had an alarm go off). The SDK had a single DOS program as part of the build process (it was something to feed the linker some addresses), so I had to rewrite a 32-bit version to get the SDK to work on 64-bit Windows 7. I think I later found someone else already did that on the Yahoo Group.


Short of full-blown prototype boards (ESP32 ecosystem) and dev environments (Apple ecosystem), are there any modern day consumer-facing devices running on an obscure platform that also come with a hidden layer of an SDK?

I am thinking of the Cybiko, but that’s also contemporary to the Timex watch.


Garmin watches have an SDK with their own entire goofy "Monkey C" language and an entire bytecode VM to run it. It's not "hidden" I guess (there's an app store), but neither was this Timex one.

https://www.atredis.com/blog/2020/11/4/garmin-forerunner-235...


Somewhere, a lone programmer in a hardware engineering org is having the time of their life.


Ruputer/OnHand PC? Another smartwatch from late 90's/early 2000s. The specs were on part with the original Mac, 128KB RAM, with 102x64 pixel display and 2MB of flash, but VERY sluggish OS and poor battery life (but still better than modern smartwatches).

Maybe Palm and Psion devices count? I would exclude WinCE...


Ah yes http://pconhand.com I drooled over it as a teen. But the price was to high for me.


i could never get used to the psion 5mx keyboard. it was just a little too small. the treo thumb-board wasn't ideal, but it worked well enough. FWIW, PalmOS was HEAVILY influenced by MacOS classic. I mean, at least they changed the variable names in the debug images.


oh man. i remember being tempted by the cybikos 20 years ago. but now you can't find a working one on eBay. maybe i could make something similar with low-power nordic or ti btle chips and a MSP430 w/ an lcd driver.


Not first-party, but calculators?


There's a new model that looks the same? https://www.timecentershop.gr/ents/timext53722.html


Hardly new, Amazon lists is as discontinued by the manufacturer which probably explains why it's tough to find.


It was just odd that it was listed as in stock and also had a different model than m851 - but maybe that refers to the processor?


I am most impressed by the documentation provided in the references. Looks like it was ahead of its time.


Credit where credit is due; I can imagine many people here right now would want to find the "NINO ALDRIN L. SARMIENTO" credited in the changelogs and shake their hand...


Fun fact: It's possible to password protect apps using the PC software with a two character password (there's no encryption, it's just intended to slow down anyone from casually examining user data if they happen to find the watch unattended). When the DLU boots up, it initializes the password variable to "NS".

It doesn't really mean much, since all apps default to unprotected on boot, so you can never use it, but it's interesting.


Has anyone informed the original programmer, Nino Sarmiento?


Does the Timex m878 have the same core cpu?

Because the m878 has GPS and ANT+ support which makes it exponentially more powerful.


oh well


This comment reads like it was written by one of my high school teachers. "You are so capable, you should apply yourself more" is really just the worst thing you can say to a person who is actually enjoying what they do. What prompts people like you to think you have any measurable influence in telling someone to do something that you find more useful? Is it a lack of talent on your side, or just entitlement?


Cool. Any idea where to buy these? Thanks


> Now that I’ve told you how much I love this thing - the bad news.

> They’re long out of production, and getting hard to come by.

> I bought a set of two on eBay, they just needed new batteries and were as good as new.


Cool, but its not really hacking if you just use an sdk to write code for a platform.


I think you misunderstand the origin of hacking. It's not about breaking into systems, that's cracking. Using an SDK to program a watch is definitely hacking on the watch. The media are generally to blame for this miscategorisation.


When it comes to (mechanical) watches in particular, hacking has another meaning: https://en.wikipedia.org/wiki/Hack_watch


Using an SDK to program a watch is... programming a watch? Otherwise every iOS developer is iPhone hacker?

I'm not an English native speaker, however to me "hack" means something not standard. Not necessary cracking, of course, but just using SDK to write some software does not sound like something very "hacky". Those watches were explicitly designed to be extendable.


It's okay. I'm a native english speaker and the word "hack" sometimes confuses me too. There are at least three meanings I can think of:

1. From the MIT Model Railroad Club. This is the "good" definition of hacking. A "hack" is a clever manipulation of an existing system to exceed it's design specs or make it do something radically different from what it was originally intended to do. Example: "Did you see that radical hack Bob pulled, using a the I/O port on a Commodore 64 to drive the model railroad track switcher? That was amazing!"

2. From the news media in the 80s and 90s. To break into a computer system. Example: "Wiley hackers broke into government computers and stole your Social Security, Drivers License and Credit Card numbers. But the good news is you get free credit monitoring services for the next year."

3. From ????. To jury-rig a system out of parts you wouldn't expect would work together, but it does. Example: "Did you see that radical hack Bob pulled, using a cardboard toilet paper roll to emulate a PDP-11? That was amazing!"


The words hack and hacker have been co-opted by so many people and for so many purposes that it is does not have any coherent definition, at least in the realm of technology. To add examples to your list: hacker seems to refer to people who do not code professionally or who subscribe to a more results oriented (rather than methodological) programming style; while a hack can refer to "amateur" (meaning low quality) code or a writer who pumps out low quality work (professionally).

These words seem to have more to do with social status within a particular group than anything else.


The "hacky" part is that he didn't use the official "Windows XP-era wizard" development environment.

Instead, he works under Linux with standard Makefiles, using wine and calling the compiler directly.


Yet here we are on a site called "Hacker News" with the current top story being about a consensus algorithm with a load of implementations and real world usage, libraries in multiple languages.

The original term comes, afaik, from journalists who would hack on typewriters, hence hacks.. it's another evolution, but I'm not great with etymology.


Well. It also purports to be a tech site when it's really a finance site.


One would think the author of the post would be very familiar with both uses.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: