Hacker News new | past | comments | ask | show | jobs | submit login
General purpose MCUs built in to LEDs emulate candle flicker (cpldcpu.wordpress.com)
237 points by pontifk8r 3 months ago | hide | past | favorite | 182 comments



> Is it possible to improve on this candle-LED implementation? It seems so, but this may be for another project.

Hey, I got paid to do this a few years back! Here's the result: https://evietealight.com/

They're the sort of thing it's not really possible to show off in a photo or even video. What really gets you (or at least gets me) is seeing these things out of the corner of your eye, flickering, just like flames flicker....

This was a pretty uncertain research project. We didn't know if we'd be able to make really convincing flameless flames. The day I knew we'd got it was the day I brought a prototype home to tune and A/B test against a real candle. I went upstairs to put the kid to bed, then came downstairs and panicked because I thought I'd left the real candle burning. I hadn't! I got fooled by my own product. It was all downhill from there.

These guys are patented (somehow... the language is pretty impenetrable, and it's my own patent...) so I don't mind sharing a little of how it's done: there's an array of LEDs plus optically-active element, which shapes the light in a way that makes it look good. Nothing about this is too complicated (the optical surfaces aren't all that special, but are not trivial to prototype in your garage, at least if you've never done optics before), but there was a lot of R&D to settle in on what looks "right", and the final product has a lot of "premium" touches that really drove up the manufacturing cost. Hence the high final cost -- they really do cost a lot to make.

But they look awesome!

(And thanks Tim for your original article -- I definitely remember reading it during the early research phase!)


Still rather odd not to have a video on the page of a product where animation is the key selling point.

Maybe a video wouldn’t convey all the benefits of the design, but lack of video does not inspire confidence either.


I came here to ask for a video!!


https://www.facebook.com/glassybaby/videos/2723369747740357/

i'd prefer a side-by-side with real candle but oh well


I own six sets of Evie lights and they are the best I have ever seen. Three light sets with an induction charger sell for $75.

Here is a pre-release writeup from Product Creation Studios in Seattle about the development of Evie lights.

https://www.productcreationstudio.com/product-creation-glass...

And, here is a short video clip of Evies “burning.”

https://www.facebook.com/glassybaby/videos/2723369747740357/


i went to a 'candle light concert' where they had a little asterisk at the bottom pointing out the candles were "flameless", they probably had several hundred LED candles lined up on each edge of the room as the main light source. it was pretty cool and im guessing they used those patents?


I love it. I have noted this article for my talk about how CPUs are free. To appreciate that, you have to understand that when the first microcomputers came out engineers were still in "compute" mode[1], we were lectured that you wouldn't use a hard coded loop to check for a switch closure, you had to use interrupts because otherwise you were wasting all those CPU clocks. And computing at the time was often billed in weird units like "kilocoreseconds" (which is the number of seconds * the number of 1024 word pages of core (RAM)) that were consumed when your program ran.

The logical extreme end of Moore's law was that you could put a CPU into a very, very small amount of silicon and that meant they were essentially free. (Chips cost by die area & layers). Another article like this is Bunnie Huang's discussion of the ARM CPU in flash chips[2].

There have always been jokes that it is cheaper/easier to use an 8 pin uController than it is to use a 555 timer, and the argument has often come down to the current and voltage ranges that a 555 can work under are different, but at some point I expect to finally see the "blending" of analog/digital chips that allow for a wide range of voltages (on board switching PMIC), and analog pins that have few if any compromises for being either digital or analog.

[1] The Chip Letter -- https://thechipletter.substack.com/p/tiny-computers-from-tex...

[2] On Hacking MicroSD Cards -- https://www.bunniestudios.com/blog/?p=3554


> at some point I expect to finally see the "blending" of analog/digital chips that allow for a wide range of voltages

The irony is that the developments which made dirt-cheap MCUs possible have at the same time basically ruled this out.

Digital logic is almost trivial to scale down. With Moore's Law the compute core itself is indeed becoming basically free. However, IO does not scale down: modern chips have far fewer analog pins, far lower current limits, lower voltages, and are increasingly sensitive to ESD & over-voltage events.

An ATmega32u4 from 2008 is designed to operate on 5V, can handle 40mA per pin, and has 13 analog pins. It's rather sturdy and can take quite a beating. On the other hand, the RP2040 from 2022 runs on 1.1V, although IO is 3.3V. It can only handle 12mA per pin, which a chip total of 50mA. It has only 4 analog pins, which lack a lot of protection present on the digital pins. Basically, you'll damage it if you look at it funny.

I think it's best summarized by a somewhat-recent change in the USB 2.0 specification: originally the data pins were supposed to handle a 24-hour short to 5V without any issues. This requirement was dropped because such a short is incredibly rare in practice, and dropping that single requirement led to 16% reduction in silicon area for the transceiver and a 3x standby power reduction.

In today's world of ever-shrinking transistors, dealing with (relatively) high voltages and analog voltages is getting more and more expensive.


I don't disagree at all, this analysis is spot on. The size of discretes on silicon has not shrunk nearly as much as it has for transistors. However, where I am coming from is that because the transistors have shrunk so dramatically it becomes possible to put an entire CPU in the "left over" space after you've placed the discretes.

There was a talk at either Hot Chips or ISSC in 2011? about a mixed mode chip where the die was 2/3rds analog parts and 1/3 digital part. Xilinx, the FPGA maker, came out with the "RF SOC" which has a "huge" analog section with multiple high speed ADCs and DACs and analog reference logic, plus and FPGA fabric, plus a quad-core AARch64 CPU. As I recall Cypress had something similar but the part family is escaping me at the moment.

But I am still looking for chip that integrates an SMPS so that they can run on a very wide range of voltages like the CD4000 series did (and still does). Combined with the ability to source 10's of milliamps like the ATMega and PIC chips did (and still do).


Plenty of chips integrate a voltage regulator so you can run digital logic from as much as 300 volts. Used in the controllers for some dimmable AC Led's for example.

I believe the linear regulator is implemented as a large resistor followed by some zener diode or something. I assume that's so the high voltage doesn't need to touch the silicon, merely the other end of some not-very-good insulator put on top.


The Cypress part family you are likely thinking of is the PSoC line, it’s a MCU mixed with very configurable analog front end.


Yeah that was what I was looking for, apparently it has an Infineon[1] part number now : https://www.infineon.com/cms/en/product/microcontroller/32-b... which is kind of cool.

[1] Apparently Infineon closed their acquisition of Cypress in 2020.


Made me realize that IO is the fundamental bits-atoms interface, thank you for writing.


> I have noted this article for my talk about how CPUs are free.

If teeny computers are free, and if I want to re-program them for my own use cases and personal applications, then why do I have to still spend nearly a thousand dollars or two on embedded systems development equipment like microcontroller development boards, JTAGs, ICEs, ROM flashers, UART-based bootloading solutions, and other delicate programming interfaces for small microchips, microcontrollers, and tiny computers? And don't forget microscopes to do power analysis for reverse engineering some old toy that was made to emulate or fake a real life candle's intractable flame properties.[1]

If you can't write code, then what's the point? How would no code be an agent of freedom and expression?

Reprogramming a microcontroller unit with a USB cable connected between it and a laptop computer is convenient. But too bad that's not really the standard for old technology and resources laying around the planet, isn't it? You have to basically be uncanny like MacGyver or inhumanly intelligent like Tony Stark to reprogram the apparently free teeny computers laying around the world.

[1]: https://cpldcpu.wordpress.com/2024/01/14/revisiting-candle-f...


Perhaps this helps, perhaps not, but the Cortex-M architecture from ARM defines, as required, a build in debug unit. I can build a standalone JTAG/Development tool for it[1] on a $3 breakout board, and program/debug it for free using GCC.

It has been a pleasant side effect of competition in the embedded space that proprietary (and expensive) tooling has become a problem for getting a chip adopted and so there is more pressure to support open source solutions.

[1] Blackmagic Probe -- https://black-magic.org/


First: don't conflate NRE and tooling with the cost of something. Plastic spoons are close to free, but making a plastic spoon factory would be expensive.

Second: you don't need most of that stuff. Dev boards that are a few bucks and debug probes for under $20 are credible and usable; fairly good compilers are free.

> But too bad that's not really the standard for old technology and resources laying around the planet, isn't it? You have to basically be uncanny like MacGyver or inhumanly intelligent like Tony Stark to reprogram the apparently free teeny computers laying around the world.

USB DFU is pretty dang common. It's not the absolutely lowest end stuff, but still pretty dang close to free.

Compare to doing all of this ages ago, where you'd have an 8051 with an expensive, crummy compiler and need a lot more tooling to do anything.


> Compare to doing all of this ages ago, where you'd have an 8051 with an expensive, crummy compiler and need a lot more tooling to do anything.

That depends... back in the day, I could buy an (UV) EPROM programmer for several hundred $$. Or I could study the datasheets, build my own EPROM programmer for a fraction of that, and write some software. Guess which route I took.

With uC's it wasn't much different, and still is. Vendor supplied programmers / debug probes etc, are just a quick & easy way to get started.

What is different these days, is that a lot of those 'vendor' tools are (more or less) generic tools, applicable to a whole class of devices (eg. JTAG), often come as cheap 3rd party clones, and with free software to use them.

So personally I don't understand parent's "1000s of dollars" complaint. That only applies when using niche products, outfitting a pro-level electronics lab, or plain doing it wrong / uninformed of the wealth of stuff out there.


> back in the day, I could buy an (UV) EPROM programmer for several hundred $$. Or I could study the datasheets, build my own EPROM programmer for a fraction of that, and write some software. Guess which route I took.

Even in 80's dollars, that's a big opportunity cost as a grownup. Now you can buy a $3 STLink and call it good. It's changed.

He said nearly a thousand dollars, which isn't that hard to get to-- but it means that you're doing a pretty wide variety of stuff.


Yeah. There's a small set of prepackaged micro- or teeny- computer programming interfaces. Or the plug-and-play if you will. In fact, that small set of convenience products only serves a market of kids that want to play with toys. They're literally toys. Ten or fifty dollar ARM microprocessors or microcontrollers coming in a box with integrated debugging features and integrated WiFi modules. And their complementary three dollar programming link handhelds. All from off the digital Amazon.com or AliExpress shelf. The "in-band" programming interface at accessible prices and stores.

And that's fine.

It's just that for me, on the other end of the spectrum, I prefer a little bit more adventure. Some less constraints. So, I need an "out-of-band" microchip programming solution for my aims.

Outside the kid world, you're required to be more knowledgeable about the way the world really works. You learn a whole lot more with out-of-band computer modifications than if you were to just plug and play some prepackaged handheld programming device into a little chip. You get more intimate with the microchip and its internals. You get concerned about its voltages and current needs, in order to achieve a proper relationship between your curiosity and the microchip's capabilities.

I want to dig into the raw power contained and hidden in unimposing millimeter (or centimeter) wide circuits. The re-programmability of microcontrollers or teeny-tiny computers, specifically.

There is no current documented solution for that. Beyond going your own way in a very long study and practice of electronics engineering and salvaging.


I'm going to translate your comment in how it sounds to me:

"I've spent a lot on embedded development. In large part, I've done this because I've sought to make things unnecessarily complicated and because I like playing with this stuff. I will deride the typical tools used today by most embedded developers as toys. I will use these views to try and support an assertion that computing isn't effectively 'free' in a monetary sense"

It's not like any of this is that complicated. I've spent plenty of time building my own programmers for things; I've bitbanged SWDIO, programmed EPROMs and micros with a parallel port and shift register or GPIOs on other micros; made pogo pin things, etc. If I were looking to get things done, odds are I can spend a few tens of dollars and just get going, and design in a part that costs a few tens of cents for a whole lot of computing in historical terms.

> I want to dig into the raw power contained and hidden in unimposing millimeter (or centimeter) wide circuits. The re-programmability of microcontrollers or teeny-tiny computers, specifically.

Very little of this is arcane on modern devices. Even a couple of decades ago the "hardest" thing in common use was the need for higher voltages for EEPROM erasure. IMO, where things get interesting is where you abuse peripherals to do things they weren't intended to do, but even that isn't usually equipment intensive-- a 4 channel oscilloscope and a debug probe will get you a long ways.


Yeah, you're right. Someone can spend at most five hundred or seven hundred dollars on a complete embedded systems development combination-set which maybe consists of something like one or five or twenty ARM microcontrollers and the convenient hardware application programming interfaces that are compatible with them, the small computers. Micro computers?

Anything else, anything outside this standard specification you've shared with me, is where some hardcore hacking goes on, in my opinion.


> Yeah, you're right. Someone can spend at most five hundred or seven hundred dollars on a complete embedded systems development combination-set which maybe consists of something like one or five or twenty ARM microcontrollers and the convenient hardware application programming interfaces that are compatible with them, the small computers.

You can literally become equipped to develop for microcontrollers and have a bunch of boards for less than $50 (excluding a laptop).

If you want to go a bit further, you can get a lab supply and a halfway decent DSO with logic analyzer capability for under $500.

> Anything else, anything outside this standard specification you've shared with me, is where some hardcore hacking goes on, in my opinion.

I've done plenty of hardcore hacking, and ... even then, not really. FPGAs? You can get ICE40 boards for <$20. A microscope is nice, but $40. Soldering iron? Pinecil is pretty great for $40.

I spent so much money on equipping myself to do EE stuff 25 years ago. Now you can do much more than I did back then for peanuts. Heck: I just built 20 little embedded ARM computers for students with LCD, cherry switches, and a debug monitor for $300, and the biggest expense was the blank keycaps. It was trivial to get going. That includes manufacturing. We are spoiled. https://github.com/mlyle/armtrainer

Where things get expensive is doing anything fancy analog, RF, very high speed (which is really also analog ;). Computing itself is cheep cheep cheep.


Impressive looking development board. It's beautiful even.

But a feature rich integrated development environment would be better. Especially if it, both a hardware accommodating and software accommodating development environment, operates on more than just something like the toy ARM Thumb instruction set and its ARM based microcontrollers.

After all, you don't need an architecture like ARM or even x86 to do some simple things that should be as accessible as alternating current mains electricity or sunlight from the Sun.

Computing is cheap, but only because it's easy to clear the low bar for having a Turing machine. Turing machines even occur naturally. Conway's Game of Life is Turing complete and subsequently you can build a computing machine with it. No ARM or x86 emulation or JTAG-ing necessary. Here, it's unnecessary to even summon a UART to USB adapter.

So, although computing is cheap, it's being locked behind some proprietary bars right now. I'm just looking for the keys to free some computers. Particularly I wanna free very teeny sized computers like microcontrollers.


> operates on more than just something like the toy ARM Thumb instruction set

This is about the most common instruction set in common embedded use in the world. IMO not too toyish.

> Especially if it, both a hardware accommodating and software accommodating development environment,

I'm a bit confused as to your point-- you seem to be simultaneously arguing for "more capabilities" but "smaller computer".

If you're saying "more integrated development" -- and mean specifically self-hosting-- the $1.75 microcontroller running in there is capable of being an 80's era DOS computer with all of those tools, etc, self-hosted. Playing with this is on my todo list for the luls. If you just want open-source development, GNU-arm-embedded is built on eclipse and gcc.

If you're saying smaller computers: STM8, 8051, etc, are easy, too. But there's really not a whole lot of reason to design below the $5 point unless you're mass producing something. The developer's time is worth something, too. Having a big pile of RAM to not even think about putting things on stack, etc, is nice.

If you're saying "free as in freedom" (you were responding to someone making a cost argument with the word "free") -- you can go ICE40 and put any number of open source hardware computing designs on it, and control every bit. Indeed, I had a high school class just build microprocessors from scratch on this approach.


Yeah. A 1980s era Microsoft DOS computer capability running on a $1.75 microcontroller is exactly what I want, on one hand. I'm not greedy or needy, after all. A microcontroller with WiFi connectivity built in (or easily attached, imported, or included) for building a networked system of these smart little computers and hardware part controllers too. Like, a washing machine that sends a "done washing" message to a headless server sitting in the home.

But I do kinda need them to be expendable. So designs that are priced at less than $5 is kinda a requirement for me. Because I sincerely believe I'm stepping into new and unexplored territory. A lot of experiments will be done with this information technology system. Which means there needs to be a massively productive facility for having a swarm of microcontrollers. Hence, the need for turning any microcontroller encountered in the wild into a controlled and compliant robot brain for my heterogeneity of devices and home appliances.

I don't mind thinking about how to not bust open a stack that can only fit three variables on it or something. In comparison to the simple architecture which includes parsimonious memory modules or only two registers total, for example, what's complex will be the total assembly and combinations of Turing machine based codes made possible by teeny microcontrollers/computers doing simple things. Like receiving temperature levels and then relaying or sending packets of temperature or heat data to a server. Acquiring x86 instruction sets is definitely unnecessary here. Or, rather, I only need x86 code execution for not re-inventing things like WiFi. ARM or x86, for example, then, should be seen as just an imported (think Python) or included (think C) module.


> Yeah. A 1980s era Microsoft DOS computer capability running on a $1.75 microcontroller is exactly what I want, on one hand. I'm not greedy or needy, after all. A microcontroller with WiFi connectivity built in (or easily attached, imported, or included) for building a networked system of these smart little computers and hardware part controllers too. Like, a washing machine that sends a "done washing" message to a headless server sitting in the home.

OK, that's an ESP8266, then. Here's a module for $2.11.

https://www.aliexpress.us/item/3256805440432225.html

They're far more capable than you're describing-- capable of emulating an PC-XT at close to 80% speed. For throwaway stuff you could use micropython.

They're cheaper than thinking about how to use random micros you find.


So basically we're shopping for fingernail sized motherboards?


That's the starting point. If you want to design boards, you can put down ESP8266 castellated modules (easy) or the ESP8266 chip itself (somewhat harder).

Because of issues with electronics supply chains, often complete boards are cheaper than you can buy the modules and chips at low quantities (things are really optimized to sell a thousand units or more). Even buying blank keycaps at low quantities was very expensive compared to finished products of printed sets of keys.


> A microcontroller with WiFi connectivity built in

ESP32 modules are $2 on LCSC and come with a built-in wifi antenna

> So designs that are priced at less than $5 is kinda a requirement for me. Because I sincerely believe I'm stepping into new and unexplored territory

No, your requirements are the same ones that every cheap IoT device has. Open up a $5 smart switch and see how they manufactured it for $1

> simple architecture which includes parsimonious memory modules or only two registers total, for example

What are you on about? Using an unusual instruction set will increase NRC, cost per MCU, and power consumption. Low power ISAs are a scam. Race to sleep if you wanna save resources


So many products would be better if they just used a Wemos S2 mini or similar $5 microcontroller board.

I understand why everything uses custom designed stuff for cost, but I don't get why people think it's somehow the better or more "professional" approach to do everything yourself.

Modular stuff with common high level modules is just so much easier to repair, modify, and recycle.

We need an ISO standard for these little modules so we can get back to vacuum tube era level of repairability.

Nearly all modern gadgets(General purpose computers and phones aside) could be made from a common set of 30 or so modules plus minimal custom stuff.


> I have to still spend nearly a thousand dollars or two on embedded systems development equipment

wat

> JTAGs, ICEs, ROM flashers, UART-based bootloading solutions

Dude, all the popular chips these days are ARM MCUs with SWD. They can be programmed with a $3 ST-Link V2. The most you'd spend on the stuff you listed is $75 for a Black Magic Probe, but of course you can build your own for 1/10th the price.

$0.03 MCUs are the exception to the rule since they use proprietary protocols and OTP memory, but their programmers are still in the $100 range


The future of everything is basically "the upfront and ongoing costs of the tooling are infinite, while the physical deployment is free".


From the modern MCU perspective, not doing a busy wait would be about not burning the battery. Instead, it should stay in low-power mode while waiting for an interrupt.

Very small controllers in a high-power device, like on a motor or a LED, don't have this limitations.


We're almost at (or are already at) the stage where the packaging of the chip costs more than the computer inside it.


we're two orders of magnitude past that stage. a ryzen 7 is ten billion cmos transistors for a hundred dollars, ten nanodollars per transistor. so how much does a minimal computer cost at ten nanodollars per transistor?

the intersil im6100 was a pdp-8 cpu in 4000 cmos transistors, and the 6502 was comparable, but that's with no memory other than the cpu registers. for a useful microcontroller you probably need about 8192 bits of instruction memory and a few bytes of ram, so let's round up to 16384 transistors for a whole computer. an 8051, with built-in instruction eprom and 128 bytes of ram, was 50k. the arm2 without ram was 27k. an avr, with built-in ram and flash, is 140k. https://en.wikipedia.org/wiki/Transistor_count

at that price, counts of 16384 to 131072 transistors work out to 0.016¢ to 0.13¢. but the cheapest computer you can buy today is a padauk pms150c https://www.lcsc.com/product-detail/Microcontroller-Units-MC... which is 4.2¢ in onesies (and 2.43¢ in quantity) with 64 bytes of ram and 1024 13-bit words of one-time-programmable prom for the program https://free-pdk.github.io/chips/PMS150C. that's 150× more; in the day of moore's law doublings every two years that would have been 14 years, but now it's probably longer. (incidentally this same blog looked at them a few years ago https://cpldcpu.wordpress.com/2019/08/12/the-terrible-3-cent...)

ergo we've been at the stage where the packaging of the chip costs more than the computer inside it since about 02009

(obviously the ryzen 7 cpu costs a great deal more than its packaging, though, because that's what you have to do when you're competing on computrons per dollar rather than gpios per dollar. in theory for 2.43¢ you should be able to get 2.4 million transistors, enough for about 300 kilobytes of rom or a 50 kilobytes of sram, or half that together with a 486 or a quad-core arm3. presumably padauk is not doing this because they're using long-obsolete semiconductor process nodes, which is also why their chips are so power-hungry)


What a complicated way of saying 'yes' ;)


i thought it was an interesting enough question to merit deeper analysis; perhaps someone else who thinks so will read it


I agree with OP/OP/OP. Thanks for the initial point, breakdown and the entertainment :p

Can't wait for us to start seeing authentic 3d silicon tho. Gimme that 3mm^3 cube of magic. And beyond that, computronium.


I have had Microchip FEs say as much to me.

Will be interesting when chiplet technology gets to the point where multiple chips are installed in a plastic package (there are thermal expansion/contraction issues that make this a hard problem but still ...)


If and when the cooling problem is solved and we can just stack layers at will computronium will truly have arrived. Adding another dimension will unlock hardware potential that we can only dream of today.


chiplets still lose area to scribing and dicing


You can tell the age of an embedded programmer by whether they consider sampling an input to be "polling" (which to me implies blocking, but that's another discussion) and then look for silver bullets for interrupt storms.


Polling has never implied blocking. It’s actually a way to avoid blocking. I think you’re thinking of “busy-wait loops”.

The difference between “polling” and “busy wait” is whether or not the CPU is doing other unrelated things between samples.

The difference between polling and interrupts is that with interrupts the CPU can halt entirely while waiting rather than having to take those samples in the first place.


You're not entirely wrong, and thanks for pointing this out, but polling is also frequently used to mean "busy-wait loops" - you can look for it if you doubt me. I didn't really want to get into that conversation.

The other thing that polling implies is that there will not always be data present when you poll (think of a UART driver), and you might block as long as there is, and you may or may not be polling at a deterministic rate. Sampling unambiguously implies that a sample is always present and handled. It does not preclude any use of interrupts - you can sample with a timer interrupt or handle a sample with an A/D interrupt.

Many embedded programmers of a certain age have a nearly fixed mental model of a microcontroller as a sort of headless VIC-20 and a certain horror at sampling techniques that derives from enduring crummy polling peripheral drivers/handlers from the early days of personal computing.


It might be a myth, but I seem to remember the ASICs used to flicker older LED designs were often repurposed from audio greeting cards. The light was actually just Happy Birthday playing through an LED or bulb rather than a speaker.


Not a myth. I clearly remember watching a YouTube video where this was discovered, only a few years ago, but of course the uselessness of search engines these days has made it impossible to find now.


Here is a May 2011 post of someone who discovered that: https://www.halloweenforum.com/threads/interesting-fact-abou...

And here is someone who recorded audio samples (no idea of the year): https://www.instructables.com/How-to-Listen-to-Light/

Edit: and here is maybe the video you were remembering: a video from June 2011 of someone hooking a speaker to listen to the flickering led: https://youtu.be/753-lkao8l0?si=-WqRRuBH644oKTXG But they don't realize this may be an audio chip as the tune isn't really nice or recognizable.


Thanks! From your second link, there is this text that dates it to mid-2008:

Update 20 Nov, 2009: This additional step was added on 20 Nov, 2009, about a 1+1/2 years after the bulk of this instructable was published.


Yeah I think I saw that too, bigclive maybe?


Likely to have been him, but you have a sibling comment that shows others also discovered this fact, before him.


Another common and low code method of getting random lights from your overused and under-resourced micro was to simply output a segment of ROM code to the IO ports the LEDs were on, often used this trick for twinkling XMAS lights or front panel lights for the custom controllers we made in the 1080's


Iirc, at least one design used a counter + ROM table with brightness values. And ROM table of bigger size (1k+ entries or so?) than one might expect. All this because a pseudo-random generator wouldn't produce nice enough effect. And b) counter + ROM asic still cheaper to manufacture in volume than a uC (probably not true anymore?).

But I'd be interested what OTP tech uC from the article uses. Mask programmed? Fuse-based? Flash? (and if so: erasable?). Something else?


Once upon a time, on USENET, someone had a .sig that predicted one day computers would be cheap enough they'd come in cereal boxes and we'd throw them away.

That day appears to have arrived.


The worst offenders right now might be disposable vapes. I saw some sold with effing lcd screens built-in. For a disposable piece of crap.


I'm waiting for the EU to ban these fucking things already. That and disposable power banks. Yes, you read that right.

Selling consumer devices with Li-Ion batteries that are designed to not be rechargeable, should be banned altogether.


Interesting thing about li-ion batteries, is they have much less lithium in them than disposable lithium coin cell batteries, and hold much more charge. If we're outlawing disposable li-ions we should outlaw those as well.


>If we're outlawing disposable li-ions we should outlaw those as well.

Except when your watch battery dies, you only dispose of the battery, not together with the watch, as is the case with those single-use vapes.


We're OK with disposable alkaline batteries, so what makes lithium worse? If anything, alkaline batteries might have a slightly worse environmental footprint due to the use of manganese.

The problem with lithium batteries is that they can catch on fire, but that's a problem only when charged (or charging). A fully discharged battery shouldn't do much.


> We're OK with disposable alkaline batteries, so what makes lithium worse?

No we're not. Disposable batteries should not be a thing anywhere, especially not in products where they cannot easily be removed by design. Alkaline batteries may not combust when damaged, but their internal juice leaking out is damn corrosive.

> A fully discharged battery shouldn't do much.

Even that is enough to light trash compactor trucks or the heaps on waste collection plants to fire. This shit is becoming a massive problem for the trash hauler and processing industry. One in Australia blames 35 (!) fires a day in the country... no surprise if 1.8 million of them are sold a week [1]. This is frankly insane, and the rise in trash fires directly corresponds with disposable vapes.

Additionally, we need every bit of lithium we can get for electric vehicles and other stuff. Not for throwaway devices.

[1] https://www.abc.net.au/news/2023-12-02/qld-lithium-ion-batte...


should Apple be forced to produce batteries for 2001 clamshell ibooks?


No but they should be forced to open up the specs after the manufacturer support ends, so that the free market can decide for itself.


Probably not, but I don't really see what that has to do with anything.


Indeed these things are an abomination. I regularly find half consumed vapes at intersections where they have clearly been accidentally dropped and abandoned by cyclists. I have a nice collection of perfectly good lithium batteries.


Cyclists in your area are vaping while they ride?

Just spitballing here, but considering that at typical intersections, automobile traffic outnumbers cyclist traffic by at least 100:1, isn't it more likely that those vapes were throw out the window of a car when someone got frustrated they clogged or something? (I'm not a vaper, but I've heard of clogged vapes being a common occurrence).


> I regularly find half consumed vapes at intersections where they have clearly been accidentally dropped and abandoned by cyclists. I have a nice collection of perfectly good lithium batteries.

If true, then the hedonism and pleasure seeking (the homeostatic pleasure trap is a monkey trap) found within the big, global industrial complex is meant for a small set of secret hackers to take advantage of by collecting disposed "vape" or smoking pleasure devices for powering some cool nerd contraptions.

Don't be afraid to get your hands dirty when you pick up trash. Because capitalism produces treasure when it excretes its waste products.

You just have to be outside the capitalist world-system to do this cool trick.


I've collected a few of them and they are robust and rechargable batteries. A common battery is IP17350 1100mAh 4.07Wh (Date of manufacture the one I see on the one I'm holding is 20210613). Anyone got any idea to use them besides small flashlights? They also have a pressure sensor on the LED and a nice metal case pretty often.


Depending on the battery there are cheap AliExpress boards - or diy - to make tiny UPSes for a raspberry pi or other usb-powered device. I don't know for that specific battery; I have some that take 18650s, though.

One 18650 can get you several hours of runtime for a pi zero. The battery can cost more than the zero. The cheaper one cell UPS boards are about $2 (plus shipping).

An advantage of single cell UPSes is that you don't have to worry about balancing, which is a bit of a pain with scavenged cells.


I turn them in to my local battery recycler for further processing.


The rechargeable lithium batteries in those really doesn't fit the "lithium shortage" and "we don't have enough battery capacity to build an EV for everyone" narrative


It takes around 850g of lithium carbonate to produce one kWh of lithium batteries. The current market price for lithium carbonate is about $14/kg. The base spec Tesla Model 3 has a 57.5kWh battery pack, so the lithium in the pack represents a cost of ~$685, or just under 2% of the list price of the vehicle.

A typical disposable vape contains about five cents worth of lithium.

https://www.irena.org/-/media/Files/IRENA/Agency/Technical-P...


Another post in this thread mentions 1.8 million disposable vapes being sold per week in Australia. So that corresponds to 131 EV batteries per week. Or roughly 48000 EV batteries per year. 87000 EVs were registered in Australia in 2023.

Make of that what you want, but disposable electronics to administer nicotine seem to be a major waste.


It actually fits the narrative pretty well

The materials for one EV battery can make ~30 e-bikes, and currently EVs are too expensive for most people.

The way to fix that and the way that industry is fixing it is to make batteries more efficient (higher-density, new anodes/cathodes) in parallel with making a bunch more of them (and mining more lithium).

If we succeed in making a $25k EV, the batteries used in those vapes will be _even cheaper_.

I don’t think it’s desirable and I find the waste appalling, but I do think that disposable batteries can only be expected to grow without intervention.


It does if the vape makers have an underpriced lithium source and don’t care about the pending lithium shortage.


they're just buying off-the-shelf cylindrical or flat-pack li-ion cells


A popular theory is that they use "QC reject" grade, as the batteries often have arount 800 mAh capacity, two times less than most basic commercial grade.


One car has 4,500 cells - bigger cells than most of the vapes. So it really is a matter of scale.


How many cars does one smoke in a year though?


Hopefully zero! Still, if you replace your car (or its battery) every 10 years (pretty long IMO) and smoke one vape a day (yikes), you'll use more cells on your car than your vapes.


I hope someone who replaces their car every 10 years isn’t just sending the old one to the compactor when they’re done with it

For one, 10 years is a perfectly good used car for somebody, and for two with large EV battery packs we’d expect some lithium recycling effort

Disposable vapes don’t have either of those going for them, batteries go straight in the trash after one charge cycle


I would hope that too. I imagine that as long as the pack still works, most cars will be sold forward on the used market. If the pack fails (either due to cell death or a crash or whatever), I bet many of them will not be properly recycled, especially from the early days. Once most cars are EVs, recycling will probably get better.

Either way, it seems pretty unfair to assume that EV packs will be 100% recycled, while vape packs will be 0% recycled. One could imagine a sort of "core charge" for disposable vapes. Bring the vape back for recycling when you get a new one and get $2 off. This could even be done by law like California CRV for cans and bottles.


The kind of people who can afford to buy EVs buy a new car what, every 3 years? So I guess about 1/3 of a car (or, using the numbers from another comment, about 4000 vapes' worth).


The car is recycled, while the vapes are sent to the landfill.


I thought a Tesla smoked them all..

I'll show myself out!


Do not give those guys ideas


The lithium in those cheap disposable things are less "lithium" and more "metallic powder/paste that theoretically contains elements of lithium." It's not something you'd want to actually use in anything important like a car battery


Not true. "Disposable" vapes use commodity li-ion cells, of the same basic type that you'd find in a cellphone, a laptop or an EV. They probably aren't the best quality, but there's nothing unusual about the chemistry or packaging. Li-ion cells are the preferred chemistry because of the very high discharge rate - alkaline or primary lithium cells just can't deliver enough current. The cell is perfectly capable of being recharged, but some people prefer the convenience of a disposable device and manufacturers are quite happy to respond to that demand.

It's wasteful, I don't particularly approve of it, I expect to see a lot of jurisdiction ban disposable vapes, but nor do I think it's particularly egregious or meaningfully impacting on the commodity price of lithium carbonate.

https://hackaday.com/2022/05/05/2022-hackaday-prize-disposab...


some people prefer the convenience of a disposable device

I don't understand this. Instead of plugging it into a charger, they'd rather go to the store to buy another, or more likely order online and wait for it to be delivered?


i don't have personal experience with this, but i imagine so, because if you plug your vape into a charger, you can smoke it in an hour or two, and if you buy a cigarette at the convenience store that's a block away, you can smoke it in two minutes

maybe you live somewhere without convenience stores


it's the convenience. If they had the executive function to get them online, they would save money and get a reusable device instead.


I introduced a battery charger to a group of people that used disposable vapes and the knowhow to charge them. It changed the way they interacted with the vapes completely even saying funny high ideas like "we should patent this".


I searched Google for this recently and could not find it. I tried it again on Google Groups just now and found one reference to it:

https://groups.google.com/g/comp.arch/c/Y4C_Zjkb9VM/m/scDk_0...

> Killer micros of today are a lot like flourescent lights -- cheap to operate, prevalent, and expensive to turn off. To see a machine standing idle, when you were raised as a child to "use cycles efficiently" is a gut-wrenching experience. Just remember Alan Kay's prediction: In the future, computers will come in cereal boxes and we will throw them away.

March 20, 1990. I haven't found a source for Alan Kay's prediction.


Various fortune files attribute the quote to Robert Lucky. I would guess that it was misattributed to Alan Kay since quotes often get attached to famous people.

"In the future, you're going to get computers as prizes in breakfast cereals. You'll throw them out because your house will be littered with them. -- Robert Lucky"

https://web.mit.edu/~mkgray/jik/sipbsrc/src/fortune/scene


That happened way before. RFID cards for disposable public transportation tickets or those music greeting cards.


At a certain point it's more about the semantics of what a "computer" is. I don't know if I'd count an ASIC from a musical greeting card, though; and even within general purpose devices, microcontrollers vs microprocessors are typically delineated by the presence of an MMU.


If I can program it to execute a sequence of arithmetic and logical operations that approximate a Turing machine (with a finite band), and reprogram it at a later date to execute a different sequence of such operations, that's a computer to me. I wouldn't count ASICs, but the PIC12F508 or the 3-cent microcontroller referenced in the post definitely count.

Though by my definition of requiring reprogrammability and Turing completeness I am purposefully excluding many things that have historically been considered computers, like the many mechanical computers of the 19th and 20th century. From that standpoint I can see how some people might count ASICs as computers, even if I don't think that fits modern usage.


These RISC-V chips are on the order of .02 to .10 (qty 1)

https://www.wch-ic.com/products/CH32V003.html

The PIC12 has 25 bytes! of sram. The CH32V003 has 2k.

https://www.aliexpress.us/w/wholesale-CH32V003.html?spm=a2g0...


the 6502, 8080, z80, 8085, 8086, 65816, 68000, and 68010 were universally described as microprocessors, not microcontrollers, but did not have mmus built in (and of these only the 68010 could easily have one bolted on, as i understand it)

i think typically the thing that distinguished these from microcontrollers like the 8031, 8051, 8748, 8751, pic1650, etc., is that the microcontrollers had program memory built into them, either rom, eprom, or, starting in the 90s, flash. so they didn't need to be booted, they didn't need program ram, and in fact for a lot of applications they didn't need any external ram at all


Arguing about hard definitions differentiating microprocessor from microcontrollers based on single feature is pointless. It's a vague product/marketing category for certain usecases. There will be group of features that are more likely included or not included, but for most of them there will likely be exceptions. And the set of available features available MCUs and microprocessors change over the time. As technology improves both microcontrolers and processors are gaining new capabilities.

* MCUs usually have program memory builtin. But then there chips like RP2040 or ESP32 which while considered MCUs are used with external Flash memory chips for storing the firmware. * MCUs usually have builtin RAM. But there are also some capable of directly using external RAM. * Then there are things like apple M1 chips, with a lot of stuff builtin you still don't call them MCU. * A bunch of ARM application processors/SOCs/Microprocessors might have enough resources builtin that they could be used as more or less standalone microcontrollers, without external RAM or flash memory. * some early microprocessors used external MMUs and it took some time until the processors settled on architecture that's closer to how we have things now * early personal computer processors were in a weird category in terms of price and processing power, in certain time period it wasn't impossible that similar microprocessor chip was used both for as main computer CPU and also for peripheral devices.

The microprocessor name in my opinion at this point is slightly outdated. It's not like anyone beside hobbyists is making non micro processors out individual relays, transistors or logic chips.


mostly i agree; it's mostly a marketing distinction rather than a technical one

actually i think non-micro (multi-integrated-circuit) processors are becoming popular again. the 'microprocessor' moniker wasn't coined to distinguish processors built out of discrete transistors from processors built out of integrated circuits; that was the 'second-generation computer' vs. 'third-generation computer' distinction back in the 01960s. what made a microprocessor 'micro' was that it was a chip instead of a circuit board


I believe 68000 could use an MMU, but the catch was that it couldn't do demand paging, just memory protection and virtual/physical translation. I can't find the specific explanation right now, but it's something along the lines of the bus error exception (needed to actually stop the memory cycle) being special in a way that sometimes causes an incorrect PC value to be pushed to the stack. So you could terminate a process on an MMU exception, but resuming it was not reliable.


There was at least one company (Apollo, I think) that implemented demand paging on 68000 by using two 68000s. You had one, the leader, running as the "real" CPU, with the other, the follower, executing the same code on the same data but delayed by one instruction.

If the leader got a bus error they would generate an interrupt on the follower to stop it before it executed the bus erroring instruction.

The leader and follower would then switch roles, and the new leader could deal with the situation that had caused the bus error on the former leader.


That's so clever. What a hack. I'm imagining the slow smile on the face of the person that came up with it. "What if...".


I feel that the semantics are quickly becoming irrelevant. Many everyday items like sports watches, toys, kitchen appliances, alarm clocks or table radios already have more processing power, more memory and storage, higher resolution screen and better network connectivity than my first desktop. Running Doom on mundane items like key fobs and light bulbs isn't too far away from where we are in 2024.


> Running Doom on mundane items like key fobs and light bulbs isn't too far away from where we are in 2024.

About that... https://www.pcmag.com/news/you-can-run-doom-on-a-chip-from-a...



I'm pretty sure musical greeting cards are also a thing.


I got a server blade with a Xeon processor and 96 gigabytes of RAM for free last week... Granted, it is a ten year old device, but still.

Another friend will give me a 180 gigabyte RAM device next week, cause it would go to the container otherwise...


Yeah, I remember looking into getting a refurbished cloud server instead of a brand new desktop a decade ago or so. They weren't free, but you'd still get a machine with 40 cores and 120gb of RAM for only $400. Pre-Ryzen, it was a very attractive offer.

I never ended up buying one, because it'd consume way more power than made sense, and they're kinda shitty for gaming/workstation use due to low single-core performance and NUMA issues.


It's the power consumption and the noise that gets you with those.


There’s been generic microcontrollers (usually single shot programmable) available for less than 3 cent per mcu for half a decade already. Check out Padauk and similar MCUs.


At least a decade ago, a major magazine had a disposable chip inside for... I don't remember why.

Does anyone remember this?


I remember receiving this video ad for a Chevy pickup truck in a print issue of Popular Mechanics circa 2015 [0]. When disassembled, it consisted of a screen, battery, speaker, and circuit board. What the article doesn't say is that the circuit board had a micro-USB port which when connected to a computer mounted the internal storage as a drive, with the four-or-so videos it played accessible. I actually managed to find an existing home video video with the correct format (I wasn't familiar with FFmpeg at the time) and when one of the internal videos was replaced with mine (with the same filename), it would play instead. I believe the micro-USB also charged the internal LiPo battery as well, as I don't recall being worried too much about battery life. I probably still have the thing somewhere! [0] https://www.tubefilter.com/2015/04/16/chevrolet-video-ads-in...


That's closer to what I remember, but I don't think the advertisement was about a Chevy Pickup.

Apparently the group that did this advertisement was: https://www.americhip.com/ourwork/?product=product-magazine-...

And they have a whole slew of magazine inserts that they've done. I'm convinced now that I saw one of these magazine inserts, but not the one on the Chevy. Thanks for helping me track it down! I'm really close...

EDIT: I found it!! https://www.americhip.com/ourwork/microsoft-wi-fi-hotspot.ht...

It was this advertisement, I remember the cover. Sorry I wasn't very specific earlier, but I didn't realize how many "throwaway computers inside of magazines" that there were. I guess I assumed this was a one-off event with Microsoft's 365 WiFi cloud (now that I remember the chip, lol).


Esquire October 2008 e-ink cover?


Someone hacked that too!

Hacking the Esquire e-ink cover

https://www.popsci.com/diy/article/2008-09/hacking-esquire-e...


A good guess but that's not what I was thinking of...

My example was all the above except the chip was centerfold in the middle of the magazine. Not on the cover...

I'm surprised this happened more than once however.


We've been there for about a decade now. Single-use paper transit tickets equipped with an RFID-capable microcontroller are quite common now.


I used to keep those disposable bus ticket back in 2007 hoping to find a used for it. After a year I didn't find a use for it.


To whomever:

I really, really, REALLY, did I say, "Really" miss the twinkle of old school incandescent blinking Christmas lights.

Each light has a metal spring that heats as the light glows. Eventually, the spring warms enough to move and break the circuit, thus turning the lamp off.

As the spring cools, eventually the circuit is made again, turning the lamp back on, and the cycle continues.

And there is more!

At first, the whole tree is illuminated. Then, one by one, the lights begin to blink.

Soon many are blinking.

Then all of them do it.

Because those springs are coarse and made as quickly as possible, there is considerable range in the cycle times of all the lamps.

Finally, the sustained, faster cycle time happens 5 to 10 minutes in. The lamp reaches a steady state, on, off, on, off, that is very regular.

Please, someone model this, drop it in a little MCU and sell us lights that twinkle, not just blink in some pattern.

If I were to guess, there is about 10 bits of variation needed to really capture what the old bulbs do. 8 bits may be enough, if one ignore outliers, those bulbs with either very short or very long cycle times. Those are rare, but do add to the magic of it all.

Thanks, I am waiting.


This reminds me a lot of Big Clive's "supercomputer". See any video https://www.google.com/search?q=big+clive+supercomputer reveals.

It's just a panel of self-blinking LEDs. But they're cheap and low quality, so they start off all-on, sort of in sync, then after a while they're blinking "randomly".

So I think you want an LED string built out of lights like these?


Those look great! But they aren't the same, though I should say those are better than what seems to be out there today.

I left something out of my description above. It is complete regarding the blinking behavior. But, there is a pattern of tree limb shadows on the ceiling! And light colored walls which really makes for the best experience.

Notice how the panels Big Clive showed us have a steady rolling pattern? The incandescent lights are not that way.

Because each one works off an imprecise thermal spring, the blinking can range from a mostly on, going off for a short while, with short while being a couple seconds, to almost the opposite! That would be mostly off, illuminated for a short while.

Because the variation is so broad, the blinking pattern tends to jump around. One might see 5 lights change in close proximity, then half the tree, with parts coming back on in chunks, to another couple regions blinking off then on rapidly before the whole thing seems lit for a brief time.

The big Clive units have all LED's blinking close to the same cycle.

His varies mostly in the phase. What I describe varies mostly in the period. Some bulbs blink fast while others are much slower.


One of the things that I've thought about back in the days of poorly looped animated gifs and Java applets was those annoying flame gifs that didn't loop right. A project that I tinkered with (never finished) was creating a cellular automata that creates a flame like structure.

For a Java applet, it would have just continued on forever with random fuel being added. However, a different application of the 'fuel' for the flame would have had it loop over some period.

The other part is that with animated gifs and a 256 color pallet, one could index the flame color. Roughly for RGB this is step 3 through 0,0,0 -> 255,0,0 -> 255,255,0 -> 255,255,255 (not perfect, but it kind of works for a proof of concept).

And then, the value of a cell is a function of the cells below it in the previous time iteration.

With the fuel rows at some point looping (if written that way), so too would the overall frame loop at some point... and you could have a perfectly looping flame gif.

While that's all well and good, flame gifs are kind of a 90s thing... but my interest in reviving that old code coincides with me occasionally wondering "what could I do if I made a volumetric display?"


Seems like a small step up from the ubiquitous WS2812, an LED with an onboard controller handling PWM and one-wire communication.

Years ago, before animated LED christmas lights were readily available, I hand-made such a string out of ATtiny85 controllers, soldered onto bicolor red/green LEDs, one controller per light. A little bit of C and an evolving animation algorithm recycled from a previous project yielded a pleasing quasi-rhythmic effect. It was absurdly non-economical, but felt like a glimpse of the future.


$.032 in quantity...


Which is that?


WS2812B


Some people say software is eating the world but I prefer to say its infesting it.


DNA is software. The world is already full of software. We're just catching up.


Technically speaking it is a very large configuration file. The config reader runs Ribosome 0.8, which escaped from the lab a couple of billion years ago and infested a whole planet to the point that it became inhospitable to life.


You left out the preprocesser (RNA transcriptase) which takes the config file and transcribes it, possible modifying some bits in-place, before it gets to the ribosome.


#defines all over the place as well as decidedly unhygienic macros. Serves them right for still using 'C'. The list of CVEs is endless for that stuff, entire encyclopedias full of them.


If you'd like to see this and more in software for flashlights please visit toykeeper's projects!

https://github.com/ToyKeeper/anduril

https://toykeeper.net/

I loved this feature on my flashlight and used it exclusively on candle mode. There's a lot of thought that goes into it like PWM speed or direct drive for LEDs that may or may not cause flickering and a tradeoff between battery life and the electronics available.


I'd rather they didn't, I recently upgraded our household flashlights and was pleased to be able to get 3-mode rather than the ridiculous 5 and 7 mode that they were replacing, no more having to loop through 'SOS' and several ridiculous flash patterns, a half-dozen brightness settings and so on to get to the 'full brightness, no flashing' setting that was needed 100% of the time.


Maybe I misunderstood you, but Anduril doesn't include strobe modes on the main cycle, and it lets you configure how many brightness steps you want (or hold for continuous ramping). So it sounds to me like you should want people to check out Anduril, since it addresses the two issues you mentioned with your old flashlights.


Those are the bad ones. If you get this one the only mode you'd get is a ramping effect by holding the button to make it go brighter or lower.


I hope there is a follow up where the LED is ground down and a PICKIT2 is used to read out the code :)


PicKit2 and MPLAB8 made MCU programming a pleasure. Everything was so damn snappy and responsive. Feels as though everything went downhill with the advent of PicKit3 and MPLABX.

Edit: The site only exists on the wayback machine now, but there was a hate page [1] for the PicKit3 posted on Dave Jones' twitter some time ago

[1] https://web.archive.org/web/20180423225612/http://www.fuckit...


I don't buy the PIC chip theory since that means it's either programmed in-situ which seems imposible, or they're ordered with premasked or preprogrammed ROM which is hellishly expensive.


It could also be programmed during production. Clearly the LED manufacturer has the equipment to do wire bonding, so why not use the occasion to program it too?

It'd be a bit comparable to the test and assembly process of the WS2812B, see [0] @ 2:30 or 6:10.

[0]: https://www.youtube.com/watch?v=pMjhJ9kcaU4


Anything is possible, I suppose, but burning EEPROM is an order of magnitude slower than anything these robots are doing.


I'm not too worried about that. The slowest part of EEPROM & flash operation is erasing an already-used page - and that's not the case here.

Chips like these only have 1kB of flash - I'd be surprised if you couldn't program them in less than 100ms.


Compute is so God-damn cheap these days that it often makes sense not to fabricate a custom ASIC, just emulate what the ASIC would do with a microcontroller. For example, in the case of people who build new, retro style computers, often a microcontroller or two are handling I/O to and from the peripherals like the screen and keyboard. Like there might be an Arduino or Pi Pico generating a VGA signal instead of a dedicated CRTC (because no one makes old-school CRTCs anymore). I just think that's wild, especially when the I/O controllers are more powerful than the main CPU itself (as might happen with a Z80-based system).


I would have assumed that nowadays using a small neural net to approximate a CFD model of a flame would be the easiest.


If you want to see some candle flicker code based on a PIC12F508 or 9 see below.

The flicker effect is based on a linear feedback shift register function to provide a pseudo random effect.

https://github.com/linker3000/CandleFlameSim


Going to drop this here:

“Reverse engineering” a real candle

https://cpldcpu.wordpress.com/2016/01/05/reverse-engineering...


This is great, I wish I could find off the shelf LED candles that more closely approximate the real thing. I'd also like them to be taper candles on a little holder with a loop finger hole so I can grab it to investigate noises in the night.


The best LED candle/flame simulations are the ones that use many LED's such as:

https://www.amazon.com/gp/product/B08789LJPS

https://www.amazon.com/gp/product/B07RBPLRPV

There used to be a company that made 192 LED rechargeable candles which looked like real flames (when behind frosted glass). They have apparently stopped selling them because of all the cheaper ripoffs with fewer LED's and worse algorithms. The many LED flames look more realistic because the light source actually moves, rather than just varying in intensity. They had a bulb similar to the first link above. It's odd that the cheap LED lights don't use a better algorithm, a brownian motion or fractal algorithm with pseudorandom number generator designed for a long sequence before it repeats wouldn't raise the price, it just takes a little more time to implement.


There are lot of MCUs around us but they are locked down and not reprogrammable. For example, if your microwave oven or a smart bulb breaks, you cannot reuse its MCU for anything else. Don't you think that MCUs must be reprogrammable, resolderable and reusable?


Next up: "Porting Doom to run on an LED candle".


FWIW Doom was ported to the Ikea GU10 lightbulb a couple of years ago.

Unfortunately the original post and code all got taken down due to copyright claims, but archive has a copy:

https://web.archive.org/web/20210615035229/https://next-hack...


Obviously porting Doom to run on smaller and smaller devices is impressive, but I wonder if anyone has completed Doom on one of these ports? That I’d love to see!


Semi-relatedly, the best "digital candle" app I ever found was one that just drew an orange rectangle in the center of an otherwise black screen, and animated its scale randomly. This convincingly modulates the brightness of the display.


It would be possible to use the LED itself for bidirectional communications (for example reconfiguring it), see https://www.merl.com/publications/TR2003-35 or "Very Low-Cost Sensing and Communication Using Bidirectional LEDs" by Mitsubishi. You could then reprogram the device.


So is the program of that chip burned into mask ROM (do people still do that?) or could that thing be taken out and repurposed? The datasheet mentions handy-dandy device programmers that plug into your PC and probably mandate the use of a horrible proprietary development environment, but maybe that's only for special evaluation parts?


According to another article on the authors site, linked from this one, most of these microcontrollers are “write once”: https://cpldcpu.wordpress.com/2019/08/12/the-terrible-3-cent...


that probably cost around a cent per die

I believe it's more like a tenth or hundredth of that, if you're talking about USD.


Cutting, wire bonding, and mounting probably cost the rest of the penny.


Is the actual make and model of the MCU just a guess?

Doesn’t seem to be anything to corroborate the PIC12 besides a pinout the author has seen before.

Just mention because there are likely a zillion eight pin MCUs with this pinout/ballout pattern.


The post makes no mention of the actual MCU being used, they just note that it's interesting that the pinout matches the often-cloned PIC12. It's one additional piece of evidence hinting that it's probably an MCU die in there, rather than some kind of ASIC.


Matches in shape, yes.

Is there any evidence it matches in function?


Read the article:

> There are rectangular regular areas that look like memory, there is a large area in the center with small random looking structure, looking like synthesized logic and some part that look like hand-crafted analog.

And the PIC12 is known to be a source of inspiration for dirt-cheap Chinese microcontrollers, see for example [0].

Hard evidence? No. But if it looks like a duck, walks like a duck, and quacks like a duck...

[0]: https://cpldcpu.wordpress.com/2019/08/12/the-terrible-3-cent...


I'll take the article's word about the PIC12 devices, but there are plenty of chips from Atmel, NXP, and TI that match that form factor. Plus the whole universe of Chinese clones.

Just seemed like a rather large leap to assert based on pad pattern alone. /shrug. Whatever. Not really that important.


OK ... As if there a wasn't things a thousand time worst


I really hate these fake LED candles with a passion. Either put an actual candle on the table, or just put a small cosy light that doesn't flicker.


I like quite them. With LED candles, there's no danger of fires if you forget about the candle, or if you accidentally knocking them over, or if something accidentally goes over the candle. There is no mess to clean up as the candle melts down the wax. There is no worry about wind blowing out the candle.

I don't like those small LED candles using disposable watch/button batteries. I use ones using standard AA batteries, which means I can use NiMH rechargeables.


Yeah, so switch it out with a continuous light instead of a flickering one. Why do you need the aesthetic of a candle light, that doesn't even flicker like a candle light anyway, and just turns out to this flickering annoyance that I'd much rather throw out of the window?


To each their own. I rate the “run for 6 hours each day” technology as a wonder of the modern age along with sliced bread and the Apollo program.

We have about a dozen of these things throughout the house, running everyday. Some on tables, some on counters, some on the wall in sconces.

They turn on and off on their own. Don’t have to worry about a cat knocking one over. We replace the batteries I think every 3 months or so. (Pair of AAAs.)

Yea, easy, safe way to add atmosphere and ad hoc nightlights throughout the house. Modern marvel.


Most of the LED candles on eBay just have a one shot timer. If you're looking for the "6 hours every day" candles, here's an example:

https://www.ebay.com/itm/254879583371


My SO trialed a few different LED tealight ones to find some that "flicker right", ie not annoyingly (to her anyway).

She then drilled out parts of the core of some large candles using a hole saw, and put these LED candles in there.

So now she enjoys candles every evening without the fire risk, poor indoor air quality during winter season and the expense of new candles.


I respect her endeavour. Except why go through all that trouble when you can just get a continuous light that doesn't annoy anyone?

Seems to be that just about no-one are annoyed by continuous lights. And they can be cosy too. Moreover, with fancy new LED technology (sorry for the slight sarcasm) you can even have it in paper lamp shades, or whatever nicely done Art Nouveau creation you like.


Why? Ive known several people that died from candles including the actions of other people's usage of candles.


Not to mention that whenever I light a candle, my air filter spins up and my air quality monitor says pm2.5 peaks.


Yeah my wife once fell asleep with a candle burning. It lit fire to the decorative candle holder and some kids clothes lying on the table. Luckily the IKEA table was fire-resistent so the fire extinguished itself but I came home to a home full of smoke and my wife asleep in the middle of it and it was the scare of my life.

Also bought smoke alarms for every room of the house after that (the hard-wired fire alarm in the apartment was heat activated only I guess)


>> Ive known several people that died from candles

That can't possibly be true - unless of them all died in the same accident?


Two incidents, two families. Happens a lot with Mexican candles.


I still don't believe it. Mexican candles? This is a technology from ancient times; why would a candle made in Mexico in 2024 be any different from an American or Chinese or wherever candle?


Mexican candles probably aren't any different except they were involved in two fires the previous commenter was familiar with. Some cultures/religions use candles as part of their holiday observances.

https://www.nfpa.org/education-and-research/home-fire-safety...


Mexican kids knock over the religious candles all the time. One tragic event happened when a friend's house was engulfed in flames after a candle was knocked into the curtains and she suffocated protecting her children from the smoke, I think one child had really bad breathing problems too.


Amazing that the tip of a led electrode can now casually accommodate a "proper" 1 MHz microcontroller. And most of the hardware is unused!


Now it just needs wifi and firmware updates.


Wait till you have to wait for your candle to boot an off the shelf Linux and download security updates before it starts flickering.


Who produces these? And at what cost?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: