Now this is what I come to HN for. A true hack in the classic sense of the term. An absurd attempt to do something with technology that its designers had absolutely no intention of supporting, and learning quite a bit about it in the process. Bravo!
There's probably not enough RAM. (64kB vs 256kB for the nRF52840). Certainly not for the full resolution of the display with color. But I'm sure a more basic port would be possible.
We use additional compute power to enable frameworks and abstractions that allow code to be written faster. Sure it's bloated but at least there is a reason.
Yeah, this is my experience too.
I view it more as beauracracy; in order to get my job done I now also have to understand a whole lot of arbitrary systems created by a group of programmers on the other side of the world.
If you break out of the desktop/mobile environment and into embedded, it's genuinely liberating.
one thing to consider is that the types of people writing code are very different. In the 80s it took geniuses to write a simple videogame within ram constraints. Now that is a feasible final project for an intro CS class.
Electrical engineers still write incredibly low-level code in embedded programming courses - we did some pretty interesting shit in MIPS assembler before we even touched C++ - and I doubt we were any smarter than the CompSci kids.
I think it's primarily a cultural difference - modern CompSci people love treating everything as an academic exercise and hand-waving away complexity, whereas the old-school "hackers" (before CompSci was really a huge thing) would love getting into the nitty-gritty of things and didn't really give a damn about whether or not they were following best practices.
If the average CompSci student (who, to be fair, is smart but not Genius-level) spent as much time reading the spec sheet for the Motorola 68000 as they did learning how and when a Turing Machine halts, they'd have no issue understanding 80's video game code.
Yes, 100%. Without question. I know it's trendy to hate modern languages and frameworks but they do reduce overall dev time even if they have more of an initial learning overhead.
Apples USB-C charging bricks are quite capable on their own. The new 140W brick uses a 32-bit Arm Cortex-M0 STM32G0-series microcontroller with 36K RAM, 128K flash ROM, running at 64MHz.
USB PD is a fairly complex protocol. I can safely plug my phone into my MacBook charger now because both ends can negotiate a suitable power level. We’re not just doing 5V 1A anymore. The CPU they chose is probably a little overkill, but it’s nicer than 8-bit microcontrollers to write programs for since it’s just Arm.
Who knows, maybe they have a bunch of internal sensors and can intelligently react to high temperatures or moisture.
I worked on the kernel parts of the pre smart phones.
When GSM was invented, there was real doubt that it might not be possible to meet the very specific timing needs on the hardware that was available at the time.
I remember when we had to integrate Bluetooth into our 16 bit, 8Mhz Infineon E-gold platform. We had ~20 people working on it for 2 years.
Reminds me of the article on HN some months ago where someone got a form of Lunix running on the controller SoC within a hard drive. Incredible what power is being packed into other devices, these days.
Satirical news/blog stories are my favorite part of the old internet. Really with they would have a comeback. The long form written humor is a lot more fun to me than something like Twitter.
That's not a standard Bluetooth dongle, that's the go-to chip that runs a lot of smartwatches, tags, and health devices. Has community Arduino support as well.
You're right, because the nRF52 series are Bluetooth Low Energy (BLE) devices, they can't stream music at all :-) Music streaming is done via Bluetooth Classic (as of now). BLE != Bluetooth Classic
I was under the assumption that BLE chips have regular BL support as well as a fallback for when you actually want to use the thing for something productive. Is that not the case or are most phone manufacturers just lying while putting two chips on devices?
Like if a chip supports Bluetooth 5.1 that should include both, no?
Have you tried a Bluetooth 5 LE headphone? It's a completely different radio and protocol from previous generation Bluetooth headphones. I don't know if there are any in the market yet. There certainly isn't any based on nRF5340 yet.
I don't know if the range will be better, but latency and pairing process should be much better.
Most Linux distributions dropped support for older architectures (Motorola 68K, Sparc, Vax and the likes) that NetBSD still somewhat actively supports [1].
Heck, many Linux distributions dropped even i386...
To make your life easier, just get the official nRF52840 Dongle - it's only 10€ but has all GPIOs conveniently exposed. No need to repurpose the LED or the user button for a GPIO.
It's sort of insane how ubiquitous outright computers are now. Like, I didn't know that USB dongles were full-on computers, but it doesn't surprise me either...Computers have gotten so cheap and small to a borderline-disposable level.
Kind of cool that I get to be a part of first generation experience that.
where "on a pregnancy test" means "they ripped out the original hardware from a pregnancy test and put in a new controller and new display, which then shows Doom"
It's even simpler than this. On the "pregnancy test" Doom is actually running on a pc and the scaled video is sent via usb to the replaced microcontroller which drives the replaced display. Source: https://mobile.twitter.com/Foone/status/1302834931421175809
That's true, but I guess it will take some years until manufacturers leave behind the proven line-up of cheap mikrocontrollers like the TI MSP430. They're so ubiquitous in these applications, TI even provides a Electric Toothbrush Controller Reference Design if you want to have a look at it.