Hacker News new | past | comments | ask | show | jobs | submit | em3rgent0rdr's comments login

HTML: the only markup language that we ever needed.

All it was missing was a self-contained WYSIWYG editor for it.


> given platforms like Windows thrive in spite of it, it is probably not as bad as many people may think.

There is a lot of extra work done behind the scenes to "thrive in spite of it". Windows Defender (built into Windows after XP) has to periodically download updated virus definitions and always scans programs for potential malware, and still can't catch them all.


Free95: "Free" as in "a free laugh".

If you can limit yourself to vector graphics, there are inexpensive open-source pen-plotters which use the same Arduino-based grbl program that is used by CNC drilling machines. I don't know how reliable they are, and I don't know of one with an automatic paper-feed, and they probably take long to draw, particularly if there are a lot of solid color areas. Looking on amazon I see a cheap A4-sized self-assemble pen-plotter for $130.

We should make a dual-head pen-plotter / paintbrush combo, to fill in solid colour areas faster.

Do iOS users care about sideloading yet? No.


I'm an iOS user and I care.


Do more than 1% or so of non-hackernews iOS users care about sideloading?


Apple has made it against the rules for apps to notify users of the benefits of not going through the app store.

Many subscription services don't allow you to subscribe from your iPhone. Why? Because if they did they'd have to pay Apple 30% for the privilege. And Apple has made it against their rules to notify users of this.

And for the ones that eat it, they're not allowed to give users a 25% discount for going to their website.

This is what Epic did to prove user harm, they made it cheaper to buy currency through their processor than Apple's. Of course they were banned for it.


Patreon is letting me bump my prices up for people who subscribe to my work via the iOS app. I don't know if this is because they are simply ignoring the "you have to have the same price through apple" rule or if they worked out some kind of deal.


If they knew every app/subscription they bought would be 30% cheaper I think they would care. But, conveniently for Apple's shareholders, the app store rules forbid developers from providing honest information about those costs to their users.


As if every developer that releases a side loaded app will lower their prices by 30%. that's total poppycock. The whole point of these devs is to keep that 30% and not give it to Apple. That's some funny shit that you think devs will all lower their prices the consumer sees. Do you have a 2 drink minimum for your set?


> As if every developer that releases a side loaded app will lower their prices by 30%.

That is literally already the case for many apps. Many offer a better price outside of the iOS ecosystem.

And why are you drawing the goalposts at EVERY developer?


I have never seen this on Android, and it definitely seems like developers don’t care about the 30% cut on Steam, even though sideloading is fairly easy on Windows. How many Steam games can you buy direct, let alone get a discount for?


The majority of PC games aren't sold through Steam even though it has a large installed base and provides some level of promotion. Meanwhile many of the popular Steam games are made by Valve itself or are free games paying Valve 30% of nothing, implying that the large majority of third party paid game developers didn't think the 30% was worth it when given a choice.


It's not that everyone will lower their prices... it's that in the past they already raised them. In 2023, if you calculated that you need to make $3 per install to be profitable, you'd have to raise that number by 43%* to account for Apple's cut. And if you don't think users will pay 43% more, then the app will just never get made.

Who knows whether or not current apps will have their prices lowered, but it would be better if in the future, new apps are not priced 43% higher than they would have been, like all the current ones.

* 1/(1-0.30) = 1.43


More like 18% higher because the fee is 15% on the first million per year.


Well seeing that most mainstream subscription services don’t allow you to purchase through the App Store including Netflix, Spotify, YouTube etc…


1% of iPhone users is a lot of people.


Just like Linux's 4.5% market share, it's a lot of people. What does it have to do with iOS? Nothing.


How many iOS user would want to play Fortnite again ? Would people want a Netflix app that handles subscription inside the app ? Who would care about a Kindle app where you can buy the books and read them without leaving the app ?

An awful lot of people do care about these issues, even if they never heard of sideloading they'll jump on it if offered the opportunity.


If you cared you wouldn’t have bought an iOS device.


This is a bit , okay, very unrelated to the main point, but I care.

But I want a new laptop.

Really eyeballing a MacBook. But I fear the locking down of osx is likely to continue. But seems great hardware.

The other side of the coin I see is framework. Hardware maybe not as wow whiz bang, but future freedoms in all the ways.

Am I correct to worry about osx lock in, and 'sideloading' on osx being fully removed vs pain in the ass?

I can't seem to find the darn sweet spot.


Serious question, but is this an age thing? When I was young, I was all about building boxes, tweaking settings, pushing to get every last drop out of the gear. That's because I had a lot of free time living a much more carefree life while still living with parental units. Eventually, I was able to parlay those skills into gaining some advantage for small companies I worked. Now, I have a full life outside of work, and just cannot be bothered with all of that now. As long as the device does what I need it to do, that's all I care about. I don't even change the desktop or any other personalization for sake of if it doesn't actively make me any more efficient. Once I'm done, I push back from the keyboard and continue on with other aspects of life.


My problem is that things don't do what I need them to do any more. I've spent years learning how to run a computer like the devil just for it to constantly amount to nothing because some asshole decided that everything needs to take the backseat to "It Just Works". More and more of the problems I run into that should be fixable by a config change, sloppy workaround, or (in the worst case) a quick patch are now just immutable facts of life. At some point the ideology has you spending more time dicking around, not less.


I hear you. I'm 43 with 2 school age kids. And work for myself. And 3 of us. So it's uh, busy, always.

It's more of a resource question. I tend to keep my hardware around for a long while. I like to repurpose it for various little things. A have a netbook grabbing sdr data in an outbuilding and shuffling data to living room server to be processed (goes weather). Server is an old Dell workstation. Etc.

My worry is the 2 i5 iMacs I have upstairs. Old, yes. Upgraded to the extent they can be? Yes! Useful hardware? Honestly, yeah.

Except they can't get newer versions of osx, this the cricut machine will not work on them. This I have to use parallels z with windows , to make the 27" iMac use the cricut because... Osx can't be upgraded. (Brave browser has hit its point too).

With the old stuff it's fine because I figure I'll eventually put some sort of Linux on it and have a useful thing, still.

The new m* looks so nice. And quick. And sleek. And I just fear 5 years down the road, when it's still nice, and fast and sleek but you can't even open a webpage because safari will not upgrade, and the world slowly crumbles around the hardware, which becomes. . Useless.

So it's not so much personalization but being able to just continue using that of which was purchased. I've been buying pixels because of the same reason. Maybe I won't keep it for forever, but I upgraded from a pixel 3 last year and that's only because I broke a BMS wire behind the battery and said "well, a new phone is cheaper than a new house when it burns down because I'm a cheap ass".

So, does one sacrifice some new shiney to have a device that you still can use if the apple gods (or whomever gods, it all seems to be forced obsolete).

Huge blabber rant, but it's such a weird market now it seems like. No problem going ARM on apple, just wish Linux efforts seemed to be heading in a more positive direction (hector, quit? Future? Unknown?).

ARM on 'wondows' seems like a not great idea, expecting weird sdr software to not work ,etc.

So that leaves paying decent $ for perhaps a lesser product..like a framework, but one that may last a longer time. Or at least work as long as the hardware works.

What do people do when faced with such (first world) things.


> But I fear the locking down of osx is likely to continue.

People have been saying this for as long as I can remember, and it has still never happened.

> Am I correct to worry about osx lock in, and 'sideloading' on osx being fully removed

Probably not

> vs pain in the ass

How is it even a pain in the ass now? Because you have to click through one dialog the first time you run an executable downloaded from the web?


Yes, the same “fear” that people have been saying since the Mac App Store was released in 2011…


I bought an iPhone despite the lack of sideloading because my family and coworkers use iMessage, so I must be on iMessage.


Give them the option and let’s find out how many really care.


What a concept. Like back in the cartridge game system era, where what the developers burned onto the ROM was the final game.


When QR codes first came out I thought it was really cool. But then re-entering meatspace after the pandemic I was honestly saddened to see so many in-person venues start using QR codes. QR codes are machine-readable, but they sure aren't human-readable, but why can't we have both? For instance, plain text using a low-pixel font with a dotted line underneath for error-correction and alignment.


Yes, I'd love to see menus printed in OCR-A[0].

[0]: https://en.wikipedia.org/wiki/OCR-A


OCR-A looks cool, but my above post isn't saying I want the menus in a machine-readable font, but rather I want the human-relevant parts of the lookup to be both human-readable and machine-readable.


>...I want the human-relevant parts of the lookup to be both human-readable and machine-readable.

There is OCR-B[1] for that. It is widely used as the human readable part of EAN/UPC barcodes used for laser scanners in retail. The relevant thing here is that the human readable part is never OCRed in practice because the barcode can be read much more reliably. OCR-B seems to be recommended simply because it is a well specified (ISO standard), easy to use (no weird licensing), high legibility font. Which is interesting because, as mentioned elsewhere on this thread, you don't really need a special font for OCR anymore. So it is a commonly used font that no longer serves the original purpose of the design. If you trained your OCR only on OCR-B you would likely get get some amount of accuracy improvement, but that would work for any high legibility font.

So the problem would be convincing app designers to read both things, where one of those things is much more reliable. I guess that might make sense for things that require high security, where the false negatives were worth the bother.

Getting back to the original article, OCR-B only has the good OCR characteristics for the upper case Latin characters, like with the QR code size thing, and for the same reason. If you are identifying something you generally want to use the larger, easier to read (both for people and machines), upper case characters. The lower case glyphs were added later to OCR-B as an afterthought.

[1] https://en.wikipedia.org/wiki/OCR-B


I'm not sure I see the distinction. if the human readable part is machine readable, there's no need for a separate machine only readable region. I'm on iOS and selecting text from images is assumed by this point. I'm not sure SOTA on Android for that though.


"I would love to see menus printed in OCR-A" just takes a regular human readable paper menu, and increases the success rate if you were to painstakingly scan each page to make a PDF with selectable text. That doesn't solve an actual need when sitting with the paper menu.

Taking a QR code that takes you to a URL, and writing out the URL in text does not enable anything, as you still must have an internet-connected device to retrieve the content, in which case you would most likely be able to scan the QR code anyway.

The point is to have e.g. a QR code, while still having human-readable content for those not having or wishing to use their smartphone in this interaction. E.g., just having the list of things on the menu printed in plain text (e.g., on a wall, over the counter, paper menu, ...), but also having a QR code with images and the ability to order directly to your table - stuff beyond what text would get you.


> Taking a QR code that takes you to a URL, and writing out the URL in text does not enable anything, as you still must have an internet-connected device to retrieve the content, in which case you would most likely be able to scan the QR code anyway

I often see things that I want to "look up later" while I'm passing by, but they only have QR code. I don't know if it is worth the time to stop and scan but if there's a URL I can just read it and remember.


To keep an actual URL for later you'd have to write it down or take a picture of it (which also works for the QR code, either by following it later or keeping the tab open). URL's aren't human-readable or memorizable, nor is it meant to be.

"Easy to remember URLs" are just domains that mostly match the human-friendly name of the place. As you are not memorizing a full URL, you can only really go to the frontpage, and so you can just memorize the name of the place instead.

It's a different story if there's just a random QR code with no clear purpose of ownership, but... maybe don't scan that.


I always find it strange here when I share an experience and people want to tell me why what I experienced was technically impossible. URLs don't have to be human readable but they often are, and the frontpage is probably fine indeed but to get there I need to know the name of the thing or some search term when more and more often--and this is the point I'm making--the only thing I'm given is a QR code.


You misinterpret - I am not saying that what you experienced was technically impossible (you saw a URL, later recreating a valid URL at least partially), but that your interpretation of what happened is neither practical nor possible in the general case, nor is it necessary. Let's try with an example:

You see a poster. "Samsung Galaxy, for real this time. Tune in on 1st of April to watch the ceremony live where we subjugate the last planet in the Milkyway.".

Scenario 1: The poster has a QR code, and a URL: https://events.samsung.com/press-room/world-domination. You memorize the hostname (events, samsung, com), and half a day later you manage to pull up a page for events, select the intended one, and get to the live feed.

Scenario 2: The poster has a QR code, no URL. You memorize "samsung galaxy", or "samsung event", or even just "samsung", and half a day layter you type this into your browser's address bar, which gives you as the first result the live feed you were looking for, or at the very least to samsung's event page.

"memorizable URLs" is not human-readable information, but computer-readable information constructed with certain rules to mimic human-readable information - e.g., the company name mangled to fit URL syntax. The original, unmangled information is easier to remember.


Scenario 3 the poster is a cool looking image with people doing something that interests me. There is a QR code and no other information.

Scenario 3 the poster says something is happening like "Neighbourhood dinner, Sunday at 7 -- Want to help cooking? Scan this code" -- There is a QR code but no information about who is organising it, where it is or how else to contact someone about helping to cook.

Yes, a name serve just as well as a URL, the point is that people begin to believe that a QR code is more convenient than text. Give me a link, give me a name, give me a search term, just give me something more than a QR code.


Then we agree.


Yes a URL is much handier than a QR code because I can read and remember it.


... no.

If you need to add human readable information (which is your scenario), a URL is never the right answer. Write a name or a sentence. Computer-readable information is for computers to read.


Just because something is OCRable doesn't make it structured data that can be used immediately. A table at a restaurant might have a QR code that takes me to a menu with the table number already encoded and pre-entered into the order page ready to go. An OCRable table number does not give me that, and an OCRable URL like https://fragmede.restaurant/menu?table=42 might work for HNers, but most humans won't recognise and understand their table number when going up to the bar to order.


"Fragmede.menu" costs $35 a year, which is roundoff error cost for a restaurant, and is a short-enough domain for a customer who wants to view a menu and order. No need for the "https://" which is implied. Adding a "?table=42" could be optional but isn't necessary, as the website in addition to simply presenting the menu could provide a means to order and if so have a little html input box when ordering to put their table number or whether it is pickup.


Sure it can be done, but there's no denying that a simple scan of a QR code instead of manually typing a URL would make life easier, as would some kind of alternative encoding technology that is more pleasing to the eye.


"if the human readable part is machine readable, there's no need for a separate machine only readable region"

So why are physical venues in 2025 presenting me with QR codes? If there is some randomish number (like another commented pointed out QRs may have a UUID number) or a checksum, then encode those randomish bits as a dotted rectangular outline or a line underneath, so a big QR doesn't ruin my human experience.


I have no idea which restaurants your frequent but take it up with them, not me. having a url written out http://restaurantname.com/menu on a sign that you can take a picture of and click on is functionally the same as a having a qr code that links to the exact same url


Having a short url like restaurantname.menu is not simply functionally the same as a qr code. A QR code is almost meaningless for me to look at...all that it says is that that there is a code hidden within these dots and I maybe can infer that the code contains a url to a menu (and maybe contains random tracking stuff). But restaurantname.menu in text communicates to me that these letters are almost certainly a url for a menu for restaruantname, and it is something that I can verbally say to the other people at my table and remember in my head.


So they can change the menu online without having to re-print it.


They can do that with a URL.


Scanning a QR code is far faster than typing a URL, and you need some sort of a computer to access the URL anyway, so providing a human-readable URL doesn't achieve anything.


A relatively short url like restaurantname.menu is something a human can say and remember, so it does achieve something more useful than a QR code. I might even be able to type or speak a short url quicker into my phone than for me to find the QR code reader feature in my phone and point and hold my phone still.


That's fine but it's moving the goalposts from the specified "reason for using a QR code" that I was responding to.


If they get you to use that QR code, they will get you to visit a URL and maybe show you ads, or sell your information to trackers. I mean with your tracking cookie, you visited a physical location, that's gotta be worth something.


In theory, if every QR in the building was different and they had the right sensors, they could also try to pair your browser fingerprint to your Bluetooth Mac, WiFi Mac, and your face. That pairing would have value on its own.


How is that different than a text url?


A simple short human-readable text url that a human can say and remember is not going to have a bunch of ".php?q=trackingcode&..." nonsense (though of course other trackers like cookies may still exist).

I don't know why the parent is getting downvoted...they bring a up very good point that hidden inside these QR codes are a bunch of tracking bits and the QR's ability to include tracking stuff may sadly be a significant reason why we see QRs way too much.


Both the human and the QR-encoded URL will be shortened and use a redirect to the URL with the tracking info.


this is not the point

Scanning a multi-page whole menu with a camera does not make it machine readable. And frankly, the menu is not in the qr-code, it is a link to the menu. What sounds appropriate here is to write also the content of the qr-code with plain letters, so one can type it, alongside the qr-code, for those who do not have a camera or whatever.

Now, the problem of requiring a mobile device to see the menu is a problem on its own, and while this is faciliated by having a qr-code vs having customers manually copy links on their phone (either writing them or with OCR), it is 2 separate issues for discussion. Moreover, OCR-A is not needed for any of that anywhere in the 2020s that we live.

The problems with QR code is that sometimes people have older smartphones or camera do not recognise them or, frankly, it being a means of obfuscating sth from humans. I have been in many situations with queues of people struggling for indeterminate reasons to scan a qr code to go fill up sth. If there was a simple link in plain text people could have even shared it between each other. Blindly trusting technology that can easily fail and is unnecessary for what one does and without redundancies is unwarranted imo.


you missed mine. if there is the text http://restaurant.com/menu.pdf I can scan that just as easily as I can a qr code, thanks to advancements in OCR technology.


It's 2025, regular english is machine readble


There is machine readable and consistently machine readable in a limited time under non-ideal lighting conditions with part of the code obscured using only cheap cameras and processors. Barcodes didn't just stop having a purpose in 2025.


Another common hangup is a legible font needs to unambiguously distinguish between lowercase eL (l), capital eye (I), the number one (1), and the pipe symbol (|), or at least for instance only deal with capital letters and numbers.


If a restaurant has no way to allow customers to not use their phone I normally just leave.


Generally speaking, I'm with you. However, there is one use case that's exceptional: when you're with a large group where every sub-party will be ordering & paying separately. It can be a godsend to have phone ordering when 25-50 people descend on a restaurant all at once (my typical use case being kids sports teams + family members). It's absolutely not ideal for experiential dining where you're going for ambience as much as the cuisine, but it definitely expedites the ordering process and the ability to keep a tab open is a huge benefit.


It's fine to have the options, there just needs to be a phone-free option as well.


I find that surprising because in my experience QR code ordering systems are almost always worse than paper menus and ordering at the bar.


The comment you replied to was in agreement with you. They're saying they don't want to be forced to order via a QR code.


Thank you for pointing this out! I think I missed a double negative.


Yes, if I cannot eat without pulling out my phone, then no thanks, I will leave my money elsewhere (which is what parent said).


Not a bad way to make a point locally, but wow are QR codes nice when you’re traveling and don’t speak the language. You get the menu, in a browser, with all of the translation and parsing tools on your phone.


(Offline) camera translation is the answer.


Having ended up in a situation where I attempted this:

no.

It is not the answer, it is a frustration where you wonder what "bean massacre pastry" is (chopped nut cookies, aka slivered almond cookies) or what they mean by "Surprise coriander special" described as "flavor of comatose with many spices in hot cow" as the translation. The accurate translation would be "mixed spice beef special" and "Beef with spice and vegetables."

Cameras are betterthan they were 10 years ago but machines are better when they have real sources in front of them


You know that you can scan/highlight the raw text and translate individual portions?

And sometimes, say in Chinese cuisine, the dishes are indeed using some flourished language. You get your translation and a peek into their culture. Win/win.


no, just looking at the properly translated menu on my phone is so much nicer


Waiting for Apple watch to have a camera.

And __not__ this one: https://wristcam.com


Seems unlikely. They day I can’t wear my watch in the changing room is the day I stop wearing it to the gym, and therefore stop wearing it altogether, and therefore stop buying new ones every 5ish years.


Why can't you wear a watch with a camera in the changing room?


It's normally forbidden because most people don't want to end up naked on some weird website without their consent.


But lots of people use their phones in the changing room ...


it's usually not allowed to have recording devices in places where people are naked.


I think what used to be considered "usually not allowed" is no longer true and a sign of one's age that you even remember not being able to use a camera any place/time. Someone could be using their phone without using their camera and they will look just like someone that is using their camera. People have become numb to it now.


> look just like

In my experience, there is usually an obvious difference between seeing a phone used as a camera versus not, based on whether it's aimed at an interesting subject or not. There are exceptions, like sitting with your elbow propped on the arm of a chair for a few minutes to avoid fatigue, which causes the phone to be at eye level and therefore perfectly vertical, but this is rare.


s/will/can/


Agreed. I like how most 1D barcodes have human-readable numbers/text printed under the barcode. For example, think of UPC barcodes on retail products. Not many 2D barcodes respect this convention.


This is directly caused by UPC codes being numerical and short, while 2D barcode have significantly higher, and often ASCII-space, data density in which human readability does not bring much of an advantage.


What does 1D barcode mean? I can think of no bar that can be represented in 1D.


A normal UPC barcode, as the parent said. The data is read along the horizontal dimension. The only reason the lines extend in the vertical direction is to make them easier to scan.


You're taking it too literally. 1D is the industry term for such code, to mean that data is encoded in one direction/dimension.

Getting hung up on the fact that the printed code needs some height to work isn't productive to the conversation.


i'm not hung up on it nor trying to be unproductive. i asked a simple question and then get chastised for it. that's not productive at all. In fact, someone much posted a much more productive response much earlier than you did, so you very much added nothing to the conversation


Whenever people are suggesting adding a QR somewhere (ad, in-app, etc) I always advocate for showing the short URL too. But about half the time they insist that "everyone knows how to scan a QR code". They clearly haven't tried to ask a few people to scan a QR code to see how easily most people do it.


I've seen QR codes to join a discord that have text under them that looks like

   discord.gg/{... a few random characters ...}
which are just fine to scan or type in.

My own 'discovery' about QR codes a few years is that you can make them "module 2" sized that ought to be easy to read with a low-spec system and have astronomical capacity if you use uppercase characters, a reasonably short domain and identifiers similar to random UUIDs. These were part of the system of "three-sided cards"

https://mastodon.social/@UP8/111013706271196029

but new-style cards put the QR code in front because (1) I have a huge amount of glossy paper that I can't print on the back of, (2) you can't read the QR code on the back if the card is stuck to the wall with mounting putting, (3) three-sided cards struggled with branding in that people didn't really understand the affordances they offered, a problem that the new-style cards attack in various ways.

https://mastodon.social/@UP8/113541119391897096

(Note the QR codes on both of those cards do not point at safebooru but at a redirect that I control that fits my QRL specification)

Personally I don't think any QR code for the web should ever require more than a "module 2" QR code and that printing a QR code which requires extra alignment markers is a sign of failure. (e.g. sure you can make a QR-code with 2000 bytes of form data embedded in it, but should you? Random UUIDs are so numerous and redirects so cheap that every new-style card like that Yakumo Ran card has a unique id because with inkjet printing it doesn't cost anything more)


That's basically converting a QR scanner into a text detector. It might work but why do they need to be human-readable? Most part of the encoded string would be UUID that's useless for human eyes anyway. After scanning, the important info will also usually show on the phone screen.


I want it human readable so I'm not presented with QR codes when I'm in meatspace. I'd rather see a pixel-font that say MENU://JOES-CHICKEN when I want to look up the menu at my local chicken restaurant than look at a QR.


If part of the data is just a random number like a UUID that is meaningless to a human, well that number could simply be placed after the pixel-font text or as a rectangle outline...if using a 3x5 pixel font then that gives 4x5=20 bits per character cell, so a 128-bit UUID with 12 bits of overhead could fit within the space of 7 characters...fine...but at least put the parts of the info that the human can relate to (like that this a MENU for RESTAURANT) as pixel text so both the human and machine are happy.



There is no "after the pandemic" (yet). The pandemic is still ongoing. (Source: WHO)


No one in the real world actually acknowledges this as being fact. Further pushes by the WHO just lower trust even more.


So glad most people in the “real world” are experts in health an epidemiology.

Taking the same stance as most people has never been wrong.


100%, I remember this being a big pain during the period where places were open but you had to order from the table. If your phone didn't want to scan the code you were kind of stuck - and to make it worse some of them _deliberately_ degraded the code to add a cutesy logo or whatever.


And because turn-out was only 63.9%, only %31.8 of the voting-eligible population voted for him. More of the voting-eligible population didn't vote at all than voted for him.


Debatable to claim the 4004 as "the first microprocessor". It's safer to specify it as the first "commercially-available general purpose" microprocessor. See https://en.wikipedia.org/wiki/Microprocessor#First_projects for a few pre-4004 chips that also are debatabley the first microprocessor: - Four-Phase Systems AL1 chip (1969), which was later demonstrated in a courtroom hack to act as a microprocessor (though there is much debate on whether that hack was too hacky) - The F-14 CADC's ALU chip (1970), which was classified at the time - Texas Instruments TMS 1802NC (announced September 17, 1971, two months before the 4004), which is more specifically termed a microcontroller nowadays, but nevertheless the core was entirely inside a single chip.


I do not consider 4004 as "general purpose".

It was designed for implementing a desktop calculator, not a general-purpose computer. With some effort it could be repurposed to implement a simple controller, but it was completely unsuitable for implementing the processor of a general-purpose programmable computer.

For implementing a general-purpose processor, it is likely that using MSI TTL integrated circuits would have been simpler than using Intel 4004.

Intel 8008 (which implemented the architecture of Datapoint 2200), was the first commercially-available monolithic processor that could be used to make a general-purpose computer, and which has actually been used for this.

Around the same time with the first monolithic processors, Intel has invented the ultraviolet-erasable programmable read-only memory.

The EPROM invented by Intel has been almost as important as the microprocessors for enabling the appearance of cheap personal computers, by avoiding the need for other kinds of non-volatile memories for storing programs (e.g. punched-tape readers or magnetic core memories), which would have been more expensive than the entire computer.


I get you point. I was speaking in relative terms about it being "general purpose" and probably should have instead said "that can run an program from an external ROM"...and it that aspect it is more "general purpose" relative to the TMS 1802NC which could only run a fixed program burned into its internal rom. Nevertheless, while it is unsuited for running a general-purpose processor, it has indeed been proven to be capable of running Linux (albeit by emulating MIPS).


For what it is worth, Intel does refer to the 4004 as the "first general-purpose":

"That’s when the Intel® 4004 became the first general-purpose programmable processor on the market—a "building block" that engineers could purchase and then customize with software to perform different functions in a wide variety of electronic devices." https://www.intel.com/content/www/us/en/history/museum-story...

"The 4004 would replace that system with a general-purpose chip that could be mass produced and then programmed through its software to perform specific functions, such as those of a desktop calculator. That idea could make computing cheaper, more powerful and smaller in one fell swoop. It could, in other words, facilitate the modern information age. In 1969, the Nippon Calculating Machine Corporation approached Intel to design 12 custom chips for its new Busicom 141-PF printing calculator. Intel's engineers proposed a new design of just four chips, including one that could be programmed for use. That programmable chip, later known as the Intel 4004, became the first general-purpose microprocessor." https://www.intel.com/content/www/us/en/history/virtual-vaul...


What they say now is irrelevant, because it is retconned with the purpose of increasing the apparent importance of what they have done in 1971 vs. what they have done in 1972.

Moreover, you can be pretty certain that whoever has written that text has never read the datasheets of Intel 4004, to be able to evaluate whether it was "general purpose" or not.

Even at its launch in 1971, when Intel has begun to offer for sale the 4004 because Busicom was unable to pay the price desired by Intel for 4004, the Intel marketing has attempted to present the 4004 as much more general purpose than it really was, in order to find customers for it, in order to ensure the profitability that could not be provided by Busicom.

During its design, 4004 has never been intended to be general-purpose, because the plan was to sell it to a single customer. Only after the delivery to the intended customer the Intel marketing has made great efforts to find other possible applications for it.


Inexpensive personal computers weren’t shipped with EPROMs, they were shipped with mask programmable ROMs. EPROMs were used in development but they were nowhere near as important as the microprocessor.


Mask programmable ROM could be used only by the companies whose production of computers had grown enough to make them worthwhile.

Moreover, in the beginning Intel was also the main producer of mask-programmable ROMs, which were launched at the same time with the corresponding EPROM, with the 23xx mask-programmable ROMs corresponding to the 27xx EPROMs.

All these part numbers belonged to a system used by Intel, where the first digit was "1" for PMOS, "2" for NMOS and "3" for bipolar, with the second digit being the kind of IC, e.g. 21xx for RAM, 23xx for mask-programmable ROM, 27xx for UV-EPROMs and 28xx for electrically-erasable PROMs.

Other manufacturers have started the production of memories after a delay of a few years and most of them have made memories compatible with those introduced by Intel and they have kept the 23xx and 27xx Intel part numbers.


Didn’t PROM come before EPROM? While I agree EPROM enabled easier testing, PROMs would fit the bill once their contents got stable.


Bipolar PROM was too small to contain the equivalent of the BIOS of a computer in the early seventies.

Bipolar PROMs were initially used mainly to store microprograms for CPUs with microprogrammed control and for implementing various kinds of programmable logic, in which role they were later replaced by PLAs, then by PALs.

I do not think that there has ever been any kind of computer that has stored in bipolar PROMs programs that were usable during normal operation, except perhaps some kind of embedded controllers designed before microprocessors become widespread, together with their associated EPROMs.

By the time when the Intel EPROMs like 1702 and 2708 have appeared (with a capacity of 256 bytes, then of 1 kbyte), typical bipolar PROMs had capacities of either 32 bytes or 128 bytes.

In that space you could put at most some kind of initial loader that would load a real bootstrapping program from something like a punched-tape reader. This kind of solution was used in some minicomputers, replacing the introducing of such an initial loader from console keys, by the operator. Such minicomputers were still at least an order of magnitude more expensive than the first computers with microprocessors, mainly due to the expensive peripherals required for a working system.


The Apple I had a single 256 byte bipolar PROM for its WozMon debugger. Though the Apple II used ROMs (EPROMS were popular in clones) the expansion cards still were supposed to use tiny PROMs for their drivers (in the case of the disk controller there was even a second PROM for the state machine - Woz did love these devices).

The Altair 680 had sockets for four 256 byte PROMs. One was for the debugger, but a popular option for the other three was the VTL-2 (very tiny language) interpreter. Pretty amazing that they fit a Basic-like language in just 768 bytes, though implementing some language in the 510 byte boot sector has become a popular hobby.


That was just EFF's first recommendation. Their next recommendation "Install Privacy Badger to Block Meta’s Trackers" is much more effective.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: