Hacker News new | past | comments | ask | show | jobs | submit login
The capacitor that Apple soldered incorrectly at the factory (downtowndougbrown.com)
535 points by zdw 10 days ago | hide | past | favorite | 259 comments





Well, today I learned to install one capacitor in reverse orientation on the PCB on a 34 year old computer...

Definitely starting Wednesday off productively.


Well, until today I didn't even know capacitor can have orientation! So more productive Wednesday than yours. In entry level electronics class I had decades ago it was always treated as a component that works the same way no matter in which direction the current flows.

Ceramic capacitors don't have polarity. Electrolytic ones do. Thing is, electrolytic capacitors have far higher capacitance for their size -- though also higher resistance.

It's something to check, but the polar ones should be clearly marked as such.


Electrolytic capacitors are kinda like lead-acid batteries in that they are polarized through manufacturing processes. A voltage is applied in the factory to anodize the anode with a thin oxide layer. For fun, I think it would be possible to buy a quality low voltage cap and reverse the polarity of it in-situ which would remove the anodization from the new cathode and deposit a new layer on the new anode (former cathode) hopefully without over-pressurizing it to bursting, albeit with much less anticipated lifespan.

PSA: Electrolytic capacitors have a rough lifespan of 10 years. Any much older than that need to be checked out-of-circuit for ESR and then capacitance. Also, tantalums (historically) suck(ed). [0] Quality audio equipment from the 80's like a/d/s/ car amps used only ceramic caps and other over-engineered passives, and have the potential (pun intended) to basically last forever.

0. https://www.eevblog.com/forum/projects/whenwhy-(not)-to-use-...


Or much shorter, around two years, if it was part of the Capacitor Plague.

https://en.wikipedia.org/wiki/Capacitor_plague#Premature_fai...

   The normal lifespan of a non-solid electrolytic capacitor of consumer quality, typically rated at 2000 h/85 °C and operating at 40 °C, is roughly 6 years. It can be more than 10 years for a 1000 h/105 °C capacitor operating at 40 °C. Electrolytic capacitors that operate at a lower temperature can have a considerably longer lifespan. ... The life of an electrolytic capacitor with defective electrolyte can be as little as two years.

This is also why so many LED bulbs are shit, lots of heat in a small space full of electrolytic caps.

Recently read that if you are going to be using an LED bulb in an enclosed space, buy bulbs designed for the high temperature, otherwise you WILL get premature failures in bulbs that will last for years in ordinary lamps.

https://duckduckgo.com/?t=lm&q=led+bulbs+enclosed+fixture+ra...


Alternatively, there are now much more efficient bulbs available. If they're passing A under the new EU Energy Label (from 2021) they'll barely be warm to the touch.

Intentional planned consumption/obsolescence by design. This class of problem is where under-regulation and lack of standards benefits only sellers and cheats buyers. PS: Also, Amazon should be required to test all of the electronic, safety, and food products on its site such that they can prove safety and standards conformance.

I am assuming you are an ee (like myself)...I have never designed a product with a built in expiration, nor have I ever seen any app notes or write ups on the engineering of it - something engineers love to do.

What I have seen done is cheaping out on parts in order to get the price as low as possible, because customers shop primarily on price.

Not to lash out, but it kind of hits a nerve for me, because people think we design products to purposely fail. Hell no, we try really hard to do the opposite, but everyone just loves to buy the cheapest shit.

The $25 LED bulb that will last for eternity will rot on the shelf next to the $3 bulb that will probably be dead in 6 months. And one more "they build these things to fail" complaint will be posted online.


To be fair this is hardly limited to EE and is the issue with the race to the bottom in all product categories. Make long-lasting high-quality 100$ pants? People prefer spending 10$ on Shein.

Additionally, the issue is that as a consumer, it's not easy to differentiate between quality markup and greedy markup. I don't see the cap manufacturer on the box so the 25$ light bulb might last 10 years or it might last 6 months just like the 3$ one. At least with the 3$ one I can come back and buy another...


> as a consumer, it's not easy to differentiate between quality markup and greedy markup

This is so true and insightful. Even plenty of so-called “luxury goods” are trash. And outside of luxury, it seems like most markets are dominated by roughly 90-99.9% products that are all identically garbage, just ranging from 1%, 2% up to 800% profit margins. So the person buying the high-end brand and the one buying from Aliexpress both have the same quality components (trash tier) and sometimes the same PCBs and designs. If you want a good quality refrigerator, you can buy a Viking or something for $9,000, or spend any amount of your choice from $800-5000 for something that will die in 2 years and be outside of warranty. It makes my blood boil, especially in areas like appliances, where the amount of waste, by tonnage, is super offensive.


it's also not like the Viking refrigerator will last appreciably longer than a cheaper make... their warranties can be noticeably shorter than their competitors even though they somehow charge 2x-10x the price.

> Make long-lasting high-quality 100$ pants? People prefer spending 10$ on Shein.

This is just fashion, right? As something becomes commoditised, it starts to become subject to fashion, which means cheap and looks fashionable is more important than durability. So you can buy one every year and keep up, and throw away the old one.


I agree with what you said - engineers do the best they can with the budget but the budget is small because people won’t pay for things that last - but it’s worth saying that any boards with electrolytic capacitors have an inherent built in expiration. Any product with rubber has an expiration. Any product with permanent batteries, glued or sealed assemblies, or no spare parts. Much of that is with the customer’s budget, sure. But these days, even among expensive things, nearly nothing is built to last.

The problem is that consumers cannot tell whether the more more expensive one is high quality or whether it is just the same as the cheap one, just priced higher.

There have been plenty of discussions on HN about brands that used to produce durable products no longer doing so. I mostly buy cheap stuff because I assume that everything will be built as cheaply as possible, so I will get something that will not last anyway.


Yes, so tragic really. The few people out there not just churning out disposable crap, are not even getting the business of people like you and me, because we know so few can be trusted that we just buy the cheapest thing.

I seriously doubt it's ever a deliberate conspiracy in engineering apart from shenanigans like what happened at VW, but it's net effect of product managers, accountants, and contract manufacturers who modify PCBs and BOMs after it's passed off to them to save money on retail products. And so it's likely unintentional with negligence, but it benefits the company. Except for some Samsung appliances made ~ 2010-2014 which seemed to fail just after their warranties expired. I suspect highly-optimized designs for "consumables" like incandescent lightbulbs and parts for cars use data to tweak design life, more often than not, in their favor. And, with the pressures of multinational oligopolies and BlackRock/Vanguard/State Street.. there is little incentive to invest $100M into a moderately-superior incandescent lightbulb using yesterday's technology that lasts 100kh and 5k cycles and sells for $1 more than the next one. Maybe if we (perhaps a science/engineering nonprofit thinktank that spanned the world and gave away designs and manufacturing expertise) had quasi-communism for R&D, we could have very nice things.

It's not my fault if other people are too dumb to comprehend TCO because I would buy the $25 bulb if it had a 30 year warranty.


> It's not my fault if other people are too dumb to comprehend TCO because I would buy the $25 bulb if it had a 30 year warranty.

A 30 year warranty would certainly make a difference in the decision making. But more typically you see the $3 bulb, with a 1yr "warranty", next to the $25 bulb, and the $25 bulb either has an identical 1yr warranty, or has a warranty period not commensurate to the price difference, such as a 2yr warranty.


> Except for some Samsung appliances made ~ 2010-2014 which seemed to fail just after their warranties expired.

And? That just sounds like they have good engineers. If you are designing a machine, you have an target lifetime. You'd obviously want the product to last through the warranty period, because warranty claims are a cost to the company.

Every choice of a component affects lifetime. Designers of mass-market products can't just use premium components everywhere -- the mass market will not pay steep premiums for otherwise equivalent products.

Value engineering and planned obsolescence are not the same thing, but they are often confused.

That being said, Samsung appliances suck and I hate them. Mine failed within warranty several times.

> And, with the pressures of multinational oligopolies and BlackRock/Vanguard/State Street.. there is little incentive to invest $100M into a moderately-superior incandescent lightbulb using yesterday's technology that lasts 100kh and 5k cycles and sells for $1 more than the next one.

It isn't that. It's pressure at the shelf that does it. Consumers behavior simply does not reward equivalent-feature products with premium components that claim (true or not) to have a longer lifespan. Unfortunately, they will buy based on their uninformed sense of quality first.

If you release a light bulb that is identical to the best selling one on the shelf, but claims 10x lifespan, your competitor will do something like gluing a weight in theirs, putting some marketing BS on the box, and will put you out of business. Consumers just don't pick products based on actual quality.


You're making a pretty awkward value judgement about what a "good" engineer is, but you're describing an unethical one with a bizword like "value engineering". I realize ethics are no longer understood by much of Western society because the culture teaches transactionality, worships trickle-down economics and greed, and hyperindividualism.

> It isn't that. It's pressure at the shelf that does it. Consumers behavior simply does not reward equivalent-feature products with premium components that claim (true or not) to have a longer lifespan. Unfortunately, they will buy based on their uninformed sense of quality first.

This is a failure of marketing and buzz of the sales channel(s) and manufacturers to educate properly, not the failure of the customer.


A good engineer is one that has a job, doesn't put their employer out of business, and produces work that fulfills the requirements they're given.

Many people think there's some unethical conspiracy going on, and consumers actually want a product that lasts a long time, but companies are refusing to give it to them. But this is projection of individual preferences on to the market as a whole. Consumers want cheap shit that is in fashion, and their buying preferences prove this time and again. Maybe you want a 50 year old toaster in your kitchen, other people are buying products based on other factors.

If consumers really wanted to pay a premium for high duty-cycle equipment with premium lifespans, they can already do that by buying commercial grade equipment. But they don't.

If you are familiar with the history of home appliances, you'd probably come to appreciate the phrase 'value engineering'. Even poor people can afford basic electric appliances now because of the ingenuous ways that engineers have designed surprisingly usable appliances out of very minimal and efficient designs.

If you look at ads for electric toasters 100 years ago, you'd see they cost over $300 in today's money adjusted for inflation. Thank god for value engineering.


A good engineer provides value to society. If they fulfill requirements that are bad for others then they are not good engineers.

I seems to me that there is also a social dynamic to things. If consumer grade products become a race to the bottom then it is going to become more difficult for regular people to purchase products which aren't low quality. There's also a degree to which society (e.g. in the form of government policy, cost of living adjustments, etc.) factors in differences in prices.


The fact that poor people can now afford to own some household appliances isn't a huge value to society?

It completely changed the way our societies operate. I think it is a good thing that people have the option to buy crappy washing machines, rather than being forced to use the washboard and bucket my grandmother used. Yeah, they sometimes do develop a bad belt, or the timer mechanism might fail. But it beats being unwillingly forced into homemaking as a career.

The world only has so much wealth to go around, and that isn't the moral quandary of the engineer picking an item on a BOM on Tuesday morning to fix. If anything, squeezing a few more pennies out of that BOM is going to lift some people at the fringes out of poverty. At the opposite end of the product value equation, every unused and functional component in every product that is no longer in service, is wealth that is wasted that could have been spent elsewhere.


(Butting into your 2-person conversation to point out:)

> At the opposite end of the product value equation, every unused and functional component in every product that is no longer in service, is wealth that is wasted

This also is a counterpoint to your position though: of everything that goes into say, a fridge, it’s all wasted in 5 years now because Samsung chooses to put a PCB that is barely fit for purpose and will just fail with an error code, and because instead of putting such a failure-prone part behind a door and using an edge connector so it can be swapped in 5 minutes, they bury it God-knows-where in the chassis requiring an $800 labor charge, and charge $300 for the part. (As though that PCB is actually more complex than an iPad logic board, lol). So the whole 600 pounds of steel, refrigerant, insulation, glass, ice maker, and the compressor goes to the dump since who would invest $800 in a fridge that could have the same failure in a month and only the part is warrantied (you have to pay labor again). The poor people you’re worried about are buying these components over and over again because the appliance makers like this system. All this is done in bad faith. They’re morally bankrupt compared to their grandfathers who made appliances that lasted decades.


> If anything, squeezing a few more pennies out of that BOM is going to lift some people at the fringes out of poverty.

If it squeezes a small but solid chunk out of product lifetime too, then it's also likely to harm people on the fringes. If they can buy it with one less month of savings, but then it breaks a couple months earlier, they're probably worse off. (For actual pennies divide both of those numbers by some orders of magnitude.)


Yeah, walk up to someone in the hood and tell them that for 15% more they could have got [insert product] that last 2x as long. You're gonna get punched in the face, because they already know that. They're not dumb. What you're missing is the time-value of owning something now, which is greatly amplified when life is tough.

People don’t want to walk their clothes basket down to the laundromat for one more month while they save for the nicer washer that lasts a little bit longer. They want the cheap one now, because they just got off some shitty shift at work, and they’re sick and tired of lugging their laundry down the street. Having a quality washer [x] years from now is not a desired part of the equation. Immediacy is of higher value.


1. Immediacy doesn't help once it breaks and you can't buy a new one for years.

2. If everything lasts twice as long for 15% more, you can get a half-expired used one for even cheaper.

> they already know that. They're not dumb.

I think they're not dumb and they already know it's extremely difficult to figure out which brand fits that criteria, if any, so it's not worth it because it's such a gamble.


> it's not worth it because it's such a gamble.

That's also true. At the individual unit level, small differences in MTTF/MTBF are negligible because product failures are naturally distributed anyway. The mean time is just a mean, and nobody gives a shit about a good mean product failure rate when theirs happened to fail below the mean. That's true no matter how much you spent.


> If consumers really wanted to pay a premium for high duty-cycle equipment with premium lifespans, they can already do that by buying commercial grade equipment. But they don't.

That costs a ton. I just want a better lifespan, I don't want to 20x the duty cycle and also pay B2B prices.

It's too hard to figure out which consumer products have a better lifespan, so companies do a bad job of catering to that need. This makes companies try too hard to be cheapest, and they often fall below the sweet spot of longevity versus price. Then everyone is worse off. That's not the fault of the engineer but it still means the engineer is participating in making things worse.


> That costs a ton. I just want a better lifespan, I don't want to 20x the duty cycle and also pay B2B prices.

Therein lies the problem. A more durable product exists, and yet, even you don't want to pay more for it. And you are likely much more privileged than the rest of the world. What do you expect the rest of the world to be doing? Most of the world isn't picky about whether their hand mixer has plastic bushings or ball bearings. They're are choosing between any appliance at all and mixing their food with a spoon.

> It's too hard to figure out which consumer products have a better lifespan, so companies do a bad job of catering to that need.

There are many companies that try to break this barrier over and over, with tons of marketing material proclaiming their superiority. Why do they all fail? Because their hypothesis is wrong. The majority of the mass market doesn't want appliances that last for tens of thousands of hours. Most people use their appliances very lightly and for short periods of time before replacing them.

I think a lot of people on this forum have points of view tainted by privilege. Poor people aren't dumb, they know that they are buying cheap stuff that doesn't last as long as more premium options. They're making these options intentionally because a bird in the hand is worth more than two in the bush to them.


> A more durable product exists, and yet, even you don't want to pay more for it.

This is disingenuous as hell.

I want to buy a version that cost 15% more to make. I don't want to buy a version that cost 3x as much to make (or is priced as if it does).

When I can't find the former, that is part of problem. When I don't buy the latter, that is not part of the problem.

> They're making these options intentionally because a bird in the hand is worth more than two in the bush to them.

The best way to have the most birds in hands on an ongoing basis is to optimize for both price and lifetime per dollar, not just price.


> I want to buy a version that cost 15% more to make. I don't want to buy a version that cost 3x as much to make (or is priced as if it does).

Well, it just doesn't work that way. Premium components that truly extend product life are multiplicatively more expensive than what you'll find in value engineered products, if not exponentially so. Furthermore, a product that is 15% more expensive than competitors won't sell 15% fewer units, it will sell significantly fewer units, and then your fixed costs will also be higher, on top of the higher BOM costs.

Quality products with measurably longer lifespans in pretty much any product category are significantly more expensive than lower quality equivalents. The entire global manufacturing industry isn't in on some conspiracy.


You are the one that came up with the 15% number. And you also said that the difference between a 5 year washing machine and a 30 year washing machine is probably "a matter of tens of dollars". Why is it suddenly multiplicatively or exponentially more expensive? There's a lot of low-hanging fruit, and I'm focused on consumer duty cycles.

It's not a conspiracy when a product that is somewhat more expensive but lasts much longer per dollar doesn't sell well, but it is a market failure.

I think it if was clear at a glance that such a product lasts much longer, there'd be enough buyers to avoid the low-volume costs. At least in many markets for many kinds of product.


The 15% number was in a different context in a different thread: "walking up to someone" in public, which presumably wouldn't be about an appliance because people keep those in their homes and don't walk around on the street with one. In that context I was thinking potentially a clothing accessory or something. Either way, the point in that thread didn't matter, because it wasn't about the number but about the socioeconomic impact of immediacy on quality of life. Doesn't matter if it was 10% or 1000%.

> And you also said that the difference between a 5 year washing machine and a 30 year washing machine is probably "a matter of tens of dollars". Why is it suddenly multiplicatively or exponentially more expensive?

You're conflating my statements about BOM costs and the final price of the product, which are two entirely different things. Demand is not a constant for your product at any price (because the market is likely elastic, and you have competitors). Demand will go down as price increases, often sharply. If you add tens of dollars in BOM cost, your product sells fewer units as a result, and now you have fewer units to spread the (potentially significant) fixed costs across. So, unfortunately the tens of dollars in BOM cost might mean hundreds in cost to the end consumer.


I think if there was a way to see the quality the sales would not drop like that.

But if you insist they would, then we can talk about a world where that level of quality is the minimum. Somehow. I don't really care how. It would be better, yeah?


If you really care, you can still get an excellently-engineered $300 toaster today https://www.youtube.com/watch?v=IxAbz9mfaj0

Engineers are to consider public safety first. This is not negotiable for real hardware engineering. Poor people could always purchase used appliances.

I agree that products shouldn't be unsafe. And value engineering does not mean making products unsafe.

> Poor people could always purchase used appliances.

The reality in mid 20th century US demonstrates this isn't the case. Most went without the modern appliances that are commonplace today.


> I realize ethics are no longer understood by much of Western society because the culture teaches transactionality, worships trickle-down economics and greed, and hyperindividualism.

You realise incorrectly, I would say. It's very defensible to claim that Western society has the most - by a giant margin - social, economic and technological advances in history, and to boil it down to this is just a bit silly, in my opinion.


> And? That just sounds like they have good engineers. If you are designing a machine, you have an target lifetime. You'd obviously want the product to last through the warranty period, because warranty claims are a cost to the company.

> Every choice of a component affects lifetime. Designers of mass-market products can't just use premium components everywhere -- the mass market will not pay steep premiums for otherwise equivalent products.

Dying just out of warranty is only okay if the warranty covers the actual expected lifetime of the product. And for appliances, it doesn't.

The difference between a 5 year washing machine and a 30 year washing machine is not very big. Anyone pinching those specific pennies is doing a bad thing.


If you think people want 30 year old washing machines, you're kidding yourself. Do you remember what washing machines were like in the 1990s? They were noisy and tore up clothing. Not only would I not want to use one of these outdated machines, nor display it in my home, but I also wouldn't have wanted to move it to the dozen different addresses I have lived at since then.

At least in the US, people move frequently, and a washing machine that lasts for decades isn't even a benefit, because they'll likely have left it behind.

> The difference between a 5 year washing machine and a 30 year washing machine is not very big. Anyone pinching those specific pennies is doing a bad thing.

Absolutely right, it's only a matter of tens of dollars, probably. However, retail consumer appliances live and die at the margins. Nobody is opening up their washer to inspect the components to see if the $510 washer has better components than the $499 washer. All else equal, they're buying the $499 washer 90% of the time. Your fixed costs are going to eat you alive when spread across your fewer units, and retailers will stop carrying your product because it isn't moving.... All the while the $499 washer is going to be sitting in that home 5 years from now when the realtor puts a sign out front. And literally zero people are buying a house based on the bearings in the washing machine.


> If you think people want 30 year old washing machines, you're kidding yourself.

You say this in the same breath you talk about people being desperate for any cheapest appliance instead of having nothing?

> And literally zero people are buying a house based on the bearings in the washing machine.

Well that's them being dumb.


> You say this in the same breath you talk about people being desperate for any cheapest appliance instead of having nothing?

Yes? I think you’re suggesting that the existence of old machines would be good for the poor. That’s true. However, manufacturers don’t make used machines. They only make new ones. So the forces of supply and demand do not apply.

> Well that's them being dumb.

Wat. No I’d say choosing a home based on location, school district, or inherent qualities of the home itself is a less dumb idea.


> Yes? I think you’re suggesting that the existence of old machines would be good for the poor. That’s true. However, manufacturers don’t make used machines. They only make new ones. So the forces of supply and demand do not apply.

As far as I can tell you were discounting the value of old machines, and suggesting long durability wasn't useful. I'm glad you agree they're useful.

And I know manufacturers make new machines. I'm suggesting that if the cheapest machines were much more durable at not-much higher prices, the end result would be better for everyone including the poor people that would otherwise have bought the even cheaper model.

(If we switched cold turkey it would be worse for them for a couple years before it got better. So let's not switch cold turkey. But that's not a reason to act like the current situation is anywhere near optimal. It's great that appliances have gotten massively cheaper than they used to be, but we could do even better.)

> Wat. No I’d say choosing a home based on location, school district, or inherent qualities of the home itself is a less dumb idea.

Wat. Do you think that's an either-or choice?

It's reasonable to say people choose a home based on price, right? If there's a washer and dryer as part of the package, the expected lifetime is basically an offset to the price.


> Wat. Do you think that's an either-or choice?

Often worse -- in many markets a buyer will pick whichever available option has a plurality of their preferences. Most buyers are going to prioritize the location, size, and permanent qualities of the home, and that's going to narrow them down to a short-list of options. Major renovations tend to affect the price of a house somewhat, but the quality of individual appliances typically does not, because they are easily changed and account for maybe 1% of the value of a home. Even a home without appliances entirely will tend to sell just as fast and for prices similar to other homes.


> And? That just sounds like they have good engineers. If you are designing a machine, you have an target lifetime. You'd obviously want the product to last through the warranty period, because warranty claims are a cost to the company.

Product have an expected lifespan longer than the warranty period. This is malicious if given as a target. I'd like to see MTBF numbers on everything so people can lump together and sue the shit out of manufacturers who do this. Would also make it easier to check the 25$ light bulb.


> Product have an expected lifespan longer than the warranty period. This is malicious if given as a target.

Also it is mathematically stupid, because products do not fail at consistent rates, nor are they used by customers are equal rates. If you want to minimize warranty costs, you do need to target some mean lifetime well beyond the warranty period.

MTBF (or MTTF) might be useful number if you buy 100 light bulbs, but is not really a useful number for you buying one appliance. Product failures don't follow a normal distribution. The stuff that ticks people off about shitty products is the infancy-failure part of the bathtub curve -- It's when you get 13 months out of a $200 blender that fails in infancy that you're pissed. Not when you get 24 months out of a $20 blender that fails from end-of-life.


There was a documented conspiracy in the past to limit the life of incandescent bulbs. Humans haven't changed that much.

https://en.m.wikipedia.org/wiki/Phoebus_cartel


Not LED light bulbs specifically, but...

"The Phoebus cartel engineered a shorter-lived lightbulb and gave birth to planned obsolescence"

https://spectrum.ieee.org/the-great-lightbulb-conspiracy


It's true that the Phoebus cartel arranged to have light bulbs die after a certain number of hours, but bulb lifetime is a trade off between lumens, filament life, and energy consumption. The cartel-defined lifetime limit sits very close to the sweet spot for all of those metrics for incandescent bulbs.

Technology Connections explained this well in a video about a year ago: https://youtu.be/zb7Bs98KmnY


  > bulb lifetime is a trade off between lumens, filament life, and energy consumption
At a specific temperature, using specific materials.

There is no reason to suspect that material science would not advance. Or other constraints would change. A specific company choosing that particular sweetspot for a particular product line is fine. But a collusion between companies dictating that specific constraint (in lieu of, e.g., wattage per lumen) is too clear a marker of anti-consumer intent.


>At a specific temperature, using specific materials.

The temperature of incandescent bulbs is directly related to the light that they give off, and tungsten is the obvious best material, there's no other materials on the horizon giving an improvement for the tech. It really is a pretty well understood tradeoff surface on a very mature technology (which was then displaced by completely different ones).


Advancements in materials led to LED bulbs, which operate under different principles and run fall cooler. Yet the time-limited bulb lifetime remains.

> Intentional planned consumption/obsolescence

No it isn't. It is simply optimization of price and the features/form-factor that many buyers have demanded.

If anything, the lifespan of a ~$1.50 household LED bulb is quite incredible. I'm not sure exactly how anyone would be able to increase the lifespan at that price point and keep the traditional Edison form factor.

> Amazon should be required to test all [..] products on its site such that they can prove safety and standards conformance.

No, the manufacturers should be required to... the same way it works for literally every other product with safety regulations.


> If anything, the lifespan of a ~$1.50 household LED bulb is quite incredible. I'm not sure exactly how anyone would be able to increase the lifespan at that price point and keep the traditional Edison form factor.

I don't think I've had any last more than 5 years.

If you bought a cutting edge LED bulb back in 2002 or so, those had a life expectancy of over 60 years, and the build quality was such that you could reasonably expect to get that.

There are plenty of teardowns on YT showing how poorly even major brand name LED bulbs are put together.


Yeah I would hope those bulbs were built pretty well, they were crazy expensive... expensive enough that they wouldn't be competitive in lifetime-per-dollar against today's crappiest bulbs even if they lasted a person's entire lifetime.

> I don't think I've had any last more than 5 years.

Do you shut them off every 3 hours? That's probably what the estimate on the box is based on. Run the same bulb half the day and you'll only get 2.5 years out of it.

> There are plenty of teardowns on YT showing how poorly even major brand name LED bulbs are put together.

I've seen them. And dissected my own. Still, at the price that modern LED bulbs are being made, I'm surprised they're built as well as they are. Brand name Sylvania bulbs are $0.79/ea in a bulk Amazon right now.


> I've seen them. And dissected my own. Still, at the price that modern LED bulbs are being made, I'm surprised they're built as well as they are. Brand name Sylvania bulbs are $0.79/ea in a bulk Amazon right now.

LED bulbs aren't lasting any longer than incandescent bulbs used to. My house has 2 bathrooms, one had incandescent bulbs when I moved in and I didn't bother to replace them. Those incandescent bulbs have outlived multiple sets of LED bulbs in the other bathroom.

I honestly worry about the increase in e-waste with LED bulbs vs the old incandescent bulbs.

> Do you shut them off every 3 hours? That's probably what the estimate on the box is based on. Run the same bulb half the day and you'll only get 2.5 years out of it.

Which given that LEDs should damn well last 20-30 years of always being on, this is all a farce. I can't even pay 2x the price to buy a bulb with an honestly stated lifetime on it.

> Yeah I would hope those bulbs were built pretty well, they were crazy expensive... expensive enough that they wouldn't be competitive in lifetime-per-dollar against today's crappiest bulbs even if they lasted a person's entire lifetime.

I bet they would be. Given LED bulbs last less than 3 years now, with some not even lasting 2 years, a 20-30 year bulb could cost 4x as much and be competitive.

The real problem is that those long lifetime LED bulbs are not driven as hard, so the light output isn't nearly as high. AFAIK all research in the last 20 years has been into bright LEDs with meh lifetimes, so I wonder if it is even possible to mass produce long lifetime consumer LEDs anymore.

(Except the LEDs in all my consumer electronics have no problems staying on for 5 years non-stop! Tiny output, long lifespan...)


> LED bulbs aren't lasting any longer than incandescent bulbs used to

They tend to last a lot longer for me.


Either you are misremembering incandescent bulbs or you're buying some real hot garbage LED bulbs. It used to be a regular chore to replace bulbs and people would keep a stock on hand like they'd keep soap or toilet paper. Incandescents usually lasted 1,200 hours, LEDs of today last around 11,000. Nobody's bulbs are lasting 250,000 hours, and if they did they wouldn't be 4x the price of the 11k hour bulbs.

The reason indicator LEDs last a different amount of time is because they are very different applications. If you want to walk around your house lit by an indicator lamp, go right ahead, but most people want more light, and more light = more heat and degradation of the diode.


The problem is that the manufacturers lie and say that the LED bulbs will last for many years when they don't.

The claim they put on the box is typically true, but based on some damn modest usage. (e.g. 3 hours per day in ideal environmental conditions) And of course, a mean-time-to-failure figure to someone with one bulb built with minimal QA is just a dice-roll when faced with the bathtub curve of product failures.

That, and customers insisting on preexisting form factors. Fitting the electronics and LEDs into the space of a traditional lightbulb comes with compromises, such as not having proper heat dissipation on either.

Yeah, you would think they would be two separate devices by now...

Please think for a moment not only about whether it's feasible for AMZN to run a safety testing program for all possible consumer products of our modern technological civilization, but whether you really want them to be in charge of it. Maybe they should just require certifications of testing in the jurisdictions where those products are sold?

Isn't faking certs already a problem?

Probably. Is it a worse problem than Amazon inspecting themselves would be? Is it a worse problem than Amazon demonstrably already has with policing counterfeits? I'm just saying, you could hardly ask for a less-qualified authority for product testing. At least with independent certs it's vaguely possible to align the incentives correctly. With Amazon the incentives would be hosed from the start.

> Any much older than that need to be checked out-of-circuit for ESR and then capacitance

And that's a very time consuming and somewhat risky operation on an old machine you want to keep running. Some old PCBs are quite fragile.

I wish there was a way to test capacitors without removing them.


You can test ESR in-circuit, with caveats. Here's a good thread from EEVblog [1].

[1] https://www.eevblog.com/forum/beginners/is-there-any-way-to-...


Here's a question for EE nerds that happen to be reading this (maybe in the future).

What if I have a stash of big electrolytics that have been out of service for 10+ years? I know that I need to reform them over a few days, but can they even run at spec after so long out of operation?

We're talking BIG stuff, 400v, 200+J each


For hobby or laboratory purposes, it’s worth a try. Some probably will come back completely.

But if you wanted to use them in production, and be able to blame me when it didn’t work, I’d say no.


Bipolar electrolytic capacitors are a thing, I recently had to solder up a handful of them in some audio circuits.

Once you have experienced blowing up a reversed elcap you will never forget its orientation. I never understood though what makes it leak current and hence heat up.

There's an aluminum oxide layer as a coating on both the anode and cathode inside the (electrolytic) capacitor. Under forward voltage it will gradually thicken but under reverse voltage it dissolves and causes a short. This increases the temperature which causes hydrogen ions to separate and bubble through the material, increasing pressure within the capacitor package until it bursts.

The thickening under forward voltage explains the "recovering cap" phenomenon with some old caps. Didn't know that.

Modern electrolytic caps don't burn like they used to.

The last few times I made a mistake, there wasn't even an explosion, even less a short-circuit. The thing slowly boiled and bubbled or unfolded.

Anyway, it blows up because the capacitor's insulation layer isn't some stable material, it's a tiny oxide layer built over the metal plate by anodization. If you put a high voltage on it with the wrong polarity, you reverse that anodization and short the liquid and the metal electrodes.


There are polarised and unpolarised capacitors. Stuff like basic decoupling capacitors tend to be unpolarised.

I actually have an LC III in storage, so I might actually be able to make use of this article.

I think this will allow me to classify today as productive.


Yeah, I have a Performa 450, which I believe is the exact same computer sold under a different name. So this is definitely important to know. I can go back to bed now, my job for today is done.

At least you made my Wednesday ;-)

[flagged]


It's the sort of mistake that one could make in a PCB design today.

Apparently people can't read.

I don't know which part of "to me" is not clear.

I don't design PCB boards (thus the "useless" word), and the comment was apparently a lighthearted joke in response to another lighthearted joke.


There are so many cases of this sort of stuff it's unreal. But it gets even stupider.

I found one a few years back when I repaired a linear power supply. This required me to reverse engineer it first because there was no service manual. I buzzed the whole thing out and found out that one of the electrolytic capacitors had both legs connected to ground. They must have shipped thousands of power supplies with that error in it and no one even noticed.


That seems like one the least harmful mistakes you could make. Capacitors are sprinkled all over boards in excess’s because it’s probably better than not enough capacitance.

It would probably be an idea to point out here that those are decoupling capacitors used to reduce noise. Capacitance isn't the point with them, but that they pass AC while blocking DC. Not that the capacitance hurts, mind.

But that’s what a capacitor does? The point is to provide a low-inductance path, not really block DC

I have a 3D printer where presumably a smoothing cap just fell off the X axis controller section of the mainboard. Didn't make a lick of difference in anything operationally. Still works great.

Checks out, most boards are made with very conservative amounts of decoupling capacitance because it’s way easier than dealing with random failures due to not enough capacitance

I've understood that capacitors can be used for timing, or smoothing a voltage after a power regulator (I think).

How/what does adding capacitance help with?


Voltage spikes from line inductance, voltage drop-outs from line resistance. Basically you have little reservoirs of charge scattered all around the board (current flow isn't instantaneous in a real circuit).

It helps to always think of current draw in a compete loop, out the "top" of the capacitor, through your IC, and back into the ground side (this isn't necessarily what's happening physically). Shorter loop means less inductance, shorter traces less resistance.


Thanks, that makes no sense to me :)

But I do recall having had inductance issues with high frequency transfers to an atmega.

I have no idea how a capacitor would help. Sounds like it would increase a signal delay (tiny one perhaps).

But definitely interesting, I'll need to learn about this if/when I dive further into electronics.


Decoupling can be used on signal lines, but we’re talking about power lines here. If you have something like an ATMega, it will change its power demands very frequently at high frequencies due to switching signals on and off. Without some additional capacitance on the board, the inductance will dominate at high frequencies. Inductance resists change in current, and you’ll get voltage drops.

Capacitance is in many ways the opposite of inductance. If you place capacitors close to the power sink (e.g. the ATMega), the traces will have lower inductance and be “decoupled” from the power supply.


Smoothing is part of the story: but the important question is what is causing the roughness? Switch mode power supplies have inherent output ripple that can be filtered, but that’s distinct from transient variations in the load. Decoupling capacitors are used to provide a low impedance path at high frequencies i.e. fighting inductance.

It could be there to control emissions. You’d need to analyze the circuit to determine its purpose.

Very possible! I actually have a 100MHz scope and sdrs that tune from 9khz to 2ghz, could be an interesting distraction on the weekend to see if that axis is any noisier than the others.

Way back when a co worker was powering up a fire alarm control panel. Poof, capacitor popped and damaged his eye

Name and shame!

Voltcraft. Can't remember the model number.

Commodore had 3 capacitors mounted backwards on the A3640, the CPU board of the Amiga 4000 with 68040 processors: https://youtu.be/zhUpcBpJUzg?si=j6UFmIJzoC-UDS6u&t=945

Also mentioned here: https://amiga.resource.cx/exp/a3640


ZX Spectrum +2 shipped with transistors backwards: https://www.bitwrangler.uk/2022/07/23/zx-spectrum-2-video-fi... This even caused visible artifacts on the display, which was apparently not enough for the problem to be noticed at the factory.

I think Clive Sinclair was notorious for wanting products to be brought to market quickly, with pretty aggressive feature sets. They very well may have noticed it at the factory, but didn't want to do a fix because it was technically functional.

The +2 was an Amstrad product, not designed or built by Sinclair, though.

Didnt at least some engineers transfer after acquisition?

Maybe. I don't think a lot did. Amstrad did not acquire the company: it bought some rights to use the Sinclair Research brand.

That's why the next computer Sir Clive launched was the Cambridge Computers Z88. But note, some of the later bicycles were Sinclair Research branded:

https://en.wikipedia.org/wiki/Sinclair_Zike

https://en.wikipedia.org/wiki/A-bike

Amstrad did not acquire or develop the Sinclair QL, for instance, but it did sell Sinclair-branded x86 PCs.

https://www.computinghistory.org.uk/det/3404/sinclair-pc200/

It sold on some stock of existing ZX Spectrum hardware, but mostly it sold models it developed:

* Spectrum +2 -- a Spectrum 128 with a mechanical keyboard and built-in cassette drive

* Spectrum +3 -- a redesigned Spectrum 128 with a DOS from Locomotive Software and a 3" (not 3.5") floppy drive. Dropped compatibility with 48 peripherals such as Interface 1 and Microdrives, and 128 peripherals such as the numeric keypad, serial ports, etc. Added the ability to page out the ROM and replace it with RAM, so it could run CP/M 3, also ported by Locomotive.

* Spectrum +2A, the black +2: a cut-down +3 with a cassette drive.

These were designed by Amstrad engineers and contractors, and manufactured by Amstrad. No Sinclair involvement I'm aware of at all.


Commodore just kept doing this. Just listing shoddy craftsmanship would take forever, and then we get to intentional bad decisions, like giving the A1200 a power supply that's both defective (capacitors ofc) and barely enough to support the basic configuration with no expansions, which is extra funny because PSUs used with weaker models (A500) had greater output...

The number of used a500 power supplies I sold to customers when I upgraded their a1200 with a GVP 030 board + RAM...

This was the hardware patch I had to install to use a CyberstormPPC: https://powerup.amigaworld.de/index.php?lang=en&page=29

Classic Commodore Quality :P

They also had backwards caps on the CD32 and A4000


The author seems to misunderstand PCB design flow. This is neither a "factory component placement issue" nor a silkscreen error. The error is in the schematic.

The layout CAD is often done by a different team that follows the schematic provided by design engineering. Automated workflows are common. The silk screen is predefined in a QA'd library. It is not their job to double check engineering's schematic.

The components are placed per the layout data.

Both those teams did their jobs correctly, to incorrect specifications. In fact, the factory performing assembly often is denied access to the schematic as it is sensitive IP.

If you're going to cast blame on a 30 year old computer, at least direct it at the correct group. It wasn't soldered incorrectly at the factory. They soldered it exactly how they were told to - backwards.


>The layout CAD is often done by a different team that follows the schematic provided by design engineering.

Just as a note, this is a fairly archaic way of working nowadays. At my place schematic design and layout go hand-in-hand, and we rejected a candidate because he didn't do the latter. The main reason is layout is no longer an afterthought, it's a key part of the electrical design of the system, and there's little room for a tedious back and forth between the circuit designer and the person doing the layout about what traces are and aren't important to optimize for various attributes.


Indeed, and this is true in other engineering activities such as mechanical design as well. Possibly with the exception of very large shops, there are no draftsmen any more, and the design engineer also creates the production drawings. And the software lends itself to this. Schematic / layout, and design / drawing, are joined together in the design software. It would be very hard to make a mistake like the one in TFA today.

Even the free software that I use -- KiCad -- would ding me.

We make bigger mistakes instead. ;-)


And yet it is not at all unusual for a production engineer to spot these faults and pass them back to the design engineers for rework.

Also true! Most common when you accidentally screw up a footprint and it doesn't fit the part on the BOM. A backwards part is the kind of thing they're not likely to pick up on (if it's marked on the silkscreen incorrectly, at least), but some do.

In the mid 80's I was the head of the CS student chapter. We ran the computer rooms for the science faculty. We had a room with about 20 Mac 128k. I do not know where Apple sourced their capacitors from, but these were not A-tier. A Mac going up in a puff of white smoke was a weekly occurrence. We had a few in reserve just to cycle them in while they were out to Apple for repair.

P.S. still my favorite Mac of all time was the IIcx. That one coupled with the 'full page display' was a dream.


On the other side, we had intern at our (very small) company and he used his own mac. One time he had to debug a mains-powered device. He decided that he will try connecting it to both mains AND programming dongle without separating transformer. He fried the dongle (it literally exploded, plastic lid banging on desk in sudddenly silent office is the most memorable thing), the company provided monitor and device, but somehow his private mac mini survived all this while being in the middle.

That sounds fishy, even if the debugged device directly interfaced mains, the Mac doesn't. And even if it did, how high would the probability be that both machines were on different circuits with phases so much out of sync that it would matter?

Unless I misunderstood your story


That device was a cheap wifi power plug, had cheap unisolated power supply, it was never intended to have user accessible electrical parts sticking out, so no need for isolation. In such cases device has common ground with ac voltage. I don't know all specifics, but NEVER connect any single terminal of 220V plug to your computer ground (usb ground in this case). When it's properly grounded, most devices will survive this. But somehow monitor connected to that mac didn't survive it. And several milliseconds of full 220V before circuit breaker reacted, made very thin traces in debugger pretty much vaporise and explode.

If i remember correctly, a lot of power supplies of cheap electronics have AC-coupled the low voltage side with the mains side. There's no physical wire, just a capacitor. You can often feel the AC when touching the 'safe' side of the adaptor.

Forget “cheap”. As far as I can tell, many modern ungrounded power supplies, including Apple’s, have enough A/C coupling from the line to the output that you can feel a bit of tingling when you touch a metallic object connected to the output.

How is this even allowed? My tv had it. My MacBooks since time memorial have it. They all feel “spicy”.

The Y capacitor is needed to allow the EMI to have a way to ground from the output rather than going out and getting radiated by the output lines.

I don’t believe for a second that this is actually necessary in a way results in that spicy feeling. I do believe that it’s far cheaper to use a Y capacitor than to come up with a better filter network that works well, though.

Common mode noise filtering is either going to be purely inductive or need a Y-cap. No other way around it.

One can build lots of things out of inductors and capacitors. I bet it’s possible and even fairly straightforward to built a little network to allow high frequencies to pass from output to the two line inputs with low impedance but that has extremely high impedance at 50 and 60 Hz (and maybe even at the first few harmonics). It would add components, cost and volume.

I bet this could be done at the output side, too. And a company like Apple that values the customer experience could try to build a filter on their laptop DC inputs to reduce touch currents experienced by the user when connected to a leaky power supply. Of course, the modern design where the charging port is part of a metallic case might make this rather challenging…

(Seriously, IMO all the recent MacBook Air case designs are obnoxious. They have the touch current issue and they’re nasty feeling and sharp-edged.)


The capacitor has to see the common mode voltage. Where do you put the other end?

Off the top of my head? Make a little gadget that’s an inductor and capacitor, in parallel, tuned to 60 Hz (i.e. a band-stop filter) and, in series with that, a Y capacitor. Wire up this gadget in place of the Y capacitor, so you end up with two of them (line to output negative and other line to output negative, perhaps). Or maybe you just have one, and you connect it between the normal pair of Y caps and the output. It will have very high impedance at 60Hz, enough impedance from DC to a few kHz to avoid conducting problematic amounts of current at DC or various harmonics, and low enough impedance at high frequencies to help with EMI. It might need a couple types of capacitor in parallel in the band-stop part to avoid having the high-frequency impedance of the presumably large-ish capacitor in parallel with the inductor being a problem, and it might be an interesting project to tune it well enough to really remove the annoying touch current, especially if you believe in 50Hz and 60Hz operation. Maybe a higher order design would work better, but the size would start to get silly.

My Fold 5 has that feeling along the hinge when charging too, no matter the charger I use. I guess it's considered safe, but it's weird.

Totally believable if the debugging device was doing something with a serial port. I once hacked something together to interface a PC serial port to a Raspberry Pi. The PC serial is real-ish RS-232, with negative voltages. The Pi side was just 0/3.3V positive. I had a nice 18-volt power brick laying around, and just split it's output down the middle--what was 0 volt ground was used as -9 volts, the middle voltage was now 0 volt ground, and the 18-v line was now +9 V.

At first everything seemed OK. but when I plugged a monitor into the PI I Was Made To Realize a) the nice 18-volt PS really was high quality, and although it was transformer-isolated its output ground was tied to the wall socket earth, b) monitors also tie HDMI cable ground to earth, and so c) my lash-up now had dueling grounds that were 9V apart.


With things like the Mac 128k, reliability issues may partly be down to Steve Job’s dislike of cooling fans.

To be honest, cooling fans never get the attention they deserve and end up whiney or buzzy.

That said, apple did a really good job with mac pro cooling fans where the shroud spun with the blades.

I think it did better than the the best PC cooling fans like noctua.


I always built PCs with the largest diameter fans possible - not sure why so many things come with tiny fans. Loads more airflow with less noise and even if they do spin up fast the noise they make is much more pleasant.

I was just thinking of the Apple III the other day.

If I remember, jobs had them not include a cooling fan. As it would heat up and cool down the chips in the motherboard would work their way out of the socket. So one of the official solutions to try if you were having issues would be to drop it a couple of inches to try and get the chips to re-seat inside.

Crazy.


IIcx was my first computer! I still have mine.

Brings back memories…

About 30 years ago I designed my first PCB with frequencies in the GHz range. It was full of challenging transmission line paths with frequencies in the hundreds of MHz and above.

I am still proud of the fact that all of the high speed signals worked as designed, with excellent signal and power integrity (the large FPGA was challenging). Emissions passed as well.

I did, however, screw up one thing: DC

I somehow managed to layout the DC input connector backwards!

These boards were very expensive ($2K), so an immediate respin was not possible.

I had to design a set of contacts to be able to flip the connector upside-down and make the electrons go in the right way.

The joke from that point forward was that I was great at multi-GHz designs but should not be trusted with DC circuits.


Commodore struggled with same mistakes on negative rail in Audio section, but also somehow on highend expensive CPU board.

https://wiki.console5.com/wiki/Amiga_CD32 C408 C811 "original may be installed backwards! Verify orientation against cap map"

A4000 https://wordpress.hertell.nu/?p=1438 C443 C433 "notice that the 2 capacitors that originally on A4000 have the wrong polarity"

Much worse is Commodore A3640 68040 CPU board aimed at top of the line A3000 and A4000 http://amiga.serveftp.net/A3640_capacitor.html https://forum.amiga.org/index.php?topic=73570.0 C105 C106 C107 silkscreen wrong, early revisions build according to bad silkscreen.


Typical Amiga fanboyism and Apple envy, if a Mac does something they have to prove the Amiga outdid it. “Only one model with a reverse polarity capacitor? With Commodore it was a systematic issue!”

> Typical Amiga fanboyism and Apple envy, if a Mac does something they have to prove the Amiga outdid it.

I think we're envious that Apple did a better job of engineering their systems


Sounds like the person who designed the board followed a very simple and wise rule: always connect the negative side to the ground. Can't go wrong with that...

until you have to deal with negative voltage (-5V). Another out of bounds bug.


From around 2011-2015, I sometimes talked to an ex-Navy electrical tech who said he was also an early Apple rework tech in the SF Bay Area. He had no shortage of work fixing manufacturing problems, adding rework improvements, and building custom test equipment until they laid him off, outsourced his job to some random country, and then he was homeless until around 2016.

It's a good thing that these machines don't even need -5 volts. With just the positive voltages provided, RS-422 still works, including LocalTalk.

I think the -5 volts is only there in case an expansion card needs it.


I did a bit of research on this because I was confused about why RS-422 needed -5V. Normally the driver is powered off +5 and 0 Volts for RS-422. By increasing the swing down to -5, the Mac port is compatible with RS-232. So a Mac of this era can connect to a modem with a specially wired cable. No level conversion is required and RS-422 receivers are required to work down to -6V so the Mac is producing completely valid RS-422 at the same time.

Most RS-232 receivers of that era had a fair amount of gain with a transition point close to 0V. So only a little minus voltage would be required in practice, not the entire -5V. So just as long as the reversed capacitor was not entirely shorting out the rail things would have worked.


I've wondered how my machines have worked without negative voltages, and this helps. Thanks!

These days, a charge pump device costs just a few dollars from Ali-express, so even though things work fine this way for many things, it wouldn't hurt to make them work for most if not all things.

When a replacement power supply does provide negative voltage, I try to use it, even if it does mean adapting -12 volts, which is what most ATX power supplies provide, to -5 volts, which is what, it seems, many 68030 Macs want.

Funny - I just did an ATX power supply for a Quadra 630, and it seems that most of the 68040 machines were using -12 volts instead of -5 volts (except the Quadra 605, which happened to use the same form factor and power supply as the LC / LC II / LC III).


Apple should be required to do a recall for these motherboards.

If they do a recall, it will say they should be discarded. Sony has a recall on all its trinitron tvs made before the end of 1990 like this:

https://www.sony.jp/products/overseas/contents/support/infor...


This shouldn't be allowed at all: if the product was bad all along, they should be required to fix it, and shouldn't be able to say "well, it's old, so you should just trash it", which means they don't suffer any penalty whatsoever.

I don't think that's a reasonable expectation in general, and certainly not in this case. The affected TVs were all at least 20 years old - that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these. Nor is it clear what Sony could reasonably have done to repair them; even by 2010, a lot of the parts used in CRT TVs were out of production and unavailable.

Maybe you're too young to remember, but people used to keep TVs for much longer periods before HDTV and flat panels came out.

Also, these TVs are apparently fire hazards. It doesn't matter that they're 20 years old (at the point of the "recall" in 2010).

I doubt the parts necessary to fix them were out of production; you can get parts for truly ancient electronics still. Things like capacitors don't become obsolete. The recall doesn't specify exactly which component is problematic, but says it's age-related, which usually points to capacitors.


This. I’ve known a TV that was in more or less daily use for over 30 years. Not sure why we stopped expecting that from electronics.

>Not sure why we stopped expecting that from electronics.

For TVs specifically, the technology changed a lot. For a long time, everyone was stuck on the NTSC standard, which didn't change much. At first, everyone had B&W TVs, so once you had one, there was no reason to change. Then color TV came out, so suddenly people wanted those. After that, again no reason to change for a long time. Later, they got remote controls, so sometimes people would want one of those, or maybe a bigger screen, but generally a working color TV was good enough. Because TVs were glass CRTs, bigger screens cost a lot more than smaller ones, and there wasn't much change in cost here for a long time.

Then HDTV came out and now people wanted those, first in 720p, and later in 1080i/p. And flat screens came too, so people wanted those too. So in a relatively short amount of time, people went from old-style NTSC CRTs to seeing rapid improvements in resolution (480p->720p->1080->4k), screen size (going from ~20" to 3x", 4x", 5x", 6x", now up to 85"), and also display/color quality (LCD, plasma, QLED, OLED), so there were valid reasons to upgrade. The media quality (I hate the word "content") changed too, with programs being shot in HD, and lately 4k/HDR, so the difference was quite noticeable to viewers.

Before long, the improvements are going to slow or stop. They already have 8k screens, but no one buys them because there's no media for them and they can't really see the difference from 4k. Even 1080p media looks great on a 4k screen with upscaling, and not that much different from 4k. The human eye is only capable of so much, so we're seeing diminishing returns.

So I predict that this rapid upgrade cycle might be slowing, and probably stopping before long with the coming economic crash and Great Depression of 2025. The main driver of new TV sales will be people's old TVs dying from component failure.


Great points. The TV I have today is approaching my platonic ideal screen. It’s as big as it can get without having to continually look around to see the whole screen. Sit in the first row of a movie theater to understand how that can be a bad thing. The pixels are smaller than I can see, it has great dynamic range, and the colors can be as saturated as I’d ever want. There’s not much that can be improved on it as a traditional flatscreen video monitor.

> The human eye is only capable of so much, so we're seeing diminishing returns.

Or not seeing diminishing returns. Which is the point.


> At first, everyone had B&W TVs, so once you had one, there was no reason to change

Televisions improved over time:

- screens got flatter

- screens got larger

- image quality improved

- image contrast increased (people used to close their curtains to watch tv)

- televisions got preset channels


My experience of ancient CRT devices is that the display gets gradually dimmer. I once had a TV that was only really usable after dark -- but that's the only time I wanted to use it anyway -- and a huge Sun monitor that was only just about readable in total darkness, but we kept it because we also had a Sun server that we didn't know how to connect to any other monitor and we were worried that one day we wouldn't be able to SSH to it, but in fact the server never once failed.

> daily use for over 30 years

However that doesn't imply TVs were that reliable.

Before the 90s TV repairman was a regular job, and TVs often needed occasional expensive servicing. I remember a local TV repair place in the 90s which serviced "old" TVs.


> Not sure why we stopped expecting that from electronics.

Last years model only does 4k, my eyes need 8k


32K ought to be enough for anybody.

32K is going to look so lifeless and dull after you try 64k.

When will the pixels start to approach erythrocyte-level density like on the Vision Pro?

edit: Anywhere between 208K to 277K.


Because electronics got so much better so much faster, that the vast majority of customers did not want to use old hardware.

Especially if customers allowing shorter lifetimes allowed companies to lower the prices.


There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.

Moreover, we're talking about televisions and old Macs. TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price), and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?

Much older computers continue to be used because they run software that newer computers can't without emulation (which often introduces bugs) or have older physical interfaces compatible with other and often extremely expensive older hardware.

If people actually wanted to replace their hardware instead of fixing it then they'd not be complaining about the inability to fix it.


>There are many use cases for which a decade-old computer is still perfectly serviceable and even where they aren't, those computers can be repurposed for the ones that are.

It depends. Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems. You're probably better off getting a RasPi or something, depending on what exactly you're trying to do. Newer systems have gotten much better with energy efficiency, so they'll pay for themselves quickly through lower electricity bills.

>TVs with higher resolutions might come out, but lower resolution ones continue to be sold new (implying demand exists at some price)

We're already seeing a limit here. 8k TVs are here now, but not very popular. There's almost no media in that resolution, and people can't tell the difference from 4k.

For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.

>and then why should anybody want to replace a functioning old TV with a newer one of the same resolution?

They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen. It's possible they might want newer smart TV features too: older sets probably have support dropped and don't support the latest streaming services, though usually you can just get an add-on device that plugs into the HDMI port so this is probably less of a factor.


> Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems.

Even many Pentium 4-based systems would idle around 30 watts and peak at a little over 100, which is on par with a lot of modern desktops, and there were lower and higher power systems both then and now. The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel. Midrange then and now was ~65W. Also, the Pentium 4 is twenty two years old.

And the Pentium 4 in particular was an atypically inefficient CPU. The contemporaneous Pentium M was so much better that Intel soon after dumped the P4 in favor of a desktop CPU based on that (Core 2 Duo).

Moreover, you're not going to be worried about electric bills for older phones or tablets with <5W CPUs, so why do those go out of support so fast? Plenty of people whose most demanding mobile workload is GPS navigation, which has been available since before the turn of the century and widely available for nearly two decades.

> For a while, this wasn't the case: people were upgrading from 480 to 720 to 1080 and now to 4k.

Some people. Plenty of others who don't even care about 4k, and then why would they want to needlessly replace their existing TV?

> They probably don't; if they're upgrading, they're getting a higher resolution (lots of 1080 screens still out there), or they're getting a bigger screen.

That's the point. 1080p TVs and even some 720p TVs are still sold new, so anyone buying one isn't upgrading and has no real reason to want to replace their existing device unless it e.g. has a design flaw that causes it to catch fire. In which case they should do a proper recall.


>The top end Pentium 4 had a TDP of 115W vs. 170W for the current top end Ryzen 9000 and even worse for current Intel.

You can't compare CPUs based on TDP; it's an almost entirely useless measurement. The only thing it's good for is making sure you have a sufficient heatsink and cooling system, because it tells you only the peak power consumption of the chip. No one runs their CPUs flat-out all the time unless it's some kind of data center or something; we're talking about PCs here.

What's important is idle CPU power consumption, and that's significantly better these days.

>older phones or tablets with <5W CPUs, so why do those go out of support so fast?

That's an entirely different situation because of the closed and vendor-controlled nature of those systems. They're not PCs; they're basically appliances. It's a shitty situation, but there's not much people can do about it, though many have tried (CyanogenMod, GrapheneOS, etc.).

>Plenty of others who don't even care about 4k

Not everyone cares about 4k, it's true (personally I like it but it's not that much better than 1080p). But if you can't tell the difference between 1080p and an NTSC TV, you're blind.

>1080p TVs and even some 720p TVs are still sold new

Yes, as I said before, we're seeing diminishing returns. (Or should I say "diminishing discernable improvements"?)

Also, the 720p stuff is only in very small (relatively) screens. You're not going to find a 75" TV with 720p or even 1080p; those are all 4k. The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.


> What's important is idle CPU power consumption, and that's significantly better these days.

It isn't. You can find both ancient and modern PCs that idle anywhere in the range from 10 to 30 watts, and pathological cases for both where the idle is >100W. Some of the newer ones can even get pretty close to zero, but the difference between zero and 30 watts for something you're leaving on eight hours a day at $0.25/kWh is ~$22/year. Which is less than the interest you'd get from sticking the $600 cost of a new PC in a 5% CD.

And many of the new ones are still 30 watts or more at idle.

> That's an entirely different situation because of the closed and vendor-controlled nature of those systems.

It's a worse situation, but if the complaint is that they abandon their customers long before the customer wants to stop using the device, they certainly match the criteria.

> But if you can't tell the difference between 1080p and an NTSC TV, you're blind.

Being able to discern a difference and caring about it are two different things. If your use for a TV is to watch the news and play 90s video games then the resolution of the talking heads doesn't matter and the classic games aren't in 1080p anyway.

> The low-res stuff is relegated to very small budget models where it's really pointless to have such high resolution.

Which is the point. If you have an old 30" TV and no space for a 72" TV, you do not need a new 30" TV.


For most videos, the difference between 1080p and 4k ain't that large.

But for certain video games on a large screen, I can definitely tell the different between 1080p and 4k. Especially strategy games that present a lot of information.

Btw, as far as I can tell modern screens use significantly less power, especially per unit of area, than the CRTs of old; even if that CRT is still perfectly functional.


> Older computers usually guzzle power, especially if you look at the absolutely awful Pentium4 systems.

https://en.wikipedia.org/wiki/List_of_Intel_Pentium_4_proces...

The Northwood chips were 50 to 70 W. HT chips and later Prescott chips were more 80 to 90 W. Even the highest chips I see on the page are only 115 W.

But modern chips can use way more power than Pentium 4 chips:

https://en.wikipedia.org/wiki/Raptor_Lake

The i5-14600K has a base TDP of 125 W and turbo TDP of 181 W, and the high-end i9-14900KS is 150 W base/253 W turbo. For example, when encoding video, the mid-range 14600K pulls 146 W: https://www.tomshardware.com/news/intel-core-i9-14900k-cpu-r...

More recent processors can do more with the same power than older processors, but I think for the most part that doesn't matter. Most people don't keep their processor at 100% usage a lot anyway.


As I said in a sister comment here, you can't compare CPUs by TDP. No one runs their CPU flat-out all the time on a PC. Idle power is the important metric.

A decade old CPU would be a Haswell, not a Pentium 4.

Suppose they would recall all the old tv's with known faults, can those be fixed to become conform to (today's) quality and safety standards, while being full of old components with characteristics beyond original tolerances?

> that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these

A modern TV may have an expected lifespan of five years. TVs from several decades ago had lifespans of... several decades. Quality has plummeted in that market.


5 years? Is that really true? I’m currently using an LG from 2017 and cannot imagine needing to change it. I would be shocked if it stopped working.

I don't think it is true at all.

There's nothing inside today's monitors or TVs that can't run for at least 10 years. Our main TV, 42" 720p LCD, is from 2008, and I have monitors that are just as old.


Yep. My TV, a 42" Panasonic plasma, dates from 2009 and is still working perfectly. I haven't replaced it, because why would I?

I have an LG OLED from 2017. It started getting really bad screen burn/pixel degredation just after the 6 year mark (6 year warranty), I did a quick search on Youtube, and lo-and-behold, a whole bunch of other people, with the same model, started having the same screen burn-in issues at the same age!

It covers the middle third of the screen, top to bottom, and the entire bottom 1/4 of the screen with some odd spots as well, it's really distracting and essentially makes the TV useless (to me).


OLED screens are known for having burn-in problems like this. LCDs don't, though they probably have issues with backlights becoming dim with age.

Yeah, it's quite disappointing when you buy a mid-high end OLED, despite keeping the brightness down and being careful with it, it just suddenly developed huge patches of worn out pixels one day.

I'll probably still stick to OLED though unless there's a new technology that can give me the same blacks and brights; can't keep up with all of the new marketing terms these days so not sure if there is something that can compete but last longer.


I have an LG about that vintage and it’s starting to black out when doing 4K content. All components before it switched out and up to date in firmware. Reatarting works, sometimes all day, sometimes 1 minute.

My other TV about the same vintage is starting to have stuck pixels in the corner.

Modern failure modes aren’t nearly as graceful.


But when it does, it will probably be the capacitors in the power supply that have dried out.

Is that really the case? Because if so, it seems like simply replacing the capacitors would save a lot of waste and unnecessary purchases of new TVs...

This is a very common fault, yes. Power supply issues in general. It is also not uncommon for people to replace e.g. Wifi routers because the wall warts fail.

It comes down to a few people don't knowing a lot about it - and I'm not blaming anyone for that, we all have our interests and most people have more than enough to do already to worry about what goes on inside their stuff.

Also, electronics are, to a lot of people in a lot of places, so cheap that they would rather just curse a little and buy a new thing, instead of bothering with taking the thing to a shop. And of course a few hours of skilled labour in a big city in the west might also be almost as expensive as making a whole new TV in a factory in Asia plus shipping, so it might not even make economic sense.


> And of course a few hours of skilled labour in a big city ...

In many/most places, these repair shops don't even exist any more, because the products have gotten too complicated/integrated/parts-unavailable, and the economics are nonsensical.


Electrolytic capacitors are not solid state and likely #1 failure mode for most electronics. There are options for better (e.g. Al polymer) capacitors that are rather expensive - overall good capacitors are 'expensive', e.g. more than a dollar a piece in some cases.

The 2nd most common failure mode gotta be the mlcc (multi layer ceramic capacitor) cracks/shorts.


How can I even know which capacitor is faulty?

If your model was popular, there's likely a recap kit for its power supply. It usually makes senss to swap all the capacitors in the kit, unless the kit instructions say otherwise.

You can look for physical signs of degredation (bulgy, leaky, discolored), but to really test a capacitor for capacititance, you need to take it out of the circuit, at which point, you may as well put a new, high quality capacitor in.

The OEM capacitors may likely have a just right voltage rating, a new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.


> new one with a higher voltage rating (and same capacitance, compatible type) may last longer in cirucit as well.

That's not necessarily true, higher voltage rating equals higher ESR which means more heat.


That would require some experience, yet the most common visual clue would be 'bulging'. There are some ways to measure ESR w/o desoldering but they won't be reliable at all times.

Measuring voltages, peak to peak, is a bit more work.


A TV used to cost a few weeks pay and now you can get a TV for the equivalent of a few hours pay. There just isn't much of a market for a $3000+ TV.

Few usually means 3-5 or so, a half decent TV would be at least half a grand. That's rather high hourly pay rate.

Explain to me why this tv for $100 [1] isn't perfectly suitable to replace a 2008 40" 1080p Samsung LCD with florescent backlight that 2was a deal at $1000. Yeah, you could get something bigger and better. Yes, price comparison on a sale week is a bit unfair.

[1] https://www.bestbuy.com/site/tcl-40-class-s3-s-class-1080p-f...


FYI: bestbuy is unavailable outside the US (the site I mean), or likely NA.

Only one metric of 'quality' has plummeted.

A rock lasts billions of years, but its quality as a TV is rather questionable.


"that's well beyond the expected useful lifespan of even a modern TV, let alone an older model like these"

People still run these Trinitron TVs to this day.


It is a legitimate business decision, to sell things that last less than 20 years. Fine, I think it is lame, but it is their choice.

But, we shouldn’t let companies get away with selling products that catch fire after working fine for 20 years.


> that's well beyond the expected useful lifespan of even a modern TV

What? That's nuts. Why bother buying a tv if you're immediately going to throw it in the trash


My radial arm saw ended up getting a product recall for simply being too difficult for the average consumer to use safely. The "recall" amounted to them sending you instructions to cut off a critical power cord and mail it in to them, and they send you a $50 check.

That is completely unreasonable. Companies can't be expected to take in and repair devices that old.

For 1993 hardware?

They don't do recalls even on modern hardware. But soldering hacks are no longer possible, all parts are serialized.

Louis Rossmann made many videos on this.


What are you talking about? Capacitor technology hasn't changed substantially in decades, and it's just as possible to change caps with a soldering iron now as it was 20 years ago. I have no idea what you mean by "serialized".

not capacitors, but more advanced components, like the camera, have serial numbers embedded in them, and the serial number needs to match, otherwise it won't accept the component. Components off a stolen device are put on a list and won't work in admirer another phone, so stolen phones aren't even worth anything for parts, driving down the market for stolen phones. It also makes the job of repair shops harder, which is collateral damage in Apple's eyes, but is very much material for anyone running a repair shop.

The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up. On top of that the "non genuine parts", some of which really are utterly dire, show up in the OS as being not genuine parts. Buying genuine parts, which are available from Apple, eats into the margins. There is very little honour in the repair market, despite the makeup applied to it by a couple of prominent youtubers and organisations.

The amount of horror stories I've seen over the years from independent repairers is just terrible. Just last year a friend had a screen hot snotted back on their Galaxy.


> they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up

What represents a more efficient economy. The one where broken phones get reused for parts or the one where you have to throw them away?


The economy that isn't backed with criminal activity and loss for customers.

If you think Apple's part pairing policy has anything to do with consumer benefit, I have a bridge in Arizona to sell you.

> The only reason this is an issue for repair shops is they can't sell you recycled stolen parts at bottom of market prices for a sky high mark up.

This is just incredibly dishonest framing and completely ignoring what the right to repair and third party repair shop issue is all about.

> Buying genuine parts, which are available from Apple,

It is not a margin problem, it is an availability problem. Apple does not allow third party repair shops to stock common parts, such as batteries or displays for popular iPhones. This is only possible when providing the devices serial numbers. This effectively prevents third party repair shops from competing with Apple or Apple authorized service providers because they have artificially inflated lead times.

Becoming Apple authorized isn't an option for actual repair shops because that would effectively disallow them from doing actual repairs when possible, rather than playing Dr. Part Swap. Everything what Apple does in the repair space essentially boils down to them doing everything they can to avoid having competition in the repair space.

> eats into the margins

Replacing a 45ct voltage regulator on a mainboard is cheaper than replacing the entire mainboard with everything soldered on is cheaper, but doesn't allow for very nice margins.

> There is very little honour in the repair market

There is very little honour in any market. Honour does not get rewarded nowadays, people are in <insert market> to make money, if you're lucky they still take a little pride in their work. If a repair shop offers good service or not should be up to the consumer to determine, not up to Apple (or any electriconics manufacturer that employs the same tactics).

> makeup applied to it by a couple of prominent youtubers and organisations.

That is called marketing, that's what Apple does also pretty good. They're also lying when they say they are environmentally conscious while they also have their genius bar employees recommend an entirely new screen assembly on a MacBook just because a backlight cable came loose.

> The amount of horror stories I've seen over the years from independent repairers is just terrible. J

The amount of horror stories I have experienced with Apple is no joke either. Apple is always taking the sledgehammer approach with their repairs. I've had the pleasure myself to deal with Apple repairs once for my old 2019 MBP. It wouldn't take a charge anymore, went to the Genius Bar and received a quote for a new mainboard costing well over 1000 EUR. Being familiar with some of the more technical videos of Rossmann etc, I found one electronics repair store that actually does board level stuff and got it fixed for a fraction of the price (iirc it was ~200 EUR).


Even if Apple has room for improvement here, I think it’s still worth it to try to curb the market for stolen parts, because that’s going to exist even if Apple sold spare parts in bulk at-cost simply because there exist unscrupulous repair shops that have no qualms with charging you OEM part prices while using gray market parts that cost a fraction as much on eBay, Aliexpress, etc.

For instance, maybe Apple could supply parts in bulk to repair shops but require registration of those parts prior to usage. The repaired iPhone would function regardless but loudly alert the user that unregistered parts were used to repair it. Gray market parts naturally aren’t going to be able to be registered (either due to serial not existing in their system or having been parted out from stolen devices), and thus the user is given some level of assurance that they’re not paid for questionable repair services.


I see. Yes, that is a big problem for component swapping. I was just thinking of electronics with old/faulty caps; those will still be repairable.

Doesn’t Apple offer a way to re-pair components if they are genuine and not stolen (unregistered from the previous AppleId)?

and Apple will very happily charge you for that privilege

TBH for such a critical piece of our modern lives, I would be more than fine to pay extra to be 100% sure I am getting original parts, put in professionally and in secure manner re my personal data. I wish ie Samsung had such service where I live.

We anyway talk about expensive premium phones to start with, so relatively expensive after-warranty service is not shocking.

This may actually eventually sway me into apple camp. This and what seems like much better theft discouragement.


I don't. Such mechanisms also disqualify 3rd party replacements. It is just a wasteful solution. Not that any smartphone would qualify as decent here.

But as a customer it will overall be more expensive for you.


3rd party replacements should be allowed but the firmware should allow me to check if any third party replacements have been made. I’d like to know when buying second hand phone if it has genuine parts. Third part replacements are usually of much worse quality (I tried that many times e.g. with third party toners for HP and every single time they turned out to be a waste of money).

There are things in life where amount paid is far from top priority, and phone is one these days. With sums we talk about, I just don't care anymore, and Samsung I have now is even more expensive and more wasteful.

Re wastefulness - a decent laptop causes 10x more pollution to manufacture than phone. Desktop PC 10x that. TVs. Cars. Clothing. Phones are very much down a very long line of higher priority targets for eco friendly approach.


I personally don't care much about $ but many probably are. I care that the manufacturers don't work with user orientation in mind and instead take advantage of them. I don't want to be a customer here. Replacement parts are possible and we maybe need regulation on that front to reduce waste. I doesn't matter if other devices are worse.

Other example have a longer shelf life or are at least repairable without being tied to a manufacturer. Notebooks have similar problems and the critique can be transferred here in a similar way. I see synergies in possible rules here of course.


It is not about stolen phones, it is about monetization of customer services. If stealing phones was legal, job description for procurement/purchase departments would look differently as well.

Anyone else a veteran of the Great Capacitor Plague? Seen more than one fire in the server room due to bad capacitors. "Burning-in" your server became literal.

It is not just Apple that did this, for example here is an equivalent from Atari: https://www.exxosforum.co.uk/forum/viewtopic.php?f=17&t=1698

The best way to remove any aluminum SMD cap is to grab it with needle nose pliers, press it down into the PCB, and begin twisting them a under moderate pressure while alternating direction until they break away from the PCB. Never pull up!

What’s the liquid in the old capacitors? PCBs? (as in polychlorinated biphenyls… that abbreviation collision always annoyed me.)

I think I know exactly enough about electronics to ask more annoying questions than someone who doesn’t know anything at all.


Electrolytics are usually nothing too fancy, but it is proprietary. Water and electrolytes, hence the name. PCBs are in the big transformers and what used to be called bathtub caps which looked like this https://i.ebayimg.com/images/g/VjwAAOSwfGJjYtHx/s-l400.jpg (think 1950s electronics stuff)

Thanks!

"Wet" capacitors contain any number of liquid electrolytes. Could be something tame like ethylene glycol, boric acid, sulfuric acid, or nastier stuff like organic solvents (DMF or DMA which are poisonous, or GBL which is less lethal).

Nothing as bad as PCBs as far as I'm aware.


Cool, thanks. I think I should learn how these components actually work. Individually they seem pretty simple.

PCBs were only used in oil capacitors and some transformers. Generally these were used in motor and grid power applications. The only consumer applications are some motor capacitors and florescent light ballasts.

Yeah I pretty much grew up (unknowingly) playing in an illegal unmarked chemical waste dump so it takes a lot to get my attention, but I opened up an old fluorescent desk lamp from the 60s I had that fried itself to see if it was fixable — and found a small piece of crumbly asbestos shielding about the size of a business card stuck to a big leaky ballast. Pretty solid toxic waste combo. City hazmat got a sweet vintage lamp that day, sadly.

I have a Quadra 700 of this vintage that hasn't been powered up in 25+ years. Kind of wanted to fire it up again to experience the glory of A/UX one more time, but sounds like I'd have to replace all the lytics :/

Do it sooner than later, the cap juice loves to eat PCB traces. same with the clock batteries, get those things out of there.

I think the Quadra 700 (and one or two other models) has tantalums on the mainboard from the factory.

I have an original Mac that no longer turns on. I bet there is a capacitor to replace. This is giving me the energy to go look for it!

There’s a very good chance the battery has leaked and caused quite a mess. Well capacitors are a problem the battery is the biggest one.

I spent my mid childhood on an LCIII. One summer my friend brought his Performa over and we tried to play 1v1 Warcraft 2 over the serial port. LocalTalk or something alike?

But it just never quite worked right. I remember how frustrated and confused my older brother was. The computers would sometimes see each other but would drop off so easily.

Was this that?!


Iirc Warcraft 2 came out well after the LCIII was obsolete (in fact, I’d be surprised if it was ever released for 68k Macs). Official system requirements list PowerPC and MacOS 7.6 as minimum required[1]. Maybe thinking of a different game?

[1] https://www.blizzplanet.com/blog/comments/warcraft_ii_tides_...


Those requirements have to be from the expansion or something, I played a lot of WarCraft II on my 25 MHz '040 Centris 660AV back in the day (we also had a family Performa 5200 so my house was the only place my friends could play multiplayer networked WC2)

You can just about make out "68040" in the requirements on the box art here https://images-worker.bonanzastatic.com/afu/images/7569/5203...

Edit: found a photo of the expansion system requirements, also lists 68040 as supported https://cdn.mobygames.com/covers/4056502-warcraft-ii-beyond-...


We tried both WC1 and 2. Played both to death.

WC2 worked mostly well but just kind of slower game speed.

In fact I got Diablo running on the LC3. A level took like 10 mins to load and it ran at 1fps though. :)


I've found a ground lug in a Kilowatt Grounded Grid amplifier... that didn't ground the grid.

I found a bad solder joint that looked ok, but was intermittent, and had been that way, in a Television built in 1948 and used for decades.

Bad design and assembly goes back forever, as near as I can tell.


The first board I ever designed and had manufactured had a reversed tantalum capacitor on the power rails and exploded somewhat dramatically when powered up. Lesson learned!

Does the -5V rail do anything other than power old RS-232 ports?

Macs have RS-422 ports, not RS-232. But, no.

I wonder if there were any bootleg boards that copied the silkscreen mistake, but didn't use those 16V capacitors, and ended up catching fire.

"The capacitor might not have been doing its job properly if it was installed backwards, but it didn’t seem to really be hurting anything."

This is the buried lede! I am of the opinion that half of the capacitors in any modern circuit are useless; the trouble is we don't know which half.


This guy knew https://en.wikipedia.org/wiki/Madman_Muntz

> He often carried a pair of wire clippers, and when he thought that one of his employees was "overengineering" a circuit, he would begin snipping components out until the picture or sound stopped working. At that point, he would tell the engineer "Well, I guess you have to put that last part back in" and walk away.[14]

Techmoan recently did a video "The story of the 4-Track Cart" https://www.youtube.com/watch?v=Uqoglkbe9sc covering most of Muntz life.

On a serious note this is a terrible practice and nowadays wont pass any EMC certification.


You can still do it, you just need to do it in the EMC test chamber :P

Why include that capacitor at all if it doesn't matter whether it works?

If you look at the traces you can see the capacitor is right next to the power connector, on the -5V rail (which is not used for much, only for the RS422 serial port). The capacitor will be there to smooth the power supply when the machine is just switched on, or there's a sudden load which causes the voltage to "dip" above -5V. Basically it's like a tiny rechargable battery which sits fully charged most of the time, but can supplement the power on demand.

So you can see why it probably didn't matter that this capacitor didn't work: It's only needed for rare occasions. RS-422 is a differential form of RS-232 (https://en.wikipedia.org/wiki/RS-422) so being differential it's fairly robust against changes in load if they affect both wires. And the worst that can happen is you lose a few characters from your external modem.

In addition, electrolytics can probably work when reversed like this, at least a little bit. It's not exactly optimal and they might catch fire(!).


> It's only needed for rare occasions.

The two RS-422 ports are actually used quite often on these old Macs for printers, modems and apple talk networking. It was the only communication port, as there was no parallel port. They were backwards compatible with RS-232.

So it obviously worked well enough.

The backwards cap was measured to reduce the voltage to about -2.4v.

I suspect that all it did was reduce the maximum range, which started at a massive 1,200 meters for RS-422 (and a good 10m for RS-232)


Also known as the Madman Muntz theory of Engineering :-)

https://en.wikipedia.org/wiki/Muntzing


I never knew there was a name for this :)

When I was a demo coder my artist friend would just haphazardly go through all my assembler code and snip random lines out until it stopped working to improve performance.


Why is the pool of goo under C21 when it is C22 that is flipped?

It leaked when they removed them, it's a common issue with these capacitors and part of why people replace them.

Didn’t this also happen on some Asus motherboards a couple years ago?

That one was Asus ROG Maximus Z690 Hero ~2years ago.

Sorry to hijack the thread, I couldn't directly reply to https://news.ycombinator.com/item?id=42092845 .

The reason to not (just) use optical flow is that it isn't absolute. If you pattern your surface correctly, you can ensure that every few by few pixel region on a QR code like bitmaps surface is unique, and thus can be decoded into an absolute position. Basically a 2D absolute optical encoder fast enough to be part of a motor control loop.


what apple era are those machines? is this before or after Jobs shafted the engineering department on the sale and Woz had to give them bonus to keep them on the factory?

This is a good five years after Jobs left...

thanks. apple have been several companies depending on the era. if you're not a fan it's hard to keep track.

I have my childhood LC II in storage

I wonder if it has the same defect


If anything you should open it up to check for any leaking batteries/capacitors.

not the Flux Capacitor?!?!

They were probably expecting these to fail a few months after the warranty expired.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: