Plus, people tend to act more morally when they think they might be watched, whether they actually are watched or not.
Whenever someone refuses to show source code I always think, "what are you hiding in there?" There's usually something.
There's a dilemma that developers face when deciding to release source code that's bigger than fear of software theft or the desire to hide something evil in the code. It's a fear of being scrutinized, ridiculed, or humiliated over the quality of their code.
Imagine 2 programs that do something useful and are functionally equivalent. Program A is closed source. Program B has source code available for inspection. Suppose on inspection, program B's code turns out to be bloated, ugly, poorly organized, and with many potential bugs or defects. B's reputation is screwed. However, for all you know, A's code is just as bad or worse. But you don't know for sure.
Bloggers and reviewers will write that no source is available for A.
Bloggers and reviewers will write that program B's code sucks.
The consumer reads that "program A doesn't give you source code" and that "program B's code is garbage", but are otherwise functionally equivalent. Which do you think will have greater influence on most consumers and their purchasing decision?
That's one major reason why more developers don't release source code. I wish I knew a way out of this dilemma.
In any case, there are good solutions for it.
Firstly, if you know from the start that you're going to open source the code, then you'll make more of an effort than (perhaps) usual to ensure that the code is well-organised, well-tested and elegant. At least within your current level of competence, but that's all anyone can hope to do in any case.
Some humility is also required in my opinion. I know many people subscribe to a fake-it-until-you-make-it philosophy, and there's some value in that, but when it comes to open sourcing your code, it's good to check your ego and to be open to suggestions and criticism. There will always be people who are more knowledgeable and better than you at certain things. Best IMO is to accept this (and their criticism if you're lucky enough to receive it) and to see you how can learn from them and improve.
BTW, I speak from some experience. I have had some nominally embarrassing experiences with OSS where other people highlighted relatively obvious security issues with code that I wrote and which I thought was of high quality. However, in turn I got free QA from knowledgeable people and in the process the code improved further as I fixed the problems.
Also, as some people have mentioned already. OSS that is popular gets improved all the time. I.e. if you're lucky, and your code doesn't languish in obscurity, then you'll get patches and pull requests to improve your code.
So the way out of the dilemma... if you're really just being held back because you fear criticism and ridicule, is to ignore the fear, be humble, and to still open-source (perhaps after doing some cleanup, but not to the extent that you use it as a crutch to avoid open-sourcing).
You'll probably realize that the fear was totally unfounded or in the least exaggerated.
This is assuming you even realize that your code is un-organized and inelegant.
There is no doubt tons of software out there written by journeyman developers who don't know best practices or good coding techniques, but just know how to "ship it"..
And, honestly, there's nothing wrong with that..
However, my experience with released software is as follows:
- Open sourced code I've seen is pretty shitty. So is closed source code I've seen.
- Programmers working in open source don't care that much if otherwise good projects have shitty source code.
- Bloggers and reviewers writing for general audience don't mention source quality at all. Hell, half of them probably don't understand what source code is in the first place. And it's OK, because general audience doesn't care about internals either.
- Reviews of application source code are very rare.
I work on very large projects and frankly, some of the code is utter crap. Still, if someone came along and criticized it without offering patches, they'd be the one looking stupid.
The project with closed source code can remain terrible forever even if there are programmers willing to improve it.
Here's an example:
Looking for a WebExtention alternative to greasemonkey on Firefox. (I know a port is being worked on, but let's ignore that for now)
Your options are tampermonkey (closed source), and violentmonkey (open source)
Tampermonkey has been around longer, and probably has more features. Which do I choose?
I choose violentmonkey, because for all I know tampermonkey's code is garbage, and full of spyware. If violentmonkey doesn't meet my requirements, I can make it meet my requirements by coding the feature myself, or paying someone to add the feature.
Fear of that criticism being public is just an ego thing. If you write something people use you will be criticized, if not for the code then for so everything else. Closed- source doesn't help with that.
Some say the only way out is to kill your ego, but that's pretty difficult to do. I think a smaller step that's easier is to simply stop attaching your self-worth to your code. You wouldn't hate a learning programmer because they're bad at coding. Extend that same feeling to yourself--unless you think you know everything, you're still learning. People might not like what you make, but that's an opportunity to figure out why, so you can do things differently next time.
But there are really few examples in the world made by one developer. Where's the fear of being scrutinized when you have a team of developers who are "supposedly" doing "code review" all the time?
This was the first thing that came to my mind as well, after reading the post!
> people tend to act more morally when they think they might be watched,
> Whenever someone refuses to show source code I always think, "what are you hiding in there?" There's usually something.
This goes further to all the bias perpetuation engines that the players (size immaterial!) from our software industry are peddling around as a silver bullet. No one know how the blackboxes are built, what biases were built in (unknowingly, or worse, knowingly!), what tests are done and data used etc.
This thread on twitter https://twitter.com/random_walker/status/901851127624458240 , when read with Cory Doctorow's post in context highlights the dangers that are looming just ahead which might go totally unnoticed due to the noise in the system, shrouded by short-term gains but the bad effects which would be visible only in long term.
Hell these days, you'd have to build the hardware yourself to make sure someone didn't put something malicious in it.
I think it's mainly just laziness, most devs don't want to be always under review and can live easier lives if they can allow themselves a bit of a mess in their own projects without consistenly dealing with complaints. And many open source users are super obnoxious, bombarding devs with insane questions/requests all the time, then acting super hostile when they don't get what they want right away.
As if this were any different for users of closed-source apps.
We aren't enforcing the laws we have and our grandfathers and mothers had. (Three guesses why.) Not on monopolies, contracts, patent misuse... nothing.
Just this week I and Hearthstone came to a stop - Blizzard's new policy insists on a credit card and that I owe them for purchases made if they leak the card no! I can't sign in to play "my" cards 'till I agree this is totally cool. Sure, the old policy said they could revise it as they liked, but the law says otherwise and always has. They don't care - it'll be years before the law is enforced against them, as it was with Steam and refunds.
No cops - so to speak - on the beat, and Trump vowing to fire more regulators, that's what's changed. The number of potential demons is more of a constant.
This also famously happened with bread. You can read about it for example in Bill Bryson's "At Home: A Short History of Private Life"
“Because bread was so important, the laws governing its purity were strict and the punishment severe. A baker who cheated his customers could be fined £10 per loaf sold, or made to do a month's hard labor in prison. For a time, transportation to Australia was seriously considered for malfeasant bakers. This was a matter of real concern for bakers because every loaf of bread loses weight in baking through evaporation, so it is easy to blunder accidentally. For that reason, bakers sometimes provided a little extra- the famous baker's dozen.”
― Bill Bryson, At Home: A Short History of Private Life
I don't really like to defend trumps sayings, but in this case I don't think there are government regulators needed. The company acted shitty and you took the consequences and quit and avoid this company. That's how market works. No need for regulation here.
The "only" problem with this in general today is, that most people do not understand technology at all, that's why big corporation's can get away with the shit they are doing, as there are still people using their consumer unfriendly, but shiny new products.
I doubt more government regulation would help with that.
Smarter people are required. And I do believe this is happening, it just takes some time.
Historically, "the consequences" have often included "you die horribly".
Sure, we can all avoid Hooker Chemical Company. (Well no, we can't, because it doesn't sell to consumers and we can't find out who consumer companies are supplied by.) But if we could, it wouldn't have saved anyone living in Love Canal. They got leukemia no matter what the market did afterwards.
I'm not being hyperbolic here. Markets work because they're iterated, voluntary transactions which gradually reveal asymmetric information. If the interaction is not iterated (e.g. fly-by-night companies), markets don't protect you. If the interaction is not voluntary (they poison your air), markets don't protect you. If the first asymmetric exchange is a disaster (you buy tainted produce and die), you never benefit from the gradual reveal of hidden information.
I'm generally a pretty staunch libertarian. But it's simply not true that this problem can be solved with a simple "you chose to buy this"; non-government solutions are vastly more complicated than that.
If a digital company and me don't come along I quit contract and it is not really anybody else business nor harm, only mine if I feel treated unfair. And the companies reputation, as I will share my bad experience.
But if a company is poisoning the real world, than this is clearly a crime. And I did not say, no more police is needed.
We need more, smarter regulations, not less.
EDIT: I don't think it's reasonable to expect everyone to educate themselves on every technical advance. That's not possible with the complexity of technology people encounter in their everyday lives. And that's without even talking about the invisible technology we never see directly, like the software controlling our voting machines, hospital equipment, power plants, etc.
Yeah well, so the problem to me is not really missing regulations, but missing the will to use open-source.
And that is what I meant, that most people have no idea about technology.
To them it does not matter if something is open-source or not as they do not know the difference - they understand neither, it is all dark magic to them.
And of course, not everybody needs to have studied IT like we did. But I also do not understand the Linux kernel - yet I trust it. Because it is open-source and I can get in touch with the people developing it and see how it is done.
So I trust them.
And ordinary people could at least understand the same: if something is developed in the open, then other people have the chance to check it. If it is closed - much harder.
Very simple. And I have no doubt, that this knowledge will get in the heads of the people. It just takes some time (and action of course) - Computers for everyone is a quite new thing ...
Koch market-based ethics (see their books) really means - ignore the law, if the market allows you to cut any corner, cut it hard. That's where you end up, with third-world economic rules, and in time results. (Starting with vicious inequality.)
And regulation s about brainwashing ... I am not sure how that could really work.
Now try to get some work done while you are caveat emptoring.
There are many other experts, hackers, organizations, newspapers, etc. I can google before making a purchase.
Just like it allready is, only with more responsibility for everyone, but therefore hopefully better marketpositions for nice companys.
In any case, you can not expect individual people to just "take the consequences" of bad actors and move on.
This can’t hurt you if you use a credit and not a debit card. Any false charge on a credit card is not your fault, and it’s not your money. The CC company will pay for it.
By all means, please - don't use debit cards online - but it's still not a panacea.
Those in charge ultimately bear the responsibility.
Not criticising leaders is the road to ruin, and senseless jingoism.
National humility trumps national pride as a useful virtue.
"... farmers have worked on their own equipment "for decades, generations even." Brasch also pointed to the emerging DIY sources of information in the world as a way that farmers and others who want to make repairs can learn about their equipment: "You can go to a YouTube for something as simple as baking a cake to repairing or operating an item. I think that's the way the market is moving. We'd like this market to move with the rest of the world."
This is one of the IP/copyright issues being negotiated in the new version of NAFTA (US, Canada, Mexico), as many farmers are affected.
The underhandedness/cleverness of the printer companies is not to be underestimated.
Of course that means that 99.99% of cartridges will be discarded with remaining ink, some of them with a rather significant quantity of ink. And it gives the opportunity to tune this number so cartridges are bought earlier. Although it's up to the manufacturer just how much ink is included, so the usefulness of this tactic is somewhat limited.
I remember what is was like when you'd print a 50 page document only to discover around page 24 you'd start seeing white streaks...and more streaks, and by page 50 you'd have nearly blank pages. Having the printer stop and let you put in a new cartridge when ink is low is a helpful feature.
However, the lockout of third party or even refilled cartridges is just egregious.
Only if it also comes with a "I know it might be streaky, but print it anyway because I own this machine and I'm in charge here" feature.
Or something. :)
It's a good idea, but you do realize that realistically, people would just swap the two out, and in a few weeks or months would just have two near-empty cartridges?
I used to have a truck with two gas tanks, and a switch that would let me toggle which one I used. I would usually be pretty diligent at refilling every time I had to switch over, but there were several occasions when I found myself with both tanks dry.
Plus, increasing the actual volume of product you sell without relying on a commensurate increase in demand is basically printing (heh) money as far as corporate returns are concerned.
Related to the above, and possibly specifically because of the above, I've noticed that HP doesn't document ink quantity in cartridges, only approximate cartridge yield in pages.
Oh, and the other thing that drives me batty. Their XL high yield cartridges are sold with "Get up to 2x the pages", but from page count it's clear that it contains 1.5x the amount of ink. So ya, you could get 2x the pages, but only if your pages use less ink. There's a disclaimer, but the link is broken.
What's worse would be if peanut butter was widely acknowledged to be safe to eat for 12 months, but the jar wouldn't open after 6 months.
Peanut butter? No. I've gotten really close to 100% out of most containers of liquid though.
The exact location of the line between them might be hard to pin down, but when comparing things as dissimilar as that, it's a great big fat line so the overlap of where you put it and where it should be is probably pretty large, making it easy to place with some confidence.
If you pay for ink you should be able to use it.
That would at least give a better estimate and you'd only need to update the batch counter a couple min after a print job (just in case there's another one).
That would allow for a great reduction in programmable memory write cycles. Careful programming would also allow the use of a ring buffer that 'loops' with an erase the ROM page to vastly reduce the write count per cell.
- Someone who doesn't treat their customer as a criminal (e.g. no DRM in cartridges games, please)
- Works on Linux/MacOS/Windows.
In particular, I'm looking for a printer for my home. I don't print very much, so an inkjet is probably out if the cartridges still have a habit of drying up in a few months.
- If the printer is cheap, chances are they make money off cartridges. Expect to pay at least a thousand, but try to get something that will last for that.
- "Pro" lines are less likely to have issues like this. Separately, ditto laser printers. Get a laser printer, color if you must, but greyscale means simpler mechanics, lower price and better reliability.
- Laser printers use ink powder, and won't dry out. Uh, just be careful when refilling.
- Laser printers suck for photo printing, and quality in general is meh. This is something you will have to live with, as an inkjet printer that goes mostly unused will break rather quickly.
- Some brands (at least used to) have refilling the cartridges as an explicit feature. I think that was Brother? No clue if that's still the case.
- Pretty much everything works on all OSs these days, but your experience will generally be better on Linux. Yes, this is quite a change. That said, you might like a network printer better, especially one that can connect to Google's cloud print system. This will be listed on the feature set.
In fact your attitude will probably systematically get you ripped off if you are an ordinary consumer. Since the price-per-page calculation in the review probably comes from an assumption that at the cartridge gets fully used, and the test was probably under a highish workload.
HP is clearly abusing their position and printer ink is one of the most expensive liquids you can buy, even more than exotic $10k a bottle wine or perfume by volume.
More and more I see gas pumps ask if you want a receipt BEFORE the gas is dispensed. This seems risky.
If you decline the receipt and then dispense gas, the pump could cheat on the amount of gas dispensed with less risk, as a papered record of the purchase amount and price is not produced.
If on the other hand, the pump waits to ask if you desire a receipt until after the gasoline is dispensed, the dispenser will not know if a written record will be requested, and cheating the customer is riskier.
Therefore, I always request a receipt if asked prior to dispensing my gasoline.
Regardless, this seems very easy to test. Just fill up a gas can and check that you're paying the right amount for the amount of gas in the can.
True the receipt protects against 2) more than 1) because the car's tank does not have an accurate guage.
I remember listening to a story once about a fraud at the gas tank in which the normal amount of gas was only given out when requested in an even increment a five gallons which was how much The Regulators requested when they came and tested a gas stations pumps. they were caught after a spicious regulator requested an odd amount of gasoline and got the wrong amount.
with a combination of receipts and camera/ML classification that classifies when a customer is filling a tank versus filling their car will allow avoidance of weights and measures regulators. This is a problem I foresee.
The places where this fraud is most likely to exist are niche markets and places where it's hard to verify. Walled gardens of software or hardware with few outputs to measure.
But they could collude... o_0
See link for a scam using an altered microchip and scheme to protect itself from discovery by weights and measures.
That said, you do specify the amount of gas if you're paying cash here in the US. You go inside, say "here's $20", and the pump will dispense exactly that.
It's worth noting that it wasn't that way the entire time I was growing up. You would pump first then pay the cashier. It wasn't until around 2005ish that I noticed signs instructing cash customers to prepay.
They can't easily cheat on prices as usually there are large signs displaying the current price. They can't easily be cheating on amount dispensed, as the pumps are regularly checked by the local Weights and Measures Department (doesn't rule out a VW like cheat though).
So, your gas pump has to be accurate for the first five gallons, but then you can cheat as much as you want.
The only saving grace is that the people who make the pumps and software wouldn't be the main benefiters from cheating at the pump. So probably we're actually okay for gas pumps. It's situations where the software creator also benefits from the sales using that software...
But maybe I'm wrong. Maybe independent gas station operators are buying bootleg software ROMs to flash their pumps... "Boost revenue by 10% with this one easy trick"
And, I see from another comment:
However, in the pumps he's talking about, the machine prompts you beforehand: "Click here for a receipt". The receipt decision (yes or no) is always before. There is no way to ask for a receipt afterwards.
He always asks for a receipt (even if he doesn't need the receipt) because he figures that the machine is less likely to cheat him in that case.
Form a company that explores new markets in legal liabilities. It could bring lawsuits with little risk where the payoff could be billions of dollars. Off the top of my head:
* Research whether channels were engineered into smartphones to allow water to leak in (since they have no moving parts and should self-evidently be watertight).
* Find the planned-obsolescence parts in things like car doors that were engineered too thin or out of plastic so that door and window handles fail after a certain number of uses.
* Find evidence that companies opted to use proprietary battery and charger form factors which drove up prices and prevented interoperability.
...the list is nearly endless. Most of these seem like they depend on research or whistleblowers. If the free market and regulations won't prevent this kind of widespread hacking then maybe lucrative opportunities could be found working within the courts!
Proprietary batteries are the reason thin electronics exist. You can't use 18650s to make a MacBook.
I think the problem here is that you vastly underestimate how hard manufacturing is. The things you're proposing are like me trying to sue Facebook because it crashes all the time. Is it annoying? Yes. Is it because they're actively trying to subvert me for nefarious reasons? No, it's just because they don't know how to do it better in a reasonable price range.
Speakers are trivially sealed, as are buttons. They are just a fancy electromagnet, or piezoelectric crystal.
Also both can work through induction potentially keeping them outside a "pressure chamber" and forgoing sealing altogether. Although I just want the steam of my shower not to mess up with my phone, not dive 100 m with it.
Not to mention sockets for SIM and SD cards. If you can take the back off the whole phone to change the battery, that makes it a lot harder.
Quite a few phones are waterproof or splashproof :
Exact meaning of ingress protection "IP" ratings: http://www.dsmt.com/resources/ip-rating-chart/
Or if you want something cheaper: https://www.gearbest.com/cell-phones/pp_602673.html
There’s no way in hell car companies are purposely under-engineering things to cause them to break. That damages the brand reputation, and will drive the next sale to a competitor.
Yes, they make engineering tradeoffs. No, they don’t design a part with the specific intention that it will fail earlier than a comparable (cost-wise) design would.
Either way, it wasn’t intentionally designed to do that. There was no engineer that said, “Hey, let’s juice our parts revenue in 5 years by making this hinge really weak.”
EDIT: Thanks for that article, though. I could see myself getting suckered into a good deal on a Benz. Now I know why the price is so low.
So maybe these things aren't as widespread as you'd think... at least not in a legal sense. I'd also like to think that maybe people aren't "designing" flaws into things rather than finding the maximum economic benefit, or just are "naive" to what they design.
Sort of like a patent troll, but for good? I do like it.
I think there's always a danger of having this type of litigation be abused (in this case I'm also thinking of drive-by litigation around the Disabilities Act etc.)
edit aww spitfire beat me to the patent troll punch.
The only thing I can think of is the flash drive is slowing down as it wears. Or, the CPU clock rate is programmed to progressively lower itself the longer it runs.
Has anyone done the performance analysis on used phones to prove this isn't just my brain moving the goalposts as hardware improves, or apps just slowing down as they bloat, but that the old devices really and truly are running the same software significantly slower than when they were new?
Even something as simple as Github and using pull requests for those that know how to do so (or manually just adding for those that email and don't) would give a lot of introspection, and allow you to share commit bits so people could help (as well as making it easy for people to clone to run analysis on if they desire).
I'll probably be doing this for the next iPhone, so it'll be soon.
While this is not super accurate because OS and app updates could affect their startup time, it is still something measurable. It will be interesting to see if a year from now a set of same apps with basically same functionality will take more time to start on the same (or newer) hardware.
Smartphones, though? The only slowdown I've noticed is when the battery starts to go. And then you can usually tell; the device gets hot as the battery pushes up against all the ESR it's accumulated, the CPU steps back a bit...that's what I figure, anyways. For me, it's always coincided with the same inflection point when I start really noticing things like charge cycles taking much longer and a sharp drop-off in the life of a 100% charge.
The apps are also evolving. Compare Facebook today to what it was two or three years ago. The app does a ton more and all that extra functionality doesn't come for free. I suspect one of the primary reasons they broke messenger out into a separate app is that the Facebook app was simply too damn big.
There have been instances where an upgrade has made my experience better, but the trend has definitely been in the other direction (for me anyway).
I also have another concrete example: the original Nexus 7 had an SSD that didn't support trim. Over time it became nearly unusable as the SSD became fragmented. It was also severely bandwidth constrained. The device was decent until the first major Android revision came out. At that point, the new version of Android so significantly increased the bandwidth requirements on the SSD that it effectively strangled the device.
What was an awesome device the day I got it became absolutely unusable. Eventually I installed a version of CyanogenMod and formatted it with an SSD friendly file system which made it passable again but, still sucked compared to the day I bought it.
It used to be quite nice with the slim ROMs, but at this point nothing has helped to get it back to run like it used to. 4years of light use (99% of time sitting idle).
After the device has written its complete 32 GB or whatever, it drops into slow mode on the storage Flash.
There are ways to get the storage performance back, just search for Nexus 7 and trim.
This Android 7 Nougat build should have many of the ParrotMod optimizations out of the box, however. It works great on my other Nexus 7.
Now in this story I'm just an alchemist with no experimental controls, but then again, the experience is not isolated. I'm not saying it's purposeful software slowdowns - maybe it's general bloat + flash memory wearing out? I don't know. But seeing as others report similar cases too, something is going on.
I’m sure you can find a way to do profiling on a smartphone, anyway.
Obviously regular updating is important for security, but I sort of suspect users would be more open to updates if they weren't consistently harmful and occasionally catastrophic.
My phone's storage is not large by today's standards, I have 16 GB. I run fewer apps than I did when I got the phone, and have run the same general collection from day one. I have less music on my phone than I had in the past. I have only had to continually remove applications and music as just about every week it has warned me that my storage is almost full. This only started within the last year. I won't have added anything new, keep my cache flushed, etc and I still have to continually remove items.
Is there something practical I'm missing here, or... ?
Apps are getting larger due to more and more code accumulating as time goes on and apple's toolkit / compilers causing binaries to become larger.
Swift itself creates incredibly bloated binaries compared to objective-c. More people are using more 3rd party libraries than 5 years ago, where you just used apple's built in libraries for the most part. If you just used the shared system libraries, then that didn't count against your binary size.
I figured that aging of the tech vs package size would be an issue, including the growing size of the OS... but do you think this is a problem even in long-life applications? I mean, surely AAC and MP3 files haven't bloated so significantly in the past 5 - 10 years.
I haven't worked much yet in Swift and wasn't aware of those differences vs Objective-C, though. That's interesting.
The binary size of swift projects doesn't catch most people, only large projects with > 400k lines of code as they go over the cellular download limit. It's the new hotness, so many projects are adding swift even if the app is old.
Adding swift to a project adds ~20MB alone for the standard libraries that get included with each app, and each line of swift creates that much more binary bloat. The autogenerated interface between swift and objective C creates even more bloat.
So I guess the problem is pretty simple, if not immediately obvious for the details of it.
I sigh and guess that means I just have to upgrade soon...
* indexing will take forever
* xcode will beachball continuously
* the debugger will crash because the codebase is too much to handle
* build times will be 4-5x more than the equivalent obj-c codebase
* startup times will balloon significantly if you have many separate libraries, since using swift forces you to use dylibs. 60-100 libraries will mean 3-5 seconds being added to your app startup time
They are all solvable problems, but it will take several years before they are fixed. But I don't think swift will be better than C++ development, since they are both similar languages as far as compiler tradeoffs go.
I notice some lag even in small swift projects compared to objective-c, but it isn't major enough to worry about.
It's still running because I deleted all the apps except a few core ones, but I'd appreciate some tips about this too.
Ars Technica's Android 8.0 deep-dive has some interesting charts showing Android device performance deteriorating over time:
Unfortunately I'm now at end of life as far as iOS updates go. I don't think I will bother with updating the hardware for some time yet.
just ordinary use makes them that way. Calibrating is necessary for all lithium batteries, but normally you charge when you think its time and don't wait for the battery to get empty first
Presumably this is because Apple keeps adding features and animations and so on, and so iOS grows bigger and more resource-hungry, as are apps, and developers aren't optimizing by running their stuff on old iPhone 5 models anymore.
This seems to mirror Wirth's Law more generally; every piece of software on your phone gets more demanding every time it updates.
I've got a Galaxy S3, an S4 couple first generation Moto Gs, a 2012 and a 2013 Nexus 7, a Note 2, I think? 2 Note 3s, and a One Plus One in various states of: loaned to cousins, used as house phones, backups in a drawer, backups in cars, or lying on my desk.
They were all either broken, at yard sales, given to me my clients / contacts that don't want them, or were <$20 on ebay.
In general four things kill these devices:
Touchscreen breakage. It is almost never worth trying to replace if the screen cracks.
Flash burnout. Shitty flash chips don't last forever. I've binned almost every older phone than this crop because the flash memory dies.
Charger port wear. Microusb sucks, replacement parts vary wildly depending on model - I can get an S series charger for <$5 most of the time, but trying to replace a Droid phone charger once was impossible because the charger harness was soldered to the pcb.
Software. I generally outright ignore devices without a ROM scene and an unlocked bootloader, but even then it is entirely volunteer how long Cyanogen/Lineage/Paranoid/etc are willing to keep supporting these fossil kernels. The S3, Note 2-3, and original Nexus 7 are all on their deathbeds because of lagging community support for these devices. It is worth mentioning, however, for the Samsung devices they have gone community supported far longer at this point than their official support periods lasted. Great job Samsung.
The software is the ultimate killer. What should be the easiest to maintain is the hardest, because corporate greed and hunger for control trumps customer respect. All my mobile devices are cheap, used, or broken when I get them because none of these exploitative abusers are worth giving a direct cent to.
Sadly mobile flash doesn't support smart monitoring. There are three indicators, but your flash can randomly fail without any of them being observable:
Sector reallocations. As flash stops writing or reading the package will reallocate data. This process is intensive and usually lags out the phone. If when moving large amounts of data into / off the flash the whole phone is freezing, it can be due to this.
Stunted read / write speeds. As the flash degrades and more sectors go bad, your read and write performance suffer. Fragmentation gets worse as working sectors dry up. If your phone was benching ~80MB/s read or write speeds the day you got it and is down to ~20 5 years later, it is likely nearing a failure point. This is usually a gradual aging thing, but you do often see a steep slope of sudden performance crash before the whole chip becomes unusable.
Crippled access times. The former was data rate, this is data latency. The latency should always be consistent and not age much throughout the life of the chip - the ability to access flash almost always stays near-constant over the lifetime of the chip. If this starts going, for very small data sizes, the chips controller can be dying. Which happens, because in phones a lot of corners are cut, and flash mmus are often really, really cheap.
It would be useful if we could get A. lifetime write averages for the flash chips in popular phones and B. trace such a number throughout the lifetime of the device, but we don't have those, so you are almost always flying in the dark on when your phones memory will die.
This was the norm in the US until the late 1800s. Indeed, corporations had to act in the public interest. And if they didn't, they were dissolved.
But then, the railroad corporations got wealthy enough that they were able to buy favorable Supreme Court rulings. Basically, they got human rights. After former male slaves, but before women.
Source? Let's say a company owning a factory was thusly dissolved. What would you do? Parcel bricks out to shareholders?
We got rid of this for a good reason. It nukes long-term investment by laying the road for expropriation.
The American Revolution was just as much against Crown Corporations as against British rule.
You are correct in stating that it took political capital to form a corporation in Revolutionary times. But that didn't mean they existed at the pleasure of the state. Furthermore, the close nexus between corporate chartering and political proximity showed its seams in the 19th century. That's why states moved the chartering process to independent bureaucracies.
But I would like a nice readable article like that to appear in more mainstream publications. It should make a good story, being both true and sensationalist and important at the same time.
Doctorow has been sensationalist for as long as I've been reading him (15 years?), I think even moreso as time goes on. Too often, it detracts from his point, but sometimes he hits the nail on the head.
Also recommended by him, The Civil War over General Purpose Computing" (2012): https://boingboing.net/2012/08/23/civilwar.html
It's even truer in hindsight.
However, body and other simpler mechanical work can be done outside service centers (for Teslas too.)
Also, conditional cheating reminded me The Story of Mel, a Real Programmer:
This has actually been the case for some time. The car magazine wouldn't just go borrow a car, they would get one directly from the manufacturer. And the manufacturer would send a ringer, a vehicle with an EPA test-exemption that doesn't have to comply with any emissions regulations.
I suppose the era of Youtube car review channels is bringing that method to a close though.
I guess they just didn't foresee someone buying the domain.
I'm no expert, but I do recall reading in one of the WC write-ups that there is known malware that does this.
Anyone know what iTunes update he's talking about? I don't remember anything that fits this description.
I suspect the Kindle feature he's talking about is the text-to-speech feature, which the publishers hated because it threatened audiobook sales. Or maybe he's talking about Amazon deleting books from customers' Kindles?
They removed the ability to use Linux on the PS3. This is probably one reason they have been consistently targeted by hackers.
Besides, it must be fun to keep hacking the same company and seeing that they haven't changed anything in their IT.
> "Most people don't even know what a rootkit is, so why should they care about it?"
I'm pretty happy to have not bought a single Sony product since 2005. I think that people should boycott brands that spit in their soup, but Doctorow makes a better point: these companies should be killed.
I mean it's their right to be a purely gaming console, but they should also accept blame for tanking their reputation with the open source community.
Supposedly to save local disk space.
I recall an accounting of a composer losing his work file that way —they got compressed behind his back. Thank goodness he had a backup.
This is a very strong statement. Asking for high margins puts you at war with your customers?
In a world populated by IOT devices full of software (as discussed previously https://news.ycombinator.com/item?id=15034955), we'll end up in a post-scientific world where the underlying rules that govern a device's behavior are so complex and arcane that we'll have little chance of reverse engineering how basic devices work anymore.
I think in practice it will mean that devices become bricks relatively quickly, and when people realize they have been cheated, there will be a strong backlash: imagine "paleo diet" but for devices.
Courts wouldn't have it. They will stop this too. Digital property will be declared property, not licenses. No limits on resale transfer or rental and the like. Companies will howl like stuck pigs. And it will benefit them, as well as consumers, tremendously.
What? Is he being metaphorical or does he mean via the environmental impact? That's a stretch IMHO. Or alternately am I simply missing something?
However, thus far only the programmer at VW has been prosecuted and sentenced to jail. His "following orders" defense did not work.
All the high-level executives will probably walk away with VW paying a fine.