I think a better way to measure value is
invention's cost effectiveness = price / (time * improvement to well-being)
By this measure, smartphones are not bad but probably lose out to the Internet, washing machines, and air conditioners/heaters in hot/cold climate. Google is a pretty big improvement over older search engines and is worth quite a lot although the price we pay as privacy is not obvious.
I think this is far more difficult to evaluate than your comment admits.
More transistors -> more software that can do more things, written by developers with a wider range of skills, in a society with increased tech permeation, and so on...
8 core Ryzen on the other hand is 4,800 million.
You're right, 10x the transistors is less than 10x as useful. 10x faster CPU in an iPad is not 10x better, because you won't be able to harness all that speed before you hit another bottleneck. And you can't buy a fraction of an iPad. And you definitely can't eat one.
but then you say...
> Google is a pretty big improvement over older search engines and is worth quite a lot although the price we pay as privacy is not obvious.
which is puzzling in light of your very own definition of incremental value.
is google's search engine so much better than bing or yandex or anything else that it deserves a much higher valuation? if you apply your own incremental definition of value to search itself, the billions of searches done on other search engines seems to contradict that conclusion. it seems the other search engines are just fine for many situations, so the incremental value of google's search is not high on either your "time" or "improvement to well-being" metric (price is hidden in ad revenue, but likely higher for google).
google's valuation broadly comes from (1) it's ad business and (2) speculation that google will conquer another industry, not from it's search technology per se, although search is certainly a core component of the ad business.
it seems to me the way that search contributes to value is that it exhibits winner-take-all characteristics (not considered in your model), so that even a slight (perceived) improvement leads people to flock to the better search engine en masse (which is how google won search in the first place, then they developed/borrowed the ad auction model and became a runaway financial success). by garnering more search queries, google has greater and better ad placement inventory (i.e., your eyeballs) to sell to advertisers.
True technological progress isn't going to be presented at a fancy keynote, it's going to be messier.
The picture is from the Computer History Museum. Well worth a visit.
But that brings up the question, what tech could we be subsidizing today so we can get it faster?
While I agree there should be zero subsidies for fossil fuels, the corporate taxes they've paid drastically outweigh the modest subsidies the fossil fuel corporations have received. The only way that isn't the case, is if one includes extremely fuzzy environmental damage estimates (in which case nearly every industry that has ever existed has been theoretically massively subsidized).
And yet they have still been doing more than fine for all these decades, they still do really good.
> modest subsidies the fossil fuel corporations have received
"Modest" is an odd word to use considering that subsidies come in more forms than straight up sending money to a company, companies can also be subsidized by taxing them less on certain behavior by weighting factors differently, like on long-term environmental impacts .
I've seen a way more detailed breakdown of this issue before in another study, but I just can't find it anymore.
As an American, I’d say internet deployment and infrastructure upgrades, but multiple government entities have tried that and the ISPs end up doing nothing.
It seems to be something that's easier to do at a local level.
Mobile wireless - we already have theoretical innovations that can fully solve and make it practically free
Cargo transport. Hybrid airships come mind.
On-demand ride-sharing(multiple people in a minibuses). It won't even take money, just a law demanding all transport requests will be done through a single app and than share the data and create a market.
Nuclear for sure.
Microcontroller based products -so much complexity is added to the field by the games(limited ram,flash,peripherals when they cost nothing) mcu makers play to increase profit.
Industries that may look like there's planned obsolescence occuring.so we could get cars that last a million miles(Toyota had that but fixed that problem), more reliabile appliances, etc.
That's one of Apple's brilliant strategies: instantly commoditize new technology, paying cash for new factories that produce next-gen tech components at such high volume that prices immediately drop to commodity levels immediately.
This leaves "begs the question" as a weird, pretentious-sounding, overly emphatic expression that automatically annoys some percentage of the audience and will slightly confuse the rest (for at least the next decade or so until the purists give up as they did with "irregardless," "could care less," and "literally").
Of course, if your goal truly is to unpredictably annoy semi-arbitrary subsets of people with something as neutral as your speech, then by all means, beg away. Personally, I don't have a choice. I'm a parent, and as is the case with any parent, my kids internalize everything I do. So I can't have fun adding auto-troll expressions into my communication, because they'll do the same without realizing the consequences. I don't like that the world is full of people who both (1) have the power to decide my fate, and (2) set the bozo bit from small things like the quality of my communication, but I'm pragmatic enough to accept that those people do exist and will judge my kids' fate. That's why I speak as if it matters.
This site seems to have more prices of goods
A cheap ford car was $1879 or 17 iPhones.
Kinda holds as a cheap ford today is around 14k
1000USD today would be around 100USD in the 50s.
Or, you could assume an average rate of inflation and calculate this using the rule of 72. I like to keep that formula in my pocket for things like this.
The spec improvements of the iPhone X do not line up with the accompanying price increase. Even by iPhone standards it's an expensive device. If you justify the price increases you're just telling Apple you're prepared to be short changed when the iPhone X's successor comes out.
The iPhone X almost certainly higher margin than the iPhone. My argument was that if consumers tolerate the $1000 price this time, they shouldn't be surprised to see future high end phones at similar prices.
On the other hand, it's interesting they choose to use the semiconductor industry as their example because progress is slowing in the semi industry due to the laws of physics with respect to standard CMOS and innovative rescues in either architectural form or through novel device physics are very unlikely due to industry's glacial pace and apparent allergies to innovation. A field that scorns young entrants is bound to die someday.
I've been in electronics and computing ~40Y, and worked with some moderately big iron on the way, eg in banking.
That iPhone has more oomph by orders of magnitude than an oil company's exploration division had not so many years ago. I worked there and looked after the machines.
Glacial pace? It followed an exponential, doubling the number of transistors every 10 months for 40 years. You would be hard-pressed to come up with another field with the same pace of innovation and progress. The amount of innovation they had to come up with to sustain that, is staggering.
> A field that scorns young entrants is bound to die someday.
What nonsense. Who will produce your chips then?
Try to make a chip, you'll quickly realize that your challenges will be primarily non technical in nature. It wasn't always this way, the semi industry was once incredibly innovative and open to new ideas, and that helped drive exponential progress.
>Who will produce your chips then?
I meant "die" to mean stagnate as the common euphemism in tech.
I studied microelectronics. I am aware of the technical challenges. Can you explain those challenges that are primarily non-technical?
> It wasn't always this way, the semi industry was once incredibly innovative and open to new ideas, and that helped drive exponential progress.
Things like Silicon-on-insulator, high-k dielectrics, finfets, extreme ultraviolet lithography are not innovative or new ideas?
We spent far more time buying EDA software, installing it, talking to foundries, getting the PDKs, signing NDAs, dealing with buggy EDA software, dealing with slow EDA response times, etc. than actually working on our chip.
>Things like Silicon-on-insulator, high-k dielectrics, finfets, extreme ultraviolet lithography are not innovative or new ideas?
I'm not saying they aren't, but I have noticed that the general level of openness, and following that, innovation and open-mindedness has dropped dramatically in the past decade or so, and I do have to say that the general semi industry has stayed generally innovative, and much of my criticism is directed towards the rest of the industry primarily. That being said, there is a major glacial pace.
Example of a real conversation I had with an engineer at one of the major (can't name the exact one) foundries about a device that's actually pretty close to reality:
Me: "Why don't you use this X device?"
Him: "Because it's still research"
Me: "Sure, but it's very promising, why aren't there at least any industrial research efforts to commercialize it?"
SOI is innovative, but it's been held back by cost and the self-heating effect, both things that really aren't that much of a problem.
FinFETs were launched by a DARPA initiative.
High-k dielectrics I will say are the single most interesting (if not innovative) innovation in the last decade in the semi industry, although I have some bias there.
EUV is a feat to engineering no doubt, but again, my grievances aren't really focused in that area.
Either way the tool they used (geekbench) apparently looks into chip-specific optimizations for tests and the rest of their process is not disclosed so it's possible all of their benchmarks are non-uniform (again based on an optimization ticket here: http://support.primatelabs.com/discussions/geekbench/18305-o...)
Maybe not with a '57 vacuum tube (transistors actually existed at the time already) computer, but then there's this:
The massive increase in computing power has been accompanied by a corresponding increase in complexity. For better or worse.
multiple milliseconds or seconds? Ridiculous! Those of old enough to remember 9600 baud know the disappointment (in our lifetimes) waiting many minutes to download a jpg just to find it wasn't what we wanted. Or that mp3 track from some obscure FTP server....
But this problem applies equally to local software.
I believe there was a name for it but I cannot recall the term. The idea is that as hardware gets faster, software developers add bloat^W features so that the user always experiences the same speed. "Speed" mean something like the time taken to perform some routine user task.
Microsoft has been usurping user resources like this since at least the 1990's. They always had the new hardware before it hit the market. New software was tested on new hardware that no consumer yet had.
By the time the user purchased a new computer, the company had a new software version that already uses up whatever resource gains the new hardware provided. Pre-installed. The end result is the user experiences the same speed.
If the user ran the old version on new hardware, they might see a speed increase.
But the company makes it very difficult to do this and/or wages a relentless marketing campaign to convince users to try the new version and ditch the old one.
The new software version did the same basic tasks the older version did, but because of the bloat^W features the speed was not any faster for the user. Not to mention how the new version would usurp gains in storage space and RAM as well.
Incidentally, regarding downloading images over dialup, didnt you abort unwanted images before they completed?
I recall images rendered slowly line by line across the screen.
And I remember bandwidth being too slow even on a corporate network to blindly batch download images and then delete the unwanted ones.
It was more efficient to view each one as it was downloading and abort if unwanted.
But some of us enjoy NOT indulging in that, and having a Web page served in milliseconds from low-powered hardware, or having MIPS-speed IoT hardware run on microwatts and do a decent job using modern compiler smarts to help get there...
This is true, but there's a couple of other factors involved. One is the recent gains in storage efficiency, and the other is familiarity over time with the new gains. When SSDs hit the scene, you could migrate your existing installation to solid state storage and see immediate, measurable gains. Then PC manufacturers started shipping SSDs as standard equipment, and folks got used to that level of performance. Essentially, it became the new normal. A while later NVMe hit the scene and suddenly even traditional SSDs began to feel "too slow". Again, manufacturers are beginning to ship NVMe-based units, pushing the tolerance levels even further out.
I've experienced this recently; my main workstation has a Skylake CPU, DDR4 RAM, and an NVMe OS drive, with a SSHD (solid state hybrid drive) for storage. I recently revived an older but still very fast workstation with a standard HDD, and it felt like I'd gone back to the "Windows Vista Capable" days of the late 2000s. Same OS (Windows 10 and Elementary OS), same software, technically faster CPU on the older workstation, but it was so "slow" I could barely stand to use it. I felt like I was waiting ages for applications to start or web pages to load, even though it was usually less than a few seconds difference. But oh, what a difference those precious seconds can make in human perception!
If by "installation" you mean the OS, I have not needed SSDs. I migrated my installation to RAM. I stopped using disk for the OS and data I am working with. I can fit everything I need in RAM. I still use disks sometimes for long term storage of infrequently accessed data, but it surprises me how infrequently I need them. For me the recognition of "immediate gains" in speed was not with the advent of SSDs it was with the availability of >= 500MB RAM. Diskless became too easy.
But I do understand what SSDs have done for other users with different requirements and I think that has been a great improvement.
VR put latency on the general radar for many people previously completely oblivious to the issue, imho another good thing to happen.
It was also pretty bad with a 56k modem as far as I remember -- definitely not minutes though! The wait becomes worse when the JPEG is of a... critical nature, if you know what I mean.
The greatest performance issues stem from the attitude of the implementers. If you assume you have more memory than you'll ever need, chances are you'll need more than you'll ever have. Also, somehow no matter how much better our collective and public knowledge of cache-aware programming gets, we manage to make the data so much larger that it doesn't matter how cache aware we are.
disregarding "apps that I can't shut off running in the background", there is no good reason that animations, even the currently selected ones, should account for nearly as much power consumption as they do.
I don't have this problem on my workstations (generally) except when I browse the web, though recently I've had to make the practical tradeoff of owning a smartphone, where this problem is rampant.
Worse than things not getting faster, the same product tends to get slower because the software designed for the new hardware is the only maintained branch, and the versions for older hardware are just backports from future hardware.
>taken up 100 billion square meters of floor space
that is (with a three-meter ceiling height per floor): a hundred-story square building 300 meters high, and 3 kilometers long and wide
"Oh and by the way this device has an edge to edge display that is so real you can hold it up seamlessly against a background while it invents a made-up image and draws it on, pretending it's part of reality. It knows its position and can adjust to your movements. It also includes a photography studio that makes billboard-size full-color photos. It fits in your pocket and needs to have its battery recharged once per day. Also it's an entire telephone including video which it has a camera facing the front for. And you can talk to it, it has a built-in assistant. Basically, magic."
I'm guessing these are historical prices, not inflation-adjusted.
Things that have improved exponentially since 1957:
How about data bitrate? Total number of available channels?
Even goosing those, you're not looking at exponential rates of increase.
For example, in 1982, the wildly popular low-end home entertainment computer, the Commodore 64, was released at $595, or about $1500 after inflation adjustment. On release in 1977, the Apple II was $1298, or over $5000 in today's terms. You could pay $400 per 4k of ram, so $6000 if you wanted one with 12k ram. If the iPhone X came out in 1985 for this price, it would have been $400 in 1982 dollars, or 40% of the C64 - with over 4 million times the storage capacity, to say nothing of the other capabilities.
That our modern mobile devices are popularly called 'phones' misses the point that they are used as general purpose computing devices, not primarily phones or even necessarily primarily for communication. They are also GPS navigators, cameras, calculators, recipe files, photo albums, alarm clocks, book readers, walkmans, home stereos, encyclopedias, wallets, and a lot more. People used to frequently pay decent sums to buy dedicated devices to perform many of those services.
His opinion was that a similar thing would literally be worth multiple billions to the Federal government in the 80s/90s.
That's the thing -- not only are phones technology marvels in their own right... they open the door to capability with almost incalculable value.
I think his point was less about the tool as is, but the simplicity and ease of access to information.
In the early 50s, much pf Europe and Asia was still rebuilding from war, and the US held an effective monopoly on multiple industries.
From the data I could find for 1952 in the US, a house was 5 years of income and a new car was half a year's income.
For many people in the US today, a new car (post tax dollars) is either completely unattainable (multiple years of income) or a complete joke (one month's income).
The numbers are different for 1982, but still closer to the 1952 US situation than today's 2017 situation.
So, I guess what I am trying to say is that $1,000 for a phone is still fabulously expensive for 99.3% of the world today who aren't millionares.
Thank you for your insight :)
I am thinking that maybe I am just very different from most consumers, or that our definition of "afford" is very hazy.
I am writing to you either from my $50 cracked screen wifi only iphone 5c, or my $40 2009 lenovo thinkpad running lubuntu.
What is strange to me is that I'm not buying this because I am price sensitive, but rather because "it is enough for what I want to do".
I have a hard time believing that a large percentage of the US market will buy an iphone X. Or if they are buying it, then they don't fit my definition of "affording it". Here's why:
Just 36.8% of US adults have a networth of over 100k.
That means 63% of US adults would be spending more than 1% on a mobile computer (phone) that is not very much different from a $300 phone. How many of these users will Apple capture?
Additionally, many people's net worth in the USA is tied up in real estate or retirement accounts. So these funds are not available to be spent, and it manifests itself in the following stat:
In 2016, 63% of Americans said they could not come up with $500 to cover an emergency purchase.
So, tons of americans can't even afford basic expenses, let alone luxury phones.
That is why I feel this is a millionaire's phone only.
It might be dumb for a minimum wage earner to spend 1.8 weeks of their income on a mobile phone, but I absolutely believe a certain subset of working-class Australians will, with some difficulty, manage to scrape together enough to buy one.
Or, more realistically, they'll finance one on a $100/month phone plan, which I also think is a bit ridiculous and unwise, but not so far fetched as to be impossible for many lower-income people who really want to do it.
Now, if you're middle class... let's just say that as someone who earns a salary that's within the same ballpark as the national median, I could go out and buy several iPhone Xs tomorrow if I had to...
Here's a chart of how a middle class American family currently spend their money.
Please note 74k in income is pre tax, and 57k in expenses is post tax.
So, the average American family is actually spending every dollar they earn.
I don't know where technology fits in to this current budget, but you can see it's pretty much pegged due to housing and healthcare.
$1,000 for one phone is out of the budget for all middle class Americans, period, unless they begin to cut back on cars or food.
Which, you can assume that some might, but on the aggregate, they won't.
I could go out and buy literally 100 motorcycles tomorrow, but I wouldn't even buy one, and I damn sure wouldn't buy one that was 5x more expensive than a similar motorcycle that isn't "luxury".
Thank you for writing :)
High end as in iphone 5c, iphone 6, or iphone 7?
My iphone 5c, cracked screen, was like $40 off of ebay or something. When it came out, the original price was like 200-400 depending on contracts and gb capacity.
I think you will see the same thing happening with iphone 6s too.
But if you told me that you see a large number of fast food employees and janitorial employees and etc riding the bus to work, with iphone 7s in their hand in the first year of launching, I'd definitely be surprised and interested.
I too rode the bus for 3 years and never really noticed that trend among those types of workers who are struggling the most. I just saw a ton of galaxy s3 phones, which are awesome phones from 2012 or whatever.
I don't doubt your experiences however! It is a really interesting thought to have. If they are affording it, I can only assume they aren't purchasing something else, like a car or high end laptop or house.
My total owed to everyone is <1mth income.
I grew up poor though so my attitude to money is different to most people my age.
If I don't have a minimum of 6mths income in savings I feel vulnerable.
FWIW I won't be buying the iPhone X, my Moto G5 Plus is an ideal phone for me and only a few months old.
I did drop 1400 quid on a Thinkpad T470P though recently - I will spend money where it makes sense.
I suppose I have two points to make from your comments:
1) You are similar to most Americans in that you already have a phone that is ideal for your needs, and that phone isn't an iPhone X
2) You are different, statistically, from hundreds of millions of Americans, in that you have any money saved at all, and don't have large financial responsibilities (house, debts)
I have considered an iPhone down the line though because Apple focuses more on privacy and that is something I do care about.
Not cheap but an i7-7700HQ in this form factor is a decent amount of grunt and with the nvme it's faster than my old i5-2500K desktop at home by a fair margin, it doesn't feel noticeably slower in practice than my Ryzen desktop for most things (until I spin up 3-4VM's and then I do start to notice but really 4VM's with 4GB of RAM each on a laptop is just crazy).
The way phones are financed makes them much more attainable, also. The way people will get that $1000 phone today is paying $2-400, and then $35 a month, folded into their phone bill, for 2 years. That's reasonable for anyone with decent enough credit to get standard wireless service. Not most of the US population, perhaps, but definitely the same demographic as the people already buying $800 Samsungs or previous iPhones, which is tens of millions of people.
So while technically the iPhone is a computer, with a microprocessor that executes instructions, primary and secondary storage, and so on, the device has more in common with the traditional telephone and television, because it can be used for communicating and consuming content, but not general purpose computing.
I'm not an iPhone developer, but from what i understand you can write apps to do about anything and run them yourself - just not distribute them on the app store.
It's no fun dropping a $1000 phone in the toilet or down the stairs.
A lot of people forget this.
It did seem like a big purchase initially. I used it for 6 years of college. I mainly used it for word processing. It never skipped a beat.
Once every two years, I would need to re-ink the ribbon with ink myself. Just dipped the ribbon into the bottle, and rewind. I was too cheap to buy a new ribbon.
Besides the initial outlay, I spent $2.00 days on ink in six years.
If I had foresight, I never would have thrown away a great computer. I miss the word processor program. I definetly miss that dot matrix printer. (I never told anyone at school about my about my computer. I felt like I was cheating. Not one professor noticed the dot matrix print type.
What people miss about old hardware from the days was everything was built like tanks.
The thought of buying a new computer every two years was unthinkable. In my world, it's still unthinkable.
I really believe we have been duped by the industry--on all levels.
Another approach would be to ask, if you were going back to the year x in a time machine, how much would you have to spend on trade goods to make your fortune? For example, if you were going back to 1970 it would certainly be less than $500, because the processing power you can get for $500 today would have been worth at least $150 million in 1970. The function goes asymptotic fairly quickly, because for times over a hundred years or so you could get all you needed in present-day trash. In 1800 an empty plastic drink bottle with a screw top would have seemed a miracle of workmanship.
Here's the thing: Deflation got people spending money on an iPhone.
I salute your high barrier of impossibility.
It's sort of an apples and oranges comparison. Obviously there are many things that couldn't be done with analog tech. But it's not as bad as you would expect with just naive comparisons based on the cost of vacuum tubes.
Did a search for computer history and 1957 and came up with this: http://www.computerhistory.org/timeline/1957/
...the creation of FORTRAN? Looking at the entry for 1956, the site lists MIT creating the TX-0, the "first general-purpose programmable computer built with transistors", and IBM's shipment of "RAMAC", the first computer based on "The new technology of the hard disk drive".
First off: yes, absolutely, the cost of provisioning and operating electronic memory data storage and processing has fallen phenomenally. DeLong makes that point abundantly clear:
in 1957, the transistors in an iPhoneX alone would have ... cost 150 trillion of today's dollars: one and a half times today's global annual product ... taken up a hundred-story square building 300 meters high, and 3 kilometers long and wide ... drawn 150 terawatts of power—30 times the world's current generating capacity
But let's look at those comparisons right there.
The iPhone X costs $1,000, and for easy math I'll assume all of that is the memory storage (this is wrong, but it's not horribly wrong, on an orders-of-magnitude basis). If the 1950 cost was $150 trillion, then the price has fallen by at least 150 billion fold. (And in fact it's fallen more, because there's more than just memory in the device, so my easy math understates the case.)
Global GDP in 1955, or more accurate, GWP, was $5.4 billion. As of 2016 it was about $80 billion, or, just for round numbers, lets call that $5 billion and $100 billion.
The multiplier is a factor of 20. Which, if I check maths, is somewhat less than 150 billion. Which is to say that whatever's been strapping white lightning to our capacity to chunk out memory circuits has not been strapped to the global economy as a whole.
Measures of the total built environment are difficult to come by, and even proxies for that seem at best obscure. Since DeLong specifies the idea of a 100-story-tall building, though, there is at least one interesting statistic that can be readily produced. Up until 1970, there was precisely one such building, and it was the Empire State Building, which held that record from 1931 until 1972 (at which time the newly completed World Trade Centers in New York City claimed the crown).
Naturally, there's been some contention for that prize since. A total of four additional tallest structures are listed: the Sears Tower (completed in 1974), Patronas Towers, Taipei 101, and Burje Khalifa. If we look at the list of the world's tallest buildings, and use the ESB's 381 meter height as a minimum qualification, there are by my count 37 such structures. Again, this seems slightly less than 150 billion.
Finally, energy consumption. In 1955, this was, roughly, 100 exajoule. In 2017 this is, roughly, 500 exajoule. The multiplier would be then ... 5. A number somewhat less than 150 billion.
The question which arises out of this is what is it about information technology that allows for a 150-billion-plus increase in capabilities, whilst total GWP (20x), skyscrapers (37x), and energy (5x) have seen far, far, far less expansion?
There's another question which asks if we're actually including full costs, which I'll note but leave off the table for this discussion.
But the question I would like to ask is what additional service value is being provided for all that the iPhone offers?
Consider that it is, ultimately, an information delivery device. And that the information end-consumer, the human tethered to it, has an almost ludicrously low consumption capability. Sure, you can deliver gigabytes or terabytes of source data to a human, but the amount of that which is absorbed, over the course of a day, amounts to ... a few megabytes, at most. And we're talking single digit values here. What the iPhone can deliver is video, audio, images, and text. Through on a viewport roughly the size of a 3x5 index card. The equivalent 1955 technologies it replaces are a notebook, a telephone (and probably some form of answering service or secretarial pool), the not-yet-invented transistor radio, a deck of cards or pocket game, a newspaper and/or magazine, a paperback book, a letter. A pile of index cards itself.
And ... the iPhone X carries any number of unintended consequences: the loss of liberal democracy, undermining a century-old tradition of advertising + subscriber based print media, journalism, adtech, concentration, possibly an entire generation. Unintended consequences are a real bitch.
Delong's calculus is exceptionally insufficient.
1. Wikipedia. Which, coincidentally, is citing one Bradforth DeLong as its source.
3. Gail Tverberg, after Vaclav Smil and BP: https://ourfiniteworld.com/2012/03/12/world-energy-consumpti...
4. Much of this revolves around the question of natural capital accounting. The good news is that this is entering mainstream economics, see the World Bank for example. The bad news is that it's still improperly founded. Steve Keen's work on energy in production functions is also of interest, though that admits yet another factor.
5. Consider audio. The human limit of perception is roughly 20 impules per second, a/k/a 20 Hz, which is the threshold at which beats become a tone. Given 86,400 seconds/day, at 20x, we've got 1.7 million bits of data, or 216 KB of audio-encoded pulses. For printed material, a 250 words/min reading pace is fairly typical, which works out to 2.16 MB/day, sustained for 24 hours.
6. a/k/a the Hipster PDA: http://www.43folders.com/2004/09/03/introducing-the-hipster-...
Movie camera (not common in 1957)
Security camera monitor (needs external hardware, but still)
Map and navigation system
Magazine and newspaper stand
Store catalogues (concierge shopping, to some extent)
News and weather on TV
Restaurant and hotel finder
Flight booking and checkin tool
Business memo distribution system
Classroom toys and child entertainment
Textbook and trainer for older/adult students (limited, but hardly non-existent)
Quick notes for friends and family
Clearly there's quite a bit more value than just "information delivery."
Source data doesn't need to be absorbed. No one in 1957 seriously expected readers to memorise the written text of newspapers or magazines, and no one seriously expects Facebook or Twitter users to memorise entire feeds today.
So the actual volume of useful information consumed daily has increased by a huge amount, and it's presented in a far more accessible and interactive/participative form than it used to be.
The point is that mobile devices connected to an open public data network generate huge economic synergies. Processing speed and memory are far less relevant than automation of old applications and the development of whole classes of new applications. Both have literally been transformative.
As for liberal democracy and journalism - those are no more endangered now than they used to be. Technology is a social multiplier, and if the roots of a culture aren't sound media of all kinds will reflect that - but that's a political problem, not one caused by technology.
You might consider how else such needs, or in many cases, wants, were previously satisfied. Or accommodated, or in which activities worked around their lack. And how, often in initially subtle ways, the smaartphone's presence has changed the structures and institutions it interacts with.
But as I've expanded on this elsewhere, the supposed economic analysis of DeLong is missing key insights.
Data and computation, much as work, expand to fill available time. It's less how much computation can you buy and far more how much are you willing to spend. A curious aspect of computers is that the price points have remained remarkably resilient. In nominal currency, even. The original IBM PC cost $1,565. A current-generation Lenovo (successor to IBM), say, the P310SFF is priced at about half that, $710, plus $160 for the monitor, or a total of $870. It seems that for typical end-use the question is more of "how much computing can I buy for a given budget, than how much will X amount of computing power cost me.
As for technology (and especially communications) not changing or disrupting democracy or society, I'd very much suggest you reassess that premise as it seems to me that every communications revolution, dating to speech itself, has had profound and often highly disruptive effects. Elizabeth Eisenstein captures some of that in The Printing Press as an Agent of Change. The role of cheap press, radio, audio tape, microphones and public address, photoreproduction, and cinema in the rise and spread of fascism is another hugely instructive episode.
DeLong expresses awareness of none of this.
Frankly though modern houses, automobiles, indoor plumbing, central heat, washing machines, vaccines, 50's era medicine, were a vastly larger jump in living standards than smart phones.
Also, "free market" is kind of a joke as a mantra. For example, "without interference" kind of would require a whole lot of marketing people stopping what they're doing. As it is now, many companies themselves are doing their best to interfere with the decisions of consumers and workers, in some cases even get in bed with each other and wage outright war on those they extract money from; so that's not a free market by a long shot.
HN, do you want to be a meeting point for minds, or a marketing tentacle connection node?
Another plus - the software stack would have been far smaller. You wouldn't need a 32 Gig phone just to install some apps. All processing would have been done in the cloud, on the mainframe. The apps would all have been dumb and only screen viewers for the mainframe.
Our quality of life in the first world changes for the better somewhat, but we had 150 trillion dollar equivalent devices we carry around?
I also think one of the comments is interesting, that indoor plumbing is so cheap but provides such an insanely greater improvement to QoL.
It is a little but amusing and gives some perspective.