What you can see here is market maturity, this is a very common thing. There was just so much to do in the computer industry so far.
Look at the history of other industries, there have been hundreds of car companies before Ford took out most of them with the Model T.
Imagine you are one of those people working is the car industry before that. "Company X had a car that drove 50 miles per hour, a new record, insane! The last record got broken just a few months ago...".
What is happening in other mature markets with good dynamics is that gains come much more slowly and it becomes "boring" so you don't think about it too much. You get an incredibly efficient and secure car for a few thousand dollars, profits come from people who buy for status. If you look at the truck industry or aviation, the margins are really low, buyers mostly care about price there.
Most people don't need more than a smartphone, they already buy for status, others need a Mac or a cloud instance like truck drivers need a special kind of car to do their work.
There are just so many car companies, there will be just so many cloud hosters with will compete for ever shrinking margins to run your cloud-function-app magically infinitely scalable on their gigantic datacenters. If you want your own get a raspberry pi rack.
This is amazing, this frees so much energy and resources to focus on other stuff that matters.
Look at software, the next decades we can focus on creating ERP-Systems that are as perfect as Whatsapp.
Then SpaceX and Blue Origin will compete to bring your 3D-Printed robot spaceship into space you just simulated in your hyperrealistic Kerbal Space Program. The enabler here is price per kilogram to orbit.
That's where my imagination stops, because it's "the final frontier" and the problems there are really infinitely hard.
Oh great, my general ledger is now free, but in return I have to route all my data through a semi-trusted 3rd party with unclear motives, who is financially incentivized to act against my interests, and who can force-deploy workflow-breaking updates whenever they choose, and maybe is secure but who can ever tell...
Rant over. I actually like WhatsApp. It’s pretty far from perfect though.
I love myself some pessimism but tech is not slowing, instead its HN being obsessed with things that don't matter.
- VR was a niche from the start, a lot of people were doubtful, and the people that weren't know that it 's not going to work and are already moving on.
- javascript frontends are great for programmer fights, but thankfully most of the public-facing web doesnt use them, they 're irrelevant to tech's progress
- Uber has nothing critical to offer to technology except an also-ran in self driving cars
- kubernetes is relevant to like 100 people on earth, and doomed to be obsolete in 1 year. not critical to tech progress
- AWS isn't critical to tech progress either, its bubbled because of easy startup money and lazyness. will be a good thing to see it fade away. Meanwhile , check googles TPU offerings for some progress.
- Who needs a 3rd browser engine when every website looks like bootstrap?
Those said, crypto is entering the plateau of productivity. It 's being held back by hostile regulation , governments and banks and AI hasn't even started , its going to be a wild ride. The web is slowly decentralizing which will be huge too
"AWS isn't critical either, its bubbled because of easy startup money and lazyness. will be a good thing to see it fade away"
Wait what ? Have you, been on the web at all ? Ten thousands of websites are hosted, or partially hosted on AWS
Sure there is -- I use it for my personal projects (so no CFO's), for example for email sending and object storage.
Sure, your own server is fine when you have unlimited time; but once you want to save time (CFO: to pay less salary; personal: to have more time in life for other things), AWS makes a great deal of sense.
Half the workforce in webdev has an average work experience of 5 years or less which leads to self-sustaining generational churn in this area, and a whole industry of bootcamps and fizzbuzz-style questions handed out to employers and webdevs alike. Being hellbent to deliver webapps regardless of fit is no progress, and will hit a wall as customers and regulators demand power efficiency.
> and will hit a wall as customers and regulators demand power efficiency.
Sorry, but LOL. That's so low on any regulator's agenda that it wouldn't make it even as a campaign point if we were to elect a "president of IT", let alone a real-world senator or mayor or president.
Ok it might not be yet, but things are changing rapidly. In Germany, established parties are running scared right now over loosing first-voters, and push for significant change.
Imagine a little icon or traffic sign or so we put on websites informing users about independently measured power effiency. Could work with privacy and network footprint as well. Hell, maybe I should put up a competition for graphic artists to design such a thing, with a 1k USD/EUR prize.
I'm afraid I've not completely understood. I was talking about the power consumption for the users of the site/webapp rather than the power necessary for running web servers. But that would indeed be another important measure (and I believe power density is certainly on the radar of data center equipment buyers, but not framework users and most developers yet).
I wouldn't rule it out. Considering the occasional tech-related hysterias, all it would take is for someone high profile to start complaining about how much power and heat slack generates.
> javascript frontends are the great for programmer fights, but thankfully most of the web doesnt use them
By frequency of retrieval, JS frameworks are the majority of websites served.
> kubernetes is relevant to like 100 people on earth
Even if it's not k8s, most companies benefit from a container orchestration system.
> crypto is entering the plateau of productivity
cryptocurrency will always be a niche product until it can provide the same services and assurances that banks do. Most people overwhelmingly prefer the availability and security from carelessness, petty fraud, and mismanagement that banks provide that they even pay for these services, than the not-entirely-complete protection from state actors that crypto provides. That's what holding crypto back, not regulation.
Cryptocurrency is meant to replace currencies and central banks, not banks (retail). Banks are institutions built on top of currencies (there are a lot of companies doing that with cryptos btw).
I'll bite - how the hell could cryptocurrency replace central banks?
To me it seems like the same logical flaw of any the Unabomber thinking sending mail bombs could shut down all industrial civilization. It not only doesn't make sense but it has no logical connection. It is less logical than thinking one could conquer the US with an army of bowmen on horseback - it makes no sense given the massive disadvantages but an army can conquer at least.
Their actual roles of currencies and central banks are "the medium needed to pay taxes" and pegging the currency ratio to optimize for the economy.
The closest replacement I could see being viable would be the central bank issuing their own as an option and perhaps allowing "instanciating" and "deinstancing" to physical currency.
AI/ML is actually everywhere already. To the point where most people don't even recognize it as AL/ML. The problem is that it's not the image of AI that has been sold to the public and even the tech generalist community by books like The Singularity and Superintelligence.
we are only seeing some first applications that are market-ready. Most of the research in AI is in things that are not ready to market but they mostly look promising, like AI diagnosis, image / video / sound /animation generation, signal analysis, various unsupervised learning methods etc.
I still believe in VR, but driven by porn, not games or productivity. That niche seems to be doing very well and has been a technological driver more than once. Basically everybody that I've talked to that was skeptical and tried it was a believer after they've tried it. When that goes mainstream, you've got a very solid market with lots of consumers.
It won't be the $1500 + a high end PC version of VR though.
thats not real VR though, and a high tech headset sounds like an overkill for that. You can place 4 screens around your head for similar and more cofortable effect :)
Absolutely 100% true, I have experienced this first hand. Unfortunately, many of the videos they try to promote on VR systems were shot with a single lens and are just "2D in surround"-- these come across as flat and uninteresting to me, especially after I saw videos with actual depth. In those cases I would be happier with a monitor. But the depth & immersion is completely worth putting a headset on for me.
Also, on the surface it looks like initial Oculus Quest sales are going well. Sold out at my local Best Buy and showing 1-3 week lead times everywhere else (including direct order). Could be artificial, but time will tell. I was already satisfied with the GearVR on an older Samsung Galaxy, and initial reports are that the Quest is a significantly noticeable improvement.
>The only thing I can think of is that all my laptops (thinkpads & Apple MBP) are now rocking USB-C PSUs.
Personally I think type-C might be the best example of moving the problem behind a layer of obfuscation rather than fixing it. If you look at a random device with a type-C port you have no idea what that port is capable of. Maybe it'll charge your Macbook but it won't charge your Asus. Maybe it'll support USB 3.1 or maybe it'll only be USB 2.0 like OnePlus. Counterintuitively, chances are if it's a phone and it does USB 3.1 then it won't support USB to HDMI or DisplayPort although some do, but if it's only USB 2.0 over type-C it'll probably support HDMI-out with a DisplayLink adapter. Maybe it'll break half the specifications just for the hell of it like the Nintendo Switch.
My reference was really to the fact that it has the same hidden properties as the port. What days throughout does it support? What charging capabilities does it have? Can't tell by looking at it.
You know, maybe it’s not a bad thing if everyone takes a break from growing and changing. There’s plenty of ways where we would benefit from stability and refinement right now.
I really don’t get articles like this. You can rent a fleet of computers for an hour for a few dollars. Your cellular phone is more powerful than the time-sharing systems of 25 years ago. There is pervasive wireless broadband internet available in much of the world. Single computers can be bought with dozens of cores and a lake of RAM. You can look up nearly any piece of human knowledge in a few seconds. The labor of millions of engineers can be accessed via APIs which you can use to assemble a stunningly capable application with a minimal team.
> You can rent a fleet of computers for an hour for a few dollars.
On which you deploy an web app that has 30 working components written in 10 different languages that is able to do a fraction of the work a single vertically scaled server that runs 3 C written components. And has less than two 9 uptime. No one person can grasp it mentally. Is statically linked and must be updated piece by piece, unable to use the distribution repos and upstream work. Oh, and man, what man? Every one of the 30 components is "open core" with cool new invented terminology that pushes you into paying for consulting or reading spaghetti "object oriented" code. It also has stack trace for logging.
How ungrateful of little sysadmin old me not to appreciate progress.
In truth because the world of C (I was a C and then C++ programmer 30 years ago) did not enable the production of effective applications cheaply and quickly. Yes - you can write anything, but it took months and months and months.
We've traded efficiency for productivity. We have been able to do this because of Moore's law and all its friends.
Also, C code is hellish to read and understand (#define anyone?), and logging almost never happened.
As a sysadmin you will be able to look at the organisation you work in and count up the number of sysadmins, my guess is that there are less than 50% of the number 10 years ago, maybe that's the problem?
"So, why are we doing this?"
I think there are a lot of reasons. Market share, control of technology, not invented here syndrome, ego issues, fads, shifting focus on specific set of problems. I personally think that OO proliferation (C++) and Java and enterprise mindset was the point of no return. The mental point where the majority left simplicity and maintainability out of its scope. There was (and still is) an era where complexity (and the ability to comprehend it) is a synonym of mental machismo. Simple was left as something that lesser minds/developers had to worry about whilst the 10x devs (another great proliferated myth) which everyone worth his salt should strive to be (so the myth went on), must not only conquer it but also thrive on it and create more of it. Thus you got a multitude of tooling, languages, stacks that do the same things differently, you got Godzilla OO hierarchies and architecture diagrams that look like a dozen overlapped London metro maps. Now we got microservices and k8s which in itself is a testament to irony - started as an attempt to simplify uservices/cluster mgmt and has evolved to perhaps the most complicated ecosystem ever (one has serious trouble even comprehending what most of the tooling in k8s land are supposed to be doing, much less use them and don't get me started about really understanding code in there).
I personally think more people should read and embrace the Unix philosophy of simple and reusable but on the other hand I'm afraid that the ship is unstoppable now.
Tangential, but it's interesting how we can't seem to agree who the mythical 10x devs are.
> There was (and still is) an era where complexity (and the ability to comprehend it) is a synonym of mental machismo. Simple was left as something that lesser minds/developers had to worry about whilst the 10x devs (another great proliferated myth) which everyone worth his salt should strive to be (so the myth went on), must not only conquer it but also thrive on it and create more of it. Thus you got a multitude of tooling, languages, stacks that do the same things differently, you got Godzilla OO hierarchies and architecture diagrams that look like a dozen overlapped London metro maps.
I always believed that this kind of stuff is what mediocre developers relish. Meanwhile, the 10x developer looks at all this complex cloud-scale data pipeline, and does the same job 3x faster with a few Unix tools piped together on their mid-tier laptop.
As far as I know, the 10x developer comes from a throwaway comment in a small sample paper on comparing people's implementation of toy problem. There ends its actual basis in reality.
> Yes - you can write anything, but it took months and months and months.
The C & C++ of 30 years ago, where compile times took days - sure. Nowadays building a whole yocto software stack, basically a full linux distro - from GCC, to the kernel, to glibc, to X11, to Qt takes 6 to 8 hours in a good laptop.
In addition, modern language features & the general shift from runtime to compile-time polymorphism, especially in C++, make the potential for errors decrease sharply, and language simplifications - standard library improvements, terser syntax, also make time to release go down.
Finally, tooling has greatly improved : I barely ever need to run a build with my IDE to check for errors, because IDEs now embed clang, clang-tidy, clazy... which checks your code as soon as you type and highlight it in the editor. And asan, ubsan, handle the remaining cases which cannot be caught at compile time, leaving only logic errors to handle.
Pretty sure the comment you're replying to is talking about development time, not compile time. They're claiming it took months longer to develop a similar app in C, not that it took longer to compile. Even if compile times are negligible now for the reasons you state, they still have a point about it taking longer to develop an equivalent app.
This. The one problem is a lack of a solid networking library. Unix sockets/winsock is ancient and terrible with tons of corner cases, Boost ASIO is creaky and "boosted", other useful things like 0mq are good for IPC but not Internet. Qt eats the world and is a pain to integrate. Libevent, libev etc. are not C++ and still pain to integrate.
Maybe there is some library that is lesser known but good.
As for time to develop, besides networking, C++ wins, even for hack projects, in comparison with Python even, JS/HTML/CSS w/ framework stack being far slower and making for terrible UX. (I'm not writing about languages I do not use.)
Java and C# can be fast too, but it hits the limits when you actually have to calculate things which gets real awkward real fast.
The main win is... Tooling. Java tooling is best now since maven and ant are mostly gone legacy, then C++ (I'm counting Qt Creator and Glade, cmake is eh, cmake+something is ok), then finally JS tied with Python. JS fails terribly on debugging and the creaky rarely adaptable UX libs, while Python after all these years still has tooling issues. (Plus performance where it matters.)
Rust is not there yet, maybe soon. Scala is nice though fewer people can use it than new C++.
Honorable mention to specialized tools like R, Faust, Lua, Matlab and Fortran. Note they integrate well with C and C++... Try anything else and you're in for a ride. Even second most supported, C# and Java, go through a C FFI layer.
Your comment mirrors the article. A bunch of projection, written out.
Lots of people do microservices for the wrong reason, with the wrong model. They don't realize this model asks for a lot of initial overhead per process for visibility and early investment into figuring out how these services communicate.
For instance, a service that sets off 3-4 layers of HTTP requests is an indicator of a bad microservice architecture.
A service that uses a shared data bus (e.g. Kafka) but can survive without it however...
That doesn't mean things are regressing. People have always done the wrong things for the wrong reasons.
That's because as a sysadmin you can now do work in a week of plugging components together that would have taken three months for a team of five to pull off with C.
Problem is no amount of plugging and configuration on my part can fix problems that come from the complexity of the software architecture.
I can imagine it's a lot cheaper. But what is the compared value/quality? I reckon it's below the slow and steady approach. You are free to argument against.
Don't work in such place then? Seriously, there are always good and bad places.
In fact, when you said "two nines" -- this is like 15 minutes of downtime per day -- I immediately thought about one of my jobs (long time ago) which involved website running on Windows Server, using a mix of ASP and DLL files. It would routinely go down for tens of minutes each deployment because someone would update the DLLs in the wrong order.
There was also using PHP with Apache. As a sysadmin, remember all those times when you had to hunt all the Apache-spawned processes and kill them by hand, so you can finally apply new configuration? Or when an .so file update made an app server not start, but that darned start-stop-daemon was silently eating both exit code and error message?
It got much better in the sysadmin world -- it is now easy to do the sane things, and all that containerization means that if programmers do provide something insane, it is trivial to clean up the mess.
None of that directly describes business (or consumer) value. It is mostly a recitation of information technology statistics.
If you look at the period of say 1990-2010 there was an absolutely insane ROI on human capital invested into IT. Arguably this level of ROI only happens once or twice in a century. You had young guys in their basements, garages etc. inventing life-changing technology with no investment other than a PC, an Internet connection, and time. It basically all revolved around getting people a PC, a 'net connection, and a phone, and then delivering services over those things.
Everyone saw what was going on and there was an unprecedented global migration of labor into the IT industry. There was a perception that anyone could be a millionaire, and anyone could change the world, etc. and some degree of truth to it.
There's no denying that much of the low hanging fruit is now gone, and there are more people in the industry than ever before (many of whom had huge expectations based on what they had witnessed transpiring a decade or two prior). So of course you are going to have a changing industry and an expectations mismatch. The old days are over and they're not coming back.
Any new wave of innovation and disruption is at least as likely to come from a non-IT industry as it is from IT.
There are still trillions of dollars to be made in IT. Since the future is never evenly distributed, a lot of these dollars are in emerging markets, and/or in industries which are being transformed by the IT revolution's latest waves a bit late in the game.
But it is a different industry now, there are fewer opportunities for disruption and there are many many more fingers in the pie. In general the way this money is made will look a little bit more like how money's made in mature industries every year.
There's no fighting history and you're not going to be the next Steve Jobs. But the pie is so big that you can carve out whatever kind of little niche you would like, as long as you understand the rules of the game.
The biggest question for me is what we are gonna do as professionals in the coming "long winter."
We have not a single project for a "tech" brand in the pipeline, we are only now learning how do tech projects for complete industry outsiders.
Not so few have some tech expertise, but they are all strangers to ways of the industry. They all have own ideas where tech goes in their business model, and I can't say that all of them are "amateurish" or not thought through, I'd say some outsiders were better at running tech projects than some de-jure "insiders"
I bet that it will be all about ability to get more tech business outside of tech industry for us.
It's not so much a long winter as it is a long average weather time period. There are still plenty of professionals in the "boring" industires, just fewer startup millionaires.
The point of the article is that hype of most of these technologies doesn't nearly compare to the hype of 7 years ago. We've had smart phones, AWS, deep learning, promises of AR, crypto, etc. for at least that long, and nearly all of those feel like they're in stagnation compared to how exciting they were then.
It's not that they're not getting better, but they're in the transition period where they move from new frontier to commonplace technology taken for granted.
For example, it seems crypto might never be successful as a replacement for consumer direct spending due to bandwidth issues. However, as a backend currency for a front-facing financial institution it has untapped potential. The growth in interest from here will be gradual, but will probably elevate.
Off-topic, but I came here to say that my long-standing fear just came true about the word "crypto"; namely, that I totally misread it in TFA as meaning "cryptography" - which is of course what the term always used to refer to - and honestly forgot that it might refer to "crypto-currency" until reading your comment.
I knew someday that would happen! I made up some story to myself that comments in TFA about "#cryptowinter" and "FUD in the community" might refer to communities losing confidence in cryptography techniques after too many companies getting hacked. Out of touch, I know, to not be aware of that hash-tag. O_o
I really hate that we have another meaning of "crypto" now so that misunderstandings like this can happen often. As programmers, we know how important naming is and how costly it is to create confusing collisions. Cryptography is too important of a concept in tech for us to not have a clean word for it. smh
It has hit the plateau. Ideas with value have already been invented. You have hammers powerful than ever, but what are problems you can hit with it? That is the hard part.
Before we are finding more meaningful problems to tackle, keep inventing hammers doesn't make too much of sense, it is just waste of money.
Maybe the future of IT will be to proven its usefulness solving problems that are hard before the revolution of IT, and are still hard after that. Like cheaper/greener energy, reliable/faster transportation, more accessible healthcare and etc.
I can follow up on this with my recent work experience.
The microelectronics has been on the declining trend for half a decade. A lot of people were saying that we are heading for the "long winter" in the industry.
Long term investments into manufacturing are on the all time low. Only recent events with trade war made some bigger OEMs to reopen lines in 3rd world countries other than China.
It's important to make distinction that it's not only consumer good sagged, "enterprisey" stuff sees no growth either, and that's why people don't invest more in high end fab process because they don't see server chips, router chips, and memory going up in sales any time soon.
Why it's bad? It's bad because every time the industry moves up a node, it's paid by highest bidders first. If Intel/Broadcom/AMD and other big boys wouldn't have paid for 7nm, there wouldn't have been 7nm for the rest of us.
We may not see 5nm, despite EUV being "just months away"
On other hand, our engineering consultancy saw an enormous inrush of new clients, most of whom are complete newcomers to the industry. One of them is... a furniture company, and a big one.
EUV is in production today with Samsung and TSMC's "7nm" processes, and the latter has started "5nm" risk production which uses even more EUV. Here's a recent SemiWiki article on details of what Samsung's foundry business is doing: https://semiwiki.com/semiconductor/259664-samsung-foundry-up...
Intel's all 193 UV "10nm" node (roughly equivalent to the above 7nm) is a failure, we'll see if they can get their EUV using "7nm" node to ship in quantity. In all these cases the demand for less power consumption in battery powered devices continues to drive demand, even if that demand isn't as healthy as it used to be.
The current crappiness of AR reminds me of the early days of MP3 players. Only a few geeks had them, and it took forever to load an album over the printer port if it didn't crash half way. Then 5 years later, everyone had an iPod.
I was in highschool when MP3 players were a thing, just before and slightly after the iPod. Everyone had one.
I wasn't in a particularly affluent or tech-savvy area. It was cheap to buy and just required you to know how to move files from one drive to the other in a computer - which was a skill that was very common and was actually taught at schools in my area.
What specifically looks similar to you? Consider there are many things which at some point only a few geeks had and which 5 (or 50) years later still only a few geeks had.
When Apple announced ARKit, this culturally geeky (not a tech) coworker of mine scoffed, saying Nintendo had something like that for the DS ages ago. That's exactly how geeks responded to the iPod with the infamous CmdrTaco post ("Lame") as a prime example.
But this is my point. Geeky people have a strong opinion about how much this or that implementation sucks or doesn't suck. No one has a strong opinion about VR platforms, or quantum computing, or drones, or any other next big thing, so that makes AR look like it has half a chance to me.
The thing is I didn't have to go into an AR community online to find opinions. Some guy at my work who isn't even a tech was giving me unsolicited snarky AR opinions.
For my team, we abandoned k8s because of the insanity of just managing the cluster itself in AWS pre-EKS. Our team has a lot of low level OS and AWS knowledge, and it was just this massive undertaking and at every turn we were told, "Lol, just use Google Cloud."
My employer is all in on AWS, there's no way to change that. Furthermore, our accounts come with a lot of restrictions imposed by our central AWS team. VPCs, subnets, NAT gateways, etc. are all pre-provisioned and we are prevented from modifying them or creating new ones ourselves. The only real tool for k8s in AWS was kops at the time. It was so opinionated that it broke down very quickly when faced with all the IAM restrictions on our user accounts.
Now that EKS is a thing, we're looking to revisit it. But a lot of the pain might be coming from the fact that it seemed that "manage my k8s cluster for me" was under-developed on AWS at the time. Elastic Beanstalk's ECS platform was just the easier thing to start with for us.
How is that? You create an object with some metadata and that's it. Don't know how that can be done any easier. I honestly get the feeling only people who have never used K8s deem it complex.
Running a cluster yourself is another topic, although there are way more complex orchestrators on the market than K8s.
It seems to be a typical "if I don't understand it and reading up on it for 5 minutes doesn't help it must be too complex" case.
Turns out turning a bunch of machines and maybe a cloud API into a functional PaaS includes a lot of logic, a lot of which is dedicated to handling edge cases.
To quote David Mitchell (the comedian, not the author) it would be tempting to take this kind of thing seriously if it wasn't for a damnable sense of perspective.
Some thoughts:
1. Much like how the entire centrist media machine feeds off the insanity of the current US administration, somehow this article is at the top of HN. We put it there. (Well, 42 of you did, as of the time I wrote this.)
2. Perhaps the best-kept secret in the world is that it is getting better by almost every measurement. The late, great Hans Rosling was superb at breaking down these dimensions in video form (eg. https://www.youtube.com/watch?v=jbkSRLYSojo ) but if you want the real deal, I recommend Steven Pinker's excellent "Enlightenment Now".
3. I couldn't care less that Intel has "stalled" since you can now buy a Raspberry Pi that pwns my first computer by an order of magnitude or three and costs less than $25. Meanwhile, Apple is refocusing on Mac hardware again because so many "poor" people have phones now that they can see a sales ceiling... this means that a huge percentage of the planet has a significant computer on them all of the time.
4. Not only does age and wisdom allow you to observe tech over many cycles, but the older I get the more I realize that tech is nothing outside of its relationship to politics, culture, and ourselves. Look at how young children just assume everything is a screen, now. Tech is now an important aspect of the daily political conversation. In just 15 years we've gone from a society that rents VHS movies to one that feels entitled to comment, upvote and subscribe to everything they watch. It's fucking crazy how much tech has reprogrammed everything from the way we find love to the way we get from A to B.
Finally, to the author: sorry a lot of the comments here are negative. They aren't wrong, but we're still working on chilling out and defaulting to presuming that in any given moment, people are generally trying their best in this community. The good news: there's lots to be excited about in this "worst possible timeline" we've fallen into.
> Perhaps the best-kept secret in the world is that it is getting better by almost every measurement. [...] The late, great Hans Rosling was superb at breaking down these dimensions in video form [...] but if you want the real deal, I recommend Steven Pinker's excellent "Enlightenment Now"
Indeed. Thanks for plugging the inimitable Rosling (I'd also strongly recommend his excellent book, "Factfulness") and Pinker. Rosling has incredible anecdotes from the real world, and Pinker is methodical and rigorous. Each book is a companion to the other.
(Me brags: A few months ago I've had the pleasure of meeting Steven Pinker in person and even have a small chat about his book, the late Rosling, and other topics.)
And I hope Hans Rosling's exhilarating enthusiasm rubs off on many folks. I was so enthralled with Rosling's work that when I was in Sweden earlier this year, I took a train to Uppsala (where Rosling lived and worked) and spent a day there just to breathe some inspiring air. Rosling was a force of nature.
PS: Yes, for all the "positivity bias" that Rosling and Pinker are accused of, they repeatedly acknowledge that there's still a long a way to go, and never claim that everything is hunky-dory. As Rosling puts it: "Things can be both bad and better".
Re Rosling: I have a feeling that we are improving immensely at the base of maslows pyramid but there is plenty of sabotage occurring further up. The march towards authoritarianism in much of the world is one that jumps out.
Remove China from the numbers and Pinker and Goslings points pretty much disappear. Both aren’t a secret, they are well known and criticized a lot for their use and abuse of numbers.
Impressive GDP growth aside, persistent malnutrition suggests that for huge chunks of the Indian population (a shocking _seventy_ percent of women are anemic), something's not working. Who cares what GDP growth is? What's it really measuring if people still aren't getting enough to eat?
The point is that developed world has stalled. Which is probably true on some measures but we do not know why. There should be slower but incremental improvement as there is room for that still. A plenty of room in fact.
Life expectancy is still gaining though and flattening. That is the main variable encompassing almost everything. Including normalized per income is flattening.
Income per capita has flattened somewhat, but it might be a natural sigmoid the China, Russia and such have yet to reach. So that's about it.
> 4. Not only does age and wisdom allow you to observe tech over many cycles, but the older I get the more I realize that tech is nothing outside of its relationship to politics, culture, and ourselves. Look at how young children just assume everything is a screen, now. Tech is now an important aspect of the daily political conversation. In just 15 years we've gone from a society that rents VHS movies to one that feels entitled to comment, upvote and subscribe to everything they watch. It's fucking crazy how much tech has reprogrammed everything from the way we find love to the way we get from A to B.
I think you caught the idea better than anybody else. The tech industry is no longer about the tech itself, and much more about everyday world around us. The tech ate the world — sure.
I can't remember the last project in our engineering company where we ever bothered with the product spec wise, and we haven't done anything like a PC or a smartphone in years. We almost solely work on "gadgetising" existing everyday stuff now — household appliances, toys, all kinds of vending machines, ad displays and installations, EV stuff, aircraft and rapid transit infortainment.
The essence and the shape of thing is way more important these days than what's inside. Things are definitely changing that way. Prime majority of our biggest clients these days are companies whose business had zero things to do with tech just few years ago, but is all about it now.
What he lists are not technical problems, they’re political problems.
Tech progress has always been incremental in nature. It seems “nothing is going on”, but there is a constant iterative cycle that will eventually automate almost every repetitive job or task.
It will be cold for many who do repetitive jobs and tasks.
It is sad that open source has been hijacked by the likes of Amazon. However, open hardware is an exciting option with desktop fabrication, electronics and low energy wireless creating new opportunities for small entrepreneurs
I think the situation is a bit more complex: the proliferation of F/OSS itself has driven the market into the situation where integrating and running other people's commodity software is one of the only commercially viable options left. Another factor is that commodity hardware is so powerful, and has been for over ten years, that it just had to be put back into the hands of datacenters/clouds.
I've played quite a bit with the Oculus DK1, GearVR, Oculus CV1 and the Oculus Go, and after 2 weeks of usage I can say the Oculus Quest is much better than any of them. Low barriers to putting the headset on and immediately walking around in VR is hard to overstate.
I think it will be a very big hit. The virality levels are very high with people showing their family and friends, who then want to go out and get their own.
The downsides are the games are a bit too expensive, and the headset itself costs $400 which is alright, but it does need to be cheaper if they want to achieve Nintendo Wii type sales figures (which the Quest fully has the potential to do).
Exactly. I work in web/tech at a museum and I see AR/VR as the next big thing to hit the horizon. The potential is very big, and very cool.
It's not quite there yet, but it's coming soon and once it gets going, it will be as big as the iPhone hitting or greater. It just needs to hit that sweet spot of price, esthetic and functionality.
About the leave of Uber. Grab is a terrible app compared to Grab compared to the UI, UX, and overall drivers attitude (personal opinion of course).
However for Indonesia, services like GoJek and Traveloka are picking up. I'm currently in Indonesia and GoJek has grown into a service that you can bail a ride to full market place with payments, vouchers, and rewards integrated into several consumer services such as package delivery, movie ticketing, Massages, make-up services, etc. They continue to offer even more and better services and gained a lot of attention die to Uber's leave.
In Georgia, for example, a relatively unheard of country, has a huge startup boom with many ride sharing apps, mobile money solutions, and the city Batumi is full of online casino kiosks.
It's probably a cold age in information age elsewhere, but certainly not din developing countries.
This isn't complaining about today vs. life before the industrial revolution; it's complaining about today vs. last year or two, within a particular job sector.
In other words, total amassed progress may be amazing, but according to the author, its derivative in web/consumer tech sucks (I'm inclined to agree, even if not for the same reasons).
We as engineers continue to make similar mistakes over and over again, with technological advancement blocking the knowledge transfer between generations. Everybody thinks they are smarter than the generations of engineers, programmers and mathematicians before them.
If “we” see a old Fortran program running some part of an airplane our intuitive reaction isn’t awe and admiration, but we spontaneously decide we can do it better. Some of these old creations run without flaw and yet the first intuition everybody gets is to replace them.
In structural engineering lessons like these had to be learned the hard way: only with collapsing bridges and hundreds of deaths, engineers start to realize that the technical progress in tooling (e.g. computers) didn’t automatically translate to safer and more stable structures and that the stuff they did back in the day is even more impressive looking back considering that.
I am not saying we are doomed: we have a incredible pool of interesting and cool solutions that worked reliably for ages. We just need to do a better job at not fooling ourselves with shiny new toys. We can’t jus assume historically hard problems become a piece of cake if we just throw the latest technology onto it, without even dealing with the lessons learned by the ones before us. This is hubris.
A good analysis of the problem gets you a long way towards the solution, that is why running around with hammers and picturing everything as a nail is already creating problems: machine/deep learning is good at solving some problems, but bad at solving others.
Ask any IT people and public, how to follow up the changes? Can you use what is being done 5 years ago then now?
Yes it is better than in 1990 vs 1995 or 1995 vs 2000 etc. You really have to learn a total paradigm. Java, unix, windows, object programming gui ... now the basic is done but still no. The python you learn is not important vs the one doing AI for example.
I don't know, #cryptowinter makes me feel warm inside, and for Uber I only wish it happened sooner :). What also warms my spirit is that it seems GDPR had a side-effect in that many other places are now considering implementing similar data protection regulation. So it's not all bad news.
That said, I sort of agree with the sentiment. Where's the technological innovation on the web these days? Where are the tools and technologies that allow me to work better than 10 years ago, and not just worse but with JavaScript, or worse but with high-resolution and shiny (yet bloaty and unergonomic) UI? What's there currently beyond WebAssembly (i.e. "all the old stuff, but in web browser", which has a few benefits)?
I mean that I consider cryptocurrency scene to be net social negative (a nasty breed of fraud, gambling and MLM) and cryptocurrencies themselves a technology that's an environmental disaster and which should have never left the research stage. As for Uber, their only innovation so far was showing the sheer scale at which you can make an inherently unsustainable business built on anticompetitive subsidizing and illegal behaviour without people noticing.
> I mean that I consider cryptocurrency scene to be net social negative
This sounds like nonsense. Cryptocurrencies allow people to perform secure financial transactions without relying on central banking systems, and to use the blockchain for things like decentralized time-stamping, automated escrow, prediction markets, online voting, and other services. There’s a huge amount of social value to these innovations.
> an environmental disaster
What specifically are you referring to?
> should have never left the research stage
I’m glad it did. In addition to their utility as decentralized mediums of exchange, they have inspired a lot of research in the fields of game theory and cryptography, including work on voting protocols, mechanism design, and secure multiparty computation. I think the world is much better off with this newfound knowledge, though you seem to think otherwise.
> As for Uber, their only innovation so far
Their innovation was revolutionizing ridesharing and ride-hailing. In 2015, and in the United States alone, they created $6.8 billion of consumer surplus [1], a surplus which has only increased since then. Over 95 million people use it on a monthly basis [2], with 14 million trips completed each day [3]. It is popular with the public, especially the young [4]. There is huge demand for services like Uber and Lyft to become more widely available. See for example [5].
You clearly have an axe to grind and it’s clouding your ability to evaluate things objectively.
Except for having invented the microprocessor, the x64 and ARM ISAs, Unix, C, SQL, SGML, TCP/IP, digital audio and video, electronic music, GUIs, logic programming, and basically everything we're taking as granted today.
And unix and C, on top of apple, pc, sql, ... I guess only mainframe and smart phone is not of that era. And if there is no internet it will be invented. How cold is that? 100 degree c.
Zen master may be you are - step into hot water and feel it is cold. It is not the flag that not move I guess it is all in your mind. I still think the flag move a bit sorry. Quite a bit.
Look at the history of other industries, there have been hundreds of car companies before Ford took out most of them with the Model T.
Imagine you are one of those people working is the car industry before that. "Company X had a car that drove 50 miles per hour, a new record, insane! The last record got broken just a few months ago...".
What is happening in other mature markets with good dynamics is that gains come much more slowly and it becomes "boring" so you don't think about it too much. You get an incredibly efficient and secure car for a few thousand dollars, profits come from people who buy for status. If you look at the truck industry or aviation, the margins are really low, buyers mostly care about price there.
Most people don't need more than a smartphone, they already buy for status, others need a Mac or a cloud instance like truck drivers need a special kind of car to do their work.
There are just so many car companies, there will be just so many cloud hosters with will compete for ever shrinking margins to run your cloud-function-app magically infinitely scalable on their gigantic datacenters. If you want your own get a raspberry pi rack.
This is amazing, this frees so much energy and resources to focus on other stuff that matters.
Look at software, the next decades we can focus on creating ERP-Systems that are as perfect as Whatsapp.
Then SpaceX and Blue Origin will compete to bring your 3D-Printed robot spaceship into space you just simulated in your hyperrealistic Kerbal Space Program. The enabler here is price per kilogram to orbit.
That's where my imagination stops, because it's "the final frontier" and the problems there are really infinitely hard.