Hacker News new | past | comments | ask | show | jobs | submit login
Outgrowing Software (ben-evans.com)
204 points by nreece on March 21, 2021 | hide | past | favorite | 128 comments



I am going to bang on about this again, but imo software is a new form of literacy. Literacy of employees and customers transform(s/ed) all industries but we don't talk about the literacy department or literacy investments in corporations.

(It's what annoys me about low-code solutions - it's like trying to write your novel by linking pre-drawn cartoon squares together in innovative ways.)

You cannot compensate for lack of literacy in a company or society. Same for software.

In short some companies and some societies will be software literate and have time to focus on the next problems as ben talks about here - while others are stuck in the transition period. But they must go through the transition period - having good answers to "publisher questions" won't help if you don't have good answers to API questions

Software skills will remain a wage distinction for some time


There's a corollary with illiteracy but for mathematics, called innumeracy. Perhaps there's another for software, which is essentially a language of logic. It could be called illogicity. You could make up your own term if you don't like that one.

Literature is not just spelling and grammar, and mathematics is not just numbers and equations. Likewise, software is not just code and computers. Each of these are ways of thinking, methods of communicating, realms to explore, and systems for organizing the world.

I would not be surprised if there are other systems like these which we have not yet discovered or invented. At least, I hope that there are.


I think it is easy to see programming as the language of logic. But this is what I've observed as my wife has learned to use python, javascript, and R for her work.

The huge majority of errors have nothing to do with inability to think logically. Instead it is "I'm getting some incomprehensible fucking error message from conda" and it turns out the root cause is that her machine has multiple python installations and the places where dependencies end up installed are all fucked up and the solution has nothing to do with programming.

The second most common source of errors are "weird language things". Something can be expressed perfectly reasonably if you were to read it as pseudo-code but weird edge cases cause problems. Consider "==" vs "===" in JS. All sorts of fun issues there and it isn't really failing to understand "the language of logic" that causes people to bash their heads into a wall until somebody says "oh, JS has multiple ways of doing equality and you just need to know that".

The existing software ecosystems are nowhere near ready to support people who just want to think logically about problems and algorithms. It is just too filled with "wtf does PC_LOAD_LETTER mean".


I can see what you're getting at here, and perhaps "language of logic" is not the most accurate way to describe what software is. Maybe "language of complexity" or "language of systems" or something else. Someone could come up with better terms or metaphors here.

The difficulties you describe, however, assuming they aren't being caused by flaws or bugs, are part of the "language" we're talking about. I think one of your assumptions here is that logic is simple. That's true to begin with, but the software systems we build are usually towering arcologies of logic, with all manner of intricate, sometimes counter-intuitive details. A single detail, when examined by itself, is relatively easy to understand, but when taken together, they form a serious challenge for any human mind to grapple with. I think you could make comparisons to sprawling works of literature or advanced forms of mathematics, where the bits and pieces can be grasped, but the number of pieces, and the connections between them are often too difficult to see all at once.


  language of systems
that sounds about right to me.... and unfortunately, in my experience, most people (and alot of devs too) are not very good at systems thinking... but its an important skill to have!


Yeah - I think there's lots of bad tooling, but a lot of it does come down to logic.

Once people realize they're not doing something 'wrong' really and that it's just the tooling that's bad - they can start to understand that troubleshooting and debugging is most of what we're doing. Being good at troubleshooting and debugging is largely logic (isolating variables, testing, thinking about what it could be, knowing what to ask/search). It's why the dev joke of 'it works on my laptop' is funny.

The narrow scope of solving some explicit programmatic problem is one area where logic is needed, but debugging things is the more common use. Lots of historical things people had to debug are past that tooling stage and now mostly 'just work' (like compilers). Lots of newer technology is not close to that yet.

The analogy to literacy I think is a good one.

I wrote a little about this here: https://zalberico.com/essay/2020/04/19/how-to-become-a-hacke...


It's not like English and algebraic notation exist in the Platonic perfect state for their respective domains.


Sure, but I've never seen somebody try to read a book just to find that the book is written upside down and only reveals itself during the full moon.

How would somebody teach my wife to deal with a weird dependency error message? It took me several hours to figure out. Googling error messages accomplished nothing. There was no reasoning really behind it. Just unique problems with python dependency management. There aren't really transferrable skills here that would empower somebody to rapidly figure this sort of thing out if they just rearranged how their thought process worked. You either know the incantations or you don't.


Actually there are general techniques to troubleshooting. Like increasing observability of the system by increasing logging levels or via a debugger, figuring out expected behavior and looking for unexpected things, etc.

Here you'd just enable some way to see the dependency resolution process in more detail. What files are touched, etc. It can be as simple as running strace with an 'open' syscall filter, and you'd see some unexpected paths being touched, perhaps. But there are many ways to achieve the same.

How would she know about strace? It's just one of the tools you get to learn if you approach problems from the a generic "how'd I make what this program is doing visible to me?" when it's not doing what I expect. There are a ton of tools you learn if you approach problems this way over time, instead of searching online for error messages first.


These general techniques work for tech professionals. My wife is a historian. She needed to take a bunch of giant pdfs and split them into separate one page files. Easy enough to script with python. 10 minutes of coding. Hours of dependency hell.

So now in order to do some simple scripting I need to run strace and look for syscalls? Why would anybody just know this? Oh, and she has a windows machine. So this is precisely the sort of useless advice that is widely available online. "Don't worry, you can always just learn how your entire dependency management package works on the inside and interacts with your operating system if you run into problems scripting something for your job" is not helpful. It is the sort of thing that makes people flee screaming from the very idea of doing anything more general-purpose than excel.


So what? You said:

> There aren't really transferrable skills here that would empower somebody to rapidly figure this sort of thing out if they just rearranged how their thought process worked.

Which is just false. There are transferable skills to achieve that.

Also troubleshooting is not the only way to achieve the goals. She could have just chosen a different tool, that doesn't come with a ton of baggage like Python does, if troubleshooting is too much of a bother. There are a lot of tools out there that could be used to select certain page range from pdf and save it as another pdf.

On Linux, I run:

    pdfseparate InputFile.pdf InputFile-%d.pdf
and the job is done. I didn't even need to install anything. On Windows, all you need to do is just to find a way to install pdfseparate tool or an equivalent.


I guess I was unclear. The original post, which I was responding to, compared programming capability to literacy and described it as "the language of logic". The implication being that as long as somebody can learn to think algorithmically that they can use programming to solve their problems. My claim is that tooling is so terrible that "being able to think algorithmically" is not sufficient. Instead, you need to know all of this troubleshooting garbage.

When I said "there aren't really transferrable skills here" I mean that learning to think algorithmically will not transfer to being able to resolve these sorts of configuration problems.

Your final suggestion is similar here. Nothing about "learning to think algorithmically" helps you know that there is some tool already installed on linux that can do this. That's trivia, not logical reasoning. One can learn the trivia, but we should recognize that this is a necessary step towards making programming a daily part of many peoples' lives.


Yes. Makes sense. Thanks.

I agree that "learning to think algorithmically" will help someone apply programming language skills about as much as learning just English language grammar and very little vocabulary, will help someone apply English. Not much.


There's also workarounds as an alternative to troubleshooting.

Just change the CLI command that's failing to a function which `docker run`s the right thing with the rest of the params passed and be done.

But this assumes you know enough of your shell language to do that, and there will likely be things to troubleshoot there.

I've managed to avoid learning enough about OS internals to get a good handle on 'deeper' ways to troubleshoot like what you talk about with strace. Know any good resources to learn?


I like this perspective.

It also hints at a potential resolution - Pull the team away from the technology for a little while so they can focus on the abstract problem domain.

Most developers probably don't have the capacity to think about shiny new technology while simultaneously running traps on the fundamental business domain abstractions. Taking time away from the computer to think about the problem you are actually trying to solve is a big deal. Many just get caught up fighting their own tools and lose sight.

One simple trick - If you are not yet at a point where you can cleanly model your problem domain in terms of SQL tables & relations (even if you don't intend to use this technology), you have absolutely no business touching the rest of the effort. There are notions that you can actually have a provably-correct model of your domain at a certain level. 3NF/BCNF schemas have fundamental mathematical implications that are very powerful. Entire classes of accidental complexity can be obviated with a clean domain model.


Yes. I think it is a Brooks quote that is something like "don't show me your code - I won't understand it, show me your database and I will understand your application."

Simple robust Data structures with complicated code acting on it.


"Show me your flowcharts and conceal your tables, and I shall continue to be mystified. Show me your tables, and I won’t usually need your flowcharts; they’ll be obvious." -- Dr. Fred Brooks


Could that word be computacy?


> It's what annoys me about low-code solutions - it's like trying to write your novel by linking pre-drawn cartoon squares together in innovative ways.

There's a reason societies which switched from hieroglyphics to alphabets have higher literacy rates—the best way to promote widespread literacy is to make it easy and accessible.

Before the pandemic shut us down, I was a volunteer teacher at Girls Who Code. We used Scratch, which I was a bit skeptical of at first, but then I watched the ten-year-old students struggle to type a sentence at three words per minute. In Scratch that didn't matter—they could create all sorts of programs.

In a different context, while I'm not personally a fan of complex spreadsheets, they seem to be effective for a lot of people, who I suspect would otherwise struggle with Python.

To be sure, Scratch and Excel are still much slower and clunkier than "real", text-based languages. But I'm not convinced that typing words into a text editor is the ultimate way to create software. And even if it is, we've got to find a way to make this stuff easier, because I don't want to live in a society where a handful of elites are able to make computers do their bidding, and everyone else is left to consume the scraps offered from on high.


Agreed - many tasks can be done with more accessible tools.

It's why Excel is so popular.

It's why modal editors suck for the majority of humanity. For most people and most applications, WYSIWYG is fine and allows people to use the computer to do their work.

We'll still need the more sophisticated tooling for a long time, but even that becomes more niche once it gets really good. Most developers are not writing assembly or machine code. Layers of abstraction can and do work, but every time a new one comes it's often viewed skeptically.

Nocode is a little different in the way that excel is limiting too, but that doesn't mean it can still solve a lot of the problem space for a lot of people if done well. Introducing young kids to programming seems like a reasonable place for it.

I think AI assisted programming will be interesting, both in the form of Karpathy's blog post: https://karpathy.medium.com/software-2-0-a64152b37c35 and also just to assist alongisde the dev. GPT-3 style, "center this on the page" -> correct CSS. Fast feedback via NLP here will be really interesting and helpful. We used to have to look up stuff in books, then there was google and stack overflow, it'd be nice to be able to ask random questions and get the answer instantly in code in front of you.

This is a great related blog post if you haven't read it: http://worrydream.com/LearnableProgramming/


No.

Excel is a local optimum - a hill on the way up the mountain.

Yes, scratch and excel are good solutions for the issue of "it takes years to become software literate and I have six week evening classes to fit it in, what shall i teach"

but that's the answer to the wrong question.


Not everyone who drives a car needs to be able to rebuild the engine.

Not everyone who writes a video game needs to write the physics engine.

Not everyone who needs to analyze a dataset needs to write SQL.

Abstractions and tools exist at levels that provide value to people solving problems. The more accessible the tools are, the wider the net of people that can leverage them.

A minority of voices can loudly yell in a corner about how everyone needs to use modal editors and write their own assembly while the rest of the world moves on and ignores them.


Eh... SQL is like the steering wheel in your first analogy. I'm not compiling or writing SQLite from scratch, simply operating it.


I think saying Excel & co are just shortcuts to do the same thing worse, is a bit uncharitable. I have very little experience with spreadsheets and a fair bit of experience with imperative programming (I guess declarative too from SQL, which is actually quite transferable to gSheets formulas).

I was sorting out a spreadsheet which I could use to keep track of income events and display information on taxes at different brackets, running totals for income & tax, capital gains events, and spent about 3-4 hours working on it, even though the formulas were all new to me.

The same thing would have taken me far longer as a python program.

Spreadsheets are really just the perfect way to operate on tabular data and display tabular data that has been transformed through a cascading series of transformation functions. I don't think it's possible to do this as efficiently through imperative programming.


> There's a reason societies which switched from hieroglyphics to alphabets have higher literacy rates—the best way to promote widespread literacy is to make it easy and accessible.

Source? This seems to have a lot more to do with socio-economics and politics than linguistics.


I see them as all tied together. Rising literacy rates increase standards of living. Rising standards of living leave time to learn to read. More people reading creates pressure to make reading easier. And so the cycle continues.


>I am going to bang on about this again, but imo software is a new form of literacy.

The four Rs: Reading, writing, 'rithmetic, R.

( https://en.m.wikipedia.org/wiki/R_(programming_language) )


Did you make that up? I love it.


Yes, I made it up just now.


RAWR.

Reading

Arithmetic

Writing

R


> software is a new form of literacy

Do you mean basic skills in using a desktop computer, or do you mean software engineering skills?

> we don't talk about the literacy department

Software development is a skilled craft, and will remain so.


I use the motor vehicle metaphor:

Software literacy is equivalent to: being able to drive a car, fuel it, decode the various dashboard lights when they blink, maybe change a tire and check the oil.

Software engineering is: Being a car mechanic.

So is it very useful to society if the vast majority of adults are able to drive. It is not important or worth the effort to have a majority of adults as fully trained car mechanics. Although it is good to have "on ramps" for those with the aptitude and inclination to pick it up, as a useful and paying job.

It is _now_ useful to society if the vast majority of adults are able to use a maps app, a messaging app, a word processor, maybe a formula in spreadsheet, calendar a zoom call and unmute themselves, have email and can distinguish a phishing email from real bank communications, ignore fake news on social media, etc.

The difference between a formula in a spreadsheet and full-time coding is one of degree, not kind; but you could say the same about changing a tire vs. a full vehicle overhaul.


I used this analogy myself the other day, and it's how I often explain software development to people who have been led to believe that computers are some deep mystery that could never be understood by the common man.

However there is one important difference between software developers and car mechanics, and that is that our work is in cyberspace so it can be effortlessly duplicated and sold a million times over. Car mechanics operate in meatspace, so in order to get paid they need to keep on fixing cars. A software developer could fix just one virtual car and then keep earning money forever thanks to intellectual property laws. It doesn't seem like equal pay for equal work.

Thinking about it this way has led me to become more skeptical of intellectual property laws. Software development isn't especially more difficult than any other skill, but we are disproportionately rewarded for it due arbitrary legal constructs. It seems like the industry has a vested interest in giving the impression that what we do is impossibly complicated. If more people realized the truth, there might be a stronger call to abolish or at least reduce the terms of copyright, patents and so on.


I used to be a mechanic and the analogy fits. Rich people drive into the shop and bark at you to "just fix it." They don't care about any of the minutia but it had better work they way they envision it working. Then they throw you a few pennies and drive off.


Where were you a mechanic that it only cost “a few pennies”? I feel like I get bled dry every time I take my car in...


Mechanics work ungodly hours and deal with very painful situations and barely make a living wage that can support a family. You feel like you are getting bled dry because the service does not scale. A highly trained professional has to focus on your problem until it is fixed and the solution is often very time consuming. Insurance costs are high and so are labor and material costs. It doesn't mean the mechanics are making a killing. The owners are often well off though. Hiring more mechanics scales for them.


That makes sense. Thanks for sharing your perspective.


I don't know myself, but if the big expense is parts not salary, both can be true. The vehicle manufacturer would be the one capturing the profits of the overpriced band-name parts.


"We need to decide if software is a car to be driven or an essay to be written." - Alan Kay

He goes on, of course, to argue that the car analogy is the wrong one.


Oh wow - do you have a reference to the original?



Man it really sucks this is being down voted. Software engineering is not like reading. Reading conveys information about reality. Software code conveys solutions to problems.

The latter requires understanding of computers, problem solving, domains, etc. Just like I can't pick up a book about organic chemistry and understand it, people are not going to some day care to know about how distributed systems algorithms work.

It is indeed a skilled craft and this weird, cyberpunk-ian idea that everyone will just converge on technical literacy is very hand wavey and ignores simple facts like some people just not having ANY interest in computers or tech.


Literacy is actually a good analogy. People have different reading abilities, but they're all under the umbrella of literacy.

Some people are illiterate, some people can read simple documents but struggle with higher complexity texts and contracts, and some people can breeze through high complexity texts. In the same way, some people struggle with computers, some are able to understand Excel spreadsheets, some are able to understand business logic expressed in a domain specific language, and some are able to read complex programs.


Writing is a skilled craft, and will remain so, especially at professional levels. But we still teach everyone the basics.


Software development is a skilled craft, and will remain so.

Yes, you're right, but i would take the meaning that understanding Software development and deployment and how that can be leveraged is a literacy. There's tons of IT transformation disasters because the companies upping their literacy get stuck with knowing the words (SOA and Agile) but don't know how to make coherent use of them.

Most projects these days are IT projects.

Think about it like a dev, Software on a computer is useful. Networked software is more useful. Cloud and IaaS is really useful, SaaS is a really really useful and knowing when to use each to actually get stuff done is the most useful of all.


>"Software on a computer is useful. Networked software is more useful. Cloud and IaaS is really useful, SaaS is a really really useful"

I actually prefer the reverse progression as the closer it is to my computer / my server the less I spend on feeding somebody else's insatiable appetites. Being connected is one thing. Giving somebody else more and more control over your business is totally different.


your periodic reminder that Excel is programming.


It’s also extremely productive.

Management always thinks a new web report / dashboard will solve all the problems.

Team then spends a week or month building said report into system.

Only to find that it doesn’t help anything. Six months of iterations later it’s correct and useful.

Or have someone build a one off report in excel. Iterate a few times that afternoon until it’s correct. Then it’s handed off to developers. Avoiding months of rework.


Respectfully, I don't think it's insightful to point out that Excel is Turing complete. That's a technical curiosity, it doesn't inform our discussion here.

The boundaries of what we call 'software development' are fuzzy, but that's true for all sorts of things. Assembling a computer doesn't make you an electronic engineer, despite that you're building a complex electronic system. If a high school teacher comes up with a basic arithmetic question that, by coincidence, has never been asked before, the teacher still doesn't count as a research mathematician for solving it.


I think you misunderstood the point (or they edited the comment). Excel is user programmed automation, and it’s incredibly wide spread.


"Excel as technical curiosity".

Peak SV.


Reading itself used to be considered a skilled craft.

Only 4% of people knew how to read during the middle ages.

Mostly clergy and civil servants.

The grandparent poster is right imo. If programming becomes just another thing everyone knows, and takes a software class every year like they take an English class, the world will be way better off.


The important difference between programming and reading/writing is that written language is just another representation of natural language. All the complexity is in understanding and using natural language. Reading itself is trivial once you have some practice.

Programming on the other hand is creating new abstractions (i.e modelling). It never becomes trivial, because it's not just a new encoding of existing abstractions.

I think programming is no different than maths. Yes, everyone will have to know a little bit of it, but very few will be capable of modelling complex problems or implementing large software systems.


Well lots of people can read and write, but few people are able to write a good novel. Everyone I know who isn't necessarily a software engineer but has basic programming skills has benefited from that knowledge. For example with everyday office work there's a big difference between someone who's able to use Word and someone who's able to use Access. The latter requires programming know-how. Also shell scripts and automation. It has a huge productivity amplifying impact.


>Well lots of people can read and write, but few people are able to write a good novel.

That's exactly my point. Coming up with a great story is very difficult. Writing it down is trivial in comparison.

>Everyone I know who isn't necessarily a software engineer but has basic programming skills has benefited from that knowledge.

I completely agree. It's very useful and should be taught in school just like maths is taught in school. Maybe it should be taught as part of maths.

My point is that the cognitive difficulty of formal modelling is on a completely different level than that of transcribing words. We should adjust our expectations accordingly.


I agree but consider that Homer, one of the greatest storytellers of all time, was most likely illiterate. I've also met people who are outstanding at computer science but struggle with coding. I just think they're different skills and the people who can do both know how smart and important they are. It's sometimes a source of discouragement when the aim is to teach coding as a form of literacy. I read a story once where this company a few decades ago tricked a group of secretaries into learning to code LISP because they didn't tell these women that what they were doing was programming. Similar to the airline ticket salespeople who used to do things like edit raw memory in their sabre terminals, which is something most programmers today would believe they're not smart enough to do. It's amazing how clever people are at figuring things out if we avoid thinking about technology labor with the classic division of labor assembly line model.


I seem to remember a similar using LISP story - not tricking people but they gave sales team / admin lisp tools because they did not have time to code the "proper ones" - and the employees (women) rebelled when people tried to take them away. I want to say it was ... wrote much of netscape then ran a nightclub ... memory like a sieve I have.


Rather than JWZ, you may be thinking of the Mailman system for customer service at Amazon in the early days, as recounted by Steve Yegge [1]

[1] https://sites.google.com/site/steveyegge2/tour-de-babel


> I read a story once where this company a few decades ago tricked a group of secretaries into learning to code LISP because they didn't tell these women that what they were doing was programming.

I'd like to read this if anyone can find it. :)


That was written by Richard Stallman, and is right on the GNU website:

https://www.gnu.org/gnu/rms-lisp.html


By saying " ed few can do the hard stuff", is that not redefining use of maths to an arbitrary cutoff line.

Think of all the instances of maths used globally. Do the millions of people who more or less get the idea of an average - this is used across politics, and common discourse. Does that count as maths? I would say yes, just as a for loop is programming. No we are not expecting a compiler but just "arranging the world so it can be easily iterated over" is a fine common objective.


>Does that count as maths?

Yes, that's exactly what I meant when I said "a little bit of it".


Language is just a tool to express a collection of ideas.

The reason programming is not an extension of our natural language is because we haven't included those concepts in our natural language yet.

The world would be way better off if everyone had better logic and analytical thinking ability


Interesting idea. Will everyone spend the next twenty years complaining about how much they hated Mrs Jones the programming teacher and how Stroustrup ruined programming for them?

Put it another way. The reason we interview people the way that we do, is because it isn't realistic (unfortunately) to expect that a CS grad can fizz-buzz, or delete from a linked list, or whatever. Anyone who has spent a little time interviewing can confirm this. Why should we expect that secondary education would do a better job?


If they were taught this kind of thing from an extremely young age in school every year these interviews would be trivial, Just like speaking your native language is trivial.


We don't quite reach a very high degree of practical math literacy even though everyone is taught from a young age. Calculus considered as "incredibly difficult" being an example.


I know a lot of students who would not understand the idea behind calculus concepts, even though they knew the formulae. They simply could not grasp a very simple thing like "area under the curve" or "Slope of a curve". This was at a large private school with a selective admissions process. I know for a fact that my college educated parents (50+) would not know how to solve a calculus problem. So yeah, Calculus is a pretty bad example because it IS incredibly difficult for the general population.


I don't know if this is like a smarts thing or just a poor math education thing.


As someone who does math for a living, I’m always frustrated because I truly believe it’s a poor math education thing. But then people just respond, “oh MYNAME, you just think that because you’re good at it/you’re a math person.” As if that made the point less valid rather than more valid.


Leonardo da Vinci was likely one of the most intelligent humans ever to have lived, and didn't understand calculus. It wasn't for lack of interest, either! He was fascinated with parachutes, timing falling objects and trying to understand the rates of change. But despite nibbling at the edges of calculus, he never got there.

In contrast, it's now completely ordinary for a teenager to not only get to the core ideas but solve complex calculations with them in timed tests!


This is an odd comparison.

I think DaVinci was way before Newton.

High school students aren't asked to invent calculus just to understand it.

I didn't invent djikstras graph traversal algorithm but I can read and understand it.

I think there's a difference between original invention and understanding 'n usage.


We all stand on the shoulders of giants and have increasingly powerful complimentary cognitive artifacts that let us easily do things that used to be very difficult.

In a more extreme version, consider that ancient Greek and Roman mathematicians struggled with long division problems that many of us can do in our heads now!

Why? Arabic numerals are far more effective than Roman numerals for the task and after having learned them, we have a permanently increased ability to do certain kinds of calculation.


That's very fascinating.

I look at quicksort and that was revolutionary at the time.

Do we have ways to better conceptualize the world around us? Or are our brains evolving outside of some normal evolutionary process?

Fascinating to me.


I think both are true but mostly the first. Human evolution has sped up dramatically in the past 15,000 years, but it's still very slow.

Arabic numerals, literacy, maps, bayesian reasoning, etc are unnatural, but better ways to conceptualize the world around us and these tools and others have steadily become more prevalent throughout the population.

Maybe quicksort hasn't permated through much of society, but binary search has! People born 100 years ago became comfortable flipping through phone books and quickly finding a name, whereas that would have taken a conceptual leap for most born 200 years ago.


I completely agree with you.

I think it's a failure of the school system from a very early age because math is heavily dependent on foundations of previous classes.

And I think it's more than just sharing the information we have to also make rational thinking fields as fun and cool as the emotional expression fields of study like English or Music and not the domain of 'nerds'.

I think our whole culture would be way better off if people had better critical thinking logic and math skills.


> because math is heavily dependent on foundations of previous classes.

This is a point I hadn't thought enough about before. Great insight.

It makes me think that maybe we just suck at education in general, and it only shows up in the subjects where continuation is dependent on previous courses.

If you completely fail to understand american history, you can lie to yourself about being good at it, and continue to believe that when you get to european history. If you completely fail to understand the lessons from Fahrenheit 451, you can believe you're good at English (the subject, not the language) while you move on to Shakespeare.

The only (pre-college) subjects I can think of that really require you to actually understand the prereq's material are math and language. And they share that people say "I'm just not a math person" or "I'm just no good at languages."

I'd never thought about it with that lens before, but now I'm feeling kinda pessimistic. Like, if this hypothesis is true, it's not about bringing math education up to the bar of other subjects, it's about overhauling the entire system.


You use your native language all the day, all the time. It's nowhere near the same.


Agreed.

Widespread ability to read and write natural language seems to've, eventually, made written constitutions not so-much "viable" but "necessary".

It's easy to forget how innovative the U.S. Constitution was for being written, content aside.

https://en.wikipedia.org/wiki/List_of_national_constitutions

Definitely excited about the social implications of widespread ability to read and write executable language in the long run, note when France crossed 50% literacy though:

https://en.wikipedia.org/wiki/Literacy#/media/File:Illiterac...


I think I am the grandparent - and yes I agree, software is going to be table stakes for the next generation of corporate players (and by implication, those corporates who cannot will mean a huge gap opening up).

It's not just "can my employees write a for loop" but it's "do we store data sensibly, when upper management wants a report is there a data history stretching back to the raw (ie no manual involvement). Do we all publish APIs and work through those defined interfaces - and many more.


It's not clear to me what you mean by software. Software is just a tool and writing software is also just a tool, more often than not, not that interesting in itself. Solving problems using software is the goal and can be realized with various ways, like those low-code solutions you mention.

This is why I'm becoming more and more unhappy about calling myself a programmer. It just doesn't have that "problem-solving using computer technology" ring to it.


>>> It's not clear to me what you mean by software.

>>> This is why I'm becoming more and more unhappy about calling myself a programmer.

Patio11 wrote about this and it is a good piece of advice - essentially it means sell your service (labour) as the value it can produce not the time it takes. This is a common piece of business as ice and is dry sensible.

But the main point is, as a software coder / developer / programmer, for most business areas software provides excess leverage and so the amount of value you can produce is greater than fir same time of a non-coder.

However to do value based charging involves other skills - negotiation, domain knowledge, lead generation and lots of meetings that go nowhere.

But even if you don't do the latter part that software still can produce outsize leveraged returns for effort input. You just won't get a cut :-(

That does not mean you did not help create the value


Agreed. Programmers code. Developers build software. Most software engineers work under a developer (business subject matter expert, slash, product owner) as programmers.

The person who envisions and specifies the features is the developer. The person who translates those into code is a software engineer or programmer. Sometimes they are the same person. In the corporate world, rarely ever.

In the corporate world if the software engineer can capture and document a business domain's processes then they are effectively taking the job role of developer from the product owner. Most people are comfortable leaving that in the hands of someone else and making it "their job." Engineers have the chance to take power but rarely do because they get caught up in the weeds of trying to covert a trickle of requirements into code because...agile. Developer Hegemony, when are we going to wake up?


I rarely reply to a comment here on HN. But yours resonated with me. Software as a new form of literacy is a good mental model. Also "You cannot compensate for lack of literacy in a company or society" is spot on. Thank you.


Thank you :-) I am actually writing a book about this and a few other ideas - watch out for "the software mind".


Extra ranting: It's worth noting that there are a lot of businesses that also "do not fit in one persons head". Anyone who has spent time in a large organisation will be constantly coming across processes that seem designed solely to prevent anything happening and knowing how to get around it is a hidden mystery. I am sure there are billion dollar departments in my company I have no idea exist. And it's dubious that at any sufficiently large organisation it is truly understood by upper management - almost every corporate failure can be attributed to something like "Yes of course I know how engines work, we can move the forwards quite easily"


Low code isn't inherently that bad as long as it offers a way to integrate with traditional code. Microsoft Excel is tremendously succesful in this area.


Low code isn't bad until you try to use a lot of it just like OOP isn't bad until you use too much of it. There is always a temptation that leads one to use too much of these. We have had low-code visual software design for decades. SQL Server Integration Services comes to mind. It does not solve the core problem. Maintainability. The last thing software needs is an invitation to more people that aren't yet aware of the problems of maintenance.

There is not an extremely high barrier to coding. That is not a problem. There is a huge barrier to learning how to produce maintainable and adaptive systems and that barrier is only made worse with low-code tools and services.


I worked with adding some functionality to a Microsoft PowerApp Covid office tracker template. Having to navigate the GUI for everything was quite cumbersome.

Although it was only two function calls in two places, it seems that every time the template is updated, the calls will have to be added back. Instead of doing a merge in Git, you will now have to remember all the navigation steps to add the changes each time. I don't want to think about what happens if you have more than one person working on an app.


We have automatic version control for our platform, give us a try when we launch.


Successful "tech" companies need to have a (relatively) small number of people who understand the possibilities that software open up for their business model.

They have a number of people who are skilled at managing the software and product development process and a (potentially large) number of people who are skilled at software development.

Seems to me to be very different to the notion of literacy which is a set of skills that essentially every employee needs to have.


The novel analogy is the right one, but it works against your point. Everyone needs to write. Few need to write novels. Just like increasingly everyone needs to write software, but few need to be engineers.


Everyone can write a novel. A lot of HNers have a word count in the (short) novel length without worrying.

I think the issue is most software developers are paid full time to write. Pay me full time to write novels and you can have an endless supply of low quality Dan Brown.

I think the novel analogy is useful - there are an awful lot of novels being published (even more self published). Most are drivel ... unfortunately similar to the quality distribution of software.

Very few people and less organisations are capable of consistently great writing.

In my view we should be aiming for seeing what makes something like the Washington Post work. This is a agglomeration of skilled people who manage to write a coherent trilogy of novels every day.

Being literate is only the first challenge - building literate organisations is a much harder one.


Coders are becoming dime a dozen. I would tell future generations to steer clear and specialize. There's a lot more to it than programming


> Literacy of employees and customers transform(s/ed) all industries but we don't talk about the literacy department or literacy investments in corporations.

Well, we do, it's just called the "copywriting department". :-) Literacy is important for modern business, some people are better at it than others, corporations pay such people to focus on literacy. Same goes for software development: yes, it's rapidly getting commoditised, but I don't think software development as a profession is going anywhere any time soon.


Every business person with the money can hire a bunch of coders to do the actual work. So I don't think "software literacy" is or will be a thing.


Typical HN crowd upset that people don't agree with their technomancer future and down voting anyone who says software engineering is hard.

You're right, in the same way why math literacy still isn't a thing.


We always seem to look at tech as if it is a thing unto itself, instead of a catalyst and enabler. Every industry is transformed by technology, but it remains fundamentally about the same thing. There was still a transportation industry after the move from horses to cars, and there is still a music industry after the move from physical media to digital downloads. If you don't understand the industry you are in, it doesn't matter how good your technology is. That doesn't mean things need to be done in the same way. What is transformative about software eating the world is the ability to bring distribution costs to zero, enabling radically different models. Only a deep understanding of an industry allows for a successful technology transition.

For example, for me what is interesting about autonomous vehicles is not the convenience, but the transportation models enabled by a radically lower cost per travelled kilometer. Those who make the software for the autonomous vehicles will surely succeed, but so will those who understand what transportation models become possible using that software. I'm not sure whether being a car maker is an advantage or disadvantage, it depends on how well the car maker understands the transportation industry. For this reason I'm also not clear whether Tesla is relevant to the future of autonomous vehicles. It seems unlikely we're going to drive around in similar ways as today, just with a computer at the wheel. Tesla seems to be working towards improving today's model of car ownership and usage by making nicer cars. That's an old world model. Whatever the future of personal transportation is, it is not that.


This is why I think SpaceX and Starlink are bigger businesses than Tesla. I already have an electric car, and it’s made by Hyundai, and they’ve been better at making cars since Elon was is diapers. Tesla is a catalyst, not the final state of the system.

Those who can charge a toll on the network, though, are in a more interesting position. It’s not clear that anyone other than SpaceX is going to be able to build rockets cheap enough to maintain an Internet constellation any time soon. It’s certainly likely that others will try, but it’s possible that SpaceX will have the final word on cheap space flight, at least for a generation.

To the point of the article, the real beneficiaries will be those who understand the consequences of humanity now having the option to spread out at low density, and due to autonomous driving and global connectivity, not seeing a drop in the quality of physical and intellectual life.


Once you have the AI experience and expertise, and have the battery technology and the experience building factories at gigascale, you can easily pivot away from the sedan/suv form factor and build delivery vehicles or whatever makes sense in the future.


Software is eating the world meant software will permeate and govern everything, every thing and every industry will be programmable, the world will be full of objects which are linked at a distance and change behaviour regularly, perhaps even with intents of their own. The combination of ML and robotics will spread software even farther into our physical world.

The author compares software to consultants, disruptive but not producing meaningful change, I don't agree with this - industries are being changed forever or replaced, but the transformation isn’t done yet. The old intermediaries who used to decide what we consume are not dead yet but I think that power will be diffused across society as the cost of production falls and new mediums emerge.

Retail - we’re only partway through a transformation but big retailers simply have no reason to exist in a world where you can have a package delivered same day to your door from a warehouse.

TV - this will be replaced by other screens which encourage participation (live chat etc), and self-production (youtube), that’s not yet complete.

Publishing - the entire knowledge of humanity is available online for the first time, most of it free, and people can publish their thoughts at zero cost. This is a remarkable revolution with profound impacts still reverberating. Gatekeepers like academic publishers are increasingly untenable and out of touch but are not gone yet.

Music - democratised as the means of production are cheaper but the incumbents are not dead yet, however their position is becoming untenable.

IMO we’re not even halfway through this revolution which is comparable in scope and time to the industrial revolution.


>> but I think that power will be diffused across society as the cost of production falls

I feel like this is the kind of optimistic sentiment, expressed over the years, that this essay is countering.

Empirically, we've seen the opposite. Yes, "software ate the world." But no, it hasn't diffused anything. Once upon a time, it was expensive to press records or print books. Someone had to bankroll it, and so record labels and book publishers became. What actually happened when music and print digitized? Did they go to a flat, federated, diffused or disintermediated structure. No. They got more centralized.

Unaffiliated musicians and TV/film makers work for the youtube or spotify. They have no control or influence over those companies, and have no power. In fact, they have less power than before because they, unlike the platforms, are decentralised.

>> Music - democratised as the means of production are cheaper but the incumbents are not dead yet, however their position is becoming untenable.

Untenable in theory. Sure, there is no need for someone to bankroll record pressing or book printing anymore. Hello zero marginal cost abundance. Yet, labels and publishers still exist. The industry structure is more centralised and more intermediated. Above labels/publishers are amazon, netflix, spotify, apple or google.... the new top of the pyramid.

OOH, we can believe our own 2004 speculative reasoning about digital economics... and what is or isn't tenable. OTOH, we can look at reality in 2021 and accept the prevailing trends. Youtube, spotify, itunes, amazon, netflix and future monopolies are what abundance looks like, not Wikimedia, Linux or the www.

We were wrong. Abundance has not impacted economic structures the way we expected. We have what we had before, but now with a tech monopoly at the top.


I disagree that music and publishing have become more centralised.

I have seen the death of the book publishing industry first hand - from the outside it might appear relatively unscathed, from the inside it has been hollowed out and is untenable. Publishers are clinging on but increasingly irrelevant and their margins are non-existent, which pushes their quality down, which undermines their business model further. They will survive another generation though on nostalgia and with steadily declining sales. They simply don’t have a viable business model any more.

Sure we’re not living in the utopian future some imagined but that does not mean the transformation was not profound and ongoing.

The transformation is far from over in those that were vulnerable to the internet and many industries have only just started being consumed.


It's true that content publishers (music, books, video, academic papers, news, software, etc) are all being eliminated by the current disruption. The problem is that they added real value to the chain, and that value is also being eliminated.

The current trend towards heaping massive quantities of low-value undifferentiated product in a heap at the feet of the consuming masses is wonderful if your metric is quantity. By eliminating discoverability, accessibility, quality, and curation you can massively increase the quantitative choice of consumers. In theory the cost difference is either saved by the consumer or passed on to the creator.

In addition, eliminating the idea of capitalizing investment in content creation means far less investment in content creation. The marginal cost of copying is virtually free, but the cost of creating the first instance to be copied has not decreased.

I suspect what we're seeing is the first wave of restructuring: tearing down the old to create a vast swamp of crap. I expect a second wave to come in which the vacuum created by tearing down the old will be filled by new publishers and their ilk to provide the services of discoverability, acessibility, quality, and curation and the capitalization of content creation.


I agree broadly, and I think we both agree with the author's main point... life goes on.

But, I think we need to be careful with our expectations. Currently, the biggest factor determining what content gets created is the preferences or MO of a handful of big guys.

Netflix is like the old world. Executives. Deals. Do Netflix insiders still like sexposition or do they think it's corny now? It's a lot like selling a show to a cable channel. On Youtube, a minute of news equals a minute of fart jokes and subscribers are important. That's their ethic.

The future isn't really about what "content" will do.. it's about what youtube or amazon will do. These companies are not neutral conduits of consumer preferences, whether or not they make decisions algorithmically.


> The current trend towards heaping massive quantities of low-value undifferentiated product in a heap at the feet of the consuming masses is wonderful if your metric is quantity.

But filtering for quality should be the job of critics/reviewers, not gatekeepers to the industry.


I see the author is here, so I feel kind of weird defending the position for him. What the heck though (sorry benedict)...

The point isn't that technology hasn't had an impact, or that software didn't really "eat the world." Old businesses and industries do have to adjust, and sometimes fail. That's not the point either. The point is "what comes after?" or rather, "how do these industries look now?" The answer to that is closer to "same as it was in 2005" than it is to "what we expected would happen in 2005."

IIRCC self publishing is still at 5%-10%. How does book publishing work today, 13 and 30 years after kindle and the wordwideweb were digitised publishing? A writer sits around wondering who they need to shag to get a publishing deal. If they're lucky, they get a publishing deal, like in 1988. Then Amazon sells it.

Maybe publishing is going through a technology induced lean years. Maybe a lot of them will go out of business. News publishing has gone through crisis moments, where they beg politicians to make Google and FB share some ad revenue back with them. What has not happened is diffusion, disintermediation or somesuch. Creators or consumers aren't more empowered. From the outside, it's business as usual and the outside view is what counts.

I remember Seth Godin's take on the kindle. At first he was excited. New medium, new message. Then he was disappointed. Digital books would cost the same. That meant the goal was to keep everything the same. There would be no penny-per page business model. No nonfiction equivalent of a short story. Amazon would be chasing deals with publishers, not opening up to new people. Jeff was right. Seth was wrong.

I share a lot of your sensibilities. But when reality contradicts theory, we need to adapt the theory.

The same is true of most large industries touched by digitization. Some (like publishing) had tumultuous transitions. Some (like banking) had pretty cushy transitions. In almost no cases has "economic untenable" logic proved out. Banks have not evolved into lean, mostly software, organisations. They've gotten bigger and hairier. Is this "tenable?" Maybe not, but it is the norm.


OK I see where we disagree I think.

You (and the author) think this transformation is over, I think it's just starting.

IIRCC self publishing is still at 5%-10%. How does book publishing work today, 13 and 30 years after kindle and the wordwideweb were digitised publishing? A writer sits around wondering who they need to shag to get a publishing deal. If they're lucky, they get a publishing deal, like in 1988. Then Amazon sells it.

This is not an accurate summary no. Today an author doesn't need a publishing deal to get in front of millions of readers in a way they most definitely did in 1988. They can set up a web page and sell their book themselves (and many are doing so). Just to pick one example, this book is self-published[1] and has its own website, on Amazon - Publisher : CreateSpace Independent Publishing Platform. There are certainly problems, Amazon is a predatory monopoly which will suck the margins out of any successful seller on its platform, but the entire industry is being shaken and self-publishing is growing in importance. Many of the old publishers see this coming but have no idea what to do about it, they're either watching their business slowly die or moving online selling curated content and leaving books behind.

To pick another example, video, channels like youtube and twitch have lead to the rise of internet celebrities not chosen by any producer or channel, but who set up their own stall and sell themselves. Gaming channels like steam or mobile have led to a boom in indie game studios and small games. There are certainly excesses and mistakes involved in that, but it is an entirely new mode of production enabled by the internet, and again I think it's in its infancy. It's not a fad which is going away - broadcast TV itself is going away and being replaced by a more interactive mode, it just doesn't know it yet.

So I don't agree the information revolution is over, it's only just beginning and will probably take about 50 more years at least to play out fully. We only just started carrying powerful internet connected computers in our pockets about a decade ago; the implications of that alone are far-reaching.

[0] http://momtestbook.com


Maybe we're getting closer but I don't think "You (and the author) think this transformation is over" is accurate. There is no pre/post. Change is continuous. Business is tumultuous.

Speaking for myself, I think over/not-over is part illusion to shake. It is part of the theory, which has failed. A pre-digital world and a digital world... where our first principles economics will finally play out. That day is not coming, marginal cost economics is not going to play to the "logical" conclusion.

There is a real reality out there. In it, digitisation of content has consistently led to monopoly. Google produce the world's most popular operating system and web browser just to support their search monopoly. They have no intention of being a neutral pipe.

To get grandiose... I feel like we are/were misguided by deterministic thinking. We see the possibilities of digitization, and assume they're inevitable. They aren't. They were opportunities, but they were never just going to manifest themselves.

The WWW was/is free because TBL made it that way. We believed that openness was an inevitability. Reality has schooled us.


I am also disappointed that Kindle and other eBook costs are not much lower than printed books. BTW, my wife and I were talking about how differently Amazon runs Audible and Kindle. While it is true that Amazon extracts $30/month from us for two Audible accounts, Audible gives away an amazing number of free Amazon Original branded audio books for members. These are often short books or experimental stuff, but often very good. We also get access to each other’s paid-for audio books.

Contrast this to Kindle where we can’t freely read each other’s book purchases. We try to use other platforms like Google Play Books that enable family sharing.

I have my own take on the publishing industry. I started by writing ten books for conventional publishers like Springer Verlag, McGraw Hill, etc. I ended up self publishing my own eBooks, for free or people can pay me. I earn much more money doing technical work than writing, and I enjoy maximizing interaction with people reading my books, so I think that I have reached a global maximum strategy for myself.


Typically the encouraged participation of Twitch and YouTube is a mechanism to strengthen para-social relationships people make with "content creators" which I don't think is a good thing, if more and more people aren't making friends and building communities around them in favour of their "friends" and "communities" online then they're going to continue being at the mercy of the established order (whether that be Governments or Corporations) as there's no real unity in an online collective and there's no meaningful action a disparate group of people can do to effect change in a single place.

On your publishing point it seems to me we're just abandoning quality control and rigour under the guise of begrudging gatekeeping (sometimes the gate does need to guarded) and whilst I agree that the subjectivity of art means more people being able to make and share music is probably a good thing (you can't definitively say there's "bad music") the same can't be said for information.

Perhaps I'm too cynical but a large aspect of the past 20 years in software proliferation I don't like is the idea that we should just build things consequences be damned (Mark Zuckerburg especially has this belief) as we're never then taking the time to properly reflect on if there's a way to prevent harm up front or if the harm produced can't be mitigated maybe we shouldn't build these things in the first place. A rhetorical but interesting question is how many relationships do we think were ruined by Facebook? How much dysmorphia and self-loathing caused by Instagram?* How much hate spread by YouTube and Twitter?

*(These two I find particularly interesting as we weren't even part way through uncovering the ill effects of unrealistic beauty standards set by Hollywood and the advertising industry over the last 50 years before along came these websites/apps whose solution was to allow you to airbrush and manipulate your own life and set your own unrealistic standards. There's now a generation and more to come who have just grown up with this being the way the world is).


There are certainly a lot of bad aspects to the popularity contest which is the current internet. Sorry if I sounded too panglossian about recent changes, but I think people overlook just how profoundly our lives have been and are being transformed.


Thanks for reading. However, that’s not what Marc meant by ‘software eating the world’ and it’s not how I described consultants.


Thanks for the article, it was an interesting read.

Genuinely interested, what do you think he meant? What do you feel the phrase means today? I read it as software will transform and replace many industries and physical things but will go back and find that essay again now - perhaps it's the bit about software permeating the physical world you object to? It was a really interesting phrase that I think resonated with a lot of people, so perhaps I read into it lots that wasn’t there.

Re consultants, I was referring to this bit:

There’s an old joke that consultants are like seagulls - they fly in, make lots of noise, mess everything up and then fly out. That’s pretty much what tech has done to media industries...

It seemed to me your thesis was that software is done transforming these industries and the only interesting problems are industry specific and not related to software? I think there is still a lot of change left to go.


One downside to lowering the barrier of entry in all of these areas is the lack of any kind of quality control or content curation. Sure, anyone can publish their thoughts online for free, which I agree is a good thing for openness and freedom, but not everyone is equally qualified or competent. It becomes increasingly difficult to sift through the mountains of sub-standard garbage to find things worth consuming (NB I dislike the word “consuming” in this context, but I don’t know of a better catch-all term).

Now sure, the old gatekeepers weren’t perfect at this either and they published a lot of garbage too. But I think this is something that needs to be discussed as content creation becomes increasingly democratized.


Completely agree as popularity != quality and curation without cornering the market does have a lot of value. When I said gatekeepers I was thinking of academic publishers vs sci hub for example, or non-fiction publishers who operate on selling big names rather than quality.


This was interesting but it reads like it's only an introduction. What does the title mean? How are we outgrowing software if every company is a software company? Where are the important questions if not in software?


Reminds me of Joe from Halt and Catch Fire observing: "Computers aren't the thing. They're the thing that gets us to the thing."


A variation of "the network is the computer" perhaps.


Wheenver reading articles like this i love to replace words like software with electricity or something similar. All these industries are and will continue to be disrupted by whatever the latest innovation is. It's just that software is the latest big one to change things so dramatically. People say similar things about crypto and finance or ai and writing. It seems silly to think this has ever not been the case and will ever not be the case.


It's also getting easier to make software and harder to make physical goods. Still software is probably the place to get started on a problem. I wanted to make something out of carbon fiber but I think I write my own cad software that can simulate braided structures before I'll buy an autoclave and scale a production line.


> More fundamentally, though, for both music and books, most of the arguments and questions are music industry questions and book industry questions, not tech or software questions

I agree with this so much, but in practice people actually don't realize this enough.

When I work with business teams, they have a hard time distinguishing business rules from tech problems. So they'll say something like... Oh the payouts aren't deposited in time, we need to detect in advance that a payout is required and start the deposit process earlier. And then they'll say, that's a tech problem for the tech team to solve.

Except not really, yes it takes a software engineer to program a computer to enact these rules and this process, if you've chosen to have the process managed by a computer. That said, all the difficulties are in defining the business rules. When exactly should you start the depositing process? Based on what will you define when it has to start, what if it's started and the customer removes their bank details, or what if etc. This all becomes hard business problems for the business to solve.

So what happens in practice is, business domain experts overtime start to abstract more and more things to the computer, leaving the software to actually answer for these business problems and define the rules for them. In turn it is often to the developer to figure out the solution to the business problem and a strategy to solve it, and then implement it in the computer. Those things often end up happening together.

That's why I think in effect software companies slowly move to owning the business itself. And this is a part of software is eating the world.

As computers take over control, enforcement and execution of more and more business processes, the management and administration authority is slowly reversed, software becomes at the root, and software engineers at the helm. It becomes increasingly difficult for business people without that software background to realize that most things are not a software problem, but a business problem which they could be focusing on themselves, but are delegating more and more over to software companies and software engineers.

It's pretty common to ask a business domain expert about how should this be handled, and for them to ask back, well how does the computer currently handles it?


This piece has an odd conception of technology as a thing that transforms an industry once and then screeches to a halt.

Technology is continuing to transform each industry it mentioned. Retail is rapidly becoming automated to a degree that would have looked like science fiction a decade ago, Tesla's battery technology continues to improve, video games continue to eclipse the movie industry.

Speaking of the movie industry, it's very likely that if that day ever does come when Tom Cruise actually looks his age, the on-film Tom Cruise will not. Young actors of future generations will face competition from AI-generated likenesses of the stars of the past generation.

Tech isn't close to done with any industry.


> Speaking of the movie industry, it's very likely that if that day ever does come when Tom Cruise actually looks his age, the on-film Tom Cruise will not. Young actors of future generations will face competition from AI-generated likenesses of the stars of the past generation.

I don't think people are drawn to star actors simply because they like their face. It's more about the kind of roles they choose and how enjoyable they are to watch based on their talent.

Unless the AI generated versions work the same way, showing up only in very few movies of a certain type/quality, I think people will get over them very quickly.


> tech will change everything, but once the dust has settled the questions that matter will mostly be retail questions, not tech questions

I'm not disagreeing but how will we know the dust has settled?

I might have thought the dust had settled with music when you could download mp3s. But now streaming services are the thing.


Software ate the world and "the important questions are somewhere else."

what do you think those are, and were they not always?


Software ate the world. And now Apple is eating software.


Sounds to me like Ben agrees that we're now in the Deployment Age (http://reactionwheel.net/2015/10/the-deployment-age.html).

I wonder if he also thinks that the VC model is a less good fit here.


The story displays a lack of historical perspective which I find common among technologists particularly those focussing on IT.

The entire article seems to be based on the following premise:

    "In the past, IBM, Oracle or Microsoft sold technology to other companies, as a tool - they sold computers and software to GE, P&G and Citibank. Now there’s a generation of companies that both create software and use it themselves to enter another industry, and often to change it."
This claim of an unprecedented economic change currently occurring seems obviously indefensible to me or at the very least historically ignorant?

New companies exploiting (rather than selling) new technology (not just software) to displace incumbents is the oldest story in the world.

And IBM, Microsoft and Oracle still sell technology as far as I can see.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: