I think about this a lot and the world seems more and more like this every day.
He mentioned Bob Colwell, who was the chief microprocessor architect at Intel, once encountering some issues with chips from TI. He went to TI, and those people told him they couldn't figure it out either because those fundamentals were done by first-generation magicians. And it's not just TI, Motorola, and Fairchild all have this kind of problem.
I remember there was an old Intel fellow who understands Fabs, designed CPU, has experience with OS, wrote VM and does Web Programming for his hobby. ( He gave a talk somewhere but my Google fu is failing me )
Now everyone is so specialised in their domain no one has a good overview of anything. And no one understand how all the puzzle were fitted together in the first place.
And after one or two generation, the Why it was crafted in the first place was lost. And we are left with people specialise in that domain to "Maintain" it. Or reinventing the Flat tire as Alan Kay calls it.
Obviously it's a little more complex then that, but the importance and value of education and documentation is very much in that realm: drop the next generation in a desert, and we go back to the stone age instantly.
We do this all the time with technology. It's because we're addicted to change, and we have no judgment about how much change we need and where.
Whether we are addicted to change or not, might be another question though.
Indeed, the problem is that the user of the machine faced a choice between 3 possibilities: replacing the machine as expertise in the old tech dwindled; building and maintaining that expertise over the course of decades; or doing neither, and letting the equipment inevitably lapse into its inevitable disrepair; and they chose the latter.
Change is constant, which is good, because there is no progress without it. The only choice we have as individuals is how, and how well, we adapt to it.
Tell me more about how people hate writing documentation, and then tell me how they can't shut the fuck up about not having it when they need it.
This is a peeve of mine.
What do you expect?
A better approach is to learn just enough to do it yourself, but it's not easy to document things well.
I like to think about documentation like this: Without it your work will be almost wasted as nobody can make use of it. That helps a bit :)
Simple things like a linter that yells at you if a class has an undocumented method would probably at least be a step in the right direction. People may complain that it would lead to overdocumentation but I'd argue that it's probably better than underdocumentation.
Even something as simple as providing a warning when there appears to be an long code block without documentation would probably be a step in a decent direction. What seems trivial and obvious today likely will not in a matter of weeks - and it generally just gets worse from there.
Of course that can lead to developers just adding a single line comment with the name of the type ... but it's still a nice feature.
IME, the only thing that works is to attack it from a Project level: make usable documentation (needs a review before being signed off) a deliverable or the project isn't done.
The thing is, I don't this happens in isolation. At least not for me. I hate writing documentation because time is not allotted for it. So I have to rush it, and in the process I feel like I'm doing a poor job, which is demoralizing. I'm gonna dread it next time I have to do it.
The choice, really, was "hire some ludicrously expensive COBOL devs, or replace the entire system". Replacing the entire system failed on every sane business evaluation: high-risk, hugely expensive, no guarantee that the new system will even work. So they hired the ludicrously expensive COBOL devs (because the devs needed to understand 1970's-era COBOL, and they were rare in the OO-frenzied 1990's) and patched up the old system.
But as time wears on, those systems fall further and further behind, and the COBOL devs who actually know how to maintain them get more and more expensive (or actually unavailable). The costs and risks of replacing the system is still too high for any given marginal change, but the marginal changes are getting very, very expensive.
And then 2038 rolls around and they'll face the same choice. And the same risks will come up. It'll be interesting to see what they do, and what choices are available. Because patching the old system may well not be possible at this point, because there's nobody left who understands it. And migrating the complex business logic to a new system may well not be possible, because there's no-one left who understands the old code.
If anything, I think having a separate, archaic, mostly frozen system base for critical infrastructure is a good thing. The current ancient COBOL abandonware situation is probably not the simple and mostly standardized solution one may wish for, but it's a lot closer.
COBOL doesn't actually pay that well, no matter how badly the business depends on it. The law of supply and demand would suggest that it should, but it still really doesn't.
This is fine as long as the "in maintenance only" portion of the industry remains small. Which will only remain true while the industry is growing exponentially. However like industrials, there will eventually be a physical limit to how much better computer networks, microchips, web search, cameras and other activities can get.
Once this happens growth in the core parts of the industry will slow, and more software will be in maintenance. However software components are both opaque and orders of magnitude more complex than physical parts. As an example, Losing practical knowledge of the linux kernel involves losing knowledge of ~27.8 million lines of code.
There may come a time in a number of decades that getting a driver patched is an impractical activity due to the scarcity of knowledge ( or potentially even the ability to build the driver... )
e.g. why update the nic drivers if the nic hasn't changed in 20 years? this could easily turn into "how do we even modify and release our companies nic driver?"
Could you please elaborate why?
I would agree that such business has relatively low ROI. Still, for example, Linux distros and *BSD foundations are to an extent a maintenance business.
Build something new, affect bottom line directly vs maintenance.
Blame the tax structure for placing an emphasis on new.
COBOL still runs the world, despite what many on this board think.
The example that springs to mind is the social security check printers. I think they wound up reading the wire voltages as checks were printing to duplicate the behavior for the y2k fix. Was urgent as many people relied on that income.
There are rare events that you can't do much about up front, they're external. A pandemic might be a good example. There are other rare events you can simply avoid, but it's often tempting to just skip the maintenance and let someone else deal with it when it breaks. It's rare right? Not like we're going to get blamed, or even be around to have to deal with it.
Now guess what happens if we forget how to make something intrinsic to modern farming.
Not being snarky; just want to point out that current soil erosion rates due to modern farming mean the world will run out of dirt by the end of the century.
"Imagine that the natural sciences were to suffer the effects of a catastrophe. A series of environmental disasters are blamed by the general public on the scientists. Widespread riots occur, laboratories are burnt down, physicists are lynched, books and instruments are destroyed. ...Later still there is a reaction against this destructive movement and enlightened people seek to revive science, although they have largely forgotten what it was. But all that they possess are fragments: a knowledge of experiments detached from any knowledge of the theoretical context which gave them significance; parts of theories unrelated either to the other bits and pieces of theory which they possess or to experiment; instruments whose use has been forgotten; half chapters from books, single pages from articles, not always fully legible because torn and charred. Nonetheless all these fragments are reembodied in a set of practices which go under the revived name of physics, chemistry and biology. Adults argue with each other about the respective merits of relativity theory, evolution theory and phlogiston theory, although they possess only a very partial knowledge of each...
In such a culture men would use expressions such as 'neutrino', 'mass', 'specific gravity', 'atomic weight' in systematic and often interrelated ways which would resemble in lesser or greater degrees the ways in which such expressions had been used in earlier times before scientific knowledge had been so largely lost. But many of the beliefs presupposed by the use of these expressions would have been lost and there would appear to be an element of arbitrariness and even of choice in their application...What would appear to be rival and competing premises for which no further
argumentation could be given would abound. Subjectivist theories of science would appear and would be criticised by those who held that the notion of truth embodied in what they took to be science was incompatible with subjectivism."
Suppose humanity were set back to the stone age. In the sense that all working technology is 100% destroyed into ash. We don't even have a proper hammer or knife. What we do have is written documentation about how the modern world works, and an army of the brightest engineers.
How long would it take to re-create modern tech? And with the documentation/academic papers we currently have, could we even do it? What I mean is, what are all the steps we need to take from building a forge out of mud and flint and steel, to producing an i9 intel chip, 5g networks and space ships?
Think of this as an ansible script that bootstraps cavemen to modernity.
One thing I found interesting is that apparently many metals used to be collected from ores just laying around. Those sites are all exhausted today. So a full 'reset' of the earth wouldn't be possible today. Of course metals could be scavenged from the machines we have now (in a nuclear disaster style scenario), but we could never fully 'redo' the technical evolution we did over the last 10's of k's of years.
We would lose access to modern tech and ancient tech is lost completely already.
How many people can farm without access to machinery at all?
How many would be able to produce any hammer from scratch? How you would ensure that they are heard and followed?
How many draft animals are present in your area, for example oxen and horses?
Rebuild time just to restore food supply is too long, we would collapse to prehistoric stage with some pockets of places where food stockpiles were not destroyed.
And maybe even worse as animals are not available for hunting!
We would survive as species, but with nearly total population, cultural and technological loss.
There is no conceivable scenario in which we lose access to modern technology. OK, in some far-fetched case we can't use complex electronics. Then what?
Well, no one need draft animals because there are still many gasoline and diesel engines around that run without electronics. Since we're not all driving cars around any more, our supplies of fuel will last at least until we can get pumpjacks and small refineries running again.
Believe it or not, the farmers can farm just fine without their $500k tractors, it just means they have to farm a lot smaller area with less output, but their skills will transfer just fine.
This is akin to "preppers" learning to make fire by rubbing sticks together: in what situation do they think they'll run out of matches and cigarette lighters?
I think that noone here was thinking that this scenario is likely?
> all working technology is 100% destroyed into ash. We don't even have a proper hammer or knife. What we do have is written documentation about how the modern world works, and an army of the brightest engineers.
Preparing for such scenario is waste of time, but thinking about it may be interesting. Actual complexity and importance of various infrastructure is interesting.
> This is akin to "preppers" learning to make fire by rubbing sticks together: in what situation do they think they'll run out of matches and cigarette lighters?
This one is actually possible to be directly useful, and not an example of something useless.
One thing to mention, though, is that all the easy-to-dig carbon fuel has been dug, so "use coal to power X" just isn't quite as feasible. Biomass just doesn't work for a lot of it.
this reminds of a recent article on the Tartarian empire conspiracy theory I posted
https://news.ycombinator.com/item?id=27013834 - no discussion - actual article https://www.bloomberg.com/news/features/2021-04-27/inside-ar...
In a few generations of nobody knowing how their technology works anymore will there be conspiracy theories regarding how that technology came to be and then became impossible to replicate.
It's very easy for people to slip into self-indulgence and pleasant illusions but when something important breaks then nobody has a clue what to do. This happens more and more around me.
I never thought I'd live to see something like this. And I am only 41.
But really, can anyone really explain how computers work nowadays? Look at how optimisations in cache stuff on the CPU led to horrible security flaws, surely that should be obvious if anyone really knew what was happening under the hood?
Ubiquitous software delivery from questionable sources is the problem, that's what invalidated the design assumptions underlying the cache optimisations. You might argue that it is the hardware designer's fault to not see how the software world would change in the next 20 years, but you can similarly argue that it's the software designer's fault to not keep in mind the security limitations of the processor it's running on.
Timing attacks are not all that hard to understand - if the timing (or anything else) a user sees depends on information they shouldn't be able to see, then there's a (theoretical, at least) information leak and that's a security flaw.
I'm not a security expert, and this is something that's always in the back of my mind when considering security. It's ... one of the only things in the back of my mind - you need a constant time string compare for passwords is the most obvious example (and one that literally any decent programmer should be somewhat aware of). I only know the basics of security, and this is one of the basic principals. It's up there with "escape strings".
People simply don't think about every layer of the stack. If they did, then it would be obvious (but there's so much in the stack that this is infeasible).
then I thought... could this happen to knowledge repositories too? could knowledge degrade under "need source" for this and other administrative controversy?
What if we sort of devolve to https://simple.wikipedia.org over time?
Also originally a scathing commentary on Thatcherism and British society, similar to Judge Dredd, but that has been toned down a lot over the years, unfortunately.
(I do know there have been a Tau and Necron Exterminatus, but I did said partially justified)
It's funny. In Warhammer 40k lore, any society that didn't practice witch burning was essentially wiped out by daemonic incursion. Therefore Warhammer 40k human world evolved to a racist, xenophobic, psyker-hating society.
This isn't even just a hobbyist phenomena. Electronics design itself is becoming more and more "turn key hardware", where you just slap chips on a board, string them together, then get chugging on the software.
This "industrial literacy" is dominated by Asia today.
The difference between tech and other machines like cellular life, is that we built it from scratch in a giant writing civilization so to absolutely lose the ability to reverse engineer or reproduce by copy would require quite a dumbing down of the humans or a catastrophic loss of resources.
Something I saw somewhere is that in engineering you can copy without even looking at the thing. If you know it s possible, all countries seem to somehow access it the next decade. Sometimes it's just a psychological block.
The thing that hasn't come about yet though is, sales, marketing, accounting can also be outsourced. Dare I say automated. It is at the end of the day, still waged work. Marketing is not an asset class. Accounting isn't capital ownership. Fabrication is a tool, tech is a tool, social media is a tool.
The moment that this sort of stuff gets automated there will be a reckoning on how businesses are built in western society. I don't mean a saleforce competitor with a slicker UI, I mean completely abstracting away payroll and the like.
Imagine for a moment:
You could say to anyone, "I like the work you do want to work for me?" either in person, online, even here. They say yes. Now suddenly a pipeline process kicks off, like a CICD for recruitment. You have your budget metrics all in place, the receiver has their negotiation asks, etc. And AI hashes out a negotiation, in browser or on your phones etc. comes up with a negotiation contract, highlights the important bits or deal breakers that need a higher level sign off. "Click 'Yes' to work." Done. Hired. Here's your onboarding package and account credentials, your paycheck comes in at the end of the month.
All the while, the two people at the table or on twitter have just been continuing on with their lives. Nobody talked to HR. Nobody had to manually do filings with the IRS. The business owner never opened a dashboard or logged in to their negotiation.ai account. It is not this easy at the moment. But no rule of physics says it can't be in the future.
Entire classes of work could crumble while people could create in a much more competitive manner.
Our salaries and power won’t last forever, and if we don’t turn around our culture of worshipping techno-aristocrats quickly then we’re all going to have a rough time in the next few decades.
I want my taxes to be automatically by some computer without having to deal with a human that's more prone to errors. I want self-driving cars so that I don't have to deal with another human that's more prone to accidents, etc, etc. The list goes on.
I think it's inevitable that we will see some kind of UBI, otherwise there would be big revolutions when most jobs are automated to some degree.
This is already in progress in New Zealand. The government simplified taxation rules and everything is fairly automated now, the payments and the end-of-year. I got an email this year from our IRD (≈IRS) telling me that my annual tax return was already completed by them (electronically) and I just needed to login to confirm/authorise it.
You only need to claim subsidies, select NGO which gets your 1% of tax - or skip that,
Tax files itself automatically,
If you are in weird situation, you can relatively easily modify it.
Tax code is still awful (not sure whether I filed taxes correctly due to my weird situation), but frontend is one of better sites that I have seen. And the best government site that I encountered.
Any functional government can setup this.
Skatteetaten (≈IRS) then sends us a letter in early spring telling us what we owe or what we will get back, we get a chance to add extra expenses etc and then a final calculation is done and you either get an invoice or a payment, most likely a small payment as taxes payments are typically slightly larger than they need to be.
Except if you run a business, then it is slightly more complicated depending on the size of your business, if you have employees etc. I have a small company so I need to report income from the company as well.
Is it though? I don't think it's impossible we see some kind of cyberpunk dystopia, where powerful live isolated lives, protected by the police/military, and the rest live in squalor.
What happens if you automate everything, and realize you don't need to feed millions of mouths. And, hey, police-military complex is already automated, so just send the drones and let them remove the excess humans.
Even in that scenario, if all useful work is truly automated, there's either still some sort of UBI (just not enough to live comfortably) or the people living in squalor are subsistence farmers eating whatever they can manage to grow in whatever land they can lay claim to.
Well... yes? The purpose of life isn't just to enhance shareholder value. Some stuff really is just inefficient of course but a lot of what looks like inefficiency in a pure economic sense is actually what people back in the day would have called living a good and satisfying and meaningful life.
Yes, there are things that have taken genuine inefficiencies out, I wouldn't go back to the days e.g. of standing in a queue at the bank rather than online banking. But an efficient world looks like 996 and I don't think that's actually a good way to live https://en.wikipedia.org/wiki/996_working_hour_system
Better get more tech stock when that happens!
And IMO the majority of 10+ Yoe engineers are actually 1 yoe * 10, or more usually 2 yoe * 5.
You don't have to be from a top school, but it does help.
It’s a bit like saying that finance pays well in Manhattan. Sure. But most people in the USA who work in finance are making mid/high 5 figures. Wall Street is small.
During New Deal and Great Society, Capital shared that surplus with Labor. Since the mid 1970s, not so much.
Instead of automating, we could simply eliminate huge swathes of work. Squint a bit, and all the bureaucratic burden (bullshit jobs) just look like very complicated rent seeking.
People forget that it doesn't need to be 100% automated. If I can automate 25% of the work of a staff-level engineer, that's amazing. If I can do 50% of a junior engineer, superb.
Right now software engineering is still in baby steps as far as tools are concerned, but the next decade will see some remarkable stuff happen once we have a more solid foundation and start applying the previous few decades' progress in machine learning and statistics.
This has been the trend since ENIAC.
And yet, the demand for programming is higher than ever. It could be another century before those lines intersect. Maybe longer.
We keep automating ourselves up into higher levels, only to discover that there is a ton more work to be done up there. Which, imo, is a very good thing.
4GLs happened, but the work didn't really become less. (And 5GL spectacularly face-planted)
In the 90s, we got widespread adoption of the Internet and search engines, and the prognosis was we'd just be able to look up everything.
In the 2010s, did we ever build fancy linear regressions^W^W^W machine learning solutions.
We always talk about how that somehow would reduce the work, but instead we continue to build barely maintainable mountains of complexity at the outer edge of what we can handle.
I wish you success. I'm not holding my breath it'll significantly reduce software engineering work. It'll shift what we do, no doubt. I'd be surprised if this is the decade where we make a significant dent. We'll just heap on more complexity.
> 4GLs happened, but the work didn't really become less. (And 5GL spectacularly face-planted)
Hah. The pipe dream is older than that.
They said that in 5 year's time there would be no demand for professional programmers. You see, there's this new programming languge that's so easy to use that anyone can write the software they need themselves.
That was in 1960.
The language was COBOL.
But take the States. In the 1980s, having a 120,000 usd annual salary would be insane. You could buy a nice place in Santa Barbara for that money.
Due to inflation, that same salary buys much much less in today’s money.
Give it another 20 years, 120k per year will buy even less than that.
But now consider that most people in the 80s would make 20-40k per year. That was enough to have a good life on the high end. A decent car, a house in a good area with a mortgage.
Can you do that now? No. Not unless you live in the sticks, and good luck finding a job there!
So to suggest that engineer salaries should be “more reasonable” is basically saying you want people to be poor in twenty years. Because that’s exactly what will happen.
Oh and your future 40k per year job will be even harder to make ends meet, if nigh impossible, because most jobs will not be in cheap areas to live.
Hell, even the cheaper areas to live are more expensive now.
Yep, just consider how much sales, marketing and accounting work the App and Play store has automated. Not to mention software delivery and installation work.
It'd be amusing to re-score the highest risk jobs with the bullshit jobs criteria, measure the overlap.
I worry that automating bullshit jobs will embed those tasks deeper into our economy, like technical debt. Better to just eliminate the bullshit jobs.
> flunkies, who serve to make their superiors feel important, e.g., receptionists, administrative assistants, door attendants
Impossible to automate. The whole point of the job is to be wasting a human's time to feed someone's ego.
> goons, who act to harm or deceive others on behalf of their employer, e.g., lobbyists, corporate lawyers, telemarketers, public relations specialists
Automation will make your goons more effective, but the arms race against the other side's goons who also have automation will prevent the jobs from going away.
> duct tapers, who temporarily fix problems that could be fixed permanently, e.g., programmers repairing shoddy code, airline desk staff who calm passengers whose bags do not arrive
If they aren't going to pay the upfront cost to fix the problem, they aren't going to pay the upfront cost to automate it either.
A lot of work isn't impossible to automate, it's just not worth the cost. Lots of people unfortunately earn very low salaries, they're cheaper than a machine. This is the case both in manufacturing, and administration.
I mean, if we're just down to the capital class, there simply will not be enough of them to buy from each other to sustain an economy. Those payroll and hr people also happen to buy people's products, which AI never will. If there is no IRS and no tax filers, there goes a lot of purchasing power.
You can't create or compete if no one is there to buy them.
But why can't AI buy products? What makes a purchase more legitimate by the click of a human finger over the automated confirmation by a piece of software? Hierarchy of needs? Why can't automated systems that have energy, networking, AWS Bill's etc. not also have needs?
And then there's a simple fact of there is a finite number of moths to feed. In the billions, but finite. Why can't an industry of trillions of digital actors, that scales far faster than biological ones, not be an even bigger economy in it's own right?
Our current economy seems to rely on "the modern consumer", as per your point-- and we are told that our main role is to "consume". Seems to me this would be extremely easy to automate. Then we don't need people any more, the robots can consume themselves!
Anyone want a paperclip?
> issues confronting our defense industrial base [...] first has been the steady deindustrialization of the United States over the past five decades, including workforce and manufacturing innovation [...] While total manufacturing output has grown during this period, [...] the workforce on which a defense industrial renaissance would depend has become [...] an endangered species.
> Together, a U.S. business climate that has favored short-term shareholder earnings (versus long-term capital investment), deindustrialization, and an abstract, radical vision of “free trade,” without fair trade enforcement, have severely damaged America’s ability to arm itself today and in the future. Our national responses – off-shoring and out-sourcing – have been inadequate and ultimately self-defeating, especially with respect to the defense industrial base.
> These trends have had particular impact on the core element of a successful manufacturing economy: the machine tool industry. Of the world’s top twenty-one machine-tool makers, only two today are American: Gleason and Haas Automation. By contrast, eight are based in Japan, and six in Germany. [...]
Reader beware. Reports to Congress are highly politicized -- both the Department itself and various bureaucrats within are kissing the ring in hopes of protecting/increasing funding. You can safely expect these reports to mention climate change, green energy, diversity, etc. much more often for the next four years. And guess what? The best office politicians serving life terms in the DoD will make sure those reports direct funds in ways that benefit their priorities.
POTUS in 2020 was protectionist, so DoD tells a story about how they and their private sector allies need money and tell it in a way that aligns with that protectionism. The main thesis about why money is needed matches the biases of the person with the pulpit, and then the report helpfully suggests exactly how to achieve that goal that we all of course agree should be the goal. In 2020, onshoring manufacturing. In 2021, maybe still onshoring but maybe play up the whole climate thing a lot more.
"Caveat Emptor" is merely a warning, not necessarily an indict of the product. Reader beware.
This is pretty much an obvious problem: if you don't have your own arms facilities, then you've got a problem. If the facilities aren't across a land border but across an ocean, then you've got a bigger problem. And if you can't reasonably expect to be able to expand capacity by internal policymaking ("we need to open 5 more factories") then you've got an even bigger problem.
Right. Including in the United States over the past 40+ years. The United States does manufacture its weapons domestically, it does stockpile reserves of key components, and it does require domestic manufacturing for key inputs. The linked report doesn't even contest any of this. It's to a "renaissance" in arms manufacturing.
The question is less "is this a real problem in the abstract" and much more "is this a real problem that the US actually has?"
This is really a bit like saying "if a country can't grow its own food then its people will starve". The problem is not with explicit argument, which is of course common sense. The problem is with the implicit implication ("give us more money for farm subsidies or you will starve"... well, wait, does the US have a food security problem? Do we already have funds in place to help mitigate that risk? Are those risk reduction programs effective? Is more risk reduction necessary? And is the specific spending that's being requested to nominally reduce risk actually going to do so?).
Which is to say, all of this comes before asking the even more important question: even if we do are out-sourcing our arms manufacturing (we aren't, but even if we were...), will following DoD's recommendations make us less reliant on other nations? Or are we just just pissing money away into some DoD beaurocrat's useless jobs program?
I found the author's framework incomplete and not useful. For example, he didn't include any counterexamples. E.g. why is Intel with both in-house chip architecture design capability _and_ chip fabrication factories falling behind in innovation to competitors using the outsource model?
- NVIDIA gpu + outsourcer TSMC is ahead of Intel at hardware for machine learning
- Apple M1 chip + outsourcer TSMC beats x86 for laptop performance
- AMD Neoverse chip + outsourcer TSMC bests Intel for many server workloads
But that doesn't mean those companies outsource everything. E.g. Apple doesn't outsource the programming of iOS and macOS to outside consultants at Accenture or Thoughtworks. They do that in house. But Apple programmers don't write their own financial back office software. Instead, they use Germany's SAP ERP enterprise system. Likewise, none of SAP employees design and make smartphones for staff to use; they let Apple and Samsung manufacture the phones.
Being strategic about outsourcing is a natural consequence of recognizing that other entities specializing in a competency can do it better/faster/cheaper. How did NASA "innovate" and send astronauts to the moon? They outsourced the work. E.g. The manufacture of space suits was contracted out to ladies bra manufacturer Playtex. The Apollo rockets were made by a combination of companies. NASA was the ultimate outsourcer.
There was once a time when IBM paid for new semiconductor manufacturing technology. Applied Materials and TEL and others would actually make the equipment, but IBM would pay for the development at each new node. Then, eventually, they decided they wouldn't pay, they would wait for someone else to do it, because cutting-edge semi's wasn't a core need for them any more.
So, Intel became the new source of funding for research on new production nodes. There were various semi industry consortiums that coordinated all of this, but still someone had to step up with the billions that were required for each new node, and for a while it was Intel that did this.
But then, eventually, the costs of R&D for each new node kept rising faster than the money you got for it, especially given that TSMC and Samsung and the rest also got to buy the new manufacturing equipment from Applied Materials and TEL and so forth, even though they hadn't sunk in the R&D cost to fund it. So, it ended up that the actual new R&D started to be done by the upstream companies that made the equipment which was bought for the fabs. This was already happening by the late 90's when I left the industry; actual semi manuf R&D was done at the equipment manufacturers, not at Intel; Intel was just the one with the financing. It worked until it didn't.
TSMC execs have been quoted as saying they "outsourced their sales and marketing" to American companies.
"The pellicle production tools have been installed in Mitsui, which this year will ramp up EUV pellicles based on ASML’s technology. Mitsui is no stranger to pellicles, and already produces optical pellicles. ASML will continue to do R&D for future pellicles."
There is specialization (comparative advantage) outsourcing, with a living connection between the partners. ASML does the R&D an Mitsui scales it up, etc.
And there's stubbornness driven outsourcing, where companies for some reason don't want to offer a competitive compensation, but are willing to spend a lot on service contracts. (Because they can claim that the risks are managed by the service provider.) And they end up with subpar solutions supplied by whatever vendors that had the patience to deliver to a completely incompetent client (and the necessary audacity to bill them as much as possible).
iPhones are built in China, but Apple keeps tight control over how they are built. They control and manage the supply chain, they buy companies making tools used to make iPhones to keep control over this. They operate the cloud service stack around them. They made massive investments into doing more themselves: building a world-class CPU design group to get independence from what other SoC makers offer them. They are now leveraging that to outsource less of the Macbook design: move away from outsourcing CPU design and production to Intel, to design inhouse.
They understand very well what the post warns about: If they stop being involved with these parts of the process, they will a)likely fall back and b) have a terrible time trying to recover the ability if they need to, so they only outsource selected parts of their work. The breaking points are further down the curve, and they stay the hell away from them.
One could argue that Apple's attempts at making Macs in the US again are an example of how difficult it is to reclaim such ability, even if the company still has the know-how to oversee it. Especially since nearly everybody else in California also has stopped doing this kind of thing - Apple would need to train people a lot. Which Apple at least can afford, if they want to.
I didn't interpret his essay that way. His acknowledgements of some outsourcing being valid doesn't address my criticism.
>and your 3 examples are all companies that (as far as I can tell) are very deliberate about what they outsource and deliberate about keeping control of the things they want to keep doing: NVIDIA is not going to go out to someone else and say "we want a GPU chip", but rather they are designing them end-to-end to make full use of what their production partners can do.
And this is a great example that ties back to the author's point because he criticized Boeing. Boeing designs the planes and tells the outsourced partners what to make. Boeing then does final assembly in Boeing-owned factories in Washington and North Carolina.
So to use your wording, Boeing does not go to somebody else and say "we want a 787 plane". Boeing does more building than NVIDIA.
I think a fair reading of his essay is that he thinks that a company that is more vertically integrated via less (but not zero) outsourcing leads to more innovation. He was lamenting that outsourcing productivity software like MS Office 365 wasn't a good trend so presumably, companies that insourced that inhouse would be "more innovative".
> They were even telling the manufacturers look, we only put up requirements, we don’t actually tell you what to do
From other sources:
> Starting with the 787 Dreamliner, launched in 2004, it sought to increase profits by instead providing high-level specifications and then asking suppliers to design more parts themselves. [...]
> Rabin, the former software engineer, recalled one manager saying at an all-hands meeting that Boeing didn’t need senior engineers because its products were mature. “I was shocked that in a room full of a couple hundred mostly senior engineers we were being told that we weren’t needed,”
That's the point where you loose your in-house grip on things, and run into trouble if your contractors are not up to it. Keep that up, and you loose the ability to fix it.
The chip-designing companies are betting that there always will be an external fab that's world-class, and likely better than what they can do themselves. AMD literally couldn't afford to keep up. (and when world-class was inside Intel, they somewhat suffered for it, but didn't really have an alternative)
What definitive conclusion are we supposed to get from that Bloomberg article you cited? That cheap outsourced $9/hr programmers from India caused the MCAS flaws which led to plane crashes? Therefore, if Boeing had used in-house American programmers, it wouldn't have been a problem? But if the specifications for the software were designed by Boeing management, it wouldn't have mattered where the programmers were located that actually coded it. From the investigative reports I read, it wasn't the programmers that decided 1 Angle-of-Attack sensor instead of checking 2 redundant ones -- it was the aerospace engineers and managers above the programmers.
The Bloomberg article uses a narrative technique to bias the reader a certain way (reader nods in agreement "outsourcing caused the problem") -- rather than present unbiased Root Cause Analysis.
As counterpoint, Tesla has programmers inhouse in California to code the self-driving software and yet it had serious flaws which contributed to fatalities. So is the correct conclusion to say that inhouse programming is "bad"? Of course not.
Your cite of that type of Bloomberg article gives me an idea of what you found compelling about this thread's blog post. In my case, I was more focused on the "more outsourcing means less innovation" aspect which wasn't convincing.
Lots of people know the story about killing the golden goose, but few understand the metaphor for the range of actions that actually correspond to slowly strangling the goose, instead of nurturing it.
The issues that I've most often seen in working with offshored programmers vs local engineers is that the former will happily code a spec that makes no sense, while the later will almost immediately raise concerns if they see something that doesn't make sense.
Maybe it's a matter of education (formal engineering vs some bootcamp) or that an engineer's signature often carries legal weight.
In that case, I wonder if there was an actual engineers who reviewed what the management has requested for the MCAS or did it go straight to the 9$/hr bodyshop?
I’m also very sure Boeing does not do more building than NVIDIA. These companies maintain absolute control over design, processes, materials and component sourcing, because it’s critical to their product, whereas Boeing has also outsourced all of that.
The flawed specification was done in-house by Boeing employees. That design wasn't outsourced.
>your position that outsourcing doesn’t matter.
That is not my position. You're misrepresenting my argument even though I've stated it clearly : The author's claim that outsourcing leads to less innovation is incomplete and flawed.
Outsourcing does matter and can ruin a company. But it can also enable innovation and the author doesn't cover that scenario. That's my specific criticism of his article.
>I’m also very sure Boeing does not do more building than NVIDIA. These companies maintain absolute control over design, processes, materials and component sourcing, because it’s critical to their product, whereas Boeing has also outsourced all of that.
You've got that backwards. NVIDIA doesn't focus on materials science of chip fabrication. They let TSMC worry about that. NVIDIA employees focus more on engineering the instruction set, the microarchitecture, the firmware, etc. (Somewhat analogous to ARM chip design.) There have been presentations where the CEOs of both NVIDIA and TSMC are on stage together talking about their different core competencies.
>, whereas Boeing has also outsourced all of that.
Why do you believe that? Take a look at the Boeing open jobs list showing ~478 engineering positions: https://jobs.boeing.com/category/engineering-jobs/185-18469/...
Why does Boeing hire all these engineering positions if they've outsourced everything?
Let's try to level set the discussion. NVIDIA outsources the chip manufacturing to TSMC and then the whole card is outsourced to Foxconn. NVIDIA did not spend billions to build or acquire any factories. See the "NVIDIA does not manufacture anything directly, instead designs are sent to suppliers specializing in the technologies needed to create these devices, and the
products are then tested for quality control." excerpt from 
Boeing spent billions in building physical factories in Washington, North Carolina, etc. They do final assembly, stress testing of wing loads, systems integration, flight envelope testing, FAA certification, etc. Boeing outsources a lot of components but they also do a lot of in-house work.
And from all this, we still conclude NVIDIA does more actual building than Boeing?
>Do you think the same kind of mistakes would have been made of the people writing the specifications actually did some of the software work?
There will always be a specialization of skills so expecting non-programmers such as managers to code the software is unrealistic. And yes, Tesla conceivably had a more "closed loop" of in-house software programming instead of outsourcing and they did make the same kind of mistake analogous to MCAS with a self-driving car.
I think you may have glossed over this point of the article:
If you separate the thinking about things from the doing of things, then innovation will suffer.
The article is arguing that the specification might have been flawed because the company doesn't have in-house manufacturing expertise any more -- there was not enough knowledge left to validate the designs.
Similar to Boeing, Airbus also outsources their components. See example list of A350 subcontractors: https://www.aerospace-technology.com/projects/a350wxb#0
Even though Airbus also outsources the flight control systems, they did not make the same mistake of only using 1 AOA Angle-of-Attach sensor in computations.
It doesn't mean Airbus is perfect. Some observers think Airbus' deliberate decision to decouple the pilot and co-pilot joysticks to not show synchronized physical feedback is a flaw which contributed to the 2009 Air France 447 crash. The co-pilot mistakenly pulled the joystick back the entire time and the senior pilot was unaware of it. Consequently, Gulfstream Aerospace (they also outsource many components including flight controls to Honeywell) decided to not copy Airbus' design for the new 500/600 business jets and instead, coupled both joysticks together with force feedback so the both pilots have physical sensation of what the other pilot is doing.
So instead of thinking "outsourcing leads to bad outcomes", there's an alternative explanation of "good or bad outcomes regardless of outsourcing". E.g. Blaming the "outsourcing" can't explain the good & bad outcomes when you study all the case studies of Airbus, Gulfstream, Tesla, Nvidia, etc.
.EDIT reply to : >The scale of outsourcing Boeing has done for the 787 is not comparable to AB (or anyone else). They outsourced core competencies, wing design, materials, software, basically everything,
I still don't understand why we begin the analysis by the amount of outsourcing and working backwards from that to conclude that the 787 is a worse airplane. Instead, why can't we consider that the 787 may be be considered superior to Airbus A330 by pilots and airliners. The more heavily outsourced 787 can also be superior to the 737-MAX in-house wing design.
>, and it pretty much aligns with the point the author is making: you can’t innovate / design if you don’t know how your products are made.
Similar to Boeing 787 carbon composite wings being outsourced in Japan potentially being superior to Airbus in-house designs...
AMD outsources more than Intel (because AMD outsourced to TSMC) and AMD Neoverse outperforms Intel Xeon. AMD out innovated Intel. Intel is so behind that they've made public statements of possibly outsourcing their chip fab in the future. But the AMD innovation story does not align with author's point. See the flaw in his analysis?
But you seem way more interested in proving right than having a productive discussion here. I suggest writing a blog if you’re interested in a monologue rather than wasting people’s time with inflammatory retorts.
I didn't write any inflammatory comments. Discussing the counterexamples that don't match the the author's thesis is not being inflammatory.
>The scale of outsourcing Boeing has done for the 787 is not comparable to AB (or anyone else). They outsourced core competencies, wing design,
Fyi... I did some more reading and it turns out Boeing engineering did design the 787 wing even though Mitsubishi of Japan manufactured them.
An excerpt from https://www.seattletimes.com/business/boeing-shares-work-but...:
Jenks, who leads the wing team, said the crucial, conceptual stage of the 787’s wing design was “100 percent Boeing.”
To define the shape of the wing and the system of movable flight-control surfaces, Boeing aerodynamicists conducted detailed analysis of performance requirements, historical flight-test information and new wind-tunnel data.
Only after that defining phase of the 787 design did Boeing bring Mitsubishi engineers to Seattle to figure out the broad parameters of the internal structure of the wing.
“We gave them the shape,” Jenks said.
“That is the family jewels,” Noble said. “That part I could never see Boeing sharing in any way, shape or form. That is what our brilliant engineers are able to figure out.”
Boeing engineering legend Joe Sutter, lead designer of the iconic 747 jumbo jet in the late 1960s, agreed that this first design phase is the key.
“That’s the stuff that Boeing still pretty much keeps under its own belt,” said Sutter, who at 86 still talks at aviation gatherings about jet design.
> Boeing delegated to Mitsubishi of Japan a big slice of the design work.
How can that “100%” be true… these are all just empty words.
> If 10 or 15 years from now the world’s leading authority on this kind of structure is in Japan, then you can’t reallocate your resources to do that work,” Sorscher said. “You are dependent on them.”
This is part of the point the original article makes. Nothing here disproves it. The fact that Boeing succeeded (despite several production issues) doesn’t mean they won’t fall victim to the innovation issues that will result from offshoring engineering knowledge.
Seems like reasonable interpretation is that the shape of the wing was 100% Boeing aerodynamic engineers. So the simulations and computational fluid dynamics to design the external flight characteristics was Boeing. But the internal spar structure and key reinforcements for the carbon fiber structure was Mitsubishi.
>this kind of structure is in Japan, then you can’t reallocate your resources to do that work,” Sorscher said. “You are dependent on them.”
>This is part of the point the original article makes. Nothing here disproves it.*
But Boeing recently built a new factory in Seattle to build new carbon fiber wings in-house: https://www.seattletimes.com/business/boeing-aerospace/at-bo...
>Boeing [...] fall victim to the innovation issues that will result from offshoring engineering knowledge.
But Boeing didn't have in-house knowledge to build a carbon fiber wing so there was no expertise to offshore. To get around that limitation, Boeing seemed to execute a very shrewd business playbook:
(1) 2003 : currently have knowledge on building metal wings in Seattle but no expertise on manufacturing new carbon fiber wings
(2) 2004 : outsource carbon fiber wing manufacturer to Mitsubishi Japan. This also attracts support from Japan government and Japan Airlines to be first key customers of the new plane. Mitsubishi also helps pay billions for development of the new carbon fiber wing.
(3) 2016 : Boeing builds its own carbon fiber plant in Seattle to switch from outsourcing to inhouse production of the carbon composite wings.
(4) Boeing is no longer dependent on an outsourced supplier for a carbon wing
This thread's article doesn't cover the above scenario either. So lessons learned are: (1) Outsource a key component you're not familiar with. (2) If it later proves to have strategic value, bring it in house.
Boeing used outsourcing to become more innovative -- which is the opposite situation the thread's article was complaining about. He writes paragraphs lamenting about companies outsourcing MS Office 365 but doesn't really dig into business case studies that don't match his thesis.
I am aching to drop Apple. I will buy their used products, but try to avoid their pricy new stuff. I don't need the latest, lightest computer either? My budget is not what it was ten years ago either?
I am waiting for a viable alternative, and it's pretty bleak.
I honestly fear what's going to happen to this once great country. I fear workers are becoming as disenfranchised as myself.
The counterexamples I was looking for were companies that didn't fit his thesis instead of a small part like a fuse being outsourced.
The author Bert Hubert keeps emphasizing "making" in addition to the thinking. So a design(thinking) company like NVIDIA doesn't seem to follow his ideal of how an "innovative" company is structured. And another counterexample like Apple in the 1970s used to in-house assemble computers and box them for shipping. That was all outsourced decades ago to China and yet Apple got more innovative with the 2007 iPhone.
This is quite an exaggeration, if not actually an outright lie. Ericsson's main hub for radio software development is in Kista, and there are some 3000 developers in Croatia as well. Some of Ericsson's radios do have their software developed exclusively in China (to my knowledge at least), and there are also a decent amount of developers in Ottawa, but to claim that all of Ericsson's software is "built in countries far away" is highly misleading imo. The 13,000+ Ericsson employees here in Sweden aren't just sitting around doing nothing.
Or alternately, poor editing? Without the "So", the "software" sentence binds to the previous paragraph, becoming about apps and enterprise software. And the Ericsson sentence binds to the hardware sentence, becoming about telecom hardware manufacturing. Both unremarkable.
5G: The outsourced elephant in the room - https://berthub.eu/articles/posts/5g-elephant-in-the-room/
And also HN Discussion - https://news.ycombinator.com/item?id=26843068
The parts about vendors having a great deal of insight into the networks is certainly true from my experience.
For example: I -- an outsourced software developer working for Ericsson here in Europe -- have insight into what software/hardware is deployed in Verizon and AT&T's networks in North America, as well as which of their nodes/radio units are having issues due to software/hardware problems.
Now Ericsson doesn't run and operate Verizon and AT&T's infrastructure, but Ericsson does have a great deal of insight into it via proactive log collection and similar initiatives.
If one of Verizon's radio units in Texas is having a lot of problems, then a software developer in Croatia might end up analyzing some log/crash dumps from it to see what's wrong with it, and then tell someone at Ericsson in Canada to tell Verizon that the unit should probably be replaced.
I like the toaster example. It reminds me of this: http://www.solipsys.co.uk/new/TheParableOfTheToaster.html
He mentions the Dreamliner. That project is kind of a poster child for how not to do stuff, but I suspect that many of the problems came about as a result of cultural hysteresis. The engineers and managers were good, but inexperienced in development of such a loosely-coupled project.
I agree with the premise of the talk. In the US, we are facing the same issue with manufacturing. It’s actually impossible to do some types of manufacturing in the US. We’ve crossed the Rubicon. Alea jacta est.
Although there is the WarCry album with the “j”:
"It was not until the Middle Ages that the letter ⟨W⟩ (originally a ligature of two ⟨V⟩s) was added to the Latin alphabet, to represent sounds from the Germanic languages which did not exist in medieval Latin, and only after the Renaissance did the convention of treating ⟨I⟩ and ⟨U⟩ as vowels, and ⟨J⟩ and ⟨V⟩ as consonants, become established. Prior to that, the former had been merely allographs of the latter."
So at Julius Caesar's time, I and J were the same letter that you could write either way.
Why not "Iulius Caesar", for consistency? :-)
In any case, English may not be the right language to bicker about spelling and pronunciation in. It sports one of the most idiosyncratic orthographic systems in existence.
English also has a long history of butchering names, words and sounds, including its own (e.g. the Great Vowel Shift ). Iulius vs Julius is the least of its worries.
There's a Planet Money episode where they talk about an airline that made more money selling oil futures than flying planes. Maybe any sufficiently outsourced company becomes indistinguishable from a finance company.
1. Fools enough people to think the government has their shit together and is protecting people.
3. Make work jobs program.
The whole shoe thing is because of Richard Reid, the shoe bomber. There was a later foiled "underwear bomber," I'm glad the government didn't have the same reaction to him as they did Richard Reid. Actually now that I think about it, they have full body scanners. I haven't been on a plane since probably 2006.
It’s an excellent read, albeit repetitive at times.
In that sense, finance is but one of the problem, which might have a central position in the current run of crisis in technical terms but only insofar as that it is the the most powerfull tool the capitalist has to extend its reach. Even if the financial system were to be sanitized overnight, it would not imply the end of crisis for Capitalism.
for capitalism to work, we need to put all of our regulatory energies into creating fair and transparent markets incentivized toward productive activity and against rent-seeking (these are among the most important of those first-order instability guards for capitalism).
for instance, look at many of the markets of the 70's, just before trickle-down ushered in decades of greed-driven regulatory disembowlment. markets work best when there are 7-9+, if not dozens of, mostly mid-sized competitors. outsized profit conditions are meant to be fleeting, as a temporary reward to encourage constant exploration, risk-taking, and creation (and creative destruction).
In a competitive market, when you outsource, you get immediate costs savings. If your competitor outsourced more things than you have, you'll be at a financial disadvantage for some amount of time before the "innovation debt" catches up. That can be decades - the quality of the outsourced parts can remain equivalent or superior for quite a while (or even perpetually, in case of fuses).
A similar thing happens with companies that do actually want to innovate. All of them are spending all available resources competing with each other, that the R&D for big tech projects simply cannot happen without external intervention or external funding. Historically, none of the well-staffed and well-funded research labs have been funded by companies whose products are a commodity.
That's a good point. In addition, there's the (mostly) inevitable tendency for economies of scale to push for outsourcing. Several companies I've worked for shut down their board shops while I was there, there's just no way to practically keep up with that. Fabs got bigger. Specialty sheet metal shops can pound out the work faster than you can.
One related thing I've noticed is that older companies (dunno about places that make exclusively software) are never well equipped to deal with perpetually cheaper products with smaller margins.
As a side note, I suspect that the real magic in making toasters, if all done in-house and using simple inputs, is to design the manufacturing facility. The toaster itself is relatively simple. The Rouge must really have been something.
Is a company intended to become better at toaster components? push the component complexity down to a more basic level and then have more bespoke manufacturing done? or to become a manufacturer/final assembler and all that that really implies?
The real answer I guess is to command that your outsourced engineers design in some IoT toaster sorcery that allows you to sell toaster information to third parties.
Isn't that just a different phrasing of what the article means by "in the longer term, [shareholders] are not strategically interested in the company, because if the company doesn’t do well, they will invest in another company. And so a lot of this stuff is actually driven by shareholders and consultancies"? Or are you arguing that it is competition that drives outsourcing, not public ownership?
the R&D for big tech projects simply cannot happen without external intervention or external funding
I think you may have a point there: the KPN Neherlab mentioned in the article never was KPN's. It was part of PTT, the Dutch national post & telecommunications service. PTT was rebranded KPN when it was privatized, and KPN subsequently closed the Neherlab.
This is why it’s being outsourced. This is why they won’t pay engineers what they deserve. All ready to be swept away but any real tech or mega Corp.
Regulations around the spectrum and the concept of auctions make it impossible for any new player to enter the market. Spectrum and radio equipment should be owned by the government and communication providers can just buy bandwidth/airtime on it at a fair price. That works successfully for things like the power grid, water/gas distribution infrastructure, etc.
Why don't people leave? Because they want their buy-out. They want the package that comes from grinding all of those years away at the zombie firm.
Is the suggestion is that European business has been hollowed out to being nothing except a sales channel for imports really true? As far as I can see, manufacturing has been steadily increasing in the EU for basically at least 30 years, with the exceptions of two one-off events (2008 financial crisis, Covid). Likewise for exports.
And when it comes to telcos, why in the world would we want them writing their own tech? Very few of them have sufficient scale to make building their own basic infrastructure sensible. All they could plausibly be writing is value added services that nobody actually wants, rather than being dumb pipes.
(I do think telcos shouldn't outsource their network operations as a whole. Outsourcing individual commodity functions like DNS seems kind of reasonable though.)
Note that EU-heavyweight Germany has an atypical-for-Europe emphasis on manufacturing. Perhaps making those two thoughts compatible.
Also, people speak of US total manufacturing output increasing, while at the same time US DoD speaks of five decades of US deindustrialization being severely damaging.
Second, Ericsson is not a telco, they're a supplier. Exactly the kind of entity that can achieve economics of scale on building say a mobile gateway router or a billing system, by building one that can be used by a hundred telcos, not just one.
Vertical integration did make more sense for the AT&T of old. The market didn't yet exist, or wasn't standardized enough, for them to be able to depend on that.
Also, Europe may start to realize that it has to develop some tech, such as Cloud services, itself. As @jbverschoor mentions, the European attitude needs to change to consider tech as a core business, instead of a cost center, and pay engineers accordingly.
Also why do exactly Europe has to do this? Is Cloud the new gunpowder? Should the EU Commission just run a big OpenStack cluster?
(So these big alliances are completely useless. Just like OpenStack itself. Instead of this 10B EUR for some pet projects of random big and old industrial companies, the Commission should work on making it easier to start a company, hire/employ folks anywhere, sell stuff, make contracts, develop standards, report problems [ie. corruption], and it should create X-prize like grants/competitions, it should give incentives to build and maintain direct to end-user technology. Anyway, this is just like the F35 in the USA, members states want to spend the money at home.)
The EU thinks in 7 year budget cycles.
I too recommend more spending on tech and science.
That said the direct EU budget is not particularly huge (just 1% of the GNI of all of the member states), but members should spend on tech more anyway. And the NGEU (next gen EU) COVID relief fund is "just" 750B EUR. (Though again, member states themselves will also spend a lot of money on boosting the recovery.)
Naturally there are obvious strategically important areas where the EU as whole should do direct investment into R&D. (Food, energy security and health for example.)
There's about 20B EUR / year earmarked for "single market, innovation and digital" ( https://en.wikipedia.org/wiki/Multiannual_Financial_Framewor... )
China does understand that the best way to build up a particular industrial capability is to do it. "Just do it". Especially using tech transfer agreements. Hence their amazing railway network. (But their 5year plan is not particularly interesting.)
I'd argue that the EU should just do more (a lot more!) of what it usually does anyway. (Which is participating in various projects, from ITER, ESA (+ Galileo), to funding a lot of initiatives like NGI.) Also it'd be great to capitalize on GDPR and other differentiating directives. (So partially funding projects that offer GDPR-compliant alternatives.) ... and healthcare, the EU populace is aging, fast .. if that's not great "market" opportunity I don't know what is. ¯\_(ツ)_/¯
But, still, the core issue is that there's not enough direct capital sloshing around in the EU. If the Commission would partially fund targeted VC funds it could make very big waves in various tech sectors.
One of them was Le Coq Sportif.
It was challenging because they realized that manufacturing sport shoes is actually difficult, requires expertise that was completely lost in France because it have been outsourced for 30 years. So they had to re-learn everything.
What a difference those three letters make. But do read the article, it's a worthwhile read.
> How technology loses out in companies, countries & continents
> How Tech Loses Out
None of these titles are understandable to me. How can it be so hard to give something a simple, intelligible, and coherent title? Or maybe I'm the idiot.
If you have a group of 500 people contributing to the making of a toaster, whether they all call themselves Company A (with many divisions) or are split into many companies, it's kind of the same when the rubber meets the road - it's 500 people working together in some fashion to produce a toaster.
How you draw the lines around them and what company/division names you write on top is sort of secondary.
- Information flows become adversarial. I don't want to tell my customer they're wasting money on the heating element, because what they waste ends up in my pocket.
- Incentives are different. Customer wants the best heating element for their use case, but I want the one that I can sell to as many customers as possible, so I'll nudge them towards this one that seems to fit that description.
- Marketing arm sees an opening and wants to innovate. Sadly they don't speak the same language as the techs, metaphorically and literally.
You could say this is the same problem that the whole world has. Why aren't we all a global village, able to make decisions like saving the planet? Incentives are cut up and fall different ways because we don't see each other as one, and can't communicate that well.
And might happily proceed with building things they know won't be useful or after a broken spec.
All this sounds like major big bad things to me.
The EU should fund and maintain certain capabilities, like building, maintaining, and innovating communications networks, from fabs to finance.
Telecommunication is a pretty mature field. Phones aren't changing very much. I think its obvious that a company in an immature field will innovate a lot, there's a lot of low hanging fruit. In a mature field its a waste of $$$.
Everything the author said could still be true, i just think the cause is a lot more intentional and non mysterious. Of course the company that sells tech that has been commiditized will be noninnovative. Why wouldn't they be?
In 2021, in the age of Amazon, Uber and cloud computing, where you can pay to get pretty much any product/service delivered on the same day, I couldn't get uninterrupted mobile internet from any of the carriers here no matter how much I was willing to pay.
I had to work around some stupid billing systems and artificial constraints, and sit through minutes of pre-recorded marketing bullshit just to be able to pay them money. They seemed to be more interested in marketing irrelevant crap to me than actually taking my money and giving me the one thing I wanted and was happy to pay big bucks for: internet access.
In the end I couldn't find anything better than topping up 30 bucks at a time and resetting the "bundle" every couple days when the data runs out. There was no way to just tell their system "here's 300 bucks, give me data until this runs out".
Telecommunications is fucked.
We're mostly SW folks here, I think, so we generally understand that interfacing components requires a well written contract and appropriate testing at the boundaries. Why is it not the same in manufacturing? If I'm outsourcing the manufacturing of my toaster to various parties, I still have a hand in specifying the pieces, assuring they meet my requirements, and verifying they are put together in a way that upholds my brand value and meets customer expectations. That very much requires engineering talent. It also benefits from comparative advantage and provides my company with more profit which could, theoretically(although seemingly rarely in practice) allow me to provide a superior product at a better price point.
Any decline in quality would be the result of (poor?) business decisions and not inherently a result of outsourcing.
Except eventually you can't, other than in very vague, handwavy sense. Engineering specification is a skill, and it dies when manufacturing base moves overseas.
You get maybe a decade of transitional time when you both have an oversupply of now unemployed engineer force AND cost savings from doing the actual work elsewhere, but then people change careers and new college students know better.
It also means that we understand the overhead in doing so, compared to doing things inside of a team.
As long as your company "owns" the customers, there is nothing you cannot outsource.
The idea that people could dig deep and get our knowlege (and the motivation to do so) just went out.
We see this in cars - the check engine light is the most disgusting thing ever invented - you have so many displays, you could put directly the real error and let me know what it is. But no - we have to go the techo shamans to do their job.
We move from not knowing things to unknowable - the idea that the knowlege is unobtainable. And that is bad.
- At any point you can lift it up to see how done it is, and drop it back down.
- If you're toasting frozen bread, you push the lever down and press the "Frozen" button.
- If it pops up too soon for you, you push the lever down and press the "A little bit more" button.
- If you're toasting fruit loaf, you press that button.
- The outside stays touchably cool.
A combination of the bread-temperature-sensing (assuming that's not what my one does) and the more modern features might be even better.
Is writing an authoritative and recursive nameserver with flex and bison and licensing the software to some telecoms in Europe the the same as "launching a telecommunications company".
Would he say that the companies that some European telecoms are outsourcing to have "launched telecommunications companies".
Why not call it what it is. A software company. That now provides SaaS.
A company that is taking on outsurcing work from telecoms that are internet providers. It is itself an illustration of the problem he purports to identify in the presentation.
This is the root of the article. The bell companies got so bad at technology (in part due to outsourcing to every consultant on the planet) that they can’t hire smart engineers.
There are lessons to broader companies too. How many large non-tech corporates learned this the hard way? And in an ironic twist, it appears that IBM (the outsourcer) can’t hire engineers on their own either, so they have to buy the RedHats and Turbonomics of the world.
I help companies with cloud transformations and often I come across situations where businesses are in trouble because they completely lost their operational knowledge. They might know how their applications work internally, but they have no clue how the machinery around them keeps them ticking.
Once I was at a company in Northern Europe that desperately needed to modernize their stack running on legacy mainframes in a colocation datacenter.
Their core business was based on winning the five years government tender for a very crucial public service. They won for years and years, but now they were scared shitless because the new tender explicitly mentioned that the system was supposed to be hosted "in the cloud" and developed using a "microservices architecture".
The main issue was two-fold. On one hand, for decades they offloaded the deployment of their core systems to the technicians of the datacenter itself with which they had an amicable and personal relationship. On the other, said datacenter was being bought by a US megacorp who didn't give a crap about the existing personal relationship with this very specific, crucial use case and told the company to simply pay them a crazy amount of money to perform the migration.
They were also so blind to their lack of understanding of their own deployment process that they called us only two months away from the deadline. So, yeah... I advised my employer to stay away from this death march.
And for the Singularity, it better have a good DevOps neural net.
What incentives does a small, geographically limited service provider has to spend so much on R&D? So that they can get try to make awkward licensing deals for their awesome tech if they strike gold and get to be the first to develop 5G? This makes very little sense. Telecom equipment vendors are an actual thing. They're just not that well known to public. For the optical backbone, you have the likes of Ciena, Juniper, Infinera, ADVA, etc. (More well-known Nokia and Cisco are also heavy players there.) These companies are all system integrators themselves, and have their own degree of vertical integration. There are component vendors which specialize even more, although they are increasingly getting gobbled up through consolidation in the sector. This is partly because people figured that it makes little sense to develop the same transceivers or modems at 10 different companies when you can instead merge them and combine the R&D budgets, patents, and manufacturing know-how. This is the same reason we don't have 20 CPU manufacturers. This is advanced stuff with high R&D costs. His argument seems to be that it'd be nice to just have more people work in this field at regional telecom companies, because then some areas like parts of Europe wouldn't be deprived of a tech sector. The reality is that we don't need a very large workforce to develop the most high-end stuff, and tech, especially the high-end non-consumer grade variety, really is a winner-takes-all market when it comes down to it.
There is nothing preventing a big telecom service providers from buying any of these system vendors. That would also be very expensive and subject to heavy regulatory scrutiny to ensure proper competition in the market (buying a system vendor and then blocking the sales or price gouging your competitors being a big no-no). But there again people figured out in the 80s and 90s that conglomerates are actually less, not more efficient. Everyone is better off specializing in their own thing. The tech inclined people end up joining those companies instead, and that's really for the best. Is he really arguing that innovation and tech doesn't exist in the field? Where does he think it comes from? Some unnamed office in Shenzhen? Besides Huawei or ZTE, not really.
But - according to Bert - it'd be great if the manpower at least would be here, in the EU, not scattered around the world. At least as long as we have these things like countries and geopolitics.
This is a scary thought, but one that is regulated by capitalism and economics. Sometimes it's cheaper to hire smart people when needed, rather than keep people around just for their knowledge...
Modern C++ for C Programmers
Introduction - https://berthub.eu/articles/posts/cpp-intro/
Part 1 - https://berthub.eu/articles/posts/c++-1/
Part 2 - https://berthub.eu/articles/posts/cpp-2/
Part 3 - https://berthub.eu/articles/posts/cpp-3/
Part 4 - https://berthub.eu/articles/posts/cpp-4/
Part 5 - https://berthub.eu/articles/posts/cpp-5/
Part 6 - https://berthub.eu/articles/posts/cpp-6/
"This is a transcript of my presentation over at the European Microwave Week 2020, actually held in 2021."
What's a European Microwave Week? Well, it's a conference put on by the European Microwave Association.
"The European Microwave Association (EuMA) is an international non-profit association with a scientific, educational and technical purpose. The aim of the Association is to develop in an interdisciplinary way, education, training and research activities."
"The European Microwave Association (EuMA) is an international non-profit association with a scientific, educational and technical purpose. The aim of the Association is to develop in an interdisciplinary way, education, training and research activities, including:
"Promoting European microwaves
"Networking and uniting microwave scientists and engineers in Europe
"Providing a single voice for European microwave scientists and engineers in Europe
"Promoting public awareness and appreciation of microwaves
"Attaining full recognition of microwaves by the European Union..."
So, uh, how far down this rabbit-hole do I have to go to find a meaningful term...
"EuCoM 2020 Events:
"GPR and Electromagnetics for Sensing Soil, Objects and Structures: Forward Modelling, Inversion Problems and Practical Aspects" - Lecce, Italy, January 29 - February 01, 2020 - Org.:R. Persico et al."
I wrote the comments above before I read the article. Now that I have read it, I came to an epiphany:
*It's exactly what he is talking about!*
EuMA doesn't do microwave things. It's an organization about microwave stuff, but what they do has nothing to do with microwaves. The schedule things, they write contracts for venues and catering, and they send press releases of various kinds.
Wouldn't it have been slightly refreshing if EuMA's web site was written by someone who actually knew something about microwaves? Someone who could spice things up with meaningful examples? Even a little?
Anyway, there are some issues with the article itself.
"And we fight for all technology, even the stuff that is not core because we are attached to it, we love what we do."
What is core, and what is not? And after you've eliminated everything that is clearly not core, what is clearly not core among the remaining things you have left? If you've outsourced the springs, knobs, cords, and cases then those start looking an awful lot like something else you should get from outside. Especially since your manufacturing facility is now just running one shift a day. Or a week.
At the end of the article, he mentions, "JPL at Caltech in the US", which is an interesting (and appropriate) phrase. If you follow the Mars rovers or any of NASA's other unmanned exploration missions, you'll see JPL mentioned a lot. NASA is very proud of JPL. Which is a little strange since JPL and NASA are only loosely related. "JPL is a research and development lab federally funded by NASA and managed by Caltech", as their web site says. The launch vehicle, by the way, was a commercial United Launch Alliance Delta II. (Not that I'm bitter in any way.)
Instead of focusing on abstractions of technology, why not focus on the fundamentals?
With fundamentals, I mean mathematics and physics.
This is perhaps all you need to understand tech in its fundamental nature. Everything is built out of mathematical and physical principles it seems.
The problem then may be to determine how much you need of it for your field. Do you need quantum mechanics to know how bicycles work? Maybe classical physics (mechanics, electricity, optics) will suffice here and the rest is a combination of physical principles aka engineering or creativity if you will. (Creativity is a fundamental human skill it seems.)
There's also abstract engineering as I would put it: computer science.
What are the principles of CS? I am not so sure, but I have a rough idea that perhaps it is also field dependent.
For computer graphics seemingly you don't need a lot of math. It's roughly at high school math level.
You can do projections and other transformations without linear algebra (vector, matrix math). All you need to do here, is deriving relations/equations out of geometric rules (like you can for perspective projection for example).
Then you have 2 fundamental ways of rendering things to the screen: rasterization (Pineda's method) and ray tracing.
- after projection, you need to bring your vertices projected onto your virtual screen to an actual "screen" or should I say PPM image. Then you need a filling algorithm like Pineda's one.
- here, you don't need any projections. The projections are basically "a given". As you shoot rays from your virtual camera through your virtual screen, all you need to do here is to calculate intersections with geometrical objects and color them accordingly.
So for all this basic rasterization and ray tracing you at least need: high school-level math (perspective projection, intersections, transformations, ...), basic rendering algorithms (Pineda's method, ray tracing, Phong shading, ...).
I suppose the other field's fundamentals or basic principles can be "filtered out" from these many layers of abstractions.
Matrix math or linear algebra is not needed for CG. (Although it is really comfortable to know it... It's like doing CG by Assembly instead of C or C++.)
I also suppose that back propagation - the backbone of neural networks - can be also done with basic high school math.
So my suggestion is: decide what are the basic principles of a given field and filter them out.
Also, my question to all of you: learning Assembly is pretty much an educational endeavor nowadays as CPUs these days are doing lots of magic behind the scenes. So what are the fundamental principles here? Physics surely, but I see a problem with my argument above. Getting down to the common denominator (math and physics) is relatively easy, but it is hard to build a CPU out of it albeit theoretically possible.
Charles Petzold's code seems to suggest that those principles are actually not that far from basic principles.
Dunno, what do you think of my comment here?
* parser combinators
* virtual machines