These spreadsheets are often put together by an intern or associate who "knows a little VBA" and automates or tracks a task that the business was doing manually. Nobody on the desk really knows how the spreadsheet works.
Changes to the spreadsheets are not controlled in any manner, are not tracked in source control, but are often (thankfully) backed up by virtue of being saved on a network drive.
The business prefers spreadsheets over "real software" because they have complete control of it whereas their internal IT department are slow and expensive, and the resulting custom-coded products are often substandard (see zubairq's comment about paying market rate).
But the business people had a strategy to defeat this every time. They would simply make sure that "copy/paste" was on the requirements list every 2-year cycle. When the new software came down, they would satisfy the company-wide decree as follows: (1) open the new software; (2) copy the data; (3) paste it into Excel. And then they would continue as before. The only thing that changed every two years was which buggy, slow, and limited in-house application they would be copying data out of and pasting it back into.
There's this dilemma that the people buying the software and mandating its use aren't the ones who end up having to use it. So, often ridiculously complex workarounds happen in order to comply with company policy yet still be able to get any work done at all.
This quite likely is one of the largest productivity sinks in industrialised societies.
There are ways to amend these issues but they require a change both in company policies and in thinking: Allow departments and teams to choose their tools. Have a repository of internally developed tools (Excel spreadsheets, bespoke CRUD applications, shell scripts etc.) for others to share and build upon. Put an emphasis on software quality and user experience.
I would absolutely echo all your points, especially the rationale behind "real software": it would have been far more expensive, have taken far longer to be delivered, and that's if it were even signed off in the first place. When management sees "real software" as wholly unnecessary and all you have is Excel, you soon learn to be rather creative.
Once I was contacted by a friend who works on the QA department of a factory (belonging to a multinational company) to implement a system to keep track of the building process (what was soldered, who did it, what tolerances it must have, things like that). The factory's process was entirely modeled in Excel spreadsheets, with the caveat that they took almost 6 months to gather and deliver the data to the client after installing the product on the client site.
As it was a freelancing job, I gave them a very high hourly rate, such that it they could forget about it, as it was too much work for one person alone. But in hindsight, I also refused because most certainly it would imply that a lot of the workers in that departament would be made redundant (there were 30 workers there, I would say that at least half would be unnecessary after the system was up and running). If only management understood the huge savings that would be possible.. But in this case, I'm glad they don't!
I blame the "MBAs can do anything" philosophy, but it's probably also a consequence of worker mobility.
To state it most simply: if your management (at any level) cannot call bullshit on the data their direct reports provide them (whatever level of abstraction that is), then your business is at existential risk.
I almost had a heart attack thinking about it while watching one of my bosses try to consolidate 13 different versions of a spreadsheet, none of which made sense. And then I realized he's one of the above par ones.
Vast majority of calculations are performed in excel. Most engineers know at least a bit of programming from one mandatory class in first year, but for some it's fortran, others it's matlab, vba, etc. Excel spreadsheets are the common denominator.
Thankfully, most sheets are rather simple, with one calculation per line, with inputs coming from previous intermediate calculations on the above lines.
Unfortunately, sometimes, the logic needs to be more complex, and that's where things go south. Despite all the best intentions, spreadsheets are very hard to test, audit and version-track, and errors will inevitably sneak in.
Other things get harder too, when all information is in spreadsheets. Recently I was tasked with updating and reporting about 200 calculations that were contained in 200 almost-identical sheets. A single factor needed to be changed on those 200 sheets, and then I needed to compile the new results in a summary table: relatively simple VBA macro, right? Except multiple of those sheets were just slighlty different, due to multiple engineers modifying them for slightly different cases. The solution ended being similar to scraping really screwed up html with something like BS. A variety of regexes, string searches, and even heuristics based on cell formatting needed to be used. A lot of information was conveyed visually in those sheets (ie, a new calculation block started with a title that was typically bold, blue, and 20pt, but sometimes a bit different).
The same thing could've been implemented in python/julia/fortran/heck even C, version tracked and unit tested. Verificating and understanding the calculations is way easier, as all the equations are there in front of you when you open the source file. The program could take input written in simple text file (think ini or even yaml if something more complex is needed), then spit out a report in another text file, or even fancy formatted html. Then, future modifications to handle "special" need not a different program (spreadsheet) for each new case, but in msot cases the master program could be updated while still ensuring backwards compatibility, and all calculations could be rerun in batch from the input txts.
Anyways, that's what I dream of sometimes while I'm either doing rote manual excel-scraping or trying to think of a new convoluted heuristic to program it. Unfortunately, the single most important thing for documentation is that colleagues can understand it, and most MEs still see programming as something too complicated and time consuming. I also work in a big corporation, and IT utterly useless. Getting a custom tool made by IT would be a hell of budget approvals, resource plannings and a bunch of three-letter acronyms for at least a year (seriously) before a programmer would strike a single key. Comparatively, a spreadsheet can be bashed out in an afternoon by basically anyone.
This is pretty common practice, in Australia at least.
I do note that you say "mostly" - but I've seen truly epic spreadsheets (including one that my employer at the time spent almost £1 million on) and I'd rather reverse engineer an application with source code rather than a complex spreadsheet (even though the feature in Excel to show dependencies visually is rather cool).
Same thing with reliability. So many of these IT supported applications that are supposed to survive a nuclear strike are down a day a week, while the basic access database pretty much hasn't seen a downtime during business hours since inception.
Tactical initiative: not just for Marines. The closer software is being built to a person who actually posseses the domain knowledge, the less risk of misunderstood requirements / behavior there is.
Readability: a language that requires six months to learn to read but expresses programs in half the length is often times not the right choice. If a good enough language can be read by someone with a week of training, and 90% of the domain knowledge is possessed by people who don't know the better knowledge, then use the good-enough language.
Reasons like these are why Excel / VBA is used. And moreover, why it (paradoxically to us) works.
Sometimes, something gets publicized. Usually, it will get shrugged off and nothing much will happen about it. This makes one wonder when the whole house of cards will finally collapse...
I find this very interesting because my company (which I applied to YC 2017 with) is based on exactly this same premise.
if something goes really wrong, they can fire the intern and caim they went rogue.
it is much harder to shift blame to a department where everything is full tracked, approved, etc.
i think this a symptom of a much bigger hidden secret. corporations are not organized to accomplish tasks. they are organized to avoid responsibility.
Of course some companies get really far by throwing armies of developers at the proof-of-concept software to keep it running and patched together. But that only works as long as the product is profitable enough to pay for all those developers plus some.
Anyway, point being, companies that "do not want to pay market rates" may have little grasp on the cost side of their profit equation. It's possible that because of that, they do not have a profitable business model, at least for a given product, and should really be rethinking their strategy.
This is one of the problems with keeping your "technology people" out of business planning. It's hard to impossible to avoid chasing organization-wide sunk costs if you don't have someone at the strategic level with a good concept of "technical" things: skill level of the engineering organization, technical debt levels, red flags in system quality, etc.
What's the rates in question in London? For permies and for contracting.
Not much familiar with the USA but permie got better advantages. For one, massive stocks/RSU at big internet companies (that the same companies don't give in London), and healthcare/benefits that are expensive as a contractor. I assume it's better to get a standard $300k package as a senior developer than a $1500 a day as a contractor.
Part of the trouble is that nobody knows!
However, i think that as a pretty good mostly-Java agile devopsy tech-leadish developer with about ten years' experience, i would be looking at a 70 - 80 k£/yr salary, or a 600 £/day rate.
The salary could be a lot higher somewhere like a hedge fund or Google, but when i was looking recently, those are the numbers that were in play. I was on about 70; when i asked for 80, some employers would wince but keep talking, and others would end the conversation quickly.
The day rate might be way off; it's anchored in a couple of highly anecdotal personal data points. It doesn't match the published industry average of ~450, but i assume that's weighed down by legions of grunts and juniors.
Java AND DevOps AND leadership AND 10 years experience?
Seriously. That hurts to hear about companies who expect to get that sort of profile for less than 6 figures. 70k is borderline insane. What package do you have on top?
I already got that when I was less than 25... Please don't tell me I am the 0.1%
The basic standard day rate for DevOps is 500-600 from what I've seen. I don't get how the average could be 450, that's less than the absolute minimum, especially if you also have good development skills on top. Java might be slightly cheaper but not much.
Met a guy between two contracts recently who does that kind of stuff (and some others). He left his current gig to join government consulting for £1000 a day :D
> i would be looking at a 70 - 80 k£/yr salary, or a 600 £/day rate.
I think that's on point. Anyone who looks old enough, can say hello and talk about some past real world experiences will land a 500-600 consulting gig or a 70-80k job at a random company.
The big brands and the shops that pay more are more selective (including most finance). Permie = annoying interviews + skill. Consulting = network + delivering.
> The salary could be a lot higher somewhere like a hedge fund or Google
Then join them. Cheap companies are not worth applying.
I actually think that consulting is very easy to do by anyone and it's more gentle on > 30 years old devs (than SMB/startups). That's why there are so many people doing it, it's not about the money (though it helps).
In all cases we are talking about specialists, think writing JIT compilers, sophisticated data analysis, program verification, or scaling to millions of users.
Salaries for excellent programmers in Europe are a joke vis-a-vis the US.
Last I checked, Europeans need a H1B VISA to go work there and it's so hard to get it's bound to failure. Even if you get it, it prevents you to switch job and you can't negotiate and swap jobs to the top.
P.S. No place in Europe are charging $4000 for a one bedroom ;)
I had some stock options that would obviously never have been worth anything.
Can i ask what kind of companies were paying you 70k in London when you were less than 25? And to do what? And how much experience you had at that point? If you were doing generalist development on the back of a degree and four years of work experience, i'm going to cry.
> I don't get how the average could be 450, that's less than the absolute minimum
To be clear, that appears to be the average rate for Java developers, not devops. Have you ever worked in a big retail bank? Or some other huge company? You know how they're stuffed full of teams of people who couldn't code their way out of a paper bag? Most of them are contractors, and it's probably them who are billing 450.
> Then join them.
> I actually think that consulting is very easy to do by anyone and it's more gentle on > 30 years old devs
As a >30 year old dev, i got out of consulting because it was such a hard life. It's not technically difficult, but as a senior, you spend your whole time banging your head against the brick wall of client stupidity.
Don't compare yourself to the top and cry. There is always someone earning more. I had my time when I was always the least paid guy in the room. Arguably that's still what I feel every time I look around.
I've got a bad image of retail banks. I am fine with investment banks and other banks and non-banks (hedge funds, others) that treat their people well and can build a business out and around what I do. Retail and consumer banks are no places for programmers.
>>> It's not technically difficult, but as a senior, you spend your whole time banging your head against the brick wall of client stupidity.
Not much different from a permie. I find it a lot more tolerable when you can see daily stupidity along a daily rate. Who cares what happens. You'll be on a new contract in 3 months anyway. (That being said, I really hate contracting).
My anecdotal case :
I currently work for a very well known French startup. I am paid a bit under 50k€/y (and no equity).
The company really struggle to attract skilled engineers and to be honest there are not that many skilled engineers in a given field in the country.
I could get 70-80k€/y in some other less known French startups that do their best to attract top talent.
I got an offer for an US based startup a little bit under 200k$/y with some advantages and a good amount of equity.
If I can get an US work visa, I am going to take that offer, otherwise I will follow other opportunities but I am not staying with that first company in the long run.
Some are still in tech, at places that pay them, places that you may not know or have never heard of.
Little experiment: Add 50% to your top compensation, then you may soon start interviewing people who earn around that number and sometimes another 50% on bonus/stock (that your company doesn't have).
They are here and they are the people you wanted, they're not difficult to get, you just didn't know about them.
Now that you found out, you'll continue to ignore them and keep looking for a unicorn at half the price :D
If they insist on tech in SV or nothing, its like people who fail the culture fit criteria because they're a woman or minority. Uber and sales and non-tech employers, unlike tech, won't discriminate as much against women or people of color. If a white male can't get a job because of a different issue like skill vs pay, he can Uber and sales and non-tech employers just as well as a black or woman already has to do.
I'd like to see salaries raised if only for the "a rising tide raises all boats".
2. The Israeli business AMDOCS with alleged Mossad links is an outsourced and hosted billing provider for telecommunication carriers, whose client list includes large portions of the developed world.
● There are “clusters of bugs”, i.e., finding a bug in a certain module increases a chance to find a subsequent bug in the same module. This is understood intuitively by most people yet very few act on it, discarding modules that have too many bugs (and rewriting them from scratch) instead of continuing to sink resources into maintenance.
● While professionals in other industries use professional tools, programmers use commodity hardware and software (the kind a homemaker would use to google a guacamole recipe). :(
● Managers and programmers think that personality traits and team members' individualities do not matter and there's no role “human factor” plays in development. Ditto for self-organization vs constantly “organizing”/policing employees.
● There were pretty cool systems back in the day (with orthogonal persistence, ability to inspect and modify any object on-the-fly, etc.). The modern ones are not “bad” either but some lessons could still be learned.
● Sometimes there are notions that bad software is created due to sales people/economic pressure but analysis of, say, build tools shows that it is not the case. ;)
i have to disagree with you here. most of the tools we use as programmers are so precise, adaptable and refined for the job that would make your average industry professional cry.
having those tools run on a guacamole-recipe machine is only a testimony to their power.
it's not just the software either; through services like AWS we have access to some of the most sophisticated and optimized computation and storage hardware on the planet.
I don't really want to get started about those slow laptops (that have “Pro” labels on them). I don't know about you but for me their performance is disappointing. (By the way, I use a 2010 Macbook Air as my home machine so no need to call me “picky”). Yet those “Pro” machines seem to be “default” ones in so many organizations I know. They are a good trade-off indeed between looks, portability, versatility and price. But they are nowhere near “pay whatever it takes to scrape every last drop of performance and reliability from your tools” approach that to me seems to designate the choice of tools by professionals.
Anyway, thank you for your answer and I am glad, that you do focus on the positive side of things!
i'm a full stack web dev myself, whatever that means nowadays. i consider languages, the os, dbms, version control, virtualization, browsers, webservers, automation servers etc etc, to be my tools. not merely the the IDE or text editor.
personally i hardly need anything beside emacs and zsh terminal on my 7y/o thinkpad t410. that said i do understand that different people need different tools. hope you find what you need.
Li-ion battery energy density grows in spurts. Realistically, there have been a few major improvements, with very small incremental improvements in between. For example, in 2001 we were at 180 Wh/kg. In just a handful of months we jumped to 260 Wh/kg, then to 280 Wh/kg as manufacturing processes improved. In the last 11 years, however, improvements have been maybe a few Wh/kg per year, and there have been literally zero improvements whatsoever in the past four years, by anyone.
Eleven years is a very long time, especially for something that everybody just assumes is constantly silently improving. And it's not like the battery industry stopped investing in growth: there are absolutely insane amounts of money invested in battery R&D, since whoever figures out how to make a cheap, energy-dense battery will become absolutely infathomably rich and make Madoff-level growth look like US savings bonds.
Bonus fact: nobody really understands lithium-ion batteries. One production run might have substantially higher or lower capacity. Some batteries might explode. Some batteries are high discharge and some are low discharge. The "explanation" for all of these things is 'heat'.
Imagine that happening to any other industry (for example the auto industry.)
- "Why did my car just explode!?" "Heat"
- "Why does this car last a tenth the lifetime of this car?" "Heat"
- "Why does this car go a thousand times faster than any others?" "Heat"
- "Why is this entire production run of cars not working?" "Heat"
People say "heat" for two reasons: 1) because nobody understands li-ion batteries and 2) because yes, the real answer does have something to do with heat.
Batteries are one of the most important industries and critical to every company on HN. But the state of the art of batteries is putting together random metals with a bunch of random chemicals and watching which ones don't explode.
Imagine talking to the CEO of Boeing: "We just did our millionth test, engines made of cheese and wings made of coconut shells. It crashed during takeoff, of course. We haven't had any improvements in over a decade, but we're sure we'll get there eventually - we've spent billions on research so far."
I get that it's probably all 3, but I feel as if the main problem being 1) or 2) would lend itself well to startups with extremely knowledgable battery industry veterans.
Another problem is that there are two categories of success: you find a simple way to make amazing batteries, or you find a massively complicated way. If you find a simple way, that's great, but it can be copied by anyone who wants money, and you'll end up like the Hoverboard - a very successful _idea_ but the company only captured a fraction of the market. If you find a complex way, it's very likely to require an expensive material or process. For example, say you need a rare metal or something that's not currently produced in large quantities. The capital you'd need to raise in order to extract or create this material could be absolutely massive. For example, let's say you got a 10,000 Wh/kg battery but it needed a substantial quantity of rhenium. Well, worldwide production of rhenium is about 40 tons/year. You'd have create a worldwide industry for extracting rhenium, perhaps rivaling the oil industry in size. Imagine that conversation with VCs. "You want to raise $1.5 trillion? With a T?" "Yep, and we want to do it to create a battery that could either enable our world to look like Star Trek. Oh yeah, and it might become obsolete in months if someone has a better idea."
Another thing is that there's not much to "disrupt". If someone gave me $XXX million for a battery startup, it'd basically be set up in a very similar way to the research arm of Panasonic or Samsung SDI. If some genius comes up with a way to improve the efficiency of the core research - sure, that's obviously worth funding. But the problem is that everybody is trying to get to the breakthrough directly, not invent tools to get to the breakthrough faster.
Imagine if you were in a room with 1000 people, and an authoritative source announced that there's a pot of gold buried in the hill behind the room, and it'll probably be found in a few hours. Do you spend two hours driving home, getting your shovel, and driving back? Or do you run out there and rabidly start digging? Sure, you might answer "get a shovel" now, but in the real world 999 of those 1000 people will be digging with their hands.
The last problem is that the risk, and the return, is absolutely ridiculous. With startups, say 1 in 1000 makes you a billion dollars, 600 return your money and 399 go bankrupt. Great, that's acceptable. Even worst-case scenario, you take a big loss but you don't lose _everything_.
With moonshot battery research startups, every single one will lose all your money except for one - and that one will make you fabulously wealthy beyond your wildest dreams. If the breakthrough is big enough, it would revolutionize literally every field from space exploration to aerospace to power generation to electronics to vehicles. There are so many variables you can't even start to calculate the odds.
China subsidizes up to 70% of postage fee, so Aliexpress sellers can sell cheap stuff for $2 with "free shipping" worldwide.
There is no such thing as "free shipping" in e-commerce, it's just included in the price. An astonishing amount of people actually thinks it's free for the seller.
When you apply for a website license in China, you have to give root access to the Ministry of Information. Most of the time it's handled by your hoster "transparently" in the form of "license application tokens". (Yes, you can trick them if you colocate your own bare metal.)
25% is an average ingredient cost for restaurant meals.
Root: If you do business in mainland China the government reserves the right to strongly suggest you comply with everything they want, which is usually "if anti-government stuff appears on your site, you are done, and otherwise if we give you a phone call, jump high now". This is in some ways no different to other countries in model, but perhaps stronger in presumed responsibility and penalty than some. As for root-by-VPS, yeah maybe, but citation-needed, and there's arguably not a huge difference versus seizure as performed by other countries.
(Source: ~16 years ... not claiming total accuracy, the above is just my experience/impression)
Wow, I'd like to know more, this seems especially interesting in regard to Chinese Bitcoin exchanges. They can keep most of their bitcoins in cold storage, but still having root access to the web server means user passwords, TLs, and so on. Is it just a web server directly connected to the internet that they need to have root access to? Or do you e.g. also need to provide access to the db server used?
edit: Also what's with the license? It's possible for me to buy hosting in China without any licenses.
But in the case of china/Ali-express, it receives maybe 1 for every 100 mails it sends out, which means receiving countries use proportionally much larger manpower to distribute the mail, and this is via tax-payer funds.
So interestingly, that you yourself end up paying for china's free shipping, via your home-country taxes.
I feel that people have a kind of double-think about this; they suspect this is the case, but they still expect software to work properly.
I work at one of the most well-reputed software product companies in the world. We pride ourselves in hiring the very best, and try to do so through an interviewing program that simulates real life coding instead of an academic whiteboard situation.
Even amongst smart people, the bar for software quality is shockingly low. You will find ostensibly senior people again and again who given half a chance with espouse all day long the importance of high quality code, well-designed abstraction, separation of concerns, and good testing.
You can then let these people operate for a few hours by themselves and examine the result. In all but a few situations, it'll be a pile of mud that's designed counter to every one of the values they talk about. Often they're not aware of this, but it's also very common that they know it's sloppy, and don't really care. People don't do what's right, they do what's easy.
In a corporate environment, there is never a structure to identify the developers who are most likely to write good and maintainable code (it's too subjective) so that they can be put into a review capacity, and even if there were, there just isn't the bandwidth to review new additions to the depth that's required. In an ideal situation, many patches would be returned to author essentially asking for a complete rewrite, but that's difficult to justify given that building strong relationships inside the team needs to be considered.
The result is that all software developed in industry trends towards bad over time, and it appears to be as absolute of a universal constant as entropy. You can find localized exceptions, but they're often only temporarily good as eventually more people contribute more features.
Electronic Displays Industry: LED TVs are also LCD TVs. When I explain this to people, some even retaliate! The picture is formed in both by LCD only, just that the backlight is built out of LEDs in what is called an LED TV.
Specifications given for power output in music systems, contrast ratio in TVs, frequency response for earbuds, etc. are mostly fake.
To get at the truth you have to find some enthusiast website and find someone there who has done enough digging to find the real technical information and specs.
Speaking of LED TVs, the next one is QDLED TVs where again it is still an LCD TV but the backlight is now improved with Quantum Dots. Confused the heck out of the public who are looking/waiting for a true Quantum Dot display.
Unfortunately, they're aided by a compatibility moat built up of custom control support. Basically everything that doesn't support Microsoft UI Automation requires memory hijinks. And you'd be surprised by how much legacy software used weird third party libraries.
If you want more impressions or to chat, feel free to flip a mail at ethbro.co a g mail. Happy to talk, as god knows automation software could be improved.
Do you mean development for mobile as a platform?
Do you mean the ability to learn on the move?
Something else entirely?
You'd think "how hard can it be, you either signed a contract with them and have it on file, or not". Well...
When the insurer signs a contract with, say, a doctor, what is happening is not "anything this doctor does for someone on one of our plans is covered by this contract". Instead, whether something falls under the contract or not depends on a whole bunch of factors, including but not limited to:
* The doctor's NPI (provider identification number, issued by US Medicare/Medicaid).
* The particular medical specialties and credentials of the doctor.
* The location(s) where the doctor renders services and the type(s) of services rendered.
* The federal tax identification number and billing entity the doctor bills as.
* The address the doctor submits on the bill.
So suppose Dr. Jane Doe signs on to your insurer's network. She's contracted as a primary-care physician, rendering services in her office at 123 Main Street Suite B, under NPI 1111111111, and will bill as Jane Doe Medical, tax ID 222-22-2222, payment to be sent to her billing office at 234 Second Street.
And you go to Dr. Jane Doe and have no trouble, until one day you get a notice back from your insurance company that suddenly she's no longer in network. She swears up and down that she's still in network with them, so it must be the insurance company's fault, right?
Well, the thing is that she merged operations with Dr. John Roe in the next office suite over, and now is billing as Doe & Roe Medical, tax ID 333-33-3333, payment to be sent to 345 Elm Avenue. And there's no contract for that!
Or she outsourced all her medical billing to MedBillCo, which again changes information the contract is keyed to. Or she also holds credentials for other types of medical practice and was rendering service of that type, maybe at a local hospital. Or she kept her tax ID and billing the same but moved her office from 123 Main Street Suite B to 123 Main Street Suite C. Or the post office realigned the boundaries of her zip code, and she "moved" from zip 11111 to zip code 11112 as a result.
This sort of thing happens all the time.
I'm told that LexisNexis once came to an (I think optimistic) estimate that the half-life of medical provider data is 18 months. So gather up all the information on your network of doctors and hospitals, carefully vet and double-check it, make sure everything is full and correct and up-to-date... and 18 months later half of it will be wrong, just due to the background rate of changes to office locations, credentials, billing entities, etc.
So after working for a little over a year at a company that has to deal with this, I am not surprised at all when I hear someone complain that "it's the same doctor I went to last time, nothing changed, they're still in network, so why can't the insurance figure that out?" Nothing visible to the person complaining has changed, sure. But that tells you nothing. I am more often surprised that anyone is ever able to correctly determine in- or out-of-network status; being able to do it even a fraction of the time is frankly a minor miracle, and requires a whole lot of people toiling away behind the scenes.
For the record: I work for a company in the Medicare space, and we're required to revalidate all our provider information at least once every 90 days, precisely for this reason. Also, you don't want to know what the industry average is for correctness of printed provider-directory booklets. Even if it's sent off to the printers the day the up-to-date data has been validated, some not-insignificant proportion of it will already be wrong by the time it arrives in someone's mailbox, just because of how often and how quickly the information changes.
Funnily, the only persons asserting the value of diploma ... are PhD in ivy league business schools... selling expensive diplomas...
Education is probably a large scale scam in its actual form: too long, too expensive, counter-productive and overrated.
Cf Henri Mintzberg essay on why MBA for instance are a poor choice for being innovative.
"M.B.A. programs train the wrong people in the wrong ways with the wrong consequences," said Henry Mintzberg, a management professor at McGill University in Montreal. "You can't create a manager in a classroom. If you give people who aren't managers the impression that you turned them into one, you've created hubris."
As a person from a region of the world with a high level of illiteracy an having witnessed firsthand the ROI of an uneducated citizenry, I would question this assertion
With that measured success, we got the idea that once you were a functioning member of society, you could become a functioning member of the workforce simply by continuing in even more years of schooling. But it's not clear if it has really worked out. In 1970, approximately 10% of the US population had a bachelors degree or higher. In 2017, that number is closer to 30%, yet incomes have remained stagnant throughout that entire period and job quality has declined. Not what you'd expect from the promised higher incomes and better jobs that people were willing to spend large amounts of money for.
If I was no troll, I would admit a soft tooth for Finland focus on early education and the amazing results they have doing so.
I've also grown up in a region where education was overlooked, and living with complete fucking morons is not fun. Especially when these morons themselves think that education is worthless, which is a snowball effect meaning many of their children will be stupid, as well.
On the other hand, it's easier to control them if you want to. Like they did in Medieval ages and earlier. Yeah, let's go back to that.
The ROI is as indirect and long term as it gets, but it's real and it's massive.
Having an educational system is a sign that the society is wealthy and can afford it, but doesn't mean it's the best use of resources involved (thus it might have a bad ROI).
You said education was overlooked where you come from, was there an educational system there ?
For one, if the labor market in a particular specialty becomes oversaturated, does that mean existing grads need to spend years retraining?
High school is and will always be needed. Also, for some careers university education is obviously necessary. Just not always.
However, this probably only applies to the USA, or other places (South America as well) where students get into high debt to get degrees.
Speaking of illiteracy, there's an urban legend (not sure if true, anyone with access to data please check it) that literacy rate has actually dropped in the US throughout the 20th century, coinciding with increase of education "supply".
Everything from 1% to 66% illiteracy has been attempted to sell to people.
It does seem that illiteracy clusters with various citizenship status, racial and other demographics, simple age distribution of a geographic location, and socioeconomic level to an extent that if you successfully filter those effects out you end up with a boring flat line at 100% literacy when ends up meaning nothing. The largest factor is literacy graphs and maps are basically a restatement of immigration data.
From that point, it depends on the subject you decide to study. It seems to me that medicine students get a lot of real life practice during their training, which makes the school almost irreplaceable.
In other areas, like programming, you're better of staying at home and watching online lectures and making your own pet projects to learn, as long as you have enough self discipline to keep yourself on track. There are tons of resources you can learn from and you can practice as much as you want. I did it and it has been working well so far.
Overall, I wouldn't say we should ditch education at all, but we should seriously rethink the way it works. We can do much better than this.
They also learn to read, write and count. It's a rather useful skillset to have :D
While I agree that you can get pretty far with self-study on small projects, the most important parts of software engineering revolve around communication, especially with other developers and with expert tools. It's hard to get better at communication without, well, communicating.
Of course, one can get a lot of experience by contributing to open source projects or finding other ways to work on group projects.
There is no teamwork, source control, tests nor anything like that.
I seriously hope other universities do it differently and I just attended the wrong one.
Not to say theoretical education worth naught, I say there is a cursor that is way tilted in favour of theory that costs less and make universities earn more.
Let's call it a corporatism behaviour.
You don't want a surgeon that was good at school, you want a surgeon that when his hospital is poor will use his earning to buy medical reviews to stay up to date, say fuck to the hospital and take unpaid vacations to go in conferences to stay up to date.
You want a surgeon that don't care about his diploma, or the titles but stay focused on his craft and will go on learning without caring for what society legally requires him to do in order to practice.
The challenge the real world presents is that the resources needed to reach that level, such as access to specialized equipment, is not in reach of the average person. It's not like a software development career where a widely available personal computer that can be purchased for hundreds – even tens – of dollars pretty much sums up everything you need to become among the best in the world.
What the schools have been able to provide is a place where people can pool their resources to gain shared access to infrastructure. Education comes as a natural consequence of effectively utilizing that infrastructure, of course.
Are any credible organisations offering even rudimentary rejuvenation therapies? If yes can you list them, with links and costs if possible?