Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What is the most important publicly unknown fact in your industry?
64 points by yotamoron on Mar 19, 2017 | hide | past | web | favorite | 99 comments



Spreadsheets are widely used in financial institutions, and not just for little things but also for critical tasks too. I've seen spreadsheets sophisticated enough to make database calls to book-of-record systems.

These spreadsheets are often put together by an intern or associate who "knows a little VBA" and automates or tracks a task that the business was doing manually. Nobody on the desk really knows how the spreadsheet works.

Changes to the spreadsheets are not controlled in any manner, are not tracked in source control, but are often (thankfully) backed up by virtue of being saved on a network drive.

The business prefers spreadsheets over "real software" because they have complete control of it whereas their internal IT department are slow and expensive, and the resulting custom-coded products are often substandard (see zubairq's comment about paying market rate).


I once met with a team of 'business support' people at a large company (not financial—it was in the oil and gas sector) who turned out to be a rogue technical group. Their job was to support the complex spreadsheets made by their bosses. They told me that every 2 years a new batch of software would come out of the IT division, and IT would get a company-wide decree mandating that everyone must use the new software without exception.

But the business people had a strategy to defeat this every time. They would simply make sure that "copy/paste" was on the requirements list every 2-year cycle. When the new software came down, they would satisfy the company-wide decree as follows: (1) open the new software; (2) copy the data; (3) paste it into Excel. And then they would continue as before. The only thing that changed every two years was which buggy, slow, and limited in-house application they would be copying data out of and pasting it back into.


Situations like this one are just sad yet unfortunately quite common in the enterprise segment.

There's this dilemma that the people buying the software and mandating its use aren't the ones who end up having to use it. So, often ridiculously complex workarounds happen in order to comply with company policy yet still be able to get any work done at all.

This quite likely is one of the largest productivity sinks in industrialised societies.

There are ways to amend these issues but they require a change both in company policies and in thinking: Allow departments and teams to choose their tools. Have a repository of internally developed tools (Excel spreadsheets, bespoke CRUD applications, shell scripts etc.) for others to share and build upon. Put an emphasis on software quality and user experience.


Not just financial institutions either: I've worked for two global manufacturing businesses – in very different industries – where fundamental parts of their day-to-day operations were dependent on Excel spreadsheets and Access databases.

I would absolutely echo all your points, especially the rationale behind "real software": it would have been far more expensive, have taken far longer to be delivered, and that's if it were even signed off in the first place. When management sees "real software" as wholly unnecessary and all you have is Excel, you soon learn to be rather creative.


My experience as well!

Once I was contacted by a friend who works on the QA department of a factory (belonging to a multinational company) to implement a system to keep track of the building process (what was soldered, who did it, what tolerances it must have, things like that). The factory's process was entirely modeled in Excel spreadsheets, with the caveat that they took almost 6 months to gather and deliver the data to the client after installing the product on the client site.

As it was a freelancing job, I gave them a very high hourly rate, such that it they could forget about it, as it was too much work for one person alone. But in hindsight, I also refused because most certainly it would imply that a lot of the workers in that departament would be made redundant (there were 30 workers there, I would say that at least half would be unnecessary after the system was up and running). If only management understood the huge savings that would be possible.. But in this case, I'm glad they don't!


The thing I find most amazing about this (I work for a bank) is seeing people make decisions based on version 812 of a workbook passed between 3 different departments with multiple pages where the count of line items doesn't even add up. They're so obviously incorrect and of questionable value, but I've seen managers drop them like they're God's own Truth in meetings. It's that kind of thing that convinces me organizational dysfunction is the root cause of all the mysterious "productivity crisis" that economists often cite.


Abstraction increases as you move up org charts. Unfortunately, actual process knowledge does not.

I blame the "MBAs can do anything" philosophy, but it's probably also a consequence of worker mobility.

To state it most simply: if your management (at any level) cannot call bullshit on the data their direct reports provide them (whatever level of abstraction that is), then your business is at existential risk.


Agreed. And then you realize these are the guys that handle the world's money. Not exactly a mystery why the financial industry has a culture problem.

I almost had a heart attack thinking about it while watching one of my bosses try to consolidate 13 different versions of a spreadsheet, none of which made sense. And then I realized he's one of the above par ones.


I work in mechanical engineering, mostly structural analysis at the moment (ie, figuring out if parts will hold up).

Vast majority of calculations are performed in excel. Most engineers know at least a bit of programming from one mandatory class in first year, but for some it's fortran, others it's matlab, vba, etc. Excel spreadsheets are the common denominator.

Thankfully, most sheets are rather simple, with one calculation per line, with inputs coming from previous intermediate calculations on the above lines.

Unfortunately, sometimes, the logic needs to be more complex, and that's where things go south. Despite all the best intentions, spreadsheets are very hard to test, audit and version-track, and errors will inevitably sneak in.

Other things get harder too, when all information is in spreadsheets. Recently I was tasked with updating and reporting about 200 calculations that were contained in 200 almost-identical sheets. A single factor needed to be changed on those 200 sheets, and then I needed to compile the new results in a summary table: relatively simple VBA macro, right? Except multiple of those sheets were just slighlty different, due to multiple engineers modifying them for slightly different cases. The solution ended being similar to scraping really screwed up html with something like BS. A variety of regexes, string searches, and even heuristics based on cell formatting needed to be used. A lot of information was conveyed visually in those sheets (ie, a new calculation block started with a title that was typically bold, blue, and 20pt, but sometimes a bit different).

The same thing could've been implemented in python/julia/fortran/heck even C, version tracked and unit tested. Verificating and understanding the calculations is way easier, as all the equations are there in front of you when you open the source file. The program could take input written in simple text file (think ini or even yaml if something more complex is needed), then spit out a report in another text file, or even fancy formatted html. Then, future modifications to handle "special" need not a different program (spreadsheet) for each new case, but in msot cases the master program could be updated while still ensuring backwards compatibility, and all calculations could be rerun in batch from the input txts.

Anyways, that's what I dream of sometimes while I'm either doing rote manual excel-scraping or trying to think of a new convoluted heuristic to program it. Unfortunately, the single most important thing for documentation is that colleagues can understand it, and most MEs still see programming as something too complicated and time consuming. I also work in a big corporation, and IT utterly useless. Getting a custom tool made by IT would be a hell of budget approvals, resource plannings and a bunch of three-letter acronyms for at least a year (seriously) before a programmer would strike a single key. Comparatively, a spreadsheet can be bashed out in an afternoon by basically anyone.


I'm not an engineer anymore, but at one point I was doing a lot of structural analysis in Excel. When we found the existing spreadsheets (purchased) weren't good enough, I wrote an entire library of analysis and design spreadsheets, the most complex being for the design of reinforced concrete columns: hundreds of lines of VBA, and only lots of recursion could crack this stuff without support for symbolic calculus.

This is pretty common practice, in Australia at least.


I've felt for a long time that any application that is targeting businesses as customers would be much better off spending resources on good spreadsheet import/export workflows than on their gui. If you look at your gui and see a grid, basically of any kind, you probably are looking at a point where your customers would prefer to interact with a spreadsheet than your software.


A spreadsheet can mostly be maintained by the next person to come along. More complex systems often die when the person who knew them leaves. Plus all that stuff about crappy IT departments who don't act as if they exist to serve the actual purpose of the business.


"A spreadsheet can mostly be maintained by the next person to come along."

I do note that you say "mostly" - but I've seen truly epic spreadsheets (including one that my employer at the time spent almost £1 million on) and I'd rather reverse engineer an application with source code rather than a complex spreadsheet (even though the feature in Excel to show dependencies visually is rather cool).


I have seen enough legacy applications, applications so old no one understand how they work or dares to touch them, or for which we need to pull developers from retirement, software running only on old mainframes, to take the key man risk with business-developed software too seriously.

Same thing with reliability. So many of these IT supported applications that are supposed to survive a nuclear strike are down a day a week, while the basic access database pretty much hasn't seen a downtime during business hours since inception.


I think to this point, the organizations that will really succeed in the future are the ones who aggressively hire people with a strong cross of building simple, usable and understandable technical tools who also understand business logic and can communicate. I think soon people will be able get further with some basic R/Python and the ability to write a good email than many of the more technically skilled workers without those abilities.


To tease out a little more from yours and parent's comments, I think a few things are drastically undervalued by traditional programmers.

Tactical initiative: not just for Marines. The closer software is being built to a person who actually posseses the domain knowledge, the less risk of misunderstood requirements / behavior there is.

Readability: a language that requires six months to learn to read but expresses programs in half the length is often times not the right choice. If a good enough language can be read by someone with a week of training, and 90% of the domain knowledge is possessed by people who don't know the better knowledge, then use the good-enough language.

Reasons like these are why Excel / VBA is used. And moreover, why it (paradoxically to us) works.


Well, I'd say you can generalize that even further: Spreadsheets are just one symptom. A very inconvenient truth is that wherever you look, you always seem to find far too much critical infrastructures/workflows depending on hacks.

Sometimes, something gets publicized. Usually, it will get shrugged off and nothing much will happen about it. This makes one wonder when the whole house of cards will finally collapse...


> The business prefers spreadsheets over "real software" because they have complete control of it whereas their internal IT department are slow and expensive, and the resulting custom-coded products are often substandard (see zubairq's comment about paying market rate).

I find this very interesting because my company (which I applied to YC 2017 with) is based on exactly this same premise.


The insurance industry has a lot of this too. Not as much by the sounds of it but our internal reporting relies on it a great deal.


In my experience, a huge amount of insurance is (various front end systems run by various external sources) -> semi-standardized mainframe intake -> mainframe heavy duty processing -> kick out report of manual work items -> Excel sheet -> manual resolution.


A lot of reporting depends on it - a lot of high end financial reporting applications have Excel spreadsheets as their main output mechanism (e.g. Oracle Hyperion FM).


Calca (http://calca.io) is an interesting alternative approach to this problem that uses plain text files (supporting Markdown even) to represent calculations.


i think businesses actually prefer spreadsheets because they have No control, and therefore no responsibility.

if something goes really wrong, they can fire the intern and caim they went rogue.

it is much harder to shift blame to a department where everything is full tracked, approved, etc.

i think this a symptom of a much bigger hidden secret. corporations are not organized to accomplish tasks. they are organized to avoid responsibility.


In tech most managers and recruiters think there is a skills shortage. This is untrue. The truth is that companies do not want to pay market rates for people with the right skills


More than that, there exist certain technology products that cannot be made profitably. Underpaying people to make proof-of-concept quality software instead of paying market rates for suitable quality systems (backed up, versioned, monitored, scalable, secure, etc.) will get you investors and even many customers. But it may not be worth the cost in the long run.

Of course some companies get really far by throwing armies of developers at the proof-of-concept software to keep it running and patched together. But that only works as long as the product is profitable enough to pay for all those developers plus some.

Anyway, point being, companies that "do not want to pay market rates" may have little grasp on the cost side of their profit equation. It's possible that because of that, they do not have a profitable business model, at least for a given product, and should really be rethinking their strategy.

This is one of the problems with keeping your "technology people" out of business planning. It's hard to impossible to avoid chasing organization-wide sunk costs if you don't have someone at the strategic level with a good concept of "technical" things: skill level of the engineering organization, technical debt levels, red flags in system quality, etc.


What are those people (with the right skills) currently doing instead of working in tech?


I assume he thinks the prices should rise until demand drops off, and at that equilibrium price we shouldn't say there's a shortage. Lots of people would like a rolex and don't have one, but nobody would say there's a rolex shortage.


If rolexes could only be bought from other consumers who actually want to keep them unless they can sell well above their purchase price, people might say there's a shortage.


Well, for one, they're contracting. Though that isn't quite the same thing as being unavailable, it contributes to a perceived shortage of permanent developers. In London this is because rates for senior developer contractors are something like 50% higher than salaried, and that's only changing very slowly. (Is the situation similar in the US?)


>(Is the situation similar in the US?)

What's the rates in question in London? For permies and for contracting.

Not much familiar with the USA but permie got better advantages. For one, massive stocks/RSU at big internet companies (that the same companies don't give in London), and healthcare/benefits that are expensive as a contractor. I assume it's better to get a standard $300k package as a senior developer than a $1500 a day as a contractor.


> What's the rates in question in London? For permies and for contracting.

Part of the trouble is that nobody knows!

However, i think that as a pretty good mostly-Java agile devopsy tech-leadish developer with about ten years' experience, i would be looking at a 70 - 80 k£/yr salary, or a 600 £/day rate.

The salary could be a lot higher somewhere like a hedge fund or Google, but when i was looking recently, those are the numbers that were in play. I was on about 70; when i asked for 80, some employers would wince but keep talking, and others would end the conversation quickly.

The day rate might be way off; it's anchored in a couple of highly anecdotal personal data points. It doesn't match the published industry average of ~450, but i assume that's weighed down by legions of grunts and juniors.


>>> I was on about 70; when i asked for 80, some employers would wince but keep talking, and others would end the conversation quickly.

Java AND DevOps AND leadership AND 10 years experience?

Seriously. That hurts to hear about companies who expect to get that sort of profile for less than 6 figures. 70k is borderline insane. What package do you have on top?

I already got that when I was less than 25... Please don't tell me I am the 0.1%

The basic standard day rate for DevOps is 500-600 from what I've seen. I don't get how the average could be 450, that's less than the absolute minimum, especially if you also have good development skills on top. Java might be slightly cheaper but not much.

Met a guy between two contracts recently who does that kind of stuff (and some others). He left his current gig to join government consulting for £1000 a day :D

> i would be looking at a 70 - 80 k£/yr salary, or a 600 £/day rate.

I think that's on point. Anyone who looks old enough, can say hello and talk about some past real world experiences will land a 500-600 consulting gig or a 70-80k job at a random company.

The big brands and the shops that pay more are more selective (including most finance). Permie = annoying interviews + skill. Consulting = network + delivering.

> The salary could be a lot higher somewhere like a hedge fund or Google

Then join them. Cheap companies are not worth applying.

I actually think that consulting is very easy to do by anyone and it's more gentle on > 30 years old devs (than SMB/startups). That's why there are so many people doing it, it's not about the money (though it helps).


I just talked to a former student of mine, who said their startup can't hire senior programmers with relevant experience in the UK, despite offering £160k per year. A colleague of mine has a new startup on the side, and he says he can't hire reliable front-end devs despite offering £70k-£80k. I also just talked to two former PhDs from my group, and both went to the US where they got offers > $200k per year, while in the UK they were topping out at ~£70k per year max.

In all cases we are talking about specialists, think writing JIT compilers, sophisticated data analysis, program verification, or scaling to millions of users.

Salaries for excellent programmers in Europe are a joke vis-a-vis the US.


Are they british? Do the UK have an immigration agreement with the USA I am not aware of?

Last I checked, Europeans need a H1B VISA to go work there and it's so hard to get it's bound to failure. Even if you get it, it prevents you to switch job and you can't negotiate and swap jobs to the top.

P.S. No place in Europe are charging $4000 for a one bedroom ;)


> 70k is borderline insane. What package do you have on top?

I had some stock options that would obviously never have been worth anything.

Can i ask what kind of companies were paying you 70k in London when you were less than 25? And to do what? And how much experience you had at that point? If you were doing generalist development on the back of a degree and four years of work experience, i'm going to cry.

> I don't get how the average could be 450, that's less than the absolute minimum

To be clear, that appears to be the average rate for Java developers, not devops. Have you ever worked in a big retail bank? Or some other huge company? You know how they're stuffed full of teams of people who couldn't code their way out of a paper bag? Most of them are contractors, and it's probably them who are billing 450.

> Then join them.

I did!

> I actually think that consulting is very easy to do by anyone and it's more gentle on > 30 years old devs

As a >30 year old dev, i got out of consulting because it was such a hard life. It's not technically difficult, but as a senior, you spend your whole time banging your head against the brick wall of client stupidity.


Whatever brings the money. If am not doing generalist development. If I write down all the specialisations I do or used to do, a tree is gonna die.

Don't compare yourself to the top and cry. There is always someone earning more. I had my time when I was always the least paid guy in the room. Arguably that's still what I feel every time I look around.

I've got a bad image of retail banks. I am fine with investment banks and other banks and non-banks (hedge funds, others) that treat their people well and can build a business out and around what I do. Retail and consumer banks are no places for programmers.

>>> It's not technically difficult, but as a senior, you spend your whole time banging your head against the brick wall of client stupidity.

Not much different from a permie. I find it a lot more tolerable when you can see daily stupidity along a daily rate. Who cares what happens. You'll be on a new contract in 3 months anyway. (That being said, I really hate contracting).


Working for somebody else I would say.

My anecdotal case : I currently work for a very well known French startup. I am paid a bit under 50k€/y (and no equity).

The company really struggle to attract skilled engineers and to be honest there are not that many skilled engineers in a given field in the country.

I could get 70-80k€/y in some other less known French startups that do their best to attract top talent.

I got an offer for an US based startup a little bit under 200k$/y with some advantages and a good amount of equity.

If I can get an US work visa, I am going to take that offer, otherwise I will follow other opportunities but I am not staying with that first company in the long run.


What makes you think they are not in tech?

Some are still in tech, at places that pay them, places that you may not know or have never heard of.

Little experiment: Add 50% to your top compensation, then you may soon start interviewing people who earn around that number and sometimes another 50% on bonus/stock (that your company doesn't have).

They are here and they are the people you wanted, they're not difficult to get, you just didn't know about them.

Now that you found out, you'll continue to ignore them and keep looking for a unicorn at half the price :D


The definition of "right skills" and "tech" is open to debate and basically they're working somewhere else for someone else perhaps at a better job title for a better standard of living. Possibly they're doing the same kind of job for a "non-tech" company internally.

If they insist on tech in SV or nothing, its like people who fail the culture fit criteria because they're a woman or minority. Uber and sales and non-tech employers, unlike tech, won't discriminate as much against women or people of color. If a white male can't get a job because of a different issue like skill vs pay, he can Uber and sales and non-tech employers just as well as a black or woman already has to do.


They are working in tech. For someone else.


Working at underpaying, unfulfilling tech jobs. Probably ones which require a simpler skill set than they currently possess. Also, they are typically living outside the usual tech centers, and couldn't afford to move on the salaries being offered.

I'd like to see salaries raised if only for the "a rising tide raises all boats".


I was quite amazed to see a lot of responses here. Just to give a bit of background, I founded NemCV and followed 15,000 people through their job hunt and have run Denmark's most successful "DATA" based job hunting service here for 5 years : https://www.meetup.com/get-your-dream-job/


1. SWIFT was founded as a 'Belgian Cooperative' in the 1970s by an ex American Express executive and grew at a rate unprecedented for its era, its first 'international center of operations' was located in Virginia, co-opened by the then-governor, and nearly co-located with CIA HQ. EU authorities have confirmed that, despite political backlash and a 'SWIFT 2' nominally designed to resolve the problem, every single SWIFT transaction since at least 2001 (read: forever) has been provided in full to the US.

2. The Israeli business AMDOCS with alleged Mossad links is an outsourced and hosted billing provider for telecommunication carriers, whose client list includes large portions of the developed world.

Just sayin'.


● The two most successful and widely used to solve everyday problems paradigms/environments: spreadsheets and shell scripting (with pipes, etc.) are the ones that have the least attention from “industrial” and academic programmers. Lessons and strong points from those systems are being ignored.

● There are “clusters of bugs”, i.e., finding a bug in a certain module increases a chance to find a subsequent bug in the same module. This is understood intuitively by most people yet very few act on it, discarding modules that have too many bugs (and rewriting them from scratch) instead of continuing to sink resources into maintenance.

● While professionals in other industries use professional tools, programmers use commodity hardware and software (the kind a homemaker would use to google a guacamole recipe). :(

● Managers and programmers think that personality traits and team members' individualities do not matter and there's no role “human factor” plays in development. Ditto for self-organization vs constantly “organizing”/policing employees.

● There were pretty cool systems back in the day (with orthogonal persistence, ability to inspect and modify any object on-the-fly, etc.). The modern ones are not “bad” either but some lessons could still be learned.

● Sometimes there are notions that bad software is created due to sales people/economic pressure but analysis of, say, build tools shows that it is not the case. ;)


>While professionals in other industries use professional tools, programmers use commodity hardware and software (the kind a homemaker would use to google a guacamole recipe). :(

i have to disagree with you here. most of the tools we use as programmers are so precise, adaptable and refined for the job that would make your average industry professional cry.

having those tools run on a guacamole-recipe machine is only a testimony to their power.

it's not just the software either; through services like AWS we have access to some of the most sophisticated and optimized computation and storage hardware on the planet.


So how many programmers you know use a professional keyboard, say? I don't want to provide free advertisement here so I won't mention any brands. But the ones that have hands separated and keys placed in recessions for comfortable wrist position and modifier keys under thumbs? And those that are fully programmable, the ones that cost from 300 euro and up? I don't know about you but I have (on my mechanical 100+ euro keyboard) 'Ctrl(Caps Lock)-Command(Alt)-Enter' remapped to 'Alt-Backspace' (in parentheses — the original key) using a Lua-scripted software. If it's your idea of “refined” and “precise” then that's fine :) Moreover most of programmers I know (hundreds and hundreds) use a “minimalist slick” keyboard from a Californian company. It's a good keyboard for googling a guacamole recipe and a nice example of a “good-looking“ industrial design. But it's not a professional tool by any means.

I don't really want to get started about those slow laptops (that have “Pro” labels on them). I don't know about you but for me their performance is disappointing. (By the way, I use a 2010 Macbook Air as my home machine so no need to call me “picky”). Yet those “Pro” machines seem to be “default” ones in so many organizations I know. They are a good trade-off indeed between looks, portability, versatility and price. But they are nowhere near “pay whatever it takes to scrape every last drop of performance and reliability from your tools” approach that to me seems to designate the choice of tools by professionals.

About software. There's the brilliant, inspirational and slightly sad article: https://www.fastcompany.com/28121/they-write-right-stuff . Where, I remember, was a comment with a person asking, “do you want to pay hundreds of dollars for a text processor written with the level of quality described?” And my reaction was, “Hell, yes! If it's a professional tool and I earn tens of thousands using it, I should be able to pay accordingly for my main equipment”. In web development industry there's an apparent trend now to switch to “hipster” (I mostly like things hipster) text editors. Won't specify the names here but they are usually written in Javascript and HTML and are “highly customizable” but I rarely see them made to perform functions my IDE does out of the box. In the meantime, IDEs for web development are a few (are there any besides the Prague-made ones?) and are not very customizable by my needs (compared even to aforementioned lightweight editors).

Anyway, thank you for your answer and I am glad, that you do focus on the positive side of things!


>In the meantime, IDEs for web development are a few and are not very customizable by my needs

i'm a full stack web dev myself, whatever that means nowadays. i consider languages, the os, dbms, version control, virtualization, browsers, webservers, automation servers etc etc, to be my tools. not merely the the IDE or text editor. personally i hardly need anything beside emacs and zsh terminal on my 7y/o thinkpad t410. that said i do understand that different people need different tools. hope you find what you need.


I work with (among other things) batteries. As I've gotten a bunch of emails about it, I suppose I am "the battery guy" here.

Li-ion battery energy density grows in spurts. Realistically, there have been a few major improvements, with very small incremental improvements in between. For example, in 2001 we were at 180 Wh/kg. In just a handful of months we jumped to 260 Wh/kg, then to 280 Wh/kg as manufacturing processes improved. In the last 11 years, however, improvements have been maybe a few Wh/kg per year, and there have been literally zero improvements whatsoever in the past four years, by anyone.

Eleven years is a very long time, especially for something that everybody just assumes is constantly silently improving. And it's not like the battery industry stopped investing in growth: there are absolutely insane amounts of money invested in battery R&D, since whoever figures out how to make a cheap, energy-dense battery will become absolutely infathomably rich and make Madoff-level growth look like US savings bonds.

Bonus fact: nobody really understands lithium-ion batteries. One production run might have substantially higher or lower capacity. Some batteries might explode. Some batteries are high discharge and some are low discharge. The "explanation" for all of these things is 'heat'.

Imagine that happening to any other industry (for example the auto industry.)

- "Why did my car just explode!?" "Heat"

- "Why does this car last a tenth the lifetime of this car?" "Heat"

- "Why does this car go a thousand times faster than any others?" "Heat"

- "Why is this entire production run of cars not working?" "Heat"

People say "heat" for two reasons: 1) because nobody understands li-ion batteries and 2) because yes, the real answer does have something to do with heat.

Batteries are one of the most important industries and critical to every company on HN. But the state of the art of batteries is putting together random metals with a bunch of random chemicals and watching which ones don't explode.

Imagine talking to the CEO of Boeing: "We just did our millionth test, engines made of cheese and wings made of coconut shells. It crashed during takeoff, of course. We haven't had any improvements in over a decade, but we're sure we'll get there eventually - we've spent billions on research so far."


Could you speak to the reasons why successful startups are not really being founded to develop more advanced batteries? Is the issue primarily 1) widespread lack of knowledge across the entire battery "stack", even within the industry, 2) widespread mismanagement and inefficiency of processes within manufacturing or R&D or 3) the fundamental R&D for battery improvements is just outrageously hard?

I get that it's probably all 3, but I feel as if the main problem being 1) or 2) would lend itself well to startups with extremely knowledgable battery industry veterans.


A lot of it is because the battery world isn't very startup-y. You can hack together a prototype for an app or a website with a handful of developers, sleepless nights and some caffeine, and push that out to the real world, and show that to VCs and raise money. The battery world doesn't work like that. The scientists and engineers you hire will want to work stable 9-5 jobs with decent salaries, you won't have anything to show for years, and success is binary. 99.99% of hypothetical battery startups would have nothing to show for their research except a handful of useless patents for design processes that will never be practical. One would have the breakthrough that brings them to unlimited riches.

Another problem is that there are two categories of success: you find a simple way to make amazing batteries, or you find a massively complicated way. If you find a simple way, that's great, but it can be copied by anyone who wants money, and you'll end up like the Hoverboard - a very successful _idea_ but the company only captured a fraction of the market. If you find a complex way, it's very likely to require an expensive material or process. For example, say you need a rare metal or something that's not currently produced in large quantities. The capital you'd need to raise in order to extract or create this material could be absolutely massive. For example, let's say you got a 10,000 Wh/kg battery but it needed a substantial quantity of rhenium. Well, worldwide production of rhenium is about 40 tons/year. You'd have create a worldwide industry for extracting rhenium, perhaps rivaling the oil industry in size. Imagine that conversation with VCs. "You want to raise $1.5 trillion? With a T?" "Yep, and we want to do it to create a battery that could either enable our world to look like Star Trek. Oh yeah, and it might become obsolete in months if someone has a better idea."

Another thing is that there's not much to "disrupt". If someone gave me $XXX million for a battery startup, it'd basically be set up in a very similar way to the research arm of Panasonic or Samsung SDI. If some genius comes up with a way to improve the efficiency of the core research - sure, that's obviously worth funding. But the problem is that everybody is trying to get to the breakthrough directly, not invent tools to get to the breakthrough faster.

Imagine if you were in a room with 1000 people, and an authoritative source announced that there's a pot of gold buried in the hill behind the room, and it'll probably be found in a few hours. Do you spend two hours driving home, getting your shovel, and driving back? Or do you run out there and rabidly start digging? Sure, you might answer "get a shovel" now, but in the real world 999 of those 1000 people will be digging with their hands.

The last problem is that the risk, and the return, is absolutely ridiculous. With startups, say 1 in 1000 makes you a billion dollars, 600 return your money and 399 go bankrupt. Great, that's acceptable. Even worst-case scenario, you take a big loss but you don't lose _everything_.

With moonshot battery research startups, every single one will lose all your money except for one - and that one will make you fabulously wealthy beyond your wildest dreams. If the breakthrough is big enough, it would revolutionize literally every field from space exploration to aerospace to power generation to electronics to vehicles. There are so many variables you can't even start to calculate the odds.


I live in China and run startups in e-commerce and food.

China subsidizes up to 70% of postage fee, so Aliexpress sellers can sell cheap stuff for $2 with "free shipping" worldwide.

There is no such thing as "free shipping" in e-commerce, it's just included in the price. An astonishing amount of people actually thinks it's free for the seller.

When you apply for a website license in China, you have to give root access to the Ministry of Information. Most of the time it's handled by your hoster "transparently" in the form of "license application tokens". (Yes, you can trick them if you colocate your own bare metal.)

25% is an average ingredient cost for restaurant meals.


Shipping: It's generally pricey as hell to send anything out of the US. It's cheap to send things from developing countries. This is merely economic reality (shipping costs include cost of packaging, labor salary/health/pension, rent at all trans-shipment points, distance-to-port, economies of scale, tax on private company earnings, legal and insurance overheads, etc.) Since half the world's products come from China, its government-run mail system has no taxes and it has huge ports and rail networks, it's very cheap. I doubt there is a gov.cn conspiracy to undermine foreign e-tailers (ready to be proven wrong!), it's just a highly efficient and centralized socialist bureaucratic mail system. (Note that it's currently being supplanted both domestically and internationally for many purposes by more trackable courier companies, of which there are many dozens already, the best of which is http://sf-express.com/ )

Root: If you do business in mainland China the government reserves the right to strongly suggest you comply with everything they want, which is usually "if anti-government stuff appears on your site, you are done, and otherwise if we give you a phone call, jump high now". This is in some ways no different to other countries in model, but perhaps stronger in presumed responsibility and penalty than some. As for root-by-VPS, yeah maybe, but citation-needed, and there's arguably not a huge difference versus seizure as performed by other countries.

(Source: ~16 years ... not claiming total accuracy, the above is just my experience/impression)


> When you apply for a website license in China, you have to give root access to the Ministry of Information.

Wow, I'd like to know more, this seems especially interesting in regard to Chinese Bitcoin exchanges. They can keep most of their bitcoins in cold storage, but still having root access to the web server means user passwords, TLs, and so on. Is it just a web server directly connected to the internet that they need to have root access to? Or do you e.g. also need to provide access to the db server used?

edit: Also what's with the license? It's possible for me to buy hosting in China without any licenses.


Countries have an obligation, via their publically-funded postal agencies to deliver any mail/courier to their citizens. This was fine pre-internet as roughly same amount of mail was sent as received on average -- so countries invested roughly the same amount of manpower.

But in the case of china/Ali-express, it receives maybe 1 for every 100 mails it sends out, which means receiving countries use proportionally much larger manpower to distribute the mail, and this is via tax-payer funds.

So interestingly, that you yourself end up paying for china's free shipping, via your home-country taxes.


The majority of commercial software writers are banging out poorly designed, fragile, inadequately tested and often borderline-unmaintainable code.

I feel that people have a kind of double-think about this; they suspect this is the case, but they still expect software to work properly.


Amen.

I work at one of the most well-reputed software product companies in the world. We pride ourselves in hiring the very best, and try to do so through an interviewing program that simulates real life coding instead of an academic whiteboard situation.

Even amongst smart people, the bar for software quality is shockingly low. You will find ostensibly senior people again and again who given half a chance with espouse all day long the importance of high quality code, well-designed abstraction, separation of concerns, and good testing.

You can then let these people operate for a few hours by themselves and examine the result. In all but a few situations, it'll be a pile of mud that's designed counter to every one of the values they talk about. Often they're not aware of this, but it's also very common that they know it's sloppy, and don't really care. People don't do what's right, they do what's easy.

In a corporate environment, there is never a structure to identify the developers who are most likely to write good and maintainable code (it's too subjective) so that they can be put into a review capacity, and even if there were, there just isn't the bandwidth to review new additions to the depth that's required. In an ideal situation, many patches would be returned to author essentially asking for a complete rewrite, but that's difficult to justify given that building strong relationships inside the team needs to be considered.

The result is that all software developed in industry trends towards bad over time, and it appears to be as absolute of a universal constant as entropy. You can find localized exceptions, but they're often only temporarily good as eventually more people contribute more features.


The same applies to hardware design at many companies, including reputed ones. I witnessed a case were they went from an R&D prototype to product with no quality control in the design process. Product failures in the market were followed by blame games.


Publicly known but not commonly known: (I get asked about these frequently.)

Electronic Displays Industry: LED TVs are also LCD TVs. When I explain this to people, some even retaliate! The picture is formed in both by LCD only, just that the backlight is built out of LEDs in what is called an LED TV.

Specifications given for power output in music systems, contrast ratio in TVs, frequency response for earbuds, etc. are mostly fake.


That's a result of having the marketing department as the ones who interact with the public. Pretty much every industry has this problem.

To get at the truth you have to find some enthusiast website and find someone there who has done enough digging to find the real technical information and specs.

Speaking of LED TVs, the next one is QDLED TVs where again it is still an LCD TV but the backlight is now improved with Quantum Dots. Confused the heck out of the public who are looking/waiting for a true Quantum Dot display.


I do marketing and AdWords for a company that specializes in local contractors. Modern online marketing is a race to the bottom that only Google wins. Fundamentally, regardles of industry, marketing is a simple logistics problem -- connect clients to service providers. But the distribution of attention is incredibly imbalanced -- the top 3 search results get nearly 50% of clicks with the top ten results getting 90%. So what I am seeing happen for many local industries and problems is that even though there are probably dozens or hundreds of local service providers that can provide adequate service for a problem, the ones that show up on the first page get a disproportionate, overwhelming amount of business, and the rest of the providers end up fighting over scraps. Unless they too want to start shelling out 10 grand a month or more to show up at the very top for every search and get more business than they can handle. There's got to be a better way.


Isn't this the problem that Angies List/Amazon Home Services/etc is attempting to solve and monetize? For a goof I submitted a plumbing work order to Amazon and received at least three competitive bids, all from businesses I had never heard of before that are vouched for to some degree by Amazon. It seems to be either the above or Yelp/word of mouth are the only ways that contractors can be seen by prospective clients, at least until they grow large enough to have more marketing funds and return business.


Not my industry any more, but a PhD isn't about pushing the frontiers of knowledge so much as being a training program for working in academia.


I agree to a degree, but this is dependent on field. Some fields it's about networking, some its about getting a foot into a door into a lab.


Every piece of commercial software ships with large numbers of known problems, limitations, and deficiencies. For old, broadly distributed software, the bug databases can easily have millions of issues, some of which have never even been investigated.


The mining industry relies on a single guess as to what an orebodies looks like when they plan their multi million/billion dollar mines. These guess orebodies are usually made by inferring local information based on their neighbours within a specified range, weighted proportionally to their distances. The method is still widely used even though it was made to deal with the fact that back in the day these calculations were done by hand. We have more correct methods but they're require computers and people don't like computers.


The telecom side of media is really boring, but on the production side almost all mass market media comes from about five corporations plus or minus a couple.


Commercially available desktop automation systems are pretty terrible in terms of modern software design. Specifically lifecycle management, code versioning, third party tooling integration, and readability.

Unfortunately, they're aided by a compatibility moat built up of custom control support. Basically everything that doesn't support Microsoft UI Automation requires memory hijinks. And you'd be surprised by how much legacy software used weird third party libraries.

If you want more impressions or to chat, feel free to flip a mail at ethbro.co a g mail. Happy to talk, as god knows automation software could be improved.


In online learning for higher education: if you don't make mobile learning your first priority you will struggle to stay relevant in the market.


Can you elaborate on what you mean by "mobile learning"?

Do you mean development for mobile as a platform?

Do you mean the ability to learn on the move?

Something else entirely?


It's M-Learning. You can check it in Wikipedia for starters. It is the combination of processes and technologies to allow people to engage into formal learning programs using mobile devices. It is different from traditional desktop based e-Learning.


In the US, how hard it is even for your insurance company to figure out if a doctor/hospital/etc. is in or out of their contracted network.

You'd think "how hard can it be, you either signed a contract with them and have it on file, or not". Well...

When the insurer signs a contract with, say, a doctor, what is happening is not "anything this doctor does for someone on one of our plans is covered by this contract". Instead, whether something falls under the contract or not depends on a whole bunch of factors, including but not limited to:

* The doctor's NPI (provider identification number, issued by US Medicare/Medicaid).

* The particular medical specialties and credentials of the doctor.

* The location(s) where the doctor renders services and the type(s) of services rendered.

* The federal tax identification number and billing entity the doctor bills as.

* The address the doctor submits on the bill.

So suppose Dr. Jane Doe signs on to your insurer's network. She's contracted as a primary-care physician, rendering services in her office at 123 Main Street Suite B, under NPI 1111111111, and will bill as Jane Doe Medical, tax ID 222-22-2222, payment to be sent to her billing office at 234 Second Street.

And you go to Dr. Jane Doe and have no trouble, until one day you get a notice back from your insurance company that suddenly she's no longer in network. She swears up and down that she's still in network with them, so it must be the insurance company's fault, right?

Well, the thing is that she merged operations with Dr. John Roe in the next office suite over, and now is billing as Doe & Roe Medical, tax ID 333-33-3333, payment to be sent to 345 Elm Avenue. And there's no contract for that!

Or she outsourced all her medical billing to MedBillCo, which again changes information the contract is keyed to. Or she also holds credentials for other types of medical practice and was rendering service of that type, maybe at a local hospital. Or she kept her tax ID and billing the same but moved her office from 123 Main Street Suite B to 123 Main Street Suite C. Or the post office realigned the boundaries of her zip code, and she "moved" from zip 11111 to zip code 11112 as a result.

This sort of thing happens all the time.

I'm told that LexisNexis once came to an (I think optimistic) estimate that the half-life of medical provider data is 18 months. So gather up all the information on your network of doctors and hospitals, carefully vet and double-check it, make sure everything is full and correct and up-to-date... and 18 months later half of it will be wrong, just due to the background rate of changes to office locations, credentials, billing entities, etc.

So after working for a little over a year at a company that has to deal with this, I am not surprised at all when I hear someone complain that "it's the same doctor I went to last time, nothing changed, they're still in network, so why can't the insurance figure that out?" Nothing visible to the person complaining has changed, sure. But that tells you nothing. I am more often surprised that anyone is ever able to correctly determine in- or out-of-network status; being able to do it even a fraction of the time is frankly a minor miracle, and requires a whole lot of people toiling away behind the scenes.

For the record: I work for a company in the Medicare space, and we're required to revalidate all our provider information at least once every 90 days, precisely for this reason. Also, you don't want to know what the industry average is for correctness of printed provider-directory booklets. Even if it's sent off to the printers the day the up-to-date data has been validated, some not-insignificant proportion of it will already be wrong by the time it arrives in someone's mailbox, just because of how often and how quickly the information changes.


Education is an expensive burden on the society that have not proven to worth its value in terms of Return Over Investment.

Funnily, the only persons asserting the value of diploma ... are PhD in ivy league business schools... selling expensive diplomas...

Education is probably a large scale scam in its actual form: too long, too expensive, counter-productive and overrated.

Cf Henri Mintzberg essay on why MBA for instance are a poor choice for being innovative.

"M.B.A. programs train the wrong people in the wrong ways with the wrong consequences," said Henry Mintzberg, a management professor at McGill University in Montreal. "You can't create a manager in a classroom. If you give people who aren't managers the impression that you turned them into one, you've created hubris."


> Education is an expensive burden on the society that have not proven to worth its value in terms of Return Over Investment.

As a person from a region of the world with a high level of illiteracy an having witnessed firsthand the ROI of an uneducated citizenry, I would question this assertion


I believe he is referring to postsecondary education specifically. In North America, K-12 exists to make one a functioning member of society. For what flaws it may have, it has overall shown to be beneficial.

With that measured success, we got the idea that once you were a functioning member of society, you could become a functioning member of the workforce simply by continuing in even more years of schooling. But it's not clear if it has really worked out. In 1970, approximately 10% of the US population had a bachelors degree or higher. In 2017, that number is closer to 30%, yet incomes have remained stagnant throughout that entire period and job quality has declined. Not what you'd expect from the promised higher incomes and better jobs that people were willing to spend large amounts of money for.


I was indeed referring to post secondary education.

If I was no troll, I would admit a soft tooth for Finland focus on early education and the amazing results they have doing so.

https://www.theguardian.com/education/2016/sep/20/grammar-sc...


I honestly can't understand how people spout some of this crap.

I've also grown up in a region where education was overlooked, and living with complete fucking morons is not fun. Especially when these morons themselves think that education is worthless, which is a snowball effect meaning many of their children will be stupid, as well.

On the other hand, it's easier to control them if you want to. Like they did in Medieval ages and earlier. Yeah, let's go back to that.

The ROI is as indirect and long term as it gets, but it's real and it's massive.


You have to compare comparables situations : what could have been done different with the same resources involved ?

Having an educational system is a sign that the society is wealthy and can afford it, but doesn't mean it's the best use of resources involved (thus it might have a bad ROI).

You said education was overlooked where you come from, was there an educational system there ?


It's worth distinguishing between basic literacy (including basic math skills) and graduate-level specialty degrees.

For one, if the labor market in a particular specialty becomes oversaturated, does that mean existing grads need to spend years retraining?


He is talking about College/University education, and companies sometimes requiring advanced degrees for basic jobs.

High school is and will always be needed. Also, for some careers university education is obviously necessary. Just not always.

However, this probably only applies to the USA, or other places (South America as well) where students get into high debt to get degrees.


I think it's a matter of value being shaped by under- or oversupply.

Speaking of illiteracy, there's an urban legend (not sure if true, anyone with access to data please check it) that literacy rate has actually dropped in the US throughout the 20th century, coinciding with increase of education "supply".


No data exists for that because the definition of illiteracy depends too strongly on the axe being ground.

Everything from 1% to 66% illiteracy has been attempted to sell to people.

It does seem that illiteracy clusters with various citizenship status, racial and other demographics, simple age distribution of a geographic location, and socioeconomic level to an extent that if you successfully filter those effects out you end up with a boring flat line at 100% literacy when ends up meaning nothing. The largest factor is literacy graphs and maps are basically a restatement of immigration data.


It depends a lot on which parts of education you're talking about. The first 9-12 years are required so that kids have something to do while their parents are at work. Learning stuff and interacting with other people is a nice way to spend their time.

From that point, it depends on the subject you decide to study. It seems to me that medicine students get a lot of real life practice during their training, which makes the school almost irreplaceable.

In other areas, like programming, you're better of staying at home and watching online lectures and making your own pet projects to learn, as long as you have enough self discipline to keep yourself on track. There are tons of resources you can learn from and you can practice as much as you want. I did it and it has been working well so far.

Overall, I wouldn't say we should ditch education at all, but we should seriously rethink the way it works. We can do much better than this.


>>> The first 9-12 years are required so that kids have something to do while their parents are at work.

They also learn to read, write and count. It's a rather useful skillset to have :D


> ...you're better of staying at home and watching online lectures and making your own pet projects to learn

While I agree that you can get pretty far with self-study on small projects, the most important parts of software engineering revolve around communication, especially with other developers and with expert tools. It's hard to get better at communication without, well, communicating.

Of course, one can get a lot of experience by contributing to open source projects or finding other ways to work on group projects.


I completly agree with the need to communicate and program with other people. From my experience, computer science degrees severely lack this communication trainning. You do ~2 small group projects per semester. The group splits the project into parts and everyone goes and does it's own part alone.

There is no teamwork, source control, tests nor anything like that.

I seriously hope other universities do it differently and I just attended the wrong one.


I had team projects, somewhat resembling what you describe. No, I had no source control or testing training at all. I chalk that up to more academic programs being oriented towards computer science and mathematics than craftsmanship, engineering, and project management concerns.


s/education/higher education in liberal arts and business/g. Next time you need surgery, you probably don't want your doctor to have been self taught.


Having a father who was a surgeon (and quite good), I have enough real life feedback to say that apprenticeship/internship in medecine may matter more than the years of theoretical studies that actually are used mostly for selecting the «right persons».

Not to say theoretical education worth naught, I say there is a cursor that is way tilted in favour of theory that costs less and make universities earn more.

Let's call it a corporatism behaviour.

You don't want a surgeon that was good at school, you want a surgeon that when his hospital is poor will use his earning to buy medical reviews to stay up to date, say fuck to the hospital and take unpaid vacations to go in conferences to stay up to date.

You want a surgeon that don't care about his diploma, or the titles but stay focused on his craft and will go on learning without caring for what society legally requires him to do in order to practice.


If resources were infinite, I wouldn't have any concern about a self-taught doctor. If they are smart enough to perform surgery, they're smart enough to figure out how to do it.

The challenge the real world presents is that the resources needed to reach that level, such as access to specialized equipment, is not in reach of the average person. It's not like a software development career where a widely available personal computer that can be purchased for hundreds – even tens – of dollars pretty much sums up everything you need to become among the best in the world.

What the schools have been able to provide is a place where people can pool their resources to gain shared access to infrastructure. Education comes as a natural consequence of effectively utilizing that infrastructure, of course.


Smart does not mean experienced. I don't say this often on HN, but you really have no clue with what you are talking about here.


Where, exactly, do you think experience comes from? By actually utilizing the aforementioned infrastructure that is not accessible to most people. Nobody becomes a surgeon just by watching a lecture in a classroom. If it were that simple, surgeons really could just watch some Youtube videos. I get the impression you didn't actually stop to take in what I wrote.



That we are actually quite close to the development of a panoply of working rejuvenation therapies capable of significantly extending human life spans, were the right lines of research and development just more aggressively funded than is the case at present.


Head transplants are actually considered feasible today. http://www.bbc.co.uk/newsbeat/article/37420905/the-surgeon-w...


I've seen your posts before (I'm a lurking life-extensionist so it's difficult not to notice). My question to you:

Are any credible organisations offering even rudimentary rejuvenation therapies? If yes can you list them, with links and costs if possible?


The title question contains the word 'fact'.


Name and shame, with links to the promising methods?


ACH transactions (bank to bank) cost 1/10000th of a dollar. PayPal charges 3% plus 29 cents for branded ACH.


I'm not sure if you're being sarcastic, but aren't ACH transactions just an SFTP upload?




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: