Hacker News new | comments | show | ask | jobs | submit | best comments login

When I was a CS prof, many, many years ago, our undergraduate lab had Macs with floppy disks. I asked the University to pay for installing 10MB Hard Drives in the Macs. I was asked to present my case to the deans council. At the meeting, I said that the students used the floppy to load their development environment. I said that, with a hard drive, it took 10 secs to load and be ready. With the floppy, I said it took 30 seconds. Then I said, "That does not sound like much difference, but this is 10 seconds ...." I paused and looked at my watch for 10 seconds. Then I said "And this is 30 seconds" - again I looked at my watch. At the 20 second mark, the VP Academic (chair of the meeting) said "You have your money".

Some of the comments here feel weird.

Elementary OS has picked a specific niche, and its arguably the best distribution in it. It has a really well designed and consistent UI experience, and you can't break it. I think most of you just don't realize how difficult linux is for people who barely understand how to use a Mac or Windows machine.

Furthermore, linux is a world where mainstream distributions still release with horrible UI experiences with numerous typography mistakes, icons of different sizes and grid alignment imbalances everywhere.

Like check out this Mint (grey theme) screenshot: https://upload.wikimedia.org/wikipedia/commons/thumb/4/43/Li... and compare it with elementary OS here https://news-cdn.softpedia.com/images/news2/elementary-os-0-...

Mint is great, but seriously. Look at that logo render. Even worse, look at the start bar. Every single text and logo has a different height. I mean how do you even do something like that unintentionally?

I am using XFCE right now, and it's great because its much faster than KDE or Gnome on this old laptop. But it sure isn't a pretty UI. I know a MSc. designer and she claims using my laptop makes her physically sick and dizzy. I don't care as much, but I can see the point. Everything is misaligned, in the start menu, the task bar, the apps. In the window bar the buttons and the minimize arrow aren't the same size. I mean seriously, whoever did this just did not care about Ui.

I don't think ElementaryOS is for everyone. If you have any interest in non-standard repos, recent kernels or doing stuff in commandline, you are just better off elsewhere. I understand their choices, but I don't use it because of how they do the app store, among other things.

But if you just want a computer that runs, looks good and doesn't break if you do X, then I think ElementaryOS is the #1 choice in the Linux world, and we should be thankful that it exists.

> Ms. Brown said a lot of students criticize Facebook and talk about how they would not work there, but ultimately join. “Everyone cares about ethics in tech before they get a contract,” she said.

This describes my what I've observed perfectly. There are very few companies that make new grad offers comparable to Facebook's, and career advancement opportunities and working conditions at Facebook tend to be better than the other 'Big-n companies'. So it's hard for a new grad to say no to a Facebook offer, even if they're morally opposed to its products.

It's a "dead dove do not eat" feeling, but I kind of hoped I'd open the HN comments on this one and _not_ see a bunch of engineering solutions.

They all miss the point of the story, which is crushingly relevant to startups:

If you build a company with people you like and love, and circumstances outside of your control force it to scale in a hurry, those people you like and love are going to get hurt in the process.

That's not an engineering problem to fix. Stanich's didn't need an algorithmic order filling solution, it needed to lose the people that made it Stanich's. It was doomed no matter what. It was unscalable; it's unscalability is what imbued it with what Alexander liked about it.

The story begins and ends with Stanich's parents and the reason why they started the business in the first place for a _massive_ reason, one that seems to have gone over the heads of most HN commenters (and I'll bet also most people reading it via HN), and that's at least as depressing as the story itself.

I somehow decided I needed to cheat to pass a certain exam because I was basically crap at memorizing stuff. So I used an analogue wireless headphone, an induction loop around my neck and a mobile phone. Since I lacked an accomplice to dictate, I read aloud the hundreds of pages and recorded myself, careful to preserve and properly serialize things like complex formulas.

This was before the era of iPods and SDCard players, so I had my mobile phone in a setup where I would call back another phone connected to my Pentium MMX 233MHz at home, that ran a sort of audio directory that would playback a certain lecture recording I would select from the menu, using DTMF tones.

I had a small keyboard sewn into my sleeve that connected to the customized mobile phone via a DB9 plug and then to the numeric keyboard, allowing DTMF codes to be sent by gently and invisibly moving my fingers. The whole setup was hideous and had it's own dedicated jacket, with wires, phone, keyboard, audio amplifier, neckloop, the earphone... a complete cyborg for academic fraud.

Back on the PC side, I wrote a C++ application from scratch that would capture audio via the soundcard using Windows Wave API, decode the DTMF pulses using a couple of IIR filters then navigate the menu and playback the required file to the mobile phone connected to the soundcard. The C++ program and menu system was scripted using an .INI file that defined the structure with links to various ADPCM-compressed .wav files that represented the menu headings or the leaf content itself (a good structure was necessary to quickly access the correct lecture after receiving the exam subjects).

Work-wise, it was a lot more difficult then putting the effort in memorizing the stuff, but I rejected memorisation on principle, that's not what an university should be about. The whole thing turned out to be a massive learning project, but I obviously could not speak about it at interviews. It's the first time I mention it to anybody.

I used the setup for two exams that I aced, was never caught but it was nerve-wracking to use in close proximity to a teacher.

Maybe I’ll sound like a bitter old techie, but it’s absolutely wrong to have someone with no direct experience managing something that is technical.

If you can’t code, you shouldn’t manage coders. If you’re not a lawyer, don’t run a law firm. If you haven’t been in charge of a class of kids, you shouldn’t run a school. And that’s despite the day-to-day of these management roles not touching code/clients/kids.

The experience gained in a few years on any ladder is enough to appreciate most of how people in those fields think.

Have you ever tried working for someone who didn’t have relevant experience? You get people deciding that they don’t need an issue tracker, let alone a code repo. You get people who think paying Google 5 bucks a months is not worth it, they’d rather have their own email server. And it’s not that there’s never a case for having your own server, it’s that the case is never made in a technical way (eg we want security / uptime / whatever). The techies end up having to translate complex reasoning into something a layman could understand, or at least pretend to. A lot of time is wasted explaining things. And then when there’s feedback - and most people cannot resist the temptation to act like they’re contributing - it only makes sense to the non technical staff, while the tech people are trying to implement whatever crazy modification it is they’ve been given.

What these people tend to do is to make everything a management issue. So management, just like politics ends up having its own ladder. Relevant experience for being a health minister is to have been an MP. Relevant experience for managing a code department is having managed the interns.

It massively corrosive to let this continue.

And before someone makes this argument, it's perfectly possible for techies to do the managing and politics.

You're right - imperial is far superior for precise building or manufacturing work.

Take drill bits, for example. Obviously it's much easier to figure out that 11/32" is less than 3/8". Or is it more? No, I'm pretty sure I was right the first time. The metric ones with their 5.5mm, 6mm, 6.5mm sequencing are just too complicated to work with, in comparison. And half a millimeter isn't very precise - it's much bigger than 1/64". Well, a bit bigger. Let's not get into tenths of millimeters.

And at larger scales, of course, base 12 is much easier when it comes to dividing distances. Taking a distance of 2'7" and dividing it by three in your head is much easier than dividing 79cm by three, because... well, 2' divided by three is 8", obviously. If you need to be sure, just tap it into a calculator. That supports base 12...

Anyway, you'll quickly determine that it's 10 1/3", which is much more precise than 26.3333cm. Now I just need to subtract the radii of these two 5/16" holes from that, which is easy - imagine trying to subtract 8mm from 26.3333cm! What folly.

99PI's coverage of forest fire policy and design was really provocative:


1. Our entire approach to fighting forest fires may just be making them worse.

2. Smart and simple design decisions can save houses.

A house can be thirty feet from an entire forest on fire and never burn down.

We learned this through hardcore experiments in Canada where they built homes to test and lit forests on fire nearby.

One of the immediate takeaways was to change roofing materials to resist embers, but there are other options for materials and landscaping that could make homes basically impervious.

The problem is probabilities. Fires are common, constant. They are exceedingly rare in any one location though. So fire resistant design is no one's urgent problem, fires are always something that seems like it will happen to somebody else.

All I want is to be able to give companies my money in exchange for not being spied on, manipulated, or locked into some walled garden. I can't for my OS - I have to use free gnu/Linux because everyone else treats me like crap.

Is my money not good enough for them? This is getting absurd

> 4. Smart people get bored easily. Being smart is not exactly the same as being curious, but if you have both these qualities you might find yourself becoming easily bored with executing the same behaviors over and over. Some types of success stem from creativity, but other types come from becoming an expert in a niche and performing a set of behaviors repeatedly. If you’re smart, curious, and have a love of learning, you might find you quickly lose interest in anything once you’ve figured it out. The execution side of performance might bore you, and you’d rather constantly be learning new things. This can end up being less lucrative than finding a niche and repeating the same formula, but that might seem too boring or unchallenging to you.

I've been wrestling with this for most of my career so far. I think it's all about striking the right balance. If you're not constantly learning you will stagnate. At the same time, jumping continuously from learning one new thing to another (esp. if the things are not very inter-related) can spread yourself too thin: you need to "go deep" on some things to become an effective/valuable contributor.

I sometimes feel resentful of the amount of time I've spent as a software engineer dealing with what I often feel is boring or pure B.S. (e.g. almost everything other than designing/building some novel complex system from scratch). But in reality looking back I see that a lot of that sh*t-shoveling has actually made me much better and wiser at my profession, despite how mind-numbing and boring it often was. So I'm trying to keep that perspective to get me through those really dull days when I want to just rage-quit and move to a rural commune :)

The article tries quite hard to link the falling demand for oil to the emergence of electric vehicles. But looking at the reduction in half of the usage of gasoline in Italy, I see that basically all of it happened prior to 2013, long before Model 3 was a thing.

I found this interesting site [1] where you can see the gasoline consumption for every country between 1991 and 2012. And all big EU countries I looked at (Germany, France, UK, Italy) exhibited a steep decline between 1999 an 2012. This can't have anything to do with electric vehicles. Most likely it's the continuous improvements in car efficiency.

To check this I went to edmunds.com [2] and I compared the same car between the 2004 and 2018. Here's what I got for the highway mileage for 3 random cars (in all cases I looked at the most basic model):

- Mazda 3: 2004: 29 mpg ; 2018: 37 mpg

- Toyota Camry: 2004: 29 mpg; 2018: 39 mpg

- Ford Explorer: 2004: 19 mpg; 2018: 23 mpg

That's an improvement between 21% and 34%. Not bad at all.

[1] https://www.indexmundi.com/energy/?country=de&product=gasoli...

[2] https://www.edmunds.com/mazda/3/2017/sedan/

> Ultimately this:

> FILE * test_file = fopen(“/tmp/test.txt”, “w+”);

> Should become something like this:

> create file /tmp/test.txt for input and output as test_file

> Syntax highlighting should work instead of syntax notation just fine.

I couldn't disagree more.

The first example is easily scannable visually -- a variable is being created from a function call. I can tell because my eyes immediately notice the middle "=" first (easy to scan without reading) and the "(...)" in the second half (also easy to scan).

The second example is all words without punctuation, which requires me to read the entire thing word-for-word to figure out what it's doing.

We certainly shouldn't aspire to regex-like syntax for programming... but we also shouldn't aspire to prose. A healthy balance of words, punctuation, and whitespace/indentation helps to let our eyes understand the structure of code at a glance.

I'd argue that C-style syntax actually achieves this quite well, although I'm also a fan of named (as opposed to ordered) function arguments.

It's still too confusing. Too much terminology, too many settings. 3 pages and over half a dozen screenshots to explain how to make a bucket private. Too complicated.

Google Drive folder permissions are easier. Phrases like "Anyone with the link can view" are understandable. Phrases like "Block public and cross-account access to buckets that have public policies" are not.

Hide the fine-grained control in an "Advanced" panel, for those who really need it.

I've built a small tool that I distributed as "free for personal use, contact me for a commercial license", without technical enforcement, not even a nag screen.

I made one (1) sale, despite receiving e-mails from several people thanking me that they're happily using it at work, and that sale was to a company who wanted customization, and only ended up actually paying the invoice when they asked for another round of customization and I pointed out that they haven't paid their last invoice.

Donationware does not work for companies, I think - the bureaucracy required to make money move from the company to you will keep people from doing it even if they think you deserve it. If it is labelled as voluntary instead of a legally required license fee, it will also be hard to make it happen.

If you're targeting companies, and want to do a shareware model, you should:

* Make it easy to buy (with credit card etc.), but also provide a contact for volume licensing. If you're lucky, this allows employees to pay you for your software without having to go through approvals.

* Make it hard to use permanently without buying (beyond just a nag screen, e.g. blocking the save feature once an expiration time is reached)

Your goal isn't to convince someone to pay for the software. Your goal is to convince the person sitting in front of the computer that dealing with the bureaucracy to pay you is easier than not dealing with it, and if given the choice between a nag screen and the bureaucracy, the nag screen is easier to deal with.

The actual blog post is much more informative: https://blog.verily.com/2018/11/update-on-our-smart-lens-pro...

Sounds like they couldn’t, with today’s technology, measure glucose levels to the accuracy required for medical devices. But will continue work on measuring other things with those lenses.

I don’t know why the negative tone of the article. Sounds like they tried something and demonstrated their specific solution didn’t work. Most research experiments generally don’t.

It seems like every dozen threads or so in the last year have at least one thread with bizarrely cantankerous / dismissive replies, with a few lonely voices saying when did you all get so unhappy?

Tulsa's initiative here is really, really good. It doesn't matter if you want to live in Tulsa, specifically (fine! don't move!), and it doesn't matter if the incentive is high enough to make it Worth It to You Personally.

What matters is that somebody has figured out that the dynamics of the Amazon HQ2 bidding war (roundly critiqued elsewhere, but I'll leave that for another thread) apply to individual remote workers as well --- in other words, potentially to basically everyone reading HN.

Quit whining, and start celebrating. Get this in front of your own city. Show them that there's an opportunity. It has never been a better time to be in software.

I would rather have some CSS to make the basic HTML Gmail look like the classic Gmail. Adding CSS to the new Gmail doesn't make it any faster, which is my biggest problem with it. The basic HTML version runs really fast, but it could use a bit of CSS to make it look a bit nicer and add a little more whitespace (at least on my 4k monitor it's somewhat cramped).

The actual text of the resolution, for those with low tolerance for dumbed-down PR releases:


What's worse is that International companies have been just giving their technology to the Chinese for years and years.

In order to do business in China, non-Chinese companies must partner with a Chinese company. The International company shares their IP with their Chinese counterpart, and the Chinese counterpart in turn shares the IP with the their partner, the Chinese government. The Chinese government takes the IP and shuffles the IP to the company or companies best suited to exploit the IP. This has been taking place as long as China has been open to International business.

International companies in a rush to get access to the largest single market in the world have freely given away their IP, because they didn't think the Chinese could ever catch up. Companies are now moving partnerships away from China, and it's forcing the Chinese to steal the IP in order to keep their edge.

I try very, very, hard to avoid buying products made in China. I"m OK with every other country in the world, except China.

I would really hate to see anything bad happen to GCP after the changes in leadership.

Based on my own experience as the CTO: After being an Azure shop for a year we've migrated ~50-100 VMs to GCP and I love the GCP products.

GCP is:

- Simpler to use

- More tailored to people with Linux environment

- Leader in K8S

- Has good support

- So much cheaper (in our case we saved ~60%)

- Has great UI and understandable primitives.

My only pet peeve is the fact that exporing your spend is practically impossible unless you're a BQ guy that can work directly with report exports.

PS: We're building our future infra on K8S to allow us to migrate more easily to a different could if something goes awry with GCP, I really hope there won't be a need to migrate back to Azure and its arcane and high pricing, strange UI, worse tooling...

If true, this suggests that previous articles about assange being 'practically free to go', and 'just staying in the embassy for attention' were misinformed.

Once in a ie6 era, I created vertically and horizontally centre align login page that also worked in Mozilla.

This reminds me of a different "rule of 3": If you want to compare two things (e.g., "is my new code faster than my old code"), a very simple approach is to measure each three times. If the all three measurements of X are smaller than all three measurements of Y, you have X < Y with 95% confidence.

This works because the probability of the ordering XXXYYY happening by random chance is 1/(6 choose 3) = 1/20 = 5%. It's quite a weak approach -- you can get more sensitivity if you know something about the measurements (e.g., that errors are normally distributed) -- but for a quick-and-dirty verification of "this should be a big win" I find that it's very convenient.

I get that we don’t like Facebook, but isn’t this a bit much? Are they worse than google or amazon? What about people who work for Coca-Cola? What about the people who make mobile games and loot boxes for Blizzard?

Where is it ethical to work?

I mean, I’ve been in public service for decades, so I know a thing or two about choosing idealism over money, but that’s not for everyone and I frankly don’t think Facebook is really that more evil than around 90% of the hundreds of software companies we deal with.

Like we recently ordered a system for abused children journals. A nationwide bidding landing in a 120 million danish or deal, for a piece of software that 30 municipalities build an equivalent of on their own for 2 million danish kr a few years back.

So some company is making 118 million because the world is rotten. That company is the most popular tech destination for newly educated CS grads in my country by the way.

See, the US wasn't being stubborn in not adopting metric, it was just waiting patiently for it to progress from beta to 1.0. :)

(Fun fact: all the US customary units have been officially defined in terms of their metric counterparts since 1893: https://www.nist.gov/sites/default/files/documents/pml/wmd/m...)

This is also a national security risk. A nation state interested in war will have an easier time shutting down a small handful of things simultaneously to quickly cripple our economy.

For example:

- Shutting down only Visa and Mastercard would disable the vast majority of credit cards.

- Disabling debit cards or withdrawing funds from JP, BofA, Wells Fargo, and Citi would disable the majority of banking clients in the US.

- About 40% of US air travel could be stopped by hacking American and Southwest.

We've already seen similar large scale attacks on large companies:

- In the 90s and early 2000s, Microsoft's weak security allowed worms to cripple Microsoft Office and Windows.

- 100 million JP Morgan accounts were compromised by 3 people in 2015.

Repo maintainer here.

...can someone explain how the repo keeps resurfacing? I haven’t promoted it in a long time. (Looking at the repo traffic, it recently spiked on the 6th, but nothing since then.)

Part of the problem is that hospitals don't disclose prices before performing services.

So patients must agree to a procedure -- yet only afterward find out how much it costs. That price can be shocking, and the patient never agreed to pay it. No wonder it becomes hard to collect from them.

In the rest of the economy, we agree on a price beforehand. That makes customers more likely to pay.

The medical system, however, has given up on disclosing prices to patients, because everything is paid by insurance. And now we have Doctors who say:

> “It's harder to collect from the patient than it is from the insurance,” said Amy Derick, a doctor who heads a dermatology practice outside Chicago.

If you want people to pay, then make an agreement with them before you perform the service!

The development of online software over the last 10-15 years is a case study in feedback loops. It really has flipped the paradigm.

For lotus notes, success is when a user makes the software do what she wants, like sending an email. In the modern equivalent, the software (designer) succeeds, when the user does what the software wants, like sending an email.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact