Hacker News new | comments | show | ask | jobs | submit | ISL's comments login

There's been a follow-up paper from three scientists whose independent G-measurements reach some of the lowest-claimed uncertainties.


This is a dangerous road to travel. Any dataset can be reordered to yield a monotonic result.

Certainly. But in this case it makes sense to reorder the dataset by descending physical volume, because the nearly constant multiplier between the steps becomes apparent.

Once you allow that, you can put anything in that graph. For example, that jump in the graph between the Cray and the AlphaServer calls for a product of around 3 cubic meters. I conclude that the Apple car will be more like a people mover than a car :-)

But the interesting thing, is that the iPad was created before the iPhone, and shelved to make way for the iPhone, so I'd say the modified graph reflects if not the release order, at least the order of innovation.

Well, there probably was a 3 m^3 product around 1989...? Maybe something like the Ardent Titan:


The Apple products do line up at tidy intervals, and there aren't any more of those.

A self-driving car will have to start a new graph. A few meagre teraflops won't be nearly enough for that application, I think!

When the article refers to 'graphics-arts cameras', what do they mean?

Is there a spectral notch filter applied to the CCD, or is the article a troll?

"Scanning in black-and-white makes it possible for the non-photo blue still to serve its original purpose, as notes and rough sketching lines can be placed throughout the image being scanned and remain undetected by the scan head."

Any black-and-white scanner should have a spectrally-flat response, picking up blue just as black and white photographs see the sky as darker than white.

It's entirely possible that older lithographic film didn't have much response in the blue, but there's really no way that a modern imaging system won't pick it up.

What am I missing?

Edit: Experiment is the arbiter of truth: I took a picture of the screen with my digital SLR. As expected, every color swatch in the article is blue. Desaturated the RAW image. Looks grey.

The "graphics arts cameras" they are referring to are pre-digital. They looked like this: http://www.forgottenartsupplies.com/?what=artifacts&image_id...

The goal was to compose a layout into a single image.

You created a layout by literally cutting and pasting things onto a board. Then you placed that board in the area at the bottom and took a picture of it that was transferred to film loaded in the top.

You're right that the film was special; but it's the other way around from how you were thinking. The film was not sensitive to red light. To this film, red is "black" and cyan or blue is "white".

Why this was useful:

- You could open the box of film (it came in sheets) in a room that was darkened except for a red bulb, without exposing it.

- You could use overlays of transparent red material (rubylith) to mask things precisely. Even though you could see through to the layer below, the camera would see it as all black.

- And, as the article mentioned you can add notes to the layout with blue pencil and it would be invisible to the transfer. We always called this "non-repro blue" though, as in, the camera wouldn't reproduce it.

And just to add a couple things to your answer:

-- Litho film was also very high contrast so everything pretty much came out black or white. (Photos weren't actually reproduced as greyscale but rather as a set of larger or smaller black dots using a halftone screen. This still applies when things are printed.)

-- Because litho film was sensitive to blue, the non-repro blue writing on the white paper would, like the white itself, be an exposed part of the image. This results in a black area of the negative where silver halide has been turned into metallic silver. This black area would then become white again when the negative was used to create a printing plate.

Kind of strange for the info box to specify a color used in the pre-digital era using a digital sRGB triplet.

Yeah, I expect that's just someone with a mania for Wiki-standardization. It's not a precise shade; any cyan-ish color would do. In practice non-repro pencils and markers varied from sky blue to a rich turquoise.

The article seems confused - it's implying that there is some magic shade of blue that cameras can't see (even today), which is totally wrong. I think that's why someone found it interesting to post here.

Graphic arts film wasn't at all fussy about the shade of blue (as you note) and so while there were expensive non-repro blue markers and pencils, everyone I knew (at the very end of the era of graphic arts cameras) used blue highlighters, so design studios were full of them.

I've stuck with blue as the only highlighter colour I'll ever use, more than 20 years since the original rationale.

Also, used to freak people out scribbling (non-repro) obscenities on a flat that was going to be sent to photo and turned into a newspaper the next day.

Especially since an sRGB triplet only specifies how to perceptually reproduce the color, and film has a different spectral response from the human eye. The dye in non-photo blue probably should actually spectrally be in the blue range rather than having any dye in the red or green range, since it would likely show up on film otherwise.

I believe they are referring to a technology of the ancients where they made thin films of photosensitive chemicals, exposed them to light, then processed them to make images. The chemicals varied in which wavelengths would activate them.

For instance, red would not activate the paper commonly used for black and white prints, hence the red lights in dark rooms.

It is also possible the cameras illuminated the artwork with a light to which the non photo blue ink was transparent.

The magic word here is "orthochromatic". Orthochromatic photo emulsions (the light-sensitive part of film or photo paper) are only sensitive to short wavelengths of light. The first photo emulsions were all orthochromatic, which makes skin look weird. Later we developed Panchromatic film which is equally sensitive to all colors. It replaced ortho in the camera, but ortho continued to be very useful in the darkroom and in compositing because it allows the red safelight and tricks like non-photo blue.

Not necessarily. Orthochromatic ("correct colour"), or ortho, materials were actually improved-spectrum materials that were sensitive well into the yellow-green. Prior to that, film and paper were really only significantly sensitive to blue/ultraviolet or "actinic" light. Getting to panchromatic ("all colours") was indeed significant, but ortho was advanced technology at the time. (And yes, being able to see what you were doing in the process room was a Good Thing™. Also, rubylith for masking.)

Whoa, didn't know that. Very interesting. Thanks!

It's a real thing: in ancient times when I worked on a yearbook staff, we used non-photo blue markers to mark up the physical pages we sent to the publisher.

I don't know how these pre-digital reproduction systems excluded the blue, nor to I know if this system is still in use in the digital era.

The article does mention diddling with the contrast and brightness as well as desaturating it. However, it doesn't give references to digital-based workflows working like this. I associate non-repro blue grids and pens with doing physical paste-up on a light table. I wouldn't think they'd be part of a typical digital flow although someone in that business would know better than I.

[Edit: As someone wrote, the article just seems confused. Yeah, you can adjust a digital B&W image so that a light blue goes away. You can also adjust it so a light yellow or a light anything goes away. Digital sensors do have different wavelength sensitivities but the use of non-repro blue and rubylith were a function of the specific sensitivities or lack thereof of litho film.]

The article was written first in 2005. Back then, digital photography was not so popular. :)

Apparently you remember a different 2005 to me!

I mean that at that time non-digital photography hadn't completely died out. I recall watching a TV show comparing digital and film photography's quality.

Submitted because I really wanted to comment on this sentence: "For a startup, work needs to be both faster and more rigorous than an academic lab."

In our academic lab, if we knew how to be faster, more rigorous, or both, we'd be doing it.

The notion that a startup can do rigorous research faster than academia is curious. The scientific journals are not full of the output of startups. Industry in general, and startups in particular, can't expend the resources on covering corner cases and tidying loose ends. The profit is generally in getting the gist of an idea, not in writing it down, vetting every detail, and sharing it broadly for free.

I chuckled when I read that. Apart from the uninformed way this was expressed, the author's actual interpretation of this statement comes after that:

"In academia, in order to publish a paper, often you just have to get it to work one time out of ten – so you think, OK, I’ll just keep doing the experiment until it works. We need it to work nine or ten times out of ten."

So the author confuses it with engineering. Then -- allow me to be a bit cynic -- the actual meaning of the article would be something like:

"If you want to be a scientist at a startup, you need to become an engineer."

> "In our academic lab, if we knew how to be faster, more rigorous, or both, we'd be doing it."

There are some cases where it's obvious how to be faster (and occasionally more rigorous): judiciously throw money at the problem. In academia, where grant funding is pretty limited and equipment is expensive, you probably want to only have enough infrastructure for, say, the 50th-90th percentile of load (microscopes, thermocyclers, analysis compute power, what have you), and accept that 10-50% of the time there will be a queue. If the difference between being successful in the market and having your lunch eaten by someone else truly is a matter of weeks, then it makes sense to you and your investors to put some extra money into a widget that will sit idle for most of the time but that helps when things are crunched.

That said, money can't fix everything---9 women can't produce a baby in one month!

> 9 women can't produce a baby in one month!

> Then you're just not thinking hard enough!

- My boss

You could have 9 women on a constant baby producing staggered cycle so one baby gets produced every month. Then, when you need a baby you will be able to get one in 15 days, +/-15 days.

you would need 12 to be precise.

'Rigour' in the article means 'intensity'. 'Rigour' in science means 'validity'. A study with high rigour means that they've nailed down more loose ends than a study with low rigour. The two concepts are largely orthogonal - and it's disturbing that someone telling scientists what to do has used the wrong version of 'rigour' for that industry.

Ironically, using the 'rigour' of science, startups require much less rigour - startups just need something that works enough, not something that is as correct as can be done.

Academic rigor applied to industry would be a type of overfitting. Just as one example, customers don't care how accurate a fitness monitor is, as long as it is vaguely effective. Yet, few journals or funding agencies would buy your claim of building a fitness monitor unless you can rigorously demonstrate its performance and novelty.

Of course, the story is different in pharma and a few such fields. Theranos is an example of how lack of rigor and openness can damage healthcare startups.

> The scientific journals are not full of the output of startups.

Startup's goals are generally completely orthogonal to publishing.

Here is one answer in the form of an analogy -- but I preface this by saying that comparing the set of all academic science with the set of all industry science is fraught at best -- academic science is to cottage industry as industrial science is to factory production.

A related consideration is that training grad students and post-docs is a key component of most academic science. The requirements of training often limit the size of teams working on a single project with the PI-trainee relationship dominating the organizational structure.

As the "PI" of a science startup R&D team, I can start and stop new projects at will with varying team sizes and mandates without the consideration that my folks need to produce a body of published work to further their careers.

The author might have meant aiming for actual results instead of publishability. Academia, in general, is fucked up. Research is being done in order to publish - which means lack of real rigor and lot of fake one. Studies are being performed using bad methodologies, and then massaged and/or repeated until results cross the magic threshold, at which point they get published in 10 papers that say the same thing in slightly different words.

It's a terribly inefficient process that could be optimized by changing the goal from "publishing" to "making something that actually works". Thus you could achieve greater speed and more rigor at the same time.

(Basic research could probably be optimized too, although not through market incentives.)

While I generally agree with your comment, I think the one big advantage of private companies over academia is that they're typically much less resource constrained. Academic labs sometimes take huge shortcuts on cost with strange side effects. Often the way to be more rigorous or faster is "have fewer resource constraints" and even a mediocrely funded startup can have much more money than a well funded academic lab. On top of that, the more experienced team members spend a smaller percentage of their time fundraising.

The same thought occurred to me; I think it's probably more along the lines of "faster and relying on more intuition".

I don't think this is a new concept; Jon Gertner in his excellent book "The Idea Factory" writes how the researchers at Bell Labs switched from basic research to development during WW2 and operated in a similar way. If anything, the urgency of development in the war effort resembled a startup in how it accomplished within a few short years projects that would have taken decades in peacetime.

Or, faster and relying on translational or post-translational stage research.

If you're building a company, odds are you aren't doing fundamental research (and you might not even be doing translational research).

>"For a startup, work needs to be both sloppier and done more arrogantly than an academic lab."

Fixed that for you.

That only works if someone else is buying the bitcoin at the current exchange rate. Coinbase gets its dollars from someone to give to the merchant.

Right, however I'm sure they can float a fair amount, holding onto Bitcoin until it's at a certain value again. So long as the scheme keeps growing..

To me, a drawl sounds like home and kindness.

Whitewater paddlers wind up spending an inordinate amount of time driving between the put-in and the take-out along "River Rd."

Doesn't matter where you are, that's often what it's called.

How much does an average convenience store make in annual profit?

If you think the link is cool, and you wish you had even one piece of it, you can build your own cloud chamber at home:


Or, apt-get install freeciv .

Just enjoyably spent an hour remembering exactly how quickly an hour goes when playing freeciv.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact