Hacker Newsnew | comments | show | ask | jobs | submit | login

This comment is cynical even by HN standards.


If you believe HN is cynical you should spend some time on Reddit.


If an atom is mostly empty space, what is being represented by the lighter valued "blobs" in these images?


It's roughly the probability of an electron undergoing a state change in the vicinity. It took a lot of measurements to make those images. It also took a very sensitive instrument, very good control of high energy particles tuned to the target, and a very stable local environment (cold, dark, and quiet).

Edit: reviewing awgl's comment: I just want to clarify which electrons we're talking about. The target has electrons. The electron beam is also, by definition, electrons. So you're shooting electrons at A) electrons, and B) nuclei.

What are the odds of a negative, 100 KeV (medium-high energy) electron in the beam interacting with a heavy, relatively stationary, positively charged nucleus? High.

What are the odds of a negative, high energy electron interacting with a low-energy electron that may, occasionally, be in the area? Low.

But we've been shooting high energy electrons at dense targets since the 1920's or 1930's. The joke in accelerators is that most of the particles you fire miss, implying you can't hit the broad side of a barn. (1)

What's far more impressive is the ability to focus the beam down to sub-angstrom scale (1^-10 m) and then scan at equal or higher resolution! And then detect at the same scale of resolution! How? Almost certainly the beam is steered electromagnetically. I'm interested in the detector. I'm guessing these are reconstructed using a combination of side-scatter and forward-scatter information. Not entirely sure how though.

(1) http://en.wikipedia.org/wiki/Barn_(unit)


I'll readily admit that I'm not a STEM expert. And, honestly, I think I was conflating the STEM in this article with Scanning Tunneling Microscopy (http://en.wikipedia.org/wiki/Scanning_tunneling_microscope).

So, yeah, my comment is not entirely accurate about the electron densities of the atoms. If you feel it is too misguided, I'll remove it.

This is what happens when you ask a theoretical chemist to explain an experiment. ;)


> And, honestly, I think I was conflating the STEM in this article with Scanning Tunneling Microscopy

It's easy to do. TEM/STEM vs SEM vs STM. All completely different things. This is what happens when scientists name things :P

For those confused:

TEM/STEM: An electron beam is transmitted through your sample. Good for atomic-scale imaging.

SEM: An electron beam is scanned across your sample, but none are transmitted through. Good for topography/surface features (the interaction volume of the beam is too large for atomic resolution).

STM: No electron beam. Instead think of a vinyl record player, and physically scanning a very sample tip across the surface of your sample. Good for atomic-scale imaging of a surface.

A good STM image and a good STEM image can, at first glance, look quite similar (especially for a 2D material like graphene), but they're very different techniques.


> What are the odds of a negative, 100 KeV (medium-high energy) electron in the beam interacting with a heavy, relatively stationary, positively charged nucleus? High. > What are the odds of a negative, high energy electron interacting with a low-energy electron that may, occasionally, be in the area? Low.

Just to clarify this, so people don't get the wrong idea, electron-electron interactions are also incredibly important in an electron microscope.

> and then scan at equal or higher resolution!

Note that while this is what's happening here, you can get similar atomic-resolution images without scanning at all, by just illuminating a sample with a broad electron beam (see 'TEM' (or 'conventional TEM') vs 'STEM'). Strictly speaking, STEM and TEM are equivalent techniques (by something called the reciprocity theorem), but in practice you'd almost certainly prefer one over the other depending on your sample and one technique is not necessarily better than the other.

> I'm interested in the detector. I'm guessing these are reconstructed using a combination of side-scatter and forward-scatter information.

For some of the STEM-HAADF images in the article, a 'high angle annular' (the 'HAA' part) detector is used, which collects electrons that are just scattered at high-angles in the forward direction, because this results in images that are much easier to interpret (intensity is just proportional to mass and thickness). The detector itself is commonly a scintillator/photomultiplier tube combo.

There are tonnes of other detectors used though. Until recently, it was common to just expose TEM images onto film (I have some of my own samples actually). An operator would find the region of the specimen they wanted to take an image of using a live view which was the electron beam projector onto a phosphor screen and, when ready, move the screen aside and expose it to film. More recently, the vast majority of TEM micrographs are taken using normal CCD tech.


> just to clarify this, so people don't get the wrong idea, electron-electron interactions are also incredibly important

Thanks. Yeah, no doubt. For the graphene image, for example, the beam appears to be interacting with the pi bonds.

> There are tonnes of other detectors used though.

Yes, I recall entire section of my nuclear physics professor dedicating a whole week, after we understood the basics of scintillators and photomultipliers, going through a multitude of detectors.

Overall, Osmium, thanks for your comments here!


A crash course in quantum mechanics is what you are asking about!

While the electrons and nucleus (i.e. protons + neutrons) of atoms are indeed 'particles', they are exhibit wave-particle duality: http://en.wikipedia.org/wiki/Wave%E2%80%93particle_duality

In brief, quantum particles act like waves sometimes (think ripples of water) and act like particles (think tiny billiard balls) at other times. The consequence of that, and a major tenet of quantum mechanics, is the 'wavefunction' of a quantum particle. A wavefunction of a particle, instead of just being a single point in space, amounts to a probability density of position and momentum.

So now that we know the above, in these images, what they are actually measuring is the spatial probability density of the electrons. The lighter values correspond to high density of electrons. The darker values correspond to less electron density. The high density occurs around the nuclei of the atoms. Thus, atomic resolution. However, note that individual electrons are not resolved in these images.

Finally, I want to recommend against thinking of atoms as electron planets orbiting a nucleus sun full of empty space in between. That thinking ignores quantum mechanics. The truth is much more fascinating, which is that electrons are wave-particles that have probabilistic densities.

P.S. Protons and neutrons are themselves made of up more elementary quantum particles: quarks!


Does the microscope care about probabilistic densities? Aren't these images rendering interference/difference between electrons sent out and electrons received?

Those 'particles' supposedly were in the space and interacted with the electron beams. They were or weren't in a place at a time.

In wave particle-duality, how are we not just suggesting particles are there because it creates an estimate model that helps model behaviors observed?

In the facetted nano-diamond void, why, in the void, do we see apparent 'ghosts' of the lattice in the void? Are there particles there or not? If there are, why are they dim?


Right. Well, given your comment and niels_olson's, I feel I have misspoken about this STEM experiment. Not that was I said about quantum mechanics was wrong, just that its relevance to the measurement in this experiment is misguided.

As niels_olson points out, the primary interaction here is between the electron beam and the atomic nuclei.


When you have enough samples, probabilities just become counts.

I would assume at the void it's dimly picking up the atoms at the bottom.


Consider, if atoms and thus matter are mostly empty space, why can't you put your hand through the table?

It's that same "stuff" that keeps your hand from passing through the table that the electrons "see" to create the STEM pictures.


>>atom is mostly empty space

What exactly is "empty space"?


I think your humor detector needs tuning.


Oh, yeah, you're probably right. I just thought they were bad at English.


It's been speculated that they are developing a consumer-oriented hardware bitcoin wallet/node.


This would indeed be very lucrative, as it is the missing component for adoption, IMO. I wonder why it's taken them 3 years though.


There's been Trezor for some time now: https://www.bitcointrezor.com/

Projects like these are great, but they won't drive adoption per se.


Hardware is hard; crypto is hard. Most people seem to think the Trezor's usability leaves a lot to be desired.


Making Brian Williams cover rap songs?


If I recall correctly it was written as part of his research for Cryptonomicon.


How do you research a market from the outside? Especially in digital/saas products where the barrier to entry is already low, it seems to me that the successful people don't want to give up their competitive advantage by telling the world how to reproduce their business model (for good reason).


Hi pault,

The way I would go about it is:

Start talking to potential customers. Find out how painful and valuable the problem is. Find out if they are currently using some other solution to the problem (your competitors).

Approach an industry organisation, if there is one, and talk to them about the problem, who has any existing or new solutions and find out if they have market research from their members.

Just a comment about the above two points, you don't have to pitch your specific solution to the problem but I would still leave the door open when you are talking to people and say something like, "if I were to come up with a solution that would do (insert features and benefits) would you be interested in talking further?" If the problem just isn't perceived by your customers to be painful or valuable then I would move on.

Now assuming the pain is sufficient and a solution is valuable then start looking at the economics of the business and start 'building the market.' What I mean by that is start with some assumptions about customer characteristics, for example demographics, and start putting some numbers together in terms of total market size. Try to do this from different angles and use different assumptions.

Then start playing around with what you think your cost and profit structure would look like with different business models. Think about what your cash burn rate will be and what sort of adoption you will need to hit a 'cash neutral' and break even scenario.

Then start applying this to your 'whole of market' model that you did before and start thinking if it makes sense. Do you need 1% or 0.0001% of the market to start adopting your solution? What might it cost to acquire those customers in terms of time and money? Start thinking about how that relates to hitting your 'cash neutral' and break even scenario.

Take a step back. Does this make sense? How much work is involved? How much capital are you going to need to achieve this? Then multiply that by say 4-5 times to give you a margin for error.

Now start thinking about post 'cash neutral' scenarios where you are capturing different rates of the market. How much do your costs ramp up? What is your marginal profitability? What does this equate to in terms of return on capital? If the returns are low then forget it. Investors want you to return their capital plus a big fat profit. If you don't think you can grow a big enough pie so you can get a slice of it that will satisfy you then forget it and find another opportunity.

If things look like they are stacking up well and you got a good story to sell to investors then good luck. Raising money is a bitch. Assuming you manage to get enough then things get harder because you will now have to execute within the cash 'runway' you have been given and well the rest at that point is up to you, your team and luck.


It might have something to do with the HN effect and 12MB of uncompressed png images. I know it's an article about design, but this seems excessive to me.


Medium usually works pretty well even under HN loads, at least I haven't seen it behave quite like this before, maybe none of the one's I've seen have been quite this image heavy. shrug


No, it nearly gave me a seizure. It might just be the super bright IPS panel, though.


I just wanted to chime in and say that I had always avoided infosec as "too hard" (I'm primarily a UI guy), but the Snowden leaks have made me a lot more conscious of my digital footprint and security in general. I even spent several months on khan academy learning the requisite maths to understand the stanford crypto 101 class on coursera (awesome course by the way!). Of course I'm far from journeyman status but I believe there are many more like me whose awareness of the issue was significantly raised by the leaks. Even if no political changes come about as a result, that's a huge aggregate effect that will only increase over the next years/decades.


I think it's a bit odd so many hackers view technology as even a possibility to their saving grace in this struggle. Were hackers the ones largely supporting actionable development efforts of Tor? Do they continue to? How many people do you know that use Tor for casual internet browsing vs spending more time on Facebook?

Take a look at your paycheck, expenses, taxes, and then contributions to technological solutions by-the-people, for-the-people. How many of you work at $job doing data-analysis for marketing schemes verses something useful to society. After all of that, then do some rough statistics on programmers that contribute to meaningful projects regarding cryptography (properly implemented, not talking about crap like 'secret') and freenet-type ideas. How many of us are going to quit our jobs to focus on this problem, how many of us could if we wanted to. Stop kidding yourselves, we're all literally paying for the work of evil-doers to subvert long extinguished ideas of privacy. Either adapt to survive or resist in a meaningful way, stop diluting yourselves with ideas of grandeur.

Tech people are nothing more than glorified marketers by-and-large. We handed the reigns of technology to those with capital, and the results are sickening. RMS was and will always be correct. The rest of you live in a vacuum fortified by paychecks and social support.


> The rest of you live in a vacuum fortified by paychecks and social support.

It's all of us really. The enormous amount of money pumped into tech by the surveillance state reaches nearly every software company (and beyond). Your examples for worthwhile projects are Tor, freenet, and crypto. The surveillance state finds those projects useful too.

This problem is larger than what you're considering and blaming particular industries and spaces understates the amount of cash pumped into the world by governments.

And just because corn farmers have a subsidized crop that drives the processed food industry which give us cancer/diabetes doesn't mean those farmers shouldn't follow best practices unless we want another dust bowl. Likewise, I think it's great that UI programmers are more aware of security.

Also, we're not literally paying for the surveillance state. They're essentially stealing from the people by arbitrarily printing money which devalues our own. If taxes were raised commensurate with spending and debt, we'd certainly be making some more prudent decisions about how best to spend tax dollars.


I assure you I have no delusions of grandeur. I believe that you are underestimating the halo effect of the negative attention that sovereign and corporate spying have received in the last few years. And I don't pay taxes to the US government.


I was careful not to specify US taxes. The US may have started the awareness of this issue, but they're far from the only government involved. I've underestimated very little so far, retrospective to my predictions. However I've argued with plenty that have overestimated as history now proves.

Again, my point was looking at the overall useful contributions the average hacker can make while focusing on the struggle of maintaining 98F, verses the hiring, monetizing and actionable efforts coming from the adversary. Making progress in this day and age largely depends on accurately understanding the actual adversary. If you think that's limited to the 'US Government'... well then...


The captue of silicon valley by madison avenue never ceases to Amaze me. That's the difference between now. Amazingly, it has nothing to do with the NSA. But the privacy landgrab really stated with all of the VC backed companies desperately trying to become profitable in a stock bubble.



Guidelines | FAQ | Support | API | Lists | Bookmarklet | DMCA | Y Combinator | Apply | Contact