Hacker News new | past | comments | ask | show | jobs | submit login
A record-breaking microscope (nature.com)
85 points by moh_maya on July 19, 2018 | hide | past | favorite | 27 comments



> Remarkably, Jiang et al. gave themselves a huge handicap with regard to beating the resolution record. For any given microscope lens, the best resolution is achieved by using the shortest possible wavelength of the radiation or electron beams concerned. However, the authors used relatively low-energy electrons, which have twice the wavelength of those used in the highest-resolution lens-based microscopes9,10. Using low-energy electrons for microscopy is good because it greatly reduces the damage inflicted on the specimen by the electrons. But in this case, it also meant that the resolution of the lens used by Jiang and colleagues was reduced by a factor of two. To beat the resolution record, the authors had to process a particular subset of the ptychographic diffraction data (the high-angle data), thereby obtaining an image with a resolution 2.5 times better than would otherwise have been possible.

Nice. It looks like this is a fresh gateway breakthrough with low hanging fruit on the other side. It's always exciting when it's not just eeking out a small increment gains blown up by University Press.


What I want to know is, on the sample image of molybdenum disulfide (b), there is one atom on the right near the vertical center, that is noticeably dimmer than the rest. What could account for that?


It's a chalcogen vacancy, a very common type of defect in these materials. These materials are called transition metal dichacogenides, so they have the form MX2, where M is a metal (most commonly Mo or W) and X is a chalcogen (typically S or Se). In this case, there is an S missing, but you still see a spot because there is another S below it (if you look at the crystal structure, it resembles graphene but has more complex structure in the third dimension, its 3 atoms thick rather than being flat, with the two chalcogens in the unit cell right on top of each other).


Maybe that one was less reflective than the others?


Pro tip: always polish the atoms when you make record breaking pictures of them.


When the tomography method is well developed, couldn’t this be use to scan the neural structure of a live brain? (i.e., to upload a brain)


It wouldn't work well at deep depths because it's interacting with the electron shell. After a few nanometer or two of tissue you'd have no signal. If you'd figured out a way to put a brain in the microscope. They tend to want samples in a vacuum, a few cm square and no more than a few mm thick including the sample mount.

You'd have to use something like neutrons, which don't interact with electrons and can be used to image internal structure. Those tend to have side-effects, though, like inducing radioactive decay in nuclei.


Don't fire electron beams at live brains, they don't like it.


  The basic principle of the technique was 
  proposed almost 50 years ago by the physicist 
  Walter Hoppe, who reasoned that there should 
  be enough information in the diffraction data 
  to work backwards to produce an image of the 
  diffracting object.
This kind of statement just absolutely cracks me up, because it's a clear reveal that between this sort of awareness of diffraction principles, and concepts like pilot wave theory, that double slit experiments and entanglement haven't been mysterious for decades.

It's all just media manipulation. There are very firmly understood concepts backing all the mechanics of quantum effects, and the journalists that push the ambiguities are simply trolling would-be amateurs for to fan the flames of confusion as a sort of outsider performance art.


There is enough information, should one be able to retain the phase; getting it from intensity is much more challenging.

I'm familiar with the work of one of the authors; he is a world expert on diffraction inverse problems in physical context. From a quick skim of the paper, it would appear that they're simply being careful and clever.


I'm not sure I completely understand your comment. Are you suggesting this is not a significant advance in the field?

I am not an EM expert; but the fact that its been through peer review & gotten published in Nature seems to suggest scientists in the field think it is a significant advance. And from what I do know of microscopy, it is still not trivial to image (sub?) atomic size structures.

Extending the quoted paragraph:

"The basic principle of the technique was proposed almost 50 years ago by the physicist Walter Hoppe, who reasoned that there should be enough information in the diffraction data to work backwards to produce an image of the diffracting object.

However, it was many years before computer algorithms were developed that could do this reverse calculation easily and reliably. The pictures produced by ptychographic methods are generated using a computer from a vast amount of indirect scattering data."

The news & views article does not claim that this is a basic sciences advance; they are claiming its an engineering / methodology / procedural advance. And those are as important, IMO.

Unless you are suggesting that, once the basic sciences are known, any engineering advance is trivial. If so, then you & I have very a different impression of how easy / difficult it is to build new "things" :)


None of the things you've mentioned are anything close to what I'm bringing up.

The point I'm making is that popular discussion of quantum effects are so wildly off-base, and have muddied the waters of even trying to understand what happens between photons and electrons, by casually reading about it.

But you see something like this emerge, and it's really obvious that solutions to these problems were on the right track even as far back as the early 1900's, only to be derailed by academics emerging in the 1940's.

Principles such as: https://en.wikipedia.org/wiki/Huygens%E2%80%93Fresnel_princi... had it right very early.

So, was there an ulterior motive to all the complex obfuscation of math, and inaccurate scientific reporting throughout the later 20th century? Or has it all been one big, innocent misunderstanding, among aloof egg heads distracted by their gigantic precious particle colliders?

One wonders.


This is a massive conflation (and misunderstanding) of quantum mechanics. The processes being discussed in this paper are statistical and large, and thus pretty highly classical in nature.

The double-slit experiment, entanglement etc. are all concerned with what happens with individual particles to produce those statistics, and what that means.

For example, it's not remotely surprising defracted light can reconstruct an image of an object (this idea has been around for a while - i.e. evanescent wave fluoresence, or the quest for a negative refractive index material which would be beat out diffraction limits handling). But that's not why it's not surprising - it's not surprising because you can also detect the existence of solid objects without actually touching them with so much as a photon purely by letting the probability field of one potentially extend through them, and then observing whether you see diffraction patterns along the path where photons do travel (basically a highly biased double-slit experiment reveals whether it would've interfered well before a particle is ever likely to have hit the object obstructing one of the beam paths).


The point being that computation is so cheap now, that it's more difficult to promote confusion and obscure facts.

It used to be expensive to compile massive data sets and reduce them to reliable statistical evidence, so it was easy to push concepts that had little supporting evidence. For example: "the particle passes through both slits", "the cat is alive and dead", "there are no hidden variables", "the source of an emission never ascribes state to its particles, and that state does not exist until inspected"

Now, such wild claims are in disagreement with rivers of data that are much more easily produced and reviewed computationally. Observations that were not previously possible now shed light on facts that were previously obscured. Without backing data ideas prone to confusion could take root. Particularly so, with voices of academic authority shouting down concepts that threaten the ivory tower.

But now, technology to conduct measurements is cheaper, and data shouts louder. So, something presented as fact in A Brief History of Time (the particle passes through both slits) can no longer be supported by fame alone, simply because the author is revered. It's easier produce and publish data (make high-fidelity video recordings of the behavior of silicone oil beads demonstrating pilot wave phenomena, and post on youtube).

On this example, pulling together raw data from sensor streams, and dumping into a high performance computing pipeline, reveals that diffraction itself is a state producing phenomenon, and that reliable variables are produced by the diffractor, but would be later hidden by subsequent polarizers that drive downstream state. If the hidden variables weren't reliable, there would be no possibility of composing an image from the statistical analysis of the diffraction. The diffraction would produce no reliable signal to reconstruct, because it would not exist, since local hidden variables are forbidden, behind no-go boundaries.


> So, was there an ulterior motive to all the complex obfuscation of math, and inaccurate scientific reporting throughout the later 20th century

No of course not, why would you think that? It's a difficult subject and simplifying it for the lay reader is a lossy process.


Ah; thank you for the reply; I understand what you were trying to say. I'll leave my original comment in place for context for other readers. :)


> It's all just media manipulation.

Don’t attribute to malice what can be explained by stupidity. [Hanlon]


Actually, I don't think it's stupidity... it's greed. Telling stories that sound good is unrelated to truth, but declaring them science or news (when you don't know) is deceptive.


The problem with this razor is that malicious people will happily incorporate it into their evil plans.


Thank you, for once Hanlon's razor is used properly.


I'm just reading "What is Real?" by Adam Becker (a history of QM published this year) and boy do the Copenhagenists look foolish. Even Bohr comes off as a saintly buffoon.

With experimental confirmation that the universe is non-local and "spooky action at a distance" is real, the pilot wave theory "wins" and there's no measurement problem.

(It isn't journalists though, it's the physicists themselves that muddied the waters by permitting herd mentality to overwhelm science. Also, von Neumann got a proof wrong! Folks can be forgiven for not suspecting that. But once it was noticed then the "orthodoxy" should have paid attention.)


“What Is Real” is an opinionated book, to put it mildly.


No doubt. But the facts of which the opinions are held are pretty staggering.

Von Neumann got a proof wrong in his textbook on QM. Grete Hermann found the error in 1935 and nobody noticed. De Broglie presented a "pilot wave" theory at the Fifth Solvay Conference in 1927.

Einstein kept pointing out the problem with non-locality and everybody thought he was getting old and foggy.

Physics is hella tribal. Physics.

I took the Copenhagen metaphysics pretty seriously. It's so neat and elegant to confound the mystery of quantum wave-function collapse with the mystery of subjective experience. The "observer-created Universe" and all that. It's very disturbing to realize that it's basically metaphysical bunk. It's just staggering.

But never mind all that!

The Universe is non-local!

*And yet-- Relativity!"

Nothing can go faster than the speed of light, but wave collapse does, so this is going to be some awesome physics!


Does anyone else here get slightly annoyed when Nature articles are shared here? It would be nice if there was a bot that tried to find a mirror/alternative source to the same paper. There's something incredibly disappointing and frustrating about paywalled scientific research.


I will tell you what you can't do in those situations

do not copy the DOI (https://doi.org/10.1038/s41586-018-0298-5)

do not go to sci-hub (https://sci-hub.tw)

and finally do not paste the DOI there to get the article


Here you go: https://arxiv.org/abs/1801.04630

they revised the title slightly between this and the real publication, but it's the same pub


> There's something incredibly disappointing and frustrating about paywalled scientific research.

Yes, there's a lot that's incredibly disappointing and frustrating—but is the solution to that not even to point to it? If you're interested enough, you can do whatever searches a bot would do (or build the bot yourself to do the searches).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: