- Usually the problem with room temperature is that the various thermal mechanical excitation and thermal radiation surrounding the qubit lead to interaction with the qubit and the information stored in it "leaks"; that is the main reason a lot of quantum hardware needs cryogenics.
- There are already a couple of similar types of "defects" that are known as a promising way to retain superposition states at room temperature. They work because the defect happens to be mechanically and/or optically isolated from the rest of the environment due to some idiosyncrasies in its physical composition.
- You might have heard of "trapped ion" or "trapped neutral atom" quantum hardware. In those the qubits are atoms (or the electrons of atoms) and they are trapped (with optical or RF tweezers) so they are kept accessible. The type of "defects" discussed here are "simply" atoms trapped in a crystal lattice instead of trapped in optical tweezers -- there are various tradeoffs for that choice.
- While discoveries like this are extremely exciting, there is a completely separate set of issues about scalability, reliability, longevity, controlability, and reproducible fabrication of devices like this. That is true for any quantum hardware and while there are great improvements over the last 15 years and while there is truly exponential progress over that time, we are still below the threshold of this technology being engineeringly and economically useful.
Lastly, the usual reminder: there are a few problems in computing, communication, and sensing, where quantum devices can do things that are impossible classically, but these are very restricted and niche problems. For most things a classical device is not only practically better today, but also is in-principle better even if quantum computers become trivial to build.
>Usually the problem with room temperature is that the various thermal mechanical excitation and thermal radiation surrounding the qubit lead to interaction with the qubit and the information stored in it "leaks"; that is the main reason a lot of quantum hardware needs cryogenics.
This issue, as I understand it, is quantum decoherence, when quantum properties are lost as an object transitions into the behavior of classical physics.
The article describes a period of a millisecond before quantum decoherence occurs. This is (relatively) a long time, and perhaps could be exploited in part to build a quantum computer.
> Quantum field theory emerged from the work of generations of theoretical physicists spanning much of the 20th century. Its development began in the 1920s with the description of interactions between light and electrons, culminating in the first quantum field theory—quantum electrodynamics. A major theoretical obstacle soon followed with the appearance and persistence of various infinities in perturbative calculations, a problem only resolved in the 1950s with the invention of the renormalization procedure. A second major barrier came with QFT's apparent inability to describe the weak and strong interactions, to the point where some theorists called for the abandonment of the field theoretic approach. The development of gauge theory and the completion of the Standard Model in the 1970s led to a renaissance of quantum field theory.
But the Standard Model Lagrangian doesn't describe n-body gravity, n-body quantum gravity, photons in Bose-Einstein Condensates; liquid light in superfluids and superconductors, black hole thermodynamics and external or internal topology, unreversibility or not, or even fluids with vortices or curl that certainly affect particles interacting in multiple fields.
TIL again about Relaxation theory for solving quantum Hamiltonians.
OTOH other things on this topic
- "Coherent interaction of a-few-electron quantum dot with a terahertz optical resonator" (2023) https://arxiv.org/abs/2204.10522 :
> By illuminating the system with THz radiation [a wave function (Hamiltonian) is coherently transmitted over a small chip-scale distance]
>> "That's the key finding," she said of the material's switchable vacancy order. "The idea of using vacancy order to control topology is the important thing. That just hasn't really been explored. People have generally only been looking at materials from a fully stoichiometric perspective, meaning everything's occupied with a fixed set of symmetries that lead to one kind of electronic topology. Changes in vacancy order change the lattice symmetry. This work shows how that can change the electronic topology. And it seems likely that vacancy order could be used to induce topological changes in other materials as well."
I can understand "nuclei excitation states" in two different ways.
- If you mean excited states of the neutrons and protons in the nucleus, I really have no idea, mainly because these are generally unachievable in an engineered system. These excitations are in the gamma-ray spectrum and just way too high-energy and destructive to be useful, so no one has tried to harness them for quantum information processing.
- If you mean the "orientation" of the spin of the nucleus, which can be easily manipulated with reasonably weak low-frequency magnetic fields (like we already do classically for medical imagining purposes), then this is actually a pretty good storage medium. Frequently people think of the electron spin as the "networking card" because it can easily interface with photonic qubits, and think of the nuclear spin as the storage medium because nuclear spins can retain quantum information for many minutes (because they are so well isolated). One problem is that transferring the data from the electron spin to the nuclear spin is relatively slow and noisy. But "progress is being made" albeit less quickly than we would like.
My claim to authority here is that I have designed various control protocols for such hardware during my postdoctoral time. But I had it easy, most of my work includes statements roughly equivalent to "assuming the progress of the last 20 years continues, we will have good qubits in 10 years if we use the protocol suggested here". To be clear, there are technologies that will be workable sooner (or already are), they just have different tradeoffs in cost and other requirements.
These kind of atom-like systems can be used as registers for a quantum computer. They allow you to store a qubit and apply gates to that qubit. It doesn’t make practical sense to use it to improve classical storage, although it actually can pack more information than a classical bit using “superdense coding” [0]
There are a couple of ways to answer your question:
- If we are talking about storing one qubit per defect: reproducibly/reliably/scalably creating such defects in large numbers is impossible for now, so the answer is either "they simply can not be built" or "just as long as a single defect if we imagine we could build them"
- If we are talking about storing one qubit in a 100 defects: same issues as above hold, but if we imagine that we can build a 100 defects... and if we imagine we have control over such ensemble of defects... and if we imagine we can perform logic gates between neighboring defects, then we can use error correcting codes, so a hundred atomic defects can store the equivalent of one logical qubit for (exponentially) longer than a single defect. But all the things I am imagining here are impossible for the moment (but people are working on it).
- Usually the problem with room temperature is that the various thermal mechanical excitation and thermal radiation surrounding the qubit lead to interaction with the qubit and the information stored in it "leaks"; that is the main reason a lot of quantum hardware needs cryogenics.
- There are already a couple of similar types of "defects" that are known as a promising way to retain superposition states at room temperature. They work because the defect happens to be mechanically and/or optically isolated from the rest of the environment due to some idiosyncrasies in its physical composition.
- You might have heard of "trapped ion" or "trapped neutral atom" quantum hardware. In those the qubits are atoms (or the electrons of atoms) and they are trapped (with optical or RF tweezers) so they are kept accessible. The type of "defects" discussed here are "simply" atoms trapped in a crystal lattice instead of trapped in optical tweezers -- there are various tradeoffs for that choice.
- While discoveries like this are extremely exciting, there is a completely separate set of issues about scalability, reliability, longevity, controlability, and reproducible fabrication of devices like this. That is true for any quantum hardware and while there are great improvements over the last 15 years and while there is truly exponential progress over that time, we are still below the threshold of this technology being engineeringly and economically useful.
Lastly, the usual reminder: there are a few problems in computing, communication, and sensing, where quantum devices can do things that are impossible classically, but these are very restricted and niche problems. For most things a classical device is not only practically better today, but also is in-principle better even if quantum computers become trivial to build.