Having taught low-temperature condensed matter labs, a big part of the grade is figuring out what went wrong, and either correcting for it, or at least acknowledging that it went wrong. The student needed to give more information about the experimental setup (what instruments did they use? four point or two point resistance? resistivity vs resistance? what is R_0?) and why they think the experiment didn't work. It looks to me like they had something miswired, so they only measured noise.
Indeed, a scientist should have some exposure to experimentation. Experiments don't always work on the first try, and often require some knowledge and skill.
The ones who can't solder go into electrical engineering and sit at a computer terminal all day. (joking of course)
I'm pretty sure the parent comment just dunked on you by demonstrating a deep well-read understanding of the underpinnings of both logic and statistics.
You're commenting on the wrong site (or just ignoring the rules & spirit of discussion) if you thought it necessary to tell someone they got "dunked on".
On the matter of probability and science, I like how Karl Popper put it, although I can't find the text now so I must report it from memory. His point was that scientific hypothesis formation is an instance of inductive generalisation while probabilistic inference is a form of abductive reasoning, and so using probability to support an inductively derived hypothesis is basically supporting a guess, with another guess.
Statistics of course is not the same as probability. Personally I think statistics is a bunch of hooey.
Superconductivity in TBG was originally sold as "unconventional". This article reaffirms that claim by showing how it cannot be a BCS superconductor, and more. Very interesting.
It's worth reiterating that while graphene can have some niche uses in "the real world", the main reason that it is so highly prized within academia is that it is a superb platform for studying fundamental physics, as in this work. Maybe in the future this will lead to room-temperature superconductors or something along those lines. Maybe not. Nobody jokes about how the Higgs boson has failed to leave the lab.
Offtopic, personal bugbear on a grumpy early morning:
I have a physics degree, which is true of only a minority of HN users, and I have no idea what TBG and BCS stand for here. Using abbreviations when communicating with audiences that can't be expected to know them wastes everyone else's time to save you seconds of typing.
Whenever I read about bilayer twisted graphene or topological insulators, I can’t help thinking that these are going to be the basis for next-gen transistors. Of I totally agree that understanding nature is its own goal, please raise my taxes for graphene research!
The other problem with this approach is that it is limited to 50% efficiency long-term. They claim "up to 90%", but it is only 90% efficient immediately after switching directions. The efficiency then drops to 0% before switching modes again.
I was measuring a delicate electronic device at MagLab in Tallahassee. They warned me that the best data would be had at night, because of reduced noise from a nearby radio station. Precisely at 8 PM every night my data became noticably sharper.
> Creating a working device typically takes them dozens of tries. And even then, each device behaves differently, so specific experiments are almost impossible to repeat.
This is frustrating. You can make two twisted bilayer graphene samples at 1.10 degrees precisely (to within 0.01 degrees), and they will show completely different phase diagrams. One will superconduct, but the other will not. Things like that.
What I learned recently is that every transport paper's twist angle report is wrong. The two hypothetical samples are actually probably not both 1.10 degrees. The uncertainty in twist angle should be of order 10-20%, rather than <1%. I even made this same mistake in my own paper last year!
When creating these TBG samples, we used to literally tear the graphene in half, to get accurate relative alignment of the two halves. It was very clever, but it imparts a huge amount of strain to the two layers, generally of order 0.1-0.3%. This seems like a small amount, but moire patterns are extremely sensitive to this (roughly strain amount divided by twist angle, but the twist angle is very small), so the unit cell area gets modified by anywhere from 5-30%. In transport measurements, we can only measure moire unit cell area, but not twist angle. The number 1.10 +\- 0.01 deg is calculated assuming no strain, and this is an incorrect assumption. An STM paper from 2019 first pointed this out, but it was just a couple sentences buried in the supplemental material, and I (and most others) completely missed it.
Even four years after moire materials took over the condensed matter world, we still don't understand the basics of how the materials work. It's very exciting, hot stuff.
> When creating these TBG samples, we used to literally tear the graphene in half, to get accurate relative alignment of the two halves. It was very clever, but it imparts a huge amount of strain to the two layers, generally of order 0.1-0.3%.
Does "it" mean the mechanical tearing of the crystal imparts the strain? or instead is it the newly introduced surface boundary (in 1D) that is imparting strain?
[I ask because long ago I was familiar with some of the crazy surface physics that would happen in IV-IV and III-V systems, and just wondering what effects the 1D termination of the 2D lattice might cause.]
When they mention tearing them in half, I’d imagine this more closely resembles what we would think of as slicing and just has the tearing effect due to the size of the material
You can probe different areas of the same device by adding many electrical probes, usually in a geometry called a Hall bar. In the old days of TBG, the different regions of the same device would do wildly different things. These days we are much better at stacking, and the different regions of the same device will be mostly the same.
I'm probably misreading this, but I see "In this study, age ≥65 years, immunosuppression, diabetes, and chronic kidney, cardiac, pulmonary, neurologic, and liver disease were associated with higher odds for severe COVID-19 outcomes;" listed as the eight risk factors. Where are you seeing the ones you listed?
Good news! The CDC does not identify every comorbidity as a risk factor. I’m glad they provide the data so you can make a more nuanced decision on your own.
Having taught low-temperature condensed matter labs, a big part of the grade is figuring out what went wrong, and either correcting for it, or at least acknowledging that it went wrong. The student needed to give more information about the experimental setup (what instruments did they use? four point or two point resistance? resistivity vs resistance? what is R_0?) and why they think the experiment didn't work. It looks to me like they had something miswired, so they only measured noise.