I always say that important scientific discoveries arise from better tools. Once technology can make the tool, it doesn't take long for the discovery to be made. It may take a genius, but there are a lot of geniuses on Earth.
Without precision instruments, how can we know that light has a speed, that stars are not points of light on a crystal sphere surrounding the earth, that diseases are carried by microbes... General relativity was a way to solve anomalies in the orbit of Mercury, something only detectable with precise measurement. Without it, general relativity would make no sense: why use all these complex formulas when Newton gravity works just as well? Send Einstein's work to the 16th century and it will be slashed by Occam's razor.
The "tenth of a second" problem is just that, technology able to measure tenth of seconds opened new fields.
I don't mean that technology is more important than scientific research, but they go hand in hand, no good science without good observation, no good observation without good tech, and no good tech without good science.
> General relativity was a way to solve anomalies in the orbit of Mercury, something only detectable with precise measurement. Without it, general relativity would make no sense: why use all these complex formulas when Newton gravity works just as well? Send Einstein's work to the 16th century and it will be slashed by Occam's razor.
I don't think that's entirely true. As far as I know, the Mercury precession anomaly wasn't considered a serious problem to Newton's theory at the time.
Einstein was motivated by theoretical problems that arose from trying to reconcile Newton's theory with special relativity, which inspired him to take the principle of equivalence seriously: the notion that gravitational mass and inertial mass are fundamentally the same. This is what lead to the famous field equations. Their derivation is not some sort of reverse engineering of the Mercury precession anomaly. Someone wrote a few words about it here [1].
The fact that GR could explain the Mercury anomaly was a bonus. The break through, however, came with the first measurement of light being deflected by the sun during a solar eclipse.
>I always say that important scientific discoveries arise from better tools
I dunno, from Archimedes to Newton, and from Maxwell to Einstein, most important scientific discoveries arose from thinking breakthroughs (and led to better tools, as opposed to having been caused by them). The "better tools" at most helped verify it after it was expressed, not reach it.
It's the incremental, evolutionary stuff, that's mostly helped by better tools...
Actually he is correct, the phenomenon that experimental physics seems to be lagging theoretical physics is a quite recent one. It's also largely based on a small subset of fields that seem to get most of the attention. There are a lot of fields (such as nonlinear dynamic systems) where our theoretical understanding still significantly lags experimental observations.
You can think up "breakthroughs" all you want, it's easy. As any mathematician or software developer can confirm, it's easy for a human to propose an explanation and immediately convince themselves of its correctness; validating hypotheses is hard work that the brain desperately tries to avoid. And in physics, validating things on paper can only get you so far - you ultimately need experimental confirmation to make solid science.
Here is where technology becomes necessary: in order to favor one hypothesis over the other, you need to find where they give conflicting predictions, and then check what actually happens in reality. That's only possible when you can perform the necessary experiment and observe the results with sufficient precision to discriminate between competing explanations. Technology is what gives you this capacity, expanding the scope of experiments you can run.
As someone else mentioned, you could drop Einstein back a few hundred years and his work would be laughed off - because ultimately there would be no way to tell if he's right or not. The theory may look beautiful (if complex) on paper, but if you can't discriminate between it and alternatives through an experiment, it's meaningless and you may as well use the simplest model that fits all your current observations.
> The "better tools" at most helped verify it after it was expressed, not reach it.
While they do so, they validate the said scientific breakthroughs. Science is not only about coming up with models or hypotheses. Science is "theory and validation." Take out either of them, and you don't have science.
Humans have been theorizing and creating mental models for thousands of years. But only with the better tools were they able to select verified models, and discard those that didn't fit the observation, and as a consequence, make more and more sophisticated and accurate predictions.
To take two recent examples, the first photograph of the black hole and the first detection of gravitational waves (both predicted by Einstein's GR, coincidentally) required a stupendous amount of collective brainpower of brilliant engineers and scientists. 100% of that brainpower went into figuring out building those better tools and analyzing the collected data. The scientific community itself has acknowledged the criticality of those better tools, 2017 Nobel Prize in Physics was awarded for building a "better tool" to observe gravitational waves, LIGO [1].
This is not to say that theory isn't important but to call out that observational tools are not "at most help verify theory" but are an integral part of science.
There is, of course, a way to validate theory within its own framework and that is what mathematicians do. I guess the philosophy is a similar domain, but I have not read much about it, so I can't comment much.
On a related note, I highly recommend this panel discussion about the topic of GW, LIGO etc., from the horses' mouth. LIGO is absolutely mind-blowing, a testimony to collective human ingenuity [2].
hmmm, how about "no". It really boils down to how you define science. If you define science as "published in peer reviewed journals" or "knee-deep in academia", the statement is correct. If, however, it's knowledge based on observation of phenomena (literally what science means), preferably with understanding why things happen (having a theory), it's so wrong, it's not even funny - it's a very dangerous standpoint, because the alternative to the scientific method is magic thinking and cargo cults.
Engineering existed way before any kind of math or symbolic manipulation. It was a collection of heuristics.
Information theory came after people were already sending signals over the wire. Similarly, there were architectural feats before geometry or Newton’s laws were discovered.
was it not based on knowledge based on observation (science)? You don't need math and symbolic manipulation, not to mention information theory to do science, even if it helps immensely if you do use these tools!
But we're not talking about just abstract principles. By themselves, these abstract principles are just that - abstract. They only matter when applied to the concrete, when you use them to learn something new about reality.
Technology is how we extend the set of things to which these principles can be applied, and thus the set of things we can learn about our world.
in purely positivistic viewpoint you are probably right, however you seem to be taking inductive approach (use principles to learn) as opposed to the deductive one (use learnings to extract principles). I wouldn't actually dismiss any of them.
edit: re-read your comments again, and i see your point now. You are absolutely right.
I would say that really it's both, unresolved questions (which can come from experiment or theory) often spur development of new measurement technology. On the other hand new measurement capabilities often spur new scientific discoveries.
The best tools are typically at the bleeding edge of technology, as the tool generally has to be of better quality than the thing being made/measured with it, so if you are going to make the best you need a tool that is better then the best (typically meaning it costs a lot). If you ever want a hard-core engineering job, go and work for a company that makes the things that make things, whether that be machine tools, electrical measurement equipment, production lines or whatever.
For example, ASML make the tools that make chips. Keysight and Tektronix make the tools that measure production. They are just random names. The list goes on. The upper echelons of such companies are at the frontiers of engineering.
I don't think this is generally true, there are lots of examples where you measure something with higher resolution than the tool's manufacturing precision (for instance, an STM) and similarly lots of examples where you use a lower precision tool to build a higher precision tool (for instance, a mill or lathe but optical focusing would also allow for a shrink if it weren't for the diffraction limit).
But moving a generation forward in precision is good, you only ever get close to perfect with diminishing returns.
> I always say that important scientific discoveries arise from better tools.
This is something I hadn't thought about before, but it makes total sense.
When I read that, my first thought for current technological toolage that might lead to new discoveries is the massive transformer language models that Open-AI / EleutherAI / others are building.
Some basic extrapolation on how-far-how-soon technology like GPT-X has come suggests that future versions could (should?) be revolutionary for knowledge discovery.
You might be interested in older work by philosopher Larry Laudan. He posited mutual positive feedback (or co-evolution) between: a) better data; b) better models or theories; and c) better instruments. Basically agreeing with you.
Likewise, he also noted that constructing better instruments is theory-dependent, as better theory allows a more complete understanding of sources of error (not to mention entirely new technologies).
Laudan called his a "tri-partite" model, as I recall.
I bought an air filter, it has a PM 2.5 meter on it.
Everyday I'm discovering and understanding what drives PM2.5 in my house.
It's not really an engineering breakthrough, you could loophole it to PM2.5 meters are getting cheaper. But really to me, it's the measurement, just better data.
I could have bought a PM 2.5 meter years ago but culture held me back. I personally don't believe in working from home, but if you did, once again the tech has existed for a while there's no real breakthrough happening.
Get the tools to the people.
Why is there no free easy to use OCR (Tesseract = PITA), I don't think a breakthrough or discovery is needed there.
Cooking without burning things could really spike it. No behaviour change yet.
With the fireplace I will behave in a way that creates less spikes now, just more effort to be less smokey. And to light a fire well is a cool skill you can nerd on. I do want to get a better fireplace.
Somedays it just spikes, I don't know why. Perhaps pollen, I'll work it out one day perhaps.
I bought a Ultrasonic Incense, because I'm a nerd that's kinda techy and fun. Undecided if I like it.
And I always run the filter, wattage is less than a led light, and cost a little (Filters aren't real cheap) but way less than my 'longevity' drugs.
I fail to understand how this is related to the topic of scientific discoveries and the technology underlying it. Yes your personal discoveries might be based on purchasing a PM2.5 meter, but personal discoveries and understanding are not generally related to scientific discoveries or understanding. If you study e.g. quantum mechanics it will not generally advance our (as society) understanding of quantum mechanics. Maybe you will become a great quantum researcher and make discoveries but it will be your discoveries then that advance understanding
If you want to jump back the the main article around the philosophy of science then yes not relevant. I haven't read the book but I'm not talking temporal consciousness, phenomenologists and philosophy and things.
> it will be your discoveries then that advance understanding
This in practical reality doesn't really happen. The philosophy fails here.
I'm just talking analytically around what happens when people get a device that can measure 1/10 finer or how Excel has pushed science, not related to philosophy or theoretical ideas around purer sciences.
More specifically citizen science like this not for profit air quality map - https://waqi.info/ Tools to the people.
Scroll right to the numbered list; it's fascinating: questions of scientific objectivity, the idea of psychology study, cinematogrpahy, statistics, continuity of reality, etc all arising from the need to observe events at a precision of 0.1s.
The author also wrote an excellent perspective on management/worker relationships and other corporate shenanigans called "The Gervais Principle" [0], set against the series "The Office.". It's definitely opinionated.
I find it’s a bit dated now, at a high level interesting but he really zooms in on the details until it becomes a kind of nonsense. It was interesting to read and think about when it came out, but I don’t really think it is correct, predicts anything in any meaningful way, or even provides a coherent strategy to manipulate others if that’s the path you want to go down in the first place.
Yeah, when I first read it I thought of it as an interesting and fun theory that was a good read, but I didn't think it was necessarily meant as truth as much as a tool to look at the series and examine it and use ideas and themes presented.
For example, I remember the idea of the Losers explained was fairly eye-opening for me. Regardless of the name, the fact that they were a largely savvy group that just chose an entirely different trade-off and decided to treat their jobs purely as a means to an end so they could focus on things they cared about clicked with me in a way that hadn't really before. I mean, I'd worked jobs like that, with a slightly different aim, as a teenager and while going to college, but I was so focused on getting a job more aligned with my interests (which I was lucky enough to do partway through college) that it skewed my view of work and the people around me. The idea of "Losers" made a few things click for me, by linking the middle class expansion of the middle 20th century, the massive manufacturing sector, and the people that were able to live great middle-class lives because of it. These "Losers" are people that have carved out their own little chunk of that dream, and are consciously putting the requisite amount of effort in to not be fired and/or feel like they aren't absolute shit at their job, as nothing more or nothing less is really needed for them to meet their real goals, which are not work related. There's something to envy in that.
Does that mean everyone fits in the categories presented, or that all the categories are accurate in all or even most respects? No, but nevertheless, as a prism with which to compare and reassess my own experiences, I found it extremely useful.
I found this blog post [1] on twitter where people were looking for literature on "office politics". It has some good points:
> Instead, I think of them in terms of what the modern corporate structure has done to them: Broken the losers, tricked the clueless, and forced the sociopaths into ethical conundrums.
> To be specific, I propose that we name the losers, clueless and sociopaths to “pragmatists, idealists and opportunists.” Their roles, relationships and dynamics remain the same ...
I read the whole thing yesterday and it's more about culture and how the majority of people in groups manage interpersonal things.
The pontificating on what drives the founders and c-levels kind of goes into the weeds a bit. I was hoping for more concrete examples of the types of speech, as well.
But the culture parts of the essays was quite thought-provoking.
Eh, all the relationships and characters in The Office are far richer and more complex than Office Space, and experience real growth and change over the series. Honestly, I view it as one of the best comedic series ever produced. I only caught up and finished the second half of it a year or so ago, so it's still fresh in my mind.
Originally the Babylonian clock split day and night each into 12 equal segments -- thus the length of an hour was different at night and day, and varied each day.
The concepts of minutes (small part) and seconds (even smaller part) is quite modern -- about a thousand years old, despite being broken into 60 (those Babylonians were quite influential!); once you do that you can continue into even smaller fractions but it was all pretty abstract.
Seconds only became interesting about 500 years ago when the clock technology could finally represent them, but they've only been useful for the last 250 years or so.
So having .1 s (not .1s!) be interesting 130 years ago is itself interesting, and reflects the state of technology (not science) of the time.
The first (.1 s) is the written method of one-tenth of a second. The second is basic written algebra for one-tenth the value of s factorial (to be pedantic with the exclamation point).
Kind of like the pseudo-math joke around xmas when people write Ho Ho Ho as Ho^3 when it should be (Ho)^3 otherwise it's just Hooo.
I recall many years ago playing Team Fortress. 100ms ping time was decent. But whenever it got the 30ms range my score kill / death ratio would skyrocket. They later implemented code to stop low pingers from dominating.
Same thing with mouse. Wireless mouse feels unnaturally sluggish to me now. Was years before I realized that it wasn’t just in my mind and most gamers use wired.
100 ping is no small amount, especially with skill games where almost everyone is trying their best. Assuming you're talking about TF the Quake mod here.
Not that I advocate for wireless mice but modern gaming wireless mice do not suffer from lag issues
If 0.1s makes any sense (as in humans processing 10 events per second), which I do honestly doubt, then 100ms -> 30ms still means that the other party is almost one full event behind.
Most people haven't played Quake/UT/CS competitively.
Btw, with modern games 30ms isn't just one full event/server tick behind. Multiplayer shootets do have 100 or higher tick rates now, so 30 would be more like 3 cycles behind
Still, it only goes to show it matters at so many levels.
Like, if using a 60Hz screen with 30ms latency, you're always almost 2 frames behind, so you always react 2 frames late, compared to another 60Hz screen person at the same ping with, say, 2ms latency.
There's something about this guy's style of prose that I find deeply unreadable.
Edit: I think I've figured it out. If there's a complicated way to say something simple, he takes the complicated way. If there's a rarely seen word he can substitute where a common word will do, he uses the rarely used word.
His prose reads heavy, but the ideas in it are light. This is like the exact opposite of Paul Graham. His prose is easy, but I find deep and actionable knowledge in many things he writes.
Slightly off-topic (as usual), but this reminds me of the book Einstein's Dreams. Different ways of thinking about time.
"The book consists of 30 chapters, each exploring one dream about time that Einstein had during this period."
The chapters are versions of the world where time has different rules. In one, the closer you get to the town square, the slower time moves. In another, people build higher and higher houses, since you age more slowly the higher you are from the surface of the Earth. And so on...
It seems inevitable and natural that it would be dramatic as we venture beyond our comfy zone of time between .1s and 10^9s - I'm not sure why the lower end is getting all the love here. Geology, anyone? Cosmology?
It's also funny that the OP mentions astronomy as the origin of the .1s problem, but doesn't mention, you know, machinery. Engines (internal combustion or otherwise) were cycling in the .1 - .001s range in the 19th century.
And in terms of the challenges of modernity, isn't the biggest modern challenge, not just to time, psychology, or physics, but to the entire notion that knowledge be organized and expanded upon by succeeding generations when institutional integrity declines and the proliferation of baseless speculation and idle daydreams is allowed to mascarade as legitimate academic content?
We humans are stuck in our time and place, at all time scales. At the high end, there is no escaping the light cone. And I'm never going to experience nanosecond time. We exist at this level of zoom, and unaided humans are stuck here. So what? We range out with our imperfect instruments. No sane person is asserting that telescopes and microscopes are eyes. We get born, recapitulate what our ancestors learned, use their tools, build our own, and push back the veil of ignorance a little.
Interestingly, 0.1s/note = 10 notes per second is approaching the upper limit of how many notes per second a musician can competently/clearly play on a string instrument!
Adam Neely has explored this. 20 Hz is the lower bound of human hearing, a/k/a 20 beats/second. Any faster than that, and beats blur into a constant tone.
Different phenomena and physical senses have different temporal distinction rates. Smell is probably among the slowest. Touch (really a collection of senses from pressure, heat, cold, vibration, proprioception, pain, and possibly others), ranges from slow (heat/cold) to fast (vibration). Visual range gets close to the 1/10th of a second described in the article, though some changes may be slower, others (usually based on flicker or interference patterns) may give even higher resolution, though an upper bound of 1/100th, possibly 1/1000th of a second (strobe effects) is probably the extreme, and that's already special cases outside typical (and certainly evolutionary) experience.
I have not seen it mentioned either here or in the article[1], but in 100 metres (and the other sprint disciplines) a reaction time faster than 1/10 of a second is considered a false start[2].
Not exactly ot, but I enjoy following Venkatesh on Twitter (https://twitter.com/vgr). There is a fair bit of self-reflexive/recursive terminology to sort through, but I guess the best way to put it is that he's just a promiscuous thinker--and the amount of in-the-clear thinking he does throws its fair share of sparks.
the failure to come to terms with the 0.1s limit in human cognition gave birth to modernity
I wonder what exactly he means by that. Sure, eyes may start betraying you at faster speeds but human cognition is in no way limited to one tenth of a second. For example, speedrunners reliably hit 1/60 of one second actions, and I'm sure many other human endeavors require similarly fast thinking.
But that's for events that can be planned in advance, right? I wouldn't be surprised to find that while those people are very fast at responding in general, that for some things they've offloaded the timing to their nervous system through repetitious training and/or subconsciousness.
I doubt a major league baseball player has time to consciously decide exactly when to start the swing for many pitches, but instead relies on feel and training, and allows their mind to instead determine whether they should abort entirely. A speedrunner is just another form of highly trained individual, so I think whether they are good at hitting sub 100ms timing events (as long as they are advertised ahead of time or are regular) is sort of orthogonal as to whether people can consciously measure periods that small in general terms.
I think this is exactly right. Speedrunners (and film editors) can perform actions with frame-perfect precision when they can plan those actions ahead of time. It might be similar to how musicians can perform notes with extraordinary temporal precision, as long as they know ahead of time what the timings will be (and in some cases as long as they have external references to resynchronize with).
I don't think there are any speedrunners who can be expected to give a frame-perfect response to an unexpected event.
The observational astronomy version could be that a human being could synchronize two clocks to less than 0.1 second difference (by starting one when the other reached a specified time, maybe), but still not measure an uncontrolled natural event to the same precision.
I'm not sure how this works with timing human athletic performance using a stopwatch, but I think I would actually be wary of trusting a stopwatch-measured time much below 0.1 second precision, even if the stopwatch displays more decimal places. I think athletic records in the timeframe this book is talking about were reported to the nearest 0.1 second and not below, and having more decimal places in our measurements has required video replay and, later, other electronic timing methods.
> I don't think there are any speedrunners who can be expected to give a frame-perfect response to an unexpected event.
This is the reason why tool-assisted speedruns are faster than speedruns without a tool. TASBot (a great thing to search for fun videos from Awesome Games Done Quick) certainly can react within a single frame, every time.
Humans generally rely on cues usually called a "setup" do to these.
That's funny, I've watched many speedruns, both RTA and TAS, and thought about the reaction time issue a bit, but never fully articulated this important difference to myself. Thanks for putting it so clearly!
> But that's for events that can be planned in advance, right?
Not planned, but an unconscious reaction or muscle memory. There isn't enough time to think about something and then trigger your muscles to move. It's got to be second nature to hit that level of "reaction time".
Yeah, that's also along what I was thinking (as noted later), but also I think you can also get close or achieve if you can prime yourself because you know the event is coming, and get the timing down. Sort of like the cyclone arcade game (if it wasn't a scam[1]). I think there's a huge difference between hitting a random 100ms time and hitting a 100ms time that's coming at a set time that you can see and judge. That's what I meant by "planned", even if it poorly encapsulates that.
I think in some cases it can be planned, in the sense that Walter Murch talked about frame-perfect cuts in video editing (which apparently is common and other expert film editors can do it too, not just Walter Murch). In that case the details of the video you're reacting to would literally be different for every cut, so the ability seems slightly more generic, because it's not aligned to exactly the same sensory or physiological cue each time it's used.
> "One of the most obvious reasons for this standard pattern is that while it is possible for any number of factors to extend the response time of a given trial, it is not physiologically possible to shorten RT on a given trial past the limits of human perception (typically considered to be somewhere between 100-200 ms), nor is it logically possible for the duration of a trial to be negative."
Because you're not reacting immediately. You're watching something move over a second or longer and hitting the button when they line up.
To prove it's not reaction time, you could even close your eyes for the last half second. And someone could do this on a trick they just learned and don't have muscle memory for.
> nor is it logically possible for the duration of a trial to be negative.
Yet hitting the button early for a trick with a small window of frames happens constantly. That by itself should show it's not a matter of reaction time.
The pitch length is 20m, and typical fast balls can be 120km/h=33m/s. For a "yorker" that might mean only 20.12/33=0.6s for the ball to reach the batsman, but for a typical bounced delivery, it's probably like 1 second.
However, the ball bounces usually 5 m from the batsman, and changes direction after the bounce, and that gives only 5/33=0.15s (or maybe 0.2-0.25s) to reach the batsman after the bounce.
Which means that the batsman is unable to change their swing after the bounce. That I find hard to believe. I think they do.
Think of that as on the order of as opposed to precisely.
Human cognition operates at rates of faster than 1 second, but slower than 1/100th of a second. In most cased, assuming about a 1/10th second response cycle is about right, and, in the context of the article and book, it's when nonhuman mechanical systems began to become capable of operating within that timeframe, that they exceeded human capabilities.
The eye and memory will register only a vague blurred impression of an event taking less than 1/10th of a second. With very standard photographic equipment, it's quite possible to capture a single frame in clarity at 1/100th to 1/1000th of a second. The former is considered relatively slow, though good for freezing virtually all macroscopic motion. At 1/1000th of a second, virtually all human-scale activity (say, sports or activities) is frozen, though very high-speed events (high-speed machinery, explosions, bullets) may still show motion blur. A good film camera would typically have a minimum exposure of 1/1000th to 1/2000th of a second.
Moreover, mechanised sensing equipment can distinguish between events at these levels. Human perceptions will simply get causality and order all out of sequence. It wasn't until June of 1878 that the very common phenomenon of a galloping horse was conclusively choreographed by means of Eadweard Muybridge's photography, and the motion of the horse and its legs in motion accurately understood.
Muybridge's sequence consisted of 24 photographs, at 1/2000th of a second (see discussion above), with the total sequence taking about 1 second. (The cameras were placed 27 inches, or 67 cm, apart, the total span was 16 meters, the horse was galloping at a 16m/s pace, or one mile every 1:40.)
Each frame then captured 1/24th of a second, about half the 1/10th second limit discussed in the article.
Not long ago I realized there's a whole bunch of stuff I just "make up" about the world.
The "me" part of my brain knows I blink. I know I'm blind, but I don't remember it at all. Turns out it's actually worse thanks to being blind while my eyes rotate in their sockets. "saccades".
There's some machinery grinding away all the time, making up little stories about the world to maintain a consistent view. I spent, and continue to spend, a lot of time wondering where else I'm blind.
The snake thing is cool. I know I have some hardwired reflexes around heat and pain. I can't think of a case where I've exercised the "hidden monster" reflex, but I believe you, and I probably have it, or something like it.
I play indoor football regulary, both as a goalkeeper and as a defender. It's similar to the full 11v11 game, but on a much smaller pitch, and thus much quicker.
I can't count the number of times I've performed (useful) actions, to then almost be stunned for a moment as I reconcile what happened.
The one I remember most vividly is a ball that was passed right at me from across the pitch. I knew my teammate kicked it, but I thought it was a shot towards goal at first. I couldn't see till about 2 metres away from me as it was across a defensive line, but it was a driven, pacy pass towards me. I stuck out a foot to kick it into goal very tidily, before I even registered that the pass was actually coming towards me.
Feels like time slowing down around you for half a second, pretty cool.
Oh yeah! I was almost in a car accident. Car in front hit the break, car was very close behind. Touched the break, and passed on the shoulder. Definitely relate to the time slowdown. I think it was adrenaline, I remember shaking after.
As someone who doesn't have any particular phobia or fear of or around snakes (a generous caution, yes, fear, no), I also had the experience of unexpectedly encountering one and discovering my subconcious body had responded and fetched me away from it several seconds before I consciously recognised what had transpired.
It was and remains a powerful and fascinating experience.
watching the Olympics and seeing some of the 100ths of a second differences in finishes is really fascinating. For the losing athlete, it must be mind numbing thinking that the tiniest of tiny things done slightly different would have changed the result
Without precision instruments, how can we know that light has a speed, that stars are not points of light on a crystal sphere surrounding the earth, that diseases are carried by microbes... General relativity was a way to solve anomalies in the orbit of Mercury, something only detectable with precise measurement. Without it, general relativity would make no sense: why use all these complex formulas when Newton gravity works just as well? Send Einstein's work to the 16th century and it will be slashed by Occam's razor.
The "tenth of a second" problem is just that, technology able to measure tenth of seconds opened new fields.
I don't mean that technology is more important than scientific research, but they go hand in hand, no good science without good observation, no good observation without good tech, and no good tech without good science.