I wish the article got into more of the technical details behind how this actually worked. I find it absolutely jaw-droopingly incredible that they were able to shoot gigabytes of essentially IMAX-quality negatives, develop them in flight (which requires temperatures of around 20ºC for all of the materials involved—Edit: they didn't do this because it would've been too difficult, and ended up using a mostly dry development process. See Edit link), scan them(!!!), and then radio them back to the earth. I would find this amazing today, and all of this happened in the 1960s!
> "I would find this amazing today, and all of this happened in the 1960s!"
If bowling alleys didn't already exist... and you pitched me the idea of fully automated systems to collect and re-rack pins and return 15-lb balls to the player after each and every throw... I would probably say that's either impossible or at least impractical at scale.
19th and 20th century engineering is amazing. Today, if I can't conceive of writing it as a mobile app than I just shrug.
> 19th and 20th century engineering is amazing. Today, if I can't conceive of writing it as a mobile app than I just shrug.
Skills thrive only if there is an incentive, monetary or otherwise, to practice them. Otherwise they atrophy.
In my past 6 years in Silicon Valley, I worked mostly alongside PhDs in physics, math, comp sci, EE, quantum physics even. This is a metric TON of talent. Do you know what you can do with that kind of talent? You can like literally get a man to moon etc. Every one of these guys ( sadly including myself ) was working on some throwaway nonsense- ETL jobs, frontend js, distributed backend code that got rm -rfed and rewritten periodically, messaging systems replaced every year with messaging systems in latest language fad, all kinds of ultimate software crap manned by the jira monster with never ending tickets & feature requests.
So I just saved up the breadcrumbs & headed back to academia when I couldn’t take it anymore. Better to work on serious stuff, even if it were ill monetized. You only have one life.
This is one of the best comments I’ve read on HN... I agree fully. I’ve always wondered what could happen if we threw all these talented people onto really important stuff. btw: even though I’m not his biggest fan, I think Elon Musk is doing exactly this with Tesla and SpaceX...
I have a feeling that a lot of people today, confronted with the problem of designing a bowling alley, would go straight for a video camera, a robotic arm to pick up the pins and put them back in their spots, and machine learning to glue it together.
I wonder how the cost, reliability, and speed of that would compare to the mechanical solutions. There are a couple bowling alley rerack machine horror stories somewhere on r/talesfromtechsupport.
Today, the focus seems to be on making sure that a product/system is easy to modify, integrate, or tear down completely instead of saying 'we need to build a system that can do x and only ever x really fast and cheap for a long time". We don't build three-decade systems in the tech industry, we build six-month systems.
What's crazy about that is that some of those 6 month systems, which were designed at the time to be a throwaway, just to get by, are still in production 26 years later and causing all kinds of problems.
That's because we predominantly build software, which has the distinct benefit that it can be changed. If ten-pin bowling goes out of fashion in favour of eleven-pin bowling, your three-decade system goes in the landfill. Your six-month gets an OTA update.
>I have a feeling that a lot of people today, confronted with the problem of designing a bowling alley, would go straight for a video camera, a robotic arm to pick up the pins and put them back in their spots, and machine learning to glue it together.
plus stuffing the pins with IoT sensors and WiFi chips, i.e. making "smart pins", "smart ball", ...
I wonder what stuff what is being done today (both kinds - known to public as well as classified) would look amazing in 2060.
It's easy to be nostalgic for past grandeur, but remember that the vast majority of 19th and 20th century engineers spend all their time working on mediocre solutions to uninteresting problems, and being complacent about, then getting throughly bamboozled by Japan in the 80s.
I can't remember where it's from, but there's a story about a firm procuring a shipment of widgets from a factory in Japan, with 5% defect rate. They received the shipment with a separate box with the 5% defect widgets and a confused letter of apology. That was also 20th century engineering.
There is amazing stuff going on, and there is a lot of bland, forgettable stuff going on, and that's the way it's always been.
Being in the UK, I can hop down the road and see what the Roman's were up to; as you say, astounding, and over such a wide geography. Then again, 4,500 years ago in Egypt they built some majestic structures.
Agreed. The things that fascinate me the most are how simalar we were to the Romans, or Greeks for that matter. Roads, society, government, and water utilities. Even just stitting in a bar/restaurant that was a similar place back on Roman times.
Not to mention the networks of towers transmitting the latest gossip by electro magnetic radiation (ie. light) or sound. Or the tourist agencies offering voyages to festivals. Or the high speed train network ... wait a second, that doesn't sound right.
I was just thinking that the positive-negative developing system described in the linked document sounded not entirely unlike the one Polaroid developed for their cameras, in particular the Type 55 sheets.
Polaroid is also interesting, because almost all of their color processes are extremely fade resistant and color stable, which is more than can be said for any other wet film process, with the exception of Kodachrome and Cibachrome/ilfachrome dye sublimation prints.
It was all analog at the time, and much faster -- processing that amount of data would take decades at that time if done digitally -- for those who were involved it was a child's play to degrade the signals they've spent so much energy to configure with proper timings and responses. Imagine the guys making circuits, turning dials while looking at their oscilloscopes.
In analog, you typically don't degrade with the "noise" but with the reduced "bandwidth" (e.g. filters) -- the details can be wiped out in "real time." What in digital you get by "increasing JPG compression" and waiting for the CPU to process everything, can be in analog smoothly achieved with only a condenser and a resistor.
Agreed but if you see the picture comparison between the earthrise photos (good/degraded) you see two things: there's a "scan pattern" where different vertical lines have different brightness/gains. Also highlights and dark areas are more contrasted than the good version.
They might have added some filtering to the signal, but from the photo, that's not the most important artifact that was added.
> there's a "scan pattern" where different vertical lines have different brightness/gains. Also highlights and dark areas are more contrasted than the good version.
You get all this “automatically” by using much simpler circuitry doing the equivalent job of your “fine tuned” versions.
Yes, but only in one direction. If you low-pass filter a raster scan signal, then the image is blurred horizontally, but retains its vertical resolution.
Vertical filtering can be implemented in analogue circuitry using line delays (this was done for standards conversion as an alternative to pointing a camera at a screen), but it's a lot harder than taking a photo of a folded-up print-out.
What I’ve meant is that to get the “stripes” to be less aligned and more jagged (or with different quality properties) it’s enough using worse circuitry or worse tuning than what they probably used to align them correctly. As an example, that's what old vacuum tube TV sets produced easily when the circuits weren't correctly tuned. The last phase, including producing the “worse suited” contrasts was most probably done in the photo processing phase, I agree with that. I also agree that making the image more “blurry” is also easiest done by making a bad photo. But, it could also have been that the "better" images were in one pass simply run through the "facsimile" machines which were anyway used by the publishers an the news agencies at that time.
This is when retouching involved a brush and an artist. They might have done it electronically, but...
For artificial scan effects like the Earthrise photo, make an acetate overlay before some good old fashioned dark room dodge and burn making the print, and a slightly out of focus enlarger.
This is pretty funny in the context of the moon landing deniers. I can picture somebody telling them - you're right, the pictures are kind of faked. Not because we didn't go to the moon, but because we didn't want to let the Soviets know how good our cameras are.
If you want a really fascinating story, check out faxes from the far side. [1] tl;dr The US sent spy balloons over the USSR, many of which were captured. For their probe to the "dark side" of the moon, the Soviets used this film which was processed in the satellite and then essentially faxed back to earth.
But, hey, Silicon Valley is really good at ad tech. :-)
The most amazing point about landing on the moon in 1969 is that they had to program everything with... negative timestamps. Of course it’s a joke but it highlights how early it was in the history of technology.
It's strange to me that you talk about 1969 as early in the "history of technology". We had the wheel, bridges, steam engine, cars, refrigerators, air-conditioning, nuclear power, early robotic manufacturing, computers, and a nascent internet.
Unix time starts counting upwards from zero on January 1st 1970. It's a joke. If they had Unix computers in the lander they would have had to use negative time since the first moon landing happened in 1969.
In Vernor Vinge's far future novel _A Deepness in the Sky_, there is a mention of a "programmer-archaeologist" who digs deep into layers and layers of legacy software of spaceships; it's tens of thousands of years of stuff built on top of each other.
It's mentioned that the deepest layers count time in seconds from the moment that humanity first landed on the moon of its home planet.
In fact, there’s an comment to the effect that the actual start time is about a million and a half seconds after the moon landing — which means the original time system is Unix-based...
I just love the title "Programmer-at-Arms." Some of the seat-of-the-pants work we do in industrial automation, making changes to a machine's operating logic while it's actually running in production, feels just like that.
"Founded in 1982 by the major space agencies of the world, the CCSDS is a multi-national forum for the development of communications and data systems standards for spaceflight."
And indeed, the epoch:
"The CCSDS-Recommended epoch is that of 1958 January 1 (TAI) and the recommended
time unit is the second, using TAI as reference time scale, for use as a level 1 time code. This
time code is not UTC-based and leap-second corrections do not apply."
> images were locked away from the public as they would have revealed the superior technology of the USA’s spy satellite cameras
Whenever I read about 50-year-old government secrets being revealed, I wonder about all the things happening today we're not being told about -- that we'll learn about 50 years from today.
Looking at the Earthrise image, as another commenter pointed out, they didn't simply degrade the image, but they deliberately added vertical lines with differing brightness and gain, so it would appear that the film scanners on the Lunar Orbiter could scan only at a very low resolution.
To use an analogy: Suppose you request a secret document from the government under the Freedom of Information Act. They decide to give it to you, but they redact 99% of the material with black markers. Oh, but they also retype the entire document, without the blacked-out portions, so you think you received the whole thing. They even add staples, 3-hole punches, coffee cup stains, and creases to the paper before photocopying it. You get 1% of material, but you are convinced that you got the complete unaltered original.
I recently visited the FBI Museum/Tour at the FBI HQ in DC. There's a section where they talk about tech the FBI had in the 80s, including a camera they used for an undercover operation where they recorded some illegal shit going on in a hotel room.
You walk into a mock hotel room and even knowing the general vicinity of where the camera should be, struggle to find it. I flat out couldn't until a friend who works at the FBI told me - it was in the period on the artists signature of a piece of artwork in the room. A literal pinhole camera recording everything in the room more than 3 decades ago with pretty great quality.
I can't even fathom the technology they have today - there's no chance that'd be on the tour.
We've gone backwards from the 60's in some areas. For example, there are no planes as good as the SR-71 any more due to their incredible costs, cheaper to use satellites over the long run.
Commitment to manned space exploration is another.
Messenger pigeons were superseded by better ways of delivering messages, covering all the use cases and then some. With SR-71 and the space program, we've lost capabilities we had before. In the latter space, it's slowly getting fixed thanks to Musk & co., but we're still behind the 1970s, capability-wise.
The dividends of surviving WWII and will to achieve have waned into near atrophy in the US. Mere construction of an apartment building has become a herculean task.
Everyone has known for decades about the test planes that can partially leave the atmosphere. They aren't economically viable and have few military applications that can't be done more cheaply already. i.e. stagnation.
Your link is a level-headed explanation of why there isn't clear video showing the plane hitting the Pentagon. But I'm not sure what you're trying to say.
If you're saying that some people believe in crazy grand conspiracies, well of course some people do.
If you're saying that the government does not withhold information and doesn't manipulate or massage the information given out, then the link above doesn't support that point at all. People had to fight to get those grainy Pentagon videos from outdoor cameras mounted in a public place. Although some bits were leaked, the government didn't formally release them for 5 years!
And there is an enormous mass of data and documents that they still haven't released. It doesn't mean that there is any kind of conspiracy -- it could be the usual incompetence/bureaucracy/laziness, but some of it could be due to covering up capabilities or technologies like with those cameras on the Lunar Orbiters.
> I wonder about all the things happening today we're not being told about
Considering the usefulness of quantum computing for cracking ciphers, I wonder how far along development of that is and when we'll see it.
> The Lunar Orbiters never returned to Earth with the imagery. Instead, the Orbiter developed the 70mm film (yes film) and then raster scanned the negatives with a 5 micron spot (200 lines/mm resolution) and beamed the data back to Earth using lossless analog compression, which was yet to actually be patented by anyone.
Absolutely incredible! The level of complexity involved involved in space flight, along with the tight tolerances, continually impresses me when I consider the relative success of these missions.
Are there any details of this? Some of the documents imply it's simply FM modulated, no different from passing a tightly-focused B&W TV camera over the film.
> It used a method called vestigial sideband where a precise (1hz) people lot tone was used as a means to allow the stripping away of one half the waveform and then using that tone on the other end to reconstitute the missing side band. It is still used in communications today.
Quoted from Dennis Wingo, co-founder of the Lunar Orbiter Image Recovery Project, who has been commenting on the linked article.
The description appears to be something more similar to the "reduced-carrier single-sideband." There's long history behind the variants of sideband modulation:
It means you combine things (because it's handy or whatever) in a manner that loses essentially no information. Normally this would just be some kind of modulation. The salient difference being losing information due to technical limitations (SNR, distortion, bandwidth limits) versus designing the system to remove information (lossy).
Contrast: lossy analog compression, e.g. colour TV.
Specifically, PAL-style color subsampling which is more or less YUV 422. They could have fixed it other ways, but they decided to incorporate a quartz glass prism and piezo transducers to realize a 1-scanline physical delay, which they then used to do the analog variant of XORing each line with the one before.
I used to work at Ames in a facility a stone's throw from McMoon's (not singularity University hah).
Heard that the guys who worked on this stuff in there were super crochety. I went over there once to check out some old aerospace equipment they had hanging out in the back (I think it was some decrepit rocket engine), and they gave me a classic 'you kids get off my lawn speech' - until they found out that the friend I went with had a PhD in aeronotical engineering. They lightened up after that and we had a great conversation after that. What a wacky group.
I wanted to look at the actual files in question, they're purported to be at [1]. However, that just displays a 'file being migrated' message[2], and according to the internet archive it's been that way since 2016[3]...were the images lost after all?
That is what a good coding project is for! If you've got Python experience, or just want to have fun with it, BeautifulSoup can do exactly what you want. There might be Firefox plugins as well.
What? No, it works fine with http out of the box. From the man page:
GNU Wget is a free utility for non-interactive download of files from the Web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.
I think the GP's uncertainty relates to download URL wildcards with HTTP. Unlike FTP, I don't believe that HTTP servers have to support your request to list all files in a directory.
Great find. It is unfortunate that the moonrise image most used as the example image from this project is missing from that archive. It is FRAME_1101_H2, and is not there. This is ironic as this image is the example image the grandparent used for their example for the images being moved.
You probably don't want to use these for casual viewing, as they are too big. Use the PNG files from the other link, especially the lowest resolution only to get the idea what is there:
To understand the .img files format refer to the corresponding .lbl file e.g.
That's how astoundingly amazing the actual resolution is. You can see the grain of the film and the sharpness of the reference patterns. Note, that's the technology used in the spy satellites even more than 50 years ago, all without microprocessors which didn't exist!
Comparing with that, the actual shot of the moon surface is, at least in this sample, much less detailed than what the film was able to capture, the scanner to scan, the transmission circuits to transmit and the tape to store:
Is there any instances of people claiming the government purposely edited the original images to make them lower quality back when they were first released?
I’m curious if anyone realized the government may be hiding their true capabilities.
Back then most people would have been supportive of that. If someone noticed they probably wouldn’t say anything for fear of blowing the governments cover.
I'd like to know more about that too. I can imagine that since the space race and cold war was in full fledge at the time, they neither wanted to give the Russians intel on the moon, nor let them in on their camera quality.
It surprises me they could transmit an analog signal back while keeping its quality secret from the Soviets. I wonder if they did succeed at that, or just at keeping the rest of us in the dark.
It wasn't like the Soviets could just turn their TVs to the right channel. I doubt the Soviets knew what frequency they were using, and analog video is still "encoded" in the radio waves. These signals still required that specialized $300,000 equipment (which I'm sure was top secret at the time) to make sense of.
The height of the antenna would give away the frequency. But the soviets had good spy penetration into the US industrial base so it's likely they already knew anyways.
I've worked next to Moffett for five years, in one of the office buildings overlooking the runway (we have to pause meetings sometimes to let the noise of jets or helicopters pass), and I've gone in to play golf a couple times, yet this is the first I've heard of or seen that building. So weird! Must be in some other corner...
It's a relic of the days when the base was Naval Air Station Moffett Field, and had a sizable on-base military complement. The Navy moved out in 1994, taking with them most potential McDonalds customers.
> After their use, the images were locked away from the public, as at the time they would have revealed the superior technology of the USA’s spy satellite cameras, which the orbiters cameras were designed from. Instead the images from that time were grainy and low resolution, made to be so by NASA.
Brilliant!
It states they used 70mm film which according to google is used for motion pictures, I assume a custom camera?
I would guess a development of Corona, the earliest spy satellites. Those earliest spy satellites were essentially disposable cameras. When full they dropped the film to be caught in mid air by the air force.
(A fascinating wiki hole - you may be gone some time) :)
I'm not certain about the film used for the Lunar Orbiter probes, but the cameras that were used by the Apollo astronauts used 70mm film, rather than the 60mm 120 format that was standard for the Hasselblad cameras they used.
The size of the frame is the same, but 70mm film is wider because it has sprocket holes that 120 format film lacks.
I would imagine that for the Lunar Orbiter probes they did use a custom camera, and didn't just bolt on an off the shelf camera.
What I find amazing is a film camera in a hard vacuum. I could be wrong but two things. Designing mechanical things that work in a hard vacuum is difficult. I'm assuming bad things would happen to film in a vacuum. So assuming they were sealed under argon or nitrogen.
They weren't sealed, but they were modified for space.
You can't use liquid lubricants in space, so they had to use solid lubrication or none at all.
Also, as film gets wound through the camera, it generates static electricity. Usually, this gets wicked away by the atmosphere. However, there is no atmosphere in space, so they had to have a special metal plate to dissipate the built up electrical charge.
The film itself was normal photographic film, although a special emulsion made specifically for NASA.
For the Moon landings, the film was not normal photographic film, it was a special thin base film mounted in special backs that allowed for 160/200 frames per back, instead of the 12 exposures you get with normal 120 film on a regular Hasselback back. Not sure what they used for the scouting missions.
Weirdly enough they used color slide film for the Moon missions. I never understood why. Color slide film has much less dynamic range compared to color negative film (5-6 stops vs. 12-14 stops, depending on the specific films) and it is much more difficult to expose for correctly (in fact, most pictures taken on the Moon are severely under/overexposed). One explanations I heard was that there was no color reference on the moon, so they didn't know how to print from negative film, but that doesn't make any sense to me. The Moon is very gray, and they could have taken a color chart with them...
What I meant was that it wasn't some super exotic photographic film, printed on an unobtanium base.
It was standard acetate film with a gelatin emulsion.
Aside from the special Ektachrome, they also took black and white film magazines with them. The colour slide magazines held 160 exposures and the black and white magazines held 200 exposures.
I believe the reason they used slide film was that the quality of colour negative film wasn't so good back in the 60's. A lot of the shots they took on the moon were bracketed for exposure, so they took 3 images of the same subject, but with different exposures.
70mm is also a "normal Hasselblad back"; it served much the same function as bulk film backs for 35mm cameras. There were, of course, issues that kept them from being the default backs for most photographers: the (relatively) small image size, which made for grainier images; the large capacity was actually a negative for most casual and professional uses (if anything was even close to time-sensitive, you'd wind up wasting a lot of film), and the size and weight of the back itself compared to the more typical A12/A24 backs.
I had a Mamiya 645 which had 120 and also 220 backs. The 220 roll had double the number of exposures, this was enabled by a thinner film and less backing paper from what I recall. I recently sold it all and was told the 220 backs were worthless now, I presume the film is no longer / not widely available.
Waste and the associated expense thereof, for one. Especially for professional use - there is an average waste of half of the last roll of film on every shoot, and that means 8-12 wasted frames on 220 versus 4-6 on 120. For the consumer side, a very large percentage of cameras used a window on the backing paper for frame counts, something that would be disastrous for 220. And for anything that used a pressure plate rather than just film tension to maintain film flatness, you'd need to have an adjustable plate (many cameras didn't have that).
Mechnical things largely work the same. What you do have to pay attention to is that you can't have stuff like flywheels that rely on air to slow them down.
Otherwise, a film camera would work the same in vacuum, even if it had some electric components.
For the film itself, I don't think you need anything special, there is nothing on the material that would boil off under vacuum to my knowledge.
The difficult parts are really heat because unlike in air, any heat you gain is difficult to get rid off again since there is no air to transport it away. So if you do have some electrics in your camera they can easily overheat if not designed for vacuum operation.
> Vacuum cementing or vacuum welding is the natural process of solidifying small objects in a hard vacuum. ...
> This effect was reported to be a problem with the first American and Soviet satellites, as small moving parts would seize together. ...
> In 2009 the European Space Agency published a peer-reviewed paper detailing why cold welding is a significant issue that spacecraft designers need to carefully consider.
Regarding specifically plastic, which might be used for the film, see https://en.wikipedia.org/wiki/Corrosion_in_space : "Many plastics are considerably sensitive to atomic oxygen and ionizing radiation."
That link also points out the "material fatigue caused by cyclical heating and cooling and associated thermal expansion mechanical stresses."
As an example of cold welding. The Galileo spacecraft wasn't able to deploy its high-gain antenna because the pins that held it in place got stuck. There was graphite lubricant to prevent that but it was shaken out while it was being transported to the launch site.
Also in the early 90's saw a price list for (3M's) two part silicone products. One product was $25,000 a gallon. Application was for space craft windows. Turns out regular silicon will out gas and fog windows under hard vacuum.
All the above has lead me to have a lot of respect for the engineers that made this stuff work. Ditto because every time I've had a materials problem it's been a trail of tears.
"Lubricants had to be chosen with utmost care because of the risk that conventional lubricants could boil off in vacuum and condense all over the optical surfaces of the lens."
I do remember the hand held camera's weren't sealed. So film and components needed to be redesigned to work. For instance special anti static film. And mechanical parts made of glass to prevent cold welding.
But with Apollo they could take the film back and develop it normally. The orbiters developed film in space and then scanned it. I tried looking and found nothing, but the orbiter camera's developed film using a semi-dry processed using damp ribbons that seem similar to a old typewriter ribbon.
According to this article, Walt Schirra's Hasselblad was the first one in space and after that they were used exclusively for some years. I guess before that they were custom.
The first camera that was brought into space for the Mercury program was a remarkably unremarkable Ansco Autoset rangefinder manufactured by Minolta [1] that was picked up at a drug store, which went up with John Glenn.
Of course, Hasselblad didn't put that in their marketing material. Nasa then moved to using Nikon cameras around the Spacelab era, and still uses Nikons to this day.
So far NASA used Nikon because Nikon didn't use fluorite elements, unlike Canon. However now Nikon started to use fluorite elements as well. I wonder what will happen in the future.
Fluorite glass is more sensitive to shock and vibration compared to regular optical glasses. Iirc some people actually managed to disintegrate lens elements here on earth, without even hurling them into space.
I guess that makes sense but is NASA actually shipping off the shelf DSLR lenses to space anyway? I guess for astronauts’ personal devices, whatever. But for actual scientific instruments, I’d expect NASA is bidding out custom contracts that can easily specify how much shock tolerance the cameras must withstand.
Reading about the equipment that they had to cobble together and keep running - reminds me that we keep losing the surplus places in the Bay Area where it would have conceivably been possible to find vintage parts.. weird stuff closed suddenly a few weeks sho, and I fear Halted is on its way out...
Those places should receive government support (as national heritage museums) or like the internet archive, big corporate sponsors (e.g. google and facebook and co can easily afford to keep those places around as a charity)
Great article. Digital is not always better, the technology was not there to outperform analogue tech.
I even wonder if a digital camera system nowadays can outperform this 2GB images, I mean how do you transfer that amount of data at this long range without loss?! Is that even possible?
Maybe this analogue picture compression is something which is still usable and valuable in long distance space transmission?
“Lossless” analogue transmission isn’t lossless. It’s just less lossy than the lossy forms of analogue transmission. As a very simplified example it’s really easy to modulate an analogue value in the frequency domain and maintain accuracy and dynamic range. Hence why FM stereo usually sounds pretty amazing still. It’s not terribly sensitive to environmental factors. However conversely AM sounds like crap.
Now we have digital protocols which are still sent on top of analogue signals (everything is analogue down at the bottom, even your CPU). We lose a tiny bit of dynamic range through compression in some circumstances but gain error correction, speed and the ability to recover signals from below the noise floor which means less power and more distance for the same power.
So no, digital is definitely the way.
As an amateur radio operator, some of us at least tend to play with very low powers. You can have a two way conversation 3000km+ with no more more than a watt but only if you use digital modes. One reason why Morse/CW is still popular; it’s a digital encoding.
> Recent advances in digital signal processing have allowed EME contacts, admittedly with low data rate, to take place with powers in the order of 100 Watts and a single Yagi antenna.
> I mean how do you transfer that amount of data at this long range without loss?
Millions of people are watching the football via a satellite with a 80Mbit/sec DVB-S2 link from low earth orbit with consumer hardware. The system uses forward error correction to cover loss.
The main limiting factor of digital cameras is producing really big sensors, but if you want to photograph a stationary object like the moon it can be done readily by stitching lots of small images together.
People are currently getting excited about the picture quality of 4k and beyond. The old cinema film is, as I understand it, roughly equivalent of about 12k digital cinema. The way the work is quite different so it's not directly comparable.
Similarly with photographic film, Zeiss continue to make a medium-format mono film called Gigagbitfilm which requires special developing but when scanned, in theory, can give gigabit images. Unfortunately not only is it mono but only ISO 40 so exposures are difficult.
If you were shooting a still subject, you could put a filter that excluded other wavelengths and capture the RGB channels separately on mono film and then marge them in post. I've been tempted to convert my digital sensor to mono-only by removing the color filter array to see if it made any difference.
Also, I don't quite understand why ISO 40 would make anything difficult. I regularly shoot at ISO 50 on digital.
> I've been tempted to convert my digital sensor to mono-only
Have you come across the Leica M-Monochrom? When I first heard of it I thought it was odd, but understanding how the sensor work I can see what an advantage it could be for B+W photography.
Yes; one of the 3-letter agencies (I forget its name) has a satellite that can take gigapixel photos 20km (IIRC!) wide. Satellite sends 1EB (yes) per 24h of timelapse. Presumably said agency stores a few days/weeks of footage.
Put the input and the output of a Schmidt on an oscilloscope some time and tune across an HF band for weak digital (RTTY) signals. Then try adding crap to the input.
They'll pull perfect (digital, lossless) copy so far down in the (analog) noise that it can barely be heard. I'm sure there's better stuff these days (been a while).
Some casual reading of modern mobile phone signaling can make one's head spin. Best I could tell, every phone talk on top of each other and still the base is able to pick out individual data streams from the resulting soup.
The reason the base station and your phone need to know their distance is so the phone can transmit at the right time so its signal arrives at the base during the phones time slot and vice versa. That way the base can listen to other phones outside of that slot and the phone can not listen, saving energy, outside of the slot.
I don’t get how that’s your takeaway from the article. The entire point was about the pains they went to in order to convert it from analog to digital because digital is absolutely the way to go here.
So in theory if someone built a robot and spacecraft, the original 70mm negatives could be retrieved from the moon and provide even higher resolution images?
One can simply wonder how much progress is being kept secret from the world due to nationalistic bullshit. To be honest, I'd not be surprised when in 2060, someone says "we had all the stuff in the Avengers movies as actual, real technology back then".
And one can only wonder if the hundreds of billions sunk into programs like the F35 or the European A400M program haven't simply been used to fund skunk works projects instead.
What most people don't realise is that can make film that can record at whatever resolution we need, and we have known how to do this for over a century. It's simply a tradeoff between grain size and sensitivity. For most consumer film sensitivity is usually the more favorable trait. But for special applications we might value smaller grain size. For holographic film we can make it record better than 10000 lines/mm. This is essential because we are recording diffraction patterns.
Makes me wonder how good the spy satellite pictures of the era were.
Wasn’t the technology behind the Corona satellites based on this tech? I wonder if the declassified pictures they release are the lower quality versions?
The "NASA technician" in the second to last image appears to be airbrushing a large scale version, does anyone know if there are more pictures of this somewhere?
Wow. Nobody had bothered to take images of the moon until 1966? All those cameras and telescopes. You would think someone somewhere would have taken a few pictures prior to sending the rocket. Perhaps the article meant that these were the first images from the moon.
By the title, I was expecting an article about the first photos of the moon.
A digital signal is inherently lossy as it takes only finite number of values because digital is discrete (0 and 1), whereas analog is continuous and in theory it can encode an infinite number of values, subject to your codec apparatuses limits.
I was referring to the fact that an "exact value" doesn't even exist. There is noise on every analog signal. If there is no exact value then lossless becomes a meaningless term.
Strictly, that doesn't matter. Depending on the signal being sent, digital is lossless, as well. Specifically, for a bandlimited signal, digital can be lossless.
Basically, the sampling theorem is a hella fun topic. There was a very good video that served as a fun primer on this from a few years ago, if you want me to look it up.
Analog compression was a very important topic before the days of digitization; it was the subject of my dad's first job out of university in 1961 (television bandwidth compression in that case).
And in fact analog modulation can precisely encode the source signal; consider how music or voice was sent uncompressed over radio/through the phone.
Edit here's a little more information: http://www.moonviews.com/2012/06/lunar-orbiters-classified-h..., and the link from there to http://www.nro.gov/history/csnr/programs/docs/prog-hist-01.p... contains the technical details on the camera I was after. Amazing stuff.