Hacker News new | past | comments | ask | show | jobs | submit login

I wish the article got into more of the technical details behind how this actually worked. I find it absolutely jaw-droopingly incredible that they were able to shoot gigabytes of essentially IMAX-quality negatives, develop them in flight (which requires temperatures of around 20ºC for all of the materials involved—Edit: they didn't do this because it would've been too difficult, and ended up using a mostly dry development process. See Edit link), scan them(!!!), and then radio them back to the earth. I would find this amazing today, and all of this happened in the 1960s!

Edit here's a little more information: http://www.moonviews.com/2012/06/lunar-orbiters-classified-h..., and the link from there to http://www.nro.gov/history/csnr/programs/docs/prog-hist-01.p... contains the technical details on the camera I was after. Amazing stuff.




> "I would find this amazing today, and all of this happened in the 1960s!"

If bowling alleys didn't already exist... and you pitched me the idea of fully automated systems to collect and re-rack pins and return 15-lb balls to the player after each and every throw... I would probably say that's either impossible or at least impractical at scale.

19th and 20th century engineering is amazing. Today, if I can't conceive of writing it as a mobile app than I just shrug.


> 19th and 20th century engineering is amazing. Today, if I can't conceive of writing it as a mobile app than I just shrug.

Skills thrive only if there is an incentive, monetary or otherwise, to practice them. Otherwise they atrophy.

In my past 6 years in Silicon Valley, I worked mostly alongside PhDs in physics, math, comp sci, EE, quantum physics even. This is a metric TON of talent. Do you know what you can do with that kind of talent? You can like literally get a man to moon etc. Every one of these guys ( sadly including myself ) was working on some throwaway nonsense- ETL jobs, frontend js, distributed backend code that got rm -rfed and rewritten periodically, messaging systems replaced every year with messaging systems in latest language fad, all kinds of ultimate software crap manned by the jira monster with never ending tickets & feature requests.

So I just saved up the breadcrumbs & headed back to academia when I couldn’t take it anymore. Better to work on serious stuff, even if it were ill monetized. You only have one life.


Truly, the greatest minds of our generation are hard at work coming up with new ways to trick people into clicking ads.


Or developing trading algorithms. AKA Financial Engineering.


This is sad, but very true. AI is driven by this now.


Getting man to the moon was more of a vanity project than at least half of all start-ups.


Never mind the enormous amounts of throwaway work that had no direct practical applications outside that single vanity project.


This is one of the best comments I’ve read on HN... I agree fully. I’ve always wondered what could happen if we threw all these talented people onto really important stuff. btw: even though I’m not his biggest fan, I think Elon Musk is doing exactly this with Tesla and SpaceX...


Same exact experience working in the valley. I also lasted 6 years.


I have a feeling that a lot of people today, confronted with the problem of designing a bowling alley, would go straight for a video camera, a robotic arm to pick up the pins and put them back in their spots, and machine learning to glue it together.

I wonder how the cost, reliability, and speed of that would compare to the mechanical solutions. There are a couple bowling alley rerack machine horror stories somewhere on r/talesfromtechsupport.


Today, the focus seems to be on making sure that a product/system is easy to modify, integrate, or tear down completely instead of saying 'we need to build a system that can do x and only ever x really fast and cheap for a long time". We don't build three-decade systems in the tech industry, we build six-month systems.


What's crazy about that is that some of those 6 month systems, which were designed at the time to be a throwaway, just to get by, are still in production 26 years later and causing all kinds of problems.


That's because we predominantly build software, which has the distinct benefit that it can be changed. If ten-pin bowling goes out of fashion in favour of eleven-pin bowling, your three-decade system goes in the landfill. Your six-month gets an OTA update.


>I have a feeling that a lot of people today, confronted with the problem of designing a bowling alley, would go straight for a video camera, a robotic arm to pick up the pins and put them back in their spots, and machine learning to glue it together.

plus stuffing the pins with IoT sensors and WiFi chips, i.e. making "smart pins", "smart ball", ...

I wonder what stuff what is being done today (both kinds - known to public as well as classified) would look amazing in 2060.


I never thought I'd have to worry about bowling alley equipment DDoSing my servers, or having them mine monero


The IoTIMCC: Internet of Things Illegitimately mining crypto currency


It's easy to be nostalgic for past grandeur, but remember that the vast majority of 19th and 20th century engineers spend all their time working on mediocre solutions to uninteresting problems, and being complacent about, then getting throughly bamboozled by Japan in the 80s.

I can't remember where it's from, but there's a story about a firm procuring a shipment of widgets from a factory in Japan, with 5% defect rate. They received the shipment with a separate box with the 5% defect widgets and a confused letter of apology. That was also 20th century engineering.

There is amazing stuff going on, and there is a lot of bland, forgettable stuff going on, and that's the way it's always been.


Why stop at 19th and 20th centuries? I had the opportunity to go to Rome and seeing what the Romans created ~2000 years ago was astounding.


Being in the UK, I can hop down the road and see what the Roman's were up to; as you say, astounding, and over such a wide geography. Then again, 4,500 years ago in Egypt they built some majestic structures.


Agreed. The things that fascinate me the most are how simalar we were to the Romans, or Greeks for that matter. Roads, society, government, and water utilities. Even just stitting in a bar/restaurant that was a similar place back on Roman times.


Not only did fast food exist in ancient times, so did takeout windows for faster food distribution. https://www.history.com/news/ancient-fast-food-window-discov...


Not to mention the networks of towers transmitting the latest gossip by electro magnetic radiation (ie. light) or sound. Or the tourist agencies offering voyages to festivals. Or the high speed train network ... wait a second, that doesn't sound right.


In fairness: survivorship bias.


The auto points calculation is another thing that seems kinda incredible no matter what implementation they used.


Wasn't the facsimile machine invented in the late 1800s? Lots of weird things have existed way before it seems to have been possible.

This: https://www.damninteresting.com/nugget/the-fax-machines-of-t...


I have one word for you - Polaroid.

Polaroid had really high resolution, a mostly dry development process, and was simple enough to be done by anyone.

20th Century Engineering is amazing.


I was just thinking that the positive-negative developing system described in the linked document sounded not entirely unlike the one Polaroid developed for their cameras, in particular the Type 55 sheets.


Polaroid is also interesting, because almost all of their color processes are extremely fade resistant and color stable, which is more than can be said for any other wet film process, with the exception of Kodachrome and Cibachrome/ilfachrome dye sublimation prints.


Now, I'm interested in how they did the degraded images at the time. Digitally? Or just shot the original photos with a bad camera and lighting?


It was all analog at the time, and much faster -- processing that amount of data would take decades at that time if done digitally -- for those who were involved it was a child's play to degrade the signals they've spent so much energy to configure with proper timings and responses. Imagine the guys making circuits, turning dials while looking at their oscilloscopes.

If you're old enough to remember VCRs, consider: VCRs were also fully analog. https://en.wikipedia.org/wiki/Videocassette_recorder


Yeah, I think you're right, just add some noise and non-linear effects and you have your degraded image.

The comparison with VCR makes sense.


In analog, you typically don't degrade with the "noise" but with the reduced "bandwidth" (e.g. filters) -- the details can be wiped out in "real time." What in digital you get by "increasing JPG compression" and waiting for the CPU to process everything, can be in analog smoothly achieved with only a condenser and a resistor.


Agreed but if you see the picture comparison between the earthrise photos (good/degraded) you see two things: there's a "scan pattern" where different vertical lines have different brightness/gains. Also highlights and dark areas are more contrasted than the good version.

They might have added some filtering to the signal, but from the photo, that's not the most important artifact that was added.


> there's a "scan pattern" where different vertical lines have different brightness/gains. Also highlights and dark areas are more contrasted than the good version.

You get all this “automatically” by using much simpler circuitry doing the equivalent job of your “fine tuned” versions.


Yes, but only in one direction. If you low-pass filter a raster scan signal, then the image is blurred horizontally, but retains its vertical resolution.

Vertical filtering can be implemented in analogue circuitry using line delays (this was done for standards conversion as an alternative to pointing a camera at a screen), but it's a lot harder than taking a photo of a folded-up print-out.


What I’ve meant is that to get the “stripes” to be less aligned and more jagged (or with different quality properties) it’s enough using worse circuitry or worse tuning than what they probably used to align them correctly. As an example, that's what old vacuum tube TV sets produced easily when the circuits weren't correctly tuned. The last phase, including producing the “worse suited” contrasts was most probably done in the photo processing phase, I agree with that. I also agree that making the image more “blurry” is also easiest done by making a bad photo. But, it could also have been that the "better" images were in one pass simply run through the "facsimile" machines which were anyway used by the publishers an the news agencies at that time.


This is when retouching involved a brush and an artist. They might have done it electronically, but...

For artificial scan effects like the Earthrise photo, make an acetate overlay before some good old fashioned dark room dodge and burn making the print, and a slightly out of focus enlarger.


This is pretty funny in the context of the moon landing deniers. I can picture somebody telling them - you're right, the pictures are kind of faked. Not because we didn't go to the moon, but because we didn't want to let the Soviets know how good our cameras are.


If you want a really fascinating story, check out faxes from the far side. [1] tl;dr The US sent spy balloons over the USSR, many of which were captured. For their probe to the "dark side" of the moon, the Soviets used this film which was processed in the satellite and then essentially faxed back to earth.

But, hey, Silicon Valley is really good at ad tech. :-)

[1] https://www.damninteresting.com/faxes-from-the-far-side/


The most amazing point about landing on the moon in 1969 is that they had to program everything with... negative timestamps. Of course it’s a joke but it highlights how early it was in the history of technology.


It's strange to me that you talk about 1969 as early in the "history of technology". We had the wheel, bridges, steam engine, cars, refrigerators, air-conditioning, nuclear power, early robotic manufacturing, computers, and a nascent internet.


Parent obviously meant "history of computer-technology".


... how. I don't get it.


Unix time starts counting upwards from zero on January 1st 1970. It's a joke. If they had Unix computers in the lander they would have had to use negative time since the first moon landing happened in 1969.


In Vernor Vinge's far future novel _A Deepness in the Sky_, there is a mention of a "programmer-archaeologist" who digs deep into layers and layers of legacy software of spaceships; it's tens of thousands of years of stuff built on top of each other.

It's mentioned that the deepest layers count time in seconds from the moment that humanity first landed on the moon of its home planet.


In fact, there’s an comment to the effect that the actual start time is about a million and a half seconds after the moon landing — which means the original time system is Unix-based...


I just love the title "Programmer-at-Arms." Some of the seat-of-the-pants work we do in industrial automation, making changes to a machine's operating logic while it's actually running in production, feels just like that.


I very highly recommend that book, by the way. My favorite science fiction book I've ever read. Anyone else love it?


I would also recommend Vernor Vinge's earlier work, A Fire Upon the Deep.

It features a galaxy wide group of alien civilizations that communicate by means of a system that greatly resembles Usenet newsgroups.

It either won or was nominated for most of the major science fiction awards.


Oh, I actually confused the two! A Fire Upon the Deep is my all time favorite, A Deepness in the Sky is close behind though.

I've been looking for books with similar ideas for a long time... would you (or anyone else) know any? The ideas in the two books are really amazing.


Here are some works I'm fond of from authors who deal in technology and whose works have also taken the top science fiction awards.

Neal Stephenson's Snow Crash and The Diamond Age.

Greg Bear's Moving Mars, Anvil of Stars, and Eon.

John Barnes' A Million Open Doors, and Mother of Storms.


I’m currently reading the book precisely based on a recommendation from HN. It’s a great book indeed.


If NASA are anything like us, their epoch is January 1, 1958:

https://public.ccsds.org/Pubs/301x0b4e1.pdf


Thanks for that info. "Us" there means:

"Founded in 1982 by the major space agencies of the world, the CCSDS is a multi-national forum for the development of communications and data systems standards for spaceflight."

And indeed, the epoch:

"The CCSDS-Recommended epoch is that of 1958 January 1 (TAI) and the recommended time unit is the second, using TAI as reference time scale, for use as a level 1 time code. This time code is not UTC-based and leap-second corrections do not apply."


I'm guessing it's a bad joke about Unix time (starts in 1970)


Except it's a really good joke.


The UNIX epoch is midnight, January 1st, 1970 while the landing on the moon was 1969-07-20.


NASA / ESA epoch is 1 January 1958:

https://public.ccsds.org/Pubs/301x0b4e1.pdf


> he UNIX epoch is midnight, January 1st, 1970

in UTC.


But after 10,000 years (or whatever) that distinction might be lost.


It’s about the unix epoch starting in 1970.

-_-


Negative timestamps are used in astronomy all the time.


yes, not credible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: