When he says the Scanimate had a scene in the first Star Wars, he most certainly means the Death Star plans animation shown during the briefing before the Trench Run. Nothing else in Star Wars looks like what this crazy machine outputs.
I knew that the plans show an equatorial "cannon" because the scene used an earlier design of the Death Star, but I didn’t know it used an analog machine to make them. I wrongly assumed it was made by a digital computer.
I don't think the Scanimate was used for this. This video shows a digital computer being used to create the animated trench sequence: https://www.youtube.com/watch?v=yMeSw00n3Ac
I’m curious what that dial system was, it looked like an oscillator based vector display with how it was controlled with dials and how the programmer used a separate computer to run the software and dot pen.
It wasn't used for the wire frames of the Death Star/trench run briefing. Those were digitally generated on a PDP-11.
I'm like 75ish percent sure the Scanimate was used to generate the orbits the Death Star's targeting computer was showing as they were showing the moon coming out from behind the planet.
Yeah, you're right and I was wrong. Perhaps only the gas planet was animated using the Scanimate though. The firing solution circles might have been done on a computer.
If you're wondering why I shared it again, I got an email(!) from HN inviting me to share it again, which is an interesting thing I didn't know existed
For real. It'd be interesting to see a Reason-like software interface with plug-in cabling for motion graphics. Node-based is cool, but it's also neat to see how the physical hardware worked.
It‘s not the same thing but I feel like you may get a kick out of the Ming Mecca [0]. It‘s a hardware patchable video game creation system. It‘s a really inspiring concept to me and I hope to play with one someday.
Someone should show this guy blender's (or similar software's) node based shader and now geometry pipelines. I bet he'd love how you can similarly play with "dials" and sliders to view realtime changes in visuals and shapes.
Anyone else notice the Bob Dobbs, Church of the SubGenius logo, on the monitor behind Dave Sieg's head in several of the shots? So rare, the logo's placement in this documentary piece is them being cheeky.
Yes, probably easily in these GPU days. I have an many excellent hardware and software implementations of analog modular synthesizers; it' soften easier to plan a patch virtually before doing it in hardware (where it does sound better but behave less predictably).
Synthesizing color video might be a bit more work, but not that much more; I have analog synth patches that display animated logos and even allow you to play pong when hooked up to an oscilloscope (analog or digital). It's absolutely doable.
I had hoped that it was about a mechanical optical printer that could composite and project multiple film images.
The closest thing I can think of was a multihead, aerial-image optical printer built by Ub Iwerks for Disney.
If anyone has knowledge of such devices I love to hear it. Apparently it could composite text, animation and live images.
http://nzpetesmatteshot.blogspot.com/2015/10/optical-effects...