Hacker News new | past | comments | ask | show | jobs | submit | sbrorson's comments login

The thing which seems to be overlooked -- both in the linked article and in the HN comments -- is that at one time ham radio was a hobby for folks wanting to play around with state-of-the-art technology. It was more than just a bunch of guys chatting with each other over a radio. At one time most people had to build their own ham rig from piece parts. One had to have a serious amount of engineering ability to do so ... or at least be able to read a schematic published in a magazine and solder together a radio, not a small feat. As a hobby it was all about becoming proficient in electrical engineering, then enjoying your proficiency by chatting with -- and showing off to -- like-minded guys around the world.

Ham magazines like QST and 73 published schematics and other how-to articles which amounted to an excellent education in practical electrical engineering. 73 magazine in particular was very technical -- it assumed you had good working knowledge of many common RF and audio circuits. Also, to get a ham license you needed to take a test demonstrating a good level of proficiency. For the lowest rung "novice" license you just had to demonstrate knowledge of Morse code at a slow but reasonable pace. For higher rungs of the achievement ladder like "advanced" or "extra" you needed to pass an exam about electronics as well as receive and send Morse code at a fast clip. The license presented a barrier to entry which didn't exist in CB radio -- which some hams looked down upon as a cesspool of unwashed, technically illiterate bozos.

My exposure to ham radio really started in the 1970s since my dad had been a ham playing with homebuilt radios since he was a kid in the 1930s. In the '70s the hobby was trifurcating. On one hand were the hard-core guys who built their own rigs, sometimes for the challenging high-frequency bands. Some also participated in designing and launching a ham-radio satellite via the ARRL. Early forms of digital encoding over radio were also big. Those guys were the real engineering types, and I admired them. On the other hand, commercial outfits were selling ready-to-use radios ... no engineering skills needed. In the middle were guys assembling and using stuff from Heathkit -- a great way to learn about electronics.

The ham hobby has been fading away for many years. I think the thrill of building a radio and then talking to somebody on a different continent with it has diminished in the face of cheap international calling, cell phones, Skype, etc. The only thing left for hams to do these days involves disaster relief -- and the remaining ham mag QST plays up that theme (or at least did the last time I looked). Meanwhile, the young nerds who used to take up ham radio now occupy themselves with various open-source software projects. I suppose that's natural.


Further details:

> For higher rungs of the achievement ladder like "advanced" or "extra" you needed to pass an exam about electronics as well as receive and send Morse code at a fast clip.

The General license (which I got) required 13 wpm. I think I read somewhere that they still broadcast Morse code practice every night.

The Extra was 20 wpm. I don't remember "Advanced."


By the time I got a HAM license that I only used for digital transmissions, several people taking the test with me were "old-timers" who were lamenting that you didn't need to know CW to get a license.

There is apparently still a thriving CW community, and some of them use auto-keying now, so you don't need the manual dexterity any more.


I'll add one more piece of ham lore to this thread. The gigantic electronic component distributor DigiKey got its start when the founder designed and marketed a Morse code key to the ham radio audience. DigiKey was originally a mail-order business selling to hams.

The point is that the ham radio hobby is very intertwined with the electrical engineering profession. It's very technical ... not just a bunch of guys talking to each other over the radio (although it's that too).


The advanced class had the same 13wpm code requirement as the general but added a difficult technical element to the exam suite. When I took it back at an FCC office I found it to be the hardest test of all.


Do you still have an Advanced class license? I think that Advanced class is the only license that is currently around (and able to be renewed) that required a code test.

The Novice test was something else. 20 questions. 5 WPM code. I was thrilled when it came in the mail. 40 Meter CW, look out.


No I have an extra. I thought they did away with the advanced when the privileges were realigned couple or 10 years back. I haven't kept up with the class privileges.


They did away with advanced more than 10 years ago.


Time flies. It was actually April 2000, 24 years ago.


This evolution seems to be the same in most technical hobbies. And as the evolution occurs and the technical requirements for adoption drop, you need more and more regulation to control the bad actors. See UAS regulations after quadcopters became off the shelf products.


You mentioned the cost of tools a couple of times. What's wrong with using MOSIS, which is free? Just asking, since I don't know anything about this area except that MOSIS was mentioned multiple times when I was in grad school back in the 1980s.


MOSIS is a fabrication service. The tools are totally separate, and you'd still have to pay to use them. Universities generally get free licenses though.



I will probably get downvoted for pointing this out, but the reality is that the geometric algebra approach to E&M, while interesting for its own reasons, will not replace the formalism based on Gibbs's vector calculus. One reason is simply that vector calculus is pretty intuitive and easy to learn. The major reason, however, is that the vector calculus approach is totally entrenched in the worlds of engineering and physics. After 100 years, nobody actually practicing those disciplines will make the notation change just so they can replace the 4 Maxwell's equations with one geometric algebra equation.

Also, Gibbs's vector calculus is used in fluid dynamics and other engineering disciplines, and as far as I know, nobody it touting the advantages of geometric algebra to folks working in fluid dynamics. I can be pretty sure that some HN reader will show me I am wrong about this by pointing out one lonely researcher who has found a way to express the Navier-Stokes equations using the geometric product ... but so what? ... My main point is that traditional vector calculus is a language everybody knows how to speak, geometric algebra is just another way to say the same things, so why would anybody change?


The metric system seems like a similar analog to geometric algebra vs vector calculus. You are saying the same thing but the language you are using is much more internally consistent.

Adoption has been bumpy given the US resistance but I think in the long run it (or something even more consistent) will win out. Similarly I think geometric algebra will be adopted. Maybe not in our lifetimes but eventually.


Interesting paper to skim that seems to look into it.

https://vixra.org/pdf/1206.0021v1.pdf


Field theorists pretty much already have abandoned the vector calculus version of the equations, though.


Besides these test functions I am curious to know if anybody uses the CUTE/CUTEr/CUTEst framework for testing. It's not clear to me how much the CUTE* packages are used in industry or by developers of optimization software.


I worked at the Arch Mach at MIT for a very short period in the very early 1980s. The Arch Mach was the predecessor to the MIT Media Lab. I made absolutely no impact there, but I do recall one of the successes of the lab was a kind of prototype of today's Google Maps Street View, where you could drive around Aspen, Colo on the computer and look at image taken from a car driving around the town. You could choose to drive forward, then turn left or right at an intersection as desired. You could thereby explore the entire town via the computer. It was regarded as extremely cool at the time. It's clearly the inspiration to today's Google Street View. Although I didn't work on it myself, I was able to play with it.

The images were stored on a laserdisk, and as you drove down the street the laserdisk player would pull the appropriate images off the disk and show them to you on the computer monitor. The images were stored on laserdisk because they were large files and at the time the only way you could store a lot of such large files was on a laserdisk since it was designed to hold video. For the 1981-1982 time-frame the Aspen exploration system was very forward-looking, but I do recall a delay between the time when you'd hit the button to move ahead, and the image would appear on the monitor. The delay had to do with first seeking, then reading the image off the disk.

I just looked around the web, and found this link describing the system:

https://en.wikipedia.org/wiki/Aspen_Movie_Map

I see a bunch of names I recognize in the Wikipedia article, so here's a shoutout to all the folks I worked with while I was an insignificant undergrad.


The reason Americans use cups and spoons (instead of e.g. grams) to measure ingredients is a matter of history. Tradition says Fannie Farmer wrote the "Boston Cooking-School Cook Book" in 1896, and for the first time in a cookbook used precise measurements (instead of "a jigger of this" or "a pinch of that") to specify how much of each ingredient to use. Nobody at that time would have a weight scale in their kitchen, so Ms. Farmer measured her proportions using tools which were readily available in any kitchen -- cups and spoons.

https://en.wikipedia.org/wiki/Fannie_Farmer

Her cookbook achieved immediate success and she became a household name in America. Even I had heard of her, long before I became a feeble student of history. Her measurement system is totally entrenched in American cooking culture, so I expect it won't ever be displaced by metric measurements except perhaps amongst high-end commercial kitchens and with people who want to make the effort to convert to grams and liters when cooking.


Inflatable buildings used for disaster relief. They can be trucked or air-lifted into place, then quickly inflated.

https://federalfabrics.com/


Proudly made in Lowell, Massachusetts. <3


Yes, they still make textiles in Lowell!


That's not how the real world works. A lot of manufacturing equipment and test stands were developed using LabView and used NI cards to control equipment and take data. The people who developed all that stuff are long gone and the new people neither understand the old stuff nor do they have time to spend many months spelunking into the guts of the old software to maintain it. Their managers are also very reluctant to change anything since it means very expensive downtime for an entire assembly line. Downtime can cost tens or hundreds of thousands of dollars per month. Everybody wants to keep the old stuff running as long as possible -- generally several decades.

I have no experience in the banking industry but I do hear the same considerations I see in manufacturing exist in banking software. Any change is fraught with major risk (millions of dollars in loss) so change is avoided as much as possible. I see lots of people on HN talk about how change and churn is avoided in banking software ... the same applies to manufacturing.

The Rasberry Pi comment is laughable. I'm not talking about toys for hobbyists. I'm talking about computers running software controlling cards costing tens of thousands of dollars running physical equipment of equal or greater expense. Nobody will hire a hobby hacker to light a few LEDs with a Pi when doing factory automation.


Also, it's laughable to talk about power consumption when the PC is controlling an industrial system that's using many kW or even MW of power. At that level, 10W vs 100W is noise.


This is a relevant article to me since I also make 20yo (or older) computers run legacy stuff ... but not for fun.

Factories and labs frequently have machines or instruments which are controlled by a computer. They are run off control cards which are inserted into ISA or PCI slots in the computer and are commanded by the legacy software through old, proprietary drivers. Examples are cards from National Instruments or Galil. Such equipment can cost tens of thousands of dollars (or more) when new. Also, decades-old software written by long-gone engineers at the factory still runs on the equipment, and nobody understands how the stuff works nowadays. Therefore, there is plenty of incentive to keep the old systems running and running and running.

Unfortunately, old computers sometimes break. That's where I come in -- I do a side consulting business with a colleague where we refurbish the old computers -- replace parts as necessary, install old versions of the O/S, replace mechanical hard drives with SSDs, and do whatever else is needed to keep the computers running for the next few decades.

I know we're not alone out there since we're aware of other small businesses which provide a similar service. It's an important thing since -- as many point out here -- modern software companies don't make backwards compatibility a priority, but factories and labs have equipment which need to run for decades, so the computers controlling them also need to run for decades.


>Also, decades-old software written by long-gone engineers at the factory still runs on the equipment, and nobody understands how the stuff works nowadays. Therefore, there is plenty of incentive to keep the old systems running and running and running.

Vernor Vinge figured this out 25 years ago. A Deepness in the Sky depicts a human society thousands of years in the future, in which pretty much all software has already been written; it's just a matter of finding it. So programmer-archaeologists search archives and run code on emulators in emulators in emulators as far back as needed. <https://garethrees.org/2013/06/12/archaeology/>

(This week I migrated a VM to its third hypervisor. It has been a VM for 15 years, and began as a physical machine more than two decades ago.)


I’d say most of current stuff is “get new guys, write it from scratch again”.

That is how you get new framework each year.

Maybe more people should be just software historians.

On the other hand rewriting stuff in new ways is beneficial as new processors come to life.


> ... but factories and labs have equipment which need to run for decades, so the computers controlling them also need to run for decades.

for PCs it's obviously not a problem (there are just so many parts out there for just about everything and anything) but what about these proprietary expensive ISA (or PCI) cards you mention? Are replacement available for decades too? Or are they easy to find second hand at reasonable prices?


You take them to someone who can do component level repairs.

And the basic hardware is more difficult than just the parts and these nuances can make certain parts expensive and rare.

You have to know all kinks of things like irqs, dos commands, memory layout, jumper settings, just an insane amount of stuff compared to modern plug and play equipment.


That insane amount of stuff was just par for the course back in the day. I still have my old spreasheets and technicians toolkit of 5.25" floppies with all the old school utilities (check it - I forgot about it until I came across my old suite of floppies while moving a few years back)


I'm curious if you do this alone or with another full time job. I've been searching for a gig company that allows one to turn old PC knowledge into a side gig.


It's a side gig for me. The biggest hurdle is finding a bunch of companies fielding the old equipment and then becoming known to them. In that sense it's like any other consulting gig -- it's mostly about who you know, not what you know.


Will it help that this increasingly affects less specialist stuff (household appliances, cars...) that the general public use? It makes it more of a known issue.

With more things being network connected, how will we deal with the problem of keeping things not just working but secure? Is it generally practical to air gap equipment in factories and labs?


Older equipment is often air gapped just because it really had no reason to be networked, and even if you DO want it networked, it's relatively easy to air gap.

It's the new stuff that is all "cloud based" that will have real problems in 10 years.


Yes, this will be a major problem for factories. Any "cloud based" software is guaranteed to become obsolete and incompatible well within the lifetime of the equipment it controls. If you're dealing with real manufacturing or lab equipment you should avoid "cloud based" stuff like the plague.

Most of the equipment I deal with is not on the net. It doesn't need to be since it's just controlling some machine which is busy making something real. If you need to move any data on or off the computer you use USB sticks (or sometimes CDs or floppies).

Some newer test stands are interfaced to an internal LAN so they can provide process control data to a database. In that case one either needs to provide some sort of firewall between the equipment's computer and the database, or just bite the bullet and redesign the whole thing for compatibility with, say, Win 10 (at major expense in time and money).

I wish I could convince people to target their automation stuff to Linux so the backwards compatibility problem would be easier, but corporations still balk at that.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: