Hacker News new | past | comments | ask | show | jobs | submit login
Building a 10BASE5 “Thick Ethernet” Network (2012) (mattmillman.com)
66 points by RicardoLuis0 on Oct 17, 2020 | hide | past | favorite | 23 comments



Not quite 10Base5, but my very first network was using 10Base2 coax at home in 1993. I remember the time well because DOOM had just come out on PC and suddenly there was a compelling reason to have a home network! I believe this was the first well-known PC game to have a multiplayer mode over LAN.

We got the LAN cards, coax cables, and T-connectors for cheap (or free?!) as I guess many businesses were already upgrading to the simpler and more flexible 10Base-T around that time. Terminators were very difficult to get, but we could make them at home by taking a segment of coax and soldering a resistor between the inner and outer conductors!

You loaded a DOS IPX driver for the LAN card before starting up DOOM. I think there was some way to share files and mount drives between PCs without a server, too, but I forget the details of that.

No switch or hub hardware was required - all the nodes on the network were just connected by one big string of coax with our home-made "terminators" at each end.

The whole thing was very fragile as any break on the cable (or disconnection of a terminator) at any point would take the whole network down. But for a home network it was fine. Good times!


Around that time (1993-ish) we had Macs and used something called PhoneNet. You plugged a dongle into the serial port and connected computers using ordinary telephone cable (you might call it “cat 1 cable” now).

https://en.wikipedia.org/wiki/PhoneNet

There were a few games we played over the network, like Super Maze Wars or Marathon.

http://macintoshgarden.org/games/super-maze-wars

http://macintoshgarden.org/games/marathon

It wasn’t long before we switched to 10BASET.


Do you remember ‘Spectre’?

http://macintoshgarden.org/games/spectre


The AppleTalk game I remember best was Bolo, which could run on just about any Mac.


Our PC lab was networked like this, with a row of about 10 computers.

I distinctly remember moving the PC desktop forward, which pulled the coax from the t-connector, and all the remaining PCs down the line would lose connectivity.

Lots of angry DOOM players losing their network connection.


Dooms first networking implementation was very inconsiderate and had a habit of killing whole networks. https://doom.fandom.com/wiki/Doom_in_workplaces

>The first version of the Doom IPX network code transmitted its data as broadcast data. As a result of this, all machines on a network where a game of Doom was being played would receive the data, even if the machine was not involved in the game. The degrading effect on network performance forced the system administrators for many office networks to ban Doom.


Yeah we got a bunch of 8-bit NE1000 cards for free and did the same thing around that time. I remember the three-step setup process of LSL / NE1000 / IPXODI. The file sharing thing might have been NWLite?


A few years later (1995 or 96 ?) I build my first ethernet network at home using 10base2 with a TV coaxial cable. Works enough well to play DooM with my brother.


You may enjoy these few photos I snapped an the 1981 National Computer Conference (NCC), where to me the most notable announcements were Ethernet (DEC/Intel/Xerox) and the Xerox Star. I’d never seen such a UX, or a mouse.

http://ozz.ie/ncc-1981


That was interesting.

Thank you for sharing a bit of history.

The graphics on that Sony monitor with the forest scene...I did not realize that was possible at that time.


> the N connector. While still used extensively in lab/specialist environments, this large coaxial connector isn’t something the average ‘tech dude’ is likely to encounter these days.

Any ‘tech dude’ that’s also a HAM or anyone who’s done work with other radios definitively will disagree that N connectors are only used in labs. I have quite a bit of “consumer grade” gear that uses that connector even (cell signal boosters and Wi-Fi antennas as two examples).


There is a tendency at some places to not rip out old cables when putting in new. I have seen in some buildings at NASA Goddard a few runs of old 10Base5 cable with vampire taps still in place. I should grab one as a souvenir of the old days. I've also found various adaptors and old switches that could bridge a 10Base5 to 10Base2 network.

The 15-pin connectors at the ends of the MAU cables were designed to be used vertically, with a latching mechanism to hold it in place. Many Unix workstations mounted the socket horizontally on the motherboard and in that orientation the latch didn't really work. This created all sorts of reliability problems.


I remember tapping into an ethernet cable many years ago.

I was up on a ladder in the basement of a building and I had to find one of the marks to tap into the cable at the precise spot by moving the ladder.

It was actually pretty fun for a software guy to be able to do this.

There was also the feeling of "engineers doing it right", not like that room full of normal employees on desktops that were always losing their network because someone plugged or unplugged something on their thinnet.



That copy doesn't have images. A wayback link: https://web.archive.org/web/20200710094652/http://tech.mattm...

I feel like I've seen more than a handful of personal sites crushed by HN traffic in recent months. I wonder at the server architecture that can't handle HN traffic. In this awesome, far-future science-fictional year of 2020, it should not be a feat of engineering to write something that can handle a few thousand pageloads. I've had posts visit the front page a couple times, and from the traffic numbers I saw, it's not like you have to handle ten thousand concurrent connections. Ten concurrent connections would do it.


The key is caching. I remember 10 years ago playing with Varnish, and it seriously blows my mind that even today sites get taken down by honestly fairly small traffic from HN. I sidestep the issue personally by having my blog (kn100.me) just be a static site generated with Hugo to keep page weight extremely low, and I stick Cloudflare in front of it. That way whenever something I write finds its way to the HN homepage, the backing server (the smallest instance Linode offers) barely even blinks.


My personal site runs off an lxc container on my OpenWRT router, there's absolutely no way it could handle HN traffic. It's absolutely not a feat of engineering to 'fix' that, but it would cost money and there's absolutely no need almost all of the time.

For that tiny one off hacker news hit, your wayback link seems to be working just fine.


> I wonder at the server architecture that can't handle HN traffic

Stock out of the box Wordpress on a low end VPS.

Any web server serving a static site, or WordPress with any of the big caching plugins should be fine


Yeah I once ran a Apache + WordPress blog running on a low-end AWS instance that hit not just Hacker News but also some big tech blogs at the same time, and it handled it fine just by having the WPSuperCache plugin installed and having all the static assets passed though CloudFront, nothing else fancy.


I never worked with 10base5 but found this fascinating. I heard stories of "vampire taps" from people in college back in the 90's.

My first home network was 10base2, originally between a 486 and a 386, where we played Doom. was surprised to find out about the grounding requirement for one of the terminators. I never did that. I guess it was a small enough network so it didn't matter.



I don't miss coax or terminators. Networks used to be harder.


Ah yes. Another one of those 'why the heck is my blog running so slow today' days.

It's on Hackernews (again) Are people ever going to get over this page?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: