>The MacWeb developers apparently took a look
at the HTTP 1.0 spec, decided, “Who would
ever need name-based virtual hosting?” and
left out the feature that 99% of the sites on
the modern web relied on.
Of course they did, it's not in HTTP/1.0, it's in HTTP/1.1.
However, I recently had to implement a minimal HTTP client. I assumed Host: was in HTTP/1.0, but unfortunately it wasn't and I had to implement the minimal portion of HTTP/1.1 required for compliance.
Keep up the good work then :) At your age I spent my time reading TCP/IP Illustrated Volume 1: The Protocols and discovering unpublished operating system detection mechanisms in ARP and ICMP, putting them online and running home from school to see which governments' agencies had visited my website that day. They didn't bother hiding their tracks very much in those days.
This is just wonderful. About three and a half years ago I booted the first real computer from my childhood as well: a Commodore Amiga 500. I had one mission and that was to clone the hard drive so that I could get all the stuff off of it before time had its way with the bits (there are excellent Amiga emulators that can actually run old OS). Not having any experience soldering, I would have been crushed if I had experienced the failure of a capacitor. Fortunately, fate was kind to me.
In that experience I learned how incredibly difficult it is to get things on and off of computers this old. Even though I lived through all the advances, my mind has a much rosier picture of the capabilities of old hardware than is actually true. Similar to this author, I had to rely on serial connections and some utilities developed by the Amiga emulation community running in Windows on an old PC to actually get it to work. And boy was it slow to move data. It took something like 48 hours to move about 80 megabytes of data. I was worried that was too long for such a machine to be running, but it held out. One of the partitions did have a tiny bit of corruption, but I was overall surprised to find most of the data intact.
Sadly, my shoebox full of disks is probably never to be recovered. Both floppy drives seem to be failing and reading Amiga floppies on modern hardware is possible, but an expensive pain.
Because of the inability of most modern PC floppy drives to read Amiga disks (since Amiga floppies use sectors outside of the appropriate "readable" range for PC disks, IIRC), it can be hard to convert Amiga disks to ROMs.
However, in ~2000 there was a mechanism discovered whereby you can take two PC 3.5" floppy drives and get a ROM from an Amiga disk. The way it worked is you have one PC disk that will be destroyed, and you instruct the floppy controller to do a low-level floppy-to-floppy copy operation from the Amiga disk to the (burner) PC disk. The CPU will listen in on the copy and get the full Amiga disk image, while the PC disk gets overwritten with junk.
I used this mechanism and saved all my old Amiga files, and happily play many of my old games (awesome retro demo-scene crackers/trainers included! yea!) using the UAE emulator to this day.
The big difference for me was that my old Amiga 600 has a PCMCIA card slot in it. I bought a modern (and very cheap) SD card reader that fit it and copied the necessary device driver over to the Amiga via a floppy. Thankfully it was already set up to read PC disks.
After that copying files (and the entire HD image) to and from my PC was straightforward - the driver even supports VFAT long filenames. I was impressed with the whole thing.
I had a similar issue with the Amiga's floppy drives, and was surprised to discover that some software called "AMI Alignment System" seemed to get the internal drive working fine again. I'm not sure if it worked some magic or just loosened up the drive a bit; either way it let me read my old disks again. It might be worth a try.
I had one mission and that was to clone the hard drive so that I could get all the stuff off of it before time had its way with the bits
Pff, hard drive? In my day, everything we ever coded was stored on cassette tapes. We never needed the computer itself to get them back, just a tape player, special software [1], and the good fortune that solar flares hadn't eaten all the bits yet.
The only real fight I ever recall my parents having was when my dad came home with that $1,000 100 meg hard drive. My mother was livid that he had not discussed it with her first. Years later, as I came to learn what our financial situation was like in those days, I understood just how crazy it was for him to do.
One of my earliest childhood memories was going to the computer store with my mom to buy my dad an Imagewriter II as a gift. I LOVED that thing and it is probably responsible for much of my love of computing. (I remember the day my dad taught me what PR#6 did… oh boy the paper I wasted after that.)
It was only over a decade later that I realized how crazy it was that she bought that for him, given that it was (a) $600, and (b) she knew absolutely nothing about computers.
When I attempted a similar thing on a C64 the external 5.25" disk drive was similarly inoperable. Turned out the elastic bands from the motor to the little ring that camped the disk had just gotten too lose with age.
Was a pain in the ass finding one that was tight enough to stay on and not to tight just just stop anything moving but got it eventually.
Might be worth popping the lid off and checking yours?
I'm not sure what they used for 286 era machines for hard drives but I suspect it was plain old IDE, which you could probably connect directly to a more modern machine. I was still in Amiga land until the 486.
Incidentally, the only 286 I ever owned was actually a board that I got for the Amiga that plugged into the CPU socket with the unholy alliance of a 286 + Motorola CPU. You could use them at the same time to run PC stuff and Amiga stuff. It was a hack of hacks. But it worked!
> I'm not sure what they used for 286 era machines for hard drives but I suspect it was plain old IDE,
It was not. It was the precursor to IDE, where the drive was just a disk and head actuator, and all the magnetic signal decoding electronics were onboard an ISA card plugged into an ISA slot.
What was initially offered as the first "IDE" was taking the analog electronics on that ISA card and moving them onboard the drive and just "extending" (essentially) the ISA bus out to the drive.
Mac Plus has a CPU limited line speed of ~19 kilobits/sec? That's pretty amazing.
My first was a Kaypro II with 64kb of RAM, 191kB floppy drive, and a 2.5MHz Z80. I just looked up some videos of them running and the memories came flooding back. Good times.
Just imagine what computing power we'll have in another couple of decades and how silly what we have right now will seem.
I remember being absolutely devastated when the U.S. Robotics 28.8k modem I purchased was too fast for the 9.6kb/sec speed the 8250 UART on my 386sx. I had to sell it to somebody else and find an expensive internal model with its own UART. I think later they started selling internal modems without UARTS that just used the CPU to do the work, I wouldn't be surprised if that was the case here.
He's saying get an _extra_ serial board (put it into an ISA slot) that has the 16550 on it. Back then, this was a common way to get around what you're talking about.
It seems like his particular setup is the limiting factor. He should have no problems with 56k rs-232 wih proper cabling. The mac had a dedicated serial controller used for rs-232 and rs-422.
LocalTalk wasn't speedy and we were pretty thrilled when we got macs that could support 10/t ethernet, but there was a distinct charm to LocalTalk. (like, the terminators for the cabling consisting of a resistor sticking out of an rj-11 jack!)
I still have a few Dayna Ethernet to LocalTalk bridges I the garage...
According to that article, the fastest widely-supported RS232 speed that chip can communicate at is 19.2 kbit/s - the onboard clock generator simply doesn't support any of the standard speeds above that.
In that case you get yourself a USB to serial dongle which includes a CP210x or a FT232 chipset. Those can be configured to support a huge range of non-standard baud-rates easily.
This post brought back great memories of connecting my Amiga to the net ~1991. The greatest triumph at the time was FTPing synth patches for my Yamaha DX7 from a server in Texas to my unix account at UWA, downloading it to the Amiga via zmodem as the author did, then uploading via the second serial port to the DX7 with a serial to din (MIDI) cable made with 2 sockets, 2 wires and a resistor. I couldn't really believe it when it all worked.
Well, the server may be an 80286, but as its faq.txt says, that's not the machine you're talking to once the page is loaded. It's just a JavaScript console simulation:
Love it, nice work. Had a bit of a chuckle when thinking of the site owners who see "Mac Plus, 1986" turn up on their user-agents log. Still renders better than IE6 though
I created a network card for my MSX computer a year ago; it has less than half the cpu speed of this Plus. I really like working on that machine; adding hardware, writing bits of software. I said it before, but it's like a Bonzai tree. Because it's not really possible to do something commercial on it anymore, unlike on anything modern, my brain stops thinking commercially and that's a good feeling which I don't seem to have when touching any modern computer/board. Using the latter I always get business ideas and then it suddenly went from playing to work. Which is not bad because I like that, but sometimes I want to just play.
>Did I mention it was slow? It was slow. Soooo sloooow. Slow slow slow. Like, minutes to read and render a page slow.
This. People wax nostalgically all the time how fast running Star Writer or something was back in the day, and how "bloated" modern OSs have become, without understanding that those old PCs did 1/1000 of the things modern computers do, and even those simple stuff (running a text processor) was slow compared to today's standards...
It depends. For instance, Borland's Turbo Pascal with its one-pass compiler was pretty damn fast. Its edit/compile/run cycle was amazing (an "ECRL" as fast as a REPL). And this with 64KB RAM.
Whereas some of today's IDEs and language-layered-on-a-VM experiences aren't nearly so fast. Granted they're working much harder. But the end-user experience is not always faster these days.
Ahh! The good ol' days. I started programming with Turbo Pascal when I was around 7 years old, I still remember the green glow of the CRT and the clickety-clack of oh-so-heavy mechanical keyboards. Man, those were good times, interrupting the BIOS, corrupting sectors and playing paratroopers deep into the night :D
>It depends. For instance, Borland's Turbo Pascal with its one-pass compiler was pretty damn fast. Its edit/compile/run cycle was amazing (an "ECRL" as fast as a REPL). And this with 64KB RAM.
Still, not comparable. That means that TP had to work on the memory, and not touch disk (else it wouldn't be speedy at all).
Which probably means that the libraries available would be miniscule. A decent graphics (or even math) lib can be well over 1MB. Did you got much out of that era TP besides the ability to use the core language structures?
Yep, particularly since touching the disk meant a 5-1/4" floppy diskette.
But no, the programs weren't toys. They were an order of magnitude more complex than most of today's smart phone apps. And in fact, PCs were the smart phones of their day -- a fun size version of what had been considered "real" computing, initially not taken seriously, but destined to evolve quickly.
Also you had direct access to the hardware -- there weren't ten layers of virtualization, protection, and APIs in between. You could do near-realtime things, very low latency. I think that's part of what contributed to the subjective experience being quite fast... provided you didn't need to hit the floppy disk!
> Which probably means that the libraries available would be miniscule. A decent graphics (or even math) lib can be well over 1MB. Did you got much out of that era TP besides the ability to use the core language structures?
A decent graphics/math lib written in TP could be much smaller than 1MB. There's tons of 64k demos written with Turbo Pascal (it helps that it has inline asm). By which I mean "decent" for that era. Say, a functional SVG engine would be something different of course.
The math lib would probably also have to spend bytes on certain functionality that we'd today consider "core language".
But how much of that stuff is actually needed, and how much is processor sucking prettiness?
I think people justifiably feel a bit peeved that word processors ran acceptably fast on a 286, and run acceptably fast on a modern multi-core many GHz machine, and yet the added functionality isn't that useful for most people.
Sure, for a program like Word, most users just know a small subset of the features. But they use many more of them because they rely on documents from people who know a different subset.
A useful analogy would be library functions. A new Python doesn't need to know its C Language Interface to use a library that relies upon it.
Word (and the other Office software) are amazing, vary powerful, etc. And the OS that these modern softwares run on are full of features.
And so software does do very much more than it used to do, and that's what people talk about when they talk about bloat.
I could install a minimal Arch, with JWM, and Abiword, and run everything from RAM and get blazing fast operation. But it is odd that modern software, even though it's so feature rich, is also so slow.
It's hard to say that Word is slow "performing x" when the alternative in many cases is not "performing x" at all.
Put another way, even if I never embed spreadsheets in my documents, if Abiword cannot deal with the embeded Excel spreadsheet that my client sends me, then Word's ability to deal with it is not bloat.
And if my alternative to editing the client's input numbers directly in the Word document she sent is to deal with a PDF and manipulate dumb text manually, then Word isn't running slowly.
"and yet the added functionality isn't that useful for most people."
Well, if the abillity to spell check as you type, see the formatting of the document as you work with the actual fonts et al, embed images and graphs etc isn't "that useful" to them, then they can always run something like WordPad or Notepad++.
I remember reading something about "good web development" back in ~1997 and the author saying the Yahoo! page loaded in about 6 seconds and I was thinking "wow, that's pretty fast."
When I bought mine similarly decked out (with a GCC PLP printer) the computer sales guy asked me "What are you doing with that much RAM? CAD for NASA?"
We kept it for 9 years but it developed a quirk where the monitor wouldn't turn on. The only way to make it light up was to grab the Mac Plus and squeeze it, so everyone used to joke that my Apple computer was lonely and needed a hug.
Ah, I used to work as at a place that used a load of old PCs for training. Often some would refuse to boot. The solution was a sharp slap on the side of the case. I guess these PCs knew they were naughty.... ;)
I had a IIe with the dual disk drive. The right side drive didn't work and the left one's door had to be held closed with a large eraser. Ah, the memories.
It actually has a lot more processing power than the minimum required to "connect to the Internet" - TCP/IP and web servers have been implemented on microcontrollers with KBs of RAM. Parsing and rendering HTML, however, does need more than that...
> In terms of technology, the achievement is not only in connecting a small computer on the Web, but also in the size of the network software that is running on the chip, according to Shri. The computer consists of an iPic TCP/IP stack running on 256 bytes of memory, using its own equally tiny operating system. Despite the small size, the TCP/IP stack is fully compliant with the requirements of the relevant standards. It is connected to the Internet through a serial port. Because the machine is a Web-server, it does not need a keyboard or display, but is operated from another computer using a Web connection.
I know everyone here will tell me to get off their lawn, but I just built a Pentium II computer about 2 months ago. I pulled most of it out of my uncles house when he died about two-ish years ago, and decided that its time to do something with this crap that's lying around. It's not quite the first computer in the house (a Compaq 486[1]), but it runs all the things it could.
[1] it was (what would now be called) an all-in-one, and it came with some Windows 3.1 thing called TabWorks. And before anyone asks, no, I didn't program on the thing. Not every programmer has programmed since he was in diapers. Also, Windows didn't come with a programming environment back then and search engines sucked ass.
A Compaq all-in-one 486 with TabWorks was our family computer when I was a kid. You've evoked some very fond memories. Windows didn't come with a programming environment, but DOS had QBASIC, which is one of the best beginners' programming environments ever made.
That machine had a long lifetime for us. We upgraded the RAM as far as it would go, replaced the HD... even added one of those AMD "586" coprocessors.
Cool! After we had it for a year or so, said uncle put Windows 95 on it. Lost the speakerphone app and Lode Runner Online during the process, but gained the late 90s web. I downloaded that Lode Runner from the creators site around 2004[1], and had my first retro computing moment.
It wasn't upgraded (except possibly RAM), and we had it from about 1995/6 until 1999, when my dad got a Pentium 166 that was surplus from an office move.
I wish my Aunty hadn't thrown out our IBM Pentium PC @ 100mhz, and a whopping at the time 16mb of RAM. Seriously powerful machine for the time. I miss it...
I'm curious how long it would take that browser to infect the computer with a drive by download? Or if it's just so old and obsolete that not a malware programmer exists in this world to bother exploiting it's holes?
Right, no scripts running means virtually no attack vectors. Perhaps overloading the buffer by downloading a large webpage? But that's about all I can muster.
There was a dedicated fanbase for using Apple's System 6 day-to-day as recently as a decade ago or so. There's still a bunch of information up on the web and many of the links to 3rd parties at the bottom still work. http://www.vintagemacworld.com/sys6net.html
Of course they did, it's not in HTTP/1.0, it's in HTTP/1.1.