I had absolutely no idea that the original SoundBlaster had embedded firmware running on an 8051. It never occurred to me that our PCs were loosely-coupled piles of microcomputers all the way back then. (Now I'm off to go learn what that firmware was doing...)
It comes full circle: computers used to be the size of a building, and now what we call data centers are again the sizes of buildings (and in many cases can be considered a large distributed computer). I've always enjoyed this article on the topic: Myer, Sutherland "On the Design of Display Processors" http://cva.stanford.edu/classes/cs99s/papers/myer-sutherland...
This might not be your scene at all, but someone made drawings and read from Gilles Deleuze's treatise on "stratoanalysis". It's long winded and completely unlike that paper but it generalizes the idea, or at least I think it does.
poor analogy IMO. the old building sized computer was only adequate for a single tenant's use. Nowadays the large data centers facilities hundreds of thousands of organizations' infrastructures
Reminds me of a story my dad told about studying CS at Waterloo in the 70s. Time sharing == bringing your punch cards in at 3am because the queue was shorter then. :)
He said those nights were the best experiences of his time at University (aside from meeting my mom).
Are you referring to this part of the parent comment? "Old building sized computers often supported multiple organizations via time sharing etc."
I worked at Tymshare in the early 1970s, and I can assure you that we supported many simultaneous users and organizations who dialed into our Tymnet nodes with their Teletypes and other terminals to connect to our building sized computers. That was the whole point of the company, and the very thing it was named after.
Not new at all. Remember that Commodore disk drives had their own 6502s in them, for example (some of the big PET drives had two). A joke at CBM was that the 1541 was the best computer they ever made.
Ironically, in the early 80s, we did it mainly because of the shortage of computing power, so we needed lots of co-processors. And now we are still doing it, but mainly due to our excessive computing power. Want to blink an LED? Let's put Linux, Apache and Ethernet in the circuit. ;-)
Well, on the other hand, it's unfair to say it's an abuse of computing power, since if you do have the power, making autonomous devices does make a system much more elegant and manageable, and not even talking about the advantages of general processor over specialized chips.
Take another look, it also tells us something about early home computer designs - the actual CPUs were usually irrelevant, there were no technical difficulties on the CPUs themselves and you could easily put TWO CPUs in a hard drive. And unlike modern computers, it was not even the most expensive part and the choice was limited, a cheap "trainer" computer with only hexadecimal keyboard, and an expensive home computer with a color CRT often shared a single CPU. The vast majority of the cost and troubles was system design, or putting these parts into a usable computer, with your own solution of input, storage, interface, graphics, software, or a chassis. The standardized IBM PC hasn't came yet, and one needed to create a solution of every subsystem. Every computer was a unique design, often with custom-made ASIC chips for peripherals.
> Ironically, in the early 80s, we did it mainly because of the shortage of computing power, so we needed lots of co-processors.
I'm not sure I entirely believe that explanation.
Back in 1982, the Commodore 1541 [1] was basically another C=64 with no keyboard, and a built-in floppy disk. It was big and heavy and hot and not terribly reliable.
4 years earlier, the Apple's Disk II [2] was tiny, light, and had only 5 ICs [3] -- it was described as "a state machine made of a prom and a latch" [4]. The computer's CPU drove it -- it's not like the 6502 would otherwise be running other threads or processes during I/O!
Apple didnt run a microchip fab manufacturing 80% of chips inside 1541, hence no incentive to sell you overpriced garbage (like C128D with 3 CPUs, 2 doing nothing all the time).
Yet Commodore's "overpriced garbage" was sold at retail prices consistently lower than Apple's, and had graphics and sound capabilities that far-exceeded the comparable Apple machines.
Also, Apple made sure they got into schools and managed to push out the PETs with Apple IIs. Kind of too bad, because the Commodore (C64, post PET) and Atari computers of the day were cheaper and (arguably) better in many ways.
I have such fond memories of my kindergarten and first grade computer lab being full of PETs! The Apple IIs started coming in a year later, and I loved those too.
To be fair something like an ESP32 runs circles around the PCW1512, which we had on the school computer club, where we spent countless hours playing Defender of the Crown.
So we don't have to necessarily constrain ourselves to Assembly and C on those tiny devices.
> So we don't have to necessarily constrain ourselves to Assembly and C on those tiny devices.
Good point.
> something like an ESP32 runs circles around the PCW1512, which we had on the school computer club
I think as long as people are having fun with them, the hardware already served a good purpose. Even if the LED blinking is implemented by running Linux.
But in my opinion, people who are interested in these popular embedded boards should also learn about the basis of electronics, or just be aware of its existence, and understand the things one could do without buying a premade "board", and that it's possible to blink an LED with a few transistors. Well, I think most will find out by themselves, it's just a matter of time and guidance.
My problem with using things like the RPi for such simple LED blinking projects is that it contributes to the cult of overengineering that plagues the industry. People seem to grow up thinking they need innumerable layers of abstraction to do things, when often they have actually made their solution not only inefficient, but actually less stable and more difficult to develop.
+1. Another criticism is consumerism - the hype that some of these devices created was more about purchasing and plugging boards and calling a library, without actually help hobbyists to learn something.
Or as John Carmack has said, low-level programming is good for the programmer's soul, it's hard to imagine that programmers do have souls if they has never been exposed to low-level programming. Unfortunately, in the modern computing world, the chances are becoming fewer and fewer, most systems are far beyond a complete understanding for most people. I think I'll never completely understand the x86_64 instruction set (w/ SSE, SSE2, FMA, AVX), or the signal path of the PCI-E interface on my motherbard.
Recently, I think some popular microcontroller projects may be the solution for a bare-metal programming experience.
But we can also looking it from another perspective - the proliferation of mobile devices changed everything, nowadays, even using a general purpose computer is not an experience that many members of the new generation may have (there was a Hacker News article about it). Some don't understand a hierarchical filesystem of files and directories anymore, and others have problem typing on a physical keyboard. In this sense, to blink an LED on Raspberry Pi is already a big step forward, and it was exactly the motivation behind Raspberry Pi.
Yes, it is unfortunate that in the pursuit of both compatibility and performance we now live in a world where even the CPU is so maddeningly complex that it is basically impossible to reason about how it will perform on any given piece of code. Even writing assembly leaves you a few layers of abstraction above where it would have 30 years ago.
I like to think that it is possible we'll one day develop a CPU architecture that is both simple enough to reason about and also highly performant.
Yes, and it's also the my reason of mentioning microcontroller projects. For example, some find the AVR instruction set is clear and powerful, optimized for both ASM programming and C compilers, and can be used as a good introduction to both hardware and assembly.
> I like to think that it is possible we'll one day develop a CPU architecture that is both simple enough to reason about and also highly performant.
RISC-V is getting there, with open-hardware cores such as Rocket (in-order) and BOOM (out-of-order). Too bad that many and perhaps most of the peripheral components even on a general-purpose SiFive SoC are still closed hardware blocks. But people are working on opening these up as well.
I don’t see a problem with this because this only happens in individual projects. Who cares if some teenager wants to control her reading lamp with a RPi instead of an ESP8266? Or if some dad wants to make a noise detection circuit for his nursery using an old laptop instead of an Arduino? What’s likely to happen is that they will both learn a lot of new things, including about the inefficiency of this. Just because you know better doesn’t mean it is bad that they did this. Besides, when you go for controlling your lamp with an ESP32, someone can always point to a simpler and cheaper processor to use instead. Then we can run into circuits designed without any software at all. Eventually you’ll be left with a MOSFET and a proximity sensor and your inner purist will be happy.
I say the point is that you should try to do trivial things with complex control systems and vice versa. You’ll do it better and more efficiently the second time. And the third. And if you decide to scale it, you will quickly learn the cheapest way to blink 10,000 LEDs.
Last year I've seen project that used two RPis for somewhat simple industrial automation task. After I pointed out that the whole thing is huge overkill the original author did some optimalizations and redesign which resulted in design that still had two RPis, with one of them being used as one bit 3.3V logic invertor and nothing else.
Needless to say the whole thing was replaced by single ~$20 chinese FX1N clone (which also neatly solved how to drive 24V industrial loads from RPi/Arduino/ESP...).
For this kind of problem you don't want breadboard. But on the other hand board with ATtiny, 12bit PIC or smallest MSP430 and few relays and optocouplers would solve the problem. And on the griping hand random chinese PLC exists and does not involve few man-days of NRE involved in designing such a thing.
"My problem with using things like the RPi for such simple LED blinking projects is that it contributes to the cult of overengineering that plagues the industry."
I disagree a bit with you (I do agree that if all the RPi ends up doing is flashing a LED, then that is a waste) You have to start somewhere and if it is flashing a LED with a Pi then cool. The path to ESP32 or ESP8266 is not far away and there is a lot, lot more to discover and play with and learn and frankly have fun with and maybe do something useful or change the world (or a tiny bit of it) with.
I had some of the best opportunities available to me in IT and electronics and electrics, schools, teachers etc etc etc. but I didn't have today's opportunities. If I was growing up today I think I would probably have started with a Pi and blinked some LEDs.
Gently, friend, the 555's less than 10 cents at Digikey in medium quantities [0]. One of my hardare cow-workers happily placed one on the same board alongside a sub $ ST ARM MCU. Sometimes the 555's still the best option.
Disclaimer: I'm not very knowledgeable about DIY electronics and pretty bad with a soldering iron (shaky hands). Regardless I think it's super interesting and cool and thus my question:
I checked that site and "medium quantities" is 5000 of them :) (for a total price of $493). Somewhere on the page it says minimum amount is 2500, but I tried and wasn't able to successfully add that to my cart (only 5000).
So I guess that's nice, but how are you going to put even 2500 of these chips to use? That's a LOT of soldering to do. Unless you have some kind of machine to do that for you? But even then what are you going to use 2500 of them for, if not for reselling?
I believe you're asking a genuine question. It's not DIY (well, after we've prototyped it's not), these components are elements of goods for sale. Specifically the 555 was used as a watchdog. Runs of 5000 are barely enough to get the attention of stateside distributors so long as they arrive with orders for everything else on the board. As to how they're used, someone sticks a reel into a pick-and-place robot that populates the board, and the assembly shuffles into a wave soldering machine for the dirty work. It's very conventional stuff to the extent that $DAY_JOB's factory manages it from a third world-country.
An impressive trick at Commodore users group meetings was to run Fast Hack'em for the 1541. This was a low-level floppy copier that ran on one or more 1541s, without the need for a host C64 once the process started.
i.e. you'd start connected like this: (drives were daisy chained)
[C64]--[1541]--[1541]
Then remove the C64 and dupe away for the rest of the evening, swapping in disks as they completed.
And the original Apple LaserWriter was the company's most powerful computer at its launch!
"With its 12 MHz 68000 CPU, the LaserWriter was 50% more powerful than existing Macs, and with the 1.5 MB of memory necessary to download Postscript fonts and print a page at full 300 dpi resolution, it had three times as much memory as the top-end Mac."
Even your keyboard has a microcontroller in it. The subsystems in a PC used to be far more independent of the CPU: sound cards had features like wave table synthesis which could offload playback work from the CPU, SCSI disk controllers could do transfers between disks (after being initiated by the CPU), network adapters could do some basic packet inspection to see if your computer needed to bother processing an incoming packet etc. Moore's law, the drive for ever cheaper computers and some strategic moves on Intel's part eliminated most of that on the majority of PCs in the late 90's when we started seeing things like Winmodems, Winprinters (i.e. devices that couldn't function without a Windows driver on the PC doing most of the work turning the peripheral into a dumb I/O device) and USB. There are still many microcontrollers in a modern PC but they're buried at lower levels (instead of being in the hard drive controller they're now in the storage device itself, for example) and for different tasks.
How so? I would classify the ARM M0 (most likely what's in those devices) as a microcontroller-class device. Yes, it's a SOC... but so is the Atmel AVR and almost every other microcontroller these days. I tend to classify devices based on their use cases rather than specific hardware limits since it's a sliding scale over time. (i.e. today's 8-bit microcontrollers exceed what we had with 1980's-era 8-bit PCs on pretty much every front other than RAM... but they generally aren't considered anything more than microcontrollers today)
Yes, it started with Winmodems, which people used to mock a lot. There used to be a fancy type of hardware modems called voice modems which even had sound card capabilities, and in turn sound cards had game ports that came standard.
Yeah that's incredible. I'd no idea. I'd assumed that the chips on devices like this were essentially dumb "slaves" controlled by the CPU.
I had some vague idea that the chips had some autonomy (ie, when playing digital sound, they knew enough to request/fetch more data via interrupts, DMA, etc) but I'd never spent the time to really learn what was going on there.
Actually, having computer boards with their own CPUs was NOT a modern phenomenon due to excessive computing power. They have always been made, as early as in 1980 (without the s), and for good reasons.
Some were even a bit unimaginable today: for example, we had Z-80 Softcard (https://en.wikipedia.org/wiki/Z-80_SoftCard), a Z80 expansion card made (by Microsoft!) for computers with 6502 CPUs (Apple ][) to run existing business applications written for CP/M, like WordStar. It can also be connected with a modem to utilize its additional processing power to run a BBS server. There were also cards that packed multiple CPUs to implement a poorman (home computer) version of parallel processing.
The simplicity of single-task operating systems and hardware architectures of that time allowed multiple independent computers to shift control of a physical computer back and forth. Today it's - pretty much - impossible, but nobody needs it either, you can just run everything simultaneously.
Indeed many 6502 systems had support for an optional Z80 (or indeed a built-in Z80, as on the Commodore 128) to run CP/M. Later there were one or two x86 boards for pre-Intel Macs. The Amiga platform's upgrade path made some later Amigas a tangled mess of different CPUs.
There was a Soviet PC for high schools, semi-clone of PDP-11 which had 2 cpus clocked at 6 and 8 MHz respectively. It was called UKNC. Look it up, it is an interesting device.
I used to have a SoundBlaster Advance (IIRC), so a bit newer one. At some point I was running a (Windows) tool/driver called "kX Project" so I could more easily route separate signals to the headphone and line out jack.
At some point while playing around with the program I discovered I could also load simple DSP effects onto the signal chain (EQ, pitch shifter, even echo delay I think). It took me a little while to realize that these DSP effects were in fact not running anywhere on my CPU, but instead were running on some kind of DSP chip on the soundcard itself!
kX Project is an alternate driver for Emu10k-based Soundcards (such as various members of the Audigy and X-Fi series and some Emu cards).
The Emu10k2 is a surprisingly overweight audio DSP that was being included for... unfortunately, no good reason, as Creative Labs sold these to gamers, and games just didn't use it outside of what could be exported via DirectSound (pre-Windows 10 sound mixing, EAX extensions (which they, themselves, are often a source of constant game crashes and/or BSoDs)) or the non-adopted Creative variant of OpenAL.
This is what you ended up discovering. I, too, had one, but never really found a use for it, as the audio quality was pretty poor due to being just a generic DAC and a generic opamp on a generic PCI card. Better than onboard sound, but not as good as inexpensive external solutions.
I looked around and wasn't able to find whether this firmware has been reverse engineered (surely it has been) -- the snark barker just has a .hex file checked into their git.
After doing some work on computer forensics, I remember wondering whether the "memory contents will be wiped off after you leave" message from Dr. Sbaitso was actually true or not.
Speaking of AdLib, a couple of years ago an AdLib Gold 1000 ISA card sold for US $3,400 on eBay. The starting ask price was 99 cents and the first bid was US $50.
> Did any of you, back in the 90s, build a parallel port DAC to use with Linux's PC Speaker driver?
And someone else, dharma1, replied:
> My friend's dad built one for him, for DOS/windows though. Must have been around -89 when we were 12.
> I think Scream Tracker had the schematic bundled with it as an ASCII drawing. We were taught how to solder in primary school, but I remember the schematics being way too advanced for us. Looking at it now it's just a simple resistor ladder tree
> I think it was roughly the same as Covox, someone had written a sound blaster emulator for it, and it worked pretty well on games, scream tracker/fast tracker and demoscene demos - for the price of a parallel port connector and a few resistors, pretty cool.
If that person (dharma1) sees this thread, I have a question for them: Could you take a picture of that schematic and post it?
The old DOS program ModPlay came with a file called which detailed several designs for DACs that attached to a parallel port. I was pleasantly surprised to find that ModPlay has its own web page: https://awe.com/mark/dev/modplay.html. The first download link on that page (https://awe.com/mark/bin/mp219b.zip) is a .zip file that contains HARDWARE.DOC. That file was designed to be displayed on DOS. In order for the diagrams to show up correctly you'll need a text editor that can be configured to display text in the DOS codepage (CP437). Notepad++ can do this via Encoding | Character sets | Western European | OEM-US. If anybody has instructions for other programs / operating systems, please feel free to jump in below.
The HARDWARE.DOC file contains designs for 2 different mono devices that use DAC chips, one mono device that uses an R-2R ladder, and the "Stereo-on-1" which produces stereo using only a single parallel port.
I built a Stereo-on-1 back in the day. I had to send away to England to get the required chips from Maplin. It didn't work properly when I assembled it because I didn't understand how the pins on the chips were numbered. They're numbered counterclockwise, starting from the pin with the dot but for some reason I had thought they ran down the left side and then ran down the right side. Years later I revisited this, figured out what I had done wrong, and fixed it. It still worked the last time I checked a couple of years ago. It produces quite decent sound.
I'm pretty sure a GUS still in its box would turn up a rather good price from nostalgia enthusiasts.
I mostly remember it for the fact that a great many demoscene demos and even some games would only play sound (music) if you had a GUS, a Soundblaster wasn't enough.
I think it had something to do with it being easier to play MODs on a GUS, or something. I imagine this was because you could load sounds into its memory and trigger them? I'm really not sure, someone please clarify :)
Either way, make sure you get a good price for that GUS :) Or better, make sure it finds a home with a loving enthusiast :)
I just noticed that I have two SoundBlaster 16 PCI cards in their original boxes, but apparently they don't approach the AdLib's level of scarcity. (Similar items are selling on eBay for about $40.)
It seems almost entirely pointless: the SB 1.0 has a big place in history but that's due more to market share than technical excellence or uniqueness compared to, say, the C64's SID chip, or even a true contemporary like the Gravis Ultrasound.
And unless I'm mistaken vintage Sound Blaster boards are not exactly in short supply... though I'm not entirely sure about that; it's tough to search for them thanks to the countless scads of products bearing the "Sound Blaster" moniker.
And yet, this is cool. This is a complex piece of work, a very nice piece of engineering by the look of it!
Preservation of computing and gaming history is important, and ensuring we can make more physical copies of things if needed/desired is crucial when it comes to keeping that history alive.
> the SB 1.0 has a big place in history but that's due more to market share than technical excellence or uniqueness compared to, say, the C64's SID chip, or even a true contemporary like the Gravis Ultrasound.
Cultural value is as important as technical advancement, at least in the long run. I think you could agree with me.
A notable cliche is still VHS. In the 80s, it was a far-from-optimal video system, but recently it has became a symbol of the 80s. Another example is the proliferation of home computers in the 80s, while many were groundbreaking, we also had some that were closer to a market bubble than a technical accomplishment, and were known as "Trash 80s" but now it's still something different.
And thanks for your additional commentary on soundchips.
> resurrecting the old 360KB 5.25-inch floppies tech
Sure, practicability is important as well. The primary reason that nobody is resurrecting 5.25-inch floppies, is simply due to its impracticability. I think many people in the retrocomputing community would like to make a retrocomputer with these crunching floppies, but the supply chain has already vanished, it's also extremely difficult (if not impossible) to make a 5.25-inch floppy drive independently, as it involves custom mechanical moving parts. It's better to use the time to, let's say, a homemade video game cartridge.
Many retrocomputing hobbyists make a liberal use of modern peripherals such as CF cards.
it's also extremely difficult (if not impossible) to make a 5.25-inch floppy drive independently, as it involves custom mechanical moving parts
The tolerances are not that high and the physical specifications are available from ECMA; most of the mechanical parts would not be difficult for a machinist, it's the heads which are the most difficult. The intersection of machinists and retrocomputing hobbyists is probably not that large, and 5.25" drives are still not really rare enough to attempt remanufacturing.
The technical excellence of the Gravis Ultrasound makes it less interesting IMO. It did everything right, so it has no individual character. I prefer hardware that cut corners, e.g. the Amiga's Paula chip, with its hard panning and lack of built in anti-aliasing filter. Technically "bad" sound hardware can have a unique and distinctive sound that's still interesting today. The GUS just sounds like a lower specced version of modern hardware.
>And unless I'm mistaken vintage Sound Blaster boards are not exactly in short supply... though I'm not entirely sure about that; it's tough to search for them thanks to the countless scads of products bearing the "Sound Blaster" moniker.
You are mistaken about that, soundblaster ISA cards are going for hundreds of dollars on ebay these days, due to their rarity. A cheaper clone is certainly welcome
And baffling, honestly. I'd have thought there were scads of them anywhere you find old computer parts.
But I suppose it's also true that old PCs were discarded and trashed/recycled at very high rates, due to rapid obsolescence thanks to a couple decades of Moore's law and the fact that the were fairly ugly and finicky beasts in the first place... so I suppose it makes sense that even if Creative Labs sold a zillion of these back in the day, only a very small percentage survive.
(I tend to hold onto my electronic stuff for a long time, but even I was generally glad to ditch my outdated PC equipment ASAP back in those days.)
For real DIY audio on PC, try the covox. It's a resistor based "dac" for the paralell port that we used to build as kids too before we could afford a proper sound card.
You could also buy a commercial version if you didn't want to solder [1]
I seem to recall that the USB ones don't work for this kind of thing. From my teenage memories, you could bit-twiddle by writing to a memory address. I presume the USB ones use a UART that doesn't allow direct memory access, or precise timing, or something.
The one linked in the parent is a PCI card, so this might not apply.
That silkscreen had me giggling. The juxtaposition of the text in the requisite serious typeface and the fact that typically only "SERIOUS ELECTRONICS MESSAGES" are communicated in that way, I guess, was the source of The Funny.
I am reminded of the names of B-52's songs on Amiga motherboards. (I had an Intel server board that had the Marmaduke cartoon character screened on it, under the "Slot 2" retainer. Alas, I got rid of the machine and never photographed it.)
I'm most impressed that it uses ZERO surface mount devices. My surface mount rework skills have been getting worse over the past 10 years. (Fuck arthritis, man. Too young for this shit.)
2. This is a Free implementation. 100% of the schematic, PCB and mechanical design is released under CC By-SA 4.0.
So, it's for retrocomputing hobbyists, history preservation, it's also pretty educational and can be used as a model to teach (or self-teach) electronic design to the new generation without the overwhelming complexity (not limited to this, it's a common aspect of retrocomputing in general).
i did it for fun, and also because real sound blaster 1.0 cards are going for a lot of money on ebay. i don't actually own an original card, this was reverse engineered entirely from photos.
Wasn't this card one of several that could damage the motherboards that they were installed into? I wonder whether they've reimplemented those details; given that, just like the original, this version has idiosyncratic power management, I wouldn't be surprised.
We burned our 486 motherboard with Sound Blaster 2.0. The computer wouldn't boot when the card was installed, so the next logical step was to first start the computer and then install the card. Didn't quite work out, and dad wasn't happy.
Some background I dug up quickly:
Snark Barker creator's blog: http://tubetime.us/index.php/2019/01/19/sound-blaster-1-0-pr...
http://www.os2museum.com/wp/sound-blaster-the-french-connect...