Hacker News new | past | comments | ask | show | jobs | submit login
Symbolics Genera - The Best Software Environment Available (dyndns.org)
51 points by udzinari on Nov 7, 2010 | hide | past | favorite | 45 comments



I have a Symbolics XL1200 that still works (at least, it worked the last time I turned it on, which was a few years ago). I'd be happy to demo it at YC sometime.

It was a fun environment to work in. Although the hardware was very slow by modern standards, the environment was designed to maximize hacker productivity. For example, you didn't have to run a program under the debugger to debug it -- the debugger was always there. Any program that hit an error would drop you into the debugger. One keystroke would then take you to the source for the current function. You could change that function, recompile it, and restart from the point where the function was called.

The environment was completely open; there was no access control of any kind. There was also only a single address space. Did that make it crash-prone? Remarkably, no. The hardware tagging and bounds checking kept programs from stepping on objects they didn't own. Of course such a design would never survive outside of a research lab, but it made it amazingly easy to hack almost anything in the system.


I don't know that I'd agree that it wouldn't survive outside of research. In recent years, there's been a huge push for pure-managed operating systems, where everything runs in one address space and depends on the compiler to ensure security and reliability. While no such systems are in practice (to my knowledge), the potential is incredible. We'll see how it pans out in the next few years.

(Full disclosure: I started such an OS, Renraku, so I have a bit of a vested interest in the success of such ideas)



Indeed, OS/400 is just wonderfully advanced under the hood. Get one of the cheap used copies of this book for a lot more: http://www.amazon.com/Inside-AS-400-Frank-Soltis/dp/18824196...


Hi Scott, didn't see this comment before I posted mine (take a look at it if you have the time). Thank you for writing Zeta-C and proving the concept of reasonable C->Lisp compilation, and for making the code public domain.


You're welcome. I hope it makes for an interesting curiosity -- I'm not sure it can be much more at this point :-)


Wikipedia has more information for people (like me) who haven't heard of Genera before: http://en.wikipedia.org/wiki/Genera_(operating_system)


Lisp Machines are something that you think is really cool when you first learn about them, then you come to the realization that pining for them is a waste of time.

I've had a flash of inspiration recently and have been thinking about Lisp Machines a lot in the past three weeks.

But first, a digression. There's an important lesson to be learned about why Symbolics failed. I think Richard Gabriel came to the completely wrong conclusion with "Worse is Better" (http://www.dreamsongs.com/WorseIsBetter.html). There are two reasons why:

1. Out of all the LispM-era Lisp hackers, only RMS understood the value of what's now known as Free Software. (If you haven't read it yet, read Steven Levy's Hackers - it describes the MIT/LMI/Symbolics split and how RMS came to start FSF and GNU).

2. Portability is really important.

The key lesson to draw from Unix isn't that "Worse is Better," it's that survivable software is Free and portable. Free because getting software to someone's harddrive is 80% of success, and portable because you don't know where people will want to use your software (there are some really weird places).

Symbolics was neither. If Genera had been Free Software, it would by definition still be around today. If Genera had been portable, it's likely Symbolics would never have gone out of business (the Alpha virtual machine would have been done sooner, with less resources, and for more systems).

Being released as Free Software today wouldn't help. Genera's predecessor, MIT CADR, was made available under an MIT-style license in 2004 (http://www.heeltoe.com/retro/mit/mit_cadr_lmss.html). There's a VM emulator which runs the code. The whole system is pretty useless.

Now on to the inspiration part:

It's possible to make a very high-performance, portable Lisp operating system on modern hardware. This has been a possibility ever since the Pentium came out. The main bottleneck to conventional Lisp runtime performance is the way operating systems manage memory allocation and virtual memory.

A type-safe runtime that has control over memory layout, virtual memory, and is aware of DMA can provide extremely high throughput for allocation and GC (this has been shown by Azure's Linux patches for their JVM), true zero-copy I/O, almost optimal levels of fragmentation, and excellent locality properties. If you go single address space (and there's no reason not to) and move paging into software (object faulting and specialized array access), you've also eliminated TLB misses.

Throw in the fact that it now becomes trivial to do exokernel-type stuff like for example caching pre-formatted IP packets, and it should be possible to build network servers that have throughput many times that of anything that kernel/user-space split OSes like Linux or FreeBSD are capable of for dynamic content (ie - not just issuing DMA requests from one device to another).

The only problem is device drivers. Lisp doesn't make writing device drivers any more fun, or reduce the number of devices you have to support.

What to do?

The reason I've been thinking about this is that I came across this: http://www.cliki.net/Zeta-C

I've heard of Zeta-C multiple times before, but for some reason this time I made the connection - "why not use Zeta-C to compile an OS kernel?"

I explored the idea further, and it seems to me that it wouldn't be an unreasonable amount of work to take the NetBSD device subsystem and have it running on top of a Lisp runtime with the necessary emulation of those parts of the NetBSD kernel that the drivers depend on. If you don't know, NetBSD's device drivers are modular - they're written on top of bus abstraction layers, which are written on top of other abstraction layers (for example, memory-mapped vs port I/O is abstracted). So the actual system twiddling bits can be neatly encapsulated (which isn't necessarily true for Linux drivers, for example).

I'm aware of Movitz (http://common-lisp.net/project/movitz/) and LoperOS (http://www.loper-os.org/). Movitz makes the mistake of trying not to be portable, but there's useful things there. I haven't spoken to Slava about this yet so I don't know what's going on with LoperOS. I am also aware of TUNES, and think it was an interesting waste of time.

The main thing is to get Zeta-C to work on Common Lisp. Then it's to build a new portable, boot-strappable runtime (I think the Portable Standard Lisp approach of having a SYSLISP layered on top of VOPs is the right way to go for this), and either build a compiler targeting that runtime, or adapt the IR-generating parts of one of SBCL, CMUCL or Clozure. Further bootstrapping can be done with SWANK and X11 once a basic networking stack is in place. I think such a system would be quite fun to hack on.

If you've gotten this far, let me know what you think about this idea. I also have some preliminary thoughts about how this can be worked into the base of a new high-performance/scalability transactional database startup, if you want to hear about that email me: vsedach@gmail.com


Alas, porting Zeta-C to Common Lisp is not nearly as easy as one might expect. The problem is that a lot of C code makes assumptions about how the compiler is laying stuff out in memory -- for example, code might write a word through an int pointer, then read it back as bytes through a byte pointer. To support this kind of thing, Zeta-C used displaced arrays with different element types onto the same memory locations, which the Symbolics hardware supported but which can't be done in Common Lisp. (It didn't even quite work on the CADR and Lambda, as they were 32-bit machines; it requires that the tag bits be in addition to the low-order 32 bits of a word, as they were on the 36- and 40- bit Symbolics hardware, rather than borrowed from them.)

I suppose, though, that if there's only one chunk of C code you care about compiling this way, there is a chance you could get lucky, in that the code might not do anything that wouldn't work in CL.


Yeah, I think you mentioned that before on TUNES or somewhere else. If ints and bytes is the biggest problem, I don't see how that is a big deal - pull the int out of the array and shift/mask it to get the byte. Or am I missing something here? What I was thinking was going to be hard was structs (although most C code tends not to abuse those too much because padding issues tend to be pretty visible disincentives, but then again there's tons of code that does stupid shit that's unknowingly targeted to GCC exclusively).


pull the int out of the array and shift/mask it to get the byte

The problem is, you would have to be prepared to do this anytime you're dereferencing a byte pointer. You wouldn't know at compile time whether the pointer points into byte-organized or word-organized memory.

You're right about structs, too.


I still don't see what the problem is - you know the type of memory array you're referencing into (int, byte, whatever), you know the type of pointer you're dereferencing (byte). So is the problem going to be that it will be slow? In portable CL, yes, but there's always ways to get at the implementation's array-accessing internals and fake type-displaced arrays (esp. if you have control over the internals, as this hypothetical project would).


Well, that's a good point. In these days of high-quality open-source implementations, games can be played under the covers that would be impossible in portable CL.


Steven Levy basically repeats Stallman propaganda. Others have different memories about that time.

Also not only Stallman promoted free software in that time. For example the Common Lisp community had extreme benefit from what was done at CMU. CMUCL/Spice Lisp was released as public domain software and has been used widely in derived implementations, like SBCL or even commercial software. There were several other prominent Lisp hackers from that time which promote free software.

Stallman did a lot, but please make him a saint - when people like Scott Fahlman had much more impact in the Lisp community.


I'm not making him a saint. Stallman completely abandoned the Lisp community at that point - he won't even talk about Lisp now (at least he told me to go away last time I tried talking to him about it).

What I'm saying is Stallman is right about the value of Free Software. CMUCL and BSD basically got released the way they did because they were university research projects that failed to get commercialized. Stallman is really the person that came out and said Free Software should be a priority in and of itself and not an accident. That totally changed the way you can think about software, and you shouldn't discount that.

Another thing I respect about Stallman is that he tends to be right about longer-term trends. I don't bet against him anymore.


CMUCL was a research project and DARPA financed. Commercializing it was not the main goal. CMUCL got funding for many years (until 1994), when it was open source already. CMU promoted 'free' software in a huge way in many projects. They also provided the CMU AI repository which collected all available university research software in AI, including language implementations.

It was not the Stallman version of 'free' software with a viral license.

Most people fail to understand the role of DARPA funding. Stallman worked in a DARPA (= military) lab - the MIT AI Lab. He worked basically on new software infrastructure for the military. They funded it to get intelligent software for guided missiles, logistics systems, missile defense systems, simulators, training systems, ... The largest batch of Lisp Machines went into the Star Wars project. Promotion of 'free software' was not a DARPA priority, but DARPA was interested to get these tools widely used in Universities. DARPA financed a batch of a hundred Lisp Machines (IIRC) for example and gave them to universities. At the same time they promoted competition, to have more than one source for the technology - so LMI and TI also got money.

Stallman has his definition of 'free software', which shows in the GPL license. Others have their idea of free software which shows in the use of 'Public Domain' as the license. To me the Public Domain Lisps were always more important than the GPLed ones.


Doesn't that reinforce his point? CMUCL & SBCL were released as free software and are still around. Genera isn't.


I haven't said anything against his point. It is just that claiming that Stallman was the only one 'getting free software' is bullshit. He did a lot, but in the Lisp community there were many others which promote(d) free software.

Also Stallman did promote Free software, but his promotion of Lisp was mostly limited to Emacs Lisp. There was an attempt with Guile, but that was mostly it. He had worked with the Lisp Machine, but once he was out, that topic was finished.

Anyway, I find a few points are right and several others not. That Symbolics would have not gone out of business if Genera would have been free software and portable is speculation. It is neither necessary nor sufficient. For example Franz is still around, though Allegro CL is not free software. There are lots of free software / portable Lisp implementations which are as dead as Genera, or more than dead. I agree that it might help, but not more.

I agree with the following: portability and 'free' can help with survival.

I don't think adding C to Lisp is of much help. The mismatch is huge. There are two options: 1) repeating the design of the past (with the same problems and maybe slight improved. There has been lots of talk and little action. It's difficult because we don't have the hackers with knowledge and time for that. 2) doing a new design which gets rid of the limitations of past designs (single user, single image, no security, ...). That's even more difficult.


"That Symbolics would have not gone out of business if Genera would have been free software and portable is speculation."

Two different things. I was claiming that Genera would still be around if it was Free Software - this is by virtue of definition. Symbolics being in and out of business has nothing to do with it.

A totally different thing is that I think Symbolics stood a very good chance of surviving if it had made its software more portable. It's a good thing you mention Franz, because they're almost as old as Symbolics, and still around, largely due to the fact that their system is easily portable. They don't waste immense amounts of engineering resources on supporting platforms, but dedicate it to improving features and performance and more importantly responding to customer needs.

"1) repeating the design of the past (with the same problems and maybe slight improved. There has been lots of talk and little action. It's difficult because we don't have the hackers with knowledge and time for that."

It's difficult to do because when you get down to it, Zetalisp sucked for systems programming. Portable Standard Lisp, Interlisp, CMUCL/Allegro/Clozure (anything with LAP and VOPs) provide a much better model for how to build a Lisp runtime.

It's also difficult to do because people like to waste their time imagining how awesome hardware type-checking on FPGAs would be and how the new Lisp OS will have super-duper database-backed storage and network transparency and etc. etc.

"2) doing a new design which gets rid of the limitations of past designs (single user, single image, no security, ...). That's even more difficult."

The key is the namespace. Symbol identity needs to be preserved across users/processes, symbol values don't. This is actually a much simpler problem than you assume.


Zetalisp sucked for systems programming? That's a pretty strong word. Have you ever used it at all?

I doubt that Allegro CL is easier to port than Open Genera.


As a means to the end of getting a working system done sooner rather than later, and most especially supporting a zillion device drivers that will never be ported to a Lisp, C in Lisp may make sense.


Only if you can integrate it in a useful way and reuse C code from other places. Which I doubt is possible.


We shouldn't cherry pick only the examples which support our position. E.g. Allegro Common Lisp has traditionally been about as unfree as possible (e.g. $$$ required to distribute solutions to your customers (for the Lisp runtime)) and it's still around. LispWorks is three years older and is still around.

Also, would the MIT Lisp Machine system or Genera have gotten as good as they did as fast as they did without outside funding (government and VC respectively)?

Or look at the failure of GNU to produce a kernel.

My general point here is that all this is much more complicated that a simple "free survives, unfree doesn't" ... although I'd need a few good examples of unfree dying to round out it out (good as in perhaps "was big, then totally failed").


"E.g. Allegro Common Lisp has traditionally been about as unfree as possible (e.g. $$$ required to distribute solutions to your customers (for the Lisp runtime)) and it's still around."

Franz started only a few years after Symbolics. I believe one of the key reasons they're still around is because their Lisp system is easily portable.

"Also, would the MIT Lisp Machine system or Genera have gotten as good as they did as fast as they did without outside funding (government and VC respectively)?"

Stallman was totally right about Symbolics though - they took all of their Lisp work and most of the Lisp community to their grave. There wasn't a lot preventing them from making their software more portable or open, but they made a business decision not to do that, which proved to be disastrous in the long term.

"Or look at the failure of GNU to produce a kernel."

That's because there's already other, better Free Software alternatives out there. Why work on HURD when you can work on Linux or one of the BSDs? And it's important to note that it's a continuing failure - because the project is Free Software, people can continue to work on it if they want.


Symbolics also had developed a portable Lisp, but closed it. They developers then develop it externally, it was developed and sold as Lucid CL. Lucid CL was highly portable, but the company also did not survive.

Symbolics' problem was not that their software was not portable. Open Genera is a VM and ran on DEC Alpha, it would have been possible to port it to other 64bit machines.

But the business went away. The thing was too costly. It was also not just a Lisp implementation, but a portable OS. As an OS it simply did too much: own networking, own window system, ...

The reason why Franz is still around and Symbolics' is not, is not a question of portability - it is because Franz' Lisp is just a Lisp implementation and concentrates on that - where Genera / Open Genera is an OS, which simply does too much. Why run a TCP stack in Lisp on a machine which already has a TCP stack? Why run a window system on a machine which already has one? Plus the rest of the machine does not benefit from that. If the other software could use the Lisp TCP stack, but it can't.


Saying that Genera was portable because Open Genera was there is like saying that MS DOS is portable because there's QEMU.


Or like any other language implementation with a language-oriented virtual machine: Java/JVM, CLISP, SQUEAK, ...


But Open Genera isn't a language-oriented VM, it's an Ivory emulator. There's a huge difference.


The Ivory is a language oriented processor - not a computer. It's a CPU, a chip. Where is the difference? Open Genera also does not emulate an Ivory-based computer, but the processor mostly. It emulates the instruction set.

http://pt.withy.org/publications/VLM.html


BTW I have a different theory about why Lisp Machines failed in the market. Symbolics and LMI were spoiled by the high prices the machines originally commanded (under the influence of the AI bubble and DoD money) and didn't realize how aggressive they needed to be about making them cheap. When the Sun 4/110 came out, it was about 1/5 the price of a comparable Lisp Machine, and faster. Had LispMs been price-competitive with workstations, they would at least have lasted a few more years. Of course, in the end even the workstation vendors succumbed to the marauding hordes of PCs.


I think the fundamental problem of Symbolics was their strategy as a systems company. They were a half-assed hardware company that could not keep up in price or performance and didn't even bother to let others develop system software for it, and they were a half-assed software company that didn't bother to port their software to other computers (a $10,000 plug-in board for a Macintosh does not count as a port).

They were also a pretty good computer graphics and computer algebra software company, and both those divisions got sold off and continued their work for a number of years after Symbolics itself went out of business.

It's interesting to note that by the time of Symbolics' demise (I'm placing it at 1993ish, although that's probably not accurate), even IBM realized that developing systems as integrated software/hardware architectures was not feasible.


One word: Apple

It's feasible.


Apple has never developed their own hardware architecture.


You must be using a very narrow definition of "hardware architecture". In the 68K era Apple used to custom-design pretty much everything interesting in the box except the processor and memory. The IIfx was perhaps the most extreme example (several custom ASICs and two 6502-based I/O coprocessors, IIRC). For the Newton we even designed parts of the processor (e.g., the MMU).


Let's take your example of the IIfx: off the shelf 68k processor on NuBus. ASICs and I/O coprocessors are just devices. Granted I didn't know about the custom processor work on the Newton, but if IIfx counts as a new hardware architecture, then so does the x86 move from ISA and 8259A to PCI-E and APIC. And I don't think it makes sense to argue that today's Macs have anything resembling a custom architecture.

Symbolics' systems and the first AS/400 models (the last new hardware/software architecture developed by IBM) are in a totally different class.


Fair enough: sounds like your definition of "hardware architecture" is limited to the processor itself.

If you're curious about the Newton processor: http://www.ot1.com/arm/armchap1.html (see the "ARM6" section).


Well, processor and memory (I certainly think NUMA machines are different), everything else is pretty much just I/O.

Thanks for the link.


Indeed; whatever the mistakes these companies made (also include the TI Explorer), they were zapped first by 32 bit microprocessors (a sustaining technology, but not one adopted quickly enough by Symbolics (plus they picked a bad initial platform to put their microprocessor on)) and then PCs (a classic disruptive technology).


Symbolics had a DOS board for their Lisp Machines. The Ivory ran embedded in Macs and SUNs. They had a PC-based offering: CLOE.

It did not help.

The basic limitations of Genera: complex interwoven software with lots of legacy stuff, single user, single image, no security ... just did not scale and was not on a growth path.


There is no such thing as the "best" software environment, anymore than there is the best car. It all depends on what you're trying to do and what your priorities and constraints are.


That's the dogma, but is there a reason for it to actually be true? All we know for sure is that there isn't a clear winner yet.


Software environments, languages, and other such abstractions aren't there to be perfect. They are there to help human beings manage absurd numbers of machine instructions. Maybe when we fully understand the human mind, we'll figure out the perfect way to map machine instructions to concepts that humans can grok. Until then, we'll have to make somewhat subjective design decisions in the face of uncertain constraints and priorities, which is certainly not going to lead to a "best" solution.


Not dogma, but the conclusion of reason and experience. You don't build a bridge and a house with the same tools. Why should software be different?


If you know what you know your priorities, constraints and "what are trying to do" (goals), then, is there such a thing as a best software environment?


That page is giving a 403 error now. What am I missing?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: