Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Lisp had a different purpose. It wasn't developed as a systems programming language and wasn't used much as such. Though it had been used for some very specialized machines from the mid 70s to early 90s, see the Lisp Machine Manual. But if you read the manual. you'll see that it was initially developed for a very narrow group of people: high-end personal workstation users in research&development. This required very deep pockets. When this manual was published (1984) the cost was in the range from $70k upwards. Most of the machines at that time were bought with government money (aka DARPA).

C spread, because it was a low-level systems programming language and various operating systems were written in C.

The Lisp Machine softwre OTOH was written in a high-level language with special requirements for the hardware: tagged memory, stack-based instruction set CPU, large memory space, garbage collection support in hardware, ...

Thus the software was not stripped down to a minimum and not very portable - which Unix and C was.



The hardware that ran the software described in this manual didn't have tagged memory or garbage collection support in hardware and the real instruction set was similar to any RISC CPU.

Somebody could have produced a workstation on conventional hardware that just ran Lisp but they didn't. Maybe Tektronix would have been a good candidate as their 4400 series wasn't sold to run UNIX, they ran either Smalltalk or Franz Lisp on top of a minimal OS.


> The hardware that ran the software described in this manual didn't have tagged memory or garbage collection support in hardware and the real instruction set was similar to any RISC CPU.

Then you should check the architecture of those machines some time: MIT CONS, MIT CADR, Symbolics LM-2 (a repackaged CADR), Symbolics 3600, LMI Lambda, ...

http://www.bitsavers.org/pdf/mit/cons/TheLispMachine_Nov74.p...

Page 5: 1 GC bit, 1 User bit, 2 cdr code bits, 5 bits data type, 23 bit pointer.

Looks to me like a tagged CPU architecture...

The MIT CADR Lisp Machine was a stack architecture with 24bit data and 8 bit tags. Six bits for data type encoding and 2 bits for compact lists. The CPU does type checks on operations, ...

It was nothing like a RISC machine, which were researched for Lisp (Symbolics, Xerox, SPUR, SPARC, ...) mid/end 80s. A full decade later after the architecture of the MIT CONS Lisp Machine.


I am well aware of the architecture of the MIT CADR, LMI Lambda and TI Explorer, Symbolics lispms less so but they are not the subject of this thread. I have written CADR microcode recently.

None of the features you list are constrained by the architecture of the hardware, they are just conventions of the software VM running on it. Would you suggest that the X86 is a tagged CPU architecture just because SBCL or a JVM use tags ?

>Page 5: 1 GC bit, 1 User bit, 2 cdr code bits, 5 bits data type, 23 bit pointer.

This is not true for the software that matches this version of the manual. System 99 used 25 bit pointers, there wasn't a GC or user bit. The change from the earlier word format was possible because this was not fixed in hardware.

The CADR microinstruction set is load/store with regular opcode fields, it is very much like an early RISC.


If the X86 would provide SBCL with such instructions and data, it would be a tagged architecture, but it doesn't. The SBCL compiler outputs conventional X86 instructions.

The Lisp Machine compiler OTOH generates instructions for a mostly stack machine, which runs on the CPU in microcode.


The Lisp compiler described by this manual can output microcode directly.

Please don't assume that all Lisp Machines were the same as Symbolics ones.


Please don't assume that the Lisp compiler on some Symbolics could not output micro code. It could, IIRC.

But it was not what a Lisp developer normally would do, he/she would use the compiler in such a way that it outputs the usual machine code, not micro code.


Computer Architecture is not defined by what a Lisp developer normally would do.

What hardware features of the CADR do you feel provide support for GC and tagged words ?

I have built the CADR microcode from the same source to use both 24 and 25 bit pointers, it is just software.


Whether microcode is hardware or software is blurred. Remember, when microcode was introduced in the 1960's, it was used for implementing the same thing in software that other versions of the same computer family did in hardware. With microcode, a vendor could offer different machines at different price/performance points. A sequential circuit can implement an algorithm; microcode can implement an algorithm.


The CADR is an example of a computer that implements a virtual machine in software, what makes it special ?


Computer architecture on the user level is defined by the data format and instruction set the CPU offers. How it is implemented is another level. I don't know how some Intel i7 is implemented, but it probably has writable microcode and some very different architecture inside.

That Intel hides the microcode and the CADR didn't is just another detail.


You still haven't identified what hardware features you think can be emulated in software on a CADR but couldn't be emulated in software on a 68020.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: