Review of existing Languages
There are a lot of these, aren't there?
I'd love feedback from you, particularly if you know any interesting
language not (or poorly) considered here...
Also feel free (=bound :-) to add your comments and to correct broken
English and typing mistakes.
Language-related WWW pages
Some "paradigms" of programming
Programming with Proofs
Resource management is a central topic, and any language implementor should
know about Garbage Collection (GC) techniques.
Run-Time Code Generation
Combining several languages
The reference about [MIA]
CORBA is the industry standard
architecture for having programs written in different languages cooperate.
ILU or Inter-language
unification, is a project to allow interoperability between programs written
in different languages.
The Eli metacompiling
The poplog language system combines POP-11, prolog, LISP, SML, and more.
This classification is stupid, approximative, and should be re-done completely,
with perhaps an array of "features", to serve as a guide... Of course,
the only way to know for sure about the language is to see how it really
is... Also, many things here are out of date. Updates welcome...
Languages are no more classified in bogus "families"
In the above list, here is the meaning of parenthesized flags for the considered
!<x> means that feature <x> is known not to be supported
-<x> means that feature <x> is known not to supported in an ankward
and limited mode only
(C) indicates that at least one free implementation of the language can
be seemlessly interfaced to low-level C calling conventions (which implies
lots of runtime safety limitations on the implementation). Such language
is often called "Embeddable", because you can embed it into most any application,
a low-level interface being the current standard.
(!C) indicates that no free implementation of the language can be seemlessly
interfaced to low-level C calling conventions
(c) indicates that at least one free implementation of the language can
be portably extended with C runtime code. This can be safely assumed for
most languages, C being the one standard low-level language. Only the opposite,
(!c) will be specified.
(A) indicates that applicative lambda-calculus is fully supported as part
of the language
(!A) indicates that the language just can't express lambda-calculus as
a first-class citizen
(M) indicates a well-done module system
(!M) indicates no well-done module system
(!=) indicates that the language is not even Turing-equivalent
(K) indicates that structured data are first-class values
(!K) indicates that structured data are not even first-class values
(T) indicates that non-trivial strong typing is supported
(t) indicates that only trivial strong typing is supported
(!T) indicates that strong typing is faked
(!t) indicates that strong typing is not even support
(R) indicates clean support for reflection
(r) indicates that some safe mechanisms for reflection are available
(!R) indicates that dirty stuff for reflection (e.g. backdoor evaluation)
only is available
(!r) indicates no support for reflection at all
When Tunes is ready, this page will be made a query-driven database (with
standard query forms) where languages/implementations couples will be classified
upon the characteristics below. Meanwhile, please forgive the bad quality
of this page, and think about enhancing it by your contributions.
What are the aims and practice of the language/implementation?
What kind of language specifications are available? Are they formal specs?
Are they some ISO/ANS/IEEE/whatever standard?
How compatible is the language with anything? How fast does it evolve?
Is it free, or do some corporations/institutions tightly control it? Is
it burdened by a reference implementation and legacy code whose quirks
must be emulated?
What kind of language differences and extensions does the implementation
support or not support?
What are libraries availables? Which are distributed with the implementation?
Which are standard?
What global control constructs are available? Procedures? Recursion? First-class
procedures (=functional programming)? First-class Continuations?
Does the language have lexical scoping? Does it support call-by-name? call-by-value?
How is evaluation order defined? Are there parallel constructs?
Is the language referentially transparent? Does it support uniqueness annotation
Does the language support pure programming style? Does it supports side-effects?
What kind of typing does the language have? strong typing? recursive types?
type-dependent types? value-dependent types? existential types? resource-annotated
types? reflective types? formal verification methods?
What kind of pattern-matching does the language have? recursive patterns?
Does the language have backtracking? Reflective patterns?
How extensible is the syntax? Are there hygienic macros? Can fixness of
operators be chosen?
Is the language fully reflective?
What support does the language offer a standard for encapsulation? Is it
possible? Standard? Higher-Order? Semantically simple?
Does it have some kind of ad-hoc polymorphism? Does it use single-dispatch?
multiple argument dispatch? Is it statically dispatched? Dynamically dispatched?
How many implementations does the language have?
Now consider each implementation of the language.
How is the implementation available? does it run on available platforms?
Is it free? Free to education or non-commercial users? Are sources available?
Are modifications freely redistribuable?
What execution model is used by the implementation? Interpreted text? Interpreted
syntax tree/graph? virtual-machine code? straightforward assembly code?
optimized assembly code?
Is the implementation multithreaded?
What kind of garbage collection and automatic resource does it have if
any? How does that interfere with features of real-time response, persistence,
transactions, finalizations, etc?
How well does the implementation support recursion? tail recursion?
How upwardly and downwardly scalable is it?
What does the syntax generally look like? Is it prefix? postfix? infix?
How redundant is it to read? To type? Are there cleanly lexically nested
blocks? Are there available structured editors?
Does the implementation support separately-compiled modules? Dynamic linking?
Dynamic compilation? User Interaction?
How can the implementation be extended? Can it interface the whole system?
What external services is it already interfaced to?
Can the implementation be used to extend existing programs written in another
language/implementation? what are supported language/implementation interfaces?
What control does the programmer have on the rapidity/quality of the compiler?
per-file command-line modifications? hints in source code? feedback from
What are known bugs and limitations of the implementation? How more reliable
can it be expected to get?
How robust is the implementation? Has it been proven correct? How much
tested has it been? Is it using an open development model to speed up the
What are other implementations for the language? How different are the
For each feature, how well is it supported? would you normally want to
use that feature, given the language/implementation? Would you use that
language/implementation, given the feature?
It should be possible for implementations to point to the language dialect
used, to language dialects to point to main language family, to language
families to point to groups of languages, etc, with implicitly inherited
or explicitly modified properties.
ABCL is a family
of languages developed by
Yonezawa and his lab for "object-oriented concurrent programming".
These languages are based upon a LISP or Scheme core with primitives for
object-oriented concurrency. The language was further extended to use
reflection to manage distributed computing and other language extensions
in a way that is as seamless as possible to the user.
There is a lot of things to say about ADA. Anyone care to write a review?
I didn't do a comprehensive web search yet, but have been given this
ADA was originally chosen as US DoD's language, and was meant to fulfill
Agora is a prototype-based OO language à la SELF,
that is used to explore reflective features (notably mixins).
Could anyone have a deep look and send insightful feedback?
Here's an FTP site,
and now a
Alma is a strongly-typed imperative language with constraints and quantifiers.
It is an extension of a subset of Modula-2. It supports declarative programming
An Alma-0 compiler and syntax definition is freely
The reference implementation, is the commercial
BETA system; educational institutions may have discounts.
There is now a free interpreter,
BETA is an OO language from
those who invented SIMULA. It is the latest advance in OO programming.
This entry should definitely be updated and split into C, C++, Objective
"C" is a low-level, weakly typed, static, non-modular, non-generic, non-reflective
"C" was designed in the late '70s as a portable assembler for the PDP series
(remember? nominally compatible machines but with various word lengths),
and it is still the state of the art in that area, although, as a systems
programming language, it is more adapted to the PDP-11 than to modern architectures.
In the late '80s, the small and elegant `K&R'
C has become the prey of an ANSI standards commitee; it has now become
ANSI Standard C.
In the '80s, C has (in our opinion wrongly) become the language of choice
for general purpose programming. In an attempt to add the features of general-purpose
Object-Oriented languages to C, Bjorne Stroustrup created C++, a huge,
bloated, hack-ridden monster. The name C++ is a pun that means that one
huge bloated thing was added to C, but the value of the result is no more
than that of C.
Writing the formal semantics of C++ is left as an exercise to the
The only positive point about it is that it is the one standard
as a programming language.
There are also other less popular "object-oriented" versions of C, such
Objective C (Kind of smalltalk with low-level unsecurity, but full
C syntactic and semantic interoperability; I'm don't really know, but I
assume that some reflective and dynamic aspects of smalltalk have been
left out), or
Dynace. (I admit I haven't
seriously inspected the language, but should).
About C, get the C Bible by Kernighan&Ritchie (the original authors
of C). There's also an online tutorial by Dave Marshall,
About C++, just shoot(yourself.foot++); or if you must, learn to program
C++ with the
Network Academy (GNA)'s
to Object Oriented Programming Using C++.
Look on prep.ai.mit.edu or any mirror site to find GCC, the GNU C Compiler,
a free C/C++ compiler, and a lot of utilities written in C. Also, experimental
features are being actively developed in egcs,
whose releases are separate from the allegedly more "stable" gcc 2.8.
A simpler free compiler is lcc.
There is a free embeddable interpreter, EiC.
are papers about why not use C or C++ in general.
one is particularly meaningful as to what a language should be (that
C/C++ is not).
LC-Lint is the best existing
typechecker for C.
Ctools, part of issue one of
[MIA] ~twaddle (pronounced
"twiddle twaddle"), might also prove useful.
Anybody knows this language, and we wouldn't have to readapt to another
language syntax, as it is the one (now ANSI) standard in system
We may compile test programs immediately, without having to wait for the
kernel and compiler to be done.
Actually, C and C++ are not high-level languages, and just worth
throwing to nearest garbage can (poor garbage can), unless you're stuck
to it, which you are on existing systems (themselves worth as much, but
again, you're stuck to them until Tunes comes out).
Note that "C" offers absolutely no particular interest when cut from all
its standard compilers and libraries, for you can no more port existing
software. As TUNES won't support any standard "C" library (at least as
support for native code), and requires a different compiler anyway, because
its semantics is so much different, "C" offers no interest on top of TUNES.
Thus, "C" just has nothing to do with TUNES, except perhaps as a layer
between TUNES and a POSIX compliant system, if TUNES is to be implemented
as an application under such system (see OTOP
C/C++ do not include functions as a first-class object, thus creating a
boundary between using and programming: it's a static language;
you can't both program and use at the same time. That's the opposite of
user programmer/user friendliness.
C/C++ is not a structured language: procedures are all global; that's why
C/C++ will NEVER allow having independent light threads, and why it's impossible
to have a lightweight multitasking system under C/C++. You may notice that
this point is related to the preceding remark: if procedure were objects,
you could include them individually inside thread objects, then each thread
would have its own independent code.
C/C++ knows only early binding (i.e., it only compiles directly executable
code), hardly knows about dynamic library linking (in the case of C, it's
not a language feature, only a linker feature); C/C++ considers a program
to be complete, finished, and run in the absolute without interaction with
other programs, but through the system; that's why all ROI in C must be
explicitly done through system calls !!!
The "we won't have to rewrite a compiler" argument doesn't stand: if the
system is to be OOed, we'll have to adapt the compiler so that it produces
OO code compliant to our system's requirements. Unless our system brings
nothing that can't be done easily by replacing the standard library, it's
impossible; so only front ends can be reused, which are trivial to write
(although more difficult in the case of C than with other languages). All
in all, "C" would only get in the way if used.
As it's a low-level language, either we'll have to have low-level specs
for the OS (as with Unix), so that we can't adapt to an architecture different
from that for the which the OS was designed; or we'll have to rewrite a
great amount of the C/C++ code on each system adaptation. So we lose either
system functionality, or language portability. Interoperability of code
is also impossible, unless we define a common target architecture that
other computers will have to emulate.
For the same reason (C/C++ being low-level), we can never achieve object-level
security, but by having a C/C++ program for each object, which either disallows
everything -however little- is an object in the system, or requires many
a low-level system (as opposed to language) declaration in a C source,
and/or requires having tiny C/C++ programs, which contradicts the heavy
compilation cost and the class philosophy of the language.
It is remarkable that most security alerts in operating systems are related
to buffer overruns in C code, because the language doesn't help in any
way to automatically ensure safety -- unsafety is the default, and manually
ensuring it is a long, tedious, difficult task. A higher-level language
could fully automatize safety, making it the default, whereas it would
be unsafe operations that would have to be explicitly coded.
Horrible in C and dreadful in C++ is the lack of really first-class structured
values. This makes C nothing more than a badly specified portable assembler
(with very limited macro facility, and restrictions in calling conventions),
and gives a completely obfuscated notion of object identity to C++.
The C++ philosophy contradicts the idea of late user-object binding. C++
knows only of early compile-time object binding, or at best/worst, binding
with compile-time defined virtual class realization through so called virtual
tables. So to make the slightiest add/change, you must recompile the whole
Because the whole object system in C++ is based on braindead
enhancements in a program may mean complete rewrite of the type hierarchy.
The C/C++ preprocessor allows simple macro-definitions, but neither macro
instructions, nor recursive macro definitions. If
#define DEF #define
was possible, for example, it would be possible to Ziv-Lempel compress
a source program from C to C. But cpp just plain sucks.
is an OO language at University of Washington, participating Craig Chambers
who wrote the
Concurrent Clean is a general
purpose, higher order, pure and lazy functional programming language for
the development of sequential, parallel and distributed real world applications.
"Clean is a language in the spirit of other modern lazy functional languages
like Haskell and Miranda. People familiar with these languages will have
no difficulty to program in Clean. The Clean compiler has the nice property
that it runs on small platforms (Mac, PC, Sun), while it compiles very
quickly and produces code of state-of-the-art quality."
Common LISP is a dialect of the LISP family of languages.
See the according information below.
Common LISP is the result of a standardization effort by Lisp vendors started
with the commercialization of LISP in the early 80s. Because it strived
toward backward compatibility with existing Lisp systems, the result is
a huge monolithic ANSI standard language, with hundreds of built-in constructs
for a megabyte worth of run-time system (not talking about add-on modules).
However, the advantages of programming in Common LISP cannot be
overestimated: everything a programmer usually needs is in the library,
the object system is (quite) well integrated with the type system and the
condition system (which is unique on its own). Greenspun's Tenth Rule of
Programming states that "any sufficiently complicated C or Fortran program
contains an ad hoc informally-specified bug-ridden slow implementation
of half of Common Lisp".
Books about Common LISP
Free Implementations of CommonLISP
CMU CL 18b is a deadly fast free
compiler for CommonLISP.
Bruno Haible's CLISP is a portable
and compact implementation of CommonLISP written in C and using bytecode.
Does not implement the full CLOS MOP.
The following two are not free software, but have binaries usable
free of cost for non-commercial purpose:
Franz Inc. sells its commercial system
Allegro CommonLISP on most platforms. Version 5.0 of ACL for Linux is available
free of charge for download.
Harlequin sells its commercial system
LispWorks and Liquid
Common LISP (bought from the dead Lucid). The personal edition for Windows
can be downloaded free of charge.
Free software packages written in Common LISP
Common Lisp Hypermedia Server, is the most configurable web server ever.
Closure is the
dual of the above, a web client written in CL.
ACL2 "A Computational Logic,
Applicative Common LISP" is the sequel of the Boyer-Moore nqthm theorem
prover, that formalizes a pure subset of Common LISP, and has been used
to prove lots of non-trivial things, from advanced mathematical theorems
to the correctness of CPUs and compiler output. Code and docs freely available
Lisp Objects" implements Persistent CLOS objects.
(Common Music) CLM
(Common LISP Music) and CMN
(Common Music Notation)
Free CLIM is an effort for
a free implementation of CLIM.
Thread Interfaces for Common Lisp are discussed here
a Common Lisp package and libraries that implement a symbolic math language
compatible with Mathematica(tm)
Much more that I haven't heard of...
See Generic critique for LISP languages
It is an ANSI standard.
Both interpreters and compilers of good quality are available, and new
ones are easy to design, making development easy, and portability quite
It has a most advanced object system with generic functions and multiple
dispatch (CLOS), a unique feature about widely implemented languages.
The object system (CLOS) has got most advanced reflective features through
a Meta-Object Protocol (MOP)
Functional programming allows good factorization of programs, so that added
features take very little space as compared to non-functional programs.
It has a powerful standardized macro system
It has got a package system.
It has got a powerful library of builtin features.
You can declare types to achieve very efficient compiled code.
the standard has got too many legacy features nobody uses.
many essential features for a modern language are not standardized (notably
threads), and how it is implemented (or not) on each implementation varies.
The object system's semantics does too many things dynamically at run-time.
The Object and Meta-Object system is too much based on side-effects
Though today's machines have more than enough memory to run CL (which was
not the case in the early days of CL), CL still takes too much space to
be used in embedded environments.
The macro system does not have as clean semantics as it should.
its module system offers no security at all.
The too many builtin features are not orthogonal enough.
There is no standard way by which the system would statically check declared
types and infer new ones as a tool for enforcing static invariants and
COBOL is the COmmon Business-Oriented Language.
It's a very old and stupid language. The kind of thing a sensible man wouldn't
touch unless forced to. Because its syntax is so verbose, it makes the
neophyte and non-programmer believe they actually understand programming
better. But of course, what makes a good language is its semantics, not
its syntax, and for actual programming, COBOL syntax gets in the way, while
its semantics plain suck.
If you're interested, check the [MIA]
Coq is a higher-order proof system based on the Curry-Howard isomorphism
between propositions and types, proofs and terms in a pure functional language:
a functional term is a proof of its type's realizability. By restraining
to intuitionistic logic (i.e. no built-in choice axiom), it allows to extract
actual programs from proofs.
The newer version of Coq does have programmable tactics (using CAML-light),
extensible grammars, etc.
The problem with Coq is that it is not reflective: there is a one global
meta-context, and no way to meta-translate expressions: a proof done in
"Prop" will have to be done again in "Set" and in "Type"; there are no
"proof sketches" as first-class objects.
Coq has got a
which is no more empty.
The Coq distribution
includes full docs.
Dylan is a new "object oriented" DYnamic LANguage, that aims at being compiled
as efficiently as static languages, while providing an unequaled development
environment. It's a lot like a LISP with an infix syntax and freed from
backwards compatibility. The design group hoped that it might bring dynamic
languages to C++ programmers but that was before the Java seized the market
with its hype. (but don't worry: even some Real Language programmers find
Dylan was developped conjointly by Apple, Carnegie Mellon University, and
Harlequin inc. But Apple abandonned the project half-way, and CMU stopped
its projects, while Harlequin finally published Dylanworks. Now, the difference
between Apple and CMU is that because Apple was closed-source, all its
code is now defunct, and no one will ever see it again, whereas CMU made
its code public domain, which has been taken over by a bunch of joyful
Eiffel is a rather clean OO language originally designed by Bertrand
Meyer, that stresses the concept of programming by contract, to achieve
reliable industrial-strength programs.
Unhappily, the language as it is is not expressive enough to express all
the constraints one may like to verify. Moreover, the language suffers
from the choice of typing covariance in functions which makes quite an
adhoc type system.
See Sather for a language that tries to fix some
of the flaws of Eiffel.
There is one and only one free implementation of Eiffel,
being actively developed under the GPL.
There are lots of commercial implementations, though I have only
these pointer handy:
Erlang is a mostly pure logic/functional
programming language developped by Ericsson, used in real-life telecommunication
applications and other industrial settings. Now free
software, and with lots of goodies, including
Projects to compile Erlang into something faster than its traditional bytecode
include HIPE and ETOS.
Escher is a declarative, general-purpose programming language which integrates
the best features of both functional and logic programming languages. It
has types and modules, higher-order and meta-programming facilities, and
Escher also has a collection of system modules, providing numerous operations
on standard data types such as integers, lists, characters, strings, sets,
and programs. The main design aim is to combine in a practical and comprehensive
way the best ideas of existing functional and logic languages, such as
Gödel, Haskell, and \lambda-Prolog. Indeed, Escher goes well beyond
Gödel in its ability to allow function definitions, its higher-order
facilities, its improved handling of sets, and its declarative input/output.
Escher also goes well beyond Haskell in its ability to run partly-instantiated
predicate calls, a familiar feature of logic programming languages which
provides a form of non-determinism, and its more flexible handling of equality.
The language also has a clean semantics, its underlying logic being (an
extension of) Church's simple theory of types.
You can obtain a
You may find the current EuLisp specification and sample implementations
The currently only active implementation is youtoo.
EuLISP is a dialect of the LISP family of languages.
See the according information below.
EuLisp (European Lisp), is a modern object-oriented dialect of Lisp whose
design was less minimalist than that of Scheme but less constrained by
compatibility reasons than that of Common Lisp. In particular, it has a
single namespace for variables and functions (like Scheme, and unlike Common
Lisp) and its type system is perfectly integrated with the object system.
Furthermore, the language makes use of the module system, so that every
program may import only those parts of the language that it needs.
FORTH is a low-level functional language for a stack-based virtual machine
It is a very interesting language to study, both for its achievements and
The current Forth standard is ANSI's 1992 ANS FORTH. It replaces the FORTH'83
and FORTH'78 standards.
FORTH is also being used as the standard language for Open
Firmware, used to boot all modern computers in a portable way (see
The main FORTH site is the FIG's page (Forth Interest Group): forth.org
(hosted by Skip Carter at taygeta.com),
that also has a FTP repository with all
known free FORTH implementations and applications. There is a german
FTP mirror, but it looks like many files are missing.
Another important page here.
Of course, the place to ask questions about FORTH is the usenet newsgroup
comp.lang.forth and its well-done
the company created by the original authors of FORTH.
Also, FORTH stuff on cera2.com
GNU Forth (gforth) is a portable 32-bit FORTH interpreter written in C,
with efficient assembly routines on common platforms.
PFE is a free,
portable, and super-standard implementation of FORTH that runs on unices
Wil Baden's [MIA]
Macro4th is a mostly standard FORTH with very interesting (unstandard)
features. Runs on unices, pcs, macs.
F-PC is a very
popular FORTH system for the old 16-bit PC. The page also contains lots
of useful FORTH pointers.
is a free implementation for OS/2 that comes with source.
MOPS is an FORTH
with builtin OO extensions well integrated into the MacIntosh environment.
Random pointers to sort out about FORTH
Gema is a general purpose
Haskell is the reference among pure
lazy functional programming language. It serves as a bench for much research
and teaching in pure functional languages. In has an elaborate type system
including both ad hoc polymorphism (operator overloading) and uniform polymorphism
(universal type quantification, plus restricted existential quantification).
It might be called a functional object oriented language.
Gofer, a lightweight version of Haskell, has implementations that can be
run on desktop computers.
The Glasgow Haskell group headed
by Simon Peyton Jones,
the Glasgow Haskell Compiler, using GCC as a backend.
Haskell group, headed by Paul
Hudak also does Haskell development (including music in Haskell).
See this FTP
pointers to sort out:
Icon is a high-level, general-purpose programming language with many features
for processing data structures and character strings. Icon is an imperative
procedural language with backtracking. Its syntax is much like C and Pascal,
but its semantics are far higher level.
The Icon project (in Arizona) has a WWW page
There's an OO extension to Icon, Idol.
Java some kind of revamped C++ dialect
meant to be used as a universal WWW language.
Java starts from the C++ syntax and object model, frees it from C compatibility
and ugly hacks; frees it from manual memory management, pointer arithbuggics,
and unsafe typecasting; then it tries to add just every possible in-fad
feature, without having any clean design principle (how could they starting
from the C++ object model?). That is, they clean up C++, remove much of
its fat, but essentially follow the same general bloat-oriented design.
They nonetheless provide interesting features (Interfaces), and overall,
the language is much cleaner (hence better) than C++ for OOP programming
(see the C++
critique v3). Note that much better than C++ does not remotely mean
"good". And whatever "OOP" means, being good at it might not mean being
much useful, either...
Java is indeed two standard: firstly, their language (see above), but secondly,
and perhaps more importantly, their low-level bytecode-based JVM virtual
Java people pretend that their JVM allows for secure, efficient, portable
bytecode. Of course, this is pure nonsense:
Security is a high-level concept; just no low-level representation can
bring it; they can do no better, only worse, than what any typed scoped
language brings; and their actual security claims look like they are based
on their isolating programs in a black box with restricted access to the
environment, which is no better than what "protected" operating systems
already do, and doesn't solve the fact that a program can be useful only
in as much as it affects actual resources.
Efficiency and portability cannot be simultaneously expressed with such
a low level of abstraction. The JVM is just a lousy intermediate code representation,
that isn't well adapted to high-level optimizations.
Java is a hype-oriented language. Expect you usual computer hypists to
talk a lot about Java. The main principle about their constantly adding
features to Java, seems to be "poll the opinion to determine what feature
will have it sell better", which of course leads to inconsistent design
that doesn't work right.
Because Java is the new fad, you'll see lots of Java implementations, development
systems, and application software getting written; lots of Java->X and
X->Java compilers (or more likely JVM->X, X->JVM), where X is about any
Let's put aside the hype, and forget about those pretentions of Java being
the mother-of-all programming languages. Then, Java is still a usable language,
though with currently buggy implementations. However, whatever you'd want
to do with it, you can do it much better with a cleaner language! The only
interest of it may lie in its alledged WWW-portability provided by the
JVM, but many other languages now have compilers targetting the JVM (plus
the JVM sucks), so this is not a particularly good argument for Java.
Here I'll add a few Java-related pages...
is a language for parallel programming, by publishing tuples in a global
The idea seems nice, but it's basically a very very particular case of
what pi-calculus allows.
Originally invented in 1956 by John McCarthy, LISP (which originally stood
for LISt Processing) is a family of languages that have always been at
the cutting edge, because they stressed better semantics rather than fancy
LISP is a self-extending higher-order dynamically typed strict (but
with lazy data structures often implemented) non-pure functional language.
It is the second oldest programming language, but also one with the
nicest semantics. It's a good experimental platform for just any kind of
computing; it has got very good support, including free portable development
environments, integrated structured editors, and efficient "optimizing"
Myths about LISP
Many people think that "LISP is an interpreted language". This is either
plain false, or else completely nonsensical. It is false, because there
are a lot of LISP compilers, some being of very high quality, and some
producing code that runs faster than equivalent hand-written C code. Certainly,
there are lots of LISP interpreters, too, and even lots of "toy" interpreters,
too. This only shows that LISP is so well designed that it makes writing
a working interpreter doable and fun. It's up to you to choose the right
implementation. C is not an "interpreted language" just because there exists
interpreters. Finally, the statement is nonsensical, because being interpreted
is purely an implementation issue; just any language may be either string-interpreted
or compiled to native code, or anything in-between.
Lisp source code is most often written in Cambridge Polish notation: from
a set of atomic symbols, you build Symbolic EXPressions (S-exp or SEXP
-- once was SEX), a SEXP being either atomic, or a list of smaller SEXP,
read and written as the parenthesized space-separated sequence of the representations
of its components.
The basic data structure, the SEXP, is hence a recursive list, and
provides a trivial syntax for writing programs: atomic symbols each denote
a value, and a list being interpreted as a function call; the head element
of the list is computed as a function, the tail of the list is computed
as parameters given to it, and the result is what applying the function
to the parameters return. (Of course, SEXP need not be interpreted this
way, and indeed, good implementations will compile them insteadfor better
Knowledgeable computists will see that SEXPs are a actually only a trivial
syntax for abstract syntax trees. While programmers and language implementers
of other languages spend so much time and effort mastering or developing
wierd fancy syntaxes, that they spend years debugging, parsing, etc, Lisp
programmers and implementers instead trivialize those tasks, and can focus
on semantics. At the same time, S-exp give an canonical embedding of source
code as data, which is foremost for development of meta-programs and reflective
SEXP syntax is unobvious at first; not only is it unfamiliar for someone
having received a standard education; it also lacks these kinds of redundancy
that make it so easy to read simple things in many other languages at first
when read in a linear way, while imposing a systematic repetitive constraint
that is tedious to follow, all the less without good editing tools (such
as EMACS), and a treelike indentation of code (which is some different
useful redundancy), of which most programmers are unaware. For these reasons,
SEXP has gained a bad reputation among illiterate programmers, who don't
see beyond the syntax, and LISP has been surnamed such things as "Lots
of Insipid and Stubborn Parentheses".
On the other hand, SEXP also avoids the ambiguities and limitations
that make it so easy to write gross mistakes in many other languages, and
to tedious (when even possible) the elaboration of complex programs. After
a short training, SEXP appear as the representation for abstract syntax
trees that they are, and they become as familiar to LISP programmers as
the infix syntaxes considered as "normal" by other people (which after
all is a matter of conventions and education); only SEXP have lots of additional
advantages, as already described. There actually used to be another more
conventional syntax for LISP, the M-exp, but it fell out of use as the
S-exp were so much more practical to Lispers whereas the M-exp led to endless
arguing about with way to write things was fanciest, and how to translate
between code and data representations.
The advantages brought by the trivial SEXP syntax are precisely the
strongest asset of the LISP family of languages, and all stem from the
fact that the programmer's brain was literally liberated from the yoke
of syntax. SEXP mean "drop the syntactic brainfuck, and get to the actual
semantic problem". Paradoxically, the uncommon character of LISP's syntax
has also been the biggest brake to its spreading widely, so that LISP's
strongest asset is also one of its major marketing weaknesses.
A way by which LISP can benefit from both the reflective features of
its abstract syntax, and from the ease of access of more "conventional"
syntaxes, is to have first-class syntaxes, as in the GUILE
system, where new fancy syntaxes can be seamlessly used, that are all canonicalized
to SEXP, while SEXP can be pretty-printed to them.
Another system, Dylan, started with SEXP syntax
and concepts developed by Lispers freed of syntactic concerns, and then
stopped using the SEXP syntax, to provide a cutting-edge state-of-the-art
language, though a language that might not benefit anymore from the same
freedom of thought as LISP provided (another more modest such Scheme dialect
Studying all the dialects of LISP is enough work for a lifetime. However,
there seem to be two main standards as for LISP:
and Scheme. Other notable LISP dialects are EuLisp,
and elisp (the EMACS Lisp from the famous GNU Emacs editor). An
interesting derivative is Dylan.
Each of these languages has a section of its own in this Review page. Here
is a quick summary:
Common LISP is an ANSI standard defined for industrial
strength and it is just huge.
Scheme is a minimalist effort that tries to narrow
the essentials of programming languages; it suffers from lack of standardization
of features needed for programming in the large.
EuLisp is an intermediate between CL and Scheme,
but it never took off.
ISLISP is yet another
elisp is a dialect that's not popular for a clean design or a good
implementation, but because it is the extension language of the Mother-of-all
text editors, EMACS.
[Is anyone willing to write a short critique of Lisp machines here?] See
Summary on Lisp Machines
There are lots pointers and documentation about LISP and its dialects.
See the sections related to each dialect for dialect specific books.
Books and Articles Online
Books and Articles Offline
Free Software using LISP
GNU Emacs is a widely spread LISP-based advanced editor. Look on prep.ai.mit.edu
or any GNU mirror site to find EMACS. Binaries for you favorite platform
on the usual binary repositories for this platform. See its archive
See Common LISP and Scheme for
It's got a trivial syntax so you can focus on the essentials.
There are standardized languages in the LISP family.
LISP is widely taught in universities, and many advanced programmers know
LISP allows for higher-order functional programming style.
LISP also allows for imperative and OO programming style.
Both interpreters and compilers of good quality are available, and new
ones are easy to design, making development easy, and portability quite
LISP is strongly-typed (type-safe)
LISP has fully reflective dialects.
The syntax isn't mainstream
There are just too many LISP standards, none being perfect.
LISP is not always so-well taught in universities, and many programmers
were unduly disgusted.
The pattern language for LISP lambda-expressions limited because of legacy
non-orthogonality, contrarily to modern functional programming languages
The way LISP inherits all its many impure imperative constructs from the
dark ages makes them not orthogonal enough.
There are so many good things about LISP languages, but it seems that you
can never find an implementation that has all of those you want at once.
The static type system is trivial.
LISP gives no way to enforce invariants, which doesn't give good control
Because LISP is so powerful and versatile a language, several flavors of
LISP have been used in a wide range of applications, from real-time industrial
control, to space-born engines, to financial applications to extension
language for text editors (Emacs), CAD engines (AutoCAD), arcade games
All in all, it could be said about Lisp's semantics that while it offers
a great expressivity as to constructing new objects, it is unexpressive
as to express constraints on these.
About its syntax, we find that it is very all-purpose, and adapts
immediately to any use; however, it doesn't allow to scale to specific
uses where local constraints make it too redundant.
It could be said that Lisp is a language based on syntax trees; then,
Clean, that is based on graphs, and could be a successor
(see also Prolog and Lambda-Prolog for the same in logic programming).
Lua is an embeddable language
library to extend your programs
Also see this page
m4 is a powerful (Turing-equivalent,
unlike CPP) macro-expansion program designed as a preprocessor to more
stubborn languages. It's ugly, but it's fairly standard, and available
on most platforms, including a fine GNU version.
The only fine implementation of m4 is GNU m4 1.4. Avoid the crippled and
broken commercial versions sold by UNIX vendors.
The semantics of m4 are bad: it uses back-door evaluation in a dynamically-bound
global environment as the only way to program, without any way to safely
quote text (not to speak about quasiquotation).
Thus, users who want to do complex things often find they must compile
manually to continuation passing style with manual fluid-bindings.
Avoid m4 but for quick hacks, and when the environment has already been
setup for it (e.g. FvwmM4, sendmail configuration, etc).
Quick critique of m4 as a programming language
The language semantics or brain-dead, and remindful of TCL (though simpler
evaluation is based on unconstrained backdoor evaluation
There are no closures or contexts.
you only have pass-by-value and dynamic scoping
quoting is syntax-based, not semantic based
you can't escape characters in a quoted text.
quoting is not safe accross modification of superficial syntax.
you don't have nested definitions of abstractions but with horrible unsafe
contorsions that give results inconsistent with deeper abstraction
Building new control structures is difficult and unsafe, yet the language
proposes few basic ones.
Syntax is modifiable, but priorly defined reflective object won't be consistently
Unless stubborn double-character quotes are used, quote characters must
be banned from processing.
Also, the language support is poor as
Few input/output primitives exist, and only output has easy-to-use syntax
available implementations make everything, but string handling, very unefficient,
and propose no efficient data structure but a string LIFO.
there is no standard library (there can't be because a user change of quote
characters would break it)
there is no standard way to extend the language
To conclude, m4 offers reflection without abstraction. Use it only as a
quick and dirty preprocessor, when little programming is needed. I wonder
why I sticked to it so long...
Mathematica is a scientific calculation package developped by Wolfram Research.
Mathematica is an impressive piece of software, because contrarily to other
technical symbolic calculation software, it is based on a real, well-designed
programming language, hence can be extended in ways both clean and advanced,
allowing for incremental enhancements. Thus, even if a given version of
Mathematica does a few things not as well as a rival software, Mathematica
will win on its overall capabilities, and can easily be made to further
scale up on the long run, whereas the rival software just can't adapt.
Unhappily, it is expensive proprietary software only, thus cannot benefit
from open development in a free world. This is a very sad situation,
as the world really needs some reliable, free (for in the software world,
freedom is the way to optimize reliability), symbolic calculation software.
Mathematica is available for Linux as well as for lesser platforms.
See Wolfram Research's WWW page in USA,
On the other hand, I've been told that I was overly generous towards Mathematica
(partly because I didn't use it deeply enough to see its shortcomings,
whereas Maple disgusted me). See the harsh
criticism by Prof Fateman
SML, with its docs
(same as here, where
you could look for "Implementation work using ML"), Edimburgh ML,
CAML The impressive CAML, with its
CAML (and the less featureful but smaller good old
that has been ported to PalmPilot, DOS (ancient versions), etc). See their
SML/NJ is an efficient but hyper-heavy implementation: don't use it with
less than 32MB of memory. I don't know about Edimburgh ML. Moscow ML is
some SML implementation using the CAML-light run-time. CAML-light is very
light-weight bytecode-based implementation that fits almost any 32 or 64
bit computer; as a measure of its usability, it has been used successfully
for a game under X11. The new Objective CAML compiles to either portable
bytecode or native code, and is meant to compete successfully with SML's
efficiency, at much lighter cost and resource requirements, and with a
much better and cleaner module system, and with unequaled objects.
SML has got a powerful but kludgy module system, only fully implemented
in SML/NJ. OCAML has got an equivalently powerful and much cleaner module
system, that allows separate compilation; plus it has objects which SML
hasn't; and it must now have builtin multithread support.
Here are some interesting software projects I've found using ML:
[MIA] ML (originally
the Meta Language for a theorem prover) is a class of (strongly) typed
ML comes in two main flavors:
The Fox project from
CMU to write an OS using (a hacked version of) SML/NJ running directly
The Coq project of a constructive proof/program environment
written in OCAML.
Other theorem provers like HOL, Isabelle...
is the successor of Modula-2+, itself an evolution of Niclaus Wirth's Modula-2,
independently from Wirth's own works (which instead gave birth to Oberon).
It is a language that stresses on simple/safe semantics, with modularity,
OO, Garbage collection, multiprogramming. Special "unsafe" modules are
allowed to mess with implementation details.
for more information.
There are free as well as commercial Modula-3 compilers available.
is a persistent programming system from University of St Andrews
Oberon (now system 3, with
the older V4 still available) is the latest modular OO language by Niclaus
Wirth (the author of Pascal and Modula-2).
See The Oberon Reference
Orca is a language
for distributed computing, codevelopped with the Amoeba
Pascal was a toy written by Prof. Nicklaus Wirth to teach how to compile
computer languages. Unhappily it has been taken seriously by many people.
Well, let's admit that the cheap Turbo-Pascal once revolutioned programming
on personal computers.
See Modula-2, Modula-3,
and Oberon for more serious successors of Pascal.
Brian Kernighan (author of C as well as many Pascal software), once wrote
an article "Why
Pascal is Not My Favorite Language", which is quite obsolete now that
modern Pascal's (e.g. Turbo-Pascal) are quite equivalent to C (which also
shows C sucks as much as this toy Pascal language).
Actually, there seems to be an "Extended Pascal" standard. I don't know
how it compares to Modula-3 or Oberon, but it came too late, with too little
support, and with the problem of being burdened by Pascal compatibility.
See gpc, the GNU Pascal Compiler.
Perl is a language for easily manipulating text, files, and processes.
It provides easy ways to do many jobs that were formerly accomplished (with
difficulty) by programming in C or one of the shells. It is THE language
of choice for hacking programs oriented toward interfacing UNIX/Internet
Perl4 lacked a lot of expressiveness required for an all purpose language,
but already was a great tool for what was clumsily done before with horrible
shell/awk/sed scripts and gross C hacks.
Perl5 has all the good features of Perl4 (a few misfeatures were
dropped and replaced), but it's a full-fledged all-purpose language with
a most powerful OO module system.
It is interfaced to lots of libraries (libc, Tk, ncurses, DB/gdbm/ndbm,
etc), has got lots of modules to manage various internet protocols, and
if by chance it isn't interfaced to your favorite software yet, this can
be done through a well-documented extension system.
And here's a Perl 5
page for people on the bleeding edge. It's a real language, with powerful
WWW Pages: Perl.org; perl.com;
As the name "Practical Extraction and Report Language" coins out, Perl
aims at being effective in practice, not at developping a nice and simple
semantics (though whenever nice things are effective, Perl does them too).
Thus after a short learning time, it becomes a great tool for jobs of interfacing
software, converting file formats, making small network daemons, etc. Perl
is perfectly adapted to the Unix philosophy and practice. However, it is
a bad choice when long-term/wide-area consistency is required.
The principal advantage of Perl is that you can quickly hack up a working
program to do text/binary manipulations that would be quite cumbersome
using previous tools, that were either too generic, with a cumbersome interface
to Internet-standard human-readable files, or too specific, unable to express
more complex file manipulations. Perl allows you to do things in tons of
different ways, so everyone can find one that fits his taste and thinking
The principal disadvantage of Perl is that, allowing too many things as
valid programs, it makes it difficult to specify and verify bugs out; the
terse syntax that makes hacking quicker makes debugging slower. This is
typical of traditional non-meta-programming development, where a compromise
between writeability and readability is needed: only a meta-programming
environment could help transform a valid program into a clean valid program
of equivalent meaning.
Another current disadvantage of Perl is that the current interpreter may
be too slow to start and run to cope with the usage pattern of an intensive
server. The Perl compiler being developped might alleviate this problem.
Prolog is a
language for programming based on Horn-Clause logic (not even full first-order
is "an interpreted, interactive, object-oriented programming language.
It incorporates modules, exceptions, dynamic typing, very high level dynamic
data types, and classes. Python combines remarkable power with very clear
syntax. It has interfaces to many system calls and libraries, as well as
to various window systems, and is extensible in C or C++. It is also usable
as an extension language for applications that need a programmable interface.
Finally, Python is portable: it runs on many brands of UNIX, on the Mac,
and on MS-DOS and Win32."
Carl Sassenrath's REBOL
The language is not finished, but what's already available looks promising.
The core language, which is available, looks like some LOGO, LISP with
parentheses removed in exchange for fixed arity functions. The semantics
also borrow a lot from LISP and Scheme, which is very good. The promised
features for future development look great, too. All in all, REBOL has
improved a lot since the original papers of 1997. There are still a lot
of things undefined yet, so let's wait and see if REBOL lives up to expectations.
If it does, it's time to write a free implementation.
On the flip side, the overloading of structures in a MUMPS/perl/Tcl way
looks like it will constrain implementations a lot, unless some standard
type inference/declaration/verification system (a la CommonLISP) is promoted.
RPL ("Reverse Polish LISP") is the language from HP 28/48 calculators.
Despite their lack of horsepower, these calculators are much more usable
and friendly than my unix workstation. Let us try to determine why...
Pros and Cons
The language is reflective: code IS an object as anything on the HP, and
it is easily manipulable.
Interaction is done through a visible stack in exactly the same way as
the thing is programmed. The interface is thus very well integrated with
the programming language, and the interface concepts are simple, with trivial
automatization. Much better than windows !
The system is orthogonally persistent.
The language was not extensible (well HP48's can be extended using the
assembler, but that seems kludgy).
The language was designed with a unique execution context in mind: single
user, single-threaded, global variables.
No support for modules and migration.
or Single-Assignment C, is a functional language with C-like syntax, especially
designed for highly optimized parallel intensive numerical computations.
In that it is a successor to SISAL.
Sather is a free Object-Oriented
programming language rival of Eiffel.
From their homepage: Sather is an object oriented language designed to
be simple, efficient, safe, flexible and non-proprietary. One way of placing
it in the "space of languages" is to say that it aims to be as efficient
as C, C++, or Fortran, as elegant as and safer than Eiffel, and support
higher-order functions and iteration abstraction as well as Common Lisp,
CLU or Scheme.
See also another implementation,
Scheme is a dialect of the LISP family of languages.
See the according information above.
Scheme is a minimalistic dialect (some say a skimmed version) of the LISP
language, in which everything is built upon the simplest constructs. It
is the language that proved to the world the benefits of lexical scoping
over dynamic scoping.
Scheme is among the few languages that have a perfectly defined formal
semantics, and not a too kludgy one at that. and Scheme programmers often
find it unpleasant to program any other language.
However, Scheme suffers from minimalism, and Common Lisp programmers often
wish they had a unified library in Scheme.
The R5RS (Revised^5 Report on the the algorithmic language Scheme),
that includes the language definition, the formal semantics, etc, is available
in various formats from
Kelsey's ftp site alongside with some other scheme-related material;
it is also available in
Books and Articles
Both authors of this review agree that the best book
about programming they read is Structure and Interpretation of Computer
Programs (SICP), by Abelsson and Sussman, that uses Scheme as a language.
(beware that the first edition used a slightly old-style subdialect of
Scheme). The official page for SICP is at MIT-Press.
The canonical place to find people interested in Scheme is the Usenet newsgroup
The Scheme Underground
project strives to build a complete computing system on top of Scheme.
Here are notable free implementations of the Scheme dialect (and
its variations). For more, or for commercial implementations, see the FAQ
and above resources.
GNU has its own Scheme dialect, GUILE,
"GNU's Ubiquitous Intelligent Language for Extension"
GUILE comes as an embeddable interpreter library derived from SCM, with
with lots of packages (including Tk interface, OpenGL 3D graphics, Posix
and Unix system interface, multithreading, etc); it also integrates multiple
languages and syntaxes (like a C-like syntax for Scheme, Tcl, and more).
FSF page, red
bean (also FTP)
Docs here and there
[MIA] GUILE Mailing list and
MIT-Scheme (aka C Scheme) and its efficient LIAR compiler lie here.
It features: a very efficient native-code compiler (LIAR), an Emacs
clone integrated to the Scheme environment (Edwin), a package system, first-class
Olin Shivers' SCheme SHell, scsh
Based on Scheme48
a portable implementation of Scheme, scsh is designed to provide a complete
Scheme-based system development tool for use on existing operating systems.
Scsh may someday become a serious alternative to Perl, however, it
still has to add these in their next release: fast startup, multithreading
support, scalability of executable output code, 64-bit support.
More on the newsgroup comp.lang.scheme.scsh.
Kali Scheme is another
Scheme system based on Scheme48,
that offers primitive for distributed computing.
The RScheme project implemented by
Donovan Kolby under direction by Prof Paul Wilson at Texas University.
It features: a byte-code compiler as well as a compiler through C,
a CLOS-like (but single dispatch) object system, a real-time GC, a persistent
store, interfaces to unix syscalls, Tk, http, and much more.
STk allows access to the popular
Tk toolkit for X programming via Scheme. It's main interest is TinyCLOS,
a port of CLOS to Scheme. The interpreter is based on SCM.
Rice University has built
lots of tools for Scheme teaching and development: MzScheme, an interpreter;
MrEd, the same interpreter bundled with a graphics toolbox and a built-in
editor; and DrScheme, a Scheme development environment built atop MrEd.
Jeffrey Mark Siskind
is actively developing Stalin
an optimizing compiler for Scheme that goes through C, and produces impressingly
good code [Was it name thus because it is an expert in massive execution?].
Manuel Serrano's bigloo
also compiles through C; it's got its own object system, a module system
that allows separate compilation, many libraries to access the system,
and so on.
Marc Feeley's [MIA] Gambit
compiler for Scheme.
Will Clinger's Twobit
compiler and its companion compiler for the SPARC, Larceny.
Oliver Lauman's elk
(Extension Language Kit) is a very straightforward Scheme interpreter written
in portable C. It's not fast, but it's very robust, easy to understand,
comes with lots of cleanly designed straightforward extensions and system
access libraries. It's easy to extend and to link into your application.
It's sure not cutting-edge, but it's very stable.
Kawa is a Scheme
for the JVM.
Objective Scheme from INRIA is an interpreter that easily interfaces C,
and has its own object system; it is used in GWM, the Generic Window Manager
Also: T, SCM,
Hobbit (Scheme to
C compiler using the SCM runtime), Latest
Gambit system, Hobbit, VSCM, Scheme->C, siod, rabbit, wxWindows, oaklisp,
euscheme, pcscheme, meroon, fools, scoops, umb, xscheme, s88, softscheme,
similix, pixiescheme, minischeme, libscheme, and every year more...
Pico is a Scheme dialect
with more traditional syntax.
There are DAMN TOO MANY IMPLEMENTATIONS (well, half are not very active).
And that's only the free ones! Every single feature you'd expect from a
Scheme implementation is implemented in one of them. Just not all of them
at a time! To many divergent efforts, not enough coordination in even the
most trivial language extensions, etc. OUCH!
Free software based on Scheme
DSSSL is a powerful document
formatting system based on SGML and Scheme.
Issue one of
Tom Lord's [MIA]
~twaddle (pronounced "twiddle twaddle"), includes a Scheme implementation,
(more or less an early Guile), together with Ctool, a tool for analyzing
C programs, that was used to analyze and debug the included Scheme interpreter
(FPS) takes all the Postscript drawing primitives, and wraps them with
a real language, Scheme, instead of that Postscript hack. (source
to the tutorial)
An interesting example of how Scheme was embedded into a [MIA]
Aubrey Jaffer's SLIB standard library for Scheme.
Portable Object-Oriented packages for Scheme include:
(a Self-like language with behavioral reflection)
Ken Dickey's YASOS
(Yet Another Scheme Object System)
Of course, many implementations have their own object system, including
RScheme, STk, OScheme, etc.
See generic critique for LISP languages above
Scheme is an IEEE standard.
Scheme has got lots and lots of implementations
Scheme has got a clean, short, and expressive formal semantics.
Scheme has got the best macro systems ever found in a language.
Scheme is minimalistic, no unneeded constructs or bizarre rules.
Scheme makes lots of things completely orthogonal.
Just any program can be made a first-class object in Scheme: it has maximal
Scheme is the basis for some of the best books to learn computer science
Scheme can express just any programming style in existence, including functional,
procedural, logic, constraint, OO, and whatever programming style you want,
for which you'll easily find lots of example source packages.
The standard focuses only on the core language, and completely ignores
lots of issues that are required for real world use.
All the implementations of Scheme are completely incompatible with each
other for anything but batch computation, because only the core language
Notably, no standard binding for non-trivial I/O primitives, threads, persistence,
etc, exist in standard Scheme.
It has no standard module system or any easy mechanism for deferred
Scheme hasn't got a large standard library, which makes every Scheme implementation
incompatible with the others as far as the system interface is concerned.
Actually, its very lack of a standard module system makes development
of such library difficult. This is the ONE BIG PROBLEM that prevents Scheme
from being used in large projects.
Again, this plain sucks: the theory is as bad as C (only a one global namespace),
and the practice is even worse (making a (define) definition local
is not a local transformation on a module, whereas in C, putting the static
Despite its simple and clean semantics, Scheme is too low-level wrt mutability.
There is no standard way to declare read-only objects. More modern functional
languages can do this, and this really would allow much cleaner semantics,
hence easier optimization, etc.
The read-write cons cell concept is a very low-level one that dirties the
otherwise high abstraction level of the language.
More generally, Scheme does introduce both the concepts of values and of
locations, but does it in complex non-orthogonal ways, which plain sucks.
Even more generally, there are a lot of things doable in Scheme, that the
Scheme standard offers no way to do, but with clumsy inefficient abstraction
inversion, which makes the language both powerful and frustrating.
Every single feature you want may be found as first-class in some Scheme
implementation, only it will not be standard, and you'll never find a Scheme
implementation with all the features you need.
SELF is a pure prototype-based Object-Oriented programming language, based
on very simple concepts which allow efficient implementations.
SELF uses trivial strong typing (much like LISP): there is no "typecasting"
backdoor to object representation, but there is no elaborate type hierarchy,
just a one unique static type.
Free software using SELF
OS project uses SELF.
SIMULA was one of the first "OO" languages (or is it the first?), down
in the sixties. C++ is to C what SIMULA is to Algol.
There are still people using it at
The authors of SIMULA have since made a much greater language,
Sisal is a pure functional language that beats FORTRAN for number crunching,
especially as its allows lots of automatic optimizations for parallel architectures.
Unhappily, while focusing on number crunching performance, Sisal forgot
many of those things that make functional programming so cool, and it seems
that they got multidimensional arrays wrong, so there's room for a better
language than SISAL.
People at LLNL have decided to stop developing SISAL so as to focus on
their main job: designing weapons of mass destruction, for which they'll
use FORTRAN and C++. Let's hope these lame language will help confuse them
in their evil quest for a worse world. Meanwhile, former SISAL developers
are working on a sequel to SISAL,
Smalltalk is a hackish class-based Object-Oriented programming language
with reflective capabilities.
Smalltalk can be viewed as an operating system of sorts, more so if it
is not running on another OS. It is fine grained, having 30,000-60,000
objects, rooted in some 200 classes; some big Smalltalk systems have over
1 million objects. No differentiation-separation is made between System
and Application software: its all one big sea of objects. It has no kernel;
but the VM serves a similar purpose. It has extensive, relentless bountry
checking. It is the most consistant OO language. Everything is an object,
no exception: every number, letter. No object can directly effect the state
of another object, only indirectly, by messages. No pointer arithmetic
Hum, better take a look at
and Smalltalk: A Social and Political History Benedict Dugan, (c) 1994
Smalltalk was inspired by Simula. Smalltalk was developed under Alan Kay's
team, at Xerox PARC in the 1970s and early 1980s. Xerox famously sat on
some amazing technologies, throwing away many great opportunities, later
developed instead by other companies: Ethernetworking (3com), Postscript
(Adobe), laser printers (Apple), windowing graphic user interfaces (Apple,
Microsoft), and Smalltalk (ParcPlace-Digitalk is now in much trouble due
to overpricing Smalltalk).
Then Apple Computer's Steve Jobs got Kay and many of the PARC team to move
to Apple. Some of the Smalltalk team moved to Digitalk. Then some Digitalkers
moved to Apple. Sometime during all this, Apple licensed Smalltalk-80.
As corporate America often did and does, Apple sat on advanced stuff it
had, and did nothing with it. In the meantime, IBM developed Visual Age,
and now has 20,000-30,000 Smalltalk programmers.
Free software using SELF
OS project uses SELF.
TCL is a logically challenged interpreted language, which is popular because
of Tk, a powerful toolkit to program the X-Window environment. Those who
really managed to understand its semantics try to explain it with words
such as `binding-time', `runtime', `coffee-time' and pretend it's not that
bad after all. Tcl has lots of success since it is a strongly-hyped language.
Many real languages, including
and more, have access to Tk. And other languages have other interfaces
Deeper insight can be found at the
critique page or in those
of Tcl with other systems
A new HLL
Pros and Cons
being efficient as an interpreted language, it may serve as a shell language
as well as a programming language; being powerful, and easy to specialize
via standard libraries, it also replaces small utility languages (sed,
awk, perl, etc); finally, being high-level and knowing of relations between
objects, it is easily adaptated to an AI language. So there is no more
need to learn a different language for every application; the same language
is used for (almost) everything; no more need to learn new syntaxes each
We can design the syntax to fit our needs and ideas, so that it's much
easier to use. Moreover, even C isn't our natural language, and whatever
language we use, there will have been adaptating time to use it.
We can correct the lacks of any existing language we would have used.
Portability: both the system and the language may be as easy to port. All
you need do is porting a LLL compiler back-end or interpreter, and hardware
specific lolos (low-level objects).
The language is perfectly well adapted to the system. No need of bizarre
and slow language -> system call translation.
we have to relearn a new language syntax. But as we may choose whatever
syntax pleases us (and support multiple automatically translatable syntax),
this is no great deal, really.
No existing compiler can be used directly. This is no great deal either:
Front end are easy to write, and no existing back end can fit an interestingly
new OS' object format, calling conventions, and security requirements.
Moreover, our system having a brand new conception, even with a traditional
language, we'll have to learn restrictions about our way of programming.
we have to debug the language specifications as we use it. But this can
prove useful to refine the language and the system specs. Here is
an interesting point.
My TOP languages are:
To Do on this page
Create a subdirectory for the Review alone.
Have a subfile for each language family.
Find all existing HLLs.
Separate C from C++
Write a summary about each of these, with particularities, pros, cons,
pointers, (examples/history?), etc.
Add cross-references to the Glossary, OS page, VM page, etc.
Wait for feedback and criticism.
What should we do with proprietary languages?
A few pointers to register:
Languages/systems with reflective capabilities:
maude: reflective system based on
pliant: free reflective
TOM: language with metaprogramming
POP-2 and POP-11, etc (see Poplog
Logic programming and rewrite systems:
Other languages/systems to Review:
for Object-oriented Software Engineering of Systems. Typical OO propaganda
kind of thing.
JO's infamous article on Scripting
languages [link dead; any update?]
User-level Stuff to achieve persistent objects:
Foreign function interfaces:
Back to the TUNES
or to the HLL
$Id: Languages.phtml,v 1.26 1999/07/15 00:26:51 fare Exp $