Review of existing Languages

Preliminary Note

There are a lot of these, aren't there?
I'd love feedback from you, particularly if you know any interesting language not (or poorly) considered here...
Also feel free (=bound :-) to add your comments and to correct broken English and typing mistakes.


Language-related WWW pages


Some "paradigms" of programming

Implementational topics

Existing languages

This classification is stupid, approximative, and should be re-done completely, with perhaps an array of "features", to serve as a guide... Of course, the only way to know for sure about the language is to see how it really is... Also, many things here are out of date. Updates welcome...

Existing Languages



ABCL is a family of languages developed by Akinori Yonezawa and his lab for "object-oriented concurrent programming". These languages are based upon a LISP or Scheme core with primitives for object-oriented concurrency. The language was further extended to use computational reflection to manage distributed computing and other language extensions in a way that is as seamless as possible to the user.




There is a lot of things to say about ADA. Anyone care to write a review?


I didn't do a comprehensive web search yet, but have been given this URL.
ADA was originally chosen as US DoD's language, and was meant to fulfill the Steelman specifications.



Agora is a prototype-based OO language à la SELF, that is used to explore reflective features (notably mixins).
Could anyone have a deep look and send insightful feedback?


Here's an FTP site, and now a WWW site.



Alma is a strongly-typed imperative language with constraints and quantifiers. It is an extension of a subset of Modula-2. It supports declarative programming ideas.


An Alma-0 compiler and syntax definition is freely available.



BETA is an OO language from those who invented SIMULA. It is the latest advance in OO programming.


  • The reference implementation, is the commercial Mjolner BETA system; educational institutions may have discounts.
  • There is now a free interpreter, GBETA

  • C/C++

    This entry should definitely be updated and split into C, C++, Objective C, etc.


    "C" is a low-level, weakly typed, static, non-modular, non-generic, non-reflective language.
    "C" was designed in the late '70s as a portable assembler for the PDP series (remember? nominally compatible machines but with various word lengths), and it is still the state of the art in that area, although, as a systems programming language, it is more adapted to the PDP-11 than to modern architectures.
    In the late '80s, the small and elegant `K&R' C has become the prey of an ANSI standards commitee; it has now become ANSI Standard C.
    In the '80s, C has (in our opinion wrongly) become the language of choice for general purpose programming. In an attempt to add the features of general-purpose Object-Oriented languages to C, Bjorne Stroustrup created C++, a huge, bloated, hack-ridden monster. The name C++ is a pun that means that one huge bloated thing was added to C, but the value of the result is no more than that of C.

    Writing the formal semantics of C++ is left as an exercise to the reader.

    The only positive point about it is that it is the one standard as a programming language.
    There are also other less popular "object-oriented" versions of C, such as


    About C, get the C Bible by Kernighan&Ritchie (the original authors of C). There's also an online tutorial by Dave Marshall, Programming in C.
    About C++, just shoot(yourself.foot++); or if you must, learn to program C++ with the Globewide Network Academy (GNA)'s Introduction to Object Oriented Programming Using C++.

    Look on or any mirror site to find GCC, the GNU C Compiler, a free C/C++ compiler, and a lot of utilities written in C. Also, experimental features are being actively developed in egcs, whose releases are separate from the allegedly more "stable" gcc 2.8.
    A simpler free compiler is lcc.
    There is a free embeddable interpreter, EiC.
    Here are papers about why not use C or C++ in general. This one is particularly meaningful as to what a language should be (that C/C++ is not).
    C++ critique v3
    LC-Lint is the best existing typechecker for C.
    Ctools, part of issue one of Tom Lord's [MIA] ~twaddle (pronounced "twiddle twaddle"), might also prove useful.




  • Pros
    1. Anybody knows this language, and we wouldn't have to readapt to another language syntax, as it is the one (now ANSI) standard in system programming (yuck).
    2. We may compile test programs immediately, without having to wait for the kernel and compiler to be done.
  • Cons

  • Actually, C and C++ are not high-level languages, and just worth throwing to nearest garbage can (poor garbage can), unless you're stuck to it, which you are on existing systems (themselves worth as much, but again, you're stuck to them until Tunes comes out).

    1. C/C++ do not include functions as a first-class object, thus creating a boundary between using and programming: it's a static language; you can't both program and use at the same time. That's the opposite of user programmer/user friendliness.
    2. C/C++ is not a structured language: procedures are all global; that's why C/C++ will NEVER allow having independent light threads, and why it's impossible to have a lightweight multitasking system under C/C++. You may notice that this point is related to the preceding remark: if procedure were objects, you could include them individually inside thread objects, then each thread would have its own independent code.
    3. C/C++ knows only early binding (i.e., it only compiles directly executable code), hardly knows about dynamic library linking (in the case of C, it's not a language feature, only a linker feature); C/C++ considers a program to be complete, finished, and run in the absolute without interaction with other programs, but through the system; that's why all ROI in C must be explicitly done through system calls !!!
    4. The "we won't have to rewrite a compiler" argument doesn't stand: if the system is to be OOed, we'll have to adapt the compiler so that it produces OO code compliant to our system's requirements. Unless our system brings nothing that can't be done easily by replacing the standard library, it's impossible; so only front ends can be reused, which are trivial to write (although more difficult in the case of C than with other languages). All in all, "C" would only get in the way if used.
    5. As it's a low-level language, either we'll have to have low-level specs for the OS (as with Unix), so that we can't adapt to an architecture different from that for the which the OS was designed; or we'll have to rewrite a great amount of the C/C++ code on each system adaptation. So we lose either system functionality, or language portability. Interoperability of code is also impossible, unless we define a common target architecture that other computers will have to emulate.
    6. For the same reason (C/C++ being low-level), we can never achieve object-level security, but by having a C/C++ program for each object, which either disallows everything -however little- is an object in the system, or requires many a low-level system (as opposed to language) declaration in a C source, and/or requires having tiny C/C++ programs, which contradicts the heavy compilation cost and the class philosophy of the language.
    7. It is remarkable that most security alerts in operating systems are related to buffer overruns in C code, because the language doesn't help in any way to automatically ensure safety -- unsafety is the default, and manually ensuring it is a long, tedious, difficult task. A higher-level language could fully automatize safety, making it the default, whereas it would be unsafe operations that would have to be explicitly coded.
    8. Horrible in C and dreadful in C++ is the lack of really first-class structured values. This makes C nothing more than a badly specified portable assembler (with very limited macro facility, and restrictions in calling conventions), and gives a completely obfuscated notion of object identity to C++.
    9. The C++ philosophy contradicts the idea of late user-object binding. C++ knows only of early compile-time object binding, or at best/worst, binding with compile-time defined virtual class realization through so called virtual tables. So to make the slightiest add/change, you must recompile the whole application.
    10. Because the whole object system in C++ is based on braindead inheritance, enhancements in a program may mean complete rewrite of the type hierarchy.
    11. The C/C++ preprocessor allows simple macro-definitions, but neither macro instructions, nor recursive macro definitions. If #define DEF #define was possible, for example, it would be possible to Ziv-Lempel compress a source program from C to C. But cpp just plain sucks.
    Note that "C" offers absolutely no particular interest when cut from all its standard compilers and libraries, for you can no more port existing software. As TUNES won't support any standard "C" library (at least as support for native code), and requires a different compiler anyway, because its semantics is so much different, "C" offers no interest on top of TUNES. Thus, "C" just has nothing to do with TUNES, except perhaps as a layer between TUNES and a POSIX compliant system, if TUNES is to be implemented as an application under such system (see OTOP subproject).


    Cecil is an OO language at University of Washington, participating Craig Chambers who wrote the SELF compiler.


    Concurrent Clean is a general purpose, higher order, pure and lazy functional programming language for the development of sequential, parallel and distributed real world applications.
    "Clean is a language in the spirit of other modern lazy functional languages like Haskell and Miranda. People familiar with these languages will have no difficulty to program in Clean. The Clean compiler has the nice property that it runs on small platforms (Mac, PC, Sun), while it compiles very quickly and produces code of state-of-the-art quality."

    Common LISP


    Common LISP is a dialect of the LISP family of languages. See the according information below.


    Common LISP is the result of a standardization effort by Lisp vendors started with the commercialization of LISP in the early 80s. Because it strived toward backward compatibility with existing Lisp systems, the result is a huge monolithic ANSI standard language, with hundreds of built-in constructs for a megabyte worth of run-time system (not talking about add-on modules).

    However, the advantages of programming in Common LISP cannot be overestimated: everything a programmer usually needs is in the library, the object system is (quite) well integrated with the type system and the condition system (which is unique on its own). Greenspun's Tenth Rule of Programming states that "any sufficiently complicated C or Fortran program contains an ad hoc informally-specified bug-ridden slow implementation of half of Common Lisp".

    Books about Common LISP

    Free Implementations of CommonLISP

    Free software packages written in Common LISP



    COBOL is the COmmon Business-Oriented Language.
    It's a very old and stupid language. The kind of thing a sensible man wouldn't touch unless forced to. Because its syntax is so verbose, it makes the neophyte and non-programmer believe they actually understand programming better. But of course, what makes a good language is its semantics, not its syntax, and for actual programming, COBOL syntax gets in the way, while its semantics plain suck.
    If you're interested, check the [MIA] COBOL FAQ.


    Coq is a higher-order proof system based on the Curry-Howard isomorphism between propositions and types, proofs and terms in a pure functional language: a functional term is a proof of its type's realizability. By restraining to intuitionistic logic (i.e. no built-in choice axiom), it allows to extract actual programs from proofs.
    The newer version of Coq does have programmable tactics (using CAML-light), extensible grammars, etc.
    The problem with Coq is that it is not reflective: there is a one global meta-context, and no way to meta-translate expressions: a proof done in "Prop" will have to be done again in "Set" and in "Type"; there are no "proof sketches" as first-class objects.
    Coq has got a WWW page, which is no more empty.
    The Coq distribution includes full docs.


    Dylan is a new "object oriented" DYnamic LANguage, that aims at being compiled as efficiently as static languages, while providing an unequaled development environment. It's a lot like a LISP with an infix syntax and freed from backwards compatibility. The design group hoped that it might bring dynamic languages to C++ programmers but that was before the Java seized the market with its hype. (but don't worry: even some Real Language programmers find Dylan nice).
    Dylan was developped conjointly by Apple, Carnegie Mellon University, and Harlequin inc. But Apple abandonned the project half-way, and CMU stopped its projects, while Harlequin finally published Dylanworks. Now, the difference between Apple and CMU is that because Apple was closed-source, all its code is now defunct, and no one will ever see it again, whereas CMU made its code public domain, which has been taken over by a bunch of joyful hackers.



    Eiffel is a rather clean OO language originally designed by Bertrand Meyer, that stresses the concept of programming by contract, to achieve reliable industrial-strength programs.
    Unhappily, the language as it is is not expressive enough to express all the constraints one may like to verify. Moreover, the language suffers from the choice of typing covariance in functions which makes quite an adhoc type system.
    See Sather for a language that tries to fix some of the flaws of Eiffel.


    There is one and only one free implementation of Eiffel, SmallEiffel, being actively developed under the GPL.
    There are lots of commercial implementations, though I have only these pointer handy:



    Erlang is a mostly pure logic/functional programming language developped by Ericsson, used in real-life telecommunication applications and other industrial settings. Now free software, and with lots of goodies, including courses and eddieware
    Projects to compile Erlang into something faster than its traditional bytecode include HIPE and ETOS.


    Escher is a declarative, general-purpose programming language which integrates the best features of both functional and logic programming languages. It has types and modules, higher-order and meta-programming facilities, and declarative input/output.
    Escher also has a collection of system modules, providing numerous operations on standard data types such as integers, lists, characters, strings, sets, and programs. The main design aim is to combine in a practical and comprehensive way the best ideas of existing functional and logic languages, such as Gödel, Haskell, and \lambda-Prolog. Indeed, Escher goes well beyond Gödel in its ability to allow function definitions, its higher-order facilities, its improved handling of sets, and its declarative input/output. Escher also goes well beyond Haskell in its ability to run partly-instantiated predicate calls, a familiar feature of logic programming languages which provides a form of non-determinism, and its more flexible handling of equality. The language also has a clean semantics, its underlying logic being (an extension of) Church's simple theory of types.
    You can obtain a report. about it.



    EuLISP is a dialect of the LISP family of languages. See the according information below.


    EuLisp (European Lisp), is a modern object-oriented dialect of Lisp whose design was less minimalist than that of Scheme but less constrained by compatibility reasons than that of Common Lisp. In particular, it has a single namespace for variables and functions (like Scheme, and unlike Common Lisp) and its type system is perfectly integrated with the object system. Furthermore, the language makes use of the module system, so that every program may import only those parts of the language that it needs.


  • You may find the current EuLisp specification and sample implementations here.
  • The currently only active implementation is youtoo.



    FORTH is a low-level functional language for a stack-based virtual machine model.
    It is a very interesting language to study, both for its achievements and its shortcomings.


    The current Forth standard is ANSI's 1992 ANS FORTH. It replaces the FORTH'83 and FORTH'78 standards.
    FORTH is also being used as the standard language for Open Firmware, used to boot all modern computers in a portable way (see Firmworks).


    Online Books

    Some implementations

    Random pointers to sort out about FORTH


    Gema is a general purpose macroprocessor.


    Haskell is the reference among pure lazy functional programming language. It serves as a bench for much research and teaching in pure functional languages. In has an elaborate type system including both ad hoc polymorphism (operator overloading) and uniform polymorphism (universal type quantification, plus restricted existential quantification). It might be called a functional object oriented language.
    Gofer, a lightweight version of Haskell, has implementations that can be run on desktop computers.
    The Glasgow Haskell group headed by Simon Peyton Jones, develops GHC, the Glasgow Haskell Compiler, using GCC as a backend.
    The Yale Haskell group, headed by Paul Hudak also does Haskell development (including music in Haskell).
    See this FTP repository.
    pointers to sort out:


    Icon is a high-level, general-purpose programming language with many features for processing data structures and character strings. Icon is an imperative procedural language with backtracking. Its syntax is much like C and Pascal, but its semantics are far higher level.
    The Icon project (in Arizona) has a WWW page here.
    There's an OO extension to Icon, Idol.


    Java some kind of revamped C++ dialect meant to be used as a universal WWW language.
    Java starts from the C++ syntax and object model, frees it from C compatibility and ugly hacks; frees it from manual memory management, pointer arithbuggics, and unsafe typecasting; then it tries to add just every possible in-fad feature, without having any clean design principle (how could they starting from the C++ object model?). That is, they clean up C++, remove much of its fat, but essentially follow the same general bloat-oriented design.

    They nonetheless provide interesting features (Interfaces), and overall, the language is much cleaner (hence better) than C++ for OOP programming (see the C++ critique v3). Note that much better than C++ does not remotely mean "good". And whatever "OOP" means, being good at it might not mean being much useful, either...
    Java is indeed two standard: firstly, their language (see above), but secondly, and perhaps more importantly, their low-level bytecode-based JVM virtual machine.
    Java people pretend that their JVM allows for secure, efficient, portable bytecode. Of course, this is pure nonsense:
    Java is a hype-oriented language. Expect you usual computer hypists to talk a lot about Java. The main principle about their constantly adding features to Java, seems to be "poll the opinion to determine what feature will have it sell better", which of course leads to inconsistent design that doesn't work right.
    Because Java is the new fad, you'll see lots of Java implementations, development systems, and application software getting written; lots of Java->X and X->Java compilers (or more likely JVM->X, X->JVM), where X is about any language.
    Let's put aside the hype, and forget about those pretentions of Java being the mother-of-all programming languages. Then, Java is still a usable language, though with currently buggy implementations. However, whatever you'd want to do with it, you can do it much better with a cleaner language! The only interest of it may lie in its alledged WWW-portability provided by the JVM, but many other languages now have compilers targetting the JVM (plus the JVM sucks), so this is not a particularly good argument for Java.
    Related Pages
    Here I'll add a few Java-related pages...


    Linda is a language for parallel programming, by publishing tuples in a global space.
    The idea seems nice, but it's basically a very very particular case of what pi-calculus allows.



    Originally invented in 1956 by John McCarthy, LISP (which originally stood for LISt Processing) is a family of languages that have always been at the cutting edge, because they stressed better semantics rather than fancy syntax.

    LISP is a self-extending higher-order dynamically typed strict (but with lazy data structures often implemented) non-pure functional language.

    It is the second oldest programming language, but also one with the nicest semantics. It's a good experimental platform for just any kind of computing; it has got very good support, including free portable development environments, integrated structured editors, and efficient "optimizing" compilers.

    Myths about LISP

    Many people think that "LISP is an interpreted language". This is either plain false, or else completely nonsensical. It is false, because there are a lot of LISP compilers, some being of very high quality, and some producing code that runs faster than equivalent hand-written C code. Certainly, there are lots of LISP interpreters, too, and even lots of "toy" interpreters, too. This only shows that LISP is so well designed that it makes writing a working interpreter doable and fun. It's up to you to choose the right implementation. C is not an "interpreted language" just because there exists interpreters. Finally, the statement is nonsensical, because being interpreted is purely an implementation issue; just any language may be either string-interpreted or compiled to native code, or anything in-between.


    Lisp source code is most often written in Cambridge Polish notation: from a set of atomic symbols, you build Symbolic EXPressions (S-exp or SEXP -- once was SEX), a SEXP being either atomic, or a list of smaller SEXP, read and written as the parenthesized space-separated sequence of the representations of its components.

    The basic data structure, the SEXP, is hence a recursive list, and provides a trivial syntax for writing programs: atomic symbols each denote a value, and a list being interpreted as a function call; the head element of the list is computed as a function, the tail of the list is computed as parameters given to it, and the result is what applying the function to the parameters return. (Of course, SEXP need not be interpreted this way, and indeed, good implementations will compile them insteadfor better performance).

    Knowledgeable computists will see that SEXPs are a actually only a trivial syntax for abstract syntax trees. While programmers and language implementers of other languages spend so much time and effort mastering or developing wierd fancy syntaxes, that they spend years debugging, parsing, etc, Lisp programmers and implementers instead trivialize those tasks, and can focus on semantics. At the same time, S-exp give an canonical embedding of source code as data, which is foremost for development of meta-programs and reflective features.

    SEXP syntax is unobvious at first; not only is it unfamiliar for someone having received a standard education; it also lacks these kinds of redundancy that make it so easy to read simple things in many other languages at first when read in a linear way, while imposing a systematic repetitive constraint that is tedious to follow, all the less without good editing tools (such as EMACS), and a treelike indentation of code (which is some different useful redundancy), of which most programmers are unaware. For these reasons, SEXP has gained a bad reputation among illiterate programmers, who don't see beyond the syntax, and LISP has been surnamed such things as "Lots of Insipid and Stubborn Parentheses".

    On the other hand, SEXP also avoids the ambiguities and limitations that make it so easy to write gross mistakes in many other languages, and to tedious (when even possible) the elaboration of complex programs. After a short training, SEXP appear as the representation for abstract syntax trees that they are, and they become as familiar to LISP programmers as the infix syntaxes considered as "normal" by other people (which after all is a matter of conventions and education); only SEXP have lots of additional advantages, as already described. There actually used to be another more conventional syntax for LISP, the M-exp, but it fell out of use as the S-exp were so much more practical to Lispers whereas the M-exp led to endless arguing about with way to write things was fanciest, and how to translate between code and data representations.

    The advantages brought by the trivial SEXP syntax are precisely the strongest asset of the LISP family of languages, and all stem from the fact that the programmer's brain was literally liberated from the yoke of syntax. SEXP mean "drop the syntactic brainfuck, and get to the actual semantic problem". Paradoxically, the uncommon character of LISP's syntax has also been the biggest brake to its spreading widely, so that LISP's strongest asset is also one of its major marketing weaknesses.

    A way by which LISP can benefit from both the reflective features of its abstract syntax, and from the ease of access of more "conventional" syntaxes, is to have first-class syntaxes, as in the GUILE system, where new fancy syntaxes can be seamlessly used, that are all canonicalized to SEXP, while SEXP can be pretty-printed to them.

    Another system, Dylan, started with SEXP syntax and concepts developed by Lispers freed of syntactic concerns, and then stopped using the SEXP syntax, to provide a cutting-edge state-of-the-art language, though a language that might not benefit anymore from the same freedom of thought as LISP provided (another more modest such Scheme dialect is Pico)


    Studying all the dialects of LISP is enough work for a lifetime. However, there seem to be two main standards as for LISP: Common LISP and Scheme. Other notable LISP dialects are EuLisp, and elisp (the EMACS Lisp from the famous GNU Emacs editor). An interesting derivative is Dylan.
    Each of these languages has a section of its own in this Review page. Here is a quick summary:

    Lisp Machines

    [Is anyone willing to write a short critique of Lisp machines here?] See this Technical Summary on Lisp Machines

    Lisp Resources




    Short description

    m4 is a powerful (Turing-equivalent, unlike CPP) macro-expansion program designed as a preprocessor to more stubborn languages. It's ugly, but it's fairly standard, and available on most platforms, including a fine GNU version.


    The only fine implementation of m4 is GNU m4 1.4. Avoid the crippled and broken commercial versions sold by UNIX vendors.


    The semantics of m4 are bad: it uses back-door evaluation in a dynamically-bound global environment as the only way to program, without any way to safely quote text (not to speak about quasiquotation).
    Thus, users who want to do complex things often find they must compile manually to continuation passing style with manual fluid-bindings.
    Avoid m4 but for quick hacks, and when the environment has already been setup for it (e.g. FvwmM4, sendmail configuration, etc).

    Quick critique of m4 as a programming language




    [MIA] ML (originally the Meta Language for a theorem prover) is a class of (strongly) typed functional language.
    ML comes in two main flavors:
  • SML, with its docs and implementations SML/NJ, (same as here, where you could look for "Implementation work using ML"), Edimburgh ML, Moscow ML, and 
  • CAML The impressive CAML, with its implementationObjective CAML (and the less featureful but smaller good old CAML-Light that has been ported to PalmPilot, DOS (ancient versions), etc). See their FTP site.
  • SML/NJ is an efficient but hyper-heavy implementation: don't use it with less than 32MB of memory. I don't know about Edimburgh ML. Moscow ML is some SML implementation using the CAML-light run-time. CAML-light is very light-weight bytecode-based implementation that fits almost any 32 or 64 bit computer; as a measure of its usability, it has been used successfully for a game under X11. The new Objective CAML compiles to either portable bytecode or native code, and is meant to compete successfully with SML's efficiency, at much lighter cost and resource requirements, and with a much better and cleaner module system, and with unequaled objects.
    SML has got a powerful but kludgy module system, only fully implemented in SML/NJ. OCAML has got an equivalently powerful and much cleaner module system, that allows separate compilation; plus it has objects which SML hasn't; and it must now have builtin multithread support.
    Here are some interesting software projects I've found using ML:


    Modula-3 is the successor of Modula-2+, itself an evolution of Niclaus Wirth's Modula-2, independently from Wirth's own works (which instead gave birth to Oberon).
    It is a language that stresses on simple/safe semantics, with modularity, OO, Garbage collection, multiprogramming. Special "unsafe" modules are allowed to mess with implementation details.
    Click here for more information.
    There are free as well as commercial Modula-3 compilers available.


    Napier88 is a persistent programming system from University of St Andrews


    Oberon (now system 3, with the older V4 still available) is the latest modular OO language by Niclaus Wirth (the author of Pascal and Modula-2).
    See The Oberon Reference Site.


    Orca is a language for distributed computing, codevelopped with the Amoeba distributed OS.


    Pascal was a toy written by Prof. Nicklaus Wirth to teach how to compile computer languages. Unhappily it has been taken seriously by many people. Well, let's admit that the cheap Turbo-Pascal once revolutioned programming on personal computers.
    See Modula-2, Modula-3, and Oberon for more serious successors of Pascal.
    Brian Kernighan (author of C as well as many Pascal software), once wrote an article "Why Pascal is Not My Favorite Language", which is quite obsolete now that modern Pascal's (e.g. Turbo-Pascal) are quite equivalent to C (which also shows C sucks as much as this toy Pascal language).
    Actually, there seems to be an "Extended Pascal" standard. I don't know how it compares to Modula-3 or Oberon, but it came too late, with too little support, and with the problem of being burdened by Pascal compatibility. See gpc, the GNU Pascal Compiler.


    Perl is a language for easily manipulating text, files, and processes. It provides easy ways to do many jobs that were formerly accomplished (with difficulty) by programming in C or one of the shells. It is THE language of choice for hacking programs oriented toward interfacing UNIX/Internet protocols.
    Perl4 lacked a lot of expressiveness required for an all purpose language, but already was a great tool for what was clumsily done before with horrible shell/awk/sed scripts and gross C hacks.

    Perl5 has all the good features of Perl4 (a few misfeatures were dropped and replaced), but it's a full-fledged all-purpose language with a most powerful OO module system.

    It is interfaced to lots of libraries (libc, Tk, ncurses, DB/gdbm/ndbm, etc), has got lots of modules to manage various internet protocols, and if by chance it isn't interfaced to your favorite software yet, this can be done through a well-documented extension system.
    And here's a Perl 5 page for people on the bleeding edge. It's a real language, with powerful OO extensions.
    WWW Pages:;; perlWWW; Perl
    As the name "Practical Extraction and Report Language" coins out, Perl aims at being effective in practice, not at developping a nice and simple semantics (though whenever nice things are effective, Perl does them too). Thus after a short learning time, it becomes a great tool for jobs of interfacing software, converting file formats, making small network daemons, etc. Perl is perfectly adapted to the Unix philosophy and practice. However, it is a bad choice when long-term/wide-area consistency is required.
    The principal advantage of Perl is that you can quickly hack up a working program to do text/binary manipulations that would be quite cumbersome using previous tools, that were either too generic, with a cumbersome interface to Internet-standard human-readable files, or too specific, unable to express more complex file manipulations. Perl allows you to do things in tons of different ways, so everyone can find one that fits his taste and thinking pattern.
    The principal disadvantage of Perl is that, allowing too many things as valid programs, it makes it difficult to specify and verify bugs out; the terse syntax that makes hacking quicker makes debugging slower. This is typical of traditional non-meta-programming development, where a compromise between writeability and readability is needed: only a meta-programming environment could help transform a valid program into a clean valid program of equivalent meaning.
    Another current disadvantage of Perl is that the current interpreter may be too slow to start and run to cope with the usage pattern of an intensive server. The Perl compiler being developped might alleviate this problem.


    Prolog is a language for programming based on Horn-Clause logic (not even full first-order logic).


    Python is "an interpreted, interactive, object-oriented programming language. It incorporates modules, exceptions, dynamic typing, very high level dynamic data types, and classes. Python combines remarkable power with very clear syntax. It has interfaces to many system calls and libraries, as well as to various window systems, and is extensible in C or C++. It is also usable as an extension language for applications that need a programmable interface. Finally, Python is portable: it runs on many brands of UNIX, on the Mac, and on MS-DOS and Win32."


    Carl Sassenrath's REBOL
    The language is not finished, but what's already available looks promising. The core language, which is available, looks like some LOGO, LISP with parentheses removed in exchange for fixed arity functions. The semantics also borrow a lot from LISP and Scheme, which is very good. The promised features for future development look great, too. All in all, REBOL has improved a lot since the original papers of 1997. There are still a lot of things undefined yet, so let's wait and see if REBOL lives up to expectations. If it does, it's time to write a free implementation.
    On the flip side, the overloading of structures in a MUMPS/perl/Tcl way looks like it will constrain implementations a lot, unless some standard type inference/declaration/verification system (a la CommonLISP) is promoted.


    RPL ("Reverse Polish LISP") is the language from HP 28/48 calculators. Despite their lack of horsepower, these calculators are much more usable and friendly than my unix workstation. Let us try to determine why...

    Pros and Cons


    SAC, or Single-Assignment C, is a functional language with C-like syntax, especially designed for highly optimized parallel intensive numerical computations. In that it is a successor to SISAL.


    Sather is a free Object-Oriented programming language rival of Eiffel.
    From their homepage: Sather is an object oriented language designed to be simple, efficient, safe, flexible and non-proprietary. One way of placing it in the "space of languages" is to say that it aims to be as efficient as C, C++, or Fortran, as elegant as and safer than Eiffel, and support higher-order functions and iteration abstraction as well as Common Lisp, CLU or Scheme.
    See also another implementation, Sather-K



    Scheme is a dialect of the LISP family of languages. See the according information above.


    Scheme is a minimalistic dialect (some say a skimmed version) of the LISP language, in which everything is built upon the simplest constructs. It is the language that proved to the world the benefits of lexical scoping over dynamic scoping.


    Scheme is among the few languages that have a perfectly defined formal semantics, and not a too kludgy one at that. and Scheme programmers often find it unpleasant to program any other language.
    However, Scheme suffers from minimalism, and Common Lisp programmers often wish they had a unified library in Scheme.
    The R5RS (Revised^5 Report on the the algorithmic language Scheme), that includes the language definition, the formal semantics, etc, is available in various formats from Richard Kelsey's ftp site alongside with some other scheme-related material; it is also available in HTML format from

    Scheme Resources

    Free Implementations

    Here are notable free implementations of the Scheme dialect (and its variations). For more, or for commercial implementations, see the FAQ and above resources.

    Free software based on Scheme




    SELF is a pure prototype-based Object-Oriented programming language, based on very simple concepts which allow efficient implementations.
    SELF uses trivial strong typing (much like LISP): there is no "typecasting" backdoor to object representation, but there is no elaborate type hierarchy, just a one unique static type.

    SELF Resources

    Free software using SELF


    SIMULA was one of the first "OO" languages (or is it the first?), down in the sixties. C++ is to C what SIMULA is to Algol.
    There are still people using it at DIRO in Montreal.
    The authors of SIMULA have since made a much greater language, BETA


    Sisal is a pure functional language that beats FORTRAN for number crunching, especially as its allows lots of automatic optimizations for parallel architectures.
    Unhappily, while focusing on number crunching performance, Sisal forgot many of those things that make functional programming so cool, and it seems that they got multidimensional arrays wrong, so there's room for a better language than SISAL.
    People at LLNL have decided to stop developing SISAL so as to focus on their main job: designing weapons of mass destruction, for which they'll use FORTRAN and C++. Let's hope these lame language will help confuse them in their evil quest for a worse world. Meanwhile, former SISAL developers are working on a sequel to SISAL, SAC (single-assignment C).




    Smalltalk is a hackish class-based Object-Oriented programming language with reflective capabilities.
    Smalltalk can be viewed as an operating system of sorts, more so if it is not running on another OS. It is fine grained, having 30,000-60,000 objects, rooted in some 200 classes; some big Smalltalk systems have over 1 million objects. No differentiation-separation is made between System and Application software: its all one big sea of objects. It has no kernel; but the VM serves a similar purpose. It has extensive, relentless bountry checking. It is the most consistant OO language. Everything is an object, no exception: every number, letter. No object can directly effect the state of another object, only indirectly, by messages. No pointer arithmetic can occur.


    Hum, better take a look at Simula and Smalltalk: A Social and Political History Benedict Dugan, (c) 1994
    Smalltalk was inspired by Simula. Smalltalk was developed under Alan Kay's team, at Xerox PARC in the 1970s and early 1980s. Xerox famously sat on some amazing technologies, throwing away many great opportunities, later developed instead by other companies: Ethernetworking (3com), Postscript (Adobe), laser printers (Apple), windowing graphic user interfaces (Apple, Microsoft), and Smalltalk (ParcPlace-Digitalk is now in much trouble due to overpricing Smalltalk).
    Then Apple Computer's Steve Jobs got Kay and many of the PARC team to move to Apple. Some of the Smalltalk team moved to Digitalk. Then some Digitalkers moved to Apple. Sometime during all this, Apple licensed Smalltalk-80.
    As corporate America often did and does, Apple sat on advanced stuff it had, and did nothing with it. In the meantime, IBM developed Visual Age, and now has 20,000-30,000 Smalltalk programmers.

    Smalltalk Resources

    Free Implementations

    Free software using SELF


    TCL is a logically challenged interpreted language, which is popular because of Tk, a powerful toolkit to program the X-Window environment. Those who really managed to understand its semantics try to explain it with words such as `binding-time', `runtime', `coffee-time' and pretend it's not that bad after all. Tcl has lots of success since it is a strongly-hyped language.
    Many real languages, including OCaml, Perl, Python, Scheme, and more, have access to Tk. And other languages have other interfaces to GUIs.
    Deeper insight can be found at the language critique page or in those Comparisons of Tcl with other systems

    A new HLL

    Pros and Cons


    1. We can design the syntax to fit our needs and ideas, so that it's much easier to use. Moreover, even C isn't our natural language, and whatever language we use, there will have been adaptating time to use it.
    2. We can correct the lacks of any existing language we would have used.
    3. Portability: both the system and the language may be as easy to port. All you need do is porting a LLL compiler back-end or interpreter, and hardware specific lolos (low-level objects).
    4. The language is perfectly well adapted to the system. No need of bizarre and slow language -> system call translation.
    being efficient as an interpreted language, it may serve as a shell language as well as a programming language; being powerful, and easy to specialize via standard libraries, it also replaces small utility languages (sed, awk, perl, etc); finally, being high-level and knowing of relations between objects, it is easily adaptated to an AI language. So there is no more need to learn a different language for every application; the same language is used for (almost) everything; no more need to learn new syntaxes each time. 


    1. we have to relearn a new language syntax. But as we may choose whatever syntax pleases us (and support multiple automatically translatable syntax), this is no great deal, really.
    2. No existing compiler can be used directly. This is no great deal either: Front end are easy to write, and no existing back end can fit an interestingly new OS' object format, calling conventions, and security requirements. Moreover, our system having a brand new conception, even with a traditional language, we'll have to learn restrictions about our way of programming.
    3. we have to debug the language specifications as we use it. But this can prove useful to refine the language and the system specs. Here is an interesting point.


    To Do on this page