Lisp HUG Maillist Archive

Understanding Lisp History...

I just ran across a relatively recent dissertation that goes a long way toward explaining the reasons for the present state of Common Lisp.

https://web.wpi.edu/Pubs/ETD/Available/etd-090110-124904/unrestricted/jshutt.pdf

I wasn’t around in the earliest days of the evolution of Lisp, and so I was never directly aware of Funarg problems, had never heard of Fexprs, and was only aware of macro hygiene issues by way of an introduction by Paul Graham in “On Lisp” and the surrounding discussions among Schemers.

Reading through this dissertation helps a lot to illuminate the issues of the time, and how they were resolved in the definition of Common Lisp. But I fear we might also have lost some expressive power along the way.

For me, the reading is a bit slow going and technically difficult, but seems worthwhile to expend the effort to understand.

This dissertation has given rise to a new implementation of Scheme known as Kernel. It doesn’t sound like a practical utilitarian language in the same sense as our Common Lisp today. But it is interesting nonetheless. I’m still slogging my way through, and looking for a compelling case to include some semblance of Fexprs. Offhand, so far, even though Macros are intentionally limited forms of Fexprs, they might have all the qualities we actually need in our day to day practical programming.

I’d love to hear comments from anyone more knowledgeable.

- DM

Re: Understanding Lisp History...

Kent Pitman explained why we have macros and not fexprs. The paper is from 1980.

http://www.nhplace.com/kent/Papers/Special-Forms.html


There has been recently some research on fexprs (especially what you found, the Kernel language), but the main problem is to make them compilable in a useful way. Expressive power also means that you have a lot of things possibly happening at runtime which make it very difficult to understand what your program is doing.

On the Lisp Machine one had fexprs, too. But they were not much used and most of the software was using macros. Especially because fexprs could not be usefully compiled at that time. Macros could be expanded at compile time, thus all the developers were using them...


Am 23.04.2017 um 14:58 schrieb David McClain <dbm@refined-audiometrics.com>:

I just ran across a relatively recent dissertation that goes a long way toward explaining the reasons for the present state of Common Lisp.

https://web.wpi.edu/Pubs/ETD/Available/etd-090110-124904/unrestricted/jshutt.pdf

I wasn’t around in the earliest days of the evolution of Lisp, and so I was never directly aware of Funarg problems, had never heard of Fexprs, and was only aware of macro hygiene issues by way of an introduction by Paul Graham in “On Lisp” and the surrounding discussions among Schemers.

Reading through this dissertation helps a lot to illuminate the issues of the time, and how they were resolved in the definition of Common Lisp. But I fear we might also have lost some expressive power along the way.

For me, the reading is a bit slow going and technically difficult, but seems worthwhile to expend the effort to understand.

This dissertation has given rise to a new implementation of Scheme known as Kernel. It doesn’t sound like a practical utilitarian language in the same sense as our Common Lisp today. But it is interesting nonetheless. I’m still slogging my way through, and looking for a compelling case to include some semblance of Fexprs. Offhand, so far, even though Macros are intentionally limited forms of Fexprs, they might have all the qualities we actually need in our day to day practical programming.

I’d love to hear comments from anyone more knowledgeable.

- DM

Re: Understanding Lisp History...

Although my Lisp days started before Common Lisp, when fexprs ruled the Earth, I’d not see Kernel before. Some interesting issues with it described here.

On Apr 23, 2017, at 7:58 AM, David McClain <dbm@refined-audiometrics.com> wrote:

I just ran across a relatively recent dissertation that goes a long way toward explaining the reasons for the present state of Common Lisp.

https://web.wpi.edu/Pubs/ETD/Available/etd-090110-124904/unrestricted/jshutt.pdf

I wasn’t around in the earliest days of the evolution of Lisp, and so I was never directly aware of Funarg problems, had never heard of Fexprs, and was only aware of macro hygiene issues by way of an introduction by Paul Graham in “On Lisp” and the surrounding discussions among Schemers.

Reading through this dissertation helps a lot to illuminate the issues of the time, and how they were resolved in the definition of Common Lisp. But I fear we might also have lost some expressive power along the way.

For me, the reading is a bit slow going and technically difficult, but seems worthwhile to expend the effort to understand.

This dissertation has given rise to a new implementation of Scheme known as Kernel. It doesn’t sound like a practical utilitarian language in the same sense as our Common Lisp today. But it is interesting nonetheless. I’m still slogging my way through, and looking for a compelling case to include some semblance of Fexprs. Offhand, so far, even though Macros are intentionally limited forms of Fexprs, they might have all the qualities we actually need in our day to day practical programming.

I’d love to hear comments from anyone more knowledgeable.

- DM

------------------
Christopher Riesbeck
Home page: http://www.cs.northwestern.edu/~riesbeck
Calendar: http://www.cs.northwestern.edu/~riesbeck/calendar.html


Re: Understanding Lisp History...

For anyone interested, here's a Lisp 1.5, containing FEXPR's, 
dynamically scoped variables and GC, written in MC6800 assembler (very 
readable, IMO).  It was only 4K (not M, not G) long.

https://github.com/guitarvydas/frits-van-der-wateren-lisp

pt

_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Re: Understanding Lisp History...

On 2017-04-23 10:36 AM, Rainer Joswig wrote:
> ... but the main problem is to make them compilable in a useful way.

I have to wonder if this assumption should be questioned.

The idea that compilation was necessary, was all the rage in the '80's.

YACC, and all of the accompanying trade-offs (e.g. LR, LALR, etc.) was 
researched and invented because it was simply unimaginable to use 
backtracking parsers back then.

In the late '90's we used backtracking parsers (the TXL language) to 
slurp in millions of lines of COBOL and highlight Y2K issues.

Today, I use Prolog for parsing.  Prolog allows me to easily parse 2D 
diagrams and convert them to running code.  I have a book on my shelf 
which expends incredible energy trying to show the theories of how to 
fit "above" and "below" relationships into the YACC/CFG paradigm, to 
little avail, since Prolog (and probably miniKanren) does it so easily.

And, neural nets were almost discarded back then.  Now, given increased 
processing power, deep learning drives cars.

Uncompiled JS is now the most ubiquitous language, found in every browser.

pt

_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Re: Understanding Lisp History...

Neural networks are compiled (-> OpenCL), even to specialized hardware. Javascript is compiled (see for example https://en.wikipedia.org/wiki/Chrome_V8) - and it has no macros and no fexprs. Prolog has compilers.

If you want fully dynamic runtime transformations of Lisp code, you might want to think about: why and at what cost? A language/implementation is a tool and the features are selected knowing the tradeoffs. Using runtime transformations may give you more flexibility, but then you get more possibilities for runtime errors in complicated dynamic code...


Am 23.04.2017 um 18:38 schrieb Paul Tarvydas <paultarvydas@gmail.com>:

On 2017-04-23 10:36 AM, Rainer Joswig wrote:
... but the main problem is to make them compilable in a useful way.

I have to wonder if this assumption should be questioned.

The idea that compilation was necessary, was all the rage in the '80's.

YACC, and all of the accompanying trade-offs (e.g. LR, LALR, etc.) was researched and invented because it was simply unimaginable to use backtracking parsers back then.

In the late '90's we used backtracking parsers (the TXL language) to slurp in millions of lines of COBOL and highlight Y2K issues.

Today, I use Prolog for parsing.  Prolog allows me to easily parse 2D diagrams and convert them to running code.  I have a book on my shelf which expends incredible energy trying to show the theories of how to fit "above" and "below" relationships into the YACC/CFG paradigm, to little avail, since Prolog (and probably miniKanren) does it so easily.

And, neural nets were almost discarded back then.  Now, given increased processing power, deep learning drives cars.

Uncompiled JS is now the most ubiquitous language, found in every browser.

pt

_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Re: Understanding Lisp History...

On 23 Apr 2017, at 15:36, Rainer Joswig <joswig@lisp.de> wrote:
> 
> There has been recently some research on fexprs (especially what you found, the Kernel language), but the main problem is to make them compilable in a useful way. Expressive power also means that you have a lot of things possibly happening at runtime which make it very difficult to understand what your program is doing.

It seems to me that something like this:

(defun f (p a b c)
  (funcall (if p a b) c))

Is a pretty good example of the kind of horrible nightmare you get into: what is it meant to do if a is a fexpr and b is an expr?

I first learnt Lisp in dialects (Standard Lisp & Cambridge Lisp) which had fexprs, and I'm fairly sure that I have no idea what they would have done with code like that except I think either they'd have treated both like exprs or something completely undefined would have happened.

If you want something actually sensible to happen then I think you end up with just huge and pervasive changes to everything: every function which takes a functional argument has to be prepared for it to be a fexpr and thus has to evaluate its other arguments only in the case that the thing isn't a fexpr (and it therefore needs the lexical environment those arguments came from so it can evaluate them).

Also, macros can get you quite a lot of what FEXPRs could (perhaps all of it?) albeit with different names for things (so, not EVAL to get the value of something) and you can't poke inside forms and expect to be able to get at the lexical environments of bits of them:

https://gist.github.com/tfeb/865bc9bf3f24d9c65574ad42df2c8f6b

(This is just a hack because I was bored waiting for an apparently infinitely slow supercomputer to do actually get around to running my code: it's not meant as a serious answer to the problem!)

--tim

_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Unable to render article 14339 because of ":DEFAULT stream decoding error on #<SB-SYS:FD-STREAM for \"socket 192.168.43.216:64155, peer: 116.202.254.214:119\" {10023E07E3}>: the octet sequence #(160) cannot be decoded." error

Re: Understanding Lisp History...

On 25 Apr 2017, at 22:18, Christopher Riesbeck <c-riesbeck@northwestern.edu> wrote:
> 
> FUNCALL didn't exist in Lisp 1.5 (or 1.6). APPLY was limited to and EXPRs and LAMBDAs, according to the Lisp in Lisp definition in the Stanford 1.6 manual.

I think that Lisp 1.5 / 1.6 were Lisp-2s (that sounds very odd to say)?  Cambridge Lisp (which I started with) is a Lisp-1 so it doesn't need funcall (it has apply).   I don't remember what Standard Lisp was (but since the implementations I used were build on Cambridge Lisp they were probably Lisp-1s too).   From the manual it looks like apply in Cambridge Lisp probably doesn't work with fexprs, but it's not clear.

I think the answer is that the semantics of old lisps were just incoherent (certainly Cambridge Lisp had the normal interpreter-is-dynamically-scoped / compiler isn't thing).   It's easy (well, easy for me) to forget how much of this mess CL & Scheme cleaned up.


_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Re: Understanding Lisp History...

> Am 26.04.2017 um 14:15 schrieb tfb@tfeb.org:
> 
> On 25 Apr 2017, at 22:18, Christopher Riesbeck <c-riesbeck@northwestern.edu> wrote:
>> 
>> FUNCALL didn't exist in Lisp 1.5 (or 1.6). APPLY was limited to and EXPRs and LAMBDAs, according to the Lisp in Lisp definition in the Stanford 1.6 manual.
> 
> I think that Lisp 1.5 / 1.6 were Lisp-2s (that sounds very odd to say)?  Cambridge Lisp (which I started with) is a Lisp-1 so it doesn't need funcall (it has apply).   I don't remember what Standard Lisp was (but since the implementations I used were build on Cambridge Lisp they were probably Lisp-1s too).   From the manual it looks like apply in Cambridge Lisp probably doesn't work with fexprs, but it's not clear.
> 
> I think the answer is that the semantics of old lisps were just incoherent (certainly Cambridge Lisp had the normal interpreter-is-dynamically-scoped / compiler isn't thing).   It's easy (well, easy for me) to forget how much of this mess CL & Scheme cleaned up.

Absolutely true.

The compiler and runtime of a robust implementation like LispWorks is really light-years ahead of what we had with Cambridge Lisp. Just remembering it is painful. The Cambridge Lisp implementation I used had not much runtime safety and zero safety when using the FFI. It looked interesting and it had quite a bit functionality on those small personal computers of that time. But in practice it was painful to use. It was a great relief for me to have access to usually much more robust Common Lisp implementations, where I would reduce safety only in selected portions of my code.

For me, the robustness of LispWorks is one of its main features. ;-)




_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

from diagrams to Lisp

We have a project in which we use a free UML tool to create high level pictures of the system.
Of course the evolving system escapes the pictures and someone has to go redraw them....

We would like to move to the point where the diagram was a representation of the system at least at the same level of abstraction as the pictures.
There are a couple of graphical tools out there focused on other languages but I have been unable to discover any that emit Lisp.

Any recommendations would be appreciated.

Best Regards,
Tom Thurman





_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Re: from diagrams to Lisp

On 4/26/17 Apr 26 -12:23 PM, Thomas Thurman wrote:
> We have a project in which we use a free UML tool to create high level pictures of the system.
> Of course the evolving system escapes the pictures and someone has to go redraw them....
> 
> We would like to move to the point where the diagram was a representation of the system at least at the same level of abstraction as the pictures.
> There are a couple of graphical tools out there focused on other languages but I have been unable to discover any that emit Lisp.
> 

It would require you to write or find a code-walker, but if you did
that, you could use CL-DOT + graphviz to render your diagrams.

Cheers,
r

_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Re: from diagrams to Lisp

On 2017-04-26 01:23 PM, Thomas Thurman wrote:
 > We have a project in which we use a free UML tool to create high 
level pictures of the system.
 > Of course the evolving system escapes the pictures and someone has to 
go redraw them....
 >
 > We would like to move to the point where the diagram was a 
representation of the system at least at the same level of abstraction 
as the pictures.
 > There are a couple of graphical tools out there focused on other 
languages but I have been unable to discover any that emit Lisp.
 >
 > Any recommendations would be appreciated.

1. I have been compiling diagrams to code for several decades, but most 
of my work was OEM and not open-sourced.

I do have an example diagram compiler on my github 
https://github.com/guitarvydas/vsh .  It uses yEd (which is not a good 
"diagrams are code" editor (nothing is), but serviceable).  It 
self-compiles, to an assembly language (8 instructions).  Currently, it 
uses the assembler to create pipes and forks, acting as a visual shell 
for linux.  This could be ported to emit / interpret lisp. The diagram 
compiler uses Lisp loops instead of Prolog.

2. I have that there are only 2 box kinds in UML that can be compiled 
(I'd have to find my UML book to remember which boxes). UML also uses 
StateCharts, which can be compiled.  [Unfortunately, StateCharts allow 
for hidden concurrency - a bad idea IMO].

3. I have found that I use 2 kinds of diagrams.  (A) A top level 
architecture (hierarchical) diagram which is composed of concurrent 
boxes and lines (patterned after CAD/CAM for digital electronics). (B) 
Leaf nodes represented as state machines (StateCharts with all 
concurrency removed).  Snippets of code (expressions) on the transitions 
and entry/exit points specify the actual textual code at those points 
(such annotations should remain as textual code, not diagrams).

4. Look to see where Full Metal Jacket is at these days.  It created 
functional lisp from diagrams. http://web.onetel.com/~hibou/fmj/FMJ.html

5. Look at Drakon (real rocket science! :-).  It's written in Tcl/Tk, 
emits C, Erlang, etc. and it looked easy enough to port to lisp when I 
last looked at it a couple of years ago.  I think it needed one extra 
hack added to it to emit a closing paren for lisp. 
http://drakon-editor.sourceforge.net/language.html

6. FBP, flow-based programming http://www.jpaulmorrison.com/fbp/ (C++ 
and Java).  noFlo https://noflojs.org/ (JS).  FBP is very similar to the 
top level I use (I use "events" instead of "flows"). I don't know how 
hard it would be to port their emitters.  OTOH, diagram compilers tend 
to emit very regular code, so maybe an AWK script would be enough...

7. node-red?  JS.  (Allows for only 0 or 1 input ports, an odd notion IMO).

8. There are two papers on my github that describe one of the tools we 
built (and used to program a smart meter system).

I'd be glad to discuss any of the above, via email.

pt

_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Unable to render article 14345 because of ":DEFAULT stream decoding error on #<SB-SYS:FD-STREAM for \"socket 192.168.43.216:64162, peer: 116.202.254.214:119\" {100A5F8603}>: the octet sequence #(160) cannot be decoded." error

Re: from diagrams to Lisp

Thanks!
We will take a look at these and review our pictures to see what makes sense.
(I have looked at Drakon before but the graphics in the version we looked at were dodgy, as connections wandered a bit on the diagram).
I will take you up on your kind offer when I have more questions.

Best Regards,
Thomas Thurman

> On Apr 26, 2017, at 1:36 PM, Paul Tarvydas <paultarvydas@gmail.com> wrote:
> 
> On 2017-04-26 01:23 PM, Thomas Thurman wrote:
> > We have a project in which we use a free UML tool to create high level pictures of the system.
> > Of course the evolving system escapes the pictures and someone has to go redraw them....
> >
> > We would like to move to the point where the diagram was a representation of the system at least at the same level of abstraction as the pictures.
> > There are a couple of graphical tools out there focused on other languages but I have been unable to discover any that emit Lisp.
> >
> > Any recommendations would be appreciated.
> 
> 1. I have been compiling diagrams to code for several decades, but most of my work was OEM and not open-sourced.
> 
> I do have an example diagram compiler on my github https://github.com/guitarvydas/vsh .  It uses yEd (which is not a good "diagrams are code" editor (nothing is), but serviceable).  It self-compiles, to an assembly language (8 instructions).  Currently, it uses the assembler to create pipes and forks, acting as a visual shell for linux.  This could be ported to emit / interpret lisp. The diagram compiler uses Lisp loops instead of Prolog.
> 
> 2. I have that there are only 2 box kinds in UML that can be compiled (I'd have to find my UML book to remember which boxes). UML also uses StateCharts, which can be compiled.  [Unfortunately, StateCharts allow for hidden concurrency - a bad idea IMO].
> 
> 3. I have found that I use 2 kinds of diagrams.  (A) A top level architecture (hierarchical) diagram which is composed of concurrent boxes and lines (patterned after CAD/CAM for digital electronics). (B) Leaf nodes represented as state machines (StateCharts with all concurrency removed).  Snippets of code (expressions) on the transitions and entry/exit points specify the actual textual code at those points (such annotations should remain as textual code, not diagrams).
> 
> 4. Look to see where Full Metal Jacket is at these days.  It created functional lisp from diagrams. http://web.onetel.com/~hibou/fmj/FMJ.html
> 
> 5. Look at Drakon (real rocket science! :-).  It's written in Tcl/Tk, emits C, Erlang, etc. and it looked easy enough to port to lisp when I last looked at it a couple of years ago.  I think it needed one extra hack added to it to emit a closing paren for lisp. http://drakon-editor.sourceforge.net/language.html
> 
> 6. FBP, flow-based programming http://www.jpaulmorrison.com/fbp/ (C++ and Java).  noFlo https://noflojs.org/ (JS).  FBP is very similar to the top level I use (I use "events" instead of "flows"). I don't know how hard it would be to port their emitters.  OTOH, diagram compilers tend to emit very regular code, so maybe an AWK script would be enough...
> 
> 7. node-red?  JS.  (Allows for only 0 or 1 input ports, an odd notion IMO).
> 
> 8. There are two papers on my github that describe one of the tools we built (and used to program a smart meter system).
> 
> I'd be glad to discuss any of the above, via email.
> 
> pt
> 


_______________________________________________
Lisp Hug - the mailing list for LispWorks users
lisp-hug@lispworks.com
http://www.lispworks.com/support/lisp-hug.html

Re: from diagrams to Lisp


On 26 Apr 2017, at 19:23, Thomas Thurman <thurman.tom@imonmail.com> wrote:

We have a project in which we use a free UML tool to create high level pictures of the system.
Of course the evolving system escapes the pictures and someone has to go redraw them....

We would like to move to the point where the diagram was a representation of the system at least at the same level of abstraction as the pictures.
There are a couple of graphical tools out there focused on other languages but I have been unable to discover any that emit Lisp.

Any recommendations would be appreciated.

I can’t recommend it (it probably is not even available anymore), but in the 90s there was a tool named OMTool (pre-UML) that was written apparently in Smalltalk, and that saved the model and diagrams as sexps.

Nowadays, most UML tools are able to save the model in XML using a standard in UML DTD.  You could therefore implement a tool that would read these XML files and produce lisp sources.

Modelio (which is the free evolution of the commercial CASE tool Objecteering), is able to generate code for various languages (thru plug-ins, some of them still being commercial).  It should be possible to write a Modelio plug-in to generate lisp code; it’s written in Java, I guess one could write such plug-ins in Common Lisp using ABCL.


Now, I would suggest the reverse: write the UML model in form of sexps, and use them to generate the lisp code (ie. implement the macros to interpret those sexps as lisp code), and to generate UML XML files that can then be loaded in a UML OO CASE tool to draw the diagrams.


-- 
__Pascal J. Bourguignon__



Updated at: 2020-12-10 08:31 UTC