December 04, 2009

Filtered functions

I am very excited that I can finally annouce a public release of filtered functions, an extension of generic functions that Charlotte Herzeel, Jorge Vallejos, and myself have developed some time ago and that we are very excited about because it seems to be quite powerful in a number of very different scenarios. It took a while to release filtered functions, because it is a quite non-trivial extension of generic functions and requires a CLOS MOP implementation that is compliant with the AMOP specification to quite a deep level. Therefore, this required some serious preparation in the form of a much improved Closer to MOP library, that I released today as well.

You can find filtered functions and Closer to MOP at the Closer project website. Below you will find a general overview of the concept.

Filtered functions are an extension of generic functions, extended with a filtering step where the arguments received by a generic function are mapped to other values based on user-defined mapping functions. Those filtered values are then used to perform the actual selection and execution of applicable methods. Nevertheless, the methods that are eventually executed see the original objects as received by the generic function, and not the filtered ones.

Here are some examples to illustrate the expressive power of filtered functions.


In order to be able to use filtered functions, we need to have filter functions that map received arguments to values that we actually want to base our dispatch on. For the factorial function, we want to distinguish between negative and positive numbers, and the number zero. For that we can just use the Common Lisp function SIGNUM that returns +1 for positive numbers, -1 for negative numbers, and just 0 for the number 0. The filtered function FAC can thus be defined as follows.

(define-filtered-function fac (n)
(:filters (:sign #'signum)))

DEFINE-FILTERED-FUNCTION is exactly like DEFGENERIC, except that it can also define one or more filters. Here, it defines a filter with the name :SIGN wich specifices that the function SIGNUM is to be used for filtering.

We can now define methods for FAC:

(defmethod fac :filter :sign ((n (eql +1)))
(* n (fac (- n 1))))

(defmethod fac :filter :sign ((n (eql 0)))

(defmethod fac :filter :sign ((n (eql -1)))
(error "Fac not defined for negative numbers."))

Here, we use the qualifiers :FILTER :SIGN in the method definitions to indicate that we indeed want to use the :SIGN filter for method selection. We then use EQL specializers to ensure that the method definitions are applicable for the three different cases that SIGNUM yields. Remember that the method bodies always see the original arguments, not the filtered ones, and this is why the FAC methods can do the correct computations.

State pattern

Filtered functions can be used to dispatch methods based on the state of an argument passed to a filtered function, which enables expressing State-like idioms. Assume the following simple CLOS class is defined for implementing a stack.

(defconstant +stack-size+ 10)

(defclass stack ()
((contents :initform (make-array +stack-size+)
:reader stack-contents))
(index :initform 0
:accessor stack-index)))

Instances of this class have three different states: Such a stack can either be empty, or full, or anywhere in between (in 'normal' state). We can express this as a function that recognizes the state of a stack.

(defun stack-state (stack)
(cond ((<= (stack-index stack) 0) 'empty)
((>= (stack-index stack) +stack-size+) 'full)
(t 'normal)))

It is now straightforward to use stack-state in a filter named :state for the typical stack operations.

(define-filtered-function stack-push (stack value)
(:filters (:state #'stack-state)))

(define-filtered-function stack-pop (stack)
(:filters (:state #'stack-state)))

(define-filtered-function stack-emptyp (stack)
(:filters (:state #'stack-state)))

We can now group the behavior of a stack according to its different states. Note that for 'normal' state, we do not need to mention the use of any filter here, because the methods are not specialized on anything specific anyway. (Filtered functions always allow for 'regular' methods alongside the filtered methods.)

;;; Normal state

(defmethod stack-push (stack value)
(setf (aref (stack-contents stack)
(stack-index stack))
(incf (stack-index stack)))

(defmethod stack-pop (stack)
(decf (stack-index stack))
(aref (stack-contents stack)
(stack-index stack)))

(defmethod stack-emptyp (stack)

;;; Empty state

(defmethod stack-pop :filter :state ((stack (eql 'empty)))
(error "Stack is empty."))

(defmethod stack-emptyp :filter :state ((stack (eql 'empty)))

;;; Full state

(defmethod stack-push :filter :state ((stack (eql 'full)) value)
(error "Stack is full."))

Note that we used a derived state function here, that determines the state of the stack based on some of its other properties. Since filter functions can be any functions, we could also use the reader of a slot as a filter function, and thus have the behavior of a filtered function depend on the explicit state of an object.

Filtered functions can do a lot more: As already mentioned, they can use more than one filter; filter functions can see all the arguments a generic function receives; and filters can be guarded, which means that methods that use a particular filter may be completely ignored if the arguments don't fulfil a certain predicate. You can read the paper Filtered Dispatch that I co-authored with Charlotte Herzeel, Jorge Vallejos and Theo D'Hondt for a more thorough introduction, background information and semantics, and which also includes an extensible metacircular Lisp interpreter as an example, based on implementing EVAL as a filtered function.

April 25, 2009

European Lisp Symposium 2009 - programme details!

On May 27-29, there will be the 2nd European Lisp Symposium taking place in Milan, Italy. The program looks very exciting, and I will definitely be there (well, due to obvious reasons, see below ;).

For example, there are two keynote talks. One is by Kent Pitman who is an award-winning author of technical papers about Lisp, editor of the ANSI Common Lisp specification, and designer of the HyperSpec, the de-facto standard manual for Common Lisp. He will discuss how the Lisp community should move forward from his perspective. Kent's ideas are always well thought-out, although at times controversial, so this is certainly going to be a provocative talk.

The other is by João Pavão Martins and Ernesto Morgado, the two owners of SISCOG - a Portuguese company that develops large-scale industrial planning and scheduling software which is in use for over twenty years. It's always good to hear what practitioners have to tell about their experiences with Lisp, so this should turn out quite interesting.

There will be a couple of presentations in the main track of the symposium about papers that have been reviewed by a program committee chaired by António Leitão.

Jim Newton is going to present a type inferencing approach for the Skill dialect of Lisp that is actually being used in his group at Cadence Design Systems, one of the world-wide largest providers of Eletronic Design Automation. (If you use an eletronic device, it's very likely that the chips inside were designed using one of their tools!)

Thomas Burdick is going to present an approach for compiling FEXPRs (think: first-class macros that you can pass around like regular functions). I'm a bit skeptical here that this will work, because it is actually known that FEXPRs cannot be compiled, but maybe he has found an interesting twist to the problem.

Some colleagues of mine from the Artificial Intelligence Lab of the Vrije Universiteit Brussel in Belgium are going to present a debugging technique they have devolped for their own large-scale agent system that they use to investigate the possible development of natural languages by reconstructing how they could have evolved from simple, first-class principles. The debugging approach consists of monitoring the activities of the agents and presenting the process as an interactive webpage that can present arbitrary detailed (or abstract) views on what is happening. Very cool stuff here, and since they use web technology here, it is actually portable...

Charlotte Herzeel is going to present an architecture for Software Transactional Memory (STM). STM has become quite a hot topic in the last few years because it seems a very promising approach for dealing with concurrency, and since multicore processors are the buzz of the moment, there is a lot of interesting in such approaches. However, little attention has been payed to the design of STM frameworks where you can selectively plug in different STM algorithms. Charlotte has developed a reflective approach (think: metaobject protocol) for STM, which seems very promising (but I'm one of the co-authors of the paper, so I'm naturally biased, of course ;).

Another presentation will be about Linda-style distribution layer on top of Kenzo, an apparently very powerful system for symbolic computation developed in Common Lisp. The Linda Model (also used in JavaSpaces and TSpaces, for example) has a number of very interesting properties for distributed computing, and the paper presents an implementation based on AllegroCache, a robust and high-performance object-oriented database for Allegro Common Lisp. I'm wondering what the concrete benefits for a symbolic algebra system are, so this is another presentation to look forward to.

Finally, I am going to present a paper myself - that's the main reason why I will definitely be there ;). My presentation will be about a macro system on top of Common Lisp's macros that allows writing hygienic macros. The system provides facilities similar to Clinger's renaming construct. The interesting part is that my system is implemented in fully portable Common Lisp and integrated in such a way that 'regular' Common Lisp macros and the macros developed with this new macro system can be used together. The reason why this works is because of the use of symbol macros in the implementation of the new constructs.

Other items on the programme for the symposium are:

A debate on expanding and updating the Common Lisp standard. I am known to be quite conservative when it comes to this topic, in that I don't think ANSI Common Lisp needs a serious revision, but can be developed in a piecemeal fashion (and this actually already happens due to the efforts of the currently very vibrant Common Lisp community). I believe (obviously) that something like CDR is much more promising than a "big bang" revision of the core language. The recent developments in the Scheme community, where the highly controversial R6RS specification was almost not ratified, seems to indicate that the danger is too high that a lot of time and resources could be wasted that can otherwise be used in a much more productive way. Well, maybe there will be some new ideas and visions coming out of the debate...

On the Saturday after the symposium, there will be a visit to the Futurism exhibit in Milan. Futurism was an art movement in Italy in the early 20th century, whose founders announced that everything "old" (so artistic and political tradition) should be replaced by the new ideas of a then young generation. This is probably one of the strangest choices for the social programme of a Lisp-related event: Lisp is second oldest programming language still in use today, and has always been in competition with whatever other programming language came along that was considered to be 'newer' (as if that automically meant 'better'). However, judging for example from this blog posting, this could actually be a very inspiring exhibition.

So all in all, a highly interesting programme, with some more items being added in the coming days. (There are rumours that Christophe Rhodes is going to give a tutorial about non-portable features of SBCL, for example!) Although the early registration deadline is ending very soon now, there is no reason to despair: With €80 for students and €160 for regular participants, the registration fees will remain very low.

So, hope to see you in Milan!

March 30, 2009


It has been suggested that certain programming language constructs, in particular the GO TO, lend themselves to obscure coding practices. Some language designers have even gone so far as to design languages which purposely omit such familiar constructs as GO TO in an attempt to constrain the programmer to refrain from particular styles of programming thought by the language designer to be "bad" in some sense. But any language with function calls, functional values, conditionals, correct handling of tail-recursions, and lexical scoping can simulate such "non-structured" constructs as GO TO statements, call-by-name, and fluid variables in a straightforward manner. If the language also has a macro processor or preprocessor, these simulations will even be convenient to use.

No amount of language design can force a programmer to write clear programs. If the programmer's conception of the problem is badly organized, then his program will also be badly organized. The extent to which a programming language can help a programmer to organize his problem is precisely the extent to which it provides features appropriate to his problem domain. The emphasis should not be on eliminating "bad" language constructs, but on discovering or inventing helpful ones.

Guy L. Steele Jr. and Gerald J. Sussman
Lambda - The Ultimate Imperative

March 12, 2009

Lisp: Research and Experience

In the last couple of years, we have seen a growing interest in the Lisp programming language and its various dialects, including classic ones, like Common Lisp and Scheme, and also brand new ones, like Clojure and Qi. Several user group meetings, workshops and conferences have been organized with great success in recent years, especially in Europe, but also elsewhere.

With the European Lisp Symposium, we aim to start a series of annual events that is especially suitable for novel research results, but also for insights and lessons learned from practical applications and education perspectives, all involving Lisp dialects. The first symposium was organized in Bordeaux, France, on May 22 and 23, 2008.

For this symposium, we have received 15 submissions, and after a careful review process, the program committee selected seven of them for presentation at the main track of the symposium. The program committee considered six of these papers worthy of being invited for a journal publication. Their authors submitted extended versions of these papers, and after another thorough review process with additional reviewers, these papers have indeed reached the necessary level of quality and maturity.

These papers are now finally published in a special issue Lisp: Research and Experience of the Journal of Universal Computer Science.

Preparations for the 2nd European Lisp Symposium to be held in Milan, Italy, May 27-29, 2009 are already under way...

January 26, 2009

European Lisp Symposium 2009

There will be another instance of the European Lisp Symposium this year: in Milan, Italy, from May 27-29, 2009. It's very good that this event takes place so relatively shortly after the successful European Lisp Symposium 2008 about one year ago. Special kudos go to Marco Antoniotti for the local organization and Antonio Leitao for having assembled a great program committee.

The rates for attending the symposium are again very reasonable: Students can get in for as low as €60, and other participants for reasonable €100, when taking advantage of the early registration rates. See the registration page at the symposium website for more details.

More importantly, though: You can still submit papers to the symposium. The deadline for paper submissions is February 4, 2009, and the program committee accepts both original contributions, including research papers and experience reports, as well as descriptions of work in progress. Again, see the symposium website for more details.