Recording Venue:
Guest(s): Anneke Kleppe
Host(s): Ron
In this episode we’re talking to Anneke Kleppe about model-driven software development and language engineering. We start with her involvement in the creation of the Object Constraint Language (OCL) and discuss the intial expactations, actual experiences, and the place of OCL in the current day. From here, Anneke talks us through her take on the formative years of UML and MDA. From here, we expand to the realm of Domain-Specific Languages and Anneke discusses their place in software engineering in general and why we should expect DSLs in significant numbers to become a common sight.
Great talk.
I would like to add that in my opinion it will be hard for big general UML based tools to compete with quick to implement internal DSL’s written in languages like Ruby.
It would be interesting to hear from other listeners what their experience is.
My personal experience is that putting together a Ruby based DSL is fast and easy. It was faster to implement the DSL’s we use in our daily work than it took to read the UML 2 book (!) Using DSL’s is still new to us but our latest product was 60% code generated from a Ruby based internal DSL. That percentage will increase in our future products.
Cheers
Morten
Morten,
Quite frankly, 60% generation rate doesn’t impress me much. I have seen higher than 80% generation-rates with editor macros in a COBOL environment. Higher generation percentages are just an indication for a bigger gap between the domain described by the DSL and the implementation domain. With a weak implementation technology, it is quite easy to produce a high generation rate.
The interesting question in judging DSL approaches is how productive you are with a given toolset in a given problem domain. Juva-Pekka Toivanen has claimed that there have been some impressive productivity gains by using meta-model based DSLs in the previous podcast. Since his company is out there for such a long time and still alive, his claims have at least some credibility.
Last point about the time to read UML 2.0 books: As Anneke Kleppe has pointed out in her podcast, it is not about using the full UML, but useful pieces (such as class or state diagramms). And I assume that any professional developer is familiar with at least the intermediate-level concepts in UML – so there is not a lot of value in an argument stating it took less time to develop an internal DSL than reading an UML book. Using a DSL over the lifecycle of an application also means that issues such as learning curves and documentation have to be taken into account. Using well-known concepts such as UML could help. From that angle, It might also be helpful to popularize OCL.
Kilian Reich
Kilian,
60% is impressive considering the domain (real-time, multi-threaded, multi-core, networked 3D console games written in C++) and the limited time spent on the DSL tools so far. I am aware that it is possible to generate close to 100% CRUD (Create Read Update Delete) style applications using DSL and more general UML based tools. But I am not in the simple CRUD application business. I am not surprised that you can hit 80% in a typical COBOL environment without much effort.
As for using UML, it is interesting that you and I both chose not to use an UML tool in our daily work. You chose to use editor macros and I chose to use Ruby. Now why is that? Why didn’t we decide to use an UML based model driven tool to get the job done? The cost is not an issue because there are good open source solutions out there (as Marcus correctly has pointed out more than once). So why is it that you and I haven’t used one of those tools? My argument is that the time cost spent using an UML based tool is higher than using what is arguably a more powerful tool: text based DSL’s. Writing internal DSL’s in Ruby (or using editor macros) is EASY.
As for OCL: even Anneke Kleppe admits that OCL is pretty much dead. It is not a solution to the problem I see in my daily work. Writing internal DSL’s in Ruby is EASY. That is my point.
Morten Brodersen