Haskell and Clojure

August 30, 2012 at 8:55 pm
filed under Coding
Tagged , , ,

As I mentioned in a previous post, I’ve spent a great deal of my time lately in Haskell. (It hadn’t occurred to me until now that it’s been since April!)

As a “starting” exercise, I did the first unit of Design of Computer Programs on Udacity, except in Haskell. It was challenging. You have to very differently from imperative-land. With Haskell especially, there’s a whole new universe of terminology and constructs.

I’m not sure whether this is improving my aptitude as a programmer. Things like functors and monads are not common outside of functional programming languages. Even if you like FP, it’s just not that widespread. Contrast this with the Gang of Four. Again, whatever value you place on those patterns, they’re well known.

My hope is that learning Haskell and FP in general is increasing the size of my mental toolbox. The interesting thing about concepts like monads is that, as esoteric as they are, they are not technically complex. The Maybe monad is very, very few lines of code, and yet it is semantically powerful. And I’ve seen people argue that people end up implementing monads in practice, without realizing it.

Still, from having listened to a great deal of Rich Hickey in the past few weeks, I eventually broke down and bought a copy of The Joy of Clojure. One word of background: I used Lisp back in college, and I thought it was pretty cool. But I didn’t take it particularly seriously. But since Clojure runs on the JVM, it seems like there’s a decent chance it could get some adoption. Given that and the fact that I find Hickey himself very convincing, I thought I’d give Clojure a try.

I haven’t had a chance to write much code. This is mostly first impressions, as an interesting compare/contrast with Haskell, the other FPL I have some amount of experience with. I haven’t gotten very far with Clojure.


Most of the time, semantics matter way more than syntax. At the limit, yeah, you can find languages such as C++, where the syntax leads to ambiguity or a lot of noise. But when you look at the lifetime of your experience with a language, how much time do you actually spend mastering syntax relative to time using the language? Not long, right?

I don’t want to throw out the idea of syntax altogether, because it can matter. Haskell’s syntax for function composition is easy enough:

foo . bar . baz $ map frotz xs

It looks alien if you don’t know Haskell, but it’s not dissimilar to something like this:

find . -name "*.foo" | xargs grep foo | awk '{print $1}'

We’ve all composed these kinds of pipelines before. The first time that really clicked for you, didn’t it feel magical? When you can express complex operations very simply and very precisely, you gain the ability to speak in a new, more powerful way. The world of possibilities opens up when the ostensibly complex reveals itself as simple.

So the whole notion that Lisp is ugly and unlearnable because of parentheses is shortsighted. I’m not saying it won’t have its share of problems; I’m still not used to it, and I still need to count parens now and then. But in exchange, you have a very simple, very lightweight, very uniform syntax.


Compared to Haskell, Clojure feels positively loosey-goosey. Clojure appears to be strongly typed, but it has no type system that I can discern. I’m not entirely clear on what this means in practice.

Collection types might be the best example. My meager experience with Lisp suggests that most of the time, the list data structure is enough. Clojure has more collection primitives: maps, sets, vectors, and lists. They are not interchangeable. Especially vectors and lists have different semantics and performance characteristics.

The whole idea of no types freaks me out a little, to be honest. That’s part of why I’m intrigued. It seems almost perverse not to care about types again. Maybe Clojure makes it work for me in a way that Python or Ruby never did; the one thing that I can say about dynamic languages is that although they could be very difficult to debug or refactor (in part because of types), you can be very productive.

Haskell’s expressive and powerful type system, as with anything, has a trade-off: you have to spend a lot more time thinking about how your pieces — types, really — fit together. If you have a pretty good idea of what you want to do, sometimes you want to write it down. An exacting type system like Haskell’s can be an impediment. Perhaps, as you get better with the language, you become more facile with this, but I suspect there’s a limit.

I worry, though, about nil. Remember, the guy who invented it thinks it was a “billion dollar mistake”. Maybe helps you deal with failure in a semantically rich way. nil is semantically poor. It might matter less in practice, given that we’re talking about values instead of objects.

Closing thoughts

As much as I love Haskell, I suspect a lot more has been written about Lisp. It’s quite old and is relatively well established, even if it’s not popular as such.

And one piece that Lisp people often bring up is macros. In Haskell, you do end up defining your own DSL, given how lightweight Haskell’s syntax is, generally; it’s relatively easy to define new types and define operations on them. But it’s a very strict DSL, which is again apropos of rapidly prototyping and iterating on an idea.

The question is what level of trade-off you have to make in order to get that kind of rapid productivity. I don’t think the software community has a great answer about how to scale software development from the birth of a project, to success or maintenance. Some of the most successful projects you can imagine are plagued with technical debt. You might have shipped quickly, but as your project ages, it becomes increasingly complex. People with varying mental models join and leave; you find corner cases and fix bugs; and decisions you made earlier in the project’s lifetime, especially in order to ship sooner, come back to haunt you.

There’s some question as to whether or not your choice of technology can help. Depending on who you ask, we are beginning to evaluate that. Testing helps but it is, in some sense, a kind of post-hoc way of dealing with the fact that our code is so difficult to write correctly. And tests aren’t free, either. I am intimately familiar with the costs and trade-offs of both manual testing and integration testing. It’s not ideal.

You could argue that type systems can make this easier. Haskell does. It provides strict safety, purity, and immutability. The type system is intended to be a huge impediment to introducing implicit mutability or state into your program; the compiler in essence proves your program is free of a certain large class of errors. People say that once your program compiles in Haskell, it works. That’s my experience, too.

While Clojure may share this attitude about purity and immutability, it does not appear to subscribe to the idea of a heavyweight type system as a way out of the tar pit. Rather, the counter-argument Rich Hickey has given in talks is that the primary source of bugs is complexity. A type system might help with complexity but, at least in imperative-land, it has a tendency to make your program more complex by layering behavior and abstractions on top of data. Conversely, when you are working chiefly with functions and values, you are working with relatively lightweight datatypes with simple, well-known interfaces. Values are more easily composed, they aggregate better, and in general it’s far simpler to reason about them.

I see this, in some sense, as a point/counterpoint. The idea of a language which is, well, easier is quite tempting. The siren song of “bottom up programming” from Paul Graham’s On Lisp is also tempting. And as always this is as much about learning something new and bending my brain as it is something useful.

%d bloggers like this: