Archive

Archive for the ‘Clojure’ Category

Clojure Tradeoffs (design implications and why you should care)

June 26, 2013 3 comments

EDIT: HN thread: https://news.ycombinator.com/item?id=5943982

Clojure as a language and community is very sensitive to the definition and design of tradeoffs. This post is an attempt to elucidate the tradeoffs chosen by the language, what they mean to interested parties, and an attempt to predict the future based on these choices.

Motivation

Rich Hickey’s said a few things about design and the role of tradeoffs, in a recent talk he described design as consciously making choices about tradeoffs.  He has another important design tenet: design by decoupling concepts from each other.  So, clojure is in the interesting position of being an extra layer of abstractions that claims to actually simplify the task of programming in the long run.  It does this by pulling apart concepts programmers take for granted in order to assemble them more effectively.  Below are some tradeoffs I noticed through working with clojure for the past year and a half.  Some of them, I had never thought about in my previous languages, but I can see that by accepting a language, I also accepted a set of tradeoffs that guided how I work.  Because design tradeoffs (manifested as abstractions) determine what is easy and what is difficult, I think it’s valuable to see what tradeoffs are made.  It’s valuable to know what they encourage, what they discourage, and how they interact, so that we can have more control over our tools and environments.

LISPiness

Clojure is a lisp.  That fact alone means it builds on 50 years of infrastructure and thought, some of which has been absorbed into mainstream languages, but it also presents a foreign and scary interface to users that are used to other syntaxes.  You can perform ‘syntactic abstraction’ at the expense of visual clarity, but it also becomes easier to express ideas and give them names.  The programmer that works on a team is forced to become more judicious about these design choices, but the greater ease of expression means you’re never bound by what your language provides.  It is the ultimate tool to identify and remove repetition.

Concurrency

Clojure has a concurrency focus.  This tradeoff interacts with the decoupling desire in order to diminish the numbers and effects of problematic sites in programs.  With regards to state change over time, this is similar to Python’s mantra, ‘better explicit than implicit.’  Even though Clojure has all the nifty concurrency features you could ever want, thread-pools, async methods, etc.,  the most important simplifying feature to consider is in fact immutability by default, which is a decoupling of state-change and value that most languages freely intermix.  This, as a concept, is difficult to make compelling without an already captive audience, but there are great treatments of it from multiple sources.  The pitch usually starts off with emphasizing the pain of building shared-state concurrent systems, but it is applicable and useful in other contexts.

The Value of Values: http://www.infoq.com/presentations/Value-Values

In practice, it means you don’t lose any capabilities of java, but since you accept that what’s easy, idiomatic, and low-friction in clojure is the right thing to do, you will by default write safe and moderately performant code.  Reliable concurrency semantics fall out from the intentional convenience of a suite of core functions built around this ‘feature’.  It’s a similar conversation to telling a C programmer that you’re taking away their malloc, but the data structures needed to express it are written in normal java.  The brilliance of clojure’s core library is that it makes using these safe/fast immutable data structures (relatively) more convenient/effective than any other option, without artificially making java-like things less (absolutely) convenient/effective.

Shared-memory over other computing paradigms, ie message-passing.

Like C, C++, Java, Ruby, Python, etc.. clojure maps closely to the actual semantics of the machines that run those languages, which in our case is the Von-Neumann shared state bit-banging model.  We generally don’t even think about this tradeoff, but there are examples of systems that hold some other construct as fundamental, such as Erlang’s actor model.  In practice, this means there’s no wall of abstraction preventing you from using lower-level constructs that map efficiently to the hardware.  You can write low-level code just as well as or better than java, while writing high-level code very easily.  It’s simple to switch modes of thought and mix and match levels of abstraction due to the ‘Composition Tradeoff’.  The resulting abstraction soup is a little unnerving at first, but you come to appreciate it after a few months of use.  In my opinion, dealing with it is a worthwhile meta-skill :-).

Dynamic over Static

It’s the same in every dynamic vs static debate. Static languages have the advantage of stronger compiler support of domain-level assertions encoded directly into a type system.  Dynamic languages trade that for increased flexibility, which is helpful when existing code is repurposed or used unpredictably.  It becomes harder to reason about contracts of programmatic interfaces when the compiler and IDE isn’t helping you.  Increased documentation is more necessary as a result.  Automated tests can fill the role of compile-time assertions at run-time.  As an added benefit, 90% of my time is spent in interactive development, building things in a live, running environment.  Usually, static languages have a speed advantage, but in Clojure this is not the case…

Speed over Convenience

Clojurists love their neat dynamic tools, however they are also speed junkies.  There is no pervasive run-time system in clojure to slow everything down, however idiomatic use promotes heavy use of the immutable data structures, which are optimized to perform competitively to other choices.  More relevantly, Clojure lets you apply the 80/20 rule.  Given the ease of interop with java, it’s possible to pick and choose your own performance tradeoffs to implement components, without losing the benefits of dynamic languages.  For instance, Clojure Records provide fast java field-access for known-ahead-of-time fields, but they are also backed by a standard immutable hashmap for additional properties.  They can be treated with the conveniences of standard hash-maps, but they are also efficient.  Clojure’s deftype is equivalent to raw java if you need to go a step further.  At the core, care is taken to provide fast implementations for common operations.  Nothing prevents a user from using their own abstractions and data structures, and extending clojure’s abstractions over them.

Composition over Inversion-of-control

The emphasis on concurrency via shared, immutable data promotes standard, dynamic methods for libraries and functions to interact.  Since a user can trust that data is immutable and reliable, there is no need for things like defensive copying as is standard practice (or should be) in multi-threaded object systems like java.  It’s simply not easy or expedient to go out of your way to destroy someone else’s data.  This trust in data integrity coupled with the syntactic abstraction afforded via macros and higher-order functions means you can write concise code that composes in intuitive ways, without the need to hook yourself into someone else’s sandbox (eg Spring or Rails).

The most troublesome thing about such frameworks is the use of polymorphism and inversion of control as a sledgehammer to get around the inherent problems of OO.   Namely, OO couples state-change to objects (binds the effects of time to a specific bucket of memory), and functions to classes.  Clojure, on the other hand, feels like nothing you write is actually ‘doing’ anything at all.  Functions generally simply transform data, and occasionally you might fire off a side-effect or perform some coordinated state change.  You can trust that there is usually a simple relationship of inputs to outputs.  When you want polymorphism, you can get it in spades, but you’ll end up sprinkling it in occasionally instead of being bound to a particular style throughout the construction of your application.

When was the last time you tried to switch some Spring beans or Rails controllers over to another framework, or use two such frameworks in one application?  This is problematic primarily due to inversion of control binding all your code to the framework’s assumptions.  In clojure, you compose functions yourself, making more choices along the way, but the benefits of doing so coupled with the ease of dealing in data overshadows the need to trust in someone else’s choices.  Code becomes actually reusable, and it usually even reads more like a tree-expansion than a graph traversal.  The language features themselves are mostly orthogonal, and are similarly composable.

Community over Individualism

Lisp has a history of promoting an individualist spirit.  There’s such a thing as the ‘Lisp Curse’ http://www.winestockwebdesign.com/Essays/Lisp_Curse.html .  I personally believe Clojure is positioned to beat the curse, due to this generation’s emphasis on open-source, social media, friendly and productive chat rooms and newsgroups, and clojure’s intentional design decisions to promote interop between libraries.  One example is Clojure’s standardized Lisp Reader, which is more restricted than Common Lisp’s, but enables source code to be shared more easily.  There are excellent conferences with highly interesting talks, and the bootstrapping by java’s pre-existing momentum meant clojure was uniquely positioned to be useful at an early state.  At this point, I feel there is enough momentum to keep clojure moving forward for the foreseeable future.

Long-term benefits over short-term approachability

Clojure optimizes for long-term use and long-term simplicity over familiarity and initial ease.  However, at each decision point, there is compelling rationale driving the design decisions.  Things are made very easy when not at the expense of primary design concerns.

Tradeoffs for individuals

Certainly, by pursuing clojure, you are not pursuing other things, but the design elements are excellent to study, and the language itself is small.  Additionally, you are not leaving the JVM, which will be a relevant platform for many years to come.  Clojurescript has recently also come into existence, offering similar design tradeoffs targeting the Javascript VM, which will also stay relevant for the foreseeable future.  An investment in clojure will position the user to take advantage of many platforms in perhaps more convenient ways.  It’s not a wall of abstraction, and it doesn’t protect you from learning the host environment.  For these reasons, it’s less of an isolating ‘language’ commitment than other languages might be, while conferring substantial benefits.  Sure, languages come and go, but working in a lisp makes it easy to not get distracted by syntax, and to instead focus on semantics.  Time spent dealing with abstractions this way makes it easy to disregard superficial differences in other languages, and the experience makes it easy to learn them quickly should the need arise.

Tradeoffs for companies

Companies have to worry about a number of things with regard to technology choices, namely there is a question of the ability to hire good developers to work in a language.  Clojure is still not yet mainstream, and the developers are few and far between. However, if you manage to find one, you are guaranteed that they will be someone who cares about optimizing their workflow, productivity, and relevance.  Interest in clojure is a good indicator of respect for the above tradeoffs and good design sensibility.  The language and toolchain is increasing in popularity.  The community is very much engaged and invested in its success, and it continues to grow.  General hardware trends and competitive trends are going to push more interest in clojure’s direction, and the language itself will keep pace with innovation.  The bottom line is that it takes a bit of effort to learn, but it stays out of the way and presents safety and simplicity as the convenient things to do.  It integrates well with any JVM solution, and many companies such as Twitter leverage a JVM polyglot infrastructure that includes clojure.  I think the most relevant analysis was the recent ThoughtWorks Radar: http://www.thoughtworks.com/radar , which both placed clojure in the ‘adopt’ category and promoted small composable libraries, a hallmark of clojure’s approach.

Tradeoffs for me

Personally, through my experiences at work, multiple conferences, IRC and newsgroups, I’m convinced that the Clojure community is a melting pot of innovation from many walks of developers.  I have confidence that there won’t be a more relevant language for me for at least five years.  Given trends in hardware, concurrency will become more of a driving force in language decisions, and I want to be on the cusp.  For larger numbers of cores and distributed systems in the future, clojure is making its way into message-passing, and will certainly have a solid offering.  I can stop worrying about languages for a while, and I can instead focus on the JVM platform itself and general systems problems until we hit a point where the tradeoffs are no longer a match for the systems that need to be built.

What they say about lisp is true, I really feel like I’m learning the truths of computing without getting bogged down by the act of expression.  After I got over the initial hump, I now spend 99% my time thinking about the problems I’m trying to solve instead of fussing with the tools.  When I have to learn something new about the language by doing a deep-dive, I always feel it’s a worthwhile exercise due to the readability of concise, idiomatic, well-composed code.

The right tool for the job

The ‘right tool for the job’ might be a more relevant thing to say in the material world, where you have to go somewhere to pay money for tools, and repurposing them would be too costly.  In open-source tech, we are able to trade our time and speculation to improve our own tools.  Taking advantage and contributing back to someone else’s tools is freely encouraged. A 50-year hammer has an opportunity to influence the design of a future toolmaker’s swiss army knife in combination with other tools built by experts from other disciplines.  I hope I’ve been persuasive that it’s more interesting to talk about actual design tradeoffs and their implications.

In conclusion, Clojure’s a more right tool for more jobs than one might think.  By making opinionated yet cautious choices, Clojure allows the user enough breathing room to compose and extend constructs in whichever way is appropriate, while maintaining a set of standards that promote safe, performant, and beautiful code.

Categories: Clojure, Observations

Clojure environment state of the union, install speedrun screencast

December 19, 2012 6 comments

I’ve been speaking to a colleague about simplifying my workstation setup, minimizing the number of OS instance configurations floating around in my head. I have multiple computers at home and at work, and I need to be able to minimize the time it takes to be productive from point-zero. I thought this would be a great time to try and make a speedrun of getting up and running with a clojure environment, starting with a freshly downloaded ubuntu ISO image and using it as a VM.

All-in-all, it takes 24 minutes, about 8 minutes is waiting for stuff to download. I could practice and get it down to ~15, but this should be a good order-of-magnitude estimate of what it takes. Minutes 16-24 are hassling with emacs, elpa, auto-complete, and order-of-installation issues.

This also serves as a demonstration of what can go wrong when even a full-time clojure developer has to recreate a working environment, and what kind of knowledge of workarounds may be needed. I had some problems with emacs that I can’t explain, I just found a workaround through trial-and-error and my minimal experience :-). Having remembered what it was like to not know these secrets, I know that this hassle can expand into days of digging.

As a community, we should aim to minimize this sort of hassle for beginners. I know there have been ill-fated efforts to do so, with out-of-date documentation and half-usable but well intended tools, and I don’t have an answer for that, but here’s a data point.

Categories: Clojure, Uncategorized Tags:

Separation of concerns, or let-map, birth of a clojure macro

June 11, 2012 8 comments

Motivation

I’ve been doing substantial work in clojure recently, and it’s created opportunities for me to think about how I program more deeply. Of course, in itself that would be a useless diversion, but in fact the nature of lisp’s abstraction tools affects my workflow significantly. There is truth to the notion of growing your own language to do the job. I really get the sense that I am learning deeper secrets of expressivity and programming by having so much power over my code, and I spend very little time actually repeating myself. There are of course adjustments to be made coming from an imperative, statically typed background, and even when I thought I understood the language from about a year of study, books and toy programs, I’m finding there’s a lot more to think about when you use it full-time. Here, I provide one such real-world example and a walkthrough of one of my first proper user-defined syntactic abstractions (macro).

Consider the case of a ‘main’ method, the initial entry-point to a command line program. Here’s the simplest example:

(defn -main
  [& args]
  (do-things args))

Clojure’s runtime will pack the java String array that normally takes the role of a variadic arglist into a clojure Seq named args. This simple case of a main method is simply passing the args to a function that can handle it. However, this is suboptimal. It’s taking the burden of argument-parsing and placing it deeper in your application code. Really, if I saw this in the wild, my first impulse would be to separate the concerns as much as possible. The business code should be concerned with data on its own terms, and any adapters and bridges from other domains (shell-land, the internet, etc.) should live outside as meager servants. Core application/library code should be general and pure. Consider the case that do-things is of this form:

(defn do-things
  "Does the things"
  [{:keys [things-to-do when-to-do-them how-to-do-them]}]
  (actually-do-things things-to-do when-to-do-them how-to-do-them))

If you’re not familiar with the {:keys [..]} syntax, it’s a shortcut for pulling out values from associative maps, called destructuring. It’s a pervasive pattern in clojure binding forms, originating from common lisp. It hides the map lookup for each key behind syntax sugar, using a macro. We will take advantage of this ability later, but for now, think of ‘do-things’ as a function that takes a map of this form:

{:things-to-do ...
 :when-to-do-them ...
 :how-to-do-them ...}

The keys are clojure keywords, and the values are whatever they need to be. In my feeble mind, I think of this structure as a sort of anonymous type, a specification of a contract to the input of the function. Since clojure is so incredibly powerful at dealing with data, we can avoid writing code that would normally show up as api by taking advantage of rich declarative data. And code is data. Back to our -main function:

(defn -main
  [& args]
  (do-things (parse args)))

`

We’ve added an extra step, now there exists a ‘parse’ method. By separating out the parsing of the args, we can test parsing separately, however the parse method’s output is now coupled to the contract of the do-things function. In most cases, this is not a substantial improvement, but it’s nice to have the option and freedom. The -main function is becoming closer to being a simple glue binding data transformations together, where each piece can exist, vary, be reasoned upon and tested independently.

To fully decouple the parsing, -main has to have a transformation step, here’s a simple example of what our parse method could look like in this scenario. Our parse method is now simply concerned with the structure of the arguments and defines its own input/output contract. ‘some-magic-parsing-fn’ is some library function that provides a data-centric interface to the arguments, for instance a getopt-style command-line input can be parsed into a clojure map, where options and specs specify how the data is transformed. We take that raw transformation and apply some defaults and things in our custom parse method.

(defn parse 
   [& args]
   (let [parsed (some-magic-parsing-fn args)
         {:keys [in out munge]} parsed
         in (or in "default-in")
         out (or out (get-default-out in))]
     {:in in
      :out out
      :munge munge}))

Here we see a function that applies some defaults to the parsed arguments, even making the default of ‘out’ rely on a function of the ‘in’. So, we have our separately-concerned argument transformation, it’s independent from both our glue logic and our library code. All it’s really doing is munging, ‘destructuring’ and reassembling things into a new map form. But, something feels odd and a little repetitive, no? What the heck is this about?

{:in in       
 :out out
 :munge munge}

It’s just ugly! Why did I do that? It turns out, that having the ability to destructure in binding forms is hella convenient, and you can’t do anything similar inside a map literal. Neither can you reference values to key/value pairs that you’ve already defined inline. In my case I had to do this same sort of thing a few times, I thought it would be worth my while to make it pretty. Wouldn’t it be great to glue together a destructuring binding form that outputs as a map? That seemed totally like a useful thing, so I made it. It’s kind of a ‘restructuring’ operation.

Implementation

I asked around on IRC for some advice on how to do this and got some help. I didn’t come up with it, but I can explain it :-).

(defmacro let-map
  [pairs]
  (let [names (map first (partition 2 pairs))]
    `(let [~@pairs]
       (zipmap
        [~@(map (comp keyword name) names)]
        [~@names]))))

Defmacro is itself a macro in clojure. The clojure compiler provides a hook to run code as a macro, that is, at compile-time. The common use of a macro is to manipulate forms and spit out other code, that is, syntactic abstraction, though it can be used for lots of things. The structure of ‘pairs’ is expected to be a name (symbol) followed by a value, just like regular let. The first let extracts all the names from the bindings vector for later use. The backtick is a syntax-quote operator, it tells the reader to interpret the following s-exp as data, and it will not evaluate calls. The major difference over the normal ‘ quote operator is the addition of namespace-expanding any symbols, for instance:

user> '(let [a b c])
(let [a b c])
user> `(let [a b c])
(clojure.core/let [user/a user/b user/c])
user> `(let [a b c])

Since a, b, and c are not resolvable, they are defined to be within the current namespace.

So, now we spit out a proper let form. Within the bindings of the let we see this nonsense: ~@pairs. What is that?! The ~ (unquote) switches the next form back to value-mode, where symbols actually get evaluated into values. This is the mode we’re normally in when we write code. Since pairs is a seq of values, we have a problem. [[a b c d]] is not what we’re looking for. How do we un-nest the values from the inner vector? The @ operator is the key, when used in this context it splices (expands) the values from a seq out of the seq. So, we have our let-bindings, how do we output a map? We have to build up the keys and values. Zipmap is suitable for assembling a maps from seqs of keys and values. We create a seq of keys by applying ‘keyword’ to the name of each symbol, splicing them into a vector. We create a seq of values by simply splicing the names in value-mode. The compiler will automatically substitute values in-place of the symbols, as in regular code. So, that totally works, but we don’t yet have destructuring. Wouldn’t it be nice if we could make that happen? It’s not so difficult actually, with some caveats.

Let’s see what ‘destructure’ gives us:

user> (destructure '[[a b] [1 2]])
[vec__2314 [1 2] a (clojure.core/nth vec__2314 0 nil) b (clojure.core/nth vec__2314 1 nil)]

Hmm…. interesante. It’s generating symbols and function calls to pull out the values that we need.

What if we just run our bindings through destructure? Here’s the new let-map:

(defmacro let-map
  "Avoids having to build maps from let bindings manually"
  [pairs]
  (let [pairs (destructure pairs)
        names (map first (partition 2 pairs))]
    `(let [~@pairs]
       (zipmap [~@(map (comp keyword name) names)]
               [~@names]))))

user> (let-map [[a b] [1 2]
                {c :c d :d} {:c 3 :d 4}])
{:d 4, :c 3, :map__2495 {:c 3, :d 4}, :b 2, :a 1, :vec__2494 [1 2]}

There you have it, if you’re willing to tolerate some extra gensyms for the destructured maps, this will suffice, but I think we can do better. How about if we walk through our input and pick out the original symbols? Then we can assemble only the parts we want.

My first thought was to flatten everything, though ‘flatten’ only works on seqs and vectors. If we look at the code to flatten, it gives us a clue what to do.

(defn flatten
  "Takes any nested combination of sequential things (lists, vectors,
  etc.) and returns their contents as a single, flat sequence.
  (flatten nil) returns nil."
  {:added "1.2"
   :static true}
  [x]
  (filter (complement sequential?)
          (rest (tree-seq sequential? seq x))))

So, we want to walk the forms and identify any symbols. Destructuring works on vectors or maps, so it would make sense to expand only those. We look at the definition of tree-seq, which is a clever builder higher-order-function that creates a sequence based on a tree walk, branching when a predicate is true, and pulling out a seq of children by calling a function on the node. It’s the right tool for the job!

user> (tree-seq #(or (vector? %) (map? %)) identity [[5 6] [1 2] {7 :c 9 :d}])
([[5 6] [1 2] {7 :c, 9 :d}] [5 6] 5 6 [1 2] 1 2 {7 :c, 9 :d} [7 :c] 7 :c [9 :d] 9 :d)

So, we create a seq, then we simply filter it for symbols. Since I always wish smart guys would show their thought process so I can learn how to be like them, I will show you all my mistakes along the way :-).

(defmacro get-symbols
  [form]
  (->> (tree-seq #(or (vector? %) (map? %)) identity form)
       (filter symbol?)))
user> (get-symbols '[[a b] [1 2] {c :c d :d}])
()

Oops…

(defmacro get-symbols
  [form]
  `(->> (tree-seq #(or (vector? %) (map? %)) identity form)  ;hmmm, I might be forgetting something here
        (filter symbol?)))

user> (get-symbols '[[a b] [1 2] {c :c d :d}])

No such var: user/form
  [Thrown class java.lang.RuntimeException]

Oops.

(defmacro get-symbols
  [form]
  `(->> (tree-seq #(or (vector? %) (map? %)) identity ~form)
        (filter symbol?)))

user> (get-symbols '[[a b] [1 2] {c :c d :d}])
(a b c d)

Success! Now, let’s hook it up to let-map.

(defmacro let-map
  "Avoids having to build maps from let bindings manually"
  [bindings]
  (let [bindings (destructure bindings)
        names (get-symbols bindings)]  ; as long as it's in there, right?
    `(let [~@bindings]
       (zipmap [~@(map (comp keyword name) names)]
               [~@names]))))

user> (let-map [[a b] [1 2]
                {c :c d :d} {:c 3 :d 4}])
{:d 4, :c 3, :map__3140 {:c 3, :d 4}, :b 2, :a 1, :vec__3139 [1 2]}

…try again…

(defmacro let-map
  "Avoids having to build maps from let bindings manually"
  [bindings]
  (let [names (get-symbols bindings)   ; immutability decomplects time and value
        bindings (destructure bindings)]
    `(let [~@bindings]
       (zipmap [~@(map (comp keyword name) names)]
               [~@names]))))

user> (let-map [[a b] [1 2]
                {c :c d :d} {:c 3 :d 4}])
{:d 4, :c 3, :b 2, :a 1}

Shazam! And if you want to inline get-symbols, notice we don’t have to quote anything.

EDIT: we also want to only zip uniquely, and calling ‘name’ may be unnecessary
EDIT: only traverse the odd elements bindings for symbols, in case vals are quoted symbols

Final Code: let-map

(defmacro let-map
  "Avoids having to build maps from let bindings manually"
  [bindings]
  (let [names (->> (take-nth 2 bindings)
                   (tree-seq #(or (sequential? %) (map? %)) identity)
                   (filter symbol?)
                   (into #{}))  ; dumps it all into a set
        bindings (destructure bindings)]
    `(let [~@bindings]
       (zipmap [~@(map keyword names)]
               [~@names]))))

Here we can see the code the macro is producing:

user> (pprint (macroexpand-1 '(let-map [[a b] [1 2]
                                        {c :c d :d} {:c 3 :d 4}])))
(clojure.core/let
 [vec__3309
  [1 2]
  a
  (clojure.core/nth vec__3309 0 nil)
  b
  (clojure.core/nth vec__3309 1 nil)
  map__3310
  {:c 3, :d 4}
  map__3310
  (if
   (clojure.core/seq? map__3310)
   (clojure.core/apply clojure.core/hash-map map__3310)
   map__3310)
  c
  (clojure.core/get map__3310 :c)
  d
  (clojure.core/get map__3310 :d)]
 (clojure.core/zipmap [:a :b :c :d] [a b c d]))

TL;DR

What did we just do? We’ve put somewhat of a ‘design pattern’ into a single word. We can trade a little bit of effort for arbitrary expressivity. But we lose a little on having to grok the ever-more nested conceptual tree of abstractions. Personally, I think the tradeoff is worth it. Behold the power of lisp!

The final result, it would be much nastier if we had to explode out all the maps:

(defn parse 
   [& args]
   (let-map [parsed (some-magic-parsing-fn args)
             {:keys [in out munge]} parsed
             in (or in "default-in")
             out (or out (get-default-out in))]))

(defn do-things
  "Does the things"
  [{:keys [things-to-do when-to-do-them how-to-do-them]}]
  (actually-do-things things-to-do when-to-do-them how-to-do-them))

(defn -main
  [& args]
  (let [{:keys [in out munge]}]
    (do-things 
      (let-map [things-to-do (make-things in)
                when-to-do-them :now
                how-to-do-them (munging munge out)]))))
Categories: Clojure