Rename symbol in Emacs (refactor like a baws)

Burt Gummer

Emacs is the editor for real folk, and CIDER is one sweet piece of software to program in Clojure, but some would argue that it lacks "big IDE" refactoring tools. This is partly true, but not entirely. Today I gonna show you an easy and cool way to rename all references inside a single function. This can be useful if you want to rename a local variable, or a binding introduced by let.

There are certain tools that allow you to do project-wide replacement too, but they are outside the scope of its post. You should absolutely check @bbatsov's excellent Projectile package for some leads. But let's get back to our original task.

Multiple cursors

Emacs users are not shy to borrow useful features from other editors. Popularized in Sublime Text (probably), multiple cursors found their way to Emacs in Magnar Sveen's multiple-cursors.el package. You can install it by issuing M-x package-install multiple-cursors. This package provides many useful tricks, but again for now we only need mc/mark-all-like-this-dwim. Bind it to some key and you are good to go.

Renaming symbols

The workflow is the following: you are editing a function, you spot a symbol you want to rename (likely a local variable), you move the cursor to that symbol and execute mc/mark-all-like-this-dwim, type the changes, Enter. Bam-bam-tapatapatapa-BOOM!

Mapping workflow

Like a baws! In fact this is not specific to Clojure and can be used with any Lisp (or non-Lisp if combined with the next feature).

Narrowing

You'll notice how MC selected only symbols in the same defn-form, and not the outside ones. It is because this function is intelligent, but there exist also cruder tools that operate on the whole buffer. To tame those you'll need another useful mechanic called narrowing.

Narrowing in Emacs allows to focus on a particular region of the buffer like there's nothing else in that buffer. Sounds confusing? Well, it works confusing too, that's why this functionality is disabled by default, so that if you were to trigger it accidentally Emacs would ask you for confirmation. Now that you know what narrowing is you can confidently enable for future sessions.

You can read more about narrowing here, but for our task we'll only have to define a single function which I shamelessly stole from endlessparentheses.com. Put this into your init.el:

(defun narrow-or-widen-dwim (p)
  "If the buffer is narrowed, it widens. Otherwise, it narrows intelligently.
Intelligently means: region, subtree, or defun, whichever applies
first.

With prefix P, don't widen, just narrow even if buffer is already
narrowed."
  (interactive "P")
  (declare (interactive-only))
  (cond ((and (buffer-narrowed-p) (not p)) (widen))
        ((region-active-p)
         (narrow-to-region (region-beginning) (region-end)))
        ((derived-mode-p 'org-mode) (org-narrow-to-subtree))
        (t (narrow-to-defun))))

It is a Do-What-I-Mean command, so it will perform the correct action every time you call it. No need to memorize loads of functions, just bind this one to some key combination and you are set. This single function will narrow to a top-level sexp by default, narrow to region if region is active, and widen if the current text is narrowed. Just splendid.

More?

If you know any other Emacs refactoring tricks, please share them in comments. I know there is a whole project called clj-refactor.el which I'm yet to try. But I wanted to tell about this specific feature because it is simple, seamless and composable. Enjoy your hacking!

Permalink

Clojure Weekly, Nov 26th, 2014

Welcome to another issue of Clojure Weekly! Here I collect a few links, normally 4/5 urls, pointing at articles, docs, screencasts, podcasts and anything else that attracts my attention in the clojure-sphere. I add a small comment so you can decide if you want to look at the whole thing or not. That’s it, enjoy!

Read-Eval-Print-λove - Michael Fogus A real treat for Lisp lovers, this bi-weekly news-letter by Fogus looks more like an entire magazine considering the length and depth of each issue. The focus is around the Lisp family of languages, with plenty of pointers, bits of history, diagrams and snippets. In the first two issues Fogus is illustrating the primordial functional concepts, showing a list of interesting lisp dialects and McCarthy description of what a Lisp language is. The second issue shows a Lisp implementation and its main constituent. Totally fine read.

->as - Clojure Documentation I missed this thread macro variation for some reason. Like a let form, it creates a local binding in the context of a thread pipeline. It’s very useful for all those situation where the target object is not necessarily always the first argument (->) or the last argument (-») but a mix of the two or even in the middle of a function call.

Rich Hickey - Inside Transducers - YouTube My first impression of transducers when I first heard of them was very related to the Reducers work and I was wrong, there is much more. A lot more has been done with stateful transducers (for example take or partition-by) also because it is not possible in a parallel context. The design of transducers is more coherent and extended to many more areas than Reducers. One of them is certainly core.async channels. If you consider that there is still a lot of work to do like promise-channels, endpoints and pipelines, the addition of Transducers is creating a powerful set of concurrency abstractions. This talk by Rich gives a very good overview of all of the above, including useful code snippets. It is quite dense presentation though, so it might require additional investigation to digest completely.

pixie-lang/pixie · GitHub That guy is on-fire! Timothy Baldridge is already author of so many things, screencasts, core libraries like core.async and now why not putting an innovative language on a side? Pixie is written in RPython, the same as the fast Python cousin, PyPy. RPython tooling allows for fast development of your own VM, including low level stuff like garbage collection and implements for you aggressive fragment caching like inside its tracing JIT. It is heavily inspired by Clojure for its standard library and things like pattern matching or easy native interaction are coming on top.

Writing Clojure in Vim Some good hints here in order to start with the right tooling writing clojure with vim. Specifically interesting the comparison between vim-paredit and vim-sexp, the latter not changing usual mappings and hence more interesting for people who wants to keep their finger memory intact. I approached vim-paredit once but couldn’t be productive right away and now I keep procrastinating on it. I’m also of the impression that paredit value with vim is somehow less than emacs, considering that all the moves to handle wrappings (quotes, symbols, parenthesis, paragraph and so on) are all built-in.

rkneufeld/lein-try I was convinced I already bookmarked this plugin here, but I was wrong. Lein try is simply amazing to try out libs without having to setup a project or jump into a “test” one to change its dependencies. It is like creating a leinengen based project on the spot and start the REPL. One example could be some client for some external service to quickly try to connect to without having to create a full project for that to have a REPL.

Permalink

My Way into Clojure: Building a Card Game with Om - Part 1

In order to gain hands-on experiences with functional programming, I wrote an HTML5 card game with Om, a “JavaScript MVC” written in ClojureScript. This first post starts our journey travelling down the Clojure rabbit hole. I'll share my experiences getting started with Clojure, introduce the language's features and explain why its LISP syntax is a logical consequence of its deep infatuation with simplicity.

Permalink

Moving to Cryogen

The blog has officially been moved over to Cryogen. While, all the content has been migrated over, the links for the posts have changed and the original comments are no longer available since I'm now using Disqus.

Yuggoth was a fun experment and it held up to most traffic storms over the years, but at the end of the day it's hard to beat the simplicity of a static site.

Porting the content from my old blog turned out to be a simple affair. I used Postgres to store all the blog content in Yuggoth. The database contains tables for the posts, the comments, the tags, and the files. All I had to do was extract the data and write it back out using the Cryogen format. First, I extracted the binary data for the files as seen below.

(defn write-file [{:keys [data name]}]
   (with-open [w (clojure.java.io/output-stream
                   (str "resources/templates/files/" name))]
     (.write w data)))

(defn extract-files []
  (doseq [file (sql/query db ["select * from file"])]
    (write-file file)))

The posts table contains the content, while the tags are stored in a separate table. The tags can be aggregated by post using the handy array_agg function. This function will produce a Jdbc4Array as the result, and its contents can then be extracted to a vector.

(defn extract-tags [post]
  (update-in post [:tags] #(vec (.getArray %))))

(defn get-posts []
  (map extract-tags
       (sql/query db
         ["select array_agg(t.tag) as tags,
                  b.id, b.time, b.title, b.content from blog b, tag_map t
           where t.blogid = b.id
           group by b.id, b.time, b.title, b.content"])))

Now, all that's left to do is to generate the post metadata and the file name. Since each post contains a publication date and a title, these can be used to produce a filename in the format expected by Cryogen.

(defn format-post-date [date]
  (let [fmt (java.text.SimpleDateFormat. "dd-MM-yyyy")]
    (.format fmt date)))

(defn format-post-filename [time title]
  (str
    (->> (re-seq #"[a-zA-Z0-9]+" title)
         (clojure.string/join "-")
         (str "resources/templates/md/posts/" (format-post-date time) "-"))
   ".md"))

With that in place we can simply run through all the posts and extract them into appropriate files.

(defn write-post [{:keys [id time tags content title]}]
  (with-open [wrt (clojure.java.io/writer (format-post-filename time title))]
    (.write wrt
            (with-out-str
              (clojure.pprint/pprint
                {:title title
                 :layout :post
                 :tags (vec (.split tags " "))})))
    (.write wrt "\n")
    (.write wrt content)))

(defn extract-posts []
  (doseq [post (get-posts)]
    (write-post post)))

And that's all there is to it. Moral of the story is that we should always keep the data separate from its representation as you never know when you will need it down the road.

Permalink

Updates and Architecture

It's been about two weeks since I released Cryogen to the public and the feedback has been great so I'd like to go over some of the changes that have been made thus far and a how everything fits together behind Cryogen.

Updates

Someone pointed out to me that posts or pages that were deleted from resources/templates would have to be manually deleted from resources/public in order for the sitemap and RSS feed to be updated correctly since these two files are generated based on the contents of the public folder.

To remedy this, I updated the compiler to wipe out the public folder and then generate all of its contents again with each compile. I was actually worried that this would increase compile time but Yogthos's blog has actually moved to Cryogen and found no problems with performance even with his 60+ existing posts.

Previously, any asset folders like css and javascript were stored under public but then have now been moved to templates. Everything in the public folder is now generated by the compiler. I've added the :resources key to the config that lets you specify which asset folders you would like to copy over from templates to public.

:resources ["css" "js" "img"]

The copying is done by this simple function in src/cryogen/io.clj using fs.

(defn copy-resources [{:keys [blog-prefix resources]}]
  (doseq [resource resources]
    (fs/copy-dir
      (str "resources/templates/" resource)
      (str public blog-prefix "/" resource))))

Cryogen also has Disqus support now. Simply register your blog on Disqus, add your disqus-shortname to the config and set disqus? to true and Selmer will inject the info into the following script in the post template.

{% if disqus-shortname %}
<div id="disqus_thread"></div>
<script type="text/javascript">
    (function() {
        var dsq = document.createElement('script'); dsq.type = 'text/javascript'; dsq.async = true;
        dsq.src = '//{{disqus-shortname}}.disqus.com/embed.js';
        (document.getElementsByTagName('head')[0] || document.getElementsByTagName('body')[0]).appendChild(dsq);
    })();
</script>
{% endif %}

For long posts or pages with numerous headings, you can now generate a table of contents. Just add :toc true to the metadata of the post or page where you want to generate a toc. For example:

{:title "Updates and Architecture"
 :layout :post
 :tags [:cryogen :clojure]
 :toc true}

Some other updates:

  • Can now customize the number of recent posts shown
  • Sass support by turbopape
  • Archive sorting fixed for users with a locale that isn't "en"

Architecture

Sometimes things are obvious to you when you design them but not necessarily to others. I've received a few questions regarding templating on the Cryogen repo so I thought I'd go over the basic architecture here.

The reason I created Cryogen was because I wanted a site generator (in Clojure!) that had a clear separation between content and layout. The idea is that you create HTML templates under resources/templates/html/layouts with whatever layout/theme you want and the content of your posts and pages from resources/templates/md gets injected by the compiler through Selmer.

Post/Page Layouts

Each markdown file representing a post or page must contain metadata about the layout in the :layout key that corresponds to an HTML file under html/layouts. For example, this post's metadata is as follows:

{:title "Updates and Architecture"
 :layout :post
 :tags [:cryogen :clojure]}

When the site gets built, this is what puts the post together:

(spit (str public (:uri post))
            (render-file (str "templates/html/layouts/" (:layout post))
                         (merge default-params
                                {:servlet-context  "../"
                                 :post             post
                                 :disqus-shortname disqus-shortname})))

Selmer will render the post and the compiler will spit it out into the public folder.

The second argument passed into render-file is a map of items that Selmer will inject into the specified layout. In this case, that will be post.html.

<div id="post">
    ...
    <h2>{{post.title}}</h2>
    ...
    <div>
        {{post.content|safe}}
    </div>
    ...
</div>

Here, post is another map of items so {{post.title}} and {{post.content}} will render the values of :title and :content from post. Pages and tags are compiled in a similar fashion.

Templates

This segues into how inheritance works in the templates. In the layouts folder, base.html contains the header, sidebar and footer of the site. These are the things that remain constant no matter what page or post you are viewing. The page/post content gets injected into the main reading pane with {% block content %}:

<body>
    <!--header-->
    <!--sidebar-->
    ...
    <div id="content">
    {% block content %}
    {% endblock %}
    </div>
    ...
    <!--footer-->
</body>

To inherit the base template, all the other html files in the layouts folder start with a line of code that specifies the path of the base layout it should inherit followed by the layout for that page or post wrapped in {% block content %}.

A short example is the layout for pages.

{% extends "templates/html/layouts/base.html" %}
{% block content %}
<div id="custom-page">
    <div id="page-header">
        <h2>{{page.title}}</h2>
    </div>
    {% if page.toc %}{{page.toc|safe}}{% endif %}
    {{page.content|safe}}
</div>
{% endblock %}

And that's it! If you want to change the header, sidebar or footer - change it in base.html. If you want to add more post or page layouts, create a new html file under resources/templates/html/layouts and follow the above structure. If you want to add different filters to your content, please check out the Selmer docs.

Permalink

Lost in Transduction - Heresy and ingratitude edition

Were you at Clojure/conj in Washington last week? If so, hello again. Wasn't that a great conference? If not, head to Clojure TV, where all the talks are ready for streaming. Assuming some moderate level of Clojure obsession on your part, I couldn't recommend skipping any of them, so the full catch-up might take you a while, but there are two in particular that I strongly recommend.

Avoiding altercations

The first is actually the very last talk of the conference. Brian Goetz, whom you may have encountered previously as the author of Java Concurrency in Practice or heard of as the Java Language Architect at Oracle, spoke about Stewardship: The Sobering Parts. To talk about Java before an audience know to derive a certain amount of enjoyment from deriding the language takes mettle, nuance and wit, all of which he displayed in abundance. Of course, he wasn't trying to convert an audience of Lispers to worship at the altar of curly braces, but to convey some sense of the responsibility you have when 9 million programmers use your language and a good number of the world's production systems are running in it.

Clojure's isn't (yet) in that position but one that was, at least in proportion to the total amount of code at the time, is COBOL. I'm not sure you can find anyone to defend it from an aesthetic or theoretical standpoint, but it was a very good fit for the computers of the day and for the purposes to which they were put.

If you feel faint at the sight of GO TO statements, it might be time for a stiff drink, because we're going to talk about something even worse. That might be hard to imagine from the vantage point of our enlightened age, but it's true. Sometime in the 1980s, the ANSI standards committee for COBOL introduced the ALTER statement:

alter

The thinking apparently was that if you have a good working memory and like puzzles, ordinary spaghetti code isn't going to be challenging enough, so you need self-modifying spaghetti code. What ALTER did was modify the destination of a specified GO TO statement, so that, from now on, it would go somewhere else.

Brian thinks that this was the precise moment when COBOL jumped the shark.1 It had had a pretty good run and a decent remaining cadre of developers, but the ALTER statement pretty much guaranteed that any codebase under active development would eventually become unmaintainable.

Which brought him to this immortal line:

Java's ALTER statement is only one bad decision away.

Although it's hard to type with meat-axes instead of hands, I'm going to attempt a minor refactoring:

${LANG}'s ALTER statement is only one bad decision away.

Always be Composing

Does this section title seem too sensible and eloquent for me to have thought of myself? It is! Actually, it's the title of another great talk, one by Zach Tellman, and the title isn't even the best part. There's this:

Composability isn't whether people can put two things together, but whether they are willing to.

I'm tempted to do that thing people do in blogs where they say something, and then say they're going to repeat it because it's so great, and then repeat it. But shucks, I've never had the panache to pull off something like that, and, anyway, I haven't explained yet what it means.

Composition, broadly speaking, is what emerges when you can combine simple pieces in different ways to produce complex and interesting results. Zach illustrated this with a Sierpinski gaskets,2

exploring the tradeoffs as you model them as macros, code or data. Other things you can compose are Lego, phonemes and cold-cuts. For example, with Lego and a bit of discipline, you can make really complicated things.

Lego is much more popular than this other thing called Soma, which Soma programmers say is unfair, because they can make a dog too:

And, if they could get hold of enough pieces, plus some glue, an instruction manual in English, and a dependency management system, they could probably make great big, complicated dogs. Unfortunately, some people think of Soma as an academic block system and complain that making things with it is like solving a puzzle.

One of the most well known FP abstractions is of course the chaining of sequence operations,

(->> [1 2 3 4] (map inc) (filter even?) (mapcat #(range % (+ 5 %))) sort)

which is clicks at such a deeply intuitive level that you can easily understand the same algorithm in languages you ostensibly don't know.

Which brings us to transducers. It would be extremely unfair to leave Zach open to charges of unsubtlety, so I'm going to quote the relevant section in full:

Transducers are a very specific sort of composition. They're not a higher order of abstraction. They're actually very narrowly targeted at a certain kind of operation we're trying to do. You can't compose a transducer with any function. And you can't even transduce with every kind of sequence operation. Certain operations such as sort, which require us to realize the sequence are not represented as transducers, so if we're looking at our code and we have some sort of great big chain of operations, one of which is one of these non-transducible things, we now need to separate that out, have certain things applied as a transducer and apply them, then apply these other operations separately. This is not to say that transducers are bad. I think they're a really interesting and very, very useful tool, but I do think it's interesting to look at how these are going to shape the tutorials for Clojure, because there's a nice sort of simplicity and immediacy to be able to say, map a function over my sequence. I think that it would be a little bit difficult for someone who's entirely new to Clojure to have their first operation over a sequence to be defined as a transducer.

Chronologically, the first quote comes after this one. There was some space in between them, but, for me, they resonated together.

History, Nomenclature and Ambition

Transducers are an ambitious concept with an interesting history. Well, it's history in the sense of 2 1/2 exciting years since the original reducers post, but we live in interesting times. That post used transducers but called them by other names, including "reducer fn" and "reducing fn", while the usual binary function passed to reduce was (and, in reducers.clj, still is) referred to as a "combining function." In any case, transducers are not yet the main event.

The next installment arrived roughly a year later, with Transducers are Coming. I think the word "important" can legitimately be applied to that post, but perhaps not the word "clear." Among other things, it introduced the word "transducer" as a synonym for one of the few terms for them that hadn't actually been used, viz. "reducing function transformer." At some point, I attempted a glossary, which seems to have been consulted quite a few times without critical comments, so there's a possibility that might have been accurate. I use the past perfect, because today, at the very least, it is no longer complete, as it doesn't mention the latest innovation, educers.

Clarity and hospitality

It is true that I enjoy making fun of other people's prose, but, in my defense (1) I always introduce a solecism or two of my own, to keep credibility down to unintimidating levels and (2) it doesn't really matter what I say. As an Unimportant Person, I have the luxury of expressing myself in a manner that amuses me, and if the barrage of flippancy causes someone to stop reading, the world will continue to turn. When we're talking about core features of an important computer language, the stakes are higher. It is a problem that many experienced Clojure programmers are demonstrably confused by transducers; it would be a bigger problem if we didn't care; it would be truly tragic if it got to the point where newcomers to Clojure were welcomed -- as they are to certain other languages -- with the belittling advice to come back after training their brains.

The expressivity-ink ratio

The compositional power of a system is inversely proportional to the complexity of its description. Kernighan and Ritchie in its first edition comprised a mere 228 pages, and they were actually all you needed to start coding. The successors of C may have had more expressive power, but that increased expressivity came at a huge cost in confusion and verbiage. Clojure, of course, cheats by being a lisp variant, but even among lisps it stands out for its crystalline internal coherence. It simply makes so much sense that the necessary textual documentation can be contained in mouseovers (or C-c C-d d or what have you). As with C, a small amount of information lets you start coding; better than C, the resulting code has a good chance of being correct.

Whatever you think of transducers, it's hard to argue that they don't require a lot of explanation. At the very least, they spawn tutorials at a pretty good clip.3 We're not quite at monadic levels, but the level of ambient perplexity represents a shift for Clojure. Perhaps some of the confusion will dissipate once the professional scribes pump out their next editions, but it seems possible to me that some is in the nature of the transductional beast. For example, there are quite a few functions involved in the transducer --

  1. The transducer is itself a function. (Not really a complaint.)
  2. It's a function of a function. (Not a complaint either. This is a functional language after all.)
  3. The function returns a function.
  4. map-like transducers are functions of functions returning a function of a function that returns a function.

-- of which the last is required to treat its result argument in a somewhat ritualistic fashion, passing it about in certain ways while never looking at it.

The sides of our intent

Transducers address two weaknesses of the standard approach to chaining operations over collections. First, they allow composition of sequence operations in a manner independent of the collection or delivery mechanism holding the sequence. Second, they are more efficient than the usual methods, because execution does not require reifying multiple (possibly lazy) collections to feed each other, and the transformations are in sufficient proximity that optimizer might be able to do something with them.

There are also at least two features that could be seen as advantageous but may not be as fundamental. First, the composition can be accomplished, literally, with comp, because transducers are functions of arity 1. Second, transducers are optimally suited for use in reduction operations, since what that function does is transform a reducing (or combining) function.

Is it important to do all these things at the same time? Is there, at some mathematical level, profundity in that we could do so? Perhaps, to both questions. But I don't think the answer is obvious.

Modest proposal

Actually two proposals, the more specific of which is indeed modest, offered only as an example of the sort of thing one might think about. The broader proposal, that we should take a step back and think about things, takes a bit of gall. To make up for that, I promise not to talk about type systems.

Rewind a couple years, and ask yourself what, to a functional programmer, is the most obvious way to transform a sequence of values into another sequence, containing zero or more elements for each original value. Wait, I know. It's this:

(defn one-to-many [x] (if (something x) [(foo x) (bar x) "whiffle"] []))
(mapcat one-to-many my-seq))
;; composition
(->> my-seq
    (mapcat one-to-many)
    (mapcat another-to-many))

The pattern is pretty well enshrined; it is not controversial. Now, suppose you had the additional requirement that the transformations might potentially depend on previous values. One obvious possibility is a function that takes a state as well as an input and returns both the output values and the new state, e.g. [ [out1 out2 ...] new-state]:

(defn one-to-many-with-state [x s]
  (if (something x s)
    [[(foo x)] (inc s)]
    [[(bar x) (bar x)] (dec s)]))

So far, this is all in the category of the obvious, and, happily, everything that follows is in the category of internal implementation details that need not be obvious.

mapcatreduce

At this point, we're going to need something other than mapcat to apply our function. This something is going to be a hybrid of mapcat and reduce, where the former gets applied iteratively to each input, while the latter accumulates state. This will do for now:

(defn translate-seqable [tl state inputs]
  (let [[input & inputs] inputs
        [outputs state] (tl input state)]
    (if inputs
      (concat outputs (lazy-seq (translate-seqable tl state inputs)))
      (concat outputs (first (tl nil state))))))

The last line introduces a small twist: we're going to invoke tl one last time with a nil value, so it can clear out any residual state.

We can write a similar mapcat/reduce to translate the contents of an async channel:

(defn- translate-channel [tl s0 c-in & [buf-or-n]]
  (let [c-out (chan buf-or-n)]
    (async/go-loop [state s0]
      (let [v (<! c-in)]
        (if-not (nil? v)
          (let [[vs state] (tl v state)]
            (async/onto-chan c-out vs false)
            (recur state))
          (let [[vs _] (tl nil state)]
            (async/onto-chan c-out vs true)))))
    c-out))

Both of these could probably be written more efficiently, perhaps directly in Java, but let's not bother with that yet.

Example translators

The finishing logic necessitates checking for a nil input, which under most circumstances is just wrapping it in an (if-not (nil? v) ...).

The standard mapping and predicate filtering ignore state,

(defn tmap [f]
  (fn [v _] (if-not (nil? v) [[(f v)]  nil])))

(defn tfilter [p]
  (fn [v _]
    (if-not (nil? v) 
      [[(if (p v) [v] [])] nil])))

while the canonical deduplicator uses it to hold a scalar:

(defn dedup [v state]
  (if-not (nil? v)
    [(if (not= v state) [v] []) v]))

Duplication does not use state, but does make use of the ability to return multiple values:

(defn dup [v _]
  (if-not (nil? v) [[v v] nil]))

For fun, we can implement a sorting translator that just swallows input until finishing time, when it releases the sorted input in one go:

(defn tsort [v s]
  (if-not (nil? v) [[] (cons v s)] [(sort s) nil]))

(To be fair, one can do this with transducers too.)

composition

Suppose, next, that we would like to be able to compose multiple one-to-many-with-state-like functions into a single function to pass to one of these translates. This single function will consume a state that is actually a vector of the states of the functions that compose it. The only comments I'll make about the following proof-of-concept implementation are that (1) it can undoubtedly be done more efficiently, (2) its complexity is not something the user needs to think about:

(defn tcomp
  "Compose stateful translators; state in the returned translator is a vector of composed states."
  [& tls]
  (fn [input states]
    (loop [[tl & tls]   tls       ;; loop over translators
           [s & ss]     states    ;;           and states
           input        [input]   ;; accrue flattened inputs
           s-acc        []]       ;; new states
      (if-not tl
         [input s-acc ] ;; done
         (let [[input s] (translate-seqable tl s input)]
           (recur tls ss input (conj s-acc s)))))))

protocols

Finally, we wrap our translate-whatevers in a protocol, so we can just call translate, irrespective of the container:

(defprotocol ITranslatable
  (translate* [this s0 tl]))
(extend-protocol ITranslatable
  clojure.lang.Seqable
  (translate* [this s0 tl] (translate-seqable tl s0 this))
  clojure.core.async.impl.protocols.Channel
  (translate* [this s0 tl] (translate-channel tl s0 this)))
(defn translate [translator input] (translate* input nil translator))

rah

For completeness,

(def blort (tcomp (tmap inc) (tfilter even?) dedup (tmap inc) tsort))
(translate blort [3 1 1 2 5 1])
;;(3 3 5 7)

(def c (chan))
(def ct (translate blort c)
(async/onto-chan c [3 1 1 2 5 1])
(<!! (async/into [] ct))
;; [3 3 5 7]

As bonuses, we never have to deal with mutability, and we don't have to take any special care to do exactly the right thing with those result values we're not supposed to look at.

Question and Answer

Do you hate transducers?

No, I think they're cool.

So, what are you complaining about?

Maybe they're too complicated. Yes, I can wrap my head around that complexity, but that's the sort of thing I do for fun. Most aspects of Clojure not only do not require a love of complexity but appeal to people who loathe it. Transducers stand out as an exception.

What's so complicated?

I answered this above, but, to summarize, the reduce heritage seems to stick out more than necessary.

Didn't Christophe Grand already make these points?

He made some of them. As I recall (his site is down at the moment), he too objected to the visibility of reduce semantics in contexts that were really about transformation. However, he reached somewhat different conclusions from mine, in the end proposing that every transducer-like thing be intrinsically stateful and mutative, in the interests of simplicity and efficiency. The arguments for doing this could be applied nearly anywhere we use a functional style, which implies strong arguments against it that are familiar enough not to require rehearsal here.

Your solution is inefficient.

True, but it's not supposed to be a solution. It's somewhere between a rumination and a proof of concept. Something to think about. If we were to go in a similar direction, there are many possible optimizations. For example, I don't see anything wrong with violating functional purity within core language facilities; rather, what bothers me is when users are regularly required to do so.


  1. Does anyone have a clip of the scene in Battlestar Galactica where someone actually shouts, "Jump the shark!"? Google is failing me here. 

  2. He didn't use any of these illustrations, particularly not the third one. 

  3. Guilty as charged. 

Permalink

Clojure Software Development Engineers, Amazon.com

Amazon.com is the leading online retailer in the United States, with over $75bn in global revenue. At Amazon, we are passionate about using technology to solve business problems that have big customer impact.

CORTEX is our next generation platform that handles real-time financial data flows and notifications. Our stateless event-driven compute engine for dynamic data transforms is built entirely in Clojure and is crucial to our ability to provide a highly agile response to financial events. We leverage AWS to operate on a massive scale and meet high-availability, low-latency SLAs.

Combining a startup atmosphere with the ambition to build and utilize cutting-edge reactive technology, the Cortex team at Amazon is looking for a passionate, results-oriented, innovative Sr. Software Engineer who wants to move fast, have fun and be deeply involved in solving business integration problems across various organizations within Amazon.

If this describes you – our team is a great fit:

  • You obsess over software performance and challenge yourself and others to deliver highly scalable, low latency, reliable and fast computation platforms.
  • You’ve got great ideas and you know how to solve problems, but you also follow through with a clean and maintainable implementation.
  • You have a high bar for coding excellence and a passion for design and architecture.

Our technology stack: Clojure, JVM, AWS tools, Sable

Basic Qualifications 

  • 3+ years of experience designing, building, deploying, operating, scaling, and evolving distributed systems and high-volume transaction application in a 24/7 environment
  • 3+ years of industry experience in software developent in Java or C++
  • Exceptional customer relationship skills including the ability to discover the true requirements underlying feature requests, recommend alternative technical and business approaches, and lead engineering efforts to meet aggressive timelines with optimal solutions
  • Bachelor’s Degree in Computer Science or related field or equivalent work experience

Preferred Qualifications 

  • Background or strong interest in Clojure
  • Experience with cloud technologies from AWS
  • Proficiency in a Unix/Linux environment
  • Graduate degree (MS/PhD) a plus
  • Experience mentoring and developing junior SDEs

If you are interested, contact Janney Jaxen, Technical Recruiter, Retail Systems  janneyj@amazon.com


Permalink

My first Conj

Thanks to good people at Cognitect and the sponsors of the Opportunity Grant I had an opportunity to speak at Clojure/conj 2014. It was the second time I’ve given a talk, and a fourth tech conference I attended. And it’s been amazing!

The venue, Warner Theatre, is the most beautiful conference venue I’ve seen. Just look at this photo! It might have been a little cold, but nothing that hot coffee or tea wouldn’t fix.

Warner Theatre chandelierImage by Warner Theatre

The talks were awesome, speakers well prepared and approachable and organisers friendly and helpful. I’m very glad my talk was second on the first day as I could just sit back and take it all in. And there was a lot to take in. Eye-opening talk about data-driven systems by Paul deGrandis, brain melting introduction to persistent data structures by Michał Marczyk, very educational talk about variants by Jeanine Adkisson, hilarious walk through game development by Zach Oakes, two amazing keynotes and lots of other talks left me with a long TODO list. There’s so much I want and need to learn and I’m very grateful to all speakers and attendees for sharing their knowledge and experiences. You can watch all talks on Clojure TV YouTube channel.

But as we all know people is what makes for a great conference. We can watch the talks online but we can’t talk to the speakers and other attendees, be challenged, go through “a-ha!” moments or have a small hackathon in the hotel lobby. We can’t make new friends and new interesting contacts. And being at Clojure/conj provided me with all the above. I’ve said it before, and I’m going to say it again: Clojure community is the friendliest community I’ve experienced. I felt very welcomed and supported both as a speaker and attendee. And I felt safe and treated like an equal. I truly hope other communities will follow in our footsteps because nothing inspires more than a room full of smart people who don’t try to prove their superiority but share their knowledge instead. We all have a common goal after all: improve technology, improve industry, improve our lives and ourselves.

Lambda Ladies DinnerDinner with the Lambda Ladies

Dinner with the Lambda Ladies, sponsored by the kind people at Living Social was a blast too! Lots of good advice and laughter – I hope it won’t be long before I see you again!

I’ve returned to London energised, with some great memories that I play on repeat in my head. I hope this state of mind will last a little longer or at least till the next event.

Thank you everyone who made Conj so memorable – I hope we meet again!  (Anyone going to Clojure eXchange 2014?)

Permalink

Cassaforte 2.0.0-rc1 is released

TL;DR

Cassaforte is a Clojure client for Apache Cassandra. It is built around CQL 3 and focuses on ease of use. You will likely find that using Cassandra from Clojure has never been so easy.

2.0.0-rc is a release candidate for 2.0.

Changes between 1.3.x and 2.0.0-rc1

Compared to 1.3.x, 2.0.0-rc1 has one major breaking API change.

Client (Session) is Explicit Argument

All Cassaforte public API functions that issue requests to Cassandra now require a client (session) to be passed as an explicit argument:

1
2
3
4
5
6
(ns cassaforte.docs
  (:require [clojurewerkz.cassaforte.client :as cc]
            [clojurewerkz.cassaforte.cql    :as cql]))

(let [conn (cc/connect ["127.0.0.1"])]
  (cql/use-keyspace conn "cassaforte_keyspace"))
1
2
3
4
5
6
7
8
9
10
11
(ns cassaforte.docs
  (:require [clojurewerkz.cassaforte.client :as cc]
            [clojurewerkz.cassaforte.cql    :as cql]
            [clojurewerkz.cassaforte.query :refer :all]))

(let [conn (cc/connect ["127.0.0.1"])]
  (cql/create-table conn "user_posts"
                (column-definitions {:username :varchar
                                     :post_id  :varchar
                                     :body     :text
                                     :primary-key [:username :post_id]})))
1
2
3
4
5
6
(ns cassaforte.docs
  (:require [clojurewerkz.cassaforte.client :as cc]
            [clojurewerkz.cassaforte.cql    :as cql]))

(let [conn (cc/connect ["127.0.0.1"])]
  (cql/insert conn "users" {:name "Alex" :age (int 19)}))

Hayt Upgraded to 2.0

Hayt was upgraded to 2.0.

clojurewerkz.cassandra.cql/iterate-table Now Terminates

clojurewerkz.cassandra.cql/iterate-table no longer produces an infinite sequence.

URI Connections

It is now possible to connect to a node and switch to a namespace using a URI string:

1
2
3
4
5
(ns cassaforte.docs
  (:require [clojurewerkz.cassaforte.client :as cc]))

;; connects to node 127.0.0.1:9042 and uses "new_cql_keyspace" as keyspace
(cc/connect-with-uri "cql://127.0.0.1:9042/new_cql_keyspace")

Cassandra 2.1 Compatibility

Cassaforte 2.0 is compatible with Cassandra 2.1.

Prepared Statement Cache Removed

Prepared statement cache was affecting client correctness in some cases and was removed.

Policy Namespace

Policy-related functions from clojurewerkz.cassaforte.client were extracted into clojurewerkz.cassaforte.policies:

1
2
3
(require '[clojurewerkz.cassaforte.policies :as cp])

(cp/exponential-reconnection-policy 100 1000)
1
2
3
4
(require '[clojurewerkz.cassaforte.policies :as cp])

(let [p (cp/round-robin-policy)]
  (cp/token-aware-policy p))

DataStax Java Driver Update

DataStax Java driver has been updated to 2.1.x.

Cassandra Sessions Compatible with with-open

Session#shutdown was renamed to Session#close in cassandra-driver-core. Cassaforte needs to be adapted to that.

Contributed by Jarkko Mönkkönen.

TLS and Kerberos Support

Cassaforte now supports TLS connections and Kerberos authentication via DataStax CQL extensions.

The :ssl connection option now can be a map with two keys:

  • :keystore-path
  • :keystore-password

which provide a path and password to a JDK KeyStore on disk, created with keytool.

Optionally, an instance of SSLOptions can be provided via the :ssl-options connection option.

Contributed by Max Barnash.

Clojure 1.4 and 1.5 Support Dropped

Cassaforte now requires Clojure 1.6.0.

Collections Converted to Clojure Data Structures

Cassandra maps, sets and lists are now automatically converted to their immutable Clojure counterparts.

Atomic Batches Support

Atomic batches are now easier to use with Cassaforte:

1
2
3
4
5
6
7
8
9
(require '[clojurewerkz.cassaforte.client :as client])
(require '[clojurewerkz.cassaforte.cql :as cql :refer :all])
(require '[clojurewerkz.cassaforte.query :refer :all])
(require '[qbits.hayt.dsl.statement :as hs])

(let [s (client/connect ["127.0.0.1"])]
  (cql/atomic-batch s (queries
                         (hs/insert :users (values {:name "Alex" :city "Munich" :age (int 19)}))
                         (hs/insert :users (values {:name "Fritz" :city "Hamburg" :age (int 28)})))))

Query DSL Taken From Hayt 2.0

Cassaforte no longer tries to support query condition DSLs for both Hayt 1.x and Hayt 2.0. Hayt 2.0 is the only supported flavour now and is the future.

Some examples of the changes:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
;; before
(where :name "Alex")

;; after
(where [[= :name "Alex"]])
(where {:name "Alex"})


;; before
(where :name "Alex" :city "Munich")

;; after
(where [[= :name "Alex"]
        [= :city "Munich"]])
(where {:name "Alex" :city "Munich"})


;; before
(where :name "Alex" :age [> 25])

;; after
(where [[= :name "Alex"]
        [> :age  25]])

;; before
(where :name "Alex" :city [:in ["Munich" "Frankfurt"]])

;; after
(where [[= :name "Alex"]
        [:in :city ["Munich" "Frankfurt"]]])

As it’s easy to see, the new condition style closer resembles Clojure itself and thus was a reasonable decision on behalf of Hayt developers.

Keyspace as Option

It is now possible to choose keyspace via an option:

1
2
3
4
5
(ns cassaforte.docs
  (:require [clojurewerkz.cassaforte.client :as cc]))

(let [conn (cc/connect {:hosts ["127.0.0.1"] :keyspace "a-keyspace"})]
  )

Contributed by Max Barnash (DataStax).

Clojure 1.7.0-alpha2+ Compatibility

Cassaforte is now compatible with Clojure 1.7.0-alpha2 and later versions.

GH issue: #60.

Support for overriding default SSL cipher suites

Providing a :cipher-suites key in the :ssl connection option allows to specify cipher suites that are enabled when connecting to a cluster with SSL. The value of this key is a Seq of Strings (e.g. a vector) where each item specifies a cipher suite:

1
2
3
4
5
6
(ns cassaforte.docs
  (:require [clojurewerkz.cassaforte.client :as cc]))

(cc/build-cluster {:ssl {:keystore-path "path/to/keystore"
                         :keystore-password "password"}})]
                         :cipher-suites ["TLS_RSA_WITH_AES_128_CBC_SHA"]}}

The :cipher-suites key is optional and may be omitted, in which case Datastax Java driver’s default cipher suites (com.datastax.driver.core.SSLOptions/DEFAULT_SSL_CIPHER_SUITES) are enabled.

This can be used to work around the need to install Java Cryptography Extension (JCE) Unlimited Strength Jurisdiction Policy Files required by the default set of cipher suites. TLS_RSA_WITH_AES_128_CBC_SHA is a suite in the default set that works with the standard JCE. E.g. by specifying just that one, as in the code example, the standard JCE is enough.

Contributed by Juhani Hietikko.

GH issue: #61.

News and Updates

New releases and updates are announced on Twitter. Cassaforte also has a mailing list, feel free to ask questions and report issues there.

Cassaforte is a ClojureWerkz Project

Cassaforte is part of the group of libraries known as ClojureWerkz, together with

  • Langohr, a Clojure client for RabbitMQ that embraces the AMQP 0.9.1 model
  • Monger, a Clojure MongoDB client for a more civilized age
  • Elastisch, a minimalistic Clojure client for ElasticSearch
  • EEP, a Clojure library for stream (event) processing
  • Neocons, a Clojure client for the Neo4J REST API
  • Quartzite, a powerful scheduling library

and several others. If you like Cassaforte, you may also like our other projects.

Let us know what you think on Twitter or on the Clojure mailing list.

About the Author

Michael on behalf of the ClojureWerkz Team.

Permalink

Senior Clojure Developer - Chicago - will help relocate - great salary

Our clients technology teams pride themselves on technical excellence, and on building a culture that they are proud to be a part of. They want people who think deeply about how to build software, and who also are open to new ideas and approaches in a finance and technology world that is constantly changing. They believe in agile with a small “a,” and in hiring the best and brightest and then keeping obstacles out of their way so they can do what they do best.

Responsibilities:

Working in multiple languages, including Clojure, JavaScript, and JRuby

Designing, programming, and testing of a variety of applications from servers to Excel plug-ins to web UIs

Testing at the unit, functional, and integration level

Working on green field development

Learning and promoting new technologies

Desirable Experience:
We are looking for strong developers of any stripe, but familiarity with the below will give candidates a leg up in. If you have a passion for Functional Programming

• Working with the JVM

• Clojure, Ruby, C#, Java, JavaScript

• Rails, HTML, CSS, JavaScript, jQuery

• SQL

• Rabbit, AMQP

Candidates should have strong initiative and have proven experience independently driving projects to completion. We work from high-level requirements and programmers are expected to gain an intimate understanding of the business.

Please email you CV to: hello@functionalworks.co.uk

Permalink

Copyright © 2009, Planet Clojure. No rights reserved.
Planet Clojure is maintained by Baishamapayan Ghose.
Clojure and the Clojure logo are Copyright © 2008-2009, Rich Hickey.
Theme by Brajeshwar.