Clojure.spec as a Runtime Transformation Engine

This is the second post in the blog series about clojure.spec for web development and introduces spec-tools - a new Clojure(Script) library that adds some batteries to clojure.spec: extendable spec records, dynamic conforming, spec transformations and more.

Clojure.specLink to Clojure.spec

Clojure.spec is a new modeling and validation library for Clojure(Script) developers. Its main target is design and development time, but it can also be used for runtime validation. Runtime value transformations are not in it's scope and this is something we need for the web apps as described in the previous post.

We would like to use clojure.spec both for application core models and runtime boundary validation & transformations. Let's try to solve this.

GoalsLink to Goals

  1. Dynamic runtime validation & transformation
  2. Spec transformations, to JSON Schema & OpenAPI
  3. (Simple data-driven syntax for specs)

SolutionLink to Solution

Spec-tools is a small new library aiming to achieve the goals. It takes ideas from Plumatic Schema, including the clean separation of specs from conformers. Spec-tools targets both Clojure & ClojureScript and the plan is to make it compatible with Self-hosted ClojureScript as well. The README covers most of the features and options, while this post walks through the core concepts and the reasoning behind them.

Spec RecordsLink to Spec Records

As per today (alpha-16), Specs in clojure.spec are implemented using reified Protocols and because of that they are non-trivial to extend. To allow extensions, spec-tools introduces Spec Records. They wrap the vanilla specs and can act as both as specs and predicate functions. They also enable features like dynamic runtime conforming and extra spec documentation. Spec Records have a set of special keys, which include :spec, :form, :type, :name, :description, :gen, :keys and :reason. Any qualified keys can be added for own purposes.

Simplest way to create Spec Records is to use spec-tools-core/spec macro.

(require '[spec-tools.core :as st])
(require '[clojure.spec :as s])

(def x-int? (st/spec int?))

x-int?
; #Spec{:type :long
;       :form clojure.core/int?}

(x-int? 10)
; true

(s/valid? x-int? 10)
; true

(assoc x-int? :description "It's an int")
; #Spec{:type :long,
;       :form clojure.core/int?,
;       :description "It's an int"}

Optionally there is a map-syntax:

;; simple predicate
(s/def ::name (st/spec string?))

;; map-syntax with extra info
(s/def ::age
  (st/spec
    {:spec integer?
     :description "Age on a person"
     :json-schema/default 20}))

(s/def ::person
  (st/spec
    {:spec (s/keys :req-un [::name ::age])))
     :description "a Person"}))

(s/valid? ::person {:name "Tommi", :age 42})
; true

Instead of using the spec macro, one can also use the underlying create-spec function. With it, you need to pass also the :form for the spec. For most clojure.core predicates, spec-tools can infer the form using resolve-form multi-method.

(let [data {:name "bool"}]
  (st/create-spec
    (assoc data :spec boolean?))
; #Spec{:type :boolean,
;       :form clojure.core/boolean?
;       :name "bool"}

To save on typing, spec-tools.spec contains most clojure.core predicates wrapped as Spec Record instances:

(require '[spec-tools.spec :as spec])

spec/boolean?
; #Spec{:type :boolean
;       :form clojure.core/boolean?}

(spec/boolean? true)
; true

(s/valid? spec/boolean? false)
; true

(assoc spec/boolean? :name "truth")
; #Spec{:type :boolean
;       :form clojure.core/boolean?
;       :name "truth"}

Dynamic conformingLink to Dynamic conforming

The primary goal is to support dynamic runtime validation & value transformations. Same data read from JSON, Transit or string-based formats (:query-params etc.) should conform differently. For example Transit supports Keywords, while with JSON we have to conform keywords from strings. In clojure.spec, conformers are attached to spec instances so we would have to rewrite differently conforming specs for all different formats.

Spec-tools separates specs from conforming. spec-tools.core has own versions of explain, explain-data, conform and conform! which take a extra argument, a conforming callback. It is a function of type spec => spec value => (or conformed-value ::s/invalid) and is passed to Spec Record's s/conform* via a private dynamic Var. If the conforming is bound, it is called with the Spec Record enabling arbitrary transformations based on the Spec Record's data. There is CLJ-2116 to allow same without the dynamic binding. If you like the idea, go vote it up.

Example of conforming that increments all int? values:

(defn inc-ints [_]
  (fn [_ value]
    (if (int? value)
      (inc value)
      value)))

(st/conform spec/int? 1 nil)
; 1

(st/conform spec/int? 1 inc-ints)
; 2

Type-conformingLink to Type-conforming

Spec-tools ships with type-based conforming implementation, which selects the conform function based on the :type of the Spec. Just like :form, :type is mostly auto-resolved with help of spec-tools.type/resolve-type multimethod.

The following predefined type-conforming instances are found in spec-tools.core:

  • string-conforming - Conforms specs from strings.
  • json-conforming - JSON Conforming (numbers and booleans not conformed).
  • strip-extra-keys-conforming - Strips out extra keys of s/keys Specs.
  • fail-on-extra-keys-conforming - Fails if s/keys Specs have extra keys.

Example:

(s/def ::age (s/and spec/int? #(> % 18)))

;; no conforming
(s/conform ::age "20")
(st/conform ::age "20")
(st/conform ::age "20" nil)
; ::s/invalid

;; json-conforming
(st/conform ::age "20" st/json-conforming)
; ::s/invalid

;; string-conforming
(st/conform ::age "20" st/string-conforming)
; 20

type-conforming-mappings are just data so it's easy to extend and combine them.

(require '[spec-tools.conform :as conform])

(def strip-extra-keys-json-conforming
  (st/type-conforming
    (merge
      conform/json-type-conforming
      conform/strip-extra-keys-type-conforming)))

Map-conformingLink to Map-conforming

s/keys are open by design: there can be extra keys in the map and all keys are validated. This is not good for runtime boundaries: JSON clients might send extra data we don't want to enter the system and writing extra keys to database might cause a runtime exception. We don't want to manually pre-validate the data before validating it with spec.

When Spec Record is created, the wrapped spec is analyzed via the spec-tools.core/collect-info multimethod. For s/keys specs, the keyset is extracted as :keys and thus is available for the :map type-conformer which can strip the extra keys efficiently.

(s/def ::street string?)
(s/def ::address (st/spec (s/keys :req-un [::street])))
(s/def ::user (st/spec (s/keys :req-un [::name ::street])))

(def inkeri
  {:name "Inkeri"
   :age 102
   :address {:street "Satamakatu"
             :city "Tampere"}})

(st/conform
  ::user
  inkeri
  st/strip-extra-keys-conforming)
; {:name "Inkeri"
;  :address {:street "Satamakatu"}}

Inspired by select-schema of Schema-tools, there are also a select-spec to achieve the same:

(st/select-spec ::user inkeri)
; {:name "Inkeri"
;  :address {:street "Satamakatu"}}

The actual underlying conform function is dead simple:

(defn strip-extra-keys [{:keys [keys]} x]
  (if (map? x)
    (select-keys x keys)
    x))

Data macrosLink to Data macros

One use case for conforming is to expand intermediate (and potentially invalid) data to conform specs, kind of like data macros. Let's walk through an example.

A spec describing entities in an imaginary database:

(s/def :db/ident qualified-keyword?)
(s/def :db/valueType #{:uuid :string})
(s/def :db/unique #{:identity :value})
(s/def :db/cardinality #{:one :many})
(s/def :db/doc string?)

(s/def :db/field
  (st/spec
    {:spec (s/keys
             :req [:db/ident
                   :db/valueType
                   :db/cardinality]
             :opt [:db/unique
                   :db/doc])
     ;; custom key for conforming
     ::type :db/field}))

(s/def :db/entity (s/+ :db/field))

It accepts values like this:

(def entity
  [{:db/ident :product/id
    :db/valueType :uuid
    :db/cardinality :one
    :db/unique :identity
    :db/doc "id"}
   {:db/ident :product/name
    :db/valueType :string
    :db/cardinality :one
    :db/doc "name"}])

(s/valid? :db/entity entity)
; true

We would like to have an alternative, simpler syntax for common case. Like this:

(def simple-entity
  [[:product/id :uuid :one :identity "id"]
   [:product/name :string "name"]])

A spec for the new format:

(s/def :simple/field
  (s/cat
    :db/ident :db/ident
    :db/valueType :db/valueType
    :db/cardinality (s/? :db/cardinality)
    :db/unique (s/? :db/unique)
    :db/doc (s/? :db/doc)))

(s/def :simple/entity
  (s/+ (s/spec :simple/field)))

All good:

(s/valid? :simple/entity simple-entity)
; true

But the database doesn't understand the new syntax. We need to transform values to conform the database spec. Let's write a custom conforming for it:

(defn db-conforming [{:keys [::type]}]
  (fn [_ value]
    (or
      ;; only specs with ::type :db/field
      (if (= type :db/field)
        ;; conform from :simple/field format
        (let [conformed (s/conform :simple/field value)]
          (if-not (= conformed ::s/invalid)
            ;; custom transformations
            (merge {:db/cardinality :one} conformed))))
      ;; defaulting to no-op
      value)))

(defn db-conform [x]
  (st/conform! :db/entity x db-conforming))

That's it. We now have a function, that accepts data in both formats and conforms it to database spec:

(db-conform entity)
; [#:db{:ident :product/id
;       :valueType :uuid
;       :cardinality :one
;       :unique :identity
;       :doc "id"}
;  #:db{:ident :product/name
;       :valueType :string
;       :cardinality :one
;       :doc "name"}]

(db-conform simple-entity)
; [#:db{:ident :product/id
;       :valueType :uuid
;       :cardinality :one
;       :unique :identity
;       :doc "id"}
;  #:db{:ident :product/name
;       :valueType :string
;       :cardinality :one
;       :doc "name"}]

(= (db-conform entity)
   (db-conform simple-entity))
; true

Dynamic conforming is a powerful tool and generic implementations like type-conforming give extra leverage for the runtime, especially for web development. conforming should be a first-class citizen, so we have to ensure that they easy to extend and to compose. Many things have already been solved in Schema and Schema-tools, so we'll pull more stuff as we go, for example a way to conform the default values.

Domain-specific data-macros are cool, but add some complexity due to the inversion of control. For just this reason, we have pulled out Schema-based domain coercions from some of our client projects. Use them wisely.

Spec transformationsLink to Spec transformations

Second goal is to be able to transform specs themselves, especially to convert specs to JSON Schema & Swagger/OpenAPI formats. First, we need to be able to parse the specs and there is s/form just for that. Spec Records also produce valid forms so all the extra data is persisted too.

(s/def ::age
  (st/spec
    {:spec integer?
     :description "Age on a person"
     :json-schema/default 20}))

(s/form ::age)
; (spec-tools.core/spec
;  clojure.core/integer?
;  {:type :long
;   :description "Age on a person"
;   :json-schema/default 20})

(eval (s/form ::age))
; #Spec{:type :long
;       :form clojure.core/integer?
;       :description "Age on a person"
;       :json-schema/default 20}

VisitorsLink to Visitors

spec-tools has an implementation of the Visitor Pattern for recursively walking over spec forms. A multimethod spec-tools.visitor/visit takes a spec and a 3-arity function which gets called for all the nested specs with a dispatch key, spec and vector of visited children as arguments.

Here's a simple visitor that collects all registered spec forms linked to a spec:

(require '[spec-tools.visitor :as visitor])

(let [specs (atom {})]
  (visitor/visit
    :db/entity
    (fn [_ spec _]
      (if-let [s (s/get-spec spec)]
        (swap! specs assoc spec (s/form s))
        @specs))))
; #:db{:ident clojure.core/qualified-keyword?
;      :valueType #{:string :uuid}
;      :cardinality #{:one :many}
;      :unique #{:identity :value}
;      :doc clojure.core/string?
;      :field (spec-tools.core/spec
;              (clojure.spec/keys
;                :req [:db/ident :db/valueType :db/cardinality]
;                :opt [:db/unique :db/doc])
;              {:type :map
;               :user/type :db/field
;               :keys #{:db/unique :db/valueType :db/cardinality :db/doc :db/ident}})
;      :entity (clojure.spec/+ :db/field)}

Currently, s/& and s/keys* specs can't be visited due to a bug the spec forms.

JSON SchemaLink to JSON Schema

Specs can be transformed into JSON Schema using the spec-tools.json-schema/transform. Internally it uses the visitor and spec-tools.json-schema/accept-spec multimethod to do the transformations.

(require '[spec-tools.json-schema :as json-schema])

(json-schema/transform :db/entity)
; {:type "array",
;  :items {:type "object",
;          :properties {"db/ident" {:type "string"},
;                       "db/valueType" {:enum [:string :uuid]},
;                       "db/cardinality" {:enum [:one :many]},
;                       "db/unique" {:enum [:identity :value]},
;                       "db/doc" {:type "string"}},
;          :required ["db/ident" "db/valueType" "db/cardinality"]},
;  :minItems 1}

With Spec Records, :name gets translated into :title, :description is copied as-is and all qualified keys with namespace json-schema will be added as unqualified into generated JSON Schemas.

(json-schema/transform
  (st/spec
    {:spec integer?
     :name "integer"
     :description "it's an int"
     :json-schema/default 42
     :json-schema/readOnly true}))
; {:type "integer"
;  :title "integer"
;  :description "it's an int"
;  :default 42
;  :readOnly true}

Swagger/OpenAPILink to Swagger/OpenAPI

There is almost a year old issue for supporting clojure.spec aside Plumatic Schema in ring-swagger. Now as the dynamic conforming and JSON Schema transformation works, it should be easy to finalize. Plan is to have a separate spec-swagger just for clojure.spec with identical contract for the actual web/routing libs. This would allow easy transition between Schema & Spec while keeping the dependencies on the spec-side on minimum. Some ideas from spec-tools will also flow back to schema-tools, some thoughts on a gist.

FutureLink to Future

As clojure.spec is more powerful than the OpenAPI spec, we lose some data in the transformation. For end-to-end Clojure(Script) systems, we could build a totally new api and spec documentation system with "spec-ui" ClojureScript components on top. We have been building CQRS-based apps for years and having a generic embeddable ui for actions is a valuable feature. Also, if specs could be read back from their s/form without eval, we could build dynamic systems where the specs could be loaded over the wire from database to the browser. For development time, there should be a Graphviz-based visualization for Specs, just like there is the Schema-viz.

ConclusionLink to Conclusion

By separating specs (what) and conforming (how) we can make clojure.spec a real runtime transformation engine. Spec-tools is a library on top of clojure.spec adding Spec Records, runtime value transformations via dynamic conforming and Spec transformations (including to JSON Schema) with the Spec visitor. It's designed to be extendable via data and multimethods. There are more features like the data-specs, more on those on the upcoming posts. As a final note, as clojure.spec is still in alpha, so is spec-tools.

Give it a spin and tell us what you think.

Permalink

Spec Transformers

clojure.spec is a data specification library for Clojure, introduced almost two years ago. It's still in alpha.

I hope you find spec useful and powerful. - Rich Hickey

This is my fourth post about spec, describing a new way to transform data based on spec definitions. Earlier posts were about differences to Schema, spec coercion and using specs with ring & swagger.

Revisiting conformingLink to Revisiting conforming

clojure.spec allows us to conform a value to a Spec:

(require '[clojure.spec.alpha :as s])

(s/conform int? 1)
; => 1

(s/conform int? "1")
; => ::s/invalid

If the world was built only with Clojure/Script, EDN and Transit, this would be all the coercion we would ever need. But in real life, data is represented in many different formats such as JSON and XML and we need to transform the values between the formats. For example, JSON doesn't support keywords.

Runtime transformations are out of scope of clojure.spec, and the current best practice is to use normal Clojure functions first to transform the data from external formats into correct format and then validate it using the Spec. This is really a bad idea, as the structure of the data needs to be copied from specs into custom transformation functions.

Spec-tools addresses this issue by separating values, specs and conforming. If the data is in String format, one can transform it using string-conforming, which uses Spec :type definitions to do the required transformations. To support this, specs need to be wrapped:

(require '[spec-tools.core :as st])

;; :type is automatically resolved
(st/conform (st/spec int?) "1")
; => ::s/invalid

(st/conform (st/spec int?) "1" st/string-conforming)
; => 1

It's still just one-way transformation (decoding) and adding a new transformation required a new :type to be registered into all conforming instances. The new :type vocabulary is even thought to be a bad idea.

BijectionsLink to Bijections

I chatted with some smart people like Gary Fredericks and Miikka Koskinen about two-way transformations using the Spec. Gary has a Schema Bijections library, which already allows two-way transformations for Plumatic Schema. As usual, I had no clue, so I had to google what Bijection actually meant.

In short, it's a symmetric 1-to-1 value transformation between domains:

X => Y => X

Mapping numbers between string-domain (e.g. properties files) and Clojure:

"1" => 1 => "1"

In some cases there are multiple valid representation for a single a value in a domain, so the properties of bijections do not hold. For example, dates in the JSON have multiple valid formats:

"2014-02-18T18:25:37Z"
"2014-02-18T18:25:37.000+0000"

Bijections are cool, but two-way transformations are what we want.

Welcome Spec Transformers!

TransformersLink to Transformers

In the just released [metosin/spec-tools "0.7.0"], the transformations have been rewritten, supporting now both two-way and spec-driven transformations.

The higher order function conforming is replaced with a Transformer Protocol, supporting both value encoding and decoding:

(defprotocol Transformer
  (-name [this])
  (-encoder [this spec value])
  (-decoder [this spec value]))

There are two new functions in spec-tools.core: encode and decode. They flow the value through both conform and unform effectively doing value transformation. In addition, all conforming functions now automatically coerce the spec argument into Spec using a IntoSpec protocol, so plain specs can be used too:

(st/decode int? "1")
; => ::s/invalid

(st/decode int? "1" st/string-transformer)
; => 1

IntoSpec is not recursive, so nested specs need still to be wrapped. CLJ-2116 and CLJ-2251 would offer solutions to fix this.

Round-tripping ISO-8601 date-times from JSON domain (x => Y => X => Y):

(as-> "2014-02-18T18:25:37Z" $
      (doto $ prn)
      (st/decode inst? $ st/json-transformer)
      (doto $ prn)
      (st/encode inst? $ st/json-transformer)
      (doto $ prn)
      (st/decode inst? $ st/json-transformer)
      (prn $))
; "2014-02-18T18:25:37Z"
; #inst "2014-02-18T18:25:37.000-00:00"
; "2014-02-18T18:25:37.000+0000"
; #inst "2014-02-18T18:25:37.000-00:00"

There are encoder and decoder functions for many basic types in spec-tools.transform, for both Clojure & ClojureScript.

Spec-driven transformationsLink to Spec-driven transformations

Another new feature is that we can use the Spec meta-data to define how values should be transformed in different domains. For this, there are two new key namespaces: encode and decode. Keys in those namespaces with the domain as a name should contain 2-arity transformer function of type spec value => value.

(require '[clojure.string :as str])

(defn lower-case? [x]
  (-> x name str/lower-case keyword (= x)))

(s/def ::spec
  (st/spec
    {:spec #(and (simple-keyword? %) (lower-case? %))
     :description "a lowercase keyword, encoded upper in string"
     ;; decoder for the string domain
     :decode/string #(-> %2 name str/lower-case keyword)
     ;; encoder for the string domain
     :encode/string #(-> %2 name str/upper-case)}))

(st/decode ::spec :kikka)
; :kikka

(as-> "KiKka" $
      (st/decode ::spec $))
; :clojure.spec.alpha/invalid

(as-> "KiKka" $
      (st/decode ::spec $ st/string-transformer))
; :kikka

(as-> "KiKka" $
      (st/decode ::spec $ st/string-transformer)
      (st/encode ::spec $ st/string-transformer))
; "KIKKA"

The default Transformer implementation first looks for a spec-level transformer, then a :type-level transformer.

When creating new specs, one can also use :type to get encoders, decoders and documentation for free, like with Data.Unjson (thanks to Fabrizio for the pointer).

(s/def ::kw
  (st/spec
    {:spec #(keyword %) ;; anonymous function
     :type :keyword}))  ;; encode & decode like a keyword

(st/decode ::kw "kikka" st/string-transformer)
;; :kikka

(st/decode ::kw "kikka" st/json-transformer)
;; :kikka

Final WordsLink to Final Words

Transforming specced values is not in scope of clojure.spec, but it should.

Meanwhile, Spec-tools provides now both spec- and type-driven transformations and two-way transformations as a proof-of-concept. Like clojure.spec, it's alpha and partially built on non-documented apis, which will most likely to be broken later. My 2 cents are that spec-tools will break, it will be easy to fix and we'll fix it. The best case would be if spec-tools was not needed for this and clojure.spec could do these things on its own.

Comments and feedback welcome in Clojureverse.

Tommi

Permalink

Clojure Deref (Sept 6, 2024)

Welcome to the Clojure Deref! This is a weekly link/news roundup for the Clojure ecosystem (feed: RSS). Thanks to Anton Fonarev for link aggregation.

From the core

BIG NEWS: Clojure 1.12 is now available! Please do take a look at the release notes - we are excited to bring you all of these improvements and looking forward to what’s next! Two things we have already started on are integrating Java virtual threads into core.async and updating the baseline JVM version for the next version of Clojure. More to come in the future I’m sure.

If you want to get the latest updates on Clojure and Datomic, you should definitely check out two upcoming events - Heart of Clojure Sept 18-19 in Leuven, Belgium and Clojure/conj Oct 23-25 in Alexandria, Virginia. These will both be great events and a special opportunity to connect and learn from other Clojurists in Europe and the US.

Blogs, articles, and projects

Libraries and Tools

New releases and tools this week:

  • bb-ops - A collection of babashka recipes for your TechOps needs

  • tools.macro 0.2.1 - Utilities for macro writers

  • malli 0.16.3 - High-performance data-driven data specification library for Clojure/Script

  • hierarchy 0.0.2 - An opinionated Clojure library primarily designed to enhance the built-in hierarchy functions

  • cursive 1.14.0-dev13 - The IDE for beautiful Clojure code

  • clofidence 0.4.0 - Bolster your Clojure test suite confidence

  • nbb 1.2.192 - Scripting in Clojure on Node.js using SCI

  • ClojureStorm 1.12.0-rc2 - A fork of the official Clojure compiler, with some extra code added to make it a dev compiler

  • calva 2.0.468 - Clojure & ClojureScript Interactive Programming for VS Code

  • usermanager-first-principles - A "from first principles" variant of "usermanager-example", the tutorial Clojure web application by Sean Corfield

Permalink

Qualified methods – for ClojureCLR

Clojure has introduced a new qualified methods feature allows for Java methods to be passed to higher-order functions. This feature also provides alternative ways to invoke methods and constructors, and new ways to specify type hints. We need to enhance this mechanism for ClojureCLR in the same ways we enhanced ‘classic’ interop.

I’ll start with a quick review of ‘classic’ interop, including the additions made for interop with the CLR. I’ll (quickly) survey the new qualified methods mechanism.
I’ll conclude with looking at how the CLR extras will be incorporated in the new mechanism.

‘Classic’ interop

I’m going to assume basic familiarity with interoperabiltiy with the JVM/CLR (prior to the introduction of qualified methods). If you need a refresher, you can look at the Java interop section of the Clojure reference.

There are several pages on the ClojureCLR wiki that talk about the additional features of CLR interop:

Class access

Symbols that represent the names of types are resolved to Type objects.

-> System.String     ; => System.String
-> String            ; => System.String
-> (class String)    ; => System.RuntimeType

Classes can be imported into a Clojure namespece so that the namespace of the type can be omitted, as with String above. (There is a default set of imports that are always available. See the note at the end for how that set is computed.)

Interlude: Specifying type names

There are types in the CLR that can not be named by symbols, or at least symbols that the Lisp reader can parse. (I guess Java does not yet have this problem.) See the note at the end for a few comments about this. The specifying types page explains how we get around this. It is just ugly. The mechanism ties directly into the CLR’s type naming and resolution machinery, thus:

|System.Collections.Generic.IList`1[System.Int32]|

You have to include fully-qualified type names, generics include a backtick and the number of type arguments, and the type arguments are enclosed in square brackets.

I plan to introduce a new syntax for this in ClojureCLR.Next that would take advantage of imports and be otherwise nice. When I designed the |...| syntax (stolen from CommonLisp), Clojure did not yet have tagged literals. Now we might be able to do something like

#type "IList<int>"

(If you are interested in helping to design this, please let me know.)

Member access

The classic list of ways to access members of a class are:

(.instanceMember instance args*)
(.instanceMember Classname args*)
(.-instanceField instance)
(Classname/staticMethod args*)
Classname/staticField

The Lisp reader is tasked with translating these into the dot special form; read all about it here. Generally, you should use the forms above rather than using the dot special form directly.

CLR augmentations

For CLR interop, we had to add some additional functionality, primarily for calling generic methods and for working with ByRef and params arguments.

If you are familiar with C#, you have seen ref, in, and out used in method signatures. There is no distinction of these at the CLR level. C# adds in and out for additional compile-time analysis. Given that we don’t have uninitialized variables in Clojure and that CLR doesn’t distinguish, ClojureCLR only provide a by-ref mechanism. The example given on the wiki page looks at a class defined by:

public class AlmostEverything
{
    public int Out(int x) { return x + 10; }
    public int Out(ref int x) { x += 1; return x + 20; }

    public string Ambig(string x, ref int y) { y += 10; return x + y.ToString(); }
     public int Ambig(int x, ref int y) { y 
}

To call Out with a ref argument, you would use:

(let [m (int n)  ]
  (.out c (by-ref m)))

The type hint provided by the (int n) is required – otherwise the it will try to match a ref object parameter.

The by-ref is a syntactic form that can only be used at the top-level of interop calls, as shown here. It can only wrap a local variable. (by-ref can also be used in definterface, deftype, and the like.) And yes, the value of the local variable n is updated by the call – that binding is not immutable. You do not want to know how this is done.

For params, consider the class:

public class ParamsTest
{
    public static int StaticParams(int x, params object[] ys)
    {
        return x + ys.Length;
    }
   public static int StaticParams(int x, params string[] ys)
    {
        int count = x;
        foreach (String y in ys)
            count += y.Length;
        return count;
    }
  
}

Method StaticParams is overloaded on the params argument. You can access the first overload with either of these:

(ParamsTest/StaticParams 12 #^objects (into-array Object [1 2 3] ))
(ParamsTest/StaticParams 12 #^"System.Object[]" (into-array Object  [1 2 3]))

The second overload is accessed with:

(ParamsTest/StaticParams 12 #^"System.String[]" (into-array String ["abc" "de" "f"]))
(ParamsTest/StaticParams 12 #^"System.String[]" (into-array ["abc" "de" "f"]))

Generic methods

We are talking here about methods with type parameters, not methods that are part of a generic type. Often you don’t need to do anything special:

(import 'System.Linq.Enumerable)
(seq (Enumerable/Where [1 2 3 4 5] even?))   ; => (2 4)

There are actually two overloads of Where in Enumerable.

public static IEnumerable<TSource> Where<TSource>(
        this IEnumerable<TSource> source,
        Func<TSource, bool> predicate
)

public static IEnumerable<TSource> Where<TSource>(
        this IEnumerable<TSource> source,
        Func<TSource, int, bool> predicate
)

The type inferencing mechanism built into ClojureCLR can figure out that even? supports one argument but not two, so it can be coerced to a Func<Object,bool> but not a Func<Object,int,bool>. Thus, it can select the first overload.

The wiki page for this discusses how to deal with situations where the function might not be so accommodating. Some of these techniques involve macros to define System.Func and System.Action delegates from Clojure functions. This is a bit of a pain, but it is not too bad.

But there are still situations where more is required.

(def r3 (Enumerable/Repeat 2 5))  ;; fails with

Execution error (InvalidOperationException) at System.Diagnostics.StackFrame/ThrowNoInvokeException (NO_FILE:0).
Late bound operations cannot be performed on types or methods for which ContainsGenericParameters is true.

We need to know the type parameters for the generic method Enumerable/Repeat at compile (load) time. The type-args macro can be used to supply type arguments for the method:

(seq (Enumerable/Repeat (type-args Int32) 2 5))  ;=> (2 2 2 2 2)

The new qualified methods feature

The new qualified methods feature is used to provide methods to higher-order functions, for example, passing a Java method to map. The new feature is described here.

Qualified methods are specified as follows:

  • Classname/method – refers to a static method
  • Classname/.method – refers to an instance method
  • Classname/new – refers to a constructor

These can be used as values (e.g., passed to higher-order functions) or as invocations.

By invocation we mean appearing as the first element in a (func args) form. Thus

(String/Format format-str arg1)   ;; static method invocation
(String/.ToUpper s)               ;; instance method invocation
(String/new init-char count)      ;; constructor invocation

For static methods, this is our original syntax. For instance methods, this would compare to (.ToUpper ^String s). The third example is equivalent to (String. \a 12). We gain the ability to directly specify the type via the namespace of the symbol.

Using qualified methods as values is the more interesting case. Rather than needing to wrap a method in a function, as in

(map #(.ToUpper ^String %) ["abc" "def" "ghi"])

you can just use the qualified method directly:

(map String/.ToUpper ["abc" "def" "ghi"])

Note that we no longer need the type hint on the parameter to avoid reflection.

:param-tags

Using qualified methods gives us the benefit of a type hint on the instance variable for instance method invocation. In fact, the type that the qualified method gives (String in the case of String/.ToUpper) overrides any type hint on the instance argument.

For invocations, further disambiguation of method signature can be made by providing type hints on the other arguments.

Consider trying to map Math/Abs over a sequence. In the old IFn-wrapper style

 (map #(Math/Abs %) [1.23 -3.14])

This will get a reflection warning. You can fix this with a type hint.

 (map #(Math/Abs ^double %) [1.23 -3.14])

Now consider using a qualified method:

 (map Math/Abs [1.23 -3.14])

You still will get a reflection warning. And there is no place to put a traditional type hint.

Enter :param-tags. You can add :param-tags metadata to the qualified method to provide type hints. The easiest way to The easiest way is to use new ^[...] metadata reader syntax.

(map ^[double] Math/Abs [1.23 -3.14]))

You need to put as many types as there are arguments in the method you wish to select. If you don’t need to type a specific argument, you can use an underscore, as in

(^[_ _] clojure.lang.Tuple/create 1 2)  ; => (1 2)

Here, we just need to indicate that the two-argument version of the Tuple/create method is required.

By the way, :param-tags can be used in invocations also as an alternative way to specify argument typing. Compare

(Math/Abs ^double x)

to

(^[double] Math/Abs x)

For one argument, not a big deal. But with, say, 3 arguments, it might help to pull all the type info in one place

(.Method ^Typename (...) ^T1 (...) ^T2 (...) ^T3 (...) )
([T1 T2 T3]  Typename/.Method  (....) (....) (....) (....))

At least you have a choice.

The CLR add-ons

We need to allow adding (type-args ...) to our :param-tags. Simply, we just put them into position.

#[(type-args T1 T2) T3 T4 |T5&|] T/.m

would specify the instance method

class T 
{
   Object m<T1,T2>(T3 arg1, T4 arg2, ref T4 arg3) {...}
}

Note also the use of the reference type T5& for the by-ref parameter.

I don’t know if there is any utility for by-ref in things like map calls. You would have no way to get any changed value in a by-ref parameter. A fallback to the classic interop using a function wrapper seems best for this situation.

Availability

The new qualified methods feature is available in ClojureCLR 1.12-alpha10 and later.

Notes

Note: Default imports

At system startup, we go through all loaded assemblies and create an default import list of all types that satisfy the following conditions:

  • the type’s namespace is System
  • the type is a class, interface, or value type
  • the type is public
  • the type is not a generic type definition (meaning an uintantiated generic type)
  • the type’s name does not start with “_” or “<”.

In addition, strictly for my own convenience for dealing with ‘core.clj’ and other startup files, I add System.Text.StringBuilder, clojure.lang.BigInteger and clojure.lang.BigDecimal. (I’ll change clojure.lang.BigInteger to System.Numerics.BigInteger in ClojureCLR.Next.)

Permalink

Transpiling, Simplicity, and the Value of (Programming) Languages

Okay, another post in the tour of skipped over noteworthy things I’ve been a part of. Previously, I wrote about writing programs in other human languages (among other things). Here, I’m talking about the talk Tim and I gave last year at Clojure/conj 2023, which is about transpiling a program from one programming language to another (among other things). Ha! No, it’s not the same at all. And our talk is actually the important lessons that we learned from our work, and we crafted our presentation content & style to be accessible. Hopefully, you will come away with at least one useful insight, even if you don’t know programming, or scroll further down this post for my written “director’s cut commentary”:

Tim really insisted on being deliberate about the presentation style and pushing us (especially) me out of my comfort zone to do so. It really paid off, IMO. We basically followed How to Speak, and it really applies to real life (very little was specific to academia research talks).

I mean, just like Patrick Winston said in “How to Speak”, did you notice that many Democratic National Convention 2024 speeches ended with a salute to the audience instead of saying “thank you” ?! No? The only thing you remember from the DNC 2024 was Lil Jon during the Roll Call? Fair enough. Don’t overlook North Carolina, NC came right after New York and made NY look weak and tired. Yeah, you heard me! (Awesome that NY had Spike & DA Tish James in the front. But Gov. Hochul can kick rocks for freezing Manhattan congestion pricing right before the finish line.)

I think the overall effect worked out well. When I hear people give their strong advice about presentations or rules of thumb about number of slides for a given time length, I love seeing their reaction when I tell them that Tim & I presented 158 slides in a 40 minute time slot… with 5 minutes to spare. Almost 5 slides / minute. Having spare time for Q&A was desired but not a top priority, and despite all of our practicing leading us to believe that we would have 2.5 minutes to spare, things really do go faster when you’re in front of the audience.

I was actually the one who was really quite hesitant to bringing Thamil into our presentation. Tim rightly persisted and insisted we do so as it helps illustrate how human languages and programming languages share similar phenomena, even if they are very different things on the surface. And that’s why the streak continues of me giving presentations that make a mention of Thamil somehow, even though it’s not my intention. He also wanted to give time to small stories that can engage emotionally and convey important takeaway lessons. It was satisfying to tell my story of thriving in Scala at Nest because only my teammate and unofficial mentor Dave recognized and understood what I was doing… Everyone else, if they event noticed, just thought I was “lucky” somehow that my speed at coding never slowed down. It’s amazing how such smart people can be blinded by their own prejudices, in this case, that Clojure “looks weird” so they don’t understand my productivity was sustained by a subtle profound clarity that they refuse to see because they never allow themselves to get close.

One other personal highlight for me was creating a visual explanation of what it takes to choose the appropriate programming language for a given situation (2 mins long starting at time 11:51). I know that I’ve talked about the topic before, but being able to use Tim’s diagram tool Hummi really helped drive the point home. Tim challenged the precision of the thoughts and words on this topic (as he did elsewhere), and it really helped strengthen the final result. Why is this important? Just look at the number of new programmers who only do systems programming and get stuck thinking that Rust is the only programming language that anyone should ever need, that we should rewrite everything in Rust, etc. Ugh. Despite my best efforts to craft this argument for our presentation, the teammates of mine who are guilty of this Rust zealotry didn’t bother to watch my presentation one bit even when I said it might be useful for them. They feel they already know everything that they need to know. And that’s how smart people limit themselves — their focus on a topic leads to success but also creates a ceiling on how much they will learn of things in the wider world. If this argument can only reach the Clojure community, then at least we have done our part to instill mental clarity in that corner of the wider world.

Permalink

Teaching Programming, Logo, Clojure, and World Languages

I’ve accidentally skipped writing here about things that I’ve done here and there. It’s easy to to not follow up after someone puts up a post or a video that is easy to point to, in the thought that the content speaks for itself. But in reality, there’s usually more to say. Often times, you just have to dig under the surface, ask questions, and look for connections. Anyhoo, here’s one such set of things that are related to each other (and related by design!).

Writing Programs in Other Human Languages

After writing about the topic in 2014 and mentioning before the event that I would present on it at Clojure/West 2015, I never actually specifically pointed to the presentation recording itself. So here it is:

I also didn’t mention the fun little pre-event interview I did.

Teaching Programming in Clojure

As I mentioned from the beginning, I didn’t think of programming in other languages as a practical reality for professional programmers. People might write source code variable & function names in their own languages like French or Russian if they’re working primarily with other speakers of their language, but that’s not common; it’s usually in English. Rather, I always thought that if there was a use case, it would be in teaching programming.

In 2016, I had creating a project to help in making it easy to learn “serious” programming in the way that I learned to love programming (not the ways that I learned to hate it or find it cryptic!). More specifically, I implemented basic commands of the Logo programming language within the Clojure programming language. My long term interest was to see how these ideas connected.

To promote the new project repository, I wrote a post on the Google Open Source blog, called “Teaching kids to program in their native language”. The post, in effect, was talking about not just the project repo, but also the potential connection of the ideas from this project and the previous project. It’s cool when people mix different ideas together to come up with something uniquely insightful. I felt this was one such instance. And to date, I still need to push on this further to make it truly useful, not just “interesting”. One day…. One thing at a time, until then.

Guess what the reaction was, though? Mind you, this was March 2016. And as an online post, the reaction is an online reaction. Well, I was dismayed to see the reaction on Hacker News, basically amounting to “This is pointless” and “Why can’t everyone in the world just learn English?” and “I’m sick of all this political correctness”, just not in exctly those words. Since the 1990s (?), the phrase “political correctness” became widely used by people who were reactionary against speech that was made to be inclusive of people who are underrepresented / discriminated due to aspects of their body / identity. The negative reactionary sentiment portrayed such speech as skewed, contrived, or politically calculated (lacking authenticity). Sometime in the mid-2010s, the term “woke” arose as a complimentary adjective to describe someone who is thoughtfully inclusive. Not long after, the same cynical reactionaries who used to say “politically correct” with a sneer appropriated “woke” to have a similar cynical connotation. Since 2021, the sneering term du jour became “CRT” (Critical Race Theory), and since 2023, it has morphed into “DEI” (Diversity, Equity, and Inclusion). So much for Hacker News being special and a better version of Reddit than Reddit… The Clojure community is small but wonderful, as they were back then, too. I was shocked by the HN reaction. Needless to say, I was blindsided by the UK Brexit vote in July 2016 and the US elections in November 2016.

Putting it together

Tim and I teamed up to put it all together. Tim created power-turtle.com that combines an ClojureScript REPL and a webpage with clojure-turtle and inspiration from clj-thamil. The result is a self-contained webpage that lets a user issue commands on one side of the page, and see the result on the other side of the page. It lets you type commands in English or in your own language, if you contribute the translations for your language to cban, which we created together as a plugin to extract only the functionality of clj-thamil that we needed, and to make it a compile time step by making it a plugin. And Tim took the whole power-turtle project many steps further by adding different types of interactive visual output.

We talked about this at the Clojure/conj in 2017. You can see us demonstrate the result of what we created, how the origins of Logo related to theories of knowledge acquisition, and the underlying reasons why our project is a useful way to teach people how to program in current times:

You don’t always get feedback from people. If you help them in a profound way that takes time to materialize or isn’t obvious (ex: teaching, prepare them for a job interview), they don’t always let you know the good news of that impact. Rarely do people admit their mistakes and apologize. For the above talks, it was really great to hear from the Unicode intern this summer, Chris, that he really enjoyed watching the above talks as well as Simple Made Easy. As a rising junior in CS in undergrad, my hope was just to expand his horizons while he was doing good work for us, particularly CLDR. He said they his other part time job during the summer was teaching kids to program, so he especially found the talk on Power Turtle intriguing. His boss at the school he teaches created the curriculum around Scratch, but is too self assured to accept feedback. Even so, Chris thinks that there is more to programming than just Scratch, and that’s why the talk opened his eyes. There’s more room to make Power Turtle beneficial for others.

Permalink

Clojure 1.12.0

Clojure 1.12.0 is now available! Find download and usage information on the Downloads page.

1 Compatibility

1.1 Java 8 - Compatiblity EOL notice

Clojure 1.12 produces Java 8 bytecode (same as Clojure 1.10 and 1.11), but this is expected to be the last release using a Java 8 baseline. Future releases will move the bytecode and minimum Java compatibility to a newer Java LTS release.

1.2 Java 21 - Virtual thread pinning from user code under synchronized

Clojure users want to use virtual threads on JDK 21. Prior to 1.12, Clojure lazy-seqs and delays, in order to enforce run-once behavior, ran user code under synchronized blocks, which as of JDK 21 don’t yet participate in cooperative blocking. Thus if that code did e.g. blocking I/O it would pin a real thread. JDK 21 may emit warnings for this when using -Djdk.tracePinnedThreads=full.

To avoid this pinning, in 1.12 lazy-seq and delay use locks instead of synchronized blocks.

1.3 Security

1.4 Serialization

CLJ-1327 explicitly sets the Java serialization identifier for the classes in Clojure that implement Java serialization. In Clojure 1.11.0 this changed for two classes unnecessarily and we reverted those changes in Clojure 1.11.1 - this completes that work for the rest of the classes.

Clojure data types have implemented the Java serialization interfaces since Clojure 1.0. Java serialization is designed to save graphs of Java instances into a byte stream. Every class has an identifier (the serialVersionUID) that is automatically generated based on the class name, it’s type hierarchy, and the serialized fields. At deserialization time, deserialization can only occur when the available class has an identifier that matches the class id recorded in the serialized bytes.

Clojure has never provided a guarantee of serialization consistency across Clojure versions, but we do not wish to break compatibility any more than necessary and these changes will give us more control over that in the future.

1.5 Dependencies

Updated dependencies:

  • spec.alpha dependency to 0.5.238 - changes

  • core.specs.alpha dependency to 0.4.74 - changes

2 Features

2.1 Add libraries for interactive use

There are many development-time cases where it would be useful to add a library interactively without restarting the JVM - speculative evaluation, adding a known dependency to your project, or adding a library to accomplish a specific task.

Clojure now provides new functions to add libraries interactively, without restarting the JVM or losing the state of your work:

  • add-lib takes a lib that is not available on the classpath, and makes it available by downloading (if necessary) and adding to the classloader. Libs already on the classpath are not updated. If the coordinate is not provided, the newest Maven or git (if the library has an inferred git repo name) version or tag are used.

  • add-libs is like add-lib, but resolves a set of new libraries and versions together.

  • sync-deps calls add-libs with any libs present in deps.edn, but not yet present on the classpath.

These new functions are intended only for development-time interactive use at the repl - using a deps.edn is still the proper way to build and maintain production code. To this end, these functions all check that *repl* is bound to true (that flag is bound automatically by clojure.main/repl). In a clojure.main REPL, these new functions are automatically referred in the user namespace. In other repls, you may need to (require '[clojure.repl.deps :refer :all]) before use.

Library resolution and download are provided by tools.deps. However, you do not want to add tools.deps and its many dependencies to your project classpath during development, and thus we have also added a new api for invoking functions out of process via the Clojure CLI.

2.2 Invoke tool functions out of process

There are many useful tools you can use at development time, but which are not part of your project’s actual dependencies. The Clojure CLI provides explicit support for tools with their own classpath, but there was not previously a way to invoke these interactively.

Clojure now includes clojure.tools.deps.interop/invoke-tool to invoke a tool function out of process. The classpath for the tool is defined in deps.edn and you do not need to add the tool’s dependencies to your project classpath.

add-lib functionality is built using invoke-tool but you can also use it to build or invoke your own tools for interactive use. Find more about the function execution protocol on the CLI reference.

2.3 Start and control external processes

For a long time, we’ve had the clojure.java.shell namespace, but over time Java has provided new APIs for process info, process control, and I/O redirection. This release adds a new namespace clojure.java.process that takes advantage of these APIs and is easier to use. See:

  • start - full control over streams with access to the underlying Java objects for advanced usage

  • exec - covers the common case of executing an external process and returning its stdout on completion

2.4 Method values

Clojure programmers often want to use Java methods in higher-order functions (e.g. passing a Java method to map). Until now, programmers have had to manually wrap methods in functions. This is verbose, and might require manual hinting for overload disambiguation, or incur incidental reflection or boxing.

Programmers can now use qualified methods as ordinary functions in value contexts - the compiler will automatically generate the wrapping function. The compiler will generate a reflective call when a qualified method does not resolve due to overloading. Developers can supply :param-tags metadata on qualified methods to specify the signature of a single desired method, 'resolving' it.

2.5 Qualified methods - Class/method, Class/.method, and Class/new

Java members inherently exist in a class. For method values we need a way to explicitly specify the class of an instance method because there is no possibility for inference.

Qualified methods have value semantics when used in non-invocation positions:

  • Classname/method - value is a Clojure function that invokes a static method

  • Classname/.method - value is a Clojure function that invokes an instance method

  • Classname/new - value is a Clojure function that invokes a constructor

Note: developers must use Classname/method and Classname/.method syntax to differentiate between static and instance methods.

Qualified method invocations with :param-tags use only the tags to resolve the method. Without param-tags they behave like the equivalent dot syntax, except the qualifying class takes precedence over hints of the target object, and over its runtime type when invoked via reflection.

Note: Static fields are values and should be referenced without parens unless they are intended as function calls, e.g (System/out) should be System/out. Future Clojure releases will treat the field’s value as something invokable and invoke it.

2.6 :param-tags metadata

When used as values, qualified methods supply only the class and method name, and thus cannot resolve overloaded methods.

Developers can supply :param-tags metadata on qualified methods to specify the signature of a single desired method, 'resolving' it. The :param-tags metadata is a vector of zero or more tags: [tag …​]. A tag is any existing valid :tag metadata value. Each tag corresponds to a parameter in the desired signature (arity should match the number of tags). Parameters with non-overloaded types can use the placeholder _ in lieu of the tag. When you supply :param-tags metadata on a qualified method, the metadata must allow the compiler to resolve it to a single method at compile time.

A new metadata reader syntax ^[tag …​] attaches :param-tags metadata to member symbols, just as ^tag attaches :tag metadata to a symbol.

2.7 Array class syntax

Clojure supports symbols naming classes both as a value (for class object) and as a type hint, but has not provided syntax for array classes other than strings.

Developers can now refer to an array class using a symbol of the form ComponentClass/#dimensions, eg String/2 refers to the class of a 2 dimensional array of Strings. Component classes can be fully-qualified classes, imported classes, or primitives. Array class syntax can be used as both type hints and values.

Examples: String/1, java.lang.String/1, long/2.

2.8 Functional interfaces

Java programs emulate functions with Java functional interfaces (marked with the @FunctionalInterface annotation), which have a single method.

Clojure developers can now invoke Java methods taking functional interfaces by passing functions with matching arity. The Clojure compiler implicitly converts Clojure functions to the required functional interface by constructing a lambda adapter. You can explicitly coerce a Clojure function to a functional interface by hinting the binding name in a let binding, e.g. to avoid repeated adapter construction in a loop, e.g. (let [^java.util.function.Predicate p even?] …​).

2.9 Java Supplier interop

Calling methods that take a Supplier (a method that supplies a value) had required writing an adapter with reify. Clojure has a "value supplier" interface with semantic support already - IDeref. All IDeref impls (delay, future, atom, etc) now implement the Supplier interface directly.

2.10 Streams with seq, into, reduce, and transduce support

Java APIs increasingly return Streams and are hard to consume because they do not implement interfaces that Clojure already supports, and hard to interop with because Clojure doesn’t directly implement Java functional interfaces.

In addition to functional interface support, Clojure now provides these functions to interoperate with streams in an idiomatic manner, all functions behave analogously to their Clojure counterparts:

  • (stream-seq! stream) ⇒ seq

  • (stream-reduce! f [init-val] stream) ⇒ val

  • (stream-transduce! xf f [init-val] stream) ⇒ val

  • (stream-into! to-coll [xf] stream) ⇒ to-coll

All of these operations are terminal stream operations (they consume the stream).

2.11 PersistentVector implements Spliterable

Java collections implement streams via "spliterators", iterators that can be split for faster parallel traversal. PersistentVector now provides a custom spliterator that supports parallelism, with greatly improved performance.

2.12 Efficient drop and partition for persistent or algorithmic collections

Partitioning of a collection uses a series of takes (to build a partition) and drops (to skip past that partition). CLJ-2713 adds a new internal interface (IDrop) indicating that a collection can drop more efficiently than sequential traversal, and implements that for persistent collections and algorithmic collections like range and repeat. These optimizations are used in drop, nthrest, and nthnext.

Additionally, there are new functions partitionv, partitionv-all, and splitv-at that are more efficient than their existing counterparts and produce vector partitions instead of realized seq partitions.

2.13 Var interning policy

Interning a var in a namespace (vs aliasing) must create a stable reference that is never displaced, so that all references to an interned var get the same object. There were some cases where interned vars could get displaced and those have been tightened up in 1.12.0-alpha1. If you encounter this situation, you’ll see a warning like "REJECTED: attempt to replace interned var #'some-ns/foo with #'other-ns/foo in some-ns, you must ns-unmap first".

This addresses the root cause of an issue encountered with Clojure 1.11.0, which added new functions to clojure.core (particularly abs). Compiled code from an earlier version of Clojure with var names that matched the newly added functions in clojure.core would be unbound when loaded in a 1.11.0 runtime. In addition to CLJ-2711, we rolled back a previous fix in this area (CLJ-1604).

Detailed changelog

See the official changelog for a complete list of all changes in 1.12.0.

Contributors

Thanks to all the community members who contributed patches to Clojure 1.12:

  • Ambrose Bonnaire-Sergeant

  • Christophe Grand

  • Frank Yin

  • Nicola Mometto

  • Ray McDermott

  • Steve Miner

Permalink

Clojure macros continue to surprise me

Clojure macros have two modes: avoid them at all costs/do very basic stuff, or go absolutely crazy.

Here’s the problem: I’m working on Humble UI’s component library, and I wanted to document it. While at it, I figured it could serve as an integration test as well—since I showcase every possible option, why not test it at the same time?

This is what I came up with: I write component code, and in the application, I show a table with the running code on the left and the source on the right:

It was important that code that I show is exactly the same code that I run (otherwise it wouldn’t be a very good test). Like a quine: hey program! Show us your source code!

Simple with Clojure macros, right? Indeed:

(defmacro table [& examples]
  (list 'ui/grid {:cols 2}
    (for [[_ code] (partition 2 examples)]
      (list 'list
        code (pr-str code)))))

This macro accepts code AST and emits a pair of AST (basically a no-op) back and a string that we serialize that AST to.

This is what I consider to be a “normal” macro usage. Nothing fancy, just another day at the office.

Unfortunately, this approach reformats code: while in the macro, all we have is an already parsed AST (data structures only, no whitespaces) and we have to pretty-print it from scratch, adding indents and newlines.

I tried a couple of existing formatters (clojure.pprint, zprint, cljfmt) but wasn’t happy with any of them. The problem is tricky—sometimes a vector is just a vector, but sometimes it’s a UI component and shows the structure of the UI.

And then I realized that I was thinking inside the box all the time. We already have the perfect formatting—it’s in the source file!

So what if... No, no, it’s too brittle. We shouldn’t even think about it... But what if...

What if our macro read the source file?

Like, actually went to the file system, opened a file, and read its content? We already have the file name conveniently stored in *file*, and luckily Clojure keeps sources around.

So this is what I ended up with:

(defn slurp-source [file key]
  (let [content      (slurp (io/resource file))
        key-str      (pr-str key)
        idx          (str/index-of content key)
        content-tail (subs content (+ idx (count key-str)))
        reader       (clojure.lang.LineNumberingPushbackReader.
                       (java.io.StringReader.
                         content-tail))
        indent       (re-find #"\s+" content-tail)
        [_ form-str] (read+string reader)]
    (->> form-str
      str/split-lines
      (map #(if (str/starts-with? % indent)
              (subs % (count indent))
              %)))))

Go to a file. Find the string we are interested in. Read the first form after it as a string. Remove common indentation. Render. As a string.

Voilà!

I know it’s bad. I know you shouldn’t do it. I know. I know.

But still. Clojure is the most fun I have ever had with any language. It lets you play with code like never before. Do the craziest, stupidest things. Read the source file of the code you are evaluating? Fetch code from the internet and splice it into the currently running program?

In any other language, this would’ve been a project. You’d need a parser, a build step... Here—just ten lines of code, on vanilla language, no tooling or setup required.

Sometimes, a crazy thing is exactly what you need.

Permalink

How do you fix broken links after changing your domain?

A while back, I changed the domain of this blog to rebrand. This broke all of the backlinks I had accumulated. I had been marketing my work online, and I didn't want that work to go to waste. At that point, I was using a smaller hosting service to ma...

Permalink

Passkeys for React Native with Clojure, Part IV

Welcome to this, the fourth and final installment of our series on implementing passkeys in Clojure and React Native. Part one covered the background on how passkeys work and the data model, part two covered user registration, and part three covered user authentication. Today, we’re going to build the client using react-native-passkey.

Passkeys on the Client

The react-native-passkey documentation covers a number of things you’ll need to do to enable passkeys in a React Native application in iOS or Android, including setting up responses on your server that prove that you own the domain your passkeys will claim to represent. Rather than rehashing those steps, I’m going to move straight to handling the client-side code. As I noted earlier, the WebAuthn standards passkeys use is standard-ish. Most of what we’ll do here is slightly massaging the shape of the data on the way in or out to match what the server or client library expects.

Base64

In particular, java-webauthn-server and react-native-passkey have different expectations for which flavor of base64 encoding to use. Here are two very simple translation functions:

const base64ToUrl = string => string.replace(/\+/g, "-").replace(/\//g, "_");
const urlToBase64 = string => string
    .replace(/\-/g, "+")
    .replace(/\_/g, "/")
    .padEnd(string.length + (4 - (string.length % 4)), "=");

Registration

As covered in our implementation of the user registration backend, the registration process has two steps, first to set up the challenge, then to fulfill it.

const beginRegistration = async (userId) => {
  const result = await fetch("https://example.com/registration/begin", {
    method: "POST",
    headers: { accept: "application/json", "Content-Type": "application/edn" },
    body: `{:user-id "${userId}"}`
  });
  const json = await result.json();
  const requestId = result.headers.get("request-id");
  return { json, requestId };
};

const finalizeRegistration = async (json, requestId) => {
  delete json.rawId;
  json.id = base64ToUrl(json.id);
  json.response.attestationObject = base64ToUrl(json.response.attestationObject);
  json.response.clientDataJSON = base64ToUrl(json.response.clientDataJSON);
  json.clientExtensionResults = {};
  json.type = "public-key";
  return await fetch(`https://example.com/registration/${requestId}/finalize`, {
    method: "POST",
    headers: { accept: "application/json", "Content-Type": "application/json" },
    body: JSON.stringify(json)
  });
};

In beginRegistration, we’re simply passing the current user’s ID. In an actual production application, you’ll want to have a more secure way of identifying users than just allowing anyone to post an ID to an endpoint, but that will depend on the authentication system you are already using.

In finalizeRegistration, we are translating the object that react-native-passkey returns into the one java-webauthn-server expects, through removing some keys, base64 converting others, and adding still others.

With that done, we’re ready to tie the two together:

import { Passkey } from "react-native-passkey";

const registerPasskey = async (userId, callback) => {
  try {
    const { json, requestId } = await beginRegistration(userId);
    json.publicKey.challenge = urlToBase64(json.publicKey.challenge);
    const result = await Passkey.register(json.publicKey);
    const finalizeResult = await finalizeRegistration(result, requestId);
    if (finalizeResult.status === 200) {
      callback();
    };
  } catch(error) {
    alert(error);
  }
};

At Passkey.register(), the client OS will present the biometric sheet to the user, and if it’s accepted, the passkey will be stored on the local secure enclave, ready to use for authentication.

Authentication

As with registration, authentication is a two-step process:

const beginAuthentication = async () => {
  const result = await fetch("https://example.com/auth/begin", {
    method: "POST",
    headers: { accept: "application/json" }
  });
  const json = await result.json();
  const requestId = result.headers.get("request-id");
  return { json, requestId };
};

const finalizeAuthentication = async (json, requestId) = {
  delete json.rawId;
  json.id = base64ToUrl(json.id);
  json.response.authenticatorData = base64ToUrl(json.response.authenticatorData);
  json.response.signature = base64ToUrl(json.response.signature);
  json.response.userHandle = base64ToUrl(json.response.userHandle);
  json.response.clientDataJSON = base64ToUrl(json.response.clientDataJSON);
  json.clientExtensionResults = {};
  json.type = "public-key";
  const result = await fetch(`https://example.com/auth/${requestId}/finalize`, {
    method: "POST",
    headers: { accept: "application/edn", "Content-Type": "application/json" },
    body: JSON.stringify(json)
  });
  return await result.text();
}

Most of this works just like the translation steps for registration. In the end, we’re returning result.text() as it’s EDN-encoded for our ClojureScript application.

Now to tie it together:

import { Passkey } from "react-native-passkey";

const signInWithPasskey = async (callback) => {
  try {
    const { json, requestId } = await beginAuthentication();
    json.publicKey.challenge = urlToBase64(json.publicKey.challenge);
    const result = await Passkey.authenticate(json.publicKey);
    const finalizeResult = await finalizeAuthentication(result, requestId);
    callback(finalizeResult);
  } catch (error) {
    alert(error);
  }
};

After Passkey.authenticate() prompts the user for facial or thumbprint recognition or similar and everything succeeds, your callback function will be provided the user information in EDN format that we built in Part III. And that is it: an end-to-end passkey service where users can register and authenticate without passwords or phishing attempts.

Wrapping Up

Building passkey support into Sportscaster involved a fair bit of trial-and-error getting the encodings and data shapes to fit together but was otherwise straightforward and rewarding. I thought briefly about turning the implementation into a library, but there are so many places where your own database needs to be injected and so little actual work (again, around 300 LoC) that it seemed better expressed as a blog post. Hopefully this serves as documentation for the grunt work of converting between representations of the standard and saves times for anyone else thinking of building this in Clojure. If you have any questions or are interested in having me consult on a project like this, please feel free to email me or reach out on Twitter or Farcaster or on Clojurians Slack.

Permalink

Debugging Custom Kondo Hooks and Clojure LSP Jump-to-Definition Functionality

At Metabase I wrote a macro called defendpoint for defining (you guessed it!) REST API endpoints with a typical usage that looks something like

(api/defendpoint POST "/:id/copy"
  "Copy a `Card`, with the new name 'Copy of _name_'"
  [id]
  ;; Malli schemas defined after the argslist are used for validation
  {id [:maybe ms/PositiveInt]}
  (let [orig-card (api/read-check Card id)
        new-name  (str (trs "Copy of ") (:name orig-card))
        new-card  (assoc orig-card :name new-name)]
    (-> (card/create-card! new-card @api/*current-user*)
        (assoc :last-edit-info (last-edit/edit-information-for-user @api/*current-user*)))))

I was spending a lot of time scratching my head wondering why Clojure LSP jump-to-definition functionality wasn’t working as expected inside defendpoint. I wrote a custom clj-kondo hook for it, but moving my cursor to something like api/read-check and doing something M-x lsp-find-definition would just take me to the start of the defendpoint form rather than what I was expecting.

Debugging this issue was pretty hard – while there was clearly something wrong with my custom Kondo hook, but I couldn’t figure out what it was.

In the end I finally managed to figure out how to debug these sorts of things so I’m writing down my steps in case anyone else runs into the same problem.

Add Clojure LSP as a dependency

I added this to my deps.edn:

{:aliases
 {:lsp
  {:extra-deps {com.github.clojure-lsp/clojure-lsp {:mvn/version "2024.08.05-18.16.00"}
                dev.weavejester/cljfmt             {:git/url "https://github.com/weavejester/cljfmt"
                                                    :git/sha "434408f6909924f524c8027b37422d32bb49622d"}}}}}

Note that at the time of this writing because of clojure-lsp/clojure-lsp#1867 I had to add a fake dependency for cljfmt to make this work as well – I just used the latest Git SHA at the time of this writing.

Launch REPL with the :lsp alias

In Emacs I used

C-u M-x cider-jack-in

to let me edit the command it used to launch the nREPL to add the :lsp alias. Note that I launched the REPL from the same project I was debugging.

Debug from the REPL

This is the code I came up with while poking thru Kondo and LSP internals to debug things:

(require 'clj-kondo.hooks-api
         'clojure-lsp.kondo
         'clojure-lsp.queries
         'clojure.java.io)

(defn find-element [path line col]
  (binding [clj-kondo.hooks-api/*reload* true]
    (letfn [(analysis [path]
              (let [db*              (atom {})
                    config           nil
                    file-analyzed-fn identity]
                (clojure-lsp.kondo/run-kondo-on-paths! [path] db* config file-analyzed-fn)))
            (db-with-analysis [path]
              (clojure-lsp.kondo/db-with-analysis {} (analysis path)))]
      (let [db  (db-with-analysis path)
            uri (str "file://" (.getAbsolutePath (clojure.java.io/file path)))]
        (clojure-lsp.queries/find-element-under-cursor db uri line col)))))

Binding clj-kondo.hooks-api/*reload* is important so Clojure LSP will reload your Kondo config (e.g., custom hooks) after you make changes to it.

The basic idea here is to run Kondo on a specific file path and then pass that analysis along to LSP which can find the element at a given position in a file based on that analysis.

All you need is a file name to debug and a line number and column number that positions the cursor over the symbol you want to debug.

Example usage:

(let [path "src/metabase/api/card_2.clj"
      line 17
      col  21]
  (some-> (find-element path line col)
          (select-keys [:name :alias :from :from-var :to])))
;;=> {:name read-check, :alias api, :from metabase.api.card-2, :from-var POST_:id_copy, :to metabase.api.common}

If your Kondo hook is broken then this will either return nothing or return the wrong information. If the information looks something like the example result above then things should work as expected.

What was broken?

My original version of the hook returned when I used the find-element code above.

{:name POST_:id_copy}

My hook basically looked like this (simplified a bit):

(ns hooks.metabase.api.common
  (:require
   [clj-kondo.hooks-api :as api]
   [clojure.string :as str])

(defn route-fn-name
  "route fn hook"
  [method route]
  (let [route (if (vector? route) (first route) route)]
    (-> (str (name method) route)
        (str/replace #"/" "_")
        symbol)))

;;; Original custom Kondo hook -- BROKEN!!
(defn defendpoint
  [arg]
  (letfn [(update-defendpoint [node]
            (let [[_defendpoint method route & body] (:children node)]
              (api/list-node
               (list
                (api/token-node 'do)
                (api/token-node (symbol "compojure.core" (str (api/sexpr method))))
                (-> (api/list-node
                     (list*
                      (api/token-node 'clojure.core/defn)
                      (api/token-node (route-fn-name (api/sexpr method) (api/sexpr route)))
                      body))
                    (with-meta (meta node)))))))]
    (update arg :node update-defendpoint)))

which means the example at the beginning would be transformed in a node representing a form like this:

;;; Output of custom Kondo hook
'(do
   compojure.core/POST
   (clojure.core/defn POST_:id_copy
     "Copy a `Card`, with the new name 'Copy of _name_'"
     [id]
     {id [:maybe ms/PositiveInt]}
     (let [orig-card (api/read-check Card id)
           new-name (str (trs "Copy of ") (:name orig-card))
           new-card (assoc orig-card :name new-name)]
       (-> (card/create-card! new-card @api/*current-user*)
           (assoc :last-edit-info (last-edit/edit-information-for-user @api/*current-user*))))))

We’re generating a do form here so we can capture the usage of compojure.core/POST so Kondo doesn’t think it’s unused if it’s in our :require form in the ns declaration.

Side note: this is mentioned in the Kondo docs but you can verify your Kondo hook output looks correct on the surface level by Kondo itself as a dep and using clj-kondo.hooks-api/parse-string and clj-kondo.hooks-api/sexpr:

;;; debugging custom Kondo hooks
(def form
  '(api/defendpoint POST "/:id/copy"
     "Copy a `Card`, with the new name 'Copy of _name_'"
     [id]
     {id [:maybe ms/PositiveInt]}
     (let [orig-card (api/read-check Card id)
           new-name  (str (trs "Copy of ") (:name orig-card))
           new-card  (assoc orig-card :name new-name)]
       (-> (card/create-card! new-card @api/*current-user*)
           (assoc :last-edit-info (last-edit/edit-information-for-user @api/*current-user*))))))

(-> {:node (-> form
               pr-str
               api/parse-string)}
    defendpoint
    :node
    api/sexpr)

I’ve actually started putting stuff like these in tests to make sure I don’t break my dozens of custom Kondo hooks.

Anyways “looks correct at a surface level” is key here. Because while this output looks correct, what you do with the metadata (specifically stuff like start and end line and column numbers) from the original nodes is actually super important to Clojure LSP. (They’re important to Kondo too to make sure warnings pop up in the right place.)

If you want to see what this metadata looks like in your REPL you can

(set! *print-meta* true)

or just bind *print-meta* and use something like clojure.pprint/pprint e.g.

(binding [*print-meta* true]
  (-> {:node (-> form
                 pr-str
                 api/parse-string)}
      hooks.metabase.api.common/defendpoint
      :node
      api/sexpr
      clojure.pprint/pprint))

The output is actually super noisy so I’ll leave it out here.

The fixed version of the hook actually has the exact same output aside from some changes to metadata.

Here’s the fixed version of the hook above:

;;; custom Kondo hook -- FIXED
(defn defendpoint
  [arg]
  (letfn [(update-defendpoint [node]
            (let [[_defendpoint method route & body] (:children node)]
              (-> (api/list-node
                   (list
                    (api/token-node 'do)
                    (-> (api/token-node (symbol "compojure.core" (str (api/sexpr method))))
                        (with-meta (meta method)))
                    (api/list-node
                     (list*
                      (api/token-node 'clojure.core/defn)
                      (-> (api/token-node (route-fn-name (api/sexpr method) (api/sexpr route)))
                          (with-meta (meta route)))
                      body))))
                  (with-meta (meta node)))))]
    (update arg :node update-defendpoint)))

The important change here was to give the generated function name symbol POST_:id_copy the metadata (line and column start and end numbers) from the original "/:id/copy" string. It seems like without knowing where a function name is defined in the text file (or rather, where it is restricted to) LSP for whatever reason was assuming the entire defendpoint form served as a the function name which meant any attempt to find the definition under the cursor anywhere in that form would assume you were trying jump to the function definition itself.

Other changes I made which didn’t seem to fix anything but seem more correct are:

  • compojure.core/POST now gets the metadata (line and column start and end numbers) from the original POST symbol (the method parameter) from the original node

  • The overall new top-level list node gets the metadata from the original top-level node rather than putting it on the generated defn node. Not sure this makes a huge difference TBH.

Conclusion

Custom Kondo hooks are powerful and worth the time it takes to wrap your head around rewrite-clj. Debugging them is tricky, and debugging them when used by LSP is even trickier. I haven’t seen it documented anywhere before, but it is possible. I hope you find this useful!

👇 Like and subscribe below! 👇

Permalink

Middle Strong Golang Developer

We are looking for Middle Strong Golang Developer to join product development for our client from the USA. The product is an AWS hosted multi-module payment and analytical platform for the healthcare services, written in Clojure/Go language stack. The product encompasses a few applications for customer journeys (web, mobile) and a complex micro-service architecture. 

Requirements: 

  •  3+ years of development experience 
  • 2+ years of working experience with Golang 
  •  Experience with RDBMS (PostgreSQL) and NoSQL (MongoDB, DynamoDB) 
  •  Experience with Messaging systems (SNS/SQS, Kafka, etc.) 
  • Experience with cloud services (preferably AWS) 
  • At least Upper-intermediate level of English 
  • Ability to write clean and easily maintained code 

Nice to have: 

  • Experience with Kafka 

Responsibilities: 

  • Participate in solution design and development, deliver high-quality code 
  • Regularly communicate with the team members at the client’s side, participate in status meetings, design sessions, and brainstorming 
  • Provide estimation and reporting of assigned tasks 

We offer friendly working conditions with competitive compensation and benefits including: 

  • Comfortable working environment 
  • Friendly team and management 
  • Competitive salary 
  • Free English classes 
  • Regular performance-based compensation review 
  • Flexible working hours 
  • 100% paid vacation, 4 weeks per year 
  • 100% paid sick-leaves 
  • Medical insurance (50% is paid) 
  • Corporate and team building events 

Apply Now

The post Middle Strong Golang Developer first appeared on Agiliway.

Permalink

Senior Golang Developer

We are looking for Senior Golang Developer to join product development for our client from the USA. The product is an AWS hosted multi-module payment and analytical platform for the healthcare services, written in Clojure/Go language stack. The product encompasses a few applications for customer journeys (web, mobile) and a complex micro-service architecture. 

Requirements: 

  •  4+ years of development experience 
  •  2+ years of working experience with Golang 
  • Experience with RDBMS (PostgreSQL) and NoSQL (MongoDB, DynamoDB) 
  •  Experience with Messaging systems (SNS/SQS, Kafka, etc.) 
  •  Experience with cloud services (preferably AWS) 
  •  At least Upper-intermediate level of English 
  •  Ability to write clean and easily maintained code 

Nice to have: 

  • Experience with Kafka 
  • Experience with Java 

Responsibilities: 

  • Participate in solution design and development, deliver high-quality code 
  • Regularly communicate with the team members at the client’s side, participate in status meetings, design sessions, and brainstorming 
  • Provide estimation and reporting of assigned tasks 

We offer friendly working conditions with competitive compensation and benefits including: 

  • Comfortable working environment 
  • Friendly team and management 
  • Competitive salary 
  • Free English classes 
  • Regular performance-based compensation review 
  • Flexible working hours 
  • 100% paid vacation, 4 weeks per year 
  • 100% paid sick-leaves 
  • Medical insurance (50% is paid) 
  • Corporate and team building events 

Apply Now

The post Senior Golang Developer first appeared on Agiliway.

Permalink

Passkeys for React Native with Clojure, Part III

Welcome back to this series on implementing passkeys in Clojure. For background on how passkeys work and how we’re building this backend, please refer back to part one of the series which covers the data model behind our implementation, and part two, which covers user registration.

User Authentication

Today, we’ll complete the backend of our passkey service by building the endpoints needed to authenticate users who have already registered. This flow is similar to user registration, in that the server will send a challenge to the client and wait to validate the response, but it will be simpler to build as it uses a number of pieces already built for registration.

(ns example.passkey
  (:import [com.yubico.webauthn StartAssertionOptions]))

(defn start-authentication
  [db _req]
  (let [options    (-> (StartAssertionOptions/builder)
                       (.build))
        rp         (relying-party db)
        request    (.startAssertion rp options)
        json       (.toCredentialsGetJson request)
        request-id (store-request db request)]
    {:status 200
     :headers {:content-type "application/json"
               :request-id request-id}
     :body json}))

As in the registration flow, we are temporarily storing the request in our database, and the server will retrieve it on the response in order to validate the challenge.

Authentication Outcomes

We’ll need three simple functions to back up our eventual authentication completion:

(defn auth-error
  []
  {:status 401
   :headers {:content-type "application/edn"}
   :body (pr-str {:error "Authorization Failed"})})

(defn auth-success-for
  [user-data]
  {:status 200
   :headers {:content-type "application/edn"}
   :body (pr-str user-data)})

(defn update-user
  [db credential-id signature-count backed-up?]
  (let [user (db/find-user-by-credential-id credential-id)]
    (db/update-user user {:credential/signature-count signature-count
                          :credential/backed-up? backed-up?})))

The error function simply returns an HTTP 401 error code. The success function will return user-data, which is anything your client application needs to know to function as a logged-in user, like you would with session information or a JWT.

The user update function serves to increment the signature count and backup status; java-webauthn-server uses this to continue authenticating the passkey.

Completing Authentication

With those built, we are ready to process an authentication response:

(ns example.passkey
  (:require [example.db :as db])
  (:import  [com.yubico.webauthn FinishAssertionOptions]
            [com.yubico.webauthn.data PublicKeyCredential]
            [com.yubico.webauthn.exception AssertionFailedException]))

(defn complete-authentication
  [db req]
  (try
    (let [json-body  (-> req :body bs/to-string)
          request-id (-> req :path-params :request)
          pkc        (PublicKeyCredential/parseAssertionResponseJson json-body)
          rp         (relying-party db)
          request-m  (assertion-request-for db request-id)
          request    (:webauthn/request request-m)
          options    (-> (FinishAssertionOptions/builder)
                         (.request request)
                         (.response pkc)
                         (.build))
          result     (.finishAssertion rp options)]
      (if (.isSuccess result)
        (let [credential-id   (-> result .getCredential .getCredentialId .getHex)
              signature-count (.getSignatureCount result)
              backed-up?      (.isBackedUp result)
              user            (db/find-user-by-username db (.getUsername result))]
          (update-user db credential-id signature-count backed-up?)
          (remove-request db request-id)
          (auth-success-for signer-uuid user))
        (auth-error)))
    (catch AssertionFailedException e
      (println "Error Authenticating Passkey" e)
      (auth-error))))

Much like the registration flow, we are invoking the webauthn library to take the incoming signed JSON authentication response. If it succeeds, we lookup the user in our database and return it to the client. If not, we’ll use our error function to return an HTTP error code.

That’s all there is; the entire passkey service is now completed end-to-end.

Up Next

Now that our backend is complete, the fourth and final installment of this series will look at building a client in React Native that can register and authenticate using passkeys on client devices.

Permalink

Using Swift SDKs from ClojureDart

One of Dart’s strengths has always been host interoperability. Early on (beginning of 2017), Platform channels were used (and still are) to consume native SDKs. Although they are very useful, Platform channels come with costs tied to their message-handling design (a close comparison would be JavaScript’s Web Worker API).

More recently, Dart’s team has invested significantly in the Foreign Function Interface (FFI). They built a set of libraries to help you generate native bindings targeting Java, Objective-C, C, Kotlin, and more. Once your bindings are generated, you only need to call them using ClojureDart. The team also plans to continue working in this direction:

We’ll continue to invest in further interoperability—both in terms of completing the above-mentioned libraries and in terms of supporting Swift—over the coming releases. See the roadmap section below for details.
Source

I personally think it’s a game-changer. At Tensegritics, we generate lots of bindings for our projects. The process is as simple as:
1. Identifying a native SDK.
2. Writing some configuration for ffigen or jnigen.
3. Dumping the bindings.

You might have noticed that I did not mention Swift among the available languages. That’s because it’s not currently directly supported, although swiftgen is a work in progress. It is still possible, and that’s the goal of this article.

We are going to generate bindings for the CryptoKit SDK, which is a Swift-only SDK, using ffigen and some Xcode magic (or nightmares).

The operation can be summarized as follows:
1. Create a Swift file containing the functions you want to use (using @objc annotations).
2. Compile the file and generate Objective-C headers.
3. Add the dynamic library to the project.
4. Generate Dart bindings using ffigen.

Cryptobridge: A macOS Desktop Application Using CryptoKit SDK

The application’s sole goal is to compare SHA-256 hashes generated from Swift and from Dart. They must be equal!

Requirements:

Having Clojure and Flutter installed. (ClojureDart comes via deps.edn.)

Initialization

First, clone the example repository:

git clone git@github.com:Tensegritics/cryptobridge.git

Then initialize it:

cd cryptobridge
fvm use # only if you are an fvm user 
clj -M:cljd init

Download the official Dart crypto package:

flutter pub add crypto

And run the application:

clj -M:cljd flutter -d macos

You should see the application like this:

You can stop the app for now -- adding a native dependency requires restarting the app.

Writing Your Swift Bindings

Swift can be consumed in Objective-C by using the @objc annotation. Once you’ve identified the Swift functions you need to use, it’s simply a matter of reading the documentation and figuring out how to call them. Most of the time, you don’t need to be a Swift expert.

Open the project with Xcode:

open macos/Runner.xcworkspace

Create a new Swift file as shown in the images below:


Name the file CryptoBridge.

Once created, copy and paste the code below. The code imports CryptoKit and creates a public class CryptoBridge with a static method hash that takes a String as input and returns an NSData. SHA256.hash is from CryptoKit, and we wrap the result in an NSData wrapper. The official objective_c.dart package makes it simple to consume NSData.

import Foundation
import CryptoKit

@available(macOS 10.15, *)
@objc public class CryptoBridge: NSObject {
    @objc public static func hash(s: String) -> NSData {
        let data = s.data(using: .utf8)!
        let hash = SHA256.hash(data: data)
        return Data(hash) as NSData
  }
}

Generating the Objective-C Wrapper Header and Dynamic Library

Once we’re done writing our Swift code, we need to compile and generate the swift_api.h headers. Note that I am using an Apple Silicon chip here, so change the -target option to fit your architecture.

swiftc -c macos/Runner/CryptoBridge.swift -target arm64-apple-macos10.15 -module-name crypto_module -sdk $(xcrun --sdk macosx --show-sdk-path) -emit-objc-header-path third_party/swift_api.h -emit-library -o macos/Runner/libcryptobridge.dylib

swiftc generated two files:
- macos/Runner/libcryptobridge.dylib that we need to embed in our app.
- third_party/swift_api.h that we’ll use to generate Dart bindings using ffigen.

In Xcode, add the libcryptobridge.dylib library to your project. Under Frameworks, right-click, select “Add files,” and look for the dylib file.


Once it’s done, when you click on Runner and then Build Phases, you should see the libcryptobridge.dylib file under Link Binary With Libraries like this:

Generating Dart Bindings and Using Them

Our Xcode job is done here. The last phase is generating Dart bindings using ffigen and using them.
If your app is running, quit it, and then run pub get for the ffigen and objective_c packages.

flutter pub add ffigen --dev
flutter pub add objective_c

Open your pubspec.yaml file and add this ffigen configuration at the very end:

ffigen:
  name: CryptoKit
  description: Bindings for CryptoKit
  language: objc
  output: 'lib/ns_cryptokit_bindings.dart'
  exclude-all-by-default: true
  objc-interfaces:
    exclude:
      - 'NS*'
    include:
      - 'CryptoBridge'
    module:
      'CryptoBridge': 'crypto_module'
  preamble: |
    // ignore_for_file: camel_case_types, non_constant_identifier_names, unused_element, unused_field, void_checks, annotate_overrides, no_leading_underscores_for_local_identifiers, library_private_types_in_public_api
  headers:
    entry-points:
      - 'third_party/swift_api.h'
    include-directives:
      - '**swift_api.h'

You can now generate your Dart bindings like this:

dart run ffigen

This should create the lib/ns_cryptokit_bindings.dart file:

If you open the generated file (lib/ns_cryptokit_bindings.dart), you’ll find a static method hashWithS_, which is the Dart wrapper for the hash(s: String) method we created above.

static objc.NSData hashWithS_(objc.NSString s) {
    final _ret =
        _objc_msgSend_1(_class_CryptoBridge, _sel_hashWithS_, s.pointer);
    return objc.NSData.castFromPointer(_ret, retain: true, release: true);
  }

Now, it's time to start our application anew (if you haven't killed it yet, you have to stop it and relaunches it).

clj -M:cljd flutter -d macos

The application should show up as before, except that this time it's linked to CryptoKit, but we have no apparent change yet.

Open src/cryptobridge/main.cljd and require the necessary packages:

(ns cryptobridge.main
  (:require
   ["package:flutter/material.dart" :as m]
   ;; our generated bindings
   ["ns_cryptokit_bindings.dart" :as ns_cryptokit_bindings]
   ;; objective_c utility 
   ["package:objective_c/objective_c.dart" :as objective_c]
   ["package:crypto/crypto.dart" :as crypto]
   ["dart:convert" :as dart:convert]
   [cljd.flutter :as f]))

In the body of the main function, look for:

(f/widget
  :padding {:top 8}
  :watch [v controller :> .-text]
  (m/Text "TO IMPLEMENT"))

Replace "TO IMPLEMENT" with the call to our wrapper:

(f/widget
  :padding {:top 8}
  :watch [v controller :> .-text] ;1️⃣
  (m/Text (str (-> v
                 objective_c/NSString ;2️⃣
                 ns_cryptokit_bindings/CryptoBridge.hashWithS_ ;3️⃣
                 objective_c/NSDataExtensions ;4️⃣
                 .toList)))) ;5️⃣

Save and wait for hot reload to pick your changes up.

While we wait for the reload to happen, let me explain two bits of the above snippet:

  • the :watch (1️⃣) triggers on changes to the text property (a string) of the controller.
  • let's break down the big -> form: take v, convert it to a NSString (2️⃣), call the wrapper (3️⃣), call the .toList (5️⃣) method extension to NSData defined in NSDataExtensions (4️⃣, it looks like we are wrapping just to call a method but we are not: it's just some weird low-level static sugar, 4️⃣ and 5️⃣ work as a single expression).

The :watch option above listens for changes in the TextEditingController and gets any new TextEditingValue. We then get the text property, which is a string. Using the objective_c package, we create an NSString from our text and pass it to our hashWithS_ bindings. Finally, we use the toList method extension to dump NSData bytes.

Conclusion

Using Swift APIs is a bit more tedious than calling Objective-C APIs as it requires an extra step (creating @objc wrappers), but it's still a straightforward process.

Native interop is making constant progress, to the point that we’ve stopped looking for ready-made wrappers.

Nowadays, you rarely have to write native languages thanks to code generation (except for the situation described above, but swiftgen is coming). The few cases where we still had to write native code in the last year (to run on a specific thread or implement a callback) have been recently addressed!

Permalink

Copyright © 2009, Planet Clojure. No rights reserved.
Planet Clojure is maintained by Baishamapayan Ghose.
Clojure and the Clojure logo are Copyright © 2008-2009, Rich Hickey.
Theme by Brajeshwar.