thingsinjars

  • 30 Dec 2012

    GhostStory

    During all the research I did for my CSS testing talk, I couldn't help but spot another gap where a testing tool could be useful.

    Cucumber

    Cucumber is a technology used widely in automated testing setups, mostly for acceptance testing - ensuring that the thing everybody agreed on at the beginning was the thing delivered at the end.

    This is accomplished by having a set of plain text files containing descriptions of different scenarios or aspects of the application, usually with a description of the actions performed by an imaginary user. You describe the situation (known as a 'Given' step), describe the user's action ('When') and describe the expected outcome ('Then').

    The language (properly known as gherkin) used in these files is deliberately simple and jargon-free so that all the key stakeholders in the project - designers, developers, product owners - can understand but the files are also written in a predictable and structured style so that they can, behind the scenes, be turned into testable code.

    What occurred to me when looking into this area was that there wasn't an agreed terminology for specifying the layout/colour/look and feel of a project in plain text. Surely this would be the perfect place to drop in some cucumber salad.

    What we've got now is a project based on SpookyJS - a way of controlling CasperJS (and, therefore PhantomJS) from NodeJS - which contains the GhostStory testing steps and their corresponding 'behind the scenes' test code. There are only two steps at the moment but they are the most fundamental which can be used to build up future steps.

    By the way, the name 'GhostStory' only just won out over the original 'CucumberNodeSpookyCasperPhantomJS'. It kinda makes sense because it's built using a lot of ghost-named technologies (Spooky, Casper, Phantom) and cucumber test files are sometimes called "Story" files because they describe a user story.

    Implemented Steps

    Here, "Element descriptor" is a non-dev-readable description of the element you want to test - "Main title", "Left-hand navigation", "Hero area call-to-action". In the project, you keep a mapping file, selectors.json, which translates between these descriptions and the CSS selector used to identify the element in tests.

    Then the "Element descriptor" should have "property" of "value"
    

    This is using the computed styles on an element and checking to see if they are what you expect them to be. I talked about something similar to this before in an earlier post. This is related to the 'Frozen DOM' approach that my first attempt at a CSS testing tool, cssert, uses but this way does not actually involve a DOM snapshot.

    Then the "Element descriptor" should look the same as before
    

    This uses the 'Image Diff' approach. You specify an element and render the browser output of that element to an image. The next time you run the test, you do the same and check to see if the two images differ. As mentioned many times before, this technique is 'content-fragile' but can be useful for a specific subset of tests or when you have mocked content. It can also be particularly useful if you have a 'living styleguide' as described by Nico Hagenburger. I've got some ideas about CSS testing on living styleguides that I'll need to write up in a later post.

    Future Steps

    Off the top of my head, there are a couple of other generic steps that I think would be useful in this project.

    Then the "Element descriptor" should have a "property" of "value1", "value2", ..., or "valueN"
    

    This variation on the computed style measurement allows an arbitrary-length list of values. As long as the element being tested matches at least one of the rules, the step counts as a pass. This could be used to ensure that all text on a site is one of a certain number of font-sizes or that all links are from the predefined colour palette.

    Then the "Element descriptor" should look the same across different browsers.
    

    This would build on the existing image diff step but include multiple browser runners. Just now, the image diffs are performed using PhantomCSS which is built on top of PhantomJS which is Webkit-based. This would ideally integrate a Gecko renderer or a Trident renderer process so that the images generated from one could be checked against another. I still feel that image diff testing is extremely fragile and doesn't cover the majority of what CSS testing needs to do but it can be a useful additional check.

    The aim

    I'm hoping this can sit alongside the other testing tools gathering on csste.st where it can help people get a head-start on their CSS testing practices. What I'm particularly keen on with the GhostStory project is that it can pull in other tools and abstract them into testing steps. That way, we can take advantage of the best tools out there and stuff it into easily digested Cucumber sandwiches.

    Try it

    The GhostStory project is, naturally, available on GitHub. More usefully, however, I've been working on a fork of SpookyJS that integrates GhostStory into an immediately usable tool.

    Please check out this project and let me know what you think. I might rename it to distinguish it from the original SpookyJS if I can figure out exactly how to do that and maintain upstream relationships on GitHub.

    Geek, Development, CSS

  • 29 Oct 2012

    Some App.net recipes

    This is a collection of code snippets for various common tasks you might need to accomplish with the App.net API. Most of these are focused on creating or reading geo-tagged posts. They require a developer account on app.net and at least one of an App ID, App Code, App Access Token or User Access Token. The calls here are implemented using jQuery but that's just to make it easier to copy-paste into the console to test them out (so long as you fill in the blanks).

    An important thing to bear in mind is the possibility for confusion between a 'stream' and 'streams'. By default, a 'stream' is a discrete chunk of the 20 latest posts served at a number of endpoints. This is the open, public, global stream:

    https://alpha-api.app.net/stream/0/posts/stream/global
    

    On the other hand, 'streams' are long-poll connections that serve up any matching posts as soon as they are created. The connection stays open while there is something there to receive the response. Streams are available under:

    https://alpha-api.app.net/stream/0/streams
    

    Totally not confusing. Not at all.


    Creating a user access token

    Required for any user-specific data retrieval. The only tricky thing you'll need to think about here is the scope you require.

    scope=stream email write_post follow messages export
    

    should cover most requirements.

    Requires

    • client_id

    Visit this URL:

    https://alpha.app.net/oauth/authenticate
        ?client_id=[your client ID]
        &response_type=token
        &redirect_uri=http://localhost/
        &scope=stream email write_post follow messages export
    

    Using a user access token to create a post (with annotations)

    Requires

    • User Access Token
    • text to post

    The text is essential if you don't mark a post as 'machine_only'. The annotations here are optional. Annotations don't appear in the global stream unless the requesting client asks for them.

    $.ajax({
      contentType: 'application/json',
      data: JSON.stringify({
        "annotations": [{
          "type": "net.app.core.geolocation",
          "value": {
            "latitude": 52.5,
            "longitude": 13.3,
            "altitude": 0,
            "horizontal_accuracy": 100,
            "vertical_accuracy": 100
          }
        }],
        "text": "Don't mind me, just checking something out."
      }),
      dataType: 'json',
      success: function(data) {
        console.log("Text+annotation message posted");
      },
      error: function() {
        console.log("Text+annotation message failed");
      },
      processData: false,
      type: 'POST',
      url: 'https://alpha-api.app.net/stream/0/posts?access_token={USER_ACCESS_TOKEN}'
    });
    

    Using a user access token to post a machine_only post (with annotations)

    Requires

    • User Access Token

    In this example, we're creating a post that won't show up in user's timelines and adding the 'well-known annotation' for geolocation.

    $.ajax({
      contentType: 'application/json',
      data: JSON.stringify({
        "annotations": [{
          "type": "net.app.core.geolocation",
          "value": {
            "latitude": 52.5,
            "longitude": 13.3,
            "altitude": 0,
            "horizontal_accuracy": 100,
            "vertical_accuracy": 100
          }
        }],
        machine_only: true
      }),
      dataType: 'json',
      success: function(data) {
        console.log("Non-text message posted");
      },
      error: function() {
        console.log("Non-text message failed");
      },
      processData: false,
      type: 'POST',
      url: 'https://alpha-api.app.net/stream/0/posts?access_token={USER_ACCESS_TOKEN}'
    });
    

    Retrieve the global stream, including geo-annotated posts if there are any

    Requires

    • User Access Token

    This is a very basic call to retrieve the global stream but it also instructs the endpoint to return us all annotations and include machine-only posts.

    var data = {
      "include_machine": 1,
      "include_annotations": 1,
      "access_token": "{USER_ACCESS_TOKEN}"
    };
    
    $.ajax({
        contentType: 'application/json',
        dataType: 'json',
        success: function(data) {
          console.log(data);
        },
        error: function(error, data) {
          console.log(error, data);
        },
        type: 'GET',
        url: 'https://alpha-api.app.net/stream/0/posts/stream/global',
        data: data
    });
    

    Creating an App Access Token

    This is necessary for many of the streams operations. It is not used for individual user actions, only for application-wide actions.

    • App.net API wiki on App Access Tokens

    Requires

    • client_id
    • client_secret

    client_credentials is one of the four types of grant_type specified in the OAuth 2.0 specification. I had difficulty getting this to work when using a data object:

    var data = {
        "client_id": "{CLIENT_ID}",
        "client_secret":"{CLIENT_SECRET}",
        "grant_type": "client_credentials"
    };
    

    The client_credentials kept throwing an error. Instead, sending this as a string worked fine:

    $.ajax({
      contentType: 'application/json',
      data: 'client_id={CLIENT_ID}&client_secret={CLIENT_SECRET}&grant_type=client_credentials',
      dataType: 'json',
      success: function(data) {
        console.log(data);
      },
      error: function(error, data) {
        console.log(error, data);
      },
      processData: false,
      type: 'POST',
      url: 'https://alpha.app.net/oauth/access_token'
    });
    

    One other thing to note is that this bit should be done server-side. This will throw a bunch of "…not allowed by Access-Control-Allow-Origin…" errors if you do it via jQuery.

    Returns

    {
        "access_token": "{APP_ACCESS_TOKEN}"
        }
    

    Creating a streams format

    Now you have your app access token, you can use it to tell the service what kind of data you want back. The streams offered in the API have two quite powerful aspects. Firstly, filters allow you to run many kinds of queries on the data before it is streamed to you so you don't need to recieve and process it all. Secondly, the decoupling of filters from streams means you can specify the data structure and requirements you want once then just access that custom endpoint to get the data you want back any time.

    Requires

    • App access token

    This first example just creates an unfiltered stream endpoint

    $.ajax({
      contentType: 'application/json',
      data: JSON.stringify({"object_types": ["post"], "type": "long_poll", "id": "1"}),
      dataType: 'json',
      success: function(data) {
        console.log(data);
      },
      error: function(error, responseText, response) {
        console.log(error, responseText, response);
      },
      processData: false,
      type: 'POST',
      url: 'https://alpha-api.app.net/stream/0/streams?access_token={APP_ACCESS_TOKEN}'
    });
    

    Returns

    {
        "data": {
            "endpoint": "https://stream-channel.app.net/channel/1/{LONG_RANDOM_ENDPOINT_URL}",
            "id": "77",
            "object_types": [
                "post"
            ],
            "type": "long_poll"
        },
        "meta": {
            "code": 200
        }
    }
    

    Using Filters to create a stream of geotagged posts

    We'll specify some requirements for our filter now so that it only returns back a subset of posts. The rules we're specfying here are:

    At least one item in the "/data/annotations/*/type" field
    must "match"
    the value "net.app.core.geolocation"
    

    Requires

    • User access token

    The field is specified in 'JSON Pointer' format. Within the response, there is a 'data' object and a 'meta' object. The data contains an 'annotations' object which contains an array of annotations, each of which has a type. This is represented as /data/annotations/*/type.

    $.ajax({
      contentType: 'application/json',
      data: JSON.stringify({"match_policy": "include_any", "clauses": [{"object_type": "post", "operator": "matches", "value": "net.app.core.geolocation", "field": "/data/annotations/*/type"}], "name": "Geotagged posts"}),
      dataType: 'json',
      success: function(data) {
        console.log(data);
      },
      error: function(error, responseText, response) {
        console.log(error, responseText, response);
      },
      processData: false,
      type: 'POST',
      url: 'https://alpha-api.app.net/stream/0/filters?access_token={USER_ACCESS_TOKEN}'
    });
    

    Returns

    The filter rules you just specified, the id of the filter (remember that for later) and the details of the application used to make the request.

    {
    "clauses": [
        {
            "field": "/data/annotations/*/type",
            "object_type": "post",
            "operator": "matches",
            "value": "net.app.core.geolocation"
        }
    ],
    "id": "527",
    "match_policy": "include_any",
    "name": "Geotagged posts",
    "owner": {
        "avatar_image": {
            "height": 200,
            "url": "https://d2rfichhc2fb9n.cloudfront.net/image/4/Pr63PjEwJ1fr5Q4KeL3392BMgSnIAYlHxv8OkWwzx75V8quNfpaFp4VPpKnDRxdXtYYPtIutrDVdU9NbJn7hKApQL84T5sfB1D9bWTgtizMWInignv0WyPPfM2DpqSThQgvkB68vbPzjZ8VeKM02M2GySZ4",
            "width": 200
        },
        "canonical_url": "https://alpha.app.net/thingsinjars",
        "counts": {
            "followers": 30,
            "following": 65,
            "posts": 96,
            "stars": 0
        },
        "cover_image": {
            "height": 230,
            "url": "https://d2rfichhc2fb9n.cloudfront.net/image/4/UWZ6k9xD8_8LzEVUi_Uz6C-Vn-I8uPGEBtKb9jSVoFNijTwyEm1mJYpWq6JvnA6Jd4gzW76vFnbSWvM3jadhc1QxUl9qS4NTKiv3gJmr1zY_UpFWvX3qhOIyKrBPZckf2MrinqWay3H0h9rfqY0Gp9-liEg",
            "width": 960
        },
        "created_at": "2012-08-12T17:23:44Z",
        "description": {
            "entities": {
                "hashtags": [],
                "links": [],
                "mentions": []
            },
            "html": "<span itemscope="https://app.net/schemas/Post">Nokia Maps Technologies Evangelist; CreativeJS team member; the tech side of museum140; builder of The Elementals; misuser of semi-colons;rn</span>",
            "text": "Nokia Maps Technologies Evangelist; CreativeJS team member; the tech side of museum140; builder of The Elementals; misuser of semi-colons;rn"
        },
        "id": "3191",
        "locale": "en_GB",
        "name": "Simon Madine",
        "timezone": "Europe/Berlin",
        "type": "human",
        "username": "thingsinjars"
    }
    }
    

    Listening to the geotagged post stream

    This wil return a link to a long-lasting connection to the app.net stream that will only return posts with the geolocation annotation.

    Requires

    • filter_id from the previous call

    Note: the filter_id was returned as id in the previous response.

    $.ajax({
      contentType: 'application/json',
      data: JSON.stringify({"object_types": ["post"], "type": "long_poll", "filter_id": "527"}),
      dataType: 'json',
      success: function(data) {
        console.log(data);
      },
      error: function(error, responseText, response) {
        console.log(error, responseText, response);
      },
      processData: false,
      type: 'POST',
      url: 'https://alpha-api.app.net/stream/0/streams?access_token={APP_ACCESS_TOKEN}'
    });
    

    Returns

    The same kind of response as the 'Creating a streams format' example except the data coming down on the stream is filtered.

    https://stream-channel.app.net/channel/1/{LONG_RANDOM_ENDPOINT_URL}
    

    Open that URL up in your browser (seeing as we're testing) and, in a different tab, create a geo-tagged machine-only post (see above). Your post will appear almost instantly after you've submitted it.

    Geek, Development, Javascript, Guides

  • 15 Oct 2012

    Location-based time

    Inspired by the simplicity of implementing a proximity search using MongoDB, I found myself keen to try out another technology.

    It just so happened that I was presented with a fun little problem the other day. Given a latitude and longitude, how do I quickly determine what the time is? Continuing the recent trend, I wanted to solve this problem with Node.JS.

    Unsurprisingly, there's a lot of information out there about timezones. Whenever I've worked with timezones in the past, I've always gotten a little bit lost so this time, I decided to actually read a bit and find out what was supposed to happen. In essence, if you're doing this sort of task. you do not want to have to figure out the actual time yourself. Nope. It's quite similar to one of my top web dev rules:

    Never host your own video.

    (Really, never deal with video yourself. Pay someone else to host it, transcode it and serve it up. It'll will always work out cheaper.)

    What you want to do when working with timezones is tie into someone else's database. There are just too many rules around international boundaries, summer time, leap years, leap seconds, countries that have jumped over the international date line (more than once!), islands whose timezone is 30 minutes off the adjacent ones...

    To solve this problem, it needs to be split into two: the first part is to determine which timezone the coordinate is in, the second is the harder problem of figuring out what time it is in that timezone. Fortunately, there are other people who are already doing this. Buried near the back of the bottom drawer in very operating system is some version of the tz database. You can spend hours reading up about it, its controversies and history on Wikipedia if you like. More relevantly, however, is what it can do for us in this case. Given an IANA timezone name – "America/New_York", "Asia/Tokyo" – you can retrieve the current time from the system's tz database. I don't know how it works. I don't need to know. It works.

    Node

    Even better for reaching a solution to this problem, there's a node module that will abstract the problem of loading and querying the database. If you use the zoneinfo module, you can create a new timezone-aware Date object, pass the timezone name to it and it will do the hard work. awsm. The module wasn't perfect, however. It loaded the system database synchronously using fs.readFileSync which is I/O blocking and therefore a Bad Thing. Boo.

    10 minutes later and Max had wrangled it into using the asynchronous, non-blocking fs.ReadFile. Hooray!

    Now all I needed to do was figure out how to do the first half of the problem: map a coordinate to a timezone name.

    Nearest-Neighbour vs Point-in-Polygon

    There are probably more ways to solve this problem but these were the two routes that jumped to mind. The tricky thing is that the latitude and longitude provided could be arbitrarily accurate. A simple lookup table just wouldn't work. Of course, the core of the problem was that we needed to figure out the answer fast.

    Nearest Neighbour

    1. Create a data file containing a spread of points across the globe, determine (using any slow solution) the timezone at that point.
    2. Load the data into an easily searchable in-memory data-structure (such as a k-d tree)
    3. Given a coordinate, find the nearest existing data point and return its value.

    Point in Polygon

    1. Create a data file specifying the geometry of all timezones.
    2. Given a coordinate, loop over each polygon and determine whether this coordinate is positioned inside or outside the polygon.
    3. Return the first containing polygon

    This second algorithm could be improved by using a coarse binary search to quickly reduce the number of possible polygons that contain this point before step 2.

    Despite some kind of qualification in mathematic-y computer-y stuff, algorithm analysis isn't my strong point. To be fair, I spent the first three years of my degree trying to get a record deal and the fourth trying to be a stand-up comedian so we may have covered complexity analysis at some point and I just didn't notice. What I do know, however, is that k-d trees are fast for searching. Super fast. They can be a bit slower to create initially but the point to bear in mind is that you only load it once while you search for data lots. On the other hand, while it's a quick task to load the geometry of a small number of polygons into memory, determining which polygon a given point is in can be slow, particularly if the polygons are complex.

    Given this vague intuition, I settled on the first option.

    If I wanted to create a spread of coordinates and their known timezones from scratch, it might have been an annoyingly slow process but, the Internet being what it is, someone already did the hard work. This gist contains the latitude and longitude for every city in the world and what IANA timezone it is in. Score! A quick regex later and it looks like this:

    module.exports = [
      {"latitude": 42.50729, "longitude": 1.53414, "timezone": "Europe/Andorra"},
      {"latitude": 42.50779, "longitude": 1.52109, "timezone": "Europe/Andorra"},
      {"latitude": 25.56473, "longitude": 55.55517, "timezone": "Asia/Dubai"},
      {"latitude": 25.78953, "longitude": 55.9432, "timezone": "Asia/Dubai"},
      {"latitude": 25.33132, "longitude": 56.34199, "timezone": "Asia/Dubai"},
      etc…
    

    (See the original on GitHub)

    All that's left is to load that into a k-d tree and we've got a fully-searchable, fast nearest neighbour lookup.

    Source

    The source for this node module is, of course, available on GitHub and the module itself is available for install via npm using:

    npm install coordinate-tz
    

    When combined with the zoneinfo module (or, even better, this async fork of the module), you can get a fast, accurate current time lookup for any latitude and longitude.

    Not a bad little challenge for a Monday evening.

    Development, Geek, Javascript

  • 8 Oct 2012

    Building a Proximity Search

    This is the detailed post to go with yesterday's quick discussion about proximity search. All the code is available on GitHub.

    This assumes a bit of NodeJS knowledge, a working copy of homebrew or something similar.

    Install

    • MongoDB - brew install mongodb
    • NodeJS
    • NPM (included in NodeJS installer these days)

    These are included in the package.json but it can't hurt to mention them here:

    • npm install twitter (node twitter streaming API library)
    • npm install mongodb (native mongodb driver for node)
    • npm install express (for convenience with API later)

    Start mongod in the background. We don't quite need it yet but it needs done at some point, may as well do it now.

    Create a Twitter App

    Fill out the form Then press the button to get the single-user access token and key. I love that Twitter does this now, rather than having to create a full authentication flow for single-user applications.

    ingest.js

    (open the ingest.js file and read along with this bit)

    Using the basic native MongoDB driver, everything must be done in the database.open callback. This might lead to a bit of Nested Callback Fury but if it bothers you or becomes a bit too furious for your particular implementation, there are a couple of alternative Node-MongoDB modules that abstract this out a bit.

    // Open the proximity database
    db.open(function() {
        // Open the post collection
        db.collection('posts', function(err, collection) {
            // Start listening to the global stream
            twit.stream('statuses/sample', function(stream) {
                // For each post
                stream.on('data', function(data) {
                    if ( !! data.geo) {
                        collection.insert(data);
                    }
                });
            });
        });
    });
    

    Index the data

    The hard work has all been done for us: Geospatial Indexing in MongoDB. That's a good thing.

    Ensure the system has a Geospatial index on the tweets.

    db.posts.ensureIndex({"geo.coordinates" : "2d"})
    

    Standard Geospatial search query:

    db.posts.find({"geo.coordinates": {$near: [50, 13]}}).pretty()
    (find the closest points to (50,13) and return them sorted by distance)
    

    By this point, we've got a database full of geo-searchable posts and a way to do a proximity search on them. To be fair, it's more down to mongodb than anything we've done.

    Next, we extend the search on those posts to allow filtering by query


    db.posts.find({"geo.coordinates": {$near: [50, 13]}, text: /.*searchterm.*/}).pretty()
    

    API

    Super simple API, we only have two main query types:

    • /proximity?latitude=55&longitude=13
    • /proximity?latitude=55&longitude=13&q=searchterm

    Each of these can take an optional 'callback' parameter to enable jsonp. We're using express so the callback parameter and content type for returning JSON are both handled automatically.

    api.js

    (open the api.js file and read along with this bit)

    This next chunk of code contains everything so don't panic.

    db.open(function() {
      db.collection('posts', function(err, collection) {
        app.get('/proximity', function(req, res) {
          var latitude, longitude, q;
          latitude = parseFloat(req.query["latitude"]);
          longitude = parseFloat(req.query["longitude"]);
          q = req.query["q"];
    
          if (/^(-?d+(.d+)?)$/.test(latitude) && /^(-?d+(.d+)?)$/.test(longitude)) {
            if (typeof q === 'undefined') {
              collection.find({
                "geo.coordinates": {
                  $near: [latitude, longitude]
                }
              }, function(err, cursor) {
                cursor.toArray(function(err, items) {
                  writeResponse(items, res);
                });
              });
            } else {
              var regexQuery = new RegExp(".*" + q + ".*");
              collection.find({
                "geo.coordinates": {
                  $near: [latitude, longitude]
                },
                'text': regexQuery
              }, function(err, cursor) {
                cursor.toArray(function(err, items) {
                  writeResponse(items, res);
                });
              });
            }
          } else {
            res.send('malformed lat/lng');
          }
    
        });
      });
    });
    

    If you've already implemented the ingest.js bit, the majority of this api.js will be fairly obvious. The biggest change is that instead of loading the data stream then acting upon each individual post that comes in, we're acting on URL requests.

    app.get('/proximity', function(req, res) {
    

    For every request on this path, we try and parse the query string to pull out a latitude, longitude and optional query parameter.

    if (/^(-?d+(.d+)?)$/.test(latitude) && /^(-?d+(.d+)?)$/.test(longitude)) {
    

    If we do have valid coordinates, pass through to Mongo to do that actual search:

    collection.find({
      "geo.coordinates": {
        $near: [latitude, longitude]
      }
    }, function(err, cursor) {
      cursor.toArray(function(err, items) {
        writeResponse(items, res);
      });
    });
    

    To add a text search into this, we just need to add one more parameter to the collection.find call:

    var regexQuery = new RegExp(".*" + q + ".*");
    collection.find({
      "geo.coordinates": {
        $near: [latitude, longitude]
      },
      'text': regexQuery
    }
    

    This makes it so simple, making it it kind of feels like cheating. Somebody else did all the hard work first.

    App.net Proximity

    This works quite well on the App.net Global Timeline but it'll really become useful once the streaming API is switched on.

    Of course, the code is all there. If you want to have a go yourself, feel free.

    Development, Geek, Guides

  • newer posts
  • older posts

Categories

Toys, Guides, Opinion, Geek, Non-geek, Development, Design, CSS, JS, Open-source Ideas, Cartoons, Photos

Shop

Colourful clothes for colourful kids

I'm currently reading

Projects

  • Awsm Street – Kid's clothing
  • Stickture
  • Explanating
  • Open Source Snacks
  • My life in sans-serif
  • My life in monospace
Simon Madine (thingsinjars)

@thingsinjars.com

Hi, I’m Simon Madine and I make music, write books and code.

I’m the Engineering Lead for komment.

© 2025 Simon Madine