thingsinjars

  • 31 Dec 2021

    RFID Timesheet

    I've done a lot of projects over the holidays. This is a quick collection of notes to remind myself later.

    I used an RC522 RFID scanner (originally part of a Tonuino project) and wired it to a Particle Photon. Whenever an RFID tag was held to it, it would publish an event containing the ID of the card to the Particle Cloud. When the card was removed, it would publish a blank event. This is the code from the Photon:

    // Photon   RFID-RC522
    //   A2         SDA
    //   A3         SCK
    //   A4         MISO
    //   A5         MOSI
    //   D2         RST
    //   GND        GND
    //   3V3        3.3V
    
    #include 
    #include 
    
    #define LED_PIN     D7
    
    constexpr uint8_t RST_PIN = D2;          // Configurable, see typical pin layout above
    constexpr uint8_t SS_PIN =  A2;           // Configurable, see typical pin layout above
    
    MFRC522 mfrc522(SS_PIN, RST_PIN);  // Create MFRC522 instance
    
    uint32_t lastMillis = 0;
    bool wasPresent = false;
    
    void setup() {
        pinMode(LED_PIN, OUTPUT);
    	Serial.begin(9600);		// Initialize serial communications with the PC
    	while (!Serial);		// Do nothing if no serial port is opened (added for Arduinos based on ATMEGA32U4)
    	SPI.begin();			// Init SPI bus
    	mfrc522.PCD_Init();		// Init MFRC522
    	mfrc522.PCD_DumpVersionToSerial();	// Show details of PCD - MFRC522 Card Reader details
    	Serial.println(F("Scan PICC to see UID, SAK, type, and data blocks..."));
    }
    
    void loop() {
    	// Look for new cards
    	if ( ! mfrc522.PICC_IsNewCardPresent()) {
    	    if(wasPresent) {
    	        if(! mfrc522.PICC_IsNewCardPresent()) {
    	          Serial.println("No card");
    	          Particle.publish("rfid_scan", "", 60, PRIVATE);
    	          wasPresent = false;
    	        }
    	    } else {
    	        
    	    }
    		return;
    	}
    
    	// Select one of the cards
    	if ( ! mfrc522.PICC_ReadCardSerial()) {
    		return;
    	}
    
        char cardID[32] = "";
    
        for (byte i = 0; i < mfrc522.uid.size; i++) {
         char hex[4];
         snprintf(hex, sizeof(hex), "%02x", mfrc522.uid.uidByte[i]);
         strncat(cardID, hex, sizeof(cardID));
        }
    
        if (millis() - lastMillis < 1000) {
            return;
        }
        lastMillis = millis();
    
    	if(!wasPresent) {
    	    wasPresent = true;
    	    Particle.publish("rfid_scan", cardID, 60, PRIVATE);
    	    Serial.printlnf("Card: %s", cardID);
    
    	    // Turn on the LED
    	    digitalWrite(LED_PIN, HIGH);
    
    	    // Leave it on for one second
    	    delay(1s);
    
    	    // Turn it off
    	    digitalWrite(LED_PIN, LOW);
    
    	    // Wait one more second
    	    delay(1s);
    	}
    
    	// Dump debug info about the card; PICC_HaltA() is automatically called
    	//mfrc522.PICC_DumpToSerial(&(mfrc522.uid));
    }

    I then used IFTTT to read these events and write them to a Google Spreadsheet. This is the IFTTT Excel code:

    {{CreatedAt}} |||{{EventContents}}||| =IF(ISODD(ROW()), "Started", "Stopped") ||| =IF(ISEVEN(ROW()),ROUND(((DATEVALUE(REGEXEXTRACT(INDIRECT(ADDRESS(ROW(),COLUMN()-3,4)), "w+ d{2}, d{4}")) + TIMEVALUE(REGEXEXTRACT(INDIRECT(ADDRESS(ROW(),COLUMN()-3,4)), "d{2}:d{2}[A|P]M$"))) - ( DATEVALUE(REGEXEXTRACT(INDIRECT(ADDRESS(ROW()-1,COLUMN()-3,4)), "w+ d{2}, d{4}")) + TIMEVALUE(REGEXEXTRACT(INDIRECT(ADDRESS(ROW()-1,COLUMN()-3,4)), "d{2}:d{2}[A|P]M$")))) * 24, 2),"")|||
    =IFERROR(VLOOKUP(INDIRECT(ADDRESS(ROW(), COLUMN()-3),4), I$3:J$10, 2, FALSE), "")

    We now have a physical time tracker that can be used to log time spent on individual projects. Super-handy for Jenni's freelance work.

    Geek

  • 31 Dec 2021

    IoT Button

    I've done a lot of projects over the holidays. This is a quick collection of notes to remind myself later.

    I took the insides out of an old IKEA Spøka nightlight and squeezed in a Particle Photon, a battery shield and a battery then soldered the nightlight's on/off switch onto some jumper cables and wired that in. I now have an internet-connected button that looks cute.

    Still no idea what to do with it but it’s fun.

    Here's the code that's running on the Photon:

    int led = D7; // Built-in LED
    int pushButton = D6; // Old Spøka momentary switch

    bool wasUp = true;

    void setup() { pinMode(led, OUTPUT); pinMode(pushButton, INPUT_PULLUP); }

    void loop() { int pushButtonState;

    pushButtonState = digitalRead(pushButton);

    if(pushButtonState == LOW) { // If we push down on the push button digitalWrite(led, HIGH); // Turn ON the LED if(wasUp) { Particle.publish("Spooky pressed"); wasUp = false; } } else { digitalWrite(led, LOW); // Turn OFF the LED wasUp = true; }

    }

    When you press the button, you get a message published to the Particle Cloud.

    Geek

  • 15 Mar 2021

    Map Web Component

    Geek

  • 4 Jan 2021

    Using Web APIs to generate music videos

    Plan

    A very simple web app that generates a music visualisation from an audio file and renders it to a downloadable movie file. Most importantly, it does this all on the client side.

    To do this, we'll combine the Web Audio API, Canvas API and MediaStream Recording API. I'll not be providing the complete source code as we go through it but just enough to point you in the right direction.

    For more detail, I recommend the following:

    • Visualizations with Web Audio API (MDN)
    • Record Audio and Video with MediaRecorder

    I've used this as the basis to create all the music videos for my next album. You can see examples in my previous post.

    TL;DR

    Here's the complete architecture of what we're building:

    And here's the finished build: All-In-One Music Visualiser

    Basic HTML

    First, let's just set up some basic HTML to show the visualisation, hold the audio and take in some parameters.

    <canvas></canvas>
    <audio controls></audio>
    
    <label>Track title:
      <input type="text" id="track" placeholder="Track title">
    </label>
    <label>Artist name: 
      <input type="text" id="artist" placeholder="Artist name">
    </label>
    <label>Audio File:
      <input type="file" id="file" accept="audio/*" />
    </label>

    Read in audio file

    const audio = document.querySelector('audio');
    audio.src = URL.createObjectURL(document.querySelector('[type=file]').files[0]);
    audio.load();

    Web Audio API

    Wire up Web Audio

    Create an AudioContext, source the audio from the <audio> element, prepare a streaming destination for later and connect the in to the out.

    const context = new AudioContext();
    const src = context.createMediaElementSource(audio);
    const dest = context.createMediaStreamDestination();
    src.connect(dest);

    Attach an analyser node

    We want our visualisation to react to the music so we need to wire in an AnalyserNode. Thankfully, this handles all the complicated audio processing so we don't need to worry about it too much. We also attach the analyser to the destination node of the AudioContext so that we can hear it through the computer speakers. Not strictly necessary but it's nice to be able to hear what we're doing.

    const analyser = context.createAnalyser();
    src.connect(analyser);
    analyser.connect(context.destination);

    Prepare to sample frequency

    the fftSize is essentially "How detailed do we want the analysis to be?". It's more complicated than that but this is all the detail we need for now. Here, we're using 64 which is very low but 512, 1024 and 2048 are all good. It depends on the actual visualisations you want to produce at the other end.

    The smoothingTimeConstant is approximately "How much do we want each sample frame to be like the previous one?". Too low and the visualisation is very jerky, too high and it barely changes.

    analyser.fftSize = 64;
    analyser.smoothingTimeConstant = 0.8;
    
    const bufferLength = analyser.frequencyBinCount;
    
    // Prepare the array to hold the analysed frequencies
    const frequency = new Uint8Array(bufferLength);

    Finally grab the values from the <input> elements.

    const titleText = document.querySelector("#track").value.toUpperCase();
    const artistText = document.querySelector("#artist").value.toUpperCase();

    Canvas API

    Now we do the standard canvas setup – grab a reference to the canvas, set our render size (doesn't need to be the same as the visible size of the canvas) and prepare a 2d context.

    Prepare Canvas

    const canvas = document.getElementById("canvas");
    canvas.width = 1280;
    canvas.height = 720;
    const ctx = canvas.getContext("2d");

    Render loop

    Execute this on every render frame.

    function renderFrame() {
      // schedule the next render
      requestAnimationFrame(renderFrame);
    
      // Grab the frequency analysis of the current frame
      analyser.getByteFrequencyData(frequency);
    
      // Draw the various elements of the visualisation
      // This bit is easy to modify into a plugin structure.
      drawBackground(ctx);
      drawBars(ctx, frequency);
      drawText(ctx, titleText, artistText);
    }

    The various visualisation elements:

    function drawBackground(ctx) {
      ctx.fillStyle = 'white';
      ctx.fillRect(0, 0, canvas.width, canvas.height);
    }
    function drawBars(ctx, frequency) {
      const widthOfEachBar = (canvas.width / frequency.length);
    
      // Loop over data array
      for (let i = 0; i < frequency.length; i++) {
        const heightOfThisBar = frequency[i];
    
        // Base the colour of the bar on its index
        const h = 360 * index;
        // Base the saturation on its height
        const s = 100 * (heightOfThisBar/256);
        const l = 50;
        const color = `hsl(${h}, ${s}%, ${l}%)`;
    
        ctx.fillStyle = color;
        
        // Add a little shadow/glow around each bar
        ctx.shadowBlur = 20;
        ctx.shadowColor = color;
    
        // Draw the individual bar 
        ctx.fillRect(x, canvas.height - heightOfThisBar, widthOfEachBar - 1, heightOfThisBar);
    
        x += (widthOfEachBar);
      }
    }
    function drawText(ctx, titleText, artistText) {
        ctx.fillStyle = 'white';
        ctx.textAlign = 'center';
        ctx.font = '4em sans-serif';
        ctx.fillText(titleText, canvas.width/2, canvas.height/2 - 25);
        ctx.fillText(artistText, canvas.width/2, canvas.height/2 + 25);
    }

    By this point, we can load a supported audio file and see some pretty pictures reacting to the music.

    MediaRecorder

    This section is copied almost word-for-word from StackOverflow

    First, we'll create a combined MediaStream object from the audio data and the canvas data.

    const chunks = []; // here we will store our recorded media chunks (Blobs)
    const stream = canvas.captureStream(); // grab our canvas MediaStream
    
    let combined = new MediaStream([
    ...stream.getTracks(),
    ...dest.stream.getTracks(),
    ]);

    Next, we start recording that data chunk-by-chunk. We'll save it as webm.

    const rec = new MediaRecorder(combined, {
      audioBitsPerSecond : 128000,
      videoBitsPerSecond : 2500000,
      mimeType : 'video/webm'
    });

    When the recorder receives a chunk of data, store it in memory.

    rec.ondataavailable = (e) => chunks.push(e.data);

    When we finish recording, combine the chunks and send them to an export method

    rec.onstop = (e) => exportVid(new Blob(chunks, { type: "video/webm" }));
    rec.start();

    This is sometime necessary to avoid async timing issues when loading the audio data. It doesn't hurt, at least.

    audio.addEventListener("durationchange", () => {
      setTimeout(() => rec.stop(), Math.floor(audio.duration * 1000));
    }

    The final video export. We convert the combined chunks (the Blob) into an object URL and create an anchor that lets us download it from the browser.

    function exportVid(blob) {
      const a = document.createElement("a");
      a.download = "myvid.webm";
      a.href = URL.createObjectURL(blob);
      a.textContent = "download";
      document.body.appendChild(a);
    }

    This export call will be triggered after the audio has finished. You load an audio file, watch the visualisation play through, when it's finished, you click the download link and you get a webm.


    So now, we have the basis for a complete music video generator – audio in, video file out.

    Try the basic version out – All-In-One Music Visualiser.

    Or watch a video generated using this basic version (as a bonus, this music was also automatically generated in the browser but that's a topic for another day):


    Don't forget, you can also find me on Spotify.

    Javascript, Geek, Design, Development

  • older posts

Categories

Toys, Guides, Opinion, Geek, Non-geek, Development, Design, CSS, JS, Open-source Ideas, Cartoons, Photos

Shop

Colourful clothes for colourful kids

I'm currently reading

Projects

  • Awsm Street – Kid's clothing
  • Stickture
  • Explanating
  • Open Source Snacks
  • My life in sans-serif
  • My life in monospace
Simon Madine (thingsinjars)

@thingsinjars

Hi, I’m Simon Madine and I make music, write books and code.

I’m the CTO for workpin.

© 2022 Simon Madine