Every last Friday of the month, people all over the world come together to ride their bikes in what is called a Critical Mass. The idea is that by riding in a convoy, you gain certain special rights (at least in Germany, this is). The most important one is that you get to behave essentially like one large vehicle meaning that if the head of the group crossed a green light, everyone gets to go.

Since this isn’t an official demonstration, there are no leaders and not predetermined route. The group just goes wherever the people at the head go. This means that it gets difficult to find the group if you are late. Recognizing this, some people built a very simple mobile app a while ago. You start it when you are riding in a Critical Mass, share your location, and you get to see everyone else’s location.

A couple of years ago, I talked to Dirk about using the data that is created in the process for some visualisation. Last year I finally implemented a simple system for building visualisation with results like this:

I tried to build this with technology that is as boring as possible. There are three parts to this: storing, processing and visualising the data.

Data retrieval

I wrote a very small Python script which talks to the Critical-Maps API and stores all of the current data in a newly created sqlite file for each day that the script is run. The script is pretty trivial: we create the table schema (which is just a timestamp and a text field to store the JSON returned by the API), and start an infinite loop which writes the newest data every 30 seconds:

import urllib.request
import sqlite3
from time import sleep
from datetime import date

FILENAME = f"/home/knut/critical-tracks/criticaltracks-{date.today().isoformat()}.sqlite"

def get_data():
    # finding the right URL is left as an exercise to the reader
    with urllib.request.urlopen(URL) as url:
        return url.read().decode()

def add_row(cur, con, data: str):
    cur.execute('insert into tracks (data) values(?);', (data,))

def main():
    con = sqlite3.connect(FILENAME)
    cur = con.cursor()
    cur.execute('CREATE TABLE IF NOT EXISTS tracks (timestamp datetime default current_timestamp, data text);')

    while True:
            data = get_data()
            add_row(cur, con, data)
            print('failed to get or add')

if __name__ == "__main__":

I wrote a separate blogpost about using systemd timers to automatically start this script on the right day of the month.

Data processing

On Saturday morning, I log onto the server and stop the script. I then download the data locally and run a processing script on it. The goal here is to protect riders’ privacy a bit. By default, all entries are exposed in the API. In the visualisation, I don’t want people to be able to see where others ride on their own once they go home though. I decided to only show those points which have at least two neighbours within 100 meters. This means that a dot is visible as long as a rider is part of the group, but gets removed once they separate. I had originally written this as a little Python script but could not bear that it ran for over a minute once a month. I did the only reasonable thing an re-wrote it in rust which increased performance by a factor of 75 if I remember correctly. The script reads the sqlite file, filters the data and prints the filtered data as json to stdout. The data format is still pretty simple, a list of entries that each have a timestamp and list of geojson point features. I run the script and re-direct the output to a data.json file at the correct path. The important part of the script currently looks like this:

    fn may_show(point: &Location, points: &[Location]) -> bool {
        const NEIGHBORS: u8 = 3;
        const DISTANCE: f32 = 100.0;
        let mut found = 0;
        for candidate in points {
            if (get_distance(candidate, point)) < DISTANCE {
                found += 1
            if found == NEIGHBORS {
                return true;

    let mut results: Vec<ResultRow> = Vec::new();
    for row in result_iter {
        let res = row.unwrap();
        let entry: Entry = serde_json::from_str(res.data.as_str()).unwrap();
        let points = entry
            .map(|l| Location {
                latitude: l.latitude as f32 / 1_000_000.,
                longitude: l.longitude as f32 / 1_000_000.,
        let mut filtered_points: Vec<Feature> = Vec::new();
        for point in &points {
            if may_show(point, &points) {
                filtered_points.push(Feature {
                    r#type: "Feature".into(),
                    geometry: Point {
                        r#type: "Point".into(),
                        coordinates: [point.longitude, point.latitude],
        eprintln!("{}", res.timestamp);
        if !filtered_points.is_empty() {
            results.push(ResultRow {
                timestamp: res.timestamp,
                data: filtered_points,
    let json = serde_json::to_string(&results).unwrap();
    println!("{}", json);


Finally, the data is visualised using mapbox-gl. I think I initially chose mapbox-gl over plain leaflet thinking that it would be too many points to just render as plain DOM-nodes. I deliberately chose not to use any other frameworks since I just wanted to keep this as simple as possible. The JavaScript code can also be found on GitHub. The code looks essentially like this (omitting some code for manual time-scrubbing here):

    .then(response => response.json())
    .then(json => {
        map.on('load', () => {
            map.addSource('bikes', {
                'type': 'geojson',
                data: {
                    "type": "FeatureCollection",
                    features: []
                'id': 'bikes',
                'source': 'bikes',
                'type': 'circle',
                'paint': {
                    'circle-radius': 5,
                    'circle-color': 'rgba(241,202,17,0.69)'

            const input = document.querySelector('input')
            input.max = json.length;

            const interval = setInterval(() => {
                if (input.value >= json.length) {
                    "type": "FeatureCollection",
                    features: json[input.value].data
            }, 100);


A version with full interactivity (and also data for all cities - just zoom out) can be found at criticaltracks.k-nut.eu.