2016 featured · music · releases · album · generative · web

microscale

A web-based generative album built in real time from random Wikipedia articles

microscale could be started as a question: how do you translate one medium into another directly, with the original meaning becoming unrecognizable, but a new meaning appearing? But it started with a dumb random idea.

The project is a web-based, generative album created in real time from random Wikipedia articles. On the page, you see six articles at once. They are unrelated in content, but in the system they become tightly connected: each article functions as a step sequencer where the letters are the steps.

The track titles are regular expressions (for example: [module], [random]). Those expressions decide which letters are “active”. When the sequencer moves through the text and lands on an active letter, it triggers a sound event. The result is a piece that feels strangely coherent, even though its source material is literally random.

The web version is also hackable: you can change the expressions, tempo, and sound sources and use it as an instrument rather than an album.

Almost all sounds on microscale were produced with Eera, a synthesizer I wrote myself.

Alongside the living web version, there is a one-time rendered release: a fixed snapshot of a single generative run, published by Preserved Sound as a limited CDR edition (so every copy is different) and as a rendered digital download.

Released in June 2017. Cover photo by Roman Palchenkov.