In playing around with Samplr for iOS, it struck me that it could behave like a mellotron of sorts, too. Sure, Samplr (and other similar apps, like Curtis, csSpectral, and Sector) is great for mashing stuff up in an extreme way, but I decided if I could play it a bit more like a piano.
Since Samplr only slices samples into increments of four, I output a little more than two scales of a synth sound in G# minor. Then it was a simple as loading that sample in, selecting 16 slices, and then playing Samplr as a keyboard.
The results were odd, glitchy, loose, and interesting. I liked that I could hold chords while also dialing in reverb, delay, even playback loop length. The loop points were obvious, but at the right lengths and tempos, they can become rhythmic or simply textural. When playing single melodies, possible by dragging my finger between slices simulated piano runs. Then, inspired by Mr. Cortini’s solo albums, I decided to make a track with samples just from one single synth, played back from Samplr.
Today’s audio clip features a live recording in multiple passes. All of the sounds are from the TAL Bassline101, an amazing emulator of the Roland SH101. I output only two audio files, each with a unique patch but in the same pitch range and scale, and the rest of the variations are from Samplr.
Today’s track is comprised of several ominous improvisations with analog synths: Korg Volca Beats, Korg Volca Bass, Waldorf Pulse +, and Grendel Drone Commander. A couple of virtual analog software synths were also used to round things out in the edit.
Nice to get back into the pure subtractive synthesis world for a whole, in terms of sound design. I need to grow more hands if I’m to tweak this many knobs at once…
I love handmade soundmaking devices, but outside of my beloved Grendel Drone Commander, a lot of the weird noise boxes and effects I have are, well, noisy. They tend to be aggressive, loud, and blippy. Some accept MIDI, some accept CV, some accept no sync signal at all.
One evening I wondered if I could coax them into some semblance of ambient drones, to loosen myself up and not record to a fixed tempo, and to not get too “precious” with editing in post. Somehow the angry nature of these devices just seems to bleed through anyway. Or is that my angry nature?
So, the result of this cathartic experiment was “angry ambient.” Or, angrient.
This track features the following:
All takes recorded live into Logic Pro X: No sync to anything, no MIDI, no CV.
One track of a Bleep Labs Nebulophone, with its alligator clip clamped onto a key for a sustained drone, recorded through a Red Panda Particle pedal set to Reverse, both tweaked live. The dry and effected track were tracked simultaneously.
Another droned Nebulophone track went through the Particle set to Delay, and then through a Seppuku Memory Loss pedal, with its clean microchip inserted, all three tweaked live. The dry and effected track were tracked simultaneously.
One track of the RareWaves Grendel Drone Commander, recorded 100% dry. That thing needs no love, especially when its bandpass filters gets overdriven at low frequencies. Yummy.
Sometimes sound design requires thinking inside of multiple boxes.
I’ve developed a small collection of handmade and boutique electronic effects and instruments over the years, like the Grendel Drone Commander, Lite2 Sound PX, and many more (perhaps the subject of another post). Longtime readers may recall that I just love supporting independent makers and small cottage industries: That’s where all the weird, truly innovative stuff happens, and I (like many of you, dear readers) am more interested in cool sound design possibilities than straight-up distorted guitarrrrrrrr sounds.
Hand-built one at a time by Eric Archer, the Grendel Drone Commander is a two-oscillator synth built inside of a metal surplus ammo box. Its apparent simplicity belies its sonic complexity. I’m still feeling my way around the thing, but I wanted to post an example of what it makes possible. (Next step: Play with CV control!)
This heavy, drone-y, smeary track was created using only the Grendel Drone Commander, recorded live thee times, each on a different track, in Logic Pro (with a few plug-ins as well).
Inspired by the work of Toshio Iwai and originally conceived (and entirely developed) by the insanely talented Josh Santangelo, I led the creative direction and interaction design, and I also created all the sounds for the piece. Our goal in making TouchTones was to ensure that anyone could use it with only a few seconds of exploration, and create beautiful music without any musical training. It was all about immediacy and richness, and the sound needed to support this.
TouchTones is a grid-based music sequencer: the user sets a sprite in motion that, when passing over a grid node, makes a specific sound. Each sprite is a different instrument, moving at different speeds, but are all locked to a master tempo. There are four sprites (voices) and 32 nodes (pitches/notes).
The main challenge was placing notes on the grid. I started by composing short pieces of music that featured a lot of arpeggios of varying note durations, which mimicked how the nodes on the grid would get triggered. This helped me figure out the best note durations for certain sounds, and to establish a key to work in. Since the user is the one who creates the final melody, the only way to really stress-test the sounds and key was to prototype and have real people play with it.
The sound palette itself went through several iterations. The first featured somewhat realistic sounds with a pretty complex scale, so the likelihood of atonality was too high. The second iteration featured purely electronic sounds in a more harmonious scale, but the sounds were too aggressive (probably owing to my own past attraction towards angry music). The third and final iteration finally hit the mark: Cleaner, primarily acoustic sounds, a key that’s pleasant and even a bit wistful, and a note distribution that isn’t always linear, preventing unnatural shifts into inappropriate pitch registers. Internally, we jokingly call the final result the “indie film about autumn in Central Park” palette.
All the sounds were created in Logic Pro, primarily using the EXS24 sampler. A lot of tonal and envelope tweaking ensued. Rather than provide sound clips like I usually do, I encourage you to watch the embedded video above to get a sense of how the application feels and sounds.
All my posts to date have featured what’s newest to me: sound gathering in the field and only slight manipulations to said sounds. But synthesis is a longtime love of mine. In my studio, hundreds of small snippets of synthesized sounds exist scattered across terabytes of hard disk space. I usually have no clue what the source material was, or how I created them.
Luckily, I (re)discovered several unusually well-documented synthesized sounds for this post: a collection of samples that were oriented towards making impactful, short sci fi sounds, but created using virtual synthesizers in software rather than real recordings. These sounds all wound up evoking lasers, blasters, and other sci-fi energy weapons, or discrete layer elements for the same.
Ben Burtt defined this sound for generations with his struck-guy-wire laser blasts in Star Wars, and I (like most) tend to agree that these real sound sources make a big difference in the complexity and character of the final sound. But synthesizing these sounds from scratch is a fun exercise, as well: deconstructing what works about that classic sound (amplitudes of high and low frequencies offset in time), figuring out how to execute it, and then modifying the sound for different emotional effects.
(I’ve found other real-world objects that also make Burtt-style blaster sounds, which will be featured in an upcoming post!)