top of page

Organic Optic Mix with Christmas Flower



Exploring semi-generative soundscape with Christmas flower, Poinsettia, through the fun interfacing gadget â€Ș@PlayTronica‬ ​⁠ Biotron and a carefully designed â€Ș@Ableton‬ Live project that further translates the incoming MIDI notes to meaningful messages for the â€Ș@AshunSoundMachines‬ Hydrasynth Desktop to react. All sounds were generated by a single factory patch on the Hydrasynth, without any external effects. All devices were connected through USB connections.


How the Live project manipulated the incoming MIDI notes:

Biotron provided two separate channels of monophonic note streams which were generated by the connected plant's natural electrical properties change and the value change detected by light sensor. Varying velocities were generated per note. Hand gesture & distance towards the plant affected the Plant stream and the extra light source that I waved around set the chords.


1st track - Plant stream sent to Pitch device that offset the "mostly happening" notes to root note of scale, then a Scale device filtered the stream to Major pentatonic scale. This provided melodic content.

2nd track - Light stream sent to Pitch device that offset the note it emitted when extra light source was hidden. It ensured to always returning to the root note. A Chord device after that then turned monophonic notes to rich chords. Four extra notes were generated with offset velocities (+7st, +16st, +19st, +24st), delay per note were applied as strum to bring out a more humanized feel. Lastly a Scale device made sure everything snaps to the major scale. When it returned to root note, it always emits tonic chord.

3rd track - Plant stream velocity was retrieved per note trigger by the Expression Control device, with Rise/Fall settings and curve modified, inverted, it then generated a slowly fading varying value, which was mapped to CC Control device sending a MIDI CC to control Macro 6 (PadVoice) of the Hydrasynth patch (CS80 Darkness RA).

4th track - Light stream velocity was retrieved per note trigger by the Expression Control device, with Rise/Fall settings and curve modified, it then generated a slowly fading varying value, which was mapped to CC Control device sending a MIDI CC to control Macro 1 (Wavescan) of the Hydrasynth patch (CS80 Darkness RA).

Remark: Tracks 3 & 4 both used a MIDI Effect Rack that had a Chain with a Key range that's out of context to filter out the MIDI note, made sure it to not send duplicated notes to the Hydrasynth.


You can see me exploring how the distances, gestures, light angles etc. affect the chords and melodies, then interacted throughout.


Comments


bottom of page