Emute Lab @ Sussex

Experimental Music Technologies Lab

Category: Generative Music

Artist talk: Timothy Didymus, Kosmiches Glass

:::: Wednesday, October 12th, 1pm, @ Jane Attenborough Studio, Attenborough Centre ::::

Brighton musician and maker Timothy Didymus will present his glass Harmonica project, Kosmiches Glass.

Twelve tuned (brandy) glasses are mounted on MIDI controllable turntables, creating a playable/ scriptable mechanical acoustic instrument with beautiful polyphonic voice.

Following a demonstration of the glasses in action, Timothy will talk about the inspiration behind and development of the project and will be happy take questions on any apsects of the project – aesthetic, technical, logistical etc.

This will be of interest to music, music tech and sonic media students – or anyone with an interest in new musical instruments.

Its design and engineering is of great elegance. (…) its music of transparency and transience, the sonorous resonances of heavenly voices” — Sound Artist Max Eastley, 2015



Free entry.


Research Fellow in Digital Humanities/Digital Performance

The Sussex Humanities Lab in collaboration with the School of Media, Film and Music at the University of Sussex wishes to appoint a fixed-term (4-year) fellowship (Research Fellow) in Digital Technologies/Digital Performance. While based in the School of Media, Film and Music, the appointee will work across the Sussex Humanities Lab in collaboration with colleagues in History, Art History & Philosophy, Informatics, and Education and Social Work.

The ideal candidate will have a demonstrable track record of work in performance technologies as a theorist and/or creative practitioner, with clear evidence of technical expertise in all cases. Candidates with knowledge of one or more of the following:  creative software (e.g. SuperCollider, Max/MSP or Pure Data), app development, graphic and games programming (e.g. OpenGL, Unity), physical computing, are particularly encouraged.

Further information on Sussex Jobs

A PDF with the job description can be downloaded here: Research Fellow in Digital Humanities/Digital Performance 226

Phantom Terrains with Daniel Jones

:::: Wednesday, April 15th, 2pm @ Recital Room, Falmer House 120 ::::

Daniel Jones is an artist and software engineer whose work explores new ways in which sound and technology can illuminate our understanding of the world, producing large-scale sculptural sound installations and systems that translate patterns and processes into musical forms. Daniel will discuss two recent works — Living Symphonies, a touring outdoor piece that grows in the same way as a forest ecosystem, and Phantom Terrains, a platform that enables its hearing-impaired wearer to hear the surrounding landscape of wifi networks — and argues that these  two  seemingly disparate outputs inhabit the same creative spectrum.

Daniel Jones

Short biography:

Daniel Jones is an artist and software engineer whose work explores new ways in which sound and technology can illuminate our understanding of the world. This manifests itself in both scientific and artistic output: he has published work on process composition, creativity theory, systems ecology and artificial life, and exhibits his sound work internationally, harnessing algorithmic processes to create self-generating artworks.

Recent works include Phantom Terrains (with Frank Swain, 2014), a platform for ubiquitous sonification of wireless network landscapes; Living Symphonies (with James Bulley, 2014-), a landscape sound work that grows in the same way as a forest ecosystem; Global Breakfast Radio(with Seb Emina, 2014-), an autonomous radio station that broadcasts live radio from wherever the sun is rising; The Listening Machine (with Peter Gregson, 2012), a 6-month-long online composition which translates social network dynamics into a piece of orchestral music, recorded with Britten Sinfonia and commissioned by the BBC/Arts Council’s The SpaceVariable 4 (with James Bulley, 2011), an outdoor sound installation which transforms live weather conditions into musical patterns; Maelstrom (with James Bulley, 2012), which uses audio material from media-publishing websites as a distributed, virtual orchestra; Horizontal Transmission (2011), a digital simulation of bacterial communication mechanisms; and AtomSwarm (2006—2009), a musical performance system based upon swarm dynamics.

Daniel’s engineering work includes Chirp, a platform and iOS app for sharing information over sound, shortlisted for the Design Museum’s Designs Of The Year; the 3D audio engine for mobile games Papa Sangre and The Nightjar, nominated for two BAFTAs including “Audio Achievement”. He co-ordinated the technical infrastructure for The Fragmented Orchestra, winner of the prestigious PRSF New Music Award 2008, and was more recently a fellow on the Mozilla Webmaker programme.

Tom Betts: The Sublime in Computer Games: Procedurally Generated Audiovisual Worlds

:::: Friday, March 20th, 2-4pm, @ Recital Room, Falmer House 120 ::::

In this seminar, musician and game designer Tom Betts will present his research and talk about the role of the sublime in the production of audiovisual digital work. He will briefly describe the history of the sublime and how it can be located in contemporary digital art and audio. He will describe how software can be programmed to generate these experiences, and present states where the digital world appears boundless and autonomous. The talk will describe the concept of the digital sublime as developed through the writing of Kant, Deleuze and Wark, music by artists such as Jem Finer, La Monte Young and Autechre and on video games such as Proteus, Minecraft and Love. Tom will describe how he employs computer code as an artistic medium to generate the sublime, generating audiovisual environments that explore permutational complexity and present experiences that walk the margins between confusion and control.



Tom Betts, aka Nullpointer, is an artist, academic and coder. He’s been a lecturer, designer, published musician, professional artist, warlock… Tom has just completed his PhD investigating the digital sublime in videogames. As part of this research he developed three games exploring the ideas of permutation and generativity in audiovisual game spaces. He is the lead programmer at game development studio Big Robot, who successfully Kickstarted the game “Sir, You Are Being Hunted” and developed the project through early access and Steam digital distribution. For a while he was a published musician and worked on a range of projects including setting up an automated generative radio station and touring with his band Weevil. He has performed with his software and delivered talks at international venues such as DIGRA, FreePlay, UNITE, Sonar, Rezzed and Indievelopment. He has also designed commercial interactive work for the Tate, V&A and Southbank Centre. He really likes generative systems, drinking coffee, watching films where nothing much happens and quite likes ironing.


The event will be followed with performances by the Icelandic S.L.A.T.U.R collective of experimental contemporary composers.


Music, Software, Function – A seminar with Robert Thomas

:::: Wednesday, Feb 25th, 1pm, @ Recital Room, Falmer House 120 ::::

This talk will present the work Robert Thomas has been exploring with various companies over the last 5 years. He will be discussing how while music making technology has evolved radically over the last 20 years, the process of music distribution and the form of music itself has remained conceptually the same for many decades, stretching back to the beginning of recorded music and beyond. Through a range of projects he will present how music which is distributed as software, and using sensor technology, can behave in exciting new ways and relate very directly to user experience. The discussion will range from apps on smartphones, to wearables, VR and AR. Finally Robert will discuss how music as software can perform a quantifiable function for listeners. No longer ‘just’ entertainment – music can become a tool to achieve practical goals and create value or reduce risk.

Following the talk Robert will host a workshop demonstrating and under the hood view of how some of his projects work technically.

Robert Thomas in the Studio

Robert Thomas, Adaptive Music Composer and Sound Designer

Over the last 10 years Robert Thomas has worked as a composer, sound designer, audio programmer and experience designer across a wide range of innovative projects. He specialises in innovative technical projects which often involve ways music and sound can adapt to listeners behaviour.

Robert is currently working on music apps which change based on heart rate and movement, using devices including the Apple Watch. Meditation music apps which create music out of the users brainwaves ( via EEG ) and help them meditate better. He has just completed an interactive virtual reality exhibit for a major TV brand using Oculus Rift. He has also worked on sound and music for the first holographic portrait of HRH Queen of England,  video games ( published by Nintendo ) and concerts / ambisonic 3d sound installations ( with Imogen Heap for Reverb Festival 2014 ).

Previously Robert was CCO at RjDj where he worked on the viral hit app Inception The App. This app was no1 in the App Store and has had over 6 million downloads. He has collaborated with Hans Zimmer, Imogen Heap, Air, Carl Craig, Little Boots, Bookashade, Jimmy Edgar, Chiddy Bang, Console / Acid Pauli, Sophie Barker (Zero 7) and Kirsty Hawkshaw (Opus III, Orbital, Tiesto).


© 2017 Emute Lab @ Sussex

Theme by Anders NorenUp ↑