Emute Lab @ Sussex

Experimental Music Technologies Lab

Category: Experimental Music

Artist talk: Timothy Didymus, Kosmiches Glass

:::: Wednesday, October 12th, 1pm, @ Jane Attenborough Studio, Attenborough Centre ::::

Brighton musician and maker Timothy Didymus will present his glass Harmonica project, Kosmiches Glass.

Twelve tuned (brandy) glasses are mounted on MIDI controllable turntables, creating a playable/ scriptable mechanical acoustic instrument with beautiful polyphonic voice.

Following a demonstration of the glasses in action, Timothy will talk about the inspiration behind and development of the project and will be happy take questions on any apsects of the project – aesthetic, technical, logistical etc.

This will be of interest to music, music tech and sonic media students – or anyone with an interest in new musical instruments.

Its design and engineering is of great elegance. (…) its music of transparency and transience, the sonorous resonances of heavenly voices” — Sound Artist Max Eastley, 2015



Free entry.


Ryan Ross Smith: Animated Music Notation – Speculative Histories and Practical Demonstrations

:::: Wednesday, April 13th, 1.00pm, @ Recital Room, Falmer House 120 ::::


The term Animated Music Notation [AMN] describes any notation that contains real-time dynamic functionalities that are necessary to its representation. Over the last 10 years, the field of contemporary animated scoring practices has emerged as a niche, yet significant thread in the new music world. This talk will introduce a speculative meta-history of the animated score, discuss the current state of contemporary animated scoring practices, and provide a practical demonstration of the author’s own notational and compositional practice. If time allows, the practical demonstration may be extended to include the audience, so please feel free to bring your instruments.

Screen Shot 2016-04-03 at 16.12.29
Ryan Ross Smith is a composer and performer currently based in Fremont Center, NY. Smith has performed throughout the US, Europe and UK, including performances at MoMA and PS1 [NYC] and Le Centre Pompidou [Paris, FR], has had his music performed throughout North America, Iceland, Australia and the UK, has presented his work and research at conferences including NIME, ISEA, ICLI, the Deep Listening Conference and Tenor2015, and has lectured at various colleges and universities. Smith earned his MFA in Electronic Music from Mills College in 2012, and is currently a PhD candidate in Electronic Arts at the Rensselaer Polytechnic Institute in Troy, NY.

See website with animated scores here


ICLC 2015

The 1st International Conference on Live Coding took place in July 2015 in Leeds. The conference was co-organised by the MIPT Lab at Sussex and ICSRIM in Leeds, facilitated by the AHRC-funded Live Coding Research Network.

The MIPT Lab had a strong presence at the conference: Sally Jane Norman gave a keynote on liveness in performance entitled “Live Coding and Embodied Action in Performance Contexts”, Chris Kiefer presented a paper and performed at the Algorave, and Thor Magnusson performed with his new CMN language as well as at the Algorave. The event was well documented thanks to MFM research funds, enabling Paul McConnell to attend armed with recording equipment. Honorary members of the lab, Nick Collins and Matt Yee-King also performed and gave papers.

The ICLC conference will take place in Hamilton, Canada next year, organised by LCRN collaborator David Ogborn.

The Conference Programme can be downloaded here: ICLC Programme.

ICLC participants


Screen Shot 2015-07-21 at 11.38.36


Research Fellow in Digital Humanities/Digital Performance

The Sussex Humanities Lab in collaboration with the School of Media, Film and Music at the University of Sussex wishes to appoint a fixed-term (4-year) fellowship (Research Fellow) in Digital Technologies/Digital Performance. While based in the School of Media, Film and Music, the appointee will work across the Sussex Humanities Lab in collaboration with colleagues in History, Art History & Philosophy, Informatics, and Education and Social Work.

The ideal candidate will have a demonstrable track record of work in performance technologies as a theorist and/or creative practitioner, with clear evidence of technical expertise in all cases. Candidates with knowledge of one or more of the following:  creative software (e.g. SuperCollider, Max/MSP or Pure Data), app development, graphic and games programming (e.g. OpenGL, Unity), physical computing, are particularly encouraged.

Further information on Sussex Jobs

A PDF with the job description can be downloaded here: Research Fellow in Digital Humanities/Digital Performance 226

The International Conference on Live Coding

::::  13-15th July @ University of Leeds, UK ::::

The MIPTL is organising the first International Conference on Live Coding in collaboration with by ICSRiM in the School of Music, University of Leeds. This is part of our two-year AHRC funded Live Coding Research Network.

The ICLC conference will take place over three days and include paper presentations, demos, concerts and algoraves.

ICLC header

Phantom Terrains with Daniel Jones

:::: Wednesday, April 15th, 2pm @ Recital Room, Falmer House 120 ::::

Daniel Jones is an artist and software engineer whose work explores new ways in which sound and technology can illuminate our understanding of the world, producing large-scale sculptural sound installations and systems that translate patterns and processes into musical forms. Daniel will discuss two recent works — Living Symphonies, a touring outdoor piece that grows in the same way as a forest ecosystem, and Phantom Terrains, a platform that enables its hearing-impaired wearer to hear the surrounding landscape of wifi networks — and argues that these  two  seemingly disparate outputs inhabit the same creative spectrum.

Daniel Jones

Short biography:

Daniel Jones is an artist and software engineer whose work explores new ways in which sound and technology can illuminate our understanding of the world. This manifests itself in both scientific and artistic output: he has published work on process composition, creativity theory, systems ecology and artificial life, and exhibits his sound work internationally, harnessing algorithmic processes to create self-generating artworks.

Recent works include Phantom Terrains (with Frank Swain, 2014), a platform for ubiquitous sonification of wireless network landscapes; Living Symphonies (with James Bulley, 2014-), a landscape sound work that grows in the same way as a forest ecosystem; Global Breakfast Radio(with Seb Emina, 2014-), an autonomous radio station that broadcasts live radio from wherever the sun is rising; The Listening Machine (with Peter Gregson, 2012), a 6-month-long online composition which translates social network dynamics into a piece of orchestral music, recorded with Britten Sinfonia and commissioned by the BBC/Arts Council’s The SpaceVariable 4 (with James Bulley, 2011), an outdoor sound installation which transforms live weather conditions into musical patterns; Maelstrom (with James Bulley, 2012), which uses audio material from media-publishing websites as a distributed, virtual orchestra; Horizontal Transmission (2011), a digital simulation of bacterial communication mechanisms; and AtomSwarm (2006—2009), a musical performance system based upon swarm dynamics.

Daniel’s engineering work includes Chirp, a platform and iOS app for sharing information over sound, shortlisted for the Design Museum’s Designs Of The Year; the 3D audio engine for mobile games Papa Sangre and The Nightjar, nominated for two BAFTAs including “Audio Achievement”. He co-ordinated the technical infrastructure for The Fragmented Orchestra, winner of the prestigious PRSF New Music Award 2008, and was more recently a fellow on the Mozilla Webmaker programme.

Fields – Tim Shaw and Sébastien Piquemal

:::: Thursday, April 9th, 1.20pm @ Meeting House, University of Sussex ::::

NOTE: Please bring a networked device – a computer, phone, tablet, etc.

In this piece, Sébastien and Tim explore mobile technology as a medium for sound diffusion. Audience members can join in by simply connecting to a specific website with their mobile phone, laptop or tablet. The connected devices become an array of speakers that the performers can control live, resulting in an omni-directional sonic experience. This project provides an alternative method for sound spatialisation as well as offering new ways in which audiences can engage in sonic works. Fields has been performed in Helsinki, Berlin, Athens, Lisbon and Newcastle.

Further information: tim-shaw.net/fields_/

In a workshop afterwards, Tim and Sebastien will introduce participants to using Fields as a performance or installation tool. We will cover technical aspects such as setting up, configuration and customisation of the system. We will also share experience gathered from a year of composing for tiny mobile phone speakers, trying different audience configurations and dealing with latency and other technical limitations. The workshop will allow for participants to create their own work using Fields and will culminate in a listening session and an open discussion.

Fields Fields

Tim Shaw

Tim Shaw has worked internationally as a professional composer, performer, sound designer and researcher. His practice incorporates diverse approaches to sound capture and processing, and includes creating immersive and site responsive sonic installations. His compositional methods include field recordings, synthesized sounds and live electronics, providing a wide scope for creative diversity. At the heart of his work lies a concern with the auditory reflection and mirroring of real world environments through sound and technology. He is currently studying a PhD in Digital Media at Culture Lab alongside managing Newcastle based record label Triptik. Tim has created commissions for Warp Records, The British Council, The British Science Association, Pacitti Company, Tender Buttons and Transform Festival.

Sébastien Piquemal

Sébastien Piquemal is a computer engineer, obsessively exploring the artistic capabilities of machines. After working several years as a full-stack web developer in Helsinki, Finland, he decided to dedicate himself fully to making music. Since then, he has been an active contributor to the open-source software community, leading various projects such as WebPd (Pure Data patches running in the web browser). As a lover of Jazz and improvised music, Sébastien is seeking new ways to place human interaction at the core of live music. He is presently doing a MA degree in sound in new media at Media Lab Helsinki.

Composing and Performing with the Magnetic Resonator Piano

:::: Thursday, March 26th, 1.20pm @ Recital Room, Falmer House 120 ::::

The magnetic resonator piano (MRP) is an electromagnetically augmented acoustic grand piano which uses electromagnets to induce vibrations in the strings. The MRP is played from the piano keyboard using an optical scanner that measures the continuous position of every key. The instrument is capable of infinite sustain, crescendos from silence, harmonics, pitch bends and new timbres, all produced acoustically by the piano strings and soundboard without any external speakers.

This event will begin with a performance by music composed for, or adapted to, the magnetic resonator piano. A talk and workshop will follow the performance. The talk will present the design of the instrument, including how its evolution has been shaped by working with composers and performers. The MRP was first created in 2009, and underwent a significant design revision in 2011 in response to feedback from musicians. The talk will also discuss how other digital musical instrument creators can build a community of musicians around their instruments.

The workshop will explore techniques for composing and performing with the magnetic resonator piano, working directly with the instrument. Examples will be taken from recent pieces composed for MRP, and there will be an opportunity to try new ideas on the resonator system.



Andrew McPherson is a Senior Lecturer in the Centre for Digital Music at Queen Mary University of London. With a background in electrical engineering and music, his research focuses on augmented acoustic instruments, new performance interfaces, and study of performer-instrument interaction. He did his undergraduate and Master’s work at MIT, completing his M.Eng. thesis in Barry Vercoe’s group at the MIT Media Lab. He completed his PhD in music composition in 2009 at the University of Pennsylvania. Before joining Queen Mary in 2011, he spent two years as a post-doctoral researcher in the Music Entertainment Technology Laboratory (MET-lab) at Drexel University.

In addition to the magnetic resonator piano, he is the creator of the TouchKeys multi-touch keyboard which launched in a successful Kickstarter campaign in 2013. More recently he has studied how musicians appropriate and misuse technology for creative purposes, creating a new digital musical instrument, the D-Box, specifically designed to be hacked and subverted by the performer using circuit bending techniques.

SLÁTUR – Experimental Composer Collective

After Tom Betts’ talk on the Sublime in Computer Games, we are lucky to have visitors from Iceland in the form of the experimental music SLÁTUR composer collective. They will perform a concert with compositions by SLÁTUR members. The performance includes electronic compositions with live electronic processing as well as music using animated notation, found objects as musical instruments and video projections.

Slatur pianoShort description:

SLÁTUR – an artistically obtrusive composer collective centered in Reykjavík, Iceland. Since 2005 its members have been working on various types of experiments. These include animated notation using computer graphics, interactivity, various experiments with sounds and tunings, performance art and the development of limited and isolated musical universes. The members share ideas and methods freely while the final products are usually independent efforts.

Slatur headquarters slatur instrument


© 2017 Emute Lab @ Sussex

Theme by Anders NorenUp ↑