Emute Lab @ Sussex

Experimental Music Technologies Lab

Category: NIME (page 1 of 2)

Emute Lab Meeting: Feedback Cellos – Alice Eldridge and Chris Kiefer

:::: Thursday, March 23rd, 1pm, @ Recital Room, Falmer House 120 ::::

We have an Emute Lab research lunch meeting on March 23rd, where Alice Eldridge and Chris Kiefer will present the results of their recent work on the feedback cellos, conducted at the Icelandic Academy of the Arts with Halldór Úlfarsson. That was a collaboration that started at the Live Interfaces conference (www.liveinterfaces.org) where Halldór ran a workshop with composers and performers.

Chris and Alice write:

We will report on a recent trip to the Iceland Academy of the Arts to visit instrument designer Halldór Úlfarsson, who we have been collaborating with on the design of two Feedback Cellos. These unique hybrid instruments have been evolving quickly since we first made them last summer. We will present the latest developments, and discuss some issues around design and performance that we have been engaging with in this project.


Enrique Tomás – Towards Non-linguistic Writing for Music: A Performative Approach

:::: Wednesday, October 26th, 1pm @ Sussex Humanities Lab, Silverstone Building ::::

This term we are pleased to have a visit from instrument designer, musician and researcher, Enrique Tomás. Enrique will be working with us and presenting his work in one of the Music Department’s lunch time seminars (in the SHL). See the abstract for the talk and bio below:

Instruments as Scores: Musical Interfaces Beyond Representation

The development of new approaches to instrumentality during the decade of 1960 has contributed to the dual perception of instrument as scores. For many performers, the instrument became the score of what they played. This artistic hybridization carries substantial questions about the nature of our scores and about the relationships among instruments, performers and musical works. In my talk, I will contextualize the historical origins of this instrumental development within digital humanities, in particular Drucker’s theory of performative materiality, Barad’s posthumanist performativy and Ingold’s procesual and relational description of material properties. Following this approach, I will defend that a ‘hybrid’ and ‘performative’ understanding of writing music, which shifts the focus from linguistic and visual representations to discursive practices, is one such alternative for suggesting new practices of notation and interface design.



Enrique Tomás is a sound artist and researcher who dedicates his time to finding new ways of expression and play with sound, art and technology. His work explores the intersection between sound art, computer music, locative media and human-machine interaction. He has exhibited and performed throughout Europe and America at the spaces of ZKM, Ars Electronica, Sónar, SMAK, STEIM, etc. Tomás is also an active researcher on the field of new interfaces for musical expression. He is affiliated to the Interface Cultures department of the University of Art and Design of Linz, and his research has been presented at international peer-reviewed conferences like NIME, ICMC, SMC, TEI and TENOR. His artistic work has been supported and awarded with scholarships by Telefónica Vida, Phonos Foundation, the Academy of Fine Arts of Vienna and the Art Council of Madrid.

Artist talk: Timothy Didymus, Kosmiches Glass

:::: Wednesday, October 12th, 1pm, @ Jane Attenborough Studio, Attenborough Centre ::::

Brighton musician and maker Timothy Didymus will present his glass Harmonica project, Kosmiches Glass.

Twelve tuned (brandy) glasses are mounted on MIDI controllable turntables, creating a playable/ scriptable mechanical acoustic instrument with beautiful polyphonic voice.

Following a demonstration of the glasses in action, Timothy will talk about the inspiration behind and development of the project and will be happy take questions on any apsects of the project – aesthetic, technical, logistical etc.

This will be of interest to music, music tech and sonic media students – or anyone with an interest in new musical instruments.

Its design and engineering is of great elegance. (…) its music of transparency and transience, the sonorous resonances of heavenly voices” — Sound Artist Max Eastley, 2015



Free entry.


Visiting lecture: Mick Grierson

:::: Wednesday, November 25th, 1pm, @ Recital Room, Falmer House 120 ::::

Mick will be showing demos from a selection of research and development projects currently underway at the EAVI lab at Goldsmiths, including the H2020 funded Music wearables project RAPID-MIX, the AHRC/NESTA/ACE funded project SOUNDLAB, and the new SPIDERSONICS app, an ongoing project with Chris Kiefer (Sussex), Simon Katan (Goldsmiths), Rebecca Fiebrink (Goldsmiths), and a selection of schools / arts organisations.

Mick Grierson


Mick Grierson is Director of Creative Computing at Goldsmiths College, where he is a Reader in the Department of Computing. He specialises in developing new technologies for the creative sector, across arts, industry and education, including in SEN (Special Educational Needs) contexts. He currently runs the European Commision-funded project “RAPID-MIX”, in partnership with IRCAM, Paris, and the Music Technology Group at Pompeu Fabra University, Barcelona.

His involvement is central in some of the most noteworthy creative technology installations since 2010 including Christian Marclay’s internationally acclaimed “The Clock”, Heart n Soul’s “Dean Rodney Singers” (Part of the Paralympics Unlimited Festival), and Science Museum’s “From Oramics to Electronica”. In addition, he is the founder of Goldsmiths Digital and co-founder of the Goldsmiths Embodied Audiovisual Interaction Group (EAVI).

Research Fellow in Digital Humanities/Digital Performance

The Sussex Humanities Lab in collaboration with the School of Media, Film and Music at the University of Sussex wishes to appoint a fixed-term (4-year) fellowship (Research Fellow) in Digital Technologies/Digital Performance. While based in the School of Media, Film and Music, the appointee will work across the Sussex Humanities Lab in collaboration with colleagues in History, Art History & Philosophy, Informatics, and Education and Social Work.

The ideal candidate will have a demonstrable track record of work in performance technologies as a theorist and/or creative practitioner, with clear evidence of technical expertise in all cases. Candidates with knowledge of one or more of the following:  creative software (e.g. SuperCollider, Max/MSP or Pure Data), app development, graphic and games programming (e.g. OpenGL, Unity), physical computing, are particularly encouraged.

Further information on Sussex Jobs

A PDF with the job description can be downloaded here: Research Fellow in Digital Humanities/Digital Performance 226

Phantom Terrains with Daniel Jones

:::: Wednesday, April 15th, 2pm @ Recital Room, Falmer House 120 ::::

Daniel Jones is an artist and software engineer whose work explores new ways in which sound and technology can illuminate our understanding of the world, producing large-scale sculptural sound installations and systems that translate patterns and processes into musical forms. Daniel will discuss two recent works — Living Symphonies, a touring outdoor piece that grows in the same way as a forest ecosystem, and Phantom Terrains, a platform that enables its hearing-impaired wearer to hear the surrounding landscape of wifi networks — and argues that these  two  seemingly disparate outputs inhabit the same creative spectrum.

Daniel Jones

Short biography:

Daniel Jones is an artist and software engineer whose work explores new ways in which sound and technology can illuminate our understanding of the world. This manifests itself in both scientific and artistic output: he has published work on process composition, creativity theory, systems ecology and artificial life, and exhibits his sound work internationally, harnessing algorithmic processes to create self-generating artworks.

Recent works include Phantom Terrains (with Frank Swain, 2014), a platform for ubiquitous sonification of wireless network landscapes; Living Symphonies (with James Bulley, 2014-), a landscape sound work that grows in the same way as a forest ecosystem; Global Breakfast Radio(with Seb Emina, 2014-), an autonomous radio station that broadcasts live radio from wherever the sun is rising; The Listening Machine (with Peter Gregson, 2012), a 6-month-long online composition which translates social network dynamics into a piece of orchestral music, recorded with Britten Sinfonia and commissioned by the BBC/Arts Council’s The SpaceVariable 4 (with James Bulley, 2011), an outdoor sound installation which transforms live weather conditions into musical patterns; Maelstrom (with James Bulley, 2012), which uses audio material from media-publishing websites as a distributed, virtual orchestra; Horizontal Transmission (2011), a digital simulation of bacterial communication mechanisms; and AtomSwarm (2006—2009), a musical performance system based upon swarm dynamics.

Daniel’s engineering work includes Chirp, a platform and iOS app for sharing information over sound, shortlisted for the Design Museum’s Designs Of The Year; the 3D audio engine for mobile games Papa Sangre and The Nightjar, nominated for two BAFTAs including “Audio Achievement”. He co-ordinated the technical infrastructure for The Fragmented Orchestra, winner of the prestigious PRSF New Music Award 2008, and was more recently a fellow on the Mozilla Webmaker programme.

Fields – Tim Shaw and Sébastien Piquemal

:::: Thursday, April 9th, 1.20pm @ Meeting House, University of Sussex ::::

NOTE: Please bring a networked device – a computer, phone, tablet, etc.

In this piece, Sébastien and Tim explore mobile technology as a medium for sound diffusion. Audience members can join in by simply connecting to a specific website with their mobile phone, laptop or tablet. The connected devices become an array of speakers that the performers can control live, resulting in an omni-directional sonic experience. This project provides an alternative method for sound spatialisation as well as offering new ways in which audiences can engage in sonic works. Fields has been performed in Helsinki, Berlin, Athens, Lisbon and Newcastle.

Further information: tim-shaw.net/fields_/

In a workshop afterwards, Tim and Sebastien will introduce participants to using Fields as a performance or installation tool. We will cover technical aspects such as setting up, configuration and customisation of the system. We will also share experience gathered from a year of composing for tiny mobile phone speakers, trying different audience configurations and dealing with latency and other technical limitations. The workshop will allow for participants to create their own work using Fields and will culminate in a listening session and an open discussion.

Fields Fields

Tim Shaw

Tim Shaw has worked internationally as a professional composer, performer, sound designer and researcher. His practice incorporates diverse approaches to sound capture and processing, and includes creating immersive and site responsive sonic installations. His compositional methods include field recordings, synthesized sounds and live electronics, providing a wide scope for creative diversity. At the heart of his work lies a concern with the auditory reflection and mirroring of real world environments through sound and technology. He is currently studying a PhD in Digital Media at Culture Lab alongside managing Newcastle based record label Triptik. Tim has created commissions for Warp Records, The British Council, The British Science Association, Pacitti Company, Tender Buttons and Transform Festival.

Sébastien Piquemal

Sébastien Piquemal is a computer engineer, obsessively exploring the artistic capabilities of machines. After working several years as a full-stack web developer in Helsinki, Finland, he decided to dedicate himself fully to making music. Since then, he has been an active contributor to the open-source software community, leading various projects such as WebPd (Pure Data patches running in the web browser). As a lover of Jazz and improvised music, Sébastien is seeking new ways to place human interaction at the core of live music. He is presently doing a MA degree in sound in new media at Media Lab Helsinki.

Interactive Music Theatre / Hertzian Field #1

:::: Monday, April 6th, 11am, @ Creativity Zone, Pevensey III ::::

Thanos Polymeneas-Liontiris / Stephanie Pan

(Research in progress) A study in interactive music theatre: In this first experiment of Thanos’ research on interactive music theatre, he is embarking on an exploration of the voice. Aided by improvisation as well as by fully written score, we have considered the instrument’s range, colours, and expressive possibilities. Moreover, this first study engaged with the subject of audience interaction beyond the traditional operatic setup.

Stelios Manousakis / M.Eugenia Demeglio

Hertzian Field #1

Hertzian Field #1 is an augmented reality surround sound environment that sonifies the dispersal of radio waves in a space caused by bodies and movement. The first of a series of pieces to come, this environment features an innovative interaction system inspired by an obscure surveillance technique. “We make wide daily use of electromagnetic radiation and depend on it increasingly to wirelessly transmit and receive information of all sorts, for all sorts of uses. Apart from carrying and distributing our data, wireless communication has a side effect: it conveys physical information about space and our own bodies as we move unawares through turbulent rivers of radio waves.”

Hertzian Field


Stelios Manousakis is a composer, performer, sound artist, and researcher, currently based in The Hague. He operates across the convergence zones of art, science, and engineering / composition, performance, and installation / the rich tradition of western sonic art and ‘digital folk’ idioms.

M.Eugenia Demeglio (IT/UK) is a maker and educator. She is an Associate Lecturer: Dance at Falmouth University and a freelance maker based in Brighton. Her practice includes move­ment and impro­vi­sa­tion per­for­mances, instal­la­tions, participatory events, videos, com­mu­nity projects and (body) sculp­tures.

Stephanie Pan (USA/NL) is a singer and multi-instrumentalist, performance artist, and maker based in The Hague, specializing in experimental music, new music, and experimental theater. Her work is rooted in the search for pure communication; finding contact with the audience stripped of expectations and distractions, which speaks beyond the conventional limitations and constructs of language. Her work is visceral, passionate and intense, often exploring the limits of the voice and body.

Thanos Polyemenas-Liontirs (GR/UK) is a composer, performer and sound artist. His practice comprises computer-aided compositions, interactive audiovisual installations and interactive music for dance, theater and multimedia performances. He is currently a PhD candidate at Sussex University researching on the development of audience immersive and interactive music theatre.  Stephanie Pan

Composing and Performing with the Magnetic Resonator Piano

:::: Thursday, March 26th, 1.20pm @ Recital Room, Falmer House 120 ::::

The magnetic resonator piano (MRP) is an electromagnetically augmented acoustic grand piano which uses electromagnets to induce vibrations in the strings. The MRP is played from the piano keyboard using an optical scanner that measures the continuous position of every key. The instrument is capable of infinite sustain, crescendos from silence, harmonics, pitch bends and new timbres, all produced acoustically by the piano strings and soundboard without any external speakers.

This event will begin with a performance by music composed for, or adapted to, the magnetic resonator piano. A talk and workshop will follow the performance. The talk will present the design of the instrument, including how its evolution has been shaped by working with composers and performers. The MRP was first created in 2009, and underwent a significant design revision in 2011 in response to feedback from musicians. The talk will also discuss how other digital musical instrument creators can build a community of musicians around their instruments.

The workshop will explore techniques for composing and performing with the magnetic resonator piano, working directly with the instrument. Examples will be taken from recent pieces composed for MRP, and there will be an opportunity to try new ideas on the resonator system.



Andrew McPherson is a Senior Lecturer in the Centre for Digital Music at Queen Mary University of London. With a background in electrical engineering and music, his research focuses on augmented acoustic instruments, new performance interfaces, and study of performer-instrument interaction. He did his undergraduate and Master’s work at MIT, completing his M.Eng. thesis in Barry Vercoe’s group at the MIT Media Lab. He completed his PhD in music composition in 2009 at the University of Pennsylvania. Before joining Queen Mary in 2011, he spent two years as a post-doctoral researcher in the Music Entertainment Technology Laboratory (MET-lab) at Drexel University.

In addition to the magnetic resonator piano, he is the creator of the TouchKeys multi-touch keyboard which launched in a successful Kickstarter campaign in 2013. More recently he has studied how musicians appropriate and misuse technology for creative purposes, creating a new digital musical instrument, the D-Box, specifically designed to be hacked and subverted by the performer using circuit bending techniques.

SLÁTUR – Experimental Composer Collective

After Tom Betts’ talk on the Sublime in Computer Games, we are lucky to have visitors from Iceland in the form of the experimental music SLÁTUR composer collective. They will perform a concert with compositions by SLÁTUR members. The performance includes electronic compositions with live electronic processing as well as music using animated notation, found objects as musical instruments and video projections.

Slatur pianoShort description:

SLÁTUR – an artistically obtrusive composer collective centered in Reykjavík, Iceland. Since 2005 its members have been working on various types of experiments. These include animated notation using computer graphics, interactivity, various experiments with sounds and tunings, performance art and the development of limited and isolated musical universes. The members share ideas and methods freely while the final products are usually independent efforts.

Slatur headquarters slatur instrument


Older posts

© 2017 Emute Lab @ Sussex

Theme by Anders NorenUp ↑