Emute Lab @ Sussex

Experimental Music Technologies Lab

Category: Uncategorized (page 1 of 3)

A Magnificent Crossbreeding of Protein and Tinplate

:::: Monday, February 13th, 5-9pm @ Attenborouch Centre for the Creative Arts ::::

A Magnificent Crossbreeding of Protein and Tinplate is a generative music theatre performance with audience interactive/participatory aspects. It is inspired and devised based on Heiner Muller’s Despoiled Shore/Medeamaterial/Landscape with Argonauts. The audience is invited to explore this performance ecosystem, a deconstructed landscape, a fragmented environment, where individuals can be found playing music, reading, singing, being.

A Magnificent Crossbreeding of Protein and Tinplate is a posthuman vivarium. The “I” in this space is collective and visitors are invited to transit, inhabit, indulge and experience this world, to interact with the performers, the objects and the space. The piece is an alive performance organism, controlled through a network of computers. It evolves as time passes, and as the performers and the cybernetic system react to the audience’s presence and movement in the space, generating a succession of unique and unexpected situations.

Thanos


The performance is designed to run for 4 hours and the audience can enter and leave the space as they please. Due to the nature of the performance and the space limitation, the audience might be asked to explore the space carefully, as well as be open for interaction with other human and non-human beings.

This piece of generative music theatre is the final of a series of experiments, part of Thanos Polymeneas-Liontiris’ PhD practice-based research on interactive music theatre, taking place at the University of Sussex, supervised by Thos Magnusson and Nicholas Till and funded by AHRC.

Further location info: ACCA

Concept/Direction/Music: Thanos Polymeneas Liontiris
Production/Assistant Director: M. Eugenia Demeglio
Text by: H. Muller, Aeschelus, Heracletus, W. Shakespeare, T.S. Elliot, G. Seferis, F. Hölderlin
Text adaptation: Nikos Ioakeim, Thanos Polymeneas-Liontiris
Devised with and performed by: Gonçalo Almeida (PT/NL), M. Eugenia Demeglio (IT/UK), Theresa Elflein (DE/UK), Nikos Ioakeim (GR/NL), Katerina Kostantourou (GR/NL), Arthur Artorius Leadbetter (UK), Stephanie Pan (US/NL), Friso van Wijck (NL).

Emute Lab in Reykjavik

The fruitful collaboration with Halldor Ulfarsson on the feedback cellos, that started as part of the Arts Council funded workshop programme at the Live Interfaces conference, is continuing with Alice Eldridge and Chris Kiefer going to Reykjavik to work at theIcelandic Academy of the Arts.

IMG_6016
They will also be performing in Mengi on February 12th:

Feedback Cell is the duo formed by cellist Alice Eldridge (Collectress, En Bas Quartet) and computer-musician Chris Kiefer (Luuma) to explore their ever-evolving feedback cello project. Two butchered cellos, electromagnetic pickups, code, bows and lots of soldering. Emits dulcet drones and brutal yelps.
Alice Eldridge is a cellist and researcher. Her backgrounds in music, psychology, evolutionary and adaptive systems and computer science inspires and informs systemic sound-based research across ecology, technology and music. Current projects include ecoacoustics for biodiversity assessment, networked notation for ensemble music-making and hybrid instrument building for improvisation. As a cellist she has shared stages, studios and other acoustic spaces with some of the UK’s most inventive musicians at the intersections of contemporary classical, folk, free jazz, minimal pop and algorithmic musics.

Chris Kiefer is a computer-musician and musical instrument designer, specialising in musician-computer interaction, physical computing, and machine learning. He performs with custom-made instruments including malleable foam interfaces, touch screen software, interactive sculptures and a modified self-resonating cello. Chris’ research often focuses on participatory design and development of interactive music systems in everyday settings, including digital instruments for children with disabilities, and development of the NETEM networked score system for musical ensembles
Concert starts at 9pm. Tickets: 2000 ISK.

 

Artist talk: Timothy Didymus, Kosmiches Glass

:::: Wednesday, October 12th, 1pm, @ Jane Attenborough Studio, Attenborough Centre ::::

Brighton musician and maker Timothy Didymus will present his glass Harmonica project, Kosmiches Glass.

Twelve tuned (brandy) glasses are mounted on MIDI controllable turntables, creating a playable/ scriptable mechanical acoustic instrument with beautiful polyphonic voice.

Following a demonstration of the glasses in action, Timothy will talk about the inspiration behind and development of the project and will be happy take questions on any apsects of the project – aesthetic, technical, logistical etc.

This will be of interest to music, music tech and sonic media students – or anyone with an interest in new musical instruments.

Its design and engineering is of great elegance. (…) its music of transparency and transience, the sonorous resonances of heavenly voices” — Sound Artist Max Eastley, 2015

 

didy

Free entry.

http://timothydidymus.com/kosmisches-glass-2

Machine Creativity Roundtable at IASPM in Brighton

:::: Saturday, September 9th, 2.00pm @ Clarendon Centre, Brighton ::::

We present a roundtable on Machine Creativity as part of the IASPM conference co-organised by the Sussex Music Department and BIMM. The idea is to set up a dialogue between researchers of machine creativity and popular music to further understanding between the respective fields. Members of the round table are:

 

  • Elaine Chew - http://www.eecs.qmul.ac.uk/~eniale/ Chew is a musician (pianist) and mathematician (operations researcher) who designs mathematical and computational tools to model, analyse, and visualise music structures: structures constructed in the process of listening, performing, composing, and improvising.
  • Rebecca Fiebrink - http://www.doc.gold.ac.uk/~mas01rf/Rebecca_Fiebrink_Goldsmiths
    Fiebrink is an expert in machine learning and has created systems using neural networks that can quickly learn gestural and compositional styles of performers and composers.
  • Andrew McPherson - http://www.eecs.qmul.ac.uk/people/view/20095/dr-andrew-mcpherson
    How can computer algorithms add to and shape musical performance? Can instruments learn about the performers playing them? Can they contribute to the playing? Andrew will bring the NIME (New Instruments for Musical Expression) expertise to this round-table.
  • Bob Sturm - http://www.eecs.qmul.ac.uk/~sturm/
    Sturm’s research seeks to interrogate the “creative” and/or “intelligent” machine. Though a machine may appear creative and/or intelligent, what has it actually learned to do? How can it be changed to do what we want it to do? Sturm will discuss various approaches to answering these questions, and how such “learned machines” can provide useful assistance and inspiration to the human creator.
  • Thor Magnusson (chair) - http://www.sussex.ac.uk/profiles/164902
    Thor Magnusson is a musician with a PhD in Computer Science and Artificial Intelligence, focusing on music software and the role of computers in musical creativity.

 

The Tree of PorphyryIASPM 2016 conference poster

The abstract for our roundtable is the following:

Recent academic fields such as software studies, digital humanities, and the philosophy of technology have directed focus on big data, machine agency, and the role of algorithms in contemporary culture. We are slowly gaining a clearer understanding of the power, as well as pitfalls, of machinic involvement in daily human tasks. With machines that learn and communicate, we will see their powers used in domains that were seen as exclusively human, for example artistic creativity. However, the use of computational creativity in the domain of music is nothing new and one of the earliest computer music compositions – the Illiac Suite by Hiller and Isaacson – was composed applying Artificial Intelligence techniques.

Machine creativity in the creative process can range from minor “suggestions” in filtering or mixing, or intelligent mapping between a controller and a sound generator, to a stronger involvement where the computer aids in writing in the style of a certain genre or a composer, even composing a whole musical piece without any human involvement. This raises both technical and philosophical questions about the nature of composition, performance, and creativity in general.

This proposed round-table addresses recent developments in machine learning, musical corpus analysis, and related computational creativity techniques. The session includes specialists in machine creativity, who will present and discuss diverse aspects of how computers will be more involved in the creative process of future musical works. The panel will explore the potential of computational creativity in the musical domain, ranging from the use of AI in musical instruments, musical corpus analysis for deeper understanding and regeneration of music, and the use of AI for non-human music. The potential of semantic audio techniques will be questioned for the domains of film music, game sound and other fields where music has a specific context-aware functionality.

The round-table will also discuss the role of the listener when musical works become open for interaction. How do new computational playback devices, in the form of mobile media, enable the listener to become part of the creative process of generating the music? What do these new technologies afford to composers, and how will they use AI as part of the new compositional context?

We will be discussing questions of computuational creativity and machine learning, and hoping to open up a dialogue with specialist in popular music studies. Some of the topics discussed include:

 

  • How is machine creativity being best applied in the field of music?
  • Does machine creativity belong equally in performance and in composition/studio work?
  • What is creativity? How does this concept operate in your field of research?
  • Under which conditions can we declare a machine to be creative?
  • What can a machine do that humans can’t?
  • Is there a difference in how computational algorithms would engage with established genres vs. experimental music?
  • Wherein lies the art of music making? Will it perhaps always escape the machine?

 

 

Our session will be at 1pm, on Saturday 10th of September. See full programme here.

The header image of this post is Al-Jazari’s Diagram of a hydropowered perpetual flute from The Book of Knowledge of Ingenious Mechanical Devices from 1206. From Wikimedia Commons

 

DiDIY workshop at ACCA

:::: Thursday, October 6th, 7.00pm @ ACCA ::::

  • Does making matter?
  • Can making promote change?
  • Can it foster creativity and entrepreneurship?
  • Does it just make you feel good?

This workshop is for anyone who uses digital tools to make or be creative: including learning techniques from Youtube, sharing code on Github, using software to compose or perform music, making instuments with microcontrollers, sharing music on Soundcloud, or selling media on Bandcamp.

This is an active making workshop where you will use Lego and other simple materials to make some things, discuss your making practice, and share ideas and experiences in the company of other makers.

Led by David Gauntlett, author of Making is Connecting, the workshop is designed to be an enjoyable way to take time out and reflect on your own practice and the meanings of making. It will give you the chance to do some creative exercises, share with other makers and contribute your perspective to our research.

This workshop is FREE, but spaces are limited, so please sign up here, and if you can’t make it, let us know so we can release your ticket to someone else.

Sign-up here

Resonating Instruments Workshops

How do we compose for new instruments? Can we redesign old instruments for new music? What happens when familiar instruments gain new expressive scope?

These are questions we will be investigating in two workshops running as part of the ICLI conference (International Conference on Live Interfaces – http://www.liveinterfaces.org), to be held at the University of Sussex in Brighton in June 29/30th.

Registration is now open for composers and performers to join the workshops in order to create pieces for two new musical instruments – the halldorophone and the magnetic resonator piano – and perform at the opening of the ICLI conference in the Attenborough Centre for the Creative Arts.

The workshops will engage with questions of legacy, tradition, notation, performance and improvisation. They will deal with how instruments frame our compositional thoughts, provoking questions as to what happens when familiar instruments gain new sonic and performance capabilities.

Screen Shot 2016-05-23 at 21.31.59

Further information on the workshops can be found on the ICLI website: http://www.liveinterfaces.org/#workshops

If you are interested in joining one of these two-day workshops, please fill in the registration form here:

Halldorophone – http://goo.gl/forms/iszthAUWb2
Magnetic Resonator Piano – http://goo.gl/forms/rGZfVmWesn

We will confirm by June 17th. This application process is because there are limited places and we wish to ensure a good balance of participants. Workshops cost £50 which includes lunches on both days.

In order to facilitate participation by creative musicians outside academia we are making these workshops open without attending the rest of the ICLI conference.

The ICLI team
www.liveinterfaces.org

ICLI registration open

3rd International Conference on Live Interfaces
School of Media, Film and Music – University of Sussex, Brighton.

Dates: June 28th – July 2nd, 2016.

Website: www.liveinterfaces.org

Registration is now open for the International Conference on Live Interfaces in Brighton this summer. This biennial conference brings together people working with live interfaces in the performing arts, including music, the visual arts, dance, puppetry, robotics or games. The conference scope is highly interdisciplinary but with a focus on interface technologies of expression in the area of performance. Topics of liveness, immediacy, presence (and tele-presence), mediation, collaboration and timing or flow are engaged with and questioned in order to gain a deeper understanding of the role contemporary media technologies play in human expression.

The conference consists of paper presentations, performances, interactive installations, poster demonstrations, a doctoral colloquium and workshops. Works engaging with the principles and assumptions governing interaction design, including perspectives from art, philosophy, product design and engineering are specially invited.

Keynotes:
- Stuart Nolan (magician)
- Kristina Andersen (instrument maker)
- Roman Paska (puppeteer)

Workshops:
- Magnetic Resonator Piano (Andrew McPherson)
- The halldorophone (Halldor Ulfarsson)
- Sound and Space: Performing Music for Organ and Electronics (Lauren Redhead and Alistair Zaldua)
- A Practical and Theoretical Introduction to Chaotic Musical Systems (Tristan Clutterbuck, Tom Mudd and Dario Sanfilippo)
- Making High-Performance Embedded Instruments with Bela and Pure Data (Giulio Moro, Astrid Bin, Robert Jack, Christian Heinrichs and Andrew McPherson)
- Distributed Agency in Performance (Paul Stapleton, Simon Waters, Owen Green and Nicholas Ward)
- Interfacing the Txalaparta Workshop (Enrike Hurtado)

The conference, including performances and installations, will take place at the newly renovated Attenborough Centre for the Creative Arts. We will publish final performance, installations and paper programme when closer in time.

There is a doctoral colloquium on Wednesday 29th of June and a Brighton Modular Meet on Sunday, 3rd of July.

We look forward to welcome you to the University of Sussex for the ICLI conference this summer.

 

intro-bg

Sussex SuperCollider Users Group is back (February – March 2016)

 

SCUG_Sussex_Polymeneas

 

Hi everyone, we are very excited to announce that Sussex SCUG is back!!!

 

When­­ is it?

The Sussex SCUG meetings will take place again on Wednesdays, between 6pm – 9pm on the following dates:

17th February
2nd March
16th March

 

Where is it?

Digital Humanities Lab
Silverstone building
University of Sussex
Falmer

For further information about Sussex SuperCollider Users Group (Susssex SCUG) please click here

 

Sussex SuperCollider Users Group

Who is SCUG for?

Are you a musician interested in making digital music? Are you a computer programmer that would like to make music with concepts you might be already familiar with? Have you ever started using SuperCollider but found it difficult due to luck of support? Are you an experienced SuperCollider user looking for constructive criticism of your work? If you think you are one of the above or if you are just curious on what SuperCollider is and you live in the area of Brighton & Hove, the Sussex SuperCollider Users Group (SCUG) at Sussex is the place to be. It is basically for anybody that is or thinks they might be interested in SuperCollider.

 

What is SuperCollider?

SuperCollider is one of the most efficient and good sounding audio programming environments existing today. It is used by musicians, artists, and scientists for all kind of work with sound, from generative music, instrument building, sound installations, analysis of sound to sonification of big data. SuperCollider is open source and free.

 

What is the SCUG?

SuperCollider Users Group is a community of SuperCollider users of all stages interested to meet up and share ideas, code and music. The events are organised by Music Informatics & Performance Technology Lab and are hosted by Sussex Humanities Lab. The meetings are ideal for presenting work, discussing projects, collaborate and learn from each other. However, most importantly these sessions will have a social aspect in which coding, debugging and music making will happen while socialising with other SuperCollider users, perhaps with some drinks and snack.

 

When­­ is it?

This term’s SCUG meetings will take place on Wednesdays, between 6pm – 9pm on the following dates:

21st October
4th November
18th November
2ndDecember

 

Where is it?

At the Sussex Humanities Lab
Silverstone building
University of Sussex
Falmer

 

What do I need to bring in?

Your laptop
Headphones

 

Do I need to sign up for it?

No, at the moment SCUG has an open doors policy, anybody is welcome.

 

Contact

Thanos Polymeneas Liontirs
email: A.Polymeneas-Liontiris@Sussex.ac.uk

 

US_Sussex Humanities Lab_RGB_Black

Research Fellow in Digital Humanities/Digital Performance

The Sussex Humanities Lab in collaboration with the School of Media, Film and Music at the University of Sussex wishes to appoint a fixed-term (4-year) fellowship (Research Fellow) in Digital Technologies/Digital Performance. While based in the School of Media, Film and Music, the appointee will work across the Sussex Humanities Lab in collaboration with colleagues in History, Art History & Philosophy, Informatics, and Education and Social Work.

The ideal candidate will have a demonstrable track record of work in performance technologies as a theorist and/or creative practitioner, with clear evidence of technical expertise in all cases. Candidates with knowledge of one or more of the following:  creative software (e.g. SuperCollider, Max/MSP or Pure Data), app development, graphic and games programming (e.g. OpenGL, Unity), physical computing, are particularly encouraged.

Further information on Sussex Jobs

A PDF with the job description can be downloaded here: Research Fellow in Digital Humanities/Digital Performance 226

Older posts

© 2017 Emute Lab @ Sussex

Theme by Anders NorenUp ↑