Evostar 2016

The Leading European Event on Bio-Inspired Computation.

Porto, Portugal, 30 March - 1 April 2016

5th International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design

April 2016, Porto, Portugal
Part of evo* 2016
evo*: http://www.evostar.org

Following the success of previous events and the importance of the field of evolutionary and biologically inspired (artificial neural network, swarm, alife) music, sound, art and design, evomusart has become an evo* conference with independent proceedings since 2012. Thus, evomusart 2016 is the fifth International Conference on Evolutionary and Biologically Inspired Music, Sound, Art and Design.

The use of biologically inspired techniques for the development of artistic systems is a recent, exciting and significant area of research. There is a growing interest in the application of these techniques in fields such as: visual art and music generation, analysis, and interpretation; sound synthesis; architecture; video; poetry; design; and other creative tasks.

The main goal of evomusart 2016 is to bring together researchers who are using biologically inspired computer techniques for artistic tasks, providing the opportunity to promote, present and discuss ongoing work in the area.

The event will be held in April, 2016 in Porto, Portugal, as part of the evo* event.


Publication Details

Submissions will be rigorously reviewed for scientific and artistic merit. Accepted papers will be presented orally or as posters at the event and included in the evomusart proceedings, published by Springer Verlag in a dedicated volume of the Lecture Notes in Computer Science series. The acceptance rate at evomusart 2015 was 27.9% for papers accepted for oral presentation, and 25.6% for poster presentation.

Submitters are strongly encouraged to provide in all papers a link for download of media demonstrating their results, whether music, images, video, or other media types. Links should be anonymised for double-blind review, e.g. using a URL shortening service.

Topics of interest

Submissions should concern the use of biologically inspired computer techniques -- e.g. Evolutionary Computation, Artificial Life, Artificial Neural Networks, Swarm Intelligence, other artificial intelligence techniques -- in the generation, analysis and interpretation of art, music, design, architecture and other artistic fields. Topics of interest include, but are not limited to:

Generation

  • Biologically Inspired Design and Art -- Systems that create drawings, images, animations, sculptures, poetry, text, designs, webpages, buildings, etc.;
  • Biologically Inspired Sound and Music -- Systems that create musical pieces, sounds, instruments, voices, sound effects, sound analysis, etc.;
  • Robotic-Based Evolutionary Art and Music;
  • Other related artificial intelligence or generative techniques in the fields of Computer Music, Computer Art, etc.;

Theory

  • Computational Aesthetics, Experimental Aesthetics; Emotional Response, Surprise, Novelty;
  • Representation techniques;
  • Surveys of the current state-of-the-art in the area; identification of weaknesses and strengths; comparative analysis and classification;
  • Validation methodologies;
  • Studies on the applicability of these techniques to related areas;
  • New models designed to promote the creative potential of biologically inspired computation;

Computer Aided Creativity and computational creativity

  • Systems in which biologically inspired computation is used to promote the creativity of a human user;
  • New ways of integrating the user in the evolutionary cycle;
  • Analysis and evaluation of: the artistic potential of biologically inspired art and music; the artistic processes inherent to these approaches; the resulting artefacts;
  • Collaborative distributed artificial art environments;

Automation

  • Techniques for automatic fitness assignment
  • Systems in which an analysis or interpretation of the artworks is used in conjunction with biologically inspired techniques to produce novel objects;
  • Systems that resort to biologically inspired computation to perform the analysis of image, music, sound, sculpture, or some other types of artistic object.

Additional information and submission details

Submit your manuscript, at most 16 A4 pages long, in Springer LNCS format (instructions downloadable from http://www.springer.com/computer/lncs?SGWID=0-164-6-793341-0)

Submission link: http://myreview.csregistry.org/evomusart16/
page limit: 16 pages

The reviewing process will be double-blind; please omit information about the authors in the submitted paper.

Programme committee

  • Pedro Abreu, University of Coimbra, Portugal
  • Dan Ashlock, University of Guelph, Canada
  • Peter Bentley, University College London , UK
  • Eleonora Bilotta, University of Calabria, Italy
  • Tim Blackwell, Goldsmiths College, University of London, UK
  • Andrew Brown, Griffith University, Australia
  • Adrian Carballal, University of A Coruna, Spain
  • Amilcar Cardoso, University of Coimbra, Portugal
  • Vic Ciesielski, RMIT, Australia
  • Pedro Cruz, University of Coimbra, Portugal
  • Palle Dahlstedt, Göteborg University, Sweden
  • Eelco den Heijer, Vrije Universiteit Amsterdam, Netherlands
  • Alan Dorin, Monash University, Australia
  • Jonathan E. Rowe, University of Birmingham, UK
  • Arne Eigenfeldt, Simon Fraser University, Canada
  • Jonathan Eisenmann, Ohio State University, USA
  • José Fornari, NICS/Unicamp, Brazil
  • Marcelo Freitas Caetano, INESC TEC, Portugal
  • Philip Galanter, Texas A&M College of Architecture, USA
  • Pablo Gervás, Universidad Complutense de Madrid, Spain
  • Andrew Gildfind, Google, Inc., Australia
  • Gary Greenfield, University of Richmond, USA
  • Carlos Grilo, Instituto Politécnico de Leiria, Portugal
  • Andrew Horner, University of Science & Technology, Hong Kong
  • Takashi Ikegami, Tokyo Institute of Technology, Japan
  • Christian Jacob, University of Calgary, Canada
  • Patrick Janssen, National University of Singapure, Singapure
  • Colin Johnson, University of Kent, UK
  • Daniel Jones, Goldsmiths College, University of London, UK
  • Amy K. Hoover, University of Central Florida, USA
  • Maximos Kaliakatsos-Papakostas, University of Patras, Greece
  • Hernán Kerlleñevich, National University of Quilmes, Argentina
  • Matthew Lewis, Ohio State University, USA
  • Yang Li, University of Science and Technology Beijing, China
  • Antonios Liapis, IT University of Copenhagen , Denmark
  • Alain Lioret, Paris 8 University, France
  • Roisin Loughran, University College Dublin, Ireland
  • Penousal Machado, University of Coimbra, Portugal
  • Roger Malina, International Society for the Arts, Sciences and Technology, USA
  • Bill Manaris, College of Charleston, USA
  • Jon McCormack, Monash University, Australia
  • Marcos Nadal, University of Vienna , Austria
  • Gary Nelson, Oerlin College, USA
  • Michael O’Neill, University College Dublin, Ireland
  • Somnuk Phon-Amnuaisuk, Brunei Institute of Technology, Malaysia
  • Jane Prophet, City University, Hong Kong, China
  • Kate Reed, Imperial College, UK
  • Douglas Repetto, Columbia University, USA
  • Juan Romero, University of A Coruna, Spain
  • Brian Ross, Brock University, Canada
  • Antonino Santos, University of A Coruna, Spain
  • Daniel Silva, University of Porto, Portugal
  • Benjamin Smith, Indianapolis University, Purdue University,Indianapolis, USA
  • Stephen Todd, IBM, UK
  • Paulo Urbano, Universidade de Lisboa, Portugal
  • Anna Ursyn, University of Northern Colorado, USA
  • Dan Ventura, Brigham Young University, USA
  • Anna Jordanous, University of Kent, UK

EvoMUSART Conference chairs

Colin Johnson
University of Kent, UK
c.g.johnson(at)kent.ac.uk

Vic Ciesielski
RMIT University, Australia
vic.ciesielski(at)rmit.edu.au

EvoMUSART Publication chair

João Correia
University of Coimbra, Portugal
jncor(at)dei.uc.pt


EvoMUSART Abstracts

Computer-Aided Musical Orchestration Using an Artificial Immune System
José Abreu, Marcelo Caetano and Rui Penha
The aim of computer-aided musical orchestration is to find a combination of musical instrument sounds that approximates a target sound. The difficulty arises from the complexity of timbre perception and the combinatorial explosion of all possible instrument mixtures. The estimation of perceptual similarities between sounds requires a model capable of capturing the multidimensional perception of timbre, among other perceptual qualities of sounds. In this work, we use an artificial immune system (AIS) called opt-aiNet to search for combinations of musical instrument sounds that minimize the distance to a target sound encoded in a fitness function. Opt-aiNet is capable of finding multiple solutions in parallel while preserving diversity, proposing alternative orchestrations for the same target sound that are different among themselves. We performed a listening test to evaluate the subjective similarity and diversity of the orchestrations.

TURBA Concert in 15 movements for 64 neural oscillators
Patxi Araujo
TURBA is a hybrid environment of artistic speculation that combines an electromechanical robotic device and its sonification with the network structure of 64 neural oscillators and the social context of the collective behaviors. None of the ac-tions generated by TURBA is previously arranged: no sound, no movement, no pat-tern is deliberately produced. Quite the opposite, these elements come alive because of its own processes in its own network structure.

Evolving Atomic Aesthetics and Dynamics
Edward Davies, Phillip Tew, David Glowacki, Jim Smith and Thomas Mitchell
The depiction of atoms and molecules in scientific literature owes as much to the creative imagination of scientists as it does to scientific theory and experimentation. danceroom Spectroscopy (dS) is an interactive art/science project that explores this aesthetic dimension of scientific imagery, presenting a rigorous atomic simulation as an immersive and interactive installation. This paper introduces new methods based on interactive evolutionary computation which allow users - both individually and collaboratively - to explore the design space of dS and construct aesthetically engaging visual states. Pilot studies are presented in which the feasibility of this evolutionary approach is discussed and compared with the standard interface to the dS system. Still images of the resulting visual states are also included.

Augmenting Live Coding with Evolved Patterns
Simon Hickinbotham and Susan Stepney
We present a new system for integrating evolvutionary processes with live coding. The system is built upon an existing platform called Extramuros, which facilitates network-based collaboration on live coding performances. Our evolutionary approach uses the Tidal live coding language within this platform. The system uses a grammar to parse code patterns and create random mutations that conform to the grammar, thus guaranteeing that the resulting pattern has the correct syntax. With these mutations available, we provide a facility to integrate them during a live performance. To achieve this, we added controls to the Extramuros web client that allows coders to select patterns for submission to the Tidal interpreter. The fitness of the pattern is updated implicitly by the way the coder uses the patterns. In this way, appropriate patterns are continuously generated and selected for throughout a performance. We present examples of performances, and discuss the utility of this approach in live coding music.

Towards Adaptive Evolutionary Architecture
Sebastian Hölt Bak, Nina Rask, Sebastian Risi
This paper presents first results from an interdisciplinary project, in which the fields of architecture, philosophy and artificial life are combined to explore possible futures of architecture. Through an interactive evolutionary installation, called EvoCurtain, we investigate aspects of how living in the future could occur, if built spaces could evolve and adapt alongside inhabitants. As such, present study explores the interdisciplinary possibilities in utilizing computational power to co-create with users and generate designs based on human input. We argue that this could lead to the development of designs tailored to the individual preferences of inhabitants, changing the roles of architects and designers entirely. \emph{Architecture-as-it-could-be} is a philosophical approach conducted through artistic methods to anticipate the technological futures of human-centered development within architecture.

Plecto: A Low-level Interactive Genetic Algorithm for the Evolution of Audio
Steffan Ianigro and Oliver Bown
The creative potential of Genetic Algorithms (GAs) has been explored by many musicians who attempt to harness the unbound possibilities for creative search evident in nature. Within this paper, we investigate the possibility of using Continuous Time Recurrent Neural Networks (CTRNNs) as an evolvable low-level audio synthesis structure, affording users access to a vast creative search space of audio possibilities. Specifically, we explore some initial GA designs through the development of Plecto (see www.plecto.io), a creative tool that evolves CTRNNs for the discovery of audio. We have found that the evolution of CTRNNs offers some interesting prospects for audio exploration and present some design considerations for the implementation of such a system.

Correlation Between Human Aesthetic Judgement and Spatial Complexity Measure
Mohammad Ali Javaheri Javid, Tim Blackwell, Robert Zimmer, Mohammad Majid al-Rifaie
The quantitative evaluation of order and complexity conforming with human intuitive perception has been at the core of computational notions of aesthetics. Informational theories of aesthetics have taken advantage of entropy in measuring order and complexity of stimuli in relation to their aesthetic value. However entropy fails to discriminate structurally different patterns in a 2D plane. This paper investigates a computational measure of complexity, which is then compared to a results from a previous experimental study on human aesthetic perception in the visual domain. The model is based on the information gain from specifying the spacial distribution of pixels and their uniformity and non-uniformity in an image. The results of the experiments demonstrate the presence of correlations between a spatial complexity measure and the way in which humans are believed to aesthetically appreciate asymmetry. However the experiments failed to provide a significant correlation between the measure and aesthetic judgements of symmetrical images.

Exploring the Visual Styles of Arcade Game Assets
Antonios Liapis
This paper describes a method for evolving assets for video games based on their visuals properties. Focusing on assets for a space shooter game, a genotype consisting of turtle commands is transformed into a spaceship image composed of human-authored sprite components. Due to constraints on the final spaceships' plausibility, the paper investigates two-population constrained optimization and constrained novelty search methods. A sample of visual styles is tested, each a combination of visual metrics which primarily evaluate balance and shape complexity. Experiments with constrained optimization of a visual style demonstrate that a visually consistent set of spaceships can be generated, while experiments with constrained novelty search demonstrate that several distinct visual styles can be discovered by exploring along select, or all, visual dimensions.

Grammatical Music Composition with Dissimilarity Driven Hill Climbing
Róisín Loughran, James McDermott, Michael O'Neill
An algorithmic compositional system that uses hill climbing to create short melodies is presented. A context free grammar maps each section of the resultant individual to a musical segment resulting in a series of MIDI notes described by pitch and duration. The dissimilarity between each pair of segments is measured using a metric based on the pitch contour of the segments. Using a GUI, the user decides how many segments to include and how they are to be distanced from each other. The system performs a hill-climbing search using several mutation operators to create a population of segments the desired distances from each other. A number of melodies composed by the system are presented that demonstrate the algorithm's ability to match the desired targets and the versatility created by the inclusion of the designed grammar.

Animating Typescript Using Aesthetically Evolved Images
Ashley Mills
The genotypic functions from apriori aesthetically evolved images are mutated progressively and their phenotypes sequenced temporally to produce animated versions. The animated versions are mapped onto typeface and combined spatially to produce animated typescript. The output is then discussed with reference to computer aided design and machine learning.

An Evolutionary Composer for Real-Time Background Music
Roberto De Prisco, Delfina Malandrino, Gianluca Zaccagnino, Rocco Zaccagnino
Systems for real-time composition of background music respond to changes of the environment by generating music that matches the current state of the environment and/or of the user. In this paper we propose one such a system that we call EvoBackMusic. EvoBackMusic is a multi-agent system that exploits a feed-forward neural network and a multi-objective genetic algorithm to produce background music. The neural network is trained to learn the preferences of the user and such preferences are exploited by the genetic algorithm to compose the music. The composition process takes into account a set of controllers that describe several aspects of the environment, like the dynamism of both the user and the context, other physical characteristics, and the emotional state of the user. Previous system mainly focus on the emotional aspect. EvoBackMusic has been implemented in Java using Encog and JFugue, and it can be integrated in real and virtual environments. We have performed several tests to evaluate the system and we report the results of such tests. The tests aimed at analyzing the users' perception about the quality of the produced music compositions.

Iterative Brush Path Extraction Algorithm for Aiding Flock Brush Simulation of Stroke-based Painterly Rendering
Tieta Putri and Ramakrishnan Mukundan
Painterly algorithms form an important part of non-photorealistic rendering (NPR) techniques where the primary aim is to incorporate expressive and stylistic qualities in the output. Extraction, representation and analysis of brush stroke parameters are essential for mapping artistic styles in stroke based rendering (SBR) applications. In this paper, we present a novel iterative method for extracting brush stroke regions and paths for aiding a particle swarm based SBR process. The algorithm and its implementation aspects are discussed in detail. Experimental results are presented showing the painterly rendering of input images and the extracted brush paths.

A Comparison Between Representations for Evolving Images
Alessandro Re, Mauro Castelli and Leonardo Vanneschi
Evolving images using genetic programming is a complex task and the representation of the solutions has an important impact on the performance of the system. In this paper, we present two novel representations for evolving images with genetic programming. Both these representations are based on the idea of recursively partitioning the space of an image. This idea distinguishes these representations from the ones that are currently most used in the literature. The first representation that we introduce partitions the space using rectangles, while the second one partitions using triangles. These two representations are compared to one of the most well known and frequently used expression-based representations, on five different test cases. The presented results clearly indicate the appropriateness of the proposed representations for evolving images. Also, we give experimental evidence of the fact that the proposed representations have a higher locality compared to the compared expression-based representation.

Evolving L-systems with Musical Notes
Ana Rodrigues, Ernesto Costa, Amílcar Cardoso, Penousal Machado and Tiago Cruz
Over the years researchers have been interested in devising computational approaches for music and image generation. Some of the approaches rely on generative rewriting systems like L-systems. More recently, some authors questioned the interplay of music and images, that is, how we can use one type to drive the other. In this paper we present a new method for the algorithmic generations of images that are the result of a visual interpretation of an L-system. The main novelty of our approach is based on the fact that the L-system itself is the result of an evolutionary process guided by musical elements. Musical notes are decomposed into elements -- pitch, duration and volume in the current implementation -- and each of them is mapped into corresponding parameters of the L-system -- currently line length, width, color and turning angle. We describe the architecture of our system, based on a multi-agent simulation environment, and show the results of some experiments that provide support to our approach.

MetaCompose: A Compositional Evolutionary Music Composer
Marco Scirea, Julian Togelius, Peter Eklund and Sebastian Risi
This paper describes a compositional, extensible framework for music composition and a user study to systematically evaluate its core components. These components include a graph traversal-based chord sequence generator, a search-based melody generator and a pattern-based accompaniment generator. An important contribution of this paper is the melody generator which uses a novel evolutionary technique combining FI-2POP and multi-objective optimization. A participant-based evaluation overwhelmingly confirms that all current components of the framework combine effectively to create harmonious, pleasant and interesting compositions.

Fitness and Novelty in Evolutionary Art
Adriano Vinhas, Filipe Assunção, João Correia, Penousal Machado, Aniko Ekárt
In this paper the effects of introducing novelty search in evolutionary art are explored. Our algorithm combines fitness and novelty metrics to frame image evolution as a multi-objective optimisation problem, promoting the creation of images that are both suitable and diverse. The method is illustrated by using two evolutionary art engines for the evolution of figurative objects and context free design grammars. The results demonstrate the ability of the algorithm to obtain a larger set of fit images compared to traditional fitness-based evolution, regardless of the engine used.

"Turingalila" Visual Music on the theme of Morphogenesis
Terry Trickett
Alan Turing’s paper ‘The Chemical basis of Morphogenesis’, written in 1952, is a masterpiece of mathematical modelling which defines how self-regulated pattern formation occurs in the developing animal embryo. Its most revolutionary feature is the concept of ‘morphogens’ that are responsible for producing an almost limitless array of animal and fish markings. Turingalila, a piece of Visual Music, takes morphogenesis as its theme. The diversity of forms evident in my projected images are based on just two Turing Patterns which are ‘perturbed’ to reveal processes of self-organisation reminiscent of those found in nature. A live performance of Turingalila forms the focal point of my oral presentation. It is prefaced by an examination of how artistic potential has been unleashed by Turing’s biological insights and concludes with comments on how Turing’s ideas are exerting an ever increasing impact in today’s world.

Important dates:

Submission Deadline: 1 November 2015
EXTENDED DEADLINE: 11 November 2015
(site remains open for final changes until 15 Nov)
Notification: 4 January 2016
Camera-ready: 18 January 2016
Mandatory registration per paper: 15 February
Student bursary deadline: 20 February
Early registration discount: 25 February 2016
Registration deadline: 24 March
EvoStar dates: 30 March - 1 April 2016

Twitter: