Visual Programming | Computer-Aided Music Compositon
OM7 is the latest generation of OpenMusic (OM), a visual programming language based on Common Lisp, dedicated to computer-assisted music composition.
OpenMusic (OM) is a visual programming language based on Common Lisp. Visual programs are created by assembling and connecting icons representing Lisp functions and data structures, built-in control structures (e.g. loops), and other program constructs. OM may be used as a general purpose visual programming language, and reuse any existing Common Lisp code. At a more specialized level, a set of in-built tools and external libraries make it a powerful environment for music composition. Various classes implementing musical structures are provided, associated with graphical editors including common music notation, MIDI, OSC, 2D/3D curves, and audio buffers.
Download OM7 here:
What’s new in OM7 ?
OM7 brings a brand-new generation of tools and features in your favorite computer-assisted composition environment:
- New patching interfaces and environment with easier box inspection / display control / automatic alignment / connections / etc.
- No workspace to set-up: open your documents and simply organize them in your usual file-system.
- Interactive visualization of Lisp code corresponding to visual programs.
- A native implementation of the reactive mode for visual program execution.
- New loops. Embed iterative processes in standard patches. Use a collection of new collectors and memory utilities.
- A new set of interface components: list-selection, switch, button, slider, … / Lock your patch for faster interaction.
- A redesigned maquette / sequencer interface including dual program/tracks-based visualization, meta-programming tools, and reactive execution modes.
- Human-readable, easily editable text format for patches and other documents. Possibility to read and edit patches as text.
- New score editors, BPF/BPC editors, etc. Nicer display. Easier edit.
- Collection : a versatile container handling the storage, visualization and editing of collection of objects.
- A time-based model for “executable” objects, including dynamic function execution and data send/transfer possibility.
- Dynamic-memory allocated audio buffers (no need to store all your sounds in external files anymore).
- New MIDI-Track object / editor.
- A framework for handling OSC data and bundles
OM7 can load most OM6-generated patches and programs. See the how to import OM6 patches in the documentation pages.
Most OM6 external libraries are easily portable (or already ported to OM7). See how to create or adapt a library for OM7.
Do not hestitate to report any problems in porting or converting libraries or patches on the OM7 user forum.
Note: OM7 documentation is still at its very early stages. The OM6 User Documentation can be useful to find out about the basics of OM visual programming workflow.
— See also this ICMC’17 paper for a quick overview.
Help / Bug reports / Community
A discussion group is hosted on Ircam Forumnet: https://discussion.forum.ircam.fr/c/om7
=> Create an account in order to post questions and replies.
Subscribe to group notifications using Watching / Tracking and other options.
Sources and Licensing
Source repository: https://github.com/openmusic-project/om7/
OpenMusic is a free software distributed under the GPLv3 license. As a Common Lisp program, the environment can be considered just as an extension of Lisp including the specific built-in features of the application. It is also possible to compile, load and run OpenMusic sources in a Lisp environment, using the adequate compiler.
While the sources of OM7 are available under the GPL license, the application is developed with LispWorks 7.1.1: a commercial Lisp environment providing multiplatform support and graphical/GUI toolkits. A free (limited) edition of LW6 is available on the LispWorks website, but unfortunately no free version of LW-7 exists at the moment.
In order to contribute to the code with a LispWorks license, one must therefore work both with the source package and an up-to-date reseased version on OM7 (which includes a Lisp interpreter).
– Unzip the libraries in a folder and specify this folder in the om7 Preferences/Libraries/
|OM7-compatible external libraries:||Connection with external/DSP tools:|
OM7 has been used as a support for research and production in a number of recent projects. See related papers below:
- OM-AI: A Toolkit to Support AI-Based Computer-Assisted Composition Workflows in OpenMusic. Anders Vinjar, Jean Bresson. Sound and Music Computing conference (SMC’19), Málaga, Spain, 2019.
- Musical Gesture Recognition Using Machine Learning and Audio Descriptors. Paul Best, Jean Bresson, Diemo Schwarz. International Conference on Content-Based Multimedia Indexing (CBMI’18), La Rochelle, France, 2018.
- From Motion to Musical Gesture: Experiments with Machine Learning in Computer-Aided Composition. Jean Bresson, Paul Best, Diemo Schwarz, Alireza Farhang. Workshop on Musical Metacreation (MUME2018), Internationa Conference on Computational Creativity (ICCC’18), Salamanca, Spain, 2018.
- Symbolist: An Open Authoring Environment for End-user Symbolic Notation. Rama Gottfried, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’18), Montreal, Canada, 2018.
- Next-generation Computer-aided Composition Environment: A New Implementation of OpenMusic. Jean Bresson, Dimitri Bouche, Thibaut Carpentier, Diemo Schwarz, Jérémie Garcia. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Landschaften – Visualization, Control and Processing of Sounds in 3D Spaces. Savannah Agger, Jean Bresson, Thibaut Carpentier. International Computer Music Conference (ICMC’17), Shanghai, China, 2017.
- Timed Sequences: A Framework for Computer-Aided Composition with Temporal Structures. Jérémie Garcia, Dimitri Bouche, Jean Bresson. International Conference on Technologies for Music Notation and Representation (TENOR’17), A Coruña, Spain, 2017.
- Computer-aided Composition of Musical Processes. Dimitri Bouche, Jérôme Nika, Alex Chechile, Jean Bresson. Journal of New Music Research, 46(1), 2017.
- Interactive-Compositional Authoring of Sound Spatialization. Jérémie Garcia, Thibaut Carpentier, Jean Bresson. Journal of New Music Research, 46(1), 2017.
- o.OM: Structured-Functional Communication between Computer Music Systems using OSC and Odot. Jean Bresson, John MacCallum, Adrian Freed. ACM SIGPLAN Workshop on Functional Art, Music, Modeling & Design (FARM’16), Nara, Japan, 2016.
- Towards Interactive Authoring Tools for Composing Spatialization. Jérémie Garcia, Jean Bresson, Thibaut Carpentier. IEEE 10th Symposium on 3D User Interfaces (3DUI), Arles, France, 2015.
See also some previous papers and resources on OM6.
Highlights / contributors / timeline of the project
The OM7 project was initiated by @j-bresson in 2013. Most of the code is written from scratch, but a significant part of it is largely inspired or borrowed from the OM original sources and features, including the direct or indirect contributions of its authors (@CarlosAgon, @assayag) and contributors.
The initial objective of OM7 was to experiment new visual Lisp programming features. Important developments have been carried out during the EFFICACe research project conducted at IRCAM (2013-2017), which aimed at exploring relationships between calculation, time and interactions in computer-assisted music composition processes, focusing on specific topics such as dynamic temporal structures or the control, visualization and interactive execution of sound synthesis and spatialization processes.
A new generation of tools and editors for the representation and manipulation of musical objects (score, sounds, temporal data streams, controllers, etc.) are included, covering most operational areas of OpenMusic/computer-assisted composition processes.
The reactive model recently introduced in OpenMusic has been integrated as a native feature of OM7 and works seamlessly in the visual programming environment. @jeremie-gracia created a framework for timeline-based control of musical object, and new tools for the representation adn interaction with spatial audio scenes (om-spat). @dimitribouche developed a dynamic scheduling architecture that was implemented and integrated as the main core for musical rendering and computation in OM7, as well as new interfaces for temporal representation and organization of compositional processes (a new design of the OpenMusic maquette).
OM7 is compatible with the main OM6 features, libraries and objects, and embeds an automatic translation system for loading/converting OM6 patches.
@andersvi is maintaining the Linux distribution.