The project New Atlantis
New Atlantis is a shared (multi-user) online virtual world dedicated to audio experimentation and practice. Unlike most online worlds where image is the primary concern, in New Atlantis sound comes first.
New Atlantis provides a context for new-media students to showcase research projects that explore the relationship between sound, virtual 3D image and interactivity. It offers a pedagogical platform for audiographic animation, real-time sound synthesis, object sonification and acoustic simulation. It is a place to organize virtual sound installations, online concerts, soundwalks and other audio visual art experiences.
The name New Atlantis comes from the title of a Utopian novel by Philosopher Francis Bacon, 1627 (ref add link or extract) which describes a legendary island somewhere in the Atlantic Ocean, dotted with extraordinary audio phenomena that might be considered as premonitory of todays electronic and digital audio techniques. We have adopted some of Bacon’s ideas and nomenclature to create classes for the virtual world such as “Sound Houses”, “Sound Pipes”, “Trunks” and “Helps”.
In New Atlantis all elements have (by default) audio qualities: spaces resonate, surfaces reflect and collisions activate the multiple sounds of the objects involved. In addition we have created custom objects such as “Sound Trunks” where visitors can leave a recording or a voice object whereby a visitor can detach his or her voice (from navigation) and use it to make a distant space resound.
A collection of purpose built scripts implement low level sound synthesis and multiple parameter interactivity enabling the creation of complex sound sources and environments linked to animation or navigation in the visual scene.
Francis Bacon’s New Atlantis is a model for the role of science and art in society which places education at the heart of culture. Our project emphasizes discovery, cultural exchange, experimentation, learning and furthering knowledge in an educational and creative environment.
New Atlantis can be accessed via a web viewer or as a standalone application. It is organized as scenes that can be accessed independently but that share the same basic principles of navigation and specific scripts. Multi-user it can be shared by several players at the same time making it suitable for group playing in both the gaming and the musical sense of the word.
Every registered user can create and host individual or shared scenes which he or she can decide to make persistent or not.
At the time of writing, we are working on a limited number of public scenes that can contain multiple “Sound Houses” (architectural elements with specific acoustics). These can be visited by navigating through the scene, their audio however is not necessarily limited to the immediate proximity of the building. This means that sounds can be heard from a distance, thus they mix together and the placing of and navigation between sound objects can be considered as a musical experience. In these public scenes users can interact with one another and with shared objects.
Another special feature inspired by Bacon’s text “Sound Pipes” allows the transmission of audio to and from a remote location.
New Atlantis project is not only about creating a multi user virtual universe, but also to make it together while learning. It is experimental, creative and educative. Each new opportunity to develop New Atlantis happens within workshops, courses or events gathering art schools students, from different places working together at a distance. It is a way to initiate ubiquitous working groups of students for immaterial non local and shared art creation.
The premises of the New Atlantis project go back to 2005 when the Locus Sonus (ESA Aix) and SAIC (School of the Art Institute of Chicago) were awarded FACE funding for an academic and research exchange program. ENSCI, through Roland Cahen, has been associated with the project since the early stages of development.
The first experiments in 3d audio-graphy took place in Second Life, using Pure Data as an audio engine. The process involved sending html commands from second life to an external server which did the audio synthesis and streamed the result back to second life. This system worked well enough to convince us that it was worth while pursuing the development of sophisticated audio features for virtual environments, however, it was difficult to implement and the delay due to streaming was problematic. The decision was made to build our own multi user world using Panda 3d with Pure Data bundled as an audio engine. This first version of New Atlantis was developed during multiple workshops that took place in Chicago and Aix en Provence between 2007 and 2011 The project involved a relatively complex path finding system to calculate acoustics and custom built client server software. The development process took a considerable amount of time and the although a working version was tested successfully during a workshop at ENSAB (2011), it was decided to abandon the system in favor of Unity3d, a more recent and efficient platform offering more scope for audio programming.
Peter Sinclair (Locus Sonus ESA-Aix)
Peter Gena (SAIC)
Roland Cahen (ENSCI)
Jonathan Tanant (JonLab): lead developer and software achitect
Alexandre Amiel (ESA-Aix)
Marc Anderson (3d GraGraphics SAIC)
Robb Drinkwater (Audio programming SAIC)
Michael Fox (3d GraphicsSAIC)
Jerome Joy (Audio Locus Sonus)
Ben Chang (Programming , 3d GraGraphics)
Daan de Lange (ESA-Aix)
Théo Paolo (ESA-Aix)
Alexandre Amiel (ESA-Aix)
Anne Roquigny (Administration/Coordination Locus Sonus)
Julie Karsenty (Administration/ Coordination ESA-Aix)
Gonzague Defos de Rau
Technical details (how it works)
New Atlantis Network architecture specifications
New Atlantis uses Unity as the main platform (client / server) and a standard server for the backoffice and database part (LAMP).
A Space is an independent New Atlantis world.
It is not necessarly related to the Bacon Sound Houses concept : several Sound Houses could exist in one Space, or one Sound House could be split in several Spaces.
The absolute rule is that one Space is totally independent : nothing communicates, exits or enters with/to/from the outside.
An Object is a composite-audio-graphic-3d-interactive object created by a User (designer/artist/developer) in Unity and uploaded to a Space in the form of an Asset Bundle. Objects have qualities (especially audio capabilities in New Atlantis) and have information attached, so an object consists of:
● data: the actual asset bundle.
● state: the state of the object and subobjects in the current simulated space. Position, orientation, audio volume ?
A User is an account that belongs to a real human user.
● pseudo / login.
● owns Spaces or has write rights on Spaces.
Viewer app / Server - Main architecture overview
data VS state
There is a distinction between an object’s data+initial state and its current state within a simulation.
If we take a simple ball as an example, the data will be the Asset Bundle and initial position in the Space, the state will be the current position, orientation and speed within the current space and the current simulation session.
The web server (LAMP Linux Apache MySQL PHP) is in charge of the Spaces, the user accounts, the objects (with their initial state) but does not run the simulation. Its role is only to give the initial state of a Space to a server and give all the needed data (Asset Bundles to download).
We will need the following features on this server :
● User account management.
● Space management (owned by users with invitations).
● Object import.
● Space description storage retreival (object list with state).
This should be done with a backoffice AND with a webservice.
Unity server (green cube)
The Unity server is in charge of running the chosen space simulation and synchronize this with connected clients.
The server runs the same “New Atlantis” app as the clients but in “server mode” instead of “client mode”.
!!! At least one Unity server must be running for running each space
This means that a servers must be running in order for a space to be persistent and working in a space requires at leat one server to be running.
authoritative server scheme
We will be using the authoritative server scheme, meaning that basically the server will run the entire simulation and send the objects state to the clients, that will be running the scene without the logic (only rendering).
Viewer / client
features : interactions with the world, with a focus on Audio.
To be defined ….
● Throw objects.
● Play a test sound (clic).
● Hit objects.
Master server, Facilitor and NAT punchthrough
Anybody should be able to start a New Atlantis server.
The data server holds the actual data and the initial state. Once a server is started it starts to simulate behavior of objects and their state start to evolve : a cube may fall or a ball might roll down the hill…
Thanks to the Unity « Master Server » and the « Facilitator server », we should not have to worry about NAT addresses, firewalls, ports forwarding and server publicity. More infos here : http://docs.unity3d.com/Manual/net-MasterServer.html
The clients could be able to ask for the current running servers (in the world) and the servers could be able to register to this Master server.
We can use at start the default Unity Master server, located at 22.214.171.124, but source code is available and we will be able to setup our own in the future and put it where we want.
Of course, in case of an event we could as well not use this and connect to an hard-coded IP, if we know that the server we are interested in is there.
● SERVER SIDE :
● A computer is started in “server mode” on a given “Space”. (The “Server”)
● The “Server” registers itself at the Master Server (as a “NewAtlantis” server running the chosen World).
● The “Server” connects to the web server to get the list of objects (Asset Bundles) to load in this world and to get their initial state.
● CLIENT SIDE :
● A computer is started in “client mode”. The client get the list of running servers connecting to the Master Server. The user is able to connect to any of these worlds.
● As soon as the user connects to a server, the app downloads automatically all the data and is synced with the server.