We spoke to the creators behind ØLali - the first Artificial Artist - who are innovating the way we connect with sound and art. ØLali is taking live music and transforming it into evolving and audio-reactive visualisations - transforming how we see, feel, and experience art. The living sculptures are mesmeric and create a multi-sensory and unique interactive musical experience. The very concept of the Artificial Artist can be replicated in any field – however it is music were ØLali has started.
What is your vision for ØLali?
The vision of ØLali is to go beyond the sound. Paul and I are huge music and art lovers. We want to extend the reach of one work of art by creating another version in another medium, creating a symbiosis between what we hear and what we see. Everything is about our senses because you have the vibrations of people dancing around you and this visual that is directly responding to the music that you're hearing.
We want to create an experience with ØLali artworks, which are accessible to both hearing and deaf people. I am a huge music lover, but I'm not a really good composer or musician and for me, it was also a way to showcase my love for music and to showcase the beauty I see in music in a different way.
How do you see ØLali and the experiences you are creating being enhanced as technology evolves ?
There are so many different technologies that are evolving, it is always interesting to wonder which ones we could implement. Currently we are working on a VR project, so people can go around the artwork and feel it, be close, be far, turn around it. It is all about the visualisation, because the 2D screen is limiting. What we love is being able to have something that people can interact with directly.
With ØLali we want to take this one piece of music and bring a complex representation which is evolving and is also unique because ofthe specificity of generative technologies and randomness. If we could improve the way we transmit the image and the video, it would be really interesting.
All your outputs are really beautiful and abstract. Would you ever create a generative system which gave music more of a figurative face or something less abstract?
Yes, we are currently working on the second version of ØLali. In this version we are planning to integrate a lot of new technologies. Our idea is we will take lyrics of songs or what people are saying about the song and create something more figurative through inputting images that would match the mood or the story of the music. However I think that our general aesthetics, which is these huge particle fields, will always remain. Not always, I cannot say always, but for now it will remain our real touch and particular main direction on the aesthetic parts.
Who are your biggest artistic influences?
Musically, we listen to so many different genres of music. We used to be huge hip hop fans and only listen to hip hop. Then we discovered jazz. I also listen to a lot of underground sounds, like techno, of course. Also, Amapiano, UKGarage, TNB. So I would say that musically, the influences are coming everywhere. But that is also the point of ØLali - being able to take any music and to make it a visual.
What is your favourite genre of music or type of instrument to visualise?
When we create artworks, we have hundreds of parameters and variables that can be twisted and be rearranged to modify the way the particles react. So there is always a way to create something interesting and reactive and beautiful. When you put drums, the visual or the auditory activity is really shown and obvious, which is sometimes really satisfying and mesmerising.
What is the big picture for ØLali?
We are always working and trying to enhance existing and new features. We want to be able to encourage all kinds of art. We have integrated dance through body tracking and also poetry and our words. It is all about being able to connect things, to connect people. For example my mother doesn't really like hip hop, you know, but maybe she would see an early artwork and be able to, by loving the visual, understand what hip hop music could mean to someone. Or maybe it could resonate with her. So technologically based, I cannot answer. We want to constantly improve and create more interaction with the people because we don't want AI to take the lead.
It is also about developing the technology. It's a software. So, one day we might want to make this software available for other people to use. It is something we are working on. It takes a lot of time but it's definitely something that would be interesting to be able to create a community of users that would try the technology and create their own interpretation of music and their own way to to use this technology that we built.