ImmerSphere aims to empower creation, to enable the overlay of “place” and “performance” in an immersive audio-visual experience that advances meaning. ImmerSphere taps into the exhilaration of augmented reality, transcending the boundaries of space and time. Creators use simple workflows to capture high-quality “performance” and “place” assets, then choose from an array of delivery formats to design stories from the building blocks. From a performer-hologram appearing in your own living room, to an AR Sphere-hopping journey, to Spheres unveiled across a real-world landscape, the permutations are manifold. Upload the elements, trace a throughline, instigate a scavenger hunt, invite us to fly.
Oni Buchanan is a poet, concert pianist, and the founder and director of the Ariel Artists music management company, in addition to founder and CEO of ImmerSphere. Having spent a lifetime pursuing interdisciplinary creation and forging collaborative networks of visionaries across disparate disciplines, Oni is uniquely positioned to run a software company which weaves together artistry, technology, interactivity, and multiple realities, and to fashion an infrastructure to uplift creators.
Tina James is a science geek turned tech enthusiast. She manages Revyrie’s offshore facilities and its team of over 60, dedicated to building tech solutions for startups globally. Before Revyrie, she has held leadership roles across a number of organizations where she was instrumental in building virtual teams, setting up Technology Centers of Excellence, and improving process maturity. She’s worked with several Fortune 500s, including Nike, Discover, and Adobe.
Matthew Neutra has been at the bleeding edge of immersive media, experience design, and digital signage. Serving as the Audio Experience Innovation Lead during his 20-year career at Bose Corporation, Matt created unique prototype experiences, bridging the gap between consumer electronics, entertainment, and technology. Matt has a B.S. in Geology, an M.S. in Geo-information Science/Remote sensing, and worked on the precursor to Google Earth in the late 1990s.
Myna Joseph’s films MAN and Fit Model have screened at Cannes Directors’ Fortnight, Sundance, the New York Film Festival, SXSW, and Lincoln Center, with Fit Model in current release on The Criterion Channel. She was nominated for an Independent Spirit Award for Best First Screenplay for The Mend, which premiered at SXSW and was released to acclaim from The New York Times, The Guardian, The Village Voice, The Wall Street Journal, TIME magazine, and New York magazine, which named it one of the best American independent films of the year.
Marta Gospodarek is a PhD Candidate in the Music and Audio Research Lab (MARL) at New York University. Her research interests lie in the intersection of spatial audio, sound design, and psychoacoustics applied to the context of Virtual and Mixed Reality. Currently, she is researching sound perception in AR environments at IRCAM (Paris). As a lead sound designer, Marta worked on several large VR/AR projects including Mary and the Monster (Cannes XR 2020).
Kristin Larson has over 25 years experience, starting at Ernst & Young and working as the senior Finance position for companies for the past 10+ years. She has worked primarily in small and venture backed companies, including Blue Dolphin and Sensable Technologies. Her focus is to provide value to the shareholders and community. She graduated from University of Richmond with a Bachelor’s degree in Accounting and is a licensed CPA.
Ted Werth is a seasoned executive who specializes in creating and executing high growth venture strategies. He founded and grew an innovative national technology/services data platform, PlumChoice, from $0 to $50M. He mentors deep tech and high tech ventures through various programs including MIT VMS, Mass Ventures and has worked with 40+ companies using technology/data solutions in IOT, Digital Health and Service industries to redefine how markets operate.
Vickie Nauman is Founder and CEO of the LA-based boutique consulting and advisory firm CrossBorderWorks, where she’s worked with a portfolio of streaming platforms, games, apps, device manufacturers, start-ups, and industry consortiums since 2014. Nauman has expertise across all aspects of digital music – licensing, products, business development, music tech. Her experience spans the earliest days of disruption at MusicNet (RealNetworks) to KEXP Seattle, global platform 7digital, and connected device pioneer Sonos.
With over 25 years of experience architecting successfully acquired start-ups, Lee Jones helps rapid-growth companies scale through building relationships, developing technology and products, engaging new markets, and assembling effective teams. Previously he served on the Austin Technology Incubator Success Committee and Clean Energy Incubator success committee and as an advisor to several portfolio companies. Currently he serves as a member of the Board or as an Advisor for several early-stage ventures.
Ken Perlin directs the NYU Future Reality Lab, with research interests that include future reality, computer graphics, and user interfaces. He is chief scientist at Parallux and Tactonic Technologies, an advisor for High Fidelity and Croquet, and a Fellow of the National Academy of Inventors. Perlin received an Academy Award for Technical Achievement for his noise and turbulence procedural texturing algorithms, and originated the technique of shader languages, widely used in feature films and computer games and simulations. He worked on the original TRON.
Agnieszka Roginska is a Professor of Music Technology at New York University. She conducts research in the simulation and applications of immersive and 3D audio including the capture, analysis, and synthesis of auditory environments, auditory displays, and applications in augmented acoustic sensing. Agnieszka is a Fellow and the Past-President of the Audio Engineering Society (AES). She is the faculty sponsor of the Society for Women in TeCHnology (SWiTCH) at NYU.
The latest startup exploring whether augmented reality (AR) technology can do fun things for music is ImmerSphere. It’s a platform for all kinds of performance, music included, based around tappable ‘spheres’ through which people can watch those performances. The company will pay artists whenever they are watched within ImmerSphere’s app, with a fee that it claims is similar to a physical live performance, rather than streaming / livestreaming amounts.
“I had already broadcast tiny holograms of performers onto my desk or kitchen table; I had walked through shimmering AR portals that allowed me to enter a 360 space; and I had experienced the magic of geospatially pinning a virtual object to a real-world set of coordinates,” is the pitch from founder and CEO Oni Buchanan.
“To create something specific to performance, I wanted to merge all three, placing a hologram of a performer inside a 360-degree space, and have that combination pinned to coordinates in the actual world.”
ImmerSphere Democratizes AR Access, Restores Creator Compensation, and Revolutionizes Virtual Performance
ImmerSphere is a new virtual realm where curators and performers can create novel combinations of musical virtuosity and inspiring settings.
Made up of Spheres, 3D audio and video environments, ImmerSphere lets you experience a piano duet on the surface of the moon or witness an Irish folk jam under the sea. It might be a sunny day in Tuscany with music to match, a drumming session from a little-known corner of Accra, or an indie rock set from the back of a historic bar, with just you and one of the finest bands on the planet.
Each Sphere within ImmerSphere blends 3D sights and ambient sounds with a high-caliber performance. Visitors scroll through a selection of Spheres on their smartphones or tablet, and as a Sphere catches their eye, they tap on it. As their chosen world envelops them, their device becomes the window into this new environment, and the performers appear directly in front of them.
Focusing first on the performing arts community, ImmerSphere aims to empower creators and curators in the metaverse. Presenters can buy turnkey products–such as Walking Concerts, Encore Series, and AR Collectibles–to test the waters. Once they establish fluency, they can license the ImmerSphere platform itself and build out their own virtual performance venues and content collections, creating new ways for their communities to engage with artists, new formats for performance experiences, and new revenue streams.
The idea for ImmerSphere emerged two years ago, when VR/AR tech experienced significant breakthroughs, just as performers and arts centers were scrambling to find a path forward amid COVID. At the time, founder and CEO Oni Buchanan — published poet, concert pianist, and founder/director of the artist management company Ariel Artists — stared into the abyss of the pandemic, with its total collapse of in-person performances. And a new idea stared back.
“When concerts were canceled and all of us were isolated across the globe, I began to imagine a new combination of AR experiences that could reconnect artists with presenters and audiences,” Buchanan recounts. “I had already broadcast tiny holograms of performers onto my desk or kitchen table; I had walked through shimmering AR portals that allowed me to enter a 360 space; and I had experienced the magic of geospatially pinning a virtual object to a real-world set of coordinates. To create something specific to performance, I wanted to merge all three, placing a hologram of a performer inside a 360-degree space, and have that combination pinned to coordinates in the actual world. Listeners could undertake a journey corresponding to the performance program in some illuminating way.”
The artists on the Ariel Artists roster were up for the challenge. From their individual quarantines, they began to transform apartments into green screen studios and record some live takes of repertoire. Concert presenters, with their venues shuttered, were also up for experimenting with something new. They provided 360-degree photos of the inside of their concert halls, eager to try a new way of gathering to hear a performance. They nominated their communities’ most beloved performers to the platform, broadening the types of performance and connecting their artists to new creative opportunities and revenue. Performers themselves also began to harness the capacity of the 360 environment to contextualize their music, tell stories, and take their audiences on a journey through the Spheres.
Getting visitors and arts lovers on this journey is remarkably simple. They can jump into a Sphere using the most ubiquitous AR-capable devices on the planet, smartphones and tablets, a key democratizing feature for the platform. “Accessibility is huge for presenters, huge for fans, and huge for artists,” explains Buchanan. “The app is intuitive. You don’t need any complex equipment or hardware or lengthy tutorial to get started. Billions of people already have a metaverse portal in their pocket.”
The vision running behind the Spheres reflects Buchanan’s deep understanding of what feeds creativity among artists and creators, and among curators. From the outset, paying creators fairly for their content has been an established, non-negotiable component of ImmerSphere. Artists receive a fee for their performance each time it is broadcast in a Sphere. They are compensated at a level similar to what they might receive for an in-person performance, not what they might earn in livestream tips or streaming revenue. “Creator compensation should not be an afterthought,” states Buchanan. “We’ve built it into our model.”
ImmerSphere has also differentiated itself from other virtual platforms by including performing arts presenters as key advisors and beta testers. “We’ve built our approach thanks to partnerships with high-profile presenters,” Buchanan notes, “tapping into their curatorial expertise and long-cultivated communities of arts lovers who trust their vision.”
Thanks to this guidance, ImmerSphere is made to function as a metaverse pied-a-terre for the performing arts. Presenters access a palette of presentation formats that enhance their core live offerings, enabling them to establish new series, present more diverse content, reach new audiences, and initiate new revenue streams. “The good thing about the metaverse is that it’s inspired arts organizations and presenters to get more creative in their digital programming,” Buchanan says. “We’ve laid the groundwork in the metaverse early, to support artists and presenters in a very deliberate way.”
Try out ImmerSphere’s prototype app for a glimpse of what’s to come. We recommend wearing headphones for the best audio experience.
Join our mailing list for more information about how to become an ImmerSphere artist, to partner with ImmerSphere at your venue, or for news about app updates and feature releases as they become available.