Does immersion yield empathy? The aim of this report is to address the question of whether news media—in this instance, short-form journalistic stories—presented in a 360-degree video format affects a user’s empathetic response to the material. If so, what might the advantage of such a response be, and could it include improvement in a viewer’s ability to recall the content over time or a resulting behavioral change? Naturally, part of the study will deconstruct what we mean by such a nebulous term as “empathetic” and how exactly it can be measured.
The study both investigates if particular audiences are likely to respond empathetically to certain narratives and analyzes the component parts of immersive experiences—comfort level, interactivity, and perceived amount of user agency—that contribute to producing an empathetic response. It also aims to answer whether the virtual reality (VR) format is better suited to particular stories or audiences, as well as the potential for unintended, antithetical effects in this embryonic medium such as a user’s perception of personal space invasion or the feeling that they are not looking in the right direction.1
Results from our study of 180 people viewing five-minute treatments comprised of either 360-degree video or text articles monitored user reactions and their sense of immersion on the day of their first exposure to the narrative treatment, as well as two and five weeks later.
One year after launching Europe’s first and biggest VR space, MK2 is set to start commercializing VR plug-and-play equipment for global exhibitors who are looking to launch VR venues in theaters, museums, and institutions, among other places
Called the MK2 VR Pod, the equipment which MK2 will be distributing worldwide is considered to be an “end-to-end premium and white-labeled product line,” the company said.
I recently got a private tour of a NASA space shuttle’s cockpit, a quirky mosaic-covered LA home, and a peaceful chapel with light streaming through ornate stained-glass windows—all without leaving my chair.
That chair was in an office at Google’s Silicon Valley headquarters, and I was wearing an HTC Vive virtual-reality headset on my face. But because these places were filmed with a high-resolution prototype camera that reproduces some of the key cues we use to understand depth in the real world, it felt more like actually being there than anything I’ve experienced with any other live-action VR. Which is to say it was pretty damn cool.
I could peer around the seats in the space shuttle Discovery, revealing buttons and switches on the walls of the cockpit that were previously obscured. As I looked closely at mirrored bits of tile on the outside of the mosaic house, I glimpsed reflections of other tiles in the background and saw a dizzying display of shapes and patterns. In the chapel, I gazed at the floor, and the colorful sunbeams moved as I did.
Travis Cloyd started making virtual reality films three years ago, and he has made four of them so far. That record was enough to earn him the award for VR Visionary at the Cinequest film festival in San Jose, California. That’s because VR is such a young medium, and Hollywood storytellers like Cloyd are still experimenting with how to do these films.
At Cinequest, Cloyd showed off his VR film that promotes the new Nicholas Cage movie, The Humanity Bureau. It’s a dystopian science fiction thriller set in the year 2030, when a massive recession and global warming catastrophe forces society to get rid of its unproductive members. The 2D film debuts on April 6.
Cloyd has also created VR promo films for John Travolta’s upcoming Speed Kills and the Wesley Snipes film The Recall. At Cinequest, Cloyd won Best VR Feature Film for Speed Kills and Best Sci-Fi VR Film for The Humanity Bureau.
With those VR films, Cloyd had to work around the schedule and production for the 2D films, and he also had to figure out how to place the cameras so they captured 360-degree action — without making viewers seasick.
To make in game characters more lifelike virtual reality (VR) developers use motion capture technology from the likes of IKinema or Vicon. Today, independent motion capture studio Animatrik has announced a new partnership with Lifelike & Believable to enhance its digital character pipeline and create immersive real-time VR experiences.
Having delivered high-end motion capture for both feature films and video games; such as Spider-Man: Homecoming, Gears of War, Justice League, and Oscar-winning immersive VR experience Carne y Arena,Animatrik will bolster its existing virtual production pipeline with the VR processes usually outsourced to videogame development studios.
David Attenborough is no stranger to innovation in the media. “I remember when I was only one of three people who could see colour TV,” the 91-year-old says. So when he was placed inside a 106-camera rig at Microsoft’s Seattle headquarters, it was nothing out of the ordinary. The purpose? To create a hologram version of Attenborough.
Using volumetric scanning – where multiple video feeds are compressed to create a 3D object – the presenter has been ported into virtual reality. Before Attenborough was flown out to the US, members of the production team collected some of his familiar blue shirts and wore them in front of the cameras to test the light setup. “It’s quite weird,” he says.
The Lawnmower Man was the first feature film to depict a new type of technology that enabled characters to explore synthetic, simulated worlds through an emerging new medium called virtual reality (VR). That was 1992.
Using head-mounted displays (HMD) the size of crash helmets and gloves with sensors, VR users were experiencing computer-generated environments and stories in new ways. Through the idea of “presence” – the feeling of actually being part of an artificially created place – any adventure was now possible, at least if the buzz was to be believed.
The film was a hit; now Hollywood studios were watching, waiting for the technology to mature to a point where they could actually create VR experiences for audiences. This idea of “convergence” between old and new media was a hot topic, and extremely attractive as a potential new revenue stream for studios.
In a year of slow sales for traditional movies, VR came out on top. So what happens now?
While everyone was busy complaining about slow sales at the 2018 SundanceFilm Festival, something remarkable happened: The festival saw its first major VR acquisition. For a reported low-to-mid seven figures, CityLights bought the three-part VR series “Spheres,” directed by science-storytelling whiz Eliza McNitt, narrated by Jessica Chastain, and executive produced by Darren Aronofsky.
A few days later, in the first sale of a VR documentary at Sundance, Dogwoof acquired “Zikr: A Sufi Revival,” directed by Gabo Arora. “Zikr” is a 15-minute interactive VR experience that uses song and dance to transport four participants at a time into ecstatic Sufi dance rituals; in addition to location-based installations, the deal includes funding for an online version of the VR experience that allows multiple players to be networked at once.
The new Los Angeles office marks the first expansion for the Montreal-based company. The L.A.-based operation aims to further the company’s mission to provide VR producers, developers, and filmmakers with on-set and post-production audio solutions of the highest technical, execution, and creative standards and an end-to-end process pipeline for 3D positional audio production, post-production and delivery.
Pimax, a Chinese startup developing an 8K virtual reality headset, came to CES this year to show off its latest prototype, the fifth in just a year since the first version was unveiled at last year’s show. After a successful Kickstarter campaign last year, one in which Pimax raised more than $4.2 million and beat out even Oculus VR’s initial crowdfunding campaign, it’s working toward mass production of the consumer version that is slated to come out later this year after an initial shipment to backers.