Walking in another’s virtual shoes: Do 360-degree video news stories generate empathy in viewers? by Dan Archer and Katharina Finger

Does immersion yield empathy? The aim of this report is to address the question of whether news media—in this instance, short-form journalistic stories—presented in a 360-degree video format affects a user’s empathetic response to the material. If so, what might the advantage of such a response be, and could it include improvement in a viewer’s ability to recall the content over time or a resulting behavioral change? Naturally, part of the study will deconstruct what we mean by such a nebulous term as “empathetic” and how exactly it can be measured.

The study both investigates if particular audiences are likely to respond empathetically to certain narratives and analyzes the component parts of immersive experiences—comfort level, interactivity, and perceived amount of user agency—that contribute to producing an empathetic response. It also aims to answer whether the virtual reality (VR) format is better suited to particular stories or audiences, as well as the potential for unintended, antithetical effects in this embryonic medium such as a user’s perception of personal space invasion or the feeling that they are not looking in the right direction.1

Results from our study of 180 people viewing five-minute treatments comprised of either 360-degree video or text articles monitored user reactions and their sense of immersion on the day of their first exposure to the narrative treatment, as well as two and five weeks later.

Read More:

How Travis Cloyd became a VR film visionary, by Dean Takahashi

Travis Cloyd started making virtual reality films three years ago, and he has made four of them so far. That record was enough to earn him the award for VR Visionary at the Cinequest film festival in San Jose, California. That’s because VR is such a young medium, and Hollywood storytellers like Cloyd are still experimenting with how to do these films.

At Cinequest, Cloyd showed off his VR film that promotes the new Nicholas Cage movie, The Humanity Bureau. It’s a dystopian science fiction thriller set in the year 2030, when a massive recession and global warming catastrophe forces society to get rid of its unproductive members. The 2D film debuts on April 6.

Cloyd has also created VR promo films for John Travolta’s upcoming Speed Kills and the Wesley Snipes film The Recall. At Cinequest, Cloyd won Best VR Feature Film for Speed Kills and Best Sci-Fi VR Film for The Humanity Bureau.

With those VR films, Cloyd had to work around the schedule and production for the 2D films, and he also had to figure out how to place the cameras so they captured 360-degree action — without making viewers seasick.

Read More:

Animatrik Partners With Lifelike & Believable To Create Immersive Real-Time VR Experiences, by Peter Graham

To make in game characters more lifelike virtual reality (VR) developers use motion capture technology from the likes of IKinema or Vicon. Today, independent motion capture studio Animatrik has announced a new partnership with Lifelike & Believable to enhance its digital character pipeline and create immersive real-time VR experiences.

Having delivered high-end motion capture for both feature films and video games; such as Spider-Man: Homecoming, Gears of War, Justice League, and Oscar-winning immersive VR experience Carne y Arena, Animatrik will bolster its existing virtual production pipeline with the VR processes usually outsourced to videogame development studios.

Read More:

Virtual Reality Finally Sold Big at Sundance: Here’s What It Means for the Future of the Marketplace, by Kim Voynar

In a year of slow sales for traditional movies, VR came out on top. So what happens now?
“Spheres” director Eliza McNitt

While everyone was busy complaining about slow sales at the 2018 SundanceFilm Festival, something remarkable happened: The festival saw its first major VR acquisition. For a reported low-to-mid seven figures, CityLights bought the three-part VR series “Spheres,” directed by science-storytelling whiz Eliza McNitt, narrated by Jessica Chastain, and executive produced by Darren Aronofsky.

A few days later, in the first sale of a VR documentary at Sundance, Dogwoof acquired “Zikr: A Sufi Revival,” directed by Gabo Arora​. “Zikr” is a 15-minute interactive VR experience that uses song and dance to transport four participants at a time into ecstatic Sufi dance rituals; in addition to location-based installations, the deal includes funding for an online version of the VR experience that allows multiple players to be networked at once.

Read More:

Headspace Studio Expands VR Sound Operations With New L.A. Office, by Peter Graham

Creative and technical sound services company, Headspace Studio (Inside the Box of Kurios; Jurassic World: Apatosaurus; The People’s House: Inside the White House with Barack and Michelle Obamahas today announced the expansion of its virtual reality (VR) sound operations with the opening of a new studio in Los Angeles, California.

The new Los Angeles office marks the first expansion for the Montreal-based company. The L.A.-based operation aims to further the company’s mission to provide VR producers, developers, and filmmakers with on-set and post-production audio solutions of the highest technical, execution, and creative standards and an end-to-end process pipeline for 3D positional audio production, post-production and delivery.

Read More:

Apple embraces VR video with 360-degree and 8K support for Final Cut Pro X, by Jon Martindale

Apple has firmly embraced 360-degree video with its latest update for Final Cut Pro X that lets users edit video especially for virtual reality. The update also introduces a number of additional features for modern video editing, including support for High Dynamic Range (HDR), 8K resolutions, and advanced color grading.

As much as we love fully immersive virtual reality games and experiences, it’s likely that 360-degree video will be the first point of contact many people have with VR. As a standard, VR video may also be the future of immersive media, in much the way that color imagery supplanted black and white. Apple is keeping itself at the forefront of modern content creation with its latest update for its premier video editing suite.

Read More:

Lytro Reveals Immerge 2.0 Light-field Camera with Improved Quality, Faster Captures, by Ben Lang

Lytro’s Immerge light-field camera is meant for professional high-end VR productions. It may be a beast of a rig, but it’s capable of capturing some of the best looking volumetric video that I’ve had my eyes on yet. The company has revealed a major update to the camera, the Immerge 2.0, which, through a few smart tweaks, makes for much more efficient production and higher quality output.

Light-field specialist Lytro, which picked up a $60 million Series D investment earlier this year, is making impressive strides in its light-field capture and playback technology. The company is approaching light-field from both live-action and synthetic ends; last month Lytro announced Volume Tracer, a software which generates light-fields from pre-rendered CG content, enabling ultra-high fidelity VR imagery that retains immersive 6DOF viewing.

Read More:

Lytro Announces VR Light Field Rendering Software ‘Volume Tracer’, by Scott Hayden

Lytro, the once consumer-facing light field camera company which has recently pivoted to create high-end production tools, has announced a light field rendering software for VR that essentially aims to free developers from the current limitations of real-time rendering. The company calls it ‘Volume Tracer’.

Light field cameras are typically hailed as the next necessary technology in bridging the gap between real-time rendered experiences and 360 video—two VR content types that for now act as bookends on the spectrum of immersive to less-than-immersive virtual media. Professional-level light field cameras, like Lytro’s Immerge prototype, still aren’t yet in common use though, but light fields aren’t only capable of being generated with expensive/large physical cameras.

Read More:

Adobe’s Project SonicScape Visualizes Audio For Easier 360 Editing, by Ian Hamilton

Adobe previewed a concept it is working on that would make it easier for creators working on VR videos to place and align sound.

Producing high-quality 360-degree video content has traditionally been a difficult affair at all stages of production, from capture to delivery. However, a constant stream of new cameras, editing tools and streaming techniques are on the way to make the process easier.

Read More:

Samsung’s 360 Round camera livestreams 3D VR, by John Fingas


Samsung already has a virtual reality camera in the form of the Gear 360, but it’s not really for pros — it’s for everyday users who want to record a 360-degree video on the street. What if you’re a pro, or a well-heeled enthusiast? Samsung has you covered: it’s launching the previously hinted-at 360 Round. The disc-shaped device carries a whopping 17 2-megapixel cameras and six microphones (plus two mic ports) to create 3D (that is, stereoscopic) VR video. It’s powerful enough to livestream 4K VR at a smooth 30 frames per second, helped in part by software that promises to stitch together immersive video with virtually no lag.

Read More: