News

From Immersion to Acquisition: An Overview of Virtual Reality for Time-Based Media Conservators

By sasha arden posted 07-14-2019 21:06

  
Title slide from Mark Hellar and Savannah Campbell's Annual Meeting presentation on May 17, 2019. The title is

"From Immersion to Acquisition: An Overview of Virtual Reality for Time-Based Media Conservators" was an informative presentation given at the 2019 Annual Meeting by Mark Hellar, a technology consultant for cultural institutions and owner of Hellar Studios LLC, and Savannah Campbell, Media Preservation Specialist at the Whitney Museum of American Art. Mark and Savannah have both have done extensive applied research on virtual reality (VR). Mark has focused on web-based VR platforms and their potential application in a museum context, while Savannah's master's thesis for NYU's Moving Image Archiving and Preservation program examined the challenges of archiving and accessing VR. The resources that they shared will be a touchstone for time-based media conservators, collection managers, and technical staff as artists increasingly use VR in their practices.

Starting with an overview of the hardware, software, and content types used in VR artworks, Savannah talked about the different ways that VR can be displayed. There are many types of the familiar VR headset, or head-mounted display (HMD). These were broken down into four broad categories based on dependencies. Mobile VR requires a cell phone, Standalone VR is a self-contained device, VR Systems require a computer for high-resolution, immersive experiences, and Console VR that work with video game consoles with as Playstation and Nintendo Switch. HMDs can be further categorized according to their passive or interactive and immersive features, which depend on "degrees of freedom". 3 degrees of freedom allows a viewer to look around, for example at 360 degree video. 6 degrees of freedom uses external sensors to gauge a viewer's position in space, allowing them to navigate the virtual space by moving around in the real world.

Table of various head-mounted displays in categories of Mobile VR, Standalone VR, VR Systems, and Console VR. These are further categorized as 3 degrees of freedom or 6 degrees of freedom.

Artworks can also use different types of projection for visuals that are interactively manipulated with VR controllers, so a head-mounted display is not always required for VR content.

Illustration of variations on VR display: Oculus Rift head-mounted display, video wall, cave, corner wall, and full dome
 
Interactive VR content that is rendered in real time (e.g., is not predetermined content played back in an immersive way) is typically run from executable files stored on a device, whether it is a Standalone VR HMD or a VR System connected to a computer. When considering this type of VR for exhibition or acquisition, one should be aware that it probably includes packaged assets such as 3D models, audio, and video files, and might require external dependencies like a particular software game engine, graphics libraries, a computer with minimum CPU and GPU capabilities, and peripherals like controllers or tracking sensors that are compatible with the VR system.

Expanding on the packaged assets and game engine, Mark and Savannah pointed out that acquiring VR content should include any proprietary software in the version that the VR project was made, along with project files containing uncompiled source code. Unity is a very popular proprietary program for VR creation, and Blender is an open source 3D modeling software with a VR plug-in. It's also important to know what programming language the project was written in. Unity uses C#, its Unreal VR engine uses C++, and Blender uses Python. The compiled executable files that are run on the Standalone VR device or host computer are typically an EXE (.exe) or APK (.apk) file.

A further consideration is interoperability between VR hardware and content. A recent development is OpenXR, a free cross-platform standard that detects a headset's features and conforms the VR content to that device. However, earlier VR projects may not use OpenXR and it is important to know which hardware is compatible.

A diagram on the left illustrates limited interoperability between VR content and hardware Before OpenXR. A second diagram on the right shows greater compatibility between hardware and content types After OpenXR.

Another standard for VR content is WebXR, which supports development of VR and AR experiences on the web, rather than on hardware systems. The programming is done in web native technologies like JavaScript, HTML Canvas, and WebGL, and the web browser executes the code. Libraries such as Babylon, A-Frame, and three.js are easily incorporated into HTML code for quick and easy development, aided by optimized file formats such as glTF (GL Transmission Format), described as the "JPEG of 3D". Mark shared a documentation project at SFMOMA that used the source files from a 3D-printed architectural model to create a web-based VR scene. Such a resource can provide access to objects in the museum's collection for examination by curatorial and conservation staff, especially when the physical iteration is replaceable or meant to degrade over time.

360 degree video uses codecs and containers similar to regular video files, but there are some key differences that are good to know for assessing and displaying them. 360 video is natively spherical, but is stored flat in a video file. Just like a map of the Earth, translating a spherical form to a plane requires some method of projection mapping that creates distortion when it is viewed flat. One method of projection mapping is Equirectangular, which is like unrolling the sphere and flattening it out. For video, this can also create unequal pixel distribution with areas of higher and lower image quality. Another common method is a cube map, where the sphere is transformed into a cube then the square surfaces are rearranged into a rectangle. The corners of the squares are somewhat distorted, but overall there is less impact on image quality. A media player that can handle 360 video decodes these flat mapping techniques back into a sphere for playback, so the display appears undistorted to the viewer.

An image illustrates how the surface of a sphere is unwrapped and flattened into a rectangle, and a second image is a still from a 360 video file showing the distortion of a flattened image.
An image illustrates how the surface of a sphere is transformed into a cube then unfolded. The bottom of the screen includes a still from a 360 video, the unfolded squares of video from projection mapping, and the squares rearranged to form a solid rectangle.

If a 360 video is also 3D, it carries some further formatting. Video files displayed in stereoscopic 3D are formatted as either top/bottom or side-by-side. A media player that can properly decode the formatting and output them to hardware is required for exhibition, but a regular media player will play the video without converting it and will show how the video is formatted. Savannah shared a case study artwork, Ben Coonley's Trading Futures (2016), a 3D 360-degree video that is part of the Whitney's permanent collection. It can be exhibited on a VR headset or projected inside a geodesic dome with 3D glasses.

The left of the screen shows a still from Ben Coonley's 3D 360 video,

Always last, but never least, we must remember that 360 video and immersive VR content can also have audio. These files can be traditional stereo audio, or it can be ambisonic, which changes to reflect spatiality when the viewer moves their head around the scene. Immersive VR can even use object-based ambisonic audio to support the illusion that an object is near or far. Ambisonic audio also relies on a method of projection mapping that must be decoded on playback. 

With the AIC Annual Meeting audience in mind, Mark and Savannah shared a VR Acquisition Template, based on the VR Preservation Working Group summit held at Tate in March 2019. The form's categories can help to organize the many pieces of information pertinent to documentation of VR artworks. A link to view the template can be found at http://bit.ly/VRacquisitiontemplate. Big thanks to Mark and Savannah for compiling and presenting all of this information for the benefit of VR artworks in collections and the sanity of time-based media conservators!

A screenshot of the first page of a VR Artwork Acquisition Template authored by Mark Hellar and Savannah Campbell in 2019
All images are courtesy of Mark Hellar and Savannah Campbell.

#Featured
#47thAnnualMeeting(NewEngland)
#ElectronicMediaGroup
#AICmtg19




​​​​

Permalink

Comments

07-16-2019 17:36

This is fantastic, thank you for distilling this information!