Innovation

The tough questions of ethical content creation in VR

A panel at the Silicon Valley Virtual Reality Conference discussed ethical questions virtual reality content creators should bear in mind.

Image: Erin Carson/TechRepublic

One of the qualities of virtual reality that excites creators and consumers the most is that it's wide open and packed with potential.

Optimistically, that means people will create and experience things they never have before, and push creativity to its farthest reaches. However, as Newton said, for every action there is an equal and opposite reaction. If there lies the potential for good work that uplifts humanity, then there also lies the potential for harmful and destructive content.

This was one of the concepts explored at a panel called Ethical Content Creation for VR at the Silicon Valley Virtual Reality Conference in San Jose, California, Thursday.

The panel was moderated by Microsoft AR and VR developer and evangelist Liv Erikson, and included: Jodi Schiller, founder of digital media and marketing company New Reality Arts; Jazmin Cano, marketing and events coordinator for open source social platform High Fidelity; and Jacob Ervin, technical lead for spatial computing company Occipital.

Erikson started the session by asking why it's worth talking about ethical content creation now.

"A lot of the patterns get established in the first year or two," Ervin said, and after that they're hard to change, and what's more, once the foundation is laid, everything is built on top. So whether it's talking about diversity in VR or developing thoughtful practices regarding the types of VR experiences creators want to unleash into the world, the time to talk is now.

During the discussion, the panelists touched on a variety of topics, like how to proceed with VR when there are no longitudinal studies on the effects of the medium on the adult brain, let alone children's brains. Or, what's the balance between keeping platforms open and protecting users?

Cano brought up the idea of trust. Creators can't violate the trust of viewers or else the viewers will not come back to unpleasant or traumatic VR experiences—or maybe to VR at all.

At a few points, the idea came up of introducing something like biometric plug-ins that would allow users to set thresholds for things like heart rate so the system could automatically pull them out of the experience if their heart spiked past a certain level. So for example, VR horror experiences are developing their own niche. A user could decide the point at which they needed to be separated from the experience.

Schiller said the situation called for "creative ways of both allowing our imaginations to go wild, but also looking after people at high distress levels."

Along those lines, Erwin brought up the idea that, while it's possible to treat traumas with VR, it could also be possible to cause traumas. VR is immersive and what people often don't understand until they've tried it is the extent that it can dupe the brain into thinking it's on the ledge of a building, being shot at, or standing too close to a figure. Creators have to consider the consequences of what they create.

The talk also covered identity issues from the potential impact avatars can have on real world behaviors, to the potential for identity theft. Imagine someone created an avatar of you without your permission and used it in unseemly ways. As Schiller pointed out, people already face the problem of controlling their image online. Fake social media profiles get created all the time using stolen pictures of unsuspecting people.

The panel talked about the idea of having layers of identity. Perhaps only close friends would see a user's true avatar, and others would see something different.

Again on the optimistic side, Ervin said "VR is a great medium for taking people and putting them in someone else's shoes." The hope is that VR can encourage and normalise positive social behavior, rather than serving as place where some of society's ills can continue to fester. One audience member brought up the example of a female avatar going into a social VR experience and getting attacked, as happens online all the time whether on Twitter or Reddit.

Cano said big platforms need to make tools to help users deal with such challenges, like creating private rooms for social interaction.

"We need to own our responsibility. We have to be always thinking, 'what world are we creating?'" Schiller said. "We are the founders of something and let's make it good."

Also see

About Erin Carson

Erin Carson is a Staff Reporter for CNET and a former Multimedia Editor for TechRepublic.

Editor's Picks

Free Newsletters, In your Inbox