You might be too surrounded by people who are into current VR.
> Focus... Again, nobody cares.
That’s just not true. Having a single fixed focal distance was super jarring when I’ve tried VR. But normal people won’t complain about that. They’ll just say “looks weird.”
Personally, the problem it needs to solve is figuring out how to let me virtually move without physically moving, while not making it awkward and also not getting me motion sick. So far, that problem still feels fundamental.
I'm the only VR developer at the company where I work, and I don't work for a software company. I make a foreign language training tool for people who are largely A) government employees, B) not gamers.
We get a small amount of skepticism from maybe 25% of the students before they try it for the first time. After putting hundreds of our own students through the system, only 1 has given it a universally negative review, and he generally gave the entire instruction program a negative review anyway. Can't please everyone all the time.
Most of our students end up using the system multiple times. They always have the option to not use it at all, or use it from a 2D PC display. Nobody has opted-out completely, and only 1 person has opted-out of VR.
About 10% have mentioned a slight feeling of dizziness after their first session, with only about 10% of those people saying the feeling returned after subsequent sessions on later dates. We've had only 1 person who flat out couldn't use the system because of simsickness. It comes out to about 0.3% of our students, which turns out is less than the proportion of people who experience moderate to severe discomfort from watching FPS video games or action movies in a theater. I've tried to encourage the instructors to limit the first session to 20 minutes to get that 10% number down even further, but most of the students haven't minded the dizziness enough to stop and end up keep going for 1hr+ sessions.
When people provide feedback, they tell us they want more to do, higher display resolution, easier setup, and more intuitive controls. Noone has ever complained about focus, latency, black levels, headset weight, etc.
I find it disappointing that your comment is one based in your personal experience, speaking directly about your time in the industry with the tooling in question, yet it was downvoted.
Just wanted you to know that I found your insight valuable and I must admit I'm surprised so few people had any issues with motion sickness/sim sickness. Have you don't anything specific to tackle that within the software you've developed?
It started with establishing various guidelines. Core values: why we are doing this, what we want to get out of it (extremely useful to have when you get into disagreements about feature design. You can point back to the core values that people agreed were core). Design: what should be done, what should NEVER be done. Testing: not just what needed to be tested, how to test it, what kinds of tests needed to be ran (technical, load, users, etc), but also who needed to do the testing (adaptation is a problem! New-to-VR users are a non-renewable resource!), and what they needed to be informed of before the test (equal parts informed concent and psychological priming). We found that if we primed users with the expectation they would feel dizzy the first time, but typically not the second, they would report higher levels of satisfaction and be more willing to perform additional sessions.
As I said, I'm the only VR developer here. I have one other software developer working for me, but she works on some of the database management, and only started relatively recently in the project life. And that's it. The company is not a software company. Establishing that testing guideline also made it possible for a lot more people in the company to engage in the project.
From a software development perspective, it involves being really honest with yourself about the strengths and weaknesses of your content. Our app incorporates a "guided tour" metaphor for instructors to take students around Google Street View imagery and practice their language skills. Google Street View is monoscopic photospheres with no depth information at all. It's not the best format for immersion, but it hits a lot of other goals for us, such as being culturally relevant, as well as being a vast source of content that we can easily combine and use without spending lots of money on 3D modelling.
Photosphere imagery causes a LOT of simsickness. If I had to list to list the top 3 causes of simsickness, it'd be low frame-rate, mismatch between motion cues in the visual and vestibular systems, and lack of depth cues. Photospheres hurt on two of those levels: you can't move inside of them and there are no stereo depth cues. But we get around it in multiple ways. First, we encourage the students to stay in their seats for the whole experience. Second, we focus on outdoor areas, so that most visual detail is beyond the range of stereo depth perception. We have also intentionally left all of our user interface elements visible at all times, so that you have 3D foreground elements to focus on. There's a wireframe floor grid that keeps most people from feeling like they're about to fall.
Incidentally, I think the photospheres help us in a demographic that typically fairs poorly in VR: women. Or I should say, people with a higher proportion of estrogen compared to testosterone (There are High-T woman and High-E men, the balance of each changes as we age, and it can change if, for example, you're undergoing hormone therapy). Apparently, the relative balance of hormones in the body can have a big impact on the visual cues our brains use to recognize depth. High-T people judge depth based on motion. High-E people judge depth based on shadowing and relative object size. Because we are using photos instead of 3D models, our "lighting" is 100% accurate. So where many projects will show a marked difference in simsickness between men and women, ours has not.
So despite starting in a tough spot with our source data, we actually end up doing a really good job in the simsickness department. Like I said before, only one person noped-out completely. I think it's pretty safe to say she would have become violently ill if she stayed in. I interviewed her afterward and she admitted she was extremely prone to motion sickness in cars and boats and had difficulty watching action movies in theaters. We had one other person complain about persistent feelings of dizziness, but it turned out we had had a significant performance regression, which we fixed and she was all better after that.
And I put a lot of effort into performance optimization. I had to be honest with myself and admit that the super-accurate audio spatialization system I was using was costing way too much compute for too little impact. The much simpler, though inferior, built-in spatialization was good enough and meant we could hit a much higher performance cap. Someone might look at my code and say I suffer from Not Implemented Here syndrome. I say most other people's code doesn't hit the performance targets we have. I know, because I tried, and had to replace it.
It also takes saying "NO" to some feature requests. I had to put my foot down on a request to let the instructor teleport students to new locations. I pointed to design guidelines: "Users in VR headsets SHALL NOT be virtually moved without their own direct, affirmative interaction".
Sometimes that "NO" is to avoid doing something that will harm users. Sometimes it's to avoid wasting time tasks that don't fit with our core values, time that could be spent in better ways. For example, we were able to quickly discard using AI as part of the student interaction because of one of our core values: to support our existing, best-in-class, instructor-led training, not replace it. Spending time on AI chatter bots to make the same, shitty, canned-called-response-convo app that every other company is making in the space would distract from... literally anything. Literally anything would be a better use of my time :)
How we've incorporated VR into our operations is a big part of why I don't really believe in this whole "but VR doesn't have a killer app" argument, especially when it comes from people who also talk about not liking the idea of being in VR for several hours at a time. We are not out to replace our instructor-led training with an VR/AI/ML/WTF/BBQ chatter bot system. We use VR as an enhancement of our existing offerings.
Incidentally, that makes it a lot easier from a business perspective, too. People already buy our services and we're not trying to get them to buy a new one. Our VR tools make us more competitive in our market. We get to establish VR as a norm in in-person language training, and we get to set the tone for what that means.
Anyway, this is getting a lot longer than I expected it to. I could talk at length about why VR is so important to language learning. But I should probably get back to working on it, instead.
Agreed and even when you can get the interpupillary distance correct you can still get peripheral blurring and sometimes have to adjust the headset constantly to make sure that you're looking straight on.
FoV is also a big deal that a lot of people tend to overlook, most of the commercial offerings only have around 100° whereas humans have about a 200° FOv. It's like I'm constantly playing a submarine periscope simulator.
The FOV for regular FPS games is often less than 100deg, so you get more camera view than a typical monitor, it's just that you can't see anything outside of that so it feels boxed in. I would liken it to playing a game in the dark.
I agree about the movement. I have a lot of fun playing on my quest 2. Fortunately I don’t have any problem with motion sickness so i can turn on joystick movement because it’s better to me than the teleporting but still kind of weird.
One cool thing I’m looking forward to is house scale be games. I’m perfectly fine with the constraints those impose due to being able to just walk around
> Focus... Again, nobody cares.
That’s just not true. Having a single fixed focal distance was super jarring when I’ve tried VR. But normal people won’t complain about that. They’ll just say “looks weird.”
Personally, the problem it needs to solve is figuring out how to let me virtually move without physically moving, while not making it awkward and also not getting me motion sick. So far, that problem still feels fundamental.