AR/VR applications are growing in popularity, but they come with unique design challenges that impact user experience. Learn about how IMUs can help address these challenges in our post.
Tuesday, June 25, 2019
The popularity of AR/VR devices and gaming is growing, leading to huge market demand. Revenue from enterprise AR/VR software is expected to rise to $5.5 billion by 2023, a whopping 587% increase. And consumers are clearly invested in AR/VR games, too. The Pokémon GO craze of 2016 is still going strong; at time of writing, it hit over 1 billion downloads.
As the popularity of AR/VR (augmented reality/virtual reality) applications grows, manufacturers need to be prepared to take advantage of demand, and understand the components needed to create an authentic experience that consumers will love.
What is AR/VR?
AR and VR are very similar technologies that are used to simulate real-life experiences. In augmented reality, computer-generated enhancements or simulations are overlaid on top of real-life images. AR applications give the user the ability to interact with simulated images and experiences within the context of their real-world surroundings. For example, Pokémon GO uses a smartphone camera to pull in images of the user’s real-world environment and places characters from the game on top of that image in real time, so it appears like the Pokémon are really in front of the user.
VR technology takes this concept a step further to create a total simulation for the user. It doesn’t just overlay simulated images on top of real-world images; it gives the impression of complete immersion inside a separate world – the simulation. VR devices usually requires some type of headset for the user to wear (the less common alternative is a specialized environment – think industrial grade pilot or driving simulator). When the user looks through it, he or she will see the simulated world; a robust VR system will show a complete 360-degree view. When the user tilts their head, or takes a step backwards, the simulation shifts to reflect that movement. There are even rides, such as Birdly, that take advantage of VR to turn the amusement into a fully immersive experience.
In AR technology, a camera is an important component to capture real-world images. In VR, the headset often creates the feeling of immersion. Both require motion sensors, either to track movement of the camera to augment reality, or to track the user’s head movements in the virtual space. Handheld controllers may also be used to navigate and interact within an AR or VR simulation. Motion sensors are critical in these devices, accurately tracking the user’s hand movements and translating them into the virtual environment.
Common Design Challenges in AR/VR
The most common user complaints for AR/VR devices are motion sickness and disorientation. In order to provide the most authentic experience, AR/VR devices need to very precisely match real-world movement from the user with the computer-generated simulation that is being presented. When the user’s head swivels to the right, the simulation needs to match that movement exactly. The time difference between the simulated images and the real world is referred to as lag.
High latency in data transfer causes this lag. Data needs to be transferred as quickly as possible so there is no interruption between what the user is doing in the real world and what the simulation is showing. A high sampling rate is also important to ensuring a smooth AR/VR experience. If you are sampling too slowly, movements will not feel as fluid as they do in real life, pulling you out of the experience.
At best, these factors are annoyances that lead to an inauthentic experience, but at worst, they can cause user discomfort and even motion sickness. In such a fast paced technological landscape, this type of negative experience can seriously damage a brand’s reputation in this quickly developing market.
The Solution: Multi-Axis IMUs
The key to eliminating these challenges is low-latency, high-accuracy motion sensors with fast sampling rates. We recommend a multi-axis inertial measurement unit (IMU). An IMU consists of a combination of an accelerometer and a gyroscope (and sometimes a magnetometer). IMUs are used in headsets to track head orientation and in handheld controllers to capture hand movement. Adding advanced sensor fusion software transforms an IMU into an Attitude and Heading Reference System (AHRS – though colloquially IMU and AHRS are rather interchangeable), allowing for smooth and realistic motion output from the 6- or 9-axis sensor used; this helps create a truly authentic immersive experience.
One important note: It is not recommended to use a magnetometer as your method to determine the absolute orientation of the headset or controllers. Magnetic interferences can be accounted for with sensor fusion, but without truth to compare against, it is hard to ensure accuracy. To avoid these issues requires a stable magnetic field with as little magnetic interference as possible. We recommend 6-axis sensors to avoid these issues at the cost of heading drift over time.
Hillcrest Labs’ expertise in multi-axis sensors and sensor fusion software will help you compete in this competitive market, and take advantage of future developments. Our low-latency IMU is the ideal component for precisely tracking head movement and controller orientation to create immersive, authentic user experiences. Check out the datasheet on our AR/VR solutions, or contact us to discuss your application.
You might also like
More from Sensor fusion
Handling magnetic interference in sensors involves understanding your environment and developing algorithms to detect interference. A while ago, I covered a glossary …