MotionEngine™ is Hillcrest Labs’ core sensor processing software system and is the product of over 15 years of experience developing sensor-based technology and products. MotionEngine is packaged into the application-specific software products described below and powers the BNO, FSP, and FSM hardware product lines. The software combines high accuracy 6-axis and 9-axis sensor fusion, dynamic sensor calibration, and many application specific features such as cursor control, gesture recognition, activity tracking, context awareness, and AR/VR stabilization to name a few.
MotionEngine software is compatible with the leading embedded processing architectures and operating systems and can be delivered as either a library or a full chip binary with host drivers that dramatically simplify system integration.
The application-specific features are packaged into MotionEngine software products that, when combined with a variety of off-the-shelf inertial and environmental sensors, provide sensor processing solutions for applications that demand the highest accuracy and quality. These are packaged into products for TV and Set-top box Remotes, Robotics, Health & Fitness and PC & Mobile segments – including stylus pens – and may be customized for large customers.
To better accommodate our customers, we have developed specialized software packages for the markets that we serve:
In-ear and over-ear electronics like TWS, audio headsets, hearing aids, and AR glasses all have one thing in common: they can benefit from gesture interface. But existing products have failures that frustrate users. MotionEngine Hear’s tap gestures and in-ear detection make taking control of that audio easier and more fluid. On top of that, built-in activity classifiers and VAD add to a dynamic list of features that informs intelligent automated decision making, while 3D head tracking enables immersive spatial audio.
Did you ever wish your customers could more efficiently interact with your displays? The SmartTV package enables this by utilizing natural hand motion and translating it intuitively on displays. Movement feels natural with features like cursor control, orientation compensation, button motion suppression, and virtual controls.
Automated robots need to move intelligently through their spaces, and our algorithms ensure they can. After all, a robot’s convenience is based on its autonomy. Our algorithms achieve precise heading with minimal drift. Our interactive and dynamic calibration algorithms achieve performance right out of the box and over time and temperature.
Handheld controllers deserve more than antiquated button interfaces. Today, cursor and gesture controls can be easily added to enhance interactivity. MotionEngine™ Air enables the similar cursor capability as in SmartTV, but also enables unique gestures (like twist, flip, pick-up) to help streamline workflows whether you’re presenting, creating, or controlling.
As our desires for more “smart” devices has increased, so has their power consumption. MotionEngine Mobile delivers high performance, low power sensor independent motion processing for mobile devices. This can be used to power motion app, provide context awareness, activity tracking, and even pedestrian navigation. The software is versatile enough for smartphones, tablets, wearables, and IoT devices.
Built over 15 years, Hillcrest Labs MotionEngine software is a robust and highly customizable sensor fusion solution for a broad range of motion applications
- 6-Axis and 9-Axis Sensor Fusion
- Dynamic Sensor Calibration adjusts for accelerometer and gyroscope bias changes over factors like time and temperature
- Magnetic Interference Rejection algorithms designed to ignore sudden changes in magnetic field
- Specialized algorithms for various applications:
- Cursor Control with Single Pixel Accuracy - software designed for in-air pointing applications for using with motion remote controls
- Gesture Recognition - in-air symbol recognition, flick, twist, flip, pick-up, shake, virtual controls, tap, double tap, and in-ear detection
- AR/VR Stabilization and Predictive Head Tracking - algorithms designed specifically to enhance AR/VR and 3D Audio user experiences
- Device operation mode identification
- Local geofencing
- Personal Activity Tracking – step counter and context detection, including walking, running, standing, in-vehicle, and on-bike
- Robust motion outputs, including orientation, heading and tilt
- Orientation Compensation algorithms designed to ignore changes in orientation with respect to the movement of a cursor on a display
- Sensor, Operating System and Processor Independence
- Drivers and sensor management written for a wide variety of inertial and environmental sensors for integration with
- Android™, Linux®, Windows®, macOS®, WebOS™
- RISC-V, arm, CEVA DSP, other RISC-based processor architectures
- Packages can be customized to fit each application
- Product notes:
Using IMUs and Sensor Fusion to Effectively Navigate Consumer Robotics
In this webinar, engineers will learn about the challenges when working with IMUs, how IMUs are applied in different robotics applications, and what is necessary to test IMU-based robots to achieve great performance.