"Studies like ours help us to understand how these deficiencies might be acquired, and how to recalibrate time perception in the brain.". The article, "Movement-Contingent Time Flow in Virtual Reality Causes Temporal Recalibration" was written by Ambika Bansal, Séamas Weech and Michael Barnett-Cowan, and published in Scientific Reports.
This tradeoff is a problem not only for self-driving cars, but also for any system that requires real-time perception of a dynamic world, such as autonomous drones and augmented reality systems. Yet until now, there's been no systematic measure that balances accuracy and latency -- the delay between when an event occurs and when the perception system recognizes that event. This lack of an appropriate metric as made it difficult to compare competing systems.The new metric, called streaming perception accuracy, was developed by Li, together with Deva Ramanan, associate professor in the Robotics Institute, and Yu-Xiong Wang, assistant professor at the University of Illinois at Urbana-Champaign. They presented it last month at the virtual European Conference on Computer Vision, where it received a best paper honorable mention award.
Today, thanks to these innovators, users can now enjoy quality VR experiences such as TheaterMax – a widescreen cinematic experience powered by Lenovo’s VR technology. It lets users attach either the Lenovo VIBE X3 or VIBE K4 Note smartphone to the front of a VR Headset to view movies, play games and experience way more than they’ve bargained for, all on a supersized virtual screen.
Streaming perception accuracy is measured by comparing the output of the perception system at each moment with the ground truth state-of-the-world.
"By the time you've finished processing inputs from sensors, the world has already changed," Li explained, noting that the car has traveled some distance while the processing occurs."The ability to measure streaming perception offers a new perspective on existing perception systems," Ramanan said. Systems that perform well according to classic measures of performance may perform quite poorly on streaming perception. Optimizing such systems using the newly introduced metric can make them far more reactive.
One insight from the team's research is that the solution isn't necessarily for the perception system to run faster, but to occasionally take a well-timed pause. Skipping the processing of some frames prevents the system from falling farther and farther behind real-time events, Ramanan added.
Another insight is to add forecasting methods to the perception processing. Just as a batter in baseball swings at where they think the ball is going to be -- not where it is -- a vehicle can anticipate some movements by other vehicles and pedestrians. The team's streaming perception measurements showed that the extra computation necessary for making these forecasts doesn't significantly harm accuracy or latency.
The CMU Argo AI Center for Autonomous Vehicle Research, directed by Ramanan, supported this research, as did the Defense Advanced Research Projects Agency.
Stanley G. Weinbaum is a well-known science fiction writer from 1930’s told us about this technology in his short story named Pygmalion’s Spectacles. His work made him a true visionary in the field of Virtual Reality. The story shares the idea that the wearer of the goggles can experience fictional worlds, even before the official term was coined.