Soli and Motion Sense: Another Sci-Fi Tech Becoming a Reality

Back in the late 1980s and early 1990s, when “Star Trek: The Next Generation” was launched, it introduced not only a new class of alien threats but also a new generation of sci-fi tech into our lives. The crew on board the Enterprise used “PADDS”, 24th-century tablets to read and write reports, carry data around, and learn, they used wearable communications badges to keep in touch and interacted with the ship’s computer through a voice interface. Not to mention all the touchscreen controls all over the place, consoles that doubled as keyboards and displays at the same time. This new generation of devices, not to mention the user interface, has seemingly triggered the development of similar devices in real life. Today, we have new ways in which we interact with technology. we have smartphones and tablets with touchscreens, we have smart speakers that we can speak to much like Captain Jean-Luc Picard spoke to Enterprise’s computer, we have wearables in the form of fitness trackers and smartwatches, and we even built our own version of Star Trek’s famous “Holodeck” in the form of virtual reality headsets. When it comes to Star Trek, we pretty much checked off every innovative technology we’ve seen in the series – with the exception of interactive holograms and the warp drive, maybe. So, it’s time to take the next step when it comes to user interfaces. And this next step is none other than gesture control.

Minority Report and the story of the real 2054

Back in 2002, long before the iPhone was first released, director Stephen Spielberg released a movie about a future law enforcement agency called “PreCrime” that catches criminals before they even commit a crime by using the predictions of three “precogs”, psychics permanently wired into the system that are capable of seeing the events of the future. The movie was not only a blockbuster but it also came with a sometimes pressing, other times fascinating view of the future – the story is set in 2054 – that looked and felt surprisingly realistic. This is not a mistake, and neither is it a work of a genius – the environment and the technologies you see in the film were elaborated by a team of fifteen experts – futurists, biometrics researchers, computer scientists, and their likes – to create a world that would be as close to the “real” 2054 as possible.

Among others, the movie predicted self-driving cars, multi-touch interfaces, retina scanners, personalized advertising, and electronic paper – and one of the most attractive of them all, gesture control. Some of these are widely used as we speak, others are in development. And gesture control is on the brink of becoming a reality thanks to Google’s “Soli” that was showcased this fall on the Pixel 4 smartphone.

A lot less touching

While smartphones are not the first devices you think of when it comes to gesture control – but they are the best to showcase what the new interface is capable of. And Google did just that when showcasing the new “Motion Sense” features of its Pixel 4 smartphone this fall. The Pixel 4 is the first smartphone with radar control thanks to Project Soli, developed by Google’s own Advanced Technology and Projects group (ATAP). Soli is a dedicated gesture-recognition chip based on radar, a novel approach that ignores every camera- and light-based gesture recognition attempt made before.

With “Motion Sense”, you can skip tracks, silence incoming phone calls, interact with games, and do many other things without ever touching your phone. But it can do more than that: your phone will be able to sense when you reach for it, silencing the alarm or turning on the screen to prepare for unlocking, and the possibilities are endless. The feature set in “Motion Sense” is a bit limited right now but the team behind Project Soli is working on new features that will cover everything between gaming and personal wellness. And the best part is that all of these interactions are handled locally – you won’t have to worry about privacy when using it.

A gesture-controlled smartphone is just the first step – in the coming years, we’ll probably see a series of devices, from smart TVs to computers, gaming consoles, VR headsets and such, with gesture control enabled. And then we’ll feel like it’s 2054 again.

Leave a Reply