Hyperight

ML codes can open iron doors: Gesture recognition

Gesture recognition

Electronic access cards are nothing new – past century news. But what if you could open a door just by making a twist with your phone? A group of very smart guys sat down and decided to find a way to take the experience of opening a door to the next level – and the perfect solution was to turn the phone into a key relying on gesture recognition.

Johan Eliasson, Director Analytics and Gábor Stikkel, Senior Data Scientist at HID Global (a subsidiary of Assa Abloy), presented a revolutionary way of access control with gesture recognition enabled by machine learning at the Data Innovation Summit 2019.

Johan Eliasson presenting at Data Innovation Summit 2019
Photo by Hyperight AB® / All rights reserved.

The future of door opening

As for the reason why they’ve chosen the phone to act as a key, Johan says “You can forget your keys at home, you can forget your access card, you can forget your kids, but probably you won’t forget your phone”. 

Apart from turning a phone into an access card, they wanted to take the science of opening a door even further into the future. “What if the door opened before you reached it?” asks Johan.

The solution was a motion you do with your phone Johan calls “Twist and Go” that you perform while you approach the door, and by the time you get to the door, it’s already opened.

In the future, they also plan to work on doors that will open automatically. However, this innovative access brings in many security aspects that need to be taken into consideration.

The access must be secure and intentful – the door must open only when a specific person wants to come in. And that is the role of the machine learning algorithm in the background of security solutions.

The person tasked with this challenge was Gábor. He had to design a solution that grants access based on the gesture “Twist and Go”.

Photo by Hyperight AB® / All rights reserved.

ML-based gesture recognition for access

The initial implementation of gesture recognition was a simple rule-based one with the help of a gyroscope, where if it went above a certain threshold it triggered the opening of the door. However, it proved to be flawed.

While running tests, they discovered that besides the “Twist and Go”, other movements with the phone also triggered opening. He had to work out how the door would open with a “Twist and Go” and no other gesture.

Gesture recognition

Cracking the “Twist and Go” gesture

The simple rule-based measurement they had implemented for gesture recognition wasn’t cutting it because it was giving a lot of false positives. Additionally, they had to take into account the user experience when opening a door with a twist gesture. The research into user behaviour led them to the human activity recognition field, where many big players had made some significant progress, as Gábor relates. Some of them are:

  • Microsoft’s Seeing AI for impaired people
  • Google’s TensorFlow Lite for mobile phone
  • Amazon’s image recognition on Raspberry Pi

There was already a foundation laid in the field by some of the tech giants, but when Gábor started working on his project he encountered the problem with lack of data. As data is to a data scientist, as ingredients are to a cook, Gábor decided to grow his own ingredients, or rather start collecting data from scratch.

“It was really hard to capture ground truth,” he emphasises. “From the pure logs in the system, we couldn’t be sure if somebody did the real twist or it triggered a fake opening,” clarifies Gábor. This was a challenge they had to figure out.

In the experiments, they utilised several sensors to capture the right gesture such as accelerator, gyroscope, linear and rotational vectors. All this data together with the user markers was fed into a Twist and Go ML model. The next step was integrating the model into both Android and iOS smartphones, which have very different approaches.

Gesture recognition

Integrating the “Twist and Go” with mobile phones

They used time series classification approach for with a sliding window, where sensor data was analysed, features calculated on it and data was labelled for the purpose of running a predictive modelling. Integrating the resulting model into smartphones was another challenge luckily with some open-source tool support.

Initially, Gábor used both neural networks and decision tree method, but decision trees proved more appropriate for the smartphone environment.

They succeeded in reducing the false positives by more than 85%. Additionally, they were able to predict a twist happening 200 milliseconds earlier with which enabled the door to open before the person approached it.

Gábor admits they had some doubts about fitting the total of 235 decision trees into a mobile phone.  But to our greatest delight, they had the first door opening based on the machine learning model on 18 January 2019.

Putting gesture recognition into production

A successful model was just the beginning because what followed was productification challenges. The biggest one was integrating a Python model in an Android phone.

The solution came in the form of a model compiler that converts a Python model into a C function that can be included in an Android app.

As for iOS, they developed an ML model named after Gábor himself, which is compiled with Core ML model.

Photo by Hyperight AB® / All rights reserved.

The future of opening doors

The “Twist and Go” gesture recognition for access is just the beginning of the future of door-opening, states Gábor.

The next peaks that HID Global want to conquer are the knock motion and even seamless access where people don’t do anything and the system recognises them and lets them in.

The ultimate use case Gábor dreams of is building these doors for spaceships colonising Mars.


Add comment