This Dutch AI start-up is changing healthcare
Dutch computer scientists are building software that can read your body language through CCTV. Already functioning in care homes around the country, the technology is the product of a spin-off team from the University of Amsterdam, which uses machine learning that can monitor and respond to human behaviour in real time.
In theory, the technology could recognize scenes of aggression or fear, harassment or intimacy, automatically alert police or ambulances and report suspicious or indecent behaviour.
While the idea may seem Orwellian, the startup company behind it – Kepler Vision Technologies – has one sole intention: to revolutionise the global healthcare system. Its flagship product, the Kepler Night Nurse, allows fisheye camera lenses, installed in care home ceilings, to send an emergency signal – a ‘Man Down’ detection – to care workers whenever an elderly resident has had a fall. It will recognise when a patient is sitting up, has left the room, is standing up or lying in bed.
According to Ergotherapie Nederland, every five minutes a person over the age of 65 ends up in the emergency room due to falling. In 2018, nearly 75 thousand elderly people suffered serious injuries from falling and it is estimated that with the country’s increasingly elderly population, the number of falls could rise by 70% before 2050.
The Kepler Night Nurse
Kepler Vision Technologies’ CEO Harro Stokman believes the invention’s capacity to intervene in this issue and save lives could make it a global healthcare ‘unicorn’.
‘There is no real alternative to this kind of technology,’ he says. ‘The problem with the lack of care staff is either paying them a lot more, but then we would have to take that from defence or education system and we just cannot do that. The other alternative is to hire foreign staff – another political impossibility. Our solution is to make existing care staff more productive through technology like the Kepler Night Nurse.’
The Kepler Night Nurse can detect a fall within 60 seconds and ‘a carer will come running,’ says Stokman. The technology can also recognise if someone has not reemerged from the bathroom. If they’ve disappeared for 20 minutes, the detector will send a signal that something isn’t right.
‘Computer vision allows you to articulate what is going on in the room. There are a number of other technologies, motion sensing for example, but this isn’t accurate or not. It could set off an alarm because the curtains are blowing in the wind. It’s computer vision that will prevail in the end.’
How it works
Once a care home adopts the Kepler Night Nurse, the software is integrated into its cameras and spends several weeks observing the environment and residents’ behaviour.
Images of the room are sent back to annotators at Kepler’s office, who mark images of falls and other relevant behaviour as false or positive, helping the machine ‘learn’ and increase in accuracy. After two months, the machine is fully acquainted with rooms and their occupants and functions automatically. So far, the Kepler Night Nurse has only failed twice to recognise emergencies, both due to objects obscuring the camera’s vision.
Into the future
So far, Kepler Vision Technologies has signed its first two contracts with major Dutch nursing homes, one with over a thousand patients and the other with more than 300. The company currently has seven patents running, with eight more pending – twice the number of all its competitors put together.
‘We are now looking to raise €3m in late seed round funding,’ says Stokman, who has set his eyes on the lucrative US market. Roughly 3 million people need treatment for fall injuries each year in the US alone.
Deep faking
Despite public concerns over this type of AI invading personal privacy and acting as a weapon for state surveillance, Stokman says the start-up is taking stringent measures to protect all those living under the Kepler Night Nurse’s gaze.
Any images running through the software is destroyed after use, and the resident’s faces are automatically altered through deep fake technology, which transforms their appearance, making them unrecognisable. ‘If there was a hack, no one’s personal information would ever be threatened,’ Stokman says.
Read more at DutchNews.nl