White Paper Privacy, Data Protection and Care

 

Privacy?

A frequently asked question about Kepler Night Nurse: what about privacy? We understand the question because the very latest computer vision technology is used.

What about the privacy of a client? And what happens to the data?

Artificial intelligence, of which computer vision is a part, has become very important. Investors, including large tech companies like Google and Facebook , as well as conglomerates such as the Japanese Softbank , and renowned companies like Ahold and TomTom  have invested large amounts of resources into Artificial Intelligence. Financial backing for this new technology has also come from government : the French, German and British have all invested more than € 1 billion each and the EU is not far behind.

Many things are written in the media about artificial intelligence. For example, a recent news report stated that thousands of Amazon employees would secretly listen in on recordings by smart home devices that can order things for you if you talk to them.

By Aphex34Own work, CC BY-SA 4.0, Link

 

Kepler Vision Technologies also uses artificial intelligence. We use smart software to watch over people in their private environment who can no longer take care of themselves. This concerns the elderly and the disabled. Is that okay? We think it’s logical to answer some questions about that. In this White paper, we explain how data collection by Kepler works and how we handle privacy and data protection.

What exactly does Kepler’s software do?

Our software is called “the Kepler Night Nurse”. The Kepler Night Nurse works by using computer vision. The Kepler team has a lot of experience with this software. As a spin-off from the University of Amsterdam, we have been studying this technology for more than twenty years. The purpose of Kepler’s computer vision is to determine if someone needs help. If someone cannot get out of bed for example. We thereby help healthcare providers to take better care of people.

Until the introduction of the Kepler Night Nurse, caregivers guaranteed the safety of a client by going into the room regularly. We often hear that caregivers must enter a room twice a night. Every client, every night and every room. This is a reasonable requirement, but it is intensive, invasive and requires a lot of effort. Unnecessary checks that require turning on the light can be very disruptive to the client’s Kepler wants to do something for both the client and the caregiver: we develop software that, with the help of computer vision, tries to determine whether someone needs assistance. For example, if the client has fallen out of bed. The software then warns the caregiver. We strive to ensure that Kepler software only passes on information when something is wrong. As long as the technique is not perfect, occasional false alarms do occur. Caregivers then check unnecessarily, but for good reason. No one thinks it is a problem that the caregiver comes around to check a single false alarm: neither the caregiver or the client.

Why does Kepler collect data?

Our experience is that computer vision never actually works immediately in practice. It is possible to make existing algorithms work one hundred times better. They will then produce a hundred times less false alarms. Furthermore, better algorithms can further reduce the chance of a real accident being missed.

To get this done examples must be shown to the software. In those examples, a person then indicates manually where something happens in the image. This is called “annotation.” It concerns photos or short video fragments in which something relevant happens, such as falling or getting out of bed. After collecting the annotated examples, the computer learns vision, is able to analyze the images better, and thus becomes a better tool for the caregivers. Then the Kepler Night Nurse can better advise the caregiver if something is wrong. This is why Kepler collects data.

In practice?

Computer vision is learning slowly. For Kepler Night Nurse, thousands of hours of videos must be viewed to learn. That makes sense because the vast majority of what happens at night is fortunately not interesting: everyone sleeps peacefully, turns around once in a while, and even those who have a wild dream should not trigger an alarm. Below we give some examples of difficult situations that we encounter:

The Kepler Night Nurse software sometimes has difficulty recognizing person’s stance. This happens, for example, when a person gets up from a sitting position. The person does not sit but does not stand either. The software cannot properly classify this. By training the software, the software learns this way of getting up.

If there is a Maria statue in a room, the statue could be seen as a person. Understandable because a statue of Mary looks like a standing person. The Kepler Night Nurse must have seen a few examples to make the distinction. Another example: a hat was hung on a coat rack above a jacket and there were boots underneath, this was seen as a person. Also understandable because even a person could be shocked. The software must first learn that this is a mistake.

 

Example of a possible misclassification: The statue of Mary looks like a human but it is not. Adding this example to the learning examples will further improve the algorithm
Example of a possible misclassification: The statue of Mary looks like a human but it is not. Adding this example to the learning examples will further improve the algorithm

 

People who appear in a video are not always immediately recognized as such by the software. This happens, for example, when someone is lying in bed and pulls the blankets all over him. Just like people, Kepler Night Nurse is not always sure and might need more experience to recognize it.

With the current state of the art artificial intelligence has to be learned all over again in the event that a new situation occurs that is not recognized. In practice, this means that when Kepler Night Nurse is retrained, all the original images are required. New images are added to the old ones that led to incorrect analyzes. Otherwise, old errors will occur again. In order to improve the software and prevent errors, it is important that Kepler saves a small part (in practice less than 0.01%) of the images.

Who looks at the images?

The images are viewed by annotators at the Kepler office. These are people who work at Kepler for four hours a day. Not any longer, because the work is mind-numbing. For example, one of our annotators is a visual artist for whom annotation work is a side job. Another annotator is writing a book and looking for extra earnings, which he does not have to think about. The annotators have a Dutch employment contract with Kepler with an obligation of confidentiality. The work is therefore not outsourced to a low-wage country. The images remain with Kepler.

What happens with the videos?

A video is viewed and stopped when an interesting situation is happening to draw a box on the screen around a person. A name is given to that box such as “person is sitting”. Then the video is forwarded to the next interesting situation. And so on until all the images on which something can be learned are annotated. Annotating a frozen image takes a few seconds. Sometimes another annotator checks his colleague’s work, but afterwards, the video is not usually watched for a second time.

Annotators work on secure laptops that are not connected to the public Internet and that are only suitable for participating in annotation work. The images or videos are not stored on these laptops and the USB outputs on these laptops do not work. To prevent changes to the settings on the laptops, the laptops go into the safe after every working day.

For how long is the data stored?

The images collected by Kepler will be retained for as long as they are recorded in the processing agreement with the care institution. Usually, this means that the images will be deleted after termination of the Kepler Night Nurse software license agreement.

Why is privacy important?

We think that privacy is important for the following reasons:

The first reason is that it does not concern others that you do not know. An often-heard argument is that privacy would not be important as long as you don’t do illegal things. For this reason, espionage units should be given the keys by the government to view encrypted messages. But that is of course not the case. When you go to the toilet you close the door. You don’t do anything illegal, but it also doesn’t mean that your roommates, the institution where you are, or government services would be allowed to view you. That’s the toilet. The same applies to what you write in your diary, or in which posture you prefer to sleep. Which books you read, which TV you watch, if you enjoy jam sandwiches, it doesn’t matter to anyone if you don’t want to. Of course, it is innocent but it is still private. And so everyone has something that he or she likes to keep private for whatever reason.

Privacy is important because it does not concern other people what you are doing
Privacy is important because it does not concern other people what you are doing

 

The second reason is that today you do not know what will happen to your data in ten years. An example of this is the registration of beliefs. Not only in the past but also now are people being discriminated against and persecuted for their beliefs. That is why the government does not think it is good to simply store all the information. Personal data can only be stored for a good reason.

How does the Kepler Night Nurse contribute to privacy?

First of all, it does not concern others what a resident – with or without a disability – of a healthcare institution does in his or her private environment. If all goes well, the room in a healthcare institution must be as private as it is at home.

However, healthcare institutions also have a duty of care that requires that the well-being of a client is guaranteed. To do this, a caregiver must enter the room. That breaches the resident’s privacy. To really be able to see how the client is doing, the light must actually be on. That is difficult to sleep through.

The Kepler Night Nurse ensures that this is no longer necessary. The software monitors what is important as much as possible. For example, we have only taught Kepler Night Nurse activities such as “person out of bed”, “person in the bathroom” or “person fallen”. Kepler Night Nurse does not recognize other activities, something innocent like tying shoelaces, or something personal like shouting for joy. The care software only reports to the care staff when necessary: if someone has fallen, if someone has been in the bathroom for too long, or if someone cannot get out of bed. Kepler Night Nurse thus works just like a smoke alarm: it only goes off when things go wrong. Only in that case will the privacy of a client be infringed, and will the staff be informed of how the client is doing. But that is exactly when it is needed.

It is, true that Kepler’s annotators and researchers violate client privacy. However, this happens as little as possible and only to make the software better and better so that the clients with an improved system will be less bothered. This also improves their privacy.

Clearly agree what happens with client data

As mentioned, privacy is important because you never know for sure what will happen to the data later. Kepler solves this by establishing a processing agreement with a healthcare institution. The data processing agreement states what data is stored and when this data is deleted. In addition, clients must give explicit permission for their data to be used and stored. Therefore the client knows who gets access to the data (that is Kepler) and for how long (as long as it is necessary to further improve the detection).

Prevent “the Americans” to get rich by selling our data

Another reason that we as Kepler team sometimes hear from healthcare institutions is that “the Chinese” or “the Americans” will later become rich from the data of that healthcare institution in case of an acquisition. The underlying reason is that 99% of the successful high-tech start-up companies are being taken over by much larger multinationals at some point. With such an acquisition, the data collected by Kepler would end up in Chinese or American hands.

Our answer to this is:

Yes, foreign multinationals can indeed get rich from technology developed in the Netherlands. If Kepler were to be acquired, it would be because we are developing completely new artificial intelligence algorithms and not because we are selling data from Dutch healthcare institutions. In the processing agreements with our customers, requirements are set on how data must be handled. In the event of a takeover, a foreign party will also have to adhere to this. In addition, legal rules apply in Europe that are monitored in the Netherlands by the Dutch Data Protection Authority.

What about data protection?

Data security is important. It must be prevented that unauthorized people (such as hackers) have access to the data. Any violation of the system is not only detrimental to Kepler but can also lead to damage to the image of the healthcare institution. Moreover, it can lead to extortion of the healthcare institution or the person to whom the images belong.

The data collected by customers from Kepler are always stored encrypted. The data is only accessible via so-called “two-factor authentication”. It automatically records who reads the data. The security measures that Kepler has taken are currently being validated through a certification procedure. Kepler Vision Technologies is ISO 27001 certified and a NEN 7510 certified in April 2020. Kepler has been assisted in the certification process by Vos Orbedo. The assessment has been done by TÜV Netherlands.

Your cares are our care, do you have any further questions?