The Care Home Environment, Infomednews, and Attoday published an article about face anonymization. Read the full article here!
Kepler Vision Technologies today announced an upgrade to the Kepler Night Nurse. Automatically blurring the faces of anyone detected by its smart sensor system! This upgrade will provide an additional level of guaranteed privacy for both patients and care staff.
Through the use of deep learning and computer vision, the Kepler Night Nurse can detect when elderly patients fall, when they are in physical distress, and when patients suffering from dementia wander into areas that are not supposed to – automatically alerting staff when these patients need assistance. Replacing old sensor systems such as bed mats, motion sensors, and wearables like necklaces and bracelets, the software allows staff to immediately respond to patients and eliminates 99% of false alarms.
Unlike traditional camera systems which require staff attention to monitor, the Kepler Night Nurse analyzes live video feeds using the software without being seen by a human being until they need to confirm the cause of a fall or physical discomfort. To better enable fall prevention practices, care staff can view a photo shot 30 seconds before a patient falls to find out the reason for the fall. With this new addition, all staff and patients in the shot have their faces blurred to guarantee privacy, while still allowing to interpret the care situation in the room.
The addition of automatic face blurring to the Kepler Night Nurse adds an extra level of privacy for anyone in these environments, similar to the automatic anonymization solutions used by Google Maps or Amazon Rekognition.
Dr. Harro Stokman, CEO of Kepler Vision Technologies, said: “As machine learning tools continue to proliferate throughout the healthcare sector, ensuring these systems provide benefits without compromising patient’s privacy and dignity is of paramount importance. While our Kepler Night Nurse system already provides privacy because the video feeds it monitors are only ever “seen” by the algorithm, automatically blurring the faces in this closed system will provide an extra level of privacy for both patients, and staff who are concerned that the Kepler Night Nurse could be used to spy on them. We look forward to announcing further upgrades to the Kepler Night Nurse that will improve its monitoring capabilities, ease of use, and ensure continued transparency.”
To further cement its position as a leader in the ethical applications of AI, Kepler Vision Technologies recently worked with AI bias tester Code4Thought to make certain its machine learning solution stood up to the highest level of independent scrutiny. based on the ISO-29119-11 standard – guidelines designed to test black-box AI-based systems to ensure accuracy and precision – Code4Thought’s analysis proved that the algorithms of Kepler Vision Technologies can make correct predictions regardless of a patient’s gender. It also set out guidelines to follow to protect against AI bias in all future updates.