Kepler Vision Technologies announces the results of an independent AI bias audit by AI advisory service Code4Thought

 

  • Algorithmic Bias Testing company Code4Thought confirms Kepler’s computer vision solution is free of gender bias
  • Kepler Vision’s “Night Nurse” solution is used to look after the wellbeing of elderly residents in care home

Global leader in elderly care facility monitoring software Kepler Vision Technologies today announced the results of an independent AI bias audit by AI advisory service Code4Thought. The results confirmed that Kepler’s “Night Nurse” algorithm takes an unbiased approach to looking after the wellbeing of patients in a professional care environment, regardless of a patient’s gender. 

Built on machine learning powered human activity recognition technology, the Kepler Night Nurse (KNN) solution immediately alerts staff whenever it detects patients in care facilities have fallen, or are experiencing physical distress, meaning they receive the attention they need the moment they need it. This reduces the need to constantly check on patients, and removes the wasted time associated with false alarms that other monitoring solutions provide.

As the world’s first computer vision-based fall detector to be registered as a medical device, Kepler Night Nurse’s deep learning algorithm requires close scrutiny to ensure that it produces fair and accurate results for any patient it monitors. The Bias Testing analysis of Code4Thought is based on the ISO-29119-11 standard – guidelines designed to test black-box AI-based systems to ensure accuracy and precision. The outcome of the analysis proved that KNN’s algorithm is able to make correct predictions regardless of a patient’s gender. It also provided input on how to maintain this by including diverse images in KNN’s training datasets, which will protect against bias and healthcare disparities in already vulnerable demographics as the use of KNN expands.  

Dr. Harro Stokman, CEO of Kepler Vision Technologies, said: “As the gravity of decisions made by AI systems increases, so does our need to ensure they operate fairly and transparently. Nowhere is this needed more than in the medical device space, where the judgments of AI powered tools can literally be a matter of life and death. The EU Commission’s proposal for AI systems Regulation makes it clear that more can be done by companies using Deep Learning algorithms with high complexity and opacity to build confidence in AI systems. By working with Code4Thought, Kepler Vision is confirming its dedication to improving the lives of all patients its technology is applied to, regardless of individual differences.”

Yiannis Kannellopoulos, founder of Code4Thought said: “Poorly trained algorithms in the medical space result in inaccuracies and biases that can have potentially devastating consequences. For this reason – making sure that its algorithms stand up to the highest level of independent scrutiny should be the priority of any company using AI systems. It is very rewarding working with teams and organizations such as Kepler which value building AI technology that can be trusted by businesses and individuals. It is encouraging to find that behind a high quality AI system, there is a high quality team of engineers dedicated to data science professionalism.”

This article was published the 2nd of November on  Startupper.gr