extra Free Case

GenAI & Biometrics: Minimising Risk, Maximising Trust - The Case of Syntonym

Published 01 Oct 2024
Reference 6930
Region Europe
Length 7 page(s)
Language English
Summary

Biometrics are used to measure and analyse verifiable unique physical or behavioral data that allow identification and authentication. Over the last few years, biometrics leveraging AI and Gen AI technologies have significantly improved public safety, fraud and crime prevention, as well as the user experience. However, they raise legal and ethical concerns related to human rights, such as data protection and privacy, mass surveillance, bias and discrimination. As companies develop and/or use biometric AI technologies, they need to consider both the opportunities and the risks and responsibilities they create. What challenges do executives face when handling biometric data and how can they implement Gen AI solutions that serve their needs while respecting laws and regulations? What are the risks of using these technologies? Can GenAI be harnessed to improve security amidst the threat to privacy posed by multimodal AI systems? Can the risks be managed to ensure value creation while limiting the downside? And if so, how?

Teaching objectives

•Explore biometrics and facial recognition technologies •Assess how to position a GenAI/AI startup competitively; differences between potential go-to-market approaches •Identify key opportunities, risks and responsibilities in using biometrics in specific business cases •Examine organizational dilemmas in implementing GenAI biometrics: build or buy; balancing trust with value creation •Evaluate the tradeoffs and red lines to be drawn when deciding to invest in biometric AI solutions •Analyze the complexities of complying with rapidly evolving legal and ethical frameworks surrounding biometrics and AI

Keywords
  • Artificial Intelligence
  • Generative AI
  • Biometrics
  • Facial recognition technologies
  • Privacy & security
  • SDG3 Good Health & Well-Being
  • SDG16 Peace, Justice and Strong Institutions
  • Q32024