Artificial Intelligence (AI) 

The FDA has a database of approved AI technologies: Artificial Intelligence and Machine Learning in Software as a Medical Device | FDA (source: https://www.fda.gov/medical-devices/software-medical-device-samd/artificial-intelligence-and-machine-learning-software-medical-device). The EU and other countries do not have such a database. 

At the time of writing, the EU institutions are also negotiating a proposal for a new EU Artificial Intelligence Regulation (the so-called ‘AI Act’), which if adopted would add an additional layer of requirements for AI technologies that comprise (or are found in) Medical Devices.

The EU Commission AI Act draft proposal contains the following:

  • High-risk AI systems including those used to manipulate human behaviour, conduct social scoring or for indiscriminate surveillance will all be banned in the EU – although exemptions for use by the state or state contractors could apply [1].
  • Special authorisation from authorities will be required for “remote biometric identification systems” such as facial recognition in public spaces.
  • “High-risk” AI applications will require inspections before being deployed to make sure systems are trained on unbiased data sets, and with human oversight. These include those that pose a safety threat, such as self-driving cars, and those that could impact someone’s livelihood, e.g. hiring algorithms.
  • People need to know when they’re interacting with an AI system, unless this is “obvious from the circumstances and the context of use”.
  • A database of “high-risk” AI systems will contain the data regarding high-risk AI systems and be accessible to the public. Public sector systems would be exempt.
  • A post-market monitoring plan will evaluate the continuous compliance of AI systems with the requirements of the regulation.
  • These rules apply to EU companies and those that operate in the EU or impact EU citizens.
  • Some companies will be permitted to carry out self-assessments, but others will be subject to third-party checks.
  • A “European Artificial Intelligence Board” will be created, comprising representatives from every nation state, to help the commission define “high-risk” AI systems.
  • AI in the military is exempt from the regulation.

If the draft AI Act is approved in its initial form, regulators will be able to fine non-compliant companies up to €20m, or 4% of their worldwide turnover.

References

[1] Clarke, L. The EU’s leaked AI is ambitious, but disappointingly vague. https://techmonitor.ai/policy/eu-ai-regulation-machine-learning-european-union/