Artificial Intelligence is an unregulated industry. It is a Wild West out there. Applications are increasingly used to make life-changing decisions without oversight or accountability. These decisions can have disastrous consequences. People of color, women, and other marginalized groups often bear the brunt of this.
The European Union thinks it has an answer to the problem of artificial intelligence. It wants to create a new law that regulates the whole industry. This new law will be the first worldwide law that tries to regulate the whole AI industry.
EU law is complex and opaque. This article provides a quick overview of what the EU’s AI act entails.
What’s the big deal?
The AI act is hugely ambitious. It requires extra checks for high-risk uses of AI that have more potential to harm people. These checks could be used for systems used for grading exams or hiring employees. The first draft of this bill also includes bans on using AI deemed unacceptable, such as scoring people on their perceived trustworthiness.
Facial recognition software is used in public places to identify people. This system allows law enforcement to identify criminals based on their face. Police departments across the country use this technique to catch criminals. A large number of people oppose this practice because it violates privacy rights.
Facial recognition software is being used by many companies and governments around the world. This new regulation could cause problems for these companies and governments.
How will it affect citizens?
In theory, it should protect people from the worst side effects from AI by ensuring that applications faced at least some level of oversight and accountability. People can rely on this system to ensure that they are protected from the most dangerous forms of AI.
People must be informed about new technology before using it. A complaint mechanism should be created for people who feel wronged by AI.
Predictive Policing Systems are used by law enforcement agencies across Europe. Critics argue that these systems are often racist and lack transparent processes.