A new set of laws has just passed this week as lawmakers in the state of California aim to regulate the artificial intelligence (AI) industry to combat deep fakes while protecting workers from exploitation in the quickly evolving landscape.
The state legislature in California, which Democrats control, will vote on hundreds of bills this week during the final session to send to Gov. Gavin Newsom’s desk which will ultimately decide if the laws pass.
Gavin Newsom Will Decide on the Laws
Newsom will have until September 30 to sign the proposals, stop them, or let them become law without a signature.
In July, the Democratic governor signaled that he would sign a new proposal to crack down on election-based deepfakes. However, he has not weighed in on any other legislation.
AI Is a Big Industry in California
Earlier this summer, Newsom warned that creating too many regulations or restrictions for the huge tech industry could potentially kill some of the revenue brought into California.
In recent years, he has cited that the state’s budget problems could be alleviated by allowing big tech companies to advance their projects without added legislation.
Stopping Election Interference
One of the biggest issues on the bill right now is the possibility that artificial intelligence technology can be used to trick voters by generating deep fake images or videos of politicians.
This practice has already been cited as tricking and confusing people in current and past elections.
Worries of Deepfake Pornography
Another massive issue with AI technology is its ability to generate deepfake pornography of minors or unconsenting people.
In the past, California lawmakers have approved several bills to stop the harmful practice.
Harmful Content
A few proposals outlined would make it illegal for anyone to use AI tools to create images and videos of child sexual abuse.
However, the current law does not allow state attorneys to go after people who have AI-generated child sexual abuse images if they can’t prove that the materials were depicting a real person. A troubling reality.
New Legislation Approved
Lawmakers in California previously approved legislation to ban deep fake images and videos related to elections. Social media platforms must also remove all deceptive material 120 days before Election Day and 60 days afterward.
Campaigns are also required to publically disclose if they are running ads with materials altered by AI in anyway.
Setting Safety Standards
The state would be the first in the nation to set up sweeping safety measures on large AI models.
The legislation introduced by lawmakers in the state would also force developers to start disclosing which data they use to train their models. The efforts are aimed at shedding more light on how AI models work and how to prevent possible catastrophic disasters.
Protecting Workers
The new laws are partly inspired by previous actions taken by the Hollywood actor and writer’s strike last year that blocked big production companies’ ability to use AI-produced scripts and actors in TV and film.
Last year, lawmakers approved a proposal to protect workers, including voice actors and audiobook performers, from being replaced with AI-generated clones.
Other Work Protections
State agencies also banned AI from replacing workers at call centers under one of the proposals.
In recent months, California has also created penalties for critically cloning dead people without their consent or the consent of their estate.
Companies Have Steadily Been Replacing Workers with AI
Despite the laws and legislation designed to protect worker rights, more people have been losing their jobs due to AI-related technology updates.
So far, jobs such as data entry, customer service, assembly line jobs, retail checkouts, analytical roles, and translation have all suffered from AI tools being used to replace them with cheaper labor.
Tech Companies Will Face More Legislation
In the future, it’s likely that tech companies will face harsher restrictions on what AI can and cannot be used to make.
Until then, the government will need to decide how to control the untethered industry.