Computer & Communication Industry Association
PublishedOctober 3, 2024

California Capitol Closeout: Key Highlights from the 2024 Legislative Session

From AI to Online Safety: Key California Legislation and Its Impact on the Tech Landscape


As the 2024 California legislative session concludes, several noteworthy bills focused on regulating various aspects of the tech sector are worth exploring in greater depth. Popular themes emerged, including the role of artificial intelligence (AI) and its potential impacts on healthcare, employment decisions, discrimination, deepfakes, privacy, and election integrity. AI was not the only topic in the spotlight; online safety also received significant attention. This session saw the introduction and passage of numerous bills aimed at regulating social media. These include new reporting requirements for cyberbullying and mandates to remove harmful content, such as illegal digital replicas and unauthorized political deepfakes. 

While industry stakeholders generally support the premise and goal of many of these bills, there was a lack of consensus on the specific language and the most effective methods for mitigating the associated risks and harms. As Governor Gavin Newsom reviewed the multitude of tech-focused bills that reached his desk, many of which contained conflicting language, he faced challenging veto decisions. He aimed to balance protecting against potential online harms with supporting tech and innovation in California, where the digital economy constitutes 10% of the state’s total GDP. Amid these discussions and government actions, it is critical to consider how these new regulations could reshape the landscape of technology and innovation in California and beyond.

Artificial Intelligence (AI)

As highlighted above, AI was a prominent topic this session, with legislation introduced on issues such as deepfakes — particularly their use in CSAM and politics — and digital replicas, bias in employment decisions, and strategies for mitigating harm from generative AI. 

Without a doubt, protecting election integrity is a laudable and important goal — a goal shared by lawmakers, constituents, and industry alike. Three bills were introduced this session, AB 2655, AB 2355, and AB 2839, looking to combat political deepfakes. On September 17, Gov. Newsom signed all three pieces of legislation. Of note, AB 2655 requires that if a user reports “materially deceptive” content, online platforms either label it or remove it from their site within 72 hours. Throughout the session, various stakeholders including industry and civil society organizations expressed concerns regarding feasibility limitations and highlighted how the law might infringe upon constitutionally-protected political speech. A complaint has already been filed requesting that the judiciary enjoin the enforcement of AB 2655 and AB 2839, declaring that both laws are in violation of the First and Fourteenth Amendments of the U.S. Constitution. 

Another bill that gained considerable attention this session was AB 1836 introduced by Assemblymember Rebecca Bauer-Kahan. The bill aims to impose liability for unauthorized use of “digital replicas” of deceased personalities. While recognizing the legitimate concerns of California lawmakers and residents regarding the potential violation of intellectual property rights, the bill departs significantly from California’s long-established right of publicity statute, which would likely infringe upon First Amendment-protected expressive uses. Though business stakeholders, including those outside of the tech industry, expressed concerns, the bill passed out of the legislature and was signed by the Governor.

Along similar lines, legislation requiring transparency in the use of AI for digital content creation was introduced and passed this session. SB 942, authored by Senator Josh Becker and signed by the Governor on September 19, mandates that AI providers offer free detection tools and include clear disclosures in AI-generated content. Assemblymember Buffy Wicks introduced AB 3211, a more controversial bill that faced greater opposition from stakeholders. This bill seeks to establish standards for labeling and managing AI-generated synthetic content, requiring provenance data to mitigate potential societal harms. While AB 3211, like many bills this session, had commendable goals, its broad scope and complexity make large-scale implementation challenging. The bill was ultimately halted in the Senate.

Lastly, two significant bills that gained attention this session were AB 2930, sponsored by Assemblymember Bauer-Kahan, and SB 1047, introduced by Senator Scott Wiener. Both bills were initially designed to provide comprehensive regulations across various sectors of artificial intelligence, with SB 1047 placing a particular emphasis on generative AI. Assemb. Bauer-Kahan introduced a similar version of AB 2930 the previous year, but it faced considerable pushback from several sectors due to its broad scope. After encountering similar pushback during this session, the bill was amended—without the sponsor’s knowledge—in the Senate Appropriations Committee to focus solely on regulations for employment decisions. Due to these abrupt changes, the sponsor chose to pull the bill from the floor vote. Given that Assemb. Bauer-Kahan has been advocating for this bill for two years, it’s likely she will introduce it and try to pass it again in 2025. Despite similar strong opposition from industry stakeholders, including those in healthcare, innovation, and financial services, SB 1047 successfully passed the Legislature and reached the Governor’s desk. However, Gov. Newsom vetoed the legislation stating that the proposed law “does not take into account whether an Al system is deployed in high-risk environments, involves critical decision-making or the use of sensitive data” but instead “applies stringent standards to even the most basic functions”. While Gov. Newsom also reiterated his commitment to ensuring that safety protocols are established, he also stressed the importance of taking the right approach. 

Online Safety

Another prominent topic of discussion this session was online safety, particularly concerning children. While the goal of ensuring children’s safety online is commendable, many proposals unfortunately missed the mark and strayed from federal law, at times conflicting with the First Amendment and Section 230. Despite numerous warnings that many of these proposals would not effectively benefit children, legislators proceeded to pass the bills, allowing several of these issues to persist throughout the session.

For instance, Senator Nancy Skinner successfully advanced her bill, SB 976, which had stalled in 2023. While the bill aimed to safeguard children from online dangers, such as exposure to harmful content, its language would render compliance nearly impossible for most social media companies and conflicts with both Section 230 and the First Amendment. Despite these concerns being raised with Governor Newsom and Attorney General Rob Bonta, he ultimately chose to sign the bill.

Throughout several legislative sessions, lawmakers have also been keenly focused on bills to require digital services to implement content reporting mechanisms and associated steps services must follow once that content has been flagged. For example, in the 2023 legislative session the legislature passed AB 1394, to require a social media platform to provide a mechanism for a California resident to report material to the social media platform that the user reasonably believes is child sexual abuse material (CSAM). AB 1394 would also require a social media platform to permanently block the instance of reported material and make reasonable efforts to remove and block other instances of the same reported material from being viewable on the social media platform. During this 2024 session, Governor Newsom signed AB 1831 which expands existing CSAM laws to include AI-generated content.

In 2024, lawmakers advanced two different measures that were ultimately signed into law by Gov. Newsom. AB 2481 would require a large social media platform to create a process to verify certain individuals as “verified reporters” and to create a process by which a verified reporter can make a report of a social media-related threat or a violation of the “large social media platform’s” terms of service that in the verified reporter’s opinion poses a risk or a severe risk to the health and safety of a minor. SB 1504 builds on the state’s existing Cyberbullying Prevention Act by requiring that a reporting mechanism for cyberbullying content also provide, within 36 hours of receipt of a report, written confirmation to the reporting individual that the social media platform received that individual’s report. SB 1504 also provides a private right of action by a parent, legal guardian, or administrator who submits a report of cyberbullying to the social media platform. While well-intentioned, AB 2481 and SB 1504 raise questions regarding how digital services would be able to appropriately implement such mechanisms and respond to such reports at scale. It is also unclear how digital services would verify a user’s status as a “verified reporter” and how to ensure that user maintains such a status. During legislative hearings, stakeholders also flagged that identifying and removing CSAM is less subjective, particularly given the ability to search for hash values of identified and reported CSAM. For the reportable content covered under California’s latest laws, the nature of such content could be construed as much more subjective. Given the challenges associated with evaluating and responding to these individual reports, it is possible services would be incentivized to more aggressively conclude that more content should be removed than they might otherwise remove, lest the service face steep penalties for not appropriately responding. 

Consumer Data Privacy

While California was the first U.S. state to establish a comprehensive consumer data privacy framework, the state continues to introduce legislation that would amend existing law, all amidst the California Privacy Protection Agency’s ongoing rulemaking process. The potential changes to existing privacy law amid the growing patchwork of states with data privacy laws on the books could lead to an increasingly challenging and diverging set of compliance requirements for businesses.

For example, AB 3048 would effectively require universal opt-out mechanisms under the California Consumer Privacy Act (CCPA) to transmit consumers’ opt-out preferences to businesses that they interact with online. The bill undermines user choice and goes beyond the CCPA by mandating such a signal; current law provides businesses with the option to implement opt-out preference signals, allowing for flexibility. It also creates uncertainty and challenging compliance questions for businesses operating in multiple jurisdictions, not to mention creating confusion surrounding what a business is expected to do should consumers send conflicting signals to businesses. Gov. Newsom vetoed the bill citing concerns that major mobile operating systems are unable to incorporate such an opt-out signal option, which could jeopardize the usability of mobile devices.  

Similarly, AB 1949 would amend existing state law to generally prohibit the collection, sharing, sale, use, or disclosure of data for consumers under 18 years of age, absent affirmative consent. Many digital services recognize the importance of implementing special safeguards for younger consumers, however, AB 1949 would not only diverge from California’s existing law, but also create more stringent restrictions that go far beyond other jurisdictions while simultaneously ignoring how consumers engage with businesses. California is not alone in seeking to establish stronger privacy protections for younger users — Connecticut and Virginia, in particular, have recently passed measured approaches to address this important issue. However, the bill was vetoed by Governor Newsom on September 28.

AB 1008, signed by Governor Newsom, seeks to alter the definition of “publicly available information” in such a way that would create uncertainty and confusion pertaining to which types of data formats the CCPA applies to. This clarification paradoxically creates confusion where none previously existed and calls into question what existing law does and does not apply to. 

Conclusion

Sacramento indubitably will continue to be a key driver in developments regarding technology policy. As the central hub for the tech sector and burgeoning hub for innovations associated with advances in AI technologies, California has the opportunity to foster further economic growth and development of products and services that carry the potential for vast benefits to consumers and society. Unfortunately, heavy-handed regulation that risks stifling such progress or infringing upon constitutional protections could prove harmful, not only to businesses located in the state, but well beyond its borders.

Khara Boender

State Policy Director, CCIA
Khara Boender is the State Policy Director at CCIA, where she oversees engagement with policymakers at the state and local level.

Jordan Rodell

State Policy Manager, CCIA
Jordan Rodell is the State Policy Manager for the Computer & Communications Industry Association, where she works alongside the State Policy Director on CCIA’s engagement with policymakers and legislative tracking at the state and local level.
Article

$600 Billion AI Abundance Dividend from Federal Preemption of State Laws

Recent reports indicate that the U.S. Congress is considering attaching a proposal to preempt state-level discriminatory regulation of AI to the National Defense Authorization Act (NDAA). If enacted, ...
  • Emerging Technology
Article

Is the Digital Markets Act Limiting European Businesses’ Potential? 

Some time ago, I joined a panel discussion asking a very timely question about Europe’s digital future: Is the Digital Markets Act (DMA) levelling the playing field for EU businesses, or limiting th...
Article

In Pictures – Online Personalisation and Consumer Experience Take Centre Stage at CCIA Europe Roundtable

On 20 November 2025, the Computer & Communications Industry Association (CCIA Europe) convened a roundtable in Brussels to discuss the Digital Fairness Act (DFA). The event focused on key question...
Article

How to Hide a Discriminatory Tax: Call It an Incentive

The Australian Government has formally begun its consultation process on its proposed News Media Bargaining Code Incentive. As anticipated, the country is continuing down the path of penalizing specif...
  • Digital Economy