Compared to the uncertainty of regulatory authorities, the company has greatly lobbies on AI issues at the US Federation Last year compared to 2023.
According to the data compiled by OpenSecrets, 648 companies were spending 458 AI lobby activities in 2024 in 2024, indicating an increase of 141 % year -on -year.
Companies like Microsoft support laws such as CREATE AI Act, which support the benchmarks of AI systems developed in the United States, including Openai, and establish a dedicated government center for AI research. Put the weight on. 。
Most AI labs, that is, companies that are almost dedicated to commercialization of various types of AI technology, spent more to support more legislative agendas than 2023 than 2023.
Openai has increased lobbying spending to $ 1.76 million last year from $ 260,000 last year. The Startup Coeree, which focuses on enterprise, has increased $ 230,000 in 2024, focusing on enterprise, with a close rival of human Openai, a close rival of $ 280,000 in 2023 last year. $ 70,000 two years ago.
Both Openai and ANTHROPIC were adopted last year to adjust the outlet of policy proprietors. Humanity has hired the first in -house lobbyist, a graduate of the Judicial Ministry of Justice, Rachel Uplleton, and Open as a new policy vice president.
All of Openai, Human, and Cohere have paid $ 2.71 million for the 2024 Federal Lobby Activities Initiative. This is a small number compared to the lobby activity in the large -scale high -tech industry with the same time frame ($ 61.5 million), but more than four times the total of the three AI labs spent in 2023 (610,000). It is dollar).
TechnaCrunch contacted a comment in cooperation like Openai, humanity, but did not reply as in the news time.
Last year, it was turbulent in the domestic AI policy plan. According to the Brennan Center, in the first half alone, parliamentarians took into account more than 90 AI -related laws. At the state level, more than 700 laws were proposed.
The parliament rarely moved forward, urging the Ibon Congress to move forward. Tennessee is the first state to protect audio artists from AI cloning, which is not allowed. Colorado has adopted a hierarchical risk -based approach to the AI policy. Gabin Newsum, Governor of California, signed a dozens of AI -related safety invoices. Some of them require AI companies to disclose detailed training.
However, no state authorities have succeeded in enacting as comprehensive AI regulations as international frameworks like the AI method of the EU.
After a long -term battle with special interest, Governor New Sam refused Bill SB 1047. The Texas trigger (AI Governance Law, which is responsible for Texas), has a wider range and may suffer the same fate if you pass through the State Council.
It is unknown whether the federal government can advance this year’s AI law more than the end, or whether there is a strong desire for culture. President Donald Trump has significantly regulated the industry and reveals his intention to reveal that he is impaired by our ruling in AI.
On the first day of his inauguration, Trump has canceled the presidential decree by former President Joe Biden and urged AI to reduce the risk of consumers, workers, and national security. On Thursday, Trump signed the EO, instructed the federal government to stop the AI policy and program in the specific Biden era, and could include the AI model export rules.
In November, humankind called for the Federal AI regulations within 18 months and warned that “the aggressive risk prevention window has been rapidly closed.” In that part, Openai called in a recent policy document to take more substantial actions on AI and infrastructure to support technology development.