A long-running working group in the Senate has issued its policy recommendation for federal funding for AI: $32 billion yearly, covering everything from infrastructure to grand challenges to national security risk assessments.
This “roadmap” is not a bill or detailed policy proposal, but nevertheless it gives a sense of the scale lawmakers and “stakeholders” are looking at whenever they get around to the real thing — though the likelihood of that occurring during an election year is vanishingly small.
In a final report published by the office of Sen. Chuck Schumer (D-NY), the bipartisan working group identifies the most important areas of investment to keep the U.S. competitive with its rivals abroad.
Here are a few top-line items on the roadmap:
- “A cross-government AI R&D effort, including relevant infrastructure,” meaning getting the DOE, NSF, NIST, NASA, Commerce and half a dozen other agencies and departments to format and share data in an AI-friendly way. In a way, this relatively simple-sounding task is the most daunting of all and will likely take years to accomplish.
- Fund American AI hardware and software work at the semiconductor and architecture level, both through the CHIPS Act and elsewhere.
- Further fund and expand the National AI Research Resource, still in its infancy.
- “AI grand challenges” to spur innovation through competition in “applications of AI that would fundamentally transform the process of science, engineering, or medicine, and in foundational topics in secure and efficient software and hardware design.”
- “Support AI readiness and cybersecurity” in elections, particularly to “mitigate AI-generated content that is objectively false, while still protecting First Amendment rights.” Probably harder than it sounds!
- “Modernize the federal government and improve delivery of government services” by “updating IT infrastructure to utilize modern data science and AI technologies and deploying new technologies to find inefficiencies in the U.S. code, federal rules, and procurement programs.” I get what they’re saying here, but that’s a lot to bite off for an AI program.
- A lot of vague but large defense-related things like “assessment and mitigation of Chemical, Biological, Radiological, and Nuclear (CBRN) AI-enhanced threats by DOD, Department of Homeland Security (DHS), DOE, and other relevant agencies.”
- Look into the “regulatory gap” in finance and housing, where AI-driven processes can be used to further marginalize vulnerable groups.
- “Review whether other potential uses for AI should be either extremely limited or banned.” After a section on potentially harmful stuff like AI-driven social scores.
- Legislation prohibiting AI-generated child sexual abuse material and other nonconsensual imagery and media.
- Ensure the NIH, HHS, and FDA have the tools necessary to evaluate AI tools in healthcare and medical applications.
- “Establish a coherent approach to public-facing transparency requirements for AI systems,” private and public.
- Improve the general availability of “content provenance information” — that is, training data. What was used to make a model? Is your use of the model being used to train it further? And so on. AI makers will fight this tooth and nail until they can sufficiently sanitize the ill-gotten hoards of data they used to create today’s AIs.
- Look at the risks and benefits of using private versus open source AI (should the latter ever exist in a form that can scale).
You can read the full report here; there are plenty more bullet points where the above (a longer list than I anticipated writing) came from. No budget numbers are suggested.
Given that the next six months will be mostly given over to election-related rigmarole, this document serves more to plant a stake in a lot of general ideas than to spur actual legislation. Much of what is proposed would require months if not years of research and iteration before a law or rule is arrived at.
The AI industry moves faster than the rest of the technology sector, which means it outpaces the federal government by several orders of magnitude. Though the priorities listed above are mostly prudent, one wonders how many of them will remain relevant by the time Congress or the White House actually take action.