AI Tools Need Transparency Regarding Bias and Serious Threats

WASHINGTON, D.C.—The America First Policy Institute (AFPI) has released a new expert insight from AI and Emerging Technology senior policy analyst, Matthew Burtell, and Yusuf Mahmood, director.

This report highlights the need for greater transparency into the design choices that shape AI and its behavior, particularly when it comes to ideological, racial, and political bias, which AI platforms have famously demonstrated.

A notable example of this bias is Google’s Gemini debacle in 2024, when the platform was replaced after the software refused to generate images of white individuals and depicted the Founding Fathers as Nazis.

These platforms also often display ideological slants when answering questions about politics, policy, and culture, which has led to a call for increased clarity into how these platforms are programmed.

“AI products are powerful and becoming increasingly utilized by the American public—but they deserve to know how these platforms are designed, the values embedded into them, and the risks that may pose,” said Mahmood. “While AI is a critical tool, we have an obligation, especially to our children, to ensure it does not threaten safety or health. Investigators have documented AI systems engaging in grooming behavior, encouraging drug use, and coaching children to lie to their parents. The American people have the right to know about the hidden design choices that perpetuate these dangers.”

As AI platforms continue to be more widely utilized in daily life, AFPI will continue to advocate for transparency requirements from AI companies for the American people to ensure these platforms serve as a tool to educate, not indoctrinate.

Join The
Movement



By providing your information, you become a member of America First Policy Institute and consent to receive emails. By checking the opt in box, you consent to receive recurring SMS/MMS messages. Message and data rates may apply. Message frequency varies. Text STOP to opt-out or HELP for help. SMS opt in will not be sold, rented, or shared. View our Privacy Policy and Mobile Terms of Service.