UTSA Researchers Explore AI Threats in Software Development

UTSA researchers delve into how errors in AI models could impact software development, focusing on hallucinated packages.

Researchers from the University of Texas at San Antonio (UTSA) have embarked on a critical investigation into the potential threats posed by the use of Artificial Intelligence in software development. Their study focuses on the implications of errors, particularly hallucinations, in AI language models which can mislead developers.

The research highlights how these hallucinated constructs arise when AI language models generate non-existent or incorrect packages that developers might inadvertently rely upon. Such mistakes are particularly associated with Large Language Models (LLMs), which often fabricate information that appears plausible but is ultimately false or unverified.

In their research paper, the UTSA team analyzed various language models to understand the frequency and impact of these hallucinated packages on software projects. Their findings point to the need for vigilant verification processes and the development of mechanisms to identify and mitigate hallucinated outputs, thereby improving the reliability of Artificial Intelligence-assisted coding environments.

67

Impact Score

Modular artificial intelligence agents outperform fine tuned monoliths

New multi institution research suggests that small specialized tools wrapped around a frozen large language model can match the accuracy of heavily fine tuned agents while using 70x less training data, validating a modular approach one developer discovered through trial and error.

Best workplace artificial intelligence tools for teams in 2026

The article outlines 10 workplace artificial intelligence tools that help teams cut busywork, improve communication, and standardize workflows across hiring, HR, projects, and operations in 2026. It explains which platforms fit different environments, from productivity suites and messaging to HR systems and service management.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.