Table of Contents
ToggleDOGE Developed Error-Prone AI Tool to “Munch” Veterans Affairs Contracts
The Department of Government Efficiency (DOGE) developed an artificial intelligence (AI) tool to identify unnecessary contracts at the Department of Veterans Affairs (VA), intending to cut costs. However, the AI system showed significant technical flaws and inaccuracies, resulting in potentially harmful contract cancellations affecting veterans’ services.
Development and Purpose of the AI Tool
In early 2023, the Trump administration sought to reduce spending at the VA. Officials engaged Sahil Lavingia, a software engineer with limited background in health care or government procurement, to build a rapid AI solution. Working under extreme time constraints, Lavingia created a contract evaluation tool within days.
This AI software labeled contracts considered non-essential as “MUNCHABLE,” aiming to streamline spending by recommending which contracts could be eliminated. Lavingia’s tool was designed to parse contract texts and extract details such as contract number and total contract value.
Technical Flaws and Inaccuracies
Use of Outdated AI Models
The AI system relied on outdated, inexpensive large language models. Several errors emerged. For example, it misinterpreted contract values, often inflating small contracts’ worth. Some contracts reported at $34 million were actually valued as low as $35,000. This inflated data distorted decisions on which contracts to cut.
Limited Scope of Data Analysis
Lavingia’s algorithm assessed only the first 2,500 words of contracts. These initial pages provide sparse summaries and rarely represent the full terms, leading to oversimplified judgments. The AI also confused multiple monetary values within contracts, selecting inaccurate figures when better data existed in public databases.
Lack of VA Context and Expertise
The AI tool ignored critical contextual knowledge of VA operations. It did not account for legal requirements or understand which services were essential to veteran care. As a result, core components of the VA’s contract procurement system, which support vital functions, were misclassified as “munchable.”
Impact on VA Contracts and Veteran Services
Contracts Flagged for Cancellation
DOGE’s AI flagged over 2,000 contracts for potential cancellation. Transparency about follow-up actions remains limited. However, at least 24 contracts identified by the system have been canceled. These included contracts supporting cancer treatment development via gene sequencing, blood sample analyses for VA research, and nurse care improvement tools.
Concerns Over Effects on Veterans
The VA stated it aimed to avoid cutting contracts directly affecting veteran care due to possible harm. Despite this, investigative reports have identified that even modest cuts risk reducing quality of services for veterans. Some VA employees reported hurried attempts to justify contract retention, sometimes given only hours and confined to short written responses mimicking social media character limits.
Background of Sahil Lavingia and DOGE’s Approach
Lavingia’s Experience and Time Constraints
Sahil Lavingia, with nearly 15 years in software engineering, lacks formal AI or healthcare expertise. After joining DOGE on March 17, 2023, he developed the AI system the next day using AI-assisted coding. During his two-month tenure, he spent time downloading and analyzing VA contracts but had limited opportunity to learn VA procedures.
Acknowledgment of Tool’s Limitations
Lavingia admitted to ProPublica that mistakes occurred in the tool. He discourages using the code as a final decision-maker. He released the “munchable” script publicly on GitHub to enable improvements by the community, aligning with DOGE’s goal of transparency under Elon Musk’s oversight during that period.
Criticisms and Expert Opinions
Concerns About AI Use in Budget Cuts
Experts question the appropriateness of AI for complex budgetary decisions, especially with veterans’ services at stake. University of Pennsylvania law professor Cary Coglianese criticized reliance on off-the-shelf large language models (LLMs), emphasizing their unreliability in such a nuanced context.
Alternative Approaches Recommended
Waldo Jaquith, former Treasury Department IT contracting lead, labeled AI the wrong tool for this task. He argued that AI generates plausible but incorrect outputs. He highlighted the need for human oversight by specialized professionals trained to evaluate government contracts thoroughly.
VA’s Response and Future Directions
VA’s Contract Review Process
VA press secretary Pete Kasperowicz praised DOGE’s efforts, calling the review a “commonsense precedent.” He noted that all 76,000 VA contracts undergo multiple internal reviews before any reduction or cancellation, involving contracting experts and senior staff members.
Potential AI Expansion at VA
VA officials have not confirmed continued use of the “munchable” AI tool. Documents reveal DOGE considered using AI further to restructure benefit claims departments, potentially replacing staff. This suggests ongoing interest in AI solutions despite earlier challenges.
Key Takeaways
- DOGE created an AI-based tool to identify and cut non-essential VA contracts but faced significant errors due to outdated models and lack of contextual knowledge.
- The AI misvalued contracts and overlooked critical legal and operational nuances, risking the cancellation of vital veteran services.
- Contract cancellations influenced by the tool have already affected important research and care programs.
- Experts advise human expertise over general AI for such complex decisions, emphasizing the risk of AI hallucinations and inaccuracies.
- The VA acknowledges the AI tool’s role but stresses comprehensive internal review before contract decisions.
- Future AI deployment at the VA remains uncertain, with potential plans to expand AI use cautiously.
DOGE Developed Error-Prone AI Tool to “Munch” Veterans Affairs Contracts
Can an AI tool with outdated models and minimal context really decide which Veterans Affairs contracts deserve the chop? The short answer: no. But that didn’t stop DOGE from building just such a tool aimed at “munching” contracts, with consequences that raise serious questions about AI’s role in government decision-making.
So, what exactly happened? Let’s break down this fascinating and cautionary tale of AI ambition, slipshod development, and high stakes for veterans.
The Birth of an AI Contract-Cruncher
Early in 2023, with the Trump administration ready to cancel several contracts at the Department of Veterans Affairs (VA), they entrusted the task to Sahil Lavingia — a software engineer with zero experience in healthcare or government policy. Working under the aegis of the Department of Government Efficiency (DOGE), Lavingia was tasked with building a tool to identify “non-essential” contracts that could be axed.
The result? A hastily cobbled-together AI tool: the “muncher.” It scanned contracts, labeled some as “munchable,” and presumably saved taxpayer dollars. Sounds great on paper, right?
Outdated AI and Hallucinations Galore
Well, not so fast. The AI system was built on cheap, outdated models—kind of like buying a 15-year-old GPS for a cross-country trip. It made glaring mistakes, including wildly inflating contract values. For example, the tool flagged over a thousand contracts as if each was worth $34 million, while some were legitimately worth only $35,000.
How did this happen? Lavingia’s system analyzed just the first 2,500 words—the summary pages of each contract. The summaries were sparse, lacking the rich details needed to make sound judgments. And worse, the AI would grab the wrong dollar amounts when multiple figures were present in one document. Experts emphasize that the correct data was easily accessible in public databases but wasn’t properly leveraged.
Context? What Context?
AI’s Achilles heel in this scenario: context. Lavingia’s prompts didn’t contain crucial background on the VA’s operations—like which contracts are mandated by law or vital for patient care. This led to bizarre outputs where the AI tagged an essential part of the VA’s own procurement system as “munchable.”
Worse yet, the system didn’t understand the nuance behind contract cancellations. Not all cuts lead to savings without consequences; some erode services that veterans rely on daily.
The Fallout: Contracts Flagged and Veterans Impacted
DOGE’s muncher flagged over 2,000 contracts for potential cancellation. The transparency about which contracts were actually axed remains murky, but at least two dozen cancellations have been confirmed. These include contracts for maintaining gene sequencing devices (used in advancing cancer treatments), analyzing blood samples for VA research, and providing tools that help nurses deliver better care.
The stakes? Real. While the VA claims it avoids cutting contracts directly linked to veteran care to prevent harm, reports tell a different story. Even relatively small cutbacks have already affected veterans’ care quality.
Staff members reveal they had mere hours—or in some cases, just 255 characters—to justify why a contract should not be cut. That’s basically writing a tweet defending critical services. This rushed process doesn’t inspire confidence in the tool or the decision-making that followed.
Lavingia’s Race Against Time and Admission of Flaws
Sahil Lavingia brought nearly 15 years of software experience to the table but lacked formal AI training. Even more challenging: he had only a day to build the tool from scratch after joining DOGE in mid-March. Using AI-assisted coding, he whipped up the “munchable” script quickly and spent the next week analyzing contracts.
He later acknowledged the flaws: “Mistakes were made. I would never recommend someone run my code and do what it says.” Honest feedback from the creator, but sadly too late for some VA contracts already affected.
Still, in an unusual move, Lavingia open-sourced the tool on GitHub, in line with Musk-approved transparency goals at DOGE—inviting the tech community to improve the system. Unfortunately, the code’s release added little to immediate repairs in the contract cancellation muddle.
Experts Weigh In: AI Isn’t the Answer Here
Many experts voiced sharp criticism. Cary Coglianese, a law and political science professor at UPenn specializing in AI governance, found the use of “off-the-shelf” large language models (LLMs) for complex decisions deeply problematic. These generic AI models lack reliability for nuanced budgetary decisions.
Waldo Jaquith, former Treasury Department IT contracting lead, slammed AI use in this context: “AI gives convincing looking answers that are frequently wrong.” His recommendation? Employ humans to evaluate contracts. AI should assist, not replace, human judgment—and definitely not determine the fate of veterans’ support services on its own.
The VA’s Stand and Future Prospects
Despite the uproar, VA spokesperson Pete Kasperowicz praised DOGE’s efforts, calling it “a commonsense precedent” in contract review. He emphasized that all 76,000 VA contracts undergo thorough examination by contracting experts and senior staff before any cancellations.
However, the VA declined to confirm whether it will keep using the “munchable” tool. Documents from earlier in the year indicate DOGE proposed expanding AI use—for example, automating benefits claims processing by replacing employees.
That prospect worries many. Deploying AI beyond contract evaluation to potentially replace human benefits workers could directly affect veterans seeking services.
Lessons Learned: What This Story Teaches Us
The saga of DOGE’s AI muncher for VA contracts offers essential insights:
- Speed vs. accuracy: Rushing AI solutions without adequate domain expertise leads to errors that impact real lives.
- Context is king: AI tools must understand the specific operational and legal environments to make valid assessments.
- Human oversight remains essential: AI can support but should not replace human judgment in critical policy decisions.
- Transparency matters: Open-sourcing code is good, but decision-making processes affecting veterans require accountability beyond Github repositories.
Could a Better AI Have Helped?
Imagine if the VA had given Lavingia time and resources to understand health care contracts fully. What if the AI was trained on updated models informed by legal mandates and medical priorities? Perhaps then, the tool would have highlighted inefficient contracts without threatening vital research and services.
Instead, the tool became more of a techno-quick fix than a reliable partner in government efficiency.
What Can Veterans and Citizens Do?
Stay informed about AI use in public policy, especially when it touches services you rely on. Ask:
- Are AI’s recommendations independently verified?
- Is there transparency about how decisions are made?
- Are humans actively involved in critical decisions, or are machines calling all the shots?
Demanding accountable AI deployment protects everyone, but especially veterans who deserve robust, thoughtful support systems.
Final Thoughts
DOGE’s AI tool to munch VA contracts represents a cautionary tale in governmental AI use. While automation promises savings and efficiency, it must never come at the expense of care quality or informed decision-making.
Can AI truly decide what’s essential for veteran well-being without human wisdom? Not yet. For now, let’s leave the munching to snacks and keep veterans’ services in trustworthy hands.
What was the main purpose of the AI tool developed by DOGE for Veterans Affairs contracts?
The tool aimed to identify non-essential contracts held by the VA. It labeled selected contracts as “MUNCHABLE” to suggest they should be canceled or reviewed for elimination.
Why did the AI tool produce many errors in analyzing VA contracts?
The tool used outdated AI models and limited contract data, often misreading contract values and details. It lacked context about VA operations, leading to incorrect assessments.
How did the AI tool impact veterans’ services and VA contract cancellations?
The tool flagged over 2,000 contracts for cancellation. Among these were contracts critical to veterans’ care, like gene sequencing maintenance and blood sample analysis, some of which were canceled.
What were the main criticisms experts had about using AI for VA contract reviews?
Experts said off-the-shelf AI models are unreliable for complex decisions like budget cuts. They emphasized the need for human judgment, as AI can provide plausible but incorrect answers.
How did Sahil Lavingia’s experience and time constraints affect the AI tool’s development?
Lavingia had no health or government experience and developed the tool within days under pressure. He admitted mistakes were made and recommended not solely relying on the AI’s conclusions.