Friday, June 06, 2025

GIGO

When an AI script written by a Department of Government Efficiency employee came across a contract for internet service, it flagged it as cancelable. Not because it was waste, fraud or abuse — the Department of Veterans Affairs needs internet connectivity after all — but because the model was given unclear and conflicting instructions.

Sahil Lavingia, who wrote the code, told it to cancel, or in his words “munch,” anything that wasn’t “directly supporting patient care.” Unfortunately, neither Lavingia nor the model had the knowledge required to make such determinations.
Because AI doesn’t think; and apparently, neither do AI programmers.

The program used was “told” to read only the first 10,000 words of the contract, and make decisions there on whether it was a “cancelable” contract. But neither the programmer nor the program had any clue what it was doing.
Cary Coglianese, a University of Pennsylvania professor who studies the governmental use of artificial intelligence, said that knowing which jobs could be done in-house “calls for a very sophisticated understanding of medical care, of institutional management, of availability of human resources” that the model does not have.
- Contracts related to "diversity, equity, and inclusion" (DEI) initiatives - MUNCHABLE
The prompt above tries to implement a fundamental policy of the Trump administration: killing all DEI programs. But the prompt fails to include a definition of what DEI is, leaving the model to decide.

Despite the instruction to cancel DEI-related contracts, very few were flagged for this reason. Procurement experts noted that it’s very unlikely for information like this to be found in the first few pages of a contract.
AI was used so all VA contracts could be reviewed within 30 days. The programmer insists his program was only meant to flag contracts which would then be reviewed by VA employees. Which makes the process terribly inefficient, since every contract “munched” by AI has to be reviewed by a human (or several) who have VA work to do (the premise that the VA has invalid contracts is the problem. The solution that these can be reviewed and eliminated in 30 days is techno/magical thinking.).

This was all a complete joke. Which is impossible because, you know, Elmo is a “genius.” Like “leaving the model to decide”? What the fuck does AI “know”? It is, at best, a “black mirror” that reflects what the user wants to see. A human then takes that as “knowledge.”
There’s a sentiment I can get behind there; but it’s just sugar glass. It shatters at a touch. It’s sentences strung together by a program, not by thought. It may look like thought to programmers, but that’s because with their hammer the whole world looks like a nail. Or more likely because programmers know AI is a tool (and not much of one, based on the VA/DOGE example) that needs human intervention (like a power drill needs a hand). It’s the public, or the Trump Administration, that makes the mistake that AI is “thinking.”

The AI program written to review VA contracts was wildly bad. Its instructions would have misguided a team of lawyers. The programmer makes excuses, but the fact is he didn’t have a clue what he was doing, and AI was supposed to be “intelligent” enough to know for him. That’s the magical thinking at the heart of the Administration’s efforts: that AI “thinking” would be pure and rational and, hey, presto, validate Vought and Miller & Co. The problem, as ever, was the users. It still is. 
The VA is standing behind its use of AI to examine contracts, calling it “a commonsense precedent.” And documents obtained by ProPublica suggest the VA is looking at additional ways AI can be deployed. A March email from a top VA official to DOGE stated:
Today, VA receives over 2 million disability claims per year, and the average time for a decision is 130 days. We believe that key technical improvements (including AI and other automation), combined with Veteran-first process/culture changes pushed from our Secretary’s office could dramatically improve this. A small existing pilot in this space has resulted in 3% of recent claims being processed in less than 30 days. Our mission is to figure out how to grow from 3% to 30% and then upwards such that only the most complex claims take more than a few days.
Once again we have met the enemy, and he is us. 

2 comments:

  1. “Open the pod bay door, HAL.”

    ReplyDelete
  2. “I’m sorry, Dave. I cannot do that.”

    ReplyDelete