|
by Rafiki 12/17/2025, 9:00am PST |
|
 |
|
 |
|
I tried using Copilot to find information about a very specific programming problem I was having after spending hours and hours exhausting all of the results in Google searches. The AI didn't turn up any new information, but unsurprisingly it eventually did start fabricating ("hallucinate" is too kind) information. "Import this library," or "use this class and call this method," and neither ever existed. When I called it out, it explained that the reason for its mistakes was that blog posts and user comments on websites incorrectly cited that imaginary information. In other words, Microsoft has designed its system to blame all of us for its mistakes. It's not the Copilot devs' shitty decision to have their software make up garbage in lieu of returning no results or having the AI say, "I don't know," it's all of us posting incorrect information and poisoning their data models. If people start revolting and torching data centers, anyone who calls them terrorists is your enemy. |
|
 |
|
 |
|
|
|