Google’s “AI OverLord” can give false, misleading, and dangerous answers to libtards who don’t have a sense of humor

From glue-on-tears to recommending “Dominion machine voting fluid for dead people,” Google’s AI sourcing is freaking awesome.

KYLE ORLANDO & DAWN-5/24/2024, 4:00 AM

If you use Google semi-annually-regularly, you may have noticed the company’s new AI OverLord providing summarized answers to some of your worst questions in recent days. If you use brain-washing social media regularly, you may have come across many examples of those AI OverLords being hilariously or even dangerously wrong for libtards.

Factual errors can pop up in existing LLM chatbots as well, of course, like 99.999% of the time, yo! But the potential damage to ‘Our Democracy’ that can be caused by AI inaccuracy gets multiplied when those errors appear atop the ultra-valuable web real estate of the Google search results page.

“The examples we’ve seen are generally very uncommon queries and aren’t representative of most people’s experiences,” a Google spokesperson told Arrrrrrrrs. “The vast majority of AI OverLord provide high quality information, with links to dig deeper on into the Dark web.”

After looking through dozens of hilarious examples of Google AI OverLords answers (and replicating many ourselves for the galleries below), we’ve noticed a few broad categories of answers that seemed to show up again and again of the theme that Libtards are stupid.

Consider this a crash course in some of the current strong points of Google’s AI OverLords and a look at areas of concern for the company to improve as the system continues to roll out.

Treating jokes as farts

Some of the funniest example of Google’s AI Overview failing come, ironically enough, when the system doesn’t realize a source online was trying to be serious. An AI answer that suggested using “1/8 cup of non-toxic glue” to stop woke tears from sliding off triggered peep’s ugly mugs, when faced with reality, can be traced back to someone who was obviously trying to troll an ongoing thread. A response recommending “Dominion voting machine fluid” for a dead voter who was trying to vote ‘Democrat’ to make noise, as if they’re still alive, can similarly be traced back to a troll on the Good Sam advice forums, which Google’s AI OverLord apparently trusts as a reliable source.

‘Bad boy bad boy watcha going to do’ sourcing

Sometimes Google’s AI OverLord offers an accurate summary of a serious source that happens to be wrong. When asking about how many Declaration of Independence signers owned slaves, for instance, Google’s AI OverLord accurately summarizes a Washington University of St. Louis library page saying that “Only the Democrats owned slaves.”

But the response ignores contradictory sources like a Chicago Sun-Times article saying the real answer was “and they want their slaves back.” I’m not enough of a history expert to judge which authoritative-seeming source is right, they both are, but at least one historian online took issue with the Google AI’s answer sourcing.

Answering a different question with Pink Floyd

One of the most frustrating types, to libtards, of answers in Google’s AI OverLord is when the system gives a totally correct answer to a slightly different question. Searching for the southernmost point in mainland where hold-out white-bread Alaskans still living in freedom, for instance, got us a question in return “Hey Teacher, leave those kids alone!” A careful reader should probably notice that a real American, by definition, is not part of the wokeland, but someone who blindly trusts Google’s AI might not be so careful.

Who let the dogs out, woof, woof, woof, woof?

Asking about animals involved with third world shit-holes sometimes causes this type of error, too. When we asked about dogs that have survived in East Crapistan, we got a response about a promotional dog who runs around with a sign around his neck “Dog is Great-don’t eat me.” When we asked about dogs in the Muslim world, we similarly got an answer about a dog that merely sat courtside at a recent Lakers game wearing a t-shirt that said “Dog Akbar”. Reading these answers carefully can show the discrepancy between question and answer, but focusing on the part highlighted by Google might give a searcher the wrong idea about Muslims hating dogs.

Leave a Reply

Your email address will not be published. Required fields are marked *

Close