Case in point: as one X-formerly-Twitter user pointed out over the weekend, AI Overview will respond to the search entry “blinker not making sound” — which would ideally return helpful, expert-penned posts or videos that help googlers figure out why their car’s blinker isn’t working — with the advice to “replace the blinker fluid.” Which, again, is not a real thing. (View Highlight)
The blinker fluid advice is one of many instances of AI Overview failing to provide correct information, often due to terrible sourcing. To wit: recently, Redditors prompting the feature with the search “food names end with um” noticed that the AI search function will return the woefully incorrect response of “Applum, Bananum, Strawberrum, Tomatum, and Coconut” — which was stolen from an obviously ironic answer to the same question posted years ago in a Quora forum. (View Highlight)
Speaking of bad sourcing, as Jalopnik points out, Google’s blinker fluid gaffe is actually derived from a common joke. Those who know their way around vehicles are well aware that blinker fluid doesn’t exist, so telling a less-car-savvy person that they need to “replace blinker fluid” is a old inside quip. You can even buy empty blinker fluid bottles as a joke; indeed, the “source” that AI Overview cites is a comment in a wildly random travelers’ forum in which the commenter includes a clearly ironic photo of one of these empty bottle gag gifts with the phony advice that one “should replace” the fake fluid “every 2 years or so…” (View Highlight)
Great things happening in the infosphere. The future of search is here, we guess — but it’s still very much under construction, and might break the internet in the process. (View Highlight)