These AI SEO spam operations have used lists of common searches to ensure that their pages come up first in searches in the “long fat tail” the kind of search where it used to be about 50/50 if you’d find a page addressing your needs. But, it used to be *if* you found something like “The top 15 smallest ants in the world” it wouldn’t be nonsense. It’d either exist and be the work of another person who cared OR you found nothing. Not so now! I can’t possibly over-stress how bad this is! 1/
The funny results: like the ones telling you to cook with glue hide the fact that some portion of these attempts at impersonating information are not easy to detect. For every obviously bad result there are others going unnoticed since they were plausible enough to pass.
And those flawed results are being regurgitated and reprocessed by further AIs spreading the rot and half truths deeper and deeper into the body of human knowledge. Like scratching an infected wound.
2/2
Once you could count on some things posted online probably being true because, well, why would anyone bother to put out misinformation about a topic so obscure or uncontroversial? now the simple fact that someone might want to know a bit of information makes it worth faking if it can get their eyeballs on an ad— or improve the search ranking for some company. The harmless act of *being curious* about the world causes misinformation to spring to life. We have made wanting to learn destructive.
Some day soon a child will ask “what is the smallest ant in the world?” and discover that, unless they want to become an expert they simply can’t know.
This is the death of polymaths— a hurdle for interdisciplinary learning— and a return to a kind of human gatekeeping for real information: you best ask someone qualified if you are not expert enough to tell on your own. (this was already true for contentious topics, but now it will be everything)
@futurebird this is absolutely something I'm feeling, for myself and my very bright and curious 7yo
@seawall It’s good to teach young people to pay attention to sources and to question information presented as factual: but it’s also a hurdle. “what was the first shark?”
“do birds eat meat?”
“what is the smartest insect?”
Used to be the kind of questions anyone could innocently explore and stand a chance of finding their way to better sources and better questions… now I have to give all of these extra warnings “some pages that look like they are about science are just traps”
Here's something I have noticed, front page search results all look like they were created specifically for your search question. Like they were written by middle schoolers who took a test question and turned it into the first sentence/topic of their low lexile answer.
You no longer land on a resource written last year or 10 years ago.
And the ads! Oh, the ads! Every four sentence paragraph begins and ends with an ad and most likely a floating video box.
@MyWoolyMastadon @futurebird @seawall
I feel like at some point I need to do a longer post about pitfalls like this, but IMO searching for "questions" has always been bad for reasons besides SEO spam.
To put it bluntly, people who know their stuff almost never present information in that manner, so whenever you "ask" a search engine a question you're almost always leaning on whatever semantic behaviors that search engines is doing to dissect that question.
@jeruyyap @MyWoolyMastadon @seawall
If you write nice keyword searches most engines aggressively try to convert your words into a “natural language question” anyways.
And don’t get me started on the way words like “AND” “NOT” “OR” are ignored.
But really there should be nothing wrong with searching for a question: that is the kind of parsing that’s reasonable to expect to work, but right now it probably makes results worse.
@futurebird I think focusing on authoritative individuals has been and is going to be what matters. If I wanted to figure out the smallest ant at this point I'd go to Tom Scott to see if he has anything. Who also talked about 'why would anyone make up something harmless and random' in my favorite talk of his, There's no algorithm for truth. I feel like that talk is more relevant than ever, though his conclusion is becoming less and less viable (it's becoming really hard to judge things).
@norbipeti @futurebird it’s as if ranking content by the relative authority of others who endorse it by linking might work. We could call this PageRank and write a whitepaper about how it only works if the algorithm is deployed by a company that resists the allure of advertising money.