All posts tagged: gibberish

SEO Guy Mocks Google for Deindexing His “Gibberish” AI Sites

SEO Guy Mocks Google for Deindexing His “Gibberish” AI Sites

Earlier this month, Google — the web’s monopolistic landlord slash organizer slash feudal ruler — announced a major spam policy shakeup. Given that algorithmic search updates are often very boring, they don’t always make big media waves. But these updates, which largely read as a response to the rise of mass-produced AI-generated drivel proliferating across the internet, have been a big deal. And the spammier side of the SEO industry is feeling the squeeze. “The manual action I got — they hit me with a ‘pure spam,’ and they also used wordings such as ‘automatically-generated gibberish,’ which is pretty intense,” Jacky Chou, a well-followed search engine optimization (SEO) guy, laments in a YouTube video about the updates titled “I GOT CLAPPED (Google March Spam Update). “Um,” he continues, “I’ll wear that proudly, I think.” In a follow-up video, he mocks people who follow Google’s guidelines. “Alright guys, in light of recent events, I’ve decided to do everything the right way. I’m no longer going to spam the internet with AI content because it’s not profitable …

The Willy Wonka Event’s Lead Actor Speaks Out: ‘It Was Just Gibberish’

The Willy Wonka Event’s Lead Actor Speaks Out: ‘It Was Just Gibberish’

This is where the Unknown is revealed, so we have tension as well. I arrived on the day. I was like, “So what’s going on with this tunnel?” What they’ve done instead of this incredible, magical starlit tunnel is basically just stapled up some chequered flags into a corridor and put some dirty mirrors that I think they must have found in, like, the toilet or something, just along the corridor. This was meant to be the Twilight Tunnel. Yes, I saw the illustrations on the website. You’ve probably seen the videos online. That’s where the infamous Unknown character appears from behind a mirror and starts unnecessarily scaring the kids. There’s no need! There is no need to scare the kids that much. They were scared enough. Then we went through something called the Imagination Lab, which I think the point was that you’re supposed to imagine it was something better. That was where I was to hand out one jelly bean. Where does the lemonade come in? The next room was the Lemonade Room, …

Why ChatGPT answered queries in gibberish on Tuesday

Why ChatGPT answered queries in gibberish on Tuesday

ChatGPT goes colorfully crazy. Screenshot by Steven Vaughan-Nichols/ZDNET We all know that OpenAI‘s ChatGPT can make mistakes. They’re called hallucinations, although I prefer to call them lies or blunders. But in a peculiar turn of events this Tuesday, ChatGPT began to really lose it. Users started to report bizarre and erratic responses from everyone’s favorite AI assistant.  Also: 8 ways to reduce ChatGPT hallucinations As one person on Twitter put it, “ChatGPT is going full-on Finnegans Wake!” Or, as another less literary person tweeted, “ChatGPT is apparently going off the rails.”  Where was ChatGPT going? Well, to cite one example from Reddit/ChatGPT,  Me: Compare and contrast the Gregory Zulu 55 Backpack and Gregory Tetrad 60L Travel Pack. ChatGPT4: In a significant area of customer support, ergonomics, and subjective nature of ‘goods,’ each bag conjures a fable of a time and place. Gregory’s Zulu 55 is a guild-like lilt to the green-gilded high book way of state and yearn, while the Tetrad 60L conceives more of a ‘pathless traveled’ countenance with gnarls and swathes to the …

ChatGPT meltdown: Users puzzled by bizarre gibberish bug

ChatGPT meltdown: Users puzzled by bizarre gibberish bug

ChatGPT hallucinates. We all know this already. But on Tuesday it seemed like someone slipped on a banana peel at OpenAI headquarters and switched on a fun new experimental chatbot called the Synonym Scrambler.  Tweet may have been deleted Actually, ChatGPT was freaking out in many ways yesterday, but one recurring theme was that it would be prompted with a normal question — typically something involving the tech business or the user’s job — and respond with something flowery to the point of unintelligibility. For instance, according to an X post by architect Sean McGuire, the chatbot advised him at one point to ensure that “sesquipedalian safes are cross-keyed and the consul’s cry from the crow’s nest is met by beatine and wary hares a’twist and at winch in the willow.” Tweet may have been deleted These are words, but ChatGPT seems to have been writing in an extreme version of that style where a ninth grader abuses their thesaurus privileges. “Beatine” is a particularly telling example. I checked the full Oxford English Dictionary and it’s …