The backlash was immediate, but it didn’t stop the BBC from using text generated by LLMs—and purportedly checked and copy-edited by a human before approval—in two marketing emails and mobile push notifications to advertise Doctor Who. But now, the corporation will stop the experimentation entirely after a wave of official complaints pushed them to offer a response to concerned audiences.
deleted by creator
The best way to handle LLMs is to treat them like an intern. They’re useful and can get a lot of work done, but you need to double check their work.
AI just helps people with their heads up their asses make bigger mistakes faster
https://www.reddit.com/r/comics/comments/d1sm26/behold_the_ultimate_life_form/
I know, reddit. I couldn’t find a better source for this image for some reason.
Well said.
BBC from using text generated by LLMs—and purportedly checked and copy-edited by a human before approval—in two marketing emails and mobile push notifications to advertise Doctor Who.
If they’re telling the truth then I don’t really get what’s wrong about that particular use
They weren’t just being cheap, they wanted to take human-written ads, auto-generate a million variations to send to individual people, then feed the ones that got clicks back in to train an AI clickbait generator. It also means the variations would be functionally watermarked so if anybody posted part of their text on Reddit or something the BBC could track who they sent that variation to.
Catgacating! Cartchy tuns!
A pasadise of sweet teats!
Given how bad the show’s writing has been for years and the declining in viewership in the Chibnell era, I’m actually surprised the BBC actually reversed course for once.
it’s crazy to me how bad the writing got. I just did a full rewatch to prep for Ncuti and it falls off a fucking cliff
I think that might make for interesting tv… one or more of the characters have dialog and action written by ai, and they have to deduce which are machines.
How the mighty has fallen.
That sums up the current AI situation pretty well. It‘s especially sad because so many (former) flagships of creativity like Wacom, LEGO, Disney or WotC are being caught using it, effectively burning down what was left of their legacy.
Ironically I thought AI would be used by smaller teams or even single users like me to brainstorm or get new ideas, but it us being shamelesslu used by croporations who could afford to pay a full team of artists and still gain a lot of money, while indipendent artists and creators just refuse to use it up to that extent.
Without meaning to be rude, you’ve not been paying attention. Throughout the history of capitalism, rich arseholes and then corporations have done as much as they can to avoid paying people for labour. It shouldn’t be surprising that this latest tool is used the same way
1960s:
these new “computers” will make everything so efficient, everyone will be able to cut their work day in half without any negative impacts!
Today:
Oh yeah man, I guess you haven’t heard but companies behind AI are investing like, billions of dollars into AI. They’re not doing that so the little guys get some novel use out of it.
There is no mention of the quality of the marketing.
If it was made by an LLM, checked and it was at the quality of a human then what’s the issue?
Backlash over marketing…emails and texts? Y’all are trolling me, right?
People are so dumb they make me actually side with a fucking company. Luddites used to be fun to make fun of until people started actually listening to them.
The luddites were a labor movement. They fought for the rights of skilled workers to make a living.
Somehow you have fallen for the myth that machines make art.
Nah, they were lunatics who thought machines would replace them. Guess what, machines are everywhere and people still have a job.
Large-language models and computer tools in general are different to traditional machines, to be fair. For every spinning Jenny that you make, you need people to make it, people to service it, and people to operate it. These people all have jobs now. For every piece of software you make, you only need the one team who originally develops it. From then, it can be replicated endlessly with no extra human input. You also only need one set of people to “service” it (bug fixes, updates etc.) for the entire world, rather than one per factory or workplace
(Also, I disagree with your premise and your assumption that jobs = good, but you probably don’t want to hear about it and I don’t want to type an essay :) )
I can actually agree that jobs = good is not a good metric, it’s just what luddites thought and I didn’t want to be inaccurate.
The rest of your comment I disagree with, I work in software development and it’s simply not true. AI transforms the work people do, it doesn’t replace it, software in general doesn’t replace work, only transforms it.