General Discussion
In reply to the discussion: Irish version... [View all]highplainsdem
(61,006 posts)the part of the video description most people never look at - that he used AI.
You don't see ANY of the video description if you're viewing the video when it's embedded on another site, like this one or Reddit or any of the other websites where YouTube videos appear. All you see is the thumbnail, that image you see in the OP, where you click on the arrow to play it. If the video's title is long, you won't even see all of the title.
YouTubers are supposed to label what's created by AI if it can seem realistic:
https://support.google.com/youtube/answer/14328491
That process, which happens when the video is uploaded, will put an ALTERED OR SYNTHETIC CONTENT label on the thumbnail, on both YouTube and every website where it appears, so anyone seeing it will know right away that it's not real. Both the photorealistic AI artwork and the use of AI for the music and voice would have required that label.
The vast majority of AI users on YouTube apparently completely - and unethically - ignore that requirement. My.guess is they do that because so many people don't want to waste time on AI slop.
With one AI video that was posted to DU fairly recently, the label was added AFTER there had been discussion here that it should have had that label.
On YouTube, you'll find the text description below the video. But to see it on a YouTube page, you have to click Watch on YouTube and then look at the text starting two lines below the video title.
In this case, that line says
You have to click more to see PART of the full video description.
If the full description is at all long, you'll have to click more again to see the rest. As you have to before you see the admission in the description there that AI was used.
This YouTuber included it - but buried it clicks away from the video itself - so he can try to claim later that he wasn't trying to deceive people. But he was. He should have labeled the thumbnail. He could also have added AI to either the start of the video title, so everyone would always see it, or to the start of the video description on YouTube, where people looking at YouTube would see it, but in small print.
But he buried it in the fine print.
People familiar with AI art would have already noticed that the image used for the.video looked like AI photorealistic art, which typically has a look I describe as plastic. Various filters can sometimes do that to real photos and video, and in fact YouTube got caught last summer doing that to short videos uploaded by some creators including producer Rick Beato and a friend of his, musician Rhett Shull:
https://www.democraticunderground.com/100220580766
https://www.theatlantic.com/technology/archive/2025/08/youtube-shorts-ai-upscaling/683946/
...In his video, Shull says he believes that AI upscaling is being useda process that increases an images resolution and detailand is concerned about what it could signal to his audience. I think its gonna lead people to think that I am using AI to create my videos. Or that its been deepfaked. Or that Im cutting corners somehow, he said. It will inevitably erode viewers trust in my content.
-snip-
....YouTube didnt tell me what motivated its experiment, but some people suspect that it has to do with creating a more uniform aesthetic across the platform. As one YouTube commenter wrote: Theyre training us, the audience, to get used to the AI look and eventually view it as normal.
IMO that plastic look is creepy. AI peddlers hope to find some way to get rid of it, because despite claims to being willing to disclose AI use, in fact they really want to make their AI slop indistinguishable from real images and video.
And now that you know more than you might've wanted to know about that...
The number of videos posted in a short time is also often a clue. Some YouTube channels using AI might post multiple videos a day, just as AI-using "writers" might upload multiple books a day to a self-publishing platform like Kindle.
The YouTuber in the OP wasn't doing quite as many, and had posted some videos that weren't AI occasionally for years earlier, but on October 12 he started posting AI-generated videos ripping off hits by well-known.artists, beginning with Alex Warren's song Carry You.Home.
Since that time he's posted about 90 AI-generated videos using AI to rip off those artists, in less than 4 months. Not quite 1 new video a day, but too rapid a pace to likely be anything but AI.
The videos use different voices, too. With AI slop channels about what's supposedly history, you'll often hear different narrators. Even when an AI user tries hard to generate what are supposedly multiple songs by one (nonexistent) artist, they often can't keep the AI singer's voice from changing.
Some AI users have a thumbnail on the video supposedly showing the singer or narrator, and that imaginary person's appearance typically changes slightly between videos.
And sometimes you get glaringly obvious errors, like a video.supposedly about life in the US in the '70s that had an image of people sitting around a table, their arms on the table...but there was.a spare arm not attached to a body hanging out with them.
