If you’re a creator on YouTube, you need to know this - you're now required to disclose if your video was made with AI to look convincingly real.
Which, in all fairness, is a good new feature. It's worth knowing whether something that looks very real is AI-generated or if it's actual video of real life.
If you're creating animated content or AI content that looks like it was created using AI tools, then you don't have to make the disclosure, because it's obvious it was made using AI.
The proliferation of AI tools like Adobe Express, Midjourney, Google Gemini, DALLE-3 and Stable Diffusion have brought us quickly forward into a present reality where we have many tools to choose from for generating AI images, and soon, AI video.
See our Spacelab How To Be A Creator guide to get started on being a creator.
“We’re not requiring creators to disclose content that is clearly unrealistic, animated, includes special effects, or has used generative AI for production assistance,” said YouTube in a blog post announcing the feature.
These AI acknowledgment labels will be in Creator Studio on the YouTube website and its mobile apps, showing up in the video description or directly on the video player.
Check out the Spacelab guide for Why You Need A Content Strategy.
Creators must then confirm if their video portrays a real individual doing or saying something they actually didn't, modifies real event or location footage, or fabricates a scene that never happened. For content that's particularly sensitive, the label might even be placed on the video itself. ๐
|