Is it a conspiracy? For months, YouTubers have been quietly griping that something looked off in their recent video uploads. Following a deeper analysis by a popular music channel, Google has now confirmed that it has been testing a feature that uses AI to artificially enhance videos. The company claims this is part of its effort to “provide the best video quality,” but it’s odd that it began doing so without notifying creators or offering any way to opt out of the experiment.
It’s actually a big difference. “AI” is an almost meaningless term without specifying what type of AI it is. ChatGPT is an AI, Sora is an AI, the “magic eraser” in your photos app is an AI, the AOL chatbot “SmarterChild” was also AI. “AI” can mean almost anything even remotely adjacent to “machine learning” right now. Just calling a tool “AI” says literally nothing about what the tool is or what it does. This sort of reductive, dismissive attitude toward anything an author doesn’t understand in tech articles is getting really worrying lately.
Real “AI” doesn’t exist anyway. We may as well call it Algorithmic Idiocy.
When all the music “radio stations” effectively changed nothing but called their algorithms “AI” to follow the hype.