AI detection is failing creators. Platforms like Instagram and YouTube routinely skip labeling obvious AI-generated content, leaving human artists, writers, and photographers unable to prove their work is theirs.
The proposed fix flips the model: instead of flagging AI output, certify human output. Think a Fair Trade-style logo for human-made text, images, audio, and video. The machines have no incentive to label themselves. The humans being displaced do.
The Verge piece goes further than the logo concept, examining who would govern such a standard, whether it could be gamed, and which existing efforts are already moving in this direction. That process argument is what makes this worth reading in full.
[READ ORIGINAL →]