![]() (Need help? Scroll for our content authenticity primer.) Deepfakes, cheap/shallowfakes, NFTs, blockchains, synthetic media, content authenticity and provenance – knowing these buzzwords in the context of our increasingly digital lives is imperative. These questions arise as many of us wrestle with new terms to help us understand what is real and what is fake. What’s real, what’s fake, what’s valuable, who created it and with which editing tools – as well as what changes were made to an original piece – are the questions we should all be asking. In the red-hot market for non-fungible tokens (NFTs), protecting the creators of these original pieces of “cryptoart” should be just as important as protecting the collectors who buy them.įrom photos to deepfake videos to NFTs and other digital file types, true transparency into a how a piece of digital content is created and/or changed, is critical to ensuring whether we can trust the source. Understanding the authenticity of content is also a big deal for the authors of the content – the creators, the creative professionals and the communicators.Ĭontent creators often go uncredited and unpaid when their images are repurposed into an endless variety of re-edited viral memes that stray from their original intent or purpose. Getting at the truth isn’t the only thing at stake. Governments and bad actors already know this and use it to spread misinformation (unintentional deception) or sow disinformation (deceit with intent). Already, deepfake videos like deeptomcruise, created with the help of AI and machine learning algorithms, are incredibly convincing (hint: it’s not actually Tom Cruise).Ī study this year in the journal Nature about the rise of misinformation online, found that people are more focused on sharing what they think will boost their social status than in sharing what is true. One expert quoted in Nina Schick’s book Deepfakes, The Coming Infocalypse, estimates that synthetic video may account for as much as 90% of online video in just three to five years – meaning it will be generated partially or entirely by artificial intelligence (AI), not humans. There could be 100 times more visual content by 2027, according to one study. Now, the same powerful and easy-to-use tools used to make and share legitimate content are also deployed to create and spread disinformation or misinformation. In an increasingly fragmented media landscape, we are witnessing extraordinary challenges to trust in media. Our inability to distinguish fantasy from reality in digital images is a wakeup call. He wanted to see how far he could get “before the guards woke up. ![]() His point was clear: If fake images can dupe the pros, imagine how hard it is for the rest of us to know what’s authentic. That is, until Bendiksen himself pointed out the fraud using a Twitter account he created using a pseudonym. ![]() The case for content authenticity in an age of disinformation, deepfakes and NFTsĭid you hear about the award-winning documentary photographer from an esteemed photo agency whose faked images of North Macedonia’s industrial cityscapes tricked even the skilled eyes of the experts at a recent French photojournalism festival?ĭespite the random computer-generated bears that Jonas Bendiksen added to his pictures, nobody noticed his photos had been intentionally altered with computer software.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |