Saturday, September 6, 2025
  • Login
Forbes 40under40
  • Home
  • Technology
  • Innovation
  • Real Estate
  • Leadership
  • Money
  • Lifestyle
No Result
View All Result
  • Home
  • Technology
  • Innovation
  • Real Estate
  • Leadership
  • Money
  • Lifestyle
No Result
View All Result
Forbes 40under40
No Result
View All Result
Home Technology

Opinion: When unregulated AI re-creates the past, we can’t trust that the ‘historical’ is real

by Riah Marton
in Technology
Opinion:  When unregulated AI re-creates the past, we can’t trust that the ‘historical’ is real
Share on FacebookShare on Twitter


A furious political leader shouting a message of hate to an adoring audience. A child crying over the massacre of her family. Emaciated men in prison uniforms, starved to the edge of death because of their identities. As you read each sentence, specific imagery likely appears in your mind, seared in your memory and our collective consciousness through documentaries and textbooks, news media and museum visits.

We understand the significance of important historical images like these — images that we must learn from in order to move forward — in large part because they captured something true about the world when we weren’t around to see it with our own eyes.

As archival producers for documentary films and co-directors of the Archival Producers Alliance, we are deeply concerned about what could happen when we can no longer trust that such images reflect reality. And we’re not the only ones: In advance of this year’s Oscars, Variety reported that the Motion Picture Academy is considering requiring contenders to disclose the use of generative AI.

While such disclosure may be important for feature films, it is clearly crucial for documentaries. In the spring of 2023, we began to see synthetic images and audio used in the historical documentaries we were working on. With no standards in place for transparency, we fear this commingling of real and unreal could compromise the nonfiction genre and the indispensable role it plays in our shared history.

In February 2024, OpenAI previewed its new text-to-video platform, Sora, with a clip called “Historical footage of California during the Gold Rush.” The video was convincing: A flowing stream filled with the promise of riches. A blue sky and rolling hills. A thriving town. Men on horseback. It looked like a western where the good guy wins and rides off into the sunset. It looked authentic, but it was fake.

OpenAI presented “Historical Footage of California During the Gold Rush” to demonstrate how Sora, officially released in December 2024, creates videos based on user prompts using AI that “understands and simulates reality.” But that clip is not reality. It is a haphazard blend of imagery both real and imagined by Hollywood, along with the industry’s and archives’ historical biases. Sora, like other generative AI programs such as Runway and Luma Dream Machine, scrapes content from the internet and other digital material. As a result, these platforms are simply recycling the limitations of online media, and no doubt amplifying the biases. Yet watching it, we understand how an audience might be fooled. Cinema is powerful that way.

Some in the film world have met the arrival of generative AI tools with open arms. We and others see it as something deeply troubling on the horizon. If our faith in the veracity of visuals is shattered, powerful and important films could lose their claim on the truth, even if they don’t use AI-generated material.

Transparency, something akin to the food labeling that informs consumers about what goes into the things they eat, could be a small step forward. But no regulation of AI disclosure appears to be over the next hill, coming to rescue us.

Generative AI companies promise a world where anyone can create audio-visual material. This is deeply concerning when it’s applied to representations of history. The proliferation of synthetic images makes the job of documentarians and researchers — safeguarding the integrity of primary source material, digging through archives, presenting history accurately — even more urgent. It’s human work that cannot be replicated or replaced. One only needs to look to this year’s Oscar-nominated documentary “Sugarcane” to see the power of careful research, accurate archival imagery and well-reported personal narrative to expose hidden histories, in this case about the abuse of First Nations children in Canadian residential schools.

The speed with which new AI models are being released and new content is being produced makes the technology impossible to ignore. While it can be fun to use these tools to imagine and test, what results is not a true work of documentation — of humans bearing witness. It’s only a remix.

In response, we need robust AI media literacy for our industry and the general public. At the Archival Producers Alliance, we’ve published a set of guidelines — endorsed by more than 50 industry organizations — for the responsible use of generative AI in documentary film, practices that our colleagues are beginning to integrate into their work. We’ve also put out a call for case studies of AI use in documentary film. Our aim is to help the film industry ensure that documentaries will deserve that title and that the collective memory they inform will be protected.

We are not living in a classic western; no one is coming to save us from the threat of unregulated generative AI. We must work individually and together to preserve the integrity and diverse perspectives of our real history. Accurate visual records not only document what happened in the past, they help us understand it, learn its details and — maybe most importantly in this historical moment — believe it.

When we can no longer accurately witness the highs and lows of what came before, the future we share may turn out to be little more than a haphazard remix, too.

Rachel Antell, Stephanie Jenkins and Jennifer Petrucelli are co-directors of the Archival Producers Alliance.

Tags: archival producers allianceCaliforniaDocumentaryemaciated mangenerative AIHistoricalhistorical biashistorical footageHistoryIndustryOpinionPlatformRealRealityRecreatessorasynthetic imageTrustunregulatedunregulated aiWorld
Riah Marton

Riah Marton

I'm Riah Marton, a dynamic journalist for Forbes40under40. I specialize in profiling emerging leaders and innovators, bringing their stories to life with compelling storytelling and keen analysis. I am dedicated to spotlighting tomorrow's influential figures.

Next Post
Ohio Nursing Student Killed Father-in-Law with Fatal Overdose, Then Googled How Long it Takes for a ‘Body to Start Smelling’ Before Going on Vacation

Ohio Nursing Student Killed Father-in-Law with Fatal Overdose, Then Googled How Long it Takes for a 'Body to Start Smelling' Before Going on Vacation

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Forbes 40under40 stands as a distinguished platform revered for its commitment to honoring and applauding the remarkable achievements of exceptional individuals who have yet to reach the age of 40. This esteemed initiative serves as a beacon of inspiration, spotlighting trailblazers across various industries and domains, showcasing their innovation, leadership, and impact on a global scale.

 
 
 
 

NEWS

  • Forbes Magazine
  • Technology
  • Innovation
  • Money
  • Leadership
  • Real Estate
  • Lifestyle
Instagram Facebook Youtube

© 2025 Forbes 40under40. All Rights Reserved.

  • About Us
  • Advertise
  • Contact Us
No Result
View All Result
  • Home
  • Technology
  • Innovation
  • Real Estate
  • Leadership
  • Money
  • Lifestyle

© 2024 Forbes 40under40. All Rights Reserved.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In