The New York Times reported that Sora’s latest mobile app gives anyone a “make movie” button that spits out minute-long scenes matching whatever text prompt, pose reference, or cameo selfie the creator feeds in. That jump in realism is collapsing an old rule of thumb—video equals proof—and it shows up everywhere from TikTok trend farms to neighborhood Facebook groups.
Fact-checkers say they now audit footage the same way they scrutinize quotes: pause every frame, look for glitchy physics, and trace shadows to see if they line up with the weather metadata from NASA’s POWER API. Local stations in Phoenix and Charlotte said they won’t air “viral” submissions unless a producer can call the person who shot it or pull the raw file with intact EXIF data.
Teachers are running into the same problem. High-school media labs reported students submitting Sora-made “news packages” as assignments, complete with fake source interviews. Some districts now require students to film on school-issued cameras and hand in SD cards at the end of class so instructors can confirm the footage existed before the edit.
Lawyers and public defenders are also nervous. Judges are demanding provenance logs when prosecutors submit video evidence, and expert witnesses are starting to testify about watermark tampering, compression artifacts, and GPU fingerprints. Courts in Los Angeles and Mumbai are piloting blockchain registries for bodycam uploads so juries can see who touched the footage before trial.
OpenAI says it’s working on invisible signatures and a provenance API that will let platforms query whether a clip ever passed through Sora’s servers. But until those tools arrive, newsrooms are warning readers to trust workflows, not pixels. If a video shows up with no credit, editors assume it’s entertainment—no matter how cinematic it looks.