Skip to content

Identity & Authenticity

Pandora's Box Has Been Open(AI)ed

Why We Should be Concerned About Sora 2 Cameos

Even though we've seen this train coming, seeing it in action still gives me pause - the moment something becomes "easy," it also becomes dangerous. Over the weekend I started playing around with Sora 2 from OpenAI, and let's just say — this thing is equal parts miracle and nightmare fuel.

The “cameo” feature alone is so good it’s unsettling. What used to take teams of visual-effects artists and GPU clusters now happens on a laptop — or worse, on your teenager’s iPhone. While you have to admit the tech is amazing - if we put a malicious bent on things, we have to consider that when the technical barrier drops, the attack surface explodes. And Sora 2 just dropped that barrier to the floor.

Friction Makes Smoke, and Where There's Smoke...

What seems like eons ago, back in March 2025 I wrote a probably-too-long paper about the nature of establishing relationships between humans over digital channels. Digital Identity Verification was an attempt at a framework to categorize how trust is formed between people — and then use that framework to systematically explain how bad actors manipulate that trust to perpetuate identity fraud.

I still think the paper is accurate, though I missed a critical point. I had positioned my argument around people establishing trust with other people via technology. What I overlooked is that the majority of technical approaches today focus on establishing trust between devices and/or software and, by extension, the people using them. These approaches often assume that the identity of the person becomes inherent to the device.

It is this critical gap — between human and technology — that we are looking to bridge with Vero. Building tools to facilitate real-time peer-to-peer authentication inherently creates friction in the process. But our friction is intentional: it’s a demonstration of trust, a clear signal of intent. And that signal is the smoke. In our case, if you see the smoke, you can be confident there is no fire.

Enhancing Digital Identity Verification

A Strategic Framework Against AI-Driven Identity Fraud

Abstract

The proliferation of AI-generated deepfakes has escalated threats of identity fraud in digital communications. This paper examines existing identity verification methods, introduces a strategic framework employing layered defenses to significantly increase attacker complexity, and proposes integrating cryptographic visual signatures alongside traditional verification methods. By analyzing attacker-defender dynamics using game theory, and referencing contemporary adversarial economics literature, we demonstrate the practical effectiveness of combining multiple verification modalities to deter identity fraud.

Blinky Light Thing: An Approach for Verifiable Real-Time Video

Zeroth Technology, Inc

  • James Canterbury
  • Patrick Macom
  • Ankur Garg

https://zeroth.technology

August 1st, 2024

Executive Summary

This white paper introduces the Blinky Light Thingy (BLT), a novel approach for real-time video authentication in live-streaming environments. The BLT embeds cryptographic messages into the physical environment being recorded, making it difficult to modify the video feed in real-time and avoiding the reliance on specialized trusted hardware. This approach addresses the growing challenge of distinguishing between authentic and generative content, particularly in the context of live video streams where it is becoming more common place for someone to "wear" the filter of another person and perpetuate fraud in what is typically considered a trusted environment. By utilizing Ethereum Attestation Services, a "do-it-yourself" hardware device, and open sourced encoding/decoding software, BLT provides a robust method for verifying both the identity of the presenter and the contemporaneousness (real-time nature) of the content.

The Blinky Light Thing

The objective of the Blinky Light Thing (that's the technical term) is to embed proof that audio and video content is both authentic and contemporaneous with the time at which it is being recorded by introducing elements into the physical environment that are very difficult replicate and are captured by the original recording devices (i.e., camera + microphone). The intention of BLT is to supply a low-tech solution that is not dependent on specific hardware/software and is easily (and freely) verifiable.

Attested Sensors & Source Data Synthesis Under Zero Knowledge

TLDR; attested sensors can be used to digitally sign source data at the moment it is collected by a IOT device.  Blockchain can be used to notarize that signature and apply a tamper proof time stamp.  To protect sensitive data from leaving the source, a zero knowledge_  circuit can be used to produce a synthesis of the raw data and generate a proof that the synthesis was done correctly.