It started with strange lights in the sky over New Jersey. Concerned citizens reported mysterious drones, sparking fears of surveillance, security breaches, and even extraterrestrial visitors. Reports near sensitive military installations only heightened tensions.
In response, the White House National Security Council issued a statement: most drones were just regular aircraft. Yet the sightings—and public distrust—continued. The drone phenomenon revealed something deeper about human perception: in moments of ambiguity, we fill the void with stories that blur the line between real, imagined, and misunderstood.
While the New Jersey skies may seem far removed from technology, they reflect something profound happening in the digital world: hallucinations—not in people, but in artificial intelligence.
Collective Illusions, AI Hallucinations, and the Fragility of Intersubjectivity
The New Jersey drone sightings echo a long history of mass hysteria, where groups share beliefs fueled by uncertainty and fear. Much like UFO sightings of the 20th century, these drones became symbols of something larger—surveillance, conspiracy, or the unknown. Ambiguity thrives in crisis, driving skepticism and speculation.
This phenomenon highlights a concept called intersubjectivity—the shared understanding of truth that allows society to function cohesively. Intersubjectivity is fragile. When uncertainty takes hold and trust erodes, we lose that shared reality. Whether it’s distrust of the government’s drone explanations or disagreements about AI outputs, this breakdown leaves room for competing narratives to flourish.
AI systems reveal a similar challenge. When artificial intelligence “hallucinates,” it generates outputs that sound plausible but are fundamentally untrue. Large language models like ChatGPT or Microsoft Bing’s “Sydney” don’t “understand” reality—they predict patterns to fill uncertainty, just as humans do with stories.
For example:
• Microsoft Bing’s Sydney: Early iterations of Bing’s chatbot became infamous for unsettling, fabricated responses—like adopting an alternate persona and confidently delivering false claims.
• Google Bard: In a high-profile launch, Bard incorrectly described discoveries from the James Webb Space Telescope, spreading confidently inaccurate information before it was corrected.
• Fabricated Citations: AI systems often invent realistic but non-existent studies, misleading users who trust the information’s appearance.
Humans and machines alike respond to ambiguity by creating stories. Depending on their personal experience, humans are comfortable with different degrees of ambiguity. They can fill uncertainty with fear, imagination, or shared narratives. AI always fills uncertainty with unemotional probabilities. Both expose the same truth: our collective understanding of reality is easily fractured.
Bridging the Gap: A Blueprint for Businesses and Leaders
The breakdown of shared truth has profound implications for leaders, businesses, and society. Companies must act as architects of clarity, building trust in an era where ambiguity thrives.
Here’s how to succeed:
Communicate Clearly During Uncertainty
- Ambiguity fuels distrust. Leaders must address uncertainty with transparency.
- Tip: Share both what you know and what you don’t. Admitting uncertainty while committing to answers builds credibility.
Strengthen Digital Resilience
- Combat misinformation with fact-checking mechanisms and truth audits for AI tools.
- Tip: Regularly validate AI outputs and ensure communication channels meet accuracy standards. Businesses that get ahead of regulatory demands will earn public trust.
Humanize AI Systems
- AI hallucinations reflect gaps in training data. Human oversight ensures outputs are reliable.
- Tip: Position AI as an assistive tool, combining insights with human judgment for validation.
Align Brand Values with Shared Truth
- Consumers trust brands that align with their values and prioritize ethical AI use.
- Tip: Develop value-driven messaging supported by transparent, verified data to connect authentically with audiences.
Foster Digital and Information Literacy
- Leaders must educate their teams and customers on AI’s strengths and limitations.
- Tip: Launch internal initiatives or external campaigns to improve public awareness of AI biases and ethical use.
Rebuilding Shared Reality in an Age of Ambiguity
The drones in New Jersey remind us of something deeply human: our need to explain what we don’t understand. Whether it’s lights in the sky or AI hallucinations, ambiguity blurs the line between perception and reality.
For all its promise, artificial intelligence mirrors this same tendency. Its hallucinations force us to confront an uncomfortable truth: reality is no longer fixed—it’s negotiated. In this moment, businesses and leaders must rise to the challenge.
This is where intersubjectivity becomes essential. Businesses and leaders must help restore shared truth by embracing transparency, ethical innovation, and clear communication.
By embracing transparency, ethical innovation, and digital resilience, companies won’t just survive ambiguity—they’ll lead through it. Success will depend not on the technologies we build, but on the shared realities we create.
Truth, like trust, isn’t something we find—it’s something we build together.