EventOps

Digital Twins for Events: The Real Value Is in Schedules, Not the Renders

By ·
Digital Twins for Events: The Real Value Is in Schedules, Not the Renders

Key takeaway: The real digital twin for events is not a 3D venue render. It is a structured model of your coordination data: schedules, dependencies, task flows, and team handoffs connected across the full event lifecycle. Industries like healthcare and aviation already made this shift. Events are next.


If you search “digital twin” in the context of events, you’ll find companies selling 3D venue walkthroughs. These look impressive, and they have their place.

But after 20 years of producing multi-track, multi-stakeholder events and now building event-coordination software, I’ve come to believe the real twin of an event isn’t something you’ll ever see in a render.

The hidden treasure is what other logistics-heavy industries are now doing:

GE Healthcare runs process twins across nearly 500 hospitals, and instead of modeling the building, they model patient flow, staff allocation, and scheduling dependencies. Airports model turnaround workflows to reduce delays.

In logistic-heavy industries, the modeling transitions from “what does the building look like” to “how does the operation actually run.”

Events haven’t made that shift yet. From what I’ve seen, there’s a significant opportunity here, and most event teams may be sitting on valuable coordination data without realizing its potential.

Keep reading to learn what’s missing to make your coordination data available for processing as your digital coordination twin.

The 3D Illusion

For events, when we think of “digital twins”, we immediately think of 3D renders of spaces and arenas. And I can see that there are some incredible solutions on the market to do just that. If you want 3D walkthroughs or a way to imagine how people flow throughout your arena, these are truly impressive solutions.

But here is what none of them model: what happens when 200 people need to coordinate across 3 days, 12 venues, and 40 parallel activities.

Think about the last complex event you ran. The hardest part was never “what does the room look like?” It was “who needs to be where, at what time, with what equipment, and who else needs to know if that changes.” It was the schedule shifting at 2 PM on day two and the ripple effect hitting catering, AV, transport, and talent management simultaneously.

A 3D render of the venue captures none of that. It captures the container. The operation that fills the container, the sequencing, the dependencies, the handoffs between teams, that lives in spreadsheets, WhatsApp groups, and the head of the one person who holds it all together.

The industries that figured this out decades ago have moved beyond modeling buildings and now focus on modeling the actual processes.

What Other Industries Already Know

GE Healthcare built something interesting. Their Command Center is a digital twin deployed across nearly 500 hospitals and over 55,000 beds globally. But the twin is not a 3D model of the hospital building. It models patient flow, staff allocation, scheduling dependencies, and bed capacity. Children’s Mercy Kansas City uses it to predict patient surges and pre-position staff before demand actually hits.

Airports did the same thing. Aberdeen Airport implemented Assaia’s Apron AI Turnaround Control Solution to improve the management of flight turnarounds, including tasks such as baggage sequencing, gate allocation, and crew handoffs, according to a report from Assaia International AG.

The pattern across these industries is consistent. Every time an industry matures, the definition of “twin” shifts from the physical space to the operation running inside it.

Gartner formalized this with the “Digital Twin of an Organization,” a dynamic software model that uses operational and contextual data to understand how an organization operationalizes its business model. The key phrase there is “operationalizes.” The twin models of both human and nonhuman behavior. How people coordinate, how decisions flow, how resources get allocated.

I think events are a textbook candidate for this transition. We run some of the most coordination-heavy operations in any industry. We have tight timelines, many stakeholders, and high consequences for misalignment. But we are still at the “what does the building look like” stage of the digital twin conversation.

The Missing Layer for Events

So why have events not made this jump? I think the answer is in the data.

Events generate enormous amounts of coordination data. Every schedule change, every task assignment, every decision about which speaker goes in which room at which time, every conversation between the production manager and the AV team about a last-minute stage change. That is operational data. And in theory, it is the raw material for a process twin.

As I understand it, most event data is currently unstructured. Think about how much of your schedules, processes, and to-dos sit in spreadsheets, meeting notes, and float in messaging apps. None of that information can be considered structured data.

Digital Thread

There is a concept in manufacturing called the “digital thread,” a connected flow of data that spans the entire product lifecycle. Events have lifecycle stages too: planning, preparation, setup, execution, tear-down, and post-mortem. But there is no thread connecting them. The planning spreadsheet is not directly feeding into the live runsheet. The live runsheet does not feed into the post-event debrief. Each cycle starts from near-zero because coordination data was never captured in a structured, retrievable form.

The missing layer is a structured data layer that captures how your team actually coordinates, and makes that data available across the full event lifecycle.

What This Means in Practice

Imagine your schedule, task completions, decisions, and team conversations were all structured and machine-readable. Connected across planning, execution, and post-mortem.

You could compare execution patterns across editions. You would see patterns that are invisible when each edition starts from a blank spreadsheet or even a copy of last year’s edition.

You could identify root causes instead of symptoms. If the keynote stage is always 15 minutes behind by the afternoon, is it because of the changeover time between sessions, or because the upstream speaker check-in process creates cascading delays? With structured lifecycle data, you could trace the actual dependency chain.

You could onboard new team members with full operational context. Imagine how fast someone could get up to speed if they could only run through the coordination history of the previous event. I’ve even played with the idea of gamifying this moment by letting your data be the basis for a “Dungeons and Dragons”-style game where you are placed in scenarios similar to those the team faced last year and asked what you would do. Let me know if you would like to try this out.

As I understand it, time-to-independent-decision-making is a crucial metric to improve on in order to solve your organization’s “people-problem”. More on that in the next article.

The twin that matters for events is a living model of how your team coordinates. Schedules, dependencies, task flows, decisions, and communication are structured and connected across the lifecycle.

This is what we are building at MergeLabs. A coordination infrastructure where the schedule is the backbone, where every task, conversation, and decision is tied to a specific activity in a specific space at a specific time, and where that data persists and compounds from one edition to the next.

But regardless of what software you use, I think the shift in thinking is what matters most. The next time someone pitches you a “digital twin” for your event, ask yourself: do you really need 3D renders or a logistics model?

If this framing resonates with how you think about your events, I suspect you will find the next article in this series interesting, too. It explores why institutional knowledge keeps getting lost between editions and why that is an architectural problem, not a people problem.


Let’s connect so we can keep this conversation going.

I think there is a lot to unpack in this space. If you have thoughts, questions, or a good counterargument, drop a comment or send me a DM. I am always up for a digital coffee on this topic.

Subscribe and Repost if you found this valuable.

Ali Taghavi — Co-founder and CEO of MergeLabs

Frequently Asked Questions

What is a digital twin for events?

A digital twin for events is a structured, machine-readable model of how an event actually operates. Unlike 3D venue walkthroughs that model the physical space, a process-level digital twin captures schedules, task dependencies, team handoffs, and coordination data across the full event lifecycle: planning, execution, and post-mortem.

How is a process twin different from a 3D venue model?

A 3D venue model captures the container: what the building looks like, sightlines from specific seats, floor plans. A process twin captures the operation inside the container: who needs to be where, at what time, with what equipment, and how changes ripple across teams. Industries like healthcare and aviation have already made this shift.

Why can’t events build digital twins today?

The raw material exists, but it is scattered. Schedules live in spreadsheets, task assignments in project management tools, decisions in email threads, and real-time coordination in messaging apps. There is no structured data layer connecting these across the event lifecycle, which means each edition starts from near-zero instead of building on the last.

digital-twin coordination eventtech scheduling

Want to see this in action?

Book a demo and see how MergeLabs handles coordination for events like yours.

Book a Demo