How to Evaluate Airport Data Platforms
Every airport technology vendor will tell you their platform is different. Better. Smarter. More integrated.
I've been on the buying side of that conversation for 15 years. And here's what I learned: most of them are saying the same thing in slightly different fonts.
The real differences only show up when you know the right questions to ask. Not the questions vendors want you to ask — the ones that actually protect your ops team from inheriting a system that looked great in the demo and fell apart in the terminal.
Here's the deal. I built six questions into a buyer's guide from the operations side. Use them to cut through the slide decks and figure out what's real.
Question 1: Does It Connect to What I Already Have — or Replace It?
This is the first and most important question.
Most airports run 5-15 disconnected systems: AODB, RMS, FIDS, workforce management, baggage handling, energy management, and more. You've invested years and millions in these systems. They work. The problem isn't the systems — it's that they don't talk to each other.
What to ask the vendor: "Do I need to replace any existing systems to use your platform?"
If the answer involves ripping out current technology, migrating databases, or replacing workflows your team already knows — pause. That's a multi-year, multi-million-dollar commitment with massive change management risk.
What to look for: A platform that sits as a layer above your existing systems. Read-only connections that pull data without writing back to source systems. Integration via open APIs, ODBC/JDBC connections, data lakes, or direct database connections. Zero disruption to the tools your team already uses.
The red flag: Any vendor who can't demonstrate integration with your specific legacy systems during the evaluation. If they need 6 months of custom development before they can show you a working connection, that's not a product — it's a project.
Question 2: How Long Until I See a Working Dashboard?
Airport procurement cycles are long enough already. The last thing you need is a platform that takes 18 months to implement before you see any value.
What to ask: "From contract signature, how many days until my team is looking at live data in a real dashboard?"
What to look for: A phased deployment model. First dashboards within 30 days. Core functionality within 60. Full operational deployment within 90. Any vendor confident in their platform should be able to commit to a concrete timeline with milestones.
At SFO's International Terminal, the first live dashboards were operational within 30 days of kickoff. Full deployment — 3+ legacy systems connected, automated reporting, real-time visibility — was complete in 90 days with zero downtime.
The red flag: Vague timelines like "it depends on your environment" without concrete benchmarks. Every environment is different, sure. But a vendor who has done this before should be able to give you a range, not a shrug.
Question 3: What Happens to My Data?
This one matters more than most airports realize during evaluation — and then becomes a crisis later.
What to ask: "Where does my data go? Who owns it? What access controls exist? How do you handle SSI?"
What to look for: Clear data ownership language — your airport's data remains your property. Role-based access controls (RBAC) so different users see only what they're authorized to see. SSI (Sensitive Security Information) compliance for any security-related data per TSA regulations. Logical data isolation between airport customers if the vendor serves multiple airports. And a clear decommissioning plan — when the contract ends, your data is deleted and access is fully revoked.
The red flag: A vendor who is vague about data ownership, stores your data in a shared environment without clear isolation, or can't articulate their SSI compliance approach. If they haven't thought about airport-specific security requirements, they haven't built for airports.
Question 4: Is the AI Actually Useful — or Just a Feature Checkbox?
Every platform in 2026 claims to be "AI-powered." Very few of them are doing anything meaningful with it.
What to ask: "Show me what the AI does with real airport data. What has it caught or predicted that a dashboard wouldn't?"
What to look for: AI that monitors data continuously — not just when someone opens a report. Anomaly detection that surfaces deviations from normal patterns without being manually configured for every scenario. Predictive capabilities that give you lead time measured in minutes, not days. And plain-language interfaces where your ops team can query data conversationally instead of building reports.
The red flag: AI that's really just rule-based alerts dressed up with new vocabulary. If the system only fires when a manually set threshold is exceeded, that's automation, not intelligence. Ask the vendor to show you something the AI discovered that wasn't explicitly programmed.
Question 5: Can I Prove It Before I Commit?
This is the question that separates confident vendors from confident salespeople.
What to ask: "Can we do a 90-day pilot in a single terminal with defined success metrics before signing a multi-year contract?"
What to look for: A structured pilot program with a clear scope (one terminal, specific systems, agreed-upon KPIs). Fixed pricing for the pilot period. Success metrics that you define together — not metrics the vendor picks because they know they'll look good. And a transparent conversion path from pilot to full deployment, with pricing that credits the pilot investment.
The red flag: A vendor who insists on a full airport commitment from day one. Or one who agrees to a "pilot" but defines success so loosely that any outcome counts as a win. The pilot should be a real test with real criteria — not a formality before the contract you've already signed kicks in.
Question 6: Who Built This — and Have They Been in My Shoes?
Aviation technology isn't generic enterprise software. Airports have unique constraints: regulatory requirements, shift-based operations, multi-stakeholder environments, legacy systems from different decades, and procurement processes that make other industries look nimble.
What to ask: "Who designed this product? Have they worked in airport operations?"
What to look for: A team with direct airport operations experience. Not consultants who studied airports for a project — people who've lived inside the ops center, who understand why manual reporting is soul-crushing, who know what a duty manager's shift actually looks like. The product decisions should reflect firsthand knowledge, not research.
The red flag: A platform originally built for a different industry (logistics, manufacturing, smart buildings) that's been "adapted" for airports. The workflows, terminology, compliance requirements, and stakeholder dynamics of airport operations are specific enough that a generic platform with an airport skin will always feel like it was built by someone who's never walked a terminal at 5 AM.
The Summary Checklist
Before you sign anything, make sure you can answer "yes" to each of these:
Integration: Connects to my existing systems without replacing them. Read-only. Zero disruption.
Speed: First dashboards within 30 days. Full deployment within 90.
Data security: Airport owns the data. RBAC. SSI compliance. Logical isolation. Clear decommissioning.
AI that works: Continuous monitoring. Anomaly detection. Predictive lead time. Conversational queries.
Pilot model: 90-day terminal pilot. Success metrics I define. Fixed scope and pricing.
Airport DNA: Built by people who've actually worked in airport operations.
If a vendor can't check every box, keep looking. Your airport — and your ops team — deserve a platform built for the way airports actually work.