In the high-velocity worlds of Formula 1 and SailGP, technology, analytics, and human ingenuity combine to deliver competitive edges that are measured in fractions of a second. These teams thrive on a culture of continuous improvement, integrated data ecosystems, and well-defined processes. The results are clear, relentless performance gains that translate directly to success on track or on water.
For businesses, the allure of AI and advanced analytics is equally appealing. According to research from MIT and the McKinsey, those who leverage data-driven decision-making outperform their peers. But here’s the twist: do you start by establishing data infrastructure and organisational readiness, or by identifying and pursuing the most promising AI use cases first? Which comes first, the chicken (organisational foundations) or the egg (specific AI-driven applications)?
The Tension Between Readiness and Use Cases
Over the last couple of years this is possibly the most common issue leaders wrestle with and its made more complex by competing agendas and needs of the different stakeholder groups. Some argue that without well-structured processes, integrated data, and a supportive culture, even the best use cases fail to deliver value, and when you look at many ERP programmes it’s a hard point to argue down. Others with strong use cases supported by a quantifiable business problem or a high-potential use case with a strong supporting business case that justifies the investments. The truth, as often in business, lies between the extremes.
Use Cases as Catalysts: Identify a compelling use case, a high-impact challenge that can be addressed with AI, and it can create the impetus for organisational alignment. As research from Bain suggests, a clear “North Star” problem often galvanises leaders to break down silos, free up budgets, and invest in data quality. It provides a tangible reason for teams to collaborate and processes to be streamlined.
Foundations as Enablers: On the other hand, McKinsey work on advanced analytics maturity models shows that without foundational readiness, clean, integrated data; clear processes; skilled talent; and cultural buy-in, AI pilots may never reach scale. Even if you spark initial excitement with a use case, that energy fizzles if the underlying infrastructure isn’t ready to support it long-term.
Looking at F1 and SailGP for Clues
Racing teams are a masterclass in balancing these priorities. They never invest in data or technology for its own sake. Instead, they focus on performance questions: Can we shave off a tenth of a second in a corner? Can we optimise sail trim given wind conditions? These are their “use cases.” But at the same time, they know that answering these questions effectively depends on having a robust data foundation, sensor arrays that provide reliable real-time telemetry, analytics platforms that run simulations at lightning speed, and a team culture that trusts data-driven decisions.
In other words, they treat organisational readiness and use cases as inseparable. Each new performance question (the use case) informs what data and capabilities are needed. Each investment in better data and stronger processes unlocks new, more ambitious use cases.
Finding the Right Sequence for Your Organisation
So how do you solve the chicken-or-egg puzzle in your own setting?
Start with a Clear Strategic Priority: Look at how INSEAD’s “Strategic Agility” research frames it: identify a priority that matters, a problem that, if solved, moves the needle for your business. This is your “egg”: a defined use case or strategic question that gives your data and AI efforts direction.
Assess Your Readiness Gap: Once you know what problem you’re solving, assess whether you have the “chicken”, the foundational readiness, to act on those insights. If your data is scattered, your processes clunky, and your teams mistrustful of analytics, address these gaps first. Publications like Harvard Business Review and Gartner emphasise that even the best algorithms fail if fed poor data or embedded in misaligned processes.
Iterate in a Loop, not a Line: Rather than viewing readiness and use cases as a strict sequence, treat them as an iterative loop. A promising use case helps you prioritise which data and capabilities to fix first. As you improve data quality and streamline processes, you can tackle more complex and higher-value use cases. Think of it as a virtuous cycle, each informs and strengthens the other.
Use Pilots to Build Foundations: Sometimes you must pick a single, high-impact pilot use case that forces improvements in data handling, process rigor, and cultural openness. When that pilot pays off, providing tangible benefits, it wins hearts and minds, encouraging further investment in foundational capabilities. This approach aligns with the agile methodologies highlighted in MIT studies on experimentation: start small, prove value, then scale.
Practical Steps to Balance Both Sides
Pick a Use Case That Matters: Don’t choose an AI project because it’s trendy. Identify an area, like reducing churn or optimising inventory, that aligns with strategic goals. As research from PwC and Deloitte indicates, showing early wins builds momentum.
Set Clear Metrics and Targets: Just as F1 teams measure every performance tweak with lap times, define KPIs that show progress. This makes the link between improved readiness (better data and processes) and improved outcomes (faster decisions, cost savings) crystal clear.
Invest Incrementally in Infrastructure: You don’t have to fix all your data problems at once. Start with the datasets critical to your chosen use case. As you demonstrate ROI, expand your data integration efforts, guided by Forrester and Gartner’s best practices for data management.
Get Cultural Buy-In Early: In SailGP, trust in real-time data is non-negotiable. Foster a culture that understands what you’re trying to achieve with the use case and why data and AI matter. This aligns with the commonly used change management frameworks; people must believe in the mission.
Iterate and Scale: Once you show that tackling a use case improved performance, use that success story to champion broader data governance initiatives, more robust analytics platforms, or targeted upskilling. Each success provides permission to go deeper and broader.
Harmony, Not Hierarchy
The chicken-or-egg dilemma of AI adoption, do you start with the “perfect” organisational conditions or a compelling use case, often stems from viewing them as mutually exclusive. High-performance racing teams and leading companies treat them as interdependent. They use ambitious, value-driven use cases to guide foundational improvements, and then leverage those improved foundations to solve even more complex problems.
In other words, neither the chicken nor the egg strictly comes first, they evolve together. Start by identifying a high-impact use case that can ignite organisational willpower. Then use the demands of that initiative to drive improvements in data quality, process integration, and cultural readiness. Over time, this interplay becomes a cycle of continuous improvement, pushing you closer to that podium finish, on the track, the water, or in the marketplace.
Let’s take your transformation efforts to the next level, reach out directly at shaun.taylor@rckpm.es for a more in-depth conversation.
About Shaun Taylor
Shaun is a seasoned C-level transformation executive with a proven track record in strategic growth, operational optimisation, and value creation, he specialises in helping c-suite leaders navigate complex transitions. His expertise lies in large-scale and private equity-backed businesses, where he has secured complex transformation and operational successes that have deliver measurable outcomes.
Through the RCK Programme Methods, he brings a structured approach blending agile principles with deep operational insight to align technology, operations, and strategy to achieve sustainable success. Whether it’s Cost Transformation, Value Creation, Enabling ERP-enabled change or building coalitions that foster cultural alignment, Shaun and the RCK team ensure your transformation efforts are not just implemented but delivery the results you have committed.
Comments