We’ve been here before, right?
Dotcom, cloud and mobile. Social media and the internet of things. Big data. Blockchain. The metaverse (lol). The past three decades are littered with hype cycles, and picking a path through the propaganda is just another part of the brief for today’s business leader.
But AI is different. It’s not just another tech upgrade, it’s a paradigm shift. It touches everything from customer experience and operations, to product development and workforce strategy. It promises to fundamentally alter the way we interact, both as businesses and humans. The stakes are high, and so is the pressure – both to move quickly, and to avoid costly mistakes.
And as a result, organizations are increasingly finding themselves caught between two states of mind:
- FOMO, or the fear of missing out – fuelled by headlines of competitors gaining an edge, vendors promising revolutionary capabilities, and startups disrupting entire industries.
- And FOMU, the fear of messing up, which arises from legitimate concerns around data privacy, ethics, regulatory compliance and the sheer complexity of the AI landscape.
Confusion reigns in the AI space
Both fears are valid. Leaders worry that if they don’t act now, they’ll get left behind. But also that if they invest in AI and it backfires, then they’ll leave their organizations exposed. And when either fear dominates, it can distort decision-making.
So how do you strike the right balance between the two? It was a topic we explored in depth on a recent GDS Group roundtable discussion, hosted by Meet the Boss, looking at how to secure AI to better enable innovation.
“There is still a whole lot of confusion around what exactly AI is, how can it be used, and what it’s going to enable from a company perspective,” explained Gerry Plaza, Field CTO in the Chief Strategy Office at cybersecurity firm Netskope. “Before you start going down the path of creating your AI strategy, you need to define what AI means to you. Define the key objectives of the business. What are you trying to solve with AI?”
Because despite all the hype around AI, Plaza believes a lot of companies still don’t understand what AI means to them. “They feel like they have to get on the AI bandwagon, or they’ll get left behind.”
Companies playing catch-up
That fear is understandable, of course – not least because there are huge risks in NOT integrating AI into your workflows. There are the obvious lost opportunity costs associated with moving slowly and being late to market; but there are also major security concerns to factor in, too.
“The risk is your users are going to figure out ways to use AI anyway, because it helps make us more efficient, more innovative, it’s a tool that helps us become more effective and productive,” Plaza explains. “And so if you don’t have visibility into how AI is being used, if you don’t have the proper security controls and the proper guardrails in place, you’re going to start to see a tremendous amount of data being leaked out.”
Strong governance is an important part of that process. But Plaza also believes that putting the right controls in place from a tooling perspective is critical.
“People are going to test the limits of everything you put out there,” he says. “So you’ve got to put guardrails around what you can access, but also have internal security controls that sit as a gateway between the prompt and the response. Because you don’t want to have a scenario where somebody comes up with a prompt, and the model ends up giving them a tremendous amount of your company’s intellectual property.”
Observability and bi-directional visibility
And in the same way that you need to protect intellectual property or company information being sent out externally, it’s just as important to be careful about what is coming in to your company.
“You’re hiring new employees that are probably bringing stuff in that they’ve been working on in other companies,” says Plaza. “They’re bringing in code or snippets of data or tooling from other environments. And you don’t want things coming into your company that could have a negative impact on your brand or operations. That can be just as damaging as data leaking out. So you’ve got to be able to have visibility bi-directionally.”
Understanding these dynamics is critical. Because while FOMO can drive innovation and early adoption, it could also lead to rushed decisions and wasted investment if left unchecked. Too much FOMU, on the other hand, and you run the risk of stagnation and missed opportunity. Striking the right balance between these two fears is essential for sustainable, strategic AI adoption.
So what are the best ways to balance FOMO against FOMU? Here are our top tips:
- Start with strategy, not technology
AI should serve your business goals, not the other way around. Begin by identifying high-impact areas where AI can solve real problems or unlock new opportunities. Whether it’s improving customer service, optimizing supply chains, or enhancing decision-making, clarity of purpose is key.
- Build cross-functional governance
AI adoption isn’t just an IT initiative – it requires input from legal, compliance, HR, operations, and more. Establish a governance framework that includes clear roles and responsibilities; ethical guidelines and risk assessments; and data stewardship and privacy protocols. This helps mitigate FOMU by ensuring risks are proactively managed, and tempers FOMO by slowing down impulsive decisions.
- Pilot with purpose
Rather than launching broad, unfocused AI programs, start with targeted pilots. Choose projects with measurable outcomes, manageable scope, and strong stakeholder support. Use these pilots to learn, iterate, and build internal confidence. Success in small pilots can reduce FOMU and create momentum, while failure in a controlled environment minimizes risk.
- Invest in people and culture
AI adoption is as much about mindset as it is about machines. Upskill your workforce, foster a culture of experimentation, and encourage cross-disciplinary collaboration. Employees who understand AI are more likely to use it responsibly and creatively.
This reduces both FOMO (by demystifying AI) and FOMU (by building internal capability and resilience).
- Stay informed, and challenge the hype
The AI landscape is noisy. Vendors, analysts and media often hype capabilities beyond what’s realistic. Stay informed, but apply critical thinking. Ask tough questions about scalability, transparency and long-term value.