top of page

Laying foundations before taking off: Build the data infrastructure for AI deployment

Updated: Jan 28, 2021

The term “big data” was first coined in the mid-noughties, yet even today 15 years later, enterprises are still struggling with what it means and how to use it to their competitive advantage says, George Frangou, founder & group CEO at Massive Analytic.

The boom of big data technologies was like something out of the middle of the 19th Century gold rush, with many companies rushing to adopt and being swept up in their attempts to productionise tools and make sense of their biggest asset, their data.

But as we start this new decade, like many of those frontier towns during the gold rush, companies are finding that as big data moves from infancy to adolescence, the foundations aren’t there to support growth and take advantage of newer more advanced technologies.

Consolidating the foundation

Many companies I’ve spoken to this year have told me that they’re simply not ready to adopt and embed AI into their business at this time. It’s a familiar story, they understand the benefits, but the infrastructure isn’t there yet. One such company in the banking sector told me how for years, different departments had been creating, analysing and building datasets on the same accounts independently.

The result? Hundreds of various versions of the same data, the duplicate accounts and the same customers. How do you take advantage of AI and machine learning if you don’t have a truthful representation of your own business? The answer is simple. You can’t. The task for this company now is how to arrive at a ‘single version of the truth’; delivering the right data to decision-makers so they can clearly understand business performance.

This story isn’t uncommon, I’ve lost count of the people I’ve spoken to who are running multiple programs to consolidate hundreds of different data warehouses and silos. Now more than ever before I think companies are realising, there needs to be a strong foundation before you can start driving data insights throughout the business and enabling data-driven decisions.

Unlocking the implementation

To use a real-world example, a major player in the aerospace industry knew it wanted to build a platform and architecture to seamlessly share and exploit the data it collected, to implement AI and machine learning. The data itself comes from aircraft sensors, but there was a myriad of different issues to solve during the implementation. These included restricted data flow, differing data standards & languages, no support for AI applications and limited access.

Data was flowing in large volumes (60Mb/ hr Flight per craft in the fleet), and the combination of those issues meant it was not actionable and accessing all of the data generated was impossible. A blend of machine learning-driven business rules, around how to deal with the different kinds of data, and a big data architecture were the key to solving these issues.

The heart of this company’s problem is quite common. They needed to be able to share and action the data across the business, including with operators/planners, national stakeholders, industry partners, and data scientists all whilst maintaining the security of the data, IP and export controls. The company also needed to be done in as close to real-time as possible. It was no small feat, but it was also a crucial one.

Scaling the deployment

Building on top of a reliable AI and machine learning platform helped establish a firm foundation that enables this business to not just access and share data securely, but provides a base for data insights and promotes the use of AI and analytics. Since this company implemented the new architecture, they’ve already been able to identify several key influences affecting the longevity and performance of their aircraft.

While the initial deployment was for a single-use case, the intention is to scale the developing architecture across the business. It has other benefits too, such as being aligned with government MoD standards and providing a base for third party applications. Once this is in place, you then have the freedom to seek out third party vendors and to make use of AI, machine learning and automation and all the benefits therein.

If you’re a company that has already consolidated its data infrastructure and has that foundation in place then fantastic if not then it’s not too late to take the critical steps to be able to start maximising the performance of your business. Rest assured even if you’re not; your neighbour soon will be.

86 views2 comments

2 comentarios

Yumi Vega
Yumi Vega
20 dic 2022

Thanks for a very interesting blog. What else may I get that kind of info written in such a perfect approach? I’ve a undertaking that I am simply now operating on, and I have been at the look out for such info.

검증 카지노제이나인

카지노 추천 제이나인

카지노 쿠폰 제이나인

사설 카지노 제이나인

모바일 카지노 제이나인

Me gusta

Within just this sequence we are searching again upon Scott Fitterer's very first yr as the Carolina Panthers total supervisor and evaluating his effectiveness in just wage cap manage, totally free consultant signings, the 2021 draft, and participant trades. This 7 days we'll study Fitterer's effectiveness through the 2021 NFL draft. I'm heading in direction of provide Fitterer 3 alternate grades in this article: draft working day trades, placement approach for Rounds 1-3, and good quality of avid gamers drafted. A "C" quality implies he achieved criteria. Something earlier mentioned a "C" signifies Fitterer exceeded specifications and one thing less than usually means he fell shorter. As the Benevolent Dictator of Exchange Down Island, I swooned around Scott Fitterer's willingness…

Me gusta
bottom of page