2 October 2024

The Devil is in the Data: The importance of data in today’s world

Ian Burgess
Ian Burgess
Chief Technology Officer

In the modern world, data reigns supreme. From shaping business strategies, to guiding public policies, data plays a critical role in almost every aspect of our lives. Yet, the true power of data lies not just in its collection, but in its interpretation and application.

As the saying goes, “the devil is in the detail,” and, given the detail is very much in the data, I coined the phrase “the devil is in the data” on a recent tour of Procode with visitors from the DCC. This article explores the significance of data, the challenges in handling it, and how it’s ultimately at the foundation of everything that we do within Procode.

The Backbone of Decision-Making

Data is the backbone of informed decision-making. In businesses, data-driven decisions can lead to more efficient operations, improved customer satisfaction, and increased profitability.

By analysing customer behaviour, market trends, and internal processes, companies can make strategic decisions that are grounded in reality rather than intuition.

For instance, retail giants like Amazon use data to optimise their supply chains, predict consumer demand, and personalise the shopping experience.

Through sophisticated data analytics, they can anticipate what products will be in demand, manage inventory more effectively, and tailor recommendations to individual customers, thereby enhancing sales and customer loyalty.

Another example, close to my own heart, comes from Formula 1.

With 1.1 million datapoints being captured from a typical F1 car every second during a race and even more data being computed during the design, manufacture and testing processes, data literally is the key to unlocking the potential of the car, the driver and entire team.

The Mercedes F1 team have the slogan “Chasing every millisecond” on the side of their trucks and it has become a mantra within the team as a millisecond can be the difference between winning and losing a race.

This was born out in the Canada Grand Prix back in June when both George Russell (Mercedes) and Max Verstappen (Red Bull) achieved exactly the same qualifying times of 1:12.000.

This has only happened once previously, back in 1997 during the European Grand Prix at Jerez when Jacques Villeneuve set a time of 1:21.072 and then – to the millisecond – rival Michael Schumacher matched it. Unbelievably, Heinz-Harald Frentzen then matched it as well and the three drivers all set identical lap times.

But back to the Mercedes slogan, you can’t “Chase every millisecond” without incredibly rich and reliable sources of data. I’m sure my time spent in F1 is partly to blame for my data fanaticism!

Transforming Industries

The impact of data extends beyond the business realm, transforming entire industries. In healthcare, data analytics is revolutionising patient care and medical research.

By examining large datasets, researchers can identify patterns and correlations that lead to new treatments and preventive measures. Indeed, the use of quantum computing to crunch huge datasets in pharmaceuticals in the pursuit of new medicines is becoming increasingly common.

Similarly, in the finance sector, data is used to detect fraudulent activities, assess credit risks, and guide investment strategies. Financial institutions rely on complex algorithms and real-time data analysis to make split-second decisions that can significantly impact the markets.

But the undeniable ‘king of data’, as you’d expect, is the tech sector.

I was fortunate to spend some time with some folks from the data analytics side of Google recently.

Google is a phenomenal example of a company that has literally been built on data from its conception back in 1996 through to today. Mapping is a great example of this. Starting with an audacious goal of mapping every road across the globe, the data megalomaniac that is Google raised it’s game further with ambitions to map the entire globe, whether with tools such as Street View, satellite imaging, crowd-sourced imagery and much more beside.

It’s estimated that Google has now mapped 98% of the globe. Fun fact: Did you know that, armed with your own 360 degree camera, you can upload your own street view imagery to Google? But they’re not doing this for the sheer hell of it, just think for one minute about the sheer volume of services that Google monetises from its mapping data.

Examples include local advertising, traffic and routing services, business footfall from phone GPS data, local weather forecasts and so much more.

Not content with mapping every street, Google is now intent on creating a high resolution, 3D view of every building on the planet as not only does this enrich its mapping service but it creates further revenue streams. A great example of this is the Google Solar API that takes high resolution rooftop imagery in conjunction with weather and mapping data and very accurately predicts the suitability of a property for solar panels and the likely return on investment.

The Challenge of Big Data

With the explosion of digital technologies, the volume of data generated every day is staggering. This phenomenon, known as Big Data, presents both opportunities and challenges. While having access to vast amounts of data can lead to deeper insights and more accurate predictions, it also requires advanced tools and skills to process and analyse effectively.

One of the biggest challenges is ensuring data quality. Poor-quality data can lead to incorrect conclusions and misguided decisions. Therefore, it is crucial to have robust data management practices in place, including data cleaning, validation, and regular audits. We see this all too often in our own data when a single error can have huge, unintended consequences across the business.

Moreover, the sheer volume of data demands powerful computational resources and sophisticated analytical techniques. Traditional data processing methods often fall short, necessitating the use of advanced technologies such as artificial intelligence (AI) and machine learning to handle and interpret Big Data.

Back to the Google example, having exabytes (1 exabyte = 1 billion gigabytes!) of data within its arsenal ranging from your holiday snaps, browsing history and emails through to mapping, weather and website data creates its own problem.

How do you access and move data quickly and efficiently? The answer is simple, through the use of petabit networks connecting these huge data stores. For those of us lucky enough to have a one gigabit internet connection, a one petabit connection has one million times more capacity!

Ethical Considerations

As data becomes more integral to our lives, ethical considerations come to the forefront. Issues of privacy, security, and consent are paramount. Organisations must ensure that data is collected, stored, and used in a manner that respects individual privacy and complies with legal regulations.

The misuse of data can have serious consequences, from data breaches that compromise personal information to biased algorithms that reinforce social inequalities.

Therefore, ethical guidelines and transparent practices are essential to maintain trust and integrity in data usage. We have a number of e-learning modules on data privacy for this precise reason.

Data and Procode

So how does Procode fit in to this?

Increasingly, the new products and services that we’re developing within Procode have data at their core.

Yes, all technology platforms consume, generate and manipulate data but often this is a bi-product of the intended purpose of the platform.

Take our metering installation and commissioning tool. Clearly it’s primary purpose is one of installing meters as efficiently as possible but, as a result of this we generate and collate lots of data. Whereas, products such as Smart Datastream have been built to be data centric.

Put simply, we have huge opportunities to monetise the combined insights and intelligence from the electricity consumption, smart home sensors, Canary Care, heating systems, weather and many MANY more beside.

Let’s just think about what this means in terms electricity data.

You’re probably all very familiar with the ability to gain very detailed insights about individual electrical appliances from their unique signatures on the power lines. For example we can understand energy efficiency, monitor device utilisation, and even detect when a device is failing or about to fail.

The richest source of this information is gained by analysing the electrical data every ten seconds (yes for those involved in the industry’s half hourly settlement programme I did say 10 seconds!).

So for every household that’s 8,640 blocks of consumption data every day. Or, for Utilita with approximately 840,000 customers, that’s over 2.6 TRILLION records per year!

Now overlay other sources of data mentioned previously and the number literally explodes in to our own Big Data opportunity.

But it’s not all about the commercial opportunities that exist in data, as this has been at the heart of our observability and monitoring practice for several years now and is driving all of the screens that seem to be a firm favourite for external tours.

By understanding what normal looks like we can better spot anomalies caused by systems, third party providers as well as customer activity.

The Future of Data

Looking ahead, the importance of data will only continue to grow. Emerging technologies like the Internet of Things (IoT), blockchain, and advanced AI will generate even more data, offering new possibilities for innovation and efficiency. However, with these advancements comes the responsibility to manage and utilise data ethically, effectively and efficiently.

In conclusion, while the devil may be in the data, so is the potential for immense value and transformative change. By embracing the power of data, addressing its challenges, and adhering to ethical standards, we can unlock a future where data-driven insights lead to better decisions, improved outcomes, and a more informed society. The key lies in understanding that the details matter, and in the world of data, those details are everything.