Data centres today are more than facilities to house and secure data. Organisations look to data centres to be strategic business enablers that power innovation and deliver competitive advantage. To address these evolving needs, they must optimise performance, and build agility and scalability into their operations.
But to optimise data centre operations, we must first wholly understand them. Only with complete and accurate information and clear visibility can improvement opportunities and potential problems be identified. Therefore, data-led insights play a key role.
Data Centres Generate ‘Big Data’
The availability of massive amounts of machine-generated data in real-time – referred to as ‘big data’ – has fundamentally changed the way businesses approach decision-making. Big data is defined by three characteristics, known as the known as the 3 Vs: massive volumes of data in a variety of non-standard formats, processed at high velocity.
The 3Vs of big data:
- Volume refers to the large amounts of data being generated every second. In a data centre, this comprises data generated by equipment, systems, sensors, smart products, and the 24-hour continuous monitoring of sophisticated equipment. The huge amount of equipment generate a staggering number of data points – to the tune of terabytes and zettabytes.
- Velocity refers to the unprecedented speed at which the data is today received – and acting on it in a timely manner is essential. This has led to a shift from batch processing to real-time analysis to help uncover complex dynamic relationships.
- Variety refers to the many data formats coming from many different sources. Data centre systems have a complex data structure with varied types and formats. For instance, equipment like the uninterrupted power supply (UPS) and the computer room air conditioning (CRAC) units have thousands of operational parts and components, and these generate data in different formats. Therefore, the ability to work with all sorts of data – from structured data that fits in traditional databases to unstructured and semi-structured data – is critical.
GOOD TO KNOW!
BIG DATA: A LARGE VOLUME OF STRUCTURED, SEMI-STRUCTURED AND UNSTRUCTURED DATA THAT HAS THE POTENTIAL TO BE MINED FOR INFORMATION AND USED IN MACHINE LEARNING AND OTHER ADVANCED ANALYTICS APPLICATIONS.
DATA ANALYTICS: THE COMPLEX PROCESS OF EXAMINING LARGE AND VARIED DATA SETS TO UNCOVER HIDDEN PATTERNS, UNKNOWN CORRELATIONS, TRENDS, ETC. TO HELP ORGANISATIONS MAKE INFORMED DECISIONS.
MACHINE LEARNING: AN APPLICATION THAT PROVIDES SYSTEMS THE ABILITY TO AUTOMATICALLY LEARN AND IMPROVE FROM EXPERIENCE WITHOUT BEING EXPLICITLY PROGRAMMED.
Big Insight from Big Data
Your data centre equipment is a treasure trove of big data, waiting to be mined. Each connected system or equipment generates millions of data points every second, 24 hours a day. While data holds the key to unlocking operational efficiency, the ground reality is that 90% of equipment performance data is never analysed.
Additionally, islands of disparate data are scattered across the facility and its operations, making consolidation and a single view difficult to attain. Often, even when performance data is available, data centres simply don’t possess the expertise or tools to organise, digest, and draw valuable insights from it.
These challenges arise from the fact that the volume, variety, and velocity of big data being generated in a modern data centre make it virtually impossible for humans using old approaches and technologies to process.
More importantly, data by itself is worthless – its potential value is unlocked only when it is analysed to provide operational insights, eliminating guesswork and helping to make effective operational decisions –from maximising reliability and uptime, to increasing energy efficiency, and extending equipment lifespan.
Big data thus requires new and advanced analytical processing technologies, such as Machine Learning (ML) to uncover hidden relationships, patterns, and correlations that offer those insights. Done right, data and analytics can work together to present essential wisdom – valuable insights that can be used for further analysis to help answer complex data centre operational questions.
Time to Move to Smart Data Centre Operations
Machine learning uses data mining techniques and statistical algorithms to identify problems and predict future outcomes, making data centre management iterative, smart, and agile. For example, monitoring sensors in the CRAC can provide data on its performance and health, helping identify potential trouble spots before they cause breakdowns.
Cutting-edge analytics using ML help establish links between data and the outcome, thus helping data centre managers move from intuition-based to fact-based decision-making. Such solutions not only help maximise efficiency, productivity, and availability but also avoid failures with targeted predictive and preventive maintenance; for instance, by indicating changes in the condition of an equipment and accurately determining when a component is likely to fail, allowing you to replace it before it breaks down and disrupts service.
ENGIE’s Avril Digital Support Services is a good example of how ML works to optimise data centre operations. Avril Digital helps you to fully understand your entire data centre operations and delivers actionable insights, enabling you to make informed decisions. Our services use the collective power of advanced digital with machine learning and accredited data centre domain expertise to continuously ingest, filter, integrate, and analyse real-time data sets, compare them with historical data sets to understand the likely causes of issues, identify performance shortfalls, and predict operational threats.
Is your data centre ready for a smart start? Find out more about how ENGIE can help you better orchestrate your data centre performance.