By Srinath Sridhar, Industry Leader for Energy and Utilities & Rhubesh Goomiah, Regional Leader in Victoria at Altis Consulting
Over the last decade, Altis have had the opportunity to work with multiple energy and water utilities, as well as other asset intensive organisations that operate large facilities-based and/or network assets. Many of these organisations utilise Operational Technology (OT) solutions such as SCADA for monitoring of critical assets and these are often combined with proprietary time-series historians for operational intelligence.
These OT solutions have typically been in place for decades and are deeply embedded within the business units that oversee asset operations. However, we’ve seen an interesting trend with several of our customers over the last couple of years as they’ve been thinking about how they could modernise these legacy time-series historians.
Key factors that are driving this thinking:
- With the rise of cheap new-generation IoT devices, the technology mix in the OT landscape is becoming more diverse for many organisations and is no longer SCADA centric. For example, we’ve seen our electricity and water clients roll out digital meters and IoT sensors on their networks and then realise that the cost to integrate all that data into their legacy historians can be prohibitive.
- In addition to the above, most of these new-generation IoT solutions are being built using public cloud services such as Azure and AWS, using proven methods for integrating data from field devices into the cloud.
- Organisations are also leveraging the same cloud services to integrate data from many of their other IT / OT data sources to take advantage of the analytics capabilities available in the cloud. This strengthens the case to integrate data from the legacy time-series solutions into the cloud platform where all other datasets have been ingested, rather than the other way around.
- With the rise of AI and Gen AI, there are lot of features that cloud ecosystems offer to take advantage of time series data to run AI specific workloads and move from just descriptive to predictive or prescriptive.
However, before organisations can consider migrating the capabilities offered by their legacy historians into their cloud platform, it is important to perform proper due diligence to ensure that the chosen cloud solution can meet expectations in terms of required capabilities.
Below are some key things to consider.
Optimised ingestion and storage for large volumes of time-series data:
Traditionally, one of the key benefits of proprietary historians has been their ability to compress and optimise time-series data storage using proprietary formats. Cloud object storage and data lake services can address this capability via cost-effective approaches. With respect to data ingestion, the same patterns that are used for modern IoT solutions or cloud data platforms can be leveraged, the key challenge to address is identifying and selecting proper available middleware that can integrate the legacy solution to the cloud ingestion layer.
Time-series intelligence capabilities:
In addition to optimised storage, historians typically offer mature time-series intelligence capabilities for querying and analysing time-series data. These include features such as time-series aggregations and joins, complex time-aware calculations, ability to set thresholds and ranges for key metrics etc. While such capabilities may have been lacking previously in the cloud offerings, vendors have been catching up quickly over the last couple of years. There are several native services available now in AWS and Azure which offer such capabilities and other best of breed vendors such as Snowflake and Databricks have continued to refine the time intelligence capabilities offered on their platforms.
Real time accessibility of data:
For accessing time series data, one of the key functionalities that a historian provides is its ability to provide real time accessibility. Very often, the use cases that need time series requires real time insights, for example a water utility that requires real time dam levels and spillages to manage flood operations . Therefore, moving to a cloud solution requires effective set up on real time pipelines to ingest, transform and visualise data in near real time.
Contextualisation of time-series data:
Finally, when dealing with time-series datasets, it is critical to provide the necessary context which enables proper interpretation of the data. Such data often relates to core assets such as equipment within a facility or field devices placed in a particular part of a network, so without the right context, meaningful analysis is not possible. In our opinion, replicating the contextualisation capability from the legacy historians into the cloud solutions is probably one of the harder challenges that needs to be overcome when seeking to modernise legacy time-series solutions.
The contextualisation in the cloud ecosystem are identified by building assets that are based on use cases. Some of the use cases include asset maintenance, operations, smart meter enabled customer intelligence, regulatory compliance and others. In the cloud ecosystem, building domain specific data models provide a holistic view of the use case.
Some of this data could be ingested from other sources such as asset management systems, but organisations should be prepared to potentially implement small custom-built applications to complement their cloud time-series solution with this capability (for example web apps which enables asset or device templates to be defined and maintained), if they are unable to find an out-of-the-box solution that meet their needs.
In summary, as we transition into the AI age, there are now credible options available to organisations that are seeking to modernise their time-series solutions and unlock new analytics capabilities. At Altis, we have experience in helping clients to navigate the challenges associated with such initiatives, so please reach out to us for a more detailed conversation.