The term ‘big data’ was coined to describe large or complex data sets. It’s only in the last few years that the ‘big data’ revolution has truly begun to take hold, due to massive increases in data collection and storage, along with increased processing power. From our assets perspective, we are the same as every other industry – it’s how we manage this data and what we do with it that makes the difference. There’s a great opportunity for this data to help us think and plan long term.
We’ve found over the years that the majority of our client organisations have some form of untapped value in their existing asset data, however until recently this data has been difficult to access. Even if you do have easy access to it, it can take a disciplined mind-shift along with organisational smarts and technological ability to actually turn it into valuable insights.
The traditional approach to asset management has been to assess the condition of assets so that lifecycle forecasts can be generated to predict the level of investment needed to maintain the assets. We’ve used a rule of thumb that says 90% of the effort can be spent on collecting infield condition information, and 90% of the value can be gained from that 10% of the effort in it’s analysis and reporting. We are now turning this around – let’s spend 10% of the effort upfront and gain possibly 80% of the value to find insights from the power held within the existing data. And this data by itself may have been overlooked in the past – just imagine though, combining the data with a variety of overlays at a building or structure or site level to find hidden risks or to uncover opportunities to save costs or even to increase asset values.
We have been thinking about these types of possibilities for a few years now – we refer to it as ‘building asset intelligence’ to simplify asset management so that more effective decision can be made that may lead to a more cost effective approach to managing assets. Easier said than done? Well, we are now starting to gain this information and it is actually quite easy – or maybe we just make it easy. We now start all new software implementations with a process of collating as much existing electronic formatted data as possible. We bring this data together with our reference libraries, templates and benchmarks, so that it can be analysed in a way that identifies the outliers, shows possible hot spots, and may even show ways to leverage opportunities.
It’s our version of ‘big data’. Where in the past we have created asset data through property assessments in collaboration with our software users, changing technology and changing mindsets are now creating opportunities to extract an enormous amount of value from existing asset data.
Our thinking is that any organisation can tap into this type of information and insights surprisingly quickly. The ‘bigger’ our databases become, the more value our users will gain, and the more information we can share with the various asset intensive industries.
If you would like to know how this can potentially work for you, send us your existing data in a flat file format. We will provide you an overview of how good your data is and if it can be readily applied to our analytics templates. We will then provide you the resulting insights at no cost – it may even show immediate areas of risk and potential cost savings.
For more details, visit www.spmassets.com/services/asset-intelligence-service