Font Size: A A A

Interviews

The value of data

How much is alarm and event data worth? asks Matt Dillon, pictured, of the security engineering consultancy QCIC, pictured.

We all know data is a valuable commodity, but why? Traditionally the data collected by physical security systems was segregated from the core business and often required bespoke reporting tools to access. The asset, meta, alarm and log data collected by physical security systems can be used to inform not only security management but also other key business areas in such decisions as occupancy monitoring and building utilisation. Ultimately these reviews and subsequent decisions are only as good as the data they have available. The data within physical security systems should adhere to quality data requirements (data that is secure, structured, consistent, accurate, standardised and most importantly accessible). The data should live within an end user’s core environment and be aligned to core standards to avoid lengthy and costly manipulation before it can be used to future-proof an organisation from potential risks.

Starting with a standard

The naming of physical security (PS) assets within an application is often free text in nature. This presents an opportunity for customisation for the end user but immediately increases risk and decreases value if these assets are named incorrectly, inconsistently, and misaligned with non-approved nomenclature within the end user’s business. Any inconsistency here ultimately feeds through to the information presented as an alarm or event. To understand the reason behind potential misalignment for PS assets compared to typical IT infrastructure like a network switch, for example, we need to consider the technological development within the industry. This includes how the industry has evolved from using typically offline, analogue systems to the connected, digital systems we see. The development can be broken up (roughly) into the below three paradigm shifts. With each generation comes increased functionality and exposure.

First generation: Typical trends include offline, air gapped, analogue, security by obfuscation and locally managed.

Second generation: Trends include online, siloed, hybrid analogue and digital, security by design and locally managed yet centrally monitored.

Third generation: Trends include online cloud and on-premises software, logically segregated, digital, edge devices, cyber hardened, centrally managed, plugs into the wider ecosystem, increased interaction with the end users’ employees and support teams, and increased governance.

Core IT and business

Now that we see the security system is part of the end user’s core IT and business systems, it is imperative the data configured and captured meets the quality data requirements and is aligned with key business and asset registry systems. This allows confidence in the reporting of the asset, meta, alarm and log data collected, and results in the security system being a value enabler, sharing information on day-to-day operations rather than simply being a cost centre.

Reporting and accessibility
If correctly configured and aligned with approved nomenclature, the transactional and alarm data available in a typical enterprise-level security system can provide valuable insights when analysed by the correct suite of applications. Historically, this data was siloed away, unavailable to the core business and used for security investigations only.

Patterns

For the security system to be a value enabler, this data needs to be accessible and subjected to the end user’s data privacy regulations. Bespoke and native reporting applications should be replaced with the consumption of the transactional and alarm data into a data lake or similar. This allows business intelligence (BI), machine learning (ML), analytics, data science and reports to be conducted on the data, and facilitates trend analysis to detect patterns that would normally go unnoticed with standard reporting. Understanding trends around occupancy monitoring and building utilisation, for example, can only be completed with speed and confidence if the data being analysed meets the quality data requirements.

SIEM

Analytics engines in combination with security information and event management (SIEM) applications can analyse daily transactions and alarms in context with data from other core business systems and present them as useful dashboards to inform leadership. Analytics engines and SIEM platforms often exist within the end user’s domain. Here we are simply adding an additional data source. It is worth re-iterating at this point that for this to work effectively, the data being ingested needs to meet the quality data requirements. Any analysis, filtering and presentation is only possible by using specific fields or characters within the ingested data. Ultimately, an event or alarm at this stage is simply a string of text.

About the author

With over ten years’ experience within the security industry, Matt Dillon is an Associate at QCIC. He manages a team within QCIC’s Build department, which provides specialised management and advisory services to their clients. His team executes QCIC’s automated Design, Build, Run methodology, which allows mandated physical security technological solutions to be delivered globally, on time and to an approved standard. Matt has held positions in the integrated security sector, which allows him to work with a client’s internal IT and project teams to a delivery process.

The full article is in the June print edition of Professional Security magazine. Click here to read on.


Tags

Related News