'The Risk Call' Monthly Webinar Register     'CYBERWatch' Monthly Newsletter  Register

shutterstock_2149818673

Data - Process - Analysis

Data

Actuarial Science is the way society manages risk of all types. Shipping risk, facilities risk, weather risks, investment risks and countless others. And having relevant data is the foundation, the starting point, of managing risks. But what type of data is most valuable in understanding and managing risk?

Losses, Industry Patterns, Financials, Regulatory Standards, Risk Relationships, Threats, Vulnerabilities, and Capabilities. This data allows patterns to be discovered, parameters established, and trends to be determined.

But how much data is enough? In simple terms, there is never “enough” data. Arx Nimbus employs the Thrivaca™ GPF or Generalized Probability Function. As more data becomes available, GPF is there to further refine results. GPF is architected from the ground up to ingest and balance hundreds of data sources on vulnerabilities, threats, and other indicators of loss and of the probability of loss. This allows Thrivaca technology to provide actuarially correct loss forecasts that stand up to audit and regulatory evaluation, while providing insurance-grade results that are in use today by cyber insurers across over 400 industries.

Process

Once the data is acquired, a process is still needed to organize it and make sense of it to inform a proper understanding of risk. In the field of accounting, generally accepted accounting principles (GAAP) were established since the 1930s to bring consistent results to company financials for investors, management and regulators. In the field of cybersecurity, we have standardized frameworks from NIST, ISO and others that, along with actuarial methods, allow for consistent, auditable and predictable results. This allows cybersecurity efforts to be directed according to accepted principles, and not based on professional opinion and expert judgement of the past.

Loss Classification, Economic Loss Models, Probability Density Functions, Actuarial Processes, Probable Maximum Loss, Negative Binomial Distributions are all central of these processes. External, independent review of these processes is essential to establish conformance to established and generally accepted principles.

Output

Data and Process are essential to understanding cyber risk. But we also need the means to express these results in ways that can be put to use by cyber professionals, auditors, board directors, senior management and others. Adherence to accepted frameworks is essential, combined with proven and easy-to-use visualizations.

Maximal Loss, Probable Loss, Remediation ratios, T-Score, Solutions Navigator, Remediation Selection, and Interactive Risk Factors are all key components. Beyond these, we have found over thousands of organizations that additional analyses expand practical use and support real-world results. These include real-time benchmarking to industry peers, trendline analysis over multi-year time periods, pattern analysis of multiple loss types, aggregating diverse business units to determine correct overall risk, and a variety of “what-if” scenario analysis to properly show the risk dynamics effects from digital transformation, new product introduction, organizational change, cloud migration and supply chain dynamics.