The Economics of Information: A Strategic Framework for Data Valuation and Asset Management
[Note: I've got another edit round lined up for this post, final to be confirmed. The main elements are tidying up the formulae. For the originals, refer to Laney's original work]
The global economy has undergone a structural transformation where the centre of gravity of value has shifted from physical capital to intangible assets, specifically electronic data and information. During this shift, existing accounting frameworks and national statistical systems haven’t been able to capture the true economic significance of information, treating it as an intermediate expense rather than a durable asset. [1, 2]
This measurement gap has led to a substantial value gap in corporate balance sheets and a systematic underestimation of national wealth in the System of National Accounts (SNA) [Author note: the next blog post will look more closely at large valuation frameworks, including the SNA].[1, 3] The discipline of Infonomics, pioneered by Douglas Laney, provides a rigorous economic framework for managing, accounting for and valuing information with the same formality as traditional assets.[4, 5]
For Chief Financial Officers (CFOs) and Chief Data Officers (CDOs), applying the Infonomics set of metrics helps optimise capital allocation, improve operational performance and realise the latent potential of the modern organisation’s most valuable resource.[1, 6] For CFOs, a large share of the firm’s productive capital is invisible to GAAP/IFRS but very visible to markets and rating agencies. For CDOs, the data is still funded as IT spend rather than capital that must earn a return.
This paper concludes with a diagnostic framework for identifying 6 common Data Value Traps—including the 'Quality Mirage' and the 'Tech Debt Spiral'—that destroy ROI.
The Infonomics Metrics Framework
The Infonomics methodology categorizes data valuation into two distinct and complementary spheres: foundational metrics and financial metrics. Foundational metrics are utilised to assess the qualitative attributes and operational utility of data, providing a relative score that informs data governance and prioritisation. [1, 6] Financial metrics apply traditional economic valuation principles—cost, market and income—to estimate the monetary value of information assets for balance sheet considerations, mergers and acquisitions and insurance purposes. [6, 7]
Foundational Models: Assessing Quality and Utility
Foundational metrics provide the internal benchmarks necessary for a CDO to manage the data lifecycle effectively. They help identify which datasets are of sufficient quality to support advanced analytics and which require further investment in data engineering. [6, 8]
Intrinsic Value of Information (IVI)
IVI measures the innate characteristics of a data asset, independent of its specific business application. It focuses on the fundamental quality of the data, which serves as potential energy for future value creation.[1, 6] IVI is particularly useful for CDOs in prioritizing data cleaning and master data management efforts.[6]
The mathematical formula for IVI is:
IVI = Validity * Completeness * (1 - Scarcity) * Lifecycle
- Validity: Percentage of records deemed correct.
- Lifecycle: The usable lifespan of the data (e.g., in months).
- Completeness: Percentage of records populated vs. the total potential universe.
- Scarcity: Percentage of the market or competitors who also have this data (lower is better).
The lifecycle variable represents the usable life span of the information, acknowledging that the utility of a data record can decay over time. [6] An ideal IVI score of 1.0 indicates a dataset that is perfectly accurate, complete, unique and possesses a long-term utility. [6] In the age of LLMs, generic data has zero scarcity (low IVI). However, proprietary human-generated context (e.g., internal decision logs) has skyrocketed in scarcity. AI makes generic data cheaper and unique data more valuable.
Business Value of Information (BVI)
While IVI measures the purity of the data, BVI evaluates its fitness for specific business processes. A dataset may have high IVI but low BVI if it is not relevant to current strategic objectives or if it is not delivered in a timely manner to decision-makers. [1, 6]
The mathematical formula for BVI is:
BVI = Sum of (Relevancep) * Validity * Completeness * Timeliness
- Relevance: The usefulness of the data to a specific business process (p)
- Timeliness: The probability that the data is current enough to be actionable.
For a CFO, BVI is a critical metric for evaluating the operational efficiency of data-related investments, as it highlights the realised utility of data within the enterprise. [7]
Performance Value of Information (PVI)
PVI measures the actual impact of data on business performance. It is typically derived through controlled experiments where the performance of a business process using the data (the informed group) is compared against a control group that does not use the data. [1, 6]
The mathematical formula for PVI is:
PVI = ((KPIi / KPIc) - 1) * (T / t)
- KPIi: Performance of the process using the data.
- KPIc: Performance of the process without the data.
- T: Average usable lifespan of the data.
t: Duration of the experiment. PVI allows the CDO to demonstrate the lift provided by data and analytics, providing the evidence needed to justify the scaling of successful data initiatives. [6]
Financial Models: Quantifying Monetary Value
Financial metrics are designed to provide the CFO with a dollar-denominated value for information assets. These models are derived from established valuation methods used for other intangible assets but are tailored to the non-rivalrous and unique nature of data. [7, 10]
Cost Value of Information (CVI)
CVI adopts a cost-based approach, measuring the financial expenditure required to acquire, process and manage data. [1, 6] This can be considered the floor value for a data asset and is essential for determining insurance coverage or the necessary budget for data recovery systems. [6, 11]
The mathematical formula for CVI is:
CVI = ((Expense * Attribution * T) / t) + LostRevenue
- Expense: Annualized cost of the process to capture data.
- Attribution: Percentage of that expense specifically attributable to data capture.
- LostRevenue: The financial impact if the data were lost/damaged.
While CVI is objective and relatively easy to calculate, it doesn’t capture the potential revenue-generating capacity of data, leading to undervaluation. [12, 13]
Market Value of Information (MVI)
MVI estimates the price the data could command if it were bartered, licensed or sold in the external marketplace. [1] Because data is rarely sold (transferred in ownership) but rather licensed (granted a right of use), the MVI must account for the diminishing marketability of information as it becomes more ubiquitous. [6]
The mathematical formula for MVI is:
MVI = (ExclusivePrice * Licensees) / PremiumFactor
- ExclusivePrice: The price to sell ownership of the data to one buyer.
- Licensees: The number of probable parties who would license the data.
- PremiumFactor: A discount applied because the data loses value as it becomes more ubiquitous (inverse scarcity).
[6] MVI is a vital metric for organisations pursuing external data monetization strategies. [1]
Economic Value of Information (EVI)
EVI is the most effective financial metric for CFOs because it measures the direct contribution of a data asset to the bottom line. [1, 7] It compares the revenue or cost savings generated with the data against the performance without it. [6, 10]
The EVI formula is:
EVI = (Revenuew - Revenuewo - LifecycleCost) * (T / t)
- Revenuew: Revenue generated using the data.
- Revenuewo: Revenue generated without the data.
- LifecycleCost: The cost to acquire, administer, and apply the data.
By factoring in the cost to acquire and maintain the asset, EVI describes the net return on information, enabling the CFO to prioritize high-value data initiatives over lower-performing ones. [1, 7]
Summary of Infonomics Metrics and Applications
Approaches to Data Valuation and the Role of Infonomics
The Infonomics framework serves as a translation methodology to understand and apply various approaches to data valuation. Current economic theory generally classifies valuation into cost-based, market-based and income-based (or use-based) categories, each with distinct advantages and limitations in the context of digital assets. [10, 12, 13]
Cost-Based Approaches
Cost-based methods, such as CVI, assume that the value of an asset is at least equal to the cost of its production or replacement. [12, 13] In the public sector and national accounting, the sum-of-costs approach is the standard methodology when market transactions are unavailable. [14] However, cost-based methods are inherently conservative; they provide a lower-bound estimate and can ignore the exponential value creation potential of data. [12, 13] For a CDO, while CVI is useful for operational budgeting, it is insufficient for demonstrating the strategic importance of data to the Board. [1, 7]
Market-Based Approaches
Market-based approaches rely on the existence of a competitive marketplace where similar assets are traded. [10, 13] MVI applies this logic by surveying potential licensors and observing competitor pricing. [6] While market-based methods are conceptually sound, they face significant hurdles in the data economy: markets for data are often opaque, datasets are highly heterogeneous and the price of data is often obscured by bartering or bundled service agreements. [2, 10]
Income-Based and Use-Based Approaches
Income-based approaches, such as EVI, seek to measure the discounted present value of the future cash flows an asset will generate. [10, 12] Use-based methods are useful for capturing the non-rivalrous nature of data—the ability for the same dataset to be used simultaneously across multiple departments to drive multiple value streams. [12] Models such as the Modified Historical Cost Method (MHCM) attempt to refine these valuations by adjusting costs for data-specific characteristics like usage rates, accuracy and purpose-specific depreciation. [12]
Theoretical Comparisons of Valuation Frameworks
Private Sector Organisational Case Studies in Infonomics Implementation
The practical application of Infonomics has enabled diverse organisations to transition from treating data as a cost of doing business to managing it as a strategic source of revenue and competitive advantage. [15, 16]
PASSUR Aerospace: Monetizing Accuracy (IVI to MVI)
PASSUR Aerospace identified a critical failure in the aviation industry’s reliance on estimated arrival times (ETAs) provided by pilots, which were often inaccurate by several minutes. [15] By installing a global network of passive radar sensors, PASSUR generated a new dataset with near-perfect IVI—accuracy within 30 seconds. [15] This high-quality data was then licensed to airlines, creating an MVI-driven revenue stream. The value proposition was based on the economic value of perfect information, as the increased accuracy allowed airlines to optimise high-cost variables such as ground crew scheduling and fuel loads. [7, 15]
United Airlines: Realizing Economic Value (EVI)
United Airlines serves as a good example of the EVI approach. By licensing PASSUR’s high-precision data, United was able to improve its operational performance significantly. At Chicago O'Hare, the more accurate arrival data enabled United to save approximately $1 million per year in costs related to gate management and fuel. [15] Furthermore, the airline reported avoiding two to three flight diversions per week, which carries both direct cost savings and significant reputational benefits. [15] This case demonstrates how a strategic investment in high-IVI external data can yield an exponential EVI.
PEMEX: Predictive Maintenance and Process Optimization
The Mexican state-owned oil company, PEMEX, applied the principles of Infonomics to refinery maintenance. [15] Historically, component failure detection was sensory based, with engineers listening for noise. [15] By implementing vibration sensors and establishing baseline readings, PEMEX created a high-relevance dataset (high BVI). This transition from reactive to predictive maintenance directly impacted KPIs related to refinery uptime, saving the organisation millions in lost production costs. [15]
Babolat: Data-Enabled Product Transformation
Tennis racket manufacturer Babolat utilized sensors to capture data on swing mechanics and ball impact. [15] This move changed Babolat’s business model from a hardware manufacturer to a data-service provider. By valuing the data generated by the racket, Babolat could monetise the MVI through a performance-analysis platform, creating a recurring revenue stream decoupled from the physical sale of rackets. [15]
Bell Helicopter: Goal-Oriented Valuation
At Bell Helicopter, a data scientist was tasked with a goal-oriented valuation project: identifying and analysing the data required to sell four additional helicopters. [15] This approach utilises the logic of BVI by focusing solely on the data assets that contribute to a high-value sales outcome. By isolating the data-driven factors in the sales funnel, the organisation could quantify the marginal contribution of information to multi-million-dollar transactions. [15]
Infonomics in the Public Sector: Treasuries and National Statistics Offices
In the public sector, the valuation of data is shifting from a theoretical exercise to a core component of national economic strategy. Treasuries and National Statistics Offices (NSOs) are increasingly tasked with quantifying the value of public sector data to inform policy, justify digital infrastructure spending and accurately reflect national wealth in macroeconomic accounts. [13, 17, 18]
The SNA 2025 Revolution: Data as a Capital Asset
The most significant development in public sector data valuation is the update to the United Nations System of National Accounts in 2025 (SNA 2025). This update marks the first time that data is recognized as a standalone asset category within the production boundary. [3, 18, 19] Historically, data was considered a non-produced asset or an intermediate expense, but the new standard recommends its capitalisation as a produced fixed asset under Intellectual Property Products. [14, 18]
This shift has significant implications for Treasuries:
- GDP Impact: Capitalising public sector data production increases the reported Gross Fixed Capital Formation (GFCF), directly increasing GDP. [14, 18]
- Wealth Recognition: Government balance sheets will now reflect the multi-billion-dollar value of accumulated data assets, such as health records, geospatial data and historical census information. [13, 19, 20]
- Fiscal Policy: By recognising data as a durable asset, Treasuries can more easily justify the long-term debt financing of large-scale digital transformation and AI projects. [3, 18]
The Sum-of-Cost Methodology in the Public Sector
Because public sector data is rarely sold on the open market, NSOs must rely on a cost-based approach to valuation, specifically the sum-of-costs methodology. [14] This method calculates value by identifying the labour and non-labour inputs required to produce the data. [14, 19]
Labour Costs and Involvement Rates
The valuation begins by identifying occupations that contribute to data production as a primary or integral part of their role. The SNA 2025 provides standardised involvement rates to determine the percentage of a worker’s compensation that should be capitalised as data production. [14, 19]
Non-labour Costs and Mark-ups
To determine the full production cost, NSOs apply a mark-up to the identified labour costs. This mark-up accounts for:
- Intermediate Consumption: Software licenses, cloud storage fees and electricity. [14, 21]
- Consumption of Fixed Capital: Depreciation of computers, servers and network infrastructure used in data creation. [14]
- Taxes less Subsidies: Any production-related taxes paid by the producing unit. [14]
The SNA 2025 suggests using a single mark-up based on the ratio of gross output to employee remuneration in the Information Service Activities sectors (ISIC 62 and 63). [14] Advanced implementations may apply multiple, industry-specific mark-ups to improve precision. [14]
Case Studies: UK, Canada, and Australia
National governments are already implementing these frameworks to provide experimental estimates of their public sector data wealth. [12, 20]
The United Kingdom: Estimating the Data Dividend
The UK Department for Science, Innovation and Technology (DSIT) and the Office for National Statistics (ONS) have developed a framework for quantifying the value of public sector data. [13, 17] Their initial conservative estimates indicate that the UK public sector invests approximately £30 billion annually in data assets.[13] Using a use-based approach, they estimate that providing external access to this data could generate an additional £15 billion to £200 billion in potential value to the economy by enabling new products and services. [13] This research informs initiatives like the National Data Library, which aims to responsibly increase data access to support the UK's AI Opportunities Action Plan. [13]
Statistics Canada: Measuring the Information Chain
Statistics Canada has been at the forefront of this movement, releasing research that maps the information chain from everyday observations to structured databases and actionable science. [20, 21] In their 2018 experimental estimates, they found that Canadian investment in data products reached $40 billion annually, exceeding investments in industrial machinery and transportation equipment. [20] Their methodology relies heavily on Census of Population and Labour Force Survey data to estimate the cost of own-account data production across the Canadian economy. [21, 22]
The Australian Bureau of Statistics (ABS): Data for Competition
The ABS plays a central role in the Australian Data Strategy, which treats non-sensitive publicly held data as a public good. [23, 24] The Australian Government finance statistics (GFS) framework is used to measure the financial activities of government, including the stock of assets. [25] The ABS is currently undertaking a major project, Big Data, Timely Insights (BDTI), which rebuilds core statistical tools in a cloud-based environment to deliver more granular and timely insights, such as a monthly Consumer Price Index (CPI). [23] For the Australian Treasury, the valuation of data is tied to its role in driving competition and job growth in the digital economy. [24]
Strategic Integration for CFOs and CDOs
To apply Infonomics effectively, CFOs and CDOs need to address the tensions between innovative data valuation and established financial standards. [2, 26, 27]
Accounting Challenges: GAAP vs. IFRS
A significant barrier to the formal recognition of data on balance sheets is the disparity between accounting regimes. US GAAP is historically more conservative, often disallowing the capitalisation of internally generated intangibles. [26] IFRS, conversely, is moving toward fair value accounting, which is more aligned with the economic reality of data. [27, 28]
For the CFO, Infonomics serves as a shadow accounting system. Even if data cannot be officially capitalised under current ASC 820 or IFRS 13 standards, the Infonomics metrics provide the internal visibility needed to manage data as a financial asset. [1, 2] This is particularly relevant in cross-border M&A activity, where being financially bilingual in data valuation can prevent significant deal-value surprises. [27]
The Data Hierarchy of Needs
The application of Infonomics follows a data science hierarchy of needs, where foundational needs must be met before advanced value can be realised. [6]
- Infrastructure (IVI): Pure data and systems (data lakes, pipelines, quality).
- Analysis (BVI): Derived datasets that are fit for business purposes.
- Informed Decision Making (PVI/EVI): Applied insights that drive specific ROI outcomes. [6]
Policy, Compliance, and Ethics
Valuing data also requires a robust framework for compliance and ethics. The economic value of a dataset is drastically reduced if it lacks the necessary consent or is subject to severe regulatory risks like GDPR. [8, 10] Infonomics mandates a systematic approach to:
- Consent: Ensuring explicit permission for secondary uses of data. [8]
- Policy: Delineating data management methodologies and security protocols.[8]
- Risk: Factoring in potential loss and reputational penalties into CVI and EVI calculations.[6, 10]
Conclusion: The Path to Information Maturity
The transition to a data-informed economy requires more than just technological investment; it demands a specific shift in how organisations and governments perceive, measure and manage information. [4, 8, 16] By adopting the Infonomics framework, CFOs and CDOs can move beyond qualitative assertions of data’s importance to a rigorous, quantitative assessment of its economic impact. [1]
For the private sector, this means using EVI to drive capital allocation and MVI to unlock new revenue streams. [1, 7] For the public sector, the adoption of the SNA 2025 provides a once-in-a-generation opportunity for Treasuries and NSOs to accurately reflect the digital infrastructure of the modern state as a cornerstone of national wealth. [14, 18, 19] In an era where data-informed firms dominate the global markets, the ability to value information is a strategic imperative for long-term competitiveness and sustainable economic growth. [4, 12]
Appendix: Category Interactions
Infonomics creates natural category cases where metric interactions produce data project-defeating traps (data value destroyed despite potential) or data project-boosting flywheels (virtuous data compounding). These emerge because the six metrics (IVI, BVI, PVI, CVI, MVI, EVI) form a value chain where early gaps cascade to degrade later-stage metrics.
Project-Defeating Traps (Value Destroyed)
Decision rule: Any red flags (high CVI + low PVI/EVI, or foundational gaps) → project cessation or remediation.
Project-Boosting Flywheels (Value Compounded)
Decision rule: Green lights (foundational metrics → financial lift) → scale aggressively, reinvest 20–30% into compounding metrics.
Sources
--------------------------------------------------------------------------------
- Infonomics and You [video] - IRI, https://www.iri.com/blog/iri/business/infonomics-and-you/
- (PDF) Accounting for Digital Assets: Recognition Challenges and Valuation Models under U.S. GAAP vs. IFRS - ResearchGate, https://www.researchgate.net/publication/399614428_Accounting_for_Digital_Assets_Recognition_Challenges_and_Valuation_Models_under_US_GAAP_vs_IFRS
- Cartography Between Worlds – The 2025 update to the system of national accounts re-opens debates on key economic measures - LSE Blogs, https://blogs.lse.ac.uk/impactofsocialsciences/2025/08/26/cartography-between-worlds-the-2025-update-to-the-system-of-national-accounts-re-opens-debates-on-key-economic-measures/
- (PDF) Infonomics and the Value of Information in the Digital Economy - ResearchGate, https://www.researchgate.net/publication/282555832_Infonomics_and_the_Value_of_Information_in_the_Digital_Economy
- Data: The new differentiator in manufacturing analytics - West Monroe, https://www.westmonroe.com/insights/the-new-differentiator-in-manufacturing-analytics
- The Art and Science of Measuring Data Teams Value | Airbyte, https://airbyte.com/blog/measuring-data-teams-value
- Data: Our most valuable resource, without accounting monetary ..., https://www.bip-group.com/en-uk/insights/data-our-most-valuable-resource-without-accounting-monetary-value/
- Harnessing the potential of AI and data analytics for infonomics in developing countries: an architectural model and framework - Emerald Publishing, https://www.emerald.com/idd/article/doi/10.1108/IDD-10-2024-0155/1270326/Harnessing-the-potential-of-AI-and-data-analytics
- Infonomics of Autonomous Digital Twins - Dr. Istvan David, https://istvandavid.com/files/infonomics-of-autonomous-dt-CAiSE2024.pdf
- Data valuation - Wikipedia, https://en.wikipedia.org/wiki/Data_valuation
- Chapter: 3 Cost and the Value of Data - National Academies of Sciences, Engineering, and Medicine, https://www.nationalacademies.org/read/25639/chapter/5
- What is the Value of Data? | Bennett Institute for Public Policy, https://bennettschool.cam.ac.uk/wp-content/uploads/2022/07/policy-brief_what-is-the-value-of-data.pdf
- Value of Public Sector Data Estimate - GOV.UK, https://www.gov.uk/government/publications/value-of-public-sector-data-estimate/value-of-public-sector-data-estimate
- Handbook on measuring data in the System of National Accounts, https://unstats.un.org/UNSDWebsite/statcom/session_56/documents/BG-3a-ISWGNA-BG-Data-Handbook-E.pdf
- Private-Sector Applications of Data Science - CIA, https://www.cia.gov/resources/csi/static/private-sector-applications.pdf
- Infonomics: How to Monetize, Manage, and Measure Information - Datentreiber, https://www.datentreiber.com/insights/blog/infonomics-how-to-monetize-manage-and-measure-information-as-an-asset-for-competitive-advantage/
- Value of Public Sector Data Estimate - GOV.UK, https://www.gov.uk/government/publications/value-of-public-sector-data-estimate
- System of National Accounts 2025 - UN Statistics Division, https://unstats.un.org/unsd/nationalaccount/snaupdate/2025/2025_SNA_Combined.pdf
- Recording Data as an Asset in the National Accounts - IARIW 2024, https://iariw.org/wp-content/uploads/2024/08/Recording_Data_as_an_Asset_in_the_National_Accounts__Practical_Guide_.pdf
- The Daily — Study: The value of data in Canada: Experimental estimates, https://www150.statcan.gc.ca/n1/daily-quotidien/190710/dq190710a-eng.htm
- The value of data in Canada: Experimental estimates, https://www150.statcan.gc.ca/n1/pub/13-605-x/2019001/article/00009-eng.htm
- The value of data in Canada: Experimental estimates, https://www150.statcan.gc.ca/n1/pub/13-605-x/2019001/article/00009-eng.pdf
- 2024–25 Australian Bureau of Statistics Corporate Plan, https://previewapi.transparency.gov.au/delivery/assets/80a82ed1-3e33-027b-b7e0-6493f97f18f8/2e9625a2-20e8-4231-900c-b6ac62c6d96c/2024-25%20Australian%20Bureau%20of%20Statistics%20Corporate%20Plan.pdf
- Australian Data Strategy - Department of Finance, https://www.finance.gov.au/sites/default/files/2022-10/australian-data-strategy.pdf
- Government Finance Statistics, Annual methodology, 2023-24 financial year, https://www.abs.gov.au/methodologies/government-finance-statistics-annual-methodology/2023-24
- Lower of Cost or Market Inventory Valuation: IFRS Versus US GAAP - ScholarWorks@CWU, https://digitalcommons.cwu.edu/cgi/viewcontent.cgi?article=1022&context=cobfac
- IFRS and US GAAP: similarities and differences - 2022 - Viewpoint - PwC, https://viewpoint.pwc.com/dt/us/en/pwc/accounting_guides/ifrs_and_us_gaap_sim/assets/pwcifrsusgaap0222.pdf
- The Effect of the Mandatory Application of IFRS on the Value Relevance of Accounting Data: Some Evidence from Greece, https://ersj.eu/journal/211/download/The+Effect+of+the+Mandatory+Application+of+IFRS+on+the+Value+Relevance+of