The Economic Architecture of Data Assets in the Age of Artificial Intelligence

The Economic Architecture of Data Assets in the Age of Artificial Intelligence
Tent pegs, pebble and picnic table. Author

The transition of the global economy from an industrial paradigm rooted in physical capital to a digital era defined by intangible assets represents a structural shift of unprecedented magnitude. Within this transformation, data has emerged as more than a byproduct of activity: it is a primary factor of production, altering the mechanisms of value creation and competitive advantage.

Modern data assets, particularly those leveraged by artificial intelligence (AI), possess unique economic characteristics that distinguish them from traditional capital. These assets are non-rivalrous, capable of generating increasing returns to scale and characterized by near-zero marginal costs for replication. However, they remain largely obscured within the traditional accounting frameworks used to evaluate corporate health and national wealth.

The defining economic impact of AI and data assets can be understood through the lens of total factor productivity (TFP). While previous technologies such as electricity and computing followed a multi-decade arc of invention, adoption and eventual saturation, AI is expected to compress this timeline significantly. Some estimates suggest that generative AI could increase global GDP by as much as 7%, or roughly $7 trillion, over the next decade, while more conservative models project a permanent increase in the level of economic activity of 1.5% by 2035 and up to 3.7% by 2075.

However, the realization of these gains is contingent upon overcoming the productivity paradox—the observed lag between the deployment of new technologies and the measurable increase in output—which is currently manifest in the massive capital expenditures in data centres and AI infrastructure that have yet to fully penetrate the broader economic data.

The Ontology of Data: Non-Rivalry and the Economics of Information

To define the economic characteristics of data, we have to first address its status as a non-rivalrous good. The economics of most goods are rivalrous; a machine used in one factory can't simultaneously be used in another, and an hour of an accountant's time sold to one client is unavailable to another. Data, however, is infinitely usable. A single dataset, whether comprising customer preferences, medical records or sensor logs from autonomous vehicles, can be utilized by an unlimited number of firms, researchers or machine learning algorithms simultaneously without being diminished or depleted.

This non-rivalry leads to increasing returns to scale, a property that underpins the dominance of modern technology firms. When data is used as an input to improve the quality of ideas or algorithms, the resulting productivity gains can be applied across the entire firm's output. In a production function where output (Y) is a result of ideas (I) and labour (L), and where ideas are improved by data (D), the non-rival nature of data allows for a geometric multiplier effect on economic growth.

Table 1. Theoretical Comparisons of Capital Characteristics

Asset Characteristic

Physical Capital (Rival)

Data Assets (Non-Rival)

Marginal Cost of Use

Positive and significant

Near-zero

Opportunity Cost

High (exclusive use)

Low (simultaneous use)

Depletion

Physical wear and tear

Does not degrade; may gain value

Replication

Costly and time-consuming

Instantaneous and low-cost

Scaling Dynamics

Diminishing returns

Increasing returns

Ownership Model

Direct control and possession

Access, licensing & orchestration

The distinction between data and ideas is important for modelling purposes. Ideas are the blueprints or instructions for production, whereas data is the factor used to produce or refine those ideas. While the underlying idea of a neural network may be public knowledge, the proprietary data used to train it remains highly excludable through encryption and legal protections.

This excludability allows firms to capture private rents from a non-rival asset, even as the social optimum would suggest that data should be used as broadly as possible to maximize scale effects. The current equilibrium often results in data hoarding, where firms protect their data to prevent creative destruction by competitors, leading to a sub-optimal level of innovation from a societal perspective.

Macroeconomic Projections and the Role of Artificial Intelligence

The macroeconomic impact of data assets is currently being reshaped by the rapid proliferation of generative AI (although the long-term value add of genAI is definitely up for debate). Research indicates that approximately 40% of the current GDP in advanced economies is significantly exposed to AI-driven transformation. This exposure is not uniform; occupations at the 80th percentile of earnings are the most exposed, with roughly half of their tasks susceptible to automation or significant augmentation. This suggests that AI serves as a productivity multiplier for high-skill labour, potentially widening the gap between technologically advanced sectors and the rest of the economy.

Table 2. Projected GDP and Productivity Gains from AI Adoption

Impact Category

Peak Contribution

Long-Term Effect (2075)

Primary Mechanism

Annual Productivity Growth

+0.2 percentage points (2032)

+0.04 percentage points

Task automation and efficiency

Total Factor Productivity

~1.5% level increase (2035)

~3.7% level increase

Knowledge compounding

Global GDP Boost

$7 Trillion (over 10 yrs)

Permanent level shift

Optimized capital allocation

Fiscal Impact

$400 Billion deficit reduction

Long-term budget stability

Increased tax base from growth

While the estimates from international institutions like the IMF and OECD suggests substantial gains, there is a noted divergence in opinion regarding the timeline. Some models, such as the Acemoglu framework, offer more conservative estimates, suggesting that if only 5% of tasks are profitably automated in the next decade, the GDP boost may be closer to 1.1% rather than the more aggressive forecasts of 7%.

This caution is rooted in the hard tasks, such as complex medical diagnosis or nuanced social interaction, where AI productivity gains may be more limited in the near term. Furthermore, the historical performance of the US economy shows that even during periods of rapid General Purpose Technology adoption, such as the introduction of the internet or computerization, real per capita GDP has maintained a steady growth rate of roughly 2% per annum, suggesting that AI may be a sustaining rather than a purely disruptive force in the long run.

Microeconomic Dynamics: Data-Enabled Learning and Network Effects

At the firm level, the economic impact of data assets is manifested through data-enabled learning. Unlike traditional learning-by-doing, which reduces marginal production costs over time, data-enabled learning increases a customer's willingness to pay by enabling hyper-personalization and rapid product iteration. This creates the self-reinforcing cycle of a data network effect: as a firm attracts more users, it gathers more data; this data is used to train AI models that improve the product, which in turn attracts more users.

Notable examples of this dynamic include search engines and navigation apps, where traffic predictions and results improve as more drivers and searchers provide real-time data. This cycle can create significant barriers to entry for new competitors, who lack the historical data corpus necessary to match the performance of the incumbent's AI models. However, recent research suggests that the value in the AI era is shifting from the ownership of generic models to the orchestration of data interactions. As open-source models become commoditized, the economic centre of gravity moves toward how these models are embedded in unique enterprise workflows and connected to real-time, proprietary data streams.

Table 3. The Evolution of Competitive Moats in the Data Economy

Moat Component

Industrial Era

Digital/Data Era

AI/Interaction Era

Primary Asset

Physical Infrastructure

Software/Database

Integrated AI Networks

Scaling Mechanism

Economies of Scale

Data Network Effects

Interaction Compounding

Source of Advantage

Low Marginal Cost

High Switching Costs

Institutional Memory

Value Driver

Standardized Output

Personalized Content

Adaptive Orchestration

The concept of network intelligence compounding suggests that the value of an AI asset grows exponentially as it interacts with more systems, agents and human workflows. This shift implies that while individual AI models may depreciate in value as technology advances, the underlying network of data interactions and the context it encodes for an enterprise will likely appreciate over time. This transition from cost-centre thinking to strategic multiplier thinking is a key differentiator for firms successfully navigating the AI revolution.

The Accounting Gap: Why Data is Missing from the Balance Sheet

One of the most significant challenges in defining the economic impact of data is its omission from corporate balance sheets. Under current U.S. GAAP and IFRS standards, data is largely excluded from financial reporting, contributing to the value gap where market valuations for data-rich companies can exceed their book values by a factor of ten or more. This absence is a result of several fundamental accounting hurdles.

First, the issue of costing remains paramount. For an asset to be recognized, it must have an identifiable fair market value. While companies recognize data as a monetizable asset, they struggle to quantify the costs associated with the entire data lifecycle, from origination and cleansing to storage and consumption, making it difficult for independent auditors to agree on a valuation. Furthermore, GAAP and IFRS generally prohibit the capitalization of internally developed intangible assets. A logo or a customer list developed by a firm does not appear on its balance sheet, whereas those same assets acquired through a merger or acquisition are recognized as goodwill.

Second, the traditional concept of depreciation is ill-suited for data assets. While physical assets lose value as they age, data assets are unpredictable. Master data, such as records of business entities and products, often gains value as it is shared and integrated across an organization. Conversely, transactional data, such as a single invoice, may lose its relevance and value within days or weeks. The lack of specific, regulated depreciation methods for data assets prevents their formal inclusion in financial statements.

Table 4. Challenges to Data Asset Recognition on Balance Sheets

Challenge

Impact on Reporting

Root Cause

Valuation Subjectivity

Inconsistent asset figures

Data value is highly contextual to the user.

Internally Developed Assets

Assets omitted from books

Conservatism in GAAP/IFRS against self-valuation.

Depreciation Ambiguity

Unclear asset life

Data can both gain and lose value over time.

Liability Risk

High conversion rate

Data can become a liability instantly (e.g., breach).

Contextual Dependence

Non-transferable value

Value often disappears during a change in ownership.

Third, the conversion rate from asset to liability is significantly higher for data than for tangible assets. A security breach or a compliance failure under regulations like GDPR can instantly turn a valuable database into a legal and financial liability, as evidenced by the rapid insolvency of companies following major data-related scandals. Because businesses dislike the unpredictability and vagueness that data introduces to asset valuation, they have traditionally preferred to expense data-related costs rather than capitalize them.

Valuation Frameworks: Income, Cost, and Market Approaches

In the absence of formal accounting recognition, several valuation frameworks have been developed to help organizations and investors quantify the value of data. These are typically adapted from established practices for other intangible assets.

The Income Approach is the most relevant for data that directly impacts cash flow. This includes methods like the relief-from-royalty method, which estimates the royalties a company avoids by owning the data rather than licensing it, and the with-and-without method, which compares the projected performance of a company with the data against its performance without it. This approach is particularly effective for AI models and data platforms that provide a clear uplift in revenue or a reduction in operational costs.

The Cost Approach values an asset based on what it would cost to recreate or replace it today. This distinguishes between reproduction (an identical copy) and replacement (a functionally equivalent substitute). This approach must account for the developer's profit and entrepreneurial incentive—the margin a third party would require to build the asset or the motivation for an owner to develop it internally. However, the cost approach is criticized for failing to capture the potential value the data generates, and it must be adjusted for technological obsolescence, where newer methods can produce the same insights for a fraction of the original cost.

The Market Approach relies on comparing the data asset to similar assets sold in an active market. While theoretically sound, this is rarely used in isolation because data markets are still maturing and lack the transparency of traditional asset markets. It is most useful as a reasonableness check alongside other analyses.

The Infonomics Framework: Six Models for Information Valuation

A more granular approach, known as Infonomics, identifies six specific models for measuring the value of information assets, divided into foundational (non-financial) and financial measures.

Foundational Measures

  1. Intrinsic Value of Information (IVI): Measures data quality metrics such as correctness, completeness and scarcity. This model looks at how good the data is inherently.
  2. Business Value of Information (BVI): Evaluates the data against specific business processes and initiatives. It considers timeliness—even accurate data is valueless if it arrives too late for a decision.
  3. Performance Value of Information (PVI): An empirical measure of how the data impacts key business drivers and KPIs, often using control group studies to isolate the data's effect.

Financial Measures

  1. Cost Value of Information (CVI): Calculates the cost to acquire, administer and store the data, or the financial impact if the data were lost.
  2. Market Value of Information (MVI): The revenue potential from selling, trading or bartering the information asset in a marketplace.
  3. Economic Value of Information (EVI): This measures the data's contribution to the bottom line by comparing revenue generated with and without the asset, minus its operational costs over its usable life.

The implementation of these models allows firms to identify valuation gaps—the difference between the potential and realized value of their data—and prioritize IT and business initiatives accordingly. By treating data with the same rigor as traditional assets, organizations can move from lip service regarding their data resources to actual strategic deployment.

Regulatory Evolution: ASU 2025-06 and the Future of Software Accounting

The accounting landscape is beginning to shift in response to the demands of the digital economy. In September 2025, the Financial Accounting Standards Board (FASB) released ASU 2025-06, which provides targeted improvements to the accounting for internal-use software costs. This update is specifically designed to address the move from linear, waterfall software development to the agile and iterative methods common in modern AI and data projects.

The new standard replaces the old three-stage framework (preliminary project, application development and post-implementation) with a more flexible approach. Capitalization of costs may now begin once management has authorized the project, committed to funding, and it is 'probable' that the project will be completed and used for its intended function. For novel or untested technologies like emerging AI models, capitalization is temporarily prohibited if significant development uncertainty exists. This requirement ensures that purely experimental research is expensed, while mature development that is likely to generate economic benefit can be recognized as an asset.

Table 5. Key Modifications in ASU 2025-06 vs. Previous GAAP

Feature

Previous Guidance (350-40)

New Guidance (ASU 2025-06)

Project Structure

Rigid, linear three-stage model

Principles-based, flexible framework

Capitalization Trigger

Successful completion of preliminary stage

Authorization + Funding + Probable Completion

Agile Applicability

Difficult to apply to iterative work

Neutral to development methodology

Novelty Clause

No explicit uncertainty threshold

Prohibits capitalization of "unproven" tech

Disclosures

Focused on general intangibles

Aligned with Property, Plant & Equipment (PP&E)

This regulatory modernisation aims to provide more decision-useful information to investors while reducing the complexity for reporting entities. By aligning software disclosures with those of property, plant and equipment, the FASB is signalling that internal software and the data systems that support it should be viewed with the same permanence and strategic importance as a physical factory.

Externalities and the Social Value of Data: GDPR and Cybersecurity

A comprehensive analysis of data assets must also account for the externalities they generate. An externality occurs when a company's investment decisions overlook the impact on wider society. In the realm of data, these externalities are primarily related to cybersecurity and privacy.

The implementation of the General Data Protection Regulation (GDPR) in the EU has provided a laboratory for studying these effects. While often discussed in terms of its compliance costs, GDPR has generated significant societal benefits by reducing the opacity of data security practices. By mandating that firms notify authorities and individuals of data breaches, the regulation has lowered the information asymmetry that previously allowed firms to under-invest in security without repercussions. Economists have found that these notification requirements have led to a 2.5% to 6.1% decrease in identity theft incidents, representing avoided losses of between €585 million and €1.4 billion annually.

Furthermore, the collective investment in data security creates a virtuous circle. When companies within a sector adhere to high security standards, they reduce the overall profitability of cybercrime, as successful attacks become rarer. This forces cybercriminals to adjust their strategies, potentially reducing the frequency of high-ransom demands that plague data-intensive industries. However, if consumers lose trust in digital services due to frequent breaches or invasive data practices, the potential for economic development and inclusion, particularly in emerging markets, can be severely impeded.

Data as a Strategic Multiplier

The economic impact of modern data assets is not a singular event but an ongoing transformation that redefines the relationship between capital, labour and productivity. As AI matures from an experimental novelty into a core infrastructure layer, the firms that will excel are those that move beyond the ownership of raw data to the orchestration of data interactions.

From a macroeconomic perspective, the surge in AI-driven productivity is likely to be a decade-long process, requiring large-scale investment in complementary infrastructure and a reconfiguration of work practices. While the Solow paradox may temporarily persist, the underlying characteristics of data—its non-rivalry and increasing returns—suggest that the long-term economic gains will be substantial, permanent and potentially redistributive across global regions.

On the balance sheet, the value gap will continue to challenge auditors and investors until broader revaluation of intangible assets is adopted. However, the introduction of ASU 2025-06 marks a significant step toward recognizing the capital nature of digital development. Organisations that adopt rigorous valuation models, such as those found in Infonomics, will be better positioned to communicate their true value to the market and to allocate capital more efficiently.

Data is the new oil only in the sense that it must be refined to be useful. In its raw form, it is an abundant, non-rival resource. Refined through AI, it becomes a strategic multiplier capable of driving unprecedented growth. The challenge for modern enterprises and policymakers alike is to build the architectural, legal, and accounting frameworks necessary to harness this multiplier while mitigating the risks of market concentration and social externality.

Concluding Frameworks for Strategic Implementation

To successfully operationalize the value of data and AI assets, peers across finance, strategy and IT should align on a unified approach to measurement and reporting.

  1. Strategic Allocation: CFOs should shift their focus from expensing IT costs to identifying and capitalizing probable-to-complete software projects that add unique functionality, as permitted under the new ASU 2025-06 guidance. This requires rigorous documentation of management authorization and funding commitments.
  2. Value Identification: Organizations should utilise foundational valuation models (IVI, BVI, PVI) to assess the health of their data assets. This allows for the disposal of data debt—information that costs more to store than its potential economic benefit—while prioritising high-quality, high-relevance datasets for AI training.
  3. Competitive Moats: Strategy teams should recognize that AI models are depreciating assets, while the networks they create are appreciating ones. Advantage is built through interaction compounding, where the integration of AI into complex, multi-step workflows create high switching costs and unique institutional memory.
  4. External Risk Management: Data security should be viewed as a value-preserving investment rather than a cost centre. By aligning with frameworks like GDPR, firms mitigate legal risk and build the consumer trust necessary to maintain their data network effects.

The mastery of data economics will be the defining characteristic of high-performance organizations. The shift from tangible factories to digital networks is not just a change in asset type, but a fundamental change in the rules of growth and valuation.

Sources

  1. Intangible capital, non-rivalry, and growth - LSE, https://www.lse.ac.uk/economics/Assets/Documents/Jan-Eberly-May-2024.pdf
  2. Nonrivalry and the Economics of Data - Christopher Tonetti, https://christophertonetti.com/files/papers/JonesTonetti_DataNonrivalry.pdf
  3. Data, Intangible Capital, and Productivity - International Monetary Fund, https://www.imf.org/-/media/files/news/seminars/2022/10th-stats-forum/session-iv-10th-statforum-carol-corrado-et-al-criw-data-and-productivity-8mar22-for-imf.pdf
  4. The macroeconomic effects of artificial intelligence - Banco Santander, https://www.santander.com/en/press-room/the-year-ahead-2025/the-macroeconomic-effects-of-artificial-intelligence
  5. The Projected Impact of Generative AI on Future Productivity Growth ..., https://budgetmodel.wharton.upenn.edu/issues/2025/9/8/projected-impact-of-generative-ai-on-future-productivity-growth
  6. A new look at the economics of AI | MIT Sloan, https://mitsloan.mit.edu/ideas-made-to-matter/a-new-look-economics-ai
  7. The Macroeconomic Consequences of AI - United Nations Development Programme, https://www.undp.org/sites/g/files/zskgke326/files/2025-12/the-macroeconomic-consequences-of-ai.pdf
  8. Nonrivalry and the Economics of Data - ResearchGate, https://www.researchgate.net/publication/344035173_Nonrivalry_and_the_Economics_of_Data
  9. The diverse economic impacts of artificial intelligence - Amundi Research Center, https://research-center.amundi.com/article/diverse-economic-impacts-artificial-intelligence
  10. Data-enabled learning, network effects and competitive advantage 1 Introduction, https://econ.ntu.edu.tw/wp-content/uploads/2023/12/HKBU_1091029.pdf
  11. Artificial Intelligence, Data Network Effects and the Transformation of Competitive Markets - ijprems, https://www.ijprems.com/ijprems-paper/artificial-intelligence-data-network-effects-and-the-transformation-of-competitive-markets
  12. AI-enabled business models for competitive advantage - Elsevier, https://www.elsevier.es/en-revista-journal-innovation-knowledge-376-pdf-download-S2444569X24000714
  13. In the AI era, is proprietary data still a sustainable competitive ..., https://www.bowmark.com/insights/in-the-ai-era-is-proprietary-data-still-a-sustainable-competitive-advantage
  14. The Economics of AI Networks: Why Value Shifts From Models to Interactions - Medium, https://medium.com/@prajnaaiwisdom/the-economics-of-ai-networks-why-value-shifts-from-models-to-interactions-8f43b59932ba
  15. Network dynamics in the age of AI | Databricks Blog, https://www.databricks.com/blog/network-dynamics-age-ai
  16. Infonomics and You [video] - IRI, https://www.iri.com/blog/iri/business/infonomics-and-you/
  17. Why is Data Missing from the Balance Sheet? | CFO.University, https://cfo.university/library/article/why-is-data-missing-from-the-balance-sheet-southekal
  18. When tangible assets are missing from the balance sheet - Flossbach von Storch RI, https://www.flossbachvonstorch-researchinstitute.com/en/studies/detail/when-tangible-assets-are-missing-from-the-balance-sheet
  19. Valuing data assets | Deloitte Insights, https://www.deloitte.com/us/en/insights/topics/digital-transformation/valuing-data-assets.html
  20. Big Data Valuation: Cost, Market and Income Approaches | by Kevin Garwood - Medium, https://kevin-garwood.medium.com/big-data-valuation-cost-market-and-income-approaches-8b25fa7945a9
  21. Three Methods of Business Valuation and Which One to Use, https://www.successionresource.com/three-different-valuation-methods/
  22. Infonomics: The New Economics of Information, https://www.energy.gov/sites/prod/files/2020/08/f78/07-beto-leveraging-bioenergy-data-july2020-laney.pdf
  23. What is Infonomics and Why Should You Care? - Everteam Software, https://www.everteam.com/en/what-is-infonomics-and-why-should-you-care/
  24. Data: Our most valuable resource, without accounting monetary value - BIP Consulting, https://www.bip-group.com/en-uk/insights/data-our-most-valuable-resource-without-accounting-monetary-value/
  25. How Do You Value Information? - BigDATAwire - Data Science • AI • Advanced Analytics, https://www.hpcwire.com/bigdatawire/2016/09/15/how-do-you-value-information/
  26. FASB Modernizes Internal-Use Software Accounting with ASU 2025-06 - Johnson Lambert, https://www.johnsonlambert.com/insights/articles/fasb-modernizes-internal-use-software-accounting-with-asu-2025-06/
  27. FASB Issues Standard That Makes Targeted Improvements to Internal-Use Software Guidance, https://www.fasb.org/news-and-meetings/in-the-news/fasb-issues-standard-that-makes-targeted-improvements-to-internal-use-software-guidance-423046
  28. FASB's Improvements to Accounting for Internal-Use Software | Forvis Mazars US, https://www.forvismazars.us/forsights/2025/12/fasb-s-improvements-to-accounting-for-internal-use-software
  29. Updated Capitalization Requirements For Internal Software Development Projects Under ASU 2025-06 - Windham Brannon, https://windhambrannon.com/blog/updated-capitalization-requirements-for-internal-software-development-projects-under-asu-2025-06/
  30. FASB Unveils New Rules Streamlining Internal Software Accounting for Modern Development Methods - Thomson Reuters, https://tax.thomsonreuters.com/news/fasb-unveils-new-rules-streamlining-internal-software-accounting-for-modern-development-methods/
  31. Cybersecurity: The Economic Benefits of GDPR - CNIL, https://www.cnil.fr/en/cybersecurity-economic-benefits-gdpr
  32. The role of data protection in the digital economy - UNCDF Policy Accelerator, https://policyaccelerator.uncdf.org/all/brief-data-protection-digital-economy
  33. The economic value of personal data In digital economy - ResearchGate, https://www.researchgate.net/publication/390293406_The_economic_value_of_personal_data_In_digital_economy
  34. ESG Reporting: Key Insights for Accurate and Effective Disclosure - Bentleys, https://www.bentleys.com.au/resources/esg-reporting-key-insights-for-accurate-and-effective-disclosure/
  35. ESG reporting examples: Top Insights in 2025, https://www.keyesg.com/article/esg-reporting-examples-from-leading-companies
  36. Firms and innovation in the new industrial paradigm of the digital transformation, https://www.tandfonline.com/doi/full/10.1080/13662716.2022.2161875
  37. Why are data hard to value? Data’s unique attributes, https://internetofwater.org/blog/valuing-data/why-are-data-hard-to-value-datas-unique-attributes
  38. The projected impact of generative AI on economic growth, https://budgetmodel.wharton.upenn.edu/issues/2025/9/8/projected-impact-of-generative-ai-on-future-productivity-growth
  39. A new look at the economics of AI, https://mitsloan.mit.edu/ideas-made-to-matter/a-new-look-economics-ai
  40. The Strategic Value of Data Sharing in Interdependent Markets, https://pubsonline.informs.org/doi/10.1287/mnsc.2024.04938
  41. AI may not need massive training data after all, https://www.sciencedaily.com/releases/2025/12/251228074457.htm
  42. The Network is the Product: Data Network Flywheel, Compound Through Connection, https://medium.com/@community_md101/the-network-is-the-product-data-network-flywheel-compound-through-connection-9bf3f94d1d6c
  43. Tangible and intangible assets, https://www.bdc.ca/en/articles-tools/entrepreneur-toolkit/templates-business-guides/glossary/tangible-and-intangible-assets
  44. Intellectual Property Valuation Basics for Technology Transfer Professionals, https://www.wipo.int/web-publications/intellectual-property-valuation-basics-for-technology-transfer-professionals/en/3-ip-valuation-methods.html
  45. Valuation Techniques, https://dart.deloitte.com/USDART/home/codification/broad-transactions/asc820-10/roadmap-fair-value-measurements-disclosures/chapter-10-subsequent-measurement/10-3-valuation-techniques
  46. Life-Cycle Decisions for Biomedical Data: The Challenge of Forecasting Costs, https://www.ncbi.nlm.nih.gov/books/NBK562702/
  47. Value your data, it is an asset, https://www.useready.com/blog/value-your-data-it-is-an-asset-principle-of-data-management-practice
  48. Data value, https://joanribas.net/data-value/
  49. FASB Unveils New Rules Streamlining Internal Software Accounting for Modern Development Methods, https://tax.thomsonreuters.com/news/fasb-unveils-new-rules-streamlining-internal-software-accounting-for-modern-development-methods/
  50. Cybersecurity: the economic benefits of GDPR, https://www.cnil.fr/en/cybersecurity-economic-benefits-gdpr
  51. Nonrivalry and the Economics of Data, https://www.nber.org/papers/w26260
  52. Security is a Revenue Booster not a Cost Centre, https://www.darkreading.com/cyber-risk/security-is-a-revenue-booster-not-a-cost-center
  53. Building Trust in Fintech: An Analysis of Ethical and Privacy Considerations in the Intersection of Big Data, AI, and Customer Trust, https://www.mdpi.com/2227-7072/11/3/90