Whitepaper-01 – Summary


Executive Summary – Data Value Accounting
Re-Thinking Data Value in the Networked Civilization

Globally open-sourced using Creative Commons licensing

In the Cloud/IoT/Blockchain era,
We have changed how we gather data.
It is time to reflect this in our taxonomy of data,
And how we assign convertible value in our accounting.


This is a revised examination of the relationship between data, finance, energy cost, volunteering, work,  employment, ROI, opportunity cost, supply chains, operations, and ecological services. It flows from re-thinking the fundamental (intrinsic) nature, (monetizable) value, and (social) worth of all underlying data entered into the ledger – the data ecosystem and the financeable costs, features and benefits that we assign to its constituents.

This flows from reorganizing newly-generated (wild/raw) data into six (6) phase states [A,B,C,D,E,F] for accounting and tax credit purposes.

There are three parts:

1a) [Read First] Restructuring data into a ‘Six Phase State’ Taxonomy of monetizable new information (“data”).

1b) Using Taxonomy in Accounting: Use the taxonomy to restructure the ‘Six Phases of Data Value’ into an accounting formula that sums using basic arithmetic [ (w + x) – y = z ]

1c) Specify “Definition Equivalency” for Specific Applications: A particularly useful component of equations is the ability to reverse-engineer definitions of Variable value; enabling us to assign equivalent definitions for the problem-at-hand. For example, if “Converted Data = Asset”, what could “Asset” equivalently get redefined to be?

Applications include:

2a) Tax Credits concept: The framework flows from the nature of data, not a country, so ought to be usable by any jurisdiction. The framework calculation here similarly culminates in a simple calculation.

2b) Private Applications: The framework also appears to be usable by private parties without needing tax authority approval.

2c) Other Applications: See “Implications“.


How do we measure and account for data now?

When you get right down to it, we don’t. Our “accounting data” and “double-entry accounting” system came into being over 500-years ago (1494).

This now traditional system has served us well. But it has not fundamentally changed since. It has not changed to reflect changes to the nature of data itself or the increasing speed, types and volume of data collection of the modern age. 

That’s the problem.

In the Cloud/IoT/Blockchain era, we have changed everything. We harvest 24/7. It is time to reflect this in our taxonomy; to assign deeply precise monetizable values to data as it moves through the ‘accounting value chain’.

The innovation here changes the nature of data that is brought into being; to be assigned a fiat or equivalent currency value (dollar, franc, token, crypto); to be entered into the ledger – the taxonomy of accounting data.

The innovation produces the means to finance volunteer (volunteer donated data) costs, volunteered (donated data) costs, and new micro-businesses that produce “a social good” [benefits to society] and “desired outcomes” [private & public-private ROI and opportunity cost choices].

And because it takes electronics to generate computer data, by definition we can use the energy cost to assign base economic value to the data we generate. Which creates the means to create data value metrics. To profit from those metrics. And to drive ineffectiveness out of energy-generating, processing, grid delivery, and consumer systems.

The method is universally useful because it flows from re-thinking the intrinsic nature of data entered in a ledger, not the laws of a specific country.

The surprise is that using newly-generated (wild/raw) data lets us convert Labour Expense to Real Assets. This sparks opportunity for millions (billions?) of new ways to earn income.


Why This? Why Now?

It has been extraordinarily impractical to gather natural, economic, and social information in a way that would be useful to society. Every country’s national census is periodically collected and the intervening years are used to tabulate and analyze the data. Astronauts say that engineering for space is hard. The census has been worse. We have computers now, but imagine just half-century ago when it was all paper.

Now, we are stuck. With an Egg & Chicken Problem.

Our data-gathering has changed and IoT/Cloud + mobile phones + sensor mesh networks makes near-time collection practical.  Collections volume will explode, but we have not changed how we account for the data we gather.

Here’s what’s coming.

300 billion passwords used globally by 2020; 96 zettabytes of digital content per year by 2020 (we generated 4Z in 2017); 200 billion IoT devices and 45 trillion networked mesh sensors by (2040). And the concurrent cybersecurity threat, already forecast to reach US$6 trillion by 2021.

Can we change our accounting methods to account for all the data? Can we drill-down into the fundamental nature of the information we gather to effectively assign intrinsic, social, economic, and monetized value.

The DVA method proposes a way to do this.

Our Egg & Chicken Problem can be our Egg & Chicken Solution. 

And because it takes electronics to generate computer data, by definition we can use the energy cost to assign base economic value to the data we generate. Which creates the means to create data value metrics. And to profit from those metrics. Which, in this age of threatening runaway climate change, is a benefit that energy generators and consumers can all create together .


Creative Commons License
This work licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Disclaimer: The strategies and other information provided here is for information purposes only. It is not intended to be investment advice. We make no representations or warranties whatsoever regarding the accuracy or completeness of any such information. Seek a duly licensed professional for investment, legal, tax, and other advice.