The Time Value of Data

I am doing more work than ever with the Internet of Things these days and I’ve wanted to write on this topic for some time. A larger article is in the works for publication, but I’ll give the high level here. Over the last few years my work with Smart Grid in particular and Big Data in general has made me acutely aware of a concept I have started calling the Time Value of Data. This was inspired by my interest in economics and draws its inspiration from the Time Value of Money which dates back nearly 500 years and to a city in Spain that I have always enjoyed visiting.

The theory behind the time value of money is quite straightforward: money today has a future value that is different from the current value. That is capital has a value that changes over time: in a “normal” environment this means some amount of money today is worth that amount plus some more in the future. This is actually a rather complex topic, but plenty has been written about it.

What I want to focus on here is the value of data over time. Data generally has a unique value curve that is different from most other commodities – and yes, data is a commodity (or at least is becoming one). When we think about the Internet of Things in particular – devices, appliances, sensors, and telemetry – it becomes quite apparent that some of this data is going to have high immediate value. A fire alarm is a great example. Knowing about a fire is extremely valuable as it starts. This may allow for safe evacuation or event containment. As time passes the value of this information drops. Do I really care that my building had a fire several hours or days ago? Many of the sensors in use today are focused on this immediate value area.

There is also a secondary data story that is historical or collective data. This is where you can save data in a raw form long enough to gain value from it. Good examples of this are climate data, defect rates, energy usage. As more of this data is collected over longer periods of time the value of it increases dramatically. Although the individual data points may not be as valuable, collectively the data set becomes even more valuable. This is depicted in the chart below (I said this was a rough draft).

TimeValueOfData

As I mentioned, this is an idea I am still formalizing and will have an article about soon – so I invite any comment or contributions on this. Perhaps this is more of a U than a V shaped curve or maybe the right side doesn’t rise as high, but the concept is fairly robust when examining use cases.

More details on this and the implications will follow.

Advertisements

About danrosanova
I am a Senior Program Manager for Messaging at Microsoft covering Service Bus: Messaging, Relay, and Event Hubs. I have a long history in distributed computing on a variety of platforms and have focused on large scale messaging and middleware implementations from inception to implementation. I was a five time Microsoft MVP before joining Microsoft and author of the book Microsoft BizTalk Server 2010 Patterns.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: