As we enter the age of data valuation we're starting to witness the rise of data marketplaces.
One of the concepts I've been experimenting with is the routing of IoT data from a sensor to a data marketplace. The diagram below highlights the general idea. How does one route IoT data and eventually advertise it for sale in one of the many emerging data marketplaces?
There are two hypotheses I've been working on in relation to the figure above:
- IoT data with a higher degree of (measurable) trust will sell for a higher price
- Data profits will decrease when IoT data is immediately moved to a public cloud
What is meant by measurable trust? It means that if a potential buyer can look at the provenance of a data set for sale, they can inspect the level of care that the data received as it was moved from sensor to market. This is called a data confidence fabric (DCF). A DCF description, and a set of measurable provenance attributes, is depicted below.
I've written before about decreasing data profits due to egress fees. Moving IoT data to the cloud costs very little, but moving it every time it sells results in continual taxation.
When you combine the two hypotheses above you end up with an architectural statement that in theory should bring the highest level of profit during the sale of IoT data:
"Build a data confidence fabric, using open-source technologies, that initially stores the data close to the sensors".
This architecture, most certainly, can be built. There are two main problems to be solved.
- What are the stages of IoT data creation, local storage, and eventual movement to multiple marketplaces? To solve this problem we need to turn to VMware. This is a multi-cloud problem (VMware's bread and butter).
- What are the set of open technologies we can use to store data close to the sensors and prepare it for sale?
After considering these two questions exhaustively, we've come up with a white paper that answers these two questions with a significant degree of depth. The paper, called Getting Started with IoT Data Monetization, starts with sensor ingestion and ends with the technology that nearly all data marketplaces expect: blockchain. The diagram below expresses the general idea.
We've started to build this framework as a way to test the hypothesis. Open technologies provide a cost efficient way to capture and manage IoT data close to the sensors, and VMware's six-sevens process pattern considers all the complexities that will arise when moving data across heterogeneous environments.
The benefits of initially storing data close to the sensors extend beyond data monetization. A scalable and robust local IoT storage layer can bring latency benefits and compliance/governance benefits as well. Therefore this architecture can not only make money (data sale), but it can save money (e.g., latency speed improves operational efficiencies) and avoid fines (e.g., when data movement would violate local/federal policies).
We're looking for opportunities to share this approach with like-minded companies and individuals; feel free to contact me on Twitter or LinkedIn (see the menu bar above) for more information.
Steve
Dell Technologies Fellow
Comments