Cloud Silos – with IoT “open” may not be so open…

by Zededa
January 24, 2019

There are tons of discussion over open source these days. But for Edge Computing it means one extremely important thing – standardization. “Cloud Native” and its ability to accelerate innovation by abstracting infrastructure complication out of app development, deployment, security, and scalability wouldn’t be possible if it wasn’t for open source. In the case of the Cloud, The Linux Foundation has been instrumental in creating and governing a system where common functions in the cloud (networking, storage, compute, security, software lifecycle management, etc.) can be exactly that – common. No need to reinvent the wheel every time for every app and when there are improvements or best practices they become integrated and part of the common function. Over the last couple of decades, the Linux Foundation have become masters at governing open source efforts to create defacto standards used by more cloud companies than the average person even knows. It isn’t hyperbole to say that the Linux Foundation has done for the cloud and cloud computing standards what the IETF and IEEE have had in setting standards for networking and mobile phones. Creating TRILLIONS of dollars of value, as the FAANG stocks have now done, would not have happened in the pre-Linux Foundation days.

Today, Edge Computing is developing very very rapidly into “islands of computing”, to put it into pre-TCP/IP networking terms. In a rush to develop systems that unlock the power of connected operations’ asset data, proprietary and closed systems have evolved. A cloud application and backend very tied to proprietary hardware may be marketed to be an “open system”. It may even allow you to add 3rd party applications with APIs. And the applications deployed actually do provide business impact which can be immediate. But this rush to deploy edge applications we are not actually creating “open systems”, we are once again sacrificing sovereignty of our data to Cloud Silos. An application sucks up your assets’ data into their cloud, and provides various ways to slice and dice and even gives you super fancy AI to interpret things. But now they control your data. If you want a new app you’ll have to develop it on their cloud – you’re locked in. It’s not a copy of your data that you’ve given them while maintaining control of its use or even access to your data. You’ve pumped your data into their Cloud Silo. And another app might just mean more Cloud Silos.

If you own an asset, and its generating data, that is YOUR DATA. How that asset is performing is YOUR BUSINESS. And tying the extraction of that data to proprietary hardware and forcing any 3rd party app to work with that hardware and/or their backend is locking your data into a Cloud Silo that now, effectively, controls your business by way of controlling what you can do with your data. In many cases, they can provide services to OTHER businesses by using YOUR data, but you don’t participate in that revenue generating scheme.

Think of it as a series of Smart Watches on your arm. The same heartbeat, respiration, blood pressure, etc. data goes into each watch. Each performs a different function with the data. But since you can’t just ask each app to use your data, you now wear their proprietary edge device (a smartwatch) that send the data up to separate clouds for processing. Do more apps require that data? More smartwatches to collect the same information. Its possible to connect “cloud to cloud” for the data but then the business now generates revenue from your health data. You could use the edge devices to transmit your data – one for each service. But is that really the right approach?

This is where The Linux Foundation has once again stepped in to start to set a vision that allows your organization to control your data. The LF Edge project, and in particular Project EVE, is meant in part to provide that standard application environment that abstracts the complication of edge infrastructure – legacy connectivity (Modbus, RS-232, etc), new connectivity (wireless, Ethernet, etc.), compute, storage, security, etc. – out of the process of building, deploying, and managing applications and, more importantly, allows you to decide what happens with your data and your business applications. It starts to bring the elegance of virtualized datacenters to small edge devices that can be deployed anywhere.

Back to our analogy, this looks a bit more like the best situation for your health data – you decide where the data goes, how its handled, and how much of it goes there:

Linux Foundations’ formal leadership in Edge Computing as a sister effort to its formal leadership in Cloud Computing is official recognition of a new shift in IIoT. It’s effectively IIoT 1.0 moving to IIoT 2.0. A shift away from the requirement that the edge and its data is forever tied to a cloud and you must give up data sovereignty. That your privacy and intellectual property (operational data) is the province of someone other than yourself. That your regulatory compliance is dependent on a different organization. The IIoT shift means that the edge and its data are STILL YOURS and you can add applications to it without giving away control. The standards that The Linux Foundation will lay out will eliminate the concept of “what hardware do I need to deploy” to extract the data and accelerate innovation in unimaginable ways. This effort is just getting started and it’s going to reshape everything you know about cloud applications today.