Pushing Computing Over the Edge: A High Stakes Game of Winners and Losers

Oct 26, 2019

Pushing Computing Over the Edge

Photo by Pixabay from Pexels

Recently, Siemens announced its intention to acquire edge computing technology from Pixeom, a company that offers edge middleware to deploy and manage containerized cloud applications at the edge. The move is the latest in a series of activities by major players in enterprise IT and OT to advance their edge strategies and reflects the growing importance of this shift in computing. Customers today are seeking the same agility that the cloud delivered to the enterprise and the mobile phone delivered to the consumer. With companies like Microsoft and AWS creating their own homegrown IoT solutions, the move by Siemens to acquire Pixeom’s technology to deliver better agility at the edge is not surprising.

Why all the fuss? According to Gartner’s 2019 Cool Vendors in Edge Computing report, it is estimated that by 2022, more than 50% of data will be generated outside the data center or cloud, and by 2023, more than 50% of large enterprises will have at least 6 edge computing use cases (compared to less than 1% in 2019). This explosion of data at the edge, coupled with bandwidth and network constraints, is creating the need for edge computing solutions closer to the source of data generation. As a result, the race to remain relevant and competitive at the edge has begun, and the jury is out on who will win!

We have seen technological shifts like this happen before in different industries. For instance, when the rush for cloud computing and for mobile began, no one could predict who would win those races. Right now, we see Amazon and Microsoft leading in the cloud computing space, while Apple and Google lead in mobile. At the outset of these trends, however, this would have been impossible to predict.

Part of the reason why we’re seeing so many companies getting into the edge computing space is that the edge is complex from a technological standpoint. Each company is solving for a different piece of the puzzle because much like any other type of computing, edge requires a multi-layer stack in order to function. You need some people designing the actual silicon and hardware, while others create applications to run on the hardware, and still, others focus on how to connect the edge back to the cloud. Each of the companies in this space is tackling a different layer of the stack, but one of the challenges of this approach is that it can lead to disparate and non-integrated solutions. This, in turn, can result in vendor lock-in, which is a problem that most customers want to avoid.

This is one of the reasons why ZEDEDA is a founding member of LF Edge, and why we donated our Edge Virtualization Engine to the group as an open-source project. We believe it’s critical to the continued growth of edge computing that we have interoperable and open standards as the foundation of our work — otherwise, companies run the risk of having their edge deployments divided into different vendors’ siloes and face vendor lock-in. By making our core virtualization technology available independently of our own company, we’re doing our part to help ensure that everyone has access to a standard base layer for their edge deployments in a secure, scalable and vendor-agnostic way. The virtualization technology is designed specifically for the edge and makes hardware invisible so that any application can run on any edge device and connect to any cloud at scale, without compromising the security posture of an organization.

We’re excited to continue watching and contributing to the developments in the edge computing industry. If you have questions about how all the pieces of the edge stack fit together or want to know more about how to base your edge strategy on open source components, reach out to us at [email protected].

RELATED BLOG POSTS 

Get In Touch