Ensuring a Secure and Resilient Edge Computing Future

November 28, 2023

AI EdgeLabs Chief Customer Officer Steve Poeppe recently sat down with ZEDEDA’s VP of Business Development Michael Maxey to discuss the evolving definition and applications of edge computing, why the edge introduces new security challenges and what to do about them, why legacy system integrations aren’t the impediment to change they were once assumed to be, and more.


Here are 5 key takeaways from the discussion: 

  1. ZEDEDA defines the edge as any gateways or servers running outside a data center, also known as “compute edge” or “gateway edge.”
    ZEDEDA’s work at the edge typically takes place in the back offices of retail stores, in factories, or near oil wells, for instance. However, the edge also consists of the device edge, a massive market where most of today’s devices operate, specifically collecting, analyzing, and processing data between the real world and the data center.Understanding the value and complexity of this data, ZEDEDA is working to move processing, applications, and capabilities closer to the creation of data to enable more rapid responses and decision-making, huge cost savings, and enhanced outcomes.
  2. One of the fastest-growing applications of edge computing is computer vision followed by data streaming. The most widespread adoption of edge computing goes to heavy machinery industries, including manufacturing, retail, automotive, energy, and telecommunications.
    Camera use, along with the analytics and transfer of visual data, has been widely popular of late, whether posted for surveillance inside and outside of businesses or attached to helmets for safety. Many industries, including oil and gas, are also using cameras with sensors and combining computer vision, data streaming, and automation to optimize oil pump pressure, both for environmental safety and increased profit.The automotive, retail, manufacturing, telecommunication, and energy sectors have been early adopters of edge computing simply because transforming their businesses has long required moving outside data centers and closer to the heavy equipment and towers that collect and disseminate data and enable their businesses to operate.
  3. When devices are no longer in the data center, it introduces several new security challenges, including the risk of theft, hacking, and malware, and the need for a more diverse security profile, including network and application-level security.
    The data center acts as an armed guard, keeping devices, data, and networks safe, but the edge makes hardware and software vulnerable to a variety of attacks. Organizations would do well to partner with an edge orchestration and management vendor or cybersecurity and anti-fraud provider that can implement preventative tools and measures to protect software, encrypt data, and secure networks in multiple locations in the thousands and beyond. At present, this is the best practice for organizations to regain the safety and security they’ve come to expect in a data center environment.
  4. To safeguard against future challenges and get ahead of trends emerging in edge computing, keep an eye on advancements in AI.
    AI, large language models, machine learning, and other cognitive capabilities are huge, inevitable areas of growth in edge computing, and ignoring the question of AI will only lead to headaches down the road. Organizations are advised to start tagging data, adding and saving metadata, and collecting and labeling data to become or remain prepared to satisfy current and future AI needs.
  5. Having legacy machinery, systems, and processes does not impede edge computing readiness nor the ability to pave the road to modernization.
    Challenges with device, system, hardware, and software diversity at the edge are as common as the novel solutions now available to circumvent them. For the most part, no matter what equipment and technologies an organization comes to the table with, there is a tool or technology that can either transform it or connect its data to the edge.Organizations concerned about legacy system integration are advised to create a data use case, which may involve implementing industry-standardized predictive maintenance to address problems before they occur and create a clear plan to future-proof existing investments. This approach also drives additional value like improved safety and uptime, while laying a path for future growth and innovation.The path to modernization may include several steps that can be deployed over time. Still, there are edge orchestration and management solution providers who can assist with legacy system integration today while organizations work to modernize.

To learn more about the evolution of edge computing and the important role of security and planning, watch the full video.



AI EdgeLabs and ZEDEDA Partner to Deliver Comprehensive Edge Security Solution for Distributed Infrastructure

Get In Touch