K

Kathleen Martin

Guest
Recently, an increasing amount of hope is attached to edge computing. The industry is buzzing with bold ideas such as “the edge will eat the cloud” and real-time automation will spread across healthcare, retail, and manufacturing.
Experts agree that edge computing will play a key role in the digital transformation of almost every business. But progress has been slow. Legacy perception has held companies back from fully leveraging the edge for real-time decision-making and resource allocation. To understand how and why this is happening, let’s look back at the first wave of edge computing and what has transpired since then.
The first wave of edge computing: Internet of Things (IoT)
For most industries, the idea of the edge has been tightly associated with the first wave of the Internet of Things (IoT). At the time, much of the focus centered around collecting data from small sensors affixed to everything and then transporting that data to a central location – like the cloud or main data center.
These data flows then had to be correlated into what is commonly referred to as sensor-fusion. At the time, sensor economies, battery lifetime, and pervasiveness often resulted in data streams that were too limited and had low fidelity. In addition, retrofitting existing equipment with sensors was often cost prohibitive. While the sensors themselves were inexpensive, the installation was time consuming and required trained personnel to perform. Finally, the expertise needed to analyze data using sensor-fusion was embedded in the knowledge base of the workforce across organizations. This led to slowing adoption rates of IoT.
Additionally, security concerns cooled wholesale adoption of IoT. The math is as simple as this:  thousands of connected devices across multiple locations equals a large and often unknown exposure.  As the potential risk outweighed the unproven benefits, many felt it was prudent to take a wait-and-see attitude.
Moving beyond IoT 1.0
It is now becoming clear the edge is less about an IoT and more about making real-time decisions across operations with distributed sites and geographies. In IT and increasingly in industrial settings, we refer to these distributed data sources as the edge. We refer to decision-making from all those locations outside the data center or cloud as edge computing.
The edge is everywhere we are — everywhere we live, everywhere we work, everywhere human activity takes place. Sparse sensor coverage has been solved with newer and more flexible sensors. New assets and technology come with a wide array of integrated sensors. And now, sensors are often augmented with high resolution/high fidelity imaging (x-ray equipment, lidar).  
Continue reading: https://www.cio.com/article/308159/edge-computing-is-thriving-in-the-cloud-era.html
 

Attachments

  • p0007545.m07192.cio_edge.png
    p0007545.m07192.cio_edge.png
    327.8 KB · Views: 44
  • Like
Reactions: Kathleen Martin