• Welcome to the Online Discussion Groups, Guest.

    Please introduce yourself here. We'd love to hear from you!

    If you are a CompTIA member you can find your regional community here and get posting.

    This notification is dismissable and will disappear once you've made a couple of posts.
  • We will be shutting down for a brief period of time on 9/24 at around 8 AM CST to perform necessary software updates and maintenance; please plan accordingly!

Brianna White

Administrator
Staff member
Jul 30, 2019
4,655
3,454
Edge Computing is a decentralized, distributed computing infrastructure that has evolved with the growth of the internet of things. 
Due to their similar names and the general unawareness of advanced computing, some people tend to think that decentralized computing and edge computing are similar.
But, both types of computing are different and complementary to each other. When combined together as decentralized edge computing, they can perform tasks that cannot be achieved individually.
What's Edge Computing?
Edge computing is the deployment of computing and storage resources at the location where data is produced. According to Gartner, edge computing is part of a distributed computing topology in which information processing is located close to the edge—where things and people produce or consume that information. Edge computing is transforming the way data is being handled, processed, and delivered from millions of devices around the world. 
What's Decentralized Computing?
Decentralized computing is the allocation of resources, both hardware and software, to each individual workstation, or office location.
Continue reading: https://www.bbntimes.com/technology/what-s-the-difference-between-edge-computing-and-decentralized-computing
 

Attachments

  • p0005099.m04767.logo1.png
    p0005099.m04767.logo1.png
    3.2 KB · Views: 41
  • Like
Reactions: Brianna White