Edge computing is a distributed computing model that brings computation and data storage closer to the sources of data. More broadly, it refers to any design that pushes computation physically closer to a user, so as to reduce the latency compared to when an application runs on a centralized data centre.[1]
The term began being used in the 1990s to describe content delivery networks—these were used to deliver website and video content from servers located near users.[2] In the early 2000s, these systems expanded their scope to hosting other applications,[3] leading to early edge computing services.[4] These services could do things like find dealers, manage shopping carts, gather real-time data, and place ads.
The Internet of Things (IoT), where devices are connected to the internet, is often linked with edge computing. However, it's important to understand that edge computing and IoT are not the same thing.[5]
^Davis, A.; Parikh, J.; Weihl, W. (2004). "Edgecomputing: Extending enterprise applications to the edge of the internet". Proceedings of the 13th international World Wide Web conference on Alternate track papers & posters - WWW Alt. '04. p. 180. doi:10.1145/1013367.1013397. ISBN1581139128. S2CID578337.