Brianna White

Staff member
Jul 30, 2019
0x0 (28).jpg

The concept of smart devices was first introduced in 1982 at Carnegie Mellon University with a modified vending machine to report on inventory. Ten years later, a toaster oven was connected and controlled over the Internet. However, the term Internet of Things, or IoT, wasn’t officially coined until 1999 by Kevin Ashton of Procter & Gamble, and it took over ten years for IoT to really take off.

Those early days of IoT involved applications that just needed to connect for a moment in time—to measure inventory or turn on a toaster. These devices, and the many IoT applications that came after, connected to the Internet using the same cellular technologies originally designed for mobile phones. And that’s because the technology met the connectivity needs for those applications.

Even when IoT started to take off in 2010, connected devices only accounted for 9% of global connections, yet are expected to reach 75% of all global connections by 2025. With that growth, connected devices that rely on the infrastructure built for mobile phones have evolved—but has the infrastructure evolved enough to support them?

From the sunset of 2G and 3G networks to the introduction of 5G and regionally-specific regulations, the connectivity landscape is evolving quickly. Pair that with the emergence of increasingly sophisticated IoT applications, and traditional cellular connectivity is no longer enough.

New Innovations In IoT​

Today, patients conduct telehealth calls with doctors, leveraging data from remote patient monitoring devices. Cameras are installed to support real-time security monitoring. Delivery trucks are tracked in real time. Industrial robots are an integral part of some manufacturing plants. And the list goes on.

Continue reading: