<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=235929&amp;fmt=gif">
Blogs by Trenton Systems

What Is Edge Computing?

An edge computing graphic superimposed over a server rack

Photo: Edge computing is a major player in the Fourth Industrial Revolution, or industry 4.0, placing computation at or near the data source to reduce latency and provide real-time data processing and insights to enterprises.

Table of Contents

Edge Computing Scenario

Imagine that a driver of a military vehicle needs to travel to a nearby outpost to respond to an unexpected attack.

The driver doesn’t know exactly where the outpost is located. He just knows it’s about 15 miles from the military base where he's currently stationed.

Thankfully, the driver’s vehicle is equipped with an Internet of Things (IoT) virtual assistant that provides him with real-time navigation, geographical information and even weather-related updates.

This nifty assistant transmits the driver's requests for information to a distant server at a centralized cloud data center located thousands of miles away.

In turn, the cloud server uses this device-generated data to compute the information the driver requested.

The inside of a data center. Many servers inside of cabinets are pictured.

Photo: A data center with multiple rows of fully operational server racks

Normally, this data is relayed back to the virtual assistant almost instantaneously. The assistant then uses it to guide the driver to the nearby outpost, all the while highlighting any obstacles or conditions that may impede his travel or jeopardize his safety.

But there’s a problem this time.

The assistant is silent. Buffering.

Retrieving the requested information is taking longer than usual.

Why?

Because hundreds of other IoT devices at the base – wearables, security cameras, drones, weapons systems, smart speakers and smart appliances – are also transmitting data to the cloud using the same connection, resulting in a network slowdown of nightmarish proportions.

As a result, the driver is experiencing a delay, or latency, in his device’s response time.

And in turn, the driver cannot receive geographical information or directions to the besieged outpost.

At least, not in a timely manner.

So, what’s the solution?

Enter edge computing.

An infographic describing the edge computing network architecture

Infographic: An illustration of an edge computing architecture

What is edge computing?

Edge computing is a type of network architecture in which device-generated data is processed at or near the data source.

Edge computing is actualized through integrated technology installed on Internet of Things (IoT) devices or through localized edge servers, which may be located inside smaller cloud-based data centers, known as cloudlets or micro data centers.

You can think of edge computing as an expansion or complement of the cloud computing architecture, in which data is processed or stored at data centers located hundreds of miles, or even continents, away from a given network.

Essentially, edge computing distributes part of a cloud server’s data processing workload to an integrated or localized computer that is proximal to the data-generating device, a process that mitigates latency issues caused by a substantial amount of data transfer to the cloud.

One example of these localized computers is an edge server.

A photo of a black-and-red edge server

Photo: Edge servers crunch data from IoT devices give enterprises the insights they need to achieve their goals faster and more efficiently.

What is an edge server?

An edge server is a computer that’s located near data-generating devices in an edge computing architecture.

It utilizes the edge computing architecture to reduce latency in data transmission and filter out unimportant or irrelevant data before it is ultimately sent to the cloud for storage.

Edge servers are used as an intermediary between a network and a cloud data center, absorbing a portion of an IoT device’s data processing activities and delivering results in real time.

Examples of edge servers include rugged servers, which are used in the military, manufacturing, and other industries.

The computational offload achieved by the edge computing architecture, in conjunction with the resilience and processing power of a high-performance rugged server, can make for quite a powerful combination at the edge.

A soldier wearing soldier wearables to show an example of edge computing

Photo: Soldier wearables are just one of many edge computing examples.

What are some edge computing examples?

There are several instances of edge computing architectures in military, commercial, and industrial applications today.

Examples of edge computing include various IoT sensors attached to soldier wearables and battlefield systems, systems on offshore oil rigs, modern cars and self-driving vehicles, security and assembly line cameras, virtual assistants, as well as the edge servers that take the data and measurements from these devices and crunch  them to provide insights to their users.

We detail each of these examples below:

  • Soldier wearables and battlefield systems: The power of edge computing allows for soldier and field data to be collected and processed in real time in what has been termed the “Internet of Military Things” (IoMT) or “Internet of Battlefield Things” (IoBT). This massive family of interconnected devices, ranging from helmets to suits to weapons systems and more, produces a gargantuan amount of contextualized data, including information about a soldier’s physical health and identification data about potential enemy combatants. When lives are on the line, the act of computation needs to take place sooner rather than later. Thanks to edge computing, soldiers can do their jobs in the field safely and more efficiently, resulting in improved national security down the line.
  • Offshore oil rigs: Edge computing is a likely solution for remote locations with limited or no connectivity to a centralized data center. Offshore oil rigs are a perfect case in point. With more and more data being collected by Industrial Internet of Things (IIoT) devices, places like offshore oil rigs are generating more data than ever, oftentimes a lot more than their networks can handle. By extending computation closer to the data source using edge devices and servers, an offshore oil rig doesn’t have to worry about latency issues in real-time applications, or transmitting mountains of data across an already-spotty network. By utilizing localized edge computing solutions, an oil rig’s reliance on the computational abilities of a faraway data center is significantly reduced.
  • Self-driving vehicles: Autonomous vehicles are a popular example of edge computing due to their need to process data with as little latency as possible. A self-driving vehicle is unable to wait for a distant cloud data center to decide whether it should change lanes or brake for a civilian. These decisions need to happen almost instantaneously to ensure the safety and well-being of persons both on and off the road. An edge computing architecture allows for this sort of real-time decision-making, since latency is being reduced as a result of real-time data processing occurring on the vehicle itself. This sort of integrated computation is also known as point-of-origin processing.
  • Security and assembly line cameras: Surveillance cameras in security systems and at production and manufacturing facilities often record a continuous stream of footage, which is then sent to the cloud for processing or storage. To cut down on the amount of stored data, the cloud server may, for example, execute an application that deletes the useless footage and stores the footage that captured certain criteria, such as a person in motion or a defect in a product. Regardless, this results in a lot of bandwidth being used, because every byte of that footage is being transferred to the cloud. But if the cameras are able to perform video analytics themselves and send only the important bits of footage to the cloud, less data would be transmitted across the network, resulting in less bandwidth use and less network traffic.
  • Virtual assistants: Virtual assistants that use the cloud for data processing can improve response times by processing requests locally using an edge gateway or server, rather than relying on all the computation to take place at a faraway cloud data center. Our military driver would have received the information he requested a lot faster had his outpost network taken advantage of edge computing's real-time processing capabilities.

What are the benefits of edge computing?

There are four main benefits of establishing an edge computing architecture: reduced bandwidth use, latency reduction, cost savings, and improved security and reliability, especially when supporting next-gen technology like 5G

We detail each of these four benefits below:

  • Reduced bandwidth use: Installing an edge device on or near the data source would allow most of the data to be stored and processed there. This cuts down on the amount of data being transferred to the cloud, which, in turn, reduces the amount of network bandwidth being used by IoT devices.
  • Latency reduction: In an edge computing architecture, data from IoT devices doesn’t have to travel great distances to a cloud data center. This reduction in distance reduces processing delays and improves response times, which is ideal for any military, industrial or commercial application in which security, safety or manufacturing efficiency is critical.
  • Cost savings: Higher bandwidth usage costs more money. In an edge computing architecture, less bandwidth is being used, which translates directly into dollars saved.
  • Improved security and reliability: With edge computing, data is stored and processed at multiple locations close to the source, instead of at a cloud data center, where a single cybersecurity attack or maintenance operation could disrupt the network entirely or result in a widespread release of sensitive data. If data storage and processing are decentralized to edge devices, so, too, are the ramifications of security breaches and maintenance delays. Furthermore, cyberattacks and regular maintenance would be isolated to one or two edge devices, instead of an entire data center.

Consider the scenario from earlier, involving the military vehicle driver and the IoT assistant.

Because there were so many additional IoT devices transmitting data to the cloud, the base's network was temporarily overloaded, causing the driver to experience a delay in response time.

Not to mention, the data required to fulfill all those requests was being computed thousands of miles away at a cloud data center, instead of on an integrated sensor or chip, or on a server at the edge of the network.

These latency issues could have been mitigated if the data was processed using an edge computing architecture.

Instead of running a hefty stream of data to the centralized data center for storage and computation, an edge computing architecture would have allowed the other IoT devices on the network to store and process a portion of the data locally.

In addition, the base would cut down on its operational costs by using less bandwidth, as well as reduce the impact of potential security or maintenance-related issues.

Edge vs. cloud computing: Is the edge replacing the cloud?

Edge computing is unlikely to replace cloud computing entirely.

Applications and devices that don’t require real-time data processing or analysis are likely to still use the cloud for storage and processing.

As the number of IoT devices increases, so, too, will the amount of data that needs to be stored and processed.

If businesses and organizations don’t switch to an edge computing architecture, their chances of experiencing latency in applications requiring real-time computation will increase as the number of IoT devices using their networks increase. In addition, they’ll spend more money on the bandwidth necessary to transfer such data.

Edge computing is an extension, rather than a replacement, of the cloud. And as more and more devices begin to use cloud data centers as a processing resource, it’s clear that edge computing is the future, at least if you want your program or application to function seamlessly, efficiently and affordably.

For more information about acquiring edge computing solutions for your program or application, reach out to Trenton Systems. Our engineers are on standby.

Speak With Our Team Today

No Comments Yet

Let us know what you think

Subscribe by email