Trend Toward Machine Learning and Time-Sensitive Networking
I see two major trends that will
affect the wired and wireless
network through the next three
to five years. First, machine learning is
the ability for computers to learn without
being explicitly programmed. It is a broad
topic that is often linked with artificial
intelligence and other advanced concepts.
Within the wired and wireless network
there are many components, such as access
points and switches that not only move
traffic from the edge of the network to
other devices or to data center-centric or
cloud-based applications but also capture
information about the network equipment,
the end points or the applications.
Machine learning applications capture
all of the network information and look for
trends. These trends are used for preventive
maintenance by predicting component
failure as the application monitors
everything from packet loss across the
network to equipment temperature and
flagging equipment when historical models
of failures are compared for reference. For
campus administrators, machine learning
means their technology investment lasts
longer between necessary upgrades. It also
means greater reliability of technology related
to facilities operations and classroom
learning. Additionally, machine-learning
applications monitor thousands of pieces
of information and dynamically change
equipment parameters to provide or
maintain better performance for students
accessing resources across the network in
the library or in the classroom.
And, while the above example focuses on
networking, similar machine learning principles
can be applied to other technology.
For example, sensors in campus buildings
can determine that there are no occupants
and turn off the lights on a specific floor or
adjust the temperature. These changes may
not be time specific but situation specific
based on multiple sensor inputs.
The second trend is on time-sensitive
networking. Today, enterprise wired and
wireless networks use protocols that are
“best effort,” meaning that there is no
specific guarantee when data will move
from the device that sent it to its destination.
This latency is best heard in the jitter
that you might see on the computer monitor
when streaming a lesson or the delay in a
phone conversation on your smartphone.
Historically, if we wanted a highly reliable,
low-latency network, it required special,
expensive and proprietary equipment. New
advances in time-sensitive networking are
seeing the ability for commercial equipment
to address the deterministic requirements
of applications. While, commercially,
this means a better audio/video experience,
it also has big quality-of-service implications
for applications that to be prioritized,
such as university research.
About the Author
Tim Zimmerman is research vice president at Gartner, Inc. in Stamford, CT.