Technology (Innovation For Education)

The Age of Universal Access

Universal Access

ILLUSTSRATION © PHOTOVS/SHUTTERSTOCK

Today, we live in a world that is largely defined by how well “connected” we are, because nearly everything we do involves “being online.” This is a relatively recent development. Just a few years ago, we thought about access to online resources in very specific ways. For example, “getting online” was an intentional activity — something we had to make a special effort to do. It required thinking about where to get online, when it might be possible and how to go about it. “Being online” was very much the exception to our normal situation, because nearly all of our time was spent not being connected to anything except an occasional cell-phone call.

But technology advances rapidly, and soon we began to talk in terms of anywhere/anytime access. The Internet was coming within relatively easier reach.

Today, we live in a world in which we are nearly all connected to one another in real-time. We have moved into the era of universal access. And as a result, much of the way we think about the way we live and work has also changed.

What was the journey that brought us here? What shifts have occurred in the challenges we face as a result of this transformation? In what ways have our society, our culture and even our daily lives been changed as a result?

From the Start

The World Wide Web (WWW) took the Internet and turned it into a wealth of digital information that was previously available only through printed materials and analog media. Before the WWW, you had to go and find the information you sought, and access it physically (picture the traditional library…). That quickly became “here is the information — get on the Internet and you can access it wherever it is and wherever you are.”

Finding the information you sought was made possible through Internet browsers that accessed search engines. The first search engines appeared in the early ´90s, and names like Infoseek, AltaVista, and Yahoo! soon became well known. Netscape was the first widely recognized browsing tool, but today they are too numerous to name. Google has not only become an enormous technology company, but also a verb embedded in our lexicon.

Accessing the Internet began in earnest with subscription services like America Online (AOL), but soon evolved into Internet service providers (ISPs) accessible via telephone, and later cable. These connected people to the WWW directly, without the need for online services such as AOL. Faster access evolved and came to be known as broadband, but it was initially limited and quite expensive.

More recently, key developments have substantially altered our daily realities. First, the long sought convergent device became a reality. Many of us remember well needing to carry a cellphone, camera, GPS, music player, PDA and other individual devices. Today’s smartphones incorporate substantive computing capabilities, as well as nearly any other function we might want or need. Think of a need, and soon “there’s an app for that!” Additionally, smartphones are now connected to powerful cloud services that allow us to stay connected with our social networks and data sources.

Another key development was commodity broadband networks. Essentially, broadband can be thought of as network speeds fast enough to be able to download large files and allow us to comfortably enjoy digital video. Broadband to our homes was great when it arrived. But today 4G LTE (short for Long-Term Evolution) is increasingly a standard for high-speed wireless communication for smartphones and tablets.

In the 1997 movie Contact, Palmer Joss (played by Matthew McConaughey) pondered whether humans were actually better off with all the technology we had developed. We might wonder the same today, in a world that enables us to be nearly universally connected 24 hours a day. The answer likely depends upon individuals, the way they utilize these technologies and whether they control the technologies — or allow the technologies to control them. Ultimately, we determine the relationship we have with these technologies, and how intrusive they become in our lives. We also determine how they can be used effectively for important purposes, including education.

The Connected Campus

In higher education, visionary faculty members are revolutionizing the way students learn by incorporating the use of powerful technologies — including network-enabled smartphones and tablets — by students. This quantum movement forward is based on an emerging understanding of education as a shared, collaborative and active process that is best served when driven by the learner. Empowered by these technologies, students can access course materials, digital media and resources, collaborate with dynamic learning communities of their peers, and benefit from numerous other capabilities enabled by these tools.

Rather than fearing these technologies and viewing them as intrusive into the lecturer-pupil relationship, most faculty are focused on the fundamental goal of successful learning outcomes rather than being ideologically focused on preserving millennial-old processes. As a result, they have adapted to and are integrating these technologies into significant improvements to helping students learn more effectively.

Of course, numerous faculty still subscribe to the alternate view; witness the “teaching naked” movement that lives on in some institutions. But students today are powerful forces in two ways. First, they are known as “digital natives” because they have been born into a world in which they have known and used universally available technologies their entire lives. They are perfectly comfortable in the use of technology, to the point that its absence is the anomaly. Second, today’s students and their parents are consumers whose tuition dollars enable institutions to operate (and survive). The growing integration of technology into teaching and learning is inevitable, irrevocable and highly beneficial. That is particularly the case when judged from the central question of benefit to students themselves.

Throughout our evolution, humans have adapted to our environment. We have shaped it, and in turn have been shaped by it. Various technologies have been integral to this process, and have evolved along with us. While some might prefer to think of the “pure” state of education through an image of Plato lecturing in the Academy in Athens, this is a simple, ideological and arbitrary choice. Humans have been involved in learning, teaching and the proliferation of knowledge across generations from the days of our earliest ancestors.

Fundamentally, it was the human ability to adapt that enabled us to survive. The technologies we developed have been integral to our survival. In many ways, technologies are one of our greatest achievements. Technology is completely neutral in terms of how humans use it. That is, technology has equal capabilities for good, and for evil as well. One point we can perhaps all agree on is this: education is among the most important of all human and societal activities. As H.G. Wells wrote, “Human history becomes more and more a race between education and catastrophe.”

Technologies such as those that provide universal access are incredibly powerful tools that can help us meet the needs of our nation and our world — if we embrace them and utilize them effectively. If we hold two central considerations in assessing the success of our institutions, they must be student learning outcomes, and the kind of citizens our graduates become. In a rapidly evolving world in which learning and continuous relearning are vitally important, as noted by Pulitzer-Prize winning author Tom Friedman, universal-access technologies and tools are becoming far more than an enabler for learning. They are becoming a fundamental determinant of whether we can sustain an educated society.

This article originally appeared in the issue of .