Impact of Technology on Communication Essay

Introduction, advancement of technology in communication, media technology and online communication, the impacts of mobile phone on communication, reference list.

The realm of technology is ever-changing. New advances in applied science have forever transformed the way people interact. Exploring the impact of technology on communication and debating whether people connect with others differently seems to be the topic of the day.

Technology has allowed people to keep in touch no matter the distance. One is able to communicate 24 hours around the clock, seven days a week, 365 days on an interpersonal level.

What are the real impacts of technology on communication? How do electronics mediate and change the ways in which humans interact? How has the emergence of the Internet, mobile phones, and social networks affected society and businesses?

In order to reveal the importance of technology in communication, the essay tries to find answers to these questions. It explores how everything has changed over the years and discusses the connection between technology and communication.

To begin this examination and find answers to these questions, we begin by defining media and communication and outlining the stages of technological advancement from old age to the present day in the field of communication. The paper will highlight the use of the Internet, newspapers, radio, and other media, but it mostly dwells on the use of mobile telephony.

Communication is “the imparting or exchange of information by speaking, writing or using some other medium” (Daniel & Rod, 2011). On the other hand, media is defined as “the main means of mass communication (television, radio, and newspapers) regarded collectively.”

Technology has changed everything in the modern society. The way we communicate has been revolutionized by the advancement of new innovations in the telecommunication sector. Connecting with other people with ease is more feasible in today’s world, and this is due to speed.

Several centuries ago, books and newspapers reigned as the only choice of communication. Then later, innovators brought the radio and television before innovation was taken a notch higher with the coming of the personal computer (Johnson, 1997, p.3).

With every new innovation, the reliance on books and newspapers as the mass medium of communication continued to reduce. With time, human culture has come to understand the power and the mechanisms involved in technology and invention. In today’s world, information has permeated the cycles of change and development.

The world today, past and present, can be studied at ease with the growing information technology. Technology has advanced with sheer velocity allowing different media to shape our thinking and habits. The people who were born during the television era thought that it was the climax of innovation, but they suddenly found themselves acclimating to a new medium, the World Wide Web.

Every time a new medium rolls out, the perceptions towards the previous media you were used to change (Johnson, 1997 p5). Technology proved to be powerful in the sense that no human being can predict what will change and what won’t with certainty.

The irony of it all is the fact that the influence of technology extends beyond generations to come. It is with no doubt that technology has changed the lives of human beings; information and entertainment are being received in a more convenient way.

The innovation of having a conversation using a device called the telephone changed everything in communication. This became magical, and one couldn’t believe such innovation would exist (Tofts, 1997, p.40).

With the emergence of new media technologies, consumers have been empowered to ‘filter’ the information they want to receive. This allows them to have a choice of which news to watch or what information to listen to (Palmer, 2003, p.161).

Media consumption has been made an engaging experience with marketers studying the preferences of the consumers in order to reflect broader social changes in society. In today’s world, the computer is seen as a multi-purpose machine with work and leisure functions, therefore, creating more value.

The rise of the Internet has also made it possible to have virtual offices where the user can work from home or any convenient location. The flow of information from different media has greatly changed the social structures of society at different levels (Barry, 1999).

Digital media has enabled news and event to be channeled in real-time. The combination of the Internet and commerce has given birth to e-commerce sites providing huge potential for marketers to reach out to virtual communities.

In the world today, there are numerous media screens within our surroundings. This ranges from the television sets in our houses, computer monitors at the office, mobile phones and MP3 players in our pockets and handbag.

Even when shopping or waiting to board a plane, you’re most probably staring at screens with entertainment media (Soukup, 2008, p.5). Heavy marketing has been adopted by producers of mobile technologies targeting consumers who possess mobile phones with picture and video capacity (Goggin, 2006, p.170).

Media texts producers have termed mobile media as a “third screen,” a device that consumers carry around with much ease. Unlike television screens, broader communication networks have been integrated into personal computers and mobile phones (Goggin, 2006, p.9).

Train, buses, and airplanes have been dominated by mobile screens providing passengers with entertainment as well as other media content, especially advertisements (Caron & Carona, 2007, p.17). With a lot of commercial media content, the preferences of people change in their everyday lives.

The world of popular media has become chaotic, with hundreds of television channels to choose from, thousands of songs ready for download, and not forgetting millions of web pages to surf.

The emergence of social media like Facebook and Twitter has enabled people to manage interactions and relationships with many friends. Technologies have impacted interpersonal communication enabling people to interact more often than before.

In addition to reducing the distance between people, online communication with tools like Facebook and Twitter enables people to keep track of their contacts with friends and are more aware of the last time they interacted with them. Online communication now incorporates more than one mode of contact, including text, voice, and body language.

A mobile phone is a device that has always been seen as connecting people who are far apart, thus overcoming the geographical distance between them. The number of mobile phone users has continued to increase substantially. The mobile phone has been integrated as part of people’s lives in the sense that it’s available and easy to use, keeping us connected to our families, friends, and business people (Ling, 2004, p.21-24).

The how and when the way we use our mobile phones impacts our communication not only with those we’re communicating with but also with the people within our proximity. At this point, it is paramount to note the changes that have taken place and that have allowed the adoption of mobile phones. The tremendous proliferation of this device has drastically changed the traditional communication model.

Who are the users of mobile phones, and for what purposes do they use them? Has there been any change in the way mobile phone facilitates communication? How has the face to face interaction been affected by mobile calls? Has mobile communication enhanced relationships?

These are some of the questions that arise when we try to fathom the way communication has affected our personal and professional lives. There are sentiments that mobile phones have reduced humans to emotionless beings.

There is no doubt that the revolution brought about the use of mobile phones in the way we communicate. There have been different perceptions among individuals and social levels in society in regard to mobile usage.

When we had fixed telephone lines that were put in a booth, telephones were seen as business tools only and were placed in a fixed, quiet environment. There was restriction when it came to teenagers using these phones (Agar, 2003). The ‘birth’ of mobile phones brought changes, and phone calls became a habit to many irrespective of age or location.

Today, people can use mobile phones wherever they are in private or in public. People have been addicted to their mobile phones more than any other gadget known to man, with the device remaining on throughout. Its portability enables people to carry it wherever they go (Castells, 1996).

A personal virtual network has been created whereby users can be available at all times to communicate with friends, family, and colleagues. The geographical barrier has been destroyed, making people feel close to one another, and the face to face communication has been rendered rather less important with this mediated communication (Richard, 2004, p.22).

Meetings and briefings have become obsolete, with communication being mediated by a computer or a phone. Mobile SMS (short messaging service) service and the Internet has become the preferable communication channels for most teenagers and young people all over the world (Plant, 2000, p.23).

There are places where mobile phones have become taboo devices, places like churches and crucial corporate meetings. At such places, the mobile ring is seen as a nuisance. In other scenarios, it is seen as a destructive device by acting as a third party and especially for dating couples who want to have a private conversation.

Any phone ring is seen as an ‘intruder,’ and this harms the relationship between the partners (Plant, 2000, p.29). In his research, Plant observes that there are those people who use mobile as ’a means of managing privacy where calls are carefully selected’. He categorizes this group of people as ‘hedgehogs.’

The other category is those people who use mobile phones as the key central part of their life. They become so attached to the device and cannot do without it. Plant referred to this group as ‘fox.’ They are regular users who need to feel connected with their families and friend. Their life will be dreadful if they lack the device (2000, p.32).

Telephones have promoted the use of text messaging and modernization since it’s allowing people to communicate more both verbally and by texting in a more convenient and efficient way. SMS has made communication to be more immediate, and users can customize the message at ease with the various applications installed on their mobiles (Richard, 2004, p. 100).

The advanced phones have email support as well as multimedia messages making chatting become a lifestyle for many who conduct business and those initiating intimate communication. It has emerged that SMS has made people become more united.

Users have developed abbreviated messages, which are now universally accepted as an appropriate language. The initial purpose of the phone to make calls has even lost taste with many people, especially the young generation.

According to Reid &Reid, more than 85% of teenagers prefer texting to talking on their mobile usage (Reid & Reid, 2004, p.1). There is ease of communication when it comes to texting in the sense that some formalities are eliminated, making communication more personal.

Texting has helped introverts who may lack the skills to have phone conversations allowing them to express their true self to other people leading to greater understanding and stronger relationships (Reid & Reid, 2004, p.8).

The use of mobile technology has affected the personalities of people to a great extent. Today, more people are hiding their feelings and whereabouts behind mobile phones, and this has raised suspicions among families, friends, and couples.

People go through text messages of others just to find out more about the individual who might even have no clue about what is happening. Contrary to this, most people believe that mobile is so crucial in enhancing the relationship between people no matter the distance and that it bonds us together more than it separates us (Plant, 2000, p.58).

The usage of mobile phones by children and teenagers has changed the way parents bring up their kids. Parenting has really changed as parents try to increase their surveillance and monitor their children’s mobile usage.

Their concern is to know who communicates with their kind and the kind of conversations they normally have. They are worried about the kind of social network the children create in their contact lists.

With the emergence of virtual communities, the influence of mobile phones has spilled over and affects parenting in general. Nonetheless, the primary purpose of mobile phones to facilitate communication has not changed.

There is no doubt that technology has changed the way humans communicate. Great impacts can be seen in the way communication has changed the social structures of our society at all levels. Even in years to come, technology remains the driving force of the way people interact.

The advancement of technology ensures that communication is quicker and that more people remain connected. There has been an evolution in interpersonal skills with the advancement of technology, and users should always be keen on adapting to new ways of communication.

Technology has continually brought new methods of communication leading to the expansion of mediated communication. The reality of having one message shared across a huge audience (mass communication) is now with us. A situation where neither time nor geography can limit the accessibility of information.

We have seen the merging together of newspapers and books with computer technology so that the frequency and ease of reporting information and advertisements can be increased. The exposure of both individuals and society to mediated communication has therefore affected our daily lives, particularly in our culture and the way we communicate.

Agar, J., 2003. Constant Touch: A Global History of the Mobile Phone . Cambridge: Icon Books.

Barry, W., 1999. Networks in the Global Village . Boulder Colo: Westview Press.

Caron, A, & Caronia, L., 2007. Moving cultures: mobile communication in everyday life. Montreal: McGill-Queen’s University Press.

Castells, M., 1996. The Information Age: Economy, Society and Culture, Volume 1. The Rise of the Network Society . Oxford: Blackwell.

Daniel, C., & Rod, M., 2011.The Dictionary of Media and Communications . Oxford: Oxford University Press.

Goggin, G., 2006. Cell phone culture mobile technology in everyday life. New York: Routledge.

Palmer, D., 2003. The Paradox of User Control’. 5 th Annual Digital Arts and Culture Conference (Proceedings), pp.160-164.

Plant, S., 2000. On the Mobile: the effects of mobile telephones on social and individual life . Web.

Postman, N., 1992. Technopoly: The surrender of culture to technology . New York: Vintage Books.

Reid, D. J. & Reid F. J. M., 2004. Insights into the Social and Psychological Effects of SMS Text Messaging . Web.

Richard, L., 2004. The Mobile Connection: The Cell Phone’s Impact on Society . San Francisco Morgan: Kaufmann.

Soukup, C., 2008. ‘Magic Screens: Everyday Life in an Era of Ubiquitous and Mobile Media Screens’, presented at 94 th annual Convention . San Diego .

Stephen, J., 1997. Interface Culture: How New Technology Transforms the Way We Create and Communicate . San Francisco: Basic Books.

Tofts, D., 1997. ‘ The technology within’ in memory trade: A Prehistory of Cyberculture, North Ryde: 21C Books.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2023, October 28). Impact of Technology on Communication Essay. https://ivypanda.com/essays/impact-of-technology-on-communication/

"Impact of Technology on Communication Essay." IvyPanda , 28 Oct. 2023, ivypanda.com/essays/impact-of-technology-on-communication/.

IvyPanda . (2023) 'Impact of Technology on Communication Essay'. 28 October.

IvyPanda . 2023. "Impact of Technology on Communication Essay." October 28, 2023. https://ivypanda.com/essays/impact-of-technology-on-communication/.

1. IvyPanda . "Impact of Technology on Communication Essay." October 28, 2023. https://ivypanda.com/essays/impact-of-technology-on-communication/.

Bibliography

IvyPanda . "Impact of Technology on Communication Essay." October 28, 2023. https://ivypanda.com/essays/impact-of-technology-on-communication/.

  • Texting's Importance for the Society
  • Texting as a Valuable Way of Communication
  • Drunk Driving vs. Texting While Driving
  • Dangers of Texting while Driving
  • The South Dakota Legislature on Texting and Driving
  • Texting in Modern Society
  • Banning Texting while Driving Saves Lives
  • LG Mobile Teen Texting Campaign
  • Texting While Driving Should Be Illegal
  • Tougher Punishment for Texting While Driving
  • Electronic Communication
  • The Role of Communication in Society
  • Public Relations and Ethical Decisions
  • Social Network Communication
  • Modern Day Communication

Visual Life

  • Creative Projects
  • Write Here!

Social Interaction Vs Electronic Media Use

Karunaratne, Indika & Atukorale, Ajantha & Perera, Hemamali. (2011). Surveillance of human- computer interactions: A way forward to detection of users' Psychological Distress. 2011 IEEE Colloquium on Humanities, Science and Engineering, CHUSER 2011. 10.1109/CHUSER.2011.6163779.

June 9, 2023 / 0 comments / Reading Time: ~ 12 minutes

The Digital Revolution: How Technology is Changing the Way We Communicate and Interact

This article examines the impact of technology on human interaction and explores the ever-evolving landscape of communication. With the rapid advancement of technology, the methods and modes of communication have undergone a significant transformation. This article investigates both the positive and negative implications of this digitalization. Technological innovations, such as smartphones, social media, and instant messaging apps, have provided unprecedented accessibility and convenience, allowing people to connect effortlessly across distances. However, concerns have arisen regarding the quality and authenticity of these interactions. The article explores the benefits of technology, including improved connectivity, enhanced information sharing, and expanded opportunities for collaboration. It also discusses potential negative effects including a decline in in-person interactions, a loss of empathy, and an increase in online anxiety. This article tries to expand our comprehension of the changing nature of communication in the digital age by exposing the many ways that technology has an impact on interpersonal interactions. It emphasizes the necessity of intentional and thoughtful communication techniques to preserve meaningful connections in a society that is becoming more and more reliant on technology.

Introduction:

Technology has significantly transformed our modes of communication and interaction, revolutionizing the way we connect with one another over the past few decades. However, the COVID-19 pandemic has acted as a catalyst, expediting this transformative process, and necessitating our exclusive reliance on digital tools for socializing, working, and learning. Platforms like social media and video conferencing have emerged in recent years, expanding our options for virtual communication. The impact of these changes on our lives cannot be ignored. In this article, we will delve into the ways in which technology has altered our communication and interaction patterns and explore the consequences of these changes for our relationships, mental well-being, and society.

To gain a deeper understanding of this topic, I have conducted interviews and surveys, allowing us to gather firsthand insights from individuals of various backgrounds. Additionally, we will compare this firsthand information with the perspectives shared by experts in the field. By drawing on both personal experiences and expert opinions, we seek to provide a comprehensive analysis of how technology influences our interpersonal connections. Through this research, we hope to get a deeper comprehension of the complex interactions between technology and people, enabling us to move mindfully and purposefully through the rapidly changing digital environment.

The Evolution of Communication: From Face-to-Face to Digital Connections:

In the realm of communication, we have various mediums at our disposal, such as face-to-face interactions, telephone conversations, and internet-based communication. According to Nancy Baym, an expert in the field of technology and human connections, face-to-face communication is often regarded as the most personal and intimate, while the phone provides a more personal touch than the internet. She explains this in her book Personal Connections in the Digital Age by stating, “Face-to-face is much more personal; phone is personal as well, but not as intimate as face-to-face… Internet would definitely be the least personal, followed by the phone (which at least has the vocal satisfaction) and the most personal would be face-to-face” (Baym 2015).  These distinctions suggest that different communication mediums are perceived to have varying levels of effectiveness in conveying emotion and building relationships. This distinction raises thought-provoking questions about the impact of technology on our ability to forge meaningful connections. While the internet offers unparalleled convenience and connectivity, it is essential to recognize its limitations in reproducing the depth of personal interaction found in face-to-face encounters. These limitations may be attributed to the absence of nonverbal cues, such as facial expressions, body language, and tone of voice, which are vital elements in understanding and interpreting emotions accurately.

Traditionally, face-to-face interactions held a prominent role as the primary means of communication, facilitating personal and intimate connections. However, the rise of technology has brought about significant changes, making communication more convenient but potentially less personal. The rise of phones, instant messaging, and social media platforms has revolutionized how we connect with others. While these digital tools offer instant connectivity and enable us to bridge geographical distances, they introduce a layer of blockage that may impact the depth and quality of our interactions. It is worth noting that different communication mediums have their strengths and limitations. Phone conversations, for instance, retain a certain level of personal connection through vocal interactions, allowing for the conveyance of emotions and tones that text-based communication may lack. However, even with this advantage, phone conversations still fall short of the depth and richness found in face-to-face interactions, as they lack visual cues and physical presence.

Internet-based communication, on the other hand, is considered the least personal medium. Online interactions often rely on text-based exchanges, which may not fully capture the nuances of expression, tone, and body language. While the internet offers the ability to connect with a vast network of individuals and share information on a global scale, it may not facilitate the same depth and authenticity that in-person or phone conversations can provide. As a result, establishing meaningful connections and building genuine relationships in an online setting can be challenging. Research and observations support these ideas. Figure 1. titled “Social Interaction after Electronic Media Use,” shows the potential impact of electronic media on social interaction (source: ResearchGate). This research highlights the need to carefully consider the effects of technology on our interpersonal connections. While technology offers convenience and connectivity, it is essential to strike a balance, ensuring that we do not sacrifice the benefits of face-to-face interactions for the sake of digital convenience.

Social interaction vs. electronic media use: Hours per day of face-to-face social interaction declines as use of electronic media [6]. 

Figure 1:  Increased reliance on electronic media has led to a noticeable decrease in social interaction.

The Limitations and Effects of Digital Communication

In today’s digital age, the limitations and effects of digital communication are becoming increasingly evident. While the phone and internet offer undeniable benefits such as convenience and the ability to connect with people regardless of geographical distance, they fall short in capturing the depth and richness of a face-to-face conversation. The ability to be in the same physical space as the person we’re communicating with, observing their facial expressions, body language, and truly feeling their presence, is something unique and irreplaceable.

Ulrike Schultze, in her thought-provoking TED Talk titled “How Social Media Shapes Identity,” delves further into the impact of digital communication on our lives by stating, “we construct the technology, but the technology also constructs us. We become what technology allows us to become” (Schultze 2015). This concept highlights how our reliance on digital media for interaction has led to a transformation in how we express ourselves and relate to others.

The influence of social media has been profound in shaping our communication patterns and interpersonal dynamics. Research conducted by Kalpathy Subramanian (2017) examined the influence of social media on interpersonal communication, highlighting the changes it brings to the way we interact and express ourselves (Subramanian 2017). The study found that online communication often involves the use of abbreviations, emoticons, and hashtags, which have become embedded in our online discourse. These digital communication shortcuts prioritize speed and efficiency, but they also contribute to a shift away from the physical action of face-to-face conversation, where nonverbal cues and deeper emotional connections can be fostered.

Additionally, the study emphasizes the impact of social media on self-presentation and identity construction. With the rise of platforms like Facebook, Instagram, and Twitter, individuals have a platform to curate and present themselves to the world. This online self-presentation can influence how we perceive ourselves and how others perceive us, potentially shaping our identities in the process. The study further suggests that the emphasis on self-presentation and the pressure to maintain a certain image on social media can lead to increased stress and anxiety among users.

Interviews:

I conducted interviews with individuals from different age groups to gain diverse perspectives on how technology and social media have transformed the way we connect with others. By exploring the experiences of a 21-year-old student and an individual in their 40s, we can better understand the evolving dynamics of interpersonal communication in the digital age. These interviews shed light on the prevalence of digital communication among younger generations, their preference for convenience, and the concerns raised by individuals from older age groups regarding the potential loss of deeper emotional connections.

When I asked the 21-year-old classmate about how technology has changed the way they interact with people in person, they expressed, “To be honest, I spend more time texting, messaging, or posting on social media than actually talking face-to-face with others. It’s just so much more convenient.” This response highlights the prevalence of digital communication among younger generations and their preference for convenience over traditional face-to-face interactions. It suggests that technology has significantly transformed the way young people engage with others, with a greater reliance on virtual interactions rather than in-person conversations. Additionally, the mention of convenience as a driving factor raises questions about the potential trade-offs in terms of depth and quality of interpersonal connections.

To gain insight from an individual in their 40s, I conducted another interview. When asked about their experiences with technology and social media, they shared valuable perspectives. They mentioned that while they appreciate the convenience and accessibility offered by technology, they also expressed concerns about its impact on interpersonal connections. They emphasized the importance of face-to-face interactions in building genuine relationships and expressed reservations about the potential loss of deeper emotional connections in digital communication. Additionally, they discussed the challenges of adapting to rapid technological advancements and the potential generational divide in communication preferences.

Comparing the responses from both interviews, it is evident that there are generational differences in the perception and use of technology for communication. While the 21-year-old classmate emphasized convenience as a primary factor in favor of digital communication, the individual in their 40s highlighted the importance of face-to-face interactions and expressed concerns about the potential loss of meaningful connections in the digital realm. This comparison raises questions about the potential impact of technology on the depth and quality of interpersonal relationships across different age groups. It also invites further exploration into how societal norms and technological advancements shape individuals’ preferences and experiences.

Overall, the interviews revealed a shift towards digital communication among both younger and older individuals, with varying perspectives. While convenience and connectivity are valued, concerns were raised regarding the potential drawbacks, including the pressure to maintain an idealized online presence and the potential loss of genuine connections. It is evident that technology and social media have transformed the way we communicate and interact with others, but the interviews also highlighted the importance of maintaining a balance and recognizing the value of face-to-face interactions in fostering meaningful relationships.

I have recently conducted a survey with my classmates to gather insights on how technology and social media have influenced communication and interaction among students in their daily lives. Although the number of responses is relatively small, the collected data allows us to gain a glimpse into individual experiences and perspectives on this matter.

One of the questions asked in the survey was how often students rely on digital communication methods, such as texting, messaging, or social media, in comparison to engaging in face-to-face conversations. The responses indicated a clear trend towards increased reliance on digital communication, with 85% of participants stating that they frequently use digital platforms as their primary means of communication. This suggests a significant shift away from traditional face-to-face interactions, highlighting the pervasive influence of technology in shaping our communication habits.

Furthermore, the survey explored changes in the quality of interactions and relationships due to the increased use of technology and social media. Interestingly, 63% of respondents reported that they had noticed a decrease in the depth and intimacy of their connections since incorporating more digital communication into their lives. Many participants expressed concerns about the difficulty of conveying emotions effectively through digital channels and the lack of non-verbal cues that are present in face-to-face interactions. It is important to note that while the survey results provide valuable insights into individual experiences, they are not representative of the entire student population. The small sample size limits the generalizability of the findings. However, the data collected does shed light on the potential impact of technology and social media on communication and interaction patterns among students.

Expanding on the topic, I found an insightful figure from Business Insider that sheds light on how people utilize their smartphones (Business Insider). Figure 2. illustrates the average smartphone owner’s daily time spent on various activities. Notably, communication activities such as texting, talking, and social networking account for a significant portion, comprising 59% of phone usage. This data reinforces the impact of digital communication on our daily lives, indicating the substantial role it plays in shaping our interactions with others.  Upon comparing this research with the data, I have gathered, a clear trend emerges, highlighting that an increasing number of individuals primarily utilize their smartphones for communication and interaction purposes.

Figure 2: The breakdown of daily smartphone usage among average users clearly demonstrates that the phone is primarily used for interactions.

The Digital Make Over:

In today’s digital age, the impact of technology on communication and interaction is evident, particularly in educational settings. As a college student, I have witnessed the transformation firsthand, especially with the onset of the COVID-19 pandemic. The convenience of online submissions for assignments has led to a growing trend of students opting to skip physical classes, relying on the ability to submit their work remotely. Unfortunately, this shift has resulted in a decline in face-to-face interactions and communication among classmates and instructors.

The decrease in physical attendance raises concerns about the potential consequences for both learning and social connections within the academic community. Classroom discussions, collaborative projects, and networking opportunities are often fostered through in-person interactions. By limiting these experiences, students may miss out on valuable learning moments, diverse perspectives, and the chance to establish meaningful connections with their peers and instructors.

Simon Lindgren, in his thought-provoking Ted Talk , “Media Are Not Social, but People Are,” delves deeper into the effects of technology and social media on our interactions. Lindgren highlights a significant point by suggesting that while technology may have the potential to make us better individuals, we must also recognize its potential pitfalls. Social media, for instance, can create filter bubbles that limit our exposure to diverse viewpoints, making us less in touch with reality and more narrow-minded. This cautionary reminder emphasizes the need to approach social media thoughtfully, seeking out diverse perspectives and avoiding the pitfalls of echo chambers. Furthermore, it is crucial to strike a balance between utilizing technology for educational purposes and embracing the benefits of in-person interactions. While technology undoubtedly facilitates certain aspects of education, such as online learning platforms and digital resources, we must not overlook the importance of face-to-face communication. In-person interactions allow for nuanced non-verbal cues, deeper emotional connections, and real-time engagement that contribute to a more comprehensive learning experience.

A study conducted by Times Higher Education delved into this topic, providing valuable insights. Figure 3. from the study illustrates a significant drop in attendance levels after the pandemic’s onset. Undeniably, technology played a crucial role in facilitating the transition to online learning. However, it is important to acknowledge that this shift has also led to a decline in face-to-face interactions, which have long been regarded as essential for effective communication and relationship-building. While technology continues to evolve and reshape the educational landscape, it is imperative that we remain mindful of its impact on communication and interaction. Striking a balance between digital tools and in-person engagement can help ensure that we leverage the benefits of technology while preserving the richness of face-to-face interactions. By doing so, we can foster a holistic educational experience that encompasses the best of both worlds and cultivates meaningful connections among students, instructors, and the academic community.

University class attendance plummets post-Covid | Times Higher Education (THE)

Figure 3:  This graph offers convincing proof that the COVID-19 pandemic and the extensive use of online submission techniques are to blame for the sharp reduction in in-person student attendance.

When asked about the impact of online submissions for assignments on physical attendance in classes, the survey revealed mixed responses. While 73% of participants admitted that the convenience of online submissions has led them to skip classes occasionally, 27% emphasized the importance of in-person attendance for better learning outcomes and social interactions. This finding suggests that while technology offers convenience, it also poses challenges in maintaining regular face-to-face interactions, potentially hindering educational and social development, and especially damaging the way we communicate and interact with one another. Students are doing this from a young age, and it comes into huge effect once they are trying to enter the work force and interact with others. When examining the survey data alongside the findings from Times Higher Education, striking similarities become apparent regarding how students approach attending classes in person with the overall conclusion being a massive decrease in students attending class which hinders the chance for real life interaction and communication. the convenience and instant gratification provided by technology can create a sense of detachment and impatience in interpersonal interactions. Online platforms allow for quick and immediate responses, and individuals can easily disconnect or switch between conversations. This can result in a lack of attentiveness and reduced focus on the person with whom one is communicating, leading to a superficial engagement that may hinder the establishment of genuine connections.

Conclusion:

Ultimately, the digital revolution has profoundly transformed the way we communicate and interact with one another. The COVID-19 pandemic has accelerated this transformation, leading to increased reliance on digital tools for socializing, working, and learning. While technology offers convenience and connectivity, it also introduces limitations and potential drawbacks. The shift towards digital communication raises concerns about the depth and quality of our connections, as well as the potential loss of face-to-face interactions. However, it is essential to strike a balance between digital and in-person engagement, recognizing the unique value of physical presence, non-verbal cues, and deeper emotional connections that face-to-face interactions provide. By navigating the digital landscape with mindfulness and intentionality, we can harness the transformative power of technology while preserving and nurturing the essential elements of human connection.

Moving forward, it is crucial to consider the impact of technology on our relationships, mental well-being, and society. As technology continues to evolve, we must be cautious of its potential pitfalls, such as the emphasis on self-presentation, the potential for increased stress and anxiety, and the risk of forgetting how to interact in person. Striking a balance between digital and face-to-face interactions can help ensure that technology enhances, rather than replaces, genuine human connections. By prioritizing meaningful engagement, valuing personal interactions, and leveraging the benefits of technology without compromising the depth and quality of our relationships, we can navigate the digital revolution in a way that enriches our lives and fosters authentic connections.

References:

Ballve, M. (2013, June 5). How much time do we really spend on our smartphones every day? Business Insider. Retrieved April 27, 2023. https://www.businessinsider.com/how-much-time-do-we-spend-on-smartphones-2013-6

Baym, N. (2015). Personal Connections in the Digital Age (2nd ed.). Polity.

Karunaratne, Indika & Atukorale, Ajantha & Perera, Hemamali. (2011). Surveillance of human-       computer interactions: A way forward to detection of users’ Psychological Distress. 2011 IEEE Colloquium on Humanities, Science and Engineering, CHUSER 2011.             10.1109/CHUSER.2011.6163779.  https://www.researchgate.net/figure/Social-interaction-vs-electronic-media-use-Hours-per-day-of-face-to-face-social_fig1_254056654

Lindgren, S. (2015, May 20). Media are not social, but people are | Simon Lindgren | TEDxUmeå . YouTube. Retrieved April 27, 2023, from https://www.youtube.com/watch?v=nQ5S7VIWE6k

Ross, J., McKie, A., Havergal, C., Lem, P., & Basken, P. (2022, October 24). Class attendance plummets post-Covid . Times Higher Education (THE). Retrieved April 27, 2023, from https://www.timeshighereducation.com/news/class-attendance-plummets-post-covid

Schultze, U. (2015, April 23). How social media shapes identity | Ulrike Schultze | TEDxSMU . YouTube. Retrieved April 27, 2023, from https://www.youtube.com/watch?v=CSpyZor-Byk

Subramanian, Dr. K .R. “Influence of Social Media in Interpersonal Communication – Researchgate.” ResearchGate.Net , www.researchgate.net/profile/Kalpathy-Subramanian/publication/319422885_Influence_of_Social_Media_in_Interpersonal_Communication/links/59a96d950f7e9b2790120fea/Influence-of-Social-Media-in-Interpersonal-Communication.pdf. Accessed 12 May 2023 .

And So It Was Written

evolution of communication technology essay

Author: Anonymous

Published: June 9, 2023

Word Count: 3308

Reading time: ~ 12 minutes

Edit Link: (emailed to author) Request Now

Creative Commons CC-BY=ND Attribution-NoDerivs License

ORGANIZED BY

Articles , Published

MORE TO READ

Provide feedback cancel reply.

You must be logged in to post a comment.

A TRU Writer powered SPLOT : Visual Life

Blame @cogdog — Up ↑

More From Forbes

The role of technology in the evolution of communication.

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

For as long as humans have been on this planet, we’ve invented forms of communication—from smoke signals and messenger pigeons to the telephone and email—that have constantly evolved how we interact with each other. 

One of the biggest developments in communication came in 1831 when the electric telegraph was invented. While post existed as a form of communication before this date, it was electrical engineering in the 19th century which had a revolutionary impact. 

Now, digital methods have superseded almost all other forms of communication, especially in business. I can’t remember the last time I hand wrote a letter, rather than an email at work, even my signature is digital these days. Picking up the phone is a rare occurrence too—instead, I FaceTime, Zoom, or join a Google Hangout. 

When I look back at how communication has advanced over the years, it really is quite incredible…

The Telephone 

In 1849, the telephone was invented and within 50 years it was an essential item for homes and offices, but tethering impacted the flexibility and privacy of the device. Then, came the mobile phone. In 1973, Motorola created a mobile phone which kick-started a chain of developments that transformed communication forever. 

Early smartphones were primarily aimed towards the enterprise market, bridging the gap between telephones and personal digital assistants (PDAs), but they were bulky and had short battery lives. By 1996, Nokia was releasing phones with QWERTY keyboards and by 2010, the majority of Android phones were touchscreen-only. 

Best High-Yield Savings Accounts Of 2024

Best 5% interest savings accounts of 2024.

In 2007, Steve Jobs revealed the first iPhone to the world and Apple paved the way for the aesthetics of modern smartphones. Before the iPhone, “flip phones”, and phones with a split keyboard and screen were the norm. A year later, a central application store with an initial 500 downloadable ‘apps’ was launched. Currently, there are over two million apps available in the Apple App Store. 

The Internet 

Since the mid-1990s, the Internet has had a revolutionary impact on communication, including the rise of near-instant communication by electronic mail, instant messaging, voice over Internet Protocol (VoIP) telephone calls, two-way interactive video calls, discussion forums, blogs, and social networking. 

The internet has made communication easier and faster, it’s allowed us to stay in contact with people regardless of time and location. It’s accelerated the pace of business and widened the possibilities within the enterprise space. It’s allowed people to find their voice and express themselves through social media, YouTube and memes. The internet has connected and divided us like nothing before. 

As a byproduct of the World Wide Web, email was introduced to the world in 1991 (although it had been operating years before) and it has vastly changed our lives—whether for better or worse depends on your viewpoint. The first users of the messaging platform were educational systems and the military who used email to exchange information. In 2018, there were more than 3.8 billion email users —that’s more than half the planet. By 2022, it’s expected that we will be sending 333 billion personal and business emails each day. 

While email is invaluable and we can’t imagine a world without it, there are tools that are springing up that are giving email a run for its money. Take Slack (an acronym for “Searchable Log of All Communication and Knowledge”) for example, the company which launched in 2014 has often been described as an email killer . However, while Slack has become the most popular chat and productivity tool in the world used by 10 million people every day, email is still going strong. In recognition of this, Slack’s upgrades have ensured that people who still rely heavily on email are not excluded from collaboratory work. 

Photo by Austin Distel on Unsplash

Wearable Technology 

The first instance of wearable technology was a handsfree mobile headset launched in 1999 , which became a piece of tech synonymous with city workers. It gave businesspeople the ability to answer calls on the go, most importantly, while driving.

Ten years ago, the idea that you could make a video call from an item other than a phone would have been a sci-fi dream. Now, with smartwatches, audio sunglasses, and other emerging wearable technology, these capabilities are a part of our daily lives. 

Photo by Luke Chesser on Unsplash

Virtual Reality (VR) 

The next generation of VR has only been around since 2016, but it’s already shaking up communications. The beauty of VR— presence —means you can connect to someone in the same space at the same time, without the time sink and cost of travel, even if participants are on different continents. 

VR also helps to facilitate better communication. In a typical discussion, a lot of information is non-verbal communication which can be transcribed in VR. Voice tone, hesitations, head and hand movements greatly improve the understanding of the participants' emotions and intents. Plus in VR, all distractions are removed and people can be fully focused on what is happening around them. In fact, MeetinVR claims that there is a 25% increase in attention span when meeting in virtual reality compared to video conferencing. 

In addition, research suggests we retain more information and can better apply what we have learned after participating in virtual reality. 3D is a natural communication language overcoming linguistic barriers as well as technical jargon. 

5G, the 5th generation of mobile network, promises much faster data download and upload speeds, wider coverage, and more stable connections. These benefits will bring about significant improvements in communication. Instantaneous communication will be possible and those patchy frustrating video calls will be a thing of the past. 

The average 4G transmission speed currently available for our smartphones is around the 21 Mbps mark. 5G will be 100 to 1000 times faster. The Consumer Technology Association notes that at this speed, you could download a two-hour movie in just 3.6 seconds, versus 6 minutes on 4G or 26 hours on 3G. The impact of 5G will go far beyond our smartphones as it will allow millions of devices to be connected simultaneously. 

Looking ahead, there is already buzz about 6G . Although it’s still in basic research and around 15-20 years away, it’s interesting from an innovation point of view. 6G will form the framework of the connected utopia we aspire towards, and with it will come untold improvements in the speed and consistency of our communication. 

Sol Rogers

  • Editorial Standards
  • Reprints & Permissions

How has technology changed - and changed us - in the past 20 years?

An internet surfer views the Google home page at a cafe in London, August 13, 2004.

Remember this? Image:  REUTERS/Stephen Hird

.chakra .wef-1c7l3mo{-webkit-transition:all 0.15s ease-out;transition:all 0.15s ease-out;cursor:pointer;-webkit-text-decoration:none;text-decoration:none;outline:none;color:inherit;}.chakra .wef-1c7l3mo:hover,.chakra .wef-1c7l3mo[data-hover]{-webkit-text-decoration:underline;text-decoration:underline;}.chakra .wef-1c7l3mo:focus,.chakra .wef-1c7l3mo[data-focus]{box-shadow:0 0 0 3px rgba(168,203,251,0.5);} Madeleine Hillyer

A hand holding a looking glass by a lake

.chakra .wef-1nk5u5d{margin-top:16px;margin-bottom:16px;line-height:1.388;color:#2846F8;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-1nk5u5d{font-size:1.125rem;}} Get involved .chakra .wef-9dduvl{margin-top:16px;margin-bottom:16px;line-height:1.388;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-9dduvl{font-size:1.125rem;}} with our crowdsourced digital platform to deliver impact at scale

Stay up to date:, davos agenda.

  • Since the dotcom bubble burst back in 2000, technology has radically transformed our societies and our daily lives.
  • From smartphones to social media and healthcare, here's a brief history of the 21st century's technological revolution.

Just over 20 years ago, the dotcom bubble burst , causing the stocks of many tech firms to tumble. Some companies, like Amazon, quickly recovered their value – but many others were left in ruins. In the two decades since this crash, technology has advanced in many ways.

Many more people are online today than they were at the start of the millennium. Looking at broadband access, in 2000, just half of Americans had broadband access at home. Today, that number sits at more than 90% .

More than half the world's population has internet access today

This broadband expansion was certainly not just an American phenomenon. Similar growth can be seen on a global scale; while less than 7% of the world was online in 2000, today over half the global population has access to the internet.

Similar trends can be seen in cellphone use. At the start of the 2000s, there were 740 million cell phone subscriptions worldwide. Two decades later, that number has surpassed 8 billion, meaning there are now more cellphones in the world than people

Have you read?

The future of jobs report 2023, how to follow the growth summit 2023.

At the same time, technology was also becoming more personal and portable. Apple sold its first iPod in 2001, and six years later it introduced the iPhone, which ushered in a new era of personal technology. These changes led to a world in which technology touches nearly everything we do.

Technology has changed major sectors over the past 20 years, including media, climate action and healthcare. The World Economic Forum’s Technology Pioneers , which just celebrated its 20th anniversary, gives us insight how emerging tech leaders have influenced and responded to these changes.

Media and media consumption

The past 20 years have greatly shaped how and where we consume media. In the early 2000s, many tech firms were still focused on expanding communication for work through advanced bandwidth for video streaming and other media consumption that is common today.

Others followed the path of expanding media options beyond traditional outlets. Early Tech Pioneers such as PlanetOut did this by providing an outlet and alternative media source for LGBTQIA communities as more people got online.

Following on from these first new media options, new communities and alternative media came the massive growth of social media. In 2004 , fewer than 1 million people were on Myspace; Facebook had not even launched. By 2018, Facebook had more 2.26 billion users with other sites also growing to hundreds of millions of users.

The precipitous rise of social media over the past 15 years

While these new online communities and communication channels have offered great spaces for alternative voices, their increased use has also brought issues of increased disinformation and polarization.

Today, many tech start-ups are focused on preserving these online media spaces while also mitigating the disinformation which can come with them. Recently, some Tech Pioneers have also approached this issue, including TruePic – which focuses on photo identification – and Two Hat , which is developing AI-powered content moderation for social media.

Climate change and green tech

Many scientists today are looking to technology to lead us towards a carbon-neutral world. Though renewed attention is being given to climate change today, these efforts to find a solution through technology is not new. In 2001, green tech offered a new investment opportunity for tech investors after the crash, leading to a boom of investing in renewable energy start-ups including Bloom Energy , a Technology Pioneer in 2010.

In the past two decades, tech start-ups have only expanded their climate focus. Many today are focuses on initiatives far beyond clean energy to slow the impact of climate change.

Different start-ups, including Carbon Engineering and Climeworks from this year’s Technology Pioneers, have started to roll out carbon capture technology. These technologies remove CO2 from the air directly, enabling scientists to alleviate some of the damage from fossil fuels which have already been burned.

Another expanding area for young tech firms today is food systems innovation. Many firms, like Aleph Farms and Air Protein, are creating innovative meat and dairy alternatives that are much greener than their traditional counterparts.

Biotech and healthcare

The early 2000s also saw the culmination of a biotech boom that had started in the mid-1990s. Many firms focused on advancing biotechnologies through enhanced tech research.

An early Technology Pioneer, Actelion Pharmaceuticals was one of these companies. Actelion’s tech researched the single layer of cells separating every blood vessel from the blood stream. Like many other biotech firms at the time, their focus was on precise disease and treatment research.

While many tech firms today still focus on disease and treatment research, many others have been focusing on healthcare delivery. Telehealth has been on the rise in recent years , with many young tech expanding virtual healthcare options. New technologies such as virtual visits, chatbots are being used to delivery healthcare to individuals, especially during Covid-19.

Many companies are also focusing their healthcare tech on patients, rather than doctors. For example Ada, a symptom checker app, used to be designed for doctor’s use but has now shifted its language and interface to prioritize giving patients information on their symptoms. Other companies, like 7 cups, are focused are offering mental healthcare support directly to their users without through their app instead of going through existing offices.

The past two decades have seen healthcare tech get much more personal and use tech for care delivery, not just advancing medical research.

The World Economic Forum was the first to draw the world’s attention to the Fourth Industrial Revolution, the current period of unprecedented change driven by rapid technological advances. Policies, norms and regulations have not been able to keep up with the pace of innovation, creating a growing need to fill this gap.

The Forum established the Centre for the Fourth Industrial Revolution Network in 2017 to ensure that new and emerging technologies will help—not harm—humanity in the future. Headquartered in San Francisco, the network launched centres in China, India and Japan in 2018 and is rapidly establishing locally-run Affiliate Centres in many countries around the world.

The global network is working closely with partners from government, business, academia and civil society to co-design and pilot agile frameworks for governing new and emerging technologies, including artificial intelligence (AI) , autonomous vehicles , blockchain , data policy , digital trade , drones , internet of things (IoT) , precision medicine and environmental innovations .

Learn more about the groundbreaking work that the Centre for the Fourth Industrial Revolution Network is doing to prepare us for the future.

Want to help us shape the Fourth Industrial Revolution? Contact us to find out how you can become a member or partner.

In the early 2000s, many companies were at the start of their recovery from the bursting dotcom bubble. Since then, we’ve seen a large expansion in the way tech innovators approach areas such as new media, climate change, healthcare delivery and more.

At the same time, we have also seen tech companies rise to the occasion of trying to combat issues which arose from the first group such as internet content moderation, expanding climate change solutions.

The Technology Pioneers' 2020 cohort marks the 20th anniversary of this community - and looking at the latest awardees can give us a snapshot of where the next two decades of tech may be heading.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

The Agenda .chakra .wef-n7bacu{margin-top:16px;margin-bottom:16px;line-height:1.388;font-weight:400;} Weekly

A weekly update of the most important issues driving the global agenda

.chakra .wef-1dtnjt5{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;} More on Davos Agenda .chakra .wef-17xejub{-webkit-flex:1;-ms-flex:1;flex:1;justify-self:stretch;-webkit-align-self:stretch;-ms-flex-item-align:stretch;align-self:stretch;} .chakra .wef-nr1rr4{display:-webkit-inline-box;display:-webkit-inline-flex;display:-ms-inline-flexbox;display:inline-flex;white-space:normal;vertical-align:middle;text-transform:uppercase;font-size:0.75rem;border-radius:0.25rem;font-weight:700;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;line-height:1.2;-webkit-letter-spacing:1.25px;-moz-letter-spacing:1.25px;-ms-letter-spacing:1.25px;letter-spacing:1.25px;background:none;padding:0px;color:#B3B3B3;-webkit-box-decoration-break:clone;box-decoration-break:clone;-webkit-box-decoration-break:clone;}@media screen and (min-width:37.5rem){.chakra .wef-nr1rr4{font-size:0.875rem;}}@media screen and (min-width:56.5rem){.chakra .wef-nr1rr4{font-size:1rem;}} See all

evolution of communication technology essay

Davos 2024 Opening Film

evolution of communication technology essay

Building trust amid uncertainty – 3 risk experts on the state of the world in 2024

Andrea Willige

March 27, 2024

evolution of communication technology essay

Why obesity is rising and how we can live healthy lives

Shyam Bishen

March 20, 2024

evolution of communication technology essay

Global cooperation is stalling – but new trade pacts show collaboration is still possible. Here are 6 to know about

Simon Torkington

March 15, 2024

evolution of communication technology essay

How messages of hope, diversity and representation are being used to inspire changemakers to act

Miranda Barker

March 7, 2024

evolution of communication technology essay

AI, leadership, and the art of persuasion – Forum  podcasts you should hear this month

Robin Pomeroy

March 1, 2024

Technology and Evolution of Communication

Technology has transformed people’s lives to a considerable extent in all spheres, including communication. Two hundred years ago, people had to write letters or meet personally to communicate. One hundred years ago, people could communicate with little attention to distance as they had telephones. Nowadays, people are constantly in contact with each other due to smartphones, the Internet, and wearable devices (Rogers, 2019). As with any other advancement, new trends in communication have both positive and negative effects.

The biggest positive outcome of the use of technology in communication is numerous opportunities to stay in touch and reach almost any person within almost no time. People share information, which is important for their professional and personal lives. Comfort is another positive change as individuals can communicate with the help of user-friendly devices, making their lives more pleasant. A person can reach more people irrespective of time and distance. However, technology use is also associated with some negative effects. An increasing level of digital-based communication makes people less comfortable with face-to-face communication. Some individuals prefer communicating online and feel uneasy or even anxious when they need to interact with others in the real world. The abundance of information coming from different sources distracts people from important tasks, which may lead to negative consequences in their professional or personal lives. For instance, an employee working in an office has to react to corporate email messages, phone calls, and instant messages.

In conclusion, technology has made communication more effective as people may share information in no time and pay no attention to distance. However, the current proportion of face-to-face communication is decreasing in many people’s lives, which may have a negative impact on them in personal and social domains. Therefore, it is important to be a responsible user of technology and make sure that social ties are properly maintained, and face-to-face communication prevails.

Rogers, S. (2019). The role of technology in the evolution of communication. Forbes . Web.

Oxford Martin School logo

Technology over the long run: zoom out to see how dramatically the world can change within a lifetime

It is easy to underestimate how much the world can change within a lifetime. considering how dramatically the world has changed can help us see how different the world could be in a few years or decades..

Technology can change the world in ways that are unimaginable until they happen. Switching on an electric light would have been unimaginable for our medieval ancestors. In their childhood, our grandparents would have struggled to imagine a world connected by smartphones and the Internet.

Similarly, it is hard for us to imagine the arrival of all those technologies that will fundamentally change the world we are used to.

We can remind ourselves that our own future might look very different from the world today by looking back at how rapidly technology has changed our world in the past. That’s what this article is about.

One insight I take away from this long-term perspective is how unusual our time is. Technological change was extremely slow in the past – the technologies that our ancestors got used to in their childhood were still central to their lives in their old age. In stark contrast to those days, we live in a time of extraordinarily fast technological change. For recent generations, it was common for technologies that were unimaginable in their youth to become common later in life.

The long-run perspective on technological change

The big visualization offers a long-term perspective on the history of technology. 1

The timeline begins at the center of the spiral. The first use of stone tools, 3.4 million years ago, marks the beginning of this history of technology. 2 Each turn of the spiral represents 200,000 years of history. It took 2.4 million years – 12 turns of the spiral – for our ancestors to control fire and use it for cooking. 3

To be able to visualize the inventions in the more recent past – the last 12,000 years – I had to unroll the spiral. I needed more space to be able to show when agriculture, writing, and the wheel were invented. During this period, technological change was faster, but it was still relatively slow: several thousand years passed between each of these three inventions.

From 1800 onwards, I stretched out the timeline even further to show the many major inventions that rapidly followed one after the other.

The long-term perspective that this chart provides makes it clear just how unusually fast technological change is in our time.

You can use this visualization to see how technology developed in particular domains. Follow, for example, the history of communication: from writing to paper, to the printing press, to the telegraph, the telephone, the radio, all the way to the Internet and smartphones.

Or follow the rapid development of human flight. In 1903, the Wright brothers took the first flight in human history (they were in the air for less than a minute), and just 66 years later, we landed on the moon. Many people saw both within their lifetimes: the first plane and the moon landing.

This large visualization also highlights the wide range of technology’s impact on our lives. It includes extraordinarily beneficial innovations, such as the vaccine that allowed humanity to eradicate smallpox , and it includes terrible innovations, like the nuclear bombs that endanger the lives of all of us .

What will the next decades bring?

The red timeline reaches up to the present and then continues in green into the future. Many children born today, even without further increases in life expectancy, will live well into the 22nd century.

New vaccines, progress in clean, low-carbon energy, better cancer treatments – a range of future innovations could very much improve our living conditions and the environment around us. But, as I argue in a series of articles , there is one technology that could even more profoundly change our world: artificial intelligence (AI).

One reason why artificial intelligence is such an important innovation is that intelligence is the main driver of innovation itself. This fast-paced technological change could speed up even more if it’s driven not only by humanity’s intelligence but also by artificial intelligence. If this happens, the change currently stretched out over decades might happen within a very brief time span of just a year. Possibly even faster. 4

I think AI technology could have a fundamentally transformative impact on our world. In many ways, it is already changing our world, as I documented in this companion article . As this technology becomes more capable in the years and decades to come, it can give immense power to those who control it (and it poses the risk that it could escape our control entirely).

Such systems might seem hard to imagine today, but AI technology is advancing quickly. Many AI experts believe there is a real chance that human-level artificial intelligence will be developed within the next decades, as I documented in this article .

legacy-wordpress-upload

Technology will continue to change the world – we should all make sure that it changes it for the better

What is familiar to us today – photography, the radio, antibiotics, the Internet, or the International Space Station circling our planet – was unimaginable to our ancestors just a few generations ago. If your great-great-great grandparents could spend a week with you, they would be blown away by your everyday life.

What I take away from this history is that I will likely see technologies in my lifetime that appear unimaginable to me today.

In addition to this trend towards increasingly rapid innovation, there is a second long-run trend. Technology has become increasingly powerful. While our ancestors wielded stone tools, we are building globe-spanning AI systems and technologies that can edit our genes.

Because of the immense power that technology gives those who control it, there is little that is as important as the question of which technologies get developed during our lifetimes. Therefore, I think it is a mistake to leave the question about the future of technology to the technologists. Which technologies are controlled by whom is one of the most important political questions of our time because of the enormous power these technologies convey to those who control them.

We all should strive to gain the knowledge we need to contribute to an intelligent debate about the world we want to live in. To a large part, this means gaining knowledge and wisdom on the question of which technologies we want.

Acknowledgments: I would like to thank my colleagues Hannah Ritchie, Bastian Herre, Natasha Ahuja, Edouard Mathieu, Daniel Bachler, Charlie Giattino, and Pablo Rosado for their helpful comments on drafts of this essay and the visualization. Thanks also to Lizka Vaintrob and Ben Clifford for the conversation that initiated this visualization.

Appendix: About the choice of visualization in this article

The recent speed of technological change makes it difficult to picture the history of technology in one visualization. When you visualize this development on a linear timeline, then most of the timeline is almost empty, while all the action is crammed into the right corner:

Linear version of the spiral chart

In my large visualization here, I tried to avoid this problem and instead show the long history of technology in a way that lets you see when each technological breakthrough happened and how, within the last millennia, there was a continuous acceleration of technological change.

The recent speed of technological change makes it difficult to picture the history of technology in one visualization. In the appendix, I show how this would look if it were linear.

It is, of course, difficult to assess when exactly the first stone tools were used.

The research by McPherron et al. (2010) suggested that it was at least 3.39 million years ago. This is based on two fossilized bones found in Dikika in Ethiopia, which showed “stone-tool cut marks for flesh removal and percussion marks for marrow access”. These marks were interpreted as being caused by meat consumption and provide the first evidence that one of our ancestors, Australopithecus afarensis, used stone tools.

The research by Harmand et al. (2015) provided evidence for stone tool use in today’s Kenya 3.3 million years ago.

References:

McPherron et al. (2010) – Evidence for stone-tool-assisted consumption of animal tissues before 3.39 million years ago at Dikika, Ethiopia . Published in Nature.

Harmand et al. (2015) – 3.3-million-year-old stone tools from Lomekwi 3, West Turkana, Kenya . Published in Nature.

Evidence for controlled fire use approximately 1 million years ago is provided by Berna et al. (2012) Microstratigraphic evidence of in situ fire in the Acheulean strata of Wonderwerk Cave, Northern Cape province, South Africa , published in PNAS.

The authors write: “The ability to control fire was a crucial turning point in human evolution, but the question of when hominins first developed this ability still remains. Here we show that micromorphological and Fourier transform infrared microspectroscopy (mFTIR) analyses of intact sediments at the site of Wonderwerk Cave, Northern Cape province, South Africa, provide unambiguous evidence—in the form of burned bone and ashed plant remains—that burning took place in the cave during the early Acheulean occupation, approximately 1.0 Ma. To the best of our knowledge, this is the earliest secure evidence for burning in an archaeological context.”

This is what authors like Holden Karnofsky called ‘Process for Automating Scientific and Technological Advancement’ or PASTA. Some recent developments go in this direction: DeepMind’s AlphaFold helped to make progress on one of the large problems in biology, and they have also developed an AI system that finds new algorithms that are relevant to building a more powerful AI.

Cite this work

Our articles and data visualizations rely on work from many different people and organizations. When citing this article, please also cite the underlying data sources. This article can be cited as:

BibTeX citation

Reuse this work freely

All visualizations, data, and code produced by Our World in Data are completely open access under the Creative Commons BY license . You have the permission to use, distribute, and reproduce these in any medium, provided the source and authors are credited.

The data produced by third parties and made available by Our World in Data is subject to the license terms from the original third-party authors. We will always indicate the original source of the data in our documentation, so you should always check the license of any such third-party data before use and redistribution.

All of our charts can be embedded in any site.

Our World in Data is free and accessible for everyone.

Help us do this work by making a donation.

National Academies Press: OpenBook

The Evolution of Untethered Communications (1997)

Chapter: 1 past, present, and future, 1 past, present, and future.

Humans have long dreamed of possessing the capability to communicate with each other anytime, anywhere. Kings, nation-states, military forces, and business cartels have sought more and better ways to acquire timely information of strategic or economic value from across the globe. Travelers have often been willing to pay premiums to communicate with family and friends back home. As the twenty-first century approaches, technical capabilities have become so sophisticated that stationary telephones, facsimile (fax) machines, computers, and other communications devices—connected by wires to power sources and telecommunications networks—are almost ubiquitous in many industrialized countries. The dream is close to becoming reality. The last major challenge is to develop affordable, reliable, widespread capabilities for "untethered" communications, a term coined by the U.S. military and referring to the union of wireless and mobile technologies. Because "untethered" is not a widely used term, this report concentrates on "wireless" communications systems that use the radio frequency (RF) part of the electromagnetic spectrum. These systems and their component technologies are widely deployed to serve mobile users.

Mobile wireless communications is a shared goal of both the U.S. military and civilian sectors, which traditionally have enjoyed a synergistic relationship in the development and deployment of communications technology. The balance of that long-standing interdependence is changing now as a result of trends in the marketplace and defense operations and budgets. These trends suggest that market forces will propel advances

in technology to meet rising consumer expectations. However, the military may need to take special measures to field cost-effective, state-of-the-art untethered communications systems that meet defense requirements.

This chapter lays the foundation for an analysis of military needs in this area by chronicling the evolution of military and civilian applications of communications technology, from ancient times leading up to the horizon of 2010. Section 1.1 is an overview of the challenge facing the U.S. military. Section 1.2 provides an historical perspective on the development of communications infrastructures. Section 1.3 outlines the wireless systems currently used by the U.S. military and the related research and development (R&D) activities. Sections 1.4 through 1.7 recount the evolution and current status of commercial wireless systems. Section 1.8 compares the development paths for wireless technologies in the United States, Europe, and Japan.

1.1 Overview

In the final years of the twentieth century, all aspects of wireless communications are subject to rapid change throughout the world. Dimensions of change include the following:

These changes are fueled by opportunities for profit and public benefit as perceived by executives, investors, and governments. Although the patterns are global, the details differ significantly from country to country. Each dimension of change is complex and all of them interact. Overall, the dynamic nature of wireless communications creates a mixture of confusion and opportunity for stakeholders throughout the world.

A principal attraction of wireless communications is its capability to serve mobile users. Because mobility is an important feature of military operations, the U.S. armed forces have always played a leading role in the development and deployment of wireless communications technology.

In the coming years, however, it appears that the commercial sector will have sufficient incentives and momentum to push the technical envelope on its own. At the same time, flat or declining defense budgets are motivating the military to adopt commercial products and services to an increasing extent. Yet there are significant differences between military and commercial requirements. Thus, it is important to examine carefully the opportunities for, and limitations to, military use of commercial wireless communications products and services.

In contrast to other areas of information technology, wireless communications has yet to converge toward a single technical standard or even a very small number of them. Instead it appears that diversity will endure for the foreseeable future. In this environment, the management and coordination of complex, diverse systems will be an ongoing challenge, particularly for the U.S. military, which coincidentally has to adapt to new threats and responsibilities after more than half a century of following the paradigm set by World War II and the Cold War. Information is now assuming greater strategic importance than ever before in warfare and other military operations, and so the wide deployment of cost-effective, state-of-the-art wireless communications systems has become particularly critical.

The present situation recalls previous epochs in which breakthroughs in hardware—aircraft carriers, jet aircraft, tactical missiles, nuclear weapons—have led to radical revisions of military doctrine. The next great revolution in military affairs could be shaped by information technology: global communications, ubiquitous sensors, precision location, and pervasive information processing. Advanced command, control, communications, computing, and intelligence (C 4 I) systems could make it possible to monitor an adversary, target specific threats, and neutralize them with the best available weapon. Admiral William Owens, former vice chairman of the Joint Chiefs of Staff, has called such an integrated capability a ''system of systems." Using such a system, a commander could observe the battle from a computer screen, select the most threatening targets, and destroy them with the press of a button. Battles would be won by the side with the best information, not necessarily the one with the largest battalions.

But unlike the military hardware of the past, information technology is advancing at a breakneck pace in a worldwide marketplace, driven not by military requirements but by the industrial and consumer sectors. Increasingly these technologies are available worldwide, and the best technology is no longer limited to U.S. manufacture and control. Highly accurate position data transmitted by satellite are now available to any yachtsman. High-resolution satellite photographs are for sale around the

world. Any nation can purchase the latest communications gadgets from the electronics stores of Tokyo.

Therein lies the challenge for the U.S. military: how to exploit the advances in affordable technology fueled by worldwide consumer demand while also maintaining technical capabilities that significantly exceed those of any potential adversary.

1.2 Historical Perspective

Throughout most of history, the evolution of communications technologies has been intimately intertwined with military needs and applications. Some of the earliest government-sponsored R&D projects focused on communications technologies that enabled command and control. A synergistic relationship then evolved between the military and commercial sectors that accelerated the technology development process. Now large corporations develop the latest communications technologies for international industrial and consumer markets shaped by government regulation and international agreements. World trade in telecommunications equipment and services was valued at $115 billion in 1996 ( The Economist , 1997).

Modern wireless communication systems are rooted in telephony and radio technologies dating back to the end of the nineteenth century and the older telegraphy systems dating back to the eighteenth century. Wireless systems are also influenced by and increasingly linked to much newer communications capabilities, such as the Internet, which originated in the 1960s. All wireless systems transmit signals over the air using different frequency transmission bands designated by government regulation. Table 1-1 provides an overview of wireless RF communications systems and services and the frequency bands they use. 1 Each frequency band has both advantages and disadvantages. At low frequencies the signal propagates along the ground; attenuation is low but atmospheric noise levels are high. Low frequencies cannot carry enough information for video services. At higher frequencies there is less atmospheric noise but more attenuation, and a clear line of sight is needed between the transmitter and receiver because the signals cannot penetrate objects. These frequencies offer greater bandwidth, or channel capacity.

1.2.1 Communications Before the Industrial Age

The annals of antiquity offer examples of muscle-powered communications: human runners, homing pigeons, and horse relays. Perhaps the earliest communications infrastructure was the road network of Rome, which carried not only the legions needed to enforce the emperor's will

but also messengers to direct forces far from the capital. Ancient societies also developed systems that obviated the need for physical delivery of information. These systems operated within line-of-sight distances (later extended by telescope): smoke signals, torch signaling, flashing mirrors, signal flares, and semaphore flags (Holzman and Pehrson, 1995). Observation stations were established along hilltops or roads to relay messages across great distances.

1.2.2 Telegraphy

The first comprehensive infrastructure for transmitting messages faster than the fastest form of transportation was the optical telegraph, developed in 1793. Napoleon considered this his secret weapon because it brought him news in Paris and allowed him to control his armies beyond the borders of France. The optical telegraph consisted of a set of articulated arms that encoded hundreds of symbols in defined positions. Under a military contract, the signaling stations were deployed on strategic hilltops throughout France, linking Paris to its frontiers. By the mid-1800s, 556 stations enabled transmissions across more than 5,000 kilometers (km).

The optical telegraph was superseded by the electrical telegraph in 1838, when Samuel Morse developed his dot-and-dash code. Now information could be transmitted beyond visible distances without significant delay. In an 1844 demonstration on a government-funded research testbed, Morse sent the message "What Hath God Wrought?" from Baltimore to the U.S. Capitol (Bray, 1995).

The rapid deployment of telegraphic lines around the world was driven by the need of nineteenth-century European powers to communicate with their colonial possessions. High-risk technology investments were required. After the use of rubber coating was demonstrated on cables deployed across the Rhine River, the first transatlantic cable was laid in 1858, but it failed within months. A new cable designed by Lord Kelvin was laid in 1866 and operated successfully on a continuous basis.

The result was a rapidly expanding telegraphic network that reached every corner of the globe. By 1870, Great Britain communicated directly with North America, Europe, the Middle East, and India. Other nations scrambled to duplicate that system's global reach, for no nation could trust its critical command messages to the telegraphic lines of a foreign power.

1.2.3 Early Wireless

Within a few decades of its widespread deployment, telegraphy began to lose customers to a new technology—radio. In 1895 Guglielmo

Marconi demonstrated that electromagnetic radiation could be detected at a distance. Great Britain's Royal Navy was an early and enthusiastic customer of the company that Marconi created to develop radio communications. In 1901 Marconi bridged the Atlantic Ocean by radio, and regular commercial service was initiated in 1907 (Masini, 1996).

The importance of this new technology became evident with the onset of World War I. Soon after hostilities began, the British cut Germany's overseas telegraphic cables and destroyed its radio stations. Then Germany cut Britain's overland cables to India and those crossing the Baltic to Russia. Britain enlisted Marconi to put together a string of radio stations quickly to reestablish communications with its overseas possessions.

The original Marconi radios were soon replaced by more advanced equipment that exploited the vacuum tube's capability to amplify signals and operate at higher frequencies than did older systems. In 1915 the first wireless voice transmission between New York and San Francisco signaled the beginning of the convergence of radio and telephony. The first commercial radio broadcast followed in 1920 (Lewis, 1993). The use of higher frequencies (called shortwaves) exploited the ionosphere as a reflector, greatly increasing the range of communications. By World War II, shortwave radio had developed to the point where small radio sets could be installed in trucks or jeeps or carried by a single soldier. The first portable two-way radio, the Handie-Talkie, appeared in 1940. Two-way mobile communications on a large scale revolutionized warfare, allowing for mobile operations coordinated over large areas.

1.2.4 Telephony

The telephone was first demonstrated in 1876. A telephone network based on mechanical switches and copper wires then grew rapidly. The high cost of the cables limited the number of conversations possible at any one time; as demand increased, multiplexing techniques, such as time division and frequency division, were developed.

A mix of independent operators ran telephone services in the early days. Subscribers to different services could not call each other even when in the same town. In 1913 the U.S. government allowed American Telephone and Telegraph (AT&T) to assume control of the national telephone network in return for becoming a regulated monopoly delivering "universal" service. Yet it was not until the 1950s that unified network signaling was offered to subscribers, allowing them to make direct-dial long-distance telephone calls (Calhoun, 1992). Since then, the rapid extension of the long-distance telephone network has been made possible by advances in photonic communications and network control technologies.

1.2.5 Communications Satellites

The concept of using geosynchronous satellites for communications purposes was first suggested in 1945 by the science fiction writer Arthur C. Clarke, then employed at Britain's Royal Aircraft Establishment, part of the Ministry of Defence. Satellites of this type are positioned above the equator and move in synch with Earth's rotation. In 1954 J.R. Pierce at AT&T's Bell Telephone Laboratories developed the concept of orbital radio relays and identified the key design issues for satellites: passive versus active transmission, station keeping, attitude control, and remote vehicle control (Bray, 1995). Pierce advocated an approach of reaching geostationary orbit in successive stages of technology development, starting with nonsynchronous, low-orbit satellites. Hughes Aircraft Company advocated a geostationary concept based on the company's patented station-keeping techniques.

In 1957 the Soviet Union launched Sputnik, the first satellite to be placed in orbit. Amateur radio operators were able to pick up its low-power transmissions all over the world. In 1960 the National Aeronautics and Space Administration (NASA) and Bell Laboratories launched the first U.S. communications satellite, Echo-1, in a low Earth orbit. The first satellite-based voice message was sent by President Dwight Eisenhower using passive transmission techniques. The next advance in satellite technology was the successful launch of the TELSTAR system by NASA and Bell Laboratories. Using active transmission technology TELSTAR delivered the first television transmission across the Atlantic in 1962. Because it was placed in an elliptical orbit that varied from low to medium altitudes, the satellite was visible contemporaneously to Earth stations on both sides of the Atlantic for only about 30 minutes at a time. Clearly geostationary orbits were desirable if satellites were to be used for continuous telephone and television communications across long distances.

In 1963 Hughes Aircraft and NASA achieved geosynchronous orbit (known as GEO today) with the successful launch of the SYNCOM satellite. The satellite was placed in an orbit of approximately 36,210 km, a distance that allowed it to remain stationary over a given point on Earth's surface. SYNCOM led the way for the next several decades of satellite systems by demonstrating that synchronous orbit was achievable, and that station keeping and attitude control were feasible. Today most satellites, both military and commercial, are of the GEO variety.

COMSAT was formed by an act of Congress in 1962 and represented U.S. commercial interests in satellite technology development at Intelsat, established in 1964 as an international, government-chartered organization to coordinate worldwide satellite communications issues. INTELSAT-II (Early Bird) was launched into a geosynchronous orbit in

1965 and supported 240 telephone links or one television channel. Channel capacities are now measured in the tens of thousands of voice channels (the INTELSAT-VI, launched in 1987, supports 80,000 voice channels).

The first military satellites, the DSCS-I group, were launched by the U.S. Air Force in 1966. Three launches placed 26 lightweight (100-pound) satellites in near-geosynchronous orbit. These systems supported digital voice and data communications using spread-spectrum technology (an important signal-processing approach discussed extensively in Chapter 2). The satellites were replaced in the 1970s by the DSCS-II group, which increased channel capacity by using spot-beam antennas with high gain to boost the received power. The first cross-linked military satellites, the LES 8/9, were launched in 1976. This demonstration fostered a vision of space-based architectures—without vulnerable ground relays—for communication, navigation, surveillance, and reconnaissance.

Satellites offer several advantages over land-based communications systems. Rapid, two-way communications can be established over wide areas with only a single relay in space, and global coverage with only a few relay hops. Earth stations can now be set up and moved quickly. Furthermore, satellite systems are virtually immune to impairments such as multipath fading (channel impairments are discussed in Chapter 2). But with the rapid deployment of undersea fiber-optic links, the use of satellite channels for telephony has been on the decline. The high capacity of fiber provides for competitive costs, which, combined with low latency, have attracted consumers. The future of the satellite industry depends on the emergence of applications other than fixed telephony channels. A new generation of satellite systems is being deployed to provide mobile telephone services (see Section 1.5).

1.2.6 Mobile Radio and the Origins of Cellular Telephony

The early development of mobile radio was driven by public safety needs. In 1921 Detroit became the first city to experiment with radio-dispatched police cars. However, transmission from vehicles was limited by the difficulty of producing small, low-power transmitters suitable for use in automobiles. Two-way systems were first deployed in Bayonne, New Jersey, in the 1930s. The system operated in "push-to-talk" (i.e., half-duplex) mode; simultaneous transmission and reception, or full-duplex mode, was not possible at the time (Calhoun, 1988).

Frequency modulation (FM), invented in 1935, virtually eliminated background static while reducing the need for high transmission power, thus enabling the development of low-power transmitters and receivers for use in vehicles. World War II stimulated commercial FM manufacturing capacity and the rapid development of mobile radio technology. The

need for thousands of portable communicators accelerated advances in system packaging and reliability and reduced costs. In 1946 public mobile telephone service was introduced in 25 cities across the United States. The initial systems used a central transmitter to cover a metropolitan area. The inefficient use of spectrum and the coarseness of the electronic filters severely limited capacity: Thirty years after the introduction of mobile telephone service the New York system could support only 543 users.

A solution to this problem emerged in the 1970s when researchers at Bell Laboratories developed the concept of the cellular telephone system, in which a geographical area is divided into adjacent, non-overlapping, hexagonal-shaped "cells." Each cell has its own transmitter and receiver (called a base station) to communicate with the mobile units in that cell; a mobile switching station coordinates the handoff of mobile units crossing cell boundaries. Throughout the geographical area, portions of the radio spectrum are reused, greatly expanding system capacity but also increasing infrastructure complexity and cost.

In the years following the establishment of the mobile telephone service, AT&T submitted numerous proposals to the Federal Communications Commission (FCC) for a dedicated block of spectrum for mobile communications. Other than allowing experimental systems in Chicago and Washington, D.C., the FCC made no allocations for mobile systems until 1983, when the first commercial cellular system—the advanced mobile phone system (AMPS)—was established in Chicago. Cellular technology became highly successful commercially with the miniaturization of subscriber handsets.

1.2.7 The Internet and Packet Radio

The original concepts underlying the Internet were developed in the mid-1960s at what is now the Defense Advanced Research Projects Agency (DARPA), then known as ARPA. The original application was the ARPANET, which was established in 1969 to provide survivable computer communications networks. The ARPANET relied heavily on packet switching concepts developed in the 1960s at the Massachusetts Institute of Technology, the RAND Corporation, and Great Britain's National Physical Laboratory (Kahn et al., 1978; Hafner and Lyon, 1996; Leiner et al., 1997). This approach was a departure from the circuit-switching systems used in telephone networks (see Box 1-1).

The first ARPANET node was located at the University of California at Los Angeles. Additional nodes were soon established at Stanford Research Institute (now SRI International), the University of California at Santa Barbara, and the University of Utah. The development of a host-to-host protocol, 2 the network control protocol (NCP), followed in 1970,

enabling network users to develop applications. At the same time, the ALOHA Project at the University of Hawaii was investigating packet-switched networks over fixed-site radio links. The ALOHANET began operating in 1970, providing the first demonstration of packet radio access in a data network (Abramson, 1985). The contention protocols used in ALOHANET served as the basis for the "carrier-sense multiple access with collision detection" (CSMA/CD) protocols used in the Ethernet local area network (LAN) developed at Xerox Palo Alto Research Center in 1973. The widespread use of Ethernet LANs to connect personal computers (PCs) and workstations allowed broad access to the Internet, a term that emerged in the late 1970s with the design of the Internet protocol (IP). The need to link wired, packet radio, and satellite networks led to the specifications for the transmission control protocol (TCP), which replaced NCP and shifted the responsibility for transmission from the network to the end hosts, thereby enabling the protocol to operate no matter how unreliable the underlying links. 3

The development of microprocessors, surface acoustic wave filters, and communications protocols for intelligent management of the shared radio channel contributed to the advancement of packet radio technology in the 1970s. In 1972 ARPA launched the Packet Radio Program, aimed at developing techniques for the mobile battlefield, and SATNet, an experimental satellite network. In 1983 ARPA launched a second-generation packet radio program, Survivable Adaptive Networks, to demonstrate how packet radio networks could be scaled up to encompass much larger numbers of nodes and operate in the harsh environment likely to be encountered on the mobile battlefield.

image

FIGURE 1-1 Military radios are designed for different uses. Combat net radios, for example, are designed for communications within a battle group.

1.3 Military Wireless Systems And Research

1.3.1 terrestrial systems.

Radio communications technology is widely used by U.S. military units at all levels. The many different types of military radios and applications cause a variety of communication problems. The military environment magnifies common difficulties such as the failure of one radio type to communicate with another type (interoperability), failure of one user to communicate with another (connectivity), incompatibility of new radios with old radios (legacy systems), and one radio at a location interfering with another radio at the same location (co-site interference).

In general, U.S. military radio systems can be categorized by the location of users and the information they broadcast and receive (see Figure 1-1). Multiple radios are often gathered together in an aircraft, shipboard radio room, or communications van to form tactical radio complexes and command-and-control centers. The radios operate simultaneously using many different waveforms across several frequency bands (e.g., high frequency [HF], very high frequency [VHF], and ultrahigh frequency [UHF]).

Combat net radios take the form of either a single radio in a vehicle (much like a car radio) or a device like a "walkie-talkie" carried around by a soldier. Most of the information broadcast on combat net radios consists of voice communications, often to share position information. Many of today's combat net radios have been enhanced to carry data in addition to voice. In general, combat net radios have fewer capabilities and cost less than do tactical radios (see Table 1-2). Military radios generally cost much more than commercial systems supporting similar applications.

Deployed military radios have various shortcomings. For example, the mobile subscriber equipment (MSE), the U.S. Army's mobile telephone system for the battlefield, was designed to be like a cellular telephone but is outdated compared to current technology. The single-channel

ground and airborne radio system (SINCGARS) has been updated with recent technology, including programmable microprocessors, application-specific integrated circuits (ASICs), and surface-mount technology, but it implements a series of outdated waveform standards for single-channel digital voice. Furthermore, SINCGARS has experienced severe co-site interference problems because it hops transmission frequencies within the VHF band, a design capability that helps prevent jamming by adversaries but results in hops onto channels already in use for other communications traffic. The mobile subscriber radio terminal (MSRT) costs $70,000 and is about the size of a microwave oven; an updated version, introduced in 1994, is no less expensive and no smaller. Numerous HF radios have been built by the Army, but most are in storage because these radios are not simple push-to-talk designs and user training for the difficult HF channel has not been widespread.

The problems posed by individual radios are exacerbated by the difficulties encountered in linking communications systems of varying sophistication together (see Box 1-2). Special interfaces can be designed; SINCGARS, for example, can be interfaced into the MSRT. Inherent interoperability is among the features sought in sophisticated future systems. But in the near term, front-line troops will continue to use both existing and evolving radios, such as SINCGARS, mobile tactical satellite (TACSAT)

terminals, MSE, MSRT, and packet radios. The Army is struggling with how to upgrade the MSE, a proprietary system. The SINCGARS is expected to be replaced and upgraded with a tri-service joint tactical radio in 1999.

The U.S. Department of Defense established IP as the underlying ''building code" for the Army, making a commitment to migrate all communications networks to the same basic structure as the Internet to position the military to integrate and leverage the advances in commercial information technologies. The Army's Task Force XXI "Tactical Internet" (Booz-Allen & Hamilton, 1995) was the first major experimental fielding of this new architecture (Sass and Eldridge, 1994; Sass, 1996).

1.3.2 Satellite Systems

Satellite systems play a major role in military communications. They are attractive alternatives to land-based systems because they provide mobile and tactical communications to a large number of users over a wide geographical area. In addition, communication links can be added or deleted quickly, and satellites are less vulnerable to destruction or enemy exploitation than are land-based systems.

The DOD uses both military and commercial satellites to meet its communications needs. Fleet communications are supported by the government-owned FLTSAT and contractor-owned LEASAT systems, both of which are geosynchronous. The U.S. Air Force uses FLTSAT, the elliptical-orbit Satellite Data System, and the DSCS-III satellites to support the AFSATCOM satellite system. The DSCS, a vital component of the global defense communications system, is the DOD's primary system for long-haul, high-volume trunk traffic. The operational DSCS space segment consists of a mix of DSCS-II and DSCS-III satellites.

In 1982 the military began developing new satellite and terminal technology for MILSTAR, a millimeter-wave system operating in the 30–60 gigahertz (GHz) range. This new system consists of both geosynchronous and inclined-orbit satellites. The system provides enhanced antijam (AJ) capabilities as well as hardening against nuclear attack. Only a few of the planned eight MILSTAR satellites have been deployed so far. The complete system would provide two satellites per coverage area over the continental United States and the Atlantic, Pacific, and Indian oceans.

In general, existing tactical-satellite ground terminals incorporate new technology (e.g., microprocessors, ASICs, surface-mount technology) but are still forced to implement legacy waveforms. As a result, they have generally not kept pace with innovations in commercial communications waveforms and standards. In the case of MILSTAR, the military uses a noncommercial frequency band and is therefore unable to use—or take advantage of the price reductions in—commercial hardware. The new Joint Tactical Terminal (one of the systems listed in Table 1-2) is designed using modern radio technology, perhaps even including software-defined radios (see Section 1.3.3.2). High data rates sufficient for multimedia transmissions can be achieved only with the most advanced technology. For example, the global broadcast system (GBS), part of the U.S. Navy's UHF Follow-On satellites 8, 9, and 10, has bandwidth exceeding 100 megabits per second (Mbps) and worldwide coverage.

The most widely used military satellite system is the global positioning system (GPS), which encompasses 18 to 24 satellites in inclined orbits transmitting spread-spectrum signals. The GPS receivers extract precise time and frequency information from these signals to determine with

great accuracy the receiver location, velocity, and acceleration. The system can be used by anyone with a receiver. 4 Commercial GPS receivers are used for numerous applications, including surveying, aircraft and ship navigation, and even recreational activities on land. Although launching and upkeep of the entire fleet of satellites are paid for by the United States, commercial GPS receivers were used by both sides in the Gulf War.

1.3.3 Research Initiatives in Untethered Communications

The DOD's vision for future communications systems is typically expressed in general terms, such as "multimedia to the foxhole" (see Box 1-3). For example, the Army's architecture for the digitized battlefield of the twenty-first century consists of fixed high-bandwidth infrastructure at the Army, theater, and corps levels, integrated with the DOD's global grid (a concept for spanning the world with high-bandwidth computing and communications systems) and based on asynchronous transfer mode (ATM) wide-area networking technology (Sass and Gorr, 1995). Bandwidth is allocated not only up and down the command hierarchy but also horizontally to cooperating formations. At the division level and below, wireless extensions provided by mobile radio access points (RAPs) will link the front-line combat communications systems to the infrastructure in the rear areas. The RAP is a wheeled or tracked vehicle with an on-the-move antenna system. The RAPs carry extensive communications systems and are interconnected by high-capacity trunk radios capable of

communicating at up to 45 Mbps over a range of 30 km. Satellites or other systems may provide back-up communications.

To the committee's knowledge, the operational requirements for future untethered communications have not been translated into technical specifications. In the future, technical specifications will need to be formulated in a way that will make it possible to determine which commercial technologies are capable of meeting military needs. As an alternative, some general DOD requirements can be inferred from military plans and the known technical capabilities of existing and emerging communications technologies. For example, future military wireless systems will require high data rates—the long-range goal is at least 10 Mbps—and the capability to transmit over broad and variable frequency bands (some experimental radios are designed to span frequencies from 2 MHz to 2 GHz). The systems will need to be rapidly deployable and the infrastructure will need to be mobile. Multilevel communications security that encompasses the most secure levels possible will be needed. Furthermore, to enable worldwide strategic communications, the new equipment will need to be interoperable with older military systems as well as those used by foreign allies and international forces. There are more than 17 different U.S. defense communications networks, and none are readily interoperable at present. New concepts and technologies will clearly be needed to meet all these requirements.

To meet its future communications requirements, the DOD is funding a number of research and demonstration projects, typically pursuing high-risk ventures with potentially high payoff. The most comprehensive DOD-funded initiative dealing with untethered communications is the Global Mobile Information Systems (GloMo) program initiated by DARPA in 1994. Other relevant research initiatives deal with software-defined radios, communications systems, and radio technology (Leiner et al., 1996).

1.3.3.1 Global Mobile Information Systems Program

The overarching goal of GloMo is to develop technology for robust end-to-end information systems in a global mobile environment by exploiting commercial products and generating new technologies with applications in both commercial and military domains. The program supports a wide range of research projects, which are identified based on the priorities of GloMo managers rather than on a systems approach to the development of top-down solutions. Notably missing from the program, for example, is a comprehensive assessment of the suitability of various network architectures, even though all other component needs are dictated by the system design. (Network architecture issues are discussed in detail in Chapters 2 and 3.) The GloMo program currently focuses on

developing innovative technologies that span the following research thrusts.

Design Infrastructure . This effort spans tools, languages, and environments for designing and deploying wireless systems. Research areas include computer-aided design tools for estimating power and designing low-power systems, design libraries and models for mixed-signal integrated circuits (ICs) suitable for implementing highly integrated RF chip sets, and simulation tools for modeling the propagation of radio waves and higher-level protocols.

Untethered Nodes . This effort focuses on high-performance, modular, low-cost, and low-power wireless nodes. Research activities are aimed at developing the next generation of agile, highly integrated radio technology. Radio control points are exposed to higher software layers to make radios and applications more adaptable to changing needs and conditions. Complementary metal oxide semiconductor (CMOS) technology (an inexpensive, low-power technology) is being pushed to its limits to achieve high-speed RF circuitry coupled to high levels of integration. Several activities are combining custom signal processing for audio and video with the radio circuitry. In these efforts radios are viewed as modular building blocks that can be combined to yield systems with different cost-performance-function attributes. Some projects are investigating the architectures of software radios, in which many of the radio functions are performed by software combined with very-high-performance processing architectures.

Network Protocols and Algorithms . This effort deals with the development of robust network architectures and techniques for rapid deployment of wireless networks. Research efforts include the development of new packet-radio routing schemes such as dynamic routing protocols for ad hoc networking. The concepts being studied are not limited to end-node mobility: Other possibilities include base-station mobility and network reconfiguration as base stations are repositioned in a battlefield scenario.

End-to-End Networking . This effort addresses how best to operate across a heterogeneous mix of underlying networks, both wireless and wired. Research areas include extensions to TCP/IP that will enable mobile users to access the Internet, satellite extensions to the Internet, and overlay wireless networking that supports mobility across diverse wireless subnetworks inside buildings and in the wider area.

Mobile Applications Support . This effort deals with the development of distributed computing techniques that will enable applications to adapt

to varying network connectivity and quality of service (QoS) needs. The techniques being studied include software agents (sometimes called mediators or proxies) that adapt data representations to the capabilities of bandwidth-constrained wireless links; methods of performing computations in the wireline infrastructure on behalf of power- and display-limited portable devices such as personal digital assistants (PDAs); capabilities to move code between wired and portable nodes to provide location-dependent or new functionality when the node is poorly connected; file system structures that operate whether well connected, disconnected, or poorly connected to a wired infrastructure; event-notification protocols that enable applications to learn of changes to the underlying network connectivity and QoS; and techniques for structuring applications to exploit information about their current location.

1.3.3.2 Software-Defined Radio Research

The DOD is devoting considerable attention to designing and demonstrating software-defined radios, none of which is in production as yet. The most prominent of these initiatives is the SpeakEASY program sponsored by DARPA, the Air Force Rome Laboratory, and the Army Communication Electronics Command. The key objective of SpeakEASY is to change the paradigm for military radios. In the past, radios were based on "point designs" with negligible capabilities for functional upgrades or waveform changes—capabilities that define SpeakEASY. In phase 1 of the program, analog-to-digital (A/D) converters were used to complete the radio signal path and high-speed digital signal processors (DSPs) were used for filtering and demodulation. The key technologies demonstrated in phase 1 include digital frequency conversion and wideband signal processing.

In SpeakEASY phase 2, modular radio elements (separate modules for the analog elements, A/D converter, and DSPs) will be integrated on an open-architecture bus. The key objective of phase 2 is to demonstrate a software-defined networking radio with support for legacy and future waveform evolution using a single architecture. This approach increases production volume, reduces costs, and enhances logistical support. The open-architecture design implies that competitive bids would be sought for commercial boards, modules, and software. Other goals include the use of commercial modules in the radio and the commercialization of any functions developed specifically for the radio.

The Naval Research Laboratory has an ongoing research program focusing on a software-defined radio known as the Joint C 4 I Terminal (JCIT). The JCIT grew out of an Army requirement for an advanced, helicopter-based command-and-control system. The JCIT will incorporate

multiple software-defined radios for combat net, intelligence communications, and military data links on a single platform.

Also under development is the advanced communications engine (ACE), which evolved from a project sponsored by DARPA. The ACE is a software-defined digital radio with capabilities for multiple simultaneous band and channel transmissions (it has six receiving and transmitting channels). The initial prototypes demonstrate "dual-use" (i.e., both military and commercial) capabilities including those of combat net radios SINCGARS and Have Quick (a UHF system designed to provide secure air-to-air and air-to-ground communications with AJ capabilities) and commercial avionics radios such as GPS, VHF air to ground, and the aircraft communications addressing and reporting system.

A very ambitious program, Millennium, was initiated to design an ultra-wideband radio. One objective was to demonstrate extremely high speed (approximately 1 billion samples per second) A/D data converters for both military and commercial communications. After the data conversion process, all tuning, filtering, demodulation, and decoding functions are performed by software (these processes and the associated technologies are discussed in Chapter 2).

1.3.3.3 Communications Systems Research

Several important research programs focus on complete communication systems. The DARPA Battlefield Awareness and Data Dissemination (BADD) program combines radios, ATM routers, and various communications networks and airborne relays from the Army's digital battlefield technology development effort for the deployment of high-speed data and large-file image transfer to the forward area. The Bosnia Command and Control Augmentation program, which is phase 1 of the GBS and focuses on satellite communications, grew out of BADD testing. Phase 2 of the GBS involves the incorporation of DirecTV transponders into Navy UHF satellites. Phase 3 will provide the means for stand-alone satellite transfer of high-speed data and large-file images.

1.3.3.4 Radio Component Research

The DOD's Extremely Lightweight Antenna program produced a compact, lightweight (under 2 pounds), and wideband (85 MHz to 2.2 GHz) antenna. The antenna incorporates a directional wideband satellite beam as well as low-gain omnidirectional radiation patterns. The DARPA Advanced Digital Receiver Technology program was initiated to demonstrate technology elements for software-defined receivers in communications,

radar, and electronic warfare. Several of these functions might be merged into one digital receiver unit.

1.3.3.5 Small Unit Operations

The Small Unit Operations Situational Awareness System includes a significant wireless communications component. One goal of the research is to create a radio system for exchanging information among groups of up to 12 foot soldiers operating in an area of approximately 4 km 2 .

1.3.3.6 Modeling and Simulation

The Scalable Self-Organizing Simulations (S3) Program, supported by DARPA and the National Science Foundation, uses parallel computers to simulate communications networks. This program includes projects that create models and a library of computer programs for simulating mobility, radio propagation, and teletraffic patterns in large-scale wireless networks.

1.4 Commercial Terrestrial Mobile Telephone Systems And Services

Commercial wireless communications systems have exhibited remarkable growth over the past decade (see Figure 1-2). There are currently more than 50 million U.S. cellular subscribers (Hill, 1997) and more than 34 million U.S. paging subscribers (Mooney, 1997). An estimated 17 percent of the U.S. population now has cellular service, compared to 95 percent with wireline telephone service (Hill, 1997). There are also 50 million subscribers to systems based on the global system for mobile communications (GSM) standard, the European cellular technology. Worldwide, the total number of subscribers to cellular systems is projected at just under 200 million (Hill, 1997). It should be noted that these figures, as market research estimates, are fundamentally imprecise and, moreover, tend to be volatile because of the dynamic nature of the wireless industry.

Throughout the world, wireless communication systems are enabling developing countries to provide instant telephone service to new subscribers who otherwise would have to wait years for wireline access. Although wireless users are still far outnumbered by the approximately 700 million wireline telephone users worldwide, the number of new wireless subscribers is growing 15 times faster than the wireline subscriber base, and this pace is expected to accelerate in the coming years. Analysts predict that, by the year 2010, there will be equal numbers of wireless and wireline connections throughout the world.

Wireless mobile telephone systems can be divided into three generations.

image

FIGURE 1-2 The number of U.S. cellular subscribers and cell sites soared between 1984 and 1996. Note that 1984 figures are for January 1985. Source: Reproduced with the Cellular Telephone Industry Association's permission from the CTIA's Semi-Annual Data Survey.

The first generation, introduced in the 1980s and early 1990s, uses analog cellular and cordless telephone technology. Second-generation systems transmit speech in digital format. They provide advanced calling features and some nonvoice services. There are two categories of second-generation systems. High-tier systems feature high-power transmitters, base stations with coverage ranges on the order of kilometers, and subscribers moving at vehicular speeds. Low-tier systems, serving subscribers moving at pedestrian speeds, have low-power transmitters with a range on the order of 100 meters (m). Some of these systems are designed primarily for indoor use. Third-generation systems, planned for introduction after 2002, are expected to integrate disparate services, including broadband information services that cannot be delivered with second-generation technology. Many users are looking forward to the increased convenience promised by the integration or compatibility of systems (see Box 1-4). In addition to terrestrial mobile telephone systems, other commercial wireless systems include satellite communications, mobile data systems, and wireless local area networks (LANs).

1.4.1 First-Generation Systems

Of the original wireless communications systems deployed in the 1980s, the most popular was the analog cordless telephone, which uses

radio to connect a portable handset to a unit that is wired to the public switched telephone network. Hundreds of millions of such devices have been produced, and the technology has been standardized in Europe under the cordless telephone first-generation (CT0, CT1, and CT1+) standards. There is no single U.S. standard. Analog cordless telephones have ranges limited to tens of meters and require a dedicated telephone line. Cellular systems have enabled much greater mobility.

In establishing cellular service in 1983 the FCC divided the United States into 734 cellular markets (called metropolitan statistical areas and rural service areas), each with an "A-side" and "B-side" cellular service provider. Historically, the designation of A or B indicated the origins of the cellular provider: An A-side provider did not originate in the traditional telephone business and was called a nonwireline carrier, whereas a B-side provider had roots in traditional services and was called a wireline carrier. Each cellular carrier is licensed to use 25 MHz of radio spectrum in the 800-MHz band to provide two-way telephone and data communications for its particular market. Because the U.S. analog cellular system is standardized with AMPS, any cellular telephone is capable of working in any part of the country.

The AMPS cellular standard uses analog FM and full-duplex radio channels. The frequency division multiple access (FDMA) technique enables multiple users to share the same region of spectrum. This standard supports clear communication and inexpensive mobile telephones, but the transmissions are easy to intercept on a standard radio receiver and therefore are susceptible to eavesdropping. As of late 1996, 88 percent of all cellular telephones in the United States used the AMPS standard (digital

cellular standards have only recently become available). Outside of the United States and Canada, a wide variety of incompatible analog cellular systems have been deployed (see Table 1-3). The European cellular service, which predated the AMPS system, used the Nordic mobile telephone (NMT) standard beginning in 1982. Other European nations and Japan also developed analog standards.

1.4.2 Second-Generation Systems

Spurred by growing consumer demand for wireless services, standards organizations in North America, Europe, and Japan have specified new technologies to meet consumer expectations and make efficient use of allocated spectrum bands. These second-generation systems use advanced digital signal processing, compression, coding, and network-control techniques to conserve radio bandwidth, prevent eavesdropping and unauthorized use of networks, and also support additional services (e.g., voice mail, three-way calling, and text transmission retrieval).

In the United States, second-generation technologies have been deployed in the original 800-MHz cellular bands and in personal communications bands around 1900 MHz that were allocated by the FCC between 1995 and 1997. In Europe and most other parts of the world, second-generation technologies are deployed in the 900-MHz cellular bands and in 1800-MHz personal communications bands. Japan operates digital cellular systems in various bands between 800 MHz and 1500 MHz as well as a personal communications band near 1900 MHz.

The most widespread second-generation techniques include three high-tier standards: the European standard, GSM; and two North American standards, IS-136, a time division multiple access (TDMA) technique, and IS-95, a code division multiple access (CDMA) technique. 5 The GSM standard, which has been adopted in more than 100 countries, specifies a complete wide-area communications system. The other two standards specify only the communications between mobile telephones and base stations. A separate standard, IS-41, governs communications between mobile switching centers and other infrastructure elements in the United States. Table 1-4 summarizes the properties of the principal high-tier second-generation systems.

Among low-tier standards, the personal handyphone system (PHS) provides mobile telephone services to several million Japanese subscribers. Two other standards, digital European cordless telecommunications (DECT) and cordless telephone second generation (CT2), from the basis of several wireless business telephone (i.e., private branch exchange, or PBX) products. A fourth low-tier system is the personal access communications system (PACS), a U.S. standard. Although PACS has attracted considerable industry interest, it has not been widely deployed to date. Table 1-5 summarizes the properties of low-tier systems.

In addition to the 1900-MHz licensed personal communications bands (see Table 1-5, the fifth column), the FCC has allocated the 1910–1930 MHz band for unlicensed low-tier systems. Commercial products based on DECT, PHS, and a modified version of PACS (designated PACS-UB, for unlicensed band) are under consideration for deployment in the 1910–1930 MHz band.

Each of the second-generation systems has distinct features and limitations, but none was designed specifically with the problems of large, complex organizations such as the military in mind. Nevertheless, it is possible to combine disparate approaches in a customized network built to meet the unique voice and data communications needs of an organization with national reach (see Box 1-5).

The commercial success of second-generation wireless telephone systems has stimulated widespread interest in enhancing their capabilities to meet public expectations for advanced information services. For example, new speech-coding techniques offering improved voice quality have been introduced to all three high-tier systems. Efforts are also under way to make these systems more attractive for data services. Accordingly, standards for fax-signal transmission have been established, and standards for circuit-switched data transmission at rates of up to 64 kilobits per second (kbps) are under development for GSM and CDMA. In addition, technology for packet-switched data transmission, suitable for providing wireless Internet access, is being developed for all second-generation systems. The technology base will continue to grow as R&D organizations worldwide design innovations for a third generation of wireless communications systems. 6

1.4.3 Third-Generation Systems

The original concept for third-generation wireless systems emerged from an International Telecommunications Union (ITU) initiative known as the future public land mobile telecommunication system (FPLMTS). 7 Over the past decade the ITU advanced the concept of a wireless system that would encompass technical capabilities a clear step above those of second-generation cellular systems. The current name for the third-generation system is International Mobile Telecommunications-2000 (IMT-2000). The number refers to an early target date for implementing the new technology and also the frequency band (around 2000 MHz) in which it would be deployed.

As envisioned in the IMT-2000 project, the third-generation wireless system would have a worldwide common radio interface and network. It would support higher data rates than do second-generation systems yet be less expensive. It would also advance other aspects of wireless communications by reducing equipment size, extending battery life, and improving ease of operation. In addition, the system would support the services required in developing as well as developed nations. Box 1-6 lists the complete set of goals established in 1990 for FPLMTS.

Since 1990 IMT-2000 recommendations have been approved that elaborate on the initial goals, establish security principles, prescribe a network architecture, present a plan for developing nations, establish radio interface requirements, and specify a framework for a satellite component. The ITU anticipated an international competition leading to a radio interface that could be developed and deployed by the year 2000. The competing radio interfaces would provide minimum outdoor data rates of 384 kpbs and an indoor rate of 2 Mbps. Other than providing a forum for discussion of

standards proposals, the ITU has not adopted clear plans of how to proceed beyond the point of reviewing the proposals.

The 1995 World Radio Conference set aside spectrum for nations to consider for the deployment of IMT-2000. The bands are 1920-1980 MHz and 2110-2170 MHz for terrestrial communications and 1980-2010 MHz and 2170-2200 MHz for satellites. As noted in Table 1-4 and Table 1-5, the United States has already allocated spectrum bands to personal communications that include part of the lower IMT-2000 band, making it unlikely

that U.S. service providers could deploy IMT-2000 at all. Early on, attention to the ITU work was limited in both Europe and the United States, where growth in second-generation digital cellular and personal communications markets has been strong. It was the Japanese, virtually alone among all nations, who insisted that the ITU program proceed as fast as possible because they were running out of spectrum for their cellular and personal communications systems. 8 The Japanese were able to keep the IMT-2000 program on schedule, resulting in an ITU call for radio-interface proposals, now due in mid-1998. In support of this effort, the Japanese radio standards group is developing one or more Japanese standards for use in the ITU-2000 spectrum. Presumably the standard(s) will be submitted to the ITU for possible worldwide use.

Meanwhile, the European telecommunications industry established a framework for developing third-generation mobile wireless technology. The universal mobile telephone system (UMTS) is intended to replicate the commercial success achieved a decade earlier with GSM. The UMTS schedule calls for establishing the technology base by December 1997, deploying a minimum system in 2002, and achieving a full system in 2005. The technical goals of UMTS closely resemble many of the IMT-2000 goals. The Europeans plan to propose the technologies adopted for UMTS as candidates for IMT-2000.

In the United States, action on this issue did not take place until mid-1997, when the four U.S. CDMA cellular infrastructure manufacturers—Lucent Technologies, Motorola, Nortel, and QUALCOMM, Inc.—announced a third-generation program called Wideband cdmaOne. Like many candidate systems under consideration in Europe and Japan, the U.S. system uses a 5-MHz CDMA signal, although the operating parameters and design features differ from those of foreign counterparts. Additional U.S. proposals for IMT-2000 could emerge from other communities of companies supporting other digital radio interface standards. 9

Among related developments, interest in "nomadicity" is growing within the Internet community in the United States. As originally conceived, the national information infrastructure (NII) placed little emphasis on the wireless delivery of information to mobile users (Computer Science and Telecommunications Board, 1994). But with the growth in demand for Internet services, reflected by the transition to private suppliers, providers are seeking to leverage Internet technology either directly or as part of heterogeneous networks. Plans are being made to accommodate nomads (i.e., mobile users) who draw on a variety of communications, computing, and information systems simultaneously, a concept that will require attention by multiple industries to issues such as security, interoperability, and synchronization within and between systems (Cross-Industry Working Team, 1995).

Other ITU activities are addressing network aspects of IMT-2000. 10 Here again the Japanese have made major contributions toward the establishment of a single worldwide network to support wireless systems. Only in mid-1997 did the U.S. and European delegations begin to make significant contributions, concerned about their current investments in cellular and personal communications networks and the possible effects of establishing a worldwide network that was incompatible with their systems. The latest U.S. and European proposals emphasize the idea of a family of networks supporting a family of radio interfaces through the use of appropriate gateways to achieve worldwide roaming and interoperability.

Although it is clear that many new wireless communications technologies will emerge in the 2002-2005 time frame, it is not clear when and how they will be commercialized. The robust evolution of second-generation systems will limit commercial incentives to introduce a new generation of systems. It is possible that advances in second-generation systems will meet future demand for mobile telephone services and that a demonstrated demand for high-bit-rate data services will be necessary to stimulate the commercial deployment of third-generation technology.

1.5 Commercial Satellite Systems

Satellite systems can be classified by frequency and orbit. Above 1 GHz a satellite signal easily penetrates the ionosphere. Transmission at higher frequencies is desirable because additional bandwidth is available there, but then expensive components are needed to overcome signal attenuation, absorption, and path loss (see Chapter 2 for a discussion of channel impairments). Most satellite systems are of the GEO variety, offering configuration simplicity, wide footprint (i.e., one satellite covers an entire geographical region), and fixed satellite-to-ground-terminal characteristics. But GEO systems also have a number of disadvantages, including long propagation delays (a round-trip takes approximately half a second), high transmitter-power requirements, and poor coverage at the far northern and southern latitudes. Moreover, GEO satellites are expensive to launch, and, because only a handful of satellites are typically used to achieve global coverage, they are vulnerable to single points of failure.

The International Maritime Satellite (INMARSAT) Organization, formed in 1979, is now backed by the governments of 75 member countries. Its first satellites (INMARSAT-A) became operational in 1982, supporting voice and low-rate data applications with analog FM technology. By the end of 1993, 30,000 ground terminals were in operation. The next generation of INMARSAT satellites (INMARSAT-B and C) used digital technology, but data rates remained low (600 bps). With the introduction of INMARSAT-M in 1996 it is now possible to use laptop computer-sized satellite terminals

for voice and low-rate (2.4 kbps) data transmission. However, the voice quality of this system remains poor due to propagation delay, and data transmission rates are 10 times slower than those of a standard modem.

In the late 1980s QUALCOMM deployed the OMNITracs vehicle-tracking and communications system for both North America (using GSTAR satellites) and Europe (using EUTELSAT satellites). The service provides two-way messaging and automatic position reporting. By 1997 more than 200,000 trucks, most of them in the United States, were equipped with the system. The use of such systems in Europe has been restricted by high equipment costs and expectations for less-costly alternatives with the next generation of systems.

Recently introduced GEO systems for data communications include Mobilesat in Australia and MSAT in North America (see Table 1-6). Innovations in GEO systems include spot beams for custom broadcast coverage and improved on-board processing. Although GEO satellite communications systems are not fully mobile (i.e., the terminals are not handheld), innovations in terminal design have enabled the development of private networks and rapidly reconfigurable systems. Very small aperture terminals (VSATs) use small Earth-station antennas to form private networks through links to GEO satellites. The VSAT is the result of more than 20 years of advances in digital Earth-station technology. The applications have evolved from point-to-point transmission links to networking terminals that leverage the broadcasting capability of satellites.

The VSAT terminals offer various types of access. Fast-response protocols are used for time-sensitive transactions such as credit card purchases and hotel or airline reservations, throughput-efficient access is used for file transfers, and circuit-switched access is used for speech and digital video. (Throughput is the fraction of time during which a channel can be used.) An important feature of VSAT technology is ease of deployment: Installation takes approximately 2 hours. Companies are now installing VSATs at the rate of more than 1,500 per month. There are more than 200,000 VSATs worldwide, operating in nearly every country; individual networks range in size from as few as 20 nodes operating in a shared-hub environment to nearly 10,000 in the General Motors Corporation network.

In 1994 direct-broadcast satellites (DBSs) became operational, some two decades after the first experiments were performed with this technology. These systems broadcast a signal from a GEO satellite with sufficient power to allow direct reception in a home, office, or vehicle with an inexpensive receiver. The two primary applications for DBS systems are television and radio; emerging applications include DirecPC and GBS. Systems for direct-broadcast television are operational in Europe, Japan, and the United States. By the end of 1996 these systems had more than 2.5 million U.S. subscribers. Digital audio broadcasting (DAB) has the potential to provide every radio within a service area with continuous transmissions of a sound quality comparable to that of a compact disc. Systems are being tested around the world that deliver DAB from satellites as well as from terrestrial antennas.

Communications systems using non-GEO satellites are emerging as major players in commercial wireless applications. These satellites are characterized as either medium Earth orbit (MEO) or low Earth orbit (LEO). The LEOs, deployed in either circular or elliptical orbits of 500 to 2,000 km, offer several advantages including reduced propagation delay and low transmit-power requirements, allowing the use of handheld terminals. But at these altitudes a system requires many satellites to achieve global coverage. Furthermore, satellite movement relative to the ground terminal introduces Doppler shift in the received signal, and each satellite is visible from a ground terminal for only a few minutes at a time so that handoffs between satellites are frequent. The MEO satellites offer features that represent a compromise between LEOs and GEOs. The MEOs are deployed in circular orbits at an altitude of about 10,000 km. Approximately 10 to 15 satellites (more than GEOs but fewer than LEOs) are required for global coverage, and average visibility is one to two hours per satellite (less than for GEOs but more than for LEOs). The Doppler shift in MEOs is also considerably less than that in LEOs, but higher transmit power is required.

The majority of new satellite systems that will become operational by the year 2000 are LEO or MEO systems. These satellites can be categorized further by size. Big LEO/MEOs (see Table 1-7) support voice and data communications with large satellites (weighing 400–2,000 kilograms [kg]) and operate at frequencies above 1 GHz. Little LEOs use much smaller satellites (weighing 40–100 kg) and operate in the UHF and VHF bands, thereby enabling the use of inexpensive transmission hardware for both the satellite and ground terminal. The 36-satellite Orbcomm system is an example.

Most of these systems provide voice and low-rate data to mobile users with handheld terminals. The link rates for little LEOs are asymmetric, with lower rates on the uplink (ground to satellite) than on the downlink (satellite to ground) because of power limitations in the handheld unit. Teledesic is unusual because it is intended primarily for broadband wireless data communications with stationary terminals at integrated services digital network (ISDN) rates. Teledesic and Iridium have direct intersatellite communication links independent of the ground segment, enabling the provision of services to countries lacking a communications infrastructure. Iridium is designed to consist of 66 satellites arranged in six planes, all in a nearly polar orbit. Each satellite is expected to serve as a "switchboard in the sky," routing each channel of voice traffic through various other satellites in the system; communications are eventually delivered to an appropriate ground-based gateway to terrestrial telecommunications.

Globalstar is a LEO digital telecommunications system that will begin offering wireless telephone, data, paging, fax, and position location services worldwide beginning in 1998. The 48-satellite constellation operating 1,410 km from the planet surface serves as a "bent-pipe" relay to local ground-based infrastructure.

1.6 Mobile Data Services

Commercial packet-switched mobile data services emerged after the success of short-message, alphanumeric one-way paging systems. Mobile data networks provide two-way, low-speed, packet-switched data communication links with some restrictions on the size of the message (10 to 20 kilobytes) in early systems. Services provided by mobile data networks include the following:

The first commercial mobile data network was Ardis, a private network developed in 1983 by IBM Corporation and Motorola to enable IBM to provide computing facilities in the field. By 1990 Ardis was deployed in more than 400 metropolitan areas and 10,700 cities and towns using 1,300 base stations. By 1994 Ardis (since then owned by Motorola) provided nationwide roaming for approximately 35,000 users, at a rate of 45 million messages per month, and a data rate of 19.2 kbps.

In 1986, Swedish Telecomm and Ericsson Radio Systems AB introduced Mobitex and deployed it in Sweden. This system is available in the United States, Norway, Finland, Great Britain, the Netherlands, and France. The system supports a data rate of 8 Mbps and nationwide roaming (international roaming is planned). This service is distributed by RAM Mobile Data in the United States, where by 1994 it had 12,000 subscribers. A total of 840 base stations are connected to 40 switching centers to cover 100 metropolitan areas and 6,300 cities and towns.

Cellular digital packet data (CDPD) technology was developed by IBM, which together with nine operating companies formed the CDPD Forum to develop an open standard and multivendor environment for a packet-switched network using the physical infrastructure and frequency bands of the AMPS systems. The CDPD specification was completed in 1993 with key contributions from IBM, McCaw Cellular Communications, Inc., and Pacific Communications Sciences, Inc. Deployment of the 19.2-kbps CDPD infrastructure, designed to make use of idle channels in analog cellular systems, commenced in 1995.

In the 1990s Metricom, Inc., developed a metropolitan-area network that was deployed first in the San Francisco Bay area and then in Washington, D.C. The signaling rate of this system is advertised at 100 kbps but the actual data rate is substantially slower. The Metricom system uses ''frequency hopping" spread-spectrum (FHSS) technology in the lower frequencies (around 900 MHz) of the unlicensed industrial, scientific, and medical (ISM) bands. 11

In 1996 the European Telecommunications Standards Institute (ETSI) standard for mobile data services, trans-European trunked radio (TETRA), was completed. It is currently being used primarily for public safety purposes. Work is in progress to enhance the digital cellular and personal communications technologies. More recently, the digital cellular standards (GSM, IS-95, PHS, PACS, and IS-136) have been updated to support packet-switched mobile data services at a variety of data rates. Key features

of existing mobile data services are shown in Table 1-8. Although many services are available, the mobile data market has grown more slowly than have voice services.

1.7 Wireless Local Area Networks

Wireless LANs provide data rates exceeding 1 Mbps in coverage areas with dimensions on the order of tens of meters. They are used for a variety of applications, including the following:

In 1990 the Institute of Electrical and Electronics Engineers (IEEE) formed a committee to develop a standard for wireless LANs operating at 1 and 2 Mbps. In 1992 the ETSI chartered a committee to develop a standard for high-performance radio LANs (HIPERLAN) operating at 20 Mbps.

Table 1-9 indicates the technical features of various LAN products (including some that use the infrared portion of the spectrum and are therefore not examined in detail in this report). The market for wireless LAN products is growing rapidly but not nearly as fast as the market for wireless voice applications. The $200 million market for wireless LANs is tiny compared to the cellular industry, which is worth billions (Wickelgren, 1996).

1.8 Comparison Of International Research, Development, And Deployment Strategies

Commercial wireless technologies have followed divergent evolutionary paths in different parts of the world. For example, strong contrasts are evident in the transition from first-generation cellular systems to second-generation systems in the United States and Europe. At first a single U.S. system was used for analog cellular communications, AMPS, and every cellular telephone in the United States and Canada could communicate

with every base station. By contrast, European users were faced with a complex mixture of incompatible analog systems. To maintain mobile telephone service, an international traveler in Europe needed up to five different telephones. The situation was reversed by second-generation systems. Now there is a single digital technology, GSM, deployed throughout Europe (and in more than 100 countries worldwide), whereas the United States has become a technology battleground for three competitors: GSM (DSC-1900), TDMA (IS-136), and CDMA (IS-95).

The differences in technology evolution are due in large measure to different government policies in Europe, the United States, and Japan, the world's principal sources of wireless technologies. Three types of government policies influence developments in wireless systems: policies on radio spectrum regulation, approaches to R&D, and telecommunications industry structure. The reasons for the shifts in the above example can be found primarily in changes in spectrum regulation policies adopted in the 1980s. In establishing first-generation systems in the United States in the late 1970s, the FCC regulated four properties of a radio system: noninterference, quality, efficiency, and interoperability. In the 1980s, deregulation was in vogue and the scope of the FCC's authority was restricted to noninterference; the other properties were deemed commercial issues to be settled in the marketplace. Although this policy stimulated innovation in the U.S. manufacturing industry, it also meant that operating companies had to choose among various competing technologies.

In Europe, the main trend in government regulation in the 1980s was a move from national authority to multinational regulation under the aegis of the European Community (EC; now the European Union [EU]). The EC had a strong interest in establishing continental standards for common products and services, including electric plugs and telephone dialing conventions. In this context the notion of a telephone that could be used throughout Europe had a strong appeal. To advance this notion, the EC offered new spectrum for cellular service on the condition that the operating industries of participating countries agree on a single standard. Attracted by the availability of free spectrum, operating companies (many of them government-owned) in 15 countries put aside national rivalries and adopted the GSM standard.

Thus, a new pattern of technical cooperation was established in Europe. This cooperation was reinforced by the European Commission (the administrative unit of the EU), which funded cooperative precompetitive research focusing on advanced communications systems, first in the Research for Advanced Communications in Europe (RACE) program and then in the Advanced Communications Technologies and Services (ACTS) program. In both programs a consortium of companies and universities

performs the research. Spectrum management rules continue to prescribe a single standard for each service, meaning that an industry consensus is required before a standard is introduced. Once a technology is established, companies enter the competitive phase of product development and marketing. This process promotes a thorough investigation of technologies prior to standardization and assures economies of scale when commercial service begins. In preparation for UMTS, scheduled for initial deployment in 2002, extensive R&D and evaluation of competing prototypes have been under way since 1994. All of this activity will provide European industry with a strong technical base for realizing the goals for mobile communications in the first decade of the next century.

The U.S. approach to communications technology R&D is much more competitive. Individual companies perform much of this research in the context of their product marketing plans. Coordination takes place within diverse standards organizations such as the Telecommunications Industry Association, IEEE, and American National Standards Institute. Some interaction also takes place in the GloMo program, which brings together universities and industry to fill specific technology gaps identified by DARPA program managers. But for the most part standards setting is a competitive rather than cooperative process, with each company or group of companies striving to protect commercial interests. The FCC rules for spectrum management allow license holders to transmit any signals, subject only to constraints on interference with the signals of other license holders. Similar flexibility is extended to unlicensed transmissions. As a consequence, there are multiple competing standards (seven in the case of wideband personal communications) for wireless service in the United States.

Government policies on industry structure also strongly influence technology development. After the FCC issued cellular operating licenses, most of the companies that began offering cellular service had limited technical resources and relied almost entirely on vendors and consultants for technical expertise. Even the cellular subsidiaries of the regional Bell operating companies had to build a new base of expertise: Under the terms of the consent decree that broke up AT&T in 1984, these cellular companies had no access to the abundant technical resources of Bellcore, the research unit of the regional Bell companies. In this environment, much of the new wireless communications technology in the United States has come from the manufacturing industry, with the result that proprietary rather than open network-interface standards have proliferated. The published technical standards for wireless communications were at first confined to the air interface between terminals and base stations. Eventually the industry adopted a standard for intersystem operation to facilitate roaming. Many other interfaces, especially those between switching

centers and base stations, remain proprietary but the situation is changing to allow fully open systems.

By contrast, the European cellular operating industry has been dominated by national telephone monopolies. These companies have strong research laboratories that participate fully in technology creation and standards setting. To gain the advantage of flexibility in equipment procurement, operating companies favor mandatory open interfaces, a preference reflected in the GSM standard.

Little has been published concerning the factors that influence the evolution of wireless communications technology in Japan. In recent years NTT, the dominant telecommunications operating company, has provided a strong coordinating mechanism for creating and standardizing new technology. The biggest success has been PHS, which entered commercial service in 1995 and attracted 4 million subscribers in its first year of operation. The initial R&D for PHS was conducted by NTT, but it licenses many manufacturers to offer PHS equipment. Now many Japanese companies are cooperating in a study of wideband CDMA technology for third-generation systems. A joint experimental trial of one system is scheduled for the end of 1997. In addition to corporate R&D, a government organization, Research and Development Center for Radio Systems, is a significant source of wireless communications technology in Japan.

Worldwide efforts to guide the evolution of wireless communications technology come together in the IMT-2000 project. National delegations to IMT-2000 reflect their country's policies: The U.S. delegation pushes for diversity, 12 the Europeans advocate a structure favorable to UMTS and its descendants, and the Japanese delegation favors convergence to a small number of worldwide standards. Other countries assert their own service needs, which in some cases can be met by mobile communications satellites and in other cases by wireless local loops.

1.9 Summary And Report Organization

The history of wireless communications suggests a number of key points to be considered in evaluating potential future strategies for the DOD and DARPA. Wireless technology has now evolved to a point where the goal of "anytime, anywhere" communications is within reach. Since 1980 consumer demand for cordless and cellular telephones has driven rapid growth in wireless services, especially for voice communications. Wireless data services have not taken off as yet although expectations are high, given the growth of Internet applications. Extensive research is under way to develop third-generation commercial wireless systems, which are expected to be in place before 2010. These trends suggest that

the DOD will continue to have an ample selection of advanced commercial wireless technologies from which to choose.

The DOD, which currently uses a variety of wireless systems based on 1970s and 1980s technology, is relying increasingly on commercial wireless products to cope with reductions in defense budgets and the growing need for flexible systems that can be deployed rapidly. In the Gulf War, the DOD used commercial equipment such as GPS receivers and INMARSAT links and found that performance was comparable to that of technologies designed explicitly to meet military needs. However, the DOD will continue to have unique needs for security, interoperability, and other features that might not be met by commercial products. The gaps between commercial technologies and military needs are difficult to identify precisely because, although the DOD has defined its vision for future untethered systems in general terms, projected operational needs have apparently not been translated into technical specifications that conform to the capabilities of commercial products.

The GloMo program and other military R&D efforts are attempting to meet DOD's future communications needs and have produced some useful results. However, none of these programs has adopted a systems approach to the problem, most notably with respect to the design of a network architecture. There may be other unmet needs as well; however, the committee based its work on first principles rather than an assessment of GloMo. A new strategy may be needed to identify the needs more specifically as a basis for determining where to focus DARPA's R&D efforts and where commercial products will suffice.

The effort to evaluate commercial technologies in light of defense needs will be complicated by the characteristics of the U.S. marketplace. In Europe there is a single standard (GSM) for digital wireless communications, and precompetitive research on new wireless technologies is carried out in cooperative, government-funded programs. The U.S. wireless market features a mixture of competing standards, and most technology R&D is conducted by individual companies. This environment forces operators to choose from an assortment of competing technologies.

The remainder of this report is an attempt to help the DOD devise strategies for making those choices. Chapter 2 provides technical background on the many issues that need to be addressed in designing wireless communications systems, which are extremely complex. The highly technical discussion may not interest all readers but is fundamental to any informed analysis of wireless systems. Chapter 3 explores the opportunities for and barriers to synergy between the military and commercial sectors in the development of wireless technologies. Chapter 4 integrates all the information presented in this report to provide a set of recommendations for the DOD and DARPA.

1. This report does not address unguided optical communications systems, which use the 10 3 –10 7 gigahertz frequency band (infrared, visible, and ultraviolet light), because the commercial products that operate in these bands are designed for indoor applications and therefore would not be of great use in military applications.

2. A protocol is a set of rules, encoded in software, for performing specific functions.

3. The developments since the mid-1970s, when the use of computer networks moved beyond the ARPA research community, paved the way for commercial services. The CSNet project, funded by the National Science Foundation (NSF) for the computer science community, eventually led to the NSFNET and a dramatic increase in the number of interconnected nodes. The commercialization of Internet service was symbolized by the decommissioning of the ARPANET in 1990 and privatization of the NSFNET in 1995.

4. Two types of codes are used to spread the signal. A long code is reserved for use by the military to obtain location information within a few meters of accuracy and timing information within 100 nanoseconds. A shorter code is used by commercial systems to obtain location information accurate to within 100 meters.

5. A fourth digital modulation technique, based on Motorola's iDEN technology, is used by some specialized U.S. mobile radio services in the lower 800-MHz band to provide cellular-like voice, trunked radio, paging, and messaging services.

6. One integrated solution not addressed in detail in this report is the new generation of public safety radio networks. These systems are used in both the military and commercial sectors for applications such as law enforcement and fire fighting. Until recently these systems were characterized simply as 25-kilohertz FM voice radios and 9.6-kbps modems. In the past a municipal law enforcement radio system typically was deployed as a redundant overlay of towers and repeaters separate from the radio systems operated by fire, health, highway, and other municipal departments. Today's tight budgets often force municipalities to pool departmental funds to upgrade public safety radios and establish a single system with enough capacity to meet every user's needs. To assist in this process the Association of Public Safety Communication Officers (APCO), which includes law enforcement, highway, forestry, health, and many other municipal and federal users, recently initiated an ambitious program called Project-25 to reduce the cost of next-generation radios. APCO Project-25 seeks to reduce user dependence on proprietary radios from a single manufacturer (generally the system installer) and introduce cost competition in the upgrading and replacement market at the municipality level. The strategy is to standardize a digital-modulation radio, which would be described as APCO Project-25 compliant, thus opening up public radio purchasing to a variety of competing manufacturers. Some radios that are APCO Project-25 compliant are now available and are being adopted by the Federal Law Enforcement Radio Users Group (representing radio users in the Federal Bureau of Investigation, Drug Enforcement Agency, Secret

Service, Department of the Treasury, and other civilian agencies). The APCO Project-25 process has encouraged an unprecedented level of cooperation among municipal radio users.

7. These activities are carried out by the ITU Radiocommunication Sector (ITU-R) Working Party 8/13, later renamed ITU-R Task Group 8/1.

8. The implementation of standards based on IMT-2000 in Japan clearly would give Japanese companies early experience with the technology and perhaps position them to dominate future world markets for IMT-2000 products.

9. Although optical communications systems are not addressed in detail in this report, in large part because the commercial research focuses on indoor applications, the advantages of laser systems need to be mentioned. A laser produces optical radiation by stimulating emissions from an electronic or chemical material. Unlike light produced by incandescent or fluorescent sources, the resultant beam is coherent and exhibits extremely low angular divergence, properties that enable transmissions spanning great distances (i.e., thousands of miles). The data, voice, images, or other signals are modulated on a beam of light, which is detected by an optical receiver and decoded. The transmitter and receiver need to be in direct visual contact, and so the laser beam is steered in the appropriate direction using mirrors or other optical elements. Laser communications systems offer several advantages over RF systems. The main advantage is high capacity: Systems now under development will support transmissions in the range of hundreds of megabits per second, with systems under consideration attaining the gigabits-per-second range. Another advantage is the low power requirement for point-to-point communications (orders of magnitude lower than RF systems). All the energy is focused into a very narrow beam because the physical dispersion of a laser beam in space is minimal. Furthermore, laser communications systems offer security benefits because almost no energy is diffused outside the laser beam, which is therefore not easily detected by an adversary. This combination of features makes laser communications systems attractive for secure transmissions between hub points in mobile, dynamically changing environments (e.g., between base stations on vehicle-mounted switching facilities). However, laser systems are sensitive to interference from other light sources, such as the sun, and any obstructions of the visual link by dust, rain, or fog. There is also a risk of damage to the eyes of unprotected observers. Finally, components for laser-based systems are much more expensive than those for RF systems and therefore are unlikely to penetrate the commercial market for some time.

10. These activities are carried out by the ITU Telecommunications Sector, Study Group 11.

11. The ISM bands (at 902–928 MHz, 2400–2483 MHz, and 5700–5850 MHz) are available for any wireless device that uses less than 1 watt of transmit power.

12. The United States participates in the IMT-2000 process in Task Group 8/1 through a delegation led by the FCC.

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

Logo for M Libraries Publishing

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

11.2 The Evolution of the Internet

Learning objectives.

  • Define protocol and decentralization as they relate to the early Internet.
  • Identify technologies that made the Internet accessible.
  • Explain the causes and effects of the dot-com boom and crash.

From its early days as a military-only network to its current status as one of the developed world’s primary sources of information and communication, the Internet has come a long way in a short period of time. Yet there are a few elements that have stayed constant and that provide a coherent thread for examining the origins of the now-pervasive medium. The first is the persistence of the Internet—its Cold War beginnings necessarily influencing its design as a decentralized, indestructible communication network.

The second element is the development of rules of communication for computers that enable the machines to turn raw data into useful information. These rules, or protocols , have been developed through consensus by computer scientists to facilitate and control online communication and have shaped the way the Internet works. Facebook is a simple example of a protocol: Users can easily communicate with one another, but only through acceptance of protocols that include wall posts, comments, and messages. Facebook’s protocols make communication possible and control that communication.

These two elements connect the Internet’s origins to its present-day incarnation. Keeping them in mind as you read will help you comprehend the history of the Internet, from the Cold War to the Facebook era.

The History of the Internet

The near indestructibility of information on the Internet derives from a military principle used in secure voice transmission: decentralization . In the early 1970s, the RAND Corporation developed a technology (later called “packet switching”) that allowed users to send secure voice messages. In contrast to a system known as the hub-and-spoke model, where the telephone operator (the “hub”) would patch two people (the “spokes”) through directly, this new system allowed for a voice message to be sent through an entire network, or web, of carrier lines, without the need to travel through a central hub, allowing for many different possible paths to the destination.

During the Cold War, the U.S. military was concerned about a nuclear attack destroying the hub in its hub-and-spoke model; with this new web-like model, a secure voice transmission would be more likely to endure a large-scale attack. A web of data pathways would still be able to transmit secure voice “packets,” even if a few of the nodes—places where the web of connections intersected—were destroyed. Only through the destruction of all the nodes in the web could the data traveling along it be completely wiped out—an unlikely event in the case of a highly decentralized network.

This decentralized network could only function through common communication protocols. Just as we use certain protocols when communicating over a telephone—“hello,” “goodbye,” and “hold on for a minute” are three examples—any sort of machine-to-machine communication must also use protocols. These protocols constitute a shared language enabling computers to understand each other clearly and easily.

The Building Blocks of the Internet

In 1973, the U.S. Defense Advanced Research Projects Agency (DARPA) began research on protocols to allow computers to communicate over a distributed network . This work paralleled work done by the RAND Corporation, particularly in the realm of a web-based network model of communication. Instead of using electronic signals to send an unending stream of ones and zeros over a line (the equivalent of a direct voice connection), DARPA used this new packet-switching technology to send small bundles of data. This way, a message that would have been an unbroken stream of binary data—extremely vulnerable to errors and corruption—could be packaged as only a few hundred numbers.

Figure 11.2

image

Centralized versus distributed communication networks

Imagine a telephone conversation in which any static in the signal would make the message incomprehensible. Whereas humans can infer meaning from “Meet me [static] the restaurant at 8:30” (we replace the static with the word at ), computers do not necessarily have that logical linguistic capability. To a computer, this constant stream of data is incomplete—or “corrupted,” in technological terminology—and confusing. Considering the susceptibility of electronic communication to noise or other forms of disruption, it would seem like computer-to-computer transmission would be nearly impossible.

However, the packets in this packet-switching technology have something that allows the receiving computer to make sure the packet has arrived uncorrupted. Because of this new technology and the shared protocols that made computer-to-computer transmission possible, a single large message could be broken into many pieces and sent through an entire web of connections, speeding up transmission and making that transmission more secure.

One of the necessary parts of a network is a host. A host is a physical node that is directly connected to the Internet and “directs traffic” by routing packets of data to and from other computers connected to it. In a normal network, a specific computer is usually not directly connected to the Internet; it is connected through a host. A host in this case is identified by an Internet protocol, or IP, address (a concept that is explained in greater detail later). Each unique IP address refers to a single location on the global Internet, but that IP address can serve as a gateway for many different computers. For example, a college campus may have one global IP address for all of its students’ computers, and each student’s computer might then have its own local IP address on the school’s network. This nested structure allows billions of different global hosts, each with any number of computers connected within their internal networks. Think of a campus postal system: All students share the same global address (1000 College Drive, Anywhere, VT 08759, for example), but they each have an internal mailbox within that system.

The early Internet was called ARPANET, after the U.S. Advanced Research Projects Agency (which added “Defense” to its name and became DARPA in 1973), and consisted of just four hosts: UCLA, Stanford, UC Santa Barbara, and the University of Utah. Now there are over half a million hosts, and each of those hosts likely serves thousands of people (Central Intelligence Agency). Each host uses protocols to connect to an ever-growing network of computers. Because of this, the Internet does not exist in any one place in particular; rather, it is the name we give to the huge network of interconnected computers that collectively form the entity that we think of as the Internet. The Internet is not a physical structure; it is the protocols that make this communication possible.

Figure 11.3

image

A TCP gateway is like a post office because of the way that it directs information to the correct location.

One of the other core components of the Internet is the Transmission Control Protocol (TCP) gateway. Proposed in a 1974 paper, the TCP gateway acts “like a postal service (Cerf, et. al., 1974).” Without knowing a specific physical address, any computer on the network can ask for the owner of any IP address, and the TCP gateway will consult its directory of IP address listings to determine exactly which computer the requester is trying to contact. The development of this technology was an essential building block in the interlinking of networks, as computers could now communicate with each other without knowing the specific address of a recipient; the TCP gateway would figure it all out. In addition, the TCP gateway checks for errors and ensures that data reaches its destination uncorrupted. Today, this combination of TCP gateways and IP addresses is called TCP/IP and is essentially a worldwide phone book for every host on the Internet.

You’ve Got Mail: The Beginnings of the Electronic Mailbox

E-mail has, in one sense or another, been around for quite a while. Originally, electronic messages were recorded within a single mainframe computer system. Each person working on the computer would have a personal folder, so sending that person a message required nothing more than creating a new document in that person’s folder. It was just like leaving a note on someone’s desk (Peter, 2004), so that the person would see it when he or she logged onto the computer.

However, once networks began to develop, things became slightly more complicated. Computer programmer Ray Tomlinson is credited with inventing the naming system we have today, using the @ symbol to denote the server (or host, from the previous section). In other words, [email protected] tells the host “gmail.com” (Google’s e-mail server) to drop the message into the folder belonging to “name.” Tomlinson is credited with writing the first network e-mail using his program SNDMSG in 1971. This invention of a simple standard for e-mail is often cited as one of the most important factors in the rapid spread of the Internet, and is still one of the most widely used Internet services.

The use of e-mail grew in large part because of later commercial developments, especially America Online, that made connecting to e-mail much easier than it had been at its inception. Internet service providers (ISPs) packaged e-mail accounts with Internet access, and almost all web browsers (such as Netscape, discussed later in the section) included a form of e-mail service. In addition to the ISPs, e-mail services like Hotmail and Yahoo! Mail provided free e-mail addresses paid for by small text ads at the bottom of every e-mail message sent. These free “webmail” services soon expanded to comprise a large part of the e-mail services that are available today. Far from the original maximum inbox sizes of a few megabytes, today’s e-mail services, like Google’s Gmail service, generally provide gigabytes of free storage space.

E-mail has revolutionized written communication. The speed and relatively inexpensive nature of e-mail makes it a prime competitor of postal services—including FedEx and UPS—that pride themselves on speed. Communicating via e-mail with someone on the other end of the world is just as quick and inexpensive as communicating with a next-door neighbor. However, the growth of Internet shopping and online companies such as Amazon.com has in many ways made the postal service and shipping companies more prominent—not necessarily for communication, but for delivery and remote business operations.

Hypertext: Web 1.0

In 1989, Tim Berners-Lee, a graduate of Oxford University and software engineer at CERN (the European particle physics laboratory), had the idea of using a new kind of protocol to share documents and information throughout the local CERN network. Instead of transferring regular text-based documents, he created a new language called hypertext markup language (HTML). Hypertext was a new word for text that goes beyond the boundaries of a single document. Hypertext can include links to other documents (hyperlinks), text-style formatting, images, and a wide variety of other components. The basic idea is that documents can be constructed out of a variety of links and can be viewed just as if they are on the user’s computer.

This new language required a new communication protocol so that computers could interpret it, and Berners-Lee decided on the name hypertext transfer protocol (HTTP). Through HTTP, hypertext documents can be sent from computer to computer and can then be interpreted by a browser, which turns the HTML files into readable web pages. The browser that Berners-Lee created, called World Wide Web, was a combination browser-editor, allowing users to view other HTML documents and create their own (Berners-Lee, 2009).

Figure 11.4

image

Tim Berners-Lee’s first web browser was also a web page editor.

Modern browsers, like Microsoft Internet Explorer and Mozilla Firefox, only allow for the viewing of web pages; other increasingly complicated tools are now marketed for creating web pages, although even the most complicated page can be written entirely from a program like Windows Notepad. The reason web pages can be created with the simplest tools is the adoption of certain protocols by the most common browsers. Because Internet Explorer, Firefox, Apple Safari, Google Chrome, and other browsers all interpret the same code in more or less the same way, creating web pages is as simple as learning how to speak the language of these browsers.

In 1991, the same year that Berners-Lee created his web browser, the Internet connection service Q-Link was renamed America Online, or AOL for short. This service would eventually grow to employ over 20,000 people, on the basis of making Internet access available (and, critically, simple) for anyone with a telephone line. Although the web in 1991 was not what it is today, AOL’s software allowed its users to create communities based on just about any subject, and it only required a dial-up modem—a device that connects any computer to the Internet via a telephone line—and the telephone line itself.

In addition, AOL incorporated two technologies—chat rooms and Instant Messenger—into a single program (along with a web browser). Chat rooms allowed many users to type live messages to a “room” full of people, while Instant Messenger allowed two users to communicate privately via text-based messages. The most important aspect of AOL was its encapsulation of all these once-disparate programs into a single user-friendly bundle. Although AOL was later disparaged for customer service issues like its users’ inability to deactivate their service, its role in bringing the Internet to mainstream users was instrumental (Zeller Jr., 2005).

In contrast to AOL’s proprietary services, the World Wide Web had to be viewed through a standalone web browser. The first of these browsers to make its mark was the program Mosaic, released by the National Center for Supercomputing Applications at the University of Illinois. Mosaic was offered for free and grew very quickly in popularity due to features that now seem integral to the web. Things like bookmarks, which allow users to save the location of particular pages without having to remember them, and images, now an integral part of the web, were all inventions that made the web more usable for many people (National Center for Supercomputing Appliances).

Although the web browser Mosaic has not been updated since 1997, developers who worked on it went on to create Netscape Navigator, an extremely popular browser during the 1990s. AOL later bought the Netscape company, and the Navigator browser was discontinued in 2008, largely because Netscape Navigator had lost the market to Microsoft’s Internet Explorer web browser, which came preloaded on Microsoft’s ubiquitous Windows operating system. However, Netscape had long been converting its Navigator software into an open-source program called Mozilla Firefox, which is now the second-most-used web browser on the Internet (detailed in Table 11.1 “Browser Market Share (as of February 2010)” ) (NetMarketshare). Firefox represents about a quarter of the market—not bad, considering its lack of advertising and Microsoft’s natural advantage of packaging Internet Explorer with the majority of personal computers.

Table 11.1 Browser Market Share (as of February 2010)

For Sale: The Web

As web browsers became more available as a less-moderated alternative to AOL’s proprietary service, the web became something like a free-for-all of startup companies. The web of this period, often referred to as Web 1.0, featured many specialty sites that used the Internet’s ability for global, instantaneous communication to create a new type of business. Another name for this free-for-all of the 1990s is the “dot-com boom.” During the boom, it seemed as if almost anyone could build a website and sell it for millions of dollars. However, the “dot-com crash” that occurred later that decade seemed to say otherwise. Quite a few of these Internet startup companies went bankrupt, taking their shareholders down with them. Alan Greenspan, then the chairman of the U.S. Federal Reserve, called this phenomenon “irrational exuberance (Greenspan, 1996),” in large part because investors did not necessarily know how to analyze these particular business plans, and companies that had never turned a profit could be sold for millions. The new business models of the Internet may have done well in the stock market, but they were not necessarily sustainable. In many ways, investors collectively failed to analyze the business prospects of these companies, and once they realized their mistakes (and the companies went bankrupt), much of the recent market growth evaporated. The invention of new technologies can bring with it the belief that old business tenets no longer apply, but this dangerous belief—the “irrational exuberance” Greenspan spoke of—is not necessarily conducive to long-term growth.

Some lucky dot-com businesses formed during the boom survived the crash and are still around today. For example, eBay, with its online auctions, turned what seemed like a dangerous practice (sending money to a stranger you met over the Internet) into a daily occurrence. A less-fortunate company, eToys.com , got off to a promising start—its stock quadrupled on the day it went public in 1999—but then filed for Chapter 11 “The Internet and Social Media” bankruptcy in 2001 (Barnes, 2001).

One of these startups, theGlobe.com , provided one of the earliest social networking services that exploded in popularity. When theGlobe.com went public, its stock shot from a target price of $9 to a close of $63.50 a share (Kawamoto, 1998). The site itself was started in 1995, building its business on advertising. As skepticism about the dot-com boom grew and advertisers became increasingly skittish about the value of online ads, theGlobe.com ceased to be profitable and shut its doors as a social networking site (The Globe, 2009). Although advertising is pervasive on the Internet today, the current model—largely based on the highly targeted Google AdSense service—did not come around until much later. In the earlier dot-com years, the same ad might be shown on thousands of different web pages, whereas now advertising is often specifically targeted to the content of an individual page.

However, that did not spell the end of social networking on the Internet. Social networking had been going on since at least the invention of Usenet in 1979 (detailed later in the chapter), but the recurring problem was always the same: profitability. This model of free access to user-generated content departed from almost anything previously seen in media, and revenue streams would have to be just as radical.

The Early Days of Social Media

The shared, generalized protocols of the Internet have allowed it to be easily adapted and extended into many different facets of our lives. The Internet shapes everything, from our day-to-day routine—the ability to read newspapers from around the world, for example—to the way research and collaboration are conducted. There are three important aspects of communication that the Internet has changed, and these have instigated profound changes in the way we connect with one another socially: the speed of information, the volume of information, and the “democratization” of publishing, or the ability of anyone to publish ideas on the web.

One of the Internet’s largest and most revolutionary changes has come about through social networking. Because of Twitter, we can now see what all our friends are doing in real time; because of blogs, we can consider the opinions of complete strangers who may never write in traditional print; and because of Facebook, we can find people we haven’t talked to for decades, all without making a single awkward telephone call.

Recent years have seen an explosion of new content and services; although the phrase “social media” now seems to be synonymous with websites like Facebook and Twitter, it is worthwhile to consider all the ways a social media platform affects the Internet experience.

How Did We Get Here? The Late 1970s, Early 1980s, and Usenet

Almost as soon as TCP stitched the various networks together, a former DARPA scientist named Larry Roberts founded the company Telnet, the first commercial packet-switching company. Two years later, in 1977, the invention of the dial-up modem (in combination with the wider availability of personal computers like the Apple II) made it possible for anyone around the world to access the Internet. With availability extended beyond purely academic and military circles, the Internet quickly became a staple for computer hobbyists.

One of the consequences of the spread of the Internet to hobbyists was the founding of Usenet. In 1979, University of North Carolina graduate students Tom Truscott and Jim Ellis connected three computers in a small network and used a series of programming scripts to post and receive messages. In a very short span of time, this system spread all over the burgeoning Internet. Much like an electronic version of community bulletin boards, anyone with a computer could post a topic or reply on Usenet.

The group was fundamentally and explicitly anarchic, as outlined by the posting “What is Usenet?” This document says, “Usenet is not a democracy…there is no person or group in charge of Usenet …Usenet cannot be a democracy, autocracy, or any other kind of ‘-acy (Moraes, et. al., 1998).’” Usenet was not used only for socializing, however, but also for collaboration. In some ways, the service allowed a new kind of collaboration that seemed like the start of a revolution: “I was able to join rec.kites and collectively people in Australia and New Zealand helped me solve a problem and get a circular two-line kite to fly,” one user told the United Kingdom’s Guardian (Jeffery, et. al., 2009).

GeoCities: Yahoo! Pioneers

Fast-forward to 1995: The president and founder of Beverly Hills Internet, David Bohnett, announces that the name of his company is now “GeoCities.” GeoCities built its business by allowing users (“homesteaders”) to create web pages in “communities” for free, with the stipulation that the company placed a small advertising banner at the top of each page. Anyone could register a GeoCities site and subsequently build a web page about a topic. Almost all of the community names, like Broadway (live theater) and Athens (philosophy and education), were centered on specific topics (Archive, 1996).

This idea of centering communities on specific topics may have come from Usenet. In Usenet, the domain alt.rec.kites refers to a specific topic (kites) within a category (recreation) within a larger community (alternative topics). This hierarchical model allowed users to organize themselves across the vastness of the Internet, even on a large site like GeoCities. The difference with GeoCities was that it allowed users to do much more than post only text (the limitation of Usenet), while constraining them to a relatively small pool of resources. Although each GeoCities user had only a few megabytes of web space, standardized pictures—like mailbox icons and back buttons—were hosted on GeoCities’s main server. GeoCities was such a large part of the Internet, and these standard icons were so ubiquitous, that they have now become a veritable part of the Internet’s cultural history. The Web Elements category of the site Internet Archaeology is a good example of how pervasive GeoCities graphics became (Internet Archaeology, 2010).

GeoCities built its business on a freemium model, where basic services are free but subscribers pay extra for things like commercial pages or shopping carts. Other Internet businesses, like Skype and Flickr, use the same model to keep a vast user base while still profiting from frequent users. Since loss of online advertising revenue was seen as one of the main causes of the dot-com crash, many current web startups are turning toward this freemium model to diversify their income streams (Miller, 2009).

GeoCities’s model was so successful that the company Yahoo! bought it for $3.6 billion at its peak in 1999. At the time, GeoCities was the third-most-visited site on the web (behind Yahoo! and AOL), so it seemed like a sure bet. A decade later, on October 26, 2009, Yahoo! closed GeoCities for good in every country except Japan.

Diversification of revenue has become one of the most crucial elements of Internet businesses; from The Wall Street Journal online to YouTube, almost every website is now looking for multiple income streams to support its services.

Key Takeaways

  • The two primary characteristics of the original Internet were decentralization and free, open protocols that anyone could use. As a result of its decentralized “web” model of organization, the Internet can store data in many different places at once. This makes it very useful for backing up data and very difficult to destroy data that might be unwanted. Protocols play an important role in this, because they allow some degree of control to exist without a central command structure.
  • Two of the most important technological developments were the personal computer (such as the Apple II) and the dial-up modem, which allowed anyone with a phone line to access the developing Internet. America Online also played an important role, making it very easy for practically anyone with a computer to use the Internet. Another development, the web browser, allowed for access to and creation of web pages all over the Internet.
  • With the advent of the web browser, it seemed as if anyone could make a website that people wanted to use. The problem was that these sites were driven largely by venture capital and grossly inflated initial public offerings of their stock. After failing to secure any real revenue stream, their stock plummeted, the market crashed, and many of these companies went out of business. In later years, companies tried to diversify their investments, particularly by using a “freemium” model of revenue, in which a company would both sell premium services and advertise, while offering a free pared-down service to casual users.

Websites have many different ways of paying for themselves, and this can say a lot about both the site and its audience. The business models of today’s websites may also directly reflect the lessons learned during the early days of the Internet. Start this exercise by reviewing a list of common ways that websites pay for themselves, how they arrived at these methods, and what it might say about them:

  • Advertising: The site probably has many casual viewers and may not necessarily be well established. If there are targeted ads (such as ads directed toward stay-at-home parents with children), then it is possible the site is successful with a small audience.
  • Subscription option: The site may be a news site that prides itself on accuracy of information or lack of bias, whose regular readers are willing to pay a premium for the guarantee of quality material. Alternately, the site may cater to a small demographic of Internet users by providing them with exclusive, subscription-only content.
  • Selling services: Online services, such as file hosting, or offline services and products are probably the clearest way to determine a site’s revenue stream. However, these commercial sites often are not prized for their unbiased information, and their bias can greatly affect the content on the site.

Choose a website that you visit often, and list which of these revenue streams the site might have. How might this affect the content on the site? Is there a visible effect, or does the site try to hide it? Consider how events during the early history of the Internet may have affected the way the site operates now. Write down a revenue stream that the site does not currently have and how the site designers might implement such a revenue stream.

Archive, While GeoCities is no longer in business, the Internet Archive maintains the site at http://www.archive.org/web/geocities.php . Information taken from December 21, 1996.

Barnes, Cecily. “eToys files for Chapter 11,” CNET , March 7, 2001, http://news.cnet.com/2100-1017-253706.html .

Berners-Lee, Tim. “The WorldWideWeb Browser,” 2009, https://www.w3.org/People/Berners-Lee/WorldWideWeb .

Central Intelligence Agency, “Country Comparison: Internet Hosts,” World Factbook , https://www.cia.gov/library/publications/the-world-factbook/rankorder/2184rank.html .

Cerf, Vincton, Yogen Dalal, and Carl Sunshine, “Specification of Internet Transmission Control Program,” December 1974, http://tools.ietf.org/html/rfc675 .

Greenspan, Alan. “The Challenge of Central Banking in a Democratic Society, ” (lecture, American Enterprise Institute for Public Policy Research, Washington, DC, December 5, 1996), http://www.federalreserve.gov/boarddocs/speeches/1996/19961205.htm .

Internet Archaeology, 2010, http://www.internetarchaeology.org/swebelements.htm .

Jeffery, Simon and others, “A People’s History of the Internet: From Arpanet in 1969 to Today,” Guardian (London), October 23, 2009, http://www.guardian.co.uk/technology/interactive/2009/oct/23/internet-arpanet .

Kawamoto, Dawn. “ TheGlobe.com ’s IPO one for the books,” CNET , November 13, 1998, http://news.cnet.com/2100-1023-217913.html .

Miller, Claire Cain. “Ad Revenue on the Web? No Sure Bet,” New York Times , May 24, 2009, http://www.nytimes.com/2009/05/25/technology/start-ups/25startup.html .

Moraes, Mark, Chip Salzenberg, and Gene Spafford, “What is Usenet?” December 28, 1999, http://www.faqs.org/faqs/usenet/what-is/part1/ .

National Center for Supercomputing Appliances, “About NCSA Mosaic,” 2010, http://www.ncsa.illinois.edu/Projects/mosaic.html .

NetMarketShare, “Browser Market Share,” http://marketshare.hitslink.com/browser-market-share.aspx?qprid=0&qpcal=1&qptimeframe=M&qpsp=132 .

Peter, Ian. “The History of Email,” The Internet History Project, 2004, http://www.nethistory.info/History%20of%20the%20Internet/email.html .

The Globe, theglobe.com, “About Us,” 2009, http://www.theglobe.com/ .

Zeller, Jr., Tom. “Canceling AOL? Just Offer Your Firstborn,” New York Times , August 29, 2005, all http://www.nytimes.com/2005/08/29/technology/29link.html .

Understanding Media and Culture Copyright © 2016 by University of Minnesota is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License , except where otherwise noted.

  • Search Menu
  • Author Guidelines
  • Submission Site
  • Self-Archiving Policy
  • Why Submit?
  • About Journal of Computer-Mediated Communication
  • About International Communication Association
  • Editorial Board
  • Advertising & Corporate Services
  • Journals Career Network
  • Journals on Oxford Academic
  • Books on Oxford Academic

Article Contents

About the authors, author note.

  • < Previous

The Past, Present, and Future of Human Communication and Technology Research: An Introduction

  • Article contents
  • Figures & tables
  • Supplementary Data

Scott C. D’Urso, The Past, Present, and Future of Human Communication and Technology Research: An Introduction, Journal of Computer-Mediated Communication , Volume 14, Issue 3, 1 April 2009, Pages 708–713, https://doi.org/10.1111/j.1083-6101.2009.01459.x

  • Permissions Icon Permissions

The study of computer-mediated communication (CMC) and new communication technologies (NCTs) is an established and growing field not only with respect to the new technologies becoming available, but also in the many ways we are adopting them for use. Historically, I have contended that this area of communication research deserves recognition as a primary area of communication studies alongside that of interpersonal, organizational, health, and rhetorical studies among others. While the CMC area is still in its infancy, its impact on a variety of areas of human existence cannot be ignored. That said, when I began to work on this special section of the Journal of Computer-Mediated Communication ( JCMC ), it led me to more systematically consider the question of its place within the larger discipline of communication. This line of research has been gathering strength for more than 25 years and is now a strong and healthy subdiscipline in communication. This special section of JCMC seeks to tie together its rich past, diverse present, and an exciting future of possibilities and challenges. This takes place through a series of essays by some of the key contributors in the field today.

Most of the established areas of research in communication are centered on a solid base of theories. The CMC field is no different. From the work on social presence ( Short, Williams, & Christie, 1976 ), information (media) richness ( Daft & Lengel, 1984 , 1986 ), critical mass ( Markus, 1987 ), social influence ( Fulk, Schmitz, & Steinfeld, 1990 ), social information processing (SIP) ( Walther, 1992 ), social identity and deindividuation (SIDE) ( Spears & Lea, 1992 ), adaptive structuration ( DeSanctis & Poole, 1994 ), hyperpersonal interaction ( Walther, 1996 ), and channel expansion ( Carlson & Zmud, 1999 ) to the mindfulness/mindlessness work of Timmerman (2002) , theory development is central to CMC research. While it can be argued that some CMC theories are not exclusive to the study of CMC, the same can be said of some of the core theories of other primary areas such as interpersonal and organizational communication. What is more important is that scholars in this field of research are using these theories as the basis for research today.

CMC research continues to find its way into many top journals today (see, for example, Gong & Nass, 2007 ; Katz, 2007 ; Ramirez & Wang, 2008 ; Stephens, 2007 ) within our discipline, as well as in sociology, social psychology, and business management (see, for example, D’Urso & Rains, 2008 ; Katz, Rice, & Aspen, 2001 ; Walther, Loh, & Granka, 2005 ). Key contributions to this field date back over 25 years (see, for example, Barnes & Greller, 1992; Baym, 1995; Chesebro, 1985 ; Hunter & Allen, 1992 ; Jones, 1995 ; Korzenny, 1978 ; Parks & Floyd, 1996 ; Reese, 1988 ; Rice, 1980 ; Rice, 1984 ; Sproull & Kiesler, 1986 ; Steinfield, 1992 ). This diversity of publication outlets and the longevity of this research line are but a few of the examples of the breath and depth of CMC research. One key trait of most established fields is the existence of a flagship journal that is the home for that genre of research. In the case of CMC research, JCMC is considered by many to fulfill that role. Published in an online format since 1995, JCMC is now an official publication of the International Communication Association (ICA). Beyond journal publications, it is rather difficult these days to peruse the bookshelves in communication research and not notice the plethora of volumes dedicated to the study of CMC. The importance of the Internet in today's society has undoubtedly played a role in this publication trend; however, many of the books are scholarly and present some of today's best research in this area.

As has been seen with the number of articles and books published on this topic, the numbers of scholars who study CMC are also increasing. Though a number of the key scholars in this field are housed in other areas such as organizational and interpersonal communication, their work routinely looks at how CMC impacts communication (see Contractor & Eisenberg, 1990 ; Fulk, Flanagin, Kalman, Monge, & Ryan, 1996 ; Rice, 1993 ). One key factor in determining if CMC research should be a distinct subset of communication research can be seen at annual conferences such as the National Communication Association (NCA) and ICA. Here, graduate students who are preparing to enter the job market are seeing more and more openings for faculty positions with CMC as a potential area of specialization. This trend does not appear to be going away anytime soon.

Both NCA and ICA have prominent divisions in their respective organizations concerned with understanding CMC. In ICA, the Communication and Technology Division is now the largest in the entire association. In NCA, the Human Communication and Technology Division has a sustained membership of over 500. Looking back at the past several NCA conference programs, one cannot help but notice the presence of this division through the sponsorship of numerous panels and papers. As the recent Cochair for this division, I felt it was time that we made our presence more prominent within NCA. In 2007, we invited a number of prominent scholars to participate in a unique double-length panel discussion. Each of the 10 panelists, featured in the special section, presented and discussed their thoughts on the past, present and future of research in CMC with the audience. The success of the panel, and the interest generated by the panel, led to this special section.

Having reconsidered my original thoughts on identifying CMC research as a primary area of communication research, I have come to the conclusion that it may have become a moot point. CMC scholars are uniquely positioned to study the vast impact that communication technologies have had and are having on our society. Looking back at the past volumes of JCMC , the diversity of topics covered includes: interpersonal, medical, psychological, organizational, political, behavioral, and management studies. This diversity of research across disciplines places the CMC field in a unique position to be at the heart of many disciplinary endeavors in communication. However, is it a distinct and separate field of communication research? Yes, but without its cross-disciplinary approach, its overall impact on communication research may be seen as implausible.

To highlight the varied aspects of CMC research, this special section presents the thoughts of some of the prominent scholars in today's field of CMC. Rice (this issue) begins with what is most likely unique common experience for many as we struggle with our day-to-day interactions with technology. The particular story that Rice relates to us focuses on the embeddedness of CMC in our lives today and the challenges we face in understanding them in a larger context. These experiences and our understanding of their importance to our research are of particular interest to Baym (this issue) who notes that our interactions with technology are seen as a welcome trend. However, we must remain cautious as to what and how we research CMC, both now and in the future. Parks (this issue) offers that a microlevel approach to studying CMC may be problematic as compared to a broader approach to the technologies and their usage over time. To illustrate this point, Jackson's (this issue) discussion of the blending of technologies and concepts through “mashups” drives home the need for a broader approach to how we not only use, but research CMC.

One of the fastest growing areas of CMC research, social networking, represents what Barnes (this issue) considers another aspect of the convergence of CMC and human interaction. This falls in line with Contractor's (this issue) call for understanding the motivations behind why we seek these networked connections through mediated means. The development of future theory and research in this area will have the potential for far reaching implications across the CMC discipline.

From a theory standpoint, Walther (this issue) wonders whether our fields' development suffers from efforts at theoretical consolidation, rather than diversification of explanations and their boundary conditions that are critical in CMC research. Scott (this issue) provides potential directions for research and theory development, but does so with caution, because as he explains, “we can't keep up” with the technological innovations, and it may not be in our best interest to do so. Poole (this issue) sees consolidation of our efforts as a potential route through a combined process of data collection and sharing similar to how other disciplines operate. However we choose to proceed, it is clear, as Fulk and Gould (this issue) note, that we face many challenges ahead, but that the potential to really enhance the field of CMC research lies in our ability to meet these challenges.

I hope you enjoy what we have assembled here in this special section. There are many areas of research, theory development, and new communication technologies for us to ponder now and in the future. We find ourselves in an exciting period in CMC research history and the future looks very promising.

Barnes , S. , & Greller , L. M. ( 1994 ). Computer-mediated communication in the organization . Communication Education , 43 , 129 – 142 .

Google Scholar

Baym , N. K. ( 1999 ). Tune in, log on: Soaps, fandom, and online community . Thousand Oaks, CA : Sage .

Google Preview

Carlson , J. R. , & Zmud , R. W. ( 1999 ). Channel expansion theory and the experimental nature of media richness perceptions . Academy of Management Journal , 42 , 153 – 170 .

Chesebro , J. W. ( 1985 ). Computer-mediated interpersonal communication . In B. D. Ruben (Ed.), Information and behavior (Vol. 1, pp. 202 – 222 ). New Brunswick, NJ : Transaction Books .

Contractor , N. S. , & Eisenberg , E. M. ( 1990 ). Communication networks and new media in organizations . In J. Fulk & C. W. Steinfield (Eds.) Organizations and Communication Technology (pp. 145 – 174 ). Newbury Park, CA : Sage .

DeSanctis , G. , & Poole , M. S. ( 1994 ). Capturing the complexity in advanced technology use: Adaptive structuration theory . Organization Science , 5 , 121 – 147 .

Daft , R. L. , & Lengel , R. H. ( 1984 ). Information richness: A new approach to managerial behavior and organization design . In B. M. Staw & L. L. Cummings (Eds.), Research in Organizational Behavior (Vol. 6, pp. 191 – 233 ). Greenwich, CT : JAI Press .

Daft , R. L. , & Lengel , R. H. ( 1986 ). Organizational information requirements, media richness, and structural determinants . Management Science , 32 , 554 – 571 .

D’Urso , S. C. , & Rains , S. A. ( 2008 ). Examining the scope of channel expansion: A test of channel expansion theory with new and traditional communication media . Management Communication Quarterly , 21 , 486 – 507 .

Fulk , J. , Flanagin , A. J. , Kalman , M. E. , Monge , P. R. , & Ryan , T. ( 1996 ). Connective and communal public goods in interactive communication systems . Communication Theory , 6 , 60 – 87 .

Fulk , J. , Schmitz , J. , & Steinfield , C. W. ( 1990 ). A social influence model of technology use . In J. Fulk & C. Steinfield (Eds.), Organization and communication technology (pp. 117 – 140 ). Newbury Park, CA : Sage .

Gong , L. & Nass , C. ( 2007 ). When a talking-face computer agent is half-human and half-humanoid: Human identity and consistency preference . Human Communication Research , 33 , 163 – 193 .

Hunter , J. , & Allen , M. ( 1992 ). Adaptation to electronic mail . Journal of Applied Communication Research , 20 , 254 – 274 .

Jones , S. G. ( 1995 ). Understanding community in the information age . In S. G. Jones (Ed.), Cybersociety: Computer-mediated communication and community (pp. 10 – 35 ). Thousand Oaks, CA : Sage .

Katz , J. E. ( 2007 ). Mobile media and communication: Some important questions . Communication Monographs , 74 , 389 – 394 .

Katz , J. E. , Rice , R. E. , & Aspden , P. ( 2001 ). The Internet, 1995-2000: Access, civic involvement, and social interaction . American Behavioral Scientist , 45 , 405 – 419 .

Korzenny , F. ( 1978 ). A theory of electronic propinquity: Mediated communication in organizations . Communication Research , 5 , 3 – 23 .

Markus , M. L. ( 1987 ). Toward a “critical mass” theory of interactive media: Universal access, interdependence and diffusion . Communication Research , 14 , 491 – 511 .

Parks , M. R. , Floyd , K. ( 1996 ). Making friends in cyberspace . Journal of Communication , 46 , 80 – 97 .

Ramirez , A. & Wang , Z. ( 2008 ). When online meets offline: An expectancy violations theory perspective on modality switching . Journal of Communication , 58 , 20 – 39 .

Reese , S. D. ( 1988 ). New communication technologies and the information worker: The influence of occupation . Journal of Communication , 38 , 59 – 70 .

Rice , R. E. ( 1980 ). The impacts of computer-mediated organizational and interpersonal communication . In M. Williams (Ed.), Annual review of information science and technology , 15 (pp. 221 – 249 ). White Plains, NY : Knowledge Industry Publications .

Rice , R. E. & Associates . ( 1984 ). The new media: Communication, research and technology . Beverly Hills, CA : Sage .

Rice , R. E. ( 1993 ). Media appropriateness: Using social presence theory to compare traditional and new organizational media . Human Communication Research , 19 , 451 – 484 .

Short , J. , Williams , E. , & Christie , B. ( 1976 ). The social psychology of telecommunication . London : John Wiley .

Spears , R , & Lea , M. ( 1992 ). Social influence and the influence of the “social” in computer-mediated communication . In M. Lea (Ed.), Contexts of computer-mediated communication (pp. 30 – 65 ). London : Harvester-Wheatsheaf .

Sproull , L. , & Kiesler , S. ( 1986 ). Reducing social context cues: Electronic mail in organizational communication . Management Science , 32 , 1492 – 1512 .

Steinfield , C. ( 1992 ). Computer-mediated communications in organizational settings: Emerging conceptual frameworks and directions for research . Management Communication Quarterly , 5 , 348 – 365 .

Stephens , K. K. ( 2007 ). The successive use of information and communication technologies at work . Communication Theory , 17 , 486 – 507 .

Timmerman , C. E. ( 2002 ). The moderating effect of mindlessness/mindfulness upon media richness and social influence explanations of organizational media use . Communication Monographs , 69 , 111 – 131 .

Walther , J. B. ( 1992 ). Interpersonal effects in computer-mediated interaction: A relational perspective . Communication Research , 19 , 52 – 90 .

Walther , J. B. ( 1996 ). Computer-mediated communication: Impersonal, interpersonal, and hyperpersonal interaction . Communication Research , 23 , 1 – 43 .

Walther , J. B. , Loh , T. , Granka , L. ( 2005 ). Let me count the ways: The interchange of verbal and nonverbal cues in computer-mediated and face-to-face affinity . Journal of Language and Social Psychology , 24 , 36 – 65 .

Scott C. D’Urso (Ph.D., 2004, University of Texas at Austin) is an Assistant Professor of Communication Studies at Marquette University, where he teaches courses focused on organizational and corporate communication and new communication technology. Scott's primary research interests include organizational use of communication technologies such as e-mail, instant messaging and chat. He has published manuscripts on privacy and surveillance in the workplace, communication channel selection, crisis communication and stakeholder issues. He is currently working on several projects including digital divides in organizations, virtual team decision-making, and the role of online identity creation and privacy concerns with social networking websites. Prior to a career in academia, Scott worked for several years as a multimedia specialist/manager of a multimedia production department for a government defense contractor in the Southwest.

The author wishes to thank Yun Xia, and all of the officers of the Human Communication and Technology Division of NCA (past and present) as well as all of the authors who contributed to this special section, and finally, Aimee R. Hardinger, who served as editorial assistant for this special section.

Email alerts

Citing articles via.

  • Recommend to Your Librarian
  • Advertising and Corporate Services

Affiliations

  • Online ISSN 1083-6101
  • Copyright © 2024 International Communication Association
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Issue Cover

  • Previous Article
  • Next Article

Promises and Pitfalls of Technology

Politics and privacy, private-sector influence and big tech, state competition and conflict, author biography, how is technology changing the world, and how should the world change technology.

[email protected]

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Guest Access
  • Get Permissions
  • Cite Icon Cite
  • Search Site

Josephine Wolff; How Is Technology Changing the World, and How Should the World Change Technology?. Global Perspectives 1 February 2021; 2 (1): 27353. doi: https://doi.org/10.1525/gp.2021.27353

Download citation file:

  • Ris (Zotero)
  • Reference Manager

Technologies are becoming increasingly complicated and increasingly interconnected. Cars, airplanes, medical devices, financial transactions, and electricity systems all rely on more computer software than they ever have before, making them seem both harder to understand and, in some cases, harder to control. Government and corporate surveillance of individuals and information processing relies largely on digital technologies and artificial intelligence, and therefore involves less human-to-human contact than ever before and more opportunities for biases to be embedded and codified in our technological systems in ways we may not even be able to identify or recognize. Bioengineering advances are opening up new terrain for challenging philosophical, political, and economic questions regarding human-natural relations. Additionally, the management of these large and small devices and systems is increasingly done through the cloud, so that control over them is both very remote and removed from direct human or social control. The study of how to make technologies like artificial intelligence or the Internet of Things “explainable” has become its own area of research because it is so difficult to understand how they work or what is at fault when something goes wrong (Gunning and Aha 2019) .

This growing complexity makes it more difficult than ever—and more imperative than ever—for scholars to probe how technological advancements are altering life around the world in both positive and negative ways and what social, political, and legal tools are needed to help shape the development and design of technology in beneficial directions. This can seem like an impossible task in light of the rapid pace of technological change and the sense that its continued advancement is inevitable, but many countries around the world are only just beginning to take significant steps toward regulating computer technologies and are still in the process of radically rethinking the rules governing global data flows and exchange of technology across borders.

These are exciting times not just for technological development but also for technology policy—our technologies may be more advanced and complicated than ever but so, too, are our understandings of how they can best be leveraged, protected, and even constrained. The structures of technological systems as determined largely by government and institutional policies and those structures have tremendous implications for social organization and agency, ranging from open source, open systems that are highly distributed and decentralized, to those that are tightly controlled and closed, structured according to stricter and more hierarchical models. And just as our understanding of the governance of technology is developing in new and interesting ways, so, too, is our understanding of the social, cultural, environmental, and political dimensions of emerging technologies. We are realizing both the challenges and the importance of mapping out the full range of ways that technology is changing our society, what we want those changes to look like, and what tools we have to try to influence and guide those shifts.

Technology can be a source of tremendous optimism. It can help overcome some of the greatest challenges our society faces, including climate change, famine, and disease. For those who believe in the power of innovation and the promise of creative destruction to advance economic development and lead to better quality of life, technology is a vital economic driver (Schumpeter 1942) . But it can also be a tool of tremendous fear and oppression, embedding biases in automated decision-making processes and information-processing algorithms, exacerbating economic and social inequalities within and between countries to a staggering degree, or creating new weapons and avenues for attack unlike any we have had to face in the past. Scholars have even contended that the emergence of the term technology in the nineteenth and twentieth centuries marked a shift from viewing individual pieces of machinery as a means to achieving political and social progress to the more dangerous, or hazardous, view that larger-scale, more complex technological systems were a semiautonomous form of progress in and of themselves (Marx 2010) . More recently, technologists have sharply criticized what they view as a wave of new Luddites, people intent on slowing the development of technology and turning back the clock on innovation as a means of mitigating the societal impacts of technological change (Marlowe 1970) .

At the heart of fights over new technologies and their resulting global changes are often two conflicting visions of technology: a fundamentally optimistic one that believes humans use it as a tool to achieve greater goals, and a fundamentally pessimistic one that holds that technological systems have reached a point beyond our control. Technology philosophers have argued that neither of these views is wholly accurate and that a purely optimistic or pessimistic view of technology is insufficient to capture the nuances and complexity of our relationship to technology (Oberdiek and Tiles 1995) . Understanding technology and how we can make better decisions about designing, deploying, and refining it requires capturing that nuance and complexity through in-depth analysis of the impacts of different technological advancements and the ways they have played out in all their complicated and controversial messiness across the world.

These impacts are often unpredictable as technologies are adopted in new contexts and come to be used in ways that sometimes diverge significantly from the use cases envisioned by their designers. The internet, designed to help transmit information between computer networks, became a crucial vehicle for commerce, introducing unexpected avenues for crime and financial fraud. Social media platforms like Facebook and Twitter, designed to connect friends and families through sharing photographs and life updates, became focal points of election controversies and political influence. Cryptocurrencies, originally intended as a means of decentralized digital cash, have become a significant environmental hazard as more and more computing resources are devoted to mining these forms of virtual money. One of the crucial challenges in this area is therefore recognizing, documenting, and even anticipating some of these unexpected consequences and providing mechanisms to technologists for how to think through the impacts of their work, as well as possible other paths to different outcomes (Verbeek 2006) . And just as technological innovations can cause unexpected harm, they can also bring about extraordinary benefits—new vaccines and medicines to address global pandemics and save thousands of lives, new sources of energy that can drastically reduce emissions and help combat climate change, new modes of education that can reach people who would otherwise have no access to schooling. Regulating technology therefore requires a careful balance of mitigating risks without overly restricting potentially beneficial innovations.

Nations around the world have taken very different approaches to governing emerging technologies and have adopted a range of different technologies themselves in pursuit of more modern governance structures and processes (Braman 2009) . In Europe, the precautionary principle has guided much more anticipatory regulation aimed at addressing the risks presented by technologies even before they are fully realized. For instance, the European Union’s General Data Protection Regulation focuses on the responsibilities of data controllers and processors to provide individuals with access to their data and information about how that data is being used not just as a means of addressing existing security and privacy threats, such as data breaches, but also to protect against future developments and uses of that data for artificial intelligence and automated decision-making purposes. In Germany, Technische Überwachungsvereine, or TÜVs, perform regular tests and inspections of technological systems to assess and minimize risks over time, as the tech landscape evolves. In the United States, by contrast, there is much greater reliance on litigation and liability regimes to address safety and security failings after-the-fact. These different approaches reflect not just the different legal and regulatory mechanisms and philosophies of different nations but also the different ways those nations prioritize rapid development of the technology industry versus safety, security, and individual control. Typically, governance innovations move much more slowly than technological innovations, and regulations can lag years, or even decades, behind the technologies they aim to govern.

In addition to this varied set of national regulatory approaches, a variety of international and nongovernmental organizations also contribute to the process of developing standards, rules, and norms for new technologies, including the International Organization for Standardization­ and the International Telecommunication Union. These multilateral and NGO actors play an especially important role in trying to define appropriate boundaries for the use of new technologies by governments as instruments of control for the state.

At the same time that policymakers are under scrutiny both for their decisions about how to regulate technology as well as their decisions about how and when to adopt technologies like facial recognition themselves, technology firms and designers have also come under increasing criticism. Growing recognition that the design of technologies can have far-reaching social and political implications means that there is more pressure on technologists to take into consideration the consequences of their decisions early on in the design process (Vincenti 1993; Winner 1980) . The question of how technologists should incorporate these social dimensions into their design and development processes is an old one, and debate on these issues dates back to the 1970s, but it remains an urgent and often overlooked part of the puzzle because so many of the supposedly systematic mechanisms for assessing the impacts of new technologies in both the private and public sectors are primarily bureaucratic, symbolic processes rather than carrying any real weight or influence.

Technologists are often ill-equipped or unwilling to respond to the sorts of social problems that their creations have—often unwittingly—exacerbated, and instead point to governments and lawmakers to address those problems (Zuckerberg 2019) . But governments often have few incentives to engage in this area. This is because setting clear standards and rules for an ever-evolving technological landscape can be extremely challenging, because enforcement of those rules can be a significant undertaking requiring considerable expertise, and because the tech sector is a major source of jobs and revenue for many countries that may fear losing those benefits if they constrain companies too much. This indicates not just a need for clearer incentives and better policies for both private- and public-sector entities but also a need for new mechanisms whereby the technology development and design process can be influenced and assessed by people with a wider range of experiences and expertise. If we want technologies to be designed with an eye to their impacts, who is responsible for predicting, measuring, and mitigating those impacts throughout the design process? Involving policymakers in that process in a more meaningful way will also require training them to have the analytic and technical capacity to more fully engage with technologists and understand more fully the implications of their decisions.

At the same time that tech companies seem unwilling or unable to rein in their creations, many also fear they wield too much power, in some cases all but replacing governments and international organizations in their ability to make decisions that affect millions of people worldwide and control access to information, platforms, and audiences (Kilovaty 2020) . Regulators around the world have begun considering whether some of these companies have become so powerful that they violate the tenets of antitrust laws, but it can be difficult for governments to identify exactly what those violations are, especially in the context of an industry where the largest players often provide their customers with free services. And the platforms and services developed by tech companies are often wielded most powerfully and dangerously not directly by their private-sector creators and operators but instead by states themselves for widespread misinformation campaigns that serve political purposes (Nye 2018) .

Since the largest private entities in the tech sector operate in many countries, they are often better poised to implement global changes to the technological ecosystem than individual states or regulatory bodies, creating new challenges to existing governance structures and hierarchies. Just as it can be challenging to provide oversight for government use of technologies, so, too, oversight of the biggest tech companies, which have more resources, reach, and power than many nations, can prove to be a daunting task. The rise of network forms of organization and the growing gig economy have added to these challenges, making it even harder for regulators to fully address the breadth of these companies’ operations (Powell 1990) . The private-public partnerships that have emerged around energy, transportation, medical, and cyber technologies further complicate this picture, blurring the line between the public and private sectors and raising critical questions about the role of each in providing critical infrastructure, health care, and security. How can and should private tech companies operating in these different sectors be governed, and what types of influence do they exert over regulators? How feasible are different policy proposals aimed at technological innovation, and what potential unintended consequences might they have?

Conflict between countries has also spilled over significantly into the private sector in recent years, most notably in the case of tensions between the United States and China over which technologies developed in each country will be permitted by the other and which will be purchased by other customers, outside those two countries. Countries competing to develop the best technology is not a new phenomenon, but the current conflicts have major international ramifications and will influence the infrastructure that is installed and used around the world for years to come. Untangling the different factors that feed into these tussles as well as whom they benefit and whom they leave at a disadvantage is crucial for understanding how governments can most effectively foster technological innovation and invention domestically as well as the global consequences of those efforts. As much of the world is forced to choose between buying technology from the United States or from China, how should we understand the long-term impacts of those choices and the options available to people in countries without robust domestic tech industries? Does the global spread of technologies help fuel further innovation in countries with smaller tech markets, or does it reinforce the dominance of the states that are already most prominent in this sector? How can research universities maintain global collaborations and research communities in light of these national competitions, and what role does government research and development spending play in fostering innovation within its own borders and worldwide? How should intellectual property protections evolve to meet the demands of the technology industry, and how can those protections be enforced globally?

These conflicts between countries sometimes appear to challenge the feasibility of truly global technologies and networks that operate across all countries through standardized protocols and design features. Organizations like the International Organization for Standardization, the World Intellectual Property Organization, the United Nations Industrial Development Organization, and many others have tried to harmonize these policies and protocols across different countries for years, but have met with limited success when it comes to resolving the issues of greatest tension and disagreement among nations. For technology to operate in a global environment, there is a need for a much greater degree of coordination among countries and the development of common standards and norms, but governments continue to struggle to agree not just on those norms themselves but even the appropriate venue and processes for developing them. Without greater global cooperation, is it possible to maintain a global network like the internet or to promote the spread of new technologies around the world to address challenges of sustainability? What might help incentivize that cooperation moving forward, and what could new structures and process for governance of global technologies look like? Why has the tech industry’s self-regulation culture persisted? Do the same traditional drivers for public policy, such as politics of harmonization and path dependency in policy-making, still sufficiently explain policy outcomes in this space? As new technologies and their applications spread across the globe in uneven ways, how and when do they create forces of change from unexpected places?

These are some of the questions that we hope to address in the Technology and Global Change section through articles that tackle new dimensions of the global landscape of designing, developing, deploying, and assessing new technologies to address major challenges the world faces. Understanding these processes requires synthesizing knowledge from a range of different fields, including sociology, political science, economics, and history, as well as technical fields such as engineering, climate science, and computer science. A crucial part of understanding how technology has created global change and, in turn, how global changes have influenced the development of new technologies is understanding the technologies themselves in all their richness and complexity—how they work, the limits of what they can do, what they were designed to do, how they are actually used. Just as technologies themselves are becoming more complicated, so are their embeddings and relationships to the larger social, political, and legal contexts in which they exist. Scholars across all disciplines are encouraged to join us in untangling those complexities.

Josephine Wolff is an associate professor of cybersecurity policy at the Fletcher School of Law and Diplomacy at Tufts University. Her book You’ll See This Message When It Is Too Late: The Legal and Economic Aftermath of Cybersecurity Breaches was published by MIT Press in 2018.

Recipient(s) will receive an email with a link to 'How Is Technology Changing the World, and How Should the World Change Technology?' and will not need an account to access the content.

Subject: How Is Technology Changing the World, and How Should the World Change Technology?

(Optional message may have a maximum of 1000 characters.)

Citing articles via

Email alerts, affiliations.

  • Special Collections
  • Review Symposia
  • Info for Authors
  • Info for Librarians
  • Editorial Team
  • Emerging Scholars Forum
  • Open Access
  • Online ISSN 2575-7350
  • Copyright © 2024 The Regents of the University of California. All Rights Reserved.

Stay Informed

Disciplines.

  • Ancient World
  • Anthropology
  • Communication
  • Criminology & Criminal Justice
  • Film & Media Studies
  • Food & Wine
  • Browse All Disciplines
  • Browse All Courses
  • Book Authors
  • Booksellers
  • Instructions
  • Journal Authors
  • Journal Editors
  • Media & Journalists
  • Planned Giving

About UC Press

  • Press Releases
  • Seasonal Catalog
  • Acquisitions Editors
  • Customer Service
  • Exam/Desk Requests
  • Media Inquiries
  • Print-Disability
  • Rights & Permissions
  • UC Press Foundation
  • © Copyright 2024 by the Regents of the University of California. All rights reserved. Privacy policy    Accessibility

This Feature Is Available To Subscribers Only

Sign In or Create an Account

Home — Essay Samples — Science — Evolution — The Evolution of Technology

test_template

The Evolution of Technology

  • Categories: Dependence on Technology Evolution

About this sample

close

Words: 640 |

Published: Dec 18, 2018

Words: 640 | Page: 1 | 4 min read

Technology Essay: Hook Examples

  • The Digital Revolution: In the 21st century, technology has reshaped every facet of our lives. This essay delves into the profound impact of the digital revolution, from smartphones to artificial intelligence, and how it continues to shape our world.
  • From Stone Tools to Silicon Chips: Human history is marked by technological advancements. Join us as we journey through time, exploring the milestones that have propelled humanity from the Stone Age to the Information Age.
  • The Ethical Crossroads: Advancements in technology bring forth ethical dilemmas. This essay examines the ethical challenges posed by emerging technologies, from genetic engineering to surveillance, and the need for responsible innovation.
  • Technology in Education: Education is undergoing a digital transformation. Explore how technology is revolutionizing classrooms, expanding access to knowledge, and reshaping the way we learn.
  • The Future Unveiled: What does the future hold in the realm of technology? In this essay, we’ll peer into the crystal ball of tech trends, from quantum computing to space exploration, and envision the world that awaits us.

Works Cited

  • Feeney, A. (2019). Overcoming Fear: Finding the Courage to Face Your Fears and Embrace Change. John Wiley & Sons.
  • Seligman, M. E. (2006). Learned optimism: How to change your mind and your life. Vintage.
  • Adams, S. K. (2019). How to Overcome Fear and Find Your Courage: Overcoming Fear, Gaining Confidence, Building Trust, and Improving Self Esteem. Independently Published.
  • Brown, B. (2012). Daring Greatly: How the Courage to Be Vulnerable Transforms the Way We Live, Love, Parent, and Lead. Avery.
  • Knaus, W. J. (2006). Fearless: Imagine Your Life Without Fear. American Management Association.
  • Chansky, T. E. (2014). Freeing your child from anxiety: Powerful, practical solutions to overcome your child’s fears, worries, and phobias. Harmony.
  • Lerner, H. G. (2015). Fear and other uninvited guests: Tackling the anxiety, fear, and shame that keep us from optimal living and loving. HarperCollins.
  • Rappaport, J. (2017). The Courage Habit: How to Accept Your Fears, Release the Past, and Live Your Courageous Life. New Harbinger Publications.
  • McGrath, C. (2018). The Psychology of Fear in Organizations: How to Transform Anxiety into Well-being, Productivity and Innovation. Kogan Page.
  • Gilbert, E. (2019). Big Magic: Creative Living Beyond Fear. Riverhead Books.

Image of Alex Wood

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Dr. Heisenberg

Verified writer

  • Expert in: Information Science and Technology Science

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

1 pages / 530 words

1 pages / 521 words

1 pages / 417 words

1 pages / 448 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

The Evolution of Technology Essay

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Evolution

Modern technology enables us to prenatally diagnose birth defects of genetic disor- ders inherited from the family. Examples of these disorders are the abnormal growth of the body, mental retardation and other conditions which [...]

Most decisively settled events of true-blue living substance gift area unit people who be a part of plastids that area unit minute living substance organelles in plant cells. Coessential of those plastids area unit chloroplasts [...]

Game Theory is a groundbreaking theory which has explained various analyses in different domains of Science and Commerce. In here, strategies of Evolutionary Biology are explained in the light of Game Theory. Mathematics and [...]

Hundreds of thousands of years, humans found themselves in a world of an endlessconflicts caused mainly because religious, economical or political reasons. The sence ofwar is indeed a human nature to survive and to grow. Roughly [...]

How have vertebrates evolved? When scientists describe vertebrate evolution, they most often frame it as a transition from water to land. Once on land, the vertebrates are described as evolving to occupy diverse habitats and [...]

Creation is the only true justification for why man was created and should be the only explanation because all people are created on purpose for a purpose by god.The Bible contains hundreds of verses with the expression “if” and [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

evolution of communication technology essay

IMAGES

  1. How Is the Technology Evolution Going Free Essay Example

    evolution of communication technology essay

  2. History Of Communication Technology Timeline

    evolution of communication technology essay

  3. The Evolution of Communication [Infographic]

    evolution of communication technology essay

  4. A Brief Review on: Evolution of Communication Technologies

    evolution of communication technology essay

  5. Evolution of communication technology

    evolution of communication technology essay

  6. The Evolution of Communication Technology by Ryan Boyle

    evolution of communication technology essay

VIDEO

  1. HISTORICAL EVOLUTION OF COMMUNICATION MEDIA

  2. The Evolution of Communication Telegraph and Teleprinter History

  3. Evolution Of Digital Communication

  4. The Evolution of Communication How Mobile Devices Have Transformed our Interactions #shorts

  5. How the evolution of digital communication platforms impacted traditional forms of communication?

  6. The Use and Evolution of Communication in Landscaping: An Interview of Craig Tracy

COMMENTS

  1. Impact of Technology on Communication Essay

    The advancement of technology ensures that communication is quicker and that more people remain connected. There has been an evolution in interpersonal skills with the advancement of technology, and users should always be keen on adapting to new ways of communication. Technology has continually brought new methods of communication leading to ...

  2. The Digital Revolution: How Technology is Changing the Way We

    The Evolution of Communication: From Face-to-Face to Digital Connections: In the realm of communication, we have various mediums at our disposal, such as face-to-face interactions, telephone conversations, and internet-based communication. ... In today's digital age, the impact of technology on communication and interaction is evident ...

  3. The History of Communications

    Two Hundred Years of Global Communications. From the printing press to Instagram, technological advances shape how people communicate. Humans communicate in various ways. They have been writing to each other since the fourth millennium BCE, when one of the earliest writing systems, cuneiform, was developed in Mesopotamia.

  4. The Role Of Technology In The Evolution Of Communication

    VR also helps to facilitate better communication. In a typical discussion, a lot of information is non-verbal communication which can be transcribed in VR. Voice tone, hesitations, head and hand ...

  5. Here's how technology has changed the world since 2000

    Since the dotcom bubble burst back in 2000, technology has radically transformed our societies and our daily lives. From smartphones to social media and healthcare, here's a brief history of the 21st century's technological revolution. Just over 20 years ago, the dotcom bubble burst, causing the stocks of many tech firms to tumble.

  6. 1.1 Communication: History and Forms

    The evolution of communication media, from speaking to digital technology, has also influenced the field of communication studies. ... we must return to the "Manuscript Era," which saw the production of the earliest writings about communication. In fact, the oldest essay and book ever found were written about communication (McCroskey, 1984 ...

  7. Technology and Evolution of Communication

    Words: 310. Reading time: 2 min. Study level: College. Technology has transformed people's lives to a considerable extent in all spheres, including communication. Two hundred years ago, people had to write letters or meet personally to communicate. One hundred years ago, people could communicate with little attention to distance as they had ...

  8. Technology over the long run: zoom out to see how dramatically the

    You can use this visualization to see how technology developed in particular domains. Follow, for example, the history of communication: from writing to paper, to the printing press, to the telegraph, the telephone, the radio, all the way to the Internet and smartphones. Or follow the rapid development of human flight.

  9. (PDF) Evolution of Communication Technology

    The fifth-generation technology, precisely referred to as 5G technology, is a wireless technology based on the cellular cells like the existing 4G LTE (Long-Term Evolution) technology network. PDF ...

  10. Getting the Message: A History of Communications

    Getting the message explores the fascinating history of communications, starting with ancient civilisations, the Greeks and Romans, then leading through the development of the electric telegraph, and up to the present day with email and cellular phones. The technology is explained in a particularly simple and accessible way, and themes from ...

  11. 1 PAST, PRESENT, AND FUTURE

    In addition to corporate R&D, a government organization, Research and Development Center for Radio Systems, is a significant source of wireless communications technology in Japan. Worldwide efforts to guide the evolution of wireless communications technology come together in the IMT-2000 project.

  12. How Communication Has Changed in The Last 20 Years

    The landscape of communication has undergone a remarkable transformation in the last two decades, reshaping the ways in which individuals connect, interact, and exchange information. The advent of technology and its rapid integration into daily life have led to significant shifts in communication patterns, affecting personal relationships, professional interactions, and societal dynamics.

  13. 11.2 The Evolution of the Internet

    Facebook is a simple example of a protocol: Users can easily communicate with one another, but only through acceptance of protocols that include wall posts, comments, and messages. Facebook's protocols make communication possible and control that communication. These two elements connect the Internet's origins to its present-day incarnation.

  14. Past, Present, and Future of Human Communication and Technology

    In ICA, the Communication and Technology Division is now the largest in the entire association. In NCA, the Human Communication and Technology Division has a sustained membership of over 500. ... one cannot help but notice the presence of this division through the sponsorship of numerous panels and papers. As the recent Cochair for this ...

  15. The Evolution Of Communication Technology History Essay

    In a traditional sense, the advancement of communication technology has also sought to address four unique aspects of information transfer; these are: the speed and distance at which information can be sent and received, the staying power or permanency of the information, and also the volume of information that can be sent.

  16. How Is Technology Changing the World, and How Should the World Change

    Technologies are becoming increasingly complicated and increasingly interconnected. Cars, airplanes, medical devices, financial transactions, and electricity systems all rely on more computer software than they ever have before, making them seem both harder to understand and, in some cases, harder to control. Government and corporate surveillance of individuals and information processing ...

  17. The Origin and Evolution of Mass Communication and Technology

    Publishing is facing a huge transformation today as it did when the printing press was made six centuries ago by Johannes Gutenberg. It's something that needs to continuously evolve as new technology becomes available and consumers preferences change.

  18. The Evolution Of Communication Technology History Essay

    The Evolution Of Communication Technology History Essay. Throughout history, the human propensity to communicate effectively has always been a factor that has defined the gap of intelligence between animals and humans. Communication, the transfer of information, has been the key element needed to establish civilized societies, which require ...

  19. The Evolution of Information and Communication Technology in Public

    Over the last decades, governments all over the world have tried to take advantage of information and communication technology (ICT) to improve government operations and communication with citizens. Adoption of e-government has increased in most countries, but at the same time, the rate of successful adoption and operation varies from country ...

  20. History of technology

    technology. history of technology, the development over time of systematic techniques for making and doing things. The term technology, a combination of the Greek technē, "art, craft," with logos, "word, speech," meant in Greece a discourse on the arts, both fine and applied. When it first appeared in English in the 17th century, it ...

  21. The Evolution of Technology: [Essay Example], 640 words

    Details. The society has been dramatically changed with the evolution of technology. Before the advent of modern technology, life was burdensome and everyday chores consumed too much of our time. Immense opportunities are being provided by technologies which play an important role in human life. The access to education, medicine, industry ...

  22. History of telecommunication

    The Evolution of Media, 2007, Rowman & Littlefield; Poe, Marshall T. A History of Communications: Media and Society From the Evolution of Speech to the Internet (Cambridge University Press; 2011) 352 pages; Documents how successive forms of communication are embraced and, in turn, foment change in social institutions. Wheen, Andrew.

  23. Writing a Research-Based Informative Essay about Language

    Write an informative essay explaining the evolution of communication technology, from the invention of the telephone to the modern use of communication devices. ... current communications devices. communications devices available over the last century. inventors of the various types of communication technology. pros and cons of video ...