These 5 Technologies Dramatically Changed the 20th Century

These 5 Technologies Dramatically Changed the 20th Century

Dariusz Stusowski - March 14, 2017

The 20th Century was a time of great advancement in a variety of technological fields. Just about every aspect of life was transformed by stunning new inventions and breakthroughs. From the way people travel, to life-giving medical advancements and stunningly devastating war technology, a person living at the beginning of the 20th Century would hardly recognize what life would look like by the time the century ended.

These 5 Technologies Dramatically Changed the 20th Century
todayifoundout.com

Nuclear Technology

Nuclear technology profoundly shaped 20th Century history in both negative and positive ways. Whether good or bad, many aspects of life changed with its introduction. It deeply influenced warfare, energy production, medicine, and even household technology.

Though nuclear technology is most intimately linked with warfare, its first uses were medical. As early as the First World War, Marie Curie, who famously isolated the first radio-active element (Radium) in 1902, began to sterilize the wounds of soldiers with radon gas. She was also responsible for creating mobile x-ray machines used in the French war effort. Only later did it become known that radiation possessed harmful and even deadly side effects. While early medical experiments with radiation were limited, by the 1940s the world was thrown into another world war. This time, many governments turned their focus to the destructive potential of nuclear technology.

Knowing that Nazi Germany had already split the atom in 1939, the United States resolved to end WWII as quickly as possible. With this in mind, the U.S. began the Manhattan Project in 1941. By the next year, the world’s first successful nuclear reaction occurred in Chicago. By 1945, one nuclear bomb was successfully tested and two more were ready to be used on Japan, as Germany had already surrendered earlier that year.

With the Allied victory in WWII, new efforts to use nuclear technology in a peaceful manner were initiated. Perhaps the most useful effort came in the form of nuclear power plants that produced inexpensive and reliable electricity. Despite periodic accidents such as Three Mile Island (1979), Chernobyl (1986), and Fukushima (2011), over 10% of the world’s energy is still made with nuclear power plants. Some countries like France still rely on nuclear energy for over 75% of its electrical power.

Energy production is not the only area where nuclear technology is used for peaceful purposes, however. The medical field renewed its interest during the latter half of the 20th Century, developing a variety of new scanning and diagnosing technologies, as well as cancer treatments. There are even small amounts of radioactive material in many smoke detectors, which save thousands of lives each year.

These 5 Technologies Dramatically Changed the 20th Century
mathworks.com

Transistors

Without transistors, modern electronic devices would not exist. From cell phones and computers to televisions and cars, transistors are a crucial part of modern life. Without them, the most advanced piece of technology would resemble something closer to a lightbulb rather than an IBM supercomputer. But what is a transistor? Basically, it is a miniaturized switch, capable of turning an electrical current on or off. Various types of primitive transistors were used throughout the 20th century, but the modern transistor is usually credited as being created in 1947 by a group of American engineers working for Bell Laboratories.

By 1954 transistor technology was advanced enough to be sold in a variety of commercial products, the most significant of which were portable radios. Small and durable enough to be used just about anywhere a person wanted to go, it sparked a technological revolution that spread to all areas of life in the coming decades.

By the 1960s transistor technology made another giant leap forward with the development of the “integrated circuit,” known commonly as the microchip. This made the transistor lighter, more powerful, and durable enough so that NASA became interested in using this technology in the Apollo lunar module guidance computer that landed on the moon in 1969. By the early 1970s a variety of companies developed increasingly powerful microchips allowing for a wide range of uses. These machines tended to be large and very expensive and were used mostly by businesses, which usually made it impossible for individuals to buy.

However, computers doubled in speed roughly every 18 months, making them smaller and more powerful until personal computers became possible by the late 1970s. Companies like Apple, Tandy, and Atari started producing computers for home use. By the 1980s personal computers were becoming commonplace along with the first cell phones and dedicated gaming consoles. Today, our modern world would cease to function in an instant without transistors.

These 5 Technologies Dramatically Changed the 20th Century
makeuseof.com

Digital Codes

When was the last time you found a phone number using a telephone book, or used a physical map to find your destination? Younger readers may have never used such things. A big reason for such dramatic changes to the way we live our daily lives has to do with the development of digital coding. In fact, it is responsible for completely revolutionizing the world in which we live.

Fundamentally, digital code relies on a system of 1s and 0s that represent on and off circuits. This simple system allows for all of the digital technology that exists today. The first modern attempts to create such a code occurred in 1949, followed shortly by the advent of FORTRAN in 1951. FORTRAN was the first widely-used programming language, which did not need the input of actual 1s and 0s in order to complete a program. Digital coding languages continued to evolve through the 1970s until they started to have a noticeable effect on the lives of ordinary people, with the advent of personal computing. For the first time personal computers and the digital code they ran on were available to average people. Soon, progressively larger amounts of information was stored digitally, while an increasing amount tasks were able to be accomplished with the use of digital technology.

But the digital revolution was just beginning in the late 1970s and would not explode until the 1990s when it became possible for the home consumer to start sending digital information over telephone lines. Though university and military researchers began experimenting with sending digital information over telephones lines as far back as the 1970s, it was not until the 1990s that these methods became easy enough for non-technical people to use. It was during this time that the Internet was born.

The digital world began to explode. At the beginning of 1994 there were 700 websites; by the end of the year there were 10,000. The synergistic melding of computer microchip technology, digital coding languages, and mobile technologies gave birth to a world that cannot function without digital codes. From social media to online purchasing, 1s and 0s radically changed the world.

These 5 Technologies Dramatically Changed the 20th Century
afiqaviation.blogspot.com

Jet Engines

From warfare and space exploration to commercial travel, jet technology opened up a myriad of possibilities for humanity. While many ancient examples of basic rocket technology exist, practical and world-changing rockets were not developed unit the 20th Century. Work on the first practical jet engine suitable for aviation began in Britain in the late 1920s, and continued a few years later in Germany. As early as 1939, just a few days before the beginning of WWII, a German-built jet airplane was the first to take flight, though mass production was still years away. This was achieved with the Messerschmitt Me 262, which started production in mid-1944. Though this plane was vastly superior to anything other countries were producing, it was manufactured in very small numbers, used too much sophisticated fuel, and was introduced too late to make a difference for the German military.

Similarly, the rocket, which is also a type of jet engine, was not developed in its modern, useful form until the 20th century. While research on rockets happened in many countries, it was not until American Robert Godard launched the first modern rocket in 1926 that rocketry became a practical reality. Just as with jet airplane engines, German experimenters made amazing leaps forward in the following years, spurred on first by armature enthusiasts and then by wartime research spending.

Eventually, a gifted engineer by the name of Wernher von Braun developed the V-2 which was world’s first guided, long-rage rocket. While far superior to any other rocket at the time, just like the Me 262, it was developed too late to make any significant difference for the Nazi war effort. More significant was Von Braun’s surrender to the United States at the end of WWII. Von Braun and other German scientists were sent to the United States to begin work on an American space program as a response to a quickly-escalating Cold War and “space race” with the Soviet Union, which culminated in 1969 with the moon landing.

Though the bloodiest war in world history played a major role in the development of jet technology, today just about everyone can afford to travel great distances quickly and safely, revolutionizing society in many ways. Likewise, rocket technology allowed people to enter outer space, and may one day make human colonization of space possible.

These 5 Technologies Dramatically Changed the 20th Century
Cancer.org

Antibiotics

Imagine dying from a small cut sustained while working in your garden, or not being able to get a routine type of surgery because the risk of infection was so high. That was the world everyone lived in before modern antibiotics.

Though some ancient peoples knew of molds and other foods that contained some anti-bacterial properties, an effective medicine was not known until the 20th Century. Controlled attempts to isolate a cure for bacterial infections began as far back as the 1600s, but even mildly successful medicines were toxic and led to serious side effects. A much better antibiotic known today as penicillin with far more effective results was discover in 1928 by Sir Alexander Fleming, and is considered by many to the beginning of the modern era of antibiotics.

As with many of our most important discoveries, the long road to developing penicillin began purely as a mistake. After returning from a summer vacation, Dr. Fleming found odd patterns on some petri dishes he absent-mindedly set near a window before he left. Upon closer examination, he realized the growth of potentially deadly bacteria was stopped in the areas where a type of mold was accidentally introduced. Excited by this discovery, Dr. Fleming and others started to experiment with the mold over the next 12 years.

By 1940, a pair of researchers heard of a man who nicked his face while working in his rose garden. From such a simple injury, he developed a serious infection. After receiving permission to administer their medicine the man began to recover, but soon fell ill because the researchers ran out of penicillin, which at the time was very hard to produce in useful quantities. However, new methods soon yielded enough penicillin to cure a variety of infections and became critical to Allied War efforts during WWII, saving the lives of thousands of American soldiers.

Today we have dozens of powerful and effective antibiotics which not only save millions of people from serious infections every year, but also allow modern medical procedures such as skin grafts, organ transplants, and many other developments that would not otherwise be possible. Because of the advent of antibiotics, we all lead much longer and secure lives.

Advertisement