8 technological advances that changed our world
Curious about how new advanced technology has changed our world throughout time? Here, we explore advanced biotech discoveries that have shaped our world.
From our earliest beginnings as hunter-gatherers, technological advances have occurred hand-in-hand with our changing human society. From early inventions to recent technological advances, we delve into just a few of the many important science and technology advancements that have changed life as we know it. If you’d like to learn about some of the most advanced technology societies have experienced throughout time, keep reading.
1. Electricity and the battery
Technology advanced rapidly following the discovery of electricity. The earliest recorded scientific understanding of electricity occurred in 600 BCE. A Greek mathematician and philosopher called Thales of Miletus found that rubbing hemp on amber created static electricity and allowed the amber to pick up other light objects, hence the term electricity originated from the Greek word ‘elektron,’ meaning amber.
Electricity is a natural phenomenon that works in the same way as it always has, but it was not harnessed for energy until the 18th century. Benjamin Franklin named two types of electricity ‘positive’ and ‘negative,’ as we still do today, and notably undertook his famous (and dangerous) attempt to catch electricity with a key tied to a kite during a thunderstorm. The first true battery was invented in 1800 by Alessandro Volta, which he referred to as “the artificial electric organ”. Since then, a series of discoveries and breakthroughs led to the widespread use of electricity and batteries.
The discovery of batteries and electricity forged a path for innumerable other technical innovations, like electric motors, power generators, solar cells, lightbulbs, refrigerators, televisions, phones, computers, renewable energy...the list goes on.
In pasteurization, food is heated to a fairly high temperature for a short time, then rapidly cooled in order to kill pathogens that would pose health risks to humans without significantly affecting the food’s flavor or nutritional contents. Pasteurization is particularly used in the dairy industry and can also be applied to extend the shelf life of other food and drink products.
In 1864, Louis Pasteur discovered a heating and cooling method that prevented wine from spoiling during a series of experiments at his university. Hence the technique became known as pasteurization, though some evidence indicates that the same principles may have existed in China and Japan centuries earlier. In the decades that followed Pasteur’s revelations about harmful microbes leading to spoilage, pasteurization grew more popular as public health officials began to realize the enormous benefits that came with the technique.
Some estimates show that unpasteurized dairy products have an 840-fold higher risk of carrying foodborne diseases compared to pasteurized dairy products. Alongside other innovations like refrigeration, pasteurization revolutionized food safety, food accessibility, and has saved countless lives.
Potentially as early as 200 BCE, humans have tried to prevent infection by intentionally exposing healthy humans to small amounts of disease. The British copied techniques from enslaved West African people and began experimenting with rudimentary smallpox vaccines in the 18th century. A range of talented male and female scientists were instrumental in developing vaccines for other diseases thereafter.
Smallpox remains to be the only human disease that has been officially eradicated from the globe in 1980. There are other diseases that are almost eradicated, or under control thanks to widespread vaccination. Vaccines have made previously fatal diseases preventable, making them one of the biggest achievements science has ever produced. Now, new vaccine delivery systems and a deepening of our molecular biology knowledge hold promise for advancing disease prevention even further.
In 1928, a professor of bacteriology called Alexander Fleming left his agar plates of bacteria uncovered in the lab when he went on holiday. In contrast to popular myth, it appears this was no accident – Fleming was trying to investigate how his colonies changed when they were exposed to the air for longer periods than normal, as he was following up on previous authors’ findings concerning irregular bacterial growth. When Fleming returned, he noticed something incredible: the bacteria had been killed off only in patches where a penicillin mold was also growing.
It was known that fungi could have hostile effects on other species at this point (we now know that many microorganisms, particularly those in the soil, produce chemical ‘weapons’ to gain an upper hand against competitor microbes) but this discovery was an unprecedented breakthrough in the treatment of bacterial infections. Molds themselves had long been used in folk medicine. Records show that ancient Egyptians used to apply moldy bread to infected wounds, and Australian aboriginal peoples used molds grown on the shaded side of eucalyptus trees. Quite astoundingly, multiple cultures across the globe thus appeared to understand, or at least observe, the antibiotic principles behind penicillin.
After further human clinical trials, studies on penicillin by scientists like Dorothy Hodgkin, and up-scaling of penicillin production, the introduction of penicillin into the medical world heralded the beginning of the “golden era” of antibiotics, wherein many more antibiotics were developed for different conditions. By the second world war, penicillin was widely used and dubbed “the wonder drug” for the many lives it saved. The development of antibiotics led to a significant decrease in deaths caused by bacterial infection – a simple cut or wound was no longer a possible death sentence, and modern surgical procedures became possible.
Antibiotics such as penicillin have changed the world and extended human lifespan by a whopping average of 23 years. However, as predicted by Fleming, antibiotic resistance is the next big challenge. Nonetheless, research into further antibiotic discovery holds promise for the years to come.
5. The contraceptive pill
1960 was an important year: it was the year that the first birth control pill was introduced. This convenient, easy to use pill meant that women had greater control over their fertility and their lives than ever before.
Birth control activist Margaret Sanger was instrumental in urging scientists to develop the pill, eventually achieved by Gregory Pincus, Min-Chueh Chang and John Rock, and sponsored by an heiress called Katharine McCormick. The synthetic progesterone and estrogen hormones contained in this first pill, Enovid, were able to suppress ovulation. Over time, the formula was refined, more options became available, and accessibility continues to improve for women of different backgrounds and regions. It became so important to the world, that it needs no introduction - “the pill” has become a standalone term.
Though controversy existed (and still exists) surrounding the pill and how it was developed, there is no doubt that the pill changed women’s lives. Unwanted pregnancies could keep women and their families stuck in a cycle of poverty. Contraception directly led to benefits for women like better career and education opportunities, higher incomes, and higher empowerment in life and relationships, as well as lower mortality and better health outcomes for children. Females could join previously male-dominated fields, shown by a rise in enrolments in degrees like law, medicine, and business. By separating sex from procreation, young women could finally dream of their aspirations and shape their lives in a way that hadn’t been possible before.
The polymerase chain reaction (PCR) was a major breakthrough in the scientific world. PCR essentially makes as many copies of tiny sections of a particular DNA sequence as a researcher likes, until they are numerous enough for identifying. The technique uses heating and cooling cycles along with specific additions that help to amplify the DNA.
Because organisms have new cells growing and old cells dying at all times, DNA must constantly be copied so every new cell contains the same genetic blueprint. However, DNA was once an absolute mystery. When Watson & Crick published their Nobel Prize-winning work on the DNA double helix in 1953 (nowadays Rosalind Franklin is widely recognized for her contribution, as her data and notes made the discovery possible), they suggested a potential copying mechanism for DNA. The ability to copy DNA was desirable. At the time, even if you could find the specific sequence of DNA you were looking for (say, to diagnose a virus, or to learn more about a genetic trait), it was such a tiny amount of material that experiments could easily be done with it.
Other researchers in the coming decades confirmed how DNA copies itself, and what enzymes help it do this. A biochemist called Kary Mullis from the Cetus Corporation realized that by modifying a DNA sequencing method (devised by Frederick Sanger in 1977), he could trigger a chain reaction using a key enzyme (DNA polymerase, a vital enzyme that helps to build new copies of DNA) that would allow a certain, desired part of the DNA to be copied: in other words, in 1983, he had discovered PCR.
Back in 1969, Thomas Brock had isolated a bacterium called Thermus aquaticus from the hot springs of Yellowstone National Park and found it had a DNA polymerase that could withstand much higher temperatures than other polymerases known at the time. This so-called Taq polymerase was perfect for PCR because it could withstand the high temperatures required for PCR to work. PCR was further tweaked and refined, so within a short amount of time, it was already widely used in screening and quantifying HIV from blood samples; for detecting sickle cell anemia; and forensic science, where PCR could help to identify crimes based on DNA from as little as a single drop of dried blood or strand of fallen hair.
Now, a PCR machine is a standard cornerstone of many labs around the world, capable of copying DNA billions of times within a couple of hours. Next-generation innovations of PCR will prove to be even more sensitive and digitized. As an incredibly versatile technique, the ability to replicate and amplify DNA with PCR is now integral to fields like environmental studies, medicinal drug discovery, food technology and forensic science. PCR has brought us the Human Genome Project, revealed the story of our evolutionary past, created technology for assisted reproduction, diagnosed disease, and much more – all made possible by the knowledge and collaboration of many great scientists.
7. Artificial intelligence
While the innocuous letters A and I may cause you to picture robots coming to life and taking over the world, the truth is that AI is simply a methodology. In other words, a series of learning algorithms. AI streamlines processes and enhances decision-making to make working life easier across a variety of applications and disciplines. Today, it’s leading the way for the 21st century’s so-called Fourth Industrial Revolution. Yet, AI has been trending for quite some time in the biotechnology space.
John McCarthy is credited with coining the term ‘artificial intelligence’ back in 1956, but philosophers and mathematicians well before the 20th century all discussed the general concept: the creation of machines that could learn, store, and apply information using a system mimicking the human brain. Thereafter, some of McCarthy’s work was funded by the US government, in the hopes that AI could be useful in the Cold War (which – it turned out – it wasn’t). Early examples of AI technology included a mobile robot called Shakey, and later in 1997, a chess-playing supercomputer. It didn’t take long for science fiction movies to grapple onto the public’s fear of humanoid robots, a fear which has remained ever since. However, that humanlike intelligence never materialized, and the AI industry faced a rollercoaster over the following decades as support came and went, and as setbacks and successes reigned in equal measure. While different kinds of AI exist, as of yet there are no self-aware machines capable of sentience.
The usefulness of AI has been more applicable to repetitive, boring tasks, where it works to reduce human error. In recent times, AI has notably been used in smart homes, customer support services, fraud detection, big data storage, and for scientific discovery and development in biotech companies as well as in medicine (for example: surveying for new vaccines). New AI advancements are continuing to progress in the field in 2022 and beyond, such as in the possibility of driverless cars, robotic surgery, and AI authors, creating plenty of new job opportunities in data science and engineering fields.
With humans training AI for beneficial purposes, it’s quickly becoming a useful tool with a world of possibility. As for other, more recent inventions: nano-robots, stem cell technology, gene editing, and 3D printing all count as other advanced biotech innovations changing science forever.
8. Our mitochondria-targeting antioxidant
We don’t mean to brag, but we fully believe that MitoQ is a breakthrough in cell health technology. It’s changed the game for athletes (research has shown that MitoQ can help athletes complete time trials faster and helps with athletic recovery). It supports people around the world with their cardiovascular health, and health experts recommend it due to its scientifically proven ability to fight cell stress – something that affects everyone’s health on a foundational level. With 700+ papers published on its potential health benefits, we believe this molecule could transform the health industry and influence a cell health revolution!
If you don’t know the story of MitoQ’s origins already, here it is: in the 1990s two scientists worked together in a lab in New Zealand’s University of Otago. They knew that antioxidants naturally formed within the mitochondria are hugely important to human health – they help to keep free radicals at bay so that we humans can function at our best! They also knew that antioxidants naturally produced within our cells typically decline as we age. The problem here was that standard antioxidant supplements – like CoQ10 – weren't getting into human cells in meaningful doses. So, they developed a solution: a mitochondria-targeted antioxidant that could fight free radicals at their source – helping human cells to do all of their amazing jobs at full capacity. These include jobs like helping our brains to function, helping our bodies to move, supporting our immune system and allowing us to see! This discovery was named MitoQ – and now, it’s available to everyone.
What happens to human cells in space?
14 of 2022’s best biotech research discoveries