Good and Evil
Science has come to be synonymous with acquiring knowledge—something man has been doing from the beginning. In the Garden of Eden, Adam and Eve used observation, experimentation and human reason to determine that the forbidden fruit was “good” to eat. They “saw” that the forbidden fruit was good for food. To experiment, they ate the fruit. They reasoned that it would make them wise (see Genesis 3:1-6).
Today, researchers use that same scientific method. And we do seem to live in a “wise” world as a result—a world of awesome progress and advancing technology.
But it’s also a world where there is appalling evil. Let’s examine this scientific paradox.
Rejecting God
Modern science originated about 200 years ago. After its emergence from the 18th-century Age of Enlightenment, people came to assume that human reason made anything possible. As one encyclopedia puts it, “A great premium was placed on the discovery of truth through the observation of nature, rather than through the study of authoritative sources, such as…the Bible.”
Charles Darwin’s theory helped stretch that widening gap between science and religion. Though bitterly opposed to evolution at first, by the mid-20th century, science had come to accept the theory en masse.
German philosopher Friedrich Nietzsche’s faith in scientific progress led to his famous 1882 declaration: “God is dead.”
Sigmund Freud, founder of psychoanalysis, asserted in 1932 that “there is no other source of knowledge of the universe, but the intellectual manipulation of carefully verified observations, in fact, what is called research, and that no knowledge can be obtained from revelation, intuition or inspiration.”
Like Adam and Eve, modern science has rejected God to partake of the tree of the knowledge of good and evil. Science has relied solely on observation, experimentation and human reason. The “good” knowledge has led to some stunning discoveries.
Modern Innovations
Consider travel. Though separated by 1,700 years, the Apostle Paul and Benjamin Franklin both used the same basic methods of travel—foot, animal or boat. People traveled that way for millennia. For that reason, most, unless rich or adventurous, never got beyond 50 to 100 miles of their homes.
Then came modern travel—steamboats in 1806, trains in the mid-1800s, electric-powered streetcars in the 1870s, bicycles in the 1880s, subways in the 1890s, automobiles at the turn of the century, airplanes in the early 1900s, submarines, helicopters and turbo-jet airplanes in the 1930s, spaceships in the 1960s.
Today, you could fly on the Concorde from Philadelphia to London in the amount of time it would take most readers to finish half this magazine—little more than an hour. When Ben Franklin embarked upon his first journey to London in 1724, it took him six weeks.
Economical, comfortable and timesaving methods of modern travel are all hallmarks of 20th century achievement.
Consider agriculture. After Adam took of the forbidden fruit, God then told him how difficult and laborious farming would be (Gen. 3:17-19). Throughout the 5,800-year Agricultural Age, that’s the way it was.
Not that farming is easy today, but like so many other laborious tasks, machines have eliminated a tremendous amount of hard labor. These labor saving devices enabled fewer people to produce larger crops. In 1900, for example, agriculture was 20 percent of the American economy; today it’s only 1.7 percent.
In addition to many labor-saving devices, artificial fertilizers and pesticides helped produce bountiful harvests. Chemical fertilizers were first added to soil in 1880—the first insecticide introduced in 1939.
Farmers were not the only ones to benefit from the knowledge explosion. Consumers did too. When Louis Pasteur discovered how to kill bacteria in food during the 1860s, suddenly products like beer, wine and milk could be “pasteurized” to last longer. The first mechanical refrigerator (1865) helped preserve foodstuffs in cold temperatures. In 1917, Clarence Birdseye developed a method for quick freezing food into small packages. In 1945, plastic packaging films were introduced to preserve “convenience” foods—even in warm temperatures. Add to these advances the artificial preservatives added to foods and you have a 20th-century phenomenon—shelf life!
These advancements have increased food supplies dramatically, helping it keep pace with the world’s population, which nearly quadrupled this century. It has made more food available and at cheaper prices. In many ways, science has silenced the doomsayers who years ago predicted a global food shortage.
Consider work. The labor-saving devices that eliminated jobs in agriculture, created them in industry. By the mid-19th century, the Western world had entered a new age—one of mass production of goods by mechanical means.
In 1873, the first use of electricity to drive machines added a whole new dimension to the Industrial Revolution. This gave impetus to the development of numerous household appliances—electric fans, razors, sewing machines and washers. These items, however, did not become affordable to the average consumer until the mid-20th century. By that time, America had perfected its assembly-line production, driving retail prices down, and the Sears and Roebuck catalogue made it known that now everyone could afford these mechanical conveniences.
Consider leisure. The technology that enabled man to mass-produce food and goods also made it possible to entertain mass audiences. Prior to 1900, the print industry was the world’s only mass medium. During the 20th century, however, four new mass mediums sprang to life—movies, radio, television and the Internet.
The first motion picture was introduced in Paris in 1895. In 1922, color was added to movies; and in 1927, sound. Since then, more sophisticated and mobile cameras, better lighting and high-tech computer editors have dazzled viewers with special effects. Technology has also made it easier and cheaper to watch movies: vcrs, videotapes, pay-per-view, dvd—these were all introduced in the last quarter of the 20th century.
Radio’s reach was even more widespread than movies. Marconi had discovered the basics of radio technology by 1900. Westinghouse started broadcasting programs over the air in 1920. “Even more than movies,” Christopher Porterfield wrote for Time magazine, “radio gave audiences an intensely communal feeling, a sense of being part of something national, as well as a special intimacy with its stars” (June 8, 1998).
It was television, however, that combined the far-reaching impact of radio with the mesmerizing visual quality of movies. Englishman John Baird introduced TV in 1926. But it didn’t catch on in the U.S. and Britain until after World War II. In the U.S., only 172,000 homes owned a TV in 1949. In 1952, however, more than 15 million had one. Two years later—32 million. For millions of first-time viewers, it was like discovering a whole new world, neatly packaged in a little black box. As the phenomenon spread like cancer, forecasters predicted the demise of radio and movies. Yet, far from wiping out the competition, television seemed to perfectly complement the other two mediums. Color was added to television in the 1960s, and in the 1980s a cable and satellite TV boom offered viewers an unheard-of number of channels to surf.
In the 1990s, surfing epitomized another 20th century marvel—the Internet. Newt Gingrich compares it to a library that never closes and always has the book you need. Lisa Jardine, in an article she wrote for The Spectator, said the Internet “will one day be bigger than all the other industries on earth.” In the United States, this year the number of on-line households is expected to jump by more than 40 percent. It’s only been around for ten years, and already there are an estimated 110 million users worldwide.
Other 20th century leisure activities, like sports, music and video games, have piggybacked these mass medium machines, to push their products. In no other century has leisure so closely interacted with technology.
Consider computers. In the same way the Industrial Age wiped out mass amounts of jobs in agriculture, the Information Age has in industry. But it has created just as many jobs, if not more, in the fields of information, technology and computers. In many ways, the information revolution has buoyed America’s economic boom during the 1990s.
In 1995, Newt Gingrich wrote, “The power of computer chips will multiply another million-fold over the next ten years—as big an increase as the productivity improvement of the last forty years. This translates into a one-trillion-fold increase in productivity between 1950 and 2000” (To Renew America, p. 58). One industry analyst says that at the current rate of advancement, by 2019 a $1000 computer will be able to process as many instructions per second as the human brain.
Computer technology, unquestionably, is one of mankind’s most innovative 20th-century creations.
Consider medicine. Most consider the invention of the microscope by a Dutch lens maker in the late 17th century the spark that set off medicine’s explosion of knowledge. With that one invention, like looking through a window, scientists could see inside the body—revealing the world of microorganisms. Using that technology, Louis Pasteur was able to link germs to infectious disease in 1864. By the late 1800s, doctors had identified the bacteria and other microbes responsible for many diseases like cholera, diphtheria, leprosy, malaria, tetanus and tuberculosis.
By the 20th century, scientists were experimenting with a host of medicines and eventually antibiotics to fight against such diseases. Once Louis Pasteur discovered how to kill bacteria in food, it was only a matter of time before scientists would use that technology on humans. In 1928, Britain’s Alexander Fleming accidentally discovered the world’s first antibiotic—penicillin. That led to the development of hundreds of antibiotics.
With these advances, researchers found ways to fight against disease, and in some cases, completely eradicate certain strains. Throughout this century, there has been a decline in cases of polio, hepatitis and the measles. In 1980, scientists declared small pox completely eradicated. Meanwhile, life expectancy has climbed and death rates dropped, especially in advanced nations, where people can afford vaccines and antibiotics.
There have been many other 20th century breakthroughs in medicine. The structure of dna was decoded in 1953. In 1954, doctors transplanted the first human organ, a kidney. Cat scans were developed in 1973, a test-tube baby born in 1978, a permanent artificial heart implanted in 1982 and the ability to decipher the entire dna sequence of a living organism in 1995. In 1997, scientists even cloned a sheep.
As we enter the 21st century, researchers tell us we are only years away from cloning human beings.
Consider communication. In 1860, when the Pony Express guaranteed it would deliver a package from Missouri to California in only ten days, people considered that state-of-the-art! Though now a legendary fixture of the American West, the Pony Express was quickly victimized by technological advance. In less than two years, the telegraph line made it all but unnecessary.
Two decades later, Alexander Graham Bell invented the telephone. By 1915, a coast-to-coast phone system stretched across the United States. By the mid-20th century, the telephone was an “essential” in every home and office. Today, a third of American adults carry mobile phones.
Satellites, fiber optics, mobile phones, electronic mail—all have made communication fast, easy and affordable. Just last year, Internet users sent more than 5 trillion e-mail messages.
You can see why Nietzsche and the 1960s radicals said God is dead. Darwin, Nietzsche, Freud and other modern intellectuals have branded their religion on the 20th century and it is here to stay. It’s a religion that props up science as the new messiah—the one great hope for mankind.
Who needs God when you have science and technology? We have seen the awesome progress—the “good” man has produced by taking from the tree of the knowledge of good and evil.
So what about the evil?
This is the great paradox of the 20th century.
Advances in Sickness
Advances in medicine and food production, contrary to popular opinion, have not improved the overall health of earth’s 6 billion inhabitants. The statistical rise in life expectancy among Western nations is deceptive. While death rates have declined in the West during this century, worldwide the picture is much less encouraging. Even in the West, where life expectancy steadily increased from 1910 to 1980, in the last 20 years, there has been a steady decrease, thanks in large part to the aids virus. There are now more than 20 million cases of aids worldwide—2.3 million died from it just last year.
Aids is not the only culprit. Researchers have identified at least 30 new untreatable diseases since 1980. And you could consider other diseases, like pneumonia, influenza and tuberculosis candidates for “comeback disease of the century.” Once in decline, new strains of these old killers are back like bad habits. Tuberculosis kills more than 3 million people a year. In 1995, more than 12 million children under the age of 5 died of pneumonia. Between 300 to 500 million cases of malaria are reported each year, causing at least 1 million annual deaths. Admittedly, most of those deaths occur in countries where antibiotics are either not available or too expensive.
But antibiotics have not stamped out disease in rich countries either. In many cases, overuse of antibiotics now poses a serious threat to Westerners. “Somehow,” says Merle Sande of the University of Utah School of Medicine, “we lost that message, and as a result we are in the process—and it’s starting to accelerate—of losing these incredibly valuable drugs. Somehow we blew it,” Sande said. “This is just the beginning of the end. We’re going to see infections that used to be curable in past centuries raise their ugly heads once again.” When overuse occurs, only drug-resistant strains survive, allowing them to flourish unabated. An estimated 70,000 people die every year from hospital infections caused by drug-resistant “superbugs.” Talk about paradox!
From 1980 to 1992, deaths from communicable diseases increased by 58 percent. That said, infectious disease is still only responsible for 4.2 percent of today’s debilitating health problems in Western society. The majority of diseases today, completely unlike past centuries, are degenerative and man-made—like cancer, heart disease and malnutrition. These account for 81 percent of today’s debilitating health problems.
Unicef reports that 12 million children under the age of 5 die every year from malnutrition. As already noted, there is more than enough food in the world to feed everyone. And we certainly have the advanced technology to preserve and transport it. Yet, every year, 12 million children die because of an unbalanced diet.
A different kind of malnutrition is responsible for many new diseases in developed countries. It’s caused by over-indulgence, eating the wrong foods and ingesting excessive amounts of man-made chemicals that have no nutritive value.
Approximately 1 billion people on earth smoke cigarettes. Of that number, more than 3 million die each year from smoking-related diseases, like lung cancer. Two thirds of those deaths are in developed countries where every available medicine or surgery is just down the street.
In America, the second leading cause of death, behind smoking-related diseases, is obesity. Many doctors are now calling it an American epidemic. Sixty percent of American adults weigh more than they should—17 percent are considered obese. Heart disease, almost unheard of in nations without an obesity problem, kills more than 1 million Americans per year.
Then there are mental and emotional disorders which, paradoxically, have skyrocketed ever since Freud invented psychoanalysis. According to the National Institute of Mental Health, more than 40 million Americans suffer from mental and emotional conditions that adversely affect their quality of life. Another 50 million suffer from intermittent symptoms.
Modern medicine, despite the bold claims made earlier this century, did not eradicate sickness and disease in the 20th century. Underdeveloped countries enter the 21th century facing a frightening array of killer diseases of epidemic proportions—aids, malaria, hepatitis B, malnutrition, starvation.
In developed countries, perhaps the most damning statistic against the “success” of medicine is the amount of money spent on health care in the United States—more than $1 trillion per year. Cancer, heart and liver disease, cardiovascular disease, diabetes, low blood sugar, obesity, chemical dependency, depression, attention deficit disorder—these are the unhealthy fruits of prosperity and high technology. And almost all of them are unique to modern life in the 20th century. Who could have imagined 200 years ago that one in seven Americans would suffer from some sort of mental or emotional “disease”?
Advances in War
This century’s most visible paradox might be the scale devastation man has brought upon himself by genocide and war.
Technology’s advance has had a strong hand in man’s brutality, and not just because of modern weapons. To supply the 19th century explosion in industry, developed nations needed raw materials. For many of them, especially in Europe, that meant going outside their borders. (In 1870, only 10 percent of Africa had been colonized by Western nations. By 1900, that figure had jumped to 80 percent.) Leading up to the 20th century, Western nations were scrambling to exploit resources in underdeveloped nations—not just in Africa—also in southeast Asia and the South Pacific.
Accompanying that mad scramble for resources were territorial disputes—which, inevitably, led to war. And when it did, as man soon found out, the same knowledge explosion that spawned the Industrial Revolution also changed the nature of warfare. At first, many politicians thought advanced new weapons would shorten conflicts and lessen destruction. They were wrong.
At the dawn of the 20th century, Western nations started arming themselves to the teeth and forming strategic alliances. The formation of alliances, established to prevent war, actually precipitated it—and on a much larger scale than anyone could have imagined. When a Serb nationalist assassinated the heir to the Austrian throne in June 1914, it sparked the first-ever world war.
Airplanes, submarines, tanks and machine guns were all introduced in the Great War, though all were in the beginning stages of their development. Most of the 10 million deaths were caused by “old fashioned” artillery fire in trench warfare. Still, technology, even in the second decade of the century, had left its mark. As Roy Willis wrote in Western Civilization, “Governments had failed to grasp the nature of modern warfare.”
When next world war hit two decades later, the above-mentioned weapons were much more developed. With the addition of newly developed rockets and missiles, World War II was man’s deadliest ever—50 to 60 million were killed. It ended with the use of the atomic bombs. The one America dropped on Hiroshima vaporized 70,000 Japanese—on Nagasaki, another 36,000. The world had entered the atomic age, creating the threat of infinitely greater catastrophes in the future. In modern warfare, military personnel were no longer the only people at risk. Now everyone was at risk. Nearly two-thirds of those killed in World War II were civilians.
It is worth noting that during first half of this century, physics was the dominant science and travel was the dominant technology. Hence atomic warfare by means of modern travel. During the last half of this century, biology was the dominant science and computers the dominant technology. Not surprisingly, biological weapons and computer warfare have now been added to this century’s earlier innovations.
“The 20th century saw the creation of great weapons based on the principles of nuclear physics,” Richard Preston wrote in Time, “the 21st century will see great weapons based on the knowledge of dna and the genetic code” (Nov. 8, 1999).
As the Trumpet has reported before, in this modern century approximately 150 million souls have been killed in wars, massacres, slaughters and oppressions. That’s half the population of the United States. Far from making wars less destructive, 20th century technology has increasingly heightened the devastation.
What a price we have paid for “progress.”
Advances in Immorality
Yet there is another threat to humanity’s existence—one that is rapidly becoming much greater than the hydrogen bomb. In the 1960s, it became known as a “moral revolution.” Once intellectuals sided with revolters, they sugar-coated the devastating effects of this revolution by labeling it the “new morality.”
Two generations later, however, the fruits are much harder to disguise. America spearheaded the revolt and it spread around the world with the help of technology, as historian Paul Johnson notes in his book A History of the American People: “If the United States was still a conformist and traditional society in the 1950s, the portents of change were present too. In 1948…Dr. Alfred Kinsey brought out an 804-page volume, based on 18,000 interviews over many years, called Sexual Behavior in the Human Male. He followed it in 1954 with Sexual Behavior in the Human Female…. They revealed that 68-90 percent of American males, and almost 50 percent of females, engaged in premarital sexual intercourse, that 92 percent of males and 62 percent of females had masturbated, and that 37 percent of males and 13 percent of females had experienced homosexual intercourse. The findings also suggested that 50 percent of men and 26 percent of women had committed adultery before the age of 40. Kinsey’s findings caused surprise and in some cases rage, but they confirmed much other evidence that, even in the 1950s, the Norman Rockwell images no longer told the full story. Hollywood was still trying to hold the lines laid down by the old Hays Code, but it was cracking….
“As the TV habit spread and took deep root, and as the medium made itself indispensable to all the purveyors of mass-consumption goods and services, those who ran the networks and the stations began to flex their cultural muscles and contemplate a society in which all standard measurements of behavior would be up for redefinition, and moral relativism, based on ratings, would rule. Thus the way to the 1960s was prepared, and what began as a sexual revolution was to bring about revolutions in many other areas too.”
Half a century later, surveying the social landscape is like sifting through a bomb-ravaged bunker in Kosovo. All the traditional building blocks of society toppled during the moral war—respect for government and law, love of one’s country, traditional roles for men and women, and above all, the family. In their places, modernists have erected radical individualism, popular culture and sexual freedom.
The new morality’s chief adherents have exploited every technological component available to drug the zombie-like masses. In America, the world’s leader in entertainment, it is especially bad. Music is a $40 billion industry. Cable television rakes in $30 billion. Hollywood garners $15 billion in ticket sales worldwide—another $20 billion selling and renting videos.
And when Alan Alcorn invented the two-dimensional video game Pong in 1972, who would have thought that the video game market would, by century’s end, amass more revenue than Hollywood’s domestic box-office receipts? Half the households in Japan own a games console, one in three in America, and one in five in Britain. And while the 1960s space program may have been most responsible for getting the computer industry off the ground, video games have jet-propelled the industry’s push for faster chips and cd-rom drives. Even the Internet owes much of its phenomenal success to video games. Only in the last few years have some commentators harshly criticized the video-game industry for its violent influence on children.
As for the Internet, it also owes much of its “success” to the popularity of pornography. Virtually unregulated, the Internet posts 20,000 new pornographic web pages every day. Nearly 70 percent of the $1.5 billion worth of online content—services that can be downloaded—is so-called “adult material.”
Even worse than the Internet is the rank perversion the movie industry spews out of the darker side of Hollywood. This year, the “other” Hollywood is on course to release 10,000 new pornographic movies, shattering 1998’s record by more than a thousand titles. Do the math. That’s 27 titles per day. Like so many modern perversions, pornography must credit much of its success to technology, which helps deliver the filth into the privacy of homes. Without vcrs, consumers wouldn’t be able to rent the 700 million hard-core titles that are checked out of video stores annually. The $11 billion porn industry dwarfs the $7.2 billion that “mainstream” Hollywood will collect in domestic ticket sales this year.
Other articles could be written about the soft porn hurled at us from every medium—raunchy television programs, mtv, provocative music lyrics, R-rated movies. Television’s Baywatch, known for its bad acting and well-proportioned lifeguards, amasses a whopping $100 million per year. Two thirds of that comes from overseas—where the program airs in more than 140 countries.
That leads right in to America’s largest single export. It’s not cars or even computers. It’s entertainment. The U.S. entertainment industry will gross more than $450 billion this year. To put that in perspective, if American entertainment was considered a nation, it would be the 11th richest—out of 210 countries ranked by the World Bank—right behind Canada and Spain.
The Future of Science and Technology
On the one hand, 20th-century science and technology have delivered an almost magic, push-button society that offers every imaginable human comfort and convenience. All of these innovative productions are not, by themselves, bad.
But all of them have been made to serve bad purposes as well as good. Advancements in travel—automobiles, airplanes, trains—also brought us tanks, submarines, F-16s and aircraft carriers. Man-made chemicals have produced bumper crops, processed food and bacteria-killing antibiotics, but they have also depleted the soil and food of all-important nutrients, contaminated the environment and caused many new diseases. Labor-saving devices have created new jobs and added hours to the day, yet there is increasing idleness, aversion to work and bouts of depression. The awesome power of mass mediums—movies, radio, television, the Internet—have turned into a mind-numbing addiction for the masses. The computer-dominated Information Age is also responsible for electronic fraud, rampant pornography and a new era in modern warfare—stealth fighters, laser-guided missiles and subminiaturization. And while there are dozens of ways to communicate in this modern age, we are struck by the increasing inability of nations, neighbors and families to communicate civilly with each other.
The paradox of the 20th century is nowhere more evident than with the schizophrenic personality of science. Man has produced an unbelievable, push-button dream world. Yet he has proven to be equally imaginative and inventive when it comes to producing evil.
You need not be an expert in dna science to foresee what lies ahead. Given enough time, man would undoubtedly be able to clone himself and send newlyweds to honeymoon in space. The question is, at the present rate of technological advance, how much longer before society collapses under the heavy weight of science’s darker side?