Take a second and think about your great-grandparents. The world they were born into is almost an alien planet compared to ours. The 20th century witnessed an unprecedented explosion of social and technological change.
In 1900, the average American life expectancy was a mere 47.3 years. By 2010, it had soared to 78.7 years. That incredible leap wasn’t just about living longer; it was about living in a completely different reality.
Philosopher Eric Hoffer described it as a time of such “drastic rapid change… when the future is in our midst devouring the present before our eyes.” Many of the foundational technologies we rely on today were born from innovations developed before World War I, creating what one historian called a “world of inexplicable wonders” for those living through it. Here are 10 things our great-grandparents did that, from our 21st-century vantage point, are entirely and utterly unthinkable.
Smoked Cigarettes… Based on a Doctor’s Recommendation

It’s almost impossible to picture it now, but there was a time when smoking was as common as checking your phone. In 1954, Gallup News reported that an incredible 45% of all American adults smoked cigarettes. This wasn’t some rebellious counter-culture habit; it was mainstream. Historian Allan Brandt went so far as to call the 20th century the “Cigarette Century.” The practice was so deeply woven into the social fabric that it was permitted almost everywhere. People lit up in their homes, in restaurants, on airplanes, and, most shockingly, in hospitals and even at medical conferences.
But here’s the part that truly boggles the mind: the tobacco industry brilliantly used the medical profession itself to sell its deadly product. Long before the public understood the link between smoking and lung cancer, a majority of physicians were smokers themselves. Tobacco companies seized on this, creating ad campaigns that are horrifying to look at today. In 1930, an advertisement for Lucky Strike cigarettes boldly claimed, “20,679 Physicians say ‘LUCKIES are less irritating’.” Their “research” method? They mailed free cartons of cigarettes to doctors and asked a cleverly biased, leading question. A few years later, Philip Morris ads claimed that sponsored studies showed that when smokers switched to their brand, “every case of irritation cleared completely.”
The most famous example came in 1946 from R.J. Reynolds, the maker of Camels. Their ubiquitous slogan was, “More doctors smoke Camels than any other cigarette.” They obtained this “finding” by a simple trick: giving doctors free cartons of Camels and then asking them which brand they smoked. It was a masterclass in deceptive marketing. These companies weren’t just selling a product; they were selling an identity. Ads were carefully crafted to make you feel that the brand you smoked “represented the type of person he or she wanted to become.” The rugged, independent “Marlboro Man” campaign, for instance, became one of the most successful of all time by offering an “escapist fantasy” to men working mundane office jobs.
Scientific studies linking smoking to cancer began emerging in the 1950s. The U.S. Surgeon General’s landmark 1964 report officially declared cigarette smoking a “health hazard of sufficient importance… to warrant appropriate remedial action.” Yet, it took decades of relentless public health campaigns, warning labels, and public smoking bans to undo the damage.
Let Their Kids Ride Shotgun Without a Seatbelt (or Car Seat)

If you’ve ever wrestled with installing a modern, five-point harness car seat, take a deep breath and be grateful. For your great-grandparents, the very concept of a “safety seat” for a child was nonexistent. The first so-called child seats of the 1930s and ’40s had a completely different purpose. Models like the “Bunny Bear” booster seat were designed simply to “elevate children so they could see their surroundings” during a car ride. They were more concerned with entertainment than safety. Some later versions even came with a built-in toy steering wheel to keep the child occupied. The goal was to keep the kid contained and happy, not to protect them in a collision.
The idea of a car seat designed for safety didn’t even emerge until 1962, when a British mother and journalist named Jean Ames invented the “Jeenay” seat. It was the first to be intended for the back seat and to use a harness system. Even for adults, safety restraints were an afterthought. Cars sold in the U.S. were federally required to be equipped with seatbelts starting in 1968, but for years, there was no law compelling anyone actually to wear them. As a result, almost no one did. In 1980, 30 years after the first car with seatbelts was sold, the national usage rate was a pitiful 11%.
The first state law making seatbelt use mandatory wasn’t passed until December 1, 1984, in New York. And even then, the idea was deeply controversial. People wrote letters to The New York Times arguing that such laws were an unconstitutional infringement on their freedom. This cultural battle between individual liberty and public safety is a recurring theme in American history. However, the eventual, overwhelming acceptance of seat belt laws—with usage rates now exceeding 91%—marked a significant shift in the social contract.
The National Highway Traffic Safety Administration (NHTSA) estimates that modern child safety seats reduce the risk of fatal injury by 71% for infants and 54% for toddlers. For older children and adults, seatbelts reduce the risk of death in a crash by about half. The journey from a booster seat with a toy steering wheel to today’s crash-tested safety systems shows just how far our understanding of risk has come.
Painted the Nursery with Lead-Based Paint

Of all the hidden dangers in a great-grandparent’s home, perhaps none was more insidious than the paint on the walls. It’s a shocking figure, but the U.S. Environmental Protection Agency (EPA) estimates that 87% of all homes built before 1940 contain lead-based paint. That number remains a startling 69% for homes constructed between 1940 and 1960. Lead was a prized ingredient in the paint industry. It was added to make paint dry faster, wear longer, and to create more vibrant, durable pigments. It was used on everything from interior and exterior walls to furniture, as well as on children’s toys and cribs. This widespread use persisted for decades, despite the health risks being known in medical circles as early as the 1900s. More than a dozen other countries banned lead paint long before the U.S. federal government finally prohibited its use in consumer housing in 1978.
The consequences of this delay were catastrophic, creating what public health experts refer to as a “silent epidemic.” The World Health Organization (WHO) states unequivocally that “There is no level of exposure to lead that is known to be without harmful effects.” The Centers for Disease Control and Prevention (CDC) confirms that even very low levels of lead in a child’s blood are associated with “developmental delays, difficulty learning, and behavioral issues.” Unlike a dramatic accident, lead poisoning is a slow, invisible crisis. Damage to a child’s developing brain is subtle but permanent. A groundbreaking 2022 study published in the scientific journal PNAS sought to quantify this “legacy lead exposure.” The findings are heartbreaking. The researchers estimated that more than 170 million Americans alive in 2015—over half the country’s population—had been exposed to dangerously high levels of lead in their early childhood.
The study went even further, calculating the cognitive cost of this mass poisoning. It is estimated that lead exposure was responsible for a collective loss of 824 million IQ points across the U.S. population, with an average loss of 2.6 IQ points per person. The generations born during the peak of leaded gasoline use in the 1960s and ’70s suffered the most significant harm.
Sent Their 10-Year-Olds to Work in Factories

Today, the idea of a child working a full-time job seems like a relic of a Dickens novel. However, for our great-grandparents, it was a simple yet often harsh economic reality. Child labor wasn’t an anomaly; it was a core component of the American economy, from farm fields to factory floors. The numbers are jarring. The 1890 U.S. census found that more than 18% of all children between the ages of 10 and 15 were employed—that’s nearly one in five kids in that age group. In the burgeoning textile mills of the industrial Northeast, children were prevalent, sometimes making up half of the entire workforce. They were hired for pennies on the dollar, with some textile mills paying them as little as $2 a week.
The fight to end this practice was a long and arduous struggle that spanned decades. Early state-level reforms, such as a 1842 Massachusetts law limiting children to a 10-hour workday, were essential but inconsistently enforced. The reform movement was fueled by tireless activists like Mother Jones, who in 1903 organized the “Children’s Crusade,” a march of child workers from Pennsylvania to New York. The children carried banners with heartbreakingly simple demands: “we want time to play,” and “we want to go to school.” Even with growing public pressure, national efforts to ban child labor were repeatedly defeated. The Keating-Owen Act of 1916, a federal law aimed at prohibiting the interstate sale of goods made by children, was struck down as unconstitutional by the Supreme Court just two years after its passage.
It wasn’t until 1938, with the passage of President Franklin D. Roosevelt’s Fair Labor Standards Act (FLSA), that child labor was finally and effectively prohibited in most factories, mines, and other hazardous occupations nationwide. The primary argument used to defend child labor was one of economic necessity—that poor families simply couldn’t survive without the income their children brought in. And it’s true that in some households, children’s earnings could account for as much as 23% of the total family income. However, this created a cruel paradox. The vast pool of cheap child labor actively suppressed the wages of adult workers, making it even more difficult for a family to get by on a single income. The system, in effect, perpetuated the very poverty it claimed to alleviate. By removing children from the workforce, the FLSA helped break this cycle, forcing industries to invest in adult labor.
Sprayed the Neighborhood with DDT to Kill Bugs

In the years following World War II, a chemical with a clunky name—dichloro-diphenyl-trichloroethane, or DDT—was hailed as a modern miracle. It had been used with incredible success during the war to protect Allied troops from insect-borne diseases, such as malaria and typhus. When the war ended, this “miracle chemical” was brought home and unleashed upon the American landscape with near-religious fervor.
DDT was used for everything. It became the pesticide of choice for agriculture, with the cotton industry alone accounting for 80% of all domestic DDT use. The U.S. Department of Agriculture sprayed it from airplanes over an estimated 30 million acres of forests to combat pests, such as the gypsy moth. In suburbs and cities, trucks and planes would “broadcast spray” entire neighborhoods with clouds of DDT to kill mosquitoes. It was seen as a symbol of scientific progress and a clean, pest-free future. But the miracle had a dark side. As early as the late 1940s, people began to notice troubling signs. In areas heavily sprayed with DDT, there were reports of mass die-offs of robins and other birds, as well as rivers filled with dead fish. The chemical was accumulating in the environment with devastating consequences.
The cultural tipping point arrived in 1962 with the publication of Silent Spring, a groundbreaking book by the marine biologist and writer Rachel Carson. With powerful, poetic prose, Carson documented the widespread environmental damage caused by the indiscriminate use of pesticides. The book was an absolute sensation. It is widely credited with having “catalyzed the modern environmental movement.” It landed on the desk of President John F. Kennedy, who ordered a federal investigation into the dangers of DDT. The robust chemical industry fought back fiercely, attempting to discredit Carson by labeling her “an alarmist, mystic and hysterical woman.” But it was too late. Public consciousness had been awakened. On June 14, 1972, the newly created U.S. Environmental Protection Agency (EPA) announced a nationwide ban on all primary uses of DDT, citing its adverse effects on wildlife and potential risks to human health.
Got a Lobotomy for “Anxiety.“

Save this article
In the modern era of therapy, psychopharmacology, and a growing understanding of mental health, it is almost impossible to fathom that for a time, the preferred treatment for a range of psychiatric conditions was a crude, brutal brain surgery: the lobotomy. Before the development of effective psychiatric medications in the mid-1950s, mental health institutions were often overcrowded, and treatments were limited. In this desperate environment, the lobotomy emerged as a shocking, so-called “last-resort” remedy. The procedure’s popularity surged in the 1940s and early 1950s. In all, Britannica mentions that more than 50,000 lobotomies were performed in the United States, with the vast majority taking place between 1949 and 1952.
The most notorious figure in this dark chapter of medical history was an American neurologist named Dr. Walter Jackson Freeman II. He championed and popularized a version of the procedure called the “transorbital lobotomy.” His method was shockingly crude: he would use a surgical instrument that was essentially a reinforced ice pick, insert it into the corner of the patient’s eye socket, and use a mallet to hammer it through the thin bone into the brain. He would then manipulate the instrument to sever the connections in the frontal lobes. Freeman transformed this horrifying operation into a swift, assembly-line procedure, sometimes completing it in under 10 minutes. He performed or supervised over 3,500 lobotomies during his career.
The lobotomy craze finally came to an end in the mid-1950s, not because of a sudden ethical awakening, but because of a scientific breakthrough: the development of the first effective antipsychotic medications, such as chlorpromazine (Thorazine). These drugs offered a far safer and more humane way to treat the symptoms of severe mental illness, rendering the ice pick obsolete. The story of the lobotomy stands as a chilling monument to medical hubris and a reminder of the horrific things that can be done in the name of a “cure” when desperation outpaces understanding.
Used Radioactive Cosmetics for a Healthy Glow

Today, we associate radiation with danger, but in the early 20th century, it was seen as a miracle of modern science. Following the discovery of radium by Marie and Pierre Curie, the element’s mysterious properties and faint glow captivated the public imagination. This fascination quickly morphed into a full-blown “radium fad,” a bizarre and deadly period of radioactive quackery. Radium was marketed as a revolutionary health and beauty ingredient, a source of vitality and energy. Companies began incorporating it into a wide range of consumer products. You could buy radium-laced toothpaste, hair creams, and even food and water. One of the most popular applications was in cosmetics, with brands like “Tho-Radia” promising a vibrant, healthy glow.
The most tragic chapter of this story, however, took place in factories across the country. Companies like the United States Radium Corporation (USRC) began producing luminous paint by mixing radium with other materials. This paint was used to make the dials of watches and military instruments glow in the dark. To perform this delicate work, the factories hired an estimated 4,000 workers, almost all of whom were young women. The job was seen as glamorous and was relatively well-paid for the time. To create a fine point on their camel hair brushes, the women were instructed by their managers to use the “lip-pointing” technique, which involved shaping the brush tip with their lips and tongue. With every lick, they ingested a small amount of deadly radium.
They were repeatedly told the paint was harmless. Some of the women, known to history as the “Radium Girls,” would even paint their nails and teeth with the glowing substance for fun before going out on a night. All the while, the company’s owners and scientists were fully aware of the dangers of radium. They took precautions to protect themselves, using lead screens and tongs while handling the material. Soon, the women began to fall ill with horrific, unexplainable symptoms. They suffered from severe anemia, their bones became brittle, and their teeth fell out. Most disturbingly, many developed a condition that became known as “radium jaw,” in which their jawbones would crumble and disintegrate. By 1927, more than 50 of the dial painters had died agonizing deaths.
When the surviving women attempted to sue the USRC, the company fought back ruthlessly. It engaged in a disinformation campaign, claiming the women were suffering from syphilis in an attempt to discredit their characters and ruin their reputations. But the women persisted, and their court case became a national media sensation. Their eventual settlement in 1928 was a landmark moment in legal history. The Radium Girls’ courageous fight established the right of individual workers to sue their employers for damages caused by occupational diseases. Their tragedy directly led to the establishment of new industrial safety standards and labor laws, fundamentally changing the relationship between workers and corporations in the United States and laying the groundwork for future protections.
Slathered on Baby Oil to Get a “Healthy” Sunburn

In some eras, women went to extreme lengths to maintain their pallor, using parasols, heavy powders, and even dangerous substances like arsenic and lead to whiten their skin. This centuries-old standard was upended entirely in a single moment in 1923. The catalyst was the iconic fashion designer Coco Chanel. After returning from a holiday on a yacht in the French Riviera, she was photographed with a deep, sun-kissed tan. In an instant, the look that had signified poverty became the absolute pinnacle of glamour and chic. A tan was no longer a mark of a laborer; it was a sign that you were wealthy and worldly enough to vacation in exotic, sunny locales. This fashion revolution kicked off a decades-long obsession with tanning. By the 1950s and ’60s, as beach culture exploded in America, people went to incredible lengths to achieve the darkest tan possible. Their tanning arsenal included some truly shocking tools.
The lotion of choice was often baby oil, sometimes mixed with iodine, which was applied liberally to the skin to maximize sun absorption. To intensify the effect, people would lie out with large, foil-lined reflectors propped under their chins to concentrate the sun’s rays on their bodies. The goal wasn’t to protect the skin; it was to cook it essentially. The famous tagline for Coppertone, one of the first major tanning brands, was “Tan Don’t Burn!“—a slogan that perfectly captured the era’s attitude.
Today, dermatologists are united in their horror at this once-common practice. The verdict is clear and unanimous: using baby oil to tan is, as Dr. Ross Perry of Cosmedics puts it, “a big no-no.” Dr. Michele Farber, a dermatologist, explains that baby oil works by attracting and absorbing UV rays, allowing them to penetrate the skin much more deeply. It provides absolutely zero SPF protection. Tanning with it dramatically accelerates skin damage, leading to premature aging effects like wrinkles and sunspots, and significantly increases the risk of developing life-threatening skin cancers, including melanoma.
Sent Unwed Mothers Away and Forced Adoptions

In the decades following World War II, a quiet, hidden tragedy was unfolding across America. It was a period that historians now refer to as the “Baby Scoop Era.” During this time, an estimated 1.5 million to 4 million infants, the vast majority born to unmarried mothers, were surrendered for adoption. For our great-grandparents, having a child outside of marriage was not just a personal matter; it was a source of profound public shame and social disgrace. Unmarried mothers were often labeled “fallen women” or “ruined.” Their families frequently disowned them, denied them public assistance, and viewed them as moral deviants and a public health concern by society.
To deal with this “problem,” a nationwide system of maternity homes emerged. These institutions, often operated by religious charities, were designed to hide pregnant young women away from their communities. Most of the residents were white, middle-class teenagers and young women whose families were desperate to conceal the pregnancy and preserve their social standing. These homes were not the supportive sanctuaries they might sound like. They were often isolating and punitive environments built on a foundation of secrecy and shame. Residents were frequently forced to use aliases, were cut off from friends and family, and were given little to no information about childbirth or their legal rights as parents.
The entire system was structured around coercion. With no financial resources, no family support, and no social acceptance for single motherhood, these young women were systematically pressured into believing that giving their child up for adoption was their only viable choice. The statistics reveal the power of this systemic pressure. In 1970, at the height of the Baby Scoop Era, an estimated 80% of all infants born to single mothers were placed for adoption. By 1983, after the social stigma had begun to recede and the legalization of abortion provided another option, that number had plummeted to just 4%. This dramatic drop demonstrates that the “choice” to surrender a child was, for many, an illusion created by a society that offered no other path.
The human cost of this era was immeasurable. Mothers reported being treated in a “frankly unhumane” manner during childbirth, with contact with their babies deliberately minimized to prevent bonding. Many were not allowed to hold their newborns at all, or were given only a few moments before the baby was taken away for good.
Lived in a World Without Instant Connection

It can be challenging to grasp how recently our hyper-connected, digital world emerged fully. For our great-grandparents, and even for many of our parents, the reality of daily life was profoundly different. Consider this: in 1984, only 8.2% of households in the United States owned a personal computer. Five years later, in 1989, that figure had only inched up to 15%. The “Information Age,” the era defined by a rapid shift from an industrial economy to one centered on information technology, was technically born in the mid-20th century. Still, its presence in the average home was nonexistent for decades.
Our great-grandparents navigated a world without the internet, email, smartphones, and social media. News and information traveled at the speed of a printing press or a broadcast signal. Keeping in touch with loved ones who lived far away meant sitting down to write a letter or paying for an expensive long-distance phone call. This wasn’t just a technological gap; it fundamentally shaped the rhythm of life, the nature of community, and the way people interacted with the world and with one another. The sheer scale and speed of the technological transformation that has occurred since their time is staggering.
The historian and scientist Vaclav Smil made a powerful point when he argued that a well-informed scientist from the late 18th century, if transported to the year 1910, would have found himself in a “world of inexplicable wonders.” The same can indeed be said for our great-grandparents if they were to be transported to our world today. The idea of carrying the entirety of human knowledge in a small device in your pocket would have been the stuff of the wildest science fiction.
But the most profound difference between their world and ours may not be any single technology, but the rate of change itself. The 20th century was undoubtedly a period of “rapid expansion” of innovation, but the digital revolution of the late 20th and early 21st centuries put that acceleration into overdrive. Moore’s Law famously captured this observation, which states that the number of transistors on a microchip—and thus, computing power—doubles approximately every two years. Our great-grandparents experienced revolutionary new technologies—the automobile, the radio, the television—emerging throughout their lifetimes. We, on the other hand, live in an era of constant, disruptive technological churn, where entire industries can be born and made obsolete in a matter of years.
Disclaimer – This list is solely the author’s opinion based on research and publicly available information. It is not intended to be professional advice.
How Total Beginners Are Building Wealth Fast in 2025—No Experience Needed

How Total Beginners Are Building Wealth Fast in 2025
I used to think investing was something you did after you were already rich. Like, you needed $10,000 in a suit pocket and a guy named Chad at some fancy firm who knew how to “diversify your portfolio.” Meanwhile, I was just trying to figure out how to stretch $43 to payday.
But a lot has changed. And fast. In 2025, building wealth doesn’t require a finance degree—or even a lot of money. The tools are simpler. The entry points are lower. And believe it or not, total beginners are stacking wins just by starting small and staying consistent.
Click here and let’s break down how.






