Every billion-dollar brand has a product, but the truly dominant ones have something far more powerful—a story. Not just any story, but a narrative so persuasive, so deeply embedded in our culture, that it reshapes our desires and rewires our behavior. These stories don’t simply sell goods; they sell ideas, rituals, and symbols that outlive their creators.
In many cases, the products themselves are ordinary, even unnecessary. Yet through masterful marketing, they become indispensable, loaded with emotional weight and social expectation. Here are five brands that didn’t just sell what they made—they sold myths we still believe today.
Diamonds Are Forever – The Greatest Marketing Illusion
For centuries, marriage proposals bore little resemblance to the glossy jewelry ads we see today. In many cultures, there wasn’t even a tradition of giving an engagement ring at all. Where rings did exist, they were more often set with colored gemstones—ruby for passion, sapphire for loyalty, emerald for fertility—each with symbolic weight far older than the diamond myth. Diamonds were not rare, nor did they occupy any sacred place in the rituals of love. They were, in truth, just another stone—abundant in the earth and especially plentiful in southern Africa.
That abundance was precisely De Beers’ problem. Founded in 1888 by Cecil Rhodes, the company quickly consolidated control over the diamond mines of South Africa, swallowing competitors until it had near-total dominance. By the early 20th century, under the leadership of German-born financier Ernest Oppenheimer, De Beers became not merely a mining company but an orchestrator of scarcity. They operated as a cartel, regulating production to drip-feed diamonds into the market. The goal was never to meet demand but to manufacture it. By hoarding vast stockpiles and carefully rationing supply, they could command high prices for a commodity that was—geologically speaking—common.
Then came the Great Depression. Luxury spending collapsed. Diamonds sat unsold in vaults. The concept of spending a significant sum on a stone for a marriage proposal was laughable to the average person. De Beers faced a grim reality: unless they could make diamonds indispensable, their empire would wither.
In 1947, salvation came in the form of a late-night scribble by Frances Gerety, a young copywriter at N.W. Ayer & Son. Four words—A Diamond Is Forever—would become the most effective advertising slogan of the 20th century, according to Time magazine. It was a line not about the physical properties of the stone (though diamonds are hard) but about the permanence of love and commitment. The genius was in the subtext: if love is eternal, the token of that love must be eternal too. A diamond wasn’t just a purchase; it was a moral and emotional obligation.
From there, De Beers executed one of the most far-reaching cultural engineering campaigns in history. They embedded diamonds into the very architecture of romance. Hollywood films began featuring glittering proposals as climactic moments. Celebrity engagements were fed to gossip columns, their rings described in loving detail. Magazines published “rules” about how much a man should spend—often two months’ salary—as if it were an age-old tradition rather than a marketing invention. High society’s embrace of the diamond engagement ring trickled down to the middle class, where it became not just a luxury, but the standard.
The transformation was staggering. In 1939, diamond sales in the United States totaled roughly $23 million. Within a generation, that figure multiplied many times over, helping create a $90 billion global industry. By the late 1950s, eight out of ten engagement rings sold in America featured diamonds, a statistic that had no precedent in human history.
And yet, beneath the shimmer, the economics remain brutally one-sided. A diamond purchased at retail often loses 50% or more of its value the instant it leaves the jewelry store. Most jewelers refuse to buy secondhand stones, and resale prices at auction typically hover well below the original purchase price. Even the notion of “natural” rarity is hollow—today, lab-grown diamonds, chemically and visually identical to mined stones, can be produced for a fraction of the cost, often 80% cheaper, without the environmental devastation of mining.
Still, the illusion endures. People know, intellectually, that a diamond is neither rare nor an appreciating asset. But the story—the one Frances Gerety distilled into four words—runs deeper than logic. It has woven itself into the collective imagination, where love and sacrifice are measured in facets and carats. The result is a triumph of narrative alchemy: a plentiful stone transfigured into a universal symbol of eternal love, simply because we were told it always had been.
Breakfast – The Moral Crusade of John Harvey Kellogg
The idea that breakfast is “the most important meal of the day” is so ingrained that challenging it feels almost sacrilegious. But this widely accepted “fact” didn’t spring from nutritional science—it came from the deeply moralistic worldview of one man: John Harvey Kellogg.
Born in 1852 into a large family in Michigan, Kellogg was raised in the strict doctrines of the Seventh-day Adventist Church. This Christian denomination placed enormous emphasis on health, temperance, and a vegetarian diet, viewing physical discipline as intertwined with spiritual purity. From childhood, Kellogg absorbed the belief that the body was a vessel for moral virtue—and that what one ate could influence not only physical health but also character and behavior.
Kellogg trained as a physician but built his fame running the Battle Creek Sanitarium, an upscale health resort that attracted wealthy and influential patients from across America. There, guests were immersed in a regimen of vegetarian meals, exercise, and hydrotherapy. Central to Kellogg’s philosophy was his conviction that bland, unseasoned food could curb lust, laziness, and impure thoughts. Rich, spicy dishes, he warned, inflamed the senses and weakened moral resolve.
This belief was not a fringe eccentricity for Kellogg—it was the bedrock of his life’s work. He saw digestion as a moral process, and controlling diet as a way to control desire. In his mind, plain grains and cereals were not just healthful; they were tools of moral reform.
In 1894, Kellogg and his younger brother Will accidentally discovered a new way to process wheat, creating thin, crisp flakes that were easy to chew and digest. Originally intended for sanitarium patients, these wheat flakes were followed by a similar invention using corn. Kellogg saw them as a practical extension of his ideals—healthy, plant-based, and conveniently bland.
Will, however, saw something more: a mass-market product. When the brothers’ partnership soured, Will founded the Kellogg Company, taking the corn flakes recipe to the public. But to win over the average consumer, he added sugar, instantly improving taste appeal but betraying his brother’s original health goals.
The marketing that followed was revolutionary. Skipping breakfast was portrayed as a dangerous health mistake—one that left you sluggish, unfocused, and nutritionally deficient. Cereal, on the other hand, was positioned as quick, clean, and nourishing, perfect for the modern household. Parents were told it was the responsible choice for their children. Packaging became a playground for branding—bright colors, cartoon mascots, and collectible prizes turned the morning meal into a small daily celebration.
The strategy didn’t just sell food—it cemented a ritual. In the United States, breakfast shifted from a variable meal (sometimes skipped, sometimes heavy) to a fixed point in the day anchored by cold cereal and milk. And while the earliest Kellogg’s cereals had a measure of nutritional value, industrial-scale processing, the addition of sugar, and the need for long shelf lives gradually eroded their health profile.
Yet the perception of cereal as wholesome persisted, thanks to decades of advertising. Even today, in an era when intermittent fasting challenges the need for early-morning eating and when sugar-laden cereals are recognized as less than ideal, the ritual survives. Globally, breakfast cereals generate over $40 billion in annual revenue.
What began as one man’s attempt to tame human desire through diet became one of the most enduring—and profitable—manufactured habits in modern history. John Harvey Kellogg didn’t just shape a meal; he reshaped the cultural definition of a “good start to the day.”
Got Milk? – Selling Health, Guilt, and Humor
By the early 1990s, America’s relationship with milk was in trouble. The beverage that had been a fixture on breakfast tables for generations was losing ground to a wave of new competitors. Soft drinks, fruit juices, sports drinks, and bottled water were claiming shelf space and consumer loyalty. In California, milk sales were dropping so sharply that the state’s dairy farmers pooled resources to fund a rescue mission.
Enter the California Milk Processor Board and the ad agency Goodby Silverstein & Partners. Their task: reverse the decline, not with another generic “milk is healthy” pitch, but with something memorable enough to spark a cultural moment. The agency began with a simple insight—people only notice milk when it’s missing. They realized that the craving for milk is most acute when you’re about to eat something that “needs” it—like a dry cookie or a bowl of cereal.
From that observation, the campaign Got Milk? was born. But it didn’t stop at food pairings. The creative team tied the product to three emotional levers:
1. Health – They anchored milk’s value in its calcium content, promoting it as essential for strong bones and teeth. This was especially aimed at parents, who were told milk could help their children avoid osteoporosis and grow “big and strong.” The irony was that the scientific record was mixed—some of the countries with the highest per-capita milk consumption also had some of the highest osteoporosis rates. But by the time these studies emerged, the marketing narrative had already won.
2. Guilt – The ads played on the idea that a well-run household always had milk in the fridge. Running out was framed as a small but telling domestic failure. This wasn’t overt finger-pointing—it was subtle, tapping into the caretaker identity of parents, especially mothers, who were made to feel responsible for keeping the family’s nutritional bases covered.
3. Humor – Rather than scare tactics, the campaign used lighthearted frustration. Iconic TV spots showed hapless people stuck in situations where milk was desperately needed—biting into a thick peanut butter sandwich, eating chocolate cake, or staring down a dry cookie—only to find the carton empty. The scenarios were universally relatable, turning milk into an everyday hero.
The visual signature of the campaign was equally powerful: black-and-white celebrity portraits, each with a thick white milk mustache. From supermodels to sports icons, the faces were instantly recognizable, and the tagline underneath—Got Milk?—was a masterstroke of brevity. It wasn’t a statement; it was a challenge, prompting viewers to check their own fridge.
The cultural penetration was staggering. The milk mustache became a pop-culture phenomenon, spoofed on late-night television, printed on posters in school cafeterias, and adopted by local campaigns around the world. Sales rose, and milk regained its position as a household staple.
What made the campaign so effective was that it didn’t just sell a beverage—it sold an identity. Drinking milk became shorthand for being healthy, wholesome, and prepared. Even as plant-based alternatives gained popularity and lactose intolerance became widely discussed—affecting roughly 65% of the global population—the nostalgic grip of Got Milk? kept dairy in the conversation.
The campaign’s genius lay in turning a simple, perishable product into a cultural touchstone. It framed milk not just as a drink, but as a small but essential pillar of a “complete” home, and it did so with a mix of wit, emotional resonance, and flawless visual branding.
Hallmark – Manufacturing Occasions
Before Hallmark, special moments were marked in simpler ways. People wrote letters by hand, visited loved ones, or gave small, practical gifts. The notion of buying a pre-printed card with a ready-made message was not just uncommon—it had to be invented. That invention came from Joyce Clyde Hall, who in 1910 started selling postcards and greeting cards from his small shop in Kansas City.
Hall was a keen observer of both cultural shifts and postal trends. At the time, postcards—brief, one-sided notes—were falling out of fashion, but the postal service was expanding rapidly. He saw an opportunity in a new format: a folded card in an envelope, offering more space for words and a greater sense of privacy. These cards were more personal, more elegant, and, crucially, could be designed for specific occasions.
The genius of Hallmark was not simply in making cards—it was in creating the occasions for which they were “needed.” Valentine’s Day, for example, had existed for centuries as a romantic tradition, but in the early 20th century it was modestly celebrated. Hallmark transformed it into a mass-market holiday by producing a wide variety of Valentine’s cards—romantic, friendly, humorous—broadening the scope so that everyone, not just couples, could participate. Soon, schoolchildren were exchanging cards, workplaces encouraged friendly notes, and the day expanded far beyond its original meaning.
From there, Hallmark began manufacturing entire cultural touchpoints. Sweetest Day, Boss’s Day, National Grandparents Day—these were either invented or popularized through marketing campaigns that positioned them as “real” events worth celebrating. The formula was consistent and effective:
- Spot an emotional gap – Identify moments in life where people want to express gratitude, affection, or sympathy but may lack the words or time.
- Define the script – Decide how such an occasion “should” be acknowledged—a card becomes the expected gesture.
- Make it effortless – Offer a beautifully designed product with the words already written, so even those uncomfortable with expressing emotion can still “say the right thing.”
By the 1980s and ’90s, this strategy had turned into an economic powerhouse. The U.S. greeting card market peaked with over 7 billion cards sold annually. A single card, often costing $5 or more, could be mass-produced for pennies. The margins were enormous, and the cultural reinforcement was self-perpetuating—if everyone else sent a card, not sending one began to feel like a breach of etiquette.
Hallmark also mastered seasonal domination. Holidays became multi-tiered events, each with its own card category: birthdays, anniversaries, graduations, Mother’s Day, Father’s Day, religious holidays, and an expanding array of “just because” occasions. They even tied cards to pop culture by licensing characters from Disney, Peanuts, and other franchises, ensuring their appeal across generations.
The company’s influence extended into other media as well. Hallmark-branded TV movies carried the same emotional themes as their cards—love, reconciliation, nostalgia—subtly reinforcing the idea that heartfelt gestures should be accompanied by a tangible token. The films weren’t just entertainment; they were two-hour commercials for the worldview Hallmark had built.
What Hallmark sold was more than stationery. It was a standardized language of care and connection, where emotional expression was prepackaged, polished, and purchasable. In doing so, they shifted social norms—turning spontaneous, personal communication into a commoditized ritual, and creating a multibillion-dollar fortune on the back of a simple piece of folded paper.
Listerine – Inventing a Medical Problem
Listerine’s story is a masterclass in turning a minor, largely ignored inconvenience into a daily, billion-dollar habit. When it first appeared in the late 19th century, Listerine was not a mouthwash at all—it was a surgical antiseptic. Named after Joseph Lister, the pioneer of antiseptic surgery, it was used to clean wounds, disinfect surgical instruments, and even serve as a household cleaner. In its early decades, it was marketed for an astonishing range of purposes, from curing dandruff to treating gonorrhea.
By the 1920s, sales had stagnated. There were only so many hospitals and households that needed an antiseptic strong enough to scrub floors and surgical tables. Lambert Pharmaceutical, Listerine’s parent company, needed a way to take their product from a niche medical aid to something the average person would feel compelled to use every day.
Their solution was as audacious as it was effective: they created a medical-sounding term for bad breath—halitosis. The word blended Latin (halitus, meaning breath) with the Greek suffix -osis (indicating a disease or condition). It sounded clinical, serious, and vaguely threatening, far more alarming than “your breath smells.” Crucially, halitosis was not a recognized medical condition—it was an invented problem. But once defined and named, it could be diagnosed, feared, and treated.
Lambert’s advertising campaign was relentless. Full-page newspaper ads depicted young women unable to marry, businessmen failing in negotiations, and friends turning away—all due to “chronic halitosis.” The message was clear: bad breath wasn’t just unpleasant; it was a social catastrophe that could cost you love, success, and respect. The imagery tapped into deep insecurities about personal reputation and social acceptance.
Once the fear was firmly planted, Listerine was offered as the easy, definitive solution. In reality, the product did nothing to address the root causes of bad breath, which can be linked to diet, oral hygiene, or medical conditions—it simply masked odors with a strong antiseptic burn. But this didn’t matter. By the time consumers understood that, the emotional association between halitosis and social failure had taken root.
The numbers told the story. In 1921, Listerine’s sales hovered around $100,000 annually. By 1927, that figure had exploded to over $4 million. The campaign had not just boosted sales—it had redefined social etiquette. Where bad breath was once tolerated as a natural, occasional occurrence, it now became an unpardonable flaw. People began incorporating mouthwash into their daily hygiene routines, not as a luxury, but as a necessity.
This shift in perception was revolutionary. Listerine had managed to transform a product designed for operating rooms into a bathroom staple. They had reframed a mild, often unnoticed condition into a pressing personal hygiene crisis—and in doing so, created a habit that would be passed down through generations.
Even today, the mouthwash market is projected to exceed $16 billion globally by 2032. Countless brands compete, but the playbook remains the same: present bad breath as a silent saboteur of relationships and careers, then offer a bottled remedy. And because bad breath is notoriously hard to self-diagnose, the “better safe than sorry” mindset keeps consumers coming back, often for life.
Listerine’s success is proof that in marketing, defining the problem can be even more powerful than selling the solution. By naming and dramatizing an issue most people didn’t think they had, they didn’t just sell more product—they rewrote the rules of personal hygiene and turned an antiseptic into a cultural necessity.
Conclusion
These five campaigns share a common blueprint: identify something ordinary, attach it to an emotion people can’t ignore, and make that emotion feel inseparable from the product. Once the story takes root, facts become irrelevant—because the decision to buy is no longer logical, it’s cultural.
That’s the genius (and danger) of great marketing. It can turn carbon into a symbol of eternal love, a bowl of sugary flakes into a moral obligation, and a simple antiseptic into a daily ritual. Understanding how these narratives are built doesn’t just make us savvier consumers—it shows us how influence, once embedded in culture, can be almost impossible to undo.
