Why Facebook Won’t Remove Miracle Cure Ads
The Business Model Is Working Exactly as Intended
Facebook recently showed me an ad for a device that claims to reverse Type 2 diabetes using a copper patch applied to the wrist.
Then one for a supplement that dissolves kidney stones while you sleep.
Then one for a shoulder massager endorsed by a doctor who doesn’t appear to exist, and a spinal stenosis cure that’s been on sale “tonight only” for the past several months.
I know about these products because I’ve been writing about them. I know about them because Facebook keeps showing them to me. I know about them because Facebook’s advertising system approved every single one, targeted them at people most likely to be suffering from the relevant condition, and served them millions of times to people who were frightened, in pain, and looking for help.
This is not an accident.
This is the product.
The Mission
“Give people the power to build community and bring the world closer together.”
This is Facebook’s stated mission. It appears in annual reports, investor presentations, and the kind of carefully worded public statements that get read out at parliamentary hearings.
It does not appear in the ad approval algorithm.
The ad approval algorithm has a different mission. It asks one question: will this ad generate clicks? If the answer is yes, the ad runs.
The mission statement is aspirational. The algorithm is operational. These are not the same document, and only one of them runs the platform.
The Technology
Facebook possesses sophisticated content moderation tools: artificial intelligence, human reviewers, automated flagging systems, entire departments dedicated to Trust and Safety.
These tools are real. They are also selectively applied.
Hate speech is removed, eventually. Misinformation is labelled, sometimes. An ad claiming that a herbal patch dissolves kidney stones while you sleep is approved in minutes.
The reason is not technical limitation. It is financial incentive.
Hate speech generates outrage. Outrage generates bad press. Bad press affects the share price. The kidney stone patch generates revenue. Revenue affects the share price in the other direction.
The algorithm understands this distinction perfectly. It has been designed to understand it.
The Claims
“97% of hate speech is removed before user reports.”
The statistic is precise. The methodology is opaque. What counts as hate speech is defined loosely. What counts as removal is defined loosely. The number is real. Its meaning is negotiable.
“We are committed to fighting misinformation.”
The commitment is stated quarterly. The enforcement is inconsistent daily. Political misinformation is contested and visible. Health misinformation is profitable and targetable. The distinction between the two is not ethical. It is financial.
“We protect our community.”
The community is protected from some things. It is not protected from an ad that has run 340,000 impressions this week claiming that dissolving a tablet under your tongue each morning will eliminate visceral fat without diet or exercise.
That ad is not a failure of the system. That ad is the system working.
The Enforcement Loop
An ad is submitted. It claims that a weight loss patch worn on the wrist for eight hours dissolves up to 60 kilograms of excess body weight in three weeks.
This claim violates the Advertising Policies. The Advertising Policies prohibit claims that are false, misleading, or medically unsubstantiated.
The ad is approved.
A user reports it. Facebook reviews it. Facebook finds no violation. The user reports it again. Facebook finds no violation again. After fourteen reports over a fortnight, the ad is removed.
The advertiser creates a new ad. New brand name. New logo. New video with different background music. Same patch. Same claim. The new ad is approved in minutes.
This is not a bug. It is the feature. The reporting system exists to give users the impression of recourse. The approval system exists to give advertisers a reliable pipeline. Both are functioning exactly as designed.
I have reported ads on Facebook. Most people who use the platform regularly have reported ads on Facebook. The experience is consistent: you report, Facebook reviews, Facebook finds no violation, the ad continues running, and eventually you stop reporting because nothing changes and you have other things to do.
That attrition is also part of the design.
The Economics
Facebook’s revenue in 2025 was approximately $200.97 billion. The majority came from advertising. A significant and carefully unquantified portion came from categories that depend on misleading claims for their conversion rates.
Weight loss supplements. Miracle cures. Financial opportunity schemes. Crypto investments. Affiliate marketing funnels. Wellness devices that claim to decompress the vagus nerve, reset your core muscles, or restore blood flow to suffocating shoulder tissues.
These are not edge cases. They are core revenue streams.
Removing them would require defining misleading claims in a way that actually excluded misleading claims. That definition would cost money. The current definition does not cost money. It is the product of careful legal and financial calculation, and it is working.
The Meta-Irony
Facebook claims to build community. The community is the product.
Facebook claims to fight misinformation. Misinformation generates engagement. Engagement generates revenue.
Facebook claims to protect users. The users are sold to the people they need protecting from.
This is not a contradiction. It is the business model.
The mission statement exists to be quoted at hearings. The algorithm exists to generate revenue. The 27,000-word Community Standards document exists so that someone has something to point to when asked whether standards exist.
An ad promising 60 kilograms of weight loss in three weeks was approved in minutes. Both of these things are true simultaneously.
What This Site Is About
The ads I’ve been writing about, the spinal stenosis cure, the fictional Italian orthopaedic surgeon, the copper patch, the vagus nerve collar, the face-lifting tape, and the rest, none of them would reach you without a platform willing to approve them, target them, and serve them to the people most likely to be vulnerable to them.
Facebook is not the only such platform. But it is the largest, the most sophisticated, and the most practised at presenting this arrangement as something other than what it is.
The products I write about here are symptoms. Facebook is the condition.
The patch is not the grift. The pill is not the grift. The platform that approves the ad, targets it at people who recently searched for weight loss solutions, serves it 340,000 times, and takes the money is the grift.
Understanding that distinction is the starting point for everything else on this site.
The Alternative
Report the ads by all means. Re
What is a template?
Templates are reusable content blocks you can insert into any post. Use them for content you repeat often, like:
Standard disclaimers or disclosures
Calls to action (subscribe, share, etc.)
Custom dividers or recurring sections
Post templates or boilerplate
To create a template, click "Template" in the editor toolbar and select one to insert. You can organize templates into groups by using "/" in the name (e.g. "Email/Welcome").
porting may remove a specific ad. It will not change the system, because the system is not broken.
The exit, as always, is free. It has always been free. Fewer than 3% of users find it.
Paul is 71, writes from the Italian Alps, and has been scrolling Facebook for material since before it had a News Feed. He is not a medical professional, a tech journalist, or a platform policy expert. He is, however, persistent. The copper patch was not purchased for review. The money was spent on espresso and sunflower hearts. The birds, as always, delivered.


