ADHD Meds Research: Designed to Deceive
And why seven weeks of data tells us nothing about years of risk
Quick note:
Tickets for the next Drug Free ADHD in-person retreat are on sale.
It’s in the beautiful Peak District National Park on 1st June — and includes meditation, breathwork, wild swimming, and connection with other ADHD-powered folk just like you.
For tickets, click here.
===
Last week, The Guardian published a puff piece on ADHD medications, parroting claims from a questionable study — but without naming the study or linking to it, leaving readers in the dark.
I first stumbled across the actual study through a neuroscience news site, which at least provided a few more details (again, mysteriously not linking to the actual research).
But as I dug deeper, what I uncovered was shocking: a tangled web of vague timelines, industry funding, and media spin that misleads millions of people like you and me. I will now dismantle this puff piece and explain exactly how Big Pharma, the media, and (some) doctors attempt to mislead us all.
By the end of this newsletter you’ll understand:
How Big Pharma manipulates the scientific method.
What you can do to never be fooled again.
What a well-designed study really looks like.
This study first came to my attention when long-time subscriber and client of Drug Free ADHD, Michelle, sent me this article on a neuroscience website.
At first glance, it looks promising:
“A large new analysis of 102 clinical trials finds that ADHD medications have overall small effects on blood pressure and heart rate in both children and adults.”
Okay, if that’s true then perhaps ADHD meds don’t, as we’ve suspected for years, lead to cardiovascular disease.
Scrolling a little further down I see the study was conducted by the University of Southampton. Not a top UK university, but they are known for biomedical research.
All looking good so far.
But then the wheels start falling off:
A new study led by the University of Southampton has found that medications for ADHD have overall small effects on blood pressure and heart rate after weeks or a few months of use.
A few weeks or months of use? That’s both vague and a very short time window.
Michelle — the long-time subscriber and client who first sent me this article — has been taking ADHD meds for years. She’s also suffering chronic side effects, including severe chest pain, and her GP’s solution is to, you’re not gonna believe this, up the dose!
Remarkable, and deeply concerning.
So, anyway, the article is vague and then the next line smashes me right in the jaw — because it is an outright manipulation (and one most people would miss).
The study, published in The Lancet Psychiatry, conducted the largest and most comprehensive analysis of the cardiovascular effects of ADHD medications based on the results of randomised controlled trials – the most rigorous type of clinical study to assess medication effects.
Notice the sleight of hand here: they claim it’s the largest RCT analysis, which is technically true, but they conveniently ignore larger, far more meaningful studies — like long-term research — that offer a much clearer picture of real-world cardiovascular risks.
For example, the Swedish government funded a 14-year-long study into cardiovascular effects of ADHD medications that included over 200,000 patients (60,000 of which ended up in the final results) — we’ll dig into these results shortly.
Neither The Guardian nor the neuroscience blog bothered to link to the actual study. So I had to go hunting — which took me 30 minutes to find the actual study.
Somebody doesn’t want this research easily accessed.
Which is odd, don’t you think?
Considering scientific research should be open and accessible to all who are affected by it. Particularly those taking potentially harmful medications.
When I finally find the study, I learn that around 22,000 people took part in those 102 clinical trials. This is wonderful. A decent data set.
However, the clinical trials ran for a median of just 7 weeks.
Seven. Weeks.
Here’s one major reason this study is BS:
Short-term blood pressure changes provide only a snapshot, whereas cardiovascular disease develops cumulatively over years. A trial lasting mere weeks cannot possibly capture this progression.
Who reading this has taken ADHD meds for just 7 weeks, unless you found that they didn’t work or gave you horrific side effects?
And here’s where things get even murkier.
Because while the Southampton meta-analysis itself was publicly funded, the vast majority of the original studies it relied on were paid for by the pharmaceutical companies that manufacture these drugs.
And now the short time scale and common distortions are starting to make sense.
This raises a big problem in the way the scientific method is completely abused by Big Pharma, and researchers who are in their deep pockets.
It’s called the evidence gap.
The Evidence Gap
Short-term trials are extremely common in pharmaceutical research, especially in psychiatry and neurology, but they don’t tell us much about what happens over years of use — and ADHD medications are often prescribed for years, especially in children who continue into adulthood. Yet, most of the evidence base is built on studies lasting just a few weeks.
Regulators and doctors rely on these short-term studies because they’re what’s available, but they don’t reflect the long-term reality. And pharmaceutical companies tend to invest in studies that get the drug approved, not necessarily in long-term safety studies unless they're forced to do so.
As Big Pharma’s primary motive is profit, why would they spend £10 million on a long-term study that will most likely prove their drugs to be ineffective or unsafe? Better to manufacture a study that gives them exactly what they want. Smart for business. Horrendous for the rest of us.
So when a large study is performed, like this one, the data that’s used is so short that it doesn’t really tell us anything — unless you’re thinking of taking ADHD meds for just 7 weeks, and then stopping forever.
But, of course, not all studies are shyt.
Here’s what a well-designed study looks like
Compare this to a Swedish state-funded study (Attention-Deficit/Hyperactivity Disorder Medications and Long-Term Risk of Cardiovascular Diseases: Le Zhang, PhD):
Monitoring 278,000 people (60,000 included in final results).
For up to 14 years.
State-funded.
Published in JAMA Psychiatry.
And it found clear risks:
Increased risk of cardiovascular disease
Increased risk of arterial disease
Increased risk of hypertension
Plus, a 4% increase in CVD risk for every year of medication use.
It found that if you take ADHD meds for 10 years your chance of developing cardiovascular disease increases by almost 50%. This is why constant monitoring of every single person taking ADHD meds is absolutely essential.
This Swedish state funded study is so reliable due to the:
Huge sample size: 278,000 people, with 60,000 in the final analysis. The results are statistically strong.
Long follow-up time: Up to 14 years. This means they could track long-term effects, which is often missing in shorter studies.
Real-world data: They used national healthcare data, so it reflects everyday life, not the controlled environment of an industry funded trial.
Careful adjustments: They adjusted for many factors (like age, sex, other mental health conditions) to try and isolate the effect of ADHD medication itself.
The Southampton Study: A Convenient Design
Let’s now compare that to the University of Southampton study:
Meta-analysis of industry-funded randomised controlled trials (RCTs): RCTs are often the gold standard of short term research.
Large pooled sample size: Data from 102 trials with 22,702 participants.
Clear short-term measurements: Focused on blood pressure and heart rate.
On paper, it appears stronger than most studies. However, its fundamental flaw remains fatal to its reliability: the time frame was far too short to assess long-term risks and most of the trials were funded by the companies with a profit motive.
The Hidden Problem: Industry-Funded Source Trials
The Southampton study is a meta-analysis, which means it didn’t generate new data — it pooled results from 102 pre-existing RCTs. And guess who funded the vast majority of these trials?
Pharmaceutical companies.
Specifically, many of the source trials were funded by the companies selling ADHD medications:
Shire Pharmaceuticals (now part of Takeda), maker of Elvanse/Vyvanse.
Takeda Pharmaceuticals (now parent company of Shire).
Novartis, maker of Ritalin.
Janssen Pharmaceuticals, which manufactures Concerta.
Medice, another major player in the ADHD space.
This is standard practice in ADHD drug trials.
The majority of short-term RCTs in this field are industry-sponsored, primarily designed to secure drug approvals rather than investigate long-term safety.
These trials were never intended to ask tough questions about the risks of ADHD medications. Instead, they focus on short-term efficacy and immediate physiological markers like blood pressure and heart rate — just enough to clear regulatory hurdles and get the drug on the market.
As a result, the Southampton study is built on an evidence base almost entirely shaped by the pharmaceutical industry. It’s like asking a car manufacturer to certify its own vehicle safety tests.
This isn’t speculation — Cochrane reviews and independent analyses of ADHD trials have confirmed that industry sponsorship dominates this space, and the Southampton authors did not specify any exclusion of industry-funded trials.
Compare this to the Swedish study
Now let’s look at the Swedish study.
This research was completely different. It used real-world, observational data from national health registries, tracking people prescribed ADHD medication over up to 14 years.
And crucially, this study was funded by public bodies — including the Swedish Research Council and the European Union’s Horizon 2020 programme. The authors explicitly state:
“The funders had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.”
There were no pharmaceutical companies funding this work. No hidden hands. No quiet phone calls or behind-the-scenes nudges. This is proper, independent, public-interest research.
Even better, they used national registry data — meaning they weren’t selecting participants in a lab. They were analysing the actual health outcomes of hundreds of thousands of people living everyday lives, not a carefully selected, tightly monitored clinical trial population.
The result?
Long-term risk identified.
Real cardiovascular events measured — not just changes in blood pressure or heart rate.
Independence from pharmaceutical influence.
Transparent methods and open access publication.
It’s everything the Southampton study isn’t.
Why this matters
This matters because it isn’t just about funding sources. It’s about the independence of the minds interpreting the data. The Swedish researchers had no skin in the game. The Southampton team? Multiple direct financial relationships with the very industry whose products they are analysing — and they relied heavily on trials bankrolled by those same companies.
So when you see media outlets like The Guardian quoting this meta-analysis as proof that ADHD medications are safe, you now know exactly why that claim doesn’t hold up. And also why they decided not to link to the study or mention its name.
You know the difference between research designed to answer uncomfortable questions — and research designed to avoid them.
The stakes could not be higher. Millions of children and adults worldwide are prescribed ADHD medications, often for years on end, based on evidence that barely scratches the surface of long-term safety. We must demand better: rigorous, independent research that prioritises public health over pharmaceutical profits.
Brilliant piece Joseph. Really enjoyed it. Keep up the focussed analysis. I'm going to send this article to targeted friends. Thank you for your continued effort, curiosity and clarity 🙏
Hi Joseph, great article!! Out of curiosity, what was the name of the Southhampton study? Interested in taking a look