The MAHA report on ADHD is misleading — a researcher explains why
On this bonus episode of Hyperfocus, we dive into the controversial new “Make America Healthy Again” (MAHA) report, released by a commission led by Robert F. Kennedy Jr. The 73-page document claims to explain what it calls a rise in “childhood chronic diseases” like ADHD, by pointing fingers at antibiotics, food dyes, and even a lack of outdoor play. But there’s a major catch: some of the report’s sources don’t actually exist.
In the first of a two-part series, Dr. KJ Wynne — a Harvard-trained population health researcher — joins the conversation to unpack how the report was assembled and where it falls short, particularly regarding ADHD. From debunking flawed claims to clarifying what real research actually says, this episode sets the record straight.
We love hearing from our listeners! Email us at hyperfocus@understood.org.
Related resources
Timestamps
(02:40) Questionable methodology and AI hallucinations
(06:26) Are more people developing ADHD, or are we getting better at diagnosing?
(10:04) Do antibiotics really increase the risk of ADHD?
(17:02) Parsing through big claims on stimulant medications
(26:00) Do stimulants cause height loss?
Episode transcript
Rae: The new MAHA report has a lot to say about ADHD and childhood chronic disease. I spoke to a researcher who could help me sort out what's real, what's not, and what was completely made up by AI.
On May 22, the Make America Healthy Again Commission, led by Robert F. Kennedy Jr., released its promise, "Make America Healthy Again Report," unquote the childhood chronic disease crisis, unquote. We first talked about the MAHA commission back in February when the president signed the executive order that created it. The first task of that commission was to report back within 100 days with an explanation of what's causing all these reported increases in rates of conditions like diabetes, autism, and yes, ADHD.
It's a long report, 73 pages and all, but we read it. Every word. And then we read again, after news surfaced last week that AI was used to create the report and that multiple citations were incorrect or just straight up didn't exist. After multiple reviews, though, we still felt it was important to talk about this report and about what it has to say about ADHD, because ADHD is mentioned about 30 times. And it's often talked about in contradictory ways, from questioning whether ADHD exists at all to suggesting that antibiotics can cause it. So, we asked two of our colleagues to talk us through what the report has to say about ADHD and to help us understand what the research actually tells us.
There's a lot to talk about, so we're breaking our report on the report into two episodes. On this first one, you'll hear from Dr. KJ Wynne, Understood Senior Manager of Interdisciplinary Research. KJ has a PhD from Harvard, where she studied population health sciences and focused on understanding how social factors can impact mental health expression. KJ's here today to help us understand, from a research perspective, the methods used to create the report and whether they were sound or not. I'm Rae Jacobson, and today on "Hyperfocus," Dr. KJ Wynne joins us to help us understand the MAHA report, how it was made, and what it has to say about ADHD.
So, KJ, let's dive in. But first, I want to say that if you're watching this on YouTube, you might notice that KJ is using her phone as a mic. That's because she's talking to us from Boston, where she's visiting her alma mater, and we couldn't get her a mic quickly enough. So, needs must, as they say. So, KJ, my first question is, you have a PhD in research, and from a research perspective, can you tell me a little bit about your thoughts on the methods used to create the report? Were they sound?
(02:40) Questionable methodology and AI hallucinations
KJ: From my professional experience, I'm going to say no, the methods used were not sound. There were numerous problems with this report, ranging from the use of AI to the studies that were included and what can be said about the studies that they chose. And all of that combined makes the report very difficult to take seriously. So, this story actually came out of the Washington Post, where they interview some experts in AI. And they agreed that it appeared that this was report was generated by AI, at least in some parts. And this can mean two things. The first is that AI wrote some parts of this report. And the second is that citations were created by AI to support some of the claims within the report.
And I'm particularly concerned about using AI to justify claims because we know that it can do what's called hallucinate. Where it makes up information in order to satisfy the question that the user is asking. In addition to that, whenever they were pulling from studies that were published in the peer review literature, those studies were not the best studies for the claims they were making.
Rae: So, another thing that was concerning to me, or that I'm interested in, is how the studies that did exist were chosen. Was that process something that made sense to you as a researcher?
KJ: No, it wasn't a process that made sense to me as a researcher for a few different reasons. The first is that they were not consistently using high-quality studies. And so these are studies that are going to be very rigorously and carefully designed to make sure that whatever question a researcher sets out to answer, we can actually answer it. And so, that concerned me deeply. The next is that these studies were not always generalizable. And so, throughout the MAHA report, there was consistent use of community samples to talk about population-level outcomes. And what that means is that we sampled a few people, like 200, 300 people, as opposed to looking at thousands of people spread across the United States to make sure that our findings are true for the broader United States.
And while community samples are wonderful and do push research forward, if we're trying to make a report about the broader United States population, then we should be looking from data that reflects the broader United States population. And then the final is putting that research in context. And so, there's not only one study on any topic in research, there's a ton of different researchers who are studying the same thing and asking similar questions. And so, we contextualize our finding to be in conversation with other researchers who are studying the same thing, and that helps us get closer to the truth so that we're all asking these questions from different angles and answering the questions from different angles. And so, when we don't contextualize our findings, then we're not getting closer to the truth.
Rae: Got it. So, a lot of the research that was used, it was either not generalizable, it was out of context, or it maybe wasn't a study that was designed in a rigorous way. A lot of things that were present in the report just didn't answer the questions that were being asked.
KJ: Yes, and there's a term for this. It's called cherry picking.
Rae: Yep. That one I've heard. All right, well, then I want to ask some more questions about this because there was a lot in the report, and some of the claims were pretty striking, especially as a person with ADHD.
(06:26) Are more people developing ADHD, or are we getting better at diagnosing?
And one of the major claims that started it out was that rates of ADHD are increasing exponentially. And I guess my question to you is, are they? And if they are, why? I think are more people developing ADHD, which is what it sort of implied. Or other neurodevelopmental disorders, or are we just getting better at diagnosing them?
KJ: When I reviewed the literature, my takeaway is that, yes, the rates of ADHD are increasing, but I believe that's from a greater willingness to seek help. We see a decrease in stigma, increased diagnostic tools. That means that we're able to find people better and understand what ADHD looks like, and all of that creates more awareness. In particular, I'm really passionate about the increased understanding. Of what ADHD looks like in women and people of color. And there's actually a study that looked and said, "One of the reasons we're seeing this increase in ADHD diagnosis is because more women and people that color are getting diagnosed with ADHD."
A way I like to think about increases in awareness and increases in diagnosis is this way: there's always been people who have ADHD. They just never got counted. And now, when we expand our inclusivity and we better understand what this looks like in different people and we do count them, of course, we expect for the number to go up. And that is a really good thing because we're being more inclusive, and we're understanding how ADHD looks differently within different people.
Rae: And that makes a lot of sense to me. It's not that there is a sudden burst of new people developing ADHD, it's that people who have had ADHD for a long time and gone unseen and uncounted and unhelped are finally getting diagnosed and hopefully getting the treatment that they need.
KJ: Hopefully they are.
Rae: Whenever ADHD comes up, in my experience, medication for ADHD also comes up. And that definitely happens in the report. And in the report... It says, quote, "Stimulant prescriptions for ADHD in the U.S. increased 250% from 2006 to 2016 despite evidence they did not improve outcomes long term," unquote. I've never heard that before. Is it true that stimulants don't improve long-term outcomes?
KJ: It's certainly not the full story that stimulants don't improve long-term outcomes. It's actually a very tricky question to ask and answer, and that's for a few different reasons. The first is that longitudinal studies are very, very difficult to undertake. It requires a lot of time, a lot of resources, and following people over years. When we think about longitudinal studies in the context of ADHD, it becomes even complicated. This is because people may not take their medication as prescribed, or people who were not taking medication, who are in your control group start taking the medication. And so, it makes it hard to have a true control in comparison groups. And it's unethical to deny people medication.
But I will say what we absolutely do know from the short-term studies that we have been able to replicate over and over again is that it is overwhelmingly positive that ADHD medication improves short-term outcomes.
Rae: That's fascinating, because as someone who takes ADHD medication, I don't take it consistently. And a lot of other people I know as well do the same. So, I could see how that would be a really hard thing to study.
(10:04) Do antibiotics really increase the risk of ADHD?
So, on the same topic of medication, but a very different medication, another thing that stood out to me in the report was a section where they mentioned that antibiotics cause children to be more likely to develop ADHD. Specifically, they say that kids who take antibiotics in the first two years of life are more likely to develop ADHD. I have a seven-year-old, she took antibiotics for ear infections, every kid I know took antibiotics for something when they were in their babyhood. Is this true? Do antibiotics really increase the risk of ADHD?
KJ: So, based on the current global evidence, I would say no, there's no conclusive evidence that antibiotics increase the risk of ADHD. And I would save this because first, I investigated the study that the MAHA assessment cited to support their claim that it does. And when I looked at this study, while it did find an association between antibiotic use in the first two years of life and ADHD, this study was conducted in a population of people only born in Minnesota. And so, we cannot generalize that to the broader United States. That's to say, things in Minnesota like the weather or the food that people eat or the schools that they go through can all be very different from people born in California, New York, Texas, Georgia.
And so again, we would need generalizable data, information from people who live in those different states to make a claim about the broader United States. The next is then when I looked out into the broader literature universe to understand what has been said about antibiotics and ADHD. I did look at some studies from around the globe who asked the exact same question. And that led me to believe that the results are not compelling. First, we saw when I look at a study of over two million children in Taiwan, who asked the exact question of if antibiotic exposure in the first two years of life is associated with ADHD, they concluded that it was very marginally associated, but it's not significant enough at all not to be a reason to diagnose or treat ADHD.
And then another study out of Sweden that looked at twins to understand how the environment and shared genetics may play a role, concluded that it was probably genetics in the environment that was confounding this relationship or blurring the relationship between antibiotics and ADHD, leading us to believe that there might be a relationship when really there's not. One of the studies that I looked at was a study conducted in Taiwan just recently in 2024.
This study started with the population of Taiwanese individuals, and so this was over two million children that were included in the original study and then from there they looked to see which one met the study criteria of having received an antibiotic within the first two years of life and a later ADHD diagnosis or not, and they set their time between 2023 to 2025. They looked at a large population of people and they also looked over a long period of time, and what they found when they asked the question of "Is antibiotic exposure in the first two years of life associated with ADHD?" Was that yes, it's associated, but it's very marginal. It was a very little increased risk. And these little increased risks was nothing compared to the risk of not taking antibiotics.
And so, I agree with the authors of this study is that the associations that have been found in the past between antibiotics and ADHD are marginal and should never be used as a reason to discount an ADHD diagnosis or treatment plan.
Rae: Or to tell someone not to use antibiotics.
KJ: Or to someone not use antibiotics.
Rae: So, speaking of parenting though, there was another thing in here that I have questions about and wasn't really sure what to make of. The report says that single-parent households are associated with worse mental health outcomes in teens and lead to triple the rates of externalizing disorder, which would be like ADHD or conduct disorder. Is that what the research says?
KJ: Yeah, this was a deeply misleading statistic that really alarmed me when I read it.
Rae: Yeah.
KJ: There were two parts to this claim that needed to be verified. The first is that single-parent families are associated with worse mental health outcomes. So, when I went to the source that the MAHA assessment cited as their evidence for this claim, again, it was that same problem of a lack of generalizability. The sample in that study was 154 poor people in a clinical setting in Illinois. There's no way 154 people can tell you enough information and look diverse enough to represent the broader United States.
Rae: It says a clinical setting. What does that mean?
KJ: So, a clinical setting means people who have gone in to receive treatment. So, a clinical setting can be a hospital, a clinic, anything like that. Usually, clinical populations look very different from the broader community because these are people who have self-identified or have been prior identified by a doctor as needing some type of mental health resources. And so, of course, there's going to be a higher prevalence of a mental health condition in a mental health clinic, as opposed to the broader community. That's just how it works. And we're happy it's a higher prevalence. It means people are getting help. And so, that citation does not provide the evidence for that claim.
Rae: I know many, many kids who are raised in single-parent households with so much love and support, and I was a little surprised to see that called out as a structural challenge.
KJ: Yeah. And so, when I did look into part two of what was cited, so the second part that said, "Worst mental health outcomes" and "Twice the rate of internalizing disorder, triple the rate of externalizing disorders," when I went to cross-reference that, now that became an issue of checking your citations because I had to read the MAHA report cited a paper and then that cited another paper. So, when I went to see the original source, I could not verify that statistic at all within that source.
(17:02) Parsing through big claims on stimulant medications
Rae: All right, so this next bit, I'm going to split my questions about this quote into two, and you'll see why when I read it. Quote, "Stimulant prescription drugs used to treat ADHD in the U.S. doubled from 2006 to 2016. By 2022, 11% of children had an ADHD diagnosis with boys having a rate of nearly one in four by age 17." And that is a confusing quote to read and to read aloud, but it brought me to two separate questions. So, for the first part, is it true that stimulant prescriptions have increased?
KJ: This increase cannot be attributed to treating ADHD alone. In fact, when I looked at the study that was cited by the MAHA assessment, the authors themselves of this study in the discussion section talk about this. And the first is that they list several off-label uses for prescription stimulants, such as treating obesity or narcolepsy, and doctors might prescribe these medications for this reason. The next is that when there are situations where access to other interventions like parent training or behavioral support in the classroom for families aren't available, stimulants may not only be the first line of treatment, they may be the only line of treatments. And so, saying that something increased without contextualizing it doesn't give you a lot of information.
Rae: So, it's back to what you were saying again at the beginning about something can give you one piece of information, but without context, you really can't fully understand what it's trying to say.
KJ: You can't, and that reminds me, it also ties back to what we said at the beginning about the rise in ADHD diagnoses, which reflects a more inclusive environment and greater recognition in women and people of color. If we see that ADHD diagnoses are rising, we should also expect and want to see that the medication used to manage ADHD symptoms is rising as well. That is a reflection of people getting the help that they need to live full lives that bring them joy and happiness. And so, again, just saying it one more time, just because we see an increase or a change does not mean that harm is happening.
Rae: I really appreciate that context, a word that I'm getting to feel is perhaps the most important word that we can use in this situation.
KJ: I think so too. Context and generalizable.
Rae: Context and generalizable. I'm going to get like two knuckle tattoos, context, generalizable, I don't have enough fingers for either one of those, I just realized. All right, so let's talk about the second part of this somewhat confusing quote: boys. In the second part, it specifically mentions that one in four boys receives an ADHD diagnosis before the age of 17. That feels like a lot.
KJ: Yeah, it is a lot. And so, first, let's just take it head-on before even thinking about how it relates to the rest of this report. So no, it's not true that one in four boys have an ADHD diagnosis. And again, when I went into the MAHA assessment, and went to the study that they themselves cited, that study found that about 14.7% of boys have an ADHD diagnosis, and that translates to one in seven boys.
Rae: And that, from what I know about ADHD diagnosis rates, is pretty consistent with what it's been for quite a while, right?
KJ: Yeah, I would say it's pretty consistent. I would also say that it's like in the range that we expect to see for ADHD diagnoses.
Rae: OK. So, I'm going to probably have to have you explain this to me very slowly and very thoroughly because I have dyscalculia and this does involve math, but the report also suggests that medication, ADHD medication, is prescribed 2.5 times more often to kids in the U.S. Compared to children in England and 19 times more than kids in Japan. Is this true? Is it reasonable to compare American kids to British or Japanese kids?
KJ: Yeah, this is a really, really great question. And when I went to investigate this, the first thing that I did was I went into the MAHA report, which has been my first step the entire time, and I looked to see what studies that they cite. So, let's start with the UK study. The first study, the UK study, I read through it, and it was a study about ADHD prescribing in the UK. But the claim in the MAHA report is that the medication is prescribed 2.5 times more in the U.S. Compared to in the UK.
Rae: Got it.
KJ: Are you following me?
Rae: Yes.
KJ: OK, great. So, whenever I went into the UK study, what I wanted to see was that they calculated this. They looked to see how much is it being prescribed in the U.S., how much it is being prescribed in the UK, and then they conducted a statistical test, and the output to that test was 2.5 times. I wanted to see the math.
Rae: Yes.
KJ: I wanted the math, I wanted see the work. And that's what we expect to see whenever any scientist reports something is 2.0 times greater or 2.4 times lower, it is always a show me your math. And the math was not there. In fact, when I reviewed it, the authors of the UK paper never even mentioned the United States when they were conducting any of their statistical analysis or conducting any of their mathematical tests. Instead, they got to all of their answers, which were solely about the UK, and then when they wanted to contextualize their findings, they said, "OK, so, now that we know all of this about the UK, let's step back and see how does this fit into the broader global context." And when they stepped back, they said, you know what? The increase in prescribing, which was their finding, is that prescribing is increasing in the UK.
That's what they set out to study, and that's what they found. When they contextualized it, what they said is, you what? That makes a lot of sense, because prescribing it's increasing in the U.S., and prescribing is increasing in the UK. 2.5 times was not reported in this study and leads me to believe that that 2.5 times was a back of the envelope calculation or an AI calculation that was not verified statistically or peer reviewed. And we, none of us, can see the math for that. There is no existing math.
Rae: Got it. So, it's something that the authors of the report are reporting, not something that was present in the study.
KJ: Yes.
Rae: See, if you had taught my statistics class in grad school, I might have done better.
KJ: Ha ha!
Rae: So, I want to take the other part, which is that they mention Japan, and I'm very interested in when you talk about generalizable facts, is it reasonable to compare kids in Japan and kids in the U.S. in terms of prescriptions, or even the way that things are prescribed in two very different countries?
KJ: Yeah, I think that the Japan example is an excellent example of what is it like comparing across a global context and how researchers do it very carefully, as the authors of the Japanese study did. So, again, we saw that it reported that prescriptions were prescribed 19 times more in the U.S. compared to in Japan. And again, the authors of the Japanese study did not run any tests. They do report on the fact that it's more prescriptions, but they're not testing for it. And so, that's not a claim that I would say that the scientists who wrote that paper are comfortable making.
But more than that, when you read into their discussion, and when they talk about, "Well, why are we prescribing less medications in Japan compared to the United States?," the authors are very clear that, well, in the Japanese context, a stimulant medication and a non-stimulant medication are both listed as the first line defense. And so, Japanese physicians may prefer the non-stimulant medications because there's no restrictions around it.
Rae: I see, so you don't have to get approval to prescribe the non-stimulant medication, but you do to prescribe a stimulant, so it's just a lot easier to prescribe non-stimulants in Japan.
KJ: Yes, that's my understanding of their research is that it would be a lot easier to prescribe the non-stimulant in the Japanese context.
(26:00) Do stimulants cause height loss?
Rae: I do want to, though, ask about this one thing, which is that it says that stimulants cause long-term height loss, specifically one whole inch on average. And I've heard this, that stimulants can restrict growth before, even coffee, I've heard that about. But an inch seems pretty significant. Could you tell me what's going on there?
KJ: Yes, this was perhaps one of the most interesting claims to dig into. When I went into the wider research ecosystem, there's been a flurry of research being published about ADHD and height. When you look at over 100 studies and you look at the association between ADHD and height at face value, you will see that there is reports of shorter height for people who are on ADHD stimulants. But if you put that in the context of the benefits that those same individuals are reaping from using the medications, it's really not even a comparison. And I have to agree on that. I think I would rather take the benefits of less mood disorders, less car crashes, less suicide any day, than a potential risk of up to one inch in height loss.
Rae: Well, then I also want to know is an inch really likely?
KJ: So, again, when we look at the other form of context of like, let's actually look at those studies that have examined the association between ADHD and height, and we dig into those studies, it's very conflicting findings. And so, first, what we see is that there are some studies that report an association between ADHD in a shorter height. Then there are other studies that reported that people with ADHD, if they stop taking their medication. They will grow an extra few centimeters or make up that potential up to an extra inch. And I'm saying a potential up an extra inch because the amount that's lost, it's no real evidence for what that exact number is. It differs based on the studies that you examine.
And then finally, there are studies that just kind of go in a very different direction and think about it genetically and say, "Well, maybe people who have ADHD have a predisposition for a shorter height." And so, when those studies look, and there are some really, really rigorous studies that have thought about it this way, what they find is that people who have ADHD were shorter before they started stimulants, and they were shorter after they took stimulants. And so, all of that to say is that the evidence is very conflicting. And so, I don't feel like the science is there to be, to conclusively say that stimulants are the cause of shorter heights in people with ADHD, even if there is an association of losing up to a potential inch of growth.
Rae: Thank you, and I feel like I understand that a lot more now. So, I have a final question for you as a researcher, as someone who works in this field and thinks very thoroughly and very frequently about ADHD and about the research surrounding it. This is probably one of the biggest and most impactful pieces of information from the government about children's mental health and learning that we're going to see for a while. And as we've discussed in great depth now, it's pretty flawed. But from your perspective, if you had written this report, if you had the chance to put out the KJ report, I would say, on children's mental health and learning, how would you like that to be conducted? What would you like to see?
KJ: I think first, I would ground myself in, what does this report do? And at its heart, what a report like this should do is ground us in our prevention strategy, our assessment strategy, how we're going to monitor, how are we going to evaluate if we're successful, and where are we missing research? And so, I would have grounded in that at first, or that first with like, what am I doing this report for? And then after that, I would have collaborated and consulted with some of the best experts on ADHD and other childhood conditions, if we just think about the broader report, and had their input, and had them weigh in and collaborate, and write the sections.
We see this all the time, where multiple researchers come together for reports and write different sections. And then after having all of this expert input and weigh in, I would have stepped back and say, "OK, what can we say? Where did we find a common cause? Where can we intervene? How can we improve our monitoring? And how can we get there? But it would have never been designed in a way of where we started with AI, we didn't consult and collaborate with experts, and we didn't use the most high quality, generalizable data available. That, to me, is a no-go.
Rae: Thank you so much, Dr. KJ Wynne, for coming on the show today, for giving us all of your vast and significant expertise, for doing it through your phone, and for being so game to explain to us this sometimes very confusing report.
KJ: Yeah, thank you, Rae, so much for having me. It's been a real pleasure.
Rae: "Hyperfocus" is made by me, Rae Jacobson, and Cody Nelson. Video is produced by Calvin Knie and edited by Christophe Manuel. Fact-checking by Mary Mathis. Our music comes from Blue Dot Sessions, Samiah Adams is our supervising producer, Briana Berry is our production director, and Neil Drumming is our editorial director. If you have any questions for us or ideas for future episodes, write me an email or send a voice memo to hyperfocus@understood.org.
This show is brought to you by Understood.org. Our executive directors are Laura Key, Scott Cocchiere, and Jordan Davidson. Understood is a nonprofit organization dedicated to empowering people with learning and thinking differences like ADHD and dyslexia. If you want to help us continue this work, you can donate at Understood.org/give.
Host

Rae Jacobson, MS
is the lead of insight at Understood and host of the podcast “Hyperfocus with Rae Jacobson.”









