The inside story of how an Ivy League food scientist turned shoddy data into viral studies

But for years, Wansink’s inbox has been filled with chatter that, according to independent statisticians, is blatant p-hacking.

“Pattern doesn’t look good,” Payne of New Mexico State wrote to Wansink and David Just, another Cornell professor, in April 2009, after what Payne called a “marathon” data-crunching session for an experiment about eating and TV-watching.

“I also ran — i am not kidding — 400 strategic mediation analyses to no avail…” Payne wrote. In other words, testing 400 variables to find one that might explain the relationship between the experiment and the outcomes. “The last thing to try — but I shutter to think of it — is trying to mess around with the mood variables. Ideas…suggestions?”

Two days later, Payne was back with promising news: By focusing on the relationship between two variables in particular, he wrote, “we get exactly what we need.” (The study does not appear to have been published.)

“That’s p-hacking on steroids,” said Kristin Sainani, an associate professor of health research and policy at Stanford University. “They’re running every possible combination of variables, essentially, to see if anything will come up significant.”

I liked this quote:

“He’s so brazen about it, I can’t tell if he’s just bad at statistical thinking, or he knows that what he’s doing is scientifically unsound but he goes ahead anyway.”