Some months ago, I participated in a two-week experiment that involved using a smartphone app to track every morsel of food I ate, every beverage I drank and every medication I took, as well as how much I slept and exercised. I wore a sensor that monitored my blood-glucose levels, and I sent in a sample of my stool for an assessment of my gut microbiome. All of my data, amassed with similar input from more than a thousand other people, was analyzed by artificial intelligence to create a personalized diet algorithm. The point was to find out what kind of food I should be eating to live a longer and healthier life.
The results? In the sweets category: Cheesecake was given an A grade, but whole-wheat fig bars were a C -. In fruits: Strawberries were an A+ for me, but grapefruit a C. In legumes: Mixed nuts were an A+, but veggie burgers a C. Needless to say, it didn’t match what I thought I knew about healthy eating.
It turns out, despite decades of diet fads and government-issued food pyramids, we know surprisingly little about the science of nutrition. It is very hard to do high-quality randomized trials: They require people to adhere to a diet for years before there can be any assessment of significant health outcomes. The largest ever — which found that the “Mediterranean diet” lowered the risk for heart attacks and strokes — had to be retracted and republished with softened conclusions. Most studies are observational, relying on food diaries or the shaky memories of participants. There are many such studies, with over a hundred thousand people assessed for carbohydrate consumption, or fiber, salt or artificial sweeteners, and the best we can say is that there might be an association, not anything about cause and effect. Perhaps not surprisingly, these studies have serially contradicted one another. Meanwhile, the field has been undermined by the food industry, which tries to exert influence over the research it funds.
Now the central flaw in the whole premise is becoming clear: the idea that there is one optimal diet for all people.
Only recently, with the ability to analyze large data sets using artificial intelligence, have we learned how simplistic and naïve the assumption of a universal diet is. It is both biologically and physiologically implausible: It contradicts the remarkable heterogeneity of human metabolism, microbiome and environment, to name just a few of the dimensions that make each of us unique. A good diet, it turns out, has to be individualized.
We’re still a long way from knowing what this means in practice, however. A number of companies have been marketing “nutrigenomics,” or the idea that a DNA test can provide guidance for what foods you should eat. For a fee, they’ll sample your saliva and provide a rudimentary panel of some of the letters of your genome, but they don’t have the data to back their theory up.
Coming up with a truly personalized diet would require crunching billions of pieces of data about each person. In addition to analyzing the 40 trillion bacteria from about 1,000 species that reside in our guts, as the project I participated in did, it would need to take into account all of the aspects of that person’s health, including lifestyle, family history, medical conditions, immune system, anatomy, physiology, medications and environment. This would require developing an artificial intelligence more sophisticated than anything yet on the market.
The first major development in this field occurred a few years ago when Eran Segal, Eran Elinav and their colleagues at the Weizmann Institute of Science in Israel published in the journal Cell a landmark paper titled “Personalized Nutrition by Prediction of Glycemic Responses.”
Spikes in blood-glucose levels in response to eating are thought to be an indicator of diabetes risk, although we don’t know yet if avoiding them changes that risk. These spikes are only one signature for our individualized response to food. But they represent the first objective proof that we do indeed respond quite differently to eating the same foods in the same amounts.
The study included 800 people without diabetes. The data for each person included the time of each meal, food and beverage amount and content, physical activity, height, weight and sleep. The participants had their blood and gut microbiome inhabitants assessed and their blood glucose monitored for a week. They ate more than 5,000 standardized meals provided by the researchers, which contained popular items like chocolate and ice cream, as well as nearly 47,000 meals that consisted of their usual food intake. In total, there were more than 1.5 million glucose measurements made. That’s a big data set.
Using machine learning, a subtype of artificial intelligence, the billions of data points were analyzed to see what drove the glucose response to specific foods for each individual. In that way, an algorithm was built without the biases of the scientists.
More than a hundred factors were found to be involved in glycemic response, but notably food wasn’t the key determinant. Instead it was the gut bacteria. Here were two simultaneous firsts in nutritional science: one, the discovery that our gut microbiome plays such a big role in our unique response to food intake, and the other that this discovery was made possible by A.I. The journal ran an accompanying editorial titled “Siri, What Should I Eat?”
Several subsequent studies by these researchers and others have confirmed not only our microbiome’s importance but also that a substantial proportion of healthy people have high glucose levels after eating. My curiosity about this led me to approach Dr. Segal and Dr. Elinav to ask if they would test me.
A few weeks later, my data had been ingested by their machine-learning algorithm. It turned out that my gut microbiome was densely populated by one particular bugger — Bacteroides stercoris, accounting for 27 percent of my co-inhabitants (compared with its average of less than 2 percent in the general population). I had several glucose spikes as high as 160 milligrams per deciliter of blood (normal fasting glucose levels are less than 100, but we don’t yet know what level is normal after eating).
I was then provided with a set of specific food recommendations in order to avoid glucose spikes, including that information on cheesecake and mixed nuts, and a searchable database of glucose predictions for 100,000 foods and beverages.
That sounds great, but I realized I had a big problem. For the most part the highly recommended foods, like cheese danishes, were ones I really disliked, while those rated C-, like oatmeal, melon and baked squash, were typically among my favorites. Bratwurst (the worst and potentially most lethal kind of food in my perception) was rated an A+! If I wanted to avoid glucose spikes, I’d have to make some pretty big sacrifices in my diet.
Nevertheless, it was an interesting first step on the path to a personalized diet. There is now a commercial version of this test, based on the research of Dr. Segal and Dr. Elinav, though it is much more limited: It only analyzes a gut microbiome sample, without monitoring glucose or what you eat.
There are other efforts underway in the field as well. In some continuing nutrition studies, smartphone photos of participants’ plates of food are being processed by deep learning, another subtype of A.I., to accurately determine what they are eating. This avoids the hassle of manually logging in the data and the use of unreliable food diaries (as long as participants remember to take the picture).
But that is a single type of data. What we really need to do is pull in multiple types of data — activity, sleep, level of stress, medications, genome, microbiome and glucose — from multiple devices, like skin patches and smartwatches. With advanced algorithms, this is eminently doable. In the next few years, you could have a virtual health coach that is deep learning about your relevant health metrics and providing you with customized dietary recommendations.
The benefits of such a coach will, of course, have to be validated by randomized trials, unlike the myriad diets that are being hawked without any proof that they are effective or even safe.
We don’t often think of a diet as being unsafe, but the wrong foods can be dangerous for people with certain risks or conditions. I’ve had two bouts of kidney stones. To avoid a third, I need to stay away from foods high in oxalate, a naturally occurring molecule abundant in plants. But if you look at the recommendations for my personalized diet, many — like nuts and strawberries — are high in oxalate. That’s a big miscue, because my pre-existing medical conditions were not one of the test’s inputs. And as we undergo significant changes through our lives, like pregnancy or aging, we’ll need re-assessments of what our optimal diet should be.
For now, it’s striking that it took big data and A.I. to reboot our perceptions about something as fundamental as what we eat. We’re still a ways away from “You Paleo, me Keto,” but at least we’re finally making progress, learning that there is no such thing as a universal diet.