The University of Washington has successfully trained an AI system which can spot obesity all the way from space. The system is using CNN [ convolutional neural network] which was analyzing  150 thousand satellite images in which the AI looks for correlations between the physical look/makeup of the area and the prevalence of obesity.

A research team from the University of Washington presented the results from the AI in JAMA Network Open. Results showed that features of a selected area could explain almost 65% of the variance in obesity. Researchers found that analyzing satellite images could help in helping us find the links between obesity prevalence and the environment.

How was AI trained to spot obesity

Before the AI was able to analyze 150 thousand satellite images of Los Angeles, Seattle, Bellevue, Tacoma, San Antonie, and Memphis, researchers had to train it. CNN was trained on 1 200 000 images from the ImageNet database. With an implementation of data on adult obesity prevalence, which were obtained from project Prevention’s 500 Cities. Regression models were used to quantify the association between these features and obesity prevalence across census tracts.


The AI was able to identify if the areas were features that increased the chance of obesity in that area. Some of the features were firmly included: crowded houses, houses near roadways and areas with lack of verdure.

Accuracy of AI

The accuracy of AI varied from city to city, but the lowest score was in Seattle with 55.8% and the highest success rate was in Memphis with 73.3% accuracy.

The team from the University of Washington, of course, stressed that surroundings are not the only main factor, because socio-economic factors play a big role in obesity prevalence in a selected area. However, the study shows that man-made features of areas are definitely correlating with obesity prevalence.