Deep Learning to Spot Obesity from Satellite Image

Does your environment speak anything about your physical health? What do you have as built environment surrounding your house? The fast food centers, shopping malls, movie theaters etc. tell a lot about the obesity prevalence in your locality. The availability of resources around your house, such as activity centers, jogging tracks, gyms, parks, swimming pools and recreational spaces, can greatly influence your health.

Yes, you hear that right. Researchers from the University of Washington have used Artificial Intelligence (AI) to detect obesity in cities of U.S.


Obesity is considered to be a serious health issue. Obese people are at a higher risk for medical conditions like high blood pressure, heart attack, diabetes etc. The environment surrounding people highly affect their health condition. Some common facts which we are all aware of form the basis of impact of the built environment on obesity. More of open area and green spaces contribute to good health of people by motivating them to do more of physical activity whereas, over packed areas lacking greenery surrounded by buildings and roads make them dull, inactive and lazy.
The research got motivation from existing studies showing the association between obesity and built up environmental. These studies have shown that certain features of the surroundings including walkability, land use, area of residence, access to resources etc. play a major role in the health of people. The obvious reason behind the associations is the socio-economic indicators like income and spending capacity of the people in this area.


The solution involved using a pre-trained network to extract features of the built environment from data set consisting of 150,000 high-resolution satellite images. The researchers trained a VGG-CNN-F (Convolutional Neural Network) on approximately 1.2 million images to recognize objects from 1000 categories. The goal was to identify features such as roads, buildings, trees, water, and land in 6 selected cities including Los Angeles, California; Memphis, Tennessee; San Antonio, Texas; and Seattle, Tacoma, and Bellevue (considered as one), Washington. The selected cities from states had both high (Tennessee and Texas) and low (Washington and California) prevalence of obesity.
The researchers also seek to assess the association between these features through elastic net regression along with points of interest like gas stations, shopping malls, parks etc. and to estimate the prevalence of obesity in the cities. This is not the very first attempt in this direction, but the researchers claim that their model is the most comprehensive one. In past also CNNs have been used in separate experiments for identifying skin cancer and poverty from satellite images.
Obesity Prevelance

Green cover is associated with low obesity prevalence whereas sparse greenery depicts high obesity (Courtesy of Source: Maharana et al.)



The researchers found that a strong association can be generalized to exist between obesity prevalence and the built environment indicator irrespective of city and neighborhood. They indicated that CNN can automate feature extraction of the built environment from satellite images to study health indicators of people. So, it can be concluded that structural changes in the environment can encourage people to do more physical activity and can decrease obesity prevalence in that area.
Machine learning has unbeatable capability to explore and deal with complex problems in health domain. Such researches involving analysis from eye-in-the-sky does not necessarily give accurate results, but certainly open new doors for further research possibly in designing methodologies for estimating and controlling health risks.

2 Responses to “Deep Learning to Spot Obesity from Satellite Image”

  1. Fred says:

    Wonderful article! That kind of information is meant to be shared around the internet.

  2. Krita says:

    Amazing… Heard something like this first time.

Leave a Reply

Your email address will not be published. Required fields are marked *