IM2GPS: estimating geographic information from a single image



Estimating geographic information from an image is an excellent, difficult high-level computer vision problem whose time has come. The emergence of vast amounts of geographically-calibrated image data is a great reason for computer vision to start looking globally — on the scale of the entire planet! In this paper, we propose a simple algorithm for estimating a distribution over geographic locations from a single image using a purely data-driven scene matching approach. For this task, we will leverage a dataset of over 6 million GPS-tagged images from the Internet. We represent the estimated image location as a probability distribution over the Earth's surface. We quantitatively evaluate our approach in several geolocation tasks and demonstrate encouraging performance (up to 30 times better than chance). We show that geolocation estimates can provide the basis for numerous other image understanding tasks such as population density estimation, land cover estimation or urban/rural classification.

paper thumbnail


im2gps.pdf, 11MB


James Hays, Alexei A. Efros. IM2GPS: estimating geographic information from a single image. Proceedings of the IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), 2008.


Google Tech Talk.
CVPR 2008 poster (30MB).

Test sets

im2gps test set, 237 images, 39MB.
2k random test set, 2000 images, 308MB.
geographically uniform test set, 955 images, 140M.
human geolocation test set, 64 images, 11M.
The gps coordinates along with other image info are saved in the jpeg comment fields. Use Matlab's imfinfo() to read them. See the code for downloading Flickr images below for more info.

All Geolocation Results

Gallery of geolocation results for the entire test set.

Download Scripts

Code for downloading Flickr images.

Comparison to Human Geolocation Performance

VSS 2009 Poster (10MB) comparing im2gps performance to twenty participants under different photo viewing conditions.


We thank Steve Schlosser, Julio Lopez, and Intel Research Pittsburgh for helping us overcome the logistical and computational challenges of this project. We thank Yahoo! and Flickr for sharing their computing and image resources with the research community. All visualizations and geographic data sources are derived from NASA data. Funding was provided by an NSF fellowship to James Hays and NSF grants CAREER IIS- 0546547 and CCF-0541230.

We had some difficulty deciding on a subtitle for this paper so we solicited our fellow researchers for advice. Their suggestions were very useful and we regret not using their excellent titles. Thanks to Adam Bargteil, Derek Hoiem, Ronit Slyper, and the CMU Graphics Lab.