Accelerated climate warming may increase the risk for infectious disease outbreaks in many species adapted to mild and cooler climates, whereas species from warmer climates could experience reductions in disease risk, reports a new study. Alongside a myriad of climate-related impacts on ecological communities, disease outbreaks among wildlife populations have become more frequent and widespread in recent decades. Such observations suggest that climate change and infectious disease risk in wildlife are linked; however, the ties between host-parasite biology and the environment are inherently complex and difficult to untangle. Researchers have proposed the "thermal mismatch" hypothesis to help explain these patterns in amphibians, suggesting that smaller-bodied, disease-causing pathogens tend to have greater tolerance to abnormal temperatures than the larger-bodied species they infect. Thus, species adapted to warmer climates are at greatest risk of disease under abnormally cool conditions, while species adapted to cooler climate face most risk from disease when temperatures are abnormally warm. To determine if this hypothesis can be expanded to explain how climate change-related temperature fluctuations influence disease risk across species and geographic regions, Jeremy Cohen and colleagues built a dataset describing pathogen prevalence in 2,021 host-pathogen pairs from 7,346 wildlife populations worldwide, with data on local weather and climate for each location. Modeling the data revealed findings that support the thermal mismatch hypothesis; wildlife from cool climates experienced increased disease risk during abnormally warm periods. And, while the risk to warm-adapted hosts similarly increased during cool periods, it mildly decreased during abnormally warmer periods. While these effects were largely dependent on the parasite and host identity and traits, respectively, they were strongest among cold-blooded species. They were similar across terrestrial and freshwater ecosystems.