Models that aim to predict human-induced global average temperature rise have been underestimating important contributions from clouds, causing projections to be lower than what actually might occur, at least in some simulations, a new study suggests. Global climate models that attempt to predict mean temperature rises need to know how the amount of carbon dioxide in the atmosphere affects atmospheric temperatures, a relation called "equilibrium climate sensitivity" (ECS). Larger values of ECS mean that the warming caused by carbon dioxide is greater. Clouds and aerosol particles strongly influence the amount of radiation, and thus heating, that occurs in the atmosphere, yet clouds and aerosol particles are currently the leading cause of uncertainty in climate projections. How clouds affect Earth's energy balance depends in part on the numbers and sizes of the ice crystals and supercooled liquid droplets they contain, as well as overall cloud coverage. Recent evidence shows that a certain ice formation process, which was previously believed to be very common, is in fact observed infrequently during actual cloud formation. This means that some types of clouds contain fewer ice crystals and more liquid droplets than previously believed and therefore reflect more radiation than was thought, leading to estimates of ECS that are too low. As for overall cloud coverage, spatial and temporal measurements have been sparse in the past. Here, Ivy Tan and colleagues used comprehensive cloud monitoring data from NASA's Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) instrument, while incorporating recent in situ measurements of ice crystals and supercooled liquid droplets. Plugging these new parameters into climate models substantially changed the ECS. Whereas old models estimated the ECS to be between 2.0° and 4.6°Celsius, results by Tan et al. now put these estimates at between 5.0° and 5.3°C, although it is important to note that these results can vary depending on what model is used.