"Strong winds, dry heat and freezing temps are all part of winter, but the temperature changes and dry, dehydrating air can wreak havoc on your skin. “During the winter, you lose water through your skin and then your skin is less of a barrier to bacteria in the environment,” says dermatologist Margaret “Miggs” Muldrow, MD"

Read more on Sharecare.com