The phenomenon of climate change is well established within the scientific community. Unfortunately, the science is not what you see in the popular media. At one extreme, you have people denying that climate change is real. At the other extreme, you have movies like The Day After Tomorrow showing widespread planetary destruction.
You wouldn't trust a politician, a pundit, a corporation, or a Hollywood director with your health issues; you'd ask someone trained in the study of human anatomy, like a doctor. Why do you listen to these people when it comes to climate change?
By the way, I use the term "climate change" rather than "global warming" because although the globe is warming, that terminology leads people to think that snow somehow disproves the science. In fact, large snow storms are completely predicted by global warming. Some areas of the globe will get much less water (drought), but that water has to go somewhere. If it is winter time, that water will show up has harsher snowstorms.