While Western medicine isn’t just simply evil, the drug part of that medical system can definitely be dangerous. And it can ruin lives.
RELATED ARTICLE:
- Women flocking to wellness because modern medicine still doesn’t take them seriously
How the Food and Drug Companies Ensure that We Get Sick and They Make Money
As always we recommend you educate yourself before you just take a pill. Do your research. Know what the side effects are. You simply cannot trust that the pharmaceutical industry has your best interest at heart because they don’t. They want your money.
RELATED ARTICLE:
Please watch and share.