You Don’t Need to Tell Your Doctor Which Vitamins You Take | Myth Buster

You Don’t Need to Tell Your Doctor Which Vitamins You Take | Myth Buster

Vitamins and herbal products may leave you thinking that they are natural and as they can be found in foods, are not important to mention to your pharmacist or doctor, however this myth is simply not true.

Some medications, vitamins or supplements can affect the way your body absorbs, breaks down or removes medicine.

It is important to discuss all aspects of your diet, lifestyle and all OTC and prescription medication including vitamins and supplements you are taking prior to starting any new medicine. Even if you are not buying a herbal medicine from the pharmacy, feel free to come in and discuss the risks and benefits with a health care professional before starting it to ensure it is suitable for you.

All information is correct at the time of writing.