Western medicine

Definition / meaning of Western medicine

A system in which medical doctors and other healthcare professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery. Also called allopathic medicine, biomedicine, conventional medicine, mainstream medicine, and orthodox medicine.

Listed under:

Find More About 'Western medicine'

Source(s):

The Web site of the National Cancer Institute (http://www.cancer.gov/)

Leave a Comment

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.