Western medicine

Definition / meaning of Western medicine

A system in which medical doctors and other healthcare professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery. Also called allopathic medicine, biomedicine, conventional medicine, mainstream medicine, and orthodox medicine.

Listed under:

Find More About 'Western medicine'


The Web site of the National Cancer Institute (http://www.cancer.gov/)

Leave a Comment