Western medicine

Definition of Western medicine

Western medicine

(WES-tern MEH-dih-sin)
A system in which medical doctors and other healthcare professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery. Also called allopathic medicine, biomedicine, conventional medicine, mainstream medicine, and orthodox medicine.

Source: NCI Dictionary of Cancer Terms