Font Size
A
A
A

Definition of Western medicine

Western medicine: Conventional medicine, as distinct from an alternative form of medicine such as ayurvedic or traditional Chinese medicine.

Source: MedTerms™ Medical Dictionary
http://www.medterms.com/script/main/art.asp?articlekey=33616
Last Editorial Review: 9/20/2012

Medical Dictionary Definitions A - Z

Search Medical Dictionary






Medical Dictionary