Definition of Western medicine
Western medicine: Conventional medicine, as distinct from an alternative form of medicine such as ayurvedic or traditional Chinese medicine.Source: MedTerms™ Medical Dictionary
Last Editorial Review: 9/20/2012
Medical Dictionary Definitions A - Z
Search Medical Dictionary
Find out what women really need.