Definition of Western medicine
Western medicine: Conventional medicine, as distinct from an alternative form of medicine such as ayurvedic or traditional Chinese medicine.
Western medicine: Conventional medicine, as distinct from an alternative form of medicine such as ayurvedic or traditional Chinese medicine.