Definition of Mainstream medicine
Mainstream medicine: Medicine as practiced by holders of M.D. or D.O. degrees and by their allied health professionals, such as physical therapists, psychologists, and registered nurses. The term "mainstream medicine" implies that other forms of medicine are outside the mainstream.
Last Editorial Review: 9/20/2012
Back to MedTerms online medical dictionary A-Z List
Need help identifying pills and medications?
Get the latest health and medical information delivered direct to your inbox FREE!