US Health Care

US Health Care is the maintenance or improvement of health via the diagnosis, treatment, and prevention of disease, illness, injury, and other physical and mental impairments in human beings. Health care is delivered by health professionals (providers or practitioners) in allied health professions, chiropractic, dentistry, midwifery, nursing, medicine, optometry, pharmacy, psychology, and other health professions. It includes the work done in providing primary care, secondary care, and tertiary care, as well as in public health.