The Nursing ProfessionIn The Reader's Companion to U.S. Women's History
Modern nursing, a predominately although not exclusively women’s field, began during the Civil War, when those women who volunteered to nurse sick and wounded soldiers proved that careful attention to proper sanitation, nourishing diets, cleanliness, and comfort dramatically cut shockingly high morbidity and mortality rates.