During the 19th century, the medical profession was male-dominated as only men could receive formal medical training. Still, pioneering women began to receive formal training and become practicing doctors. Their efforts paved the way for others to enter the field. Women’s role in healthcare has continued to evolve to the present day as women continue to work as physicians, nurses, medical specialists and researchers.
Nursing was one of the first ways women began to take an active role in the medical profession. Their valuable services during the Civil War helped save lives and improve sanitation. The achievements of these women encouraged other females to enter the medical profession after the war. Today nurses continue their work in maintaining the health of society by caring for the sick, assisting doctors and aiding in health research and education.