Women's colleges

Women’s colleges can be supportive, empowering environments that allow female students to access opportunities and fulfill their potential unhindered by discrimination. Alumnae of women’s colleges include national and global leaders in politics, the sciences and the arts.

A women’s college in the USA is usually a liberal arts college whose student body is completely or mostly female. The most well known women’s colleges were founded in the 1800s, when higher education was almost exclusively male. 

Many women’s colleges have agreements with nearby coeducational universities, so students can register for classes at both institutions.