Women's fiction
Women's fiction is an umbrella term for women centered books that focus on women's life experience that are marketed to female readers, and includes many mainstream novels or woman's rights Books. It is distinct from Women's writing, which refers to literature written by women.