Black Womens Health Imperative
Advertisement
The Black Women's Health Imperative is a nationally recognized non-profit organization that is dedicated to achieving health equity for Black women in America. Founded in 1983, the organization leads health policy, education, research, and leadership development initiatives to improve the overall health outcomes of Black women. They partner with Black women on their life-health journey, advocating for their concerns and addressing the inequities that impact their well-being.
With a vision of ensuring optimal health for all Black women in a society that promotes health equity and reproductive justice, the Black Women's Health Imperative offers various programs and initiatives focused on lifestyle changes, rare disease diversity, workplace equity, and more. They have successfully collaborated with organizations, introduced legislation, and launched national campaigns to raise awareness, drive policy changes, and provide life-saving screenings and treatments to underserved women. Through their efforts, they aim to increase the number of healthy Black women in the United States and empower them to lead healthier lives.
Generated from the website
You might also like
Partial Data by Infogroup (c) 2025. All rights reserved.
Advertisement