Skip to content

Avoiding Data and Algorithmic Bias In Disaster Research: The Ten States Project

Pioneering Research for a Better Future

In this writing, we wanted to delve into the Ten States Project, a pioneering initiative addressing climate and disaster risks confronting African-Americans in the ten most states where, except for New York and California, 60% of that population lives. Our goal is to create a comprehensive risk profile for this population that looks at multiple indicators comprehensively, rather than comparatively or at fixed points in time. The project will also include factors such as connectivity, socioeconomics, and political instability in the profile. The final product will be a hub in which the community itself, along with emergency management, disaster, and climate resilience professionals can access the research. More importantly, it will give a glimpse of the lived reality of divested groups vis-à-vis the multifaceted challenges of climate-accelerated disasters,

When conducting research that relies heavily on data and a variety of data-oriented outputs related to a divested group, it is crucial to design a process aimed at avoiding data management bias and later addressing algorithmic bias. It’s important to note that the methodology we’re detailing is a pilot. The entire process will undergo review at the project’s conclusion, coinciding with a comprehensive quality control and knowledge management check on the data and any products built using that data.

The Ten States Project is committed to delivering impartial insights into disaster and climate risks, particularly for communities that have not had access to impartial, accessible information. Part of this work is recognizing the significance of data bias and its ability to warp outcomes and impact program design. This issue is especially concerning when working with divested communities who are underrepresented in data-centered research, but overrepresented in detrimental impacts of said research. Proactively, the project ensures unbiased training data, aligning its practices with ethical principles emphasized by the Greenlining Institute.

Embracing Diversity in Team and Tech

Our team mirrors the demographics of the studied communities, incorporating members with lived experiences and geographical ties to the focal states. This strategy not only combats data management bias but also enriches our insights with valuable, on-the-ground perspectives. Our commitment to ethical data management involves public data collection spearheaded by experts who deeply understand the communities they’re serving.

We’re implementing best practices for bias mitigation, as recommended by the Brookings Institute and scholars like Orphanou et al. Inclusivity is key in our data governance approach. By involving community members in the process, we aim to rebuild trust and ensure responsible data use. Diversity doesn’t stop at team composition—it extends to our algorithm design. By incorporating diverse perspectives in development and user engagement, we aim to identify and mitigate discriminatory biases. The Ten States Project is more than a study; it’s a commitment to transparency, inclusivity, and equitable application of data and technology. We actively seek input from the communities we study, ensuring our approach remains grounded and effective.

Join Us on This Journey

We’re not just conducting research; we’re inviting a dialogue. We welcome stakeholders and interested parties to engage with us, both online and at the upcoming FEMA Higher Education Symposium in June. Together, let’s pave the way for a future where every community is equipped to face the challenges of a changing climate.

We have a draft white paper on this aspect of the Ten States Project that will be updated to the insights gained through this process. It will also include an addendum with the results of the end-of-project review. We encourage stakeholders and interested parties to follow our work online and engage with us at the coming FEMA Higher Education Symposium in June.