You are viewing 1 of 2 articles without an email address.


All our articles are free to read, but complete your details for free access to full site!

Already a Member?
Login Join us now

The toolkit that protects citizens against bias

The toolkit aims to ensure decisions made are fair and any unintentional harm to constituents is minimised

LinkedInTwitterFacebook
Algorithms are increasingly used in local government for decision-making
Algorithms are increasingly used in local government for decision-making

The US Centre for Government Excellence (GovEx), DataSF, the Civic Analytics Network and Data Community DC have introduced an algorithm toolkit that aims to help municipal leaders ensure that decisions made based on algorithms are unbiased and deliver the best outcomes for residents.

 

Complex algorithms learn from data, identify patterns and make predictions, sometimes with minimal human intervention, and are used across industries to provide services to residents.

 

Algorithms are used in the criminal justice system, higher education processes, and social media networks. Yet, according to the consortium, there are unintended consequences that arise when algorithms have significant bias and decisions are made without careful review or human input.

 

Ensuring decisions are fair

 

The main goal of the toolkit is to ensure automated decisions are fair and the unintentional harm to constituents is minimised. The toolkit provides a risk management approach and helps users better understand the risks and benefits associated with algorithm-based decision making in local government.

 

As cities throughout the country work to address issues of inequity as a result of algorithms and the negative impact it has on residents, the toolkit helps local leaders to proactively ask specific questions to quantify risks and also provides recommendations on ways to handle those risks.

 

“Governments want to ensure fairness and transparency for their residents, but with more and more algorithms being used to make determinations that impact the lives of those residents, trying to mitigate bias has been difficult,” said Andrew Nicklin, director of data practices, who led the project for GovEx.

“Instead of wringing our hands about ethics and AI, our toolkit puts an approachable and feasible solution in the hands of government practitioners"

“Government employees do not have a process or tool to evaluate how risky their algorithms are, nor how to manage those risks. That is, until now.”

 

“Instead of wringing our hands about ethics and AI, our toolkit puts an approachable and feasible solution in the hands of government practitioners – something they can use immediately, without complicated policy or overhead,” added Joy Bonaguro, chief data officer for the city and county of San Francisco.

 

GovEx is part of Johns Hopkins University, DataSF is part of the city and county of San Francisco, and the Civic Analytics Network (Harvard University).

 

GovEx provides technical assistance and training to cities in the Bloomberg Philanthropies’ What Works Cities initiative which seeks to help local governments use data and evidence effectively to tackle the most pressing challenges and improve residents’ lives.

 

The new toolkit is the latest in GovEx’s compendium of resources.

 

If you like this, you might be interested in the following:

 

Helping the public sector share data more effectively

It says local data-sharing agreements can provide an infrastructural and standards template for larger-scale data-sharing agreements

Read more

 

Make data the ‘public good’ before it’s too late

Global innovation foundation Nesta argues that city governments should make data a new form of infrastructure to improve citizens’ lives

Read more

 

Guiding cities on data analytics

A new report draws on approaches and methodologies developed by members of the Civic Analytics Network, a community of municipal civic data leaders

Read more

 

LinkedInTwitterFacebook
Add New Comment
LoginRegister