Subscribe to Pittwire Today
Get the most interesting and important stories from the 麻豆传媒.Algorithms can influence our lives in ways large and small: where our kids go to school, whether police patrol our neighborhoods and even how long we wait at red lights.
by the Pittsburgh Task Force on Public Algorithms, an initiative of the 麻豆传媒鈥檚 Institute for Cyber Law, Policy, and Security (Pitt Cyber), details the potential pitfalls of algorithmic systems and addresses how policymakers can ensure their accountability to the public.
鈥淎gencies across the country, including in our region, are using algorithmic systems to help make government decisions. We need to make sure they are doing so with transparency, accountability and the understanding of the public,鈥 said David Hickton, founding director of .
The report is the result of a two-year effort by drawn from local and national experts and community leaders, including a government advisory panel with designees from Allegheny County and the City of Pittsburgh.
Put simply, algorithms are computational tools that use data to predict outcomes. Proponents of public algorithmic systems cite significant benefits including more efficient data processing, fewer errors in decision-making relative to humans and the ability to consider vast troves of factors and data at greater speeds.
While the benefits of algorithms are notable, the task force鈥檚 report details how algorithms can reflect existing biases in data and our society 鈥 and how they can accelerate and exacerbate harm, especially along racial and gender lines.
鈥淲e cannot miss this opportunity to do better for our region and make sure that we do not lock in a digital Jim Crow in this front in the fight for civil rights,鈥 said Hickton.
One example of a government鈥檚 use of an algorithm gone awry is the case of the Broward County, Florida, court system relying on an algorithm called COMPAS to perform risk assessments of people facing criminal sentencing. The algorithm analyzed an arrestee鈥檚 history and demographics and rated how likely they were to offend again. The algorithm was found to falsely flag Black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants.
鈥淲e do not have to accept the false choice between technological advancement and civil and constitutional rights. People of goodwill can find ways to balance liberty and security in the digital age, leveraging tech innovation fairly and with transparency,鈥 said Hickton.
[Read about the 2021 launch of the Pitt Disinformation Lab.]
Against this complex backdrop, the task force endeavored to learn from local听governments鈥 experiences with algorithmic systems. They found a range of approaches with profoundly different commitments to being transparent, engaging the public and obtaining outside systems reviews.
鈥淭he work of the is grounded in a simple truth: algorithmic tools are set up to fail if the public doesn't know what they are or trust how they are being used,鈥 said LaTrenda Sherrill, task force member and founder of Common Cause Consultants who led the task force鈥檚 community engagement efforts.
Based on research, lessons learned from case studies and public feedback, the task force outlined recommendations for local and regional governments to manage algorithmic systems and ensure accountable use of algorithmic systems in all agencies. The recommendations include:
- Encouraging meaningful public participation commensurate with the risk level of the potential algorithmic system.
- Involving the public in algorithmic system development plans from the earliest stages through any later substantive changes to the system.
- Utilizing external reviews when the system might be at higher risk of error.
- Assessing whether any planned procurement might include an algorithmic system.
- Publishing information about algorithmic systems on a public registry website.
- Avoiding facial recognition and related systems.
- Evaluating the effectiveness of the recommendations.
.
The 麻豆传媒 Institute for Cyber Law, Policy, and Security and the Pittsburgh Task Force on Public Algorithms gratefully acknowledge the generous support of The Heinz Endowments and the Hillman Foundation.
听
鈥 Nichole Faina