Some of society’s most important decisions are being made automatically, and there’s no guarantee they’re being made fairly. The shift to algorithmic decision-making has opened up a new front in the fight for civil rights, one algorithm at a time. Algorithms are choosing who can rent an apartment and whether an accused criminal gets bail, with little accountability for whether those decisions are being made fairly. As police and border agents embrace facial recognition, a simple shift in error rates can result in massive racial disparities in the people getting searched. And in every instance, the agencies responsible will point to automation as a reason why no bias could be present. That push for algorithmic accountability is one of the most important fights in technology today, with implications for nearly every industry and sector of society. This is where we’ll track those fights and try to shine a light on the chilling new threat to civil rights.
Police body cameras don’t tell the whole story. This experiment shows it.
Intent is in the eye of the beholder
Every single Onewheel is being recalled after four deaths
The Sphere’s first show looks like it was a mind-blowing spectacle
‘There’s no tracing Xbox 360 chat,’ claimed guy now charged with insider trading
Apple blames iOS 17 bugs and apps like Instagram for making iPhone 15s run hot
LG is dropping ATSC 3.0 from its TVs next year