Scholars create program to test software for bias and discrimination
In the future, your future might depend on a series of carefully calculated zeros and ones.
As technology improves, humans become less involved in decisions that affect our lives — and that isn’t exactly a good thing.
As artificial intelligence gains ground, college professors at the University of Massachusetts-Amherst have developed a program to test software for bias and discrimination.
Yes, racial discrimination. But more than that. Healthcare decisions. Loan decisions. Heck, even how Amazon decides package-sending rates.
“Today, software determines who gets a loan or gets hired, computes risk-assessment scores that help decide who goes to jail and who is set free, and aids in diagnosing and treating medical patients,”
– according to the program’s developers.