AARP Eye Center
En español | We urged the Federal Trade Commission (FTC) on Monday to investigate how companies use Americans’ data, including how they build automated computer processes and algorithms that can introduce age, gender and demographic biases in how they use data to target and serve consumers.
Federal law prohibits companies from discriminating — whether that’s through hiring decisions or serving consumers — based on certain characteristics, like age, gender and race or ethnicity. But companies are increasingly relying on computer programs and algorithms to manage consumer data. And those systems have the potential for bias, even if they weren’t intentionally built that way.
“For example, technology firms are less likely to employ African Americans, women and older adults,” we wrote in comments to the FTC. “Thus, the lack of diversity among those who initially develop the algorithms may contribute to algorithmic bias.”
Our filing called these problems “insidious” because they’re not always apparent to the victim or to the company using the algorithm, opening the door to “systematic but unrecognized discrimination.” We urged the FTC to encourage companies that develop algorithms to make them transparent and to set up a system of checks to weed out bias. We also urged them to work with Congress if they don’t have the needed authority to take action.
The FTC is considering whether to introduce new rules or guidelines around how companies collect, use and store consumer data. The public comment window closed on Monday, Nov. 21.
Read our filing to the FTC.
Start each day with The Daily newsletter for the latest in health, money and jobs — and updates on how we're fighting for you.