LawTalkers  

Go Back   LawTalkers

» Site Navigation
 > FAQ
» Online Users: 109
0 members and 109 guests
No Members online
Most users ever online was 9,654, 05-18-2025 at 04:16 AM.
View Single Post
Old 01-06-2015, 01:40 PM   #1076
sebastian_dangerfield
Moderator
 
sebastian_dangerfield's Avatar
 
Join Date: Mar 2003
Location: Monty Capuletti's gazebo
Posts: 26,231
Re: It was HAL 9000!

Quote:
Sebby and I originally argued about whether redlining (and certain hiring practices and other things) were the product of outright racial bias. Based on my understanding of what he was saying, he was arguing that the practices were built around very shallow data. It seemed to me he was choosing to ignore the fact that those practices were not data-driven--that they were the specific product of racial discrimination and not a by-product of business practices that were "blind" and just happened to end up being racially discriminatory for whatever reason (eg., this neighborhood vs. that).
Correct assessment of most of our initial disagreement. However, you forgot one point I made: That big data use can lead to discriminatory results without intent.

Quote:
Presumably Sebby moved past this argument after reading the Times article and, even though he didn't acknowledge my original point, he decided to discuss how we will continue to be biased based on his mainly correct, heuristic view of where big data is taking us all.
I agree with the Times' article's conclusion that unconscious racism persists despite efforts to combat conscious racism. My view of where big data is taking us is not intuitive. I'm employing the same simple logic as Wonk: That old prejudices persist and infect new technology.

Quote:
He and Adder now seem to be arguing over whether there is inherent value in discriminating based on race.
No. Adder is trying to force the point that my prediction of how big data will impact discriminatory practices necessarily assumes discriminatory criteria in the algorithms will have profitable predictive value. In some instances, it will. In some, not. If you discriminate based on geography, chances are, you will minimize some consumer credit risk. If your algorithm discriminates based on types of names, it will all but assuredly leave profit on the table.

Quote:
Right now, I'd settle for business approaches that aren't based on straight up de jure racist bullshit. Hire based on qualifications and not the sound of one's name. Give me an interest rate based on the credit of people with similar finances, not my race. Etc. Once we've tackled that, let's address the de facto discriminatory algorithm which draws its data from how we've unfairly educated whole groups of people or confined them to depressed neighborhoods.
Right now, that means tackling the algorithms. Because that's what's being used, and that automation is only going to increase.
__________________
All is for the best in the best of all possible worlds.
sebastian_dangerfield is offline  
 
Powered by vBadvanced CMPS v3.0.1

All times are GMT -4. The time now is 08:10 PM.