» Site Navigation |
|
|
» Online Users: 189 |
| 0 members and 189 guests |
| No Members online |
| Most users ever online was 9,654, 05-18-2025 at 04:16 AM. |
|
 |
|
01-06-2015, 12:01 PM
|
#1066
|
|
[intentionally omitted]
Join Date: Mar 2003
Location: NYC
Posts: 18,597
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
Adder, I'm not accusing you of doing anything but buying into a myth that has been bought into my millions. The myth is that you can divide people or properties into different cohorts without taking race or socioeconomic status into account.
As an example, you build an algorithm that predicts risk of foreclosure. How are you going to divide up the properties? By zip code? By average home value? By number of foreclosures in a 2-mile radius? I defy you to pick a criteria that isn't reflective of the differences in our society.
Any argument that the algorithm is going to protect against downside and not perpetuate the race-wealth-education gaps is going to fall to the fact that an algorithm by definition has to categorize. If race, money, and education are all concentrated in a given area, then the distinguishing characteristic is reflected in the algorithm.
If you argue the algorithm serves a valuable purpose, then what is that purpose if not to exclude the riskier actors? Show me an algorithm that gets around this problem, I will concede. Any algorithm I can think of is going to perpetuate the discrimination. Anyone who uses that algorithm knows it will perpetuate the discrimination.
In short, it doesn't matter how much lipstick you apply. A pig is still a pig. I'm not getting personal with you. I'm suggesting that your premise is fatally flawed. If that gets me the angry fist of God, I will try to find a way to live with it. I tend to fall into that risk pool anyway.
|
While I don't disagree with you, I think, as a group, we are all over the place on this.
Sebby and I originally argued about whether redlining (and certain hiring practices and other things) were the product of outright racial bias. Based on my understanding of what he was saying, he was arguing that the practices were built around very shallow data. It seemed to me he was choosing to ignore the fact that those practices were not data-driven--that they were the specific product of racial discrimination and not a by-product of business practices that were "blind" and just happened to end up being racially discriminatory for whatever reason ( eg., this neighborhood vs. that).
Presumably Sebby moved past this argument after reading the Times article and, even though he didn't acknowledge my original point, he decided to discuss how we will continue to be biased based on his mainly correct, heuristic view of where big data is taking us all. He and Adder now seem to be arguing over whether there is inherent value in discriminating based on race.
Your point is that if you are setting up a model to take into account anything that can be linked back to the result of our collective institutional racism ( eg., this neighborhood vs. that), that such model will be inherently discriminatory. While I agree with you, that's two steps down the line.
Right now, I'd settle for business approaches that aren't based on straight up de jure racist bullshit. Hire based on qualifications and not the sound of one's name. Give me an interest rate based on the credit of people with similar finances, not my race. Etc. Once we've tackled that, let's address the de facto discriminatory algorithm which draws its data from how we've unfairly educated whole groups of people or confined them to depressed neighborhoods.
In short, you and Adder aren't really disagreeing on anything, I think.
TM
|
|
|
01-06-2015, 12:19 PM
|
#1067
|
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by sebastian_dangerfield
I'm not excusing it. I'm noting that big data is going to: (1) enable a lot more of it, both intentionally and unintentionally; and, (3) provide an alibi for it.
Corporations are the greatest tools ever invented for avoidance of personal responsibility. We are handing them enormous data pools with which to commoditize human beings. When they judge the books by the covers because that's the easiest and most efficient way to maximize profit and minimize risk, and we call them on it, this will be what you hear in the Congressional hearings:
"I did not design the algorithm... That was by committee, and involved many tech people no longer with us. A number of outside consultants, as well. There's no way to know who exactly designed the code at issue. And the algorithm we're discussing-- And keep in mind, I'm not a coder or anything-- actually, quite a Luddite in that regard... I believe that code actually teaches itself. So the prohibited criteria it used, I think, if I understand the tech people correctly, was selected by the algorithm itself. With no human involvement, or foresight that might occur.
But we have enacted best practices to avoid this in the future. Our new coders assure us this exact use of this exact prohibited criteria can be avoided. Some others may be used by these learning algorithms in the future, as it's impossible to preclude them all. But this one? This exact discriminatory basis? We have that one eliminated. And we are committed to vigilant removal of others as they appear."
Of course, no purchased Senator or Congressman will mention that the categories of criteria that may be used to effect discriminatory ends are innumerable.
Only the silliest tech evangelist would believe big data is going to remove or reduce discrimination. It's a delusion as preposterous as the belief the 2008 Crash would result in true upending of social order, as it should have, rather a stronger retrenchment in which the rich and powerful before the Crash became even more rich and powerful afterward.
We are Engineered. The question isn't whether the status quo persists, but how fast it accelerates our splintering into ever more deeply class divided mini societies. To be a bear, to hope for some form of justice, or revolution, or true and deserved free market correction, is to be insane.
|
"Of course we knew it was a tax. But there was no way it would have passed if we called it that."
And yeah, I know I'm insane. Doesn't mean I plan to stop trying.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 12:21 PM
|
#1068
|
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 17,175
|
Re: It was HAL 9000!
Quote:
Originally Posted by ThurgreedMarshall
In short, you and Adder aren't really disagreeing on anything, I think.
|
I don't think so either.
|
|
|
01-06-2015, 12:28 PM
|
#1069
|
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Adder
I take your point that there will be defenses raised in future discrimination cases that can be summed up as "we were just following the data." That might often be true, and sometimes it may not.
|
We were just following orders.
Quote:
There's is wonk's point that all data is permeated with our history of injustice, but to believe that big data is going to be just as likely as the human subconscious to unknowingly reject black applicants because they are black is irrationally cynical. Why do that when the data offers you all kinds of ways to get to non-discriminatory results?
Yeah, I know, you've argued its going to happen either inadvertently or because it's the cheapest and easiest way to do it, and they're all greedy bastards. But both of those conclusions require entertaining the possibility that race really is strongly predictive, to a degree that it will outweigh other factors.
|
That wasn't Wonk's point. Wonk's point was that the humans writing the algorithms are going to set up parameters that reflect racial and economic disparity. Big Data isn't likely to do anything. Big data will do exactly what it's told to do, which is going to be the same racial and economic profiling the underwriters, loan officers, and IT people use now. What's the old expression, garbage in, garbage out?
Data may offer a way out of discrimination, but that would involve the same kind of personal, painful analysis and second-guessing that machines aren't capable of and humans aren't paid to do. Besides, they aren't in the business of ending discrimination. That would fly in the face of every argument in favor of big data that's been uttered here for the last two days.
Lenders, insurers, marketers, etc. are going to program their algorithms to do make money, or, as it's so politely put, "minimize risk." That's what corporations do, right? They maximize profits.
Unless we want them to buy politicians. Then they're people. White, Christian people.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 12:34 PM
|
#1070
|
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by ThurgreedMarshall
In short, you and Adder aren't really disagreeing on anything, I think.
TM
|
I don't think we are, either, except that I am saying that big data shouldn't be used, at least not without consequences (i.e, a person can sue and recover for discriminatory effect), unless and until those corrections are made. Allowing dig data to be built on a racially biased platform just further institutionalizes the bias and makes it harder to identify and correct.
I'm pretty sure I am alone on that point. And I don't really expect anyone to give up their shiny new toy.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 12:47 PM
|
#1071
|
|
I am beyond a rank!
Join Date: Mar 2003
Posts: 11,873
|
Re: It was HAL 9000!
Quote:
Originally Posted by ThurgreedMarshall
While I don't disagree with you, I think, as a group, we are all over the place on this.
Sebby and I originally argued about whether redlining (and certain hiring practices and other things) were the product of outright racial bias. Based on my understanding of what he was saying, he was arguing that the practices were built around very shallow data. It seemed to me he was choosing to ignore the fact that those practices were not data-driven--that they were the specific product of racial discrimination and not a by-product of business practices that were "blind" and just happened to end up being racially discriminatory for whatever reason (eg., this neighborhood vs. that).
Presumably Sebby moved past this argument after reading the Times article and, even though he didn't acknowledge my original point, he decided to discuss how we will continue to be biased based on his mainly correct, heuristic view of where big data is taking us all. He and Adder now seem to be arguing over whether there is inherent value in discriminating based on race.
Your point is that if you are setting up a model to take into account anything that can be linked back to the result of our collective institutional racism (eg., this neighborhood vs. that), that such model will be inherently discriminatory. While I agree with you, that's two steps down the line.
Right now, I'd settle for business approaches that aren't based on straight up de jure racist bullshit. Hire based on qualifications and not the sound of one's name. Give me an interest rate based on the credit of people with similar finances, not my race. Etc. Once we've tackled that, let's address the de facto discriminatory algorithm which draws its data from how we've unfairly educated whole groups of people or confined them to depressed neighborhoods.
In short, you and Adder aren't really disagreeing on anything, I think.
TM
|
Thesis, antithesis.... meet synthesis, bitches. Well said.
__________________
Where are my elephants?!?!
|
|
|
01-06-2015, 12:51 PM
|
#1072
|
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by sidd finch
thesis, antithesis.... Meet synthesis, bitches. Well said.
|
potw
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 12:56 PM
|
#1073
|
|
Moderasaurus Rex
Join Date: May 2004
Posts: 33,080
|
Re: Is Ted Cruz Satan? Discuss.
Dunno if anyone else has been following the Ched Evans case. He's a Welsh & Sheffield United soccer player who was convicted of rape and served his sentence, and now wants to resume his career. Protests seem to be closing the door at every club that might take him.
Here's one good take:
http://www.theguardian.com/football/...aw-marina-hyde
Here's another:
http://www.theguardian.com/football/...s-money-morals
Not sure what to think about this.
__________________
“It was fortunate that so few men acted according to moral principle, because it was so easy to get principles wrong, and a determined person acting on mistaken principles could really do some damage." - Larissa MacFarquhar
|
|
|
01-06-2015, 12:58 PM
|
#1074
|
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
True, but having read the article, we now know that people will discriminate even without intent. If we accept that as a truth, then don't we have an obligation to test everything against that knowledge and reject any process that continues the discrimination?
|
I think the article generally said people denied having intent, not that they didn't actually have intent.
Lots of people say there is not a discriminatory bone in their body, but then intentionally give preference to alumni of their own lily white school, to people who their own lily white friends recommend, to people who grew up in their icky white suburb or to other people who fit their own tribal characteristics. Hiring nothing but white people while denying discriminatory intent is a time-honored American tradition.
But interviewing people who are shocked at their own discriminatory actions is like asking the guy holding walking out of your house holding a jewel box if he is stealing it. "Stealing what? Oh, this? Just have to bring it to the shop and fix a blown gasket. I don't know how it found it's way into my hands. Is this yours? It looks like mine."
__________________
A wee dram a day!
|
|
|
01-06-2015, 01:31 PM
|
#1075
|
|
Proud Holder-Post 200,000
Join Date: Sep 2003
Location: Corner Office
Posts: 86,149
|
Re: It was HAL 9000!
Quote:
Originally Posted by ThurgreedMarshall
Once we've tackled that, let's address the de facto discriminatory algorithm which draws its data from how we've unfairly educated whole groups of people or confined them to depressed neighborhoods.
TM
|
after my great blue state banned AA in public schools based upon RACE, data mining gave U of M a fall back to get around it- the school makes sure it draws from ALL NEIGHBORHOODS.
http://www.newrepublic.com/article/1...n-in-education
while I am glad they keep going for diversity, I am a bit troubled the state school intentionally side stepped the (racist, but still) will of the people this way.
__________________
I will not suffer a fool- but I do seem to read a lot of their posts
|
|
|
01-06-2015, 01:40 PM
|
#1076
|
|
Moderator
Join Date: Mar 2003
Location: Monty Capuletti's gazebo
Posts: 26,231
|
Re: It was HAL 9000!
Quote:
|
Sebby and I originally argued about whether redlining (and certain hiring practices and other things) were the product of outright racial bias. Based on my understanding of what he was saying, he was arguing that the practices were built around very shallow data. It seemed to me he was choosing to ignore the fact that those practices were not data-driven--that they were the specific product of racial discrimination and not a by-product of business practices that were "blind" and just happened to end up being racially discriminatory for whatever reason (eg., this neighborhood vs. that).
|
Correct assessment of most of our initial disagreement. However, you forgot one point I made: That big data use can lead to discriminatory results without intent.
Quote:
|
Presumably Sebby moved past this argument after reading the Times article and, even though he didn't acknowledge my original point, he decided to discuss how we will continue to be biased based on his mainly correct, heuristic view of where big data is taking us all.
|
I agree with the Times' article's conclusion that unconscious racism persists despite efforts to combat conscious racism. My view of where big data is taking us is not intuitive. I'm employing the same simple logic as Wonk: That old prejudices persist and infect new technology.
Quote:
|
He and Adder now seem to be arguing over whether there is inherent value in discriminating based on race.
|
No. Adder is trying to force the point that my prediction of how big data will impact discriminatory practices necessarily assumes discriminatory criteria in the algorithms will have profitable predictive value. In some instances, it will. In some, not. If you discriminate based on geography, chances are, you will minimize some consumer credit risk. If your algorithm discriminates based on types of names, it will all but assuredly leave profit on the table.
Quote:
|
Right now, I'd settle for business approaches that aren't based on straight up de jure racist bullshit. Hire based on qualifications and not the sound of one's name. Give me an interest rate based on the credit of people with similar finances, not my race. Etc. Once we've tackled that, let's address the de facto discriminatory algorithm which draws its data from how we've unfairly educated whole groups of people or confined them to depressed neighborhoods.
|
Right now, that means tackling the algorithms. Because that's what's being used, and that automation is only going to increase.
__________________
All is for the best in the best of all possible worlds.
|
|
|
01-06-2015, 01:41 PM
|
#1077
|
|
Wild Rumpus Facilitator
Join Date: Mar 2003
Location: In a teeny, tiny, little office
Posts: 14,167
|
Re: It was HAL 9000!
Quote:
Originally Posted by Greedy,Greedy,Greedy
I think the article generally said people denied having intent, not that they didn't actually have intent.
Lots of people say there is not a discriminatory bone in their body, but then intentionally give preference to alumni of their own lily white school, to people who their own lily white friends recommend, to people who grew up in their icky white suburb or to other people who fit their own tribal characteristics. Hiring nothing but white people while denying discriminatory intent is a time-honored American tradition.
But interviewing people who are shocked at their own discriminatory actions is like asking the guy holding walking out of your house holding a jewel box if he is stealing it. "Stealing what? Oh, this? Just have to bring it to the shop and fix a blown gasket. I don't know how it found it's way into my hands. Is this yours? It looks like mine."
|
I didn't have intent. I was actually shocked and ashamed at the level of built-in racism I displayed when I actually started paying attention. Stuff like "I can't be racist; I dated the black girl (that's right, there was one) in my class, I was friends with three of the four black guys in my class and two of the three in the classes above and below."
Doesn't make me any less racist in action, but I do maintain there was no overt intent.
__________________
Send in the evil clowns.
|
|
|
01-06-2015, 01:46 PM
|
#1078
|
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: It was HAL 9000!
Quote:
Originally Posted by taxwonk
I didn't have intent. I was actually shocked and ashamed at the level of built-in racism I displayed when I actually started paying attention. Stuff like "I can't be racist; I dated the black girl (that's right, there was one) in my class, I was friends with three of the four black guys in my class and two of the three in the classes above and below."
Doesn't make me any less racist in action, but I do maintain there was no overt intent.
|
Wasn't thinking of you and the hospital choice here, but the article on employment issues. I think there are lots of people in positions where they hire who would say they have no intention of discriminating the way they do, very carefully, every day.
__________________
A wee dram a day!
|
|
|
01-06-2015, 01:49 PM
|
#1079
|
|
Registered User
Join Date: Mar 2003
Location: Government Yard in Trenchtown
Posts: 20,182
|
Re: It was HAL 9000!
Quote:
Originally Posted by Hank Chinaski
after my great blue state banned AA in public schools based upon RACE, data mining gave U of M a fall back to get around it- the school makes sure it draws from ALL NEIGHBORHOODS.
http://www.newrepublic.com/article/1...n-in-education
while I am glad they keep going for diversity, I am a bit troubled the state school intentionally side stepped the (racist, but still) will of the people this way.
|
There's an easy solution here. Stop discriminating in housing.
By the way, when do we start a pool for the last state discriminating on marriage equality? Your state is a contendah!
__________________
A wee dram a day!
|
|
|
01-06-2015, 01:54 PM
|
#1080
|
|
Proud Holder-Post 200,000
Join Date: Sep 2003
Location: Corner Office
Posts: 86,149
|
Re: It was HAL 9000!
Quote:
Originally Posted by Greedy,Greedy,Greedy
There's an easy solution here. Stop discriminating in housing.
|
so the poor wrecked families left in Detroit need to be allowed to buy homes in real nice neighborhoods? that will solve it?
Quote:
|
By the way, when do we start a pool for the last state discriminating on marriage equality? Your state is a contendah!
|
hoping for the court to knock down the court's pick up of it.
__________________
I will not suffer a fool- but I do seem to read a lot of their posts
|
|
|
 |
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
|