LawTalkers

LawTalkers (http://www.lawtalkers.com/forums/index.php)
-   Politics (http://www.lawtalkers.com/forums/forumdisplay.php?f=16)
-   -   Is Ted Cruz Satan? Discuss. (http://www.lawtalkers.com/forums/showthread.php?t=875)

Sidd Finch 01-05-2015 03:30 PM

Re: For Sebby
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492534)
I don't see this improving a whole hell of a lot anytime soon. The shift toward hiring/renting/giving credit based on surface information is only increasing. Big data is going to bifurcate us unfairly and arbitrarily more than we were ever able to on our own in the past.

I think lawsuits over discrimination-by-algorithm are going to become more frequent and much larger in the next few decades.

Jumping off a bit, management and governance from the "helicopter view" are going to make a shit show of society, and I don't see any hope of us stopping it. There's a delusion being bought by almost all people with power that huge organizations, markets, and even countries can be fully understood and managed by viewing mere data about how they operate in aggregate.*

This thinking isn't irrational, of course, but we don't have to look far to see its Achilles Heel - the 2008 Crash. The aggregate data never provide a complete picture of what's going on at the street level. God only knows how much damage and unknown risk we'll cause by applying such know-it-all-ism and total reliance on necessarily blunt data in realms beyond finance and insurance.

______
*Wall Street would tell us otherwise, but their broad analyses only appear accurate because its all the same self-reinforcing data passed back and forth between and among the same limited actors, upon which those actors engage in herd investing, the timing of which creates profits. You don't have to have accurate data to win at musical chairs.



Huh? The whole point of the studies discussed in the article was that there were no differences attributable to "big data" or "algorithm," but only differences in race (based on skin color or names).

taxwonk 01-05-2015 03:42 PM

Re: For Sebby
 
Quote:

Originally Posted by ThurgreedMarshall (Post 492523)

Very interesting piece. It's also what I was talking about a couple weeks back when I copped to being racist. It's almost impossible not to be. We are programmed from birth. Every single day, I am faced with a situation where I have to ask myself, "why am I making this choice?" And those are just the times I catch myself.

taxwonk 01-05-2015 03:44 PM

Re: For Sebby
 
Quote:

Originally Posted by Sidd Finch (Post 492532)
That is really disturbing and upsetting.


Of course, most of those studies happened before we had a black president. it's all better now, right?

"We like to talk about being a post-racial society. That shit don't fly in Mississippi."

- Tom Edge, director, Southern Foodways Alliance

Greedy,Greedy,Greedy 01-05-2015 03:44 PM

Re: For Sebby
 
Quote:

Originally Posted by Sidd Finch (Post 492535)
Huh? The whole point of the studies discussed in the article was that there were no differences attributable to "big data" or "algorithm," but only differences in race (based on skin color or names).

For college admissions patterns, its been found that interviewers generally prefer people like them - not just by race but also by economic background. To some extent big data is a solution to that problem, because you should be able to program to eliminate that personal bias.

But, the downside is, programmed admissions tend to be less diverse and not as prone to pick out the intellectual. Your quota of white suburbanites don't tend to be the ones who are interesting and quirky, but instead ones who know how to game the system to deliver numbers without having real intrinsic interest. They all play a varsity sport, but one without as much competition from the real jocks (e.g., sailing team or fencing club), they have great grades in easy classes, they have enough extracurriculars but not too many...

I think the only real solution is to carefully hire weirdos as admissions officers, but I'm not anticipating that any time soon. The Dangerfields don't want to have their little snookums go back to their alma mater and discover that its being represented by Zeebo and Wiploc.

Adder 01-05-2015 03:46 PM

Re: For Sebby
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492534)
I don't see this improving a whole hell of a lot anytime soon. The shift toward hiring/renting/giving credit based on surface information is only increasing. Big data is going to bifurcate us unfairly and arbitrarily more than we were ever able to on our own in the past.

Hm. Theoretically big data will actually be able to replace the prejudices of the past with things that are actually predictive.

Certainly some things that are predictive will correlate with race (or other historically disadvantaged classes) but others will not, leaving the black people who do not have those characteristics at least arguably better off.

"We won't lend do you because we have a lot of data showing you're a bad credit risk" is a different problem than "we won't lend to you because you're black" and, if the data's really there, substantially less unfair and arbitrary.

taxwonk 01-05-2015 03:47 PM

Re: For Sebby
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492534)
I don't see this improving a whole hell of a lot anytime soon. The shift toward hiring/renting/giving credit based on surface information is only increasing. Big data is going to bifurcate us unfairly and arbitrarily more than we were ever able to on our own in the past.

I think lawsuits over discrimination-by-algorithm are going to become more frequent and much larger in the next few decades.

Jumping off a bit, management and governance from the "helicopter view" are going to make a shit show of society, and I don't see any hope of us stopping it. There's a delusion being bought by almost all people with power that huge organizations, markets, and even countries can be fully understood and managed by viewing mere data about how they operate in aggregate.*

This thinking isn't irrational, of course, but we don't have to look far to see its Achilles Heel - the 2008 Crash. The aggregate data never provide a complete picture of what's going on at the street level. God only knows how much damage and unknown risk we'll cause by applying such know-it-all-ism and total reliance on necessarily blunt data in realms beyond finance and insurance.

______
*Wall Street would tell us otherwise, but their broad analyses only appear accurate because its all the same self-reinforcing data passed back and forth between and among the same limited actors, upon which those actors engage in herd investing, the timing of which creates profits. You don't have to have accurate data to win at musical chairs.

Of course that thinking is irrational. Pick a stereotype, any stereotype. Now, list the number of people you know who fit it. Compare to the number of people you know that don't.

taxwonk 01-05-2015 03:50 PM

Re: For Sebby
 
Quote:

Originally Posted by Adder (Post 492539)
Hm. Theoretically big data will actually be able to replace the prejudices of the past with things that are actually predictive.

Take away the "theoretically" and this may be the dumbest thing ever said. Leave the word in and you're only in the top 10. For what it's worth, I don't really think you believe either one.

Greedy,Greedy,Greedy 01-05-2015 03:54 PM

Re: For Sebby
 
Quote:

Originally Posted by Adder (Post 492539)
Hm. Theoretically big data will actually be able to replace the prejudices of the past with things that are actually predictive.

Certainly some things that are predictive will correlate with race (or other historically disadvantaged classes) but others will not, leaving the black people who do not have those characteristics at least arguably better off.

"We won't lend do you because we have a lot of data showing you're a bad credit risk" is a different problem than "we won't lend to you because you're black" and, if the data's really there, substantially less unfair and arbitrary.

Predictive of what?

[Note: rhetorical question, no answer needed. Really.]

sebastian_dangerfield 01-05-2015 03:59 PM

It was HAL 9000!
 
Quote:

Originally Posted by Sidd Finch (Post 492535)
Huh? The whole point of the studies discussed in the article was that there were no differences attributable to "big data" or "algorithm," but only differences in race (based on skin color or names).

Hence the caveat before the last section, "Jumping off..."

I see a future in which algorithms are used to sort people based on criteria discrimination law was designed to eliminate. And then are used as defenses where found to be doing so. "It wasn't me. It was the computer system." I've seen that defense myself in a discrimination case. It was not successful, but it was a legitimate defense, and can be easily employed given the increasing automation of everything.

taxwonk 01-05-2015 04:06 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492545)
Hence the caveat before the last section, "Jumping off..."

I see a future in which algorithms are used to sort people based on criteria discrimination law was designed to eliminate. And then are used as defenses where found to be doing so. "It wasn't me. It was the computer system." I've seen that defense myself in a discrimination case. It was not successful, but it was a legitimate defense, and can be easily employed given the increasing automation of everything.

No. It isn't. We all know good and well that any statistical study is only going to reflect the prejudices of the designer. Anything other than a study that it purely quantitative is going to reflect bias that will be both racial and socioeconomic. Unless you're prepared to give a 3-bedroom house on the Main Line exactly the same weighting as a three-bedroom house in the shittiest neighborhood in Philly, your algorithm reflects bias.

sebastian_dangerfield 01-05-2015 04:09 PM

Re: For Sebby
 
Quote:

Originally Posted by taxwonk (Post 492540)
Of course that thinking is irrational. Pick a stereotype, any stereotype. Now, list the number of people you know who fit it. Compare to the number of people you know that don't.

I'm not saying stereotyping is rational. I'm saying the assumption one can deduce entirely what is taking place or will take place in any market, organization, or state based solely on aggregate blunt data is rational. It's a seductive proposition, playing to man's idiot belief he can conquer risk.

sebastian_dangerfield 01-05-2015 04:11 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492547)
No. It isn't. We all know good and well that any statistical study is only going to reflect the prejudices of the designer. Anything other than a study that it purely quantitative is going to reflect bias that will be both racial and socioeconomic. Unless you're prepared to give a 3-bedroom house on the Main Line exactly the same weighting as a three-bedroom house in the shittiest neighborhood in Philly, your algorithm reflects bias.

It's a legitimate defense in that, a court allowed it to be raised. "Sustainable" is perhaps the better legal term.

ETA: You'd be surprised how many people would disagree with the argument, "there are lies, damned lies, and statistics."

ThurgreedMarshall 01-05-2015 04:21 PM

Re: For Sebby
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492534)
I don't see this improving a whole hell of a lot anytime soon. The shift toward hiring/renting/giving credit based on surface information is only increasing. Big data is going to bifurcate us unfairly and arbitrarily more than we were ever able to on our own in the past.

I think lawsuits over discrimination-by-algorithm are going to become more frequent and much larger in the next few decades.

Jumping off a bit, management and governance from the "helicopter view" are going to make a shit show of society, and I don't see any hope of us stopping it. There's a delusion being bought by almost all people with power that huge organizations, markets, and even countries can be fully understood and managed by viewing mere data about how they operate in aggregate.*

This thinking isn't irrational, of course, but we don't have to look far to see its Achilles Heel - the 2008 Crash. The aggregate data never provide a complete picture of what's going on at the street level. God only knows how much damage and unknown risk we'll cause by applying such know-it-all-ism and total reliance on necessarily blunt data in realms beyond finance and insurance.

______
*Wall Street would tell us otherwise, but their broad analyses only appear accurate because its all the same self-reinforcing data passed back and forth between and among the same limited actors, upon which those actors engage in herd investing, the timing of which creates profits. You don't have to have accurate data to win at musical chairs.

I am having trouble reconciling the fact that you have written English words in order such that they are actual sentences without having the ability to read.

TM

Adder 01-05-2015 04:22 PM

Re: For Sebby
 
Quote:

Originally Posted by taxwonk (Post 492541)
Take away the "theoretically" and this may be the dumbest thing ever said. Leave the word in and you're only in the top 10. For what it's worth, I don't really think you believe either one.

Believe what you like, but I'm a whole lot more comfortable with lending decisions, for example, being made on the basis of a whole lot of data analysis than because the loan officer doesn't like the sound of the name on the form.

sebastian_dangerfield 01-05-2015 04:25 PM

Re: For Sebby
 
Quote:

Originally Posted by ThurgreedMarshall (Post 492550)
I am having trouble reconciling the fact that you have written English words in order such that they are actual sentences without having the ability to read.

TM

Do I have to say I agree with the article? Seems a waste of effort. How couldn't I? I read it twice by the way, having also scanned it in the Times yesterday.

I decided to offer a thought I had after reading it. This seemed the more interesting thing to do.

Adder 01-05-2015 04:25 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492545)
I see a future in which algorithms are used to sort people based on criteria discrimination law was designed to eliminate.

Why? Because the data crunchers and their bosses just really want to discriminate?

Nah. They want to make money. They will end up using criteria that correlate with race but that they can show actually have meaning for the decision they are making.

Leaving the different and more difficult problem of how to level the playing field.

sebastian_dangerfield 01-05-2015 04:35 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Adder (Post 492553)
Why? Because the data crunchers and their bosses just really want to discriminate?

Nah. They want to make money. They will end up using criteria that correlate with race but that they can show actually have meaning for the decision they are making.

Leaving the different and more difficult problem of how to level the playing field.

Data crunchers can't think fast enough (to borrow the article's term). All is sacrificed to risk minimization and maximum efficiency. If it's found that an algorithm gets better results using prohibited bases for credit denial/renting/hiring, there will be pressure to nevertheless use it. And given many algorithms are self-tweaking, the algorithm itself might engage in the prohibited discrimination without any human programming toward doing so, providing the desired unlawful result and alibi all in one.

taxwonk 01-05-2015 04:37 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492549)
It's a legitimate defense in that, a court allowed it to be raised. "Sustainable" is perhaps the better legal term.

ETA: You'd be surprised how many people would disagree with the argument, "there are lies, damned lies, and statistics."

Shit, I'm not even a litigator by habit, but give me 5 minutes with some programmer or expert on the stand and I could shred that argument.

"How did you place a value on Property A?"
We looked at comparables.
"Comparables?"
Yeah, you know, what properties with similar characteristics in similar neighborhoods sold for.
"So the value was dependent on neighborhood. Is that what you're saying?"
Well, yes.

taxwonk 01-05-2015 04:38 PM

Re: For Sebby
 
Quote:

Originally Posted by Adder (Post 492551)
Believe what you like, but I'm a whole lot more comfortable with lending decisions, for example, being made on the basis of a whole lot of data analysis than because the loan officer doesn't like the sound of the name on the form.

What data?

taxwonk 01-05-2015 04:39 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Adder (Post 492553)
Nah. They want to make money. They will end up using criteria that correlate with race but that they can show actually have meaning for the decision they are making.

What criteria?

ThurgreedMarshall 01-05-2015 04:45 PM

Re: For Sebby
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492552)
Do I have to say I agree with the article? Seems a waste of effort. How couldn't I? I read it twice by the way, having also scanned it in the Times yesterday.

I decided to offer a thought I had after reading it. This seemed the more interesting thing to do.

Here's the problem with your post: "The shift toward hiring/renting/giving credit based on surface information is only increasing."

That sentence basically sums up the issue with your previous posts (remember, the "jumping off point" in your last post was your retort to why redlining wasn't really discrimination) that I was trying to address by citing this article.

I don't want you to say, "I agree with the article," because what's in the article just *is.*

TM

Adder 01-05-2015 04:51 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492554)
Data crunchers can't think fast enough (to borrow the article's term).

You're misusing Kahneman's term. "Fast" thinking is heuristic and intuitive. I'm not exactly sure how you can crunch data with that kind of thinking.

Quote:

If it's found that an algorithm gets better results using prohibited bases for credit denial/renting/hiring, there will be pressure to nevertheless use it.
That's where you lose me. It won't be better. It's not true that, for example, Thurgreed is a worse credit risk than Ty's hypothetical deadbeat cousin and a model that assumes so will perform worse and cost the organization money.

It may be true that someone from the wrong neighborhood, the wrong school, the wrong type of job or with the wrong history with the legal system is a worse credit risk, and all of those things may correlate closely with race and lead to results that we think are unfair, but that's a different issue.

Anyway, you don't need big data to do what you're talking about. You're now saying that they will ignore the data and just use race. I'm skeptical that they will both to do the analysis if that's where they intend to come out.

Adder 01-05-2015 04:56 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492557)
What criteria?

Yes, wonk, the data will bear the stamp of our unequal history and thus reliance on it will not bring an even playing field.

As I've said repeatedly, that's a different issue from Sebby's assertion that people will use with the specific intent to discriminate.

They don't need data for that.

sebastian_dangerfield 01-05-2015 05:08 PM

Re: It was HAL 9000!
 
Quote:

You're misusing Kahneman's term. "Fast" thinking is heuristic and intuitive. I'm not exactly sure how you can crunch data with that kind of thinking.
The holy grail of algorithms is one that is heuristic and intuitive, something that can replace or at least approximate human thinking, with the added benefit of being able to do math and make decisions at 100X the speed. Again, cost minimization (no salary or benefits to pay) and efficiency maximization (it does what humans do, but better).

Quote:

Anyway, you don't need big data to do what you're talking about.
I didn't say you did. But you you can engage in a whole lot more intentional and unintentional discrimination with big data than without it.

Quote:

You're now saying that they will ignore the data and just use race.
No. I'm saying some will do it if they find a correlation that enhances their ability to predict future events and minimize risk. How many is some? I don't know. And some others will merely include prohibited criteria among non-prohibited criteria if they find that such blending enhances their predictive capacities.

Quote:

I'm skeptical that they will both to do the analysis if that's where they intend to come out.
They don't intend to come out anywhere. They merely seek to avoid risk and predict the future for profit. And the actors we're talking about, at least in insurance and finance, will not lose sleep over using prohibited criteria. Particularly where they can blame it on a learning algorithm.

Not Bob 01-05-2015 05:12 PM

This is not my beautiful house!
 
Quote:

Originally Posted by Adder (Post 492553)
They want to make money. They will end up using criteria that correlate with race but that they can show actually have meaning for the decision they are making.

Pardon my skepticism, but I have long heard arguments about how the Invisible Hand of The Market prevents discrimination because companies that "irrationally" discriminate (i.e., on the basis of race, sex, religion, etc.) would be forced out of business by losing out to companies that "rationally" discriminate (i.e., on the basis of whatever the market is - skills, credit-risk, speed of fastball, etc.).

With the notable exception of professional sports, I don't think that this has occurred. So the idea that the lenders won't discriminate on race now that they have all of these wonderful (albeit imperfect) tools and databases because they only care about making money is Not Credible.

sebastian_dangerfield 01-05-2015 05:13 PM

Re: It was HAL 9000!
 
Quote:

As I've said repeatedly, that's a different issue from Sebby's assertion that people will use with the specific intent to discriminate.
No. I am not saying they all intend to discriminate. The overwhelming majority will not. They will intend nothing more than to make money. The discrimination will be largely unintentional. Think of it as algorithmic laziness. Company X finds that, rather than spending time assessing more complex traditional data on hiring, it can shortcut using the number of certain letters in a first name (which just happen to indicate minority background). If the Company finds this is effective and saves time, it will be pressured to use this discriminatory technological advantage.

Adder 01-05-2015 05:15 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492561)
They don't intend to come out anywhere. They merely seek to avoid risk and predict the future for profit. And the actors we're talking about, at least in insurance and finance, will not lose sleep over using prohibited criteria. Particularly where they can blame it on a learning algorithm.

You're still assuming that stereotypes are profit-maximizing. I think that's highly unlikely to be true, especially in a big data world.

sebastian_dangerfield 01-05-2015 05:20 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Adder (Post 492564)
You're still assuming that stereotypes are profit-maximizing. I think that's highly unlikely to be true, especially in a big data world.

No, I'm saying if they are found to be profit-maximizing they will be used. Some will be. Some will not be. I'm talking about the former.

taxwonk 01-05-2015 05:22 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Adder (Post 492559)
It may be true that someone from the wrong neighborhood, the wrong school, the wrong type of job or with the wrong history with the legal system is a worse credit risk, and all of those things may correlate closely with race and lead to results that we think are unfair, but that's a different issue.

That's a crock and we both know it. The fact that those things correlate with race is unfair and that's NOT a different issue. It's the same issue because those neighborhood lines were drawn that way for a reason, the school sucks because of the place it's located, and the job is directly related to the education, neighborhood, and yes, race or socioeconomic status (something often, but not exclusively, tied to race).

That is the exact issue.

taxwonk 01-05-2015 05:24 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Adder (Post 492560)
Yes, wonk, the data will bear the stamp of our unequal history and thus reliance on it will not bring an even playing field.

As I've said repeatedly, that's a different issue from Sebby's assertion that people will use with the specific intent to discriminate.

They don't need data for that.

They don't need data to have an intent to discriminate. But here you are, yourself, using it as an excuse. That excuse is just that. The intent is to discriminate.

Adder 01-05-2015 05:25 PM

Re: This is not my beautiful house!
 
Quote:

Originally Posted by Not Bob (Post 492562)
Pardon my skepticism, but I have long heard arguments about how the Invisible Hand of The Market prevents discrimination because companies that "irrationally" discriminate (i.e., on the basis of race, sex, religion, etc.) would be forced out of business by losing out to companies that "rationally" discriminate (i.e., on the basis of whatever the market is - skills, credit-risk, speed of fastball, etc.).

With the notable exception of professional sports, I don't think that this has occurred.

Really? It's the exceedingly rare company that's okay with the world thinking it discriminates, and that's not entirely about anti-discrimination laws.

Moreover, those "shocked" hiring managers in the article, who believe they value diversity and could not believe that they were unconsciously discriminating, would not have the opportunity to do so if decisions were guided by actual data.

Quote:

So the idea that the lenders won't discriminate on race now that they have all of these wonderful (albeit imperfect) tools and databases because they only care about making money is Not Credible.
We probably need to deal with the uncomfortable realization that redlining may well have been profit maximizing for a banking system operating in our racist society. Or at least not money-losing when all of the other banks are doing it too.

But no, market incentives will not fix everything and regulation is needed too.

taxwonk 01-05-2015 05:27 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492563)
No. I am not saying they all intend to discriminate. The overwhelming majority will not. They will intend nothing more than to make money. The discrimination will be largely unintentional. Think of it as algorithmic laziness. Company X finds that, rather than spending time assessing more complex traditional data on hiring, it can shortcut using the number of certain letters in a first name (which just happen to indicate minority background). If the Company finds this is effective and saves time, it will be pressured to use this discriminatory technological advantage.

I call bullshit. They intend nothing more than to make money. Fine, the way you make money is by not lending it to schwarzes, hiring schwarzes, or dating schwarzes.

But they aren't being racist. It's just business, Mikey. Nothing personal.

Adder 01-05-2015 05:29 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492566)
That's a crock and we both know it. The fact that those things correlate with race is unfair and that's NOT a different issue. It's the same issue because those neighborhood lines were drawn that way for a reason, the school sucks because of the place it's located, and the job is directly related to the education, neighborhood, and yes, race or socioeconomic status (something often, but not exclusively, tied to race).

That is the exact issue.

It's not the issue we've been discussing, wonk. We've been discussing whether the rise of big data is going to lead to more or less overt discrimination. Sebby contends that it will, while I contend it will not.

None of that goes to how to address the accumulated effects of history. Those issues will still exist. They are much bigger issues and much harder to address, and, sadly, are still largely ignored. But they were not the conversation we've been having and eliminating racial discrimination does not eliminate those issues.

Greedy,Greedy,Greedy 01-05-2015 05:30 PM

Re: This is not my beautiful house!
 
Just a point of order, is using the term "heuristic" a corollary to Godwin's Law or the basis for a separate but parallel law?

Adder 01-05-2015 05:33 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492567)
They don't need data to have an intent to discriminate. But here you are, yourself, using it as an excuse. That excuse is just that. The intent is to discriminate.

I want you to go back and re-read this discussion, because this is not okay. I don't appreciate you accusing me of something I've not done and I assume you're only doing so because you've misinterpreted something I've said.

If I've been too charitable, you can go make use of Atticus's angry fist of god with yourself if you think I've excused discrimination somehow.

Adder 01-05-2015 05:35 PM

Re: This is not my beautiful house!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492571)
Just a point of order, is using the term "heuristic" a corollary to Godwin's Law or the basis for a separate but parallel law?

Take it up with Kahnemann and Tversky.

Greedy,Greedy,Greedy 01-05-2015 05:36 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492566)
That's a crock and we both know it. The fact that those things correlate with race is unfair and that's NOT a different issue. It's the same issue because those neighborhood lines were drawn that way for a reason, the school sucks because of the place it's located, and the job is directly related to the education, neighborhood, and yes, race or socioeconomic status (something often, but not exclusively, tied to race).

That is the exact issue.

Of course its a crock. Everyone needs to go home, read Gould's "The Mismeasure of Man", and come back to talk again. The issue was settled long before big data came along.

Greedy,Greedy,Greedy 01-05-2015 05:40 PM

Re: This is not my beautiful house!
 
Quote:

Originally Posted by Adder (Post 492573)
Take it up with Kahnemann and Tversky.

They ripped it off from Habermas, anyways. Wherever the word appears, someone is trying to make a small and questionable thought seem big and profound.

Adder 01-05-2015 05:41 PM

Re: This is not my beautiful house!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492575)
They ripped it off from Habermas, anyways. Wherever the word appears, someone is trying to make a small and questionable thought seem big and profound.

Would you say that "people don't actually think about or know why they make the decisions they do" is a small and questionable thought?

Greedy,Greedy,Greedy 01-05-2015 05:42 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Adder (Post 492570)
It's not the issue we've been discussing, wonk. We've been discussing whether the rise of big data is going to lead to more or less overt discrimination. Sebby contends that it will, while I contend it will not.

It is just a tool, and a tool that will be used by those with deep biases and with hidden biases. It will only be unbiased if you can find an unbiased person to use it, and, as we all know, the idea of an unbiased person is an heuristic construct.

By the way, I enjoy the phrasing of your summary. "We've been discussing whether A will lead to B or C. Sebby says it will, and you say it won't." So your position is it leads no where at all, and Sebby thinks it will lead to one or the other of B or C?


All times are GMT -4. The time now is 10:03 PM.

Powered by: vBulletin, Copyright ©2000 - 2008, Jelsoft Enterprises Limited.
Hosted By: URLJet.com