LawTalkers

LawTalkers (http://www.lawtalkers.com/forums/index.php)
-   Politics (http://www.lawtalkers.com/forums/forumdisplay.php?f=16)
-   -   Is Ted Cruz Satan? Discuss. (http://www.lawtalkers.com/forums/showthread.php?t=875)

ThurgreedMarshall 01-06-2015 02:18 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492625)
I don't think we are, either, except that I am saying that big data shouldn't be used, at least not without consequences (i.e, a person can sue and recover for discriminatory effect), unless and until those corrections are made. Allowing dig data to be built on a racially biased platform just further institutionalizes the bias and makes it harder to identify and correct.

I'm pretty sure I am alone on that point. And I don't really expect anyone to give up their shiny new toy.

I would be with you if there were any chance that the foundation of bias on which absolutely everything has been built were to be fixed during the life of any person in being plus 100 years.

TM

Adder 01-06-2015 02:24 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492624)
We were just following orders.

As I have never made any attempt to keep them straight, I need a ruling on which type of whiff this is.

Quote:

That wasn't Wonk's point. Wonk's point was that the humans writing the algorithms are going to set up parameters that reflect racial and economic disparity. Big Data isn't likely to do anything. Big data will do exactly what it's told to do, which is going to be the same racial and economic profiling the underwriters, loan officers, and IT people use now. What's the old expression, garbage in, garbage out?
Please excuse me for improving your point for you.

The concern you state here is a small data concern. Like your list of the things that are already in use from earlier, drawing lines by zip code or whatever is what's already happening.

Big data is going to mean finding patterns in your tweets, to take only a mildly far-fetched example, that suggest that you're a better or worse credit risk.

Those things are only going to be included if the numbers have predictive value. Sure, deciding which things to take a look at will start with a person (in the beginning, anyway), and thus will be subject to human biases, but they are only going to get used if the numbers work.

Which is a different critter from unconsciously going into the no pile because of your name.

Adder 01-06-2015 02:27 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492630)
I think the article generally said people denied having intent, not that they didn't actually have intent.

Lots of people say there is not a discriminatory bone in their body, but then intentionally give preference to alumni of their own lily white school, to people who their own lily white friends recommend, to people who grew up in their icky white suburb or to other people who fit their own tribal characteristics. Hiring nothing but white people while denying discriminatory intent is a time-honored American tradition.

But interviewing people who are shocked at their own discriminatory actions is like asking the guy holding walking out of your house holding a jewel box if he is stealing it. "Stealing what? Oh, this? Just have to bring it to the shop and fix a blown gasket. I don't know how it found it's way into my hands. Is this yours? It looks like mine."

I wish it was that simple. I don't think it is.

ThurgreedMarshall 01-06-2015 02:28 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492635)
Right now, that means tackling the algorithms. Because that's what's being used...

Except for the many examples in the article?

TM

Adder 01-06-2015 02:29 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Hank Chinaski (Post 492641)
so the poor wrecked families left in Detroit need to be allowed to buy homes in real nice neighborhoods? that will solve it?

Does seem just a wee tad over-simplified, yeah.

Greedy,Greedy,Greedy 01-06-2015 02:40 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Hank Chinaski (Post 492641)
so the poor wrecked families left in Detroit need to be allowed to buy homes in real nice neighborhoods? that will solve it?


hoping for the court to knock down the court's pick up of it.

It's all about who you sell your house to, and how much it pisses off your neighbors.

sebastian_dangerfield 01-06-2015 02:41 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by ThurgreedMarshall (Post 492647)
Except for the many examples in the article?

TM

Christ... Do both. You know I'm not suggesting ignoring the sort of stuff in the article. But that stuff is really tough to address, as Wonk earlier explained (far more eloquently than I might here) in his post about choosing a neighborhood.

If you want to try to preclude discrimination in the future, as much as possible, you have to start addressing the big data angle.

ThurgreedMarshall 01-06-2015 02:45 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492636)
I didn't have intent. I was actually shocked and ashamed at the level of built-in racism I displayed when I actually started paying attention. Stuff like "I can't be racist; I dated the black girl (that's right, there was one) in my class, I was friends with three of the four black guys in my class and two of the three in the classes above and below."

Doesn't make me any less racist in action, but I do maintain there was no overt intent.

I think you're being a bit hard on yourself. Everyone's actions are colored by stereotypes. It is unavoidable. Although my wife's features are hardly European, I tend to prefer light-to-medium-skin-tone black and latino women with European-ish features. That's cultural programming. It doesn't mean I don't find all types of women attractive (and it doesn't mean I have a 'type'*), but I recognize how racism colors society and molds who we are.

As for your personal example, it might have made more sense to say that when you were looking to move you avoided poor, black neighborhoods because EMTs, police, fireman, etc., do not have the same response rates in those neighborhoods as they do in white neighborhoods or that hospitals in poor, black neighborhoods are not as well equipped to deal with emergencies as those in white neighborhoods (the ratio of doctors-per-patient, that is; I think hospitals in poor neighborhoods always seem to have excellent triage units, by comparison).

But I understand that part of your decision is based on your perception of where violent crime occurs. I don't think it's a racist perception unless it's based on a belief that black people are naturally more violent than white people, and not on the understanding of the mix of poverty, neglect, oppression, discrimination, politics, lack of opportunity, etc. that created the atmosphere in which that violence thrives.

TM

*I hate when people say they have a type or she's not my type. I suppose it's the same as a preference, but it really sounds awful to me for some reason.

Hank Chinaski 01-06-2015 02:46 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492650)
It's all about who you sell your house to, and how much it pisses off your neighbors.

maybe I'm whiffing but you are talking about people who cannot pay their waterbills. As part of the bankruptcy the city contractors were charged with turning off the water to tens of thousands of homes. If "I sell my house to them," I think my heirs would be the ones most pissed off.

ThurgreedMarshall 01-06-2015 03:12 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492651)
Christ... Do both. You know I'm not suggesting ignoring the sort of stuff in the article. But that stuff is really tough to address, as Wonk earlier explained (far more eloquently than I might here) in his post about choosing a neighborhood.

Look, weeks ago I said lots of decisions are being made on the basis of race. You said, "No, it just looks that way because of how people use data." I explained why you were wrong or at least what you were overlooking. You ignored my point in favor of some convoluted explanation of how decisions are made based on whatever your understanding is of how banks and employers use algorithms.

Yesterday I posted an article outlining how lots of different types of decisions are being made based on race. Apparently something bad will happen to you if you acknowledge that this still happens, because you again jumped to a discussion about how businesses use algorithms for everything. Maybe you don't want to discuss what may be abundantly obvious in favor of a discussion of the complex business models you find so very interesting, but that's not the way this conversation started. And frankly, it sure seems like you still think that all decisions are currently being made based on mathematical formulas* (whether or not you have acknowledged that some of them may be flawed). They are not. And that's the point. It wasn't my original point, but it was in my response to you weeks ago.

I made it again just now when I said to Wonk that "Right now, I'd settle for business approaches that aren't based on straight up de jure racist bullshit. Hire based on qualifications and not the sound of one's name. Give me an interest rate based on the credit of people with similar finances, not my race. Etc. Once we've tackled that, let's address the de facto discriminatory algorithm which draws its data from how we've unfairly educated whole groups of people or confined them to depressed neighborhoods." Your direct response to that was, "Right now, that means tackling the algorithms. Because that's what's being used..." Please explain to me why I shouldn't read that as you once again ignoring (at best, and refuting, at worst) my original response to you and the point of the article.

Maybe something different is being played out in your head?

TM

*Even if I acknowledge that huge businesses lean on these models, the country, our economy, and so very much of it is made up of people and businesses that do not use complex algorithms to make every fucking decision.

Replaced_Texan 01-06-2015 03:23 PM

Re: Is Ted Cruz Satan? Discuss.
 
Quote:

Originally Posted by Tyrone Slothrop (Post 492629)
Dunno if anyone else has been following the Ched Evans case. He's a Welsh & Sheffield United soccer player who was convicted of rape and served his sentence, and now wants to resume his career. Protests seem to be closing the door at every club that might take him.

Here's one good take:
http://www.theguardian.com/football/...aw-marina-hyde

Here's another:
http://www.theguardian.com/football/...s-money-morals

Not sure what to think about this.


He's probably not worth it. He's not Luis Suarez or Ben Roethlisburger or (maybe) Adrian Petersen, with loads of talent to make up for the bad behavior. He hasn't even done the Michael Vick apology tour, that at least acknowledged that he did something wrong.

Maybe he's cheap, and that's why Oldham thinks he might be worth it, or maybe Oldham doesn't think the public cares that much about rape. I suspect it will boil down, as it always does, to the sponsors.

I don't think he "deserves" a second chance, but then I don't think anyone deserves anything, especially in situations like these. Fame is such a fleeting thing, and it's as much luck and circumstance as anything else.

Adder 01-06-2015 03:42 PM

Re: Is Ted Cruz Satan? Discuss.
 
Quote:

Originally Posted by Replaced_Texan (Post 492664)
He's probably not worth it. He's not Luis Suarez or Ben Roethlisburger or (maybe) Adrian Petersen, with loads of talent to make up for the bad behavior. He hasn't even done the Michael Vick apology tour, that at least acknowledged that he did something wrong.

Maybe he's cheap, and that's why Oldham thinks he might be worth it, or maybe Oldham doesn't think the public cares that much about rape. I suspect it will boil down, as it always does, to the sponsors.

I don't think he "deserves" a second chance, but then I don't think anyone deserves anything, especially in situations like these. Fame is such a fleeting thing, and it's as much luck and circumstance as anything else.

I don't know what to think about Evans. Seems like a particularly difficult variation on the theme, as you say.

But I have thoughts on Peterson. While I've permanently lost respect for him, will never wear my #28 jersey again, and will never really root for him on an individual level, I want him back on the Vikings (well, did, now it's the off-season and he's got a big contract and the business side of things needs to be considered).

I'm with you that "deserves" does not have much of anything to do with it. I want(ed) Peterson back because he makes the team better and gives them a better chance to win. I'm also certain he's not the only player on the team who has done something I find morally repugnant.

That said, I think he and his advisors (and maybe the union, although I think they have other interests in mind) have really botched the whole situation. No matter how he really feels, he should have been already doing and/or eager to do the kinds of the Goodell has insisted he do before being reinstated - counseling, parenting classes, public apologies, vowing to change. Instead, he's seemed to imply that he thinks his error was only one of degree.

Sidd Finch 01-06-2015 03:51 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492630)
I think the article generally said people denied having intent, not that they didn't actually have intent.

Lots of people say there is not a discriminatory bone in their body, but then intentionally give preference to alumni of their own lily white school, to people who their own lily white friends recommend, to people who grew up in their icky white suburb or to other people who fit their own tribal characteristics. Hiring nothing but white people while denying discriminatory intent is a time-honored American tradition.

But interviewing people who are shocked at their own discriminatory actions is like asking the guy holding walking out of your house holding a jewel box if he is stealing it. "Stealing what? Oh, this? Just have to bring it to the shop and fix a blown gasket. I don't know how it found it's way into my hands. Is this yours? It looks like mine."

This is certainly true and a good point. Nonetheless, I believe that un, sub, or semi-conscious bias is real. To be honest, among people in their 40s or older at least (and younger people from many parts of the nation), I cannot fathom how that less-than-conscious bias would not exist, at least to some extent, given the culture in which people were raised. (And I further believe that conscious thought can counter it.)

Greedy,Greedy,Greedy 01-06-2015 04:00 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Hank Chinaski (Post 492654)
maybe I'm whiffing but you are talking about people who cannot pay their waterbills. As part of the bankruptcy the city contractors were charged with turning off the water to tens of thousands of homes. If "I sell my house to them," I think my heirs would be the ones most pissed off.

My post was a veiled reference to a priest fucking joke, but, hey, I guess it was too obscure.

Or maybe the priest-fucking stuff wasn't too funny to begin with.

sebastian_dangerfield 01-06-2015 05:02 PM

Re: It was HAL 9000!
 
Quote:

Look, weeks ago I said lots of decisions are being made on the basis of race. You said, "No, it just looks that way because of how people use data." I explained why you were wrong or at least what you were overlooking. You ignored my point in favor of some convoluted explanation of how decisions are made based on whatever your understanding is of how banks and employers use algorithms.
I noted that decisions were made which appeared to be racist based solely on risk avoidance concerns. These decisions had discriminatory effects, but were not made with the intent to discriminate, but with the intent merely to "make money," as Adder put it. Part of making money is avoiding risk. You couch that however you like.

Quote:

Yesterday I posted an article outlining how lots of different types of decisions are being made based on race.
The article does not say all decisions made which have discriminatory impact are based on race. It notes merely that many discriminatory decisions are unconscious. Are you suggesting that even a blind algorithm which was not intended to discriminate, but does so because discriminating happens to dovetail with risk avoidance is nevertheless deciding based on race?

Quote:

Apparently something bad will happen to you if you acknowledge that this still happens, because you again jumped to a discussion about how businesses use algorithms for everything.
Apples and oranges. People do make decisions unconsciously based on race. I think that is such an obvious fact, I hijacked the point into what I thought was more interesting: How will discrimination be changed as we automate more and more of these decisions?

Quote:

Maybe you don't want to discuss what may be abundantly obvious in favor of a discussion of the complex business models you find so very interesting, but that's not the way this conversation started.
You got me. But that also means I agree with you, and with the article's observations. I disagree with the article's hopeful tone. What's unconsciously done is rarely fixed. Those behaviors are like heart beats, or breathing.

Quote:

And frankly, it sure seems like you still think that all decisions are currently being made based on mathematical formulas* (whether or not you have acknowledged that some of them may be flawed). They are not. And that's the point. It wasn't my original point, but it was in my response to you weeks ago.
I don't think this at all. But I do think they will be, and soon. And I think if you want to combat future discrimination, there's lots more to be gained in avoiding mechanized electronic discrimination before it becomes a huge problem than there is in asking people to examine their unconscious biases.

Quote:

I made it again just now when I said to Wonk that "Right now, I'd settle for business approaches that aren't based on straight up de jure racist bullshit. Hire based on qualifications and not the sound of one's name. Give me an interest rate based on the credit of people with similar finances, not my race. Etc. Once we've tackled that, let's address the de facto discriminatory algorithm which draws its data from how we've unfairly educated whole groups of people or confined them to depressed neighborhoods." Your direct response to that was, "Right now, that means tackling the algorithms. Because that's what's being used..." Please explain to me why I shouldn't read that as you once again ignoring (at best, and refuting, at worst) my original response to you and the point of the article.
Because they aren't mutually exclusive propositions. Logically, I say pre-emptively adjust the technological thing that hasn't been polluted yet, and does not have human failings, to avoid it becoming infected with racism. Why? Because if we don't, as Wonk noted, it'll acquire all the prejudices of its users and creators. I'm not sure we can ever fully racism-proof the algorithms, but its worth a try, and its more likely to have results than suggesting people individually look at their own unconscious racism. Why would I say this? Because here's the thing about unconsciousness: A person doing something he doesn't even realize until after he's done it has a hard time both stopping the act or remedying its damage. Tweaking the algorithms that may engage in this discrimination in the future has a far greater chance of actual, measurable success.

Quote:

Maybe something different is being played out in your head?
I'm not much interested in the human element because I hold little hope of it improving much in our lifetimes. Sorry to have hijacked. Can't help myself.

taxwonk 01-06-2015 05:40 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Adder (Post 492645)
Big data is going to mean finding patterns in your tweets, to take only a mildly far-fetched example, that suggest that you're a better or worse credit risk.

Those things are only going to be included if the numbers have predictive value. Sure, deciding which things to take a look at will start with a person (in the beginning, anyway), and thus will be subject to human biases, but they are only going to get used if the numbers work.

Which is a different critter from unconsciously going into the no pile because of your name.

Predicting my behavior based on my tweets is actually not a problem for me. Except that the algorithm is going to be built based on whether or not I knew that Sir Paul really didn't need jack shit in the way of musical boost from Kanye, or it's going to take into account whether I use certain abbreviations or anagrams. Again, the issue isn't whether the data is valid or not. The issue is what data is studied, and how it is weighted. Because an algorithm is going to have to weigh choices if it is to have any chance at all of producing a competitive advantage.

You (not you, personally and exclusively) keep trying to draw this distinction between deep and shallow, fast or slow, big or little, as if they make a difference. All big data is is a shitload of little data being looked at by a big computer array instead of by a roomful of grad students or junior associates.

taxwonk 01-06-2015 06:02 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492679)
I noted that decisions were made which appeared to be racist based solely on risk avoidance concerns. These decisions had discriminatory effects, but were not made with the intent to discriminate, but with the intent merely to "make money," as Adder put it. Part of making money is avoiding risk. You couch that however you like.

Okay. I think I have this figured out. Risk avoidance is bad, at least as practiced in America today by bankers, insurers, employers, and any other economic actor that is large enough to be able to pretend that human beings are not responsible for their actions.

Decisions made "merely to make money" are inherently bad when the basis on which that risk avoidance is built is racially impacted. Note that I did not say racially motivated.

I don't give a fuck that your model is based on the mathematical determination that boys who went to Choate are more likely to wind up as managers or subject matter experts than boys who went to any public school in America. Who gets into Choate?

Same thing, someone earning $150,000/year in Bloomfield Hills is 64% less likely than someone earning $150,000 in downtown Detroit. So what. Who lives where?

Again, big data is just a shitload of small data. Some asshole still sits at a desk somewhere and decides what each piece of data is worth. Whether it's being made in the name of maximizing profits or not, somebody is still saying the black-sounding name or the mexican neighborhood gets weighted less favorably.

The truth is, if people are still saying that "If I lend money to this black man or hire this Vietnamese woman, my risk profile is going to be X rather than Y," they are still saying nothing more than that colored folk is unreliable, and if they want to work here in America, why cant' they bother to learn to speak American.

ThurgreedMarshall 01-06-2015 06:06 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492679)
I noted that decisions were made which appeared to be racist based solely on risk avoidance concerns. These decisions had discriminatory effects, but were not made with the intent to discriminate, but with the intent merely to "make money," as Adder put it. Part of making money is avoiding risk. You couch that however you like.



The article does not say all decisions made which have discriminatory impact are based on race. It notes merely that many discriminatory decisions are unconscious. Are you suggesting that even a blind algorithm which was not intended to discriminate, but does so because discriminating happens to dovetail with risk avoidance is nevertheless deciding based on race?



Apples and oranges. People do make decisions unconsciously based on race. I think that is such an obvious fact, I hijacked the point into what I thought was more interesting: How will discrimination be changed as we automate more and more of these decisions?



You got me. But that also means I agree with you, and with the article's observations. I disagree with the article's hopeful tone. What's unconsciously done is rarely fixed. Those behaviors are like heart beats, or breathing.



I don't think this at all. But I do think they will be, and soon. And I think if you want to combat future discrimination, there's lots more to be gained in avoiding mechanized electronic discrimination before it becomes a huge problem than there is in asking people to examine their unconscious biases.



Because they aren't mutually exclusive propositions. Logically, I say pre-emptively adjust the technological thing that hasn't been polluted yet, and does not have human failings, to avoid it becoming infected with racism. Why? Because if we don't, as Wonk noted, it'll acquire all the prejudices of its users and creators. I'm not sure we can ever fully racism-proof the algorithms, but its worth a try, and its more likely to have results than suggesting people individually look at their own unconscious racism. Why would I say this? Because here's the thing about unconsciousness: A person doing something he doesn't even realize until after he's done it has a hard time both stopping the act or remedying its damage. Tweaking the algorithms that may engage in this discrimination in the future has a far greater chance of actual, measurable success.



I'm not much interested in the human element because I hold little hope of it improving much in our lifetimes. Sorry to have hijacked. Can't help myself.

I would respond to each and every point, but I figure you're just on your own frequency, so I'll just let it go.

TM

Tyrone Slothrop 01-06-2015 07:05 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492682)
You (not you, personally and exclusively) keep trying to draw this distinction between deep and shallow, fast or slow, big or little, as if they make a difference. All big data is is a shitload of little data being looked at by a big computer array instead of by a roomful of grad students or junior associates.

I've been trying to avoid getting into this particular exchange, despite some strong feelings, because conversations about big data are so tedious. Talking about big data is like talking about transportation or weather -- the subject is so incredibly broad that any sort of assertion about it is bound to be part right, part wrong, and completely useless.

There is data and then there is data, and it really depends on what you are actually talking about. Take the phenomenon, described in the NYT piece, that resumes with certain names on them fare much worse than the same resumes with other names on them. No one here can possibly think that any organization hires people in a particularly rational or effective way. Most jobs have specific requirements which set them apart from other jobs, and thus requires a human to make subjective judgments about whether someone is a good fit. I'm sure everyone thinks they are better than average at doing this. I have heard that Google's HR head has tried to do some data analysis to try to figure out which indicators are the most effective at screening resumes to identify the better candidates, and that sounds like a good idea. But if anyone thinks that's going to dispel the racial bias in hiring mentioned in the NYT article, I have a bridge to sell you.

taxwonk 01-06-2015 07:11 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Tyrone Slothrop (Post 492689)
I've been trying to avoid getting into this particular exchange, despite some strong feelings, because conversations about big data are so tedious. Talking about big data is like talking about transportation or weather -- the subject is so incredibly broad that any sort of assertion about it is bound to be part right, part wrong, and completely useless.

There is data and then there is data, and it really depends on what you are actually talking about. Take the phenomenon, described in the NYT piece, that resumes with certain names on them fare much worse than the same resumes with other names on them. No one here can possibly think that any organization hires people in a particularly rational or effective way. Most jobs have specific requirements which set them apart from other jobs, and thus requires a human to make subjective judgments about whether someone is a good fit. I'm sure everyone thinks they are better than average at doing this. I have heard that Google's HR head has tried to do some data analysis to try to figure out which indicators are the most effective at screening resumes to identify the better candidates, and that sounds like a good idea. But if anyone thinks that's going to dispel the racial bias in hiring mentioned in the NYT article, I have a bridge to sell you.

I don't think the discussion was whether it is going to change the way anyone hires. I think the issue we are looking at is whether or not this crap gives someone cover (or should give someone cover) when their actions have a clearly discriminatory impact.

Tyrone Slothrop 01-06-2015 07:22 PM

caption, please
 
http://static01.nyt.com/images/2015/...izontal375.jpg

taxwonk 01-06-2015 08:55 PM

Re: caption, please
 
Quote:

Originally Posted by Tyrone Slothrop (Post 492691)

My eyes, Mommy!! It burns!

Hank Chinaski 01-06-2015 08:59 PM

Re: caption, please
 
Quote:

Originally Posted by Tyrone Slothrop (Post 492691)

Yes, if you could handle this you can handle me, I promise.

taxwonk 01-06-2015 09:30 PM

Re: caption, please
 
Quote:

Originally Posted by Hank Chinaski (Post 492694)
Yes, if you could handle this you can handle me, I promise.

Blatant double entendres with old people is icky. Stop this at once.

Greedy,Greedy,Greedy 01-07-2015 10:00 AM

Re: It was HAL 9000!
 
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?

Sidd Finch 01-07-2015 10:16 AM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492698)
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?

I would say not. But I would say you should give greater preference to the person who was top of his class at a lower-tier law school than the person who was 80th percentile at fancy-pants U. And you should consider that someone who didn't start life on third base has worked hard to be a credible candidate for you, and likely be more appreciative and hard-working in the future.

Greedy,Greedy,Greedy 01-07-2015 11:06 AM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Sidd Finch (Post 492699)
I would say not. But I would say you should give greater preference to the person who was top of his class at a lower-tier law school than the person who was 80th percentile at fancy-pants U. And you should consider that someone who didn't start life on third base has worked hard to be a credible candidate for you, and likely be more appreciative and hard-working in the future.

So, in other words, I should use UMich's approach and hire from the hoods?

ThurgreedMarshall 01-07-2015 11:41 AM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492698)
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?

If the person satisfies your firm's initial hiring requirements such that they are sitting in front of you, yes. Woman, minority, whatever. That's my approach. Give them preference. And that really just means that non-minority men have to absolutely shine in order to overcome that preference (or the minority or woman has to clearly be wrong for the position). That's how it works 95% of the time in the opposite direction.

TM

sebastian_dangerfield 01-07-2015 12:06 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492698)
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?

No. You make a business decision. You hire the best candidate you can find based on the usual criteria (personality, cost, skill, etc.).

Tyrone Slothrop 01-07-2015 12:08 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492702)
No. You make a business decision. You hire the best candidate you can find based on the usual criteria (personality, cost, skill, etc.).

Aren't we just talking about how to hire the best candidate?

sebastian_dangerfield 01-07-2015 12:29 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Tyrone Slothrop (Post 492703)
Aren't we just talking about how to hire the best candidate?

Yes, which also happens to be the best business decision.

taxwonk 01-07-2015 12:45 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Greedy,Greedy,Greedy (Post 492698)
So, after all this discussion on racial discrimination in hiring, I'm about to go into the market for a young corporate associate. Should I intentionally be giving preference to candidates who are minorities? To women candidates?

I'd like to take a moment to speak in support of the elderly white male....

taxwonk 01-07-2015 12:48 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by ThurgreedMarshall (Post 492701)
If the person satisfies your firm's initial hiring requirements such that they are sitting in front of you, yes. Woman, minority, whatever. That's my approach. Give them preference. And that really just means that non-minority men have to absolutely shine in order to overcome that preference (or the minority or woman has to clearly be wrong for the position). That's how it works 95% of the time in the opposite direction.

TM

The most honest thing a hiring partner ever said to me was "if you're sitting here, you already know you're qualified for the job. The interview is only to see how well you fit in." There are whole universes of meaning in that sentence.

taxwonk 01-07-2015 12:51 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492702)
No. You make a business decision. You hire the best candidate you can find based on the usual criteria (personality, cost, skill, etc.).

Cost is equal. Either way, you're hiring a junior associate. Same for skill.

So you're saying that personality is the sole factor? Tell us more about how you assess personality. I think this is the point Thurgreed was focusing on: Ceteris paribus (sp?) how do you decide?

Greedy,Greedy,Greedy 01-07-2015 12:58 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by ThurgreedMarshall (Post 492701)
If the person satisfies your firm's initial hiring requirements such that they are sitting in front of you, yes. Woman, minority, whatever. That's my approach. Give them preference. And that really just means that non-minority men have to absolutely shine in order to overcome that preference (or the minority or woman has to clearly be wrong for the position). That's how it works 95% of the time in the opposite direction.

TM

Does it matter whether I'm dealing with the Governor's daughter, who started on 3rd base, or some kid from the Bronx who started someplace far from the stadium?

Greedy,Greedy,Greedy 01-07-2015 12:59 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by sebastian_dangerfield (Post 492702)
No. You make a business decision. You hire the best candidate you can find based on the usual criteria (personality, cost, skill, etc.).

One of my criteria is a firm that doesn't look like an advertisement in a 1969 issue of GQ. Or like the average Silicon Valley social media startup.

That's a good business issue to consider, right, idiotic S.Ct. university admissions decisions aside, right?

Greedy,Greedy,Greedy 01-07-2015 01:01 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492705)
I'd like to take a moment to speak in support of the elderly white male....

Know one of those who wants a 2nd or 3rd year corporate associate position billing north of 2000 hours a year?

Tyrone Slothrop 01-07-2015 01:24 PM

Re: Is Ted Cruz Satan? Discuss.
 
Oy. Testilying by police officers -- a how-to guide.

Tyrone Slothrop 01-07-2015 01:28 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by taxwonk (Post 492706)
The most honest thing a hiring partner ever said to me was "if you're sitting here, you already know you're qualified for the job. The interview is only to see how well you fit in." There are whole universes of meaning in that sentence.

Agree with your last observation. One reaction that I have to this is that it shows how broken law firms are, in how they think about what they need from their lawyers and how they're going to try to serve their clients. Most businesses understand that different applicants for a job will bring a range of different skills to it. It takes a peculiarly insulated view of your organization's business to think that beyond a certain minimum set of qualifications, the only thing that really matters is whether you mesh with the current employees.

Greedy,Greedy,Greedy 01-07-2015 01:52 PM

Re: It was HAL 9000!
 
Quote:

Originally Posted by Tyrone Slothrop (Post 492712)
Agree with your last observation. One reaction that I have to this is that it shows how broken law firms are, in how they think about what they need from their lawyers and how they're going to try to serve their clients. Most businesses understand that different applicants for a job will bring a range of different skills to it. It takes a peculiarly insulated view of your organization's business to think that beyond a certain minimum set of qualifications, the only thing that really matters is whether you mesh with the current employees.

Yeah, I disagree violently with the hiring officer. My rule has always been to interview broadly, because we shouldn't be thinking that a certain GPA, or any other paper credential, qualifies you to become a good lawyer. And we shouldn't want people who fit too well.

By the way, once upon a time I thought much more deeply about these issues than I have had to in a long time. I was head of the hiring committee at another firm about a dozen years ago, and took a very aggressive approach along the lines Sidd suggested - I wanted people not born on 3rd base and I doubled the number of interviews we did at each school (while not increasing the number of hires - but this meant the high GPA folks didn't dominate the interviews the same way).

The classes I hired were each almost 50% minority and were majority women. The first year, my partners thought it was great, they were surprised at how many "good" candidates I found who were minorities. The second year, one law school complained, saying that we were "hostile" to a candidate who I had basically asked "you were born on third base; tell me how you overcome that?" (we got no complaints about questions about how people overcame all sorts of much tougher shit), and no one commented at all on all the great minorities we were hiring. That was my last year; they got someone else to run the process after that.

Looking back years later, the minorities were the ones, for the most part, who made partner. Though a couple of them then left for really sweet in-house jobs.

Now I just have to hire one person, and we'll see what the resumes look like. I've also recently been put in charge of hiring paralegals. First one hired was a woman and a minority; we'll see when people figure out my general philosophy.


All times are GMT -4. The time now is 04:26 PM.

Powered by: vBulletin, Copyright ©2000 - 2008, Jelsoft Enterprises Limited.
Hosted By: URLJet.com