Einherjar86
Active Member
It's in her book We Real Cool: Black Men and Masculinity. Definitely worth looking at, as far as AfAm studies go.
And that's that.What attitudes? Traditionalism generally puts women on a pedestal. Must not be that.
From what I've seen among those that make a hobby out of this stuff, there is such a thing as black male privilege which apparently when contrasted with black female oppression, places black men above black women, regardless of prison statistics etc.
This link is especially extensive in the logic used by some to come to this conclusion:
http://projecthumanities.asu.edu/black-male-privileges-checklist
There are many more sources besides this one.
Bell Hooks iirc mostly focused on the slave era and Jim Crowe era and as much as modern social justice activists claim to base most of their views in those two eras of civil rights travesty, I think they actually just like Bell Hooks as a symbol, when really they haven't read what she has written. In fact if they knew that she says black women have it better than black men, she'd probably be no-platformed or at least some in-fighting would break out.
Black men exist in a paradox. It was the best of times, it was the worst of times. They have the availability of many legal and social benefits, but many of them also lack access to/appreciation of those same social and legal benefits.
Quick history lesson: The line for mental retardation used to be -1 standard deviation (-1SD) on IQ tests (85 IQ). After the passage of the Civil Rights Act, and AA's were included/ schools were desegregated/ IQ tests were widely implemented, it was realized that this would mean that the majority of AA's would fall under that standard and this would require either extra funds/special attention and/or the requirement of labeling large portions of a recently "protected" race as mentally retarded. So the standard was relaxed to -2SDs (70 IQ). So on the one hand, we don't have to provide services to those in need. On the other, we don't have to classify many AA's mentally retarded (or in the modern parlance "cognitively disabled" or "intellectually inhibited" or any other euphemism). In other words, you can perceive the change as being sensitive to historical oppression or as racist withholding of resources. What doesn't change are the the scores.
It can be sensitive and racist. In other words, the requirements could be changed in order to compensate for people who had been educationally disenfranchised; but if nothing's done to elevate those individuals to the same intellectual level as others, then it's not really solving the problem.
Until the Civil Rights bill blacks developed an entirely different kind of educational support system and intellectual aptitude. When your main concerns are getting food on the table and not being lynched by white folks, you don't have time for calculus, biology, or Dickens. Now, once black children are allowed into white schools, they have access to this information--but the cultural perceptions and stigmas that surround blacks don't suddenly disappear. It's as though we expect them to adapt and "catch up" to these new environmental standards despite the fact mainstream white attitudes toward blacks don't change all that much from 1960 to 1970. Or from 1970 to 1990, for that matter.
I'm sympathetic to the problems of the poor, but problems of finance and culture aren't strictly racial problems.Giving someone a mental disability label provides access to potentially needed services (unless, of course, the system is overburdened by demand - which is its own problem), but then also sort of puts them in a hole of low expectations. I don't know what the answer is, but one can see a problem both with racially based low expectations as well as expecting too much.
At this point I've developed a heuristic of suspicion regarding people who are suspicious of heuristics. Heuristics are an important cognitive tool.
I really do believe that our attitudes are shaped much more by our social groups than they are by facts on the ground. We are not great reasoners. Most people don't like to think at all, or like to think as little as possible. And by most, I mean roughly 70 percent of the population. Even the rest seem to devote a lot of their resources to justifying beliefs that they want to hold, as opposed to forming credible beliefs based only on fact.
Think about if you were to utter a fact that contradicted the opinions of the majority of those in your social group. You pay a price for that. If I said I voted for Trump, most of my academic colleagues would think I'm crazy. They wouldn't want to talk to me. That's how social pressure influences our epistemological commitments, and it often does it in imperceptible ways.
http://www.vox.com/conversations/20...cts-psychology-donald-trump-knowledge-science
I don't know if he just pulled the "70%" number out of his ass, but it matches up pretty closely to just under +1SD on the bell curve.
I think the latter emboldened portion is extremely underrated, and no one would want to acknowledge it.
Of course, I believe in critical thinking.
In fact, I believe in it so much that I think individuals and institutions who misrepresent it are pernicious through and through. The most disastrous obstacle to critical thinking is the blithe assumption that you are, by dint of training or disposition, a critical thinker. Our psychology insures that no human has been or ever will be a ‘critical thinker.’ The best we can hope for are moments of critical lucidity–like the one I had before bailing from my philosophy program. And the most we can hope from our institutions is that they maximize the frequency of those moments.
And the humanities, I fear, do nothing of the sort.
Human beings are rationalization machines. Some researchers even think we have a module in our brain dedicated to the production of self-serving reasons–confabulations that justify what we do and believe. In other words, we suffer matching compulsions: to judge others, on the one hand, and to justify ourselves on the other. And we all live, to varying degrees, in dream worlds as a result.
Of course, it doesn’t feel this way. As bent as it is, your inner yardstick is the only one you have, the very definition of straight and true–for you. This is why all those conservative Americans think that Fox News really is ‘fair and balanced.’ We’re literally hardwired to confuse agreement for intelligence. And this is why literature and philosophy departments are anything but the shining beacons of critical rationality they purport to be: human beings, no matter how polysyllabic their vocabularies, tend to use their intelligence to better leverage their stupidity.
I actually agree with most of this. Communities tend to reinforce common attitudes, and academia is no different. The primary I would identify, and this is anecdotal, is that most academics are riddled with self-doubt about their own ideas/arguments. I am as well. I think the healthiest critical perspective to have is, paradoxically, a skeptical one.
Also, Bakker's comments above are very much in line with his critique of heuristic thinking. Heuristics don't foster self-criticism or skepticism, but in fact allow us to bypass self-criticism. That's how they operate. Again, in some cases this is probably a good thing; but when it comes to topics on a complex scale, heuristics tend to lead us down errant paths.
My take on Wittgenstein is that I think ethical conversations can be had, and ethical policies can be arrived at; but I don't think they'll ever be consistent or absolute. Ethics should always be up for debate.
I would rather have a world of people debating who agrees with scientific consensus or not, than a world of people debating whether scientific consensus is even valuable.
There is one caveat to the above: I think it’s dangerous to promote a normal of agreeing with scientific consensus, insofar as that helps encourage exactly the mistakes about the nature of consensus that I discussed above. When poorly-informed diet industry gurus support the Bad Old Paradigm, their rallying cry is usually “You’re a stupid crackpot, bow to the scientific consensus which agrees with me”. I gave three examples above of cases where I would have gotten the scientific consensus 100% wrong if I didn’t have access to a formal survey of scientific experts. In a world where these surveys had never been done – or some existing field without these surveys – or some field where these surveys have been done inaccurately or in a biased manner – people will often believe the consensus to be the opposite of what it really is. In those cases, demands that people respect consensus can be used to shut down people who are actually right – the field-wide equivalent of calling true facts you don’t like debunked and well-refuted. I see this happening all the time and I worry that waxing too poetically about the unreasonable effectiveness of scientific consensus will only serve to empower these people. Goodhart’s Law says that a measure which becomes a target ceases to be a useful measure, so we should be reluctant to target scientific consensus too strongly.
hitlermeme.jpg j/k
http://slatestarcodex.com/2017/04/17/learning-to-love-scientific-consensus/
Quite a few links to genetic/sex based psychological/cognitive differences included, but linking the final closing point:
I feel a deep temptation to sympathize with global warming denialists who worry that the climatological consensus is biased politicized crap, because that is exactly the sort of thing which I would expect to come out of our biased politicized crappy society. Yet again and again I have seen examples of scientific fields that have maintained strong commitments to the truth in the face of pressure that would shatter any lesser institution. I’ve seen fields where people believe incredibly-bizarre sounding things that will get them mocked at cocktail parties just because those things seem to be backed by the majority of the evidence. I’ve even seen people change their minds, in spite of all the incentives to the contrary. I can’t explain this. The idea that scientific consensus is almost always an accurate reflection of the best knowledge we have at the time seems even more flabbergasting than any particular idea that scientists might or might not believe. But it seems to be true.
But seriously, that's actually why ethics always needs to be up for debate, and why it needs to be democratized. If you tried to argue with Nazi Germany, you were sent to "Holocaust centers."
Wow, good piece. I understand his wariness of blindly accepting scientific consensus. But the point isn't that people shouldn't be skeptical; it's that they should take the time and do their own fucking research (if they can), like SSC did; and lo and behold, look what you find?
My frustration is with people who are skeptical without having any reason to be beyond their own personal doubt.
I like this comment:
But seriously, that's actually why ethics always needs to be up for debate, and why it needs to be democratized. If you tried to argue with Nazi Germany, you were sent to "Holocaust centers."
Well, I'm not sure those to things go together. You're assuming the majority won't go for things like genocide etc. The more I learn the more I have a generally low opinion of the intelligence of the masses. Now, in what may seem a paradoxical position, I still think it's generally best to let people decide things about themselves for themselves, but this has more to do with data problems. However, letting people make their own decisions about themselves also requires intact feedback mechanisms. In other words, and for a sociopolitical example, a lack of systemic safety nets. No learning can occur without feedback. However, letting the majority deciding about everything what is best for everyone is prone to all sorts of problems, which the previous vox article sort of addressed.
I would call myself a "true" climate skeptic but of course I'd run into problems there. My skepticism isn't with climate change though. It's with the ability to accurately construct a causal model. In social sciences, even predicting something like 30% of the variance is major achievement - but that still leaves 70% "error" (that is, shit we haven't figured out yet). Planetary climate would seem to be at least as complex as human behavior and cognition. Furthermore, value assessments of climate change are entirely outside of the scope of climate science, but do start involving the political realm. Statements like "climate change is happening" and "to some degree can be attributed to carbon levels in the atmosphere" have zero necessary connection to or indication for statements like "climate change is bad" or "climate change must be stopped" (or even "can be stopped").
Because of the lack of specific interest in or even simply ability to interpret and untangle these complex issues in the average person, my relatively self-interested model of human behavior and cognition says they are going to interpret through their values.
One hundred thousand years is pretty much an eyeblink. But two million years is not. This is, rather loosely, the length of time in which our unconscious has been organizing and directing our lives. And without language you will note. At least for all but that recent blink. How does it tell us where and when to scratch? We dont know. We just know that it’s good at it. But the fact that the unconscious prefers avoiding verbal instructions pretty much altogether—even where they would appear to be quite useful—suggests rather strongly that it doesnt much like language and even that it doesnt trust it. And why is that? How about for the good and sufficient reason that it has been getting along quite well without it for a couple of million years?