Dakryn's Batshit Theory of the Week

It was tongue in cheek - not a jab, just a self-deprecating comment on the ivory tower.

I assume this is a point on which we differ, but I don't think that any intelligence differential within the population we classify as human is large enough to preclude the effects of communication. I think that people may refuse to change their minds about certain things, but that doesn't mean they lack the capacity to understand the concepts. That might be an indefinite distinction, but something about it seems worth considering.
 
I assume this is a point on which we differ, but I don't think that any intelligence differential within the population we classify as human is large enough to preclude the effects of communication. I think that people may refuse to change their minds about certain things, but that doesn't mean they lack the capacity to understand the concepts. That might be an indefinite distinction, but something about it seems worth considering.

"Preclude the effects". I can't disagree with something that vague. It's not even a matter of lacking the capacity to understand - more importantly they lack the capacity, in wetware, time, resources, and education, to even understand why they should care. If I had any reservations about making the following statement, the last year or two of observations at school, work, and otherwise have completely disabused me of them: You underestimate the degree to which simple managing day to day living in modernity is overwhelming to people of even average intelligence (eg 95-105IQ), to say nothing of the 30+% of people below those numbers. You severely underestimate the magnitudes of difference a mere 20pts of IQ makes. Combine that with heredity and local acculturation/norms of many people within the same ballpark and you get vastly different capabilities even if the "numbers seem close". I've had customers that cannot do math in their head that they could do on their fingers (but can't think to even use their fingers). I've had customers that have no concept of which is bigger: 16 or 48. I've had college tutees that have difficulty following along with extremely basic and/or logic. How are you going to get these people to give a real, carefully considered, intelligently thought out opinion or formulate a legitimate position on "~50% anthropogenic-influenced climate change based environmental and economic policy". How disconnected could such a demand of these people be? These people need real help, like social support and pressure to simply not keep eating Bojangles 3 meals a day AFTER a heart attack.


Formalized public/private k-12 education plays a part in this problem, but there are limits to the capabilities in the wetware that the world is quickly outstripping. While the ivory tower has existed for centuries, it is my contention that it has never been higher nor more blind to the situation on the ground.
 
Last edited:
I'm not sure exactly what this entails, but I'm going to assume that a significant part of it involves communication.

The study I'm currently involved in hypothesizes rather than communication it depends on existing values. Communication acts as a post hoc reinforcement. At best, if there is no value there, communication probably has to be ONE HUNDRED PERCENT EAT THIS SHIT YOU DIE TOMORROW to even get someone's attention. Kind of like how politics looks. And then they eat it and don't die the next day and so who gives a fuck....that's the thought process as best as I can tell.
 
The study I'm currently involved in hypothesizes rather than communication it depends on existing values. Communication acts as a post hoc reinforcement.

That's really interesting. That's close to what I was saying when I admitted that plenty of people will just be stubborn and won't change their minds. Similar to when voters are given specific examples of when their candidate has lied and/or done something else reprehensible, but this just increases their ire and commitment.

The question of precluding the effects of communication is incredibly vague, I'll admit - and intentionally so, since my definition for communication exceeds linguistic exchange. As far as I'm concerned, communication is an almost ubiquitous condition, not something we find only in humans and other "intelligent" animals. I put "intelligent" in scare quotes because we often exclude things like microorganisms or plants from the category, but if we observe their behaviors on a broad scale we find that they exhibit pattern behavior, and pattern behavior is intelligent behavior. Humans participate in communication, but so do viruses, so do trees, so do nervous systems, so do ants, etc.

Lastly, I also say "participate in communication" rather than "communicate" because the latter suggests an established link between two (or more) distinct parties, but this is somewhat archaic. Like Hoffman said in the article you posted, it isn't so much about enjoying shared linguistic access to objects beyond language; rather, we use words in the hope that our meaning will be understood. Sometimes it is, and sometimes it isn't. Ultimately, communication happens somewhere between subjects, not in the minds of individuals. This may not fall in line with your understanding, but I think it supports your hypothesis in a way: namely, that if communication were truly a kind of established hermetic link between individuals that shut out the possibility of interference, then it would be far more successful than it currently is. Unfortunately, this kind of pure link is a fantasy. At the end of the day we're all solipsists. My own solipsistic view, however, is that a better understanding of communication can lead to a better understanding of the social and our role in it.

"Human beings cannot communicate. Not even their brains can communicate. Not even their conscious minds can communicate. Only communication can communicate."
 
It seems an easy assertion that "communication communicates" when looking at open broadcasts: Like the books you work with, or even forum posts. The spoken word is quite different, and even more so the fewer in number the audience. A blind spot for academia/rationalists is that "people matter", and not in just that trite sounding way, but connection. There is no such connection with the broadcasted word.
 
It seems an easy assertion that "communication communicates" when looking at open broadcasts: Like the books you work with, or even forum posts. The spoken word is quite different, and even more so the fewer in number the audience. A blind spot for academia/rationalists is that "people matter", and not in just that trite sounding way, but connection. There is no such connection with the broadcasted word.

These are good points, and hit on a central issue in linguistics/semiotics: namely, whether there's a difference between spoken and written language. I think it's safe to say that yes, there are many differences, some quite obvious; but I think there are also similarities that complicate the matter.

I don't really want to contend with the clarity and/or correctness of Derrida's work, but he makes a provocative point in an essay titled "Signature Event Context." In short, he contends that an implicit graphability, or iterability, subsists in both written and spoken language, and I find this to be a compelling claim:

[The] possibility of extraction and of citational grafting which belongs to the structure of every mark, spoken or written, and which constitutes every mark as writing even before and outside every horizon of semiolinguistic communication; as writing, that is, as a possibility of functioning cut off, at a certain point, from its "original" meaning and from its belonging to a saturable and constraining context. Every sign, linguistic or nonlinguistic, spoken or written (in the usual sense of this opposition), as a small or large unity, can be cited, put between quotation marks; thereby it can break with every given context, and engender infinitely new contexts in an absolutely nonsaturable fashion.

Derrida will refer to this as "arche-writing" - that even spoken language presupposes a written (i.e. citational) origin, even as it breaks away from and defies this origin. As I said, we could go back and forth on Derrida for eternity; but that's what he says, and I think it merits consideration.

Alternatively, someone like Wittgenstein would probably disagree and fall more on your side of the issue; but even Wittgenstein acknowledges that meaning is precarious in spoken contexts, even intimate ones. As the model of rule-following demonstrates, it circulates endlessly through a series of rules that are continuously followed and/or broken, and new rules are always being engendered. But the process always has to presuppose a set of rules that somehow existed prior to the speech act. This seems to mesh nicely (in my opinion) with Derrida's insistence that all marks and utterances act upon the (il)logic of an already-existing written form. In other words, even conversation, or "connection," as you say, relies on the preexistence of communication. A statement, written or spoken, only makes sense if it can be translated. This is why Wittgenstein contends that there can be no such thing as a private language.

I'm not sure what the distinction, or possibly the overlap, between "academia" and "rationalists" is; and I'm not sure I agree with the accusation that people don't matter for academicians (by which I assume you mean scholars in the humanities). I think that people can matter just as much even if we shift our focus from the nodes to the network.

EDIT: if we're thinking, for example, about psychology or psychoanalysis, the immediacy of the environment certainly plays a role; but I wouldn't say this negates the communicational structures on which the psychological relationship is constructed.
 
I'm referring to non-verbal, or non-logos, portions of communication. Not only environment, but history, tone, body language, the eyes. For an example, and at the risk of triggering Mort: Consent can lack words but include some or all of the other things I just mentioned.
 
I don't want to get into a discussion over what constitutes consent. As far as all the things you list, I would first be sure to point out that history is absolutely logocentric.

Otherwise, I'm not sure how any of that achieves "connection," to use your word. There's always a medium, never any pure linkage. All the items in your list simply fall into my category of complex communication. That's my two cents.
 
Well, I'm fascinated by the concept of "media" - on one hand it separates, and on the other it "connects." Without it we would have no access, and with it we can only have a troubled access. All communication involves noise.

In other news, I figured this may be of interest to some of us. ;) The New Yorker recently published a piece on epigenetics that I thought was compelling, but is creating a bit of a stir in the scientific community. Not because of the argument over environmental factors affecting gene expression, but over how the author (Siddhartha Mukherjee, believe it or not) chose to explain the findings of epigenetics. As it turns out, the article is overwhelmingly misleading and intellectually dishonest.

I won't say that I'm that surprised, since TNY is not a scientific journal although it likes to portray itself as educated on the subject(s). Being a subscriber, it's disappointing to see articles like this, but also interesting to see how a conventionally literary/humanistic publication often skews the information. Although it is strange that Mukherjee makes these errors, since he published a well-received book on cancer a few years back.

My dissertation project involves the overlap between science and literature around the mid-twentieth century, so I find this kind of stuff really interesting.

TNY link: http://www.newyorker.com/magazine/2016/05/02/breakthroughs-in-epigenetics

Link to scientific backlash: https://whyevolutionistrue.wordpres...criticize-the-mukherjee-piece-on-epigenetics/

Here are a few good quotes:

And when it came to mentioning actual evidence for phenotypic specification and memory, he cited the Yamanaka factors, seeming not to realize that these are transcription factors, not the etching of marks on histones or DNA, or enzymes responsible for these modifications, or anything else about DNA packaging proteins or their modifications. Mukherjee seemed not to realize that transcription factors occupy the top of the hierarchy of epigenetic information, that this has been widely accepted in the broader chromatin field, and that histone modifications at most act as cogs in the machinery that enforces the often complex programs specified by the binding of transcription factors. In no case that I recall is there an example of a change in gene expression that can be attributed to histone hyperacetylation to the exclusion of non-histone substrates, of which many have been identified.

DNA methylation might be doing some interesting things, but despite decades of effort there is still no hard evidence that implicates DNA methylation in the kinds of processes that underlie differences between Mukherjee’s mother and aunt. Indeed, epigenetic processes analogous to those performed by the Yamanaka factors are performed by bacteria that entirely lack histones and DNA methylation. Mukherjee’s description of evidence for Lamarckian inheritance through the germline is no better, implying that phenotypic effects passed through the germline may be somehow mediated by histone and DNA modifications. But the best evidence is contrary to this view. For one thing, nearly all the histones are removed when sperm is packaged, and DNA methylation is erased and reset between generations. More to the point, the only informational components that have been shown to be transmitted with sperm to the next generation are small RNAs, which like transcription factors, are not referred to at all in this article.

In his piece, Dr. Mukherjee paints a grotesquely distorted picture of how the environment influences our genome and of how genes are regulated. Not only does he represent the ideas propagated by Dr. Allis and Dr. Reinberg as set in stone, which they are not; in fact, many researchers actively debate whether the ‘epigenetic’ processes they study have indeed a causative, instructive role in gene regulation or whether they are just cogs downstream of proteins, known as ‘transcription factors’, that determine which genes get turned on or off. Ironically, the Yamanaka experiments mentioned in the text clearly argue for the latter. To say the least, the jury is still out on these matters. And there is certainly no evidence whatsoever that epigenetic mechanisms play a role in evolutionary adaptation.

...

Finally, even Dr. Mukherjee’s account of the origins of the term epigenetics as meaning ‘above genetics’ is wrong. Conrad H. Waddington coined the term as an adjective, ‘epigenetic’, pertaining to ‘epigenesis’, the de novo origin of structures of the embryo, as opposed to ‘preformation, a mere unfolding of already pre-existing structures.
 
Well, I'm fascinated by the concept of "media" - on one hand it separates, and on the other it "connects." Without it we would have no access, and with it we can only have a troubled access. All communication involves noise.

Well sure. I don't know why this has to be "troubled". "Pure" communication appears to be an unreasonable standard.

In other news, I figured this may be of interest to some of us. ;) The New Yorker recently published a piece on epigenetics that I thought was compelling, but is creating a bit of a stir in the scientific community. Not because of the argument over environmental factors affecting gene expression, but over how the author (Siddhartha Mukherjee, believe it or not) chose to explain the findings of epigenetics. As it turns out, the article is overwhelmingly misleading and intellectually dishonest.

I won't say that I'm that surprised, since TNY is not a scientific journal although it likes to portray itself as educated on the subject(s). Being a subscriber, it's disappointing to see articles like this, but also interesting to see how a conventionally literary/humanistic publication often skews the information. Although it is strange that Mukherjee makes these errors, since he published a well-received book on cancer a few years back.

My dissertation project involves the overlap between science and literature around the mid-twentieth century, so I find this kind of stuff really interesting.

TNY link: http://www.newyorker.com/magazine/2016/05/02/breakthroughs-in-epigenetics

Link to scientific backlash: https://whyevolutionistrue.wordpres...criticize-the-mukherjee-piece-on-epigenetics/

Here are a few good quotes:

I assume this sort of misrepresentation occurs often in journalistic coverage of the frontiers of science, it just doesn't always get requisite attention that it should. Nevermind the fact that retractions from any sort of publication are almost always quiet, back-of-the-publication affairs.
 
"Troubled" is my synonym for "noise." I borrow from Judith Butler actually, who uses it in relation to gender. Basically, there is no such thing as "pure" communication (as you say), just as there is no such thing as "pure" gender.
 
I forget now who said it (some analytic phil), but this seems like a perfect example of "kicking up dust and then complaining about not being able to see". Why is it 100% or nothing? I know the claim is that what is being allowed for are shades of grey, but that doesn't seem to bear out in practicalities. My MOS in the military was heavily communications related/dependent. The comm stuff was also never even remotely "pure". To apply this Butlerite approach in a very concrete way, regardless of how well I understood the person on the other end, I'd have to just reply "say again" - infinitely - to all attempts at communication ("say again" is military comm for "repeat the last thing you said", because "repeat" has to do with follow-on strikes on a target). Noise creates "https://en.wikipedia.org/wiki/Packet_loss", which isn't a problem - iirc up to 50% packet loss in some situations. This is within a very rigidly structured network system. I would expect human communication to be even more robust in its ability to deal with "impurities".
 
You misunderstand me. I'm saying that noise is a good thing. Noise lets a system evolve, reorganize itself as higher levels. Noise is a productive disruption. This is a major tenet of media and (second-order) systems theory.

It is, however, a common tendency to assume that communication should be pure, as should gender - as should any identity. That's why we're so obsessed with authenticity. I'm saying that we need to loosen our hold on this notion.
 
If I am misunderstanding you, then I would assert that the entirety of the vocal left also misunderstands its philosophical roots entirely. I cannot fault them too much though based on the level of self-inflicted "noise" in the French (why even call it Continental at this point?) philosophic writing style.

Again we have a sort of weird all or nothing. Noise isn't always productive. It can be productive but that is always determined after the fact. I would agree that "authenticity" isn't all it is cracked up to be, if we speak of authenticity in terms of seeking some Platonic perfect form. What practical applications (such as manufacturing and construction) and risk management show is that there are acceptable "tolerances" and measures of deviation without critical damage to the structure, product, system, etc. However, that cannot be interpreted to say that they aren't "bad", merely that function and stability isn't hampered up to a point. Redundancies or protocols allow for a certain measure of impurities, whether in material or labor. Design and use-fields is where we find sometimes productive deviation, not construction or application (application as in the particular deployment within a given field of usage).

I insist that we couch terms in the practical, the concrete. Otherwise the noise, if you will, renders them useless for anything other than mental masturbation. Abstract thought is an extremely valuable tool but only when there is a point of contact. Material application is the test of the abstract.
 
I'm not entirely sure what you're finding so disagreeable, but then that usually happens. :heh:

The "French" writing style is out of vogue. There's a reason why it emerged during the heyday of cybernetic paradox. It was a product of its time, and because of that it has value. Literary journals won't publish anything that reads like Derrida anymore.

And I know that noise isn't always productive. I said it was productively disruptive. There's always some production, and some disruption. That doesn't seem like all or nothing to me... Regarding practicality, I don't see anything to disagree with, so I'm not sure what you find controversial. Everything is determined only after the fact; practical matters can only proceed by acknowledging that they might fail. I don't see anything wrong with that, but I don't think that means that praxis should neglect theory altogether in favor of blind faith (which I don't think you're suggesting, but I am confused as to what exactly your issue is).

Finally, I'm not sure what the "vocal left" is. If anything, I don't think they're all that well-read on Derrida.
 
I'm not entirely sure what you're finding so disagreeable, but then that usually happens. :heh:

The "French" writing style is out of vogue. There's a reason why it emerged during the heyday of cybernetic paradox. It was a product of its time, and because of that it has value. Literary journals won't publish anything that reads like Derrida anymore.

And I know that noise isn't always productive. I said it was productively disruptive. There's always some production, and some disruption. That doesn't seem like all or nothing to me... Regarding practicality, I don't see anything to disagree with, so I'm not sure what you find controversial. Everything is determined only after the fact; practical matters can only proceed by acknowledging that they might fail. I don't see anything wrong with that, but I don't think that means that praxis should neglect theory altogether in favor of blind faith (which I don't think you're suggesting, but I am confused as to what exactly your issue is).

Finally, I'm not sure what the "vocal left" is. If anything, I don't think they're all that well-read on Derrida.

"Productively disruptive" (productive disruption) means disruptive in a productive way - absolutely. That doesn't mean some of this and some of that. If a little of both is what you meant I'm fine with that, but that's not how what you said originally reads.

Axioms and laws allow us to determine things prior to particular facts. If they are correct and followed correctly no failure - within accordant strict definitions - occurs. This is one of the problems surrounding discussions of "what works". When someone says "welfare doesn't work", the type of "work" must be specifically delineated when speaking to someone who believes it does "work".

Praxis is both blind and not blind, as per Hume. For a given thing/protocol/etc, future conditions/knowledge could change to render a certain praxis no longer efficacious. But as long as such changes remain "unknown unknowns", we must proceed with such "blindness".
 
"Productively disruptive" (productive disruption) means disruptive in a productive way - absolutely. That doesn't mean some of this and some of that. If a little of both is what you meant I'm fine with that, but that's not how what you said originally reads.

This right here would be an example of productively disruptive communication.

Axioms and laws allow us to determine things prior to particular facts. If they are correct and followed correctly no failure - within accordant strict definitions - occurs. This is one of the problems surrounding discussions of "what works". When someone says "welfare doesn't work", the type of "work" must be specifically delineated when speaking to someone who believes it does "work".

Praxis is both blind and not blind, as per Hume. For a given thing/protocol/etc, future conditions/knowledge could change to render a certain praxis no longer efficacious. But as long as such changes remain "unknown unknowns", we must proceed with such "blindness".

I have no qualms with any of this. Instead of things like welfare, I'm simply thinking more about how relativity theory "works" until you shrink your scale to the very, very small... at which point quantum theory "works," but not when you expand it to the encompass the planetary. These theories also have significant practical implications involving engineering and computer science.


Adams is funny, but his categories of "reality" and "illusion" seem much too fine. The universe as a simulation doesn't mean evolution isn't real, per se, if all we already have is some perception of it. It would just change what we see as evolutionary pattern into computer code. And of course, evolution already is a code...
 
Last edited: