Dakryn's Batshit Theory of the Week

Russia is getting a nice series of bombings before the Olympics start. I think it's a gift from the southern people. Maybe it's just the new year's fireworks though.

EDIT: Insert "in Russia..." meme here.
 
That economists might suffer from delusions of grandeur and desire for fame in an academic profession is something I won't contest because I'm unfamiliar with the territory.

But if the insinuation is that all individuals in academic professions work purely for fame, then I disagree wholeheartedly, of course.

EDIT: another new blog post, probably the last I'll make before the semester begins.
 
To address the conclusion - everything is horror to something. I assume to make a statement about horror is practically humanist and of course not at all absolute.
 
Not "fictional theorizing" per se; because the theory itself isn't "fiction."

I'm trying to insinuate that a necessary part of philosophy must entail the absolute opposition to everything, even itself. If we take philosophy seriously, then it should not only tell us logical truths about the mathematizable world. It must also expose yawning gaps that structure the organization of that logic.

Many traditional logicians and scientists resist this necessity because it challenges the totality of their field models; and I'm not saying that their models are incorrect. I'm merely suggesting that looking at theory as the persistent illogical, or "unthought" component of philosophy - and assessing theory, in a broad stroke, with philosophy's failure over form - might allow us to better understand the development of philosophy as an institution of Western thought and consciousness.

I'm looking at this conceptually, as one must do in this case. If philosophy is something, and is defined as something - even if it's a process - then there is something else that philosophy is not, but that it pulls along with it, something in its shadow, that we might call "un-philosophy." This is the locale of horror, the source of Nick Land's "deepening of unknowing."
 
Not "fictional theorizing" per se; because the theory itself isn't "fiction."

I'm trying to insinuate that a necessary part of philosophy must entail the absolute opposition to everything, even itself. If we take philosophy seriously, then it should not only tell us logical truths about the mathematizable world. It must also expose yawning gaps that structure the organization of that logic.

Many traditional logicians and scientists resist this necessity because it challenges the totality of their field models; and I'm not saying that their models are incorrect. I'm merely suggesting that looking at theory as the persistent illogical, or "unthought" component of philosophy - and assessing theory, in a broad stroke, with philosophy's failure over form - might allow us to better understand the development of philosophy as an institution of Western thought and consciousness.

Only Western? I think that while it may be important at some early stage to point out that a hammer is not a lambchop, this importance can be quite overstated.

I'm looking at this conceptually, as one must do in this case. If philosophy is something, and is defined as something - even if it's a process - then there is something else that philosophy is not, but that it pulls along with it, something in its shadow, that we might call "un-philosophy." This is the locale of horror, the source of Nick Land's "deepening of unknowing."

Unintended consequences? Reductio? Or, in the case of "deepening of unknowing" (clever, but not original thought) - "The more you know, the more you realize you don't know". Not really groundbreaking. You know more in totality but know you know less relatively. Known knowns, known unknowns, unknown knowns, and unknown unknowns and so forth.

Horror is subjective, and making meta Ought statements about abstract value is not something I would tend to take very seriously.
 
Only Western? I think that while it may be important at some early stage to point out that a hammer is not a lambchop, this importance can be quite overstated.

Simplification. The point isn't that a hammer is not a lambchop, but that a hammer is also not a hammer.

Unintended consequences? Reductio? Or, in the case of "deepening of unknowing" (clever, but not original thought) - "The more you know, the more you realize you don't know". Not really groundbreaking. You know more in totality but know you know less relatively. Known knowns, known unknowns, unknown knowns, and unknown unknowns and so forth.

Horror is subjective, and making meta Ought statements about abstract value is not something I would tend to take very seriously.

The whole subjectivist/relativist thing is pretty outmoded itself - "nothing groundbreaking" as you would put it, and incredibly boring. You're claiming that referential horror doesn't exist (i.e. "horror" no longer refers to some abstract absolute), but that only structural horror exists (i.e. "horror" only means something because of its relation to other signifiers, and these meanings might change among individuals).

This is an illusion that we have to resist; in fact, the reason for this illusion is the very reason why I'm insisting on viewing theory in a fictional sense, as a coming-to-consciousness of philosophy. You want to reduce everything to relativism in a juvenile sense (no offense here, I'm merely stating that it's a lower step). We need to recognize relativism as part of the form by which we associate with the world. You want to deny us absolute access to reality by simply conceding that we cannot know it absolutely. I will readily agree with you (for the time being); but I'm interested in the formal limitations that prevent us from knowing it! The advent of theory, the study of literature, and the development of philosophy need to be assessed formally because this is how we come to recognize, criticize, interrogate, and (perhaps) overcome the restrictions imposed on us.

This is, in fact, a very literary way to look at theory. Rather than assess the content of these texts in a juvenile fashion, why not implement them as registers of formal paradoxes/problems/failures? Instead of seeing them as alternative philosophical models, we should see them as a difficulty of doing philosophy. Instead of simply dividing the concepts of the "for us" and the "not for us," and resigning ourselves to the "for us" and assigning the "not for us" as something eternally unattainable, why not investigate and study how this "not-for-us" appears to us through our formal relationship to the world?

I find a field of study in which we only focus on current needs and the best economic means to fulfill those needs to be utterly, mind-numbingly boring to the nth degree.
 
Simplification. The point isn't that a hammer is not a lambchop, but that a hammer is also not a hammer.

imow9d.gif
 
Simplification. The point isn't that a hammer is not a lambchop, but that a hammer is also not a hammer.

Semantics :cool:

The whole subjectivist/relativist thing is pretty outmoded itself - "nothing groundbreaking" as you would put it, and incredibly boring. You're claiming that referential horror doesn't exist (i.e. "horror" no longer refers to some abstract absolute), but that only structural horror exists (i.e. "horror" only means something because of its relation to other signifiers, and these meanings might change among individuals).

No longer or never has/will? Horror is "an intense feeling of fear, shock, or disgust." A feeling. Feelings are not original but rooted in other things. Those things will be subjective. (Ignoring of course, arguments against subjects).

Why should I be horrified by philosophy? Why should the carpet? Andromeda?

This is an illusion that we have to resist; in fact, the reason for this illusion is the very reason why I'm insisting on viewing theory in a fictional sense, as a coming-to-consciousness of philosophy. You want to reduce everything to relativism in a juvenile sense (no offense here, I'm merely stating that it's a lower step). We need to recognize relativism as part of the form by which we associate with the world. You want to deny us absolute access to reality by simply conceding that we cannot know it absolutely. I will readily agree with you (for the time being); but I'm interested in the formal limitations that prevent us from knowing it! The advent of theory, the study of literature, and the development of philosophy need to be assessed formally because this is how we come to recognize, criticize, interrogate, and (perhaps) overcome the restrictions imposed on us.

Of course I'm skeptical of sneaking up on anything - much less by calling It something Else. Seems like dressed up word games. Games are fun, but not necessarily educational.

I think I've said in the past that I believe most thought is not original, but merely rediscoveries, with the discovery that gets recorded and passed down hailed as an original (until possibly lost and it is "rediscovered" again). We can artificially enhance the number of rediscoveries by renaming and rephrasing old known discoveries. Like "deepening unknowning" for instance. Rephrasing has it's uses, but it's not original. On the other hand, hawking an imitation or redressed facsimile as the original is generally considered poor form.

This is, in fact, a very literary way to look at theory. Rather than assess the content of these texts in a juvenile fashion, why not implement them as registers of formal paradoxes/problems/failures? Instead of seeing them as alternative philosophical models, we should see them as a difficulty of doing philosophy. Instead of simply dividing the concepts of the "for us" and the "not for us," and resigning ourselves to the "for us" and assigning the "not for us" as something eternally unattainable, why not investigate and study how this "not-for-us" appears to us through our formal relationship to the world?

I would respond that the difficulty of doing philosophy is psychological and human, but then I would be guilty of as much domain bloat as Thacker ;)

Is/Ought is afaik still the biggest problem in philosophy, and I'm skeptical how new or reversed theories can rid us of this other than pretending it doesn't exist.

I find a field of study in which we only focus on current needs and the best economic means to fulfill those needs to be utterly, mind-numbingly boring to the nth degree.

Maybe because you don't experience or empathize with need? Of course, limiting it to "needs" is rather boring because actual needs are quite limited and the means for providing them equally so. What is more interesting is the unlimited field of subjective value and the equally unlimited field of creation and fulfillment. Even if we subscribe to the accelerationist view, where the ride is going is just as open and interesting.

I see a field of denial in the hands and mouths of those which want to spend time calling hammers not hammers. "You only think you're providing water to people, people and the water don't exist" etc. I wouldn't rule out the possibility of this sort of thought "breaking out of the Matrix" so-to-speak, but I won't believe it can happen until it does. Since I don't expect this sort of thought is really new, I'm going to say it's got thousands of years of failure at affecting change.

If something cannot come to bear in some concrete manner, I'm only going to give it the attention of limited amusement. If it can be brought to bear concretely but not positively, there is the horror.
 
Semantics :cool:

It isn't just semantics though, is it? In one sense, we can say that when I say "hammer" I'm referring to an instrument for driving nails into wood. In another, we can say that when you say "Hammer" you're referring to a rap artist.

However, in another sense we can say that in both instances, the word "hammer" fails to achieve a totality of meaning. Something escapes both instances, and this something isn't necessarily two different things in the same way that our relative definitions are two different things.

What escapes is beyond semantics.

No longer or never has/will? Horror is "an intense feeling of fear, shock, or disgust." A feeling. Feelings are not original but rooted in other things. Those things will be subjective. (Ignoring of course, arguments against subjects).

Why should I be horrified by philosophy? Why should the carpet? Andromeda?

"Horror" is a word. That's it. We use words to stand for things; but as I just explained, something always escapes. We can also conceive of this something, and of its escaping; but as soon as we use a word to label it, we appear to have drawn it back into semantics.

So keep from doing that, and consider for a moment that "horror" signifies (in this sense) the escape of something from conceptualization. What I am describing is analogous to a gravitational singularity - you cannot deny its existence, and you cannot claim to not know it entirely, because you and I and everyone do know that they're there, that they exist. This is the formal paradox I'm trying to consider. We can only conceptualize it negatively, but we have to consider how this conceptualization is formalized in a broader sense.

For instance, how does the organization of a gravitational singularity as an anomaly of space-time affect our conceptualization of historical development? And I'm not talking about cultural progress and other sociological perspectives; I'm asking how the anomaly of a black hole might also be present (in an analogous form) in our understanding of history, or the subject, or language.

Of course I'm skeptical of sneaking up on anything - much less by calling It something Else. Seems like dressed up word games. Games are fun, but not necessarily educational.

I think I've said in the past that I believe most thought is not original, but merely rediscoveries, with the discovery that gets recorded and passed down hailed as an original (until possibly lost and it is "rediscovered" again). We can artificially enhance the number of rediscoveries by renaming and rephrasing old known discoveries. Like "deepening unknowning" for instance. Rephrasing has it's uses, but it's not original. On the other hand, hawking an imitation or redressed facsimile as the original is generally considered poor form.

There's no such thing as an "original" discovery. Discovery is a process. And words are all we have to work with; so it might require some reorientation and/or redefinition on our part if we're to genuinely develop our modal thought.

I would respond that the difficulty of doing philosophy is psychological and human, but then I would be guilty of as much domain bloat as Thacker ;)

Is/Ought is afaik still the biggest problem in philosophy, and I'm skeptical how new or reversed theories can rid us of this other than pretending it doesn't exist.

This is why theories shouldn't be assessed on the basis of their content; they need to be recontextualized as part of a larger formal process. It has nothing to do with reversing them or angling their content differently. Rather, their content should be considered as consequences of formal engagement with reality. Even phenomena such as social Darwinism or Nazism, despite their egregious manipulations of science and economics, need to be considered in this manner.

Maybe because you don't experience or empathize with need? Of course, limiting it to "needs" is rather boring because actual needs are quite limited and the means for providing them equally so. What is more interesting is the unlimited field of subjective value and the equally unlimited field of creation and fulfillment. Even if we subscribe to the accelerationist view, where the ride is going is just as open and interesting.

How is what I'm saying ignoring the importance of "where the ride is going"? I feel as though I'm pushing in that direction. To simply study current biological needs and their economic achievement affords no development whatsoever, accelerationist or otherwise.

I see a field of denial in the hands and mouths of those which want to spend time calling hammers not hammers. "You only think you're providing water to people, people and the water don't exist" etc. I wouldn't rule out the possibility of this sort of thought "breaking out of the Matrix" so-to-speak, but I won't believe it can happen until it does. Since I don't expect this sort of thought is really new, I'm going to say it's got thousands of years of failure at affecting change.

If something cannot come to bear in some concrete manner, I'm only going to give it the attention of limited amusement. If it can be brought to bear concretely but not positively, there is the horror.

Processes take time; and without the necessary attention, perhaps they would never come to bear in some concrete manner.

You seem to be advocating the abandonment of everything that yields no immediate reward. Perhaps I'm misreading; but what looks absurd to you also looks useless. You want humans to focus on what works here and now. I would urge that we postpone such interests in certain circles or fields of study in order to maintain a mildly objective understanding of our place.

I'm not privileging the human here, or saying we have some duty to involve ourselves in studies of this sort; but humans are symbiotic with their environment. What emerges only does so according to some degree of our involvement. It may be that, in future centuries, the emergent phenomena that we help produce continue without our assistance; but until then, we can't ignore the possibility that the persistent interrogation of our minds might yield an alteration of our very formal relationship to the world around us.

And of course water and people exist; when I say "a hammer is not a hammer," I don't mean that the hammer doesn't exist. This is the form of our relationship to reality. It isn't all or nothing.
 
It isn't just semantics though, is it? In one sense, we can say that when I say "hammer" I'm referring to an instrument for driving nails into wood. In another, we can say that when you say "Hammer" you're referring to a rap artist.

However, in another sense we can say that in both instances, the word "hammer" fails to achieve a totality of meaning. Something escapes both instances, and this something isn't necessarily two different things in the same way that our relative definitions are two different things.

What escapes is beyond semantics.

I have an issue with using the word escape. Since words are symbology (compounded) anyway, they never have meaning in themselves. There is no escape, although there can be loss, or noise.

"Horror" is a word. That's it. We use words to stand for things; but as I just explained, something always escapes. We can also conceive of this something, and of its escaping; but as soon as we use a word to label it, we appear to have drawn it back into semantics.

So keep from doing that, and consider for a moment that "horror" signifies (in this sense) the escape of something from conceptualization. What I am describing is analogous to a gravitational singularity - you cannot deny its existence, and you cannot claim to not know it entirely, because you and I and everyone do know that they're there, that they exist. This is the formal paradox I'm trying to consider. We can only conceptualize it negatively, but we have to consider how this conceptualization is formalized in a broader sense.

The thing that cannot be named? Has it escaped though? Have we escaped it? Limiting horror to relations with escape is, again, problematic. But I don't think symbology escapes.

There's no such thing as an "original" discovery. Discovery is a process. And words are all we have to work with; so it might require some reorientation and/or redefinition on our part if we're to genuinely develop our modal thought.

Sure it's a process, I don't see how that precludes us from planting the proverbial flag. Ignoring the fact that others were already there (which I already alluded to re: rediscovery), although the Lewis and Clark Expedition was most certainly a process, there was an objective and ultimately a literal flag planting at Fort Clatsop. "Accidental discoveries" have similar moments with just a drastically reduced process.


This is why theories shouldn't be assessed on the basis of their content; they need to be recontextualized as part of a larger formal process. It has nothing to do with reversing them or angling their content differently. Rather, their content should be considered as consequences of formal engagement with reality. Even phenomena such as social Darwinism or Nazism, despite their egregious manipulations of science and economics, need to be considered in this manner.

How is what I'm saying ignoring the importance of "where the ride is going"? I feel as though I'm pushing in that direction. To simply study current biological needs and their economic achievement affords no development whatsoever, accelerationist or otherwise.

Processes take time; and without the necessary attention, perhaps they would never come to bear in some concrete manner.

......

I'm not privileging the human here, or saying we have some duty to involve ourselves in studies of this sort; but humans are symbiotic with their environment. What emerges only does so according to some degree of our involvement. It may be that, in future centuries, the emergent phenomena that we help produce continue without our assistance; but until then, we can't ignore the possibility that the persistent interrogation of our minds might yield an alteration of our very formal relationship to the world around us.

I do privilege the human. To do otherwise is suicidal. However, referencing "all or nothing thinking", to privilege the human doesn't require ignorance of everything else - to the contrary of a lot of nominal "free marketers" et al. (In fact, ignoring everything else is a quick way to fail.)

I think it's funny to accuse Darwinists (I refuse to make a non-existent distinction) and Nazis of scientific manipulation. No more or less than any other group or groups (and Nazi scientists shaped the world more than any other group in their era and post WWII - Operation Paperclip). However, they most certainly are part of the process of history.

How does study of the concrete afford nothing concrete? On the other hand, mistaking contingency for the absence of cause and effect does certainly afford nothing (or worse, believing in cause and effect but that they can be entirely manipulated). We can create things from science fiction in real time now, but we cannot also create the consequences.

The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design. - Hayek

You seem to be advocating the abandonment of everything that yields no immediate reward. Perhaps I'm misreading; but what looks absurd to you also looks useless. You want humans to focus on what works here and now. I would urge that we postpone such interests in certain circles or fields of study in order to maintain a mildly objective understanding of our place.

That certainly is a misreading. I've been pounding the pulpit of long time preference for some time now. However, it's possible to mistake a focus on what works period as only a focus on what works "now". That "this time it's different" type of thinking is why the same mistakes keep being made.

One thing I am very sympathetic to is the cyclic view of (human) history, rather than linear. We might be hurtling through space time at a billion miles at hour, but we still turn on the axis and revolve around the sun more or less like we always have - tolerable variance isn't enough to throw the relation out.
 
I'm currently stranded in Buffalo, so I have the privilege of quick replies.

To be honest, I'm so confused by many of your comments. I don't see how you can overlook certain things, but I suppose this is a consequence of our differing areas of study.

I have an issue with using the word escape. Since words are symbology (compounded) anyway, they never have meaning in themselves. There is no escape, although there can be loss, or noise.

So because "escape" implies some sort of agency you want to avoid it. That's fine; noise then, or loss! I don't understand the distinction you're drawing, and I see no purpose in this separation other than to prolong and confuse the argument.

The thing that "escapes" has no intention or agency; "escape" is purely rhetorical term. It's lost, or it's just noise; either works fine! All I'm saying is that when we mean different things by saying the word "hammer," there is still something missed (lost, noisy, escaping, etc.) that remains uncovered by the word.

The action of this missing, as it is always and forever (given the bizarre functioning of language), is a theoretical constant. I cannot stress this enough. It is something taking place, beyond the symbols that betrays the efficacy of language. We can claim, logically, that this thing is absolute in language; that it is always there. Derrida calls it "the trace," but claims that it cannot be defined or positively determined.

The thing that cannot be named? Has it escaped though? Have we escaped it? Limiting horror to relations with escape is, again, problematic. But I don't think symbology escapes.

If it cannot be named, termed, or otherwise grasped by categorical methods, then it "escapes" in some sense; but we can still know it, because it necessitates itself via epistemological structures. If it isn't "the hammer that hammers," and if it isn't "the Hammer that raps," and if it isn't even "the hammer" in any functional sense of the word, then it is something else. We do not exhaust reality through speech.

Sure it's a process, I don't see how that precludes us from planting the proverbial flag. Ignoring the fact that others were already there (which I already alluded to re: rediscovery), although the Lewis and Clark Expedition was most certainly a process, there was an objective and ultimately a literal flag planting at Fort Clatsop. "Accidental discoveries" have similar moments with just a drastically reduced process.

Only because we assign them beginnings and ends. Human discoveries, à la Lewis and Clark, and particularly easy because they offer man-made positives for us to cling to. Trying to assign points to something as abstract and as negative as knowledge (since knowledge constitutes, by definition, the negotiation of frontiers) is futile and fruitless. It betrays the true nature of the process.

I do privilege the human. To do otherwise is suicidal.

False. This is absolutely and positively false. You say it over and over. Stop saying it. It isn't true. When I say "privilege" I don't mean that humans should stop trying to survive or something like that. I mean stop assuming that the central position we grant ourselves in organizing knowledge of the world is accurate, or approaches accuracy.

If you don't understand that, then it's hopeless to have this discussion.

However, referencing "all or nothing thinking", to privilege the human doesn't require ignorance of everything else - to the contrary of a lot of nominal "free marketers" et al. (In fact, ignoring everything else is a quick way to fail.)

I think it's funny to accuse Darwinists (I refuse to make a non-existent distinction) and Nazis of scientific manipulation. No more or less than any other group or groups (and Nazi scientists shaped the world more than any other group in their era and post WWII - Operation Paperclip). However, they most certainly are part of the process of history.

Social Darwinists are different from Darwinists. To claim it's a "nonexistent distinction" is pretty rash. Social Darwinism misappropriates and misunderstands the entire notion of Darwinian thought. I realize you probably want to defend them because you have a Herbert Spencer quote in your signature; but the truth is, he wasn't the most intelligent person when it came to applying Darwinian theory. The most egregious error is the misinterpretation of "survival of the fittest" (which isn't even the correct phrase to begin with).

How does study of the concrete afford nothing concrete? On the other hand, mistaking contingency for the absence of cause and effect does certainly afford nothing (or worse, believing in cause and effect but that they can be entirely manipulated). We can create things from science fiction in real time now, but we cannot also create the consequences.

I don't think I ever said that studying concrete things affords nothing concrete. These are the kinds of comments that bog down discussion. I said that the study of immediate needs affords no development; by which I mean, no possibility of revolutionary change. You would rather have us study the immediate and let the change occur of its own accord.

The problem with that outlook is that you ignore the possibility that your purportedly natural and "accurate" behavior is causing harm to others. All I have ever said is that the type of theory and study I espouse provides a critical challenge to our assumptions and biases.

That you want to stifle them entirely insinuates that you don't want to be criticized, which in turn suggests that you're trying to hide something.

The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design. - Hayek

This does nothing to contradict my accusation of immediate demands. Economics purports to be a science of needs, and how individuals can fulfill those needs; when pressed, of course humans can imagine new designs.

I have no problem with economics as a science and system of study. I have a problem with assumptions; and all I want to do is provide the necessary critique so that we don't ride off into the sunset on the backs of those assumptions, only to discover that the world is not what we thought it was.

That certainly is a misreading. I've been pounding the pulpit of long time preference for some time now. However, it's possible to mistake a focus on what works period as only a focus on what works "now". That "this time it's different" type of thinking is why the same mistakes keep being made.

But what works now isn't necessarily what works "period"; you see, you're making an assumption. Hume would say "tsk tsk." So I read any kind of absolute statement like that as commenting on what works now. Because that is all we can read it as, with any certainty. To project it into the future is to assume it won't change, and that it hasn't already changed for some.

One thing I am very sympathetic to is the cyclic view of (human) history, rather than linear. We might be hurtling through space time at a billion miles at hour, but we still turn on the axis and revolve around the sun more or less like we always have - tolerable variance isn't enough to throw the relation out.

History is nothing more than representation; linear, cyclical, etc.
 
To be honest, I'm so confused by many of your comments. I don't see how you can overlook certain things, but I suppose this is a consequence of our differing areas of study.

I'm sure that it is. I won't say that I am confused by some of the things you write (here and otherwise), but I will say I am sure I don't always have a firm grasp of your references. However, the inferences are often pretty clear even if implicit or unconsidered, and it is usually those inferences that I respond too. Obviously this could lead to some confusion if you hadn't followed a particular thought down the same path.

So because "escape" implies some sort of agency you want to avoid it. That's fine; noise then, or loss! I don't understand the distinction you're drawing, and I see no purpose in this separation other than to prolong and confuse the argument.

The thing that "escapes" has no intention or agency; "escape" is purely rhetorical term. It's lost, or it's just noise; either works fine! All I'm saying is that when we mean different things by saying the word "hammer," there is still something missed (lost, noisy, escaping, etc.) that remains uncovered by the word.

The action of this missing, as it is always and forever (given the bizarre functioning of language), is a theoretical constant. I cannot stress this enough. It is something taking place, beyond the symbols that betrays the efficacy of language. We can claim, logically, that this thing is absolute in language; that it is always there. Derrida calls it "the trace," but claims that it cannot be defined or positively determined.

I actually hadn't considered it specifically in terms of agency, but I suppose that clarifies. I also suppose my amatuer background in communication (particularly digital) technologies informs my understanding of human communication, or at least my language. When information is transmitted, barring physical malfunction, 100% is always transmitted. However interference ("noise"), and a lack of power over distance (loss), can affect the full receipt of the transmission on the other end. The information does not go anywhere. I suppose you could call this "escape", but I don't think it's as accurate as it could be - although it appears critically necessary for ascribing agency to anything and everything.


If it cannot be named, termed, or otherwise grasped by categorical methods, then it "escapes" in some sense; but we can still know it, because it necessitates itself via epistemological structures. If it isn't "the hammer that hammers," and if it isn't "the Hammer that raps," and if it isn't even "the hammer" in any functional sense of the word, then it is something else. We do not exhaust reality through speech.

Do not or cannot? It almost seems like this train of thought is reverting back to a form (lolz) of Platonic argument on forms.

Only because we assign them beginnings and ends. Human discoveries, à la Lewis and Clark, and particularly easy because they offer man-made positives for us to cling to. Trying to assign points to something as abstract and as negative as knowledge (since knowledge constitutes, by definition, the negotiation of frontiers) is futile and fruitless. It betrays the true nature of the process.

Are these frontiers then non-negotiable?

False. This is absolutely and positively false. You say it over and over. Stop saying it. It isn't true. When I say "privilege" I don't mean that humans should stop trying to survive or something like that. I mean stop assuming that the central position we grant ourselves in organizing knowledge of the world is accurate, or approaches accuracy.

If you don't understand that, then it's hopeless to have this discussion.

Either the knowledge is accurate (as it pertains to us) or it is not (with a general understanding of accuracy as "working"). That we are the center of our collection and organization is unavoidable - and that we work for ourselves is necessary. To think otherwise isn't just suicidal in an immediate and concrete "well we have to eat" sort of way.

Social Darwinists are different from Darwinists. To claim it's a "nonexistent distinction" is pretty rash. Social Darwinism misappropriates and misunderstands the entire notion of Darwinian thought. I realize you probably want to defend them because you have a Herbert Spencer quote in your signature; but the truth is, he wasn't the most intelligent person when it came to applying Darwinian theory. The most egregious error is the misinterpretation of "survival of the fittest" (which isn't even the correct phrase to begin with).

A complete sidenote, but whenever I see "survival of the fittest" now I just think "Survival of the Fitness boys" :lol:

http://youtu.be/hIX7l06VhIg

Anyway, I've read none of Spencer other than this quote, which sums up in a sentence the problem of our modernity. He could be wrong about any and everything else and this stands on its own.

The difference between the phenomena observed by Darwin and its application is that it is active, or applied. Culling on preference rather than let processes do their work. "Helping nature along".

I don't think I ever said that studying concrete things affords nothing concrete. These are the kinds of comments that bog down discussion. I said that the study of immediate needs affords no development; by which I mean, no possibility of revolutionary change. You would rather have us study the immediate and let the change occur of its own accord.

The problem with that outlook is that you ignore the possibility that your purportedly natural and "accurate" behavior is causing harm to others. All I have ever said is that the type of theory and study I espouse provides a critical challenge to our assumptions and biases.

That you want to stifle them entirely insinuates that you don't want to be criticized, which in turn suggests that you're trying to hide something.

I said nothing about stifling them. I do believe if it ran wild it would stifle itself indirectly - an abundance of necessities is necessary to fuel work on non-necessities (studies of any sort, higher order/capital goods, etc), and when the balance tips it self corrects or self-eradicates over time.

I don't know what you mean by revolution though, since it seems you are in fact precluding anything physical from being revolutionary. I think technology is very revolutionary.

This does nothing to contradict my accusation of immediate demands. Economics purports to be a science of needs, and how individuals can fulfill those needs; when pressed, of course humans can imagine new designs.

I have no problem with economics as a science and system of study. I have a problem with assumptions; and all I want to do is provide the necessary critique so that we don't ride off into the sunset on the backs of those assumptions, only to discover that the world is not what we thought it was.

Well avoiding assumptions as much as possible is fine, but it seems the conclusion of a lot of this is to leave no ground for any conclusion, assumed or other wise, and conclusions must be reached - right or wrong - for action.


But what works now isn't necessarily what works "period"; you see, you're making an assumption. Hume would say "tsk tsk." So I read any kind of absolute statement like that as commenting on what works now. Because that is all we can read it as, with any certainty. To project it into the future is to assume it won't change, and that it hasn't already changed for some.

Haha Hume. Poor guy did try didn't he? A for effort. Anyway, obviously at some point if we cannot find a way into another solar system, human existence will end with the sun and then nothing will work at some point in the future. There really is a limited amount of human/social functions and methods, although they take different names and sometimes hashes, but it's mostly recycling and stitching and the same bad parts fail time and again for the same reasons. Softer landings are aided by technology, not through some sort of systemic revolution or a change in "what works".

History is nothing more than representation; linear, cyclical, etc.

That assumes all representation is equal.
 
I actually hadn't considered it specifically in terms of agency, but I suppose that clarifies. I also suppose my amatuer background in communication (particularly digital) technologies informs my understanding of human communication, or at least my language. When information is transmitted, barring physical malfunction, 100% is always transmitted. However interference ("noise"), and a lack of power over distance (loss), can affect the full receipt of the transmission on the other end. The information does not go anywhere. I suppose you could call this "escape", but I don't think it's as accurate as it could be - although it appears critically necessary for ascribing agency to anything and everything.

Well, I don't think that when information is communicated linguistically, 100% is always transmitted. In fact, I would say that you cannot even measure linguistic communication in that sense. When it boils right down to it, I'm a firm believer that every time you and I communicate, what occurs unconsciously (and nearly instantaneously) is an educated guess as to how to respond.

I'm forced to inquire: what is information? You could say it's the "meaning" of the word, but this is problematic because words can always mean different things, even if it boils down to experiential reception. Okay, then; it's an author's (or speaker's) intended meaning. This is problematic also because the language iterates (i.e. repeats) beyond the death of any and all speakers. After a speaker's death, we analyse and interpret their speech and organize it into a hierarchy of language and literature, and we associate the dead speaker with what he/she said, thus abiding by Foucault's "author function." Information, at this point, is no longer reducible to authorial intention, but to the hierarchies in which the author and his/her text is embedded.

Is information purely in the semantic content of language; or is it also in its style? Its inflection? Can information be transmitted unspoken, via a gesture or suggestive gaze?

In communication technologies, we might say that information is in the math, the algorithms, that we feed into our machines and that they then communicate to someone else. In this scenario, information is tautological: A equals A on both ends, if we reduce it purely to code.

But language isn't a code; once we "translate" the code (the math, the algorithms) into what it "says," then we also have to interpret the message.

Žižek gives a good hypothetical example of this:

A secret organization sends one of its spies to infiltrate a country with a purportedly cruel totalitarian government. The agent is to reside in the country for some months, investigate, and then send word back as to the veracity of the country's governmental system. However, it is assumed that the mail will be intently observed, so the agent must respond in code. If the stories are false, and the country is overseen by a benign government, the agent will respond in blue ink; but if the government is tyrannical and oppressive, the agent will respond in red ink.

Months later, the agency receives its reply: a message written in blue ink. It reads: "Everything is wonderful here. Everyone is provided food, clothing, and shelter, and all citizens are employed. The government is peaceful and unobtrusive. Everyone is content, and has everything they could ever want. There is only one thing you cannot buy: red ink."

Do not or cannot? It almost seems like this train of thought is reverting back to a form (lolz) of Platonic argument on forms.

I would say otherwise, because I'm saying that language cannot exhaust reality. Forms, concepts, language does not subsist in reality, as Plato claims; something else, something non-conceptually real exists beyond the symbolic. The real I'm arguing for isn't formal in the Platonic sense; it is real beyond formal comprehension. What I'm pushing for is a closer examination of the evolution of our formal relationship with, and representation of, reality. The "form" lies only with our engagement, not in reality itself.

Are these frontiers then non-negotiable?

Certainly not; but their advancement doesn't approach some total appropriation of the real. This seems like a Zeno's paradox; that the frontier never really moves at all, since it can never reach a total appropriation. But I think that continuous revision and critique of our formal relationship, which negotiates with and interrogates the frontier, can eventually lead to the possibility of forming posthuman ontologies.

Either the knowledge is accurate (as it pertains to us) or it is not (with a general understanding of accuracy as "working"). That we are the center of our collection and organization is unavoidable - and that we work for ourselves is necessary. To think otherwise isn't just suicidal in an immediate and concrete "well we have to eat" sort of way.

You're right that our situation at the conceptual "center" is unavoidable; any other possibility seems impossible. This is exactly the formal relationship that I'm saying we need to interrogate. You say that physical/biological suicide isn't necessarily what you meant. So why is the "suicide of the subject," so to speak, such a bad thing?

A complete sidenote, but whenever I see "survival of the fittest" now I just think "Survival of the Fitness boys" :lol:

http://youtu.be/hIX7l06VhIg

Anyway, I've read none of Spencer other than this quote, which sums up in a sentence the problem of our modernity. He could be wrong about any and everything else and this stands on its own.

The difference between the phenomena observed by Darwin and its application is that it is active, or applied. Culling on preference rather than let processes do their work. "Helping nature along".

Actually, the social Darwinists were the ones who "culled on preference." Their entire ideology was created and fomented for the purposes of rationalizing the politico-economic exploitation of mass quantities of labor. They were ideologues, pure and simple. Darwin was the one who truly vied for "letting processes do their work."

"Survival of the fittest" implies necessity and teleology; in fact, the language used by social Darwinists was that because wealthy capitalists had money and were successful, therefore they should have money and be successful. The true Darwinian perspective is not "survival of the fittest," but "survival of the luckiest."

I said nothing about stifling them. I do believe if it ran wild it would stifle itself indirectly - an abundance of necessities is necessary to fuel work on non-necessities (studies of any sort, higher order/capital goods, etc), and when the balance tips it self corrects or self-eradicates over time.

I don't know what you mean by revolution though, since it seems you are in fact precluding anything physical from being revolutionary. I think technology is very revolutionary.

I agree that technology is revolutionary; but it isn't as simple as man-made technologies that process our "information" and aid us in completing our tasks. Technology is also revolutionary in the sense that it actively alters our formal relationship to the world. This is what I'm interested in.

Well avoiding assumptions as much as possible is fine, but it seems the conclusion of a lot of this is to leave no ground for any conclusion, assumed or other wise, and conclusions must be reached - right or wrong - for action.

Leaving no ground for conclusion isn't the point; the point is to force us to remember that, although we must act with some semblance of certainty, the ground on which we stand is unfathomably tenuous. In fact...

Haha Hume. Poor guy did try didn't he? A for effort.

Hume concluded that there's no way for us to successfully determine causality or exist in a way that relies on causal proof. Therefore, all we can do is live as though causality exists, since it seems to work most of the time.

But he would remind us that we should never forget the logical aimlessness of our actions.

Anyway, obviously at some point if we cannot find a way into another solar system, human existence will end with the sun and then nothing will work at some point in the future. There really is a limited amount of human/social functions and methods, although they take different names and sometimes hashes, but it's mostly recycling and stitching and the same bad parts fail time and again for the same reasons. Softer landings are aided by technology, not through some sort of systemic revolution or a change in "what works".

There may be a limited number of human functions; but again, that implies some sort of ground, or original basis, for the human. Part of what posthumanism entails is the study of how technology isn't simply an instrument in our hands, but an active mediating force that alters our relationship to the world, and contributes to those limited functions. As I think I've said before (and as many theorists claim), we are already posthuman.

That assumes all representation is equal.

I wasn't assuming anything. I was simply saying that conceptualizing history as cyclical in no way - not even closely - approximates what it's actually like. In fact, calling it history is problematic, since history can only be human to begin with (at least, as we conceive of it).

I don't believe that all representations are equal; this is part of the argument I'm making. We need to actively study our formal relationship to the world in order to better understand the symbols we use, the concepts we create. Understanding history and its representations (i.e. historiography) isn't something that economics can immediately yield; philosophy and the sciences are what reveals our conceptual understanding of history.
 
Screw neoreaction, I'll take necronautical!

This, perhaps, approaches what we’re trying to feel our way toward: the breach, the sudden, epiphanic emergence of the genuinely unplanned, the departure from the script. To put it in fashionable Badiouan, the Event. The INS believes in the Event—in the power of the event, and that of art, to carry that event within itself: bring it to pass, or hold it in abeyance, as potentiality. And, paradoxically, the best way that art can do this is by allowing itself to be distracted, gazing in the rear view mirror.

http://www.believermag.com/issues/201011/?read=article_necronautical

http://necronauts.org/ (homepage)
 
Guess Amazon and Google have driven the cost of starting and interweb movement to next to zero. :lol:

Sorry about not getting back to the other post, will hopefully get to it tomorrow.
 
http://www.forbes.com/sites/nathanlewis/2014/01/23/keynes-and-rothbard-agreed-todays-economics-is-mercantilism/


Economists were actually late to the party. The significance of Keynes’ General Theory of Employment, Interest and Money of 1936 was not that it showed a new way — by that time, world governments had been busily implementing Mercantilist policies for over six years. Rather, it showed economists how they could update their blather to the new political realities, so that they could reclaim their prized sinecures as elaborate justifiers for what politicians wanted to do anyway. Keynes’ book is essentially unreadable; the title alone tells you where Keynes intended to get his employment from.

Keynes himself understood what he had done. An entire chapter of the General Theory (chapter 23, Notes on Mercantilism …) is dedicated to cheering the return of the Mercantilist agenda.

“[the Mercantilists] were emphatic that an unduly high rate of interest was the main obstacle to the growth of wealth … and several of them made it clear that their preoccupation with increasing the quantity of money was due to their desire to diminish the rate of interest.”

Does that sum up the past several years of Bernanke and Quantitative Easing?

Murray Rothbard wrote extensively and wonderfully on this topic, especially in his fantastic book Economic Thought Before Adam Smith. This used to be a little-known text buried in university libraries, but it is now available for free in eBook form from mises.org.

One of Rothbard’s points is that Mercantilism reflected big government; the Classical or “laissez-faire” view reflected small government. Today’s Mercantilism is a reflection of the expansion of the U.S. government, from 7% of GDP in 1900 to about 40% today. This brief excerpt is also from mises.org:

”As the economic aspect of state absolutism, mercantilism was of necessity a system of state-building, of big government, of heavy royal expenditure, of high taxes, of (especially after the late 17th century) inflation and deficit finance, of war, imperialism, and the aggrandizing of the nation-state. In short, a politicoeconomic system very like that of the present day.”

That sounds a little familiar …

Here’s Rothbard, talking about Mercantilism in 1963:

”Mercantilism has had a ‘good press’ in recent decades, in contrast to 19th-century opinion. In the days of Adam Smith and the classical economists, mercantilism was properly regarded as a blend of economic fallacy and state creation of special privilege. But in our century, the general view of mercantilism has changed drastically.

Keynesians hail mercantilists as prefiguring their own economic insights; Marxists, constitutionally unable to distinguish between free enterprise and special privilege, hail mercantilism as a “progressive” step in the historical development of capitalism; socialists and interventionists salute mercantilism as anticipating modern state building and central planning.

Mercantilism, which reached its height in the Europe of the 17th and 18th centuries, was a system of statism which employed economic fallacy to build up a structure of imperial state power, as well as special subsidy and monopolistic privilege to individuals or groups favored by the state.”