Not "fictional theorizing" per se; because the theory itself isn't "fiction."
I'm trying to insinuate that a necessary part of philosophy must entail the absolute opposition to everything, even itself. If we take philosophy seriously, then it should not only tell us logical truths about the mathematizable world. It must also expose yawning gaps that structure the organization of that logic.
Many traditional logicians and scientists resist this necessity because it challenges the totality of their field models; and I'm not saying that their models are incorrect. I'm merely suggesting that looking at theory as the persistent illogical, or "unthought" component of philosophy - and assessing theory, in a broad stroke, with philosophy's failure over form - might allow us to better understand the development of philosophy as an institution of Western thought and consciousness.
I'm looking at this conceptually, as one must do in this case. If philosophy is something, and is defined as something - even if it's a process - then there is something else that philosophy is not, but that it pulls along with it, something in its shadow, that we might call "un-philosophy." This is the locale of horror, the source of Nick Land's "deepening of unknowing."
Only Western? I think that while it may be important at some early stage to point out that a hammer is not a lambchop, this importance can be quite overstated.
Unintended consequences? Reductio? Or, in the case of "deepening of unknowing" (clever, but not original thought) - "The more you know, the more you realize you don't know". Not really groundbreaking. You know more in totality but know you know less relatively. Known knowns, known unknowns, unknown knowns, and unknown unknowns and so forth.
Horror is subjective, and making meta Ought statements about abstract value is not something I would tend to take very seriously.
Simplification. The point isn't that a hammer is not a lambchop, but that a hammer is also not a hammer.
Simplification. The point isn't that a hammer is not a lambchop, but that a hammer is also not a hammer.
The whole subjectivist/relativist thing is pretty outmoded itself - "nothing groundbreaking" as you would put it, and incredibly boring. You're claiming that referential horror doesn't exist (i.e. "horror" no longer refers to some abstract absolute), but that only structural horror exists (i.e. "horror" only means something because of its relation to other signifiers, and these meanings might change among individuals).
This is an illusion that we have to resist; in fact, the reason for this illusion is the very reason why I'm insisting on viewing theory in a fictional sense, as a coming-to-consciousness of philosophy. You want to reduce everything to relativism in a juvenile sense (no offense here, I'm merely stating that it's a lower step). We need to recognize relativism as part of the form by which we associate with the world. You want to deny us absolute access to reality by simply conceding that we cannot know it absolutely. I will readily agree with you (for the time being); but I'm interested in the formal limitations that prevent us from knowing it! The advent of theory, the study of literature, and the development of philosophy need to be assessed formally because this is how we come to recognize, criticize, interrogate, and (perhaps) overcome the restrictions imposed on us.
This is, in fact, a very literary way to look at theory. Rather than assess the content of these texts in a juvenile fashion, why not implement them as registers of formal paradoxes/problems/failures? Instead of seeing them as alternative philosophical models, we should see them as a difficulty of doing philosophy. Instead of simply dividing the concepts of the "for us" and the "not for us," and resigning ourselves to the "for us" and assigning the "not for us" as something eternally unattainable, why not investigate and study how this "not-for-us" appears to us through our formal relationship to the world?
I find a field of study in which we only focus on current needs and the best economic means to fulfill those needs to be utterly, mind-numbingly boring to the nth degree.
Semantics
No longer or never has/will? Horror is "an intense feeling of fear, shock, or disgust." A feeling. Feelings are not original but rooted in other things. Those things will be subjective. (Ignoring of course, arguments against subjects).
Why should I be horrified by philosophy? Why should the carpet? Andromeda?
Of course I'm skeptical of sneaking up on anything - much less by calling It something Else. Seems like dressed up word games. Games are fun, but not necessarily educational.
I think I've said in the past that I believe most thought is not original, but merely rediscoveries, with the discovery that gets recorded and passed down hailed as an original (until possibly lost and it is "rediscovered" again). We can artificially enhance the number of rediscoveries by renaming and rephrasing old known discoveries. Like "deepening unknowning" for instance. Rephrasing has it's uses, but it's not original. On the other hand, hawking an imitation or redressed facsimile as the original is generally considered poor form.
I would respond that the difficulty of doing philosophy is psychological and human, but then I would be guilty of as much domain bloat as Thacker
Is/Ought is afaik still the biggest problem in philosophy, and I'm skeptical how new or reversed theories can rid us of this other than pretending it doesn't exist.
Maybe because you don't experience or empathize with need? Of course, limiting it to "needs" is rather boring because actual needs are quite limited and the means for providing them equally so. What is more interesting is the unlimited field of subjective value and the equally unlimited field of creation and fulfillment. Even if we subscribe to the accelerationist view, where the ride is going is just as open and interesting.
I see a field of denial in the hands and mouths of those which want to spend time calling hammers not hammers. "You only think you're providing water to people, people and the water don't exist" etc. I wouldn't rule out the possibility of this sort of thought "breaking out of the Matrix" so-to-speak, but I won't believe it can happen until it does. Since I don't expect this sort of thought is really new, I'm going to say it's got thousands of years of failure at affecting change.
If something cannot come to bear in some concrete manner, I'm only going to give it the attention of limited amusement. If it can be brought to bear concretely but not positively, there is the horror.
It isn't just semantics though, is it? In one sense, we can say that when I say "hammer" I'm referring to an instrument for driving nails into wood. In another, we can say that when you say "Hammer" you're referring to a rap artist.
However, in another sense we can say that in both instances, the word "hammer" fails to achieve a totality of meaning. Something escapes both instances, and this something isn't necessarily two different things in the same way that our relative definitions are two different things.
What escapes is beyond semantics.
"Horror" is a word. That's it. We use words to stand for things; but as I just explained, something always escapes. We can also conceive of this something, and of its escaping; but as soon as we use a word to label it, we appear to have drawn it back into semantics.
So keep from doing that, and consider for a moment that "horror" signifies (in this sense) the escape of something from conceptualization. What I am describing is analogous to a gravitational singularity - you cannot deny its existence, and you cannot claim to not know it entirely, because you and I and everyone do know that they're there, that they exist. This is the formal paradox I'm trying to consider. We can only conceptualize it negatively, but we have to consider how this conceptualization is formalized in a broader sense.
There's no such thing as an "original" discovery. Discovery is a process. And words are all we have to work with; so it might require some reorientation and/or redefinition on our part if we're to genuinely develop our modal thought.
This is why theories shouldn't be assessed on the basis of their content; they need to be recontextualized as part of a larger formal process. It has nothing to do with reversing them or angling their content differently. Rather, their content should be considered as consequences of formal engagement with reality. Even phenomena such as social Darwinism or Nazism, despite their egregious manipulations of science and economics, need to be considered in this manner.
How is what I'm saying ignoring the importance of "where the ride is going"? I feel as though I'm pushing in that direction. To simply study current biological needs and their economic achievement affords no development whatsoever, accelerationist or otherwise.
Processes take time; and without the necessary attention, perhaps they would never come to bear in some concrete manner.
......
I'm not privileging the human here, or saying we have some duty to involve ourselves in studies of this sort; but humans are symbiotic with their environment. What emerges only does so according to some degree of our involvement. It may be that, in future centuries, the emergent phenomena that we help produce continue without our assistance; but until then, we can't ignore the possibility that the persistent interrogation of our minds might yield an alteration of our very formal relationship to the world around us.
You seem to be advocating the abandonment of everything that yields no immediate reward. Perhaps I'm misreading; but what looks absurd to you also looks useless. You want humans to focus on what works here and now. I would urge that we postpone such interests in certain circles or fields of study in order to maintain a mildly objective understanding of our place.
I have an issue with using the word escape. Since words are symbology (compounded) anyway, they never have meaning in themselves. There is no escape, although there can be loss, or noise.
The thing that cannot be named? Has it escaped though? Have we escaped it? Limiting horror to relations with escape is, again, problematic. But I don't think symbology escapes.
Sure it's a process, I don't see how that precludes us from planting the proverbial flag. Ignoring the fact that others were already there (which I already alluded to re: rediscovery), although the Lewis and Clark Expedition was most certainly a process, there was an objective and ultimately a literal flag planting at Fort Clatsop. "Accidental discoveries" have similar moments with just a drastically reduced process.
I do privilege the human. To do otherwise is suicidal.
However, referencing "all or nothing thinking", to privilege the human doesn't require ignorance of everything else - to the contrary of a lot of nominal "free marketers" et al. (In fact, ignoring everything else is a quick way to fail.)
I think it's funny to accuse Darwinists (I refuse to make a non-existent distinction) and Nazis of scientific manipulation. No more or less than any other group or groups (and Nazi scientists shaped the world more than any other group in their era and post WWII - Operation Paperclip). However, they most certainly are part of the process of history.
How does study of the concrete afford nothing concrete? On the other hand, mistaking contingency for the absence of cause and effect does certainly afford nothing (or worse, believing in cause and effect but that they can be entirely manipulated). We can create things from science fiction in real time now, but we cannot also create the consequences.
The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design. - Hayek
That certainly is a misreading. I've been pounding the pulpit of long time preference for some time now. However, it's possible to mistake a focus on what works period as only a focus on what works "now". That "this time it's different" type of thinking is why the same mistakes keep being made.
One thing I am very sympathetic to is the cyclic view of (human) history, rather than linear. We might be hurtling through space time at a billion miles at hour, but we still turn on the axis and revolve around the sun more or less like we always have - tolerable variance isn't enough to throw the relation out.
To be honest, I'm so confused by many of your comments. I don't see how you can overlook certain things, but I suppose this is a consequence of our differing areas of study.
So because "escape" implies some sort of agency you want to avoid it. That's fine; noise then, or loss! I don't understand the distinction you're drawing, and I see no purpose in this separation other than to prolong and confuse the argument.
The thing that "escapes" has no intention or agency; "escape" is purely rhetorical term. It's lost, or it's just noise; either works fine! All I'm saying is that when we mean different things by saying the word "hammer," there is still something missed (lost, noisy, escaping, etc.) that remains uncovered by the word.
The action of this missing, as it is always and forever (given the bizarre functioning of language), is a theoretical constant. I cannot stress this enough. It is something taking place, beyond the symbols that betrays the efficacy of language. We can claim, logically, that this thing is absolute in language; that it is always there. Derrida calls it "the trace," but claims that it cannot be defined or positively determined.
If it cannot be named, termed, or otherwise grasped by categorical methods, then it "escapes" in some sense; but we can still know it, because it necessitates itself via epistemological structures. If it isn't "the hammer that hammers," and if it isn't "the Hammer that raps," and if it isn't even "the hammer" in any functional sense of the word, then it is something else. We do not exhaust reality through speech.
Only because we assign them beginnings and ends. Human discoveries, à la Lewis and Clark, and particularly easy because they offer man-made positives for us to cling to. Trying to assign points to something as abstract and as negative as knowledge (since knowledge constitutes, by definition, the negotiation of frontiers) is futile and fruitless. It betrays the true nature of the process.
False. This is absolutely and positively false. You say it over and over. Stop saying it. It isn't true. When I say "privilege" I don't mean that humans should stop trying to survive or something like that. I mean stop assuming that the central position we grant ourselves in organizing knowledge of the world is accurate, or approaches accuracy.
If you don't understand that, then it's hopeless to have this discussion.
Social Darwinists are different from Darwinists. To claim it's a "nonexistent distinction" is pretty rash. Social Darwinism misappropriates and misunderstands the entire notion of Darwinian thought. I realize you probably want to defend them because you have a Herbert Spencer quote in your signature; but the truth is, he wasn't the most intelligent person when it came to applying Darwinian theory. The most egregious error is the misinterpretation of "survival of the fittest" (which isn't even the correct phrase to begin with).
I don't think I ever said that studying concrete things affords nothing concrete. These are the kinds of comments that bog down discussion. I said that the study of immediate needs affords no development; by which I mean, no possibility of revolutionary change. You would rather have us study the immediate and let the change occur of its own accord.
The problem with that outlook is that you ignore the possibility that your purportedly natural and "accurate" behavior is causing harm to others. All I have ever said is that the type of theory and study I espouse provides a critical challenge to our assumptions and biases.
That you want to stifle them entirely insinuates that you don't want to be criticized, which in turn suggests that you're trying to hide something.
This does nothing to contradict my accusation of immediate demands. Economics purports to be a science of needs, and how individuals can fulfill those needs; when pressed, of course humans can imagine new designs.
I have no problem with economics as a science and system of study. I have a problem with assumptions; and all I want to do is provide the necessary critique so that we don't ride off into the sunset on the backs of those assumptions, only to discover that the world is not what we thought it was.
But what works now isn't necessarily what works "period"; you see, you're making an assumption. Hume would say "tsk tsk." So I read any kind of absolute statement like that as commenting on what works now. Because that is all we can read it as, with any certainty. To project it into the future is to assume it won't change, and that it hasn't already changed for some.
History is nothing more than representation; linear, cyclical, etc.
I actually hadn't considered it specifically in terms of agency, but I suppose that clarifies. I also suppose my amatuer background in communication (particularly digital) technologies informs my understanding of human communication, or at least my language. When information is transmitted, barring physical malfunction, 100% is always transmitted. However interference ("noise"), and a lack of power over distance (loss), can affect the full receipt of the transmission on the other end. The information does not go anywhere. I suppose you could call this "escape", but I don't think it's as accurate as it could be - although it appears critically necessary for ascribing agency to anything and everything.
Do not or cannot? It almost seems like this train of thought is reverting back to a form (lolz) of Platonic argument on forms.
Are these frontiers then non-negotiable?
Either the knowledge is accurate (as it pertains to us) or it is not (with a general understanding of accuracy as "working"). That we are the center of our collection and organization is unavoidable - and that we work for ourselves is necessary. To think otherwise isn't just suicidal in an immediate and concrete "well we have to eat" sort of way.
A complete sidenote, but whenever I see "survival of the fittest" now I just think "Survival of the Fitness boys"
http://youtu.be/hIX7l06VhIg
Anyway, I've read none of Spencer other than this quote, which sums up in a sentence the problem of our modernity. He could be wrong about any and everything else and this stands on its own.
The difference between the phenomena observed by Darwin and its application is that it is active, or applied. Culling on preference rather than let processes do their work. "Helping nature along".
I said nothing about stifling them. I do believe if it ran wild it would stifle itself indirectly - an abundance of necessities is necessary to fuel work on non-necessities (studies of any sort, higher order/capital goods, etc), and when the balance tips it self corrects or self-eradicates over time.
I don't know what you mean by revolution though, since it seems you are in fact precluding anything physical from being revolutionary. I think technology is very revolutionary.
Well avoiding assumptions as much as possible is fine, but it seems the conclusion of a lot of this is to leave no ground for any conclusion, assumed or other wise, and conclusions must be reached - right or wrong - for action.
Haha Hume. Poor guy did try didn't he? A for effort.
Anyway, obviously at some point if we cannot find a way into another solar system, human existence will end with the sun and then nothing will work at some point in the future. There really is a limited amount of human/social functions and methods, although they take different names and sometimes hashes, but it's mostly recycling and stitching and the same bad parts fail time and again for the same reasons. Softer landings are aided by technology, not through some sort of systemic revolution or a change in "what works".
That assumes all representation is equal.
This, perhaps, approaches what we’re trying to feel our way toward: the breach, the sudden, epiphanic emergence of the genuinely unplanned, the departure from the script. To put it in fashionable Badiouan, the Event. The INS believes in the Event—in the power of the event, and that of art, to carry that event within itself: bring it to pass, or hold it in abeyance, as potentiality. And, paradoxically, the best way that art can do this is by allowing itself to be distracted, gazing in the rear view mirror.
Economists were actually late to the party. The significance of Keynes’ General Theory of Employment, Interest and Money of 1936 was not that it showed a new way — by that time, world governments had been busily implementing Mercantilist policies for over six years. Rather, it showed economists how they could update their blather to the new political realities, so that they could reclaim their prized sinecures as elaborate justifiers for what politicians wanted to do anyway. Keynes’ book is essentially unreadable; the title alone tells you where Keynes intended to get his employment from.
Keynes himself understood what he had done. An entire chapter of the General Theory (chapter 23, Notes on Mercantilism … is dedicated to cheering the return of the Mercantilist agenda.
“[the Mercantilists] were emphatic that an unduly high rate of interest was the main obstacle to the growth of wealth … and several of them made it clear that their preoccupation with increasing the quantity of money was due to their desire to diminish the rate of interest.”
Does that sum up the past several years of Bernanke and Quantitative Easing?
Murray Rothbard wrote extensively and wonderfully on this topic, especially in his fantastic book Economic Thought Before Adam Smith. This used to be a little-known text buried in university libraries, but it is now available for free in eBook form from mises.org.
One of Rothbard’s points is that Mercantilism reflected big government; the Classical or “laissez-faire” view reflected small government. Today’s Mercantilism is a reflection of the expansion of the U.S. government, from 7% of GDP in 1900 to about 40% today. This brief excerpt is also from mises.org:
”As the economic aspect of state absolutism, mercantilism was of necessity a system of state-building, of big government, of heavy royal expenditure, of high taxes, of (especially after the late 17th century) inflation and deficit finance, of war, imperialism, and the aggrandizing of the nation-state. In short, a politicoeconomic system very like that of the present day.”
That sounds a little familiar …
Here’s Rothbard, talking about Mercantilism in 1963:
”Mercantilism has had a ‘good press’ in recent decades, in contrast to 19th-century opinion. In the days of Adam Smith and the classical economists, mercantilism was properly regarded as a blend of economic fallacy and state creation of special privilege. But in our century, the general view of mercantilism has changed drastically.
Keynesians hail mercantilists as prefiguring their own economic insights; Marxists, constitutionally unable to distinguish between free enterprise and special privilege, hail mercantilism as a “progressive” step in the historical development of capitalism; socialists and interventionists salute mercantilism as anticipating modern state building and central planning.
Mercantilism, which reached its height in the Europe of the 17th and 18th centuries, was a system of statism which employed economic fallacy to build up a structure of imperial state power, as well as special subsidy and monopolistic privilege to individuals or groups favored by the state.”