Wise and clever posts


Recommended Posts

Phil Coates wrote:


Atlantis was a huge downhill slide from OWL in terms of quality. People were clever, snarky, rowdy....and to the extent they focused on that they might have been amusing but the level of intellectual content, original insights dropped off.

End quote

The percent dropped off, but many fine posters did not like the limited per day subscription allowance of OWL.

Michael wrote about our three part brains and creativity:


All three are interconnected. They do not operate independently. Instead, they operate with emphasis on their respective characteristics, not with exclusivity.

End quote

And I think that is why such good posts were generated on OWL and free-wheeling Atlantis. Roger Bissell, Ellen Moore, Barbara Branden, George H. Smith, Chris Sciabara, and others nearly exclusively wrote on Atlantis not on its sister site, OWL.

Phil Coates, nearly exclusively wrote on its sister site, objectivism@wetheliving.com or OWL as we usually called it. I did searches on four threads via your name, Phil, and NEVER found you on Atlantis. There were about fifty other locations which I did not look at.

And you did some excellent writing on OWL as the following prove.

Peter Taylor

From: "Philip Coates"

To: "owl" <objectivism@wetheliving.com>

Subject: OWL: Levels of Understanding

Date: Tue, 8 Oct 2002 14:37:38 -0700

Subject: Levels of Understanding

>what I first thought Rand was saying wasn't quite what Rand was actually saying. (Even Phil has made this point when he has mentioned that someone can't really understand O'ism until they've studied/lived it for years.) [Allen, 10/3]

That would be a rather imprecise formulation. Hopefully I didn't put it in those words independent of context, but if I did, let me clarify:

There are several levels of understanding Rand.

You can read a key concept in her political philosophy in an essay such as "Man's Rights". And it is not a difficult essay. It is possible for a reader to relatively quickly understand that she has a very distinctive and very precise concept of rights. That it is along the lines of the classic concept of 'negative rights' (freedom from interference) rather than 'positive rights' (a right to some good, such as a right to a job. A careful reader will also understand, since she spells it out, that when she discusses a right to freedom of expression, or property, or 'the right to life', she does _not_ mean that this right is 'balanced' by the kind of other everyday competing considerations, ('stakeholder rights', 'the public interest') that one often sees in decisions of the Supreme Court today.

So this aspect of Rand and of Objectivism can be _completely_ understood just from one essay.

But there may be other issues involving rights, such as issues of validation or answering objections or integration with other issues. Or application: how do they apply in many specific contexts: Should patent rights have a time limit? What happens to rights in emergency situations (lifeboat situations and wars)? What about quarantine and subpoena?

So that's a second level of understanding Objectivism. It often comes later in time and builds on the first. It doesn't mean that the first level was not well understood.

And there are other levels.

Often involving the expansion of one's knowledge and breadth of integration.

It's a different level and a separate and later effort to integrate the concept of 'rights' with the anarchist idea (fallacy, actually) that one has a 'right' to delegate self-defense or retaliation to whoever one chooses based on whatever level of investigation one makes as to whether they intend to be scrupulous or not. To spot the fallacies requires a higher level of integration of rights, an integration with several other issues.

Plus, there are other areas of Objectivism which are more difficult than rights theory, requiring a different level of understanding (sometimes more abstract or with a more methodological focus). For the average person, the whole field of epistemology would qualify. For example, reading Introduction to Objectivist Epistemology for the first time is much less likely to be a bored "yeah, yeah, I already got that" kind of experience in the way that reading "Man's Rights" might be for those already exposed to the classical liberal tradition or to the ideas of the founding fathers.

In that sense, whether or not you "get" some aspect of Objectivism quickly or slowly may depend on your own context...how much or how little you have been exposed to similar ideas (or methods of thinking) in the past.

In the case of ITOE, the chances are you've never seen anything remotely like this before or on this level

If you're a good introspector, you know that you are going to have to come back and carefully reread this several times over a period of years because, as Allen mentions, you sense on some level that you may find what you "thought Rand was saying wasn't quite what Rand was actually saying".

One level that ITOE forces you to reach if you are to fully understand it (Rand may have forced this on you before, or you may have achieved it earlier in life) is tremendous precision in language. She uses words and

phrases such as nominalism, realism, intrinsicism, implicit knowledge . . . and many others in almost mathematically precise ways.

(In regard to the difficulty of grasping epistemology, I'm reminded of a bizarre, rationalistic, uncomprehending, enormously destructive piece written by Bryan Register in JARS [vol 1, no. 2]. As one more piece of evidence of the mistakes of 'highly academic thinkers', this is a student of Objectivism who is a graduate student in philosophy at a prestigious school and has attended many Objectivist conferences. But he just can't shake out of his head the idea that Rand is advocating "nominalism" in ITOE. Which is exactly one of the two major errors she is refuting. He didn't seem to have the humility to do the careful rereading several times until it sunk in what she was getting at. As one would expect, he also butchers the concepts 'realism' and 'intrinsicism'...apparently employing the usages of the analytic tradition in regard to these and attributing to her a set of views which she does not take anywhere. And then vigorously beating to death the straw philosophical thinker he has manufactured --- thereby causing enormous damage to the plausibility of Objectivism in the minds of any outsiders who would happen to read the piece and think that it represents careful thinking about Objectivism by knowledgeable insiders.)

The most advanced or difficult level of all is to integrate Objectivism fully into living life. There are more people still struggling with this than there are who think Rand is advocating nominalism in regard to

universals or are unclear what she means by a moral law.

To summarize:

One level of understanding is to grasp the basic position or idea or definition or theory and the often unique way she uses words.

Another level is to integrate it with other views one has or hears on the same topic.

Another level is to apply the position to complex or borderline cases.

Another level is methodological: adopting Rand's precise and careful way of thinking about philosophical (not polemical) issues - and integrating them properly with other methodological tools from one's own discipline or experience. Another level is to take all this stuff and integrate it with all the details and complexity of living one's life. (Note: I'm not suggesting these levels divide up neatly in practice or come in this tidy a chronological sequence.)

--Philip Coates

From: "Philip Coates"

To: "owl" <objectivism@wetheliving.com>

Subject: OWL: Subject: Defending the Language

Date: Wed, 21 Aug 2002 13:48:14 -0700

Subject: Defending the Language

One of the biggest mistakes young academics (this would be true of older established and influential full professors as well) often make in philosophy, psychology, and elsewhere in the humanities is they fail to

defend in their own minds the context, the richness, the multiple meanings, the derivation, and the conceptual effectiveness of the English language as it has developed into a more and more powerful and supple tool across the centuries.

They seem too easily persuaded that there are puzzles or problems or improper implications in the usage of words such as "he", "man", and "referent". They are persuaded by journal articles more than by a full

exposure to how people on the street, historical writers, magazine articles use words. Since they have often gone directly from high school to college to grad school, they (some or many of them) have never acquired enough depth of reading to be resistant to the ideas that "man" only applies to one gender. Or that "referent" only refers to a physical thing. [Paul Bryant, 8/21, claims this usage is insisted on by both analytic and continental philosophers].

Those of us who did not pursue graduate study in linguistics but instead have read widely in Shakespeare and Browning, or Aristotle and Plato, or Gibbon and Herodotus, or Orwell and Strunk and White . . . or simply just read the New Yorker or the Atlantic Monthly. . . have a much clearer idea through actual practice (the gold standard here) of how, for example, "man" and "referent" are used.

In this post, I'd rather not defend (again) point-by-point the sense in which "man" is understood quite clearly by those who are widely read and have a thorough liberal education to refer to both genders and to the species as such. Nor do I want to go off on a tangent to hammer home the general knowledge fact that one usage in the English language of a 'referent' is 'that to which something refers' and the 'that' can be much more than a physical object.

There's a deeper and far more important point here.

When you fail to integrate and defend the language you grew up with and everyone around you grew up with...and instead try to substitute for what already works some new, invented terms (such as some gender neutral construct for 'man', or 'signification' for 'referent') you are causing crippling and sometimes irreversible harm to your own thinking processes . . . as well as those of everyone who adopts the new terminology or insists that it be used in academic discourse or to be published in a journal.

The new term does not have the richness of history or association.

It does not have the metaphors, the poetry, the immediate mental connection with things you learned in childhood or high school history or English literature or ancient myth or great essays by Mark Twain or E.B. White or George Orwell.

Even worse when you discard a well-worn term one of whose sub-meanings or senses of connotations already serves the exact purpose you want, you or others will always constantly have a disconnect, a mental struggle going on:

People will misuse both the old term (whose meaning you have tried to alter or restrict) and the new one. If philosophers create technical terms such as 'intension/extension' either to exactly indicate connotation / denotation' (or even a subtle shift or distinction involving these), they are creating a special jargon which is unnecessary if terms already exist in the language that do that work.

Maybe for elitist reasons the originators didn't want to use terms which every English junior high school teacher can understand and teach her class (connotation and denotation). But what they have accomplished is to muddy the waters. Every time someone reads it they may have to wrench themselves mentally away from, for example, the ordinary meaning of the word 'intention' [purpose or intent]...they will have to try to automatize a new technical jargon. And they will have to also hold in mind the (old-fashioned) terms connotation and denotation.

Also, there is a whole thousand year history of discussions of denotation and connotation or their root ideas in the English language or the Latin and other languages that preceded it and have fed into it from many sources. By switching terms, philosophers keep their discipline 'pure' in their journals. It can too easily be a false and Platonic purity. Integration with, for example, English literature and historical sources tends to be excluded in favor of another tradition...that of academic philosophy of much more recent vintage. The examples would tend to shift from historical and "public" ones widely known throughout the culture toward thought problemsand technical formulations that only the priesthood can understand. [i'm not even mentioning the possibility that the new terms may be found to have their own ambiguities and imprecisions, such as when philosophers attempt to substitute the fuzzier, made-up term 'signification' for the crystal clear, already commonly known term 'referent'.]

The claim is often made by academics in the humanities that, like scientists, they are in need of sharper tools and finer-grained instruments. The English language is sometimes ambiguous or imprecise. There is no good word for what they need to be discussing.

But as Rand and many others of us have pointed out incessantly and academics don't seem to want to accept, philosophy and the humanities deal primarily with facts of reality about man and his everyday interactions with the world.

This is material accessible to any intelligent layman.

It is not like subatomic physics or the biochemistry of viruses.

There are two reasons why (with _extremely_ rare exceptions) it is generally unnecessary and inappropriate to invent special, technical terms (whose full context and usage can only be understood after years of mastering a specialized professional literature)..and a third point which is their consequence:

i) The facts of reality and everyday situations and introspections involved have been experienced by virtually every thoughtful human being; they are quite well-known; they have been discussed by gardeners, stockbrokers, librarians, court jesters, lawyers, and physicians for centuries;

ii) Since this (accessibility and commonality of experience) is true, language has _already evolved_ to discuss nearly all of these 'humanistic' issues quite precisely and accurately;

iii) Your responsibility as a graduate student is to become aware of this huge past cultural context -- to master that language and that historical-literary-psychological background. You do not have the epistemological right to discard it all, say "don't bother me I want to be a philosopher not a historian", and just start afresh. That would be irresponsible and unscholarly on your part.

The English language as it has evolved over the centuries . . . through absorbing concepts from other languages and through a certain amount of creation of new specialized terms . . . already has long provided tremendous power to a skilled writer and to a deep thinker. It's hard to be certain but based on the academic writers I have read, the journal articles I have had recommended to me, what seems to be true is that academic writers are seldom good writers. Academic prose is well-known for being clumsy and unskilled, long-winded and obscure or obtuse. It seldom displays a good command of the English language in a supple, nuanced, literary, sophisticated, cultured level.

What (sometimes) seems to happen is the following:

a) The academics who invented or want to change terms did not do a thorough enough study of the language that is already out there. Sometimes the historical figures or current grad students came into philosophy from having been good at or devoted to science or math rather than literary or arts or history types ... so they came in _already_ not very language-sophisticated or relatively less developed in that sphere.

B) There _already exists_ a very good word, or phrase (or a combination of words which taken together conveys the meaning needed) But they were just not aware of it, or it was too difficult to formulate so they chose the "creative" satisfaction of inventing new language...which is also much easier and more fun than the long search to find just the right word or "public" formulation.

c) The new specialized jargon the academics create is readily accepted by journal editors because it is 'fresh' and 'original'...or is well-defined and presented in a seminal journal article.

d) The new language, new distinctions, long series of learned commentaries and critiques add new layers of clarification or adumbration to the special terms which all become part of the sizable body of knowledge new graduate students will be tested on/required to master.

e) New graduate students are epistemologically and linguistically vulnerable. They grew up in a century where they often did not acquire a good education. One where they often did not learn Latin (roots of words and their structure), did not thoroughly enough learn the canon of dead white males (great writers who used language forcefully and well) and other 'cultural and civilizational' areas such as world civ and world history.

That would have been the proper preparation to become philosophers. The grounding needed before one enters graduate school. Without it, they are susceptible to absorbing uncritically what their professors tell them is

true about language and how it needs to be changed.

Here's the bottom line:

Language and words are the tools we use to think with.

The English language, rooted in other simpler and feeder languages, was developed over an enormously long period of time by tens of thousands of minds working in the planet-wide and millennia-deep laboratory of experience.

It needs to be treated with tremendous respect and mastered over decades. It is a very, very, very good tool, just as it exists right now and in common usage. While not unflawed, it is the most powerful thinking tool in the universe (more words and concepts than any other language...absorbs terms from other languages to identify previously unnamed concepts or existents...developing into a world language used by many civilizations and specializations).

To change the language we have grown up with and used to clearly specify issues, situations, and subtleties is an _incredibly_ serious matter. And it seems to be done too cavalierly too often by unscholarly or too pedantic or too disrespectful academics.

Words represent concepts. To change words is often to change our very conceptual understanding of reality.

It can affect our very ability to think.

A good, diligent graduate student does not merely absorb uncritically the terminology and definitions of his professors, no matter how prestigious or brilliant.

He may have to use them in papers to get a degree. But he always does keeps a separate mental notebook in which he asks and answers questions such as:

Was this new term or distinction or series of journal articles parsing the term necessary? Does it add further clarity or further confusion? Does normal language already cover this issue and therefore should we throw out the whole discussion? Is the debate (over for example the morning star/evening star issue) a semantic and linguistic one or a substantive one?

It's really important that, no matter how busy you are, you do all this extra work. (I certainly had to in college and graduate level courses in philosophy and I can't imagine someone not doing this and retaining mental clarity.) I'm not sure from postings, conversations, and writings I've seen over the years from, for example, Oist grad students in philosophy that, for all their claims to scholarship, they have always been as diligent as is necessary for cognitive self-defense.

Or critical enough of their own professors or their own mental/linguistic processes.

--Philip Coates

From: "Philip Coates"

To: "owl" <objectivism@wetheliving.com>

Subject: OWL: Chris, Are You Falling for Libertarian/Marxist RevisionistHistory?...Don't Do It, Young Man!

Date: Wed, 30 Apr 2003 22:43:08 -0700

Subject: Chris, Are You Falling for Libertarian/Marxist Revisionist History?...Don't Do It, Young Man!

Chris Sciabarra [4/30] presents the view that if only the U.S. had not entered WWI, Germany and Russia would have not been so damaged by war that Nazism and communism inevitably arose.

The reason this is worth discussing today is that it is hardly "old news". Isolationists would argue that if Chris is right we should avoid making the same mistakes. And that fighting other wars overseas today may lead to like 'destabilizing' consequences.

I've explained in great detail elsewhere* why the Iraq war was necessary (and the need to root out terrorist training camps and a host regime justified Afghanistan) so let's move forward (or backward in this case).

Let's deal with the if-only-we-hadn't-gone-to-war-with-Germany-in-1917 argument:

1. WWI wasn't a 'war of choice'. It was a response to aggression:

In early 1917, Germany announced that it would launch unlimited U-boat warfare against all commercial ships, including those of neutral America, anywhere near Britain and France. They followed up promptly by starting to sink unarmed American merchant vessels. As a Philadelphia newspaper put it at the time the "difference between war and what we have now is that now we aren't fighting back." Unless Chris wants to argue that America can allow Germany to dictate its trade with Europe and to sink our ships if we don't comply, Germany was the initiator of acts of war.

And there is also the matter of the Zimmerman telegram at the same time, signaling an intent to escalate hostilities further - in which the German foreign secretary proposed an alliance with Mexico in which it would regain Texas, New Mexico, and Arizona as fruits of war.

2. It wasn't the war, it was the peace:

It's not the fact of the war's devastation that caused fascism, Nazism, communism. It was the aftermath of the war. After WWII, which caused far more devastation, the U.S. participated in an occupation policies and de-Nazification and war crimes policies and economic / free trade policies which set the defeated countries back on their feet. So it's reasonable to assume that if the post-WWI Treaty of Versailles had been more benign, rather than aimed at crippling and impoverishing Germany, Hitler would not have been able to flourish. There is a whole section in most histories of the period on this topic. They are usually entitled something like "The Peace Treaty that Bred a New War."

3. It wasn't the war, it was the lack of free trade:

Moreover, if the free market low tariffs of post-WWII had been in place, rather than the high tariffs of post-WWI, prosperity rather than economic upheaval could have resulted.

4. It wasn't the war, it was the appeasement (...and the fact the bad guys thought they could 'get away with it'):

This is well-known both with regard to Hitler and with regard to the Bolsheviks so I won't spend much time. The totalitarians found they could get away with murder. Then they went further. The free countries could have defeated them early very easily and without much loss of life. And maybe even without war (hot or cold) in one or both cases. Instead they waited until they were entrenched, had conquered the continent of Europe (Hitler) and were on the verge of developing nukes, or had consolidated power and developed an arsenal of atomic ICBMs.

We are very, very, very lucky the Soviets, operating often thru committee post-Stalin and post-Khrushchev, were somewhat more risk-averse than Hitler would have been in their position. And didn't destroy the world in some hair-brained miscalculation or brinkmanship during the years from the Cuban Missile Crisis through the late eighties.

5. With regard to whether Russia would have gone communist eventually if they had not been defeated in WWI, I have no informed opinion. But the following logic: U.S. does not enter the war, therefore peace breaks out, therefore Russia doesn't lose, therefore the czars stay in power forever, therefore the horrors of the century are averted --- seems like a quite a stretch to me.

5b. With regard to the etiology fascism in Italy, I'm going to take a pass. (My customary omniscience is in remission on this.)

6. I want to make a broader reply to a currently fashionable set of doctrines on the left, which libertarians have sometimes swallowed whole and which undergird much of the debate about isolationism (although Chris didn't mention them and may or may not subscribe to them as sweeping principles):

War is destabilizing. It causes more harm than good. So leave things alone. Leave the status quo alone. Find another peaceful way to deal with the bad guys.

The fallacy is that each of these statements is an overgeneralization. Some wars which leave matters unresolved -do- indeed destabilize. Some don't. WWII stabilized Germany and Japan and ended their threats to the world and their forms of totalitarianism or authoritarianism.

Why would it be the case that every war leaves things worse off or unleashes new demons?

It just doesn't follow. Did the American Revolution leave things worse off than they were before? What about the Cold War? What about the Civil War? And what about the cases when the bad guys are "hungry" and don't want to be deterred or kept in their box? (It's often forgotten how militantly expansionist the antebellum South was. This was a main cause of war. The Southern slavers wanted new lands for slavery - the fact they couldn't get Kansas/Nebraska, California, and south of the border drove hostilities.) Wouldn't you want to "destabilize" a status quo in these latter cases which involved dictatorship or slavery, especially when they threaten to spread further?

Assuming you could do it successfully without literally or figuratively blowing up the world...which first the colonies, then the North, then the United States did.

--Philip Coates

*I did so in the following posts: "Dummies Guide to Iraq" [1/23]...."The Case for War" [2/26], "The Case for War - Revisited" [3/8]. Observe that many of the arguments there and in today's post could be accumulated into a long treatise entitled AGAINST LIBERTARIAN ISOLATIONISM if I felt there were a large enough audience, which I doubt.

From: "Philip Coates"

To: "owl" <objectivism@wetheliving.com>

Subject: OWL: Students vs. Completed Objectivists

Date: Mon, 11 Mar 2002 13:41:22 -0800

Subject: Students vs. Completed Objectivists

For anyone put off by my "modest proposal" and "modest proposal part two", I'm going to continue to be the skunk at the garden party. I will extend even further into an additional area my views of the shortcomings of Objectivists and the Objectivist movement and the enormous distance we still have to travel.

Those who think we are about where we need to be as a subculture -or- who see most of the problems we face as being outside of us and in the inattentiveness of a basically irrational culture -or- who don't see a need for the negativity of an ongoing critique, should press delete right now. I may offend you even further.

Mike Miller mentions that "people are thin-skinned, for the most part" and Ram replies that this is not an Objectivist view of people and asks whether one should view people as so frightened or fragile. But an Objectivist view of people has to do with how people might be and ought to be, not how one finds them statistically in the general population...or how one observes tendencies within the Objectivist movement. Perhaps part of where I think Ram would be mistaken on this last aspect (what's true of people within Objectivist circles) comes from thinking in terms of an "Objectivist" here instead of a "student of Objectivism": an Objectivist, by definition, would not be thin-skinned because that would be a failure to be rational? But what about someone who is only still learning Objectivism? Would he make this mistake?

This leads to the wider issue of how we label ourselves--the words we use affect our thinking.

The conventional term for anyone who has read Ayn Rand and embraces her philosophy is an "Objectivist". Books about us, press coverage, and we ourselves use the term. A too-quick implication would be that one becomes the embodiment of a philosophy as soon as or by the fact that one accepts it. There used to be a tendency for us to call ourselves "students of Objectivism". The recognition was that, unlike becoming a nihilist or a born-again Christian or a Buddhist, there is no sudden moment of baptism, no one all-illuminating revelation that one can "get" very quickly. On some level, you can become thoroughly a Buddhist much more rapidly.

It takes years just to fully _understand_ a philosophy as complex, radical, and against-the-cultural-and-historical-grain as Objectivism. Plus it can take decades to _integrate_ it into all the myriad areas of life.

I'm not a historian of the Objectivist movement, but I gather that the term "students of Objectivism" fell out of usage because it had been used largely to distinguish between Ayn Rand and those in the inner circle as being the only _true_ Objectivists while everyone else was supposed to be perpetually a student, eternally on a lower rung either intellectually or morally.

That usage was never valid.

Those in the inner circle, the "teachers" (the NBI staff and faculty in the 1960's?) were not always unflawed and perfect at understanding or integrating the ideas into seamless practice. Just because you teach Objectivism or write or deliver lectures does not make you a better Objectivist than anyone else. Nor is there any reason why people outside that inner group around Ayn Rand might not understand or apply Objectivism just as thoroughly. Even though they had never written an article or delivered a lecture about it, or were not intellectuals by profession.

But there is nonetheless a valid and important distinction captured by "student of..."

Most, or at least a major fraction, of the people labeled Objectivists could actually more accurately be called students of Objectivism. Their learning of the system is still far from complete (this is natural and there is, of course, nothing wrong with this--to get from point A to point A+50 you have to pass through point A+20). Moreover there is no perfect or unflawed understanding of the system of ideas which provides a seamless road map to existential success. Being an Objectivist is no guarantee whatsoever against insufficient benevolence, being thin-skinned, using floating abstractions, losing your temper when you shouldn't, misjudging situations or people, dropping context, neglecting social skills, or being insufficiently knowledgeable in areas important to our lives.

Since life is a constant learning process, this means that at any particular point you don't yet know everything you need to know. And you will continue to make booboos (trust me on this!) Many students of Objectivism, on the other hand, seem on some level to think that accepting the philosophy explicitly makes them somehow morally or practically better than people who have not yet (or never will) accept the philosophy explicitly. They feel an unearned sense of superiority to the whole rest of the world: to any Christian, to any person who is confused philosophically, to anyone who is just not a very deep philosophical thinker regardless of how successful they are in living and pursuing goals.

It doesn't work that way. It's how you actually live your life in all the various areas that matter. And whether you continue to grow or stop developing. Not what philosophy you gave your verbal commitment to at age 19.

In fact, the man on the street, perhaps influenced by Christian humility or philosophical skepticism, is more likely to err in the direction of excessive uncertainty or humility or unwillingness to judge, to listen in silence when he should instead enunciate firm conclusions, to err in the direction of subjectivism, than a student of Objectivism. But I very frequently and regularly see long-time Objectivists or students of Objectivism erring in the opposite direction--toward intrinsicism, toward 'snap judgment certainty' in fields or areas where he has strong opinions and sharp conclusions but has not sufficient knowledge or exposure, toward arrogance rather than humility, and toward talking when he should shut up and listen or ask questions.

Just as a start (and as a corrective to "Objectivist arrogance" or smugness or complacency), it's time to sometimes use the term "students of Objectivism" again. And it's time for people who have complaints about what we do wrong as Objectivists or as a movement to air them so we can correct them. We are very far from perfection and very far from success as a movement. I see us as a great distance as people from reaching Atlantis. Plus, I see our movement as being in very deep trouble and losing the war of ideas (I've only scratched the surface here in this series of three posts), so while I intend to be diplomatic and persuasive, I hardly intend to be silent.

If we were more aware of our own shortcomings or need for further growth (or that of our movement) it would help immensely. This point was put best with some bemusement by the noted academic philosopher Clint Eastwood:

"A man's just GOT to know his limitations."

--Philip Coates

Link to comment
Share on other sites

If anyone is interested--on a brain level--in why a spirited discussion is generally more fruitful than a dry "civil" one, here is something to think about.

One of the characteristics of the mammalian brain is that it controls long-term memory. This means--physically--a greater number of synaptic connections (dendrites and axons). Where you have more synaptic connections, you also have deeper held conclusions. These are neural pathways. You find these neural pathways in all three brains (actually they all interconnect), but the mammalian brain is especially adept at forming new neural pathways quickly due to the emotion it controls.

When the neocortex interacts freely with the deeper mammalian brain portions of neural pathways, you can get all kinds of connections that were not evident before. But the only way to get the neocortex flashing gangbusters on the mammalian brain is to get the mammalian brain wound up with emotion.

Spirited discussions do the trick.

I think that's cool.



I refer to hard and intense debate and argument as mental gymnastics which reflects what you have just provided.

One of the reasons why I will sometimes vociferously argue a position that I do not believe in is because it exercises mental muscles that I need to use.

Also, it explains why I loved to teach argumentation, debate and rhetoric.

Thanks for the info.


Post script:

Peter you realize that almost no one has the patience to read those llllloooooooooonnnnnnnnggggggg posts that you stick in this type of medium, especially without breaking them up with the quote function to set off each piece.

Link to comment
Share on other sites

> And you did some excellent writing on OWL as the following prove... [#26]

Thanks, Peter. I think I wrote at that length and in that detail because I was getting responses on OWL, because people seemed interested in exploring things in that depth. There were a lot of philosophy grad students and others with a strong academic or intellectual or exploring-the-theory-of-Objectivism bent.

I seem to remember Allen Costell, Neil Goodell, the Enrights, Joshua Zader, and a very sharp Israeli fellow, Moses ???, etc.: Lots of good discussions, very definite but polite disagreements.

Whether Atlantis appealed may have varied with whether it was Atlantis as it first started out or as (if I recall correctly) it degenerated a bit.

Edited by Philip Coates
Link to comment
Share on other sites

My online bud, Phil Coates wrote:


I seem to remember Allen Costell, Neil Goodell, the Enrights, Joshua Zader, and a very sharp Israeli fellow, Moses ???, etc.: Lots of good discussions, very definite but polite disagreements.

End quote

Eyal Moses. I think he went on to work for TOC. I agree. Those were the days. I loved the polite disagreements better than, “When you sit down to write, are you sitting on the toilet . . . “ which I recently got from Ghs on OL, whether I deserved it or not, it stank :o)

And to think I was thinking about buying 100 of his latest book and sending it to various libraries (I mistyped 1000 without doing the math, when I posed the hypothetical scenario about buying Shane Wissler’s, anti-anarchy book, but I was also thinking about Georgie. It’s good being part of the “leisure class.”)

Phil, as great as it was, I liked Atlantis even more than OWL. I always felt that OWL was academic and dry while on Atlantis (at its best) you could establish a real personal contact with people. Atlantis is more like “off list.” OL can be more like “off list.” I think that Michael Stuart Kelly is better at the job of moderator than Jimbo was.

What? Oh. I hear no hisses of disagreement.

It is good to reminisce.

This is off topic, Phil, but is there anyone that you are hoping will run for President?

Peter Taylor

Link to comment
Share on other sites

> This is off topic, Phil, but is there anyone that you are hoping will run for President?

Peter, I hate politics so much that I'll wait till mid-2011, or as Scarlett once said "I'll think about it tomorrow." The last Presidential election I voted in was when George Washington was running for a second term. :(

Link to comment
Share on other sites


If this has not been mentioned elsewhere on OL, you have an excellant article on "Using The Atlas Shrugged movie to meet people."

For those that have not read Phil's piece, go to:


Link to comment
Share on other sites


If this has not been mentioned elsewhere on OL, you have an excellant article on "Using The Atlas Shrugged movie to meet people."

For those that have not read Phil's piece, go to:


yep I posted it Phil Coates Announces New Movie Dating Concept

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now