A Mini-Course In Rhetoric For Writers. Concept 2.9: Imitation And Emulation With An Eye On Competition

Concept 2.9. Imitation and emulation with an eye on competition. The ancient Greek teacher Longinus is among the first thinkers to address what would become a recurrent theme in the history of writing, and most especially, literary criticism: the sublime (elevated wonder, ecstasy, and beauty mixed with terror). His reflections on the sublime can be found in On Sublimity (first century CE), but it is also there that he addresses another key issue of special concern to rhetoric: imitation and emulation (an attempt to match) with an eye on competition (an attempt to surpass). He invites would-be writers to enter into “the way of imitation and emulation of great writers of the past” (142).

The poet Homer, for Longinus, is the highest standard for imitation; he is the writer that both the historian Herodotus and the philosopher Plato read closely, looking for hints as to how their writing might produce in readers “wonder and astonishment” as opposed to language that is “merely persuasive and pleasant” (137). The distinction between wonder and something merely pleasant is important here, for Homer wrote more of sublime things (grand things, war, death) than merely beautiful or small things (items on a banquet table; a flower), as in this line from the Iliad (vi, 146): “Even as the generations of leaves, so are also those of men.” Leaves are beautiful, to be sure, but Homer emphasizes here their relation to life, death, history—and, implicitly, meaning (or rather, the seeming lack of meaning, as humans, from the vantage of cosmic time, come and go as leaves).

And here are some lines, also from the Iliad, in which Ajax—strongest of all Achaean (Greek) warriors in the poem—prays eloquently, but also sublimely, for the lifting of fog during war, the prayer having a literal and immediate meaning (Ajax beseeching Jove that his troops might see in the fog of war), but also a larger, existential meaning (Ajax perplexed by mortal, sublunary life—life as lived beneath the moon—with human vision lacking a higher vantage and played out in the fog of strife, the gods not interfering, but hidden and silent):

A veil of cloud

O’er men and horses all around is spread.

O Father Jove, from o’er the sons of Greece

Remove this cloudy darkness; clear the sky,

That we may see our fate, and die at least,

If such thy will, in the open light of day. (xvii, 644-47)

Agon. The way that Herodotus and Plato read Homer was, in Longinus’s view, not merely imitative, but agonistic (competitive, emulative, and rivalrous). If you’re a writer, you read a poet like Homer not just with the aspiration of learning from him, but of also matching or outdoing him: “Plato could not have put such a brilliant finish on his philosophical doctrines or so often risen to poetical subjects and poetical language, if he had not tried, wholeheartedly, to compete for the prize against Homer” (142). This idea of writing as competition was broadly shared by the Greeks, where, for instance, Athens’s most famous ancient playwrights (Aeschylus, Sophocles, Euripides, and Aristophanes) wrote for competitions put on each spring in honor of the god Dionysus. In Freudian terms, we might also say that competition is oedipal (akin to a son attempting to vanquish a father to obtain access to power and mates). And the Yale literary critic Harold Bloom, also concerned with competition in writing, uses the term belated to describe the ambitious writer—belated in the sense that he arrives late to the party, a line of great writers having already written original things before him. If you’re a playwright, for instance, how do you write a better or more original play than one of Shakespeare’s? How do you explore the shifts and somersaults of the human mind and emotions better than Hamlet? But your deficit—your belatedness—is also your advantage. You cannot speak before Shakespeare, but Shakespeare cannot speak to your time. Only you and your contemporaries can do that. Will you find a way to address your era that Shakespeare didn’t already anticipate and rehearse? Can you, in the poet Ezra Pound’s phrase, “make it new”?

In goading you to making it new, Longinus recommends having great writers with a severe eye present to consciousness (143):

[I]t is good to imagine how Homer would have said the same thing, […] [G]reat figures [in the history of writing], presented to us as objects of emulation and, as it were, shining before our gaze, will somehow elevate our minds to the greatness of which we form a mental image. They will be even more effective if we ask ourselves ‘How would Homer or Demosthenes have reacted to what I am saying, if he had been here? What would his feelings have been?’ It makes it a great occasion if you imagine such a jury or audience […] and pretend that you are answering for what you write.

In other words, imagine great writers as your audience.

Immerse yourself in the atmosphere of books. It should be emphasized that by imitation Longinus is not advocating plagiarism. Rather, he likens the close reader of model writers to Pythia, famed priestess of Apollo at Delphi, who, when prophesying, stood in the midst of divine vapors that exhaled from a “cleft in the ground” (142). This provides a hint as to how steeped in the aromatic liquors of great books Longinus thinks the serious aspirant to sublime writing should be. Reading books should get you high, and you should bask in them as lovers bask in the sunlight of one another. If you want to be an exemplary writer, it makes sense to be a sensuous, cult-like devotee of sublime authors. The end, however, is not in losing yourself in another’s vision, but in learning, then finding, your own voice.

Seneca on novelty. Seneca, the Roman Stoic, writing in the same century as Longinus, was also concerned with competition and surpassing his mentors, himself inventing a new literary genre (form) of writing. Through his letter writing to his friend Lucilius, Seneca created the genre we now recognize as the essay. (Essay in Latin means attempt, as in an attempt to get one’s head around an issue.) Seneca’s letters, as a genre, tend to be short, focused meditations in an informal voice, directed to an intelligent reader, which in Seneca’s case is Lucilius. One translator of Seneca, in an introduction to his writings, observes that his “one hundred and twenty four letters to Lucilius comprise something entirely new in literature. For in these, which were his most conspicuous and immediate literary success, Seneca if anyone is the founder of the Essay” (Campbell 20).

In Seneca’s thirty-third letter to Lucilius, he addresses the dangers of over-reliance on the writings and thoughts of others. And midway through his letter’s cautioning admonitions, he writes the following:

‘Zeno said this.’ And what have you said? ‘Cleanthes said that.’ What have you said? How much longer are you going to serve under others’ orders? Assume authority for yourself and utter something that may be handed down to posterity. Produce something from your own resources. (80)

In other words, for Seneca, quoting authorities and leaving it at that amounts to intellectual outsourcing akin to plagiarism. It arrests one’s own thought, robbing a person of his or her chance at self-expression and self-development. To rely on the thoughts and words of others, with or without attribution, is to leave your own thinking to atrophy—or never to develop at all. “This is why,” says Seneca, “I look on people like this as a spiritless lot—the people who are forever acting as interpreters and never as creators, always lurking in someone else’s shadow” (ibid.). Instead, Seneca admonishes Lucilius to come out from the shadows and cease being a mere imitator of other people’s voices, memorizing their words and reciting them uncritically to others. An appeal to authority, or deploying its near cousins, plagiarism or mere summary writing, threatens to arrest thought. Seneca writes on imitation with caustic brilliance: “a man who follows someone else not only does not find anything, he is not even looking” (81).

Outer v. inner direction. In thinking of rhetoric and writing as agon (struggle and competition), it may be useful to reflect on the influential philosopher Georg Wilhelm Friedrich Hegel (1770-1831), whose philosophy surrounding history and art is one of relentless conflict. His ideas influenced figures of his own century, such as Karl Marx and Nietzsche, and they continue to have a wide circulation among contemporary intellectuals today.

For Hegel, coming into ever greater self-consciousness through contest is what human beings are doing on Earth. Unlike other animals, whose consciousness—to the extent that they possess any—appears to be directed outward toward survival and reproduction, and not much else, human beings are directed inward, toward an ever greater knowledge of themselves as geist (spirit).

Geist with a capital ‘G.’ And what is this spirit to which each individual is aware of possessing? Who are you, really? Hegel’s answer is that you are an emanation of the World Spirit—the Geist with a capital “G”—and that Geist comes into ever greater awareness of itself through you in conflict. In this sense, you’re special. You’re connected to the bigger world, and that world is channeling and clashing through you. You are a locus for forces of conflict. When Geist is mirrored perfectly back to itself by the world as a whole—that is, when the Geist’s essential unity with all that exists is realized in history, Hegel claims it will be the end of history. But since we’re not at the end of history yet, we are in a process of increasing self-awareness. That would be you.

Thesis, antithesis, synthesis. So how does self-awareness grow, according to Hegel? How do you discover who you really are? Hegel’s answer: by living; by crashing into people and things and learning what you’re made of. It is through life’s demolition derby that you discover whether, and in what sense, you are a noble dominator of existence or a mere handmaid to forces greater than yourself. Sometimes you’re being defeated and incorporated into some larger existence or goal that you didn’t choose, but to which you submit; at other times, you’re incorporating other people and things into your own larger existence and vision. In either case, Geist–the World Spirit–is coming to discover itself through you, history, and struggle. It is the dialectical movement of thesis (assertion), antithesis (resistance), and synthesis (unity). For Hegel, the ultimate dialectic of thesis, antithesis, and synthesis is when Geist and the universe face off in one last round and Geist wins, reaching a final unity on its own terms, and transcending history. In the meantime, we live in the midst of our peculiar Zeitgeist (the Spirit as manifested in our present age and culture), and rhetoric and writing might thus be thought of, in this state of affairs, as akin to what Carl von Clausewitz called politics: “war by other means.”

The Master-Slave dialectic: being for itself v. being for others. If you are what you eat and digest (or what you are eaten by and digested into), then where is this eating and being eaten taking place? For Hegel, there are four key arenas for the spirit’s struggle:

in physical labor (wrestling with material resistances to your existence, desires, and thriving)

in ideas

in art

with others

With regard to others, your first experience of being self-conscious is in the breaking of your unity with them; of a recognition that you are not alone, but in a diverse world with diverse others which, as Hegel puts it in his Phenomenology of Spirit (1807), have other “shapes of consciousness” from your own.

On encountering another person, therefore, you discover the paradox that your very existence as a “being for itself”—a self-assertive being—is dependent upon your also being a “being for others” (that is, a part of the object world of other self-assertive beings, potentially as their masters or in their service). If it is necessary for you to acknowledge that another person exists to realize your own self-assertive existence (this is mine; that is yours), it also follows that she or he must acknowledge you. And this leads to a face-off. What will be the terms of this acknowledgment? Whose vision will be essential and whose will be secondary or absorbed?

This dilemma is Hegel’s famous (or infamous) Master-Slave dialectic, in which two people, being “unequal and opposed,” and therefore not yet synthesized into a unity, must contend for dominance:

[T]hey exist as two opposed shapes of consciousness; one is the independent consciousness whose essential nature is to be for itself, the other is the dependent consciousness whose essential nature is simply to live or to be for another. The former is lord [Herr], the other is bondsman [Knecht]. (544)

And who is to be independent and who dependent is discovered in struggle:

[The lord’s] essential nature is to exist only for himself; […] he is the pure, essential action in this relationship, while the action of the bondsman is impure and unessential. […] The outcome is a recognition that is one-sided and unequal. (545)

Discovering inner power through work. In the Master-Slave dialectic, what does it mean to lose, to be a loser? Two things. First, it means to be “seized with dread,” for she or he has had a foreboding of the ultimate submission to death, “the absolute Lord.” In being “quite unmanned,” the loser trembles “in every fibre,” for “everything solid and stable has been shaken to its foundations” (545). In the loser’s upheaval, however, a substitute satisfaction is discovered that leads to a rediscovery of her or his own inner power, her or his “being for itself”: work.

In thought and physical labor directed to some triumph over a material problem, the bondsman rediscovers an inner strength that the lord does not enjoy. The lord, in deriving pleasure from material things by setting others to labor on her or his behalf, is actually alienated from the material world (one step removed). The bondsman, however, finds in the shaping of material reality (rather than shaping people or the master), dignity and self-knowledge:

[I]n fashioning the thing [as opposed to persons or the master], he becomes aware that being-for-itself belongs to him, that he himself exists essentially and actually in his own right. (546)

There is a great lesson for writers here. Writing–the command and control of words on a page–can empower the otherwise powerless and dispirited; it can recover in them their sense of being-for-oneself; of the possession of an inner directed purpose. Karl Marx was influenced by Hegel here, asserting that capitalism, in its exploitative nature and ever finer divisions of labor in the name of efficiency, alienates both workers and bourgeois consumers from full enjoyment of what is actually produced, undercutting the dignity that Hegel accorded to losers in the Master-Slave dialectic. So what human satisfaction demands, at minimum, on Hegelian and Marxist terms, is command over a process of fashioning from start to finish. A product made wholly by themselves, and owned by them. The generation of an excellent piece of writing, of course, is one way this satisfaction might be achieved, empowering an individual.

The symbolic, classical, and romantic in art. Since, for Hegel, the master-slave dialectic of thesis, antithesis, and synthesis is what spiritual beings, in their embodiment as humans, are up to, he applies this dialectic to ideas and art, which are also part of the historical process by which Geist comes to self-knowledge. With regard to art, for example, Hegel offers a very particular, three-stage historical progression for its evolution. These consist of “the symbolic, classical, and romantic” stages, and they represent the three strategies “in the striving for, the attainment, and the transcendence of the Ideal as the true Idea of beauty” (555).

Nature as wholly other—wholly alien—from you, and beyond your control. What Hegel calls “the symbolic” is art that focuses on setting before the mind one’s relation to alien nature: there is a whole world out there functioning independent of you and that does not submit to you. It defies your attempts at meaning. Hegel’s example is “the early artistic pantheism of the East” in which the subjects for art entail “even the most worthless objects,” and in which they may appear “bizarre, grotesque, and tasteless,” making any potential human victories over them seem futile (“null and evanescent”). This first historical form of art, “with its quest, its fermentation, its mysteriousness, and its sublimity,” sets out the terms against which any being-for-itself must struggle (552).

How Hegel’s speculation as to the first stage in the evolution of art might translate to the experience of the individual writer is in thinking about the first draft. It may present itself to the writer’s gaze as a hodgepodge of sentences and paragraphs without coherence, lacking rhyme or reason. What is needed is a governing thesis that will make order of the disorder–but that thesis has yet to appear.

The Greek body in art as synthesis of spirit and Nature. Hegel’s “classical stage” is the art innovation of the ancient Greeks, in which Nature is overcome by man. The struggles and victories of the Geist—the Ideal spirit—are represented by the synthesis of the human spirit and body with Nature: “[P]ersonification and anthropomorphism have often been maligned as a degradation of the spiritual, but in so far as art’s task is to bring the spiritual before our eyes in a sensuous manner,” it is necessary. In other words, Hegel suggests that the fashioning spirit of the human being is mirrored and symbolized in depictions of the fashioning actions of the body, and the body fashioned (either nude or elegantly clothed). On Hegel’s account, the Greeks understood, as have subsequent sculptors and painters, that the human body is the intersection of spirit and Nature, for “spirit appears sensuously in a satisfying way only in its body” (553). Thus for Hegel the first two stages in the history of art emphasize Nature, particularly wild Nature, and the human spirit as thesis and antithesis: two forces to be reckoned with and brought into contemplation, then conflict (agon), then unity.

Thus the application to the individual writer could not be more clear: the rough draft, as first thesis; as wild nature; as first essay (attempt) at energetic thoughts put on the page, is now in need of an antithesis; a counter force for introducing order and discipline: the edit of the author.

Moving inward, toward Geist with a capital “G.” For Hegel, it belonged to the romantic stage to move art to its next level—not the synthesis of spirit and Nature, but the wholly inward turn—what Hegel calls “the inwardness of self-consciousness”; the synthesis of the human spirit with the Geist itself: “[M]an breaks the barrier of his implicit and immediate [animal] character, so that precisely because he knows that he is an animal, he ceases to be an animal and attains knowledge of himself as spirit.” This romantic turn is associated, for Hegel, with Christianity: “Christiantiy brings God before our imagination as spirit, not as an individual, particular spirit, but as absolute in spirit and in truth” (554). And so the “inner world constitutes the content of the romantic sphere […] its true reality and manifestation it can seek and achieve only within itself.” Thus it is that romantic art functions paradoxically: “[R]omantic art is the self-transcendence of art, but within its own sphere and in the form of art itself.” (555). In the romantic stage, one does not struggle to shape art into a mirror reflecting the union of spirit and Nature, but to somehow reflect instead the truth of one’s inwardness and power solely, though it is, literally speaking, invisible.

In other words, if we apply Hegel’s romantic stage of development to the writer, rather than the artist, the moment of beholding one’s final literary product is not from afar, witnessing a representation of wild nature synthesized with order, but as in a mirror. That is, one beholds in the writing oneself. The writing is one’s deepest, fashioning self made material; made flesh on the page. What is mirrored on the page is you.

What you see when you look in the mirror. In short, Hegel is a Jacob-wrestling-the-angel kind of person, and so, in answer to the question—How do I come to know myself?—his answer is: by encountering and attempting to shape the material world, ideas, art, writing–and even others–in such a manner that when you contemplate them, they are in your service, mirroring back to you your triumphant, independent, fashioning spirit. The objects and people you have influenced and fashioned in the world reflect you. You look at them and experience a form of self-consciousness: that’s me in material manifestation. Of course, if you look, and discover that they don’t reflect you well, or are independent of you, or are out of your control, then this makes for another form of self-consciousness. You become aware of your weakness; your defeat; of perhaps being a secondary and dependent being in service to them.

Writing 2.9.1. In a piece of writing, express empathy for a person, group, non-human animal, or ecosystem, and justify your empathy. Then write another piece in which you withhold empathy from a person, group, non-human animal or ecosystem, and justify the withholding. After writing your paragraphs, observe the contrasts in the feeling tones of the two pieces of writing, noting the different aspects of the human psyche they appeal to—the better and worse “angels of our nature.”

Writing 2.9.2. In a piece of writing, intermix some appeals to the conventionally better angels of our nature with some conventionally worse angels of our nature, and see if what you write possesses a more-than-usual interest. Is your writing more energetic than otherwise for entertaining such tensions—or does it feel less coherent, hopeful, and inspiring? (This experiment may entail working with a dialogical, as opposed to a monological, voice.)

Writing 2.9.3. The tone of your writing primes readers to adopt and track your attitudes and energies. But if you’re not careful here, you may lose them immediately. So imagine you are starting a longer piece of writing on a topic of your choice and ask yourself this question: “What angel of human nature should I try to evoke to start a piece of writing on my topic—humor, empathy, sobriety, cooperation, selfishness, enthusiasm, snark, lust, etc.?” After you pick either a better or worse angel of our nature to start the hypothetical piece, actually write a couple of sentences—or even a full paragraph—attempting to evoke your chosen attitude. After you write those opening sentences, read them out to another person and ask what tones of voice and energies your hearer actually inferred from them. You should be able to answer all of these questions: (1) What’s my topic? (2) Do I have a thesis and thesis statement that’s clear as a bell? (3) What’s my genre? (4)

Writing 2.9.4. Reflect on a writer you regard as exceptional and worthy of imitation. What makes their writing work?

Writing 2.9.5. Write a paragraph or more reflecting on competition, and whether it is something you take to be valuable in relation to writing. Do you think the ancient Greeks and Romans had it right, turning writing into an agon (a struggle with precursors and contemporaries)?

Writing 2.9.6. Select a paragraph from a book by a sublime author you especially admire and wish to emulate and write the paragraph out, word for word, by hand, slowly, noticing how the rhythm of the sentences function, how punctuation is used, etc. Then attempt to write some paragraphs in the style of that writer. Imagine the writer at your shoulder as your audience. Please and surprise them. Justify your authorial choices to them. Edit according to the suggestions you imagine them offering. Pretend you are them, the gifted and sublime author, and see what happens to your writing.

Resources:

A selection from Longinus’ On Sublimity begins on pg. 136 of The Norton Anthology of Theory and Criticism (edited by of Vincent Leitch, et. al., 2nd edition, 2010). An introduction to Hegel with selections from his Phenomenology of Spirit and Lectures on Fine Art can also be found in The Norton Anthology of Theory and Criticism (2nd edition, 2010), beginning on page 547.

__________

Image result for hegel

Posted in aesthetics, atheism, atomism, critical thinking, donald trump, education, edward feser, God, Hegel, philosophy, reason, rhetoric, science, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.8: Selection And Editing

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter two. To have a look at other parts of the book, click here.

Concept 2.8. Selection and editing. The context for selection in writing is threefold. The writer has: (1) an end in mind—a thesis; (2) an audience in mind; and (3) some sense of the length-limit for the piece of writing. If, for instance, you’re an astronomer writing for a professional journal, your audience consists of professional colleagues—and the journal’s editor may generally accept no submission larger than, say, 8,000 words. If you’re writing a letter to the editor of a local newspaper, your audience is likely to consist of mostly nearby residents, and the editor may not accept letters longer than 200 words. So whether you’re writing something book-length or posting a tweet on Twitter, you don’t have an endless number of words and time to address everything, and therefore you must weigh your choices, making some points, but not others. In this sense, writing is always a zero-sum game: some words and ideas must necessarily win your favor, while others go unselected. There are victors and vanquished. What is essential and what can be discarded? How can the matter-at-hand be addressed most clearly—and with the fewest words? There is a process of efficiency-seeking at work. Like evolution, writing can be formidably “red in tooth and claw” (think of the school teacher’s dreaded red editing pen), and as such, it simply cannot escape its stakes and competitions. To echo the poet Samuel Taylor Coleridge (1772–1834): may “the best words in the best order” win.

Change the environment, change the selection pressure. So writing is about choosing words and sentences for survival, and eliminating others. While writing, the author is constantly judging: “This seems like the best word choice for now; the best sentence; the best direction for this piece of writing.” But these judgments are subject to change. Imagine, for instance, after taking an hour’s break from writing, an author returns to her work and discovers the emotional weather inside her has shifted from pessimism to optimism. What then? Naturally, she’ll bring that fresh energy to what she has already written, scrapping some sentences and adding others. What was fine and worthy of survival in one hour may not be so in the next. The environment for selection has shifted. Notice how alike this writerly process is to Charles Darwin’s description of natural selection in the fourth chapter of On the Origin of Species (1859):

It may be said that natural selection is daily and hourly scrutinizing, throughout the world, every variation, even the slightest; rejecting that which is bad, preserving and adding up all that is good; silently and intensibly [intensively] working, whenever and wherever opportunity offers, at the improvement of each organic being in relation to its organic and inorganic conditions of life. (504)

Exchange into Darwin’s passage the writer for natural selection, sentence for organic being, and her writing for the world and you have a startling description of the author’s actual profession:

It may be said that the writer is daily and hourly scrutinizing, throughout her writing, every variation, even the slightest; rejecting that which is bad, preserving and adding up all that is good; silently and [intensively] working, whenever and wherever opportunity offers, at the improvement of each sentence in relation to its organic and inorganic conditions of life.

Why “daily and hourly scrutinizing”? Because the environment for selection is constantly shifting in both mental and physical space and time. Moods shift in authors, readers shift, and history shifts. Whether a sentence is conserved in its integrity and form or is revised or eliminated depends on how well it proves to be adapted to the environments it encounters. Threats to its existence come from the writer who might eliminate it altogether, the reader who might ignore it, and the sheer ravages of time (the last book or data base in which it appears may go up in flames).

Shifting tastes in shifting environments. Writing is also like the selections surrounding taste. Imagine, for instance, a woman in a fudge shop, selecting three different pieces of fudge of different types (one is white fudge, one is without nuts, one with nuts). Now imagine she decides, an hour after purchasing them, that one of the pieces is insufficiently chewy, so she tosses it into the trash. Now the trash bin itself is a locus for selection, with ants in the role of potential consumers—or rejecters—of the fudge.

In other words, forms of selection are occurring in every moment all around us. We have selection on the producer’s side; selection on the consumer’s side. Think of the videos on the popular website “Funny or Die,” which either attract eyeballs or are eliminated. For a piece of fudge the motto might be “Yummy or Die,” attracting mouths—or, if we can imagine a conscious piece of fudge that doesn’t want to be consumed, it would be “Yucky or Die.” This latter formulation is the evolutionary strategy for survival of many species of beetles, which squirt acrid juices into the jaws of their predators. Darwin famously learned about this firsthand, to his chagrin. In order to catch three beetles on a specimen collection hunt as a young man, he held one in each hand and popped the third into his mouth—which he then quickly expelled in foul-tasting shock, the beetle scurrying away, uncaught. “Yucky or Die.”

“Interesting or Die.” Writing, in turn, has a general survival strategy of its own: “Interesting or Die.” Lionel Abel (1910-2001), an early contributor to Partisan Review, in an interview from the mid-1990s, said this about the Russian revolutionary and author, Leon Trotsky:

He had a literary verve which was unmistakable. He was a great journalist. And the intellectual power of his criticism of the Stalin regime…[is] accepted nowadays as justified, that he was right. But we didn’t know he was right. We knew he was interesting. And, in a way, if you lived in the Village [Greenwich Village in New York City in the 1930s], what was interesting was right. Certainly, the uninteresting was wrong. I’m not willing to altogether give that up, even today.

The rough draft as material for selection. Writing is about seeing the sentences that occur to you in your mind as wriggling forms of life, then giving them material form—one’s thought-worms made flesh, as it were—by laying them out in front of you on the page and seeing what happens from there. To extend the metaphor: the writer’s organelles are letters; her organs, words; her organisms, sentences—and once those sentences get started, they suggest other sentences—they breed other sentences. By the time the writer is done with, say, a twenty minute round of writing, perhaps her words have come together nicely to transform into a single, vivid, ecological community of sentences; a little system of relations on the page (a paragraph or two of argument or observation; a lyric poem, etc.).

Idea therefore leads to idea, thought to thought, as when the poet Robert Frost (1874-1963) wrote of his path through an autumnal woodland that branched unpredictably into new paths, way leading “on to way.” The writer is a pathfinder, unwinding sentences onto the page, then choosing further paths out of those sentences to still other sentences. And at some point the writer steps back to see where he or she has been. This is where editing begins.

Editing. The first step in writing is to just relax into the process of making initial selections among the ideas that occur to you. (“X comes to mind. I think I’ll say something about x. No, y is better. Yes, I’ll write about y instead.”) The second step is to get your thoughts about whatever it is you’ve decided to write about to flow out of you as actual, material sentences on the page. At this stage, you want to bring the stakes down on yourself, not over-fretting about perfection, completeness, grammar, or perhaps even coherence. It’s okay to make mistakes at this point. You know that writing is a process of imaginatively unleashing material, then chiseling it back. These sentences will be subject, soon enough, to later selection and correction in editing, so for now you let things fly. To let things fly naturally entails attention, not just to rational thoughts, but emotions. As the journalist and author Robert Wright concisely puts it, “emotions are judgments”; they are what surface in the writer as conclusions your brain has subconsciously–and perhaps quite quickly–worked out about such things as aptness of phrase and aptness of direction as they occur to you: “Yes, this phrase is good, this direction is good; these feel right.”

The sequence of selection thus proceeds from mind to emotion to material sentences to editing. At each of these steps you are imaginatively venturing outward to new ideas (“word are birds”), then back to the nest of four things:

(1) the music of your sentences together (how your language is flowing)

(2) your audience (the ear you are writing for or the group you are writing to)

(3) the piece of writing’s length limit

(4) your thesis or end (think in Shakespeare of the ghost of Hamlet’s father urging the ever erratic Hamlet not to forget his “almost blunted purpose”)

These four function together as form putting pressure on your flights of novelty, even as your flights of novelty put pressure on form.

Artificial vs. natural selection. So editing is where the writer’s actions have their most obvious analogy to artificial selection as opposed to natural selection. Before the eye of a writer are a variety of sentences gathered like dogs in kennels or roses in greenhouse beds, and just as the breeder of dogs or the cultivator of roses has a collection of specimens before her eyes, and an end in mind, so it is with the breeder and editor of sentences. The writer possesses what blind nature cannot have: an ironic perspective on the forms of life she has already generated. From your writerly vantage, you enjoy distance, and can cultivate what you want to say, cutting back on some words and allowing others to suggest still other words. Hence do your surviving words become “fruitful and multiply,” but to an end in mind: your object. If your sentences are organisms and your paragraphs little communities of organisms, your end may be to gather these communities into a greater ecosystem of interrelation, sending it forth as an essay or book–which might then undergo yet another form of selection by readers, perhaps going viral.  

Richard Dawkins. For being an early and vigorous defender of the theory of evolution by natural selection against its critics, 19th century biologist Thomas Henry Huxley (1825-1895) became known as “Darwin’s bulldog.” In the late 20th and early 21st century, the sinewy and quick-witted Oxford biologist Richard Dawkins (b. 1941), in his equal enthusiasm for the power of evolutionary explanation, has been called “Darwin’s greyhound,” and in his seminal 1976 work on evolution, The Selfish Gene, he argues that what underlies all of life’s activity is the reproductive imperatives of genetic material: a chicken, as it were, is an egg’s way of making another egg; an anthill is a way to make another egg-filled queen ant, and so on. Dawkins in turn argues that human languages, being codes for carrying information, function analogously to genes, moving words, phrases, and ideas about like viruses from mind to mind, some being more successful at provoking humans to attend to them and reproduce them than others.

Memes. In the last chapter of The Selfish Gene, Dawkins coins a term for the viral and replicating nature of human cultures and languages. Cleverly mashing echoes of the words mimesis (imitation) and memory with genes, he calls those bits or clusters of culture and language that go viral memes. Memes, like genes, are replicators. Among their potential iterations, memes can travel small and independent (“Got milk?”), can mutate (“Got beer?”), and can also be carried along in larger memetic clusters (as in the familiar phrase from the 23rd Psalm—“The Lord is my Shepherd . . .”—in the King James Bible).

An obvious example of a meme is the repetition of the phrase “I prefer not to” by Bartleby, a copyist in a 19th century law office, in Herman Melville’s well-known short story, “Bartleby the Scrivener” (1853). In Melville’s story, the phrase comes to infect the minds of the copyist’s employer and coworkers, and it has even taken on a life of its own outside of the story itself, becoming readily associated with all forms of passive resistance to authority, from Henry David Thoreau and Leo Tolstoy in the 19th century, to Gandhi and Martin Luther King in the 20th.

Here’s Dawkins from the last chapter of The Selfish Gene: “Examples of memes are tunes, ideas, catch-phrases, clothes, fashions, ways of making pots or of building arches” (206). There is a cultural catch-all quality to Dawkins’ use of the term, and thus the takeaway, for the writer thinking memetically is to attend, not only to the content and syntax (word order) of what one says, but to the material style of communication—for these too are potentially memetic. Just as, for instance, physical building arches communicate an architect’s purposes and style to viewers, so do the writer’s letter arches to readers, as when a sans-serif font like Helvetica (a font lacking small lines at the end of strokes) goes viral in advertising, becoming ubiquitous in culture as a font associated as clean and modern in feel.

Here are two additional, memorable sentences from Dawkins using either the term “meme” or “memes”: “If a meme is to dominate the attention of a human brain, it must do so at the expense of ‘rival’ memes” (211), and “When we die there are two things we can leave behind us: genes and memes” (214).

Writing 2.8.1. Try imitating evolution, making way lead “on to way” (in the poet Robert Frost’s phrase). Start a path for thoughts. Write a paragraph or two in which you write exactly 200 words, no more or less, on any topic of interest. As with artificial selection in the breeding of dogs or roses, have an end in mind and an audience. While writing, notice your mental process of selection and elimination. You might say to yourself, for instance, things like the following: “What sentence most smoothly and naturally clarifies or builds upon the previous sentence?” or “This seems like the best word choice for now; the best sentence; the best direction for this piece of writing.” After writing a bit, go back and look at what you’ve written and interrogate each sentence with questions like these: “Why that word in this sentence and not another?”; “Why those words in that particular order?”; “Why that sentence next, and not another?”; and “Do the sentences move along smoothly and rhythmically together?” Alternate between writing and noticing what’s driving your choices in the writing. Attend to your feelings as you write (recall that “emotions are judgments”). Again, fiddle with the writing until you’ve shaped it into something consisting of exactly 200 words. A little organism, as it were. 

Writing 2.8.2. If, in 2.8.1, you generated 200 words, look at what you have and write another paragraph or two explaining to yourself the processes of selection you went through to get those words onto the page. Be a naturalist here. What’s the story you tell yourself of how your piece of writing came to be? What’s holding your paragraphs together as paragraphs (as opposed to just being a series of disconnected sentences)? What seemed essential to keep, and what inessential—and why? If you feel you’ve made something alive, ask yourself whether it is Leonardo’s symmetrical and harmonious Vitruvian man, or something more akin to a Frankenstein’s monster: a piece from here, a piece from there; a thing ugly, and perhaps even worthy of destruction. As the standard for evaluating your writing, think “Interesting or Die.” Is what you’ve produced worthy of going on? Who or what decides?

Writing 2.8.3. Take what you’ve generated from 2.8.1 and give it an edit. Edit at the level of word choice, word order (syntax), tone of voice, musicality, and energy. See if you can breathe some more life and interest into your piece of writing than it has now. Perhaps, in your edit, additional thoughts will occur to you. Get those down as well. Let the writing grow a bit beyond their existing 200 words, but do not exceed 300 words. To give yourself additional ideas for writing and editing, ask yourself what environments they might encounter and how you hope to delight or interest the readers in those environments with what you’ve written (perhaps some fellow students you are taking a class with right now; perhaps yourself ten years hence, stumbling on the paragraphs again; perhaps a person a century from now flipping through your journal on sale at a thrift shop, etc.). What environment(s) are you aiming your words at? Momento mori (remember death; remember the limits placed on you by space and time); momento delectus (remember your power of selection in the situations and moments available to you).

Writing 2.8.4. Discuss what you wrote in 2.8.1 with one person or more, focusing on the issues you wrestled with as a writer. Is the writing working?

Writing 2.8.5. Write a paragraph to an individual you can speak with immediately after doing so (either by phone or in person), gambling on an argument that you think might persuade him or her to feel, act, or adopt a belief of your choosing. The purpose of this experiment is to gamble on an audience, then get immediate feedback from that audience as to the pay-off—or failure—of your argument. Was your argument persuasive to the person? Did you hit or miss the goal of your writing?

Writing 2.8.6. Pick a topic and write three completely separate paragraphs on it. To be clear, don’t write one piece of writing with three paragraphs, but three separate pieces of writing of one paragraph each—all on the same topic. Your goal here is to have three different, full paragraphs with different starts so that you can then choose the one you think is best after setting all three aside for a time. (After writing, you may have to set the paragraphs aside for at least a full day to really bring a fresh eye to them). As with three pieces of music, each paragraph will start differently, and so acquire its own unique tone and logic from its first “note” to the last. So try this. Select three of the below words or phrases to get you started on three completely separate paragraphs. Let them set the beat for your writing. Start one paragraph with one word or phrase, the second with another, etc. In part two of the book, we’ll refer to such phrases as mutagens—agents of change. (See the book’s Introduction for a discussion of mutagens.)

Once … Consider … Contrary to … It is … It was … One day … In the past … Have you ever … Although … It is a well-known saying that … It is well known that … After … The subject of x has become … Over the past x years … As we start … At the beginning of … Perhaps … Like most … Roughly speaking … I … In light of the recent … While it is true that … Contrary to popular opinion … Contrary to x …  Imagine … Visualize … On the one hand … Back in the day … In the past … One the ironies of existence … At one time I thought … At one time … I’m interested in x … I’m an x … For me, … What, where, when, why, how [as in, raise a question: “What do you suppose it would be like to live with perfect courage?”; “Where in our culture is stillness and quiet valued?” etc.]… I recently … The recent story of … Recently … Have you noticed …

__________

Image result for memes

__________

 

Posted in atheism, atomism, critical thinking, Darwin, education, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.7: Audience And Rhetorical Strategy

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter two. To have a look at other parts of the book, click here.

Concept 2.7. Audience and rhetorical strategy. Just as organisms have evolutionary strategies (more or less cooperative; more or less aggressive) for success, survival, and reproduction, writers have rhetorical strategies (more or less friendly; more or less obnoxious) for the success, survival, and reproduction of their messages. As a writer, you may choose to be generally low-key or high-strung; ironic or serious; optimistic or pessimistic; rational or irrational. These are gambles that you’ll make in the moment, hoping they’ll appeal to your audience and perhaps make for the success, spread, or preservation of your words.

As an example, consider your tone of voice in writing. Tone has to do with mood; it is conveyed from the very first sentence, and it is present in every sentence thereafter. Tone is a pathos appeal (an appeal to an audience’s passions or emotions) and can change through the course of a piece of writing (humorous at one moment, earnest the next). The hope is that your audience will pick up your emotional weather, and reflect it (laugh at your jokes, share your passions). Your success as a writer depends on it.

Writing is a bet laid down on the future. So your next sentence is always a gamble, and what you write is thus driven by an appraisal of your environment (whether you judge your audience to be skeptical, sympathetic, attentive, intelligent, etc.). When beginning a piece of writing, therefore, you are engaged in a process of selection: what model do you have in your head as to who is in your audience, and what do you bet will please them? Especially at the start of a piece of writing, there is an attempt to strike lightning in the consciousness of the reader by guessing which words, and in what order, are most likely to do that. Mark Twain (1835-1910) famously put the weight of the writer’s selective responsibility at the level of even the smallest word choices. He observed that the right word distinguished from the almost right word is akin to the distinction between the “lightning” and the “lightning bug.” The poet Emily Dickinson (1830-1886) once wrote in a letter,

If I read a book and it makes my whole body so cold no fire ever can warm me, I know that is poetry. If I feel physically as if the top of my head were taken off, I know that is poetry.

On opening a book of poetry, she hoped the writer had found words that would dazzle and spin her out, and as a writer, her goal was to locate words that would decapitate both herself and others through their aptness, velocity, energy, and truth. Sometimes she’d miss, sometimes hit. Hit and miss. The essay writer’s opening sentences have a similar function to the poet’s: arresting and capturing an audience’s attention.

Living is an art—and so is rhetoric. Because situations vary, and can shift dramatically in each hour, life has no recipe book for what to do from moment to moment. Living is an art. If, for instance, you were a literate tortoise on the island of Galapagos, there’s no guidebook to read, instructing you in how to be a successful tortoise from day to day. (“On such-and-such date, move to the left of that tree over there, stop to eat for ten minutes, chat-up the female tortoise that will also be there, then move back toward the ocean before noon.”) Instead, each tortoise just has to play the jazz of being a tortoise in each moment, doing the best it can with what its got.

So it is for humans. Humans are what Aristotle called political animals (social animals), finessing, not just their interactions with the material world, but also their interactions with each other. And that’s where the jazz of rhetoric comes in. Becoming good at deploying rhetoric is an indispensable part of making one’s life work. It entails sharing words in speech and writing, and of sharing images and gestures (recall that one can practice oral, written, and visual forms of rhetoric). Dancing, for instance, can be a form of rhetoric. Your proficiency with playing the jazz of rhetoric from day-to-day can dramatically influence your access to resources, work, and mates. As with any art, decisions have to be made about one’s goals in practicing it and how to proceed. In the case of rhetoric, your aim may be to impress your audience as to how clever you are, raising your social status; or it may be to share something zany on Facebook or Instagram that you think will make your circle of friends smirk. Whatever your social goals, if you are not accomplishing them with swords—forcing people—you’re doing it with words—reasoning with, coaxing, delighting, and spell-casting others.

Words, not swords. Deploying words, images, and gestures instead of violence to get what you want is the basis for civil society. It makes possible the gathering of humans into ever larger, cooperative circles–and this trait–talking things out, rather than fighting them out–has been decisive in the blossoming of our species from narrow tribal affiliations of, say, 150 people to cities of ten million or more. The deploying of words rather than swords to achieve ends underlies our elections, advertising, courts, and educational practices. A good reason to go to college, for instance, is to become more proficient and self-aware surrounding critical and rhetorical modes of thought. That’s a big thing that one does throughout college: practice critical thinking and rhetoric so as to get better at them. In a community that relies on words more than swords to settle differences, knowledge of critical thinking and rhetoric really is power.

Lincoln’s evolutionary gamble: his “better angels of our nature” speech. On March 4, 1861, in front of the still-in-construction U.S. Capitol building, Abraham Lincoln took the oath of office as the 16th President of the United States of America, and in his first inaugural address, he delivered these now famous words of appeal to the slave-holding southern states not to leave the Union:

One section of our country believes slavery is right, and ought to be extended, while the other believes it is wrong, and ought not to be extended….In your hands, my dissatisfied fellow-countrymen, and not in mine, is the momentous issue of civil war….We are not enemies, but friends. We must not be enemies. Though passion may have strained, it must not break our bonds of affection. The mystic chords of memory, stretching from every battle-field and patriot grave to every living heart and hearthstone all over this broad land, will yet swell the chorus of the Union when again touched, as surely they will be, by the better angels of our nature.

The better angels of our nature. Lincoln’s intimate appeal to fellow feeling and the best part of ourselves was a rhetorical attempt to avert war. It failed. In the month following his speech, the Confederacy of slave-holding states attacked Fort Sumter, a Union sea fort off the coast of South Carolina, initiating the first shots fired in the Civil War.

Who’s in, who’s out? Although Lincoln’s conciliatory words did not avert war, from a rhetorical perspective they are instructive in terms of thinking about human evolution in relation to speech and writing, for notice how saturated they are with vital questions surrounding community survival and identity, and how they attempt to sway the American people to undertake a shared evolutionary strategy: cooperation and unity, not distrust and war. Who’s in and who’s out of our community? How cooperative will we be with one another, as sharers of a broad continent? These are the great issues surrounding the speech: whether the nation’s circle of empathy should be narrowed or broadened.

And notice Lincoln’s deployment of the first person plural we and our throughout his sentences, priming the hearer to sympathy for a continent-wide tribal affiliation, placing all Americans securely in the same national family. What will be the status of us in relation to outsiders who don’t share our values or national identity? Will white and black, South and North, be friends—or enemies? These are issues surrounding borders. To whom do we extend our concern—and to whom do we shut our hearts? Will we deploy memory and narrative in the service of division, resentment, and grievance—or unity, fellow feeling, and a shared destiny? Will we submit to the darkest of our human passions (racism and war)–or love and reason? Lincoln cast his lot with the better angels of our nature; with sympathy and understanding—though a month later he would find himself in the role of a war President.

Competing selection pressures complicate Lincoln’s rhetorical choices. Lincoln clearly implies in his first inaugural that such things as trust, deference, and vulnerability in discourse belong to the better angels of our nature, and that withholding them from others is a bad thing, but noting this is not meant to imply that, rhetorically, adopting a speaking or writerly voice like Lincoln’s is necessarily preferable in all circumstances to other tonal voices. Evolution problematizes didactic formulations of good and bad, whether as to inclinations, emotions, or motivations because evolution is about the production of variation cast into new environments: may the fittest organisms and words survive in this here and now. Sometimes the best evolutionary or rhetorical strategy, given the environment you find yourself in, is to abandon what worked in the past, and, to echo the poet William Blake, “roll your cart over the bones of the dead.”

Thus the angels of human nature needn’t be treated as better or worse, per se, or as wholly at war with one another, for human beings possess both cooperative and selfish, truth-telling and manipulative—and even rational and irrational—traits for good reason: they can be adaptive. It depends on the environment you find yourself in. Indeed, it might be maladaptive to be too cooperative, trusting, or openly committed to rationality in certain contexts—leading to death, either for yourself or for your family or tribe. And human emotions and motives rarely come in their purest forms, but are fluid and unstable. They much more frequently arrive—and often in rapid fire—as ambivalent admixtures of the conventionally good and bad, as Shakespeare so effectively captures in characters like Othello, whose love is intermixed with a debilitating, and ultimately murderous, jealousy. Soliloquies in Shakespeare are also characterized by a roller coaster range of rapid shifting tones of voice, emotions, and motives, high and low, as in Hamlet’s famous “To be or not to be” soliloquy. Harvard evolutionary biologist E.O. Wilson puts it this way:

The internal conflict in conscience caused by competing levels of natural selection is more than just an arcane subject for theoretical biologists to ponder. It is not the presence of good and evil tearing at one another in our breasts. It is a biological trait fundamental to the human condition, and necessary for survival of the species. The opposed selection pressures during human evolution produced an unstable mix of innate emotional responses. They created a mind that is continuously and kaleidoscopically shifting in mood — variously proud, aggressive, competitive, angry, vengeful, venal, treacherous, curious, adventurous, tribal, brave, humble, patriotic, empathetic and loving. All normal humans are both ignoble and noble, often in close alternation, sometimes simultaneously.

To be or not to be, to think or not to think, to act or not to act–or to be nice or mean. These are the questions. As it is in the human breast, so it is in writing. An appeal to the better angels of our nature may thus arrive intermixed with lower appeals to, say, the seven deadly sins (lust, greed, envy, pride, sloth, gluttony, wrath). We may appeal to a reader’s vanity or a group’s narcissism (“our tribe is best, piss on the rest”); or we may make implicit threats of exclusion to those who don’t reciprocate our love. These too are evolutionary, rhetorical strategies that a social organism like ourselves might adopt in communication, assisting the success and survival of our messages–and perhaps, even ourselves.

What social niche do you occupy? As Lincoln’s first inaugural suggests, tone of voice and mode of address are intimately bound up with one’s adoption of a persona (a mask) for writing and speaking, a concern in rhetoric surrounding ethos (the presentation and credibility of the sender of a message). In a social species like ours, the personae we present to one another can enhance or cripple our life prospects. Our personae play a large role in determining our status within groups, who we will affiliate with, and with whom we will mate. Just as organisms in the wild fill ecological niches (the bottom-dweller, the one that clutches to walls, the predator), so we fill social niches (the funny person, the serious person, the sexy person, the dominant person, etc.), and these social niches change—sometimes daily, sometimes hourly, sometimes by the very minute. In one moment, you might broadcast a relaxed persona to a friend via a phone text; in another, a studious persona to one of your professors as she passes your study table in a college library. The personae we emphasize to the world, interpersonally and in our writing, vary and are context dependent. With some people we want to come across as silly, with others as smart and in-control. The personae we adopt function rhetorically, sometimes influencing or even spell-casting others in such a manner that they think, feel, or act in ways we want them to, providing us with advantages in ongoing thriving, access to mates, and so on.

How good are you at reading a room? If we’re projecting masks—personae—into social situations, it’s important that people believe them. Our personae also need to be attractive. And we also want to have a good sense of timing (knowing when to deploy a persona to maximal effect). We also want to read others well, and gauge what they’re up to. And we want to be self-aware. That means it’s to our advantage to model the states of mind of others, and to model to ourselves our own states of mind—that is, to be both extrospective—outward-looking—and introspective—inward-looking. Scanning our outer and inner environments for clues as to what the communication situation demands, how different scenarios are likely to play out, what our purposes are, and what’s most likely to succeed in achieving our purposes, is what it means to read a room (whether that room is literal, as when mixing at a party, or metaphorical, as when a writer imagines her audience of readers).

The Big Five trait model. Is there a way to know ourselves and know our audiences so that we read rooms better and make better rhetorical decisions? Something that might help is the Big Five trait model (also known as the five-trait model), which some psychologists use, via a brief questionnaire such as the ten-question Newcastle Personality Assessor (readily available online), to gauge temperamental traits that tend to correlate, to a greater or lesser degree, with such things as career success, proclivity to addiction, and even longevity (high conscientiousness, for instance, is associated with five years of longer life). Each trait seems to have some degree of heritability (a genetic component), and shows a significant measure of stability over an adult lifetime (how you score at twenty on a trait is unlikely to be substantially different a decade later). Via external observation methods, all five of these traits have also been measured in chimpanzees—and four have been measured in other species (humans and chimps alone share conscientiousness as an identifiable trait). So the Big Five trait model assumes a Darwinian framework: it recognizes evolutionary continuity between animals and humans, and identifies the traits measured as variations along continua that are subject to natural and cultural selection. Variations surrounding the traits are seen as measures, not of disease or health, but of different evolutionary strategies—evolutionary gambles—on the part of each individual, beneficial in some environmental contexts, and not in others.

Extroversion, neuroticism, conscientiousness, agreeableness, and openness. The five-trait model locates each individual as:

(1) more or less extroverted

(2) more or less neurotic

(3) more or less conscientious

(4) more or less agreeable

(5) more or less open

Individuals tend to be able to self-report rather quickly and accurately, even absent the use of a formal psychological questionnaire, where they stand in relation to these traits (extroversion, neuroticism, conscientiousness, agreeableness, and openness), and where others they know stand. If you’re more extroverted than introverted, you may tend to feel ready and motivating rewards in even the simplest things, and enjoy parties more than books; if you’re high in neuroticism (high on neuroticism?), you may see in yourself a deep sensitivity to pain, and an inclination to worry and double-check things; if you’re high in conscientiousness, you may feel deep concern for others—even strangers—and are highly organized, planning things way in advance; if you’re agreeable, you may see in yourself a tendency to avoid conflict and an ability to make friends easily; and if you’re open, you may find in yourself strong artsy and novelty-seeking impulses—and have a decided aversion to behaving conservatively or conventionally. Recall that these are tendencies along continua and highly contextual. You may, for instance, be more characteristically introverted than extroverted, and yet when you are in some environments, you may be quite extroverted (talkative when you are among your closest circle of friends, etc.).

So, are you a shark of a bonobo? Perhaps the take-away here is that evolution has no preferred evolutionary strategies. From an evolutionary vantage, if there’s a natural law at work with regard to success, survival, and reproduction, it’s whatever works. This would seem to apply to writing as well. Whether you’re an amoeba, an octopus, or a writer putting forward memes (memorable packets of language), rather than genes, what is staked in your gambles is the future. Whether you are a go-it-alone shark or a highly gregarious and social bonobo (a species of chimp with a reputation for being, among themselves, peaceful, cooperative, and highly, highly promiscuous), each animal positions itself as a more or less friendly, more or less cooperative, more or less risk-taking, more or less violent, more or less promiscuous organism–and this positioning may or may not serve success and reproduction. Thus it is that sharks and bonobos have different evolutionary strategies for getting their DNA into the future—as do writers with their words.

Writing 2.7.1. Reflect on where you might locate yourself along the shark-bonobo (go-it-alone/cooperative) and cautious-daring spectrum. How might your location along these two continuums impact your style of writing? Who might be attracted to it—and who put-off? Is there a middle ground here? How might you bridge the gaps between the sharks and bonobos in your audience without perhaps leaving both groups unsatisfied? Should you balance the interest of these competing groups, or go all-in on one side as opposed to the other? What is your end, ultimately, in writing?

Writing 2.7.2. Imagine that a friend recently inherited $100,000, and has now asked you for your advice as to what, exactly, he or she should do with it. Write two different responses. In the first response, advise caution and conservatism; in the second, risk-taking. After generating these responses, observe and think about the nature of the arguments and observations you’ve made on behalf of each of these contrasting evolutionary strategies, and how they worked in terms of tone within your writing. Then write a reflection on whether is it really possible to reliably advise another person. If you think it is, on what basis do you draw this conclusion? And if not, why not? Is there a middle ground here?

Writing 2.7.3. Draw a longish horizontal line on a piece of paper and place at the left pole the word, “shark,” and at the right, “bonobo.” Hash off seven places in between and locate seven well-known people along your continuum—from the more shark-like (go-it-alone and selfish) in temperament, behavior, and persona (presentation), to the more bonobo-like (cooperative and friendly). Of those included, who thinks, speaks, and writes most successfully? Who has produced the most offspring (either biologically or in terms of followers)?

Writing 2.7.4. Now ask this question, in turn, of each of the individuals along your shark-bonobo continuum generated in 2.7.3: What might it mean to write something in that persona? Who would be the best audience for receiving a message favorably from a persona of this sort? If you are doing the exercise with others, discuss your conclusions with the group and see if they arrived at similar author-audience conclusions. Ask the group these questions: Do more shark-like personas only appeal to other sharks? Do the more bonobo-like personas only appeal to other bonobos? Who, along the continuum, if anyone, has cross-over appeal to both bonobo-temperaments and shark temperaments? Why do they have this appeal? What are their rhetorical tricks (in terms of ethos, logos, and pathos)?

Writing 2.7.5. Write a paragraph or two in which you express empathy for a person, group, non-human animal, or ecosystem, and justify your empathy. Then write a paragraph or two in which you withhold empathy from a person, group, non-human animal or ecosystem, and justify the withholding. After writing your paragraphs, observe the contrasts in the feeling tones of the two pieces of writing, noting the different aspects of the human psyche they appeal to—the better and worse “angels of our nature.”

Writing 2.7.6. Write a paragraph or two in which you intermix some appeals to the conventionally better angels of our nature with some conventionally worse angels of our nature, and see if what you write possesses a more-than-usual interest. Is your writing more energetic than otherwise for entertaining such tensions—or does it feel less coherent, hopeful, and inspiring? (This experiment may entail working with a dialogical, as opposed to a monological, voice.)

Writing 2.7.7. The tone of your writing primes readers to adopt and track your attitudes and energies. But if you’re not careful here, you may lose them immediately. So imagine you are starting a longer piece of writing on a topic of your choice and ask yourself this question: “What angel of human nature should I try to evoke to start a piece of writing on my topic—humor, empathy, sobriety, cooperation, selfishness, enthusiasm, snark, lust, etc.?” After you pick either a better or worse angel of our nature to start the hypothetical piece, actually write a couple of sentences—or even a full paragraph—attempting to evoke your chosen attitude. After you write those opening sentences, read them out to another person and ask what tones of voice and energies your hearer actually inferred from them.

Writing 2.7.8. Estimate where you would place yourself along the traits measured in the Big Five trait model, and where you think someone else you know well stands. How might what you know about yourself, and what you know about that other person, influence how you try to communicate to them?

Writing 2.7.9. Imagine yourself writing to an audience of people high or low in one of the Big Five traits (a group, for instance, that is highly neurotic or low in openness). Address an issue of interest or concern to you—but angle it to appeal to your target audience.

__________

 Image result for bonobos and sharks     

Image result for bonobos and sharks

__________

Posted in atheism, atomism, critical thinking, donald trump, education, edward feser, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.6: Moving From First Sentence To Thesis

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter two. To have a look at other parts of the book, click here.

Concept 2.6. Moving from first sentence to thesis. Even as you might be refining your thesis statement and inventing argumentative paragraphs, contemplating their order of arrangement, you may also be thinking about how, exactly, you’re going to move from the first sentence of writing (and what that might be) to your thesis statement. Like lightning between cloud and ground, first sentence and thesis statement will have to link-up at some point. So how will you get from one to the other?

The first sentence of a piece of writing can be thought of as akin to a lightning strike. It’s the attention grabber, as with the beginning of Sylvia Plath’s The Bell Jar (1963): “It was a queer, sultry summer, the summer they electrocuted the Rosenbergs,…” It might be anecdotal, introducing the author’s motive for writing in the first place: “I underwent, during the summer that I became fourteen, a prolonged religious crisis” (James Baldwin, The Fire Next Time, 1962). A short opening sentence is a common attention grabber: “Something is lacking” (Albert Low, What More Do You Want? 2013). Or even a single word: “Chocolate. The very word conjures…” (John Robbins, “Is There Slavery in Your Chocolate?” 2002). If you can’t think of how to start your first sentence of writing, consider building onto one of the below words or phrases.

Once … Consider … Contrary to … It is … It was … One day … In the past … Have you ever … Although … It is a well-known saying that … It is well known that … After … The subject of x has become … Over the past x years … As we start … At the beginning of … Perhaps … Like most … Roughly speaking … I … In light of the recent … While it is true that … Contrary to popular opinion … Contrary to x …  Imagine … Visualize … On the one hand … Back in the day … In the past … One the ironies of existence … At one time I thought … At one time … I’m interested in x … I’m an x … For me, … What, where, when, why, how [as in, raise a question: “What do you suppose it would be like to live with perfect courage?”; “Where in our culture is stillness and quiet valued?” etc.]… I recently … The recent story of … Recently … Have you noticed … Given what so-and-so has said … It occurs to me …

Your first paragraph should justify your reason for writing. Why are you writing this? Why? Implicit in any sensitively written opening paragraph is an acknowledgment by you, as writer, that the reader deserves an explanation, either implicit or explicit, for why you’re writing to him or her in the first place. This is, in part, out of respect for the reader’s autonomy. The reader in a liberal democracy can always walk away from your message. You recognize the reader’s autonomy and freedom—that it is just—and that it’s your responsibility, if you want to communicate, to have something worth saying, not wasting another person’s time.

So the first paragraph carries this burden. How to meet it? One way, most obviously, is to be grammatical. If you show indifference to proofreading, you’re disrespecting the reader. Another way is to show care in your sentence writing and word choices. If you show indifference to sentences, and how they sing together, work, and imply their meanings, then why write anything at all? Writing consists of sentences—and good writing entails attention to saying things carefully, sentence by sentence. As Annie Dillard once suggested, if you don’t like sentences, you shouldn’t be a writer.

Yet another way to justify your taking of a reader’s time is with style, which is a display of tone, charge, surprise, persona (personality), and the musicality of language. Style is performative. Attending to style tells your readers that you want their undivided attention; that you mean to engage with their intelligence respectfully, ironically, and subtly—thereby delighting and entertaining them. Camille Paglia’s book, Sexual Personae (1990), starts, for instance, with an arresting echo of the first verse of Genesis—but with nature, not God, given pride of place in being first. In such a beginning, she means to startle the reader; to turn the reader’s habits of thought upside down:

In the beginning was nature. The background from which and against which our ideas of God were formed, nature remains the supreme moral problem. We cannot hope to understand sex and gender until we clarify our attitude toward nature. Sex is a subset of nature. Sex is the natural in man.

Notice that Paglia is arguing that human reasoning, contra the scholastics of the Middle Ages, begins with nature, and the experience of nature, not with God. God, for Paglia, is an abstraction—and everything that nature is not. Our ideas of God, suggests Paglia, are thus a traumatic reaction to our encounters with nature (its violence, its diversity). And our attitudes toward sex tells us about our attitudes toward nature, for “Sex is a subset of nature.”

So nature is supreme—and “the supreme moral problem.” Want to keep reading? Intrigued? That’s the function of the first paragraph. It might grab us by absorption into a controversy. It can also send a subtle (or not so subtle) tribal signal, suggesting to your reader of what political or religious tribe to which you belong. (With Paglia, she’s signaling straight-off that she is of that tribe of intellectuals unafraid to speak unconventionally of nature, God, and gender, and she expects her readers to stay for a potentially offensive—or, at least, offensive to some—performance. She means to be provocative.

Your first paragraph might get its impetus from what someone else has said or done. Notice in the Paglia quote above that by implicitly riffing off of Genesis in the first sentence, she deploys a common way to justify a piece of writing: quote someone else. Introducing a quote is a fast and tidy technique for entering the stream of an ongoing debate or conversation (“So-and-so recently said…and I say…”). Readers in a liberal democracy will rarely question another voice entering the fray of a conversation. It’s its own justification. Everyone can have a say. Attending to another’s voice is the hospitality a reader extends to a writer.

So enter the conversation. “They say, I say.” Entering a conversation may not need an exact quote (Paglia only alludes to the author of Genesis in her first sentence), and it may not even need reference to what a specific person said, but merely a general observation surrounding a common assumption, or something many people are saying, talking about, or doing. The philosopher Hannah Arendt deploys this method as her provocation for writing an entire book (On Violence 1969), her first paragraph starting with this:

These reflections were provoked by the events and debates of the last few years as seen against the background of the twentieth century, which has become indeed, as Lenin predicted, a century of wars and revolutions,… (3)

A good deal of writing, then, is a response to what others have said or done. If, for example, you’re a biologist writing for a professional journal, you might start with a review of the specialized literature that preceded your own investigations (“On the matter of microbial life on Mars, so-and-so in 1963 observed…”).

Your first paragraph might be an opening anecdote. An opening anecdote is a very short story or vignette, usually no more than a paragraph or two, and it may have a personal component. The personal component might tell the reader how you came to have an interest in the topic you mean to discuss:

My interest in life on other planets and my interest in cosplay (costume play) intersected last year at Thy Geekdom Con, a convention held at…

As you launch into your anecdote, it will be the reader’s hope that you’re not just rambling or being digressive, but will bring him or her in relatively brisk fashion to a thesis or main point worth reflecting on or arguing about. Magazine writers often capture readers’ attention via an opening anecdote, as in this first sentence of an explication essay—an essay that attempts to explain something—by investigative journalist Evan Osnos titled “Survival of the Richest: Why Some of America’s Wealthiest People are Prepping for Disaster” (The New Yorker Jan. 30 2017):

Steve Huffman, the thirty-three-year-old co-founder and C.E.O. of Reddit, which is valued at six hundred million dollars, was nearsighted until November, 2015, when he arranged to have laser eye surgery.

From this first sentence, a two paragraph tale is woven around the multi-millionaire’s reason for getting his eyes fixed now, not later: insurance against some unspecified day in the future when eye contacts won’t be available at all; a day of apocalypse. From this anecdote arrives the author’s implicit statement of purpose in writing:

Survivalism, the practice of preparing for a crackup of civilization, tends to evoke a certain picture: the woodsman in the tinfoil hat, the hysteric with the hoard of beans, the religious doomsayer. But in recent years survivalism has expanded to more affluent quarters, taking root in Silicon Valley and New York City, among technology executives, hedge-fund managers, and others in their economic cohort.

The author’s statement of purpose here sets up a contract with readers: we expect him to devote the rest of his essay exploring the survivalism practiced by the fabulously wealthy.

So whether the provocation for writing comes from what others have said or in response to an event (an earthquake, a convention, an election, a rich man getting laser eye surgery), you may find your entire first paragraph or two taken up with an opening anecdote that winds its way to a thesis or statement of purpose that then functions as the center of gravity for all the sentences that will follow.

Writing 2.6.1. Analyze the opening sentences of a variety of essays or books. See what patterns you detect.

Writing 2.6.2. Select a couple of essays from good magazines, observing how the opening sentences are functioning in relation to their first paragraphs, and how their first paragraphs are functioning in relation to their thesis statements or statements of purpose. 

Writing 2.6.3. In a journal entry, chart a course from first sentence to a thesis question. Make “Imagine…” the opening word of your first sentence. Write whatever comes to mind after that, winding a path that lands at the end of your first paragraph with the following thesis question: “This raises a question: should humans in the 21st century eat meat?”

__________

Image result for should we eat meat?

__________

Posted in critical thinking, education, reason, rhetoric, writing | Leave a comment

A Mini-Course in Critical Thinking For Writers. Concept 1.16: Thinking Critically About Beauty, Art, And Literature

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter one. To have a look at other parts of the book, click here.

Concept 1.16. Thinking critically about beauty, art, and literature. In 1757, a full century prior to the publication of Darwin’s On the Origin of Species (1859), the Scottish philosopher David Hume (1711-1776) published four essays under the title, Four Dissertations, one of which he called “Of the Standard of Taste.” In it, Hume has a question surrounding variation, a matter we now recognize as a characteristically Darwinian concern. We’ll call Hume’s puzzling over it proto-Darwinian, for just as Darwin sought, not a supernatural or moral, but a natural and rational, explanation for why species vary, Hume sought, not a supernatural or moral, but a natural and rational explanation for why opinions vary, most particularly on matters of taste with regard to beauty, art, and writing.

So why do tastes vary on beauty, art, and writing?

In tackling this question, Hume not only walks us through what a plausible natural and rational explanation to this question might look like, but he also teaches us how to read closely, think critically, and see.

Is taste in art and writing akin to taste in ice cream? Hume begins by noting the problem that, even among persons sharing the same “narrow circle,” “educated under the same government,” and sharing “the same prejudices,” one can still discover differences of taste with regard to beauty, art, and literature.

Why is that?

Hume doesn’t think it’s because people disagree in the abstract about where, say, good writing tends to be found. We all seem to agree, in general, on positive criteria for good writing: “Every voice is united in applauding elegance, propriety, simplicity, spirit in writing.” We also agree on negative criteria. We tend to agree, for instance, that coldness in writing is bad, as is “fustian” writing, which Samuel Johnson, in his Dictionary (1756), defines as “a high swelling kind of writing made up of heterogeneous parts”—that is, a writing that sounds sophisticated at first, but is actually just a pompous shambles, inharmonious, lacking coherence or an ultimate point.

So, no, taste in beauty, art, and writing is not like taste in ice cream, according to Hume. We all generally agree on the criteria for what’s beautiful and makes art and writing good. Something not just subjective, but objective, seems to be going on when we judge beauty, art, and good writing. Beauty and excellence are in some sense in the art and writing, and there are people who see them immediately. They recognize them. But others don’t. So why don’t we agree?

Is the poetry of Homer and John Milton really as good as your literature professor says it is? Hume doesn’t think beauty is just in the eye of the beholder (strictly subjective). He doesn’t agree, for example, with this line of argumentation: “Beauty is no quality in things themselves. It exists merely in the mind which contemplates them; and each mind perceives a different beauty. . . . [E]very individual ought to acquiesce in his own sentiment, without pretending to regulate those of others.”

Again, Hume disagrees with this plausible and common sense argument, for it leads to absurd conclusions, such as applauding the opinion of a critic who might treat the poetry of John Ogilby (a minor Scottish poet and Homer translator of the 17th century) as equivalent to that of a major poet like John Milton (author of Paradise Lost, 1667). Such a judgment, says Hume, would be as if one “had maintained a mole-hill to be as high as Teneriffe [a volcanic peak in the Canary Islands],” and would not deserve respect from educated and aesthetically discriminating people.

Treating beauty as strictly subjective also fails to explain how a famous poet like Homer could please “Athens and Rome two thousand years ago” and still be “admired at Paris and London” today. Despite “the changes of climate, government, religion, and language,” educated people agree that Homer’s poetry has beauty and power. The interventions of space and time “have not been able to obscure his glory.”

So beauty, for Hume, is objective.

Then why don’t we agree? If broad principles of what tends to make for beautiful things (symmetries, coherences, novel contrasts, etc.) can be agreed upon, and beauty is, in some sense, objectively “out there” in nature, art, and writing, then why are there aesthetic disagreements? Hume locates the problem in us; in our sense of discrimination, which he takes to be delicate and subject to poor calibrations, like the mechanism of a watch:

Those finer emotions of the mind are of a very tender and delicate nature, and require the concurrence of many favourable circumstances to make them play with facility and exactness, according to their general and established principles. The least exterior hindrance to such small springs, or the least internal disorder, disturbs their motion, and confounds the operation of the whole machine.

In other words, for accurate detection of time, you need a fine watch, and for accurate detection of beauty, you need delicate discrimination. Put another way, the criteria we assign to the beautiful in the abstract can be picked up by us in practice only if our discerning faculties are well-tuned, and neither damaged nor working improperly. Just as you wouldn’t, for example, expect “a man in a fever” to be “able to decide concerning flavours,” so you cannot expect an agitated or distracted person to be especially discerning of beauty. Some people, likewise, have little native aptitude for “delicacy of imagination,” something Hume insists “is requisite to convey a sensibility of those finer emotions.”

Prejudice functioning as habit and temperamental bias. Another reason that people may not perceive the same things as beautiful has to do with prejudice: people possess different habits of attention and temperamental biases that make it difficult to notice all the things in the world that are actually beautiful:

A young man, whose passions are warm, will be more sensibly touched with amorous and tender images, than a man more advanced in years, who takes pleasure in wise, philosophical reflections concerning the conduct of life and moderation of the passions.

Likewise,

One person is more pleased with the sublime; another with the tender; a third with raillery. One has a strong sensibility to blemishes, and is extremely studious of correctness: Another has a more lively feeling of beauties, and pardons twenty absurdities and defects for one elevated or pathetic stroke.

Culturally, people also carry biases:

[W]e are more pleased, in the course of our reading, with pictures and characters, that resemble objects which are found in our own age or country, than with those which describe a different set of customs.

In contemporary psychology, the Big Five trait model (extraversion, neuroticism, conscientiousness, agreeableness, openness) would seem to support Hume here. Temperamental inclinations will tend to tug one’s attention and interest away from some things, and toward others.

Senses attuned to aspect seeing (and smelling, hearing, etc.). Still another reason that people may not agree on the beautiful is that their sense organs and powers of imagination and vision are differently calibrated: one may be naturally sensitive to one subtle quality in an object; another to a different quality. That is, one’s senses may be highly attuned to beauty, but to very different aspects of it. By way of analogy, Hume offers two people passing very different judgments as to the qualities adhering to a bottle of wine: one praises it as promising, but detects “a small taste of leather,” and the experience of it is thus compromised for him. The other too praises it as good, “but with the reserve of a taste of iron.” Before ridiculing their judgments as grounded in projective fantasy, Hume asks us to imagine that, on drinking the whole bottle, “there was found at the bottom, an old key with a leathern thong tied to it.” From this, Hume draws the conclusion that, just as there are qualities in wine that make for judgments as to its sweetness or bitterness, so there are qualities in objects that make for judgments as to their beauty or deformity: “[T]here are certain qualities in objects, which are fitted by nature to produce those particular feelings,” and these qualities of beauty and deformity can be very fine and difficult to detect:

Now as these qualities may be found in a small degree, or may be mixed and confounded with each other, it often happens, that the taste is not affected with such minute qualities, or is not able to distinguish all the particular flavors, amidst the disorder, in which they are presented. Where the organs are so fine, as to allow nothing to escape them; and at the same time so exact as to perceive every ingredient in the composition: This we call delicacy of taste, whether we employ these terms in the literal or metaphorical sense.

Our taste in beauty, in other words, is very like our taste in wine: just as we must have a developed and sensitive palette to detect the subtle qualities in a wine, so we must have a developed and sensitive faculty of aesthetic taste—a “delicacy of imagination”—to detect and render good judgments concerning all the qualities of a beautiful thing in nature, art, or writing. (And Darwin might say here that rendering such a judgment in the right company is socially adaptive.)

Nothing escapes notice: delicacy, sensitivity, and precision of sense. So “delicacy of imagination” is the reason that it’s not “easy to silence the bad critic, who might always insist upon his particular sentiment, and refuse to submit” to a critic of more refined judgments. People differ in their powers of sensitivity, and this means that some apprehend details far more perfectly than others:

It is acknowledged to be the perfection of every sense or faculty, to perceive with exactness its most minute objects, and allow nothing to escape its notice and observation. The smaller the objects are, which become sensible to the eye, the finer is that organ, and the more elaborate its make and composition. A good palate is not tried by strong flavours; but by a mixture of small ingredients, where we are still sensible of each part, notwithstanding its minuteness and its confusion with the rest. In like manner, a quick and acute perception of beauty and deformity must be the perfection of our mental taste; nor can a man be satisfied with himself while he suspects, that any excellence or blemish in a discourse has passed him unobserved.

Hume is insisting here on very close reading and seeing. To reach this highest experience of beauty—“perfection of our mental taste”—exactness is required. Nothing must get past the perceiver “unobserved.”

Practice makes perfect. But can you do anything about this? That is, can you obtain this well-calibrated aesthetic faculty—the faculty of taste, or is it just something a person is born with, as some are born with more sensitive ears and taste buds than others? Here’s Hume’s answer:

[T]hough there be naturally a wide difference in point of delicacy between one person and another, nothing tends further to increase and improve this talent, than practice in a particular art, and the frequent survey or contemplation of a particular species of beauty.

In other words, there’s hope for the person interested in becoming a close discriminator of beauty: practice makes perfect. (Practice entails focus, habit, and the close study of wide-ranging models.) But the best that people tend to do without practice is to recognize beauty and fine art and writing in only the most general fashion: “The [unpracticed] taste cannot perceive the several excellencies of the performance.” Also, the subsequent judgment lacks confidence:

If it pronounce the whole in general to be beautiful or deformed, it is the utmost that can be expected; and even this judgment, a person, so unpracticed, will be apt to deliver with great hesitation and reserve. But allow him to acquire experience in those objects, his feeling becomes more exact […]

In other words, the danger of being shamed before others in disagreement is in play as to whether you will confidently detect beauty, good art, and good writing. If you lack practice and experience, the contrary opinion of a more experienced observer may rattle you. You might not be sure within yourself as to what you’ve seen. So to obtain a confident proficiency in detecting beauty, you have to practice looking and engage in wide-ranging viewing and reading. Through practice, you also achieve a degree of self-esteem (inner confidence that your faculties are up to the task, and that, when you detect beauty, good art, or good writing, you see it right). You have to fight through a lot of static—including social static—to detect the signal in the noise.

Linger on, and rotate, the diamond. In addition to practice, Hume asserts that one must also learn to look at things more than once, and from multiple angles: “[B]efore we can give judgment on any work of importance, it will be […] requisite […] that [it…] be more than once perused by us, and surveyed in different lights with attention and deliberation.”

Hume’s advice thus entails slowing down; way, way down:

There is a flutter or hurry of thought which attends the first perusal of any piece, and which confounds the genuine sentiment of beauty. The relation of the parts is not discerned: The true characters of style are little distinguished: The several perfections and defects seem wrapped up in a species of confusion, and present themselves indistinctly to the imagination. Not to mention, that there is a species of beauty, which, as it is florid and superficial, pleases at first; but being found incompatible with a just expression either of reason or passion, soon palls upon the taste, and is then rejected with disdain, at least rated at a much lower value.

Comparison, contrast, degree, and hierarchy. Hume emphasizes comparison as another way of honing one’s aesthetic sense; that is, noticing differences in the degrees of beauty by asking the following question: this object of nature, art, or writing is beautiful, excellent, or powerful as compared to what? Here’s Hume:

It is impossible to continue in the practice of contemplating any order of beauty, without being frequently obliged to form comparisons between the several species and degrees of excellence, and estimating their proportion to each other.

Notice Hume’s emphasis here on “being frequently obliged to form comparisons.” Frequency in this context implies the leisure of repeated, wide-ranging exposures to nature, art, and literature.

Experience many things, and across times and cultures. Refined comparisons thus require exposing yourself to a lot of things so that you become conversant in the varieties of aesthetic experience. Only then can you render the best judgments based on the most refined criteria and models: “A great inferiority of beauty gives pain to a person conversant in the highest excellence of the kind, and is for that reason pronounced a deformity.” If you only have a limited experience with beauty, “the most finished object” you know of “is naturally supposed to have reached the pinnacle of perfection,” but once you become “accustomed to see, and examine, and weigh the several performances, admired in different ages and nations,” then you can competently “rate the merits of a work exhibited” and “assign its proper rank among the productions of genius.”

Hume in a nutshell. In short, for Hume beauty is objective and pervasive in the world, but also subtle and subject to degrees of perfection. And so it is that evaluations concerning beauty, art, and good writing are not either-or judgments, but judgments along a continuum (more or less beautiful, more or less excellent, etc.). Our failure to perceive things accurately and in their fullness can thus be accounted for by numerous factors:

  • We may be distracted or otherwise ill-tuned or damaged in our faculties.
  • We may have prejudices born of habits of attention, temperament, and culture.
  • We may have naturally diverse and calibrated senses that notice some aspects of beauty, but not others.
  • We may lack prolonged and wide-ranging experience in looking, reading, and judging.

Hume offers the following to those who wish to cultivate their receptivity and discernment:

  • Practice close reading and seeing.
  • When practicing reading and seeing, slow down, look repeatedly, and take views from multiple angles.
  • When contemplating a piece of art or writing, compare and rate it in relation to other pieces of art or writing that you know, including those across times and cultures.

There’s so much beauty in the world. How much of it are we really noticing? Obviously, very little, so it’s okay to start noticing a bit more just from where you are now. Little steps in a definite direction, persisted in, and with a goal in mind, make for a destination and purpose. You reach the summit of Everest one step at a time.

An objection to Hume: is he confusing beauty with a beauty criteria list? A 21st century person sensitive to the preservation of variety and experiment might wonder whether Hume has confused criteria for beauty with beauty itself. She might ask whether what Hume calls a dullness or insensitivity to beauty is, to the contrary, just a dullness or insensitivity to a particular, culturally determined, beauty criteria list, perhaps not even explicitly stated, but nevertheless present, applied to objects of contemplation.

A second objection: anything you can do, Hume can do “meta”? A 21st century person might also doubt the wisdom of Hume overlaying an aesthetic metalanguage (a language overlaying another language, dominating and interpreting it) onto all the other aesthetic languages at work in the world, as when Hume writes that the widely read and well-traveled critic, by long practice, becomes “accustomed to see, and examine, and weigh the several performances, admired in different ages and nations.” Any such meta-evaluative weighing of diverse aesthetic traditions may not really be closing in on the fullest apprehension of beauty that can be attained, but may, rather, be just another spell-casting enactment of marginalia and notation tacked onto somebody else’s art or cultural language game. To wholly see the beauty of something from a culture different from your own, you may be in need of full immersion in the language and aesthetic game(s) of that culture.

So the fact that anything you can do, Hume can do meta—and anything that Hume can do, you can do meta, meta—doesn’t mean that it’s getting one to a greater truth or perception of beauty. It may just mean that each of you is using a different language game or list of criteria to evaluate what the other is doing. There is thinking, and there is thinking about thinking (metacognition), and there is the artwork itself, and how Hume judges it—and how you judge Hume in the judging of it. By categorizing one language or criteria list for beauty as superior and others inferior, a follower of Hume may miss the beauty and inner logic of the ones deemed inferior. One person’s interpretation or notation–“Here’s what’s really going on; here’s what’s really beautiful and superior about this piece of writing over all others…”–may be another person’s experience of a hijacking.

Thinking about Hume after Darwin. Darwin would probably concur with the contemporary brake-tapping on Hume immediately above. What Hume might deem beautiful or good in art or writing depends on the criteria for what’s beautiful and good—and the historical, cultural, and aesthetic language games in which they are discussed. If the criteria for beauty, art, and good writing are agreed upon, and an author’s claims are repeated by others and stay in circulation (survive) within a cultural language game, then the evolutionist might say: whatever works.

But Darwin might also have an additional critique. If Hume is correct that differently calibrated sense organs and imaginations impact the degree to which we can make delicate discriminations of art and writing, then where did these variant calibrations come from? Darwin would say that they came from adaptations that serve survival. These adaptations can manifest physical capacities (keen eyesight, for instance), but also social capacities (a keen ability to read a room or emotionally salivate to the same social cues as those in your tribe). And some will have these abilities to a greater degree than others.

Beauty and social approval may not always line up. So a Darwinian advantage can accrue to you for gathering resources and mates if you have above average powers to impress a like-minded group of people—and this is regardless of what’s objectively beautiful (if objective beauty indeed exists). Put another way, a group can share a criteria list for beauty whether that list really and truly characterizes beauty well or not.

And outsiders don’t necessarily matter either. What matters in terms of your adaptive fitness to your tribe or group may be whether the group agrees with you or not, not whether outsiders agree. (The exception is if your group values disagreement. If it does, then playing the role of contrarian from within your group might be adaptive. You may disagree, for instance, with your group’s beauty criteria list, or question it, and not strain your relation to the group.)

But now imagine yourself with a lucky, variant social adaptation. As an individual in your group, you may be especially good at detecting or producing exactly the sorts of art that your group salivates to and agrees is beautiful. You may thus be well-adapted for making the sorts of discriminations that bind you to that particular group, but not to others, thereby marrying your fate to that group. (Evolution, recall, is about organisms varying in their evolutionary strategies, and laying down bets on future survival). Your bet is with your group—and you may find yourself quite fortunate in your contingent, delicate discriminations because they agree with your group. You may, for instance, have a delicate sensitivity to depictions of the sea, and life at sea, and your friends are all fishermen. They love looking at the sea with you. You see things from a vantage that they like. The actual truth of matters, or the accurate perception of objective beauty (whatever that might be), may have nothing to do with the actual success of your judgments with your friends.

Writing 1.16.1. Select a readily manageable item for delicate discrimination (a bottle of water, a pair of keys, a tree, a photograph, a building, a piece of art, a poem, a paragraph of writing). It can be anything that you can comfortably get out in front of you, and on which you can concentrate without significant distractions. Evaluate it for its excellencies and defects, and the relation of its parts to the whole. Linger, take notes, make repeated sensorial passes over the object (visual, of course, but perhaps also engage your other senses as well), and do this from multiple angles. Engage it with your thoughts as well. Notice distinctions—and make them. Do your best to get away from any external distractions, and if you have internal distractions (obsessive thoughts that are taking you out of the moment and elsewhere; a stuffy nose or ear-ache, for instance), do your best to set them aside. As you spend time with your item, write reflections on it. What do you notice? Write with precision and exactness. Be sure to address the item’s excellencies and defects, and the relation of its parts to the whole—and see to it that, as you write, you too are attendant to your own relation of parts to whole in your composition—and try to write and think with an eye on excellence. In other words, try not to write a jumble of disconnected thoughts, but a page that holds together as a piece of writing that is seeking to locate a main point (so that its parts are in service to that main point).

Writing 1.16.2. Get a photograph out in front of you, preferably from a good art of photography book. Think “slo-mo”: pause to ask questions, linger, look, and feel. Think, for instance, what a photograph achieves, arresting time, generating a crawl space around it. Think of how you might, in imagination, enter the scene of the photograph, savoring the sights, sounds, aromas, textures, and tastes that might be there–or how you might enter the scene from the vantage of each character in it, human, animal, plant, cloud, and stone. Or perhaps note how your photo depicts or functions as a process (ask what stage the scene is in; its parts in that stage; what it’s embedded in). Also count patterns. Note in what way the photo you have chosen to contemplate depicts somethig odd or is itself odd, and what’s sui generis (one of a kind) about it. Note in what sense things depicted are perhaps ugly, abhorrent, beautiful, good, or true. Also note what process or processes the scene might be embedded in (name & un-name things in the photograph; integrate & disintegrate them). Think of your photograph as a happening as opposed to a noun: notice that all things (nouns) are really events and relations in combination—and that nouns bear adjectives that frequently carry emotional, essentialist judgments (emotions are judgments). What emotions; what dominant impressions, come forward around this image?

Writing 1.16.3. Have another look at the writing you generated in response to either 1.16.1 or 1.16.2, and write a page of reflection on the prejudices (cultural and personal biases, passions, and temperamental inclinations) that found their way into your piece of writing. What got into the writing because of the contingent circumstances surrounding your inner and outer life? Think about your age, both your own (in terms of your life cycle) and the one you live in (the cultural Zeitgeist—the spirit of the age). Think about your race, gender, and class. Did these make their way explicitly or implicitly into the writing–either as to the selection of your item of contemplation, or as to the content of your reflection? How about matters surrounding your temperament and passions. Perhaps you’re an extrovert rather than an introvert; an optimist rather than a pessimist; a conservative rather than a liberal; are into sports, not art. Perhaps you’re someone who doesn’t like to argue–or maybe you’re a perfectionist. How did such things about you, temperamentally, enter your writing (in ways subtle and not so subtle)? How about your geographic location on the planet? Your marital status? Whether you’ve travelled in your life, or live in the city or countryside? Try to notice and be honest with yourself about what sorts of things crept into your writing that were not, strictly speaking, about the item of contemplation–and an objective judgment concerning it–but about you.

Writing 1.16.4. Imaginatively cross a boundary of time or space (or both). Look at a cultural artifact that is not your own–either in art, photography, or writing. It may be something ancient, such as paragraph of writing in translation from the Gilgamesh Epic or Bhagavad Gita, or a piece of art, but evaluate it on its own terms, writing about it. Reflect on what appears to be its inner logic (how its parts relate to the whole; the myths or worldview that might surround it). What do you surmise its use might have been to the first audience that received it (if you do not know), and what do you guess may have constituted its excellencies and defects from the vantage of that first audience, or as evaluated by its creator or first critics?

Writing 1.16.5. Select a piece of art, photography, or writing from your own time or culture and do a cross-cultural comparison and contrast with the art you wrote about in 1.16.4. Which one do you prefer personally? Is it fair to evaluate the inner logic of one piece against the inner logic of the other—given that they emerged out of different cultural language games–and perhaps had different priorities? What about if they had come out of the same culture?

Writing 1.16.6. Compare, contrast, and rank for excellence these three paragraphs of writing in relation to one another. What are their excellencies and defects? Which is most effective—and which least effective—in relating its parts to its whole? Which paragraph arouses in you the greatest amount of interest and energy in response? How does it achieve this effect? Is that effect in the writing—or in you?

[Insert three paragraphs here.]

Writing 1.16.7. In the following passage, x evaluates y as a bad writer…. [Complete.]

Writing 1.16.8. Write your first impression of the below paragraph as to its tone, content, quality, interest, etc. Don’t read the below paragraph twice. Read it only once. After you’ve generated a paragraph outlining your first impression, read it again—but this time, far more slowly and carefully. Then read it several times. Think about it from different angles. Notice its ironies, excellencies, and defects. Observe how its parts relate to the whole. Then write a second paragraph discussing your more considered opinions of the writing, and any delicate discriminations you might have caught on a closer pass with the work. Did your opinion, feeling, or perspective shift at all between your first paragraph of writing and the second? The more you thought about it, did you like the piece of writing more or less?

[Insert paragraph here.]

Writing 1.16.9. Engage in a meta-gesture for a paragraph in which you declare what the below paragraph is “really about.” Be creative. Put your reader in-the-know. “The overt subject of this paragraph is x, but what it’s really about is y.” Be ironic or serious, as you please. Just don’t take the texts overt meaning to be its final meaning. Expose something of its subtext.

[Locate a paragraph.]

Writing 1.16.10. Write a paragraph in which you offer your main criteria (no more than five) for what constitute excellence in writing or art. After generating this paragraph, contemplate it a bit further, and in a second paragraph, answer these question: are the criteria you generated in paragraph one valuable absent context? Are they perhaps too specific—or too general? Would making them more specific or more general make them more meaningful and useful? Do they really translate reliably into all contexts—or just some contexts? What might those contexts be?

__________

 Image result for art

__________

Image result for art

__________

Posted in aesthetics, atheism, atomism, beauty, critical thinking, david hume, education, edward feser, meditation, philosophy, poetry, reason, rhetoric, writing | Leave a comment

A Mini-Course In Critical Thinking For Writers. Concept 1.15: Know Where You’re Entering The Intellectual Conversation

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter one. To have a look at other parts of the book, click here.

Concept 1.15. Know where you’re entering the intellectual conversation. When you chance upon a group of people at a social gathering having an animated discussion, your first impulse may not be to jump in half-cocked with a comment, but to listen. You want to discover where the conversation is at–the beginning, middle, or end–and at what level it’s taking place (serious, ironic, analytical, etc.). You want the lay of the land before participating–and this is true of intellectual conversation in college as well. So naturally the question becomes: what’s the basic lay of the land for intellectual conversation? This can be answered in broad terms as falling into six distinct categories: conversations in which mimesis, romanticism, structuralism, formalism, poststructuralism, or social text theorizing are taken for granted. To navigate this territory in conversation and writing, let’s flesh out each of these in turn.

First, what is mimesis? Mimesis is imitation before an audience of what is most essentially true, holding, as Shakespeare famously put it, “the mirror up to nature” (Hamlet Act 2.3). A tragic play, for example, reflects the turmoil of the human soul to an audience, a book is a mimetic artifact of an author’s thoughts, and the description of phenomena in a science text presumably has some correspondence to the real world.

That, in any case, is the theory.

But not everyone accepts the mimetic assumption about communication. Indeed, two of the broad takeaway claims frequently encountered in the contemporary era—the postmodern era—is that meaning is slippery and a great deal happens beneath awareness. Thus our communication is less mimetic–less reflective of reality and our intentions–than we suppose. As the British literary critic Frank Kermode observes, the central premise of all postmodern reasoning is that there is always more in a text than the author knows or intends–and this goes rather nicely with the 19th century philosopher Friedrich Nietzsche’s claim that “there are no facts, only interpretations.”

But before killing off the author, truth, and mimesis (words reflecting accurately what is actually the case) as archaic, flawed concepts from bygone days, chew on this question raised by two philosophers in 1982 (Knapp and Michaels): if the following stanza from a Wordsworth poem magically appeared scrawled in beach sand in the wake of a receding wave, how would you interpret it?

No motion has she now, no force;

She neither hears nor sees;

Rolled round in earth’s diurnal course,

With rocks, and stones, and trees.

In other words, were you to witness these words before you, apparently written via the action of water, how might you explain to yourself such a Twilight Zone moment? Absent an author, would the words mean anything? Your answer to that question is likely to reveal a good deal about your assumptions concerning mimesis and its importance to interpretation.

The evolution of Western thought in the light of mimesis. To orient oneself in the Western intellectual tradition generally, it helps to think of it as having passed through six periods or turns first sketched out by literary scholar M. H. Abrams in The Mirror and the Lamp (1953): the classical, the romantic, the formalist, the structuralist, the poststructuralist, and the “social text.” When Abrams wrote, he hadn’t lived through the postmodern turns yet (the poststructuralist and “social text” turns), but these are the two readily discernible critical approaches that have emerged since the 1950s. And all six of them overlap and interact. None has ever really gone away. One way to explain their differences from one another is in their assumptions about mimesis and their relation to Aristotle’s famous rhetorical triangle (ethos, logos, pathos—speaker, message, audience).

The classical turn. Those who identify with the classical tradition (the tradition stemming from the ancient Greeks) concern themselves with the message (logos) sent to an audience (pathos) in its relation to the universe (the cosmos): does that message reflect the objective truth of matters? That is, does it point to reality as a whole—the cosmos—as it most truly is? If it does, the sender is to be praised for holding “the mirror up to nature.” Successful mimesis—which is, again, mirroring in language to an audience what is most essentially true—is taken in the classical tradition to be the highest end of all communication, displaying scientific, rhetorical, moral, and artistic excellence. If you believe that objective truth is, in some sense, “out there” in the world, and that communication ought to be consonant with it, your sensibilities are in accord with the classical tradition. 21st century scientists, for example, though living in a “postmodern” age, tend to take the classical tradition for granted. They assume, like the first scientists of the 17th and 18th century Enlightenment assumed (Galileo, Newton, etc.), that they’re discovering true things about nature, not just constructing imaginative interpretations of it, and they think of themselves as communicating their findings to others clearly, as one might hold “a mirror up to nature.” But some thinkers, especially in the humanities, focusing on the ways that time, language, and culture functions, doubt vigorously these common sense mimetic assumptions, and this is what makes them 21st century “postmoderns” as opposed to ancient Greco-Romans, 19th century romantics, or 20th century modernists—all of whom shared the assumption that the truth is “out there” and real—it’s not relative—and can be communicated clearly, simply, and directly.

The romantic turn. Those who identify with the romantic tradition (the tradition stemming from 19th century romantics like Johann Goethe and William Blake) concern themselves with the messages they send in relation to their “souls”: does that message accurately reflect the inner truth of the speaker, writer, or artist? If so, it is worthy of praise. As in the classical tradition, the romantic tradition is mimetic (imitative), but the concern is not so much with the message mirroring outer nature, but rather with it mirroring the greater subjective truths and insights of the mind, imagination, and heart (inner nature) in interaction with the world. In this sense, it is phenomenalist (not excluding the experience of the subject from the truth of the matter at hand; what is true is not necessarily independent of what is subjectively experienced; indeed, to exclude the individual from the truth would be to miss something important).

The structuralist turn. No man (or word) is an island. That’s the structuralist’s insistence. Theorists in the structuralist tradition concern themselves with the field of relations. In linguistics, for example, signs (signifiers and signifieds, the words and the concepts they point to) become a great text—a field of signs—in which meaning does not reside outside of that field. Think of a dictionary in which words define words, and those words are defined by other words. Complete meaning necessarily disperses into the field of signs and is chased there. Think of ecology and the internet. What is the meaning of an organism or web page, but its location in a system of events in relation? Thinkers like Sigmund Freud, Karl Marx, and Charles Darwin can also be thought of as bringing structuralist theories into their respective disciplines: for Freud, the id, ego, and superego structure the energies of the psyche; for Marx, the class struggle structures history; and for Darwin, natural selection structures life into a great and branching evolutionary tree. In each of these, the human individual, like a word in a dictionary, takes her origin and meaning from the logic, systems, subsystems, and structures in which she finds herself embedded. The truth is “out there” and can be communicated—traditional mimetic assumptions are in play—but it resides in the structure. So when somebody asks–“What’s really going on?”–the structuralist is looking to explain things in light of some underlying pattern, system of relations, or structure: “Her relationship with her father was strained throughout her life, and so she dates older men because she has daddy issues.”

The formalist turn. Those who identify with the early and mid-20th century formalist tradition concern themselves with the message alone, and are not particularly concerned with its mimetic representations of either outer or inner truths (outer or inner nature), or its relation to outer structures or history. In formalism, the message itself is to be treated as whole, beautiful, and interesting—a “well-wrought urn,” in the critic Cleanth Brooks’ phrase. Each message is a universe all its own, possessing an inner language and logic apart from that functioning in the cosmos or the message’s sender. The formalist intellectual turn is to be fascinated with a dab of paint on a canvas in relation to another dab of paint on the same canvas; or to notice the materiality of a piece of window-glass itself, and not just of what is seen through it; or to wish to study, not a poem’s history or author, or how it relates to other poems, but its rhyme scheme, visual appearance on the page, word repetitions, it’s inner logic, its form. Have a look online at paintings by modernists like Kandinsky, Klee, and Miro for a visual sense of the formalist turn. Formalism doesn’t necessarily reject classical mimetic assumptions, it just backgrounds them to concerns about a thing’s inner logic all by itself.

The poststructuralist turn. Like the structuralist, the poststructuralist is also concerned with fields of relations. But what divides a structuralist from a poststructuralist—and brings us to postmodernism—is whether or not to treat such fields as basically stable (like in a game of chess, with its spread-out board, accompanied by definite rules and determinate relations). Structuralists see history as largely playing out like a chess game. Interesting things happen, but within a structure governed by some law (perhaps discerned by a genius like Freud, Marx, or Darwin). By contrast, poststructuralists see things as playing out less lawfully, less predictably. Historical and chance contingencies are more in play. The poststructuralist foregrounds the gaps in the structures we purport to understand; the spaces where things can surprise. Think, for instance, of the gap depicted on the ceiling of the Sistine Chapel between the finger of Michelangelo’s Adam and the finger of God. They almost touch, yet don’t, and in that gap something unexpected can intervene, transforming the meaning of the image. (A fly could land in the space, or a wasp could build a nest there.) Emily Dickinson’s poem, “I heard a Fly buzz—when I died,” nicely anticipates the poststructuralist intellectual turn. When, at death, the character in her poem expects her life’s structure to close upon her quite meaningfully in that “last Onset—when the King / Be witnessed—in the Room,” she gets instead a fly “With Blue—uncertain stumbling Buzz / Between the light—and me.” That’s the Dionysian trickster to which the poststructuralist turns attention. The trickster is the contingency not anticipated; the margin that disrupts the center. Here’s Dickinson’s poem in full:

I heard a Fly buzz – when I died –
The Stillness in the Room
Was like the Stillness in the Air –
Between the Heaves of Storm –
The Eyes around – had wrung them dry –
And Breaths were gathering firm
For that last Onset – when the King
Be witnessed – in the Room –
I willed my Keepsakes – Signed away
What portion of me be
Assignable – and then it was
There interposed a Fly –
With Blue – uncertain – stumbling Buzz –
Between the light – and me –
And then the Windows failed – and then
I could not see to see –

 

So if you’re a poststructuralist, you’re particularly inclined to notice that models, rather than serving clarity, can blind us to surprise; that things aren’t stable; that the meaning of words in dictionaries possess ambiguities and evolve; that over time signifiers (words, signs) subtly shift their meanings in relation to their signifieds (concepts)–and in relation to one another–and you conclude from this that language, and everything else, is in flux and historically contingent, including the structures and laws that seem to govern them. Thus to “hold the mirror up to nature” is, from the vantage of the poststructuralist, to take for granted a stable correspondence between stable language and stable reality that is, in fact, illusory. If you’re a poststructuralist, you believe that people in the mimetic traditions (classicism, romanticism, structuralism) are possessed of what might be called the mirror delusion. They’re under the spell of nouns, images, and things they take to be fixed; images they’ve constructed. But time and poststructuralists themselves have their ways of undoing things. They are both deconstructionist. Thus, even when a poststructuralist is not performing a deconstructive reading of a text or event, foregrounding its margins and instabilities, time itself is, in the postmodern philosopher Jacques Derrida’s phrase, “always already” doing so. The idea of structures being unraveled by unanticipated actors is also captured by Derrida’s surmise of his own work: “All I have done is dominated by the thought of a virus.” Thus the postmodern emphasis on becoming over being directs attention to what Derrida calls l’avenir, the time to come; the time which will break out of our models and structures, and is thus unpredictable. Derrida suggests that we can orient to l’avenir by being open, flexible, accepting, enacting toward it hospitality, as one might while on an LSD trip.

The “social text” turn. Those who identify with the “social text” turn in criticism have absorbed the insights of poststructuralists that the models and languages we overlay onto reality are problematic, but they nevertheless insist on making efforts to link models and languages, in a non-ironic fashion, to (left leaning) social causes. This is why they can be properly designated social text theorists: everything is a text (a field of relations), but we still want non-ironic, progressive social engagement. Duke University, for example, puts out an academic journal of socially engaged criticism titled, Social Text. So to be either a poststructuralist or social text theorist is what it means to call oneself “postmodern” as opposed to “modern” (formalist or structuralist) or “mimetic” (classical or romantic). But within postmodernism itself is this ongoing tension concerning mimesis: do the ways that people represent their inner lives and social struggles mirror accurately a fundamental inner and outer reality or not? Put another way, how do you keep faith with a romantic revolutionary like the poet Byron or a structuralist like Marx after absorbing Derrida (a poststructuralist)?  Another issue in social text theorizing is the relation of the individual to the field of history: is an individual determined in any significant way by biology? Does she have free will? Social text theorists sometimes downplay biology and free will, emphasizing the individual’s embeddedness in the fields of language and culture. “Biology is not destiny,” “nature doesn’t speak, we speak,” and “the personal is political” can seem, at times, to be very near to articles of faith among social text theorists. Except when they’re not. (Think about the above slogans in relation to the question of whether or not some people are born gay). Historian of science Bruno Latour complains that social text theorists are simply not coherent because they sometimes affirm and sometimes deny key assumptions. But in their defense, that’s because they’re trying to be a bit of everything (soulful romantics, Marxist structuralists, and Derridean poststructucturalists) at the same time. How do you square these circles?

Concluding thought. Before entering a conversation, consider asking what the dominant implicit or explicit intellectual stance of, say, a professor is from whom you are taking a class: mimetic, romantic, structuralist, formalist, poststructuralist, or social text. At what level are they posing a question or having a discussion? Also consider thinking of one’s intellectual life as a game of chess. When a writer or speaker makes a claim, offers a support for a claim, or engages in some other critical or rhetorical move, evaluate it as one might a move in chess and reflect on the following: there are lots of moves that could be made across the chessboard of intellectual life (classical mimetic, romantic, formalist, structuralist, poststructuralist, or social text ones). Why did this person make this particular intellectual move–and why now, at this moment? What does the person making this move want me to assume about models and languages in relation to the world, and what should be my move in response? What countermoves by others might I anticipate from my own move? Break the spell. See through the blue pipe smoke. Make your move from the vantage of knowing the intellectual terrain: “They say, I say.”

Writing 1.15.1. Read closely any sort of text (a prose essay, a story, an advertisement, a film, a poem), identifying its dominant sensibility and implicit assumptions about mimesis, romanticism, structuralism, formalism, poststructuralism, and social text theorizing. Would you say that the sensibility of the text has a stable, consistent, and coherent point of view—or is it unstable, contradictory, and confused? What implicit or explicit intellectual chess moves are being made, and what chess moves would you make in reply? Reflect on this in your journal, articulating to yourself your discoveries and insights.

__________

Image result for postmodernism

__________

Posted in atheism, atomism, critical thinking, education, edward feser, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Critical Thinking For Writers. Concept 1.13: The Committed Writer

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter one. To have a look at other parts of the book, click here.

Concept 1.13. The committed writer. If the critical thinker is about getting at the truth of matters, and the spell-casting mystifier is about blowing blue pipe-smoke over the chessboard of situations so that people can neither see nor think clearly, then the committed writer is about The Cause—whatever that cause might be. Committed writers may be distinguished from critical thinkers, not in searching for the truth, but in believing the truth is already in their possession. And they may be distinguished from spell-casting mystifiers, not in feigning seriousness, but in being quite serious. Committed writers are the true, non-ironic believers (the prophets and apologists for this or that ideology; the superheroes of their own minds; the patriots; the activists; the defenders of the outcast). Committed writers know right from wrong—or at least think they know. They may be at once confident and sincere. When in character, they may even present themselves to others in the posture of 100% certainty, without any doubts whatsoever—even though they may privately hold doubts. The content of that knowing may come from the political left or right, or from the vantage of religion or irreligion, or from the vantage of sales (they really, really believe in their product).

So if you’re a committed writer, you may be the lover of justice, the progressive activist, the person concerned with solidarity—but you may also be a resenter, an angry defender of your tribe, a hater of those you take to be freeloaders. Whatever your beliefs, you may not be especially ironic or critical about your own side—but you may be quite ironic and critical of your opposition. Your criticism may go all in one direction—outward. You might be especially concerned with either the preservation or advance of your community of believers—but not necessarily with those outside of your community. You may be quite nostalgic for a past condition—or long for a utopian future—and you might cast very dark shade on a particularly loathed enemy. (“If so-and-so would just get out of the way, my community could reach its goal.”)

Ad hoc reasoning and the committed writer. If you’re a committed writer, your argumentative style may be characterized by a lot of ad-hoc reasoning. Ad hoc means “for this instance only” in Latin. It’s haphazard, on-the-fly reasoning that adds premises to an argument to protect a favored thesis:

“If aliens have visited Earth, where are they now?”

“They’re hiding on the dark side of the moon.”

“But we’ve mapped the dark side of the moon. We know what’s there.”

“They’re underground.”

A person not engaged in ad hoc reasoning is more likely to reason in a more balanced and just fashion, as when the scientist and philosopher Francis Bacon (1561-1626) wrote, “Read not to contradict and conflate, nor to believe and take for granted…but to weigh and consider.” [from the first page of Logic and Contemporary Rhetoric by Kahane.]

So when you’re not taking the advice of Bacon; when you’re a committed writer, then you may not really be doing your best to just go wherever the truth might most fairly and naturally lead. If, for instance, you are a committed believer in aliens, your premises in argument are going to build up in such a way that they bring you, no matter what, to the conclusion that there are aliens. That is, you treat your belief–and the protection of your group of believers–as a special case for special deployments of reasoning in their defense (also known as “special pleading”).

You might not reason in this sort of ad-hoc fashion on other issues, but on your most treasured beliefs you may be prepared to do so. If you can find a logically possible way, however implausible or strained, to defend your position or group, you’ll go there. You’re not especially enamored of the principle of Occam’s razor. Occam’s razor is “the simplest explanation is usually best.” It is a heuristic—a rule of thumb—formulated by the medieval theologian William of Occam (1285-1347) for how to choose between two contending arguments: one that is simple, requiring few premises to be believed, and one that is complex, requiring numerous premises to be believed. Perhaps, for example, we don’t have evidence of aliens on the dark side of the moon, not because they have invisible powers and don’t want to be found, or because they’re hiding underground, but because the simplest explanation is best: they’re not there.

Occam’s razor vs. the premise-beggar’s razor. In place of Occam’s razor, the committed writer may be tempted to engage in what we might coin the question-beggar’s razor or the premise-beggar’s razor; i.e., you won’t stop adding premises to your argument until it has arrived at the destination you desire, and once your premises reach that destination, your reasoning comes to a dead stop. Any challenge is answered by an ad hoc premise that returns you back to your destination: the desired conclusion. In essence, whether one is willing to follow your argument to the location you take it to, and stop questioning at the place you do—accepting your list of premises, full stop—is a test of group loyalty. Are you one of us—the believers, the committed—or one of them—the unbelievers, the uncommitted–dissatisfied with where the group’s use of premises and justifications stops?

Pessimism and the committed writer. In the persona of the committed writer, you may come across as pessimistic, apocalyptic, or conspiratorial surrounding your group’s relation to the world—or you may be quite optimistic, imagining that a breakthrough or victory for your side is just around the corner. So when you’re in this persona, you might find yourself making a very conscious choice of whether or not to exude to your audience optimism or pessimism. In other words, rather than apportioning your optimism or pessimism to the evidence–being neither more nor less optimistic or pessimistic than the situation warrants–your stance becomes a performative choice of rhetorical style.

Thus if you opt for pessimism, you’re taking a risk both psychologically and rhetorically, as you may come across as defensive, worried, alienated, isolated, or fearful. These are states of mind stressful on yourself and on your audience. And if the pessimism is extreme, you may turn cranky, and so you may come across as someone, even if you don’t mean to, who is cruel, sadistic, and cynical toward out-groups and outsiders; an angry and aggressive person prepared to torment, ridicule, exclude, and demonize the non-committed. You may be perceived as someone who is prepared to narrow group affiliation in the name of group and ideological purity. In a pessimistic mode, you may also come across as vindictive, rigid, and inflexible.

If you are a committed writer in a pessimistic mode, you may also regard an open attitude toward the world as foolish, and opt for being quite defensive. If you opt for this way of being in the world, you’re akin to a soldier; a sword fighter, say, who is always “on guard.” But in being intellectually and emotionally armored, you may also prove impervious to reason; a person who suffers from epistemic closure (you’re not searching for the truth; instead, you know). Indeed, your loyalty may not actually be to the truth at all, but to an ideology, and your tribe may function as an idol that replaces the truth, as in “My tribe, right or wrong.”

This sort of rejection of objectivity may lead you into other traps of irrationality, such as being an all-or-nothing thinker, posing false dilemmas to your audience, and engaging in either-or reasoning. When you’re enthused in this manner, very little may fall into gray areas. You may see the world in largely black and white terms. You may also lack self-criticism or self-awareness (all criticism is directed in anger outward, toward “the Other”). And because you’re so emotionally invested in your position, you may also engage in cognitive dissonance, counting the hits in your pet theory, but never the misses, and never reading books or exposing yourself to media from the other side. You may also place an excess of reliance on your intuition, waving-off reasoned and systematic deliberation, and you may over-rely in argumentation on poisoning the well of discourse with ad hominem (attacking the persons or groups that oppose you, making it about them personally, not the issues at hand). You may also prove an impatient grandstander, not really hearing others. Because, in your own mind, you already have the truth, you proclaim, you don’t really dialogue. You’re not really listening. You may be quite proud to come across as the never-yielding combatant, but it may be coming from a place of shame and humiliation; of feeling alienated and an outcast, akin to the lead character portrayed in Dostoevsky’s 19th century novella, Notes from Underground.

The committed writer may also be open and optimistic in ways that are unwarranted. Of course, there is a more positive way to be a committed writer without falling into pessimism and emotional and epistemic closure, and that is via the route of optimism and openness. If you’re an optimistic true believer, you may come across as hopeful, enthusiastic, and happy; a confident person who, though non-ironic about your beliefs, is open to scrutinizing them. You are not impatient, incurious, or rigid; you’re not uncaring for those outside of your group; you try to be self-aware when you are falling into cognitive dissonance; and you’re not impervious to reality testing. You recognize that life is rarely simple and change frequently difficult. You accept that choices often have trade-offs, and so you are more likely to be weighing competing goods in your commitments rather than casting them as good vs. evil.

Still, there are landmines here. Because optimism is wed to confidence, you may be tempted to pose as the confidence man (con-man). And as an optimistic thinker, you may be a wishful thinker who, when intellectually cornered, is a subject changer, making yourself prone to cognitive dissonance, unrealistic goals, and only looking at the bright side of life. You may thus find yourself in the role of one who is naïve and innocent, akin to another of Dostoevsky’s protagonists, Prince Lev Nikolayevich Myshkin in The Idiot.

Empathy and the committed writer. Perhaps the most attractive feature of the committed persona is that of the empathetic writer or speaker. But here too are critical thinking landmines. The person with vast stores of empathy may decide that focus on the pain of individuals is more important than some larger truth, concern, or goal. For example, the empathetic person may write and politically organize on behalf of debt forgiveness of college loans or a cure for cancer, focusing like a laser on the pain of individual students or patients under the burden of debt or ill health. But competing goods may not be wrestled with. In other words, the empathetic, committed writer may fail to address such issues as who in society will pay the taxes for relieving the pain of students in debt and the impoverished sick. Visceral images of the suffering of students or patients may take up all the oxygen of your concern, attention, and thought.

The empathic person may also be in danger of only seeing the pain of those in his or her own tribe. In the name of empathy for his or her side’s dead soldiers or religious martyrs, for instance, he or she may demonize outsiders, supporting their slaughter in war. So in empathy, there can be a component of cognitive dissonance–and even cruelty or outright sadism (we care about our wounded and dead, not yours). To direct all of one’s thought, love, attention, and priority on one thing necessarily means it is not directed at something else. In this sense, empathy is a narrowing of response to existence. It can render the particular visible after being unjustly invisible for too long, but it can also render a synoptic perspective on the whole invisible–and a synoptic perspective is a condition for coherent reasoning itself. “The truth is the whole” (Hegel).

Writing 1.13.1. Adopt in a piece of writing the tone and sensibilities of the committed writer writing to a college educated audience. See if you can do this without falling into the intellectual mistakes committed writers frequently encounter. 

Writing 1.13.2. Adopt in a piece of writing the tone and sensibilities of a committed writer who is highly empathetic, attuned to the suffering of individuals, writing to a college educated audience.

__________

Image result for the committed writer

__________

Posted in atheism, david hume, education, edward feser, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Critical Thinking For Writers. Concept 1.14: Binomial Definition

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter one. To have a look at other parts of the book, click here.

Concept 1.14. Binomial Definition. In the ancient Greco-Roman world, species identification meant looking at a thing’s matter and form and locating it within a hierarchical classification system by asking, “What makes this similar to other things—and what makes it different?” This is known as binomial definition, and was first formulated by Aristotle. For example, in defining humans, Aristotle saw us as animals—but with a difference: we alone can reason. Hence his definition of the human was: rational animal. The definition has a genus (what makes us similar) and a species (what makes us different). Contemporary taxonomists still use Aristotle’s technique of binomial definition, but they apply it in a more fine-grained way: Homo (man) is the genus (type) to which we belong—the category we share with genetically nearby, but now extinct, cousins, such as Homo neanderthalensis (“Neanderthal man”), and our species (our subtype or difference—differentia in Latin) is sapien, to be wise, echoing Aristotle’s “rational animal” definition. We are thus Homo sapiens—the wise animal.

Make your own binomial definitions. The lazy way to define a thing is to look it up in an online dictionary and quote it: “According to Webster’s Dictionary, a human is…” But that’s not very interesting, and if you do this, you pass up an opportunity to contemplate a word’s meaning for yourself. So let’s go back to Aristotle’s binomial definition of a human and think about it, seeing if we might generate other useful or interesting binomial definitions for what it means to be human.

Aristotle’s species component of a definition is sometimes referred to by the Latin word, differentia. And so, in definition, we have a generalization about a thing accompanied by a differentia–an observation that marks a difference. But humans are unique in more ways than just being rational–so we could, for example, define ourselves by one of our other unique qualities, such as laughter: humans could thus be defined as the laughing animal.

But the reason laughing animal may not seem quite as good a definition of human as rational animal is that one appears more essential to being human than the other. Or, if that is too strong a claim, one might say that laughter is implied among the behaviors we would expect from a rational animal, for many things are ironic and absurd, and one would thus expect laughter from a rational animal that recognizes this. Laughing animal, it is arguable, can be more readily subsumed under the more fully encompassing term, raitional animal. And laughing animal also may not work well as a definition of human because there’s some evidence of laughter-like behaviors in other animals, thus weakening the differentia. In any case, in seeking a good definition for a word, we want to identify, not just its unique properties, but its most essential qualities for our purposes (and to foreground those). It’s important to emphasize here for our purposes because that relaxes the energy around arriving at the single, objective, best definition for a word—which is really only arrived at in sentence context and authorial purpose.

Purpose in definition matters. So nothing immediately above should be taken as nixing laughing animal as a definition of the human, for if one is discussing, for instance, comedy, it may prove useful. It’s important to note that because our purposes in definition may vary, our definitions may therefore vary. Definition depends on what we mean to distinguish and make important. Thus, in Aristotle’s Nicomachean Ethics 1.13, he emphasized the capacity for reason as what distinguishes humans from animals.  But Aristotle’s own definition shifted when he wrote in another context. When his concern was reflection on social behavior, he wrote in his Politics that “man is by nature a political animal.” That is, of all the capacities that animals have, only humans have the capacity for sophisticated politics. In this context, he singled out as our most essential differentia a high-level behavioral trait—our capacity and desire for socially navigating collective life in the context of city life (life in the polis–the city–as opposed to life in the oikos–the home).

So when we take it upon ourselves to define something, our individual purposes matter. There’s no one right way to define a thing. There are only insights as to how a word might be used or thought about in a particular context. In defining something, for instance, we may be shooting for an effect that is quite surprising, or we might say something a bit less obvious and concise—or even a tad more elaborate, such as the following: humans belong to the small group of self-aware social mammals that includes chimps and dolphins.

In the above definition, the genus is narrower than Aristotle’s: humans, rather than being broadly located within the hierarchy of living things, and most specifically within the kingdom of animals, are instead designated within the small group of mammals that are self-aware and social. So the genus in this definition of what humans are is: self-aware and social animals. But wait. Where’s the differentia? There isn’t one. There’s no differentia attached to this definition yet.

So let’s add one. What would make for a good differentia here? What distinguishes, in an essential manner, humans from other self-aware social mammals? In answer to this, we might say the following: humans are uniquely characterized by their capacities to reason, speak, and extend their influence and control over their environments via tools. So this brings us to a pretty good definition for what it means to be human: humans are self-aware, social mammals generally possessing the ability to reason, speak, and use complex tools.

Now we’ve got a pretty interesting binomial definition that can get us places (open up avenues for reflection and discussion). But what if we preferred not to define ourselves in relation to animals? There are, after all, other relations or hierarchies that we might wish to place humans in, and to do so would bring us to other definitions of what it means to be human. This is important to notice, for it reminds us that binomial definition is always relational and set into some broader conceptual hierarchy of our choosing.

Gods, angels, and aliens? Instead of in relation to animals, we may wish to define what it means to be human within the hierarchy of conceivably conscious beings (gods, angels, aliens, etc.), in which case we might arrive at an answer to the genus question in which we share key characteristics, not with gods or angels (who are, presumably, immortal and free of bodies and materiality), but with aliens: humans belong to the genus (type, group) of conscious beings that are carbon-based, solar system dependent, limited in knowledge, prone to error, and mortal. Unless aliens are quite far in advance of us, most conscious life forms beyond Earth are likely to share these characteristics with us. Hence the saying, “To err is human,” is also almost certainly true of many aliens (“To err is alien”). What makes us different is that we are on Earth, and so we might reach, after thinking about it some more, a full genus-differentia definition something like the following: Humans are Earth-bound and body-limited conscious mammals.

In the conceptual hierarchy of conceivable, conscious beings, the above definition distinguishes humans from gods (who are not Earth-bound or body-limited) and aliens (who are not of this Earth and have not evolved as mammals on our planet). In a pinch, we might make a genus-differentia definition that is really compact: Humans are conscious mortals. Or: Humans are conscious earthlings. But, really, this is inadequate because now we are being tapped on the shoulder by the chimps and dolphins (who are also quite self-aware and live on earth). So we might try again: Humans are conscious, rational, speech and sophisticated tool producing, mortal earthlings.

In relation to the gods and aliens, our mortality and Earth-boundedness come to the fore of definition; in relation to other social animals, our rational, speech, and tool-using attributes come to the fore. The definition is not boiled down to a two word binomial definition, but it is in the spirit of binomial definition, locating humans in relation to other things in terms of similarities and differences. So all this chasing after what a human is really reminds us that binomial definition is a way of arguing with yourself and others about what’s important. And notice that this definition of what a binomial definition is—a way of arguing with yourself and others about what’s important—also has a genus and a differentia! The genus is “a way of arguing with yourself and others,” and the differentia is “about what’s important.” But, of course, this definition needs more thinking about and work (which tends to be true of everything we value). The point is that, if you’re a thinker and writer, definition isn’t something you outsource to a dictionary; it’s something you do. There’s no one way of thinking about a word; there are only contexts for making distinctions. Thinking about binomial definition can help you navigate your way to a concise and efficient definition of a word that serves your purposes and communicates to readers what you take to be important—perhaps even most important. It helps you focus and to think clearly and precisely.

Writing 1.1.14. Think about a single word. What does it mean, really? Any word. Love, aging, sleep. What is it? Once you’ve got some thoughts going about the similarities and differences it has with other things or concepts, and the things you might group it together with, work your way to a binomial, genus-species (type-subtype; categorization-distinction) definition that is as precise, compact, and interesting as you can make it (without losing the essence of your definition). Keep distilling the definition down to as few words as possible until only what you regard as most important about it remains. Perhaps also then provide an example:  “So I would include w as part of the larger category, x, possessing the differentia y, and here is a specimen (z, as an example) [Aristotelian genus, species, specimen]…” 

__________

Image result for binomial definition aristotle

__________

 

 

 

Posted in atheism, critical thinking, edward feser, Lucretius, philosophy, reason, rhetoric, Uncategorized, writing | Leave a comment

A Mini-Course In Critical Thinking For Writers. Concept 1.12: The Spell-Casting Mystifier

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter one. To have a look at other parts of the book, click here.

Concept 1.12. The spell-casting mystifier. Historical figures characteristic of this writerly and speaking persona (presentational mask) are Joseph Goebbels, Adolf Hitler’s Minister of Propaganda, and Edward Bernays, deemed the intellectual “father of public relations” in the United States. This persona is a walking, talking, propaganda-disseminating fog machine; the chief nemesis of the science oriented critical thinker, and arguably, of social hope. Why in opposition to social hope? Because the spell-casting mystifier makes difficult—even exasperating—honest, vulnerable, and rational debate, glibly shifting attention away from critical thinking to psychological and rhetorical manipulation, gossip, and emotion-driven memes (catch-phrases) and images. To be drawn into rhetorical battle with such a person on his or her own turf is to become him or her—and yet little can get accomplished in the contemporary world without Jacob-wrestling with this persona.

Like the critical thinker, the spell-casting mystifier is found everywhere in global culture, in forms both subtle and extravagant (in advertising, business, politics, religion, law, broadcasting, and lobbying). Perhaps armed with an ivy-league law degree, the spell-casting mystifier may be the hired-gun of a public relations firm. Or perhaps he or she is a salesperson, cult leader, or surrogate to a politician. Spell-casting mystifiers are not concerned with truth so much as in using every rhetorical trick in the book to get people to think, feel, or act in the way they want—or the way their clients want. Spell-casting mystifiers generate fog not just around issues, but around themselves. In other words, they may act like the fog: smoke-like, ghostly, evasive. If seen in public, and you become too inquisitive, they may duck behind a door or curtain, Wizard of Oz-like.

Shakespeare’s Iago, Franz Kafka, and the spell-casting mystifier. In literature, the spell-casting mystifier is captured by William Shakespeare in the character of Iago in Othello, who says enigmatically, “I am not what I am.” Iago is the cynical soul-stealer of the play, opaque himself in motive, whispering metaphorical poison into the ears of the innocent, driving them without remorse, like a tobacco industry executive, to emotions and behaviors that lead to mayhem and death.

The spell-casting mystifier’s ideal environment is represented in Franz Kafka’s fiction, most specifically his novels The Castle and The Trial, where the honest and rational person is thwarted at every turn, and responsibility for the existing and seemingly all-pervasive system of things cannot be assigned, but is dispersed to a maze-like and inhuman bureaucracy that supports the ongoing production of opaqueness.

Miracle, mystery, and authority. The spell-casting mystifier is also captured in literature by the Russian novelist Fyodor Dostoevsky in The Brothers Karamazov. Dostoevsky embeds within his novel a fanciful short story that he situates in seventeenth century Spain. In the story, Jesus has returned to Earth and is on fresh trial, this time before the Grand Inquisitor, the supreme judge presiding over the Spanish Inquisition. This was a time in history of widespread arrest, persecution, exile, torture, and murder of religious minorities in Catholic-majority Spain. It is here that Dostoevsky imaginatively makes Jesus subject to a lecture from the Grand Inquisitor as to what people really want. The masses, the Grand Inquisitor informs Jesus as he stands at trial, don’t want inconvenient truths or freedom, they want bread; they want miracle, mystery, and authority—“And I give that to them.” The path of truth and freedom—the path the Inquisitor says that Jesus offers—is hard, and not what people really want. They want to not think. Jesus’s way, says the Grand Inquisitor, is cruel to weakling humanity. Better to give the majority of people lies and delusions to balm their short and tormented lives; better to give them strong and confident headship surrounded by pomp and unshaking beliefs—and not to leave them vulnerable to their own weak thought.

So though the Grand Inquisitor is in the position of command as supreme judge in Dostoevsky’s tale, and the consistently truthful and honest person who ought to be ruling is the one standing silent at trial, listening without interruption to a hectoring lecture, it is nevertheless the Grand Inquisitor—the supreme judge himself—who has broken the law of the soul with his fellow humans. He has done this by knowingly presiding over a regime whose foundation consists of falsehoods and that persists because of the dissemination of falsehoods.

The story ends with Jesus retorting to the Grand Inquisitor’s speech, not with words, but with a Judas-like kiss to his cheek. Dishonesty, the kiss suggests, severs the implicit social contract between people at the deepest level of the soul, poisoning the heart’s ability to be vulnerable and trusting. Dishonesty—the faking of reality—is a betrayal of mind, heart, and genuine community. This is why the poet Dante, in his Inferno, reserved the deepest circle of hell for the liars, placing Judas Iscariot in the jaws of the Great Deceiver himself, Satan, who is at the very center of hell, frozen in ice (a symbol of the unfeeling heart).

Willfulness and the Dark Triad. As illustrated by Kafka’s disorienting fiction, Shakespeare’s Iago, Dostoevsky’s Grand Inquisitor, and Judas Iscariot in Dante’s Inferno, the spell-casting mystifier is not a self-deluded person, but a willful person. He knows exactly what he’s doing—and he knows exactly what sort of system of things makes it possible for him to do it. Not recoiling, he wants to support the powerful and report for duty, but perhaps only to the highest bidder. Like Johanne Goethe’s Faust, he will sell his soul to Mephistopheles.

So if you say, “Sign me up!”–adopting the writerly persona of the spell-casting mystifier–you know exactly how and why you’re distorting the truth. As a spell-casting mystifier, you may partake in what is known as the Dark Triad of personality traits (high in narcissism, sociopathy, and Machiavellianism). To be narcissistic is to be focused on the self; to be sociopathic suggests a lack of emotional states appropriate to situations; and to be Machiavellian is to be cynical, nihilistic, conniving, and crafty. So if you’re a spell-casting mystifier, you may delude others with perhaps few or no twinges of conscience, all the while without yourself ever being deluded as to what you’re actually doing as a rhetorical hired gun for Big Propaganda (the message management industry working for all-comers—Big Oil, Big Tobacco, nation states, authoritarian dictators, scandal entangled CEOs, etc.).

Rhetorical moves of the spell-casting mystifier. If you’re in the persona of the spell-casting mystifier, you’re highly selective in your presentation of evidence, subtly encouraging confirmation bias: you want your audience to attend only to the evidential hits, not the misses, in whatever pet theory or spidery narrative you’re spooling out to them. Your end is to guide them to your conclusion, not the truth. You’re prepared to lie. You send people down rabbit holes. If you display fairness or curiosity about the actual truth at all, it is for show—or you may even be quite brazen, displaying outright incuriosity and unfairness, betting that your audience is also incurious and unfair, wanting what you’re selling, and simply waiting on you to provide a reason–any reason–to quell their lingering doubts.

And so you are theatrical. Perhaps not given over to political or religious passions yourself, you are nevertheless content to be a cipher for other people’s passions—especially their magical thinking, lusts, and resentments—and you’re prepared to give them exactly what they want: a show; a ritual enactment of their longing or anger. You shrug off reality testing, or make only pro-forma gestures toward it, going through the motions to pantomiming fairness, all the while sending signals that you’re to be trusted for no other reason than that you’re a member of the in-tribe; one of them.

So as a spell-casting mystifier, you’re also a confidence man: you act like you’ve thought of everything, have everything under control, and are not innocent, but experienced. You are someone people think they can follow with trust, outsourcing their thought to the carnival of your charisma, charm, issue-framing, and storytelling. You promise fast results—and with minimal cost. You don’t acknowledge the dilemmas of competing goods. You seduce and oversimplify. 100% confidence is attractive, and enthusiasm is attractive, and getting things done cleanly and fast are attractive—and so you are attractive.

Demonizing others, emotional blackmailing, intimidating, and tormenting. For those in your audience who waver, you’re prepared to engage in Stockholm Syndrome-like rhetorical techniques, where you make it clear to the vulnerable that you can hurt them as well as help them. The Stockholm Syndrome is getting love and threat from the same source. So if you’re doing that to people in your sales pitch—whatever you’re selling—putting time urgency, for instance, on them to buy now!, then you’re an emotional blackmailer. You take hostages. Posing false either-or dilemmas, you promise extravagant rewards for following you or doing what you want—and suggest terrible, terrible consequences for those who balk. You’re also prepared to feign outrage, humiliate, shun, gossip, and engage in ad hominem (attacking not the argument of one’s opposition, but the opponent herself; making it about the person, not the issues at hand, demonizing her). The spell-casting mystifier might thus speak or write like this: The solutions to our problems are simple—if they will just get out of the way. They are all darkness, we are light, and I have a final solution for getting them out of the way.

Writing 1.12.1. Locate a piece of writing that is in the persona of the spell-casting mystifier, and analyze and deconstruct it. How is it functioning rhetorically? How might one in the writerly persona of the critical thinker respond?

__________

Image result for grand inquisitor

__________

Posted in atheism, atomism, critical thinking, david hume, donald trump, education, edward feser, hillary clinton, Lucretius, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Critical Thinking For Writers. Concept 1.11: The Writerly Persona Of The Critical Thinker

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter one. To have a look at other parts of the book, click here.

The writerly persona of the critical thinker. In literature, this persona (mask for presentation) is found in detective fiction in the character of the probing investigator, and in film it is captured by the scientist played by Jodi Foster in Contact, which was based on a novel by the famed Cornell astronomer, Carl Sagan. Albert Einstein projected this persona, as does Martha Nussbaum, the well-known humanities polymath at the University of Chicago. You can find this persona projected everywhere in global culture, not just in academia, but in newspapers like The New York Times and magazines like The Atlantic and Skeptic. Wherever you encounter anyone writing in an open, vulnerable, probing, and self-critical way, you’re encountering the persona of the critical thinker. This latter trait—being self-critical—especially distinguishes the critical thinker from the non-critical thinker, as doubt and suspicion goes, not just outward, but back upon oneself (“Am I quite sure I’m seeing things right? Am I fooling myself? Am I badly motivated? How do I know?”) If you’re writing in the persona of a critical thinker, you’re projecting an honest effort at capturing the truth of a matter, or reflecting truthfully and subtly on something complex, following wherever your search for truth, the good, or beauty leads you. There’s a universal quality to this voice, suggesting a wide circle of affiliation, incorporating any literate, fair-minded person of goodwill, regardless of nationality, gender, religion, or irreligion. With good reason, this is the voice of ideal college discourse, of civil international relations, and science.

Rhetorical moves of the critical thinker. You may find yourself writing in the third person when you’re in the persona of the critical thinker, not necessarily avoiding the I or we voice, but perhaps not especially emphasizing it. Your focus is generally outward, pointing away from self or group to the matter at hand. Of course, in this persona you’re also seeking to come across as smart and thoughtful, which suggests that you’re very likely to do the following:

(1) use a broader than average range of vocabulary

(2) deploy (show-off would be too strong a word) intellectual and cultural literacy

(3) display marked accuracy and precision of speech

(4) vary sentence length and rhythm in your writing

Thus it is that you’re the inquisitive seeker who notices interesting things in interesting ways, and can write about them with fairness, sophistication, and style. You point and make observations, and have the capacity for sustained focus and attention—and you show it. You linger. You’re calm. You might also be mischievous: the friendly provocateur of the groups to which you belong, encouraging them to be self-critical and to challenge their own assumptions; assumptions that the writer and reader may share, but which you openly bring into scrutiny.

So to be the critical thinker in the room is to be the intellectually unpredictable person; the person prepared to look at the world from different angles, and not just the angles that are familiar to your group. You display independence. As a seeker of truth, knowledge, growth, and understanding, you’re also more interested in dialogue than proclamation—most especially dialogue between those who fundamentally disagree, calling to mind the philosopher Baruch Spinoza: “Non ridere, non lugere, neque detestari, sed intelligere. (Not to laugh, not to lament, not to curse, but to understand.)

The critical thinker and science. The persona of the critical thinker is essentially the persona of the ideal scientist, meaning the critical thinker tempers careful thought and plausible theorizing, wherever possible, with reality testing. As a critical thinker, you never bring Galileo’s telescope down from the heavens. Always looking, you’re ever ready to adjust beliefs in response to new data. This means that you move readily and willingly from the realms of claim and theory into that of support, evidence, and, most importantly, disconfirmation. You’re hard-nosed, not just in evaluating others’ claims and theories, but your own. (Again, you are not in the persona of the ideal critical thinker if your skepticism goes only outward, and never back upon yourself and your own beliefs.) You want to be proven wrong—or if that is too strong, you are at least open to being proven wrong, and you’re active, not passive, in searching out what might be wrong (where you might be fooling yourself).

The critical thinker’s audience. In the persona of the critical thinker, you’re not asking for faith from your readers, but for them to gauge evidence, logic, plausibilities, and probabilities with you. You want them to apportion their beliefs to the evidence you present, not believing things in excess of the evidence. You want your readers to read you critically and ironically—as you yourself are also an ironist (one who can stand back from a scene, not lost wholly in passionate advocacy). As a doubter, skeptic, and seeker, you want doubters, skeptics, and seekers for readers, keeping you on your toes.

The critical thinker and doubt. In the persona of the critical thinker, doubt is a virtue, not something to be tamped down. So you doubt, make distinctions, and want distinctions made. You’re not content with the easy answer, and as with any science-oriented person, you speak with measure (more in probabilities than in certainties). You’re nuanced, patient with ambiguity, and willing to live with uncertainty and complication. You don’t sweep under the rug difficult choices between competing goods and competing data points. If you don’t know, you say you don’t know. You don’t pretend to know what you don’t.

The temptation of contempt. There are more than a few landmines in deploying the persona of the critical thinker in discourse, especially in mixed company (i.e., critical thinkers in the midst of non-critical thinkers). Even if your social skills are excellent, concern for truth may too frequently put you in conflict with broader social group expectations, placing you in tension with others. You may come across as a cynical person, or as one insufficiently committed to the beliefs and projects of the tribe to which you belong (not a team player). You may appear too frequently impatient and angry with what the philosopher Immanuel Kant called “the crooked timber of humanity.”

Bored with the majority of people’s naïve enthusiasms, and forgetting Spinoza, you may be persistently tempted to introduce ridicule and gnawing worry into discourse. You may also come to feel outright contempt for the simpler perceptions of others, leaving, say, religious fundamentalists and surface-loving extroverts out of your circle of full understanding and empathy. You may therefore grow world-weary of being a frequently critical and conflicted person, and find yourself on the outs with others, alone in your opinions, and often left alone to your opinions, not understood—and perhaps with few who want to understand you. You may even be feared.

Being baited into defensiveness. Because you’re critical and intellectual, and not a conformist, you may be seen by some as snobbish and elitist—and unethical debaters may send these accusations your way regardless of the evidence, putting you on the defensive. Unfairness of this sort may lure you away from the critical thinker’s natural advantage—your vulnerability and openness toward the world—and into a defensive posture of armoring, cunning, and aggressive counter-punching. Ironically, these could actually end up alienating others from you further still. So you’re in a dilemma. If you’ve got the right methods for getting at the truth—and critical thinkers think they do—you nevertheless have to make hard choices, moment-by-moment, about when to rhetorically shout, adopting the methods of opponents, and when to trust that the truth will out and reality will assert itself.

So you may need to check rigidity and impulses to dominance in yourself if you adopt this persona as a habit, and be self-aware, in the heat of intellectual combat, as to when you’re being sadistic toward others, or—ironically—not thinking critically about your own social and rhetorical situation. Critical thinkers obviously tend to be good at slicing and dicing opponents, but you may not know when to quit. You may prove inflexible in contexts where social circumstances might dictate a lighter touch. So if you misjudge your rhetorical situation, you may find yourself firmly in the public jaws of your chief opponent in intellectual combat, the spell-casting mystifier.

Writing 1.11.1. Write on a topic of your choosing adopting the tone and sensibilities of a critical thinker writing to a college educated audience.

__________

Image result for the critical thinker

__________

Posted in atheism, atomism, critical thinking, david hume, education, edward feser, Lucretius, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.5: Genre

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter two. To have a look at other parts of chapter two–or the whole of chapter one (Concepts 1.1 – 1.10), click here.

Concept 2.5. Genre. Let’s imagine you arrive by research and reflection at a working thesis you take to be genuinely the best among those on offer or that occur to you—It’s fair to call Rome’s destruction of Carthage a genocide—and you decide that your smartest rhetorical move in this context is to put your thesis statement squarely upfront in your essay. You might thus announce that you mean to rehearse how, exactly, you arrived at your opinion:

By numerous criteria, Rome’s destruction of Carthage was a genocide.

Or you may wish to simplify your thesis even further:

Rome’s destruction of Carthage amounted to genocide.

But wait. Notice that these two sentences imply different genres (forms) of writing. The second thesis is not really a simplification of the first thesis, but a very different thesis–purpose in writing–altogether. If you pick the first thesis statement immediately above, you’re essentially boxing yourself in. You’re announcing that your genre of writing will be that of the evaluation essay (you’ll be evaluating a proposition by criteria, and from these you’ll render a judgment). But if you pick the second thesis statement, you’re implying that your genre may include evaluative criteria, but these may not, of necessity, be fleshed out. You may be signalling instead that you’ll be emphasizing argument in general, and so your essay might be classified, genre-wise, in a catch-all fashion as an expository essay (an essay that incorporates both research and argument)—or simply an argumentative essay.

In this sense, the argumentative and evaluation essays are subgenres of the expository essay, for most argumentation sooner or later points to evidence and research. By opting for the second thesis—“Rome’s destruction of Carthage amounted to a genocide”—you’re leaving yourself a lot of wiggle room for working with a broad range of arguments. Your thesis statement sets up an expectation in your reader that you will offer implicit and explicit reasons for your judgment concerning the nature of ancient Carthage’s destruction, but perhaps without an explicit explanation as to why those particular reasons themselves constitute the best criteria for drawing such a judgment.

Many genres can be deployed for tackling a single topic. But let’s say you don’t like either of these thesis statements. Perhaps rehearsing your criteria or reasons for arriving at a conclusion feels boring to you. You like your topic, but you don’t want to write an evaluation or general argument essay about it, justifying your opinion. Absent abandoning the topic altogether, what do you do?

You can pick a different genre for writing.

You might, for example, accept without argument that ancient Carthage’s destruction was indeed a genocide, trust that your audience already agrees with you (is already at yes with you), and then write an essay tracing what you take to be the causes that led up to Carthage’s destruction, and the subsequent effects it has had on the history of northern Africa. Such an essay might be described, in terms of genre, as a process essay or cause-effect essay (“this important thing happened, then this happened, and here’s why it happened, with impacts that reverberate to this day…”). That is, you’re walking your reader through a linked process, narrating and explaining how one gets from point a to point b. Such an essay becomes a type of story-telling, where choice of emphasis implies your opinion as to what is most important and worthy of attention.

But let’s say you don’t like the process essay as your genre choice either. Now what?

Well, you may wish to write a definitonal essay: in light of what happened at Carthage, what is genocide, exactly? Or perhaps you decide, after additional reflection, that the most interesting aspect of the topic for you is not in definition, but in just writing a straightforward comparison and contrast essay, thinking about Carthage in the light of, say, the Holocaust, and so you write out the following thesis statement:

Like the Nazi genocide of Jews in 20th century Germany, Rome’s destruction of Carthage also deserves to be considered a genocide, and thus comparisons and contrasts with the Holocaust illuminate it.

Now, you think, you’re cooking with gas (to echo the blind man in Raymond Carver’s short story, “Cathedral”). It’s something worth arguing about; you’re happy with your topic; you’re happy with the genre for discussing the topic; and you’ve got an explicit thesis statement around which to organize your thoughts.

The take-away here is: whatever thesis statement you settle upon, it will be implicitly accompanied by a genre—and you should make that genre explicit to yourself by asking the following: By posing this thesis statement, what genre am I actually going to be writing in?

Specific genres may constitute whole essays in themselves, or be deployed on a limited basis, as a few sentences or paragraphs within another genre (an argumentative essay that includes a paragraph of comparison, some narrative recounting, and paragraphs summarizing research). Typical genres of college essays–and genres and subgenres to be found in sentences or paragraphs within college essays–include things like the following:

reporting (summary of facts or events)

research (posing empirical questions–and trying to answer them)

narrating experiment and experience (telling a story, etc.)

argument (taking a position and supporting it, pointing to things like testimony, logic, and evidence)

evaluative and valuation claims (what’s true, good, or beautiful by some criteria, implicit or explicit)

grayscale evaluations (making arguments about what’s probable; what’s likely)

ethical and political appeals; appeals to action

placing or deflecting blame

aesthetic claims (making claims concerning art or what’s beautiful)

pointing to converging lines of evidence

offering examples

definition

comparison and contrast

cause and effect

describing

appealing to logic

appealing to common sense

theorizing, exploring possibilities, speculating

exploring the sublunary (beneath the moon observations; wrestling with aporias–impasses; things that appear uncertain)

engaging in framing gestures

centering and decentering (bringing something regarded as marginal to the center of attention, etc.)

Such a list can be multiplied, but here’s the take-away: it can be helpful for writers to make explicit to themselves the genre or sub-genre of writing that they’re deploying, either in the broadest sense (the genre of the essay as a whole) or in a narrower sense (the genre of whatever sentence or paragraph is being worked on in the moment).

After genre choice, ask what you want to qualify and/or clarify in your thesis statement. After writing a thesis statement on Carthage, and deciding to place it toward the beginning of your essay, it might occur to you, on looking again, that your readers could have an initial objection, arresting further or sympathetic reading of the rest of the essay. Is it really reasonable, for instance, to compare the Holocaust, a historically recent, continent-wide phenomenon, to the destruction of a single, ancient city? Doesn’t the comparison amount to a faulty analogy? Anticipating this objection, you might decide your thesis statement is in need of a qualifier or clarification that signals yes, you too have thought of this objection, and yet you still affirm your thesis, and hope the reader will stay with you as you make your case:

Though the Nazi genocide of six million Jews spanned a continent, and the Roman killing of 150,000 ancient Carthaginians focused on just a single city (a community of perhaps 200-400,000 people at the time), the destruction of Carthage can nevertheless still be illuminated by comparisons and contrasts with the Holocaust, and deserves itself to be considered a genocide.

That’s a clear, genre and audience attentive statement of a writer’s purpose. Notice that you’re doing your best to keep yourself and your readers on the same page, saying yes to your propositions. Notice further how your thesis statement is getting ever more precise as you reflect, write, and revise. Also notice how your thesis statement has evolved in such a way as to set up an expectation for your readers that your essay will emphasize—and perhaps even justify, at some point—that the number of victims at Carthage is not the deciding or only factor in deploying the term genocide, but is better conditioned on other, relevant criteria. One of the things determining your success in a piece of writing is whether it delivers on its implied promises.

A good thesis statement navigates skillfully between an excess of confidence and an excess of caution. Even with the acknowledgment or tackling of an initial objection—something you and your audience can reach yes regarding right upfront—you may feel that your thesis statement is still too strongly worded, and thus you may decide to bring your confidence down a notch before proceeding:

Though the Nazi genocide of six million Jews spanned a continent, and the Roman killing of 150,000 ancient Carthaginians focused on just a single city (a community of perhaps 200-400,000 people at the time), the destruction of Carthage can nevertheless still be illuminated by comparisons and contrasts with the Holocaust, and perhaps deserves itself to be considered a genocide.

Inserting that perhaps into your thesis, however, is a gamble on your audience’s tolerance for ambiguity. Are you now being too squishy? If you’re writing for college, for instance, professors are generally open to expressions of hesitation and doubt, but your particular professor may read the perhaps in your thesis as student-level insecurity grounded in a lack of self-confidence (even though that’s not your intent.) It may also be read as an attempt to wiggle out of a commitment. The tension here, again, is in to whom you are writing and to what end–and what risks in writing you are prepared to run. Your purpose is clear—you mean to use the Holocaust to cast light on the destruction of Carthage, and to implicitly make a case for the fairness of calling the destruction of the ancient city a genocide—but as you proceed, you don’t want to oversell your level of confidence surrounding the issue. Maybe you have genuine doubts, or perhaps you feel the issue is undecidable. Yet that very last part of your gesture, though noble in its caution and humility—may not be rewarded by a professor who prefers displays of confidence. What works for one audience does not guarantee transfer to another audience.

Writing 2.5.1. Look again at the Carthage thesis. Imagine yourself as a professor. Would you keep the perhaps in the thesis—or advise the student to remove it? Or perhaps you would leave it to the student to decide, letting him or her evaluate the essay as a whole before deciding to keep it or drop it. Explain your rationale for whatever advice you would offer to the student.

Writing 2.5.2. Evaluate an essay in light of whether it announces its purpose clearly and meets its implied promises.

Writing 2.5.3. Think of a topic on which you have a strong opinion. Imagine you are going to write an essay on the topic, then write out a thesis statement regarding it. Narrow the thesis statement to something manageable for driving, say, a ten page essay. Then qualify that thesis statement, honing it into a form of ever greater precision and clarity.

Writing 2.5.4. Look at a variety of nonfiction essays or books, reading their opening paragraphs, asking, “What genre does this writing seem to be in?” Would you call the genre expository (incorporating research and argument), evaluative (having implicit or explicit criteria for judgment), argumentative, process-oriented, definitional, comparative, etc.   

Writing 2.5.6. Search in a library or data base for a short expository essay—an essay that incorporates both research and argument—and read it. (Most essays are expository at some level.) Make explicit to yourself where the research likely happened for the author, and where the arguments in the essay are. What research did the author conduct to bring about this particular piece of writing? Did the author perhaps locate an article in a data base, then comment on it? Did the author do any original research or have an experience the she recounts?

__________

Image result for genre

__________

Posted in critical thinking, education, reason, rhetoric, Uncategorized, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.4: Thesis

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter two. To have a look at other parts of chapter two–or the whole of chapter one (Concepts 1.1 – 1.10), click here.

Concept 2.4. Thesis. The thesis is your main point; your ultimate yes. It’s the reason for a performance of rhetoric. Imagine, for instance, you believe marriage would be a good idea for your partner and yourself, and that you would be happy together “till death do you part.” That’s a thesis that you might have in your head–but it’s not necessarily the way you might propose it to your audience (your partner). You might say it more poetically, or you might simply pose it as a question: “Will you marry me?” Perhaps the actual sentence that you use to broach the subject ends up being:  “I think marriage would be a good idea for us.” This way of saying it becomes your thesis statement. The thesis statement is your main point–your thesis–made clear. So your thesis and your thesis statement are not exactly the same things. If the thesis is your main point, it is the thesis statement that makes it explicit in carefully chosen words. A strong thesis statement seeks precision of expression, ideally leaving no room for a decoding error on the part of the hearer or reader. By contrast, a weak thesis statement is one that is vague; unclear.

What precedes and follows the thesis statement are other sentences–sentences that support the thesis statement. The thesis statement thus functions like a North Star around which all the other stars–all the other sentences–revolve. Sentences either lead up to or follow from the thesis statement, and if you say yes to each one of them along the way (or, at least, most of them), all those collective yeses should converge on an overarching conclusion: the thesis itself. The thesis is the thing you’re trying to get your audience to reach a final yes on.

The sentences that are not the thesis statement are akin to the accouterments to the performance of a marriage proposal: the flowers, the person on one knee, and the open ring box support and lead up to The Big Question (which you have an opinion on): marriage. So the thesis is your idea that marriage would be a good thing for your prospective partner and you; the thesis statement (“I think marriage would be a good idea for us”) makes explicit the point of putting on a rhetorical performance in the first place; and the other sentences are present to support your main point.

As a rule of thumb, your thesis statement belongs toward the beginning of your essay. If you don’t have a very definite main point—if your rhetorical flowers, gestures, and rings, as it were, don’t add up to anything—then you don’t really have a thesis, and so you risk losing those readers who might be looking for one. This is readily solved by putting your thesis near the beginning of your essay (within the first or second page). Otherwise, if you are being digressive, and not writing to the point, you might in fact deprive your actual, perhaps impatient, reader—say, a professor—from grasping the hook that tells her where, exactly, your piece of writing is going, and whether it is worth reading beyond the first page or two.

A research question is not a thesis statement. So observe the following important distinction: you may have a piece of writing that, though it has a goal (to explore a question) it does not have a thesis. It may, for instance, not take an interesting or novel position on a matter worth arguing over—worth getting to yes over. Instead of a thesis statement, what you may have is a research question: “Is it fair to call Rome’s destruction of Carthage a genocide?” This is a perfectly reasonable opening question for an essay or book. But again, it’s not a thesis or thesis statement. From this question, your reader, if she has the patience for your withholding of judgment, has a sense that you’ll be exploring the nuances of the question, presumably from different angles and perspectives (reflections on expert opinion; comparisons with contemporary acts of genocide; how one defines genocide, exactly, etc.), guiding her through some of the terrain of the question. Perhaps, at the end, the reader will expect, given your appraisals along the way, that you will actually provide an overarching opinion—a culminating judgment—to which she will finally concur (say yes to) or demur (say no to).

If you offer that culminating judgment at the end of your essay, then this is your thesis statement functioning as a conclusion. If the terrain is sufficiently complex or intractable at some level, this rhetorical move of holding off the thesis to the conclusion may be appreciated. And as with a proposal of marriage, there’s nothing inherently wrong in a build-up to an overwhelming conclusion. There’s no law set in stone, for instance, that says a thesis statement should be at the end of the first paragraph of a piece of writing. But as a rhetorical performance, it’s a risk to trust that your audience will not grow impatient with a delayed thesis statement.

So catch another important distinction here: the research question (“Is it fair to call Rome’s destruction of Carthage a genocide?”), located toward the beginning of a piece of writing, coaxes the reader to some culminating thesis (presumably arrived at toward the end of the essay), while the thesis statement hopes that the reader, made privy to the writer’s main point or conclusion straight-off, will stay for the details: “It’s fair to call Rome’s destruction of Carthage a genocide.” Is it more or less intuitively satisfying to get this thesis at the beginning of a piece of writing or toward the end? That’s a decision grounded in your purpose, a correct reading of your audience, and—if the writing is being done for a college course—the careful reading of a particular professor’s prompt.

A research question can be blended with a thesis statement. Book authors sometimes signal their thesis statement—which can be more than a sentence, and sometimes even extends to a full paragraph—with the phrase, This book…. Philosopher Susan Neiman uses this formulation in her widely acclaimed and influential Evil in Modern Thought (Princeton 2002), placing her thesis statement at page two of a book that is over three hundred pages long:

This book traces changes that have occurred in our understanding of the self and its place in the world from the early Enlightenment to the late twentieth century. Taking the intellectual reactions to Lisbon [the Lisbon earthquake of 1755] and Auschwitz [the Nazi-run death camp at Auschwitz, 1940-1945] as central poles of inquiry is a way of locating the beginning and end of the modern [era]. Focusing on points of doubt and crisis allows us to examine our guiding assumptions by examining what challenges them at points where they break down: what threatens our sense of the world? That focus also underlies one of this book’s central claims: the problem of evil is the guiding force of modern thought.

Notice that Neiman explicitly sketches, in just a few sentences, a broad map to both her research project (tracing “changes…in our understanding of the self…”) and one of her key claims: “the problem of evil is the guiding force of modern thought.” In doing both in a single paragraph, she heightens her readers’ interest in at once joining her search for answers and discovering whether she makes a convincing case that the problem of evil plays a central role in forming the modern mind. Her words also assist the readers’ focus, setting up an implicit contract with them that she’ll not disappoint, but actually perform, exactly what she promises.

As with Susan Neiman above, San Francisco gay rights activist and civil rights attorney, Kenji Yoshino, in his book, Covering: The Hidden Assault on Our Civil Rights (Random House 2006), also uses the phrase–This book–to flag his thesis statement, but it occurs, not on page two, as in Neiman’s book, but on page twenty-six: “This book performs the point that the new civil rights requires both legal and cultural action.” This would seem to be a vague, weak thesis at first, but in a nearby paragraph he elaborates: “My argument begins at its source—gay rights. I retell the history of gay rights as the story of a struggle against…the demand to convert, the demand to pass, and the demand to cover” (27). He next goes on to write, “I then argue that this gay critique of assimilation has implications for all civil rights groups, including racial minorities, women, religious minorities, and people with disabilities” (ibid.). Notice that, like Susan Neiman, Yoshino has laid out a map of the territory for his book early on, and in a manner that announces both his route of exploration through the material and his central conclusion, which is that civil rights groups should preserve their unique identities by resisting assimilation (“the demand to convert, the demand to pass, and the demand to cover”). Also notice that the title of his book does the work of anticipating his thesis as well: Covering: The Hidden Assault on Our Civil Rights.

Writing 2.4.1. Think of a topic you know well and have researched in the past. Presumably you have the basic issues surrounding that topic in memory and have an opinion concerning it. Now attempt to write a paragraph about it akin to Susan Neiman’s above. Let her paragraph function as an exemplary model of someone announcing her purpose in a piece of writing. Sketch out a broad road map to your own topic and state, by the last sentence, your opinion about it (a claim worth defending or arguing about).

Writing 2.4.2. In Writing 2,4,1 immediately above, you generated some writing in which you staked out territory for discussion as well as an opinion. Imagine this constituted the outline for a book or essay. What would you then title it? In generating a title, deploy a colon in it after the manner of Kenji Yoshino’s Covering: The Hidden Assault on Our Civil Rights.

Writing 2.4.3. Read the beginning of any essay and see if you can locate an explicit thesis statement. What does it say, exactly? Is it strong enough in terms of specificity, relevance, controversy, and interest to make the essay seem intriguing, where you want to read the whole thing? If there is no explicit thesis statement, is there an implicit thesis that is readily discerned? How about a research question? Perhaps there is a research question combined with a thesis statement. In any case, discover how the author announces her or his purpose.

Writing 2.4.4. Browse some essays or books in search of explicit thesis statements. Once you’ve located one or two, ask yourself the following: How clear are they—and detailed? Do they consist of one sentence—or more than one? If the thesis statement contains more than one sentence, why did the author resort to more than one? What work were those additional sentences doing, exactly? Did the thesis statement include a reference to the author in the form of the I voice, such as, “I argue that…”?

Image result for susan neiman

__________

 

Posted in critical thinking, education, reason, rhetoric, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.3: Rhetoric Is Sexy

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter two. To have a look at other parts of chapter two–or the whole of chapter one (Concepts 1.1 – 1.10), click here.

Concept 2.3. Rhetoric is sexy. Rhetoric enacts the drama of receptivity. When a matter comes before you, what is your judgment concerning it? What assertions will you submit to—and what assertions will you reject? If Plato is correct, the self is one and knows itself as one via a process of contending self-talk that gets to yes. Likewise, discourse is a search for maintaining or getting to oneness, solidarity, harmony. Assertions are made, ideas are proposed—and thumbs from the audience go up or down. And if thumbs go up in approval, signaling submission, it can be, not just a submission to logos—a rational submission—through ever-heightened clarification, argument, and evidence-pointing—but a passionate submission—a submission to pathos—amounting to a seduction, as illustrated by James Joyce’s extraordinary soliloquy (self-talk) in his novel Ulysses, placed in the mouth of Molly Bloom as a stream-of-consciousness, with her eagerness for unification dramatized by the dropping of all commas:

[A]nd Gibraltar as a girl where I was a Flower of the mountain yes when I put the rose in my hair like the Andalusian girls used or shall I wear a red yes and how he kissed me under the Moorish wall and I thought well as well him as another and then I asked him with my eyes to ask again yes and then he asked me would I yes to say yes my mountain flower and first I put my arms around him yes and drew him down to me so he could feel my breasts all perfume yes and his heart was going like mad and yes I said yes I will Yes.

Molly and her lover had a main point to which they were moving, and that’s the goal of rhetoric as well: a coming together to an overriding question or conclusion, as when T.S. Eliot, in the first stanza of his poem, “The Love Song of Jay Alfred Prufrock” (1915), invites the reader on a journey of discovery:

Let us go then, you and I,

When the evening is spread out against the sky

Like a patient etherized upon a table;

Let us go, through certain half-deserted streets,

The muttering retreats

Of restless nights in one-night cheap hotels

And sawdust restaurants with oyster-shells:

Streets that follow like a tedious argument

Of insidious intent

To lead you to an overwhelming question …

Oh, do not ask, “What is it?”

Let us go and make our visit.

Hopefully your own writing will not consist of any “tedious” arguments, but of novel and cogent (clear, coherent, convincing, relevant) ones, leading to an overarching question or thesis. In writing and speech, the point on which the yes is at ultimate stake is the thesis, the organizing and culminating goal of discourse.

Darwinian sexual selection and writing. The quotes from James Joyce and T. S. Eliot above are obviously erotically suggestive—and in Joyce’s case, explicitly so. And the fact that words in their suggestiveness are “fruitful and multiply,” breeding further thoughts and words in writers, editors, and readers, reminds one of writing’s relation to sex. Charles Darwin noticed that organisms do not just have adaptations for survival, but for spell-casting potential mates. By way of contrast with natural and artificial selection—may the fittest organisms in nature and on the farm survive—Darwin identified a third type of selection: sexual selection. May the most attractive survive. He did not include sexual selection in his seminal book, On the Origin of Species (1859); this was instead worked out in some detail in Parts II and III of his book, The Decent of Man (1871).

For contemporary biologists building on Darwin’s original insights, human language and sexual selection are not merely analogous. Like birdsong and plumage, language has literally evolved, at least in part, to make the sexes ever more intensively attractive to one another. Certainly in a social species like Homo sapiens, facility with language is enormously advantageous to lubricating social interactions and heightening attraction. Think of the contrast between someone with a vocabulary of 5,000 words verses William Shakespeare (1564-1616), a man who had an ever-ready vocabulary of perhaps 100,000 words. Which one would you predict would more readily attract a mate? In a competition for lovers, Shakespeare’s language displays would certainly have drawn a great deal of attention from potential mates.

So it’s beneficial to think of sentences not just in terms of their internet viral or memetic properties, or as akin to natural and artificial selection—may the best words in the best order win—but in terms of sexual selection. Like a feather in a peacock’s tail, a sentence functions to attract eyeballs—and so the more attractive and memorable a sentence is, the better for attracting eyeballs. And in a piece of writing as a whole, it’s not just the content or matter presented on its pages, or the attractiveness of its material layout and font choices that count, but its evident persona, energy, coherence, tone, ease of reading, use of syntax, word-choice, and flair (that is, its style or manner of communicating; its attractive use of words).

Rhetoric is akin to music. So writing that arouses gets read, coaxing the reader onward with a desire for more, more, more. It can thus also be likened to jazz—which is itself a seductive performative display. When, for instance, the poet Allen Ginsberg famously first publicly read his poem “Howl” in October of 1955 at the Six Gallery in San Francisco, he began his recital subdued and hesitant, as if starting a piece of jazz, but then a friend in the audience, the equally famous writer Jack Kerouac, goaded him to a greater expression of energy by shouting to him, “Go, go, go!”—which Ginsberg did—and to electrifying effect. Writing should go; it should matter, and the writer should believe that it matters, looking for ways to awaken what perhaps might be an otherwise sleepy audience.

So if you’re writing something, perhaps think of it in terms, not just of sexual selection, but of music. Think of it as your attempt to move, along a sliding scale, from merely catching a reader’s slight attention to actually holding her or his full attention. Seek to make your writing uncommon; a variant organism—a cultural organism—amidst the vast multiplicity of other organisms, different from all that has come before, attracting eyeballs to something novel and interesting. As the early modernist poet Ezra Pound (1885-1972) put it succinctly, “Make it new!”

Writers sometimes poison the well with others. Not all rhetoric is attractive. Sometimes it is repellent. While it may be, strictly speaking, always the case that the goal of rhetoric boils down to reaching agreement, that agreement may be with such a narrow constituency of the like-minded that it amounts to a no and a separation from everybody else. In such a case, the speaker or writer is indifferent to, or ignores, how his or her rhetoric might play to a broader audience of fair-minded people. This is illustrated by a nineteenth century controversy that surrounded John Henry Newman (1801-1890), who changed from being a Protestant clergyman to a Catholic clergyman at a time when tensions between Protestants and Catholics were more pronounced than they generally are today. Newman was accused by a prominent writer, after becoming a priest, of not being someone to safely dialogue with because he no longer could be trusted by Protestants to tell the truth. The implied prejudice being expressed was that Catholic priests deliberately lie to outsiders—to which Newman replied by coining a phrase that remains a part of the vocabulary of rhetorical studies to this day: poisoning the well—or “poisoning the wells” (viii).

To poison the well is to introduce into a dispute an accusation so toxic against one’s opponent in debate (they’re a liar, a racist, a murderer, a warmonger with blood on their hands, etc.), that dialogue cannot really be effectively sustained after making it. Poisoning the well is a signal of broken discourse, with an intent only to preach, as it were, to the tribe or choir on your side of the argument. It’s a way of demonizing others and sowing suspicion among groups. After the well has been poisoned, there’s talking, perhaps, but no real trust or listening. And so Newman writes in his book, Apologia Pro Vita Sua (Latin: “A defense of one’s life”), that the effect of his accuser’s accusation was “to cut the ground from under my feet,” and “to infuse into the imaginations of my readers, suspicion and mistrust of everything that I may say in reply to him. This I call poisoning the wells” (Ibid.). Newman also describes such an accusation as he experienced as being at once “base and cruel,” and to which he found himself at a loss, rhetorically, how to reply (Ibid.). How does one respond to an accusation that places one in a damned-if-you-do, damned-if-you-don’t double bind? In other words, how does one prove a negative, marshaling evidence against an unspecified and general provocation to show others that one is not, and has never been, a liar, a thief, an extremist, or a murderer?

Writing 2.3.1. Think of a contemporary example of a piece of writing or speech poisoning the well of discourse. What was said, and how did the rhetorical poison do its work? Who did it function to persuade—and why did it succeed (if it succeeded)? Who was the audience for it—and who was alienated by it? What ultimate purpose was served by it?Did the poison ultimately benefit or harm the writer or speaker who deployed it?

Writing 2.3.2. Think of an audience that is inclined to disagree with you on an important matter, and write down, in a single sentence, a controversial claim surrounding it. Make sure it’s a claim you actually believe—that it’s your claim. Now develop one or two arguments in support of the claim, ever mindful of the unfriendly or even hostile audience that you’re writing to, and with a mind to winning them over; of having them ultimately submit to your judgment. Think about, not just what you’re saying, but the way you’re saying it (your tone of voice, etc.). How will you reach your end?

Writing 2.3.3. Write a couple of sentences on a topic of your choosing in such a way that you are very clearly and emphatically trying to bring attention and interest to it via evocations of eros, energy, and novelty. Let the writing, like thunder, be charged. The subject of the writing needn’t be sexual in content—that’s not the point—but it ought to be “sexy” in the sense of raising in the reader energy and attraction toward the writing itself. You should come across as an author implicitly interesting enough with which to go on a blind date. Don’t go beyond the crafting of the first couple of sentences. Really focus on them as if they are a make-or-break moment to an implicitly longer piece of writing; the moment when you either net your readers or lose them. In your sentences, attempt to stand out.

Writing 2.3.4. In a paragraph or two respond to the first sentence (or couple of sentences) of any piece of writing by asking, “Is this writer making it new? Is there anything vaguely—or overtly!—sexy or interesting about it? Does it raise libido (animal energy) or attention in some way? If so, how so? If it doesn’t, why doesn’t it? Has the opening of this piece of writing really caught my eye, reason, heart, and imagination? To what degree, exactly? How is it achieving its effects—if it is in fact achieving any at all?” Be critical—but not gratuitously so. Think about it. How are those first sentences functioning?

__________

Image result for rhetoric and darwin

Posted in atheism, critical thinking, education, origins, philosophy, poetry, reason, rhetoric, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.2: Getting To Yes Is The Goal Of Rhetoric

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is part of chapter two. To have a look at other parts of chapter two–or the whole of chapter one (Concepts 1.1 – 1.10), click here.

Concept 2.2. Getting to yes is the goal of rhetoric. If the audience has the capacity to reply, the role of speaker-audience may switch-up in such a way that what is at work is not proclamation or monologue (one-way discourse), but dialectic (two-way discourse). Dialectic is dialogue around a matter of importance and controversy in which the stakes are the following: will the parties in conversation come to agreement, stay in agreement, or break agreement? In other words, will they come to yes with one another? In rhetoric, the methods of message deployment can be, not just verbal or written, but visual, as with emojis or images in advertisements. Such methods are known as visual rhetorics.

An example of getting to yes: Mother Pollard and Martin Luther King. In introducing ethos, logos, and pathos (Concept 2.1), our first attempt at a definition of rhetoric was the following:

Rhetoric is the art of reaching or maintaining agreement through argumentation, with arguments being directed to the reason (Greek: logos), the emotions (pathos), or some combination of the two.

We are now in a position to offer a fuller definition:

Rhetoric is the art of reaching or maintaining agreement by appeals to character (“trust me, like me, follow me”), reason, and emotion, whether via one-way discourse (a political poster, writing an essay, giving a speech) or via dialectic (an exchange of snapchat photos or emails; a conversation). It’s about getting to yes.

Martin Luther King recounts how he reached a state of life-energizing agreement–getting to yes--with a parishioner one evening through a process of dialogue in which he started at no:

On a particular Monday evening, following a tension-packed week that included being arrested and receiving numerous threatening telephone calls, I spoke at a mass meeting. I attempted to convey an overt impression of strength and courage, although I was inwardly depressed and fear-stricken. At the end of the meeting, Mother Pollard came to the front of the church and said, ‘Come here, son.’ I immediately went to her and hugged her affectionately. ‘Something is wrong with you,’ she said. ‘You didn’t talk strong tonight.’  Seeking further to disguise my fears, I retorted, ‘Oh, no Mother Pollard, nothing is wrong. I am feeling as fine as ever.’ But her insight was discerning. ‘Now you can’t fool me,’ she said, ‘I don told you we is with you all the way.’ Then her face became radiant and she said in words of quiet certainty, ‘But even if we ain’t with you, God’s gonna take care of you.’ As she spoke these consoling words, everything in me quivered and quickened with the pulsing tremor of raw energy. (Strength to Love 125)

This exchange took place in 1956, and when King wrote the following words in 1963, he observed:

Since that dreary night in 1956, Mother Pollard has passed on to glory and I have known very few quiet days. I have been tortured without and tormented within by the raging fires of tribulation. I have been forced to muster what strength and courage I have to withstand howling winds of pain and jostling storms of adversity. But as the years have unfolded the eloquently simple words of Mother Pollard have come back again and again to give light and peace and guidance to my troubled soul. ‘God’s gonna take care of you.’ (Ibid.)

Through Mother Pollard’s argument, directed at once to the emotions (pathos) and the reason (logos), King’s no became a yes.

But what was Mother Pollard’s argument, exactly, and what is an argument? An argument is a claim supporting another claim, in which case Mother Pollard’s initial claim was that King should not feel alone or afraid. Her supporting claim for this was, “I don told you we is with you all the way,” and this was followed by a second supporting claim: “But even if we ain’t with you, God’s gonna take care of you.” These claims can be turned into an argumentative syllogism (two premises followed by a conclusion):

One need not feel alone or afraid when you have strong backing (first premise); Dr. King has strong backing from friends and God (second premise); therefore, he should feel neither alone nor afraid.

If an audience submits to your claims, nodding in agreement, you’ve won rhetorically, you’ve reached yes, as was the case between Mother Pollard and Martin Luther King. But if King hadn’t been convinced—if, for example, he disagreed with her that God existed, then Mother Pollard would have had to continue the argument, perhaps offering claims that would lead him to belief in God—or, taking a different tack, saying something like this: “Even if the community isn’t with you, and God isn’t with you, I’m with you. Carry me in your heart. Carry on for me.” Each of these sentences advances the argument, looking for the point where sufficient clarity and a stable yes between speaker and audience are finally reached. The trick in any argument is knowing when you’ve truly gotten to that point of understanding and yes with an audience; of knowing when further sentences are not needed because the audience has accepted your premises and is with you; of knowing when to stop.

Rhetoric seeks an audience’s final judgment. As the art of reaching yes, rhetoric means getting all the parties to a discourse on the same page—or keeping them there—believing, feeling, or acting as one. This process has a curious parallel to thinking itself, as can be seen in this ancient dialogue between Socrates (470-399 BCE) and Theaetetus (417-369 BCE) in Plato’s (428-348 BCE) Theaetetus (190a):

Socrates: And how do you accept my description of the process of thinking?

Theaetetus: How do you describe it?

Socrates: As a discourse that the mind carries on with itself about any subject it is considering. You must take this explanation as coming from an ignoramus [a simpleton]; but I have the notion that, when the mind is thinking, it is simply talking to itself, asking questions and answering them, and saying Yes or No. When it reaches a decision—which may come slowly or in a sudden rush—when doubt is over and the two voices affirm the same thing, then we call that its ‘judgment.’ So I should describe thinking as discourse, and judgment as a statement pronounced, not aloud to someone else, but silently to oneself.

In other words, thoughts arrive to the mind as arguments arrive to an audience. If you accept them as your thoughts, and not intrusive thoughts; if you say yes to them implicitly or explicitly, then they are your judgment because they belong to you.  If you say no, you reject the thoughts as belonging to you. But if you say maybe to them, entertaining the incorporation of some new thoughts, then you are either wrestling with self-clarification (“What do these thoughts that I’m having mean, exactly?”) or engaging in inward debate (“On the one hand this, on the other hand that”). The culmination of inward talking, listening, and questioning comes to this: which thoughts will you reject and which ones you will say yes to? Reaching a judgment means arriving at your undivided self—the one who is, as it were, gathered up in agreement. A writer, for instance, might wrestle for hours with her exact wording, finally saying to herself, “Yes, these are indeed the best words in the best order for my purposes. That’s my judgment.”

Writing 2.2.1. In a journal, recount a time when a no became a yes for you, or a time when you were able to bring another person or an audience from no to yes. Think about this in the light of the rhetorical triangle (ethos, logos, and pathos). What mixture of emotional and rational appeals were at work? What role did ethos (the personality and character of the messenger) play in the switch? If you do not know either of these experiences (either of being persuaded yourself of something or of successfully persuading another), to what do you attribute this?

Writing 2.2.2. Wrestle with a question around which you have doubt or are in a dilemma concerning, expressing the problem to yourself in writing and venturing some possible solutions. See what judgment you finally arrive at after writing about the issue.

__________

Image result for rhetoric

Posted in atheism, critical thinking, education, edward feser, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Rhetoric For Writers. Concept 2.1: Rhetoric Is The Music Of You, Sent To Others (Ethos, Logos, Pathos)

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers; the second chapter is a mini-course in rhetoric for writers. This post is the beginning of chapter two, introducing its first concept (Concept 2.1). To have a look at chapter one (Concepts 1.1 – 1.10), click here.

Concept 2.1. Rhetoric is the music of you, sent to others. Rhetoric is the art of reaching or maintaining agreement through argumentation, with persuasion being directed to the reason (Greek: logos), the emotions (pathos), or some combination of the two. Logos is most directly associated with the messages you send and pathos with the passionate audiences to which you send them. Audiences are regarded as passionate in the sense that they presumably have an interest or stake in the outcomes of your arguments, and are thus susceptible, not only to appeals to reason, but to desire and aversion.

The philosopher Aristotle, writing more than two millennia ago, observed in his Rhetoric that, in addition to logos and pathos, there is always a character (ethos) component to argumentation. From the Greek word ethos we get the English word ethics. Ethos is associated with the sender of a message. To increase the odds of being rhetorically successful, the sender might seek to transmit to an audience a persona—a mask, a personality—that is interesting, friendly, credible, and ethical (because either truthful, virtuous, or courageous). This is achieved by such things as choice of mood in address, tone of voice, earned authority, expertise, reputation, presentation, and relationship (cultural touchstones and history with the audience). When you bring to an audience your essence—your spirit, character, habits of mind and action, and your true or best self (or even a fake or worst self)—you bring your ethos; it is, as it were, the music of you that arrives to your audience (pathos) accompanied by a message (logos).

So as lyrics are to music, messages are to ethos. Whatever messages you’re sending, they are accompanied by the music of your presentation—your mask, personality, style—your ethos. Thus it is that, say, a coffee-stained essay with grammar errors, and not in the format expected by your audience, reflects on you. In such a rhetorical situation, messiness and carelessness are part of your ethos. So, along with your logos and pathos appeals, you arrive to people: (1) with character (some sort of virtuous or not-so virtuous orientation or reputation); and (2) in character (donning the mask of a personality, style, and tone of voice). Will an audience read or listen to your message if it doesn’t like the ethos—the music—that accompanies it?

The rhetorical triangle. Argument is invariably combative (some arguments must win, others lose), but it need not be divisive. It can bring parties together. At bottom, rhetoric is about sensitive and skillful attention to audience; about reaching an audience, not losing or alienating it. This is true whether that audience is a group (“Vote for me!”), a single person (“Will you go on a date with me?”), or even yourself in self-talk (“You can get that job; you’re the best!”). Notice in the latter example how the self splits into two—it is of two minds—and in the role of both speaker and listener. Notice also that the self is sending the “other self” a message (“You’re the best!”). So, in communication, sender, message (reason), and audience (passion)—the rhetorical triangle–first theorized about together in ancient Greece—are all three at work together, even at the level of inner dialogue.

Writing 2.1.1. Think of a person or group with whom you have a disagreement. In a journal, and in no more than a sentence or two, state the nature of that disagreement. Next, address that person or group concerning your disagreement, seeking their agreement by appealing to both their reason (logos) and emotions (pathos).

Writing 2.1.2 Have another look at the journaling you did for Writing 2.1.1 immediately above, then try another piece of writing on the same topic to the same audience—but this time don (adopt) for your ethos—the music of you—one of the following personae (masks): (1) the energetic ironist or humorist who is irreverent and “in-the-know”; (2) the measured, serious, earnest nerd or scientifically oriented person; (3) the over-the-top salesperson; (4) the compassionate, empathy-driven liberal; or (5) the angry, resentful populist.

Writing 2.1.3: Write a paragraph to yourself on a matter of importance to you. Notice as you do this that you’re participating in the rhetorical triangle from all three vantages: ethos (sender, personality), logos (message, reason), and pathos (audience, emotion). Also notice your internal dialogue as you write, observing how it shifts from sender concerns to message concerns to receiver concerns: Is this what I want to say? Does the way I’m saying this resonate with how I actually feel about it now? Am I speaking honestly, and is this what I really want to say and hear?

Image result for rhetorical triangle

 

Posted in critical thinking, education, philosophy, reason, rhetoric, writing | Leave a comment

A Mini-Course in Critical Thinking For Writers. Concept 1.10: Skepticism

I’ve decided to attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter is a mini-course in critical thinking for writers. To have a look at Concepts 1.1 – 1.9, click here.

Concept 1.10. Skepticism. Skepticism (doubt), as a philosophical position, originated with the ancient Greeks. In its most extreme form, it entertains the idea that knowledge itself, on any matter whatsoever, is ultimately impossible–though this position has often been observed as self-refuting. If one were actually correct about this, how would one ever know?  Here, we will use the term in a looser, more contemporary sense: you are a skeptic if you regard doubt, not as a thing to be avoided, but as a virtue. Faith in some contexts may be a virtue–but so is doubt. So you’re a skeptic if you follow David Hume’s (1711-1776) advice, on encountering a claim, to apportion your belief or disbelief to the evidence. That is, you don’t believe or disbelieve claims with more confidence and enthusiasm than the evidence warrants; instead, you are rigorous and conscientious in being fair minded, measured in your opinions, and keeping doubt in play. This means you turn your skepticism, not just on the opinions of others and on opinions you don’t like, but on yourself and your own opinions as well. Contemporary skeptical behavior thus includes:

scrutiny of sentences as to their precision and coherence

appraising ad hoc explanations in light of Occam’s razor

using deduction, induction, abduction, reality testing (experience), and scientific method (experiment) to evaluate claims and their supports, attempting to get at the actual truth of matters as nearly as humanly possible

apportioning belief to the evidence

ironically noticing the relation of what we call facts to the values, metaphors, heuristics, and models we overlay upon them

self-criticism (turning the tools of critical thinking not just selectively outward, toward the claims of others, but toward oneself; toward one’s own beliefs)

no sacred cows (no claim is exempted from rational scrutiny)

Skeptics listen. Skeptics bring rigorous, hard questions to claims. And because skepticism is doubt, it can feel dangerous to people who have a claim that they want to believe. Doubt from outsiders can marshal a group’s defenses against loss and humiliation–as can doubt from insiders (whistle-blowers). Nobody likes to lose hope or lose face in argumentation. But skepticism needn’t be dehumanizing or socially corrosive. As a seeker of truth, the vulnerable skeptic, ideally, is not a sadist. Instead, he or she is more interested in dialogue than proclamation or deconstruction of others’ views. And again, ideally, the skeptic is most especially interested in civil dialogue between those who fundamentally disagree, calling to mind the philosopher Baruch Spinoza (1632-1677): “Non ridere, non lugere, neque detestari, sed intelligereNot to laugh, not to lament, not to curse, but to understand.

So a good skeptic is a good listener. For the skeptic, when it comes to getting at the truth of a matter, two heads are better than one, and two that disagree are better than two that agree. Skepticism is no skepticism that functions in an echo-chamber and goes all in one direction, circling the wagons with fellow skeptics, and never bringing skeptical questions back upon one’s own beliefs. So when you’re thinking closely about a claim someone is making, one way to get clarity is to keep an open mind, listen carefully, and maintain your humanity and the other person’s humanity squarely in the foreground of consciousness. Think of the person as a person, just as you are a person, and ask of them, with civility, the same sorts of tough, skeptical questions that you willingly and readily turn on your own beliefs. Be honest and kind.

Humility is also a virtue. It is good to recall that we are all subject to flawed reasoning, and any one of us may catastrophically misread the landscape we’re navigating, whether literal or metaphorical, causing us to arrive at false beliefs that end in our injury or death. We may also be thwarted in our purposes by setting them too high or low. Someone might outmaneuver us. We may make all the wrong allies—and find ourselves with all the wrong enemies. There are so many ways, and at so many levels, our critical thinking can fail, and it is in the knowledge of this that we question claims.

A list of questions to assist your critical inquiries. Below is a list of eighteen questions to help you navigate claims. The first nine address the claim itself; the subsequent nine address the claim’s adherents. Some might argue that to add skeptical consideration of the advocates of a claim to the analysis of a claim is not just to commit the ad hominem fallacy (“adding the man” to an argument), but to commit the genetic fallacy as well (substituting an analysis of the genealogy–the history–of how a claim came to exist for how the claim actually functions in argument). A claim, it is said, should stand or fall on it’s own argumentative merits, not on who says it or what its historical genealogy might be. Whether, for example, an atheist should eat meat is independent of the fact that Hitler was a vegetarian, and that the first vegetarians practiced vegetarianism for religious reasons. Vegetarianism can be evaluated independent of either its advocates or its history.

Strictly speaking, this is of course true. “Adding the man” or a genealogical history to a claim can distract critical focus. But once a claim has indeed been independently and fairly evaluated on its argumentative merits, skeptical questions surrounding believers in the claim and the historical context for the claim are often highly, highly revealing. They can function as a reminder of the many social factors that cause thought to go wrong–or right. For instance, thinking about how a particular community of scientists reached consensus on an issue like continental drift in geology can be suggestive of how to engage in critical thinking generally. It might serve as an exemplary model. The community may have been highly deliberative, not rushed; it may have regularly met to hash out objections, etc. In short, argument is not necessarily in a zero-sum game with considerations of the people and history behind a claim.

(1) Does the claim concern facts (what’s true), values (what’s good), or aesthetics (what’s beautiful), and is the claim meaningful (i.e., specific, coherent, and controversial enough to argue over)?

(2) Does the claim have any real evidence in support of it—and what is the quality of that evidence?

(3) Are there converging lines of evidence supporting this claim? (For example, one can be confident the Holocaust occurred via multiple, converging lines of evidence: photographic, testimonial, physical, documentary, and so on. How about this claim? Is this claim of the nature that it could also have converging lines of evidence in support of it–and if so, do these lines of evidence actually exist–and what is their quality?)

(4) Is the claim backed by authorities and experts? If so, how reliable are these?

(5) Independent of physical evidence and what authorities and experts might say about it, what other good reasons (valid arguments) are there for actually accepting the claim? Are there also good reasons not to accept it?

(6) Is this claim coherent with other things that are seemingly well-known and established (the things most reasonable people think they already know about human flourishing, what’s good and beautiful, what the universe is and how it works, etc.)? In other words, is there anything in this claim that seems to be incoherent or in tension with our other well-established pieces of background knowledge (evolution is true, penguins can’t fly, etc.)?

(7) What premises underlie this claim? Why do the argumentative supports for this claim start and stop where they do? Do the starting and stopping points seem reasonable–or are there questions that seem to go begging (that are in need of further argument)?

(8) A Goldilocks question: do the rules of thumb, mental models, maps, metaphors, and narratives that accompany this claim seem about right, too complicated, or too simplistic?

(9) What’s your dominant impression of the claim? In other words, given the cumulative quantity and quality of all the evidence and reasons that can be marshaled for this claim, how strongly should you actually accept or reject it? (Think grayscale here: on a scale of one to a hundred, how confident would you say you are, for example, that bacterial life exists on Mars? Or that banning all guns is a good thing? Or that Picasso was the greatest artist of the 20th century? Or that it would be good if 21st century humans stopped eating meat?

(10) Are there ways to reality test this claim? If so, do those who support the claim actively seek out disconfirming evidence and arguments for the claim–or are they largely just engaged in confirmation bias (counting the hits, but not the misses, in relation to the claim; not really looking for counter-arguments; walling the claim off from sustained scrutiny or possibilities for falsification; ad hoc-ing)? In other words, do the reasons believers put forward for the claim amount to after-the-fact rationalizations for the belief?

(11) Are those who accept this claim fairly judging and weighing the competing goods that may attend it? In other words, are they simply denying or ignoring what trade-offs might be at stake in adopting the claim, thereby oversimplifying matters? (Example: does an advocate for universal, free health care–a good thing–ignore or deny that low tax rates–also a good thing–might need to be raised to pay for it?)

(12) Did those who support this claim go through a process of abduction before fully embracing it? (Abduction entails dispassionately slowing down and weighing alternative beliefs or explanations before concluding that your position is the best belief or explanation on offer.) If abduction was not engaged in, then how, exactly, did those who accept the claim actually reach their professed level of confidence, intellectually? Was it, for instance, via emotions (as Robert Wright notes, “emotions are judgments”)? And if so, were these grounded in long experience and expertise, where an intuition or emotion might indeed function as a highly reliable judgment, the elimination of weaker hypotheses functioning unconsciously? Or was an emotional judgment rendered under pressures that might reduce its likelihood of being correct (emotional duress, intoxication, social conformity, time constraints, hope, fear, etc.)?

(13) What roles are group belonging, self-identity and esteem, financial interest, temperament, and desire—desire of any sort—playing in people’s adoption of this claim?

(14) How do those who accept the claim account for those who reject the claim? If there is a group that has formed around the claim, how do group members deal with outsiders and ambivalent insiders (fellow group members who express doubts, are insufficiently committed to the group, or perhaps even threaten to become whistle-blowers)?

(15) Does this claim seem to bring those who accept it under the spell of a metaphor, analogy, model, or narrative that appears to be dubious or in need of greater scrutiny? Are there other ways—better ways; simpler ways—to frame or tell the story of this claim that might break the spell of the claim on its adherents?

(16) Is there any indication that those who adhere to the claim on offer are dodging pointed and skeptical questions from outsiders? Is it the truth or something else that seems to be at stake among those who support the claim?

(17) Are those who accept this claim introducing any static into their arguments in support of it (emotional appeals, logical fallacies, the demonizing of outsiders, things that are beside the point, etc.)? If so, why are they doing this? What’s the signal in the noise here?

(18) Would you appraise the people who actually believe this claim as excessive in their belief, supporting the claim too confidently and enthusiastically, not in proportion to the evidence?

Again, the critical reasoner may bring skeptical questions, not just to claims, individuals, and the genealogy (history) of claims, but back upon oneself as well: what’s clouding my thought; what biases are at work in me? Skeptical questioning directed outward, but never back upon yourself, is not skepticism. Do you have the capacity, not just for bringing criticism to others, but for self-criticism–and the hearing of criticism? If so, then you’re probably closing in on what’s good, beautiful, and true–the truth of matters, and so moving very far across the bridge from the merely logically possible to what’s actually the case.

Writing 1.10.1. Evaluate a claim in the light of the eighteen criteria listed above. Reflect on your discoveries in a piece of writing. Read out your analysis to others and discuss it with them.

Writing 1.10.1. In your journal, reflect on your relationship to skepticism and doubt.

Related image

Posted in atheism, atomism, critical thinking, david hume, education, edward feser, Lucretius, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course in Critical Thinking For Writers. Concept 1.9: Moving From Innocence To Experience

I thought I would attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter would be a mini-course in critical thinking for writers. To have a look at Concepts 1.1 – 1.8, click here.

Concept 1.9. Moving from innocence to experience via shifting models. If science (empiricism) is about comparing theories, models, and maps to reality through prediction and experiment, and learning from the result, then experience is about comparing earlier, perhaps quite innocent, assumptions about the world to what has actually been discovered over time, and learning from that. Experience is thus deeply akin to experiment in science. It’s what the Romantic poet William Blake (1757-1827) rhapsodized as the movement from innocence to experience. It entails innocent models recalibrated or overturned by experience. As a process, therefore, innocence to experience is the movement from what you thought you knew then to what you think you know now—and it can be a communal revelation, not just an individual one, as in what Jews learned collectively, as a people, from the Holocaust. It is whatever we think biography or history teaches; the stories we tell ourselves. Innocence to experience is a form of bearing witness, as when Blake depicts, in the “The Garden of Love,” a man revisiting a carefree field of grasses and wildflowers from childhood, and discovering that a church has been built on it.

I went to the Garden of Love, 

And saw what I never had seen: 

A Chapel was built in the midst, 

Where I used to play on the green. 

 

And the gates of this Chapel were shut, 

And Thou shalt not writ over the door; 

So I turn’d to the Garden of Love, 

That so many sweet flowers bore. 

 

And I saw it was filled with graves, 

And tomb-stones where flowers should be: 

And Priests in black gowns, were walking their rounds, 

And binding with briars, my joys & desires.

Here the character in Blake’s poem moves from recollection of spring life in fenceless nature to a jarring encounter with private property (the door to the chapel is locked), prohibition (“Thou shalt not”), and death (“tomb-stones where flowers should be”). Bearing a similar tone of movement from innocence to experience is Joni Mitchell’s (b. 1943) song, “Big Yellow Taxi” (1970), the first lines of which are the following:

They paved paradise

And put up a parking lot

With a pink hotel, a boutique

And a swinging hot spot

At some point we reality test, are tested by reality, or discover that reality is not what we thought. In the first, we’re doing science; in the second and third, we’re living our lives, moving from innocence to experience, our models coming up against the full complexity of reality and time.

Heuristics. Heuristics are rules of thumb; the simplest sort of models or maps for navigating reality. As pattern-seeking animals looking to save time and energy, we are happy to repeat the coping strategies that worked from our past. No one wants to reinvent the wheel. So, when we encounter a new thing, we might be in the quick habit of dealing with it via shortcuts or cues, asking pragmatic questions such as these:

Does this idea or thing already fit somewhere in my system of ideas and values, and do I know what to do with it?

Is this a readily recognizable friend or foe to me, or something to be neutral or indifferent about?

Sometimes we’ll have both the time and energy to raise our heuristics from rules-of-thumb to full-blown, carefully thought-out schemas (more detailed maps), theses, theories, or paradigms (overarching theories that guide our questions and priorities). But given the limitations on our knowledge, and the shortness of each individual life in relation to the cosmos, it’s difficult to know, exactly, what to say and do, what’s important, and at what level we should respond to things. Sometimes our quick heuristics are good enough, and sometimes not.

Should we, for instance, think fast or think slow on a matter? Each human being, if she is to go on navigating the cosmos at all, has to lay down bets on what she thinks is most likely going on around her and within her, both on the large scale and the small scale, and what response is appropriate. Whether we’re thinking loosely or precisely, fast or slow, we’re bringing models to bear on those things we encounter (a stranger, a rustling bush, a stray dog). We never encounter reality unmediated. It’s simply too complex. We would experience sensory overload if we took in all data equally. Things are far too complicated to know everything about them all at once, and we are often rushed for time, and so we typically isolate a few aspects or attributes from things, size them up, and make a quick determination on the whole.

The signal in the noise. Parts stand in for wholes. As with synecdoche (parts standing in for wholes) in literary studies, heuristics are those simpifying parts we focus on as important for getting our heads around a matter at hand. They are the models or procedures we deploy for dealing with something—and we hope that we are doing so accurately, quickly, and efficiently. If you’re using heuristics well and rationally, as opposed to with prejudice and passion, you’re trying to boil things down to their essentials and increase the probability you’ve judged them rightly. When you deploy a heuristic, you’re basically laying down a bet that you’ve accurately detected a reliable signal in the noise, and interpreted that signal correctly. You’re hoping the future will not leave you with regret at your snap judgment.

Hasty generalization and prejudice. You might use just one heuristic or criterion for evaluating a thing (you never eat a sandwich with meat on it because you deem sandwiches with meat “unhealthy”), or you may have a small cluster of briefly sketched criteria, as when you decide a sandwich is unappetizing (it has white bread and baloney), is “un-American” (Russian dressing), and lacks zest (it has no pickle). You might presume that you know what to do with a sandwich like that. Pick a different sandwich. Our heuristics simplify the world for us, saving energy and work. They’re time savers for thinking fast. But they’re grounded in probability, not certainty. We’re always trying to navigate our way from what is logically possible to what is plausible and actual, and wherever we can, we do so, not just via logic and reasoning, but via experience and investigation. Maybe we take a bite of the sandwich and find that, well, after all, it works for us. It’s a pretty good sandwich. Our heuristics can lead us astray if we never reality test them, and they should be simple, but not too simple, distinguishing what’s more important from what’s less important. They shouldn’t lead to hasty generalizations and prejudice.

Stephen Hawking’s model-dependent realism. As with simpler heuristics, science too is value-laden and model dependent, not giving us an unmediated encounter with facts, reality, and the good. Thus physicist Stephen Hawking (1942-2018) argues that, as human beings embedded within the system we’re trying to navigate and explain, we should be very careful about speaking with absolute certainty, as if our models can never be overturned. Even our very best models of reality can be overthrown by better theories or new data. We don’t know, for instance, whether the Standard Model in particle physics, though spectacularly successful, will ever be replaced by a still better model, but for now, it certainly is superior to all others on offer. We don’t need the truth unmediated (which is probably an impossible ambition, in any case), we just need, pragmatically, the best model among those on offer now. We have models (theories about the world and how it works) that achieve our purposes (predicting and explaining things), and we have good reasons for thinking that some of our models are far better than others on offer, but we nonetheless stay open to new data and surprise. Hawking has an alternative to naive notions of truth; he calls it “model-dependent realism,” and when he was alive, he called himself a “model-dependent realist.” No final truths. No final-says. No epistemic closure (knowledge closure). Just the best models for now.

No Country for Old Men and Michel de Montaigne. In the film, No Country for Old Men, the sheriff, played by Tommy Lee Jones, comes across the location of a drug deal in the desert that had gone bad, finding abandoned trucks and bodies splayed over the scene. His partner exclaims, “Ain’t this the mess!”—and Jones replies, “If it ain’t the mess, it will do till the mess gets here.” We might say something similar of our best models for navigating reality. If they’re not the truth, they’ll do till the truth gets here.

This open-minded attitude to new data is captured admirably by the French essayist, Michel de Montaigne (1533-1592), who famously asked of himself, again and again along the path of his life—“Que sais-je?”—What do I know? The implication here is a questioning of one’s level of confidence about things, the reasons for one’s confidence, and maintaining oneself, as it were, out of doors, susceptible to the weather of alternative explanations: What do I really know? Montaigne, in other words, held himself into the wind of experience. He was in the habit of revisiting premises. Alberto Manguel (b. 1948), in his book Curiosity (Yale 2015), sums up Montaigne’s open-ended and inquisitive attitude to life as “a continuous state of questioning of the territory through which the mind is advancing…” (2).

Metaphor, models, Shelley. The poet Percy Bysshe Shelley (1792-1822) famously described poets as “the unacknowledged Legislators of the world.” One way to interpret this is that Shelley was suffering from ego-inflation; poets actually have little impact on how the world goes. But if you define a poet as one with a gift for metaphorical framing; that is, if you think of a metaphor–this is that: my love is a rose–as in some sense a heuristic or model for seeing a thing, then Shelley’s claim is more than plausible, for metaphor is indeed dramatically intrusive on all areas of human life—including our attempts to think critically.

Take Iran, for example. During the presidency of Barack Obama (b. 1961), the implicit historical analogies swirling around Iran were multiple. Was Iran Nazi Germany or the Soviet Union? Or was it something else? It’s hard to think of a contemporary foreign policy topic more important to reason clearly about than Iran. But, when we attempt to do so, we’re plunged immediately into a realm associated with poetry, i.e., associative thinking. Israeli Prime Minister Bibi Netanyahu, for example, once framed Iran as Nazi Germany with nuclear ambitions. Here’s the Israeli newspaper Haaretz from 2008:

Netanyahu said Iran differed from the Nazis in one vital respect, explaining that ‘where that [Nazi] regime embarked on a global conflict before it developed nuclear weapons,’ he said, ‘this regime [Iran] is developing nuclear weapons before it embarks on a global conflict.’

And below is Eric Edelman, writing at the Foreign Affairs website, thinking about Iran in a Cold War frame. He compared Iran to the Soviet Union, but with ambitions that were not containable (Nov. 9, 2011, Eric Edelman et al.):

During the Cold War, of course, the United States managed to prevent nuclear use and discourage proliferation by containing the Soviet Union and providing security commitments to U.S. allies. According to the conventional wisdom, a similar approach would work in the Middle East today. Yet there are a number of important differences between the two cases, the biggest being that the United States had formal security commitments with partners across Europe and Asia and deployed hundreds of thousands of troops to their territories.

We might also think of the American standoff with Iran from the vantage of a different Cold War frame: the Cuban Missile Crisis. Is Iran Cuba? Yet another way to metaphorically frame Iran is to think of it as the United States in the 18th century seeking to secure and protect its revolution and sovereignty against an imperial power (that would be us).

So how might we decide which extended metaphor is best? Like Alice in Wonderland, we’re all groping for some familiar ground on which to reason about Iran. And depending on our framing metaphor concerning it, other metaphors, like rabbits, multiply. For instance, if you take Iran to be Nazi Germany, for example, then suddenly Bibi Netanyahu would expect to be cast in the role of Winston Churchill. But if you take Iran to be Cuba in 1962, Bibi Netanyahu might be analogized to the President John Kennedy’s hothead General, Curtis LeMay, who infamously advised Kennedy to bomb Cuba with nuclear weapons if necessary. And if you take Iran to be America in the 18th century, then you might liken the President of Iran to Thomas Jefferson.

Change your metaphor, change your mind. This is the power of framing metaphors. Maybe Shelley is right. If you’ve got the poet’s gift for synthetic associations, you’re valuable or dangerous to the State, mapping out the grooves by which thoughts can travel, thereby “legislating” them. Metaphors, therefore, are ‘groovy.’ Or, to switch the metaphor, you can’t think what you don’t frame.

Writing 1.9.1. Observe something in the news. What heuristics, models, or metaphors are being overlaid on it—and by whom?

Writing 1.9.2. Describe a movement from innocence to experience. Describe in your journal a movement from then to now, and what was thought then compared to what is thought now. What metaphors overlaid the innocent world then? What metaphors overlay the world of experience now?

Writing 1.9.3. Describe a movement from experience to a recollection of innocence. Describe in your journal a movement from now to then, and what is thought now compared to what was thought then.

Writing 1.9.4. Think of a problem. In your journal, reflect on how much time and energy you think it deserves, and what models might best be overlaid on it. Is this an issue for thinking fast or slow? Are there external time constraints forcing a decision by a date certain? What are the consequences of getting one’s model about this wrong?

__________

Image result for innocence to experience

__________

Posted in atheism, atomism, beauty, david hume, education, edward feser, Lucretius, philosophy, poetry, reason, rhetoric, science, writing | Leave a comment

A Mini-Course In Critical Thinking For Writers. Concept 1.8: Scientific Method

I thought I would attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter would be a mini-course in critical thinking for writers. To have a look at Concepts 1.1 – 1.7, click here

Scientific method. If we’re not engaging in self-deception, trying to ad hoc our way across the bridge from logical possibility to the actual truth of a matter, we see that we have a variety of genuinely powerful, seemingly objective tools ready to hand to help us reach beliefs that would appear to be rationally justified (warranted):

logical possibility

the three rules of thought

deduction, induction, and abduction

Occam’s razor

These intellectual tools are all brought together in scientific method. In the practice of scientific method, one:

brings questions to the cosmos 

generates competing theses or theories that are capable of falsification 

tests theories

incorporates the best theories on offer into a coherent network of other best theories 

Survival of the fittest theories. The scientific method is a sorting process for theses; it’s about discovering theses survivors after an ordeal of testing, whether in or out of the lab. It’s about the survival, as it were, of the fittest models or maps for navigating reality. These fittest models are discovered in the way that the best gladiators in the ancient Roman arenas were discovered: by contrivances dreamed up for trial or testing. But in the case of science, these trials are not dreamed up by emperors, but by scientists; scientists deploying scientific method.

Criteria of adequacy.  A theory that cannot be falsified, or that is phrased in such a way that it’s impervious to new data, reality testing, or competition from competing theories is not a scientific theory. Testing is a must. But aside from a theory being testable, by what general criteria is it to be judged in relation to other theories? First, if it’s an interesting theory, scientists agree that it is fruitful (it makes predictions that pan out; it’s not readily surprised by new data; it’s not constantly requiring the addition of ad hoc premises to save it). Second, it has scope (it explains a lot of things, not just, say, one or two things). Third, it slots into our already well-established background knowledge (a theory of biology shouldn’t contradict a well-established theory of physics). Fourth, it’s simple (it doesn’t multiply premises beyond necessity; it incorporates new data in a natural, as opposed to a strained, way). Alongside testing, these can be posed as straightforward questions brought to a theory:

Is it testable?

Is it fruitful?

Does it have scope?

Does it accord with our well-established background knowledge?

Is it simple?

Together, these concerns are sometimes referred to by scientists as the criteria of adequacy.

Abduction: reasoning to the best hypothesis. But it’s not enough to be adequate. Even if a theory passes through these five questions with flying colors, it still has to survive the judgment of the sixth and final criterion: Is it the best theory? It can’t be merely a good, interesting, or adequate theory. It needs to be the very best on offer. Isaac Newton’s theory of gravity works, but Albert Einstein’s has greater scope, and so works better. (Sorry, Isaac.) And perhaps someday a better theory of gravity will arrive, overthrowing Einstein’s. Scientific knowledge is always provisional. Metaphorically, Galileo’s telescope never comes down.

Fact-value entanglement. In one sense, when working with scientific method, we’re not dealing with values, but facts. Noticing, for instance, that evolution frequently entails competition (a fact) doesn’t tell you whether you should be competitive or cooperative with someone at work (a value). No is demands an ought. It was the philosopher and historian David Hume (1711-1776) who first made this important is-ought distinction. But while it is often crucial to maintain this is-ought distinction in reasoning (noticing when one is pointing to a fact—an is—and a value—an ought)—it should also be observed that is and ought are difficult, arguably even impossible, to wholly disentangle in practice. Look again at the six criteria scientists broadly agree should be deployed to reach the truth of matters, and to lock down things we can take to be facts. A good theory:

(1) should be testable

(2) should be fruitful

(3) should have scope

(4) should slot naturally into our already well-established scientific background knowledge

(5) should be simple

(6) should be the best theory on offer

But notice all the shoulds in this list. In other words, our criteria for arriving at a thing we take to be a fact are laden with value statements (facts should be this, facts should be that, etc.). If, for example, we value the criteria of adequacy, then evolution is vastly more reasonable than alien intervention in explaining the variety of species on our planet. But why should reality and the truth be a matter so readily testable? Why do we value fruitfulness, scope, and coherence? Why should the things we take to be facts be simple, possessing an economy of premises, rather than elaborate and complex, possessing a multiplication of premises? And isn’t it obvious that there are many things that we take to be true that fail to meet one or more of the criteria of adequacy? Thus, even if we are convinced that we have adopted the best values and models for generally getting at the truth of matters, we cannot wholly disentangle the processes we have chosen to get at the facts from what we take to be the facts themselves. The criteria we value for bringing us to high confidence that we’ve reached the truth on a matter cannot, like a ladder, be kicked to the ground after reaching the roof. If we kick away the ladder, we lose our basis for certainty as to where we actually are. And yet our ladder–which includes the rungs of our value-laden criteria for reaching the truth–really only gives us the ability, if we are being honest, to proceed with caution about our statements of fact. Our values and models condition and infect our facts. Our human condition is such that we never have a wholly objective, value-free, and unmediated relation to reality, truth, and the good.

Writing 1.8.1. Pretend the internet doesn’t exist. Now, like Oedipus addressing the Sphinx, bring a question to the cosmos and puzzle over how one might tease an answer from it using scientific methods. Write out the process you would use to discover the truth of the matter. Think about the level on which you might be satisfied to get an answer. For example, if you asked why it is that old buses form such visible evidence of rust, at what level would you regard the question as answered (at just the level of being left out in the rain, or an answer from chemistry, etc.)? Perhaps speculate on tentative competing answers to the question as if you were beginning a process of abduction. In any case, if you really wanted an answer to a question concerning the cosmos, how would you go about getting it–absent a Google search? (Example questions: Do birch and eucalyptus trees shed their leaves at the same time of year, and if not, why? Why is glamour a magnate to some people, but not others? Why did Neanderthals become extinct?)

Writing 1.8.2. Use the criteria of adequacy to evaluate a claim. Is it testable? Fruitful? Does it have scope? Is it conservative (does it accord with what we’re reasonably confident we already know)? Is it simple? Then add to the five criteria of adequacy one more question; the abductive question: Is this the best thesis on offer? Write in your journal your observations and conclusions surrounding the claim.

Writing 1.8.3. Reflect on an is-ought distinction, noticed by yourself or another. Wrestle in in your journal with the difficulties of the distinction. It may be true that no is demands an ought, but is it wholly true that no “is” should ever inform our “oughts”?

Writing 1.8.4. For your journal: Can we ever wholly disentangle what we take to be facts from our values? Can we have an unmediated encounter with truth? 

__________

Image result for criteria of adequacy

__________

 

Posted in atheism, atomism, critical thinking, david hume, education, edward feser, Lucretius, philosophy, reason, rhetoric, science, writing | 1 Comment

A Mini-Course In Critical Thinking For Writers. Concept 1.7: Distinguishing Best Explanation From Ad Hoc Explanation

I thought I would attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter would be a mini-course in critical thinking for writers. To have a look at Concept 1.6, click here; Concept 1.5, click here; for Concept 1.4, click here; for 1.3 here; 1.2 here; 1.1 here.

Concept 1.7. Distinguishing best explanation from ad hoc explanation. A sign that you may not be seeking the best theory surrounding the truth of a matter, and instead protecting a favored theory—one you want to be true over all others—is if you’re doing a lot of ad hoc reasoning in response to objections to it. We might call this ad hoc-ing. Ad hoc is Latin for for this explanation or moment only or add here for a special purpose, as when a business or bureaucracy forms an ad hoc committee to address an unforeseen situation. Ad hoc reasoning is deployed in, as it were, unanticipated or emergency situations; i.e., situations where a thesis or claim has failed to foresee an important objection or is at an impasse. When you engage in ad hoc explanation, you’re trying to save a favored thesis or belief from pointed and skeptical questions—or new and disconfirming data or competing theses—by seat-of-the-pants rationalizing that cannot be generalized to other theses of the same type. Here are two example sentences deploying ad hoc (seat-of-the-pants) reasoning:

My psychic powers failed today because the audience had too many doubters in it.

UFOs exist, but they never land on the White House lawn because they prefer not to be seen.

These explanations save the theses in question (the claims that psychic powers and UFOs are real), but at the expense of adding additional claims to them (psychic powers are real and they don’t work in the presence of doubters; UFOs are real and they don’t want to be known). That is, they add convenient, “just so” premises to blunt the force of obvious objections: If you have psychic powers, why did they fail in the presence of witnesses today? If UFOs exist, why do they never land on the White House lawn?

With the addition of a premise (a claim supporting another claim) to each thesis, the theses become less simple and probable—though still logically possible. In ad hoc premise adding, skeptical questions and new data are not really being anticipated or naturally incorporated into a theory or claim, but deflected with an additional claim or premise that is logically possible, but maybe not subject to empirical verification (reality testing). The ad hoc explanation, with each new premise added to the original claim, thus renders the explanation less plausible. But because there are lots of logically possible ways that the world can be that cannot be verified by evidence—we may, for example, be a dream in the mind of a butterfly—if you are willing to believe things absent evidence, then you can engage in a lot of “ad hoc-ing” to save your favored beliefs, theories, explanations, and behaviors from skeptical inquiry and new data. Ad hoc premise generating can also be a sign that someone is arguing in bad faith (their motivations for posing such explanations are something other than the truth or the good).

One way to push back against ad hoc rationalizers is to deploy in response Occam’s razor, formulated by William of Occam (1285-1347), in this manner: “No more things should be presumed to exist than are absolutely necessary.” That is, if you can keep things simple, do it. Don’t multiply premises unnecessarily. Ad hoc, seat-of-the-pants rationalizing, multiplies premises. Occam’s razor shaves them off. Maybe psychic powers fail in the presence of skeptics—and UFOs never land on the White House lawn—because neither psychic powers nor UFOs exist.

“Seek simplicity and distrust it.” So the principle here is: quite often–and arguably, most often–the simplest explanation is best. And in some instances, mathematics can be deployed to support Occam’s razor. As a matter of sheer probability, for example, two inductions being true are always going to be at least slightly less likely than a single induction being true. Each time a premise is added to an inductive thesis, the odds of the combined premises being true must necessarily come down (induction, recall, is about probabilities, not certainties). If, for instance, you’re 90% certain a particular woman is a Democratic voter, 90% certain she’s vegetarian, and 90% certain she signed your friend’s animal rights petition, the odds that you are actually right about all three of these in combination is not 90%, but drops statistically to 72% (.9 x .9 x .9 = .72). If you also infer, say, with a confidence of 65%, that she’s a feminist, then the odds that she’s all four of these things come down further still (.72 x .65 = .468 or 46.8%). With just four inferences or assumptions surrounding the woman in question, your odds of being right about her on all of these matters have plummeted to under 50% (assuming you scaled your levels of confidence surrounding each claim accurately in the first place). Occam’s razor proceeds with caution in the multiplication of premises, thus increasing our odds that we’re on the right track.

But a caution to simplicity (Occam’s razor) as a criterion for evaluating truth is offered by the mathematician and philosopher, Alfred North Whitehead: “Seek simplicity and distrust it.” Why distrust it? One reason is that humans tend to find comfort in things they can control, and simple models or explanations might thus serve an emotional bias for control, distorting the complexity of the matter at hand, as in the politician’s behavior can be summed up in one word: greed. If you accept this simple heuristic (rule of thumb, model) for the politician’s behavior, you’ve got a lot of control over the processing of news you encounter about him or her, and you don’t have to expend energy thinking about it, but you may judge their words and actions wrongly, or fail to anticipate how their actions might impact your life. Simplicity as a criterion can arrest a deeper inquiry, and can open people up to such things as the availability heuristic (landing on the nearest and simplest rule of thumb, model, or map that comes to mind for explaining a problem or situation). Occam’s razor, used too casually, can signal lazy thinking.

So ideally, the critical thinker wants to locate heuristic rules of thumb, models, and maps that–like Goldilocks in search of her soup, chair, and bed–are just right. (That is, useful and attending to the right signals in the noise, neither more nor less complicated than necessary.)

Writing 1.7.1. Imagine yourself as a playwright or screenwriter generating a funny scene between two characters. One has made a claim that is a lie, and the other is suspicious. Have the lying character field and answer pointed questions from the skeptical character, engaging in ad hoc, seat of the pants, explanation all the way.

Writing 1.7.2. Select a claim you regard as especially complicated, and unnecessarily larded with unnecessary and ad hoc rationalizations, and write a paragraph slicing and dicing it with Occam’s razor. What’s a simpler explanation or thesis, and what makes it better than the claim you’re deconstructing?

__________

Related image

 ___________

Posted in atheism, critical thinking, david hume, education, edward feser, Lucretius, philosophy, reason, rhetoric, science, Uncategorized, writing | Leave a comment

A Mini-Course In Critical Thinking For Writers. Concept 1.6: Distinguishing The Logically Possible From The Actual

I thought I would attempt the first draft of a college-level textbook, writing it directly into my blog, bit by bit. Feedback and recommendations in the thread comments are welcome, either encouraging or critical. The first chapter would be a mini-course in critical thinking for writers. To have a look at Concept 1.5, click here; for Concept 1.4, click here; for 1.3 here; 1.2 here; 1.1 here.

Lesson 1.6. Distinguishing the logically possible from the actual. As the three laws of logic suggest—and the early Wittgenstein observed—what we can say meaningfully about the world consists of an infinite number of logically possible sentences, but never impossible ones. The penguin flew to the moon is a logically coherent and meaningful sentence. Whether it is actually true or not is another matter. The penguin flew over and under the moon at the same time, however, is not a logically coherent sentence. It violates the law of non-contradiction, and you can’t even visualize it. The penguin sort of flew to the moon and sort of didn’t is also not a logically coherent sentence. It cannot be visualized. It violates the law of the excluded middle. The car is in the garage and on the street is also not a coherent sentence. I’m a married bachelor is not a coherent sentence. You cannot visualize, let alone think, such sentences. Thus, when we say that something is not, strictly speaking, logical, or not logically possible, we are speaking of words in relation. If the words make no sense in relation, then they cannot correspond to any actual state of affairs in the world itself.

Making distinctions is often central to reasoning, and so one way to think about the merely meaningful sentence in contrast with the factual sentence is to distinguish the logically possible from the physically possible, the technologically possible, and the actual. Something might be logically possible–a penguin flying to the moon–but not physically possible. When we test penguins’ ability to fly, we find that, though they are birds, they in fact can’t actually fly at all, let alone to the moon. It may one day, however, be technologically possible to put rockets on the backs of penguins that are powerful enough to propel them to the moon. We might also one day engineer bionic or robotic penguins that can fly over an survive on the moon. One day, perhaps, the moon might be terraformed (made earth-like, with an atmosphere). You may thus judge that it may be plausible—or even probable—that technology will one day make it possible for biological penguins to fly around and live on the moon. But that still doesn’t mean it will actually happen. What is plausible, possible, or probable does not always become actual. And if you put rockets on the backs of bionic penguins, we run into disputes over definition: is a bionic penguin with a jet pack and capable of surviving on the moon still a penguin?

We also might distinguish the logically possible from the actual in a still more fine-grained way, holding up a logically possible proposition for scrutiny from the vantage of a continuum, gray-scaling it. Bacterial life on Mars may thus be deemed:

highly improbable

improbable

plausible

possessing even odds

probable (possessing better than even odds)

likely

highly likely

nearly certain

inevitable

logically necessary

actual

Another example: someone might claim—“On the dark side of the moon is an advanced alien civilization from another galaxy, hiding”—which is logically possible, but how plausible is this? One might also make other distinctions. Observe, for instance, this claim: There has been the evolution of another planet in the observable universe that looks almost exactly like our own. It’s logically possible, but it would seem to be highly improbable. Yet, because of the vast extent of time and space involved, it may nevertheless be deemed likely–or even inevitable. If the universe is infinite in space and time, it might even seem to make such a claim logically necessary.

Deductions, inductions, and abductions. With logical possibility and the three laws of thought or logic stabilizing our words in their definitions and our sentences in their coherences, we can move further over the bridge of logical possibility (the many things contending for the truth concerning a matter) to truth (that one thing that is actually the case). We can do this by evaluating sentences (claims) through deduction, induction, and abduction. Two of these methods for getting at the truth of matters are ancient, going back to Plato and Aristotle, and the third (abduction) is more recent, going back to the mathematician and philosopher Charles Sanders Peirce (1839-1914).

Deduction and induction are grounded in the syllogism, which is two premises supporting a conclusion. Abduction is grounded in criteria and data evaluation, reasoning one’s way to the best explanation concerning a matter via experience or the scientific method. All three–deduction, induction, and abduction–are easy to learn and easy to remember, both as to what they are and what distinguishes them. The classic example of a deduction is the following:

[First premise] Socrates is a man. [Second premise] All men are mortal. [Conclusion] Therefore, Socrates is mortal.

In other words, if the two premises in a deductive syllogism are true, the conclusion is 100% certain. Socrates is indeed a man; all men are indeed mortal; therefore it’s 100% certain that Socrates is mortal. It must be the case.

By contrast, induction concerns conclusions that are less than 100% certain:

Socrates has a runny nose this morning. It’s allergy season. Socrates may have an allergy.

Again, induction deals with conclusions that have some degree of plausibility or probability attached to them, and so is distinguished from deduction, which deals with 100% certainties. And abduction takes induction further, brainstorming possibilities and ranking them:

Some logically possible theories that might account for Socrates’s runny nose are the following: allergy; a cold virus; psychosomatic illness; the devil; a side effect of medication; an invisible gremlin tickling Socrates’s nose with a tiny, invisible feather. For a variety of good reasons, I dismiss some of these as ludicrous, others as unlikely, and therefore choose allergy as the most likely, best explanation.

In other words, abduction is reasoning to the very best thesis or explanation. In trying to get at the truth of a matter, you don’t want just any logically possible induction or hypothesis that accounts for the facts surrounding it. Instead, you want to lay out your options and reason your way, both logically and empirically (that is, experientially), to what appears to be the best hypothesis. Abduction is attentive to new data, seeking to reconcile it to what one already thinks one knows (one’s background knowledge).

Writing 1.6.1. Describe in your journal a state of affairs that is logically possible, but not yet actual, and appraise how plausible you think it is that it will come to pass or be. (For example, the end of war over the next century; the prospect for an alternative energy economy to evolve over the next three decades, etc.)

Writing 1.6.2. Write a deductive syllogism and an inductive syllogism (two premises accompanied by a conclusion). Recall that, with a deductive syllogism, if the premises are true, then the conclusion is 100% certain, but with an inductive syllogism, there is only a possibility that the conclusion is true. (Socrates will die vs. Socrates probably has a cold).

Writing 1.6.3. Generate a list of plausible inductions surrounding a question (either an imaginative question or a factual one), and reason your way to what you regard as the best or most important induction about it. In a paragraph, discuss your conclusion, and why you see it as the best answer to the question. Question examples: Why are dogs preferred to birds as pets? How might my life turn out if I change my major from biology to art? If we ever detect a radio signal from an alien intelligence, what will most likely be the chief consequence for the course of the human decade after the discovery?

Writing 1.6.4. Write some sentences that clearly violate the laws of logic and think about what it is, exactly, that makes them either impossible to visualize or to think.

Writing 1.6.5. Write a paragraph in which you describe something that might be logically possible, but which you take to be physically impossible or technologically impossible, and explain why. For instance, imagining a person moving from one side of our galaxy to the other side of our galaxy within a single second is logically possible, but is neither physically nor technologically possible because to do so would violate the speed of light, the fastest thing in the cosmos (light travels at 186,000 miles per second; no human can travel faster than the speed of light).

__________

Image result for deduction, induction, abduction

Posted in atheism, critical thinking, david hume, edward feser, Lucretius, philosophy, reason, rhetoric, science, writing | 1 Comment