This is interesting. The mental has been “understood” in many (relatively unsatisfactory) ways, from a standalone, self-contained, wispy kind of (non-)thing (Descartes) to some “thing” more akin to foam on the crest of a wave (re: physics generally).
Michael Silberstein thinks otherwise (concerning both of these approaches). His suggestion is, mental properties and everything else (the physical, the chemical, the biological/psychological, even the social/cultural) can be more productively understood as “entangled” systems/subsystems, all “mutually embedded and…interconnected.” So instead of worrying about ‘downward’ causation (which IS kinda hard to grasp), he calls his approach systemic causation, a universe that is “intrinsically nested and (again) entangled,” each system interacting and cutting across the others. So there is no ‘up’ or ‘down’, rather ‘around and around’, like the stirring of a giant cauldron of cosmic soup.
***
Michael Silberstein thinks otherwise (concerning both of these approaches). His suggestion is, mental properties and everything else (the physical, the chemical, the biological/psychological, even the social/cultural) can be more productively understood as “entangled” systems/subsystems, all “mutually embedded and…interconnected.” So instead of worrying about ‘downward’ causation (which IS kinda hard to grasp), he calls his approach systemic causation, a universe that is “intrinsically nested and (again) entangled,” each system interacting and cutting across the others. So there is no ‘up’ or ‘down’, rather ‘around and around’, like the stirring of a giant cauldron of cosmic soup.
***
The Road of Excess leads to the Place of Wisdom. That’s from Wm. Blake, more or less. And what is that Wisdom? Moderation in all things! That’s Aristotle, more or less.
***
New information requires interpretation, an account of what the information means. It (interpretation) is modeled out of the materials of our past, viz., beliefs, themselves woven from prior beliefs, and so it goes. All rather risky business. (See Quine’s The Web of Belief.)
That’s interpreting in a nutshell.
***
If there is experience (and there is), then there’s a subject of experience. The jury is out concerning whether experience ‘emerges’ from the non-experiential or exists separately. Concerning the subject of experience, we don’t (yet) know if it is substance or accident. Ontologically, I’m banking on substance (wishful thinking). Concerning substance and accident, there is more agreement on the latter. Substance strains the imagination.
***
Concepts of the self (from Philosophy Now, Issue 107, Joshua Farris):
Prof. Farris,
I enjoyed your article on personal identity and have a few thoughts (I hope you don’t mind). Please bare in mind, however, that I have no formal training in either philosophy or theology and am only an occasional reader in either (less in theology). But because of my age (72), the subject of personal identity, which I assume dovetails nicely into the subject of the self, as well as consciousness, has become something of a project for me. Otherwise I would just be another old man building birdhouse in the garage (which I am).
The Body View. “…persons are identical to their bodies.” “…animalism.” I would argue (suggest) that dementia strains against dismissing a body-self identity — more so than, say, a tooth ache, or even cancer (does my wife ‘have’ dementia — where we at least don’t have to worry nights about her self/soul? — or is her ‘self’ being dragged along the same path as her body/brain? A brilliant sentence in Arthur Frank’s The Wounded Storyteller nicely encapsulates this: “Am I a body, or do I have a body?”
Concerning the building-blocks of the body changing through time (the Ship of Theseus paradox, right?), G. Strawson in Selves claims he himself doesn’t experience a continuous, unified sense of self, rather, ‘selves’, which he calls the Transient View, claiming he’s in good company with the likes of Wm. James. I see his point (wasn’t I “different” at the party last night than at the funeral today?) even though I also sense something (some Thing) more enduring. As you may know, Strawson warns that concluding from self-experience the existence of a self should be avoided. (Strawson has a nice couple of paragraphs (p. 14 in Selves) on philosophy and temperament, admitting that his own views on metaphysics could be due as much from his temperament as anything else, “…a thought that strikes with particular force when one comes to the problem of the self.”
The Brain View. I’ve read some of the literature on type and token identity, emergence, and eliminativism (this latter, very depressing, I must say). Do you know the work of Stuart Hameroff and Roger Penrose? Just in case: Hameroff, an anesthesiologist, became interested in consciousness when he first began to put people to sleep some thirty-plus years ago. In a nutshell, he feels the AI crowd has it all wrong, claiming, as they (apparently) do, the seat of consciousness to be at the level of the neuron; rather (and this is where Hameroff’s association with Penrose, the mathematician/physicist, comes in), it’s deeper down, at the level of the microtubules, the workings of which involve quantum machinations. By implication (because everything physical involves quantum weirdness), consciousness itself might be woven into the very fabric of the universe (smeared out, to change metaphors). Such a thought of course could lead us into the jungle of panpsychism — no time for that here. I must say, however, the thought gives me some solace, even though such a possibility might have little to do with my (ontological) self (assuming such a thing) remaining intact after I die. (By the way, you can find a grainy-blurry film of Richard Feynman on YouTube giving a talk in 1979 in New Zealand, trying to explain to the unhappy audience what quantum mechanics is. This is the one where he makes his famous statement that he doesn’t understand it either.)
Memory and Character. “…memory is not a sufficient condition for personhood”…which brings me back to my wife’s dementia. It’s a curious fact (from my experience) that a person who looses her memory very quickly begins to loose friends and family. People just stop coming by. I’ve sometimes wondered if this might be because there is an intuitive understanding that the afflicted individual is just “not there.” Yet all these people (in our case) are believers in an afterlife of the Christian variety, which should suggesting that the afflicted one is not only there but there in spades, still. Character traits of course is a wooly topic, always a hotly debated subject in personality theory, so I’ll spare you my thoughts on this topic.
The Simple/Soul View. I would bet that the soul is highly visible in the venues where you teach. I grew up in a similar milieu…down the street from Houston Baptist, actually, on Ave. B in Bellaire (I haven’t lived in Houston since the mid-60s). I have to admit, I’ve given religion short shrift through the years, in all its forms, except perhaps Buddhism (I’m not a practicing Buddhist, whatever that might entail, but Buddhist precepts have held a certain resonance for me –– unlike Christianity, which seems highly improbable, what with its claim to exclusivity and punitive rewards for the slow-witted.) (Just for the record, my Uncle Frank, a Southern Baptist minister, came up with a scheme during the Great Depression that saved the Southern Baptist Convention from bankruptcy. I never forgave him.)
The Not-So-Simple Simple View. “…persons are identified with a particular [first person] perspective” (Baker). This view strikes me as small potatoes. When reading this last section, an old memory popped up: fifty-some-odd years ago I was sitting in a bar in San Francisco listening to a female impersonator panning Peggy Lee’s rendition of “Is This All There Is?” (“…then keep on dancing,” as the song continues). I mean, there has to be more to it –– basing personal identity (simply) on a first-person perspective doesn’t seem to give us much. Too, I’m not so sure that the first-person perspective depends on proper brain functioning, not entirely. My wife, even in her relatively advance stage of Alzheimer’s, still seems to have a good chunk of first-personhood about her: it’s second- and third-person that give her challenges (well, ‘give me’ would be more to the point). My own, third-person experience of dementia might be characterized as a kind of revved up Transient View of self a la Strawson.
Oh, I saw in your ‘Further Reading” section John Perry’s book Personal Identity. I have that book — yellowing on my bookshelf. I tried, but most of it was beyond me — I’m thinking because, as they say, the brain shrinks by about a third by my age. Perhaps the fact that I can read it at all should give me solace.
Well, if you’ve read to this point, thanks. I realize of course my own thoughts are small potatoes, but then I’m not much of a cook. Funny aside, the editors at Philosophy Now were so desperate to fill space in Issue 105 that they slipped in my offer on how society should be structured (small thoughts on a big subject). I was supposed to get a “random book” for my entry. I never did. Oh, well.
Have a great day.
***
It’s interesting, is it not, how we drum ourselves down into smallness(se), loosing sight of the miracle of (I don’t like the word ‘miracle’ but cannot think of another at the moment), not our lives per se (which can be quite horrible), but of being, period.
There can be (indeed, is) a phenomenal self, whether or not there is an ontological self, we’ll call it. This is to say, we have an experience of being a such-and-such, un no se que, a thing: a single thing, a mental thing, a persisting-through-time thing, an agentic thing (all this from G. Strawson). Can we live with the fact (if it is a fact) that there is (turns out to be) only a phenomenal self? All the world’s religions bank on the addition, if you will, of an ontological self — with maybe the exception of Buddhism (certain forms of Buddhism). The phenomenal thingness of self seems an attenuated thing (a homeostatic property cluster? — see Richard Boyd on this interesting idea); whereas the ontological thingness of a self would be much more substantial (more enduring, as in “beyond the grave”). I suppose the latter would include (conclude that) the self is a natural kind — a thing over and above its properties? Near death stories seem to suggest this may be so, but is this good evidence?
***
Could it be that consciousness, the what-it-is-likeness––batness, et al.––arises not from but beyond (outside?) the brain, from somewhere out in spacetime (or wherever), then becoming (for example) particularized, or individuated, in us––perhaps even in things, stuff, ‘mere’ matter, as the ‘hard’ panpsychists might have it? Google the work of Stuart Hameroff (an anesthesiologist) and Roger Penrose (a physicist and mathematician), their theory of where consciousness might be occurring in the human brain––viz., at the level of the microtubules deep within the neuron via the (oh so hard to grasp) machinations of quantum mechanics (collapse of the wave function and all that). Hameroff’s reasoning might go something like this: The quantum world of the tiny is ubiquitous, happening everywhere and in every thing (every material thing: animals, plants, rocks, in all the elements of the periodic table). So if the workings of consciousness are the result of quantum physics, and quantum physics is not confined to living systems (which it is not), then consciousness might be better understood (or at least considered) to be woven into the very (quantum physical) fabric of the universe. Could it be? And could such a thing even explain, say, the near death experience? out of body experiences? Sound farfetched? Maybe. But if you want to read a first rate metaphysical debate on consciousness, pick up Consciousness and Its Place in Nature, Galen Strawson et al. But be warned, it’s a slow, challenging read. (Strawson once complained that all philosophy is read too quickly, so don’t feel badly if you spend an evening close-reading a page, or even a paragraph.)
***
Identity theory, which states that mental states are “nothing but” brain states (that is, when a brain state changes, the mental state changes), seems somehow truncated, incomplete, for how do we then account for the brain states “doing” that mental thing?––are we simply moving the mental “down” to the brain and continuing with the problem there? Arthur Koestler, in The Ghost in the Machine (1967) says something to this effect: “As we move downward in the hierarchy…we nowhere strike rock bottom, find nowhere those ultimate constituents which the old mechanistic approach to life led us to expect. The hierarchy is open-ended in the downward, as it is in the upward direction.” (quoted in Skrbina, 2007).
***
Life is a series of events (thousands, tens of thousands).
Each event contains something old and something new, drawing on the past as it soaks in the present.
Events have soft boundaries, making it difficult to determine when one ends and another begins.
Events are ordered in time and reshuffled in memory.
Events are interpreted and remain open to reinterpretation.
Events are truth makers and reality enhancers.
***
John Searle in Mind says this: “Materialism tries to say truly that the universe is entirely made up of physical particles that exist in fields of force and are often organized into systems. But it ends up saying falsely that there are no ontologically irreducible mental phenomena. Dualism tries to say truly that there are irreducible mental phenomena. But it ends up saying falsely that these are something apart form the ordinary physical world we all live in, that they are something over and above their physical substrate.”
For me, being an “ontologically irreducible mental phenomena” is only academically interesting, for I'm still clueless as to what the outcome of my being truly entails. Like any creature, being is my project, not my fundamental ontological nature.
Wait, no, that can’t be right. I want my existence to be not just for now but forever This seems, however, to be asking a lot. But can it be helped? Every creature wants to live one moment more––with qualifications of course, like being in a general state of health, security, and with enough life-sustaining resources. But this may be asking too much: no material state can sustain, or be sustained, in such a steady state for long, much less forever. Perhaps such a realization was the original impetus for the possibility for a nonmaterial state of being, and religions coming along to (hopefully) make this possible.
Just a thought.
***
The choice between free will and determinism seems unsettling, perhaps because, well, all cultures (I believe all cultures) assume that each individual is responsible for his or her actions, plus the fact that our most cherished assumptions (personal responsibility and decision making, for example) will make little sense if the jury comes down on the side of determinism. I mean, what would even be the point of deliberating on our (supposed) options? (Note, there is a third option: compatibilism, claiming we can have our cake and eat it too, but that’s a story for another day.)
Famous examples of determinism can be found in Sophocles’s Oedipus Rex, Homer’s Iliad, the Fates––thus fatalism––and in Aeschylus’s Agamemnon); karma is another, central to Hinduism; in certain Jewish sects such as the Essenes: fate governs all); among the Islamic Jabarites: Allah determines all; and of course there is that most famous and brutal of Christian predestinator, John Calvin.
Why does this seemingly intractable problem persist? And why does the mere possibility of determinism being the case strike us as so disconcerting? The American philosopher John Searle believes it’s because physical reality is deterministic (a claim with a mountain of empirical evidence behind it), while our subjective experiences seem to suggest we are making choices as we go about our day. And, Searle adds, there doesn’t seem to be a way to rectify these two facts, not currently.
But what about this? Maybe the free will/determinism debate is based on a false dichotomy. Why couldn’t free will be an emergent property? I know, I know, this would be just another example of epiphenomenalism, right?––froth on the wave, as Searle say. So in order to endow free will with some heft we would have to assign it some causal powers going back the other way, back “down” to the neural level, only currently there doesn’t seem to be any evidence for this.
***
There are instances when a species seems to have figured out how to get another species to do its dirty work. For example, I heard about a spider that will capture a prey (a caterpillar I think it was), wrap it up in its web, not eat it, but keep it alive (feed it!) in order to store its own eggs inside the caterpillar, where, when they hatch, will have a ready-made food source, slowing eating the caterpillar’s insides. But get this: the little spiders will not eat just any part of the caterpillar, like its heart, or liver, until they eat its less vital parts! I don’t know how Leibniz would factor this in to his “best of all possible worlds” thesis, though I can imagine what Voltaire would say (cf., Candide).
Then there’s this. The common cuckoo will lay its eggs in another species’ nest, the cuckoo’s eggs laying alongside those of the other’s birds eggs. Well, through the process of natural selection (we can assume), the cuckoo’s eggs have an unusually short incubation period, hatching before the eggs of the birds whose nest they’ve usurped, and pushing those eggs out of the nest. But get this, a cuckoo fledgling in the nest of another bird can mimic the calls of, not just one, but multiple young chicks of the unsuspecting parents, tricking the clueless parents to bring food not just for one offspring but for a whole brood!
Frankly, I have a hard time with these stories, because they seem, I don’t know, not just cruel, but fundamentally unfair. And what do they tell us about the creative force, or forces, behind them? And why does the cruelty jump out at us? Why has the process of evolution built such unfairness into our brains?
Take nature shows on TV. Nowadays the cruelty is choreographed in (it wasn’t thus in the early days of TV). I remember seeing one not too long ago where these water buffalo about to cross a river on the Serengeti, a river full of crocodiles. When the herd reached the bank of the river, all forward movement suddenly stopped, and the animals began to pile up on each other, trying to keep their footing and not fall in––like they knew, knew was coming…strength in numbers, the commentator explained, each waiting for its neighbor to jump or fall in. Well, inevitably one did, then another, and another, then hundreds, churning up the water in a frantic swim for the other side. And predictably, here came the crocks, wide mouth hungry; and as luck would have it (luck for the camera crew), a big crock lunged at a smallish juvenile and bore its teeth into its hindquarters. The crock splashed around to get a better hold––but get this: the juvenile just looked back at its attacker, like it was only mildly curious about what was happening. I mean, it didn’t even seem to want to struggle. The whole scene was horrifying. Who watches this stuff?
And again I ask: What is it about our own evolutionary history that makes this seemingly gratuitous violence even noteworthy? Wouldn’t you think it would seem normal, because it is so ubiquitous.
***
Here’s a story, true. The language is a little stilted because it was written in the 19th century, so I’ll bring it up to date, some.
The author wrote:
“As I looked down into the market, there was this woman, bent on business, carrying a heavy pile of ornamental fabric, shawls or something, on her head. I watched her as she entered shops, trying to sell her merchandise. I watched her for a long time, hoping she might make a sale. She didn’t. But for all that she bore herself with a dignity that, frankly, was unsurpassed, bearing each rejection with grace. Only her persistence showed how anxious she was to earn money. I shall always remember that tall, hard-looking woman as she passed with firm step and noble balance through the streets. To pity her would have been an insult. The glimpse I caught of her hard life revealed to me something worthy of admiration. Never have I seen such discouragement so silently and strongly borne.”
What about this:
“Looking back at the age of eighty-eight over the fifty-seven years of my political life in England, knowing what I aimed at and the results, meditating on the history of Britain and the world since 1914, I see clearly I have achieved practically nothing.”
—Leonard Woolf, British politician, from his autobiography
The Journey Not the Arrival Matters, 1969
At first glance, the excerpt from the book strikes one as rather odd, and frankly, discouraging, until you come to the title of the book: The Journey Not the Arrival Matters, which has an old, hard ring of truth that’s hard to deny. It reminds one of the American poet, Gertrude Stein, her quip: “Once you’re there, there’s no there there.” Funny.
I’ll say it again: the more concern we have for ourselves, the less we have for others: significant others and otherwise.
I saw a little girl on TV the other day. In perfect Arabic she said, “I haven’t eaten anything but some olives for four days.” Indeed, there was no time to eat, for in her little corner of the world, in the Syrian desert, she was being hunted, she and her little band as they tried to make it to the Turkish border. I think about her now almost every day. I want to think she’s eating a hot meal in Turkey.
Here. This is good:
A young monk was busy making the courtyard perfect for the bishop, who would be arriving later in the day. He trimmed the grass, cleaned up the bird droppings, the trash that had blown in from the street, even the leaves that had fallen from a small tree in the middle of the yard. Finishing, he sat and marveled at how perfect it all looked. As he sat, an old monk walked up and asked, “What are you looking at?” The young monk said, “The yard. It’s perfect, for the bishop.” The old monk looked for a while in silence, then said, “No, there’s one thing.” “What?” asked the monk, panic in his voice. The old monk walked to the middle of the courtyard, grabbed the young tree by its trunk, and shook it until leaves fell helter-skelter onto the manicured lawn. “Now,” said the old monk, “now it’s perfect.”
Moral? Hmmm. Life’s a mess? no matter what?
Jesus, the night before he was to be crucified, sat alone in the garden. (We can only wonder what was going through his mind.) Finally, he couldn’t help himself, he had to say it, because he was afraid! “Father,” he said (in so many words), “I don’t think I can do this. I don’t think I can go through with it. Is there some other way?”
“No,” said his Father.
Well, we know the rest.
To me, the story of the Cross is the story of life. It’s the story of the woman in the market, the British politician, the young monk, and that little dark-eyed girl. It’s me, too, and you.
It’s the journey, right? So maybe we should take a page out of Thoreau, who tells us to, “Travel light.”
***
There appears to be a limit to decomposition––taking things apart.
Take a watch. You take a watch apart to see how it works, thus gleaning the functional relationship of the various mechanisms of the watch. Logically, you could take the parts themselves apart, some of which may still be compound, and so on, and by doing this second step the knowledge gleaned will add still further to an understanding of the macro-workings of the watch. But even so-called simple parts, like a spring, might be further reduced (“decomposed”) by exploring its molecular composition, although the information gleaned at this level might not dovetail so obviously into the macro-workings of the watch (although it might: for example, how the molecular composition of the spring results in the spring’s elasticity). Finally, decomposing the molecules of the spring into its “parts” (its atomic structure) would seem to add little to an any further understanding the workings of the watch.
Such diminishing returns brings into question the utility of any further decomposition and the risk of an infinite regress. This doesn’t necessarily discourage the curious mind of course, it’s just that now we have left the watch behind.
***
Prof. Tye,
Hello. I’m sure you’re replete with emails, so I’ll be brief.
I’m a recreational reader of philosophy (crazy, I know), and in the process of reading your article on qualia in SEP this thought occurred to me:
In the paragraph beginning “The Ability Hypothesis appears to be in trouble...,” I have to say that the physicists’ argument that Mary “Once she leaves the room, she acquires these new modes of thought as she experiences the various colors...” seem quibbling, perhaps even a failed attempt at “saving the appearance” (for themselves). Oh, and your example in the following paragraph of the difference in sense between Cicero and Tully (like Frege’s famous morning star and evening star, right?) reminds me of G. Strawson’s claim (somewhere, I can’t remember where at the moment) that even sentences can have qualia.
Here’s an example for qualia that may be more forceful:
First, let’s get away from “red.” Too subtle. Let’s say Mary’s not interested in the brain’s encounter with color, but sex. After all, as any teenager can assure you, “learning” about sex in health class is not sex, any more than “seeing” red via brain imaging studies is seeing the color red.
Anyway, one night Mary washes off her gray makeup (which she never really understood anyway), checks to see if the graduate student posted outsider her door is (as usual) asleep, sneaks away, goes to a bar, meets a nice young man, they talk about her research, he sees an opening (no pun intended), invites her to his apartment, and Mary has first person, raw feel sex.
Please, don’t tell me Mary won’t notice a qualia-tative (sorry) difference between her “encounters” with “sex” in her room and sex with her new (boy-) friend in his room. And please (I say to the physicists), don’t give me that lame “the qualities the new concepts pick out are ones she knew in a different way in her room [lol], for they are physical or functional qualities like all others.” Oookay.
None of this of course may, at the end of the day, speak to the metaphysics of qualia. I personally feel that something might qualify as qualia through more “fine grained” laboratory techniques yet to be developed. Relatedly, I’ve often wondered about the fine grained neurological difference between seeing an image for the first time and seeing it a second time. Could that second experience be a hint for our elusive qualia? Depends on who’s reviewing the article, right?
Hope you have a great day.
Roger Tripp
San Antonio
***
I have this T-shirt, given to me by my daughter upon her return from a trip to Guatemala. The shirt is navy blue, and across the chest area in bold white letters is: “duyuspikinglish?”, and underneath in smaller letters is the word “Guatemala.”
I’ve worn the shirt around town on several occasions. For the most part, people don’t appear to notice the message, the joke, and the few that have have laughed, all except my sister-in-law, who says she finds it insulting. My sister-in-law is Hispanic, with about a quarter Irish thrown in, and with a fair amount of education (she has a doctorate in psychology). Of the few people who have noticed the shirt, and laughed, all but one were Hispanic, and all but maybe two were female. My sister-in-law told me the quip across the front of the shirt is an insult to Hispanics, the message being that they should learn English. That surprised me. I interpreted it as making fun of Anglos, who notoriously rarely take the trouble to learn another language, expecting everyone to know English.
So, what is going on here? The bugaboo of interpretation, of course, reminding us once again that facts don’t speak for themselves: they need us to speak for them, itself another fact that needs interpreting.
***
Several years ago I pointed out to my daughter that a junkyard (we were retrieving her car from an impound lot) was as much a part of nature as a forest. She was incredulous. But is it not the case? Is a junkyard qualitatively different from an abandoned termite mound? Two species created the contents of both, and the remains remain. Of course the content of the junkyard takes exponentially longer to “disappear” back into nature, but the processes of coming into being and decay are identical.
There is a cultural divide of course to explain this. Take the book of Genesis, as one of many examples: Man was created separate from Nature (which is to say, in God’s image) and was to rule over Nature. Such might explain why civilization is commonly thought of as something of an artificial construct separate from the natural world, and why we think of ourselves as going “back to nature” when we vacation in one of our national parks.
Just a thought.
***
My wife and I went to the McNay today––you know, to see the artwork, to relax, to wonder at the genius of Norman Rockwell, etc., etc.
What we didn't count on was, being watched...I mean really watched!...by guards: blue suited, angry people from the looks on their faces. We didn't notice this at first...until, until I pointed at a small, plexiglass encased (I was thinking reproduction) of a Rockwell magazine cover. And (this is when it really got tense) a guard (blue suited and angry faced) approached from my blind side and said: "18 inches! You must stay back 18"! Startled, I moved back, wondering what 18" looked like...and was I back 18"? Apparently, for the guard seemed satisfied, for she stepped back, yet, yet, she wouldn't take her eyes off me, and the piece (the small, plexiglass encased reproduction disappeared from the wall, even though I pretended to continue to look at it). So, we moved on, but all I could think about was: 18", 18", 18".
Oh, and you know that little yellow sticky? the one that gives you passage to move from room to room? Boy, did that guard swoop down on a woman wearing an outfit in the same family of colors, the guard demanding to see proof of passage! The young woman, startled, sheepishly pointed to, oh, there it was...she was allowed to continue on, yet I worried for her safety: was she was aware of the Law of the 18"?
The rest of the afternoon (for me) was exhausting. I tried to read the inscriptions next to the works, but because I was so nervous about being the required distance from the wall, I added what I estimated to be another 5", just to cover myself.
Be back? Probably not. I'm thinking about a visit to another museum, or maybe I'll just stay home and read a book.
***
Who we decide to have a relationship with may be the most humongous decision any of us will make in life, for now and forever, assuming it is a conscious decision and not something that just happens––like for sure we don’t decide who are parents will be. Who we decide on of course will depend on the availability pool: Mr. Right, for instance, may live just beyond our reach; Mr. Wrong right next door. But when we do decide our options disappear like bubbles in seltzer. There’s no getting around any of this of course, but still it seems many of us could have done a better job of it. A whole industry, marriage counseling, has grown and prospered from our evident lack of skill in this area.
Fewer options are available concerning who we work with or around, when we live, where, and the social conditions of the moment. Life of course may give us more latitude as we age––more choices, more options––yet (always the caveat) these also lessen with age––particularly after midlife as we become less and less productive and more and more dependent. Nature once managed all this; today, advances in science and technology have made Nature’s job more difficult. It’s not easy (or moral or perhaps even legal) to withhold resources, particularly from the elderly. Unlike with the other species, the human neocortex has somehow complicated all this.
It seems, therefore, that serendipity still rules, although there will always be those who claim they’ve pulled themselves up by their own bootstraps––and some have, and against tremendous odds, keeping the nature/nurture debate alive.
For my part, I will continue to hold to that old maxim: Success in life comes from hard work and luck––but mostly luck.
***
I’m thinking, justice (justice as fairness) may only occur, or have a better chance of occurring, only when a society is enjoying an excess of resources––with resources to give away. Not in all cases of course (there are always the exceptions): I’m reminded of the Biblical story of the widow’s mite. But even in John Rawls’ famous thought experiment, his “original position,” where the participants are organizing a society without knowing which position each of them will occupy (because they are working behind “a veil of ignorance”), it seems that the participants are still just working to mitigate any possible adverse circumstances that may impact them once the veil is lifted––working not, in other words, from a sense of fairness, but from selfish motives––wouldn’t you?––which reminds me of another cleaver thought experiment, Adam Smith’s “invisible hand,” where, paradoxically, public goods are also supposed to result from selfish motives. Smith worried about this, about how public morality could fit into such a scheme. I suspect Rawls did also. Which brings Thomas Hobbes to mind, his famous quip about life being “nasty, brutish, and short,” where the principal drive is for self-preservation and the primary fear is of a violent death. To militate against this Hobbes invented Leviathan.
So are we sugar coating justice just a bit? Is justice more of a necessary evil than some kind of metaphysical good? I’m thinking of the Donner party, where resources became so low they just had to eat each other.
***
William James wrote that truth is something that happens to an idea. On first reading this, it was like eating a piece of candy (chocolate, actually), and I wanted more. So I “ate” it again, and again it was like eating, ah, more chocolate. So I thought, What the hell, and ate it again, and again, and...hummm, I don’t know, not so much now. You know the feeling. Another piece? I don’t think so. All those empty calories. Maybe tomorrow, although maybe I should stay away from chocolate, and candy period.
***
I have a simple idea concerning whether “the author” (the flesh and blood writer of a text) should be given some credence, be politely ignored, or vilified as an evil genius bent on controlling the reading process (for this, see Roland Barthes’ “Death of the Author” for a seminal article on all this).
Personally, I prefer not to know too much about an author prior to reading. This of course is impossible in some instances, but still I like to keep it at a minimum. I especially try to avoid photographs of an author, for reasons I can’t fully explain––maybe because I spent my childhood listening to radio melodrama (I’m that old), where you would have to build scenes in your head from just the dialogue. Of course this happens in reading dialogue, but normally there is exposition (“show don’t tell”) to flesh out dialogue in a written text.
Anyway, I think the author should be given her due––at least a nod, maybe after a first reading––this way one might experience something of an ‘open’ reading, because it is true (my experience) that becoming familiar with the author’s life, say, or comments by the author about the text can influence a reading.
If you want to know more about how this death-of-the-author business got started in the first place, go online and listen to lecture 2 of Paul Fry’s Open Yale Lecture series “Literary Theory”––or you can download it on your smart phone.
***
Just a thought about multitasking––as in, “Oh, I just got an email––Wait, let me Google that––I’m putting that on my Facebook––Should I ‘friend’ (?) him?”
I’ve always claimed (with no empirical evidence, I might add) that multitasking (doing two or more things at once) is actually...what shall I call it...“rapid switching” (starting one thing, switching to another, stopping that, going back, starting a third––rapid switching).
Try this, Multitasker: add 2 + 2 while simultaneously subtracting 2 - 2. Simple enough. Or this: email your new “friend” while simultaneously parsing the line, “These are the times that try men’s souls.” If you can do these things simultaneously (which I define as at-the-same-time) you’re a multitasker, otherwise you’re a rapid switcher. Believe me, you’re a rapid switcher.
But don’t take my word for it. Try this. You’re on the freeway, the traffic is light (traffic lite), you’re traveling a route you know by heart, and you’re listening to “breaking news,” like, oh, I don’t know, World War III just started. Jesus Christ! This is big, and you focus! Suddenly, however, the traffic is not so lite, drivers are changing lanes, an 18-wheeler is trying to muscle into your lane (let’s assume none of this has anything to do with World War III; it’s just traffic). Naturally, you concentrate on the traffic.
Ah, what happened to World War III? World War III is gone! And when the traffic melds back enough into what passes for normal you hear the announcer say something like, “All citizens in the metro area should do this immediately.” What? And the radio goes silent.
Now this is interesting (this from Frontline’s Digital Nation): a recent study conducted at Stanford put several self-described master multitaskers to the test. It was found that, not only did the participants not do some things all that well, they didn’t do anything well. Why? Because they were, you guessed it, rapidly switching, and when you do that, your work degrades (so do your grades, as some undergraduates at MIT discovered––also in the Frontline episode).
Well, I thought I’d pass this information on.
Oh, I just got a text that I’m going to get an email about something I have to Google. Gotta go!
***
Stars die. Everyone knows that––or would, with a little reflection. Time, particularly cosmic time, may soften such reality. There’s the story of a speaker telling his audience that, in a billion years the sun will “go out.” (That information is wrong of course, but the essential fact remains.) A man in the audience reportedly jumped to his feet and cried, “A million years?” “A billion,” corrected the speaker. “Oh, thank God,” responded the man, than sat back down, seemingly satisfied.
What will happen, climate scientists tell us, is this. In a billion years (the same billion), the earth’s average temperature is expected to rise by 100 degrees Fahrenheit. This will change everything of course, gradually enough for human populations to have adjusted––as best they might––like developing the necessary technology for living underground. Keep in mind, however, that 1 billion is 1000 million (years), and the human(-oid) species we know will have adapted in significant ways, perhaps (I’m going to venture “probably”) into a different species entirely.
Anyway, when I first heard this, I didn’t jump to my feet, not literally, but something happened. So now when I’m making my daily rounds I often catch myself looking around and thinking, Well, that won’t be there...this will be gone. But here’s the thing: I feel sad, but somehow relieved.
***
It is a commonplace that, in today’s world, there is just too much to learn, which apparently wasn’t always the case, apparently as recently as the 18th century––if you were smart enough, curious enough, and rich enough (had the time). In this country, Thomas Jefferson was reputed to have been an encyclopedic learner.
Today, however, even with ample resources, brains, and willpower, you still won’t be able to do much more than dissect a modest area of a discipline. Perhaps the only way to overcome this is to read introductions, which abound. For my part, I’ve spent my life reading introductions: to history, metaphysics, the world of finance, literary criticism, postmodernism, ethics, European art, Buddhism––even typeface––and the list goes on, and on. But truth be told, I still feel “unread”––and confused, about all of it, and now I think I know why.
Too much human, too little on everything else. I made this discovery recently while watching Animal Planet, and that other series––The Universe––and I’ve come to the conclusion that, unless you know something about other animals, and other worlds, you’re (well, me, I’m) going to remain confused.
Knowing more about other species and seeing the world from a greater distance is helping me build a larger mental template. So now when I see humans acting like, well, animals, I remember, Oh, ya, like those chimps, or those bower birds in Australia––something along those lines. Did you know that crows will take time out from their busy schedule to slide down a hill in the snow on their backs––just for fun? And does anyone but me realize that the earth is overdue to reverse the polarity of its poles––north to south, south to north––and that no one is quite sure what will happen as a result!? Oh, and has it ever occurred to anyone that homo sapiens sapiens is not an endpoint in evolution? Are you kidding me?
***
All during my growing up years I heard the story of the Good Samaritan, over and over, and it heavily weighed on my conscience. Yet, still, it was so remote...until I read about a particular research study, a famous one in the annuls of social psychology. The study went something like this:
Seminary students were told they would be giving a talk on the Good Samaritan. It would be an important part of their grade. They would be doing this individually, to a group of people they didn’t know. It would be good practice. Each was given their time and place.
Of course, it was a set up. As it turned out, each student would have to walk to a far corner of the campus to give their talk, and each would be pressed for time, purposely held back for some bogus reason. Along the way, they would encounter a person lying on the ground, needing help. Would the students rushing to give a talk on the Good Samaritan be good seminarians? Alas, for the most part, no. The vast majority rushed past the person lying on the ground. These were seminary students, mind you.
History of course is itself a study of human nature, but for some reason, when you (well, me) read it in the context of history––descriptive, chronically explicated events––and not in the context of something like social psychology (or sociology or cultural anthropology), something is lost––perhaps the point itself. In the story of the Good Samaritan and the story of the good samaritans, the “same” story becomes a different story, because, I’m going to assume, each is unpacked differently. Both probably have their use. Perhaps the Biblical account was too remote for a young boy living in the Midwest in the mid-twentieth century.
Just a thought.
***
Human populations appear to be in the grip of some kind of global angst––perhaps the operative term being ‘global,’ at least for those of us who subscribe to cable TV and have access to the internet. Why so?
Humans of course don’t hold a monopoly on worry. Life is stressful. Case in point: mornings, as I eat my organic raisin bran, I watch the birds and squirrels quarrel over the seed, corn, and peanuts I put out for them. (I’m not the only one watching. Mr. Patience, the neighbor’s cat––he likes to watch, too. Hmmm.)
All God’s creatures, from expensive primates “down” to throwaway viruses, naturally work to maintain a semblance of control over their local environments.
Self-preservation, wrote Hobbes, is not only the primary drive (to give it a Freudian twist), but the most fundamental right: all “men” are created equal. It’s Nature’s law of laws.
But what’s going on today? What’s different? What’s in the air? Is it just me? I don’t think so.
I think it’s Gaia––Gaia’s sick.
Maybe the planet is an organism, and the angst is a symptom of illness. And what’s the underlying etiology? Overpopulation. Overpopulation––unbridled growth––may be to Gaia what cancer is to the human body––and population growth may be approaching critical mass.
In ages past, humans met this problem by colonizing. It’s what the Greeks did, the Romans, and eventually the Europeans, because when you have too many people (particularly unemployed and under-employed young males), there’s going to be trouble, a lesson currently being taught to the powers that be (or recently were) in north Africa and the Middle East. The Greeks colonized ancient Anatolia (modern day Turkey) and Magna Græcia (southern Italy); the Romans, the Mediterranean literal and north as far as the British Isles; and the Europeans, during the age of exploration, populated the Americas, southern Africa, and portions of the Far East.
Would that we could do that today. But what Locke one wrote, that “in the beginning all the world was America,” which is to say, empty...well, pretty much so, when he wrote it...this no longer applies, anywhere.
***
The Newter's Among Us
Newt Gingrich said recently that the Palestinians are “an invented people.” And are they not? And are not the Jews? (no, not the Jews, says Newt) the Kurds? the Basques? the Outer Mongolians? (are there Outer Mongolians?). Indeed, are not the Americans? (Newt says, again, no, not the Americans.)
What? Are the Americans a natural kind? You know there is something of a controversy concerning this, this natural kind business––I believe it’s in the philosophy of science (I say “I believe” because I know little about the philosophy of science, or about any other philosophy for that matter). But maybe Newt knows. Newt’s smart. Everyone says so.
Is the Newter himself an invented person? Well, he is a politician, or was, and wants to be one again, evidently. And don’t you have to invent, then reinvent yourself if you’re going to get anywhere in life? Isn’t that the American way? Maybe the Palestinians ought to take a page out of Newt’s pretending. Newt’s no fool, although he will suffer them. He’s a politician for Christ’s sake.
I’m thinking Newt’s claim that the Palestinians are an invented people is a red herring, and will go away right after the Iowa Caucuses, by New Hampshire I’m figuring. Newt’s no fool.
I’ll tell you what Newt is, Newt’s a consummate rhetorician––he doesn’t give a flip about natural, artificial, or any other kind. Anyway, Newt’s alright in my opinion. I say more power to him (I mean that metaphorically; Newt gaining more real power makes me a little nervous.)