Saturday, January 20, 2018

A Prayer for Owen Meany by John Irving

I struggled with this one.

My first struggle was with the title character, who I found difficult to like and relate to. Owen Meany is so obviously a device for Irving to, as one of the blurbs on the back cover of my paperback says, “meditate on predestination, faith and the unrealized forces that shape one’s days,” that I can’t take him seriously as a human being. He is not a character in any full sense of the word. He is instead whatever Irving needs him to be in order to arrive at his scripted conclusion. Somewhere late in the novel, our narrator tells us:

It seemed to me that Owen Meany had been used as cruelly by ignorance as he had been used by any design.

It’s a good line, well supported by the preceding details, but it illustrates a fundamental weakness with the text. The design in question bears too many hallmarks of the author, and not those of a supposed in-story divine intervention, that it left me fondly wishing for ignorance to win the day. In other words, by that time I was rooting for Owen Meany to lose, not win the faith jackpot Irving had decided was his lot.

My second struggle was with the narrator, who I couldn’t figure out. Specifically, I couldn’t figure out who he was talking to. The narrator in question is John Wheelwright, Owen Meany’s best friend, who is sometimes seemingly talking directly to the reader and something seemingly writing in his diary. Both devices are perfectly acceptable as structures within which to cage a first person narrator, but Irving decides to use them both, the former primarily to relay the story of the narrator’s adolescent and young adult interactions with Owen in New Hampshire and the latter primarily to relay the story of his current and grown man interactions with a new community of people in Canada.

That sounds clear the way I just explained it, and for most of the novel I guess it is, but the larger point is the fact that I noticed the two devices at all. When a first person narrator has an engrossing story to tell, the reader tends to rather pleasantly get lost in the drama. The device of the narrator -- oral storytelling, written diary, or whatever -- fades into the background. You don’t challenge the written diction of what is supposed to be an oration, for example, or you don’t challenge the lengthy and supposedly transcribed dialogue in what is supposed to be a diary. You simply settle in and enjoy the process of turning the pages.

But by constantly switching the narrative device, Irving never allows me to fall in line with his often entertaining storytelling. With every break on the page, where the diary ends and the oration begins, or vice versa, Irving did little else than pull me out of the story. Who, I would find myself asking over and over again, is John Wheelwright talking to?

And the clincher comes near the very end, when Irving is supposed to be ramping his narrative up to its whiz-bang climax.

Owen Meany taught me to keep a diary; but my diary reflects my unexciting life, just as Owen’s diary reflected the vastly more interesting things that happened to him. Here’s a typical entry from my diary.

“Toronto: November 17, 1970 -- the Bishop Strachan greenhouse burned down today, and the faculty and students had to evacuate the school buildings.”

So now one narrative device has subsumed the other? I’m no longer secretly reading John Wheelwright’s diary? Now he’s quoting me passages from it? Was he doing that before? Then why aren’t those previous sections not set off in quotation marks?

And my third struggle was with the ending, which, as I already mentioned, seemed overly contrived. Instead of dishing out a bunch of spoilers, let me give you a related example. [You? You might now be thinking. Who is he talking to?] Here’s essentially how Irving would have you write a novel.

First, come up with the ending. Make it as improbable as possible. Your main character -- let’s call him Oliver -- is going to lose his hand by intentionally holding it in the bubbling oil of a deep fryer until it is cooked away.

Next, come up with a list of character traits that individually seem unremarkable (or better yet, quirky and weird), but which, taken together could provide an explanation for the seemingly crazy action of your main character in your climactic scene. He hates fried food but works in a chicken shack. He’s in love with the girl who runs the fryer, who refuses to remove her dangling jewelry as required in the employee handbook. He’s plagued by dreams of being an amputee. He has an operation that leaves him with no feeling in his right arm.

Next, take those traits and write a chapter about each one of them. Call those chapters “The Shack,” “The Girl,” The Dream,” and “The Operation.” Make each one a novella, populated with enough quirky characters and comic scenarios to distract the reader from the predetermined climax you’re building towards. At the same time, have your narrator make regular cryptic and foreshadowing remarks about that climax. Things like “If I had known then what I knew later, I would’ve forced Oliver to stop working at that chicken shack,” or “Sometimes love will drive people to do crazy things, and Oliver was no exception.”

Finally, write your climax. Pat yourself on your back for being so clever. Cash the constant stream of royalty checks as your novel goes into printing after printing.

I’ve oversimplified for effect, but I hope you see what I’m driving at. At the end of A Prayer for Owen Meany, I felt very much like I could see Irving’s outline sketch of the novel, written, as I’ve described, with the end firmly in mind, and indulging in whatever twist or turn was necessary to get us there.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.


Monday, January 15, 2018

The Idea Swap

I attended an educational conference last week and a wonderful thing happened to me there.

I got a new idea.

I don't know yet if it's a good idea or a bad idea. Maybe you can tell me. But that's not the point. The fact that something new and novel occurred to me while attending this conference practically made it worth the price of admission.

Here it is.

At a lot of educational conferences there is time set aside for group-based discussions on ideas or problems that the attendees self-identify. My own association, in fact, has often tried something similar. We ask people on their registration forms to list one challenge that is "keeping them up at night." We then take all those responses and sort them into ten or so common topics and post them on a series of tables at one of the conference's networking breakfasts. Go find the topic that most resonates with you, we'll tell the attendees, and talk with the like-minded people at your table about ways to address or overcome the idea.

It works -- to a degree. It certainly helps get people who may not have otherwise spoken to each to speak to each other. But does it actually help our attendees solve the problems they told us about on their registration forms? There's frankly little evidence of that.

So what if we tried something different?

First, rather than ask people when they register what's keeping them up at night, ask them when they come into the breakfast session what problem they are facing that they would like their peers at the conference to help them solve. Not everyone is going to have something -- but that's okay. For this idea, you don't need a hundred people to give you a hundred problems. You only need ten.

Second, once you have your list of ten problems, put them anonymously up on the screen. While people are enjoying their breakfasts, interrupt them and ask them to look at the screen. Tell them that everyone in the room has to pick one of these ten problems to solve. But here's the trick. Tell them that if the problem is yours -- whether you were the one who wrote it down or it is simply someone else's problem that you also share -- YOU CANNOT PICK IT. You must pick someone else's problem to solve.

Third, have everyone re-sort themselves in the room so that they are at a table filled with people who have also picked the same problem as them. I'm not sure how to do that logistically, but I'm sure it can be done. Maybe the tables can be pre-labeled with the numbers one through ten, and then you can use those numbers as an index for the ten problems up on the screen. If you picked problem #5, for example, move to a table with a #5 on it.

Fourth, let the people at the tables talk about ways to solve the problem they have chosen. Ideally, ask them to come up with as many possible solutions as they can, because no single solution is likely to solve every individual variation of the problem that their colleagues might have. To provide an overly simplistic example, if my problem is that I lost my green crayon, and your single solution to the problem is for me to buy a new green crayon, that's not going to work if I don't have the money for any new crayons. I'm going to want to hear multiple ideas for how to solve my problem. Buy a new one, borrow one from a friend, use your blue and yellow crayons together, etc.

Fifth and finally, go around the room and have a spokesperson from each table read out or otherwise describe their list of solutions.

What I like about this idea is that it at least attempts to deal with a fundamental and often overlooked reality of this situation. The reason the attendee has a challenge that is keeping her up at night is because she doesn't not how to solve it. If she knew how to solve it, it wouldn't be a problem that was keeping her up at night. And putting her together at a table with nine other people who are confounded by the same problem won't necessarily help.

So, is this a good idea or a bad idea? I'm open to your perspective, but in a way I'm not sure it matters. What matters is that the conference I attended got me thinking about different ways of doing things in my association. And that, above all else, is frequently what's most needed.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
http://downloadclipart.org/f/swap-ideas-day-3808




Monday, January 8, 2018

The Importance of Making Commitments

It's been a while since I wrote about my association's on-going experiment with 4DX -- The Four Disciplines of Execution. My write-up of this business book recently appearing on my annual post about the top five most visited pages on this blog for the year reminded me that we still haven't given up on it. Indeed, if anything, we have increased our commitment to this apparently simple, yet powerful system of execution.

One of my biggest takeaways from 4DX is the concept of the whirlwind -- all the day-to-day activities and energy that it takes to run your operation. While it is true that it is the whirlwind that can keep your organization from achieving things of long-lasting consequence, it is wrong to think that the solution is to reduce or somehow tame the whirlwind. While there may certainly be tasks within the whirlwind that could be dropped with little ill effect, the whirlwind will never be tamed because it is largely comprised of things that are vitally important to your organization. The tasks within the whirlwind keep the lights on and the doors open. Reducing your organization's attention on those things can put you in even greater jeopardy.

So, to better fight against the whirlwind, 4DX provides the concept of WIGs -- Wildly Important Goals. These are a small number of objectives that the organization will consciously turn its attention toward, things that the whirlwind would typically distract the organization from, but which are actually "wildly important" to the long-term growth and success of the organization. 

In the past, my association erred in identifying too many WIGs to focus on. As we worked to adapt the 4DX nomenclature to our existing system of strategy and execution, we perceived too many of our strategic objectives as WIGs. They were certainly important, but many were important at the level of the whirlwind, and our whirlwind of established activity and energy was already well tuned to accomplish them. As a result, we diluted the extra attention that is meant to be preserved for WIGs and struggled to apply the necessary leverage to achieve the ones with long-term consequence.

So this year we made a change. There is only one WIG and it is going to get the special attention it requires each and every week. Because that's the 4DX prescription for WIGs. The system calls it a "cadence of accountability," and it manifests itself in a weekly WIG meeting where progress against the WIG is measured and decisions are made about what additional tactics are needed to drive success against the goal.

We've been conducting these  weekly WIG meetings for about six months, and some new attention and energy has certainly found its way to our WIG as a result. But until just this past week one important facet of these WIG meetings had completely fallen off my radar screen -- the importance of making commitments. In these meetings, every member of the WIG team in supposed to make a commitment to perform a discrete number of tasks, completely under his or her control, that will help move the organization closer to achievement of the goal.

This, specifically, is how action against the WIG is designed to find its place in the whirlwind of activity that would otherwise consume it. Everyone already has a full plate -- myself included. But in making a commitment to my other team members that I am going to do X and Y in the upcoming week, the likelihood that those tasks will actually find their way on and then off my to-do list is dramatically increased. At next week's WIG meeting, after all, I'm supposed to report on that status of the tasks I committed to the week before, and use their relative success or failure as one of many data points in the on-going strategy of execution that we're building around our WIG.

Without these individual commitments, I believe, the whirlwind will consume all.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
https://www1.cbn.com/commitment-and-freedom




Saturday, January 6, 2018

I’m OK -- You’re OK by Thomas A. Harris, M.D.

I think I mentioned this before. I like picking up pop psychology books from generations past at used book stores. It’s fun, I think, to read what past generations had “figured out” about the human condition, and to see if it still holds water today. I had a recent experience with Games People Play by Eric Berne that was especially challenging on that level and, unbeknownst to me when I picked it up, I’m OK -- You’re OK relies heavily on Dr. Berne and the work he laid out in Games People Play.

But I found I’m OK -- You’re OK much more accessible. Even useful on a certain level. It’s basic premise relies on two formulations of human behavior.

The first, taken from Berne, is that we all have not one, but three personas within ourselves, two of which develop at an early enough age that we can do little about them. Before the age of two our Parent and our Child have been more or less fully formed -- the Parent with all its strict rules about human behavior, and the Child who is the helpless, emotional reaction to that, often faulty, understanding. Only later with maturity do we develop the third persona, the Adult, who can test the assumptions of the Parent and shelter the feelings of the Child.

The quest for healthy human relationships, then, under this construction, is for your Adult to interact with the Adults of other people. Too often, however, we fall into unhealthy “games,” where a Parent becomes co-dependent with another’s Child, or two Parents wind up butting heads, or two Childs conspire to avoid all painful reality. The power of this model is in giving people a common language for acknowledging and discussing what is otherwise too difficult to put into words.

The second formulation is what gives the book its title. That, within the context described above, our Child inevitable starts life in what Harris calls the I’m Not OK -- You’re OK life position. It comes from an infant’s early and inescapable need to be cared for. He’s hungry, he’s tired, he’s wet: he is Not OK, and it is not he but only his caregiver that can alleviate all of that suffering. I’m Not OK, but you’re OK. You have everything you need to be you, and you also have the thing that I need to be me.

That, according to Harris, is the life position we all begin in, and many people never leave it. Their whole lives they need someone else to alleviate the psychological pain of being Not OK. But some people migrate to one of the other three obvious life positions: either I’m Not OK -- You’re Not OK, where the child is not adequately cared for and concludes that no one is OK, that no one can alleviate his own feelings of being Not OK; or I’m OK -- You’re Not OK, where the child who has been neglected or abused learns how to help himself and becomes self-reliant (often, according to Harris, to a criminal degree); or I’m OK -- You’re OK, which is a conclusion that only a healthy Adult persona can reach, the true understanding that we’re all in the soup together, each in charge of our own happiness and able to help others reach theirs.

Like the three personas of Parent, Child, and Adult, these four life positions have some explanatory power, and can provide all of us with a simple framework and vocabulary for working out our psychological problems. The challenge I found with the book didn’t lay in these structures, but in their frequently archaic applications and assumed mechanisms of action.

First off, like all psychology I’ve read from the 1960s, it’s often best to just skip over the case studies.

One modern housewife with every up-to-date convenience in her home found she simply did not have any interest in buying a garbage-disposal unit. Her husband encouraged her to get one, pointing out all the reasons this would simplify her kitchen procedures. She recognized this but found one excuse after another to postpone going to the appliance store to select one. Her husband finally confronted her with his belief that she was deliberately not getting a garbage disposal. He insisted she tell him why.

A bit of reflection caused her to recognize an early impression she had about garbage. Her childhood years were the Depression years of the 1930s. In her home, garbage was carefully saved and fed to the pig, which was butchered at Christmas and provided an important source of food. The dishes were even washed without soap so that the dishwater, with its meager offering of nutrients, could be included in the slops. As a little girl she perceived that garbage was important, and as a grown woman she found it difficult to rush headlong into purchasing a new-fangled gadget to dispose of it.

You just can’t relate to things like that. Next it’ll be Ma Joad talking about the pig getting loose and eating the baby.

But, more important, probably, are the pieces that, with forty more years of scientific progress, we know now are pretty much wrong.

For example, Harris takes as one of his fundamental premises, based on the state of brain science at the time, that “the brain functions as a high-fidelity tape recorder,” and that “everything which has been in our conscious awareness is recorded in detail and stored in the brain and is capable of being ‘played back’ in the present.”

This, based on everything more contemporary that I have heard or read, however, is not actually how the brain -- or at least how memory -- works. Memories are not perfect recordings of past events. They are reconstructions based on the current biochemistry of the brain. As has been shown again and again -- memory is malleable and imperfect.

It is unfortunate, then, that Harris makes the “fidelity of brain recordings” such a crucial foundation of his resulting theory, both because it is essentially wrong and, I think, unnecessary. One doesn’t have to have a tape recorder in one’s head in order to find utility in Harris’s therapeutic approach and vocabulary. The personas of Parent, Child and Adult, and even the dichotomy of the OK and Not OK life positions, have, I believe, a certain utility, regardless of the scientific discoveries that do or do not support brain recordings.

Another troubling section is Harris’s analysis and defense of free will. His purpose, I think, is to provide reassurance to his readers that they, in fact, do have the power to change themselves and their lives, despite several distressing traditions of determinism that have permeated science in general and psychology specifically.

Can man really change if he wants to, and if he can, is even his changing a product of past conditioning? Does man have a will? One of the most difficult problems of the Freudian position is the problem of determinism versus freedom. Freud and most behaviorists have held that the cause-and-effect phenomenon seen in the universe also holds true for human beings, that whatever happens today can theoretically be understood in terms of what has happened in the past. If a man today murders another man, we are accustomed by Freudian orientation to look into the past to find out why. The assumption is that there must be a cause or causes, and that the cause or causes lie somewhere in the past. The pure determinist holds that man’s behavior is not free and is only a product of his past. The inevitable conclusion is that man is not responsible for what he does; that, in fact, he does not have free will.

If you're a regular reader of this blog you know what I’m going to say. Why does the absence of will “inevitably” lead to the absence of responsibility? As is so often done, Harris makes the knee-jerk comparison to criminality and the courts.

The philosophical conflict is seen most dramatically in the courts. The judicial position is that man is responsible. The deterministic position, which underlies much psychiatric testimony, is that man is not responsible by virtue of the events of his past.

As if this is some kind of existential crisis. We have to deny determinism because to do otherwise would be to open up all the prisons and let all the criminals go free! Of all the arguments against determinism, this one makes about the least amount of sense to me. Whether man is a free agent or a deterministic machine, I don’t see any difference in the appropriate response to criminal behavior. Shouldn’t broken machines be prevented from harming other machines, just as criminal humans should be prevented from harming other humans?

But rather than admit that simple solution, Harris, and many, many others, decide that they must weave together a bunch of over-thought philosophical concepts in order to lull everyone into believing that free agency can be retained -- for humans -- in an otherwise deterministic universe.

We cannot deny the reality of cause and effect. [But you will.] If we hit a billiard ball and it strikes several more, which then are impelled to strike other billiard balls in turn, we must accept the demonstration of the chain sequence of cause and effect. The monistic principle holds that laws of the same kind operate in all nature. Yet history demonstrates that while billiard balls have become nothing more than what they are as they are caught in the cause-and-effect drama, human beings have become more than what they were. The evidence of evolution -- and of personal experience -- convinces us that man has become more than his antecedents.

Did you catch that? Billiard balls are billiard balls, but human beings are something different. Why? Because we’re human beings, and we know we’re different.

There is an essential difference, however, between a man and billiard ball. Man, through thought, is able to look to the future.

Here’s another place where Harris’s 1960s understanding of brain science does him a disservice. How would he react, I wonder, to the modern understanding that thought itself is another billiard ball -- a manifestation of similar deterministic realities, this time across biochemical synapses rather than across a green felt table?

He is influenced by another type of causal order which Charles Harteshorne calls “creative causation.” Elton Trueblood elaborates this point by suggesting that causes for human behavior lie not only in the past but in man’s ability to contemplate the future, or estimate probabilities:

“The human mind … operates to a large extent by reference to final causes. This is so obvious that it might seem impossible to neglect it, yet it is neglected by everyone who denies freedom in employing the billiard ball analogy of causation. Of course, the billiard ball moves primarily by efficient causation, but man operates in a totally different way. Man is a creature whose present is constantly being dominated by reference to the nonexistent, but nevertheless potent, future. What is not, influences what is.”

Let me be blunt. The above paragraph is incoherent, as most arguments against determinism eventually become. A nonexistent, but nevertheless potent, future? I guess that’s easier for some to swallow than the idea that they are billiard balls. And our evidence for this nonexistent thing? Well, who needs evidence when its potency is “obvious”?

Ortega defines man as “a being which consists not so much in what it is as in what it is going to be.” Trueblood points out

“...it is not enough to say that the outcome is determined even by one’s previous character, for the reality in which we share is such that genuine novelty can emerge in the very act of thinking. Thinking, as we actually experience it daily, is not merely awareness of action, as it is in all epiphenomenalist doctrine, but is a true and creative cause. Something happens, when a man thinks, which would not have occurred otherwise.”

Enough. Statements such as these are pervasive in the books I’ve read on the subject, but they rely on facts not in evidence. Something happens when we think. We don’t know what that something is, and everything we learn about the brain makes the possibility of that something more and more remote, but we will still assert that this something is real. Why? Because we can feel it. We feel, and therefore we have to be something more than unfeeling billiard balls. Harris, and many others, will twist themselves into logical and philosophical knots before they admit to being that.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.


Monday, January 1, 2018

My Top 5 Blog Posts of 2017

As we end another year, here's a look back at the five posts on this blog that received the most page views in 2017.

1. Stop Calling It Strategic Planning
This has been on every year-end list since it was originally posted in January 2012, and keeps getting a ton of traffic, including as the page through which the highest number of people enter my site. It was inspired by the take-down of strategic planning in Humanize, and in it I pledge to stop using that term to describe the messy, constantly evolving process my association uses to determine our direction and set our objectives. In laying out the guidelines that govern our activities, I realize that only one term makes any sense--association management.

2. The 4 Disciplines of Execution by Chris McChesney, Sean Covey and Jim Huling
This one was originally posted in May 2014, and returns for a fourth placement on these year-end lists. It summarizes my takeaways from the book The 4 Disciplines of Execution. The book's subtitle is “Achieving Your Wildly Important Goals,” and it contains a deceptively simple and oddly compelling system for doing exactly that--with a lot of potential applicability for associations. Among the many practical tools it taught me was the need to create "winnable games" for your team to go after, with regular and visual scorecards showing the team's progress towards each goal. As the authors continually remind the reader, people play differently when they are keeping score. When they can see at a glance whether or not they are winning they become profoundly engaged.

3. The Chairman's Gift
Originally posted in July 2012, this one has now been on five of six possible year-end lists. It tells the story about how my association ensures that our outgoing Board Chair receives a gift that recognizes not just his service to the association, but the fact that he is an individual who has made a personal sacrifice to serve in that capacity. The true value is the message it sends to others who might be considering a similar commitment in their futures.

4. Action Plans Describe the Steps Staff Will Take
A newcomer last year, this one returns this year for its second appearance. It was originally posted in November 2015, and is part of a series I was doing describing the strategy and execution process my association uses instead of traditional "strategic planning." Action plans are on the deep end of the execution side, coming only after strategic goals have been set and specific program objectives needed to bring those goals about have been identified. As the post title implies, action plans detail the specific steps a staff leader (i.e., the person responsible for ensuring that the organization achieves the program objective) will take in that quest. In the post, I provide examples and explore the two most common questions I get with regard to action plans: (1) When do you set these Action Plans? Is there any room for adjustment? How can you possibly chart a course of action for an entire year? and (2) Who's in charge of these action plans? What happens when they are behind schedule or not progressing at all? Who do you hold accountable? Actually, for a more complete answer to that second question, you need to also go here.

5. The Crucible by Arthur Miller
The only newcomer to this year's list, this was originally posted back in January 2015. It's one of the many "mini term papers" I tend to offer up, free of charge, to desperate freshman English majors the world over. My overall theses: This is a play about the balance between order and freedom, and specifically order’s ultimate triumph over its weaker counterbalance. The historical setting is, of course, the Salem witch trials of the 1690s. The order is that of the theocratic state, its functionaries able to convict, jail and hang those they determine to be in league with the Devil. The freedom is that of John Proctor, his wife Elizabeth, and their fellow villagers, who are held hostage by the accusations of a group of vengeful teenage girls. It may seem silly to our modern sensibilities, but these people very much believed in God and the Devil, and the way the two of them battled for people’s souls right here on earth. And Miller paints no one in his drama as a fool, just as people with clashing motivations interpreting the world as they understand it.

My thanks to everyone who has been reading what I've been putting up here. I hope you plan to stay engaged in 2018.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
https://blogs.gartner.com/todd-berkowitz/20161217-fearless_predictions/


Monday, December 25, 2017

A Holiday Break: American Pastoral by Philip Roth

Books are always the best holiday gift for me. The only thing I like better than the anticipation of reading a long sought after title is the fondness that comes with remembering the discovery of an unexpected treasure.

As I look back on all the books I've profiled here in 2017, the one I'd most like to revisit is American Pastoral by Philip Roth. I blogged about it back in July, and found it to be a novel of astonishing depth and complexity.

It is a novel of two people representing two generations. First, there is Seymour "the Swede" Levov, the child of Jewish immigrants, representing a generation of people embracing the American dream and all of its totems and rituals. And then there is his daughter Merry, the radical, representing a generation of people disillusioned with the very totems and rituals that define the generation that came before.

And although the novel delivers powerfully when the reader views Seymour and Merry as individuals in conflict with each other, the transcendent depth of the novel emerges when they are viewed as the generations they represent, wrestling with each other for the soul of America. The Swede, in blaming himself, embodies the mindset of an aspirational generation, while Merry, in rejecting all that her father has arranged and decoded for her, embodies the mindset of a nihilistic one -- the American pastoral versus the American berserk.

As you enjoy your holiday break, I hope you find some time to curl up with a good book. I know I will.

+ + +

This post was written by Eric Lanke, an association executive, blogger and author. For more information, visit www.ericlanke.blogspot.com, follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Saturday, December 23, 2017

The Incredible Shrinking Son of Man by Robert M. Price

I first learned of Robert M. Price by stumbling upon his Bible Geek podcast some years ago. As I confessed in my write-up of his The Reason Driven Life, listening to him answer questions about the origins, contradictions, and hidden meanings in the Bible is one of my favorite things to do.

Well, The Incredible Shrinking Son of Man is very much like one long, sustained, and better organized episode of the Bible Geek podcast. The format of the podcast often makes it difficult for Price to cover all the background information that a newcomer would need in order to understand the context and often just the terms he uses in his answers. In book form, however, Price has all the space he needs. And here, there is really only one question to answer. It doesn’t come from Price’s “rain barrel” of listener questions. Price asks it himself in his subtitle. How Reliable is the Gospel Tradition?

In 355 pages, Price gives his answer, and all the supporting context and defined terms that he needs to justify it. It’s not. That’s the essential takeaway. The Gospel Tradition is not reliable.

But that’s getting ahead of ourselves. Let’s first just capture a few of the things I feel I learned while reading this book.

The Pauline Epistles Pre-Date the Gospels

I frankly don’t know how this one escaped me for so long. I’ll blame myself rather than my Sunday School teachers, but I guess I always just assumed that Paul was writing in the time after the Gospels were written, as the early Christian Church was expanding out of its Jewish core and recruiting in the Gentiles. Isn’t that what the cities referenced in the Epistle titles denote? Corinth (in Greece), Philippi (in Macedonia), and Rome (in Italy)?

Well, it turns out this is right -- it’s just my assumption about Paul having the Gospels of Matthew, Mark, Luke and John as reference material that is wrong. Paul, assuming he even wrote the epistles, was writing in a time before these canonical Gospels existed, even though he was writing to Gentiles who had established Christian churches far outside of Israel or Judea.

And this reality will figure significantly in Price’s calculation of the reliability of the Gospel tradition. For if the events described in the Gospels actually happened to the God/man named Jesus Christ, why does Paul never mention them?

There Are At Least Four Different Jesuses in the Gospels

This one I was already keyed into, but Price helped me tease out four clear examples of the “multiple Jesus” phenomenon. The dividing lines in question are based on when Jesus actually became the Christ -- that is, the Son of God and redeemer of mankind: upon his resurrection, upon his baptism, upon his birth, or from the beginning of time. Each tradition had their sects and advocates while the Gospels were being penned and transcribed, and each tried to influence the texts to better favor their interpretation of Jesus’s relationship with God.

If only each tradition had its own Gospel, and each consistently stuck to their conflicting story throughout. Unfortunately, it didn’t work out that way. What we have instead are four books where within each all four traditions have been piled on top of each other through a historical progression of redactions, with not a one having a reliably original source by which such edits could be consistently identified.

And all of that is complicated by the order that the Gospels are presented to modern readers.

The average reader of the New Testament reads Matthew before Mark and then goes on to Luke and John. Matthew gives him the impression that Jesus was born God’s Son in a miraculous fashion. Mark begins only with the baptism, but the reader will think little of this: perhaps Mark begins in medias res. With Luke we are back to a miraculous nativity for one born the Son of God. In John the reader learns that Jesus had already been God’s Son from all eternity. But suppose one read Mark by itself, as its first readers did. What impression would one receive? Surely in a book where the main character shows up as an adult and, right off the bat, experiences a vision of divine calling in which he and no one else is told he is God’s Son, the natural inference would be that the baptism was the beginning of an honorific Sonship. If he were already God’s son, wouldn’t he have known it? And then why should God tell him what he already knew? It seems that Mark might believe what others in the early church did, namely, in Jesus’ adoptive Sonship.

But as difficult as it may be to tease apart these traditions, the fact that they are there in the soup has amazing explanatory power. I’ll let Price make one of his extremely helpful analogies from today’s popular culture mythology.

When Siegel and Schuster first told tales of the Man of Steel, he was said to have developed his powers only once he reached maturity. But Superman’s adventures proved so phenomenally popular that the publisher suggested moving the origin of his powers, and hence his superhero career, back one stage to his adolescence. So the adventures of Superboy premiered and continued for decades. Why not go a step further? The legend was revised again, so that the infant Superbaby was already helping out with farm chores using his superstrength, for example, lifting the tractor single-handedly. Even so, when Jesus’ divine sonship was thought to have stemmed from his Spirit-baptism at the Jordan, his adult activities formed the content of the gospel. But once his sonship was believed to have started at his physical birth, his miraculous “adventures” had to be extended backward to fill the gap.

Price is talking there about the numerous infancy Gospels -- texts, while not canonical, comically describe the child Jesus doing things like cursing people, healing others, and fashioning living sparrows out of clay. While few today take them seriously, they offer a certain logical consistency with the premise only partially addressed in the canonical Gospels -- that Jesus may have been God from birth.

Anachronisms Hide Behind Every Corner

These are some of my favorites -- all of which pretty much prove that large portions of the Gospels were written at a time far distant from the events they purport to describe.

Here’s a simple one.

We will return to the enigmatic figure of Judas later, but in the meantime, let us observe that his epithet “Iscariot” might mean, with about equal plausibility, three very different things. First, and traditionally, it has been taken to denote “Judas of Kerioth.” Kerioth was the name of a number of villages in Judea, which would make him the only non-Galilean in the group, if not even an Edomite (like Herod!), which is why he is given red hair in Nikos Kazantzakis’s The Last Temptation of Christ (book and film), the Edomites being notorious redheads. John’s gospel must have understood Iscariot this way, since John refers to Judas as the son of Simon Iscariot (13:26). Second, many understand Iscariot as meaning “the Sicarius,” making Judas a member of the assassin squad of the revolutionary Zealots. They carried the sicarius, or short sword, hidden in their robes from whence they would pluck it to stab their intended victim and then mix in with the shouting crowd. This would place Judas alongside Simon the Zealot and Simon Barjona as militant nationalists. I prefer the third option, the surmise of Bertil Gärtner and others, whereby Iscariot represents the Hebrew Ishqarya, “man of falsehood, betrayer.” This means, obviously, that Judas would have been called “Judas Iscariot” during his lifetime no more than Jesus would have been called “Jesus Christ.” This does not mean, however, that sufficient water has not passed under the bridge by the time of the Gospels that Iscariot could be mistaken for a surname. See Mark 3:19, “Judas Iscariot, who betrayed him.” Mark no longer recognized it as a redundancy.

In other words, by the time “Mark” started writing his Gospel, the meaning of Iscariot had already been lost in the dimly remembered past, assuming that the name and the character are based on an actual person. It reminds me a lot of the confusion over Jesus being called “the Nazarene.” Price dissects this one pretty well when examining the record to see if the Gospels can reliably tell us of Jesus’s birthplace.

Despite the rendering of many English Bible translations, Jesus is very seldom called “Jesus from Nazareth” in the Gospels. Mark calls him “Jesus the Nazarene,” as does Luke twice … while Matthew, John, and Acts always call him “Jesus the Nazorean” … with Luke using this epithet once. … Some critics have questioned whether the village of Nazareth even existed in the time of Jesus, since it receives no mention outside the Gospels until the third century. Whether that is important or not, the difference between “Nazarene” and “Nazorean” does give us reason to suspect that the familiar epithet does not after all denote Jesus’ hailing from a village called Nazareth. “The Nazarene” would imply that, but not “the Nazorean.” That seems to be a sect name, equivalent to “the Essene” or “the Hasid.” Epiphanius, an early Christian cataloguer of “heresies,” mentions a pre-Christian sect called “the Nazoreans,” their name meaning “the Keepers” of the Torah, or possibly of the secrets. … These Nazoreans were the heirs, supposedly, of the neoprimitivist sect of the Rechabites descending from the time of Jeremiah. … They were rather like Gypsies, itinerant carpenters. “Nazorean” occurs once unambiguously in the New Testament itself as a sect designation, in Acts 24:5: “a ring leader of the sect of the Nazoreans.” Robert Eisler, Hugh J. Schonfield, and others have plausibly suggested that Jesus (and early Christians generally) were members of this Jewish pious sect.

Again, in other words, any reference to Jesus being from a town called Nazareth based on his being called the Nazarene or the Nazorean in the Gospels in a faulty attempt to square a circle. No such town existed in the time of Jesus’s reported birth, and when the connection is made, it only shows that the Gospel writers didn’t know that.

But when it comes to anachronisms, there’s one I stumbled across that really takes the cake. It’s a little more complicated than misunderstood words, so bear with me.

Jesus is depicted in the Gospels in several contradictory ways when it comes to the matter of legal observance. Matthew’s gospel presents Jesus not merely as a new Moses but virtually as a new Torah. As “Moses” and “Torah” had become practically synonymous, so would “Jesus” and “Gospel” become interchangeable, and for Matthew, Jesus is the new Torah. Matthew organizes the teachings he attributes to Jesus into five major blocs: The Sermon on the Mount (chapters 5-7), the Mission Charge (10), the Parables (13), the Manual of Discipline/Community Rule (18), and the Diatribe against the Pharisees/Olivet Discourse (23-25). The fact that he has squeezed these last two, rather different, topics together only underlines his urgency to get all the material into five sections, each of which ends with a similar statement: “And when Jesus finished these sayings, the crowds were astonished at his teaching” (7:28). “And when Jesus had finished instructing his twelve disciples, he went on from there to teach and preach in their cities” (11:1). “And when Jesus had finished these parables, he went away from there” (13:53). “Now when Jesus had finished these sayings, he went away from Galilee…” (19:1). “When Jesus had finished these all these sayings, he said to his disciples…” (26:1).

So, these are all interpreted as new and somewhat radical teachings. The lessons in the Sermon on the Mount “astonishing” those who heard them.

And yet this new Torah is in no way intended to replace the traditional one. It belongs to a curious genre of contemporary documents that provide a sort of “new edition” of the old Torah. Other examples are the Book of Jubilees and the Qumran Manual of Discipline. Thus, Matthew can have Jesus speak as if nothing at all has changed: “Do not think that I have come to abolish the Scriptures; I have come not to abolish them but to fulfill them. For amen: I say to you, till heaven and earth pass away, not a yodh, not a vowel point will pass from the Law until all is accomplished. So whoever relaxes one of the least [important] of these commandments and teaches others [to do] so, shall be called least in the kingdom of heaven; but he who does them and teaches them shall be called great in the kingdom of heaven” (Matt. 5:17-19). Of all this, only the [blue type] is from Q, paralleled by Luke 16:17, “But it is easier for heaven and earth to pass away than for one dot of the Law to become [null and] void.”

Q is what Bible scholars call a set a supposed source materials that they believe several of the Gospel writers were working from. But don’t let that distract you. The larger point here is that Matthew is evidently trying to have Jesus both change and not change Jewish law. So what? Well, remember who he is supposedly preaching to: Jews in and around Galilee.

The Q saying thus isolated is already strange if we take it as a saying of Jesus, for it is a polemical proposition against someone who posits the Torah is obsolete. Who would Jesus have been talking to? Reform Jews? But the saying fits perfectly into the context of the Gentile Mission and the Pauline debate over the Torah, and that is where we have to leave the saying.

In other words, this can’t possibly be the verbatim report of what a Galilean carpenter preached near the start of the first millennium. It is, again, a reflection of competing traditions vying for dominance with one another through a historical progression of redactions and embellishments. The very fact that Jesus is counseling Jews to accept a new Torah shows that none of these lessons can be historically accurate. They are anachronisms that belong in the time of Christian expansion to the Gentiles.

The Inevitable End of Shrinking

There are, in fact, so many anachronisms, so many sayings of Jesus and so many reports of Jesus’s activities that can’t possibly be historically consistent with the time he supposedly lived, that Price eventually comes to the conclusion alluded to in his book’s title.

According to such an understanding, there can have been no Galilean adventures of an itinerant teacher and healer named Jesus. Rather, these stories must necessarily have arisen only at a subsequent stage of belief when the savior’s glorification, along with his honorific name Jesus, had been retrojected back before his death. I would suggest that only such a scenario of early Christological development can account for, first, the utter absence of the gospel-story tradition from most of the New Testament epistles, and second, the fictive, nonhistorical character of story after story in the Gospels.

A critical analysis, in Price’s opinion, leads to a historical Jesus that has shrunk essentially to the vanishing point.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.