This is turning out to be disappointingly heavy sledding. Maybe because the whole thing feels too much like a cerebral exercise -- the characters are This is turning out to be disappointingly heavy sledding. Maybe because the whole thing feels too much like a cerebral exercise -- the characters are there just as a means of exploring the fallibility of memory, not because there's a story that needs to be told.
I like a good story myself.
Update: 2 months later.
Oh dear. This "critically acclaimed" "novel of ideas" turns out to be just the kind of book that exposes me for what I am. The kind of reader who finds a certain kind of serious "novel of ideas" to be essentially unreadable. I like Rosecrans Baldwin (for reasons I'll explain below), I could see him straining to write a serious novel that would receive critical acclaim, I was rooting for him to succeed. And possibly, for some subset of discerning readers, he has. Those readers will need more patience than me. I found the characters uninteresting, the structural devices a little too forced, and the prose was way too solemn for my taste.
Here's the thing. While he was writing this novel, Rosecrans Baldwin spent 18 months in Paris working for a French ad agency. He documents this (often surreal) experience in a much looser, almost gossipy account called "Paris, I Love You, But You're Bringing me Down". That book is terrific -- in it, he manages to nail the absurdities of Parisian life hilariously.
I hate to say it, but the book that was presumably written as a diversion ended up being far better than "You Lost Me There". Not an entirely fair judgment, of course, since I never did manage to finish that earnest "novel of ideas" (insert weak "He lost me there" jokelet ad lib).
I am, of course, a confirmed philistine*, and your mileage may vary.
*: Despite my philistine status, I make no apologies for holding all authors to the minimum standard of writing a book that the reader will actually, you know, want to keep reading. Because of its failure to meet that standard, I can only give "You Lost Me There" one star....more
I think very highly of William Gibson. I've been vastly entertained by three of his novels and can't wait to get my hands on more of his fiction. But I think very highly of William Gibson. I've been vastly entertained by three of his novels and can't wait to get my hands on more of his fiction. But this collection of non-fiction pieces, written over a span of several decades, is a disappointment, likely to be of interest only to diehard Gibson fans.
Don't get me wrong. There's nothing here to change my impression that Gibson is smart, and a fundamentally nice guy. But pieces like the 1993 essay about his impressions of Singapore for "Wired", or two essentially similar 2001 pieces about the futuristic appeal of Tokyo as a setting for his fiction were probably only modestly interesting when first published and have not improved with age. As coiner of the term "cyberspace", Gibson is probably doomed to suffer a lifetime of being asked to write pieces that try to predict the future. Does reprinting such efforts really address some deep-seated need among the reading public? I doubt it.
To his credit, Gibson adds a little coda to each such dated piece, in which he signals his own embarrassment at serving it up again. One senses this book was the brainchild of some enthusiastic soul working for his publisher. It is a fundamentally misbegotten effort.
If you have yet to discover the fun to be had in reading Gibson, try "Neuromancer". Or "Pattern Recognition". Or "Spook Country". Or any of his fiction. But give this collection a miss....more
I love books about language (check out my bookshelves). Imaginary languages? Weirdly specific glossaries? Talking bonobos? Delightful foreign idioms? I love books about language (check out my bookshelves). Imaginary languages? Weirdly specific glossaries? Talking bonobos? Delightful foreign idioms? The latest neurolinguistic breakthrough? Dubious folk etymologies? Yet another book about controversies in English usage? Add it to the bedside pile.
So you'd think I would have enjoyed this perfectly decent book by Henry Hitchings, who appears to be a perfectly decent fellow. He has already written two perfectly decent books about the English language - "Defining the World" and "The Secret Life of Words". The first was about the OED; I haven't read the second. He is obviously interested in English, given that he keeps coming back to it. He is thorough and methodical, meticulous in acknowledging the work of others. His sentences are grammatical.
Unfortunately, they are also, for the most part, exceedingly dull. Some authors write about language with a passion that is infectious. This is not Hitchings's way. His preferred style is a kind of restrained reasonableness that would be laudable if only it weren't so terribly dull.
The final six chapters, in which he focuses on the state of modern English, were more lively than the historical material that forms the core of the book. Unfortunately, given the total of 28 chapters, these represented only about 20% of the book. Earlier chapters had titles that titillated ("Bishop Lowth Was a Fool!", "Of Fish-knives and Fist-fucks"), but did not live up to their promise.
I want to give it a third star, but I can't. It's a perfectly decent book, though. The word "plodding" is surely undeserved. ...more
This seems like a stunning misstep by the normally brilliant Steven Pinker. His ability to write with extraordinary force and clarity has been demonstThis seems like a stunning misstep by the normally brilliant Steven Pinker. His ability to write with extraordinary force and clarity has been demonstrated repeatedly in two separate areas of expertise -- linguistics and cognitive science. Unfortunately, the brilliance of his earlier books in those areas is nowhere in evidence in this regrettable dog's breakfast of a book.
I found it almost unreadable - poorly argued, undisciplined, self-indulgent, and - despite its grotesquely bloated length (800 pages) - support for its main thesis is woefully inadequate, dependent on a highly selective interpretation of existing data and completely unconvincing. Pinker can sling the statistical jargon (Poisson processes, power laws, the gambler's fallacy, the Gini coefficient) like a pro, but all the jargon in the world cannot make up for his recurrent habit of over- or mis-interpreting data whose limitations he consistently glosses over.
The jacket cover breathlessly promises "more than a hundred graphs and maps". Any graph is open to misinterpretation. Three of the most common ways of doing so are (i) selective interpretation (ignoring or explaining away the data that don't fit one's preconceived ideas) (ii) inappropriate extrapolation beyond the range of available data and (iii) failure to acknowledge the data's limitations, such as likely sources of bias, or extreme sparsity of information.
Pinker commits each of these errors, with such numbing frequency that one loses all respect. We are seriously asked to draw conclusions from a graph of the "rate of battle deaths in state-based armed conflicts between 1900 and 2005" (Figure 6-1) while being instructed to ignore the figures for the first and second world wars. After all, "the world has seen nothing close to that level since". This kind of rubbish insults the intelligence. Or you could look at Figure 7-28. Lest you be distracted by the actual data, Pinker has helpfully superimposed some very impressive looking solid lines documenting his cheerful belief in the rise of vegetarianism. These are much darker than the actual data points, presumably in the hope that the reader might be distracted from noting their complete lack of fit to the data. Worried about racially motivated killing of black people? Here are the yearly data (number of such killings) from 1996 to 2008:
5,3,3,4,3,3,3,4,1,2,1,1,1
Pinker's gleeful trumpeting of a five-fold reduction seems to rest on a pretty flimsy foundation to me. Not to mention being a little premature.
But nothing as inconvenient as facts, or their absence, can stand in the way of a man who has already decided he knows the answer. The threat of nuclear holocaust? Exaggerated, because - as any fool can see - nuclear weapons have never been used in wartime since Hiroshima and Nagasaki. One imagines this argument must be of great comfort to those who survived those particular "anomalies". Just as Pinker's breezy insistence that the only meaningful way to interpret the number of people killed in a given conflict is relative to the world's population at the time is surely meaningless to anyone who has lost a family member in battle. It's at best breathtakingly insensitive; some would find it deeply offensive.
To anyone who respects the scientific method, this is a horrifyingly bad book, one which completely obliterates Pinker's credibility. Don't waste your time....more
Lethem's boundless self-obsession and whiningly persistent neediness make this collection impossible to get through, despite the presence of an occasiLethem's boundless self-obsession and whiningly persistent neediness make this collection impossible to get through, despite the presence of an occasionally decent essay. But the guy's total narcissism just creeps you out after a while. Doesn't he have any friends? Or a decent literary agent? Someone to point out to him that the world might not have been thirsting for his pompously self-important post 9/11 musings, or his pathetic extended whine in response to a negative review by James Woods? Or his adolescent tastes in music. That publishing every scrap of text on his hard drive in a bloated omnibus collection just makes him seem pathologically narcissistic?
Dude. Just SHUT THE FUCK UP already. What makes you think we care?...more
I considered putting this on the "intellectual con artist at work" shelf, but that wouldn't be quite fair. It's not you, Doctor Pennebaker, it's me. II considered putting this on the "intellectual con artist at work" shelf, but that wouldn't be quite fair. It's not you, Doctor Pennebaker, it's me. I have no doubt that the research reported on in this book is genuine, if only because of its excruciatingly tedious nature. Frankly, it's hard to get excited (or even to stay awake) about work that uses word-counting as its primary tool, particularly given Doctor P's fawningly enthusiastic invocation of factor analysis as a legitimate statistical method. Even a reader willing to overlook this (serious) deficiency is likely to be bludgeoned into a state of anesthetized indifference by the pedestrian prose and the sheer banality of the conclusions.
It probably didn't help that I read this book immediately after finishing "Thinking Fast and Slow". Daniel Kahneman's clear, careful, measured exposition reminds us that work in experimental psychology can be reported with lucidity and elegance. The mix of anecdotal evidence, statement of the bloody obvious, and somewhat dubious over-generalization found in this book has to be considered a disappointment. And the whole obsession with pronoun usage seems entirely overblown, and not at all convincing.
Upon reflection, and after reading Trevor's excellent, take-no-prisoners review, (http://www.goodreads.com/review/show/...) I have to agree with his assessment and downgrade this to a single star. I will spare Pennebaker the indignity of the "intellectual con artist at work" shelf, if only because I kind of feel sorry for anyone whose life work involves research as pathetically boring as his appears to be. ...more
Learning foreign languages is a topic that interests me greatly -- since retiring from my career in statistics I've made a concerted effort to achieveLearning foreign languages is a topic that interests me greatly -- since retiring from my career in statistics I've made a concerted effort to achieve mastery of Spanish and French, and hope eventually to add Italian and Portuguese to that list. Over the last few years I've given a fair amount of thought to efficient strategies for language acquisition, as well as to the challenges of switching among languages. I don't have any simple answers. Neither does Michael Erard, which is probably a point in his favor. In researching this book he set out to investigate the phenomenon that he refers to (excruciatingly, in my view) as "hyperpolyglottery". Acknowledging the difficulties inherent in judging such concepts as "fluency" or "mastery" (particularly in the case of historical figures), Erard adopts a working definition of a "hyperpolyglot" as someone who exhibits mastery of at least six distinct languages. By studying a number of hyperpolyglots, he hopes to gain insight into the process of language acquisition. The results of his research are frankly disappointing. The book introduces us to a number of reasonably entertaining characters, though one of its primary conclusions seems to be that claims of "hyperpolyglottery" (it hurts me to type that "word", it's so ugly) are invariably exaggerated. The tribe of those claiming extraordinary linguistic capacities is rife with impostors and self-promoters. And the evidence provided by the small number of people whose linguistic abilities seem genuinely exceptional amounts to little more than a series of case studies. There are many anecdotes, but little in the way of firm conclusions. One suspects that Doctor Erard must have been a little disappointed by the results of his investigation. I certainly was, though I give him credit for trying....more
I liked the author's previous short story collection well enough, but this was a major disappointment. The eccentricity that enlivened the short storiI liked the author's previous short story collection well enough, but this was a major disappointment. The eccentricity that enlivened the short stories began to seem more like an annoying stylistic tic on prolonged exposure. And, as other reviewers have pointed out, this whole novel seemed structurally flawed -- the bifurcation into parallel narratives just didn't seem to work at all. I confess to having skimmed the final half just to see if things would get better - they didn't unfortunately.
That said, I still think Karen Russell is a talented writer, and look forward to her future efforts....more
The bookshelves constitute the review. Though I paid only $2.98 for this smug little nugget of crap, I'm tempted to sue the estate of Muriel Spark jusThe bookshelves constitute the review. Though I paid only $2.98 for this smug little nugget of crap, I'm tempted to sue the estate of Muriel Spark just on principle. The characters don't even rise to the level of caricature; they are stick figures that Dame Muriel pushes around her chessboard for a while. Until she can't be bothered anymore. The mystery is why she bothered at all. Surely she didn't need the money, and why would she choose to have this piece of mincingly clever dreck be her last "novel"?
I appear to be in a minority of one on this book. So be it. But this is really nothing more than a case of a talented author phoning it in. Muriel Spark's conversion to Catholicism and its effect on her writing are well documented. Somewhere during that conversion process she should have learned the meaning of shame. Because this is a book to be ashamed of.
I could allow my righteous indignation to sputter on for several more paragraphs, but I think I've made my point. There is nothing in this book that merits your attention. ...more
Generally I'm a sucker for books about books, so I expected to like this more than I actually did. But, although Allison Hoover Bartlett writes well, Generally I'm a sucker for books about books, so I expected to like this more than I actually did. But, although Allison Hoover Bartlett writes well, she never quite managed to convince me that this book was anything other than a magazine article that got out of hand. John Charles Gilkey, the serial book thief at the center of the story, is not completely dull, but he's not as interesting as the author seems to believe and certainly not interesting enough to warrant a 250+ page book. I think that the time and energy Bartlett spent in researching the topic caused her to overestimate its general appeal. She's not the first non-fiction writer to fall into that particular trap, and I'm sure she won't be the last.
(A tip to all non-fiction authors: IT'S NOT ABOUT YOU. If you notice that you are starting to a take a prominent role in the story, it's a dead giveaway that your story may be getting away from you. In olden days there was this priesthood of people known as editors who would step in and point this out to you, to save you from yourself. Sadly, this kind of editor (intelligent, engaged, firm) appears to have gone extinct, so let me say this explicitly here. If you're writing non-fiction, please stay out of the picture. Repeatedly insinuating yourself into the narrative will not make me like you more - instead it's likely to reduce the quality of your reporting and irritate the hell out of most readers. So, unless you're Richard Feynman, resist the temptation to make yourself a character in the narrative. We all have a boundless need to be liked; please don't pander to yours by gatecrashing your narrative). Allison Hoover Bartlett's failure to resist this temptation weakens this book significantly, though not fatally.
The failure of the book to ignite my interest stems from something that was essentially beyond the author's control. The problem is that John Charles Gilkey's kleptomania is the only faintly interesting thing about him, and it's not as fascinating as you might think. According to the jacket blurb, "Gilkey steals for love -- the love of books". This is accurate, strictly speaking, but it's also highly misleading. His obsession centers only on books as status objects and has nothing whatever to do with their intellectual content or with the joy of reading. He could just as well have focused his energy on stealing collectible paperweights. Or Pez dispensers. The realization that Gilkey steals books, not because he wants to read them, but because he thinks they will enhance his status, is ultimately what made this book fall flat for me. Despite Bartlett's borderline obsession with her subject, for me the book amounted to little more than a meandering account of the petty misdeeds of a small-time, singularly uncharismatic, drifter. When the account eventually just petered out, it came as a relief.
I'm making it sound worse than it is. Bartlett writes fluidly and the story is not completely without interest. It was just far less interesting than I'd expected ...more
The first foreign language I learned to complete fluency was German - after five years of high school German I spent a year at a German boys' boardingThe first foreign language I learned to complete fluency was German - after five years of high school German I spent a year at a German boys' boarding school. At the end of that year I was completely fluent, but noticed an odd phenomenon, that I felt like a slightly different person when I spoke German than when speaking English. Since then I've also learned Spanish to a high degree of fluency, and the same observation holds. In both cases, the main difference that I perceive has to do with humor, and the way the language I'm speaking affects my sense of humor. So I've always been interested in the extent to which language affects thought. The notion that it does is what linguists refer to as the Sapir-Whorf hypothesis. Belief in Sapir-Whorf reached its peak in the first half of the 20th century, but since then the notion that language affects cognition has been discredited by almost all mainstream linguists.
In "Through the Language Glass" Guy Deutscher mounts a careful, very limited defence of the Sapir-Whorf hypothesis. He considers three major areas - the link between language and color perception, how different languages deal with spatial orientation, and the phenomenon of differences in noun genders across different languages. His examination of the link between language and color perception is extensive and thought-provoking - he traces the development of linguistic theory on color perception from British prime minister Gladstone's commentary on the relative paucity of color terms in Homer's work, through the Berlin-Kay model (stating essentially that languages all tend to split up the color spectrum in similar ways) through very recent experiments suggesting that the existence of a particular color distinction in a language (e.g. the existence of separate terms in Russian for light and dark blue) affects the brain's ability to perceive that distinction. Deutscher's account of the evolution of linguistic theory about color perception is a tour de force of scientific writing for a general audience - it is both crystal clear and a pleasure to read.
Two factors contributed to my eventual disappointment with this book. The first is that, even after Deutscher's careful, eloquent, persuasive analysis, one's final reaction has to be a regretful "So what?" In the end, it all seems to amount to little of practical importance.
The second disappointment pertained only to the experience of reading this book on an Amazon Kindle. Reference is made throughout to a "color insert" which evidently contained several color wheels as well as up to a dozen color illustrations. This feature was completely absent from the Kindle edition, which had a severe adverse effect on the overall experience of reading this book. Obviously, this point is relevant only if you are contemplating reading the Kindle version - DON'T!...more
This is an odd book, and not a very good one. As someone with over 120 books on my "words-and-language" shelf, I'm a confirmed language geek. Even theThis is an odd book, and not a very good one. As someone with over 120 books on my "words-and-language" shelf, I'm a confirmed language geek. Even the remotest byways of language have the potential to fascinate me. Though my interest in language is purely amateur, it is of long standing. (When I was learning Spanish a couple of years ago, my classmates were completely spooked by my enthusiasm for the subjunctive, which they deemed "unnatural".) The point is, where books about words and language are concerned, my bar is pretty low. I'm predisposed to like anything written by someone who is enthusiastic about language, generally willing to give the benefit of the doubt.
So a language book has to suck big time for me not to like it. Secret Language manages this by fitting squarely in the category of total pointlessness. For the life of me I can't figure out why this book was written, or what the author was trying to get across. The title seems to promise a unifying theme; I'm sure Barry Blake hoped for something more coherent than this dog's breakfast of a book.
There are two main problems. First, the author's inclusion criteria are far too broad. Under the rubric of "secret language" he drags in so many different topics that the result is an incoherent blur. Major chapter headings include:
1. From Anagrams to Cryptic Crosswords 2. Talking in Riddles 3. Ciphers and Codes 4. Biblical Secrets* 5. Words of Power 6. Words to Avoid 7. Jargon, Slang, Argot & Secret Languages 8. The Everyday Oblique 9. Elusive Allusions
This doesn't look too bad, but there's less to it than meets the eye. The book runs to about 300 pages, so you might reasonably ask
"How will Barry Blake manage to tell us something coherent and interesting about so many subjects in such a short book?"
The answer is simple. He doesn't. He flits from one topic to the next, like a slightly deranged hummingbird on speed, with about as much impact. Any risk of saying anything of substance is minimized by darting rapidly from one topic to the next. The remarks that do make it in are, with rare exceptions, astonishingly banal. Here, for instance, is what Barry has to say about internet argot:
The invention of the internet has given rise to an extensive argot among those communicating by email, instant messaging, and other social media. There are a number of rebus-type substitutions for syllables such as B4 'before', C 'see', M8 'mate', U 'you', abbreviations such as LOL 'laughing out loud', and emoticons such as :-) for 'smile' and :-( for 'sad' (if they are not transparent, turn them 90 degrees clockwise). Similar abbreviations are used in texting by mobile phone.
Pretty edifying stuff, eh? There's a similar paragraph telling us how internet spammers like to incorporate deliberate misspellings in words like Ciali$ or V1agra to thwart email spam filters, whose inclusion I'll spare you, because the boredom in transcribing it might actually be lethal.
The basic problem is that most of the book is like this - the author has very little to say that's original, his writing style makes for heavy sledding, and whatever enthusiasm he might feel for his subject isn't evident to the reader. I found the final chapter, "Elusive Allusions", in which the author explains such difficult linguistic conundrums as the reason Ah-nold is referred to as the "Governator" and the origins of phrases like "Achilles heel" and "Trojan horse", particularly irritating.
*: The inclusion of a chapter devoted to such rubbish as "bible codes" and the like is dispiriting. The "History" channel has a lot to answer for....more
I've run across so many non-fiction duds recently that I thought it would be helpful to make a short list of what I look for in a good non-fiction booI've run across so many non-fiction duds recently that I thought it would be helpful to make a short list of what I look for in a good non-fiction book. There are obvious additions or deletions, depending on whether one is talking about biography, current affairs, history or popular science. But for a book like this one, advertised as an accessible account of recent developments in an important branch of neuroscience, here's a short list of what I hope for:
# Context: give me a sense of where the material in the book fits in the overall scheme of things. # Organization and signposting: find a reasonable scheme for organizing the material and stick to it. Tell me at the outset what you're going to cover, then cover it, then tell me again what the important takeaway message is. # Establish credibility: be authoritative without being condescending or vain. # Be intelligible: write clearly, and at a level that's appropriate for your target reader (note that this requires that you actually have a specific readership in mind). Avoid academic jargon as far as possible. # Incorporate figures, graphs, and diagrams intelligently - this can often eliminate huge swaths of verbiage. # Choose good examples. Ideally, they should grab the reader's attention, motivate the questions discussed and illuminate the answers.
Regrettably, Marco Iacoboni's Mirroring People fails to satisfy almost all of these desiderata. I found the book so annoying that I couldn't be bothered finishing it.
Many of the problems can be traced back to the fact that Dr Iacoboni seems to have believed that, since he wasn't writing primarily for neuroscientists, it would be acceptable to:
# give perfunctory descriptions of experimental work, # sum up results only in the vaguest of qualitative terms (effects are almost never expressed quantitatively) # over-interpret data to fit pre-existing hypotheses # fail to discuss limitations of experimental work, or give fair discussion of competing interpretations.
Judged as an overview of a body of scientific work, the book fails miserably. Note that I am not saying that the conclusions presented are wrong*. But the summarization of data is so vague and sloppy that the reader is unable to judge whether or not they are valid.
This leaves one in the unsatisfactory position of being asked to take it on trust "from the expert". The problem there is that everything about Dr Iacoboni's style leads the reader in just the opposite direction - the way this book is written raises warning flags on almost every page. It begins with his hagiographic introduction to some of the key researchers in the area, all - not coincidentally - Italian like himself. Referring to them as the "Fab Four" was not a good idea. And this kind of blather only serves to antagonize the reader:
"maybe this talent for deep insight is why [he:]... reminds me of Albert Einstein" "He is one of the 27 members of the exclusive Club dei 27 (www.clubdei27.com), wherein each member personifies one of Verdi's 27 operas"
Intuitions of these uber-scientists are "incomparable", "uncanny"; their talents are unique, special, almost not of this world.
"Now we can add neuroscience to the list of Parma's world-class exports."
DUDE - I'm all for a little local pride, but there's such a thing as overkill. Most readers will be unable to suppress the eye-rolling reflex, you lay it on so thick. And by the same token, please don't keep dragging your lovely daughter Caterina into the damned book. I don't care that she's in sixth grade, gets a lot of homework, is passionate about her ballet lessons; nor do I care about your lovely wife. If you're feeling guilty that you may have neglected your family while working on this book, take them to Disneyland or something. Don't drag them in as bit players.
I wouldn't be anywhere near as cranky about the extraneous stuff that the author drags in if it weren't for the serious omissions. But failure to provide adequate detail about experimental results in a book that runs to 300 pages is a major problem. Also, is there some problem about figures and summary tables, criminally absent from this book? Almost every discussion of experimental work in the book could have been improved (and shortened) by a suitably chosen summary table or graph. It's really not hard. Even the newspapers are doing it nowadays.
I could go on, but you get the general drift. The book is an annoying mess, and I remain completely baffled by those reviews that described it as an "accessible account". Only if your expectations are very low indeed would this book not be a disappointment. Mirror neurons may indeed play a critical role in the future of neuroscientific research, but a decent, accessible account of their importance is still outstanding. This book fails on almost every level.
*: though conclusions about a putative link between mirror neurons and language acquisition seem clearly overstated; the imagined linking between single-cell results and imaging data appears based on little more than anecdotal data in a handful of subjects. The author remembers to include the mantra "correlation is not causation" every 20 pages or so; unfortunately he appears to take this as license to ignore it completely for the intervening 19 pages. Rampant speculation abounds. ...more
The evidence that I am a complete Philistine continues to accumulate, as yet another acknowledged classic sails right over my head. I did not like "ThThe evidence that I am a complete Philistine continues to accumulate, as yet another acknowledged classic sails right over my head. I did not like "The Good Soldier", for various reasons. Here are a few:
# The plot was an awkward mixture of implausible contrivance and overwrought melodrama, and seemed fundamentally not credible, from start to finish. The basic setup (Serial philanderer Edward cheats on controlling Leonora and cavorts with Florence, the slutty wife of the book's narrator John) was OK - this kind of love quadrangle is hardly unusual. But the way the plot unfolds from the basic premise seemed ludicrous, even allowing for the fact that the account of events is being delivered as the recollections of possibly one of the most unreliable narrators in all of 20th century fiction. The plot was little more than a series of random, largely implausible events, lurching from one improbable crisis to the next. Prussic acid capsules in the vanity case? Suicide by penknife? Telegram-induced catatonia? Give me a break.
# The silliness of the plot had a lot to do with the complete lack of depth of the protagonists. You never get the feeling that any of these characters are real people, so their weird antics never seem like anything other than the jerky behavior of cartoonish puppets. Though most puppets have more character than these annoying stick figures. The most annoying of the stick figures being, hands down, the idiot narrator, John Dowell. A man allegedly so stupid that he doesn't notice his wife is cuckolding him with his best friend and hero for 8 years . Or that her "heart condition" is pure invention and that she's healthy as a horse. Who is apparently the only person on the planet unaware that she committed suicide by ingesting prussic acid. There was an enormous sense of relief upon finishing the book, because at least one didn't have to suffer the idiocies of the obtuse narrator any longer. (Dowell wasn't just idiotic; he was also completely without charm, probably a virgin, and likely a closet case)
# My final objection to the book was the profusion of passages like this one:
And, proud and happy in the thought that Edward loved her, and that she loved him, she did not even listen to what Leonora said. It appeared to her that it was Leonora's business to save her husband's body; she, Nancy, possessed his soul--a precious thing that she would shield and bear away up in her arms--as if Leonora were a hungry dog, trying to spring up at a lamb that she was carrying. Yes, she felt as if Edward's love were a precious lamb that she were bearing away from a cruel and predatory beast. For, at that time, Leonora appeared to her as a cruel and predatory beast. Leonora, Leonora with her hunger, with her cruelty had driven Edward to madness. He must be sheltered by his love for her and by her love--her love from a great distance and unspoken, enveloping him, surrounding him, upholding him; by her voice speaking from Glasgow, saying that she loved, that she adored, that she passed no moment without longing, loving, quivering at the thought of him.
Between this book and "Mr Peanut", it's been a bad month for marriage. But at least "Mr Peanut" was interesting. For me, "The Good Soldier" was kind of a snooze.
I am not a particularly violent person. But there were so many places in this book where I wanted to sit the author down, smack her briskly and screamI am not a particularly violent person. But there were so many places in this book where I wanted to sit the author down, smack her briskly and scream at her "What were you thinking? It started with the very first word in the book, freshly minted for the occasion by the author. You read it and experience an involuntary recoil of revulsion at the sheer tin-eared ugliness of it. For God's sake, Kathryn Schulz, please don't title your opening chapter "Wrongology". If the first word in your book already makes my flesh crawl, that's hardly a good sign.
I chose this book based on a fawningly positive NY Times review by Dwight Garner, who obviously suffers from some kind of unhealthy crush on Kathryn Schulz. According to Dwight, KS "flies high in the intellectual skies, leaving beautiful sunlit contrails". Now, I get off on the ozone rush of huffing a beautiful intellectual contrail as much as the next reader, but I'm afraid in this case Dwight is letting his slobbering fanboy worship cloud his judgement. Kathryn Schulz is not stupid, but she's certainly no intellectual goddess. Normally, this wouldn't be a problem, but given the direction she chose to take her investigation in this book, the reader begins to wonder if her choice was a wise one and (I hate to say it) if she really has the intellectual chops for the task she sets herself.
I think it would have been a perfectly straightforward matter to come up with a reasonable working definition of "being wrong", one would that cover the great majority (say 95%) of situations that are of practical interest. Had Kathryn Schulz chosen to adopt this kind of pragmatic approach, she would have written a considerably shorter book and, I think, a much better one. Unfortunately, she has chosen instead to wax philosophical about epistemological difficulties in coming up with appropriate definitions of concepts such as "knowledge", "being right", "error", and if one wants to bump it up a meta-level, knowledge of errors (one's own and those of others). Not surprisingly this turns out to be a rabbit hole, and not a particularly interesting one. Schulz's choice to explore these questions has the immediate effect of lengthening the book considerably and burdening whole sections with the kind of jargon only a professional philosopher could love. A case in point: the First Person Constraint on Doxastic Explanation , which is apparently the phrase philosophers use for the phenomenon that anyone pressed to defend his/her beliefs will say that it's because they are true. Flawed logic, conflict of interest, reason clouded by emotion -- these problems may be immediately evident to other people, but we are notoriously bad at detecting them in our own thinking. To her credit, the author acknowledges the clunkiness of the philosphical jargon; I just wish the replacement she proposes - the 'Cuz It's True Constraint didn't set my teeth on edge with the faux-folksiness of that 'cuz. But this reflects a problem with Schulz's style that bothered me throughout the book. Maybe it's a consequence of her time as a reporter for Rolling Stone, but she never seems to find a consistent register. As she flips back and forth from discussing scientific results to illustrations from popular culture, this failure to establish an appropriate register becomes jarring. Whether it's her choice to write "fuck up" when "screw up" would clearly have been more appropriate, or sentences like this one, about Hamlet: "It's not as if the prince dillydallies for fourteen scenes over whether to order the BLT or the chicken salad", I just did not enjoy Schulz's writing style.
Which is a shame, really, because buried in there among the superfluous philosophical baggage and the rambling mess of her own prose, she has some genuinely interesting points to make. In particular, her observation that making mistakes is an intrinsic part of human nature, and is an essential component of scientific enquiry, is important, if not particularly novel. However, Schulz is not the first person to have considered the questions in her book. A considerably clearer, more focused, discussion can be found in the excellent "Mistakes Were Made" by Carol Tavris and Elliot Aronson. The rambling undisciplined character of Schulz's writing prevents me from giving her book more than two stars. The disappointment is that, with some decent editing, she could have written a far better book. ...more
I had a sufficiently positive impression of Dan Ariely from his first book, Predictably Irrational, to be willing to give this one a try. My residual I had a sufficiently positive impression of Dan Ariely from his first book, Predictably Irrational, to be willing to give this one a try. My residual impression from the earlier book was of a smart, likable guy, with a knack for designing clever experiments to capture the irrational side of human behavior, particularly when making decisions with economic consequences. This area of investigation has risen to prominence over the past 5 to 10 years, there is now a flood of titles on the market, which shows no sign of abating in the foreseeable future. Predictably Irrational holds up well against the competition: it covers a lot of ground in reasonably concise fashion, and is quite readable. Each chapter's primary message is grounded in, and illustrated by, specific experiments conducted by Ariely and colleagues, and this is the book's particular strength.
Given the strength of Ariely's first book, and the relatively short interval since its publication, it would be truly surprising if this second book reached the same high standard. "Sophomore slump" is a real phenomenon (just a manifestation of what statisticians would call "regression to the mean") and Professor Ariely is not immune to its effects. A reviewer predisposed to be critical of the author might argue that this is a sequel that is short on substance, presenting results that are either (i) blindingly obvious (e.g. that people need to believe their work is meaningful to feel motivated), (ii) needless and not particularly illuminating amplification of ideas already presented in the first book (overvaluing of ownership and the power of anchoring),or, (iii)material presented previously, and better, by other authors.
That assessment seems unduly harsh to me - the sequel shares some of the positive qualities of the original - primarily Ariely's clear and engaging style, which guarantees readability at the very least. Unfortunately, an engaging style doesn't quite make up for some of the book's obvious weaknesses. The material in the earlier book was fascinating because most of the results were surprising -- counterintuitive or non-obvious -- but the experimental work was strong enough to be persuasive. The experimental foundation of the work discussed in the second book is noticeably weaker across the board, at times barely rising about the level of anecdotal data, with the author displaying a regrettable propensity to issue pronouncements of a general nature solely on the basis of his own personal experience. Even if one disregards the relative weakness of the empirical evidence to support them, claims made in the second book are simply not as interesting as the earlier work - either they are immediately obvious, or restatements of material likely to be familiar to anyone who has done any prior reading in this general area.
Finally, there is the unavoidable impression that a significant portion of the material is nothing more than padding (the book is studded with space-filling sidebars that are notably lacking in content: examples include a one-page explanation of the myth of Sisyphus, complete with stick-figure diagram, a verbatim transcript of an online rant about the 2008 banking bailout, graphs that were superfluous, cartoonish, or both). The most egregious padding is Ariely's inclusion of what seems like an endless stream of personal anecdotes from his own life, a feature that severely tests the reader's patience and is an implicit acknowledgement that this is a book based primarily on anecdotal evidence, rather than hard science.
With these caveats in mind, I quite enjoyed the book. But I can't give it a resounding endorsement. Instead, I would steer readers to: Predictably Irrational , Stumbling on Happiness and Nudge . Cumulatively they afford an accessible account of the same material that is more thorough and more rigorous than that in Professor Ariely's somewhat disappointing sophomore effort.