Mark D. White

Writer, editor, teacher

  • Mark D. White

    In a New York Times op-ed this morning, Daniel E. Lieberman, an evolutionary biologist at Harvard, makes a strong case for our strong attraction to sugar, but a weak case for paternalistic action on the part of the government to limit our consumption of it.

    The problem is captured by the question he poses after he lays out the scientific reasoning: "What should we do?" Professor Lieberman somehow makes the leap from "people are eating too much sugar" to "the government should do something to stop this" without appreciating the size of the chasm he's jumping. (Just ask David Hume.) He fails to explain why this is a problem that justifies government intrusion into the choices of individuals–he simply takes it for granted that because we have evolved to crave more sugar than he thinks is optimal, the government is entitled to adjust our sugar consumption to bring it in line with what he thinks is optimal.

    This is yet another example of the most offensive aspect of paternalism: value substitution. We are lucky to have access to Professor Lieberman's scientific insights regarding why we crave sugar so much. But then he asserts his opinion that we eat too much sugar, based on his opinion regarding our optimal diets (based on our ancestors' nutritional nirvana long before Coca-Cola and Nabisco). This much is fine–everyone is entitled to his (or her) opinion, and he is fortunate that The New York Times gives him a platform to express it. But like all who endorse paternalism based on what they think we can do better, Professor Lieberman crosses a line when he shifts from expressing his opinion regarding our behavior to endorsing government action to adjust our behavior based on that opinion.

    But isn't it commonsense that we should eat less sugar (and salt, and trans fats) and more healthy foods? Of course. But is that all our only concern? Is it our only interest? Should it be? Not only does value substitution mispresent our true interests, but it only greatly oversimplifies them. I know I shouldn't eat too much sugar. But I also think a sugary treat or drink is a fine complement to a meal, or the cornerstone of a celebration, or a nice way to acknowledge a job well. People's interests are complex, and while they do include health, they also include a myriad of other things, things that are ignored when a scientific expert or government regulator proclaims, "Too much! Too much!"

    After making a reasonable case for limiting unhealthy foods in schools–a proposal I have very little problem with–Professor Lieberman proclaims that "adults need help too." This is the paternalistic mindset in a nutshell: you need help because we know better than you how you should run your life. Again, he makes a tremendous leap, from "you are being tempted by cheap sugar from the food industry" to "you need the government to step in and counter this influence." First, there is no way to know how much sugar consumption is due to "irresistible temptation" and how much is due to open-eyed choice. Paternalistic regulation–whether in the form of prohibition, taxes, or nudges–is a blunt tool that misses the nuances of human decision-making.

    Second, this treats people as slaves to their passions–there's Hume again–which must be manipulated by the state to counter manipulation by industry. Professor Lieberman devotes just two sentences to spreading information, but decides that it hasn't done enough–"enough," of course, based on his opinion regarding what should have happened. But here's another possibility: people know how bad sugar is from them, and armed with that information plus all of their multifaceted interests, they nonetheless choose to eat more sugar than Professor Lieberman would like them to.

    Professor Lieberman concludes with this: "We have evolved to need coercion." I hope he's not making that claim on the basis on his scientific expertise, because science cannot tell us what we need. Science can help explain why we do what we do–as Professor Lieberman well details in the early part of his article–but it has nothing to contribute to what we need. Such a proclamation requires knowledge of our goals, interests, or "purpose"–the last one a teleological notion which scientists normally disavow–and none of which science or government knows better than people themselves.

  • Inspired by my friend Colin Smith's post on his blog, Too Busy Thinking about My Comics, regarding our friendly neighborhood Spider-Man's questionable actions in Amazing Spider-Man #685, and then further spurred on by a vigorous debate with Spidey editor Stephen Wacker on Twitter this morning (joined by Colin and several other stalwart discussants), I need to put my thoughts on this matter down for whatever counts as posterity these days.

    Although my comments will much be broader than this specific instance, here is the page-of-interest:

    ASM685

    While Colin was firmly against what Spidey did in this issue, his larger concern (as I understand it) is with the lack of context and discussion of what Spidey did. No more reflection, no consideration of the principles or consequences involved–he just "did what he had to do." (Another concern of his was with the increasing proliferation of acts like this, one of which I mentioned in an earlier post, and is also related an another post on DC Comics' devolving views on killing.)

    One issue that arose in the Twitter discussion this morning was selectivity when applying moral standards to characters. I was accused of sanctioning Batman's practice of child endangerment over the last 70 years, but not Spidey's recent transgression. I replied first that I wasn't sanctioning or condemning either hero's actions, and then elaborated on two points:

    1) Batman's actions towards his various Robins have been discussed extensively in the comics–for instance, in Devin Grayson's wonderful Gotham Knights run, as well as Peter Tomasi's current Batman and Robin, which also deal with Damian's penchant for killing. (There is also a chapter titled "Is It Right to Make a Robin?" in Batman and Philosophy.) It is not the controversial nature of Spidey's actions, so much as lack of recognition of any controversy, that is distressing to me.

    2) Batman and Spider-Man are different characters, with different moral codes, implying different limits and lines they won't cross, as well as different processes of judgment. The Punisher is different, as is Captain America, and Superman, and Daredevil. Even within the Batman family, we see Bruce, Dick, Tim, and Damian all making moral decisions differently–and the fact that we can read that in the comics, that the characters are so well-defined that these differences come out, is a testament to comics storytelling.

    It is this second point about character on which I want to focus here: not only is the lack of reflection or context in Amazing Spider-Man #685 troublesome, but it seems out of character for Spidey to do what he did. (To be fair, he does acknowledge this briefly–see the panel in the lower right-hand corner in the image above–but then seems to dismiss such concerns.)

    Despite this long legacy in the Marvel Universe, Spider-Man is perpetually represented as the kid among grown-ups, representing a still-developing moral idealism alongside the more solidified moral positions of Captain America and Iron Man. We see this when he sided first with Iron Man and then Cap during Civil War (which I described in my chapter "'My Name is Peter Parker': Unmasking the Right and the Good" in Spider-Man and Philosophy, edited by Jonathan J. Sanford, as well as in this blog post). He also serves as the point-of-view character for events like Civil War, and he has gained the status as the moral center of the Marvel Universe–not in the sense that he is always right, but rather that he hasn't got everything figured out, and therefore he considers his actions like a conscientious young person would. For example, he makes idealist pledges like "no one dies," which makes his all-too-easy acceptance of torture all the harder to reconcile with his character.

    Torture is a hot button issue to be sure, and deservedly so. You could say it's the ultimate "trolley problem," in which grave consequences will follow if one adheres to cherished principles. Several years ago, I compared Batman's continued struggle over whether to kill the Joker to a trolley problem and to the contemporary torture debates. Among its other lessons, the trolley problem (which Spidey has also faced, of course) shows us the difficulty of adhering to moral absolutes. It is too easy to say "I will never kill" or "I will never torture," no matter how high the costs, until you're actually in a position–as a government official or superhero–to have to accept those costs on behalf of those who will bear them, in order to maintain your principles. (For a magnificant academic treatment of these issues, see Michael S. Moore's paper "Torture and the Balance of Evils" in his collection Placing Blame.)

    For that reason, I am no moral absolutist when it comes to torture (or anything else). As a Kantian I am a strong devotee of moral duty and principle, but no matter how firmly we adhere to a moral principle, there can always be another moral principle can that be judged to be more important–and this importance can be based, in part, on consequences. But this is a decision that everyone has to make for himself or herself–including fictional characters like superheroes that find themselves in these situations much more often than the average Jane or Joe.

    Secretav21As much as I admire the character of Captain America, I find it more believable that he, compared to Spider-Man, would engage in torture, with a tremendous heavy heart, when he judged it to be necessary. Ideally, his struggle with such a decision would parellel America's struggle with it. It would rip him apart inside, representing a betrayal of his core principles of respect for human rights and dignity, and he would only do it if the costs of not doing it were unacceptably high, as in the "ticking time-bomb scenario." (He would not, however, be as flippant about it as he was in Secret Avengers #21, shown at the right.)

    As I understand the character of Spider-Man, he would be far less accepting of the "inevitability" of torture in whatever few circumstances Cap would. Spidey would want to look for another way, not stopping until he found it, no matter what the costs to himself (as with his "no one dies" pledge). If anything, he may be guilty of ignoring the costs to others of sticking to his own principles in the face of the "Ends of the Earth," and a thought process like this may in fact explain why he resorted to torture in Amazing Spider-Man #685. But we don't know this, because it wasn't fleshed out (at least not yet). As it stands, it appears that Spidey adopted this morally extreme course of action too easily, and that seems out of character.

    Why does this matter? (This question was at the core of much of the Twitter discussion this morning, and is a point I explored in my other chapter in Spider-Man and Philosophy, "The Sound of Fury Behind 'One More Day.'") It matters because the character of Spider-Man has been around for 50 years now, and for the most part has been defined very precisely, to the credit of the dozens of writers that have told his stories over the years. He has changed and he has grown, but organically within the stories, not suddenly or abruptly–except in cases like the deal with Mephisto in One More Day or the incident that prompted all this hubbub.

    That rare cases like these stand out is because they are exceptions to the "rule," or in this case, the character that is Spider-Man. If you've read his stories long enough, you feel you know him and you come to care about him, despite his being a fictional character. (This is the problem with the ciphers walking around in familiar costumes in DC Comics' New 52.) Just as we call out our friends for doing this that are out of character ("this isn't you, Jimmy, it's not who you are!"), we can think of our fictional characters' action the same way–even moreso, actually, because fictional characters, no matter how complex, are much simpler and well-defined than people in the real world.

    If comics publishers and creators want our loyalty to their characters, they have a responsibility to portray the characters consistently–and if they choose to have a character do something surprisingly, they should deal with it in-story so it feels organic. The backlash against One More Day is a obvious example of readers' feelings of betrayal at out-of-character writing based on editorial fiat. On the other hand, the frequent back-and-forth between Cap and Iron Man since Civil War, each calling out the other's hypocrisies, is fantastic, because it deals with issues of character and consistency in-story–between the characters themselves, even!

    To me, and to many other readers, Spider-Man never seemed like a character that would accept torture as an acceptable means to an end. If that is a true representation of the character as he's evolved, then I hope the creators will draw that out–that could make for some thoughtful and enjoyable superhero comics. But I also wonder what other lines he refuses to cross, and that fills me with despair.

  • Mark D. White

    That was fast–in a "Room for Debate" feature that went online Saturday evening, the New York Times asked "What's the Best Way to Break Society's Bad Habits?" The contributors, predictably, take the question at face value and answer accordingly. But the question is nonsensical and the answers beside the point.

    "Society" does not have bad habits–people do. And it is not for "society," or anyone in it, to decide whether a person's habits are bad, except that person himself or herself. Others are free to tell a person they think he or she has bad habits, to try to persuade or inform him or her about why these habits are bad, but only the person who has these habits can judge whether they are bad, based on his or her own interests.

    So the question the Times poses is based on a false premise. A better question would be, what can we do as a society to help people conquer habits that they themselves judge are bad? Paternalism won't work, since it paints with too broad a brush, affecting everyone with a particular habit whether they think it's bad or not. The best way to help people break self-identified bad habits is to hold them responsible for their consequences.

    But exactly the opposite is happening: we are moving away from individual responsibility and toward collective responsibility. This shows up most clearly in health care, where the more responsibility the government takes (or forces private insurers to take) for people's unhealthy behavior, without being able to charge more in premiums or deductibles to make up for it, the less incentive people have to moderate such behavior. If they were faced with even some of the costs of their behavior (as they would under a more flexible private health insurance system), people could make a fairly rational decision whether the cigarettes, or soda, or fatty foods, are worth the eventual cost. But now their personal costs are opaque, consisting of taxes or insurance premiums largely unrelated to their behavior.

    And the all-too-predictable result of more collective responsibility for health care is more governmental control of behavior. Restrictions on unhealthy behavior are not just paternalistic anymore–they're now a public cost problem. Cities and states are eager to cite rising Medicare costs as justifications for their restrictions on smoking, trans fats, and other health risks. (Forget broccoli: academics today seriously endorse plans to mandate exercise.) But this is like the boy who shot his parents and then argues for mercy because he's an orphan; by claiming responsibility for health care costs, the government has created the crisis (or at least this particular part of it) which "justifies" restrictions on behavior.

    Let people judge whether their own habits are good or bad, and let them take responsibility for the consequences of these decisions. That's the right answer to the right question.

  • BloombergOh, Mayor Bloomberg–you make writing a book about libertarian paternalism and nudges too easy. (Thanks!) But seriously, you help show why it's important to write this book, that's it's not just some pie-in-the-sky idea that lives only in the ivory tower, but one that affects the real world.

    Yesterday The New York Times reported that New York City Mayor Michael Bloomberg, through his Board of Health, is planning to limit sizes of sugary drinks like soda (other than diet), energy drinks, and sweetened coffee drinks, to 16 ounces. (One person on Twitter remarked that this is still 13 ounces more generous than the TSA.) This applies to prepackaged bottles of beverages sold in bodegas or delis (but not grocery stores or convenience stores) as well as drinks poured by an employee or customer, such as fountain soda sold at fast food restaurants, sports games, and movie theaters.

    According to the article,

    The mayor, who said he occasionally drank a diet soda “on a hot day,” contested the idea that the plan would limit consumers’ choices, saying the option to buy more soda would always be available.

    “Your argument, I guess, could be that it’s a little less convenient to have to carry two 16-ounce drinks to your seat in the movie theater rather than one 32 ounce,” Mr. Bloomberg said in a sarcastic tone. “I don’t think you can make the case that we’re taking things away.”

    No, he's not taking away people's soda or limit consumer choices–people are free to buy more, smaller drinks or take advantage of free refills–but he is hoping to affect their choices, or he wouldn't be doing this in the first place. This element of cynical manipulation lies behind all nudges, the idea that regulators can leave your options unchanged substantively but still change your behavior for the better.

    This leads to another offensive aspect of nudges: to change behavior without curtailing options, they rely on the same cognitive biases and dysfunctions that its proponents use to justify their imposition. I assume that Bloomberg blames short-sightedness or lack of willpower for New Yorkers' heavy consumption of sugary drinks, but his plan will only work if people were too lazy, hurried, or absent-minded to consider other options for getting more soda. (His sarcasm about the inconvenience of buying two sodas is ironic, since that inconvenience is one thing that he's counting on to drive the success of his plan.)

    What do I see coming from this? A lot of delis and bodegas working to reclassify themselves as grocery stores instead of "food service establishments" (a health department classification) and a lot more restaurants that serve fountain sodas offering free refills or "buy one cup get one free" deals. Consumers won't have to "seek out" ways to get their fix; business will be more than happy to provide them. Like most poorly crafted regulation, this ban on large sugary drinks will certainly shift some behavior, but in efforts to circumvent the ban, not to conform to it.

    New Yorkers are smarter than you give them credit for, Mayor Bloomberg. Maybe it's all that sugar.

  • My letter in The Wall Street Journal on May 16 regarding the Pathways general education initiative at CUNY drew a quick response from Benno Schmidt, the chairman of the CUNY Board of Trustees, which was published on May 24:

    Mark D. White's criticisms of CUNY's general education initiative "Pathways" (Letters, May 16) are not only incorrect but preposterous in many ways. The Pathways structure was developed by dozens of tenured faculty members, and many hundreds more tenured faculty from across CUNY's 19 undergraduate colleges are deciding which courses should be offered and the learning outcomes to be achieved by each.

    The general education framework was unanimously approved by the university's board of trustees, which has the responsibility under New York state education law to ensure that CUNY, as one university, has clear transfer paths and curricular alignment across its colleges. The resolution calls for a "commitment to the highest academic standards and to the faculty's special responsibility for courses and curriculum."

    The goals of ensuring quality coursework, clearly defined learning outcomes and the reduction of artificial barriers to student progress have driven the initiative since its inception. The general education framework is now consistent with national norms and flexible enough to enable faculty at individual colleges to emphasize lab science and instruction in languages other than English if they choose. The initiative allows students who take any portion of their general education requirements at any CUNY college to transfer all those credits to any other CUNY college, graduating without excess credits and expenditure.

    Far from being an imposition, the work of articulating high learning standards and facilitating student progress must be our highest priority.

    Benno Schmidt, Chairperson, CUNY Board of Trustees, New York

    Rather than respond personally (and probably preposterously!), I will pass the baton to my colleague at the College of Staten Island, and chair of the University Faculty Senate, Professor Sandi Cooper, who asked that I publish her response here:

    Benno Schmidt’s letter (May 24 2012) applauding CUNY’s new Pathways for student transfer sadly reflects a profound communication gap between CUNY’s faculty and its managers. And an outrageous insult to Mark White, a serious faculty member.

    Unlike corporations, Universities run on cooperative arrangements between the professionals (faculty) who design the education and administrators and trustees who are responsible for broad policy, far removed from classrooms and modern scholarship.

    The Pathways project, as our colleague Mark White correctly stated, is an example of the worst kind of managerial micromanagement. When Schmidt states that it was designed by faculty, he is disingenuous. It was designed principally by an administrator who then appointed largely agreeable faculty to fill in the blanks. What respectable faculty senate would approve a core curriculum that requires only 3 credits in World Cultures to be fulfilled by ONE course from language, literature, history, sociology, anthropology, political science, philosophy, etc etc? What faculty senate would agree to only 3 credits of composition for students who preparation is widely understood to be deficient? Or, for that matter, a 3 cr science course with no time for labs?

    Of the original 50 or so faculty who put this project together, more than half have disavowed it. Over half the full time faculty, over half our distinguished professors and over half of our department chairs have signed a petition asking for repeal and for a new start. Nearly every college senate, most discipline councils, many departments (see website of the UFS)agree. Pathways is a pathway to dumbing down general education to a junior high school level. It will harm those students who enter with the greatest deficits – while they may progress faster, they wont have the foundation to compete in the upper levels.

    As chair of the University Faculty Senate, I have ample evidence that faculty view this imposed core curricula as a trivialization of education. Do trustees and university management think that a university with a majority of minority enrollment should cater to the lowest common denominator? Broad educational policy is the purview of trustees but up to now this always included respect for faculty professionalism in curricula, graduation and admission standards. The trustees may get their way – stuffing a simplistic curriculum down our throats – but we have to take responsibility for the quality of the degree that they will have shaped.

    Sandi Cooper, Chair
    University Faculty Senate, CUNY

  • Mark D. White

    In The New York Times over the weekend, Tim Jackson contributed a piece titled "Let's Be Less Productive." In it, he decries the modern obsession with productivity gains, while recognizing the role it has played in increasing standards of living. He cites necessarily stagnant productivity in the arts, services, and craft industries, which William Baumol noted years ago, terming it the "cost disease" (because wages would have to remain competitive while productivity stayed the same), but cautions against increasing productivity throughout the economy because of other detrimental effects–specifically on jobs, if higher productivity is not accompanied by growth.

    I have no problem with tempering the push for higher productivity, especially in areas in which it can hardly be expected. Productivity is a means to an end and therefore it is only valuable insofar as it actually serves that end. But I think there is an end which can benefit from higher productivity that Jackson doesn't see: a less work-centered conception of meaningful life. Instead, he sees higher productivity as a threat to full employment:

    Ever-increasing productivity means that if our economies don’t continue to expand, we risk putting people out of work. If more is possible each passing year with each working hour, then either output has to increase or else there is less work to go around. Like it or not, we find ourselves hooked on growth.

    On a certain level he's right; if we produce the same amount of output more efficiently, that means less resources will be required, including labor. For people who want to work, who need to work, this is of great concern, which makes this an important matter to discuss during these dire economic times.

    But more generally, we should consider if work is a means to an end or an end in itself. It's the former for most everybody, of course, but the latter for only some. It's a cultural stereotype that Americans live to work while Europeans work to live, but it is based on a kernel of truth. Some people find their life's meaning primarily in work, but others find it more in other aspects of life, such as service, art, family, or love. Higher productivity may result in fewer jobs, yes, but insomuch as some people find a job a burden–and have other means to support themselves, such as a spouse or a partner–they can enjoy other aspects of life if they have other means of support, due to higher productivity.

    There are other benefits to this aspect of higher productivity. It would relieve the modern necessity of the two-earner family, either allowing a two-parent family to live on one earner's income, or a single-parent family to live more comfortably on one income. And higher productivity can also–if you're so inclined–finance a stronger welfare state, to support those who want to work but can't find a job, and have no partner or other financial support. Even without growth, higher productivity enables a state to fund social welfare programs. (Just look at Sweden, where a fairly unrestrictive regulatory environment for business has led to productivty gains and growth to support their extensive welfare state.)

    There is plenty of room to bemoan the single-minded focus on productivity espoused by many in business and government, and at the same time to recognize that the loss of jobs it creates (in the absence of corresponding growth) has some broader societal benefits, including lessening our reliance on our jobs and careers to give meaning to our lives and relaxing the economic burden on families. Work to live, indeed!

  • Batman inc 1I'm no fan of Grant Morrison, especially on Batman. But I will always thank him for bringing the "product" of Son of the Demon into mainstream continuity in the form of Damian Wayne, which has led to a Batman/Robin relationship unlike any that preceded it. Rather than purity of Dick Grayson and Tim Drake, or the loose cannon that was Jason Todd, Batman now has a partner with training and devotion that equals his own, but few of the principles that keep Batman from crossing the ultimate line into naked vengeance. (See this earlier post for father and son's discussion of killing from Peter Tomasi's Batman and Robin.)

    Batman Incorporated #1, written by Morrison and illustrated by the incomparable Chris Burnham, continues the Leviathan storyline from the pre-relaunch title, but seems just as much like Morrison's own work on Batman and Robin (when Grayson wore the cowl). And it was fun! If I found Morrison's work in the Batverse very up-and-down before the relaunch, it seems like a breath of fresh air now. His "technicolor" vision of Batman is more than welcome, not so much within the current Bat-titles, but more against the backdrop of the New 52 as a whole, which is distinctly lacking in fun.

    Don't get me wrong, I love what Snyder and Capullo have been doing on Batman, beating down Batman until he just can't take any more–and then he still comes back, because he's Batman. (Can't get enouh of that kind of story, sincerely.) And Tomasi's Batman and Robin has been a quality title, but Daniels' Detective Comics has been pedestrian at best, and the less said about Batman: The Dark Knight, the better. None of the other Bat-titles excite me either; as much as I wanted to enjoy Batwing, Batgirl, and Batwoman, they all leave me cold.

    But Batman Incorporated #1, now this is fun. More details after the jump–and there may be spoilers.

    (more…)

  • A letter of mine, disputing an "inaccurate" description (to be kind) of Pathways–the rock that we at CUNY have had to roll up the hill, day after day, only, like Sisyphus, to see it roll back down again–was printed in the Wall Street Journal on May 16:

    William Bowen's "How to Keep American Colleges on Top" (op-ed, May 10) presents a grave mischaracterization of the Pathways initiative at the City University of New York. As a department chair and faculty member in the CUNY system, I can assure readers that Pathways is neither "rigorous" nor "faculty-defined," nor did it result from "a collaborative effort of committed faculty members and enlightened administrators."

    Pathways reduces the credit hours devoted to English composition, math and science at a time when our students need these subjects the most. It also makes little room for foreign-language instruction, also shown to be tremendously beneficial to cognitive development, but which is now considered a square peg that must be forced into the round holes of the Pathways general-education template.

    Pathways was imposed on the faculty of the CUNY colleges by the central administration, and most of our faculty organizations have issued public renunciations of it. Pathways is in direct violation of the spirit of faculty governance and negates all the hard work by the dedicated faculty at the individual CUNY colleges to structure their own general-education programs.

    In the end, Pathways is just one more attempt by the central CUNY administration to subvert the autonomy of faculty at the individual campuses. All of us who teach at CUNY, from full professors to adjuncts, want our students to succeed. But difficulties with transferring credits should and can be handled in a way that ensures a smooth road to graduation for our students while guaranteeing them the quality education they deserve.

    See here for the faculty reaction to Pathways throughout the CUNY system (as mentioned in the letter).

  • Wow, has it been another month?

    I had planned to use this blog to collect my online activity for various sites and blogs, as well as traditional publications, but I haven't been active in that area this past month. But it's for good reason–I've delved back into academic writing in a big way, and it's going really well. (Plus, you know, the end of the semester and all. And this doesn't help matters there.) I finished both my paper on Kantian ethics and altruism in the family (which is now under review for a special issue of a journal) and the first draft of a paper on welfare economics that I'll present next month at the World Congress of Social Economics in Glasgow (and which I plan to expand into one of these).

    My big news–from this morning–is that my book on libertarian paternalism (a la Nudge), proposed to Palgrave as a Pivot title (short ebook), was not only accepted but was upgraded to a simultaneous hardcover and softcover release, based on the editorial board's expectations of its appeal and potential. It will be longer than the Pivot format would allow, and ironically will be very close to the original proposal developed with my agent and shopped around to trade presses (unsuccessfully), which is very gratifying.

    In the meantime, I have lot of things in the stores right now, including The Avengers and Philosophy, Downton Abbey and Philosophy, chapters in Spider-Man and Philosophy and The Big Bang Theory and Philosophy, plus the paperback version of The Thief of Time: Philosophical Essays on Procrastination.

    Finally, I did manage to write a little for blogs over the last month:

  • Mark D. White

    In this morning's New York Times, James Atlas discusses recent books about cognitive processes and neuroscience, such as Jonah Lehrer's Imagine: How Creativity Works, Charles Duhigg's The Power of Habit: Why We Do What We Do in Life and Business, and Leonard Mlodinow's Subliminal: How Your Unconscious Mind Rules Your Behavior. Atlas highlights several interesting things about this publishing trend, including the increasingly analytical focus on the "how" of thought rather than the more existential "why," and the shrinking space allowed for true agency to operate amid of the ever-expanding, hidden wiring in the brain. The latter concern motivates the title of his piece, "The Amygdala Made Me Do It" as well as his characterization of this genre as "Can't-Help-Yourself" books.

    Even though I'm an advocate of autonomy, willpower, and self-knowledge myself, I find little to be troubled by here, and quite a bit to be excited about. Freud, of course, posited that much of our thought happens under the surface, and this idea has been brought into modern experimental psychology and described in books such as Timothy D. Wilson's Strangers to Ourselves: Discovering the Adaptive Unconscious (which I highly recommend). And it is the title of Wilson's more recent book, Redirect: The Surprising New Science of Psychological Change, that suggests a positive interpretation of these developments, as described by Atlas:

    The Power of Habit and Imagine belong to a genre that has become increasingly conspicuous over the last few years: the hortatory book, armed with highly sophisticated science, that demonstrates how we can achieve our ambitions despite our sensory cluelessness.

    The discovery of increasing levels of complex scaffolding underneath conscious thought processes need not threaten intentional choice, but rather help to enable it. Citing David Hume, Williams James, and Daniel Kahneman (no Adam Smith, Jonathan!), Atlas focuses on the importance of habits to everyday action, and how the conscious mind can redirect those habits for its own ends, such as countering a habit of watching TV with a healthier habit of exercising. Such a person is making an intentional and strategic choice by harnessing the power of habit; our habitual nature is thus transformed from a liability to an asset, from a weight to a tool.

    The danger lies in letting the easy intertia of habit take over and forgetting we have the responsibility to choose which habits to nurture and which to reject. The current picture of our brains casts each of us as the CEO of our minds rather than the entry-level employee or even middle manager: we have the ability to command and direct our cognitive resources, but we retain responsibility for what we do with them.