The Fall

This post contains significant spoilers about True Detective and The Fall.

So I just finished watching the second season of The Fall, with Gillian Anderson. It’s really very well done, smart and political and beautifully shot. Anderson is a standout as detective Stella Gibson, but the cast is overall excellent too. It’s got the overall quality in production and immense attention to detail that are easy to take for granted these days. A couple things, though.

First: it’s another gorgeously made, lovingly created crime show that just doesn’t work on the level of a crime show. It reminds me of True Detective in that sense. It’s better than that show, and not just politically, but it still is better acted and better shot and better created than it is plotted.

True Detective was, for me, an interesting and noble failure because it was a mystery, at least in large part, and it utterly failed in that regard. I mean, really: as a mystery, it could not be worse. As a lot of people pointed out, in the lovely, ridiculous final episode, the whole boat ride meant nothing– they learned nothing of use to solving the mystery, and the creators seemed to have that whole plot point just for the “cool” scenes of showing the sheriff the ritual-rape-and-murder and for the terrible cliche of the secret sniper trick. But if you go back through a lot of the show, very often the things the detectives do make no difference for the overall case. In fact the entire part where Rust goes undercover in that motorcycle gang, leading to the wildly overpraised single tracking shot, ultimately makes little difference in progressing from point A to point B. That could work in a series that played up that angle; the movie Zodiac very effectively explored the false starts and dead ends that are part of detective work. But that’s not the show True Detective is, or the show it wants to be. I mean, they catch the guy through an impossible intuitive leap based on a ridiculous clue. I’ve painted a lot of houses in my life, and I never got paint on one ear, let alone both. It’s impossible that the character guessed it, the audience had no way of making the connection earlier… it’s a mess. And that’s just in terms of how the detectives get from point A to point B in the story, not even going into the massive plot holes, dangling threads, and pointless contrivances. The show’s beautiful in many ways, but it’s just a bad crime story.

The Fall isn’t nearly as bad in that regard, but I also find it fails as a traditional crime story. It’s important to note that, unlike True DetectiveThe Fall isn’t a mystery; the killer is the first character we meet. But the show seems to go back and forth without confidence about whether Paul Spector is a master criminal or incompetent. At times, he’s the former, and at times, the latter. He meticulously covers his tracks, while simultaneously accruing evidence against himself. The show suggests that this is the product of a manic reaction to a killing gone bad, but it doesn’t dramatize that adequately, and there’s no rhyme or reason to the things he does to cover his crimes up and the things he does to incriminate himself. Again, I think that showing all the ways someone who styles himself a master criminal has screwed up, gradually giving the cops the evidence they need, could be a good show. Something somewhat similar happens in the recent movie Night Moves. But I don’t think the show is doing that intentionally, or at least not cohesively. Whether Paul is bumbling or masterful, and whether he wants to get caught or not, are questions that the show seems perfectly incapable of making its mind up about from scene to scene. Also, a subplot about a powerful Belfast figure who holds sway over the top cop just up and disappears from the story. (Is there any resolution at all to the killing of the detective Olson?) The abusive husband figure is also both a very obvious and heavy-handed dramatization of the show’s key themes and a convenient vehicle to move the plot along when in need. He’s too obvious a device for both theme and plot and this exacerbates his lack of character beyond big violent oaf.

It’s still a very satisfying couple of seasons of television, don’t get me wrong. I just feel like this is a key problem with Prestige Disease, the very palpable awareness that a lot of shows have now that TV is supposed to be a very big deal and that they are high quality productions of the highest caliber. Production values are much more reliably bought than a compelling, tightly plotted story.

Second, and maybe more importantly: the show’s much-championed feminism. Which…

On the one hand, yes, it’s more explicitly and intelligently feminist than most any show I can imagine. And it’s a dark, absorbing, challenging feminism, one that really is profoundly ambivalent about the fundamental violence (sexual and otherwise) of maleness. It’s amazing to see feminism of this radicalism and dedication on a mainstream TV show. There’s no noble male figure to reassure viewers that most guys are good guys. There’s better men and worse, but every major male character is implicated in some kind of aggression against women. So that’s great.

But I also feel like the show is pulling a more sophisticated version of the old Law & Order: SVU two-step. That show has a simple and powerfully enticing formula for viewers: it titillates them, then assuages their moral sense by having the detectives righteously win in the end. When Detective Stabler says some version of “creepy perv, you like getting your jollies with little girls,” he’s ritualistically cleansing the prurient entertainment they enjoyed when the creepy perv was getting his jollies with little girls. Not that they approve, of course, but the relationship between viewers and the crimes that entertain them is a little unhealthy in that way. It’s a way to have your cake and eat it, too.

I feel like The Fall is sort of the same way, though surely not intentionally. Because there are a lot of hints that the creators don’t want to make the classic, Hannibal Lecter-style appealing serial killer story… but they kind of do. Near the end of season one, Gibson gives a speech to Spector over the phone that insists that he’s in fact weak, and not powerful at all. (One of the show’s minor weaknesses is a tendency to just announce its themes explicitly.) The show is so self-aware that it wants to make that kind of statement. So why is Jamie Dornan shot to look cool, when he’s in his serial killer hoodie and mask? Why is there the classic, intense focus on his serial killer rituals, so common to the genre? Why the loving shots of his abs and pecs? Why does he get in one last cool little philosophical aphorism at the end of the second season, again portraying him as a figure of philosophical detachment and discernment? Why is his intelligence as insisted on as his sadism? Why is his relationship with his daughter so loving? The show’s head believes that he’s evil, but it doesn’t have the heart not to make Paul Spector another gorgeous, charismatic woman-destroyer.

Having been told by a detective she’s just bedded that he finds Spector fascinating, Gibson says, “I despise him with my entire being.” And maybe she does. But the show doesn’t. Or at least, the camera doesn’t. And while I applaud the show’s explicit feminism, I find the implicit sexism of yet another sexy killer of women depressing. I have come to wonder if the only way to make a serial killer story feminist is to have the killer himself be obviously pathetic on the outside as well as morally, or if there can ever be such a thing as a serial killer story that is genuinely feminist at all. Though I don’t think it’s intentional, and thus not cynical, I think the show’s effect is explicit feminism acting as a palette cleanser for another story about dead women and the charismatic evil man who killed them.

Maybe this is just the facts of life in the entertainment business; you’re not going to get a lot of show biz execs willing to cast their screen-dominating villain with someone who looks like John Wayne Gacy. But the show adds to what has become, for me, a growing ambivalence about the power of the protagonist in art and the powerlessness of creators to oppose it. Look at Sopranos. As many have pointed out, the last several seasons of the show function as a rebuke to all of the viewers who thought that Tony was cool and the fun was in watching him whack people. And, yeah, I hate that as much as David Chase obviously did. But… can you blame them, in a sense? James Gandolfini was so charismatic, and the criminal monster as sexy, enviable figure is so prominent in American culture that I wonder if asking people to recognize that Tony is a terrible, pitiful creature is too much. How many Scorcese movies have had the intended thematic purpose of demonstrating the moral rot of his characters, and yet the impact of plastering those characters on dorm room walls? I knew enough to make fun of people who thought that the point of Fight Club was that you should start your own fight club and do Project Mayhem. But god, they made it look sexy, didn’t they? It takes the incredible beauty and tenderness of that final shot to wipe Brad Pitt’s sexy abs and cool clothes away.

I just wonder if the natural power of the protagonist, or the attractive, intense antagonist, overwhelms the political and artistic desires of creators in most cases. Which is precisely what I think happens in The Fall.

Look, it’s great TV. I wish I had more of it to watch. It’s frequent thematic and plot confusion don’t outweigh the quality of the characterization, acting, and production values. And it’s remarkable to watch a show with such direct and unapologetic feminism. I just don’t know if, in the end, that feminism is achieved as well in practice as it was so clearly intended in theory.

a simple reform to improve data journalism

Like a lot of terms that get bandied around, it’s not always clear what “data journalism” means, but I’ll risk the potential for being a bit vague and assume that most people know what I’m talking about. We’ve seen a rapid growth in the use of arguments based on statistics in the popular  media in the last several years. In particular, we’ve seen growth in journalists and commentators running statistical analyses themselves, rather than just reporting the statistics that have been prepared by academics or government agencies. This is potentially a real boon to our ability to understand the world around us, but it carries with it all of the potential for misleading statistical arguments.

My request is pretty simple. All statistical techniques, particularly the basic parametric statistical techniques that are most likely to show up in data journalism, require the satisfaction of assumptions and checking of diagnostic measures to ensure that hidden bias isn’t misleading us. Many of these assumptions and diagnostics are ultimately judgment calls, relying on practitioners to make informed decisions about what degree of wiggle room is appropriate given the research scenario. There are, however, conventions and implied standards that people can use to guide their decisions. The most important and useful kind of check, though, is the  eyes of other researchers. Given that the ability to host graphs, tables, and similar kinds of data online is simple and nearly free, I think that data journalists should provide links to the graphs and tables they use to check assumptions and diagnostic measures. I don’t expect to find these graphs and tables sitting square in the center of a blog post, and I expect that 90% of readers wouldn’t bother to look. But there’s nothing to risk in having them available, and transparency, accountability, and collaboration to gain.

*****

That’s the simple part, and you can feel free to close tab. For a little more:

What kind of assumptions and diagnostics am I talking about? Let’s consider the case of one of the most common types of parametric methods, linear regression. Whether we have a single predictor for simple linear regression or multiple predictors for multilinear regression, fundamentally regression is a matter of assessing the relationship between quantitative (continuous) predictor variables and a quantitative (continuous) outcome variable. For example, we might ask how well SAT scores predict college GPA; we might ask how well age, weight, and height predict blood pressure. The types of regression analysis, and the issues therein, are vast, and I’m little more than a dedicated beginner. But I know enough to talk about some of the assumptions we need to check and some problems we have to look out for. I want to talk a little bit about these not because I think I’m in a position to teach others statistics, or because regression is the only statistical process that we need to see assumptions and diagnostics for. Rather, I think regression is an illustrative example through which to explore why we need to check this stuff.

There are four assumptions that need to be true to run a linear (least squares) regression: independence of observations, linearity, constancy of variance, and normality. (Some purists add a fifth, existence, which, whatever.)

Independence of Observations

This is the biggie, and it’s why doing good research can be so hard and expensive. It’s the necessary assumption that one observation does not affect another. This is the assumption that requires randomness. Remember that in statistics error, or necessary and expected variation, is inevitable, but bias, or the systematic influence on observations, is lethal. Suppose you want to take the average height of the student body of your college. You get a sample size of 30. (Not necessarily too small!) If your sample is truly random, and you get a sample mean of 5’8, but your actual student population mean is 5’7, that’s error. That’s life. On the other hand, if you only sample people who are leaving basketball practice, and you get an average height of 6’2, that’s bias. The observations aren’t independent; they share a common feature which is influencing your results. When we talk about randomness in sampling, we mean that every individual in the population should have the same chance of being part of the sample. Practically, true randomness in this sense is often impossible, but there are standards for how random you can make things. Getting random samples is expensive because you have to find some way to compel or entice people in a large population to participate, which is why convenience samples, though inherently problematic, are so common.

Independence is scary because threats to it so often lurk out of sight. And the presumption of independence often prohibits certain kind of analysis that we might find natural. For example, think of assigning control and test conditions to classes rather than individual students in educational research. This is often the only practical way to do it; you can’t fairly ask teachers to only teach half their students one technique and half another. You give one set of randomly-assigned classes a new pedagogical technique, while using the old standard with your control classes. You give a pre- and post-test to both and pop both sets of results in an ANOVA. You’ve just violated the assumption of independence: we know that there are clustering effects of children within classrooms; that is, their results are not entirely independent of each other. We can correct for this sort of thing using techniques like hierarchical modeling, but first we have to recognize that those  dangers exist!

How would a lack of independence affect regression? Well, suppose you wanted to define the relationship between average number of hours sleep per night and Body Mass Index. But say you chose your sample by asking people as they left the gym. Your sample is now made up primarily by people who exercise regularly. Maybe the relationship is different for the sedentary. Maybe people who exercise a lot can sleep less and stay trim, but those who are sedentary have a strong relationship between BMI and numbers of hours of sleep. If you only are looking at the fit because of your sampling, you have no way to know.

Independence is the assumption that is least subject to statistical correction. It’s also the assumption that is the hardest to check just by looking at graphs. Confidence in independence stems mostly from rigorous and careful experimental design. You can check a graph of your observations (your actual data points) against your residuals (the distance between your observed values and the linear progression from your model), which can sometimes provide clues. But ultimately, you’ve just got to know your data was collected appropriately. On this one, we’re largely on our own. However, I think it’s a good idea for data journalists to provide a Residuals vs. Observations graph when they run a regression.

Here’s a Residuals vs. Observations graph I pulled off of Google Images. This is what we want to see: snow. Clear nonrandom patterns in this plot are bad.

SPSSREG236

Linearity

The name of the technique is linear regression, which means that observed relationships should be roughly linear to be valid. In other words, you want your relationship to fall along a more or less linear path as you move across the x access; the relationship can be weaker or it can be stronger, but you want it to be more or less as strong as you move across the line. This is particularly the case because curvilinear relationships can appear to regression analysis to be no relationship. Regression is all about interpolation: if I check  my data and find a strong linear relationship, and my data has a range from A to B, I should be able to check any x value within A and B and have a pretty good prediction for y. (What “pretty good” means in practice is a matter of residuals and r-squared, or the portion of the variance in Y that’s explained by my Xs.) If my relationship isn’t linear, my confidence in that prediction is unfounded.

Take a look at these scatter plots. Both show close to zero linear relationship according to Pearson’s product-moment coefficient:

2015-02-28 15.55.48

And yet clearly, there’s something very different going on from one plot to the next. The first is true random variance; there is no consistent relationship between our and variables. The second is a very clear association; it’s just not a linear relationship. The degree to which varies along changes over different values for x. Failure to recognize that non-linear relationship could compel us to think that there is no relationship at all. If the violation of linearity is as clear and consistent as in this scatter plot, it can be cleaned up fairly easily by transforming the data. I currently have the advantage of a statistical consulting service on campus, but I also find that the internet is full of sweet, generous nerds who enjoy helping with such things.

Regression is fairly robust to violations of linearity, as well, and it’s worth noting that any relationship that is sufficiently lower than 1 will be non-linear in the strict sense. But clear, consistent curves in data can invalidate our regression analyses.

Readers could check data journalism for linearity if scatter plots are posted for simple linear regressionFor multilinear regression, it’s a bit messier; you could plot every individual predictor, but I would be satisfied if you just mention that you checked linearity.

Constancy of variance

Also known by one of my very favorite ten-cent words, homoscedasticity. Constancy of variance means that, along your range of predictors, your varies about as much; it has as much spread, as much error. That is, if an SAT score predicts freshman year GPA with a certain degree of consistency for students scoring 600, it should be about as consistent for students scoring 1200, 1800, and 2400.

Why? Think again about interpolation. I run a regression because I want to understand a relationship between various quantitative variables, and often because I want to use my predictor variables to… predict. Regression is useful insofar as I can move along the axes of my values and produce a meaningful, subject-to-error-but-still-useful value for y. Violating the assumption of constant variance means that you can’t predict with equal confidence as you move around x(s).

Here’s a residuals plot showing the dreaded megaphone effect: the error (size of residuals, difference between observations and results expected from the regression equation) increases as we move from low to high values of x. The relationship is strong at low values of and much weaker at high values.

megaphone

We could check homoscedasticity by having access to residual plots. Violations of constant variance can often be fixed via transformation, although it may often be easier to use techniques that are more inherently robust to this violation, such as quantile regression.

Normality

The concept of the normal distribution is at once simple and counterintuitive, and I’ve spent a lot of my walks home trying to think of the best way to explain it. The “parametric” in parametric statistics refers to the assumption that there is a given underlying distribution for most observable data, and frequently this distribution is the normal distribution or bell curve. Think of yourself walking down the street and noticing that someone is unusually tall or unusually short. The fact that you notice is in and of itself a consequence of the normal distribution. When we think of someone that is unusually tall or short, we are implicitly assuming that we will find fewer and fewer people as we move further along the extremes of the height distribution. If you see a man in North American who is 5’10, he is above average height, but you wouldn’t bat an eye; if you see a man who is 6’3, you might think yourself, that’s a tall guy; when you see someone who is 6’9, you say, wow, he is tall!, and when you see a 7 footer, you take out your cell phone. This is the central meaning of the normal distribution: that the average is more likely to occur than extremes, and that the relationship between position on the distribution and probability of occurrence is predictable.

Not everything in life is normally distributed. Poll 1,000 people and ask how much money they received in car insurance payments last year and it won’t look normal. But a remarkable amount of naturally occurring phenomena are normally distributed, simply thanks to the reality of numbers and extremes, and the central limit theorem teaches us that essentially all averages are normally distributed. (That is, if I take a 100 person sample of a population for a given quantitative trait, I will get a mean; if I  take another 100 person sample, I will get a similar but not exact mean, and so on. If I plot those means, they will be normal even if the overall distribution is not.)

The assumption of normality in regression requires our data to be roughly normally distributed; in order to assess the relationship of as it moves across xs, we need to know the relative frequency of extreme observations to observations close to the mean. It’s a fairly robust assumption, and you’re never going to have perfectly normal data, but too strong of a violation will invalidate your analysis. We check normality with what’s called a qq plot. Here’s an almost-perfect one, again scraped from Google Images:

2010-01-19-qqplot

That strongly linear, nearly 45 degree angle is just what we want to see. Here’s a bad one, demonstrating the “fat tails” phenomenon– that is, too many observations clustered at the extremes relative to the mean:

heavy tails

I will confess that, when I work with my statistic instructors, I still can’t predict what he will deem a “good enough” quantile plot. But this is just another way to say that I’m a beginner. Data journalists would do a good deed by posting publicly-accessible qq plots.

Diagnostics

OK, so 2000 words into this thing, we’ve checked out four assumptions. Are we good? Well, not so fast. We need to check a few diagnostic measures, or what my stats instructor calls “the laundry list.” This is a matter of investigating influence. When we run an analysis like regression, we’re banking on the aggregate power of all of our observations to help us make responsible observations and inferences. We never want to rely too heavily on individual or small numbers of observations because that increases the influence of error in our analysis. Diagnostic measures in regression typically involve using statistical procedures to look for influential observations that have too much sway over our analysis.

The first thing to say about outliers is that you want a systematic reason for eliminating them. There are entire books about the identification and elimination of outliers, and I’m not remotely qualified to say what the best method is. But you never want to toss an observation simply because it would help your analysis. When you’ve got that one data point that’s dragging your line out of significance, it’s tempting to get rid of it, but you want to analyze that observation for a methodology-internal justification for eliminating it. On the other hand, sometimes you have the opposite situation: your purported effect is really the product of a single or small number of influential outliers that have dragged the line in your favor (that is, to a p-value you like). Then, of course, the temptation is simply to not mention the outlier and published it anyway. Especially if a tenure review is in your future…

Some examples of influential observation diagnostics in regression include examining leverage, or outliers in your predictors that have a great deal of influence on your overall model; Cook’s Distance, which tells you how different your model will be if you delete a given observation; DFBetas, which tells you how a given predictor observation influences on a particular parameter estimate; and more. Most modern statistical packages like SAS or R have built-in commands for checking diagnostic measures like these. While offering numbers would be nice, I would mostly like it if data journalists reassured readers that they had run diagnostic measures for regression and found acceptable results. Just let me know: I looked for outliers and influential observations and things came back fairly clean.

(Here’s a recent post I wrote about the frustration of researchers failing to speak about a potential outlier.)

*****

Regression is just one part of a large number of techniques and applications that are happening in data journalism right now. But essentially any statistical techniques are going to involve checking assumptions and diagnostic measures. A typical ANOVA, for example, the categorical equivalent of regression, will involve checking some of the same assumptions. In the era of the internet, there is no reason not to provide a link to a brief, simple rundown of what quality controls were pursued in  your analysis.

None of these things are foolproof. Sums of squares are spooky things; we get weird results as we add and remove predictors from our models. Individual predictors are strongly significant by themselves but not when added together; models are significant with no individual predictors significant; individual predictors are highly significant without model significance; the order you put your predictors in changes everything; and so on. It’s fascinating and complicated. We’re always at the mercy of how responsible and careful researchers are. But by sharing information, we raise the odds that what we’re looking at is a real effect.

This might all sound like an impossibly high bar to clear. There are so many ways things can go wrong. And it’s true that, in general, I worry that people today are too credulous towards statistical arguments, which are often advanced without sufficient qualifications. There are some questions that statistics certainly can not answer. But there is a lot we can and do know. We know that age is highly predictive of height in children but not in adults; we know that there is a relationship between SAT scores and freshman year GPA; we know point differential is a better predictor of future win-loss record than past win-loss record. We can learn lots of things, but we always do it better together. So I think that data journalists should share their work to a greater degree than they do now. That requires a certain compromise. After all, it’s scary to have tons of strangers looking over your shoulder. So I propose that we get more skeptical and critical on our statistical arguments as a media and readership, but more forgiving of individual researchers who are, after all, only human. That strikes me as a good  bargain.

And one I’m willing to make myself, as I’m opening up my comments here so that you all can point out the mistakes I’ve inevitably made.

it eats everything

At New York Private Schools, Challenging White Privilege From the Inside

Establishment power is defended with the baton and tear gas  only as a last resort. In the first instance, it is defended with far subtler, far more insidious means.

On a recent morning, 20 or so high school students, most of them white, milled about the meetinghouse at Friends Seminary, a private school in Manhattan. They were trying to unload on their classmates slips of paper on which they had jotted down words related to the topic “Things I don’t want to be called.”

Street level protests like #BlackLivesMatter are the most genuine and principled form of resistance to this power; counterintuitively, they inspire response from establishment power that is less true to establishment power’s typical modus operandi.

Several girls tried get to rid of “ditsy.” A sophomore in jeans and a gray hoodie who identifies as Asian-American was seeking to unload “minority.” And several white students, including a long-limbed girl in a checkered lumberjack shirt, wanted to get rid of “privileged.” Under the rules of the exercise, no other student was obligated to accept it.

As the history of dictatorship shows, armed, heavy-handed defense of establishment power is effective only until it isn’t. The obvious and crude nature of this form of defense reveals its real-world power but also its vulnerability.

“It’s just a very strong word to use,” the last girl said. “I don’t want to be identified with that just because my parents can afford things. I think it has a negative connotation.”

Contemporary capitalism has produced systems that are far more sophisticated. Modern neoliberal nations do not typically have to crush dissent. They rarely feel forced to meet strength with strength. Paradoxically this tendency to avoid the direct expression of force through violence demonstrates the true depth of establishment power.

The workshop was part of a daylong speaker series known at Friends as the Day of Concern. Students gathered in small groups to discuss a variety of social justice issues and participate in workshops; there were also talks about gender and the environment. But the overarching theme of the day was identity, privilege and power. And it was part of a new wave of diversity efforts that some of the city’s most elite private schools are undertaking.

Perhaps no form of subtle social control better exemplifies privilege’s ability to dominate through soft power than the way in which privilege theory itself becomes a commodity, monetized and peddled to the privileged as easily as consumer electronics or expensive clothes.

In the past, private school diversity initiatives were often focused on minority students, helping them adjust to the majority white culture they found themselves in, and sometimes exploring their backgrounds in annual assemblies and occasional weekend festivals. Now these same schools are asking white students and faculty members to examine their own race and to dig deeply into how their presence affects life for everyone in their school communities, with a special emphasis on the meaning and repercussions of what has come to be called white privilege.

Capitalism employs the power of the rifle only when necessary. Over time, the systems of commodification, appropriation, and undermining become more and more sophisticated; concurrently, the need to use brute force declines. Pinkertons are replaced by well-meaning cultural studies professors. The defense of privilege is carried out by those who rail against it.

The session at Friends Seminary, on East 16th Street, was led by Derrick Gay, a 39-year-old diversity consultant who has led similar programs atCollegiate School on the Upper West Side, Saint Ann’s in Brooklyn Heights and the Spence School on the Upper East Side.

Sincerity becomes a tool of power. When establishment power’s tactics were cruder, less refined, appropriation relied on insincerity; it was a form of outward deception. Now the deception is self-deception. The most committed, most passionate critics of privilege become the agents through which their own critique is packaged, consumed, and ultimately stored away in a mental closet like last season’s handbag.

Mr. Gay, who is black, says schools are increasingly drawn to conversations about privilege and race because they understand that “raising students to live in a bubble — a white bubble, a black bubble, a Latino bubble, whatever type of bubble you want to call it — is not to your benefit in a global society.”

In an earlier time, establishment power would have opposed the creation of an anti-establishment professional class. Today, establishment power recognizes that the surest way to blunt the impact of a social movement is to professionalize it. Thus the rise of the professional anti-racist, the professional anti-sexist, the professional opponent of privilege. Sincerity in pursuing the cause becomes not an impediment to serving the needs of establishment power but a powerful virtue.

For most of their history, private schools were the living embodiment of white privilege: They were almost all white and mostly moneyed. Not anymore. This year, according to the National Association of Independent Schools, minority students make up a third of the population of New York City private schools, and 18.5 percent of all students receive financial aid.

To improve the optics and keep overwhelming irony at bay, privilege enacts aesthetic reforms that deepen greater inequality. Like the woman elevated onto the board of a company where the CEO makes 300 times the average worker, establishment power looks to diversify systems and institutions that are unequal by their nature and elitist in their function.

Educators charged with preparing students for life inside these schools, in college and beyond, maintain that anti-racist thinking is a 21st-century skill and that social competency requires a sophisticated understanding of how race works in America. In turn, faculty members and students are grappling with race and class in ways that may seem surprising to outsiders and deeply unsettling to some longtime insiders. And the term “white privilege” is now bantered about with frequency.

Political discourse becomes, in the hands of the privilege education industry, inherently and existentially linguistic in its function. Language becomes inescapable: bad language is represented as the cardinal sin, good language the cardinal virtue, language is the means through which those worthy of punishment are identified, and language the tool to punish them.

It comes up during schoolwide assemblies like a recent one held to honor the Rev. Dr. Martin Luther King Jr. at the Little Red School House and Elisabeth Irwin High School, also known as LREI, a progressive school in the West Village. It is explored at parent gatherings at the Dalton School on East 89th Street during broader conversations about racial equity. It is examined in seventh-grade social studies at the Calhoun School on West End Avenue, where students read “White Privilege: Unpacking the Invisible Knapsack,” a 1989 article by Peggy McIntosh that outlines dozens of ways white people experience “unearned skin privilege.”

This obsessive focus on language seems, to those who have accepted its central premises, to be a trap that can catch all bad behavior within it. In fact, privileging language above all else merely empowers the more industrious to escape criticism through employing language themselves. If language is both the cage and the lock, language is inevitably the key.

And at a few schools, students and faculty members are starting white affinity groups, where they tackle issues of white privilege, often in all-white settings. The groups have sprung from an idea that whites should not rely on their black, Asian or Latino peers to educate them about racism and white dominance.

First, by making language the means through which inequality is identified, expressed, and combated, structural and material inequality become strangely marginalized in critical analysis, and those who focus on them are mocked and distrusted.

“In the past, there was a tendency to think: This isn’t my problem and it isn’t something I need to deal with because it isn’t something I even think a lot about,” said Louisa Grenham, a white senior at Brooklyn Friends Schooland a member of a white affinity group there.

Second, when the linguistic becomes the only means through which to understand the world, the linguistic rejection of privilege becomes an arbiter of who gets sorted into which camp. Curiously, the most effective way to undermine one’s place of privilege is to announce it; “I know I am privileged” becomes a tool with which to force others to see you as something else.

“Whiteness” as a concept is not new. W. E. B. Du Bois wrote about it in the 1920s; James Baldwin addressed it in the 1960s. But it did not gain traction on college campuses until the 1980s, as an outgrowth of an interdisciplinary study of racial identity and racial superiority. It presumes that in the United States, race is a social construct that had its origins in colonial America when white plantation owners were seeking dominance and order.

If all identities are social constructs, it becomes impossible to conduct a reality check. Social critique marches further and further from the material conditions it arose from.

Today “white privilege” studies center on the systemic nature of racism as well as the way it exposes minorities to daily moments of stress and unpleasantness — sometimes referred to as “micro-aggressions.” Freedom from such worries is a privilege in and of itself, the theory goes, one that many white people are not even aware they have.

Whatever critiques of the-thing-in-itself exist become subject to the appropriation of those who have it, and thus their capacity to harm is blunted. Since belonging is a matter of linguistic ritual, even those most directly indicted by these critiques feel no compunction against taking them up and directing them outward. Even the bullet with your name on it cannot harm you if you are allowed to grab the barrel of the gun and point it in the other direction.

It may seem paradoxical that students at elite institutions would decide to tackle the elitism they seem to cherish. But private schools’ diversity consultants brush aside insinuations that their social justice work is inauthentic.

The best defense is a good offense.

In recent months, for example, as the deaths of Michael Brown in Ferguson, Mo., and Eric Garner, on Staten Island, have prompted protests, schools have tried to make the conversation relevant for their students, taking them to Black Lives Matter marches and honoring white civil rights leaders in schoolwide assemblies.

So men who enthusiastically mock “Not All Men,” sharing memes and composing tweets, are inevitably themselves saying “not all men” in a different register. Just as shamelessly as the men who insist “Not All Men,” they extricate themselves from the critique which they ostensibly celebrate.

Talking about “whiteness,” administrators say, gives white students a way into conversations about equity and prejudice that previous diversity efforts at their schools may have excluded them from.

Thus the white person who rails the loudest about white privilege feels themselves to be least vulnerable to the accusation of being so.

At the LREI high school campus, the front entrance is adorned with a student art project, by the seniors Ana Maroto and Sage Adams, that includes a black-and-white photo of a somber-looking teenager, who identifies as mixed-race, holding a placard that reads: “I need justice because I’m sick of having to explain privilege.”

Like an auto-immune disorder, the systems designed to keep the body healthy attack it themselves. Privilege theory has become the instrument of the privileged.

At the Riverdale Country School in the Bronx, two white seniors started the Exploring Whiteness club in the fall, which now regularly attracts 15 students. They were inspired by reading “Waking Up White,” a memoir by Debby Irving, a self-proclaimed WASP from New England who discovered in her late 40s that many of the benefits her father had received in housing and education from the G.I. Bill had been denied to millions of African-American veterans. In the book, Ms. Irving writes about “stepping out of a dream” and realizing that the black people she knew lived in a more challenging world than she ever would face.

Capitalism being as it is, a new class of professional privilege educators is born. They react to market need. If the affluent are seeking to salve themselves through the careful application of privilege theory, a professional class will arise to commodify that desire. If the privilege are looking to be soothed, someone will sell them the balm.

Every year, an increasing number of New York City private schools select students to attend the White Privilege Conference, founded 16 years ago by Eddie Moore Jr., the former diversity director at Brooklyn Friends. This year, the theme of the conference, organized by the Dalton School, is “Race, Privilege, Community Building.”

As the ranks of the professional privilege opponents grow, the urge to defend the theory from external criticism grows.

The new focus on addressing white privilege has not been an unmitigated success. Dr. Moore, for example, despite the stature of his conference, is no longer working with Brooklyn Friends. Acknowledging the inherent tension, he said: “Not every student is saying: ‘I want to talk about white privilege. Give me the best book.’ ”

Luckily for these enthusiastic capitalists, the form of that defense is inscribed in their position: accusations of privilege and bigotry themselves. The initial political defense of these ideas and tactics intermingles with the naked financial self-interest until they are, by design, totally inextricable.

For years, private schools in New York avoided conversations about race and class by remaining uniformly white and wealthy. They began desegregating in earnest in the 1970s and 1980s, as programs for low-income students like Prep for Prep and A Better Chance brought in minority scholarship students. Many white parents welcomed the change, worried that their children would be ill prepared for an increasingly multicultural world if they did not have exposure to people from diverse backgrounds. Today, for example, at LREI, Calhoun and Dalton, at least one-third of the student body is not white.

These people become invulnerable, their commodification impregnable: there is no critique from within privilege theory that they cannot turn around on others, and no critique from outside of it that they cannot dismiss as itself the hand of privilege.

At some of the city’s top neighborhood public elementary schools, nonwhite populations are actually lower. At both Public School 6, on the Upper East Side, and P.S. 41, in Greenwich Village, 21 percent of the students in the 2013-14 school year were nonwhite, according to state figures. At P.S. 41, that is a dip from 31 percent in the 2003-4 school year.

The initial functions of these theories, to challenge and undermine and discomfit, are thus lost, at least to those savvy enough to appear forever on the right side of things.

Many of the private schools have struggled, though, to make these new minority students feel welcome, oscillating between a colorblind philosophy and a feel-good “festival approach” — reserving light discussions about race and class for Martin Luther King’s Birthday, Black History Month and an annual assembly or two.

That approach, diversity directors say, has proved ineffective.

The ameliorative potential of this kind of engagement is always asserted, rarely proven. Nor is serious consideration given to whether, by focusing so intently on feelings as a deracinated aspect of psychology, these efforts actually prevent serious efforts to dismantle the socioeconomic conditions that cause them.

Tim Wise, an anti-racism activist and the author of “White Like Me: Reflections on Race From a Privileged Son,” said: “If you’re still talking about food and festivals and fabrics with high school students, you’re probably not pushing them to think critically about these bigger issues.”

Indeed, in recent years, several documentaries filmed inside these schools — including Michèle Stephenson and Joe Brewster’s“American Promise,” Kavery Kaul’s “Long Way From Home” and “Allowed to Attend,” produced by Trinity’s director of communications — present in excruciating detail the alienation many minority students experience. The schools are depicted as institutions teeming with white students oblivious to their outsize privilege — the lavishness of their spring-break vacations, weekend homes and lunch money — and unaware of the challenges faced by their less privileged classmates.

Absurdly, the more immaterial and asystematic these critiques become, the more likely those who voice them are to self-style as radicals, as if radicalism exists in inverse proportion to the willingness to explore first causes and foundational inequality.

In “The Prep School Negro,” the filmmaker André Robert Lee explores what it was like to be one of the few African-American students enrolled, on scholarship, in the 1980s at Germantown Friends, an elite Quaker school in Philadelphia. He has taken his film, first completed in 2008 and reworked in 2014, to hundreds of schools around the country. He maintains that the screenings have helped spur conversations about race and class that would not have been possible even 15 years ago.

Mr. Lee is now touring schools with another film he produced, “I’m Not Racist … Am I?” Commissioned by the Calhoun School, the film follows 12 New York City private and public school students for a year while they attend workshops exploring racism and white privilege. “School administrators tell me: ‘We realize we have a lot more work to do on these issues,’ ” Mr. Lee said.

In these contexts, the obsessive focus on conversations, awareness, and knowing becomes inevitable. Solutions must, like causes, remain vague, indistinct, and resistant to material evaluation.

Administrators at Friends Seminary would seem to agree. In January, students gathered in the school’s slate-gray meetinghouse, a room virtually unchanged since 1860, to watch a presentation by Mr. Gay, a classically trained opera singer and the former director of community life and diversity at the Nightingale-Bamford School, a private institution for girls on the Upper East Side. With slides, videos and a series of pen-and-paper exercises, Mr. Gay talked to the students about how race, class, gender and ablebodiedness influence people’s perspective and contribute to whether they feel welcome “inside a space.”

During an exercise called “Who Are You?” Mr. Gay asked students to create their own “identity cards,” writing down terms they wanted to be associated with, in stark contrast to the other exercise, which focused on unwanted identities. One girl wrote “white,” “SoHo” and “Sag Harbor”; another wrote “a very nice person.” Then students paired up, with one responding to the question “Who are you?” The room erupted in noise, with students shouting, “black,” “white,” “straight,” “lesbian,” “Jewish,” “Spanish” and “smart.”

Whatever once remained of the material, objective conditions of oppression that first inspired theory has dissolved. A wealthy 16 year old becomes representative of marginalized identity; an out-of-work truck driver becomes classified by his male privilege.

“Everyone has a card,” Mr. Gay told the students. “It’s called an identity card. Society doesn’t value each of these identities equally.”

Later he added: “It’s no one’s fault. But you should be aware of it.”

Paradoxically, a movement often accused of essentialism teaches its adherents that they can wriggle out of any critique of their demographic and social qualities.

During another seminar that day, Darnell L. Moore, a writer and activist from Camden, N.J., divided students into small groups, giving them large sheets of paper and felt-tip markers and asking them to develop social-status charts, based on current conditions in America and general perceptions.

The students produced strikingly similar charts, with several envisioning a straight, white male as the most powerful citizen and a poor, black single mother as the least powerful one.

White privilege becomes other white people’s privilege; male privilege becomes the sin of other  men; heteronormativity, the fault of some category “straight people” and not the particular “this straight person.”

“It was kind of gross how easy it was to be able to say, ‘This person has to be this,’ ” said Camille Fillion-Raff, a junior at the school.

Educators who do this work in New York private schools say one of the challenges white students face when exploring their own identity is the dearth of white anti-racist role models. They say white students have traditionally been offered only three ways to confront race: to be colorblind, ignorant or racist.

“Those are not happy identities,” said Beverly Daniel Tatum, the president of Spelman College and the author of “Why Are All the Black Kids Sitting Together in the Cafeteria?”

Identity, stripped of any plausible real-world referent, signals everything and means nothing.

With that in mind, the Trevor Day School on East 89th Street spends at least some time every year honoring the white civil rights activist Andrew Goodman, who was killed in Philadelphia, Miss., in 1964, while working to register black voters. This year, the school invited Mr. Goodman’s brother, David, to speak at the school.

But helping students explore their white identity has not been without its challenges.

Once synonymous with reactionary conservatism, pride in being part of a privileged class becomes reconciled with an ostensibly radical, counter-cultural worldview.

At the Ethical Culture Fieldston School, which has campuses in Manhattan and the Bronx, a plan this winter to roll out a racial awareness workshop series for third through fifth graders was met with fierce resistance by parents. Many objected that children as young as 8 were being asked to segregate themselves into race-based affinity groups. Ultimately, parents were told, students who chose not to identify with any of the racial categories would be allowed to sign up for a group that was not based on race. A fifth grader’s father, a white man who asked not to be identified because he did not want any repercussions for his daughter, called the plan “mind-boggling” and said his daughter found the entire concept confusing and unsettling.

Unmoored from the responsibility to actually demonstrate marginalization, groups like #GamerGate proceed to use the terminology and tactics of privilege theory against its champions. Having created the conditions for this appropriation themselves, they find themselves powerless against this. Aesthetics having totally eaten the actual, no one has a firm enough place to stand to deny their claims to marginalization, least of all to the corporate advertisers towards whom they make their appeals.

At Brooklyn Friends, a controversy over the approach of Dr. Moore, the school’s former diversity director, ended abruptly when he left at the end of last year and did not return this fall. Many students, like Jumoke McDuffie-Thurmond, a black senior, said Dr. Moore was a warm and stimulating figure at the school who talked openly about what he called “subconscious racial bias.” But several sources inside the school said some white students complained that Dr. Moore was a polarizing figure whose focus on white privilege made them uncomfortable. Both Dr. Moore and a school representative described his departure as “amicable.”

Capital thus sends its newly-educated young people out into the work place, stuffed with the means to combat privilege but no idea why, ready to devote ostensibly left-wing theory to the cause of personal financial gain, and possessed of an iron-clad assurance that their self-conception is congruent with the brand new moral world. Political morality is as etched into their identities as their money, as intrinsic to them as will be their inevitable Ivy League diplomas.

At LREI, Sandra Chapman, the director of diversity and community, said conversations about white privilege could be difficult, with some students and faculty members more willing to engage than others. “This is messy work,” she said. “But these conversations are necessary.”

Establishment power then sits back to wait for the inevitable corruption and conservatism of age and time.

butterfly theory in the classroom

10446284_10100614256587159_5706319377317509171_oI learned this idea from a mentor of mine, and thus I can’t give proper credit to whoever thought it up. Perhaps it’s one of those pieces of lore that’s been floating around between teachers forever and belongs to no one. In any case.

Anyone who’s been a teacher  for long enough has probably encountered a student that displays a certain degree of, for lack of a better term, performative eccentricity. Or you might look back to your high school days and think of someone like this. This is the kid who played up a particular kind of difference from the crowd, accentuating his or her “weirdness.” I’m not just talking about being different in general, and I’m not at all  suggesting that these people are not expressing sincere aspects of their personality. I am saying that there’s a way in which they broadcast their difference in order to make that the salient aspect of their personality in the eye of their peers. And I say this from experience, as in middle school, this was more or less me.

Butterfly theory is an attempt at an explanation for this tendency. To understand these students, think of how a  butterfly flies. If you study the path of butterflies through the air, it looks like they’re drunk. To get from one point to another, they never travel in a straight line; they zig and zag through the air, dipping down strangely and without warning. The presumed reason for this is a survivability advantage: if you are a creature as fragile as a butterfly, it’s a very bad thing if other animals can predict where you’ll go. Contrast with, say, a 1,000 pound moose. You might be prey, sometimes. Your physical advantage doesn’t make you invulnerable. But mostly you’re equipped to handle it if some other animal comes across your path. So you walk in straight lines. In a similar way, people who have certain social vulnerabilities — people who can be easily hurt thanks to the outward aspects that signal different types of social value, particularly when we’re young — have a vested interest in unpredictability. If no one knows who you really are, no one can insult who you really are.

Or, to put it in another way, when I was in 8th grade, I think my implicit thinking was “If they define me as the weird kid, at least they aren’t defining me as the kid with the greasy hair, with the bad clothes, the kid who smells.”

As I could tell you, from my middle school experience, this system of self-defense is inadequate. But I think both from my own experience and from my years of teaching, as a sub in middle and high school and as a college instructor, there’s a great deal of truth in this theory. And I’ve often struggled to know how to react to students who I perceive to be enacting this kind of behavior, not out of judgement, but out of sympathy. How can I make them feel that my classroom is a place where they are safe enough to move in straight lines? And how can I think of them in this way without acting like eccentricity and difference are things to avoid, or like they are all performance rather than an expression of genuine personality?

Because I was on the receiving end of the very worst way to go about it. One day in 8th grade, I was quoting from Moby Dick, because I had watched Wrath of Khan the night before. So while we puttered around doing exercises, I was saying some lines to the members of my group. My teacher pulled me out of class and gave me a speech that has only gotten harder to believe over time. My problem, as she patiently explained to me, was that “you’re different from other kids.” “You act so unhappy,” she said, but I was lonely because I acted strangely, and if I wanted to be happy, I had to stop. What bothers me in particular, with the weight of hindsight, is that while middle school was something like social hell for me, by then most of the people around me in classes had come to understand, if not accept, who I was. I’m still friends with a bunch of people from that very class. And I’m sure they thought it was odd that I was quoting Ahab, but they knew me well enough by then to leave it alone. As much as I was chased around and laughed at, for a couple years, none of my peers ever made me feel as bad as my math teacher did that day.

So I have some sympathy for fellow teachers who say that the personal or social eccentricities of students is simply none of my business, that the most humane and fair thing to do is not to acknowledge those idiosyncrasies. By the time they come to us at the collegiate level, students are adult learners, and deserve to be treated in the ways befitting the narrow exchange of pedagogical practice between teacher and student. The best way to avoid being like my 8th grade teacher is through benign neglect.

But context matters. There’s a funny reality of teaching freshman composition at a university like mine. So many of the classes our students take in their first couple years, at this huge STEM university, are giant lecture hall classes. I can’t tell you how many of my freshman have said to me, “You’re the only instructor here who knows my name.” That, to me, dictates a certain responsibility. I guess the punchline of this piece is that I haven’t really discovered a way to meet it yet. The best I can come up with is to act in a way that I hope all teachers would act: to try my best to remain aware of the social dynamics within my classroom that can be so hard for instructors to notice, to extend sympathy and respect, to make sure students know that I am accessible. Then again, I think to when I was a college freshman and try imagining going to tell a professor I was lonely then. Would never happen. I guess this is all weak brew.

Perhaps by the time they come to me it’s less pressing. Things got better for me, after middle school. I know the popular conception of high school is as a hellish wasteland of ceaseless cruelty, but things were OK for me, and they got better as time went on. Part of that was choosing to get more invested in my hygiene and my appearance — not for that teacher, or for the kids who teased me, but for me. Part of it was just aging into myself; I grew almost 4 inches in 18 months, my complexion cleaned up, I lost weight. But a lot of it, I perhaps naively think, was just that people started to give each other a better time. I became close friends with some of the very kids who had once chased me around. People let stuff go. I think people came to understand how rough life could be and resolved to just leave each other alone, more. You just grow up, you know? In any event, I got popular, to my surprise. I even started dating — although the first two women I dated came, not incidentally, from over the bridge in the next town over, and never knew me when I was the awkward kid getting chased.

So maybe by the time they come as college students, they are past some of this stuff. Maybe there’s a virtue to not seeing all of the same people in all of the same classes. Maybe it’s the simple reality of not having to ride the bus or eat in the cafeteria. I’d like to think that, at a certain age, the social cost of acting like an asshole overwhelms the insecurity and self-hatred that provokes it. But then, now I’m tall, and I’ve lifted weights for forever, and I dress a certain way, and I have absorbed the subtle rules of the social hierarchy, and I’m educated and male and white, and I have been told I’m attractive often enough to realize that there’s a certain kind of arrogance in self-deprecation. So from that stance of abundant privilege, my optimism is cheap, and I find myself wondering about the abundant social cruelties that may be multiplying right out under my nose. Reluctantly I come to admit that I am powerless to  understand, much less to prevent, the pain among the students who I cherish and do not understand.

this is why we can’t have nice things

So here’s a perfect example of how actually-important political debates get lost in the haze of useless slapfights, observe David Corn vs. Bill O’Reilly on the subject not of foreign policy or really even media ethics but the personal integrity of Bill O’Reilly. Sure, I’m happy to see a lying blowhard get called out in this way. But Greg Grandin wrote a piece that was similar to the later-arriving Mother Jones piece earlier this month. (To be clear, I’m not accusing anyone of journalistic impropriety, but a link back to Grandin would have been natural and useful to readers, largely because it’s just a better, more thoughtful piece.) Why didn’t Grandin’s piece attract as much attention as the MoJo piece? I mean, it actually had a point beyond  shooting spitballs at O’Reilly; it considered the way in which post-Vietnam journalism evolved in response to various late-Cold War conflicts, and in so doing, had a broader point than “O’Reilly’s a dink.” O’Reilly is a dink, and he deserves to get shot called in this way. But everybody who will side with Corn already thinks that O’Reilly’s the devil and those who like O’Reilly will never listen to Corn. And of course, there’s Politico to cover it all like it’s a fight on a  junior high school playground.

The answer is that Grandin’s post didn’t suffer in comparison despite the fact that it had a broader point than O’Reilly’s lack of integrity. It suffered in comparison because it had a broader point. The politics of personalities is the problem.

Update: I notice that MoJo has a link to say that the Nation had a video first. I’m not sure if it was retroactive or not. Broader point stands: we should have a conversation about the Falklands, about South America and the West, and foreign policy over this, not just a “showdown” between media personalities.

unless your site is about one thing, it’s about everything

I have, for same reason, spent a couple hours on Fusion tonight. (.net, teehee.) And, you know, as much as I might want to make light of Felix Salmon’s cool guy Monty Burns routine, the site itself is… fine. It’s fine! I mean, it’s whatever. Fine is fine.

But it just isn’t about anything in the way that the site’s founders and editorial people clearly want it to be. I like Alexis Madrigal a lot. But you can write a manifesto, and you can have some sort of goofy TV channel sidepiece going on, and you’re still another site publishing people writing about news and politics and culture and sometimes sports. And in that, you’re joining every other website that publishes about news and politics and culture and sometimes sports.

The mix changes; Grantland is some more sports and a little less news and whatever intern is currently writing the “Bill Simmons” column. Slate is a little less sports and a little more politics and Troy Patterson endlessly writing the word “gentleman” into his Mead notebook in cursive while admiring his new glasses in the mirror. New York is a little of everything with some soothing noises to remind New Yorkers that they are very very important. The revamped New York Times Magazine is a lot of the same edited by people who think you can get more sexy Millenials to your website by adjusting the kerning on your font. The Atlantic is a lot of the same plus Ta-Nehisi Coates plus Coates’s creepshow commenters asking him to forgive their sins. Business Insider is a lot of the same only written for the illiterate. The New New Republic is the same stuff written by every non-white male Gabriel Snyder could find to exorcise the vengeful presence of Marty Peretz’s farting ghost, and thank god for that, plus Jeet Heer with an essay made up of 800 numbered tweets. Buzzfeed is a lot of the same only if life was a Law & Order episode about the Internet from 1998. Salon is the same stuff but every single piece is headlined “Ten Things You Won’t Believe Rethuglicans Said on Fox News” regardless of content. Vox is a lot of the same stuff plus a new-fangled invention called the “card stack,” an innovative approach which allows webpages to “link” to other pages. The Awl is a lot of the same stuff brought to you by the emotion sadness. Gawker is a lot of the same stuff, cleverly hidden across 1,200 sub-blogs along with several thousand words of instructions for how to read the site that are somehow still an inadequate guide. Vice is a lot of the same stuff written by that guy you knew in high school who told you he did cocaine but seemed to only ever have that fake marijuana called Wizard Smoke you could buy at a gas station. Five Thirty Eight, I’m told, exists, although whenever I try to open it my browser seems to show me a strange lacuna into which the idea of a website was, once, meant to congeal. But one way or another,  you could take 90% of what each of these sites publish and stick it on any other, and nobody would ever know the difference.

I’m sure some people will think I’m talking poop and saying these sites aren’t good. That is not the case. I’m saying that they are all as good or as bad as whatever piece I am reading at the moment. Writers are good or bad, and much more, writing is good or bad. But I no longer know what a website means as an identity, unless that identity is a specific subject. I know what Guns and Ammo is. I know what Road and Track is. (I know what Redtube is.) I don’t know what Fusion is. I’m not saying there’s no good work. There’s lots! I’m spoiled, we’re all spoiled, people do good work. All of these places regularly publish stuff that I admire, that I enjoy, that I think is good. (OK not Business Insider.) But that’s the only designation that matters: good. The rest is a matter of logistics and who gets that week’s John Oliver video traffic.

For a website, or a publication, or a magazine, or a natively advertising content vertical, there is no such thing as a sensibility. Such a thing does not exist. I get the desire to have one. For in as much as Salmon and Madrigal might seem like utter opposites from Alex Balk, with his morose virtuosity on the subject of a dead dream, in this sense they are the same: the want their sites, their publications, their shops to matter. And it just doesn’t seem like it matters. If you want to publish talent your identity has to be as insubstantial as the next good pitch. Getting paid? That matters. Rent matters. But not much else.

Which is my long-winded and less-perceptive way of looking at all the stuff John Herrman has been writing about and asking, if Facebook and Snapchat want to peddle your words themselves, what’s the difference? Unless, of course, the point is that Facebook and Snapchat get to keep that rent money for themselves.

Update: Again… I’m just teasing. Just a  little bit. These places run good writing. I’m not disputing that. I’m just saying that they are always launched with fanfare about what makes them different, but I’m not sure that you can ever maintain that kind of vision as an actually-existing publisher unless your site has a very specific, subject matter-based focus. When you’ve got to find enough writing to run, and you’ve received some great pitch that might not reflect your mission statement, what are you gonna choose? The abstraction of sensibility or the reality of good writing?

yes, the Atlantic has an Islam problem

1. Of course I take the question of European anti-Semitism seriously. And taking that question seriously involves discussing it responsibly, and discussing it responsibly means meeting an evidentiary basis. It’s a mark of how unhealthy our conversation about this topic is that discussion of that evidentiary basis occurs under the shadow of threat, threat of being labeled anti-Semitic for asking whether a rise in anti-Semitism is or is not occurring. Whether or not Islamaphobia is real at all is a question that The Atlantic finds perfectly permissible to ask. I invite you to consider whether the magazine would ever even think to ask a similar question about anti-Semitism.  No conversation about these issues can possibly be constructive or worthwhile without acknowledging that casting aspersions on Islam is a permissible, mainstream activity for popular publications like The Atlantic in a way that it is not for Judaism. No conversation about these issues can possibly be constructive or worthwhile without acknowledging that the United States and the broader Western world has engaged in a ceaseless campaign of violence against the greater Muslim world for decades. If you write a piece in which you argue that Muslims are responsible for persecuting other groups without discussing the relentlessly campaign of invasion, manipulation, espionage, and slaughter that has been carried out against them by the most powerful governments in the world, you are not a journalist, you’re a propagandist.

2. I don’t expect Conor Friedersdorf to police his writing in order to avoid attracting the support of bigots, as the comments section of his piece demonstrates he surely did. I do  expect him to acknowledge that the mainstream media in general and his magazine in particular is perfectly willing to ask dark questions about the nature of Islam and whether it is a threat to modernity in a way that it would never ask of other religions, and I further expect him to acknowledge that this tendency has teeth, given the incredibly casual way with which this country treats Muslim life. Of course we have a responsibility to be vigilant about European anti-Semitism. That responsibility is universally acknowledged in the American press. Is there any similar unanimity when it comes to protecting the lives of innocent Muslims living in the tribal borderlands of Afghanistan and Pakistan? To protecting the due process rights of Muslims rotting in Guantanamo? To protecting the human rights of Muslims in the Palestinian territories?

3. Along with many, I am of the opinion that the constant frivolous accusations of anti-Semitism that are used to discipline and exclude those who are critical of America’s foreign policy actually make it more difficult to identify and challenge actual anti-Semitism. Every time protest of Israel’s illegal, brutal occupation of the Palestinian territories is attacked as inherently anti-Semitic, our ability to identify genuine anti-Semitism is damaged. Every time that happens, we lose an opportunity to engage those who are not possessed by hatred of Jews but who are adamant in their criticism of Israel. Whether the priority should be to engage those who are capable of being engaged or to ritualistically shame and exclude those who do not toe the mainstream line on Israeli policy is up to the conscience of the individual.

4. Employing David Frum and Jeffrey Goldberg, giving them carte blanche to peddle calls for violence against the greater Muslim world, is not the same as the other kinds of failings I regularly criticize in the media. It’s not the same because Frum and Goldberg have blood on their hands. Through Goldberg’s horrendous failures to satisfy the most basic requirements of journalism, whether due to incompetence or careerism or political bias, he directly and umambiguously contributed to one of the greatest disasters in the history of American foreign policy, one which resulted in the death of hundreds of thousands of people. Employing someone who admitted in his book to participating in the abuse of prisoners while working as a prison camp guard is not the same as publishing dumb things on education or the economy. Having a writer stake your magazine’s credibility on a cover story saying an attack on Iran by Israel was imminent, when there was every reason to suspect that the writer was being played by hardliners within the Israel government to make war seem inevitable, and then rehiring him is not the same as publishing people whose work I don’t like. Employing the man who coined the phrase “Axis of Evil,” and in so doing helping to ruin the best chance for detente between Iran and the United States in a generation, is not the same as publishing something I disagree with politically. Giving a man who was a key architect of the case for war on Iraq a platform to constantly troll of yet-more American war on Muslims targets is not the same as publishing something annoying. To reward people who have such a record of miserable failure and existential professional incompetence by giving them a platform of such prominence to continue to deepen their mistakes is something different entirely, and I will not withhold criticism of The Atlantic for fear of hurting the feelings of other writers it employs.

5. If you are a young journalist or political writer, and you review the post-Iraq careers of those in the media who were for or against the Iraq war, the message is powerful and incontrovertible: when the next war effort comes around, as it surely will, be for it rather than against it. For if you go person by person through the rolls, you will find that those who were on what we widely acknowledge to be the wrong side of the question have achieved vastly more career success in media than those who were right. Those who were wrong, terribly, disastrously wrong, have gone on to far greater fame and fortune than those who were right, in dominant majorities. They are, as a class, speaking from positions of the greatest mainstream authority or drowning in VC cash, with the black swan exception of Judith Miller simply serving to prove the rule. That’s true whether the writers in question engaged in the apology theatrics that briefly came into fashion. (Such “apologies” usually took the form of being “wrong but for the right reasons,” of course.) No publication better reflects this tendency to reward those who were unforgivably wrong about the biggest foreign policy mistake in decades than The Atlantic.

People ask why media never gets better. It never gets better because its members have no incentive to get better. When failure is rewarded and success ignored, the result is a series of broken institutions. At the airport, yesterday, I watched Wolf Blizter and his “terrorism expert” guest busily validate the case for ground war against ISIS. So: which way do you think the ambitious young strivers in our media will ultimately break?

6. How badly would you have to fail in your job as a journalist or opinion writer before The Atlantic would refuse to hire you? Just how badly do you have to fail before the publication says “no thanks”? This is a question that I have been asking for years and years. I find it a perfectly uncontroversial question given the hiring history of the magazine, and yet it is constantly dismissed as axe-grinding, as obsession, or as ad hominem. I will ask again: how badly does someone have to screw up that The Atlantic would refuse to hired them? Given, that is, that they are towing the right line to satisfy the magazine’s self-identified neoconservative owner. It’s not a rhetorical question.

7. Any honest consideration of The Atlantic‘s publishing history in the last fifteen years must admit that an inordinate number of their pieces take as their subject whether Muslims are uniquely violent, uniquely incompatible with modernity, uniquely deserving of suspicion, inherently bent towards extremism, or worthy of being considered the subject of bigotry and oppression. I am willing to discuss the magazine’s history on these matters but I am only willing to do so with those who are committed to doing so honestly. On the very day that Friedersdorf’s piece appeared, so did yet another Frum piece inveighing against the refusal to blame Islam for extremism, and so did Graeme Wood’s infinitely self-impressed piece taking the courageous, bold, contrarian stance that ISIS is bad, which has been widely interpreted as an argument for the inherent extremist tendencies of Muslims. (Update: read Jacob Bacharach on Wood’s historical illiteracy and absurd pretensions to daring.) The magazine occasionally publishes pieces that cut against this narrative, including some from Fridersdorf, for which I’m glad; it constantly publishes Muslim-trolling articles. You want to defend the magazine from these charges? Go ahead. But don’t tell me that I’m not identifying real aspects of the institutional and editorial culture of a publication that has given endless space to those who grind the axe against Islam, and in so doing helped normalize prejudices that are already mainstream.

SNL, the Yankees, and The Atlantic: insufferable for the same reason

Conor Friedersdorf has a piece out today, headlined “Europe’s Increasingly Targeted Jews Take Stock.” (No doubt the head was written by editors.) In it, Friedersdorf spends the requisite amount of time showing Grave Concern about the increasing threat to Europe’s increasingly threatened Jews, who are threatened, at an increasing level. Near the end, he helpfully includes the caveat: “The degree of danger that Jews in Europe actually face is beyond my knowledge.” Or to paraphrase, the phenomenon that is the sole justification for my piece may or may not be occurring, I just don’t know. It’s an excellent little bit of postmodern maneuvering: I’ll take my Muslim-throngs-are-advancing-across-Europe clicks, please, but don’t take my word for any of this.

If you’re interested in looking at some actual facts about the constantly-expressed fear that Europe’s Muslims are creating an atmosphere of stifling anti-Semitism, rather than just read the assertion one more time, you might start here.

Why does a professional journalist get away with this? For one thing, it’s just a product of a consequence-free culture in professional writing. But I really think a lot of it has to do with The Atlantic, and the way in which institutions become so enamored of their own history, they become indifferent to their present. Friedersdorf’s piece is part of a genre that the magazine specializes in, which is the chin-scratching reflection on moral issues of vast import, which have left these writers introspective, concerned, and stuffed with portentous feelings of doom. It’s precisely the way you write when your publication can’t stop bragging that it once published Twain and DuBois, instead of taking stock of the fact that it currently publishes David Frum and Jeff Goldberg. (Apparently ruining American foreign policy brings you a lot of cred there.)

The Atlantic comes into a lot of criticism around these parts, and it’s for this reason: I just can’t stand the pompous invocations of noble history intermingling with the grubby realities of the pageview-obsessed present. Yes, cool, you’re protecting the sacred flame of American letters. You’re also publishing delightful larks, like this hilarious banter from Goldberg, Senior Editor of Being Terrible at Journalism, and James Hamblin, Associate Editor of Resembling an Infant! Remember your old buddy Curveball, Jeff? Ho ho! We are engaged in insouciant comedy! The Atlantic is like the National Honors Society came to life as a magazine, except for when it decides to be Buzzfeed.

The Atlantic, in other words, is rather like two other impossibly self-important institutions, the Yankees and Saturday Night Live. Barry Petchesky has a great thing out today on how the Yankees are becoming a real-life version of the Onion joke. The YES Network is seriously some of the creepiest propaganda I’ve ever seen in my life; most dictatorships would be embarrassed to run with that kind of marketing campaign. Every time I watch a Yankees game I’m overcome by the smell of Old Spice. Even if they weren’t so eminently hateable otherwise, the endless, self-satisfied, smug voiceovers by company men droning on over shots of second base dappled by the sun would send me screaming for the exits. No thanks. I stopped saying the Pledge of Allegiance when I was like 10.

Even worse, though, is SNL. SNL wants simultaneously to maintain this bogus reputation for badassery and being anti-establishment while constantly celebrating its past. I’m pretty sure when you stuff a theater full of a thousand self-important celebrities and leaders so they can ritualistically kiss your ass, you don’t deserve a lot of street cred anymore. It’s amazing: they constantly distract you from how many awful sketches there are by saying “Remember when Belushi was a samurai? That was awesome!” Sometimes I think the only reason they bother to run new shows is so that they can include the clips in retrospectives 10 years later. It’s America’s greatest comedy institution, but the comedy itself just jeeps receding endlessly into the past or the future. Meanwhile, Kenan Thompson is in a dress and half the cast is laughing at their own unfunny jokes. And I hate to hurt anybody’s feelings but if you actually go back and watch the whole episodes from the 70s all the way through, you realize how bad so much of it was. Now here’s Bill Simmons with 10,000 words asking whether Church Lady is overrated or underrated. I would have watched the 40th anniversary special but then I remembered I would rather run steel wool across my tonsils.

You could throw MTV into this mix, too. The problem, in all cases, is not caring about history. It’s self-mythologizing. It’s spending too much of your time telling people about why your enterprise is a very big deal because of the very big deal it’s always been.

By now, most everyone acknowledges that the great blog revolution never actually happened. A few good ideas got out there, some people who wouldn’t have otherwise been heard got an audience, a lot of commiserating happened. But mostly people got bought out or otherwise absorbed. I don’t blame them. You gotta pay the rent. But nobody but really committed nostalgics believe in the old dream of a light-on-its-feet, antiestablishment digital media that can act as a counterweight to the stentorian, self-absorbed old media.  That’s OK; the dream was probably bogus and self-aggrandizing to begin with. But one thing that I really, really wanted to actually occur was the death of the immense, ponderous pomposity that has been such a part of media. The real cruel fate is not just the rise of Planet Listicle, endless throngs of minimum-waged 20-sometimes out to make it big as writerers in sexy NYC, churning out gifs and LOLs like some terrible culture-destroying contraption. I am ready to turn off my brain and sink into my vat of chemicals as Fusion directly accesses my brain’s zinger area. No, the true horror is that this stuff will come right alongside some 26 year old intoning gravely about the pressure he feels to maintain some stuffy old rag’s claim to profundity. That, folks, is where I sit back and wait for the meteor. Heaven help us.

slice of life

Here’s a collection of random  books from around my office, all of which have recently been a part of some sort of academic work of mine — my dissertation, articles I am trying to send out, a book proposal, or most of all, the hundreds of pages stuffed into my hard drive of ideas and arguments and explorations that will never bend themselves into a form that will permit me to share them.

001

Showing books is always, in some ways, a brag. I suppose I can dilute that element a bit just by saying that I have not read some of these cover-to-cover, and have no intention of doing so, and also that in each of them, there is some aspect that I barely or don’t understand. Beyond that, I accept that this is a brag.

With that aside, I share them simply to demonstrate the variety of intellectual and mental work that I have been permitted to do, these past 6 years of graduate education, and to use that variety as an answer to a question I hear a lot. Typically, when people out there in internet land hear the name of my field, they laugh at it. (Take, for example, this emailer to Andrew Sullivan, and to salve my ego, read also these responses to him.) That’s OK. When I made the decision to go into grad school, I knew that “grad student” generally and “humanities grad student” especially are codes, in our culture, that attract ridicule. I’ve written about my decision to go before, so I won’t belabor the point, but the long and the short of it is that if you try and do something that you think will bring meaning and purpose to your life, people will savage you for it these days. And savage they may: the last six years of my life have been the happiest and healthiest I have enjoyed since my very early childhood.

Still, the question “why do you study that?,” and a more sophisticated version, remain permanent  fixtures of this double life I’m living as an academic and a blogger. The more sophisticated version comes from those who understand the labor and institutional conditions of the contemporary university. Since I have interest in, and growing competency for, using algorithms and coding languages to analyze text, why not pursue a degree outside of the humanities, such as in dedicated linguistics departments or quantitative social science programs, or in those rare slices of the humanities that retain an amount of sexiness, like the digital humanities? You seem to do a certain kind of work, the thinking goes, and that kind of work could be done in fields that are more likely to result in permanent employment, and the kind that are more likely to get respect from the kind of people like that Dish emailer. So why didn’t you go to a department like that?

Well, to be clear: I’ve applied for jobs in those fields and would consider taking them, if offered, though I wouldn’t at all give them priority over jobs in my current field. I would weight the pros and cons just like any job. But as to why I didn’t start out in those fields given the perceived employment and prestige advantages… The first response is the most pragmatic, which is that these questions presume that I could have gotten into those programs when I started. I’m not suggesting that my current program is easy to get into; in fact, if I can again get away with appearing to brag, it’s very hard. What I mean is that I didn’t discover interest in, and facility for, some of the things I do until I was already here. I learned that I liked this stuff, and have ability (if not talent) to do it, while I was exploring within rhetoric and composition. (More on that in a sec.) I doubt I could have gotten into some of the programs that people think I should have applied to, in large part because they wouldn’t have had any reason to.  Second, while I recognize the sense in which these fields are sexier and more in keeping with the zeitgeist, it’s not at all clear to me that there actually would be a job market advantage in pursuing them. While my field has suffered as the entire humanities have, and especially in the great contraction of post-2008, the labor market is about as healthy as the humanities get these days. It happens that the world still needs to train young people in how to argue and write, perhaps now more than ever. I am simultaneously amused and disturbed by people who presume that, because we are living in a world of a thousand  facile arguments for teaching kids to code, we have suddenly lost the need to teach people to express themselves in writing and to make good arguments when they do so. As much pressure is being brought to bear by the Scott Walkers of the world, there are still enough people who recognize that we need to train communicators, citizens, and human beings, though I admit that this conviction is deeply threatened.

More important are these two points. First, I deeply value a lot of the work that goes down in my field, and the presumption that a lot of it isn’t useful or rigorous simply reflects the pervasive bias of our times. And it isn’t even just that I find them valuable because I am at heart a humanist who believes in the traditional values of the liberal arts. Even many among the jaded who believe that we have nothing left to learn from the humanities could be persuaded to see the importance and pragmatism of what many of my colleagues are doing, if they could just open their minds a little bit. There’s so much of interest going on. My professor Thomas Rickert, for example, recently published a book called Ambient Rhetoric. Rhetoric is typically defined (when it isn’t mistaken as being necessarily a matter of deception) as the means of persuasion. That tends to create a certain narrowness in how we define the rhetorical situation: who is speaking, and to  whom, and for what purpose. Rickert argues that there are a vast number of ambient features in which arguments take place that constantly shape their outcomes. Or look at a friend of mine, whose dissertation research is about communicative failure in large institutions, particularly corporations. What causes communication breakdowns in large businesses? Where do communicative failures occur, and why, and what are the consequences? There is so much interesting work being done, but none of it will be absorbed by those who have decided that the only knowledge of value these days comes out of algorithms. And I say that as someone who is waiting for an algorithm to finish spitting out its results at this very second.

Finally, though, there’s this: the books that you see above. Because to me, they symbolize freedom. True, immense, academic freedom, the freedom to explore all the different ways in which language and communication and persuasion intermingle. That’s what I’ve gained from being in my program and being in my field. There are a lot of departments in the world where I would receive great teaching and mentorship and support, but very few where I would be the recipient of the kind of freedom that’s been extended to me. I mean I have friends in educational psychology, for example, who are in great, supportive programs where they and their peers do innovative and important work. But  they tell me plainly: they’ll be chasing a particular value their whole  life. That’s indicative of what constitutes success in their field. My faculty have always had a straightforward response to my own efforts to learn and research stuff that seems remote from what the field is about: you have to finish our curriculum, but otherwise go for it. When I said I had to take a class just in applied regression analysis, that it was important for me, they said, cool. Go for it. When I said that I was going to take some time to start teaching myself (sllloooowwwllllyyyy) to code in R, they said, cool. Good luck. They have had warnings and advice — remember your exams; remember that you have to write a dissertation — but they always encouraged me to follow my own inquiry. They were sometimes afraid that the people who would eventually be reviewing my credentials might not understand who I was as an academic, and not without cause. But they told me to go for it. In many of the fields people think I would better belong in, if I asked to take a course in classical rhetoric, the answer would be, what on earth for? And yet I have learned, from my limited, partial, and ultimately amateur dalliances with the social sciences and computation that my course in that subject was invaluable. Nobody here has ever told me no. And in a world where more and more people seem to define a field’s seriousness by the narrowness with which it defines success, that’s a blessing.

That great narrowing is, for me and for so many others, a great sadness. It is also an unnecessary one. As the man said, there is absolutely no inevitability as long as there is a willingness to contemplate what is happening. Nothing powers the growing threat to the humanities more than the widespread perception that they are threatened, often expressed by those who aver that they would change that condition if they could. I fixate on subjects like the STEM shortage myth because they demonstrate the danger of turning vast presumptions about the nature of knowledge and human progress into soundbites. So often, these arguments fly in the face of the very approach to human knowledge that they lionize. That is, they call for enumeration, but fail to provide meaningful numbers to defend their claims; they champion the rigor of certain fields, but apply none in that championing; they place “data,” whatever that is, on a pedestal, and yet assume great changes in the human economy without bringing any of it to bear. What could be more indicative of contemporary times than the growing throng of data journalists who run regression analyses while seeming barely to understand how to do so, or why? In an age of Tom Friedman-style aphoristic deepities, spouted endlessly by the most powerful people in the world at Davos or Aspen, wisdom and skepticism are not outmoded tools of a lesser age. They are the only life raft left available.

It happens that I have gotten a few bites, lately, a few institutions and businesses who are, perhaps, interested in putting coin in my palm to continue to do the work I have set out to do. Some are academic and some aren’t. Nothing is certain, and nothing may come of it. For now the interest is enough; I find that merely being asked “who are you?” brings a kind of gratitude I can hardly express. I began all of this with a simple reality about myself: that there are things I have to know, and things I have to share, and they come from all over the world of the mind. Maybe they will continue to let me make a little money doing those things, and may be they won’t. Either way, I’m prepared. I don’t know what’s going to happen tomorrow. Today, I’m exactly where I need to be.

for whom the rules bend

David Carr died last night. The outpouring of genuine, deep grief from many of his peers and friends has been deeply moving. I don’t know what it’s like to have known or worked for him, but public sadness of this depth is impossible to fake. It seems clear that the Times and journalism have lost a remarkable man.

I’m going to make an observation now that will surely be taken by some as an insult to that very bout of public mourning. It isn’t; I personally have no reason at all to question the popular narrative of Carr’s life and death. I simply want to point out: one of the crimes that Carr was guilty of, during his years as an addict, was serial domestic violence. That’s a matter of public record, of his own recording. And I will further say that this is one of those crimes that is usually treated, by the amorphous but powerful group that polices norms online, as unforgivable. For most public people, having repeatedly beaten women would be the end of their good reputation, no matter if it was under the influence of drugs and alcohol, no matter how many years ago it happened in the past. More, many of the people who would enforce that damage to reputation are the same who mourn Carr now. Again, some people will assume I’m saying that Carr’s reputation should be similarly damaged. I’m not. I’m just observing a discrepancy, and asking: what makes this person, in this time, exempt from the usual rules?

When Weev was revealed to be a white supremacist, I was shocked to find that he had supporters who knew about his history of anti-Semitism and casual racism among the very vocal denizens of Twitter who enforce online shaming norms. (Molly Crabapple was a prominent one, who has since walked it back after very vocal criticisms.) To this day, there are people who take part in Twitter shaming who have a soft spot for him. How is this permitted? Amber Lee Frost and a group of women sympathetic to her were dragged through the mud, their reputations subject to the most brutal assault, for linking to a public tweet. Weev is a literal Nazi, not a “that guy’s  such a Nazi” Nazi but a giant swastika chest tattoo Nazi, and yet people can get away with defending him to this day. Why? What are the rules? How long ago does a bad deed have to happen before support for that person is no longer met with a shame spiral? Could, say, Roman Polanski point out that he was under the influence of drugs himself, and seek retribution? Why, or why not? Some people’s reputations follow them. Some don’t. Dr. Dre beat a female reporter nearly to death for doing her job, and has never expressed contrition. Yet as far as I can tell, he’s a beloved pop culture figure. Why? What are the rules?

Carr himself asked a version of this question.

“If I said I was a fat thug who beat up women and sold bad coke, would you like my story? What if instead I wrote that I was a recovered addict who obtained sole custody of my twin girls, got us off welfare and raised them by myself, even though I had a little touch of cancer? Now we’re talking. Both are equally true, but as a member of a self-interpreting species, one that fights to keep disharmony at a remove, I’m inclined to mention my tenderhearted attentions as a single parent before I get around to the fact that I hit their mother when we were together. We tell ourselves that we lie to protect others, but the self usually comes out looking damn good in the process.”

I am against the online shaming phenomenon for a variety of reasons — because I think it’s bad politics, because I think it’s ineffective, because I think it’s a performance that pleases those taking part in it without threatening establishment power, because it is inhumane. But this is the deepest reason: its fundamental fickleness, its singular hypocrisy, the way that these explosions of shame so conveniently map onto the contours of self-interest, popularity, and momentary  convenience. David Carr was a complicated man, one who like all of us had good and bad things about him. I’m not interested in prosecuting the case of whether he was a good or bad man. I am interested in prosecuting the integrity of those who think they always have the right to do so.