Wednesday, November 1, 2017

Sexual Assault at America's Colleges: A Culture of Complicity

This month, the world heard that Harvey Weinstein is a serial sexual abuser. Hollywood actors summoned their craft to feign surprise. But even George Clooney and Meryl Streep can't act well enough to inspire shock in an audience that has learned to distrust everyone from presidents to priests.

Harvey Weinstein was no priest, and the stories of his sexual misconduct should surprise no one. Apparently his reputation was so well-known in entertainment circles that NYU professors discouraged female students from interning with him. Even entertainment outsiders could have guessed that Weinstein's company would be a lion's den for aspiring young women. Weinstein, after all, is the kingpin behind raunch-fests like "Grindhouse," "Dirty Girl," "Sin City," and "Zack and Miri Make a Porno." It doesn't take an investigative reporter to suspect that a person whose wares were the objectification and exploitation of women might not be averse to the objectification and exploitation of women in the workplace. Think of it this way: normal employers couldn't show many of Weinstein's movies in their offices without creating what sexual harassment lawyers call a "hostile work environment." But for Weinstein and the women who worked with him, those movies were the work environment.

Weinsten did not survive as a prolific sexual predator by stealth. Rather, he thrived thanks to what The New Yorker termed a "culture of complicity" in Hollywood.

There are two approaches to fighting sexual assault. The first and most obvious method is to hunt, expose, and terminate predators one by one after they have already claimed their prey. But a more proactive, if less satisfying, approach is to modify the habitats where predators like Weinstein thrive.

Indeed, America's college campuses, where there is an epidemic of sexual violence, is one such habitat that is in sore need of modification.

By now, any college official who is interested in providing a safe educational environment for aspiring young women knows two things. The first is that sexual assaults are occurring at an alarming rate on American campuses. If you sent your daughter to an American college this year, there is a one in five chance that she will be sexually assaulted during her time there. (By comparison, if you sent your son to war in Iraq, the odds that he would be killed or wounded were about one in fifty.)

The second thing they should know is that one of the most significant predictors of sexual assault in college is the presence of heavy drinking. 80% of campus sexual assaults involve alcohol and, not surprisingly, college environments with less drinking also have far fewer sexual assaults than those with more. Sexual violence is seven times more common at "party schools" such as Syracuse and Penn State than "stone cold sober schools" like BYU and Brooklyn College.

These statistics are not new--studies in the 1990s showed that sexual violence is more common at schools where there is more binge drinking. They are also not surprising--drunkenness and risky sexual venturing have become more readily associated with the college experience than academics. Even eight in ten college students themselves agree that less alcohol would prevent sexual assaults.

Bureaucrats are often accused of paying lip service to problems without finding solutions, and when it comes to campus sexual assault, America's public officials--those charged with giving aspiring young women a safe place to learn--have been in rare form. Thus far, politicians and experts have compiled volumes of statistics and organized no shortage of task forces, yet they continue to treat sexual assault as a conundrum without a solution. In January, 2014, President Obama created a task force to address college sexual assault. Three years later, the task force issued a report that it gushingly dedicated to the courageous sexual assault survivors who were "agents of change." The task force then reported data that proved not much has changed at all. It didn't inspire confidence that change was on the horizon, either: a section shamefully mistitled "Prevention Programs That Work" does not even mention the schools that keep their rates of binge-drinking and sexual assaults low.

The report did warmly embrace colleges who used token awareness programs to counter sexual assaults. Since 1999, the federal government has thrown more than $131 million at such programs. But the programs have treated sexual assaults as if they occur in a vacuum and, not surprisingly, they haven't worked. The Department of Justice reviewed such programs in 2007 and lamented that "despite the link between substance use and sexual assault, it appears that few sexual assault prevention and/or risk reduction programs address the relationship."

Those colleges that have tried to lower alcohol consumption have done so with laughable half-measures that are doomed to failure. Indiana University banned hard liquor at frat parties. Stanford limits the size of alcohol bottles. (If only one could lose weight by eating ice cream in smaller dishes.) Michigan began "self-policing" their fraternities by sending its bravest--and certainly least popular--students to patrol frat houses in bright orange shirts, asking drunk partiers to please get off the roof. Meanwhile, in June, 2016, more than 35 universities began earning fortunes by selling beer at football games. At the University of Maine, my alma mater, there is a pub in the middle of campus, and the school anthem--taught to every student on their first day of college--is a drinking song.

Sadly, America's pesky culture of sensitivity, not ignorance, is giving predators a habitat to thrive in. Fear of being seen as blaming victims and excusing perpetrators has chilled a long-overdue discussion about the connection between alcohol and sexual violence. As one famous victim of sexual assault rebuked her assailant: "Alcohol is not an excuse. Is it a factor? Yes. But alcohol was not the one who stripped me, fingered me, had my head dragging against the ground, with me almost fully naked."

She was right. Drunkenness cannot be a shield for perpetrators or a sword against victims. But such concerns didn't stop Americans from nearly halving alcohol-related traffic deaths by waging an aggressive and multi-faceted campaign against drunk driving. If such a campaign against sexual violence on college campuses is going to happen, it will have to begin with a recognition that America's beloved alcohol is playing a major role.

College officials do a shameful disservice to America's daughters if they let political correctness impede progress. One frustrated writer interviewed experts who confessed that they were reluctant to advise college girls to protect themselves from sexual predators by not drinking. She wrote:
[W]e are failing to let women know that when they render themselves defenseless, terrible things can be done to them. Young women are getting a distorted message that their right to match men drink for drink is a feminist issue. The real feminist message should be that when you lose the ability to be responsible for yourself, you drastically increase the chances that you will attract the kinds of people who, shall we say, don’t have your best interest at heart. That’s not blaming the victim; that’s trying to prevent more victims.
Sexual assault is endemic to college campuses not because the problem is concealed, but because a culture of complicity exists. Officials know that binge drinking and sexual assaults go hand-in-hand, yet they remain willfully mired in the brainstorming phase of their response, grasping for more convenient solutions to the problem and refusing to take anything but the most tepid measures to fight their colleges' binge-drinking cultures. But the most effective treatments for America's rape culture would be a healthy does of some good, old-fashioned Victorian prudishness, and a resurrection of progressive temperance.

It is high time that booze gets some bad publicity. Microbreweries and country singers alike have successfully portrayed America's favorite drug as not only acceptable but downright wholesome. A discussion about the connection between alcohol and sexual assault would be a good start at giving alcohol the bad reputation that it has earned.

The colleges that give sexual predators an environment to thrive deserve some bad publicity too. Perhaps we could begin by abolishing the phrase "party schools" from our vocabularies--a euphemism that is more likely to invite than deter young people.

Perhaps "hotbeds of sexual violence" would be a more apt substitute.
_________________________________________________________________

Further Reading and Other Perspectives:

In 2015, CNN aired a documentary about campus rapes called The Hunting Ground which, interestingly, Harvey Weinstein helped fund. One of the main arguments in the documentary is that 8% of college men commit 90% of the sexual assaults, so purging those predators from college campuses is an effective way to address the problem. The documentary has received some criticism for the data that premises its argument and how it portrayed accused predators.

Wednesday, October 11, 2017

Dear Baseball, It's Time to Get Rid of Home Plate Umpires

I'm not a baseball fan, strictly speaking. Don't get me wrong, I love the sport, but I harbor subversive ideas that a real baseball fan, by definition, cannot hold. I am a heretic. Take my view of umpiring, for example. In my opinion, it is high time Major League Baseball turns the job of calling balls and strikes over to the machines.

Logistically, it could be done. Already, home plates have become epicenters of scientific observation; there are probably more gizmos fixed on any given professional diamond than there are observing all of the planets in our solar system. Those gizmos give baseball fans--the most stat-hungry in sports--a trove of data that makes traditional metrics seem as antiquated as baseball cards. Gurus these days are more interested in the speed of a pitcher's fastball, the spin rate of his curve, and the exit velocity of a batter's home run than ERAs and batting averages.

These wonders of modern technology should be revolutionizing how umpiring is done. Instead, they are highlighting how atrocious umpiring is and probably always has been.

A short history: Beginning in 1887, baseball has relied upon the relatively frail eyesight of human beings to determine whether a pitch has passed through an imaginary box known as a "strike zone" (before then, batters could tell pitchers to throw a high, low, or fair pitch). Since then, players and fans have been complaining. Arguments over balls and strikes and the feeling of being ripped off by a blown call have become as integral to the baseball spectacle as chewing tobacco and facial hair.

Although their job hasn't changed much, being a major league umpire is as difficult as ever. A 92-mph fastball, once considered fast, is in the air for a mere 446 milliseconds. Today's pitchers have shaved precious milliseconds off that figure by throwing 100-mph more frequently than ever. Home plate umpires are expected to judge a ball's position at such speeds with no physical reference whatsoever: just an imaginary strike zone suspended in air.

Now, Major League Baseball has technology that could instantaneously call balls and strikes with the precision and impartiality that only machines can offer. It seems like a no-brainer. "The sad thing is you have no clue what could be called a ball or a strike at any point," complained former big-leaguer Erik Byrnes. "Why do millions of people sitting at home get to know whether or not it was a ball or strike, yet the poor dude behind home plate is the one who’s left in the dark?"

Read more here: http://www.star-telegram.com/sports/spt-columns-blogs/gil-lebreton/article105378146.html#storylink=cpy

If you can't relate to the emotional sting that comes from losing a game due to a blown strike call, consider some hard data that prove how bad the best umpires in the world are at calling pitches. When Yale professor Dr. Toby Moskowitz analyzed nearly a million major league pitches, he found that umpires correctly called a ball or strike about 88% of the time. That might not sound bad, but for close pitches--those within two inches of the plate--the success rate dropped to 68.7%. That's not good, especially considering that the umpires would have scored 50% if they were completely guessing.

It gets worse. Not only do umpires regularly miss calls, but they are also susceptible to bias. According to research from Stanford University, umpires tend to adjust the strike zones to avoid sending runners to first base on a walk or to the dugout on a strikeout: their zones expand in three-ball counts and shrink in two-strike counts. Umpires are also reluctant to call two strikes in a row. And try as they might, umps can't quite shake home-field bias: statistics show that home pitchers benefit from slightly more generous strike zones.

So if umpires are judged for both accuracy and consistency, then they are failing on both fronts.

Yet letting machines become umpires doesn't sit right with old-school baseball fans. To them, baseball just wouldn't be baseball without umps to flamboyantly ring-'em-up after a called third (alleged) strike. The "human element"--i.e., blown calls and the saliva-swapping arguments that ensue--are baseball's patina.

They have a point. After all, bad umpires have driven the plot of some of baseball's most memorable dramas: think Jackie Robinson magically stealing home through the tag of Yogi Bera in game 1 of the 1955 World Series, or George Brett flailing out of the dugout after Yankees manager Billy Martin complained about the pine tar on his bat and umpires negated his go-ahead home run. And who can forget Jim Joyce wiping tears away as he accepted a lineup card from Armando Galarraga a day after the umpire blew an  easy call that cost Galarraga a perfect game?

But baseball should not let infatuation with its history slow its progress. In pitch-tracking technology, the MLB has a near-perfect solution to a century-old problem. If such things were possible, the NBA would leap at the chance to remove the subjectivity from foul-calling. The NFL would pay handsomely to get rid of its five-minute replay delays. For now, basketball and football are stuck with their officiating woes. Baseball is not. Automated pitch calling would give officiating the precision and objectivity of replay review without the snooze-inducing delays.

I'm not saying mechanical pitch-calling won't change the game. It would, and drastically. Pitchers and batters who master the strike zone will become more valuable as their skill is more consistently rewarded. Catchers, freed from the charade of framing pitches, would change their catching stances to hold runners more aggressively. Even the aesthetics would change. Without a home plate umpire crowding the action, baseball games would take on a more backyard feel. That mano-y-mano medieval joust that is the pitcher-batter face-off would look better unsupervised.

Alas, baseball purists have a love-hate relationship with umpires that they aren't quite ready to quit. As much as they pester umpires through their TV screens, fans have developed a vocabulary for shrugging off their mediocrity. When a player resists the urge to swing at an outside pitch and gets called out on strikes the blame falls on him for not anticipating a blown call ("that was close enough to swing at with two strikes," "he's got to protect.") On a bad day, an umpire might still earn the praise of fans if he is "calling it both ways." That phrase sounds like a compliment. In fact, it is just a euphemism for, "well, at least both teams are getting equally ripped off."

"It's a part of the game," is my least favorite argument for tolerating bad umpiring. Before pitch-tracking technology, blown calls were tolerable--fans had no choice but to tolerate them. Now, they are an embarrassment.

I get it. After baseball replaces them, I will probably miss the human element that home plate umpires offer. (Likewise, after scientists cure the common cold, I may feel a certain nostalgia for the taste of cough syrup.) But the sport will fare better when its games hinge more on athletic feats and less on the foibles of its officials.
________________________________________________________________________________

Further Reading:

A summary of HBO's feature, Man vs. Machine can be seen here. Apparently former umpire Jerry Crawford laughed off any suggestion that things needed to change. Umpires have gotten so good, he claimed, that "they're not missing any pitches."

Here is an article for those interested in the history of pitching rules.

For Another Perspective:

Derek Thompson, writing for The Atlantic has argued that pitch trackers have made baseball worse by making it harder to score. The 2017 season--which set an all-time record for home runs--may have wrecked his theory.

Tuesday, October 3, 2017

Communists Yesterday, Racists Today: How America Deals with Ideological Pariahs

I'm puzzled by the way that Americans, liberal and conservative, treat their ideological pariahs. We, who claim devotion to the freedom of speech, nevertheless feel a moral obligation to shame, boycott, and ostracize those who hold and express unpopular beliefs.

I don't get it. I consider myself a patriotic American, but I haven't heard one good reason (just a lot of bad reasons) why I should adopt an attitude of scorn or disgust towards any NFL player--or Jehovah's witness, for that matter--who doesn't stand during the National Anthem. The reasons for expressing--or not expressing--one's patriotism are highly personal. Colin Kaepernick might be an ungrateful punk for all I know, but I don't storm out of the chapel when a fellow churchgoer is noisy during a prayer, and I'm not going to boycott the NFL because some twenty-eight-year-old doofus is irreverent during a flag salute.

For all of our progress on racial and ethnic tolerance, ideological intolerance is a stubborn disease that survives by hiding under the guise of righteous causes.

Take for example, the war against bigotry, today's most socially-acceptable form of ideological intolerance. The national mood is such that one can't be too hard on racists. After the controversial Unite the Right rally in Charlottesville, Virginia, Governor Terry McAuliffe shooed away white supremacists with a phrase that sounded like something that came straight from the mouth of a white supremacist: "There is no place for you here, there is no place for you in America." His motives may have been pure, but there was something unsettling about hearing a high public official use ideology to define Americanness.

Across the country, pockets of hysteria broke out as pictures of rally attendees went viral. When students at the University of Nevada-Reno discovered a racist among them, fretful students packed the Student Senate, demanding that the administration do something to protect themSome complained that knowing there was a racist on campus made them fear for their lives. 32,000 people signed a petition demanding that the school expel the offender.

Ultimately, those pesky First and Fourteenth Amendments (and, hopefully, a bit of human decency) prevented UN-R, as a public school, from yielding to the demands of its horror-struck student body.  

The law isn't always so forgiving. Recently, otherwise talented individuals have lost their jobs when the public discovered their previously-dormant bias. Take Fisher Deberry, for example, who was the winningest football coach in Air Force Academy history before he said that his team needed to recruit more black players because "Afro American kids can run very well." He was publicly reprimanded and resigned soon after. Or former NPR news analyst Juan Williams, who was canned after confessing that he feels nervous when he sees passengers in Muslim garb on his plane. More recently, James Damore became a martyr of the alt-right when he suggested, as tactfully as one possibly could, that non-bias factors were contributing to Google's gender gap. Feminists decried Damore's memo (which was factually accurate) as sexist, and Google sent Damore packing.

Each of these victims have one thing in common: none of the offenders were actually accused of acting in an intolerant way; they were convicted of bigotry for honestly and candidly expressing unpopular opinions.

America have a long history of ostracizing ideological pariahs. If racism is the 21st century's cardinal sin, communism was the sin of the 20th when socialists, communists, and radicals of all kind were deemed un-American, disloyal and a serious threat to America.

Persecution against radicals--or suspected radicals--became socially-acceptable. In Chicago, for example, a sailor shot a man three times at a pageant when he refused to stand for the national anthem. The crowd erupted in cheers. "It was believed by the authorities," the Washington Post reported, lest its readers shed any tears for the victim, "he had come here for the [Industrial Workers of the World] convention."

U.S. Attorney General A. Mitchell Palmer compiled a list of suspected radicals 450,000 names long, then set out to rid the country of its reds. Palmer's dragnet operation, which became known as the "Palmer Raids," involved the rounding up of tens of thousands of suspected radicals. Arrestees were often taken in Kafkaesque fashion, without being told the charges against them. Hundreds, without being convicted of a crime, were packed onto a ship dubbed the "Soviet Ark" and sent to Russia. Some arrestees became bait for other suspected radicals; those who visited political prisoners at the Seyms Street Jail in Hartford, Connecticut, were automatically arrested; they were, Palmer reasoned, essentially attending "revolutionary meetings." That, to Palmer, was proof enough of their guilt.

The public hysteria quelled, but returned with a vengeance during the Cold War. Again, intense fear of communism fueled a nation-wide inquisition. President Truman instituted a "Loyalty Program" beginning in 1947 to purge radicals from the federal workforce. Suspected radicals were brought (often on anonymous tips) before a "Loyalty Board," where they were probed about their belief in racial equality, association with labor unions, support for socialized medicine, sexual orientation and anything else that whiffed of communism.

Though few employees were actually terminated, the mere threat of being dissected by a Loyalty Board had a repressive effect. In 1950 the American Scholar published the results of a survey which found:
The atmosphere in government is one of fear--fear of ideas and of irresponsible and unknown informers. Government employees are afraid to attend meetings of politically minded groups; they are afraid to read "liberal" publications; the screen their friends carefully for "left-wing" ideas. . . . Nobody wants to go through a "loyalty" investigation.
The fear of being pinned as a communist or even a "fellow traveler" spread far beyond the federal government. In Madison, Wisconsin, a journalist asked 112 people to sign a petition that contained nothing but lines from the Declaration of Independence and Bill of Rights. All but one person refused for fear that they would be caught fixing their names to a subversive document.

Those who spoke out against red-baiting risked being smeared as communists themselves. Drew Pearson, a prominent journalist spoke out against Senator McCarthy's communist witch hunt. When McCarthy met him in a night club, the senator grabbed Pearson by the neck and kneed him in the groin before Richard Nixon pulled the two apart. Soon after their brawl, McCarthy called Pearson a "sugar-coated voice of Russia" from the floor of the Senate, and called for "every loyal American" to boycott Pearson's radio sponsor, The Adam Hat Company unless it revoked its sponsorship. The Adam Hat Company soon obliged.

The most shameful incidents of ideological persecution to come out of the Red Scare involve the thousands of ordinary citizens, far removed from nuclear secrets, who nevertheless found their livelihoods jeopardized by anti-communist witch hunts. The story of Anne Hale provides one such story. Ms. Hale was a member of the Communist Party during World War II (when the United States and Soviet Union were allies). After the war, she cut her ties to the communist party and became a second-grade schoolteacher in Wayland, Massachusetts.

Hale's past came to light in 1948 when an FBI informant caught her buying a Pete Seeger songbook. Agents, who did not think that Hale posed a security threat, told the Wayland School Committee about Hale's past. Hale assured the school committee that she was no longer a communist and that she was a loyal American, but the committee unanimously voted to suspend her without pay.

Hale's case would not have been unusual if she, like thousands of other accused communists, had simply resigned. Instead, she braved the public humiliation of a hearing. "I think it will do less harm to the children," she reasoned, "to see me standing up for what I believe to be true, than to see me run away." Hale admitted during the hearing that she had been a communist, but insisted that she did not believe in the overthrow of the government by force and violence and that she "believed in government by the majority, in the Bill of Rights, and in the protection of the rights of minorities."

Few dared vouch for her. Even her lawyer quit just days before the hearing. Hale's reverend, to his credit, told the press that he would give Hale a character reference. He paid a dear price. Attendance at his parish plummeted; one Sunday, just three parishioners showed up.

After eight grueling days of hearings, the school committee voted to dismiss her. Hale asked that she be allowed to say goodbye to her class. The committee refused. Instead, Hale sent each one of her students a farewell. "Just remember these things," she wrote, "which I am sure you know--I love my country and I love you."

The communist purges of the 20th century and today's purge of racists have at least one thing in common: they are driven by a fear that is grossly disproportionate to the threat posed. That is not to say that there are no hate-mongering racists among us. Just as there were communists during the cold war intent on giving nuclear secrets to the Soviets and, if given the chance, would have joined a violent overthrow of the government, there are fire-branded bigots today who would bring back Jim Crow.

But, as was the case during the Red Scares, the war against bigotry is being waged not just against those who act to harm others, but also those who merely lean to the forbidden side of an ideological spectrum. Such ideological purges have no logical end because self-righteous persecutors will find a perpetual line of baddest apples to drive from polite society. And thus both ideological scares will probably suffer the same fate; they will fizzle out after enough James Damores and Anne Hales make people realize that the quest for national purity was madness all along.

This is not to say that we should have a blase attitude toward hate and prejudice, only that the way to suppress an idiotic ideology is more evangelical and less persecutory: a strategy of conversion rather than coercion; an acknowledgment that preserving America as a laboratory of ideas means tolerating those who hold opinions that we loathe.

My fellow Mainers still take pride in the heroic words of Maine Senator Margaret Chase Smith, who bravely rebuked Joe McCarthy and her fellow Republicans from the Senate floor:
Those of us who shout the loudest about Americanism in making character assassination are all too frequently those who, by their own words and acts, ignore some of the basic principles of Americanism: the right to criticize; the right to hold unpopular beliefs; the right to protest; the right of independent thought.
The exercise of these rights should not cost one single American his reputation or his right to a livelihood merely because he happens to know someone who holds unpopular beliefs. Who of us doesn't? Otherwise none of us could call our souls our own. Otherwise thought control would have set in.
America's mission in the 20th century should have been to expose the foolishness of communism and let the radicals exorcise themselves, not to exile the communists from every nook of society. The same may be said of the war against racism. One can only hope that we are faster learners this time around.

__________________________________________________

Further reading:

Much of my material about the Red Scares comes from Haynes Johnson's thoughtful book, The Age of Anxiety: McCarthyism to Terrorism.

You can read the full story of Anne Hale in the Boston Globe.

The Brookings Institute recently published the results of a poll of undergraduate students on freedom of speech issues. They were disturbing. Among other findings, almost twenty percent of respondents thought that it was acceptable to act--including resorting to violence--to suppress expression that they consider offensive.


Thursday, August 17, 2017

The Golden Age of Misinformation

Shortly after taking office, Donald Trump decried the media as the "enemy of the people." That was, to put it mildly, a poor choice of words. "Enemy of the people" has had a grim connotation ever since violent revolutionaries and dictators started using it as a justification for hustling political opponents to the gulags and guillotines.

Still, there is no doubt that "the people" and "the media" are going through a bit of a rough patch. Only 32% of Americans--and a meager 16% of Republicans--trust the mass media to fully, accurately, and fairly report the news. That is down from a high of 72% in 1976, when journalists were still riding a wave of popularity after their work on the Vietnam War and Watergate scandal.

The unpopularity of the mass media helps to explain why Americans have not universally condemned the scuffles between their elected representatives and the press. When Trump insults the media, his supporters applaud. Constituents of Montana's Glen Gianforte didn't seem to mind that he body-slammed a reporter in May; they elected him to Congress the following day.

Americans have long had a strained relationship with the media. On the one hand, we have enshrined freedom of the press with protection under our nation's highest law. On the other hand, newspapers have always been America's least favorite troublemakers. Teddy Roosevelt famously coined the term "muckracker" to describe pessimistic investigative journalists. Richard Nixon gave the chairman of his Joint Chiefs of Staff a very Trump-esque warning: "The press is your enemy. Enemies. Understand that? . . . Because they're trying to stick the knife right in our groin." Lyndon Johnson, upset at Vietnam War coverage, privately complained that the newspapers were being run by "a bunch of commies."

In the 1800s, hostility to the press was even more widespread--and more violent. Abolitionist printers like Maine's Elijah Parish Lovejoy were frequent targets. Lovejoy had three printing presses destroyed in St. Louis before he relocated to Illinois. He fared worse there. In 1837, a mob killed him in 1837 and, for good measure, threw his press into the Mississippi River. Two years earlier another prominent abolitionist printer, William Lloyd Garrison, narrowly escaped death-by-riot when the mayor of Boston threw him in jail (on a charge of rioting).

But abolitionists weren't the only victims of anti-press sentiment. Joseph Smith and his Mormon followers found themselves on both the giving and receiving end of press hostility. When Mormon settlers established a newspaper in Jackson County, Missouri, an angry mob rode into town and demanded that the Mormons close it down. When they refused, the mob scattered their type into the streets and forced over a thousand Mormons from their homes.

Meanwhile, anti-Mormon newspapers seemed to follow the Mormons--and fuel persecution--wherever they settled. In Illinois, Smith's frustrations finally got the better of him. When the Nauvoo Expositor, an especially hostile anti-Mormon paper, set up shop in the Mormons' home city, the city council resolved that it was inciting persecution and therefore a public nuisance. Smith, as city mayor, ordered the paper's destruction.

Predictably, the move backfired. The anti-Mormon Warsaw Signal, as if to prove that it, in fact, was a threat to public safety, wrote: "War and extermination is inevitable! Citizens ARISE, ONE and ALL!!! . . . We have no time for comment, every man will make his own. LET IT BE MADE WITH POWDER AND BALL!!!"

Anti-Mormon vigilantes answered the call to arms less than two weeks later. On June 27, 1844, an armed mob stormed Carthage Jail, killing Smith and his brother. Within two years, Mormons were on the move to the Salt Lake Valley. But anti-Mormon newspapers found them there. When Brigham Young, successor to Joseph Smith, died in August, 1877, The Salt Lake Tribune couldn't resist a parting shot. "[W]e believe," it wrote, "that the most graceful act of his life was his death." With no one to lead Young's "defrauded followers," the disintegration of "the whole decaying structure" of the Mormon church was inevitable.

Though their methods could be extreme, the early 19th-century public had legitimate reasons to hate their newspapers. By today's standards, the quality of reporting at that time was downright horrendous. Newspaper publishers had neither the time nor the resources to do the investigation and fact-checking that today we call "journalism." They were first and foremost craftsman. Their days were consumed by the tedious and dirty job of running a small printing press: setting their type letter by letter, spreading ink, hanging sheets of paper to dry, finding supplies of ink and paper.

They were passive news gatherers and generally had no qualms about publishing whatever rumor would sell. They ran local enterprises without the benefit of full-time reporters or correspondents, yet their readers wanted news on far-flung events from the Lewis and Clark Expedition in the far west to the French Revolution across the Atlantic. Printers thus had little choice but to publish whatever news they could pull from other newspapers or gossip from those who dropped by the print shop to chat.

Not surprisingly, newspapers were poor sources of information. Even Thomas Jefferson, who once said that he would rather have newspapers without government than government without newspapers, lost his patience with inaccurate reporting:
Nothing can now be believed which is seen in a newspaper.  Truth itself becomes suspicious by being put into that polluted vehicle . . . . I will add, that the man who never looks into a newspaper is better informed than he who reads them; inasmuch as he who knows nothing is nearer to truth than he whose mind is filled with falsehoods & errors . . . . Perhaps an editor might begin a reformation in some such way as this. Divide his paper into 4 chapters, heading the 1st, Truths. 2d, Probabilities. 3d, Possibilities. 4th Lies.  The first chapter would be very short.
Early American newspapers were not only inaccurate, they were also unapologetically biased and their writing style did little to conceal it. Before Joseph Smith even had a chance to publish The Book of Mormon, a local paper picked up the story and dismissed the "golden bible" as "the greatest piece of superstition that has ever come within our knowledge." When Smith organized an 1834 expedition to return Mormon settlers from their Jackson County exile, newspapers as far east as Hallowell, Maine sensationalized it as a revolutionary crusade. Smith, the paper wrote, was leading an "invasion force" and vowing to "cast out the infidels" of Missouri.

Much of the bias can be explained by following the flow of money. Conflicts of interest were not just common in the 19th century, they were part of the business model. Printers helped pet politicians win elections by giving them favorable coverage and, when elected, the politicians rewarded their pet printers with lucrative government contracts. This mutual backscratching ensured that reports of political events were comically skewed. In her book, Team of Rivals, Doris Kearns Goodwin described the absurd newspaper coverage of the debates between Abraham Lincoln and Stephen Douglas:
At the end of the first debate, the Republican Chicago Press and Tribune reported that "when Mr. Lincoln walked down from the platform, he was seized by the multitude and borne off on their shoulders, in the center of a crowd of five thousand shouting Republicans, with a band of music in front." Observing the same occasion, the Democratic Chicago Times claimed that when it was over, Douglas's "excoriation of Lincoln" had been so successful and "so severe, that the republicans hung their heads in shame."
The quality of journalism in the United States has vastly improved since those humble origins. New technology, especially the telegraph, drove the modernization. The Associated Press used the telegraph to pioneer a new business model. Instead of distributing a publication of its own, the AP sent reporters to news hot spots, then sold reports to various newspapers, thus giving locally-based newspapers easy access to firsthand accounts of distant events. The AP's model lent itself to a more modern style of news writing; to ensure their articles appealed to papers of all political stripes, their correspondents wrote in a "just the facts" style that became standard. The AP showed that impartiality could be profitable.

In time, journalism became a profession. Accuracy and independence became the newspapers' stock-in-trade. They distanced themselves from politicians, and built firewalls between their business and editorial branches to avoid conflicts of interest. Though they never became apolitical--newspapers often endorse political candidates to this day--modern papers are nothing like the tools of propaganda that they were in the 19th century.

That is not to say that today's media doesn't deserve criticism. The stress of new competition brought about by the proliferation of new media has led to a relaxation of the old guard's self-imposed ethical standards. Revenue-starved outlets have essentially lent their letterhead to "advertorials" which deliberately obscure the line between news and sales pitch. Meanwhile, cable news outlets have led a regression toward the hyper-partisan newspapers of old. The line between reporting and editorializing is blurring. Web pages can make fake news, once clearly identified by its place in checkout-aisle tabloids, look like newspapers of record.

Still, Americans are far better equipped to combat biased and inaccurate reporting. A generation that can research any topic and check any fact with a few thumb-taps has no excuse for being misinformed. Indeed, if Americans are misinformed, it is because we are misusing the tools at our disposal. The web browser that could lend itself to depths of understanding and appreciation of nuance is too often used for selective exposure, headline grazing, and a superficial familiarity with current affairs.

As much as we complain about the media, our power to discern food from poison has been diminished by a generation of spoon-feeding from the relatively good news sources of the late 20th century. In the short-term, fake news and partisan reporting will find fertile ground where confirmation bias is high and media literacy is wanting. False claims from birthers, antivaxers and their kin will gain too much traction. Reliable news from outlets that maintain high levels of journalistic integrity will receive too little credit.

We can only hope that every fraud will make us savvier, and that, in time, Americans will learn who to trust.

Further Reading:

Salon has a quick article on how to distinguish fake from real news.

Jacob Soll, a U.S.C. historian, wrote an article for Politico on the long history of fake news.

Journalist/historian Christopher Daly wrote a very interesting book on the history of journalism in America.

Thursday, March 9, 2017

Why Trump's Immigration Orders Haven't Gone Far Enough (and Why He Should Ban British People)

Not long after I learned to read, I decided to become a professional writer. That decision, as so often happens with such things, gradually devolved: first into a goal, then into an aspiration, then into a hope, then into a remote fantasy that may or may not revive itself in the form of a mid-life crisis.

I only have two excuses for not chasing my dream: (1) I'm not that good, and (2) I know it. As sure as I know that I'll never dunk a basketball or hit a home run, I know that I'll never be as eloquent or concise as George Will and I'll never have David Brooks's eye for insights. 


Yet, I recently had a revelation when I came across Matt Walsh, a conservative blogger who gives hope to mediocre writers everywhere. 


Matt Walsh had a short and unsuccessful stint writing for the Huffington Post, where he did a poor Andy Rooney impersonation, airing curmudgeonly grievances about rude customers and participation trophies. It would take an above-average wit to enliven such cliche topics, a trait that Walsh unfortunately lacks. If you've seen any of Walsh's work, you know that he has the writing style of an internet troll and the insight of a drunk uncle. Not surprisingly, Walsh's time at the Huffington Post didn't last long. 


And yet, he is making it. Walsh started his own blog where he had an epiphany: he didn't need to write like a Pulitzer Prize winner to have success--he just 
needed to give people the thinly-reasoned demagoguery that they want.  Walsh ditched the droll slice-of-life topics in favor of controversial, attention-grabbing headlines like, "There's No Way I'll Send My Kids to Public School to Be Brainwashed By the LGBT Lobby," and "While You Were Crying Over a Dead Ape, 125 Thousand Babies Were Murdered." Every post was an explanation of how the apocalypse was coming in the form of sexual deviants and social liberals.

The blog went viral. Irate christian conservatives couldn't get enough of Walsh's liberal-bashing diatribes. Reputable sponsors--like the company pitching "A Weird Trick to Make Women Obsess Over You"--flocked to the blog. Glen Beck even picked Walsh up for TheBlaze.com

Walsh has become a real-life Rumpelstiltskin, spinning non sequitur and hyperbole into gold. He is to literature what The Black Eyed Peas are to music, proving that when it comes to getting noticed, talent is overrated. 

So, in the hopes of achieving my dream of becoming a professional writer notwithstanding my mediocrity, I've decided to stop aspiring to write well-reasoned and articulate posts. Instead, I'm going to give the angry public the insult-laden red meat that they obviously crave.


So here it is. Following the Walsh Elements of Style for Mediocre Writers*, here is my post on why Donald Trump's travel ban didn't go far enough.



Why Trump's Immigration Order Hasn't Gone Far Enough (and Why He Should Ban British People)

Thanks a lot, liberals. Thanks to you and  your lying, hysterical ways, people will die. I hope that you're all happy.

Allow me to explain. A few weeks ago, President Trump tried to DO HIS JOB by protecting Americans from the hordes of radical Muslim extremists who are coming here to blow us up. 

And how did you all thank him? By doing what liberals do best: hyperventilating. And while you were hyperventilating, you managed to do the other thing that liberals do best: lying, this time by calling the executive order a "Muslim ban."

As usual, the mainstream media poured gasoline on the leftists' burning hair-dos by airing bleeding-heart liberal propaganda: images of dejected and confused refugees who had cleared a long, rigorous vetting process only to be turned away at the last moment. Had the media been reporting the truth, we would have seen airport terminals full of bloodthirsty terrorists who, upon learning about Trump's executive order, threw their machetes to the floor and shouted, "damn those tricky Americans, we almost had 'em!" 


Here's a headline that you won't see on the New York Times (because it's true): Trump's Executive Order Was Not a Muslim Ban. Just read the order itself. It isn't that complicated. A third grader could have written it. Trump simply banned everyone, Muslim or not, from seven Muslim-majority countries (with an exception for people who are not Muslim). Now how could that be a Muslim ban?


But alas, the pro-immigration (i.e., pro-death) lobby tattled to the federal courts and Trump had to go rewrite the thing. So now Middle-Eastern Christians are going to be treated just like everybody else. What's worse, the administration caved and decided to exclude Iraqis from the ban.

And as a result, people are going to die.

Unless you are either an idiot or a deluded leftist radical, you will acknowledge that America would be safer if we let Trump keep immigrants out of the United States entirely.
 Think of it this way. Most immigrants are harmless. But a few--maybe one in a million--will kill someone. In other words, the government is sentencing one random American to death for every million immigrants that it lets in. Obama, who let in eight million legal immigrants during his presidency, basically offered one American per year as a human sacrifice to appease the leftist god of tantrums.


Sure, since 9/11, no terrorist from any of the seven Muslim-majority countries has succeeded in killing an American on our home soil. But that doesn't mean that Iraqis are safe. It just means we've pushed our luck long enough.

Besides, when we talk about the dangers posed by immigration, terrorism just scratches the surface. There are plenty of other ways that an immigrant could kill or seriously hurt you. Are you, like millions of Americans, suffering from high cholesterol? You probably have some foreigner's irresistible cuisine to thank (the Chinese have been slowly poisoning us with General Tso's chicken for decades). Do you like having your teeth drilled? Didn't think so. Well don't forget that some of those innocent-looking Syrian immigrants are actually dentists, hell-bent on coming to America to do exactly that. 


Maybe you do like dying of eggroll-induced heart disease or having your teeth drilled, but the rest of us were perfectly happy with the original travel ban. In fact, the only rational objection that a sane person could have to Trump's immigration order is that it didn't go far enough. It probably should have included Saudi Arabia, a hotbed for terrorism, Colombia, a hotbed for drugs, and France, a hotbed for French people.

It also should have included the United Kingdom, India, South Africa, Indonesia and Japan, hotbeds for dangerous drivers.

Of the million ways that an immigrant could kill you, vehicular homicide probably tops the list. Since immigrants make up about 13.3% of the U.S. population, they must account for about 4,278 car fatalities every year. If that mortality rate continues, you and every other native-born American will eventually be extinct.


In fact, immigrants probably account for more than their fair share of traffic deaths because they can be downright horrific drivers. Take British people, for instance. Brits--along with Indians, South Africans, Japanese, and Indonesians--drive on the wrong side of the road (moral anarchists might refer to it as the "left" side of the road). That is all well and good so long as they are all driving on the wrong side of the road together in their home country, but when they come to America and drive on the wrong side of the road, carnage ensues. Brits are killing Americans all the time this way.

I'm not calling for a permanent ban on British immigration, just a temporary hiatus so that immigration officials can develop a vetting process that will give a 100% guarantee that no British immigrant will kill me by driving on the wrong side of the road. Until then, letting Brits into America is like throwing American children into speeding traffic. It's the same exact thing.

If you are a Christian, you've probably heard some godless heathen tyrant maniac (a.k.a. a Democrat) tell you that your support of Trump's immigration policies is a “betrayal of your faith”--as if you have some sort of moral obligation to ease the suffering of others. Apparently you aren't a Christian unless you're willing to roll out the red carpet and hand machetes to Islamic State fighters as they parade off the docks into New York City.

Lest you feel the slightest pang of guilt, allow me to misconstrue a Bible passage to legitimize your callous attitude toward foreigners. When the Bible tells us to love our "neighbors," it refers to the people who live in our neighborhood. Last time I checked a map, neither Iraq nor Syria didn't have a U.S. zip code; they're clear on the other side of Jihaadistan. So say a prayer or two, but don't lose any sleep over images of drowned children. According to the unassailable word of God, they're not your problem.


Besides, it isn't a government's job to help foreign refugees. It's a government's job to (1) protect its citizens, and (2) have immigration policies that will prevent its citizens from helping foreigners, thus ensuring that they will direct their energies on more pressing issues, like keeping transgenders from strolling around women's restrooms with their penises out.


Just look at the Canadian government, which is so intent on importing could-be terrorists that it allows groups of private citizens to sponsor refugee families and take an active part in their assimilation. In 2016 Canadians, including many Christian groups, resettled about 18,000 Syrian refugees through the program. Canadian Christians were so distracted that they didn't even see that the apocalypse had arrived in the form of a 52-year-old "transager" who is currently ripping apart the very fabric of their society. With everyone so focused on helping the helpless, there was no one to call attention to that deviant's attention-seeking fetish.


As much as leftists like to pretend otherwise, Trump won the election. He won. So liberals should shut up and let the president do whatever he wants. That's how the electoral college works. Let's all turn our attention to the important issues of the day. Like Colin Kaepernick. Or who those spineless cowards running the Boy Scouts of America are letting into their club. Or whatever garbage those effeminate millennial progressives on MTV are talking about. 

Whether Donald Trump is dishonoring the American tradition of serving as a refuge for people of all faiths is small potatoes.
__________________________________________________________________________________________ *The Walsh Elements of Style for Mediocre Writers is comprised of five simple rules for success:

1. Know your audience. Angry people. That's a mediocre writer's ideal audience. The angrier the better. 


Ideally, a mediocre writer will target an audience that is really angry but doesn't know why. It's a mediocre writer's job to tell them why they are angry. In that respect, Matt Walsh has the recipe for success. He features a steady stream of posts about abortion, transgenderism, promiscious Millenials, and Beyonce's babies. Topics like that are a perfect way to get those indignant juices flowing.

2. Know your enemy. 
As a rule, mediocre writers lack the persuasive ability to engage with normal, reasonable people. So don't try. The best mediocre writers have have a gift for finding the most abnormal, irrational fringes of society--incest activists, for example--and writing as if they represent the popular mainstream. 

A mediocre writer that is having trouble finding a straw man can simply make up a foil that doesn't actually exist, like feminists who want to murder children or scout leaders who give out merit badges for sexual experimentation


Remember, mediocre writers won't draw a crowd by illuminating the nuances of controversial topics. They should make their enemies as one-dimensional as possible so as to tickle their readers' angry bones most effectively.


3. Throw logic out the window.  

The beauty of being a mediocre writer is that mediocre writers aren't constrained by traditional rules of logic. Angry people just don't expect the kind of justified inferences and careful reasoning that characterizes more polished work. 

The key is to make non sequitur sound logical. Take this example from Matt Walsh's blog: according to Walsh, allowing people to vote without paying taxes is a form of "taxation without representation." See what he did there? (Hint: representation without taxation and taxation without representation are not at all the same thing.) 

Once a mediocre writer breaks the shackles of reason, he will be amazed at the creative arguments that he can make.

4. Use insults.  Insults serve as a suitable substitute for logic as a centerpiece of a mediocre writer's work. If you were bullied in junior high, a blog can be a great place to vent all the rage that you've been suppressing. Matt Walsh, by all appearances, was picked on by the gay, feminist, and liberal bullies at his school, and now uses his blog to compile all of the feisty comebacks that would have earned him a swirly in 2001.

6. Talk about Jesus a lot. A mediocre writer can effectively camouflage questionable logic by claiming to have the savior of mankind on his side. 

Also, picking on liberals can feel a little unwholesome after awhile, even for angry people. So the mediocre writer should throw in enough Jesus-talk for the audience to feel that its indignation is of the righteous variety.

5. Use hyperbole. Mediocre writers use this a lot. So use it until your readers' heads explode.

Friday, February 3, 2017

You Just Don't Get It, Do You?

Some democrats still don't get it.  Senate Majority Leader Chuck Schumer commenced his party's momentous battle against President Trump over his coveted Supreme Court appointment with an opening volley that was downright pitiful. Neil Gorsuch, he complained, was not in the "legal mainstream." Let that sink in. Just a few months after his party suffered a humiliating defeat to the least "mainstream" presidential candidate in United States history, Schumer proved that he remains totally unaware that, outside Washington, "not mainstream" is usually a compliment, not an insult. People who refer to the "mainstream media," for example, are generally not praising the virtues of the New York Times. Yet Schumer designated Gorsuch's lack of "mainstream" credentials as the Democratic opposition's chosen point of departure.

But Schumer didn't just show his inability to read the vital signs of the American electorate, he also showed how little he knows about what the Senate should look for in a Supreme Court Justice. "Mainstream" is not something that any good judge aspires to be. They strive to write opinions that are fair, impartial, objective, unprejudiced, and true to the law, even when those opinions fall out of the mainstream. They come from a profession with heroes whose greatest moments were decidedly anti-mainstream: Atticus Finch defending Tom Robinson and John Adams defending British soldiers. 

Supreme Court Justices don't have a need to be "mainstream," either. Unlike legislators, who keep their jobs only so long as they can stay in the good graces of their constituents, Supreme Court Justices have no constituents; they are appointed, not elected. Their job is to say what the law is, however inconvenient or unpopular that might be. And if the people don't like it, then too bad, because there isn't a darn thing that they can do about it.

Justices who don't answer to the mainstream popular opinion seems so undemocratic. 

It is. That's the whole point.

As Alexander Hamilton explained in the Federalist Papers, Supreme Court Justices were not meant to respond to the "ill humors of society," they were to stand apart from them and keep the people from trouncing over the separations and safeguards that were so carefully arranged in the Constitution. Good justices are bound to be unpopular. After all, they stand in the peoples' way when they have an impulse to create a law or elect a president that would violate an unpopular minority's constitutional rights. They are like sober friends at a raucous party: everyone might roll their eyes, but the next morning they will be grateful that someone stopped them from starting a bonfire in the living room. 

To that end, the constitution almost totally insulates the Supreme Court from the political pressures that would otherwise push them into an unconstitutional mainstream. While legislators campaign, justices fulfill their constitutional duties in the quiet of libraries.

If the true test of a good justice is his ability to fairly and impartially interpret the law as written, then Neil Gorsuch is a fine choice. Conservatives are not the only ones praising the strength of his judicial character. Neal Katyal, an acting solicitor general in the Obama administration, praised Gorsuch as "someone who will stand up for the rule of law and say no to a president or Congress that strays beyond the Constitution and laws." If a justice with that kind of universally-recognized integrity is not in the legal mainstream then there is something wrong with the legal mainstream, not Neil Gorsuch.

Democrats should join in approving Gorsuch's appointment. They will only hurt themselves if they don't. Unless they find better ammunition than Schumer could muster, opposing such a nominee will expose their own disaffection for a Constitution that doesn't say what they want it to say.

In the long run, Democrats will be better off if they put their support for judicial activism behind them. This will mean, of course, that the Supreme Court will no longer serve as their back channel for advancing (without enacting) socially progressive laws. Remedying every perceived social injustice or inequality won't be quite as easy (it is, after all, easier to get five votes on the Supreme Court than thousands of votes across fifty state legislatures), but their hard work will be rewarded. Their victories will be duly-enacted laws that bear a surer stamp of legitimacy than the Supreme Court could ever bestow.

Democracy will be the better for it as well. The undemocratic character of the Supreme Court only fits within a republic so long as the Court confines itself to the narrow task of legal interpretation. Alexander Hamilton wrote that the judiciary should be "beyond comparison the weakest" of the three branches of government. The Court cannot get into the business of lawmaking without also intruding on the voters' ability to decide controversies by more democratic means. 

Since the mid-twentieth century, the Supreme Court has been committing the cardinal sin of legislating from the bench. Decisions on abortion and so-called "fundamental rights," while praised by progressives, actually restrained Americans from bringing their arguments to democratic forums. As Justice Scalia dissented in Obergefell (the decision that forced states to recognize same-sex marriages):
Until the courts put a stop to it, public debate over same-sex marriage displayed American democracy at its best. Individuals on both sides of the issue passionately, but respectfully, attempted to persuade their fellow citizens to accept their views. Americans considered the arguments and put the question to a vote. The electorates of 11 States, either directly or through their representatives, chose to expand the traditional definition of marriage. Many more decided not to. Win or lose, advocates for both sides continued pressing their cases, secure in the knowledge that an electoral loss can be negated by a later electoral win. That is exactly how our system of government is supposed to work.
The Supreme Court has allowed progressives to get away with changing laws without doing the hard work of changing minds. Social conservatives would be far more willing to accept defeat if defeat came in the arena of state legislatures and state-wide referendums. When defeat comes at the hands of five "mainstream" Ivy League law graduates, there is bound to be a backlash. 

The 2016 presidential election may have been that backlash. In some ways, the generalized anger that voters felt for being the victims of elitism was imagined. In others it was not. Judicial activism is the most egregious form of elitism that Americans are forced to tolerate. Former Justice Brennan probably best encapsulated the arrogance that had crept into the Supreme Court when he used to tell his clerks that "with  five votes, you can do anything around here." 

He was right, in a way, wrong in another. With five votes, "mainstream" justices can impose upon Americans an edict that is wholly undemocratic and only "mainstream" in the opinion of five robed intellectuals. But they can't make Americans accept such an edict as their own. Americans are right to revolt against the presumptuousness of Justice Brennan when it has deprived them of their right and ability to vote on issues that they passionately care about.

The Supreme Court has a long way to go to toward rectifying the damage done by judicial activism. Neil Gorsuch's appointment will be a good step in the right direction.

Sunday, January 29, 2017

Trump's Inauguration of a New World Order

Changes in the international system are difficult to track. National politics is easy.  If you want to know who is in power and what is changing, you need only look at election outcomes, legislative acts, and Supreme Court decisions. Not so with international relations. The health and the stability of the world order largely hinges on intangibles: the extent to which governments respect the legitimacy of treaties, how strictly they follow established norms of behavior that restrain national aggression, and the like. In many ways, gauging the stability of the world order resembles tracking climate change. Single events may reflect a long-term pattern, but they could just as easily be random outliers that distort longer patterns of change.

That being said, Donald Trump's first inaugural address was exceptional. It was a loud and unambiguous signal that the world order, for better or for worse, is experiencing a transformation that is more than just a passing storm. Trump heralded an "America First" ideology that will have the United States relinquish its place as the chief manager of the world order. Though couched in a commitment to make the interest of U.S. citizens his chief concern, it was nonetheless a rebuke to anyone who thought that the United States would be a stabilizing force or a levee for the rising tide of Russian aggression.

When an inauguration transfers power from one political party to another, there is an inevitable tendency to focus on the difference between the incoming and outgoing administrations. 2017 was no exception. After all, a black, eloquent, painfully deliberate former president standing beside a white, inarticulate, impulsive newcomer made for a striking constrast. But there was a more informative comparison on display: that between Trump and his Republican predecessor, George W. Bush. When it comes to their views on the United States' place in the international system, Bush, not Obama, represents Trump's polar opposite. Whereas Bush pushed benevolent hegemony to its limits and beyond, Trump has previewed a foreign policy characterized by unwavering national self-interest.

Let's revisit George W. Bush's second inaugural address, which he delivered in 2005. Speaking shortly before Iraq's first parliamentary election after the 2003 invasion, Bush anapologetically declared America's duty to advance freedom and liberty abroad. He also spoke of the unity of American and foreign interests: "We are led, by events and common sense, to one conclusion: The survival of liberty in our land increasingly depends on the success of liberty in other lands. The best hope for peace in our world is the expansion of freedom in all the world. America's vital interests and our deepest beliefs are now one." Ultimately, though, self-interest was not the animating force behind America's foreign policy, rather, an evangelical belief in and duty to the cause of liberty itself. Americans, themselves graced with freedom, had a moral imperative to spread liberty abroad: "From the viewpoint of centuries," he warned, "the questions that come to us are narrowed and few. Did our generation advance the cause of freedom? And did our character bring credit to that cause?" In keeping with this commitment to fostering freedom, the United States would be friend to those governments that protected their citizen's liberty and oppose those who did not.

Whereas Bush reminded Americans of their privileged status and appealed to a higher sense of duty, Trump pointed to their unjust depravity and appealed to a sense of indignation. Americans had suffered "carnage": economic stagnation, crime, violence, and, most significantly, exploitation at the hands of foreigners.
For many decades, we've enriched foreign industry at the expense of American industry, subsidized the armies of other countries while allowing for the very sad depletion of our military. We defended other nation’s borders while refusing to defend our own. And spent trillions and trillions of dollars overseas while America's infrastructure has fallen into disrepair and decay. We've made other countries rich while the wealth, strength, and confidence of our country has dissipated over the horizon. One by one, the factories shuttered and left our shores with not even a thought about the millions and millions of American workers that were left behind. The wealth of our middle class has been ripped from their homes and then redistributed all across the world.
To Bush, America is blessed with a wealth of freedom, and the the logical extension of that blessing is that it has an obligation to spread its wealth to the world. To Trump, America has been victimized, and the logical extension of that victimization is that it should seek its just deserts.

It is no surprise, then--given how differently their ledgers calculate America's debt to the world--the two disagree on how America should behave itself in the international system. Bush saw America as the world's selfless hero. To the contrary, Trump vowed that the United States under his leadership would make no sacrifice for the welfare of those outside its borders. "From this day forward," he vowed, "it's going to be only America first. . . . Every decision on trade, on taxes, on immigration, on foreign affairs will be made to benefit American workers and American families." Accordingly, Trump did not subscribe to Bush's sentiment that America should ally itself with governments who respect democratic values. Under his "America First" foreign policy, self-interest alone would dictate the country's alliances: "We will seek friendship and goodwill with the nations of the world, but we do so with the understanding that it is the right of all nations to put their own interests first." 

"America First" foreign policy does not aspire to uphold or advance any higher virtues. Trump's speech left no room for American altruism, only blatant, unflinching self-interest.

Whether the international system will change for the better under Trump's "America First" foreign policy is a fair topic of debate. On the one hand, Trump makes some valid points. America has shouldered a disproportionately heavy burden for policing the free world. And Trump's central proposition--that a government should protect its citizen's interests--seems reasonable on its face. Why should America spend blood and resources on freedom crusades in the Middle East rather than respect the sovereignty of other nations and mind its own business? Why check Russian aggression in Europe when that aggression does not threaten American borders?

Then again, Trump's foreign policy could spell trouble for both the United States and the international system. For the United States, being at the helm of the free world has had its perks. Although it has not always exercised its influence prudently, the United States has enjoyed tremendous influence over issues of global concern. But that influence will surely suffer. After all, a country vowing to seek its own interests is also a country disavowing any principled loyalty to its friends. The United States should expect a reciprocal erosion of trust and loyalty.

Things could also turn sour for the international system, especially if other nations follow Trump's lead and turn inward. The relative stability enjoyed by the western world since the end of the Cold War may be more fragile than most realize. So long as the United States has been willing to flex its economic or military muscles, there was generally too much of a disincentive for nations to act aggressively across their own borders. Saddam Hussein learned this lesson the hard way after he ordered his military to invade Kuwait in 1990, gambling that the United States would stand by and watch. That proved to be a grave miscalculation. In 1999, Pakistan might have started a nuclear war with Russia had not the United States intervened and forced its Prime Minister, Nawaz Sharif, to accept an embarrassing withdrawal. Sharif's decision was an unpopular one that led to his coup and exile, but it was an easier course than the one Hussein had chosen.

Trump's "America First" foreign policy could be a step toward international anarchy, a world in which national aggressors can flaunt international norms without fear of reprisal. The calculations of would-be aggressors will look far different under an America First foreign policy. History offers lessons in what can happen when world powers stop making it their business to maintain world order. As the western world learned during World War II, expressions of outrage alone do not deter acts of national aggression. Allies cannot collectively deter bad actors only when it is in their individual best interests to do so. Indeed, the very idea behind alliances is that allies commit themselves to one another's defense whether or not doing so is in their individual best interests. If the United States withdraws from the business of ensuring the integrity of the world order, Russia will not be the only nation to see how much aggression others will tolerate. 

One could defend Trump's foreign policy by arguing that, as a matter of principle, America should avoid conflicts abroad. But one should not confuse Trump's foreign policy with a return to a Westphalian ethic in which states acknowledge one another as coequal sovereigns, mutually agree to respect one another's borders and adopt a policy of noninterference in others' internal affairs. Trump has not revoked the right of the United States to intervene, just redefined the terms on which it will do so.  Trump seems to see America as a national Ubermensch, unaccountable to any international code whatsoever. "America First" foreign policy is thus premised on a view of the world as a food chain of nations, not as a society of equal states. Trump's demand that Mexico pay for the United States' border wall--lest he impose crippling economic sanctions--is devoid of any moral logic whatsoever. He explained his proposals to rob Iraq and Libya of their oil resources by hearkening back to the good old days when "to the victor belong the spoils." That proposal, according to military historian Lance Janda, is "so out of step with any plausible interpretation of U.S. history or international law that they should be dismissed out of hand by anyone with even a rudimentary understanding of world affairs. . . ." 

Trump's foreign policy resembles nothing from the modern world. These days, even the most bellicose dictators manage to concoct some half-baked rationale to legitimize aggression. Trump has what can only be described as a viking ethos. The plunder itself justifies the plundering.

By even uttering his foreign policy proposals, Trump has fully surrendered the moral high ground that the world needed the United States to hold. Not long ago, there was an international consensus that America could be trusted to maintain the world order. That trust eroded after the United States led its allies into a dubious war in 2003. Nevertheless, the society of nations may have turned to America for leadership once again. Trump's inauguration speech vanquished any hope that the United States is willing to take on such a role. Moreover, some of Trump's foreign policy proposals have been sufficiently absurd to raise the question of whether the United States is morally fit to put its cloak back on.  Under Trump, America may actually threaten the international norms that it once defended. 

As Lance Janda put it, "Are we the good guys or not? Because if we are, and if we want to convince the world we are, then we can't go around invading countries and stealing their oil. The long-term damage to our reputation would be irrevocable." Trump has given no indication that he wants to convince the world that we are the good guys. From his perspective, nice countries finish last.

For another perspective:

Many have observed that while Trump's rhetoric points in a radical new direction, his cabinet choices point in a more traditional one.

Further Reading:

Charles Krauthammer drew a similar insight from Trump's inaugural address and cites some other historical examples.

Time magazine published an article on this topic worth reading: "Donald Trump's New World Order Puts Nation Over Globe."