Wednesday, October 11, 2017

Dear Baseball, It's Time to Get Rid of Home Plate Umpires

I'm not a baseball fan, strictly speaking. Don't get me wrong, I love the sport, but I harbor subversive ideas that a real baseball fan, by definition, cannot hold. I am a heretic. Take my view of umpiring, for example. In my opinion, it is high time Major League Baseball turns the job of calling balls and strikes over to the machines.

Logistically, it could be done. Already, home plates have become epicenters of scientific observation; there are probably more gizmos fixed on any given professional diamond than there are observing all of the planets in our solar system. Those gizmos give baseball fans--the most stat-hungry in sports--a trove of data that makes traditional metrics seem as antiquated as baseball cards. Gurus these days are more interested in the speed of a pitcher's fastball, the spin rate of his curve, and the exit velocity of a batter's home run than ERAs and batting averages.

These wonders of modern technology should be revolutionizing how umpiring is done. Instead, they are highlighting how atrocious umpiring is and probably always has been.

A short history: Beginning in 1887, baseball has relied upon the relatively frail eyesight of human beings to determine whether a pitch has passed through an imaginary box known as a "strike zone" (before then, batters could tell pitchers to throw a high, low, or fair pitch). Since then, players and fans have been complaining. Arguments over balls and strikes and the feeling of being ripped off by a blown call have become as integral to the baseball spectacle as chewing tobacco and facial hair.

Although their job hasn't changed much, being a major league umpire is as difficult as ever. A 92-mph fastball, once considered fast, is in the air for a mere 446 milliseconds. Today's pitchers have shaved precious milliseconds off that figure by throwing 100-mph more frequently than ever. Home plate umpires are expected to judge a ball's position at such speeds with no physical reference whatsoever: just an imaginary strike zone suspended in air.

Now, Major League Baseball has technology that could instantaneously call balls and strikes with the precision and impartiality that only machines can offer. It seems like a no-brainer. "The sad thing is you have no clue what could be called a ball or a strike at any point," complained former big-leaguer Erik Byrnes. "Why do millions of people sitting at home get to know whether or not it was a ball or strike, yet the poor dude behind home plate is the one who’s left in the dark?"

Read more here: http://www.star-telegram.com/sports/spt-columns-blogs/gil-lebreton/article105378146.html#storylink=cpy

If you can't relate to the emotional sting that comes from losing a game due to a blown strike call, consider some hard data that prove how bad the best umpires in the world are at calling pitches. When Yale professor Dr. Toby Moskowitz analyzed nearly a million major league pitches, he found that umpires correctly called a ball or strike about 88% of the time. That might not sound bad, but for close pitches--those within two inches of the plate--the success rate dropped to 68.7%. That's not good, especially considering that the umpires would have scored 50% if they were completely guessing.

It gets worse. Not only do umpires regularly miss calls, but they are also susceptible to bias. According to research from Stanford University, umpires tend to adjust the strike zones to avoid sending runners to first base on a walk or to the dugout on a strikeout: their zones expand in three-ball counts and shrink in two-strike counts. Umpires are also reluctant to call two strikes in a row. And try as they might, umps can't quite shake home-field bias: statistics show that home pitchers benefit from slightly more generous strike zones.

So if umpires are judged for both accuracy and consistency, then they are failing on both fronts.

Yet letting machines become umpires doesn't sit right with old-school baseball fans. To them, baseball just wouldn't be baseball without umps to flamboyantly ring-'em-up after a called third (alleged) strike. The "human element"--i.e., blown calls and the saliva-swapping arguments that ensue--are baseball's patina.

They have a point. After all, bad umpires have driven the plot of some of baseball's most memorable dramas: think Jackie Robinson magically stealing home through the tag of Yogi Bera in game 1 of the 1955 World Series, or George Brett flailing out of the dugout after Yankees manager Billy Martin complained about the pine tar on his bat and umpires negated his go-ahead home run. And who can forget Jim Joyce wiping tears away as he accepted a lineup card from Armando Galarraga a day after the umpire blew an  easy call that cost Galarraga a perfect game?

But baseball should not let infatuation with its history slow its progress. In pitch-tracking technology, the MLB has a near-perfect solution to a century-old problem. If such things were possible, the NBA would leap at the chance to remove the subjectivity from foul-calling. The NFL would pay handsomely to get rid of its five-minute replay delays. For now, basketball and football are stuck with their officiating woes. Baseball is not. Automated pitch calling would give officiating the precision and objectivity of replay review without the snooze-inducing delays.

I'm not saying mechanical pitch-calling won't change the game. It would, and drastically. Pitchers and batters who master the strike zone will become more valuable as their skill is more consistently rewarded. Catchers, freed from the charade of framing pitches, would change their catching stances to hold runners more aggressively. Even the aesthetics would change. Without a home plate umpire crowding the action, baseball games would take on a more backyard feel. That mano-y-mano medieval joust that is the pitcher-batter face-off would look better unsupervised.

Alas, baseball purists have a love-hate relationship with umpires that they aren't quite ready to quit. As much as they pester umpires through their TV screens, fans have developed a vocabulary for shrugging off their mediocrity. When a player resists the urge to swing at an outside pitch and gets called out on strikes the blame falls on him for not anticipating a blown call ("that was close enough to swing at with two strikes," "he's got to protect.") On a bad day, an umpire might still earn the praise of fans if he is "calling it both ways." That phrase sounds like a compliment. In fact, it is just a euphemism for, "well, at least both teams are getting equally ripped off."

"It's a part of the game," is my least favorite argument for tolerating bad umpiring. Before pitch-tracking technology, blown calls were tolerable--fans had no choice but to tolerate them. Now, they are an embarrassment.

I get it. After baseball replaces them, I will probably miss the human element that home plate umpires offer. (Likewise, after scientists cure the common cold, I may feel a certain nostalgia for the taste of cough syrup.) But the sport will fare better when its games hinge more on athletic feats and less on the foibles of its officials.
________________________________________________________________________________

Further Reading:

A summary of HBO's feature, Man vs. Machine can be seen here. Apparently former umpire Jerry Crawford laughed off any suggestion that things needed to change. Umpires have gotten so good, he claimed, that "they're not missing any pitches."

Here is an article for those interested in the history of pitching rules.

For Another Perspective:

Derek Thompson, writing for The Atlantic has argued that pitch trackers have made baseball worse by making it harder to score. The 2017 season--which set an all-time record for home runs--may have wrecked his theory.

Tuesday, October 3, 2017

Communists Yesterday, Racists Today: How America Deals with Ideological Pariahs

I'm puzzled by the way that Americans, liberal and conservative, treat their ideological pariahs. We, who claim devotion to the freedom of speech, nevertheless feel a moral obligation to shame, boycott, and ostracize those who hold and express unpopular beliefs.

I don't get it. I consider myself a patriotic American, but I haven't heard one good reason (just a lot of bad reasons) why I should adopt an attitude of scorn or disgust towards any NFL player--or Jehovah's witness, for that matter--who doesn't stand during the National Anthem. The reasons for expressing--or not expressing--one's patriotism are highly personal. Colin Kaepernick might be an ungrateful punk for all I know, but I don't storm out of the chapel when a fellow churchgoer is noisy during a prayer, and I'm not going to boycott the NFL because some twenty-eight-year-old doofus is irreverent during a flag salute.

For all of our progress on racial and ethnic tolerance, ideological intolerance is a stubborn disease that survives by hiding under the guise of righteous causes.

Take for example, the war against bigotry, today's most socially-acceptable form of ideological intolerance. The national mood is such that one can't be too hard on racists. After the controversial Unite the Right rally in Charlottesville, Virginia, Governor Terry McAuliffe shooed away white supremacists with a phrase that sounded like something that came straight from the mouth of a white supremacist: "There is no place for you here, there is no place for you in America." His motives may have been pure, but there was something unsettling about hearing a high public official use ideology to define Americanness.

Across the country, pockets of hysteria broke out as pictures of rally attendees went viral. When students at the University of Nevada-Reno discovered a racist among them, fretful students packed the Student Senate, demanding that the administration do something to protect themSome complained that knowing there was a racist on campus made them fear for their lives. 32,000 people signed a petition demanding that the school expel the offender.

Ultimately, those pesky First and Fourteenth Amendments (and, hopefully, a bit of human decency) prevented UN-R, as a public school, from yielding to the demands of its horror-struck student body.  

The law isn't always so forgiving. Recently, otherwise talented individuals have lost their jobs when the public discovered their previously-dormant bias. Take Fisher Deberry, for example, who was the winningest football coach in Air Force Academy history before he said that his team needed to recruit more black players because "Afro American kids can run very well." He was publicly reprimanded and resigned soon after. Or former NPR news analyst Juan Williams, who was canned after confessing that he feels nervous when he sees passengers in Muslim garb on his plane. More recently, James Damore became a martyr of the alt-right when he suggested, as tactfully as one possibly could, that non-bias factors were contributing to Google's gender gap. Feminists decried Damore's memo (which was factually accurate) as sexist, and Google sent Damore packing.

Each of these victims have one thing in common: none of the offenders were actually accused of acting in an intolerant way; they were convicted of bigotry for honestly and candidly expressing unpopular opinions.

America have a long history of ostracizing ideological pariahs. If racism is the 21st century's cardinal sin, communism was the sin of the 20th when socialists, communists, and radicals of all kind were deemed un-American, disloyal and a serious threat to America.

Persecution against radicals--or suspected radicals--became socially-acceptable. In Chicago, for example, a sailor shot a man three times at a pageant when he refused to stand for the national anthem. The crowd erupted in cheers. "It was believed by the authorities," the Washington Post reported, lest its readers shed any tears for the victim, "he had come here for the [Industrial Workers of the World] convention."

U.S. Attorney General A. Mitchell Palmer compiled a list of suspected radicals 450,000 names long, then set out to rid the country of its reds. Palmer's dragnet operation, which became known as the "Palmer Raids," involved the rounding up of tens of thousands of suspected radicals. Arrestees were often taken in Kafkaesque fashion, without being told the charges against them. Hundreds, without being convicted of a crime, were packed onto a ship dubbed the "Soviet Ark" and sent to Russia. Some arrestees became bait for other suspected radicals; those who visited political prisoners at the Seyms Street Jail in Hartford, Connecticut, were automatically arrested; they were, Palmer reasoned, essentially attending "revolutionary meetings." That, to Palmer, was proof enough of their guilt.

The public hysteria quelled, but returned with a vengeance during the Cold War. Again, intense fear of communism fueled a nation-wide inquisition. President Truman instituted a "Loyalty Program" beginning in 1947 to purge radicals from the federal workforce. Suspected radicals were brought (often on anonymous tips) before a "Loyalty Board," where they were probed about their belief in racial equality, association with labor unions, support for socialized medicine, sexual orientation and anything else that whiffed of communism.

Though few employees were actually terminated, the mere threat of being dissected by a Loyalty Board had a repressive effect. In 1950 the American Scholar published the results of a survey which found:
The atmosphere in government is one of fear--fear of ideas and of irresponsible and unknown informers. Government employees are afraid to attend meetings of politically minded groups; they are afraid to read "liberal" publications; the screen their friends carefully for "left-wing" ideas. . . . Nobody wants to go through a "loyalty" investigation.
The fear of being pinned as a communist or even a "fellow traveler" spread far beyond the federal government. In Madison, Wisconsin, a journalist asked 112 people to sign a petition that contained nothing but lines from the Declaration of Independence and Bill of Rights. All but one person refused for fear that they would be caught fixing their names to a subversive document.

Those who spoke out against red-baiting risked being smeared as communists themselves. Drew Pearson, a prominent journalist spoke out against Senator McCarthy's communist witch hunt. When McCarthy met him in a night club, the senator grabbed Pearson by the neck and kneed him in the groin before Richard Nixon pulled the two apart. Soon after their brawl, McCarthy called Pearson a "sugar-coated voice of Russia" from the floor of the Senate, and called for "every loyal American" to boycott Pearson's radio sponsor, The Adam Hat Company unless it revoked its sponsorship. The Adam Hat Company soon obliged.

The most shameful incidents of ideological persecution to come out of the Red Scare involve the thousands of ordinary citizens, far removed from nuclear secrets, who nevertheless found their livelihoods jeopardized by anti-communist witch hunts. The story of Anne Hale provides one such story. Ms. Hale was a member of the Communist Party during World War II (when the United States and Soviet Union were allies). After the war, she cut her ties to the communist party and became a second-grade schoolteacher in Wayland, Massachusetts.

Hale's past came to light in 1948 when an FBI informant caught her buying a Pete Seeger songbook. Agents, who did not think that Hale posed a security threat, told the Wayland School Committee about Hale's past. Hale assured the school committee that she was no longer a communist and that she was a loyal American, but the committee unanimously voted to suspend her without pay.

Hale's case would not have been unusual if she, like thousands of other accused communists, had simply resigned. Instead, she braved the public humiliation of a hearing. "I think it will do less harm to the children," she reasoned, "to see me standing up for what I believe to be true, than to see me run away." Hale admitted during the hearing that she had been a communist, but insisted that she did not believe in the overthrow of the government by force and violence and that she "believed in government by the majority, in the Bill of Rights, and in the protection of the rights of minorities."

Few dared vouch for her. Even her lawyer quit just days before the hearing. Hale's reverend, to his credit, told the press that he would give Hale a character reference. He paid a dear price. Attendance at his parish plummeted; one Sunday, just three parishioners showed up.

After eight grueling days of hearings, the school committee voted to dismiss her. Hale asked that she be allowed to say goodbye to her class. The committee refused. Instead, Hale sent each one of her students a farewell. "Just remember these things," she wrote, "which I am sure you know--I love my country and I love you."

The communist purges of the 20th century and today's purge of racists have at least one thing in common: they are driven by a fear that is grossly disproportionate to the threat posed. That is not to say that there are no hate-mongering racists among us. Just as there were communists during the cold war intent on giving nuclear secrets to the Soviets and, if given the chance, would have joined a violent overthrow of the government, there are fire-branded bigots today who would bring back Jim Crow.

But, as was the case during the Red Scares, the war against bigotry is being waged not just against those who act to harm others, but also those who merely lean to the forbidden side of an ideological spectrum. Such ideological purges have no logical end because self-righteous persecutors will find a perpetual line of baddest apples to drive from polite society. And thus both ideological scares will probably suffer the same fate; they will fizzle out after enough James Damores and Anne Hales make people realize that the quest for national purity was madness all along.

This is not to say that we should have a blase attitude toward hate and prejudice, only that the way to suppress an idiotic ideology is more evangelical and less persecutory: a strategy of conversion rather than coercion; an acknowledgment that preserving America as a laboratory of ideas means tolerating those who hold opinions that we loathe.

My fellow Mainers still take pride in the heroic words of Maine Senator Margaret Chase Smith, who bravely rebuked Joe McCarthy and her fellow Republicans from the Senate floor:
Those of us who shout the loudest about Americanism in making character assassination are all too frequently those who, by their own words and acts, ignore some of the basic principles of Americanism: the right to criticize; the right to hold unpopular beliefs; the right to protest; the right of independent thought.
The exercise of these rights should not cost one single American his reputation or his right to a livelihood merely because he happens to know someone who holds unpopular beliefs. Who of us doesn't? Otherwise none of us could call our souls our own. Otherwise thought control would have set in.
America's mission in the 20th century should have been to expose the foolishness of communism and let the radicals exorcise themselves, not to exile the communists from every nook of society. The same may be said of the war against racism. One can only hope that we are faster learners this time around.

__________________________________________________

Further reading:

Much of my material about the Red Scares comes from Haynes Johnson's thoughtful book, The Age of Anxiety: McCarthyism to Terrorism.

You can read the full story of Anne Hale in the Boston Globe.

The Brookings Institute recently published the results of a poll of undergraduate students on freedom of speech issues. They were disturbing. Among other findings, almost twenty percent of respondents thought that it was acceptable to act--including resorting to violence--to suppress expression that they consider offensive.