In the early 1970's, American chicken hatcheries began killing their stocks of chicks. One hatchery manager, to make a point, drowned 43,000 on television for all the world to see. It was cruel. It was also the farmers' rational economic response to a government policy that turned chicks from assets to liabilities by fixing the price that farmers could charge for their stock. Feed prices rose; chicken prices didn't. Cue the dirge.
One might assume that a socialist-leaning Democrat was responsible for such a policy. After all, the chick slaughters were reminiscent of scenes from the Grapes of Wrath, in which John Steinbeck described the perverse outcome of Roosevelt-era efforts to artificially prop up food prices by torching produce and limiting supply. ("And children dying of pellagra must die because a profit cannot be taken from an orange.").
In fact, the chicks had Richard Nixon, a theoretically pro-free market Republican, to thank for their early demises. Richard Nixon had been a critic of government price controls ever since he worked as an attorney during World War II for a government tire-rationing office. But as president, he built a persona as a champion of the working man and a crusader against foreign influence, and he had no qualms about exercising his executive power to--and beyond--its fullest extent.
So, when pressured to do something (anything) about rising inflation in 1971, Nixon took swift and sweeping action. On Sunday, August 15, he interrupted the Western hit television series, Bonanza, to declare that "the time has come for decisive action." Taking a measure that seems unthinkable today, Nixon ordered "a freeze on all prices and wages throughout the United States" for ninety days. Inflation would stop. Because Nixon said so.
The price controls were necessary, Nixon explained, because "international money speculators" were waging an all-out war on the American economy and profiting from the inflation crises at the expense of American paychecks.
Indeed, foreign powers were the primary targets of Nixon's program, under which the United States devalued foreign currencies by disallowing foreign governments to convert dollars to gold, imposed a 10% tariff on imports, and cut foreign aid. America, Nixon explained, had rescued foreign economies from the fallout of World War II; now it was time for them to contribute their fair share.
Politically, the gamble paid off. As Nixon had intended, the public viewed him as a decisive commander. Nixon bolstered his credentials as the champion of the working man and the enemy of international speculators. The price controls artificially curbed inflation long and well enough for Nixon to win reelection in 1972.
But as famed economist Milton Friedman predicted, the wage and price controls proved to be merely a cosmetic fix to mask a systemic problem. When the administration lifted controls, prices shot up. By 1973 inflation reached 9 percent. Although Nixon warned that "we must not let controls become a narcotic," he needed to appear strong more than ever in the face of the looming Watergate scandal. To that end, he reimposed price controls and increased government staff to sanction businesses who raised prices. In the long-term, the price controls did not resolve the inflation crisis. By the time Nixon left office and the controls were fully lifted in April 1974, inflation had soared to 12 percent.
There are lessons to be had from the failure of Nixon's price controls. For one, they prove that prices are more obedient to the invisible hand of the market than presidential orders. To Nixon's head of the Office of Management and Budget, this was the silver lining. "At least," he told Nixon, "we convinced everyone else of the rightness of our original position that wage-price controls are not the answer."
Perhaps more importantly, the "Nixon shock" should be remembered as a cautionary tale about the dangers of giving broad authority to a president who will use his power to build a strong-man persona rather than protect the interests of his constituents.
There are deep parallels between Trump and Nixon. Like Nixon, Trump pushes policies that bolster his tough-nosed persona, even when--or especially when--they run contrary to well-established economic truths. Most notably, he invokes Nixon-esque xenophobia to justify trade tariffs that are certain to do more harm than good to the U.S. economy.
But lecturing Trump about the vices of protectionism are futile, because, like Nixon, Trump is less interested in the pursuit of national prosperity and more interested in the pursuit of a masculine persona built on a sentimental narrative in which foreigners are a threat and he is the protector.
Apologists will undoubtedly deflect criticism of Trump's irresponsible trade policies by invoking ad hominems against Obama and his socialist band of Democrats. But history proves that traitors make the best assassins. As Nixon proved, a Republican president with an appetite for "decisive action" can wield executive power with far more impunity than a Democrat, because his own loyalty-fragmented party is a meager obstacle.
Fortunately, Nixon's wage and price controls lost their allure after Nixon's disgraceful departure gave Republicans the opportunity to resume their place as the party of the free market, ready and willing to check and reverse bureaucratic solutions to free market problems, a role that they solidified during the oil crisis of 1979.
Trumpism marks a worrisome departure from that post-Nixon political order. In a stunning reversal, a large majority of Republicans now favor tariff increases. Even those harmed by his trade policies seem inclined to follow the Trump persona over free market ideas; after Harley-Davidson announced that it would move manufacturing operations out of the United States in response to Trump's tariffs, the company's employees were still apt to trust the president's instincts and blame foreigners for their predicament. Persona is a powerful thing.
America will be fortunate if Trump leaves before he can break all of our nice things. It would also help if he would take his trade policies with him, so that Republicans can go back to pushing against Democrats for a freer market and an executive branch less inclined to Nixon's brand of "decisive action."
For another perspective:
Peter Navarro, an economic adviser to President Trump, offers a rationale for raising tariffs.
Megan McArdle wrote an op-ed arguing that free trade is under fire because people care more about their identity as producers than consumers.
Further Reading:
For more background on Nixon's economic policy and conditions during his presidency, PBS has a section on a web page titled "Commanding Heights" based on a book with the same title.
For a full video of Nixon's August 15, 1971 speech, click here.
In All Fairness . . .
Well-reasoned articles and opinions for those seeking insight, not outrage.
Friday, June 29, 2018
Monday, February 26, 2018
It's Hard for the Government to Isolate and Treat Potential Mass Shooters--And For Good Reason
The NRA has a talking point that never seems to generate much discussion: its call for the government to prevent mass shootings by isolating and treating the mentally ill. Some law enforcement authorities have recently made the same call for reform, arguing that laws need to make it easier for them to detain suspicious people, such as the man who went on a shooting rampage in Parkland, Florida, and bring them to mental health professionals. These sound like reasonable, even compassionate proposals, especially after massacres by deranged shooters who, in hindsight, were exhibiting signs of trouble.
I have some experience in this area. As an attorney, I have had gone before a judge and asked him to commit law-abiding citizens to a psychiatric facility against their will. It can be an uncomfortable job for an attorney who feels passionately, as I do, about the virtues of constitutional rights and limited government. In the world of involuntary commitments, many of the features that would normally constrain a state's ability to lock up its citizens--juries, proof of guilt beyond a reasonable doubt, the right to bail, extensive discovery procedures--don't exist. A citizen whose liberty is at stake doesn't get a public trial, just a short hearing at the psychiatric hospital itself. Judges appear by teleconference and routinely rely on the expert opinion of the state's own experts (patients can but generally don't have experts of their own). The whole thing lasts little more than thirty minutes.
It's not a broken system, but it is a system of relaxed constitutional protections that relies heavily on the honesty and integrity of the professionals within it. And for that reason, it represents a potential constitutional loophole that a government, hamstrung by the constitutional protections, but nevertheless pressured to guarantee public safety, could exploit.
Since the notions of freedom and liberty became vogue, institutionalization on the grounds of mental illness has become a convenient loophole to imprison dissidents without proof of guilt. And no case study better illustrates how the legal mechanism of involuntary commitments can become a tool for oppressive regimes than Soviet Russia.
Toward the end of the Cold War, American psychiatrists examined Pyotr Grigorievich Grigorenko. Grigorenko was a highly-decorated major general for the Soviet Union until he became a vocal dissident. Authorities brought him to government psychiatrists, who decided that Grigorenko's political views were the symptom of a mental disorder. They reasoned that his fear of the government was paranoia--though he was actually being closely monitored by the KGB--and that his obstinate criticism of authorities--something that could have gotten him shot--was suicidal lunacy. Over his objections, psychiatrists committed him to a facility to cure him of "reformism."
But when American psychiatrists examined Grigorenko, they found no sign that he was, or ever had been, psychotic. To them, his willingness to advocate reform despite the inherent dangers was a mark of conviction, not a symptom of madness. Grigorenko's confinement raised suspicions that the Soviets were using psychiatric hospitals as political prisons and tools for social control.
A delegation confirmed those suspicions after visiting and inspecting Soviet psychiatric hospitals in 1989. On paper, the Soviet laws for involuntary commitments are similar to those here in the United States. In both cases, the government can only commit those who both suffer from a mental disorder and pose a danger to themselves or others.
But in practice, the Soviet system was much different. The delegation learned that Soviet psychiatrists' notions of what constituted a mental illness and when someone posed a threat of harm were so broad and elastic that they could confine healthy individuals who posed no threat of violence. Thanks to the uncertainties inherent to psychiatry, psychiatrists could commit virtually anyone that the KGB wanted them to. Soviet psychiatrists became notorious for using an invented diagnosis, "sluggish schizophrenia" to label neurotic behavior as a symptom of a mental disorder. Patients like General Grigorenko who espoused disfavored views were considered diseased. Patients also could be committed even when they posed no danger of physical harm to themselves because political harm itself was dangerous. Those who spoke out against the government, became religious, or illegally crossed a border were considered dangers to society.
Like the American system of involuntary confinements, these psychiatrists were effectively the gatekeepers to their hospitals. But the professional integrity of Soviet psychiatrists proved no match for pressure from the KGB. American observers met a patient who came to a psychiatric facility after the KGB caught him visiting the apartment of a political subversive. Instead of doing an examination, his examining psychiatrist told him, "I have a family and I need this job. I and the rest of the Commission will do what we are asked by the KGB." Observers also saw that clinicians would skew the results of their examination by interviewing them in an adversarial way--they compared exams to courtroom cross-examinations. Patients who responded with hostility were described as paranoid.
Not surprisingly, involuntary confinements became a Soviet tool of repression like the GULAGs of old. But in some ways, involuntary commitments were even better at suppressing dissent than the GULAGs. By declaring someone mentally ill and dangerous, the Soviets could simultaneously avoid embarrassing public trials while also stigmatizing and discrediting political or religious dissidents, whose views could then be shrugged off as lunacy.
Fortunately, involuntary commitments here in the United States are not systematically abused. Psychiatrists don't face reprisals when they don't recommend commitment. Commitment periods are limited in duration. Concepts of mental disorder and risk of harm are narrowly applied. Law enforcement generally respects a person's right to refuse treatment and uses involuntary commitment as a last resort.
Of course, these features that separate the American and Soviet systems also render involuntary commitment a woefully inadequate tool for protecting the public from mass shooters such as the one that went on a rampage in Parkland, Florida.
If Americans wanted to rid themselves of the weirdos who aren't breaking the law but who make us nervous nonetheless, the government could wield involuntary commitments in a heavier way. Commitments could become a first resort when arrest isn't an option. Mental health examiners could confine more mentally ill people by labelling any minor symptom as a mental disorder. Courts could read lawful, even constitutionally-protected behavior--stockpiling weapons, posting controversial speech on an online forum, attending a mosque with a reputation for radicalism--as signs of danger. Psychiatric hospitals could become quasi-prisons for a new class of quasi-criminals: a holding place for those who make the neighbors squirm but can't be convicted of a crime.
Times like these call for a few important reminders. First, the limitations of involuntary commitments are the system's saving virtues. Second, our legal system is not designed to find and punish criminals before they become criminals--and for good reason. Third, we should not expect law enforcement to seek out and detain such people.
Finally, it is worth remembering that the NRA is not an advocate for freedom in general, just one particular freedom. Those of us who value freedom in general will beware a call for reform that would protect one freedom at the expense of many others.
Further Reading:
You can read more about the story of General Grigorenko here.
Richard Bonnie has written extensively on the political abuse of psychiatry in the Soviet Union and elsewhere. You can read one of his papers here.
For Another Perspective:
The Atlantic recently published an exchange with John Snook, the director of an organization that advocates for more robust mental health treatment. Snook favors a relaxation of the standards needed for involuntary commitments; specifically, a relaxation of the requirement that a person pose a danger before being committed.
Changes to involuntary commitment laws are not the only reforms being proposed. Five states have implemented "red flag laws" that allow law enforcement to temporarilly confiscate someone's firearms when they show signs of trouble. Such laws have been at least somewhat effective, especially in preventing suicides.
Note: Though I have some experience with involuntary commitments as an attorney, I don't purport to be an expert in that practice area.
I have some experience in this area. As an attorney, I have had gone before a judge and asked him to commit law-abiding citizens to a psychiatric facility against their will. It can be an uncomfortable job for an attorney who feels passionately, as I do, about the virtues of constitutional rights and limited government. In the world of involuntary commitments, many of the features that would normally constrain a state's ability to lock up its citizens--juries, proof of guilt beyond a reasonable doubt, the right to bail, extensive discovery procedures--don't exist. A citizen whose liberty is at stake doesn't get a public trial, just a short hearing at the psychiatric hospital itself. Judges appear by teleconference and routinely rely on the expert opinion of the state's own experts (patients can but generally don't have experts of their own). The whole thing lasts little more than thirty minutes.
It's not a broken system, but it is a system of relaxed constitutional protections that relies heavily on the honesty and integrity of the professionals within it. And for that reason, it represents a potential constitutional loophole that a government, hamstrung by the constitutional protections, but nevertheless pressured to guarantee public safety, could exploit.
Since the notions of freedom and liberty became vogue, institutionalization on the grounds of mental illness has become a convenient loophole to imprison dissidents without proof of guilt. And no case study better illustrates how the legal mechanism of involuntary commitments can become a tool for oppressive regimes than Soviet Russia.
Toward the end of the Cold War, American psychiatrists examined Pyotr Grigorievich Grigorenko. Grigorenko was a highly-decorated major general for the Soviet Union until he became a vocal dissident. Authorities brought him to government psychiatrists, who decided that Grigorenko's political views were the symptom of a mental disorder. They reasoned that his fear of the government was paranoia--though he was actually being closely monitored by the KGB--and that his obstinate criticism of authorities--something that could have gotten him shot--was suicidal lunacy. Over his objections, psychiatrists committed him to a facility to cure him of "reformism."
But when American psychiatrists examined Grigorenko, they found no sign that he was, or ever had been, psychotic. To them, his willingness to advocate reform despite the inherent dangers was a mark of conviction, not a symptom of madness. Grigorenko's confinement raised suspicions that the Soviets were using psychiatric hospitals as political prisons and tools for social control.
A delegation confirmed those suspicions after visiting and inspecting Soviet psychiatric hospitals in 1989. On paper, the Soviet laws for involuntary commitments are similar to those here in the United States. In both cases, the government can only commit those who both suffer from a mental disorder and pose a danger to themselves or others.
But in practice, the Soviet system was much different. The delegation learned that Soviet psychiatrists' notions of what constituted a mental illness and when someone posed a threat of harm were so broad and elastic that they could confine healthy individuals who posed no threat of violence. Thanks to the uncertainties inherent to psychiatry, psychiatrists could commit virtually anyone that the KGB wanted them to. Soviet psychiatrists became notorious for using an invented diagnosis, "sluggish schizophrenia" to label neurotic behavior as a symptom of a mental disorder. Patients like General Grigorenko who espoused disfavored views were considered diseased. Patients also could be committed even when they posed no danger of physical harm to themselves because political harm itself was dangerous. Those who spoke out against the government, became religious, or illegally crossed a border were considered dangers to society.
Like the American system of involuntary confinements, these psychiatrists were effectively the gatekeepers to their hospitals. But the professional integrity of Soviet psychiatrists proved no match for pressure from the KGB. American observers met a patient who came to a psychiatric facility after the KGB caught him visiting the apartment of a political subversive. Instead of doing an examination, his examining psychiatrist told him, "I have a family and I need this job. I and the rest of the Commission will do what we are asked by the KGB." Observers also saw that clinicians would skew the results of their examination by interviewing them in an adversarial way--they compared exams to courtroom cross-examinations. Patients who responded with hostility were described as paranoid.
Not surprisingly, involuntary confinements became a Soviet tool of repression like the GULAGs of old. But in some ways, involuntary commitments were even better at suppressing dissent than the GULAGs. By declaring someone mentally ill and dangerous, the Soviets could simultaneously avoid embarrassing public trials while also stigmatizing and discrediting political or religious dissidents, whose views could then be shrugged off as lunacy.
Fortunately, involuntary commitments here in the United States are not systematically abused. Psychiatrists don't face reprisals when they don't recommend commitment. Commitment periods are limited in duration. Concepts of mental disorder and risk of harm are narrowly applied. Law enforcement generally respects a person's right to refuse treatment and uses involuntary commitment as a last resort.
Of course, these features that separate the American and Soviet systems also render involuntary commitment a woefully inadequate tool for protecting the public from mass shooters such as the one that went on a rampage in Parkland, Florida.
If Americans wanted to rid themselves of the weirdos who aren't breaking the law but who make us nervous nonetheless, the government could wield involuntary commitments in a heavier way. Commitments could become a first resort when arrest isn't an option. Mental health examiners could confine more mentally ill people by labelling any minor symptom as a mental disorder. Courts could read lawful, even constitutionally-protected behavior--stockpiling weapons, posting controversial speech on an online forum, attending a mosque with a reputation for radicalism--as signs of danger. Psychiatric hospitals could become quasi-prisons for a new class of quasi-criminals: a holding place for those who make the neighbors squirm but can't be convicted of a crime.
Times like these call for a few important reminders. First, the limitations of involuntary commitments are the system's saving virtues. Second, our legal system is not designed to find and punish criminals before they become criminals--and for good reason. Third, we should not expect law enforcement to seek out and detain such people.
Finally, it is worth remembering that the NRA is not an advocate for freedom in general, just one particular freedom. Those of us who value freedom in general will beware a call for reform that would protect one freedom at the expense of many others.
Further Reading:
You can read more about the story of General Grigorenko here.
Richard Bonnie has written extensively on the political abuse of psychiatry in the Soviet Union and elsewhere. You can read one of his papers here.
For Another Perspective:
The Atlantic recently published an exchange with John Snook, the director of an organization that advocates for more robust mental health treatment. Snook favors a relaxation of the standards needed for involuntary commitments; specifically, a relaxation of the requirement that a person pose a danger before being committed.
Changes to involuntary commitment laws are not the only reforms being proposed. Five states have implemented "red flag laws" that allow law enforcement to temporarilly confiscate someone's firearms when they show signs of trouble. Such laws have been at least somewhat effective, especially in preventing suicides.
Note: Though I have some experience with involuntary commitments as an attorney, I don't purport to be an expert in that practice area.
Friday, February 9, 2018
Let Them Play
It's nice that Americans can test the patriotism of their citizens with North Korean ease. Those who stand during the national anthem and applaud speeches from their Commander-in-Chief are the faithful. Those who do not are the traitors. It's that simple. These days no one gets too bogged down in the nuances of John F. Kennedy's call to "ask what you can do for your country." If Americans did concern themselves with such things, then those True Patriots who have been so adament that the NFL coerce its players into standing for the national anthem might send a suggestion to the NHL, which refused to let its players play for their countries during the 2018 Winter Olympics.
The NHL's disappointing stance follows five consecutive winter games in which the world's best professional hockey players asked what they could do for their respective countries and answered by doing what they do best. The league owners generously accomodated their players by skipping the annual All-Star Game and taking a two-and-a-half week break from the regular season schedule. It felt like a win for everyone. The players had the privilege of playing for their countries, while their countrymen had the privilege of watching some of the purest competition that sports has to offer. Though an inconvenience, the Olympics served as an opportunity for the league to showcase its product on the most international of stages before an audience that ordinarilly doesn't care for hockey.
But in the year leading up to the Olympics, NHL ownership smelled an opportunity to capitalize on the patriotism of its players by using their participation as a bargaining chip in labor negotiations--a charge the owners tepidly deny. When the players refused to bend, ownership grounded them from attending the games, trotting out excuses that were either nakedly hypocritical (they expect us to believe that owners who hire goons specifically for their ability to punch other players' faces in but don't require facemasks are suddenly concerned about player safety?) or unapologetically pragmatic. "We certainly understand and appreciate the players want to be a part of the Olympics," one owner lamented, "but from our perspective, it is difficult for our business."
Thus we meet at the dubious intersection of capitalism and patriotism. It is an intersection where tax evaders and profiteers have put their own interests first. Robert La Follette said in the midst of World War I that "wealth has never yet sacrificed itself on the altar of patriotism."
But it is also an intersection where some of the most unsentimental industrialists have put their country's welfare over personal wealth. Cornelius Vanderbilt, for example, often said that "there is no friendship in trade." But in 1862, when a confederate iron-clad warship threatened to destroy the entire wood-hulled U.S. Navy, Secretary of War Edwin Stanton desperately telegrammed sent him a telegram, asking how much it would cost for Vanderbilt to use his private fleet to stop the confederate threat. Vanderbilt surprised Stanton by dedicating his prized $1 million steamship--the Vanderbilt--into the fray, free of charge, on the one condition that he could personally equip it for battle.
NHL owners might take a cue from Commodore Vanderbilt.
In the interest of fairness, let us concede a couple points. First, NHL owners have every right to tell their players where they can and can't play hockey. Second, sending players to the Olympics during prime hockey-viewing season is no small sacrifice, even once every four years. But even in the most sympathetic light, the owners still look ugly. Patriotic sacrifices are never convenient, as anyone who has voted, served on a jury, or held a job open for a soldier on military leave knows. Moreover, the players' vocal enthusiasm for donning their nation's colors puts the owners to shame. Just as the players deserve praise for being ready and willing to pad up for their homeland, the owners deserve criticism for standing in their way.
The casualty of the owners' pragmatism will be the unifying experience that comes from watching America's best push and shove not for money but for America herself. Perhaps the owners can think about that the next time they stand for the national anthem.
The NHL's disappointing stance follows five consecutive winter games in which the world's best professional hockey players asked what they could do for their respective countries and answered by doing what they do best. The league owners generously accomodated their players by skipping the annual All-Star Game and taking a two-and-a-half week break from the regular season schedule. It felt like a win for everyone. The players had the privilege of playing for their countries, while their countrymen had the privilege of watching some of the purest competition that sports has to offer. Though an inconvenience, the Olympics served as an opportunity for the league to showcase its product on the most international of stages before an audience that ordinarilly doesn't care for hockey.
But in the year leading up to the Olympics, NHL ownership smelled an opportunity to capitalize on the patriotism of its players by using their participation as a bargaining chip in labor negotiations--a charge the owners tepidly deny. When the players refused to bend, ownership grounded them from attending the games, trotting out excuses that were either nakedly hypocritical (they expect us to believe that owners who hire goons specifically for their ability to punch other players' faces in but don't require facemasks are suddenly concerned about player safety?) or unapologetically pragmatic. "We certainly understand and appreciate the players want to be a part of the Olympics," one owner lamented, "but from our perspective, it is difficult for our business."
Thus we meet at the dubious intersection of capitalism and patriotism. It is an intersection where tax evaders and profiteers have put their own interests first. Robert La Follette said in the midst of World War I that "wealth has never yet sacrificed itself on the altar of patriotism."
But it is also an intersection where some of the most unsentimental industrialists have put their country's welfare over personal wealth. Cornelius Vanderbilt, for example, often said that "there is no friendship in trade." But in 1862, when a confederate iron-clad warship threatened to destroy the entire wood-hulled U.S. Navy, Secretary of War Edwin Stanton desperately telegrammed sent him a telegram, asking how much it would cost for Vanderbilt to use his private fleet to stop the confederate threat. Vanderbilt surprised Stanton by dedicating his prized $1 million steamship--the Vanderbilt--into the fray, free of charge, on the one condition that he could personally equip it for battle.
NHL owners might take a cue from Commodore Vanderbilt.
In the interest of fairness, let us concede a couple points. First, NHL owners have every right to tell their players where they can and can't play hockey. Second, sending players to the Olympics during prime hockey-viewing season is no small sacrifice, even once every four years. But even in the most sympathetic light, the owners still look ugly. Patriotic sacrifices are never convenient, as anyone who has voted, served on a jury, or held a job open for a soldier on military leave knows. Moreover, the players' vocal enthusiasm for donning their nation's colors puts the owners to shame. Just as the players deserve praise for being ready and willing to pad up for their homeland, the owners deserve criticism for standing in their way.
The casualty of the owners' pragmatism will be the unifying experience that comes from watching America's best push and shove not for money but for America herself. Perhaps the owners can think about that the next time they stand for the national anthem.
Wednesday, January 24, 2018
What Monkeys Can Tell Us About Tax Reform
According to Albert Einstein, the hardest thing in the world to understand is the income tax. Perhaps only the United States Congress could concoct a paper quagmire of tax breaks and loopholes thick enough to stump a man whose name is synonymous with genius. But as difficult as the U.S. Tax Code is to understand, reforming it is even more difficult. Tax reform is like getting 300 million people to agree on how to arrange their furniture; decades of lobbying and bickering have given us the tax equivalent of a sofa on the pool table.
Congress's most recent endeavor--imperfect as it might be--looks like a step in the right direction. For most Americans, filing taxes will become simpler (though they won't be filing their returns on postcards anytime soon). Most Americans will also see their taxes go down. Perhaps most importantly, the tax reform will spur economic growth that will translate to higher household incomes (though the tax cuts certainly won't pay for themselves).
But the rules of party politics are such that it has fallen on Democrats to form a resistance against a reform that will bring broad and tangible benefits to the majority of the electorate voters. Their task has been daunting, but not impossible. They could emphasize the fact that the tax cut will cost future generations trillions of dollars and the swelling public debt will probably push up interest rates as consumers and businesses compete for loans against a debt-crazy Uncle Sam.
But debt conservativism has never been a theme of the Democratic platform, so it should be no surprise that the main thrust of the resistance's argument is a familiar one: that tax cuts benefit the wealthy at the expense of the poor.
That would be a tough argument even if it were true; the wealthier 51% of Americans-- those who actually pay income taxes--might reply that it is about time for the other 49% to make a token contribution. But the real problem with the democrats' argument is that it isn't true at all. Under the Republican tax reform, the vast majority of taxpayers at every income level will get a tax cut. Earners in the top 0.1% are actually the most likely to see a tax increase. And yet Democrats have raised another banner in their crusade against the wealthy few on behalf of the exploited masses.
Democrats aren't stupid, and their opposition to the tax reform isn't doomed to fail. After all, they have a curious human tendency on their side.
Monkeys may shed light on why many Americans who stand to gain from the Republican tax cuts nevertheless oppose them. In 2003, two scientists tested how female monkeys react to unequal treatment by simultaneously rewarding one monkey with a cucumber--something that monkeys like--and another with grapes--something that monkeys like even more.
The results were enlightening. The monkeys consistently reacted negatively when they got a mere cucumber for returning a token when they saw another monkey getting a grape for doing the same thing. It isn't surprising that monkeys, like humans, have a sense of fairness. But the monkeys didn't just chafe at getting the short end of a bad deal, they reacted to being shorted in a way that was contrary to their self-interest. Once they saw another monkey getting grapes, the monkeys spurned a reward that they ordinarilly would enjoy, refusing to return their tokens or throwing their cucumbers out of the test area.
Scientists have observed this same phenomena in humans and given it a fancy name: inequity aversion. Its understanding has added nuance to economics, a field of study that traditionally assumed that people will act rationally for their own gain. Inequity aversion could help explain why people don't respond to incentives as one would expect. The recent fixation on the disparity between "the 99%" and the wealthy could be the missing key to explaining why Americans are staying out of the labor force even as jobs become more readily available.
Social inequity aversion also explains why tax reform is so politically perilous: even reforms that will improve the efficiency of the tax code and benefit the vast majority of taxpayers may nevertheless prove deeply unpopular if it is perceived as giving grapes to some and cucumbers to others. Democrats may have success appealing to the less rational angels of our nature if they can convince enough Americans that their somewhat lower taxes and modest pay increases are insults relative to the billions of dollars that corporations will save. But if they succeed in undoing tax reform, their constituents will suffer economically with no consolation other than the deeper economic suffering of the wealthy.
Of course, tax debates would be more fruitful if politicians could agree that a good tax code is one that taxes undesireable activity--overconsumption, pollution, etc.--and encourages desireable activity such as production and child-rearing. Then they could stop wasting time arguing about what segment of the population needs to pay its "fair share" (whatever that means).
For now, we'll have to be content with fewer chairs on the pool table.
Congress's most recent endeavor--imperfect as it might be--looks like a step in the right direction. For most Americans, filing taxes will become simpler (though they won't be filing their returns on postcards anytime soon). Most Americans will also see their taxes go down. Perhaps most importantly, the tax reform will spur economic growth that will translate to higher household incomes (though the tax cuts certainly won't pay for themselves).
But the rules of party politics are such that it has fallen on Democrats to form a resistance against a reform that will bring broad and tangible benefits to the majority of the electorate voters. Their task has been daunting, but not impossible. They could emphasize the fact that the tax cut will cost future generations trillions of dollars and the swelling public debt will probably push up interest rates as consumers and businesses compete for loans against a debt-crazy Uncle Sam.
But debt conservativism has never been a theme of the Democratic platform, so it should be no surprise that the main thrust of the resistance's argument is a familiar one: that tax cuts benefit the wealthy at the expense of the poor.
That would be a tough argument even if it were true; the wealthier 51% of Americans-- those who actually pay income taxes--might reply that it is about time for the other 49% to make a token contribution. But the real problem with the democrats' argument is that it isn't true at all. Under the Republican tax reform, the vast majority of taxpayers at every income level will get a tax cut. Earners in the top 0.1% are actually the most likely to see a tax increase. And yet Democrats have raised another banner in their crusade against the wealthy few on behalf of the exploited masses.
Democrats aren't stupid, and their opposition to the tax reform isn't doomed to fail. After all, they have a curious human tendency on their side.
Monkeys may shed light on why many Americans who stand to gain from the Republican tax cuts nevertheless oppose them. In 2003, two scientists tested how female monkeys react to unequal treatment by simultaneously rewarding one monkey with a cucumber--something that monkeys like--and another with grapes--something that monkeys like even more.
The results were enlightening. The monkeys consistently reacted negatively when they got a mere cucumber for returning a token when they saw another monkey getting a grape for doing the same thing. It isn't surprising that monkeys, like humans, have a sense of fairness. But the monkeys didn't just chafe at getting the short end of a bad deal, they reacted to being shorted in a way that was contrary to their self-interest. Once they saw another monkey getting grapes, the monkeys spurned a reward that they ordinarilly would enjoy, refusing to return their tokens or throwing their cucumbers out of the test area.
Scientists have observed this same phenomena in humans and given it a fancy name: inequity aversion. Its understanding has added nuance to economics, a field of study that traditionally assumed that people will act rationally for their own gain. Inequity aversion could help explain why people don't respond to incentives as one would expect. The recent fixation on the disparity between "the 99%" and the wealthy could be the missing key to explaining why Americans are staying out of the labor force even as jobs become more readily available.
Social inequity aversion also explains why tax reform is so politically perilous: even reforms that will improve the efficiency of the tax code and benefit the vast majority of taxpayers may nevertheless prove deeply unpopular if it is perceived as giving grapes to some and cucumbers to others. Democrats may have success appealing to the less rational angels of our nature if they can convince enough Americans that their somewhat lower taxes and modest pay increases are insults relative to the billions of dollars that corporations will save. But if they succeed in undoing tax reform, their constituents will suffer economically with no consolation other than the deeper economic suffering of the wealthy.
Of course, tax debates would be more fruitful if politicians could agree that a good tax code is one that taxes undesireable activity--overconsumption, pollution, etc.--and encourages desireable activity such as production and child-rearing. Then they could stop wasting time arguing about what segment of the population needs to pay its "fair share" (whatever that means).
For now, we'll have to be content with fewer chairs on the pool table.
Wednesday, November 1, 2017
Sexual Assault at America's Colleges: A Culture of Complicity
This month, the world heard that Harvey Weinstein is a serial sexual abuser. Hollywood actors summoned their craft to feign surprise. But even George Clooney and Meryl Streep can't act well enough to inspire shock in an audience that has learned to distrust everyone from presidents to priests.
Harvey Weinstein was no priest, and the stories of his sexual misconduct should surprise no one. Apparently his reputation was so well-known in entertainment circles that NYU professors discouraged female students from interning with him. Even entertainment outsiders could have guessed that Weinstein's company would be a lion's den for aspiring young women. Weinstein, after all, is the kingpin behind raunch-fests like "Grindhouse," "Dirty Girl," "Sin City," and "Zack and Miri Make a Porno." It doesn't take an investigative reporter to suspect that a person whose wares were the objectification and exploitation of women might not be averse to the objectification and exploitation of women in the workplace. Think of it this way: normal employers couldn't show many of Weinstein's movies in their offices without creating what sexual harassment lawyers call a "hostile work environment." But for Weinstein and the women who worked with him, those movies were the work environment.
Weinsten did not survive as a prolific sexual predator by stealth. Rather, he thrived thanks to what The New Yorker termed a "culture of complicity" in Hollywood.
There are two approaches to fighting sexual assault. The first and most obvious method is to hunt, expose, and terminate predators one by one after they have already claimed their prey. But a more proactive, if less satisfying, approach is to modify the habitats where predators like Weinstein thrive.
Indeed, America's college campuses, where there is an epidemic of sexual violence, is one such habitat that is in sore need of modification.
By now, any college official who is interested in providing a safe educational environment for aspiring young women knows two things. The first is that sexual assaults are occurring at an alarming rate on American campuses. If you sent your daughter to an American college this year, there is a one in five chance that she will be sexually assaulted during her time there. (By comparison, if you sent your son to war in Iraq, the odds that he would be killed or wounded were about one in fifty.)
The second thing they should know is that one of the most significant predictors of sexual assault in college is the presence of heavy drinking. 80% of campus sexual assaults involve alcohol and, not surprisingly, college environments with less drinking also have far fewer sexual assaults than those with more. Sexual violence is seven times more common at "party schools" such as Syracuse and Penn State than "stone cold sober schools" like BYU and Brooklyn College.
These statistics are not new--studies in the 1990s showed that sexual violence is more common at schools where there is more binge drinking. They are also not surprising--drunkenness and risky sexual venturing have become more readily associated with the college experience than academics. Even eight in ten college students themselves agree that less alcohol would prevent sexual assaults.
Bureaucrats are often accused of paying lip service to problems without finding solutions, and when it comes to campus sexual assault, America's public officials--those charged with giving aspiring young women a safe place to learn--have been in rare form. Thus far, politicians and experts have compiled volumes of statistics and organized no shortage of task forces, yet they continue to treat sexual assault as a conundrum without a solution. In January, 2014, President Obama created a task force to address college sexual assault. Three years later, the task force issued a report that it gushingly dedicated to the courageous sexual assault survivors who were "agents of change." The task force then reported data that proved not much has changed at all. It didn't inspire confidence that change was on the horizon, either: a section shamefully mistitled "Prevention Programs That Work" does not even mention the schools that keep their rates of binge-drinking and sexual assaults low.
The report did warmly embrace colleges who used token awareness programs to counter sexual assaults. Since 1999, the federal government has thrown more than $131 million at such programs. But the programs have treated sexual assaults as if they occur in a vacuum and, not surprisingly, they haven't worked. The Department of Justice reviewed such programs in 2007 and lamented that "despite the link between substance use and sexual assault, it appears that few sexual assault prevention and/or risk reduction programs address the relationship."
Those colleges that have tried to lower alcohol consumption have done so with laughable half-measures that are doomed to failure. Indiana University banned hard liquor at frat parties. Stanford limits the size of alcohol bottles. (If only one could lose weight by eating ice cream in smaller dishes.) Michigan began "self-policing" their fraternities by sending its bravest--and certainly least popular--students to patrol frat houses in bright orange shirts, asking drunk partiers to please get off the roof. Meanwhile, in June, 2016, more than 35 universities began earning fortunes by selling beer at football games. At the University of Maine, my alma mater, there is a pub in the middle of campus, and the school anthem--taught to every student on their first day of college--is a drinking song.
Sadly, America's pesky culture of sensitivity, not ignorance, is giving predators a habitat to thrive in. Fear of being seen as blaming victims and excusing perpetrators has chilled a long-overdue discussion about the connection between alcohol and sexual violence. As one famous victim of sexual assault rebuked her assailant: "Alcohol is not an excuse. Is it a factor? Yes. But alcohol was not the one who stripped me, fingered me, had my head dragging against the ground, with me almost fully naked."
She was right. Drunkenness cannot be a shield for perpetrators or a sword against victims. But such concerns didn't stop Americans from nearly halving alcohol-related traffic deaths by waging an aggressive and multi-faceted campaign against drunk driving. If such a campaign against sexual violence on college campuses is going to happen, it will have to begin with a recognition that America's beloved alcohol is playing a major role.
College officials do a shameful disservice to America's daughters if they let political correctness impede progress. One frustrated writer interviewed experts who confessed that they were reluctant to advise college girls to protect themselves from sexual predators by not drinking. She wrote:
It is high time that booze gets some bad publicity. Microbreweries and country singers alike have successfully portrayed America's favorite drug as not only acceptable but downright wholesome. A discussion about the connection between alcohol and sexual assault would be a good start at giving alcohol the bad reputation that it has earned.
The colleges that give sexual predators an environment to thrive deserve some bad publicity too. Perhaps we could begin by abolishing the phrase "party schools" from our vocabularies--a euphemism that is more likely to invite than deter young people.
Perhaps "hotbeds of sexual violence" would be a more apt substitute.
_________________________________________________________________
Further Reading and Other Perspectives:
In 2015, CNN aired a documentary about campus rapes called The Hunting Ground which, interestingly, Harvey Weinstein helped fund. One of the main arguments in the documentary is that 8% of college men commit 90% of the sexual assaults, so purging those predators from college campuses is an effective way to address the problem. The documentary has received some criticism for the data that premises its argument and how it portrayed accused predators.
Harvey Weinstein was no priest, and the stories of his sexual misconduct should surprise no one. Apparently his reputation was so well-known in entertainment circles that NYU professors discouraged female students from interning with him. Even entertainment outsiders could have guessed that Weinstein's company would be a lion's den for aspiring young women. Weinstein, after all, is the kingpin behind raunch-fests like "Grindhouse," "Dirty Girl," "Sin City," and "Zack and Miri Make a Porno." It doesn't take an investigative reporter to suspect that a person whose wares were the objectification and exploitation of women might not be averse to the objectification and exploitation of women in the workplace. Think of it this way: normal employers couldn't show many of Weinstein's movies in their offices without creating what sexual harassment lawyers call a "hostile work environment." But for Weinstein and the women who worked with him, those movies were the work environment.
Weinsten did not survive as a prolific sexual predator by stealth. Rather, he thrived thanks to what The New Yorker termed a "culture of complicity" in Hollywood.
There are two approaches to fighting sexual assault. The first and most obvious method is to hunt, expose, and terminate predators one by one after they have already claimed their prey. But a more proactive, if less satisfying, approach is to modify the habitats where predators like Weinstein thrive.
Indeed, America's college campuses, where there is an epidemic of sexual violence, is one such habitat that is in sore need of modification.
By now, any college official who is interested in providing a safe educational environment for aspiring young women knows two things. The first is that sexual assaults are occurring at an alarming rate on American campuses. If you sent your daughter to an American college this year, there is a one in five chance that she will be sexually assaulted during her time there. (By comparison, if you sent your son to war in Iraq, the odds that he would be killed or wounded were about one in fifty.)
The second thing they should know is that one of the most significant predictors of sexual assault in college is the presence of heavy drinking. 80% of campus sexual assaults involve alcohol and, not surprisingly, college environments with less drinking also have far fewer sexual assaults than those with more. Sexual violence is seven times more common at "party schools" such as Syracuse and Penn State than "stone cold sober schools" like BYU and Brooklyn College.
These statistics are not new--studies in the 1990s showed that sexual violence is more common at schools where there is more binge drinking. They are also not surprising--drunkenness and risky sexual venturing have become more readily associated with the college experience than academics. Even eight in ten college students themselves agree that less alcohol would prevent sexual assaults.
Bureaucrats are often accused of paying lip service to problems without finding solutions, and when it comes to campus sexual assault, America's public officials--those charged with giving aspiring young women a safe place to learn--have been in rare form. Thus far, politicians and experts have compiled volumes of statistics and organized no shortage of task forces, yet they continue to treat sexual assault as a conundrum without a solution. In January, 2014, President Obama created a task force to address college sexual assault. Three years later, the task force issued a report that it gushingly dedicated to the courageous sexual assault survivors who were "agents of change." The task force then reported data that proved not much has changed at all. It didn't inspire confidence that change was on the horizon, either: a section shamefully mistitled "Prevention Programs That Work" does not even mention the schools that keep their rates of binge-drinking and sexual assaults low.
The report did warmly embrace colleges who used token awareness programs to counter sexual assaults. Since 1999, the federal government has thrown more than $131 million at such programs. But the programs have treated sexual assaults as if they occur in a vacuum and, not surprisingly, they haven't worked. The Department of Justice reviewed such programs in 2007 and lamented that "despite the link between substance use and sexual assault, it appears that few sexual assault prevention and/or risk reduction programs address the relationship."
Those colleges that have tried to lower alcohol consumption have done so with laughable half-measures that are doomed to failure. Indiana University banned hard liquor at frat parties. Stanford limits the size of alcohol bottles. (If only one could lose weight by eating ice cream in smaller dishes.) Michigan began "self-policing" their fraternities by sending its bravest--and certainly least popular--students to patrol frat houses in bright orange shirts, asking drunk partiers to please get off the roof. Meanwhile, in June, 2016, more than 35 universities began earning fortunes by selling beer at football games. At the University of Maine, my alma mater, there is a pub in the middle of campus, and the school anthem--taught to every student on their first day of college--is a drinking song.
Sadly, America's pesky culture of sensitivity, not ignorance, is giving predators a habitat to thrive in. Fear of being seen as blaming victims and excusing perpetrators has chilled a long-overdue discussion about the connection between alcohol and sexual violence. As one famous victim of sexual assault rebuked her assailant: "Alcohol is not an excuse. Is it a factor? Yes. But alcohol was not the one who stripped me, fingered me, had my head dragging against the ground, with me almost fully naked."
She was right. Drunkenness cannot be a shield for perpetrators or a sword against victims. But such concerns didn't stop Americans from nearly halving alcohol-related traffic deaths by waging an aggressive and multi-faceted campaign against drunk driving. If such a campaign against sexual violence on college campuses is going to happen, it will have to begin with a recognition that America's beloved alcohol is playing a major role.
College officials do a shameful disservice to America's daughters if they let political correctness impede progress. One frustrated writer interviewed experts who confessed that they were reluctant to advise college girls to protect themselves from sexual predators by not drinking. She wrote:
[W]e are failing to let women know that when they render themselves defenseless, terrible things can be done to them. Young women are getting a distorted message that their right to match men drink for drink is a feminist issue. The real feminist message should be that when you lose the ability to be responsible for yourself, you drastically increase the chances that you will attract the kinds of people who, shall we say, don’t have your best interest at heart. That’s not blaming the victim; that’s trying to prevent more victims.Sexual assault is endemic to college campuses not because the problem is concealed, but because a culture of complicity exists. Officials know that binge drinking and sexual assaults go hand-in-hand, yet they remain willfully mired in the brainstorming phase of their response, grasping for more convenient solutions to the problem and refusing to take anything but the most tepid measures to fight their colleges' binge-drinking cultures. But the most effective treatments for America's rape culture would be a healthy does of some good, old-fashioned Victorian prudishness, and a resurrection of progressive temperance.
It is high time that booze gets some bad publicity. Microbreweries and country singers alike have successfully portrayed America's favorite drug as not only acceptable but downright wholesome. A discussion about the connection between alcohol and sexual assault would be a good start at giving alcohol the bad reputation that it has earned.
The colleges that give sexual predators an environment to thrive deserve some bad publicity too. Perhaps we could begin by abolishing the phrase "party schools" from our vocabularies--a euphemism that is more likely to invite than deter young people.
Perhaps "hotbeds of sexual violence" would be a more apt substitute.
_________________________________________________________________
Further Reading and Other Perspectives:
In 2015, CNN aired a documentary about campus rapes called The Hunting Ground which, interestingly, Harvey Weinstein helped fund. One of the main arguments in the documentary is that 8% of college men commit 90% of the sexual assaults, so purging those predators from college campuses is an effective way to address the problem. The documentary has received some criticism for the data that premises its argument and how it portrayed accused predators.
Wednesday, October 11, 2017
Dear Baseball, It's Time to Get Rid of Home Plate Umpires
I'm not a baseball fan, strictly speaking. Don't get me wrong, I love the sport, but I harbor subversive ideas that a real baseball fan, by definition, cannot hold. I am a heretic. Take my view of umpiring, for example. In my opinion, it is high time Major League Baseball turns the job of calling balls and strikes over to the machines.
Logistically, it could be done. Already, home plates have become epicenters of scientific observation; there are probably more gizmos fixed on any given professional diamond than there are observing all of the planets in our solar system. Those gizmos give baseball fans--the most stat-hungry in sports--a trove of data that makes traditional metrics seem as antiquated as baseball cards. Gurus these days are more interested in the speed of a pitcher's fastball, the spin rate of his curve, and the exit velocity of a batter's home run than ERAs and batting averages.
These wonders of modern technology should be revolutionizing how umpiring is done. Instead, they are highlighting how atrocious umpiring is and probably always has been.
A short history: Beginning in 1887, baseball has relied upon the relatively frail eyesight of human beings to determine whether a pitch has passed through an imaginary box known as a "strike zone" (before then, batters could tell pitchers to throw a high, low, or fair pitch). Since then, players and fans have been complaining. Arguments over balls and strikes and the feeling of being ripped off by a blown call have become as integral to the baseball spectacle as chewing tobacco and facial hair.
Although their job hasn't changed much, being a major league umpire is as difficult as ever. A 92-mph fastball, once considered fast, is in the air for a mere 446 milliseconds. Today's pitchers have shaved precious milliseconds off that figure by throwing 100-mph more frequently than ever. Home plate umpires are expected to judge a ball's position at such speeds with no physical reference whatsoever: just an imaginary strike zone suspended in air.
Now, Major League Baseball has technology that could instantaneously call balls and strikes with the precision and impartiality that only machines can offer. It seems like a no-brainer. "The sad thing is you have no clue what could be called a ball or a strike at any point," complained former big-leaguer Erik Byrnes. "Why do millions of people sitting at home get to know whether or not it was a ball or strike, yet the poor dude behind home plate is the one who’s left in the dark?"
Read more here: http://www.star-telegram.com/sports/spt-columns-blogs/gil-lebreton/article105378146.html#storylink=cpy
If you can't relate to the emotional sting that comes from losing a game due to a blown strike call, consider some hard data that prove how bad the best umpires in the world are at calling pitches. When Yale professor Dr. Toby Moskowitz analyzed nearly a million major league pitches, he found that umpires correctly called a ball or strike about 88% of the time. That might not sound bad, but for close pitches--those within two inches of the plate--the success rate dropped to 68.7%. That's not good, especially considering that the umpires would have scored 50% if they were completely guessing.
It gets worse. Not only do umpires regularly miss calls, but they are also susceptible to bias. According to research from Stanford University, umpires tend to adjust the strike zones to avoid sending runners to first base on a walk or to the dugout on a strikeout: their zones expand in three-ball counts and shrink in two-strike counts. Umpires are also reluctant to call two strikes in a row. And try as they might, umps can't quite shake home-field bias: statistics show that home pitchers benefit from slightly more generous strike zones.
So if umpires are judged for both accuracy and consistency, then they are failing on both fronts.
Yet letting machines become umpires doesn't sit right with old-school baseball fans. To them, baseball just wouldn't be baseball without umps to flamboyantly ring-'em-up after a called third (alleged) strike. The "human element"--i.e., blown calls and the saliva-swapping arguments that ensue--are baseball's patina.
They have a point. After all, bad umpires have driven the plot of some of baseball's most memorable dramas: think Jackie Robinson magically stealing home through the tag of Yogi Bera in game 1 of the 1955 World Series, or George Brett flailing out of the dugout after Yankees manager Billy Martin complained about the pine tar on his bat and umpires negated his go-ahead home run. And who can forget Jim Joyce wiping tears away as he accepted a lineup card from Armando Galarraga a day after the umpire blew an easy call that cost Galarraga a perfect game?
But baseball should not let infatuation with its history slow its progress. In pitch-tracking technology, the MLB has a near-perfect solution to a century-old problem. If such things were possible, the NBA would leap at the chance to remove the subjectivity from foul-calling. The NFL would pay handsomely to get rid of its five-minute replay delays. For now, basketball and football are stuck with their officiating woes. Baseball is not. Automated pitch calling would give officiating the precision and objectivity of replay review without the snooze-inducing delays.
I'm not saying mechanical pitch-calling won't change the game. It would, and drastically. Pitchers and batters who master the strike zone will become more valuable as their skill is more consistently rewarded. Catchers, freed from the charade of framing pitches, would change their catching stances to hold runners more aggressively. Even the aesthetics would change. Without a home plate umpire crowding the action, baseball games would take on a more backyard feel. That mano-y-mano medieval joust that is the pitcher-batter face-off would look better unsupervised.
Alas, baseball purists have a love-hate relationship with umpires that they aren't quite ready to quit. As much as they pester umpires through their TV screens, fans have developed a vocabulary for shrugging off their mediocrity. When a player resists the urge to swing at an outside pitch and gets called out on strikes the blame falls on him for not anticipating a blown call ("that was close enough to swing at with two strikes," "he's got to protect.") On a bad day, an umpire might still earn the praise of fans if he is "calling it both ways." That phrase sounds like a compliment. In fact, it is just a euphemism for, "well, at least both teams are getting equally ripped off."
"It's a part of the game," is my least favorite argument for tolerating bad umpiring. Before pitch-tracking technology, blown calls were tolerable--fans had no choice but to tolerate them. Now, they are an embarrassment.
I get it. After baseball replaces them, I will probably miss the human element that home plate umpires offer. (Likewise, after scientists cure the common cold, I may feel a certain nostalgia for the taste of cough syrup.) But the sport will fare better when its games hinge more on athletic feats and less on the foibles of its officials.
________________________________________________________________________________
Further Reading:
A summary of HBO's feature, Man vs. Machine can be seen here. Apparently former umpire Jerry Crawford laughed off any suggestion that things needed to change. Umpires have gotten so good, he claimed, that "they're not missing any pitches."
Here is an article for those interested in the history of pitching rules.
For Another Perspective:
Derek Thompson, writing for The Atlantic has argued that pitch trackers have made baseball worse by making it harder to score. The 2017 season--which set an all-time record for home runs--may have wrecked his theory.
Logistically, it could be done. Already, home plates have become epicenters of scientific observation; there are probably more gizmos fixed on any given professional diamond than there are observing all of the planets in our solar system. Those gizmos give baseball fans--the most stat-hungry in sports--a trove of data that makes traditional metrics seem as antiquated as baseball cards. Gurus these days are more interested in the speed of a pitcher's fastball, the spin rate of his curve, and the exit velocity of a batter's home run than ERAs and batting averages.
These wonders of modern technology should be revolutionizing how umpiring is done. Instead, they are highlighting how atrocious umpiring is and probably always has been.
A short history: Beginning in 1887, baseball has relied upon the relatively frail eyesight of human beings to determine whether a pitch has passed through an imaginary box known as a "strike zone" (before then, batters could tell pitchers to throw a high, low, or fair pitch). Since then, players and fans have been complaining. Arguments over balls and strikes and the feeling of being ripped off by a blown call have become as integral to the baseball spectacle as chewing tobacco and facial hair.
Although their job hasn't changed much, being a major league umpire is as difficult as ever. A 92-mph fastball, once considered fast, is in the air for a mere 446 milliseconds. Today's pitchers have shaved precious milliseconds off that figure by throwing 100-mph more frequently than ever. Home plate umpires are expected to judge a ball's position at such speeds with no physical reference whatsoever: just an imaginary strike zone suspended in air.
Now, Major League Baseball has technology that could instantaneously call balls and strikes with the precision and impartiality that only machines can offer. It seems like a no-brainer. "The sad thing is you have no clue what could be called a ball or a strike at any point," complained former big-leaguer Erik Byrnes. "Why do millions of people sitting at home get to know whether or not it was a ball or strike, yet the poor dude behind home plate is the one who’s left in the dark?"
Read more here: http://www.star-telegram.com/sports/spt-columns-blogs/gil-lebreton/article105378146.html#storylink=cpy
If you can't relate to the emotional sting that comes from losing a game due to a blown strike call, consider some hard data that prove how bad the best umpires in the world are at calling pitches. When Yale professor Dr. Toby Moskowitz analyzed nearly a million major league pitches, he found that umpires correctly called a ball or strike about 88% of the time. That might not sound bad, but for close pitches--those within two inches of the plate--the success rate dropped to 68.7%. That's not good, especially considering that the umpires would have scored 50% if they were completely guessing.
It gets worse. Not only do umpires regularly miss calls, but they are also susceptible to bias. According to research from Stanford University, umpires tend to adjust the strike zones to avoid sending runners to first base on a walk or to the dugout on a strikeout: their zones expand in three-ball counts and shrink in two-strike counts. Umpires are also reluctant to call two strikes in a row. And try as they might, umps can't quite shake home-field bias: statistics show that home pitchers benefit from slightly more generous strike zones.
So if umpires are judged for both accuracy and consistency, then they are failing on both fronts.
Yet letting machines become umpires doesn't sit right with old-school baseball fans. To them, baseball just wouldn't be baseball without umps to flamboyantly ring-'em-up after a called third (alleged) strike. The "human element"--i.e., blown calls and the saliva-swapping arguments that ensue--are baseball's patina.
They have a point. After all, bad umpires have driven the plot of some of baseball's most memorable dramas: think Jackie Robinson magically stealing home through the tag of Yogi Bera in game 1 of the 1955 World Series, or George Brett flailing out of the dugout after Yankees manager Billy Martin complained about the pine tar on his bat and umpires negated his go-ahead home run. And who can forget Jim Joyce wiping tears away as he accepted a lineup card from Armando Galarraga a day after the umpire blew an easy call that cost Galarraga a perfect game?
But baseball should not let infatuation with its history slow its progress. In pitch-tracking technology, the MLB has a near-perfect solution to a century-old problem. If such things were possible, the NBA would leap at the chance to remove the subjectivity from foul-calling. The NFL would pay handsomely to get rid of its five-minute replay delays. For now, basketball and football are stuck with their officiating woes. Baseball is not. Automated pitch calling would give officiating the precision and objectivity of replay review without the snooze-inducing delays.
I'm not saying mechanical pitch-calling won't change the game. It would, and drastically. Pitchers and batters who master the strike zone will become more valuable as their skill is more consistently rewarded. Catchers, freed from the charade of framing pitches, would change their catching stances to hold runners more aggressively. Even the aesthetics would change. Without a home plate umpire crowding the action, baseball games would take on a more backyard feel. That mano-y-mano medieval joust that is the pitcher-batter face-off would look better unsupervised.
Alas, baseball purists have a love-hate relationship with umpires that they aren't quite ready to quit. As much as they pester umpires through their TV screens, fans have developed a vocabulary for shrugging off their mediocrity. When a player resists the urge to swing at an outside pitch and gets called out on strikes the blame falls on him for not anticipating a blown call ("that was close enough to swing at with two strikes," "he's got to protect.") On a bad day, an umpire might still earn the praise of fans if he is "calling it both ways." That phrase sounds like a compliment. In fact, it is just a euphemism for, "well, at least both teams are getting equally ripped off."
"It's a part of the game," is my least favorite argument for tolerating bad umpiring. Before pitch-tracking technology, blown calls were tolerable--fans had no choice but to tolerate them. Now, they are an embarrassment.
I get it. After baseball replaces them, I will probably miss the human element that home plate umpires offer. (Likewise, after scientists cure the common cold, I may feel a certain nostalgia for the taste of cough syrup.) But the sport will fare better when its games hinge more on athletic feats and less on the foibles of its officials.
________________________________________________________________________________
Further Reading:
A summary of HBO's feature, Man vs. Machine can be seen here. Apparently former umpire Jerry Crawford laughed off any suggestion that things needed to change. Umpires have gotten so good, he claimed, that "they're not missing any pitches."
Here is an article for those interested in the history of pitching rules.
For Another Perspective:
Derek Thompson, writing for The Atlantic has argued that pitch trackers have made baseball worse by making it harder to score. The 2017 season--which set an all-time record for home runs--may have wrecked his theory.
Tuesday, October 3, 2017
Communists Yesterday, Racists Today: How America Deals with Ideological Pariahs
I'm puzzled by the way that Americans, liberal and conservative, treat their ideological pariahs. We, who claim devotion to the freedom of speech, nevertheless feel a moral obligation to shame, boycott, and ostracize those who hold and express unpopular beliefs.
I don't get it. I consider myself a patriotic American, but I haven't heard one good reason (just a lot of bad reasons) why I should adopt an attitude of scorn or disgust towards any NFL player--or Jehovah's witness, for that matter--who doesn't stand during the National Anthem. The reasons for expressing--or not expressing--one's patriotism are highly personal. Colin Kaepernick might be an ungrateful punk for all I know, but I don't storm out of the chapel when a fellow churchgoer is noisy during a prayer, and I'm not going to boycott the NFL because some twenty-eight-year-old doofus is irreverent during a flag salute.
For all of our progress on racial and ethnic tolerance, ideological intolerance is a stubborn disease that survives by hiding under the guise of righteous causes.
Take for example, the war against bigotry, today's most socially-acceptable form of ideological intolerance. The national mood is such that one can't be too hard on racists. After the controversial Unite the Right rally in Charlottesville, Virginia, Governor Terry McAuliffe shooed away white supremacists with a phrase that sounded like something that came straight from the mouth of a white supremacist: "There is no place for you here, there is no place for you in America." His motives may have been pure, but there was something unsettling about hearing a high public official use ideology to define Americanness.
Across the country, pockets of hysteria broke out as pictures of rally attendees went viral. When students at the University of Nevada-Reno discovered a racist among them, fretful students packed the Student Senate, demanding that the administration do something to protect them. Some complained that knowing there was a racist on campus made them fear for their lives. 32,000 people signed a petition demanding that the school expel the offender.
Ultimately, those pesky First and Fourteenth Amendments (and, hopefully, a bit of human decency) prevented UN-R, as a public school, from yielding to the demands of its horror-struck student body.
The law isn't always so forgiving. Recently, otherwise talented individuals have lost their jobs when the public discovered their previously-dormant bias. Take Fisher Deberry, for example, who was the winningest football coach in Air Force Academy history before he said that his team needed to recruit more black players because "Afro American kids can run very well." He was publicly reprimanded and resigned soon after. Or former NPR news analyst Juan Williams, who was canned after confessing that he feels nervous when he sees passengers in Muslim garb on his plane. More recently, James Damore became a martyr of the alt-right when he suggested, as tactfully as one possibly could, that non-bias factors were contributing to Google's gender gap. Feminists decried Damore's memo (which was factually accurate) as sexist, and Google sent Damore packing.
Each of these victims have one thing in common: none of the offenders were actually accused of acting in an intolerant way; they were convicted of bigotry for honestly and candidly expressing unpopular opinions.
America have a long history of ostracizing ideological pariahs. If racism is the 21st century's cardinal sin, communism was the sin of the 20th when socialists, communists, and radicals of all kind were deemed un-American, disloyal and a serious threat to America.
Persecution against radicals--or suspected radicals--became socially-acceptable. In Chicago, for example, a sailor shot a man three times at a pageant when he refused to stand for the national anthem. The crowd erupted in cheers. "It was believed by the authorities," the Washington Post reported, lest its readers shed any tears for the victim, "he had come here for the [Industrial Workers of the World] convention."
U.S. Attorney General A. Mitchell Palmer compiled a list of suspected radicals 450,000 names long, then set out to rid the country of its reds. Palmer's dragnet operation, which became known as the "Palmer Raids," involved the rounding up of tens of thousands of suspected radicals. Arrestees were often taken in Kafkaesque fashion, without being told the charges against them. Hundreds, without being convicted of a crime, were packed onto a ship dubbed the "Soviet Ark" and sent to Russia. Some arrestees became bait for other suspected radicals; those who visited political prisoners at the Seyms Street Jail in Hartford, Connecticut, were automatically arrested; they were, Palmer reasoned, essentially attending "revolutionary meetings." That, to Palmer, was proof enough of their guilt.
The public hysteria quelled, but returned with a vengeance during the Cold War. Again, intense fear of communism fueled a nation-wide inquisition. President Truman instituted a "Loyalty Program" beginning in 1947 to purge radicals from the federal workforce. Suspected radicals were brought (often on anonymous tips) before a "Loyalty Board," where they were probed about their belief in racial equality, association with labor unions, support for socialized medicine, sexual orientation and anything else that whiffed of communism.
Though few employees were actually terminated, the mere threat of being dissected by a Loyalty Board had a repressive effect. In 1950 the American Scholar published the results of a survey which found:
Those who spoke out against red-baiting risked being smeared as communists themselves. Drew Pearson, a prominent journalist spoke out against Senator McCarthy's communist witch hunt. When McCarthy met him in a night club, the senator grabbed Pearson by the neck and kneed him in the groin before Richard Nixon pulled the two apart. Soon after their brawl, McCarthy called Pearson a "sugar-coated voice of Russia" from the floor of the Senate, and called for "every loyal American" to boycott Pearson's radio sponsor, The Adam Hat Company unless it revoked its sponsorship. The Adam Hat Company soon obliged.
The most shameful incidents of ideological persecution to come out of the Red Scare involve the thousands of ordinary citizens, far removed from nuclear secrets, who nevertheless found their livelihoods jeopardized by anti-communist witch hunts. The story of Anne Hale provides one such story. Ms. Hale was a member of the Communist Party during World War II (when the United States and Soviet Union were allies). After the war, she cut her ties to the communist party and became a second-grade schoolteacher in Wayland, Massachusetts.
Hale's past came to light in 1948 when an FBI informant caught her buying a Pete Seeger songbook. Agents, who did not think that Hale posed a security threat, told the Wayland School Committee about Hale's past. Hale assured the school committee that she was no longer a communist and that she was a loyal American, but the committee unanimously voted to suspend her without pay.
Hale's case would not have been unusual if she, like thousands of other accused communists, had simply resigned. Instead, she braved the public humiliation of a hearing. "I think it will do less harm to the children," she reasoned, "to see me standing up for what I believe to be true, than to see me run away." Hale admitted during the hearing that she had been a communist, but insisted that she did not believe in the overthrow of the government by force and violence and that she "believed in government by the majority, in the Bill of Rights, and in the protection of the rights of minorities."
Few dared vouch for her. Even her lawyer quit just days before the hearing. Hale's reverend, to his credit, told the press that he would give Hale a character reference. He paid a dear price. Attendance at his parish plummeted; one Sunday, just three parishioners showed up.
After eight grueling days of hearings, the school committee voted to dismiss her. Hale asked that she be allowed to say goodbye to her class. The committee refused. Instead, Hale sent each one of her students a farewell. "Just remember these things," she wrote, "which I am sure you know--I love my country and I love you."
The communist purges of the 20th century and today's purge of racists have at least one thing in common: they are driven by a fear that is grossly disproportionate to the threat posed. That is not to say that there are no hate-mongering racists among us. Just as there were communists during the cold war intent on giving nuclear secrets to the Soviets and, if given the chance, would have joined a violent overthrow of the government, there are fire-branded bigots today who would bring back Jim Crow.
But, as was the case during the Red Scares, the war against bigotry is being waged not just against those who act to harm others, but also those who merely lean to the forbidden side of an ideological spectrum. Such ideological purges have no logical end because self-righteous persecutors will find a perpetual line of baddest apples to drive from polite society. And thus both ideological scares will probably suffer the same fate; they will fizzle out after enough James Damores and Anne Hales make people realize that the quest for national purity was madness all along.
This is not to say that we should have a blase attitude toward hate and prejudice, only that the way to suppress an idiotic ideology is more evangelical and less persecutory: a strategy of conversion rather than coercion; an acknowledgment that preserving America as a laboratory of ideas means tolerating those who hold opinions that we loathe.
My fellow Mainers still take pride in the heroic words of Maine Senator Margaret Chase Smith, who bravely rebuked Joe McCarthy and her fellow Republicans from the Senate floor:
__________________________________________________
Further reading:
Much of my material about the Red Scares comes from Haynes Johnson's thoughtful book, The Age of Anxiety: McCarthyism to Terrorism.
You can read the full story of Anne Hale in the Boston Globe.
The Brookings Institute recently published the results of a poll of undergraduate students on freedom of speech issues. They were disturbing. Among other findings, almost twenty percent of respondents thought that it was acceptable to act--including resorting to violence--to suppress expression that they consider offensive.
I don't get it. I consider myself a patriotic American, but I haven't heard one good reason (just a lot of bad reasons) why I should adopt an attitude of scorn or disgust towards any NFL player--or Jehovah's witness, for that matter--who doesn't stand during the National Anthem. The reasons for expressing--or not expressing--one's patriotism are highly personal. Colin Kaepernick might be an ungrateful punk for all I know, but I don't storm out of the chapel when a fellow churchgoer is noisy during a prayer, and I'm not going to boycott the NFL because some twenty-eight-year-old doofus is irreverent during a flag salute.
For all of our progress on racial and ethnic tolerance, ideological intolerance is a stubborn disease that survives by hiding under the guise of righteous causes.
Take for example, the war against bigotry, today's most socially-acceptable form of ideological intolerance. The national mood is such that one can't be too hard on racists. After the controversial Unite the Right rally in Charlottesville, Virginia, Governor Terry McAuliffe shooed away white supremacists with a phrase that sounded like something that came straight from the mouth of a white supremacist: "There is no place for you here, there is no place for you in America." His motives may have been pure, but there was something unsettling about hearing a high public official use ideology to define Americanness.
Across the country, pockets of hysteria broke out as pictures of rally attendees went viral. When students at the University of Nevada-Reno discovered a racist among them, fretful students packed the Student Senate, demanding that the administration do something to protect them. Some complained that knowing there was a racist on campus made them fear for their lives. 32,000 people signed a petition demanding that the school expel the offender.
Ultimately, those pesky First and Fourteenth Amendments (and, hopefully, a bit of human decency) prevented UN-R, as a public school, from yielding to the demands of its horror-struck student body.
The law isn't always so forgiving. Recently, otherwise talented individuals have lost their jobs when the public discovered their previously-dormant bias. Take Fisher Deberry, for example, who was the winningest football coach in Air Force Academy history before he said that his team needed to recruit more black players because "Afro American kids can run very well." He was publicly reprimanded and resigned soon after. Or former NPR news analyst Juan Williams, who was canned after confessing that he feels nervous when he sees passengers in Muslim garb on his plane. More recently, James Damore became a martyr of the alt-right when he suggested, as tactfully as one possibly could, that non-bias factors were contributing to Google's gender gap. Feminists decried Damore's memo (which was factually accurate) as sexist, and Google sent Damore packing.
Each of these victims have one thing in common: none of the offenders were actually accused of acting in an intolerant way; they were convicted of bigotry for honestly and candidly expressing unpopular opinions.
America have a long history of ostracizing ideological pariahs. If racism is the 21st century's cardinal sin, communism was the sin of the 20th when socialists, communists, and radicals of all kind were deemed un-American, disloyal and a serious threat to America.
Persecution against radicals--or suspected radicals--became socially-acceptable. In Chicago, for example, a sailor shot a man three times at a pageant when he refused to stand for the national anthem. The crowd erupted in cheers. "It was believed by the authorities," the Washington Post reported, lest its readers shed any tears for the victim, "he had come here for the [Industrial Workers of the World] convention."
U.S. Attorney General A. Mitchell Palmer compiled a list of suspected radicals 450,000 names long, then set out to rid the country of its reds. Palmer's dragnet operation, which became known as the "Palmer Raids," involved the rounding up of tens of thousands of suspected radicals. Arrestees were often taken in Kafkaesque fashion, without being told the charges against them. Hundreds, without being convicted of a crime, were packed onto a ship dubbed the "Soviet Ark" and sent to Russia. Some arrestees became bait for other suspected radicals; those who visited political prisoners at the Seyms Street Jail in Hartford, Connecticut, were automatically arrested; they were, Palmer reasoned, essentially attending "revolutionary meetings." That, to Palmer, was proof enough of their guilt.
The public hysteria quelled, but returned with a vengeance during the Cold War. Again, intense fear of communism fueled a nation-wide inquisition. President Truman instituted a "Loyalty Program" beginning in 1947 to purge radicals from the federal workforce. Suspected radicals were brought (often on anonymous tips) before a "Loyalty Board," where they were probed about their belief in racial equality, association with labor unions, support for socialized medicine, sexual orientation and anything else that whiffed of communism.
Though few employees were actually terminated, the mere threat of being dissected by a Loyalty Board had a repressive effect. In 1950 the American Scholar published the results of a survey which found:
The atmosphere in government is one of fear--fear of ideas and of irresponsible and unknown informers. Government employees are afraid to attend meetings of politically minded groups; they are afraid to read "liberal" publications; the screen their friends carefully for "left-wing" ideas. . . . Nobody wants to go through a "loyalty" investigation.The fear of being pinned as a communist or even a "fellow traveler" spread far beyond the federal government. In Madison, Wisconsin, a journalist asked 112 people to sign a petition that contained nothing but lines from the Declaration of Independence and Bill of Rights. All but one person refused for fear that they would be caught fixing their names to a subversive document.
Those who spoke out against red-baiting risked being smeared as communists themselves. Drew Pearson, a prominent journalist spoke out against Senator McCarthy's communist witch hunt. When McCarthy met him in a night club, the senator grabbed Pearson by the neck and kneed him in the groin before Richard Nixon pulled the two apart. Soon after their brawl, McCarthy called Pearson a "sugar-coated voice of Russia" from the floor of the Senate, and called for "every loyal American" to boycott Pearson's radio sponsor, The Adam Hat Company unless it revoked its sponsorship. The Adam Hat Company soon obliged.
The most shameful incidents of ideological persecution to come out of the Red Scare involve the thousands of ordinary citizens, far removed from nuclear secrets, who nevertheless found their livelihoods jeopardized by anti-communist witch hunts. The story of Anne Hale provides one such story. Ms. Hale was a member of the Communist Party during World War II (when the United States and Soviet Union were allies). After the war, she cut her ties to the communist party and became a second-grade schoolteacher in Wayland, Massachusetts.
Hale's past came to light in 1948 when an FBI informant caught her buying a Pete Seeger songbook. Agents, who did not think that Hale posed a security threat, told the Wayland School Committee about Hale's past. Hale assured the school committee that she was no longer a communist and that she was a loyal American, but the committee unanimously voted to suspend her without pay.
Hale's case would not have been unusual if she, like thousands of other accused communists, had simply resigned. Instead, she braved the public humiliation of a hearing. "I think it will do less harm to the children," she reasoned, "to see me standing up for what I believe to be true, than to see me run away." Hale admitted during the hearing that she had been a communist, but insisted that she did not believe in the overthrow of the government by force and violence and that she "believed in government by the majority, in the Bill of Rights, and in the protection of the rights of minorities."
Few dared vouch for her. Even her lawyer quit just days before the hearing. Hale's reverend, to his credit, told the press that he would give Hale a character reference. He paid a dear price. Attendance at his parish plummeted; one Sunday, just three parishioners showed up.
After eight grueling days of hearings, the school committee voted to dismiss her. Hale asked that she be allowed to say goodbye to her class. The committee refused. Instead, Hale sent each one of her students a farewell. "Just remember these things," she wrote, "which I am sure you know--I love my country and I love you."
The communist purges of the 20th century and today's purge of racists have at least one thing in common: they are driven by a fear that is grossly disproportionate to the threat posed. That is not to say that there are no hate-mongering racists among us. Just as there were communists during the cold war intent on giving nuclear secrets to the Soviets and, if given the chance, would have joined a violent overthrow of the government, there are fire-branded bigots today who would bring back Jim Crow.
But, as was the case during the Red Scares, the war against bigotry is being waged not just against those who act to harm others, but also those who merely lean to the forbidden side of an ideological spectrum. Such ideological purges have no logical end because self-righteous persecutors will find a perpetual line of baddest apples to drive from polite society. And thus both ideological scares will probably suffer the same fate; they will fizzle out after enough James Damores and Anne Hales make people realize that the quest for national purity was madness all along.
This is not to say that we should have a blase attitude toward hate and prejudice, only that the way to suppress an idiotic ideology is more evangelical and less persecutory: a strategy of conversion rather than coercion; an acknowledgment that preserving America as a laboratory of ideas means tolerating those who hold opinions that we loathe.
My fellow Mainers still take pride in the heroic words of Maine Senator Margaret Chase Smith, who bravely rebuked Joe McCarthy and her fellow Republicans from the Senate floor:
Those of us who shout the loudest about Americanism in making character assassination are all too frequently those who, by their own words and acts, ignore some of the basic principles of Americanism: the right to criticize; the right to hold unpopular beliefs; the right to protest; the right of independent thought.
The exercise of these rights should not cost one single American his reputation or his right to a livelihood merely because he happens to know someone who holds unpopular beliefs. Who of us doesn't? Otherwise none of us could call our souls our own. Otherwise thought control would have set in.America's mission in the 20th century should have been to expose the foolishness of communism and let the radicals exorcise themselves, not to exile the communists from every nook of society. The same may be said of the war against racism. One can only hope that we are faster learners this time around.
__________________________________________________
Further reading:
Much of my material about the Red Scares comes from Haynes Johnson's thoughtful book, The Age of Anxiety: McCarthyism to Terrorism.
You can read the full story of Anne Hale in the Boston Globe.
The Brookings Institute recently published the results of a poll of undergraduate students on freedom of speech issues. They were disturbing. Among other findings, almost twenty percent of respondents thought that it was acceptable to act--including resorting to violence--to suppress expression that they consider offensive.
Subscribe to:
Posts (Atom)