Cato Op-Eds

Individual Liberty, Free Markets, and Peace
Subscribe to Cato Op-Eds feed

Five successive Secretaries of Defense have asked Congress for permission to reduce excess and unnecessary military bases. The fairest and most transparent way to make such cuts is through another Base Realignment and Closure (BRAC) round. So far, however, the SecDefs’ requests have gone unanswered. For their sake, but mostly for the sake of the men and women serving in our armed forces, I want one, too. All I want for Christmas is a BRAC.

According to the Pentagon’s latest estimates, the military as a whole has 19 percent excess base capacity. If it helps to visualize the nature of the problem, nearly 1 in every 5 facilities that DoD operates are superfluous to U.S. national security, or their functions could be consolidated into other facilities elsewhere. This is important because requiring the military to carry so much overhead necessarily compels the services to divert resources away from more important things – from salaries and benefits for military personnel, to maintenance and upkeep for their equipment, and even to the purchase of new gear.

As Secretary of Defense James Mattis said in congressional testimony earlier this year

Of all the efficiency measures the Department has undertaken over the years, BRAC is one of the most successful and significant – we forecast that a properly focused base closure effort will generate $2 billion or more annually – enough to buy 300 Apache attack helicopters, 120 F/A-18E/F Super Hornets, or four Virginia-class submarines.

There are two leading arguments against a BRAC, but neither is very convincing. The first envisions a vastly larger military – especially a larger Army – and concludes that a BRAC at this time would be premature because it would deny some hypothetical future military the land and other facilities it needs in order to train and operate effectively.

But BRAC rounds don’t eliminate every square inch of infrastructure not deemed essential in the present-day; they merely grant the Pentagon the authority to more efficiently allocate scarce resources, and respond to changing circumstances. Each of the past five BRAC rounds have cut an average of about 5 percent excess capacity. The military will always retain a surplus as a hedge against future contingencies. 

What’s more, the latest estimate was constructed around the force structure from 2012, when the U.S. military was engaged in major operations in Iraq and Afghanistan. Given other pressures on the defense budget, and federal spending in general, it seems highly unlikely that the military will grow back to 2012 levels. But in the extreme scenario in which the military’s needs are dramatically greater than at any time in the recent past, I’m confident that the federal government could obtain what it needs. After all, the U.S. military was tiny for most of our history, and yet we somehow managed to find new locations for bases when they were truly needed for the nation’s security (e.g., World War II).   

The second argument against BRAC has less to do with the military’s requirements, and is more about the impact of base closures on local communities. For the Pentagon, BRAC is like a shiny package wrapped with a bow under the Christmas tree. For locals, BRAC is a lump of coal in the stocking.

Except that we shouldn’t look at BRAC in this way. To be sure, base closures are disruptive to communities that have grown dependent upon the economic activity that a base generates. A few places have struggled to adapt after their local base closed and the troops moved away. But the actual experiences of defense communities reveal a more complex, and ultimately more optimistic, reality. Most communities are able to find more productive uses for properties previously trapped behind fences and barbed wire. Most are able to attract new businesses, from a diverse array of industries. Some have taken pride in granting the public access to newly open space. The array of uses for former bases is practically limitless (see, for example, Atlanta, Georgia; Austin, TexasBrunswick, MaineGlenview, Illinois; and Philadephia, Pennsylvania). A future BRAC round could be less disruptive than in the past if affected communities plan well, take account of lessons learned elsewhere, and apply some best practices to ease the transition.

As Secretary Mattis practically pleaded in a cover letter to the most recent report:

every unnecessary facility we maintain requires us to cut capabilities elsewhere. I must be able to eliminate excess infrastructure in order to shift resources to readiness and modernization.

If Congress doesn’t grant his wish, perhaps Secretary Mattis will climb onto Santa Claus’s lap, and whisper his desires into the jolly old elf’s ear – but I hope, for both men’s sake, it doesn’t come to that.

The U.S. Department of Justice, November 17 [press release/memo]:

Today, in an action to further uphold the rule of law in the executive branch, Attorney General Jeff Sessions issued a memo prohibiting the Department of Justice from issuing guidance documents that have the effect of adopting new regulatory requirements or amending the law. The memo prevents the Department of Justice from evading required rulemaking processes by using guidance memos to create de facto regulations.

In the past, the Department of Justice and other agencies have blurred the distinction between regulations and guidance documents. Under the Attorney General’s memo, the Department may no longer issue guidance documents that purport to create rights or obligations binding on persons or entities outside the Executive Branch….

“Guidance documents can be used to explain existing law,” Associate Attorney General Brand said. “But they should not be used to change the law or to impose new standards to determine compliance with the law. The notice-and-comment process that is ordinarily required for rulemaking can be cumbersome and slow, but it has the benefit of availing agencies of more complete information about a proposed rule’s effects than the agency could ascertain on its own. This Department of Justice will not use guidance documents to circumvent the rulemaking process, and we will proactively work to rescind existing guidance documents that go too far.”

This is an initiative of potentially great significance. For many decades, critics have noted that agencies were using Dear Colleague and guidance letters, memos and so forth — also known variously as subregulatory guidance, stealth regulation and regulatory dark matter — to grab new powers and ban new things in the guise of interpreting existing law, all while bypassing notice-and-comment and other constraints on actual rulemaking. To be sure, many judgment calls and hard questions of classification do arise as to when an announced position occupies new territory as opposed to simply stating in good faith what current law is believed to be. But the full text of the memo shows a creditable awareness of these issues. Note also, even before the Justice memo, Education Secretary Betsy DeVos’s statement in September, on revoking the Obama Title IX Dear Colleague letter: “The era of ‘rule by letter’ is over.”

Another notable pledge in the DoJ press release:

The Attorney General’s Regulatory Reform Task Force, led by Associate Attorney General Brand, will conduct a review of existing Department documents and will recommend candidates for repeal or modification in the light of this memo’s principles.

Note also this recent flap over certain financial regulations and the possibility that they may have been issued without notice to Congress, which could preserve Congress’s right to examine and block them under the terms of the Congressional Review Act. [cross-posted from Overlawyered; earlier in this space on the era of “rule by letter” at the Education Department]

Yesterday, Bangladesh-born Akayed Ullah attempted a suicide bombing in New York City.  Fortunately, he only injured a few people and severely burned his own torso.  Ullah entered the United States on an F4 green card for the brothers and sisters of U.S. citizens.

Some are using Ullah’s failed terrorist attack to call for further restricting family-based immigration and the green card lottery.  After hearing about the failed terrorist attack, President Trump argued that “Today’s terror suspect entered our country through extended-family chain migration, which is incompatible with national security … Congress must end chain migration.”  Rep. Bob Goodlatte (R-VA), Chairman of the House Judiciary Committee, also argued for ending chain immigration and the visa lottery program.  He said ending those green card programs “would make us safer.”

Neither President Trump nor Rep. Goodlatte indicated how much safer ending chain immigration or the diversity visa would make us.  Since September 2016, I have been updating information on the number of people killed in a terrorist attack on U.S. soil by foreign-born terrorists according to the visa they initially used to enter the United States.

From 1975 through December 11, 2017, foreign-born terrorists who entered on a green card murdered 16 people in terrorist attacks on U.S. soil.  Assuming all of those green cards were issued in the family reunification categories or through the diveristy visa lottery, the chance of being murdered in a terrorist attack committed by a chain immigrant or a diversity visa recipient was about 1 in 723 million per year.  The chance of being murdered in a non-terrorist homicide during that time is about 1 in 14,394 per year.  In other words, your annual chance of being murdered in a normal homicide is about 50,220 times greater than being killed in a terrorist attack by a chain immigrant or an immigrant who entered through the diversity visa lottery.

At least six of those victims were Argentinians here on a tourist visa, leaving 10 Americans as victims.  If we take American nationalists at their word and only consider the harm of foreign-born terrorist attacks on U.S. citizens, then we would have to exclude those six victims of terrorist attacks by chain immigrants and diversity visa winners.  This crazy nationalist calculus means that the annual chance of an American or a resident of the United States being murdered in a terrorist attack on U.S. soil committed by a chain immigrant or diversity visa winner is about 1 in 1.2 billion per year.  Your annual chance of being murdered in a normal homicide is about 80,352 times greater than your chance being killed in a terrorist attack by a chain immigrant or an immigrant who entered through the diversity visa lottery.

Of the 3,037 people murdered in terrorist attacks on U.S. soil during that time, about 98 percent perished in the 9/11 attacks.  Foreign-born terrorists who entered on tourist and student visas account for 99 percent of all murders committed by foreign-born terrorists on U.S. soil since 1975.  The annual chance of being murdered by any foreign-born terrorist during that time is about 1 in 3.8 million per year.

The 1 in 723 million chance a year of being murdered by a foreign-born terrorist who came in as a chain immigrant or on a diversity visa is the greatest possible risk, as I assume that all terrorists who entered on green cards did so through one of those two paths.  These numbers could change in the future and perhaps chain immigrants or those who entered on the diversity visa will become especially dangerous at some future date. That, however, is thin support for Trump’s policy proposal to remove about 400,000 green cards annually.  Whatever other merits could accrue from reforming chain immigration or the diversity visa, security is not a serious one as the danger is already so low.

During his campaign, President Trump promised to ban all Muslims outright until he could figure out “what is going on.” He later explained that this idea had developed into several policies that would have the same effect. Since his inauguration, Trump has begun to implement them—they include slashing the refugee program, banning all immigration and travelers from several majority Muslim countries, and imposing new burdens on all visa applicants as part of “extreme vetting” initiatives. So far, these policies appear to have “worked,” strongly reducing Muslim immigration and travel to the United States.

Muslim refugee admissions have fallen dramatically over the past year. According to figures from the State Department, Muslim refugee flows fell 94 percent from January to November 2017 (the last full month of available data). In calendar 2016, the United States admitted almost 45,000 Muslim refugees, compared to a little more than 11,000 in 2017—fully half of those entered in January and February. Of course, the administration has cut refugee flows generally, but the Muslim share of all refugees has dropped substantially too—from 50 percent in January to less than 10 percent in November.

Figure 1
Monthly Muslim Refugee Admissions for Each Month of 2017 and Average for 2016

 

Source: U.S. Department of State *Monthly average, **Through December 11, 2017

This year’s drop is even more substantial when compared with the trend. In only one year over the last decade has the number of Muslim refugee admissions fallen, and Muslim admissions have increased on average 18 percent annually from 2007 to 2016.

As for foreign travelers and immigrants seeking to live permanently in the United States, the State Department does not ask on its visa application form about their religious affiliation (thankfully). But based on the number of visas issued to nationals of the nearly 50 majority Muslim countries, it certainly appears that the Trump administration policies have affected them as well.

America issues two types of visas—“immigrant” for permanent residents and “nonimmigrant” for temporary residents—mainly tourists, guest workers, and students. For Muslim majority countries, the average monthly permanent visa issuances during the period of March to October 2017 (the only months that are available so far) dropped 13 percent from the monthly average in FY 2016. Average monthly visa issuances for temporary residents—tourists, guest workers, and students—from majority Muslim countries have dropped 21 percent from the FY 2016 monthly average.

Figure 2
Average Monthly Visa Issuances—Permanent and Temporary—2016 and 2017

Sources: U.S. Department of State, “Monthly Nonimmigrant Visa Issuance Statistics”; “Monthly Immigrant Visa Issuance Statistics”; “Nonimmigrant Visas Issued by Nationality”; “Immigrant Visas Issued

During the last decade, majority Muslim countries have never—even during the recession—seen temporary visa issuances fall by more than 1 percent in a single year and immigrant visas never more than 7 percent. From 2007 to 2016, temporary visa approvals for nationals of these countries actually grew 8 percent annually and permanent visas 9 percent annually. Again, compared to the expected increases, the declines are even more remarkable.

Immigration and travel from all countries has also declined this year, but the declines for Muslim majority countries were larger. They saw their share of all immigrant visa issuances fall 3 percent and their share of temporary visa approvals by 15 percent.

The visa declines disproportionately affected certain countries. In particular, they impacted the eight majority Muslim countries that President Trump has singled out in his three “travel ban” executive orders—Chad, Iran, Iraq, Libya, Syria, Somalia, Sudan, and Yemen. (Iraq and Sudan are technically now off the list, though Iraqis are supposedly subject to higher scrutiny. Chad was added in September.)

All eight countries received fewer visa approvals, and collectively, their monthly average immigrant visa issuances fell a collective 36 percent, while temporary visas fell 42 percent. These declines occurred despite court orders that barred full implementation of the ban until this month.

Figure 3
Average Monthly Visa Issuances—Permanent and Temporary—2016 and 2017 for Eight “Travel Ban” Countries

Sources: U.S. Department of State (See Figure 2)

The decline in Muslim refugee admissions is almost entirely a consequence of policy. The administration selects the number and types of refugees that it wants. President Trump promised to “prioritize” Christian refugees, and he has done so, not by increasing their numbers—the number of Christian refugees has declined as well—but by decreasing Muslim admissions.

Policy is at least partially culpable for fewer visa approvals. Almost 80 percent of the drop in immigrant visas came from the eight targeted countries, but these countries explain only 14 percent of the drop in temporary visas.

The Trump administration has rolled out other policies designed to target Muslim extremists that include more complicated and lengthy immigration forms and requirements to supply more evidence to support certain claims, such as past addresses and jobs. These could be increasing the costs associated with an application, forcing immigrants to hire attorneys or simply delaying their applications. Accounts of mysterious visa denials for Muslim applicants have serviced as well.

Undoubtedly, some Muslim travelers are also afraid to travel to the United States right now—stories of lengthy detentions and other mistreatment at the border for Muslims could dissuade Muslims from even applying. Regardless, President Trump is certainly fulfilling a major campaign promise: few Muslims are entering the United States. One can only hope he will figure out “what is going on” soon.

Pundits of every political persuasion decry corporate lobbying in Washington, and a major tax bill is a great opportunity for businesses to gain benefits if they convince members of Congress to help them out. However, battles over tax provisions are sometimes not what they appear on the surface.

For years, liberal pundits have characterized efforts to repeal the estate, or death, tax as the plutocrats pulling the levers of power on the Republican side of the aisle. But a new investigative piece at Daily Caller by Richard Pollock exposes the lobbying that is undermining good policy on estate taxation.

I favor estate tax repeal, for numerous reasons, as I laid out here. One reason is the large waste of resources spent on paperwork and avoidance. I noted:

The estate tax is probably the most inefficient tax in America. It has a high marginal rate and is very difficult for the government to administer and enforce. It has also created a large and wasteful estate planning and avoidance industry. The industry overflows with high-paid lawyers and accountants doing paperwork, litigation, asset appraisals, and creating financial structures to minimize the tax burden using trusts, life insurance, and private foundations.

Pollock explored lobbying by the life insurance industry to retain the estate tax, and the large revenues the industry earns on estate planning and avoidance services:

The life insurance industry has handsomely profited from the estate tax for years through the sale of “survivorship,” or second-to-die life insurance policies that generate billions of dollars in sales. The insurance industry provides these products to cover the estimated estate tax the policyholders’ children or heirs would have to pay upon their death. The policies are a more affordable way to pay the tax to the federal government.

For example, if a husband or wife estimates their heirs could face $1 million in estate taxes, they could buy a life insurance policy that pays out $1 million upon their death. That sum is free of income tax. The costs for the $1 million whole or universal survivorship policy could cost them pennies on the dollar, making the protection affordable.

“If properly arranged, a survivorship life policy will be tax free to the beneficiary, no estate tax and no income tax,” one organization boasts on its website, adding, “If, for example, you only pay over time $200,000 of premiums into a $1,000,000 policy, you’ve effectively paid $1,000,000 of estate tax for $200,000! Twenty cents on the dollar!”

The life insurance industry has been tight-lipped about how much money they make from these policies. Survivorship policy “represents approximately four percent of the life insurance market and 10 percent of premium for companies who offer it annually,” according to a June 13, 2017 report by the Insurance News Network. That amount would deliver as much as $24 to $30 billion in annual profits to the industry based on premium data from the Insurance Information Institute.

… As talk of full repeal of the death tax echoed through the walls of Congress, “panic” gripped the life insurance industry, its estate planners and insurance agents, according to industry insiders. “All estate planning has almost come to a halt over the last six months due to the possibility of significant changes to the estate tax laws, and in particular, the possibility there could be repeal,” said retired estate planning lawyer Steve Hornig, in an interview with TheDCNF. Hornig opposes the estate tax.

“I would classify it as a panic in the industry,” added Ted Bernstein, who is a retirement-planning and life-insurance specialist in Florida and who supports the estate tax. “Survivorship insurance will go away completely if the legislation passes as expected.” “Permanent insurance policies,” he added, “are a very significant percentage of the life insurance sales of the leading life insurance companies in the U.S.”

… Between 2015 and 2016, lobbying expenditures by [the] American Council of Life Insurers were estimated at $9.4 million, according to the nonpartisan Center for Responsive Politics. The insurance council has 30 full-time paid lobbyists …

In Congress, the House tax bill included estate tax repeal, while the Senate expanded the exemption. We will see in the coming week or two whether lawmakers will buck the life insurance lobbyists and repeal this inefficient tax.

In March 1990, NASA’s Roy Spencer and University of Alabama-Huntsville’s (UAH) John Christy dropped quite a bomb when they published the first record of lower atmospheric temperatures sensed by satellites’ microwave sounding units (MSUs). While they only had ten years of data, it was crystal clear there was no significant warming trend.

It was subsequently discovered by Frank Wentz of Remote Sensing Systems (RSS), a Santa Rosa (CA) consultancy, that the orbits of the sensing satellites successively decay (i.e., become lower) and this results in a spurious but slight cooling trend. Using a record ending in 1995, Wentz showed a slight warming trend of 0.07⁰C/decade, about half of what was being observed by surface thermometers. 

In 1994, Christy and another UAH scientist, Richard McNider, attempted to remove “natural” climate change from the satellite data by backing out El Niño/La Niña fluctuations and the cooling associated with two big volcanoes in 1983 and 1991. They arrived at a warming trend of 0.09⁰C/decade after their removal.

Over the years, Spencer and Christy slightly revised their record repeatedly, and its latest iteration shows a total warming trend of 0.13⁰C/decade, which includes natural variability. But it is noteworthy that this is biased upward by very warm readings near the end of the record, thanks to the 2015–16 El Niño.

Recently, Christy and McNider carried out a similar analysis to what they did in 1994 and found removing the volcanoes and natural sea surface temperature changes resulted in a warming trend nominally the same as their 1994 finding, at 0.10⁰C/decade—far, far beneath the 0.2–0.3⁰C/decade predicted for the current era by the models in the latest (2013) report of the UN’s Intergovernmental Panel on Climate Change.

Much as Christy and McNider said in 1994, it appears that the sensitivity of temperature to carbon dioxide changes in those models is just too high.

Here’s the illustration at the heart of the paper:

Because the print is so small in the figure legend, we’ll paraphrase it here. The top plot (red) is the temperature of the lower troposphere (“TLT”), from the surface to about eight kilometers in altitude.  The blue plot is the “natural” sea surface temperature (SST) component, now a combination of El Niño and other known oscillations, such as the Pacific Decadal Oscillation (PDO) and the Atlantic Multidecadal Oscillation (AMO). The middle black plot is the raw satellite data minus the oceanic oscillations, and the bottom one adjusts that for the two big volcanoes in 1983 and 1992. 

The new Christy and McNider paper also calculates the “transient sensitivity” of temperature to increasing carbon dioxide. The transient sensitivity is the temperature change observed at the time that atmospheric carbon dioxide doubles from its preindustrial background. Given observed rates of increase, this should occur sometime around 2070. The sensitivity works out to 1.1⁰C, which is slightly below half of the average transient sensitivity of all the climate models in the latest (2013) report of the UN’s Intergovernmental Panel on Climate Change.

This is another indication that if business-as-usual continues, including a continued transition from coal to natural gas for electrical generation, the world will easily meet the Paris Accord target of total anthropogenerated warming of less than 2.0⁰C by the year 2100.

Note that this is based on the satellite-sensed lower atmospheric temperatures. Our next post will compare them to the reanalysis data described in our last Global Science Report.

Thirteen law professors have written about how the GOP tax plan will provide incentives for tax planning and behavioral changes that might undermine current revenue estimates.

The document, “The Games They Will Play: Tax Games, Roadblocks, and Glitches,” is clearly written by professors skeptical of the overall tax package. But it provides useful examples of potential problems, such as ways new passthrough provisions could lead to complex battle lines between tax authorities and taxpayers.

It also assesses how the elimination of the state and local income and sales deduction (SALT) from the federal income tax code might encourage changes to state tax systems. Remember both House and Senate Bills would only retain a deduction of up to $10,000 for property taxes.

The restriction of the SALT deduction will, ceteris paribus, raise the cost of state and local government expenditures for affected taxpayers, particularly in higher-income, higher-tax jurisdictions. That might be expected to put pressure on states to reduce spending—a feature of this reform, rather than a bug.

But according to the professors, states may also seek to “reshape their tax systems so as to respond to this change and retain the benefit of the deduction for their taxpayers.” One means is to shift towards collecting more revenue from deductible taxes.

How might they do so?

One way for states to achieve this is by shifting to use of the property tax. The liquidity impact on taxpayers of a shift to property taxes can be mitigated by circuit breakers administered through a state’s income tax—essentially, reducing income tax liability in exchange for higher property taxes. Such responses would effectively allow taxpayers to deduct the full amount of state and local property and income taxes, up to the $10,000 cap.

Now, the professors paint this as a bad thing, because they believe it means federal revenue losses from tax reform will be greater than currently projected. From an economic perspective though, property taxes are broadly regarded as being less distortionary to economic decisions than income taxes. They also tend to encourage localism, and given they are widely disliked (particularly by elderly constituencies who turn out in elections), may be even more effective at bringing attention to the scale of state government spending than an increased income tax burden.

Whilst it would be better to have no SALT property deduction at all, if a consequence of tax reform is states using property taxes instead of income taxes, then that could be a good thing for the economy.

News stories are portraying the Republican tax bills as favoring the rich, even though the opposite is true. The GOP cuts would make the tax code more progressive, and the largest percentage cuts would go to middle-income households.

The Washington Post pushed another faulty narrative yesterday. The three layers of headlines on the hardcopy front page were, “Trump’s tax vow taking a U-turn—focus shifted away from middle class—GOP plan evolved into a windfall for the wealthy.” The story’s theme was that Trump originally promised middle-class tax cuts, but House and Senate tax bills have morphed into an orgy of tax cuts for corporations and rich people.

Ridiculous. Business tax cuts have been central to Trump’s message since 2015. He proposed slashing business tax rates to 15 percent and the top individual rate to 25 percent. House Republicans proposed in 2016 to cut the corporate rate to 20 percent and the top individual rate to 33 percent. Trump and House Republicans were elected in 2016 promising large business tax cuts and across-the-board individual rate cuts.

Rather than Trump and Republicans “shifting away” from middle-class cuts toward cuts for businesses and the wealthy as the Post claims, it is the opposite. Current House and Senate tax bills have sadly shifted away from pro-growth reforms toward redistribution from higher earners to lower earners.

Rather than a “windfall for the wealthy” as the Post claims, the GOP bills would provide larger percentage cuts for middle earners than higher earners (see here and here). The GOP may abandon cutting the top individual tax rate at all. Much of the cuts for high earners are allocated corporate tax cuts, but economists disagree about who those cuts would actually benefit.

Furthermore, the GOP tax bills would increase spending subsidies (refundable credits) for people at the bottom who do not pay any individual income taxes. Look at this TPC analysis of the Senate bill. It shows the bottom two quintiles receiving tax “cuts” in 2019 and 2025, yet those groups do not currently pay any income taxes on net.

The Post complains, “the legislation would lower taxes for many in the middle class, but mostly temporarily.” That is true, but virtually all the individual provisions in the Senate bill are temporary, not just the ones for the middle class. The corporate tax rate cut would be permanent, but this JCT analysis shows that in 2027 much of the revenue loss from that cut would be offset by corporate tax increases. Not only that, the Tax Foundation found that the corporate rate cut would nearly pay for itself by 2027 as corporate investment expanded and tax avoidance fell.

The Post presents TPC data showing tax-cut shares for each income group but provides no context. The following chart shows the TPC data in context. First, note the enormous share of federal taxes paid by the top quintile under current law. The chart includes all federal taxes—income, payroll, estate, and excise for 2019.

Now observe that the top quintile would receive a smaller share of the Senate tax cut than their tax share under current law. For the three middle quintiles, it is the opposite.

That means that the Senate tax cut would make the federal tax code more progressive. If the Senate tax cut is enacted, higher earners would pay a larger share of the overall federal tax burden. That moves in the wrong direction because our tax code is already far too progressive.

Trump and the Republicans did take a “U-turn.” They started down the pro-growth highway but veered off course into the redistributionist side roads. The conference committee would put the tax reform engine in reverse if it bumps up the corporate tax rate from 20 percent and makes other anti-growth changes.

Data behind the chart is here.

2017 has been a year of massive expansion for the Global War on Terror, but you could be forgiven for not noticing. In addition to the media focus on the ongoing chaos in the Trump White House, the Pentagon has consistently avoided disclosing where and who America’s armed forces are engaged in fighting until forced to do so.

Take Syria, where the Pentagon long claimed that there were only 500 boots on the ground, even though anecdotal accounts suggested a much higher total. When Maj. General James Jarrard accidentally admitted to reporters at a press conference in October that the number was closer to 4000, his statement was quickly walked back. Finally, last week, the Pentagon officially acknowledged that there are in fact 2000 troops on the ground in Syria, and pledged that they will stay there ‘indefinitely.’ 

Even when we do know how many troops are stationed abroad, we often don’t know what they’re doing. Look at Niger, where a firefight in October left four soldiers dead. Prior to this news—and to the President’s disturbing decision to publicly feud with the widow of one of the soldiers—most Americans had no idea that troops deployed to Africa on so-called ‘train-and equip’ missions were engaged in active combat.

Yet U.S. troops are currently engaged in counterterrorism and support missions in Somalia, Chad, Nigeria, and elsewhere, deployments which have never been debated by Congress and are authorized only under a patchwork of shaky, existing authorities.

Even in the Middle East, deployments have been increasing substantially under the Trump administration, with the number of troops and civilian support staff in the region increasing by almost 30% during the summer of 2017 alone. These dramatic increases were noted in the Pentagon’s quarterly personnel report, but no effort was made to draw public attention to them.

The fundamental problem is simple. With only limited knowledge of where American troops are, and what they are doing there, we cannot even have a coherent public discussion about the scope of U.S. military intervention around the globe. We should be discussing the increase in U.S. military actions in Africa or the growth in U.S. combat troops in the Middle East, but that discussion is effectively impossible—even for the relevant congressional committees—with so little information.

So if I could ask for one change to U.S. foreign policy for Christmas, I’d like to know where American troops are and what they’re doing there. It’s past time for a little more transparency, from the Trump administration, and from the Pentagon. 

A Wall Street Journal op-ed last week by liberal billionaire Tom Steyer complained that the proposed Republican tax cut “overwhelming helps the wealthy.” He said that the American people will be furious “if they see a bill passed that hands out filet mignon to the wealthy while leaving them struggling over scraps.”

Steyer’s op-ed had more rhetoric than data, but he did cite a Tax Policy Center (TPC) analysis of the Senate bill. So let’s look at the TPC data. The table below summarizes the Senate tax cuts for 2019 and compares them to current-law taxes.

Looking at the block on the right, TPC finds that 62.2 percent of the tax cuts would go to the highest quintile, or fifth of U.S. households, and 15.3 percent would go the top 1 percent. Just 13.5 percent of the cuts would go the middle quintile. Does that mean filet mignon for the top and scraps for the middle?

No, it does not. We need context. We need to know how much tax those groups are currently paying, but TPC does not show that in its analysis of the Senate plan. You have to dig through TPC’s website to find it here. TPC’s estimates of current law taxes for 2019 are below in the block on the left. “All Federal Taxes” includes the taxes shown plus payroll and excise taxes.

Without any tax cut, the top quintile will pay 67.0 percent of all federal taxes in 2019, and the top 1 percent will pay 26.7 percent. Since the tax cut shares for those groups are less than that, the cuts will make federal taxation more progressive. If the Senate bill were passed, the top quintile of higher earners would pay an even larger share of the overall federal tax burden. That would undercut the growth potential of tax reform and make our excessively progressive tax code even more so.

What about the middle quintile? TPC estimates that under current law the group will pay 5.4 percent of individual income taxes, 8.6 percent of corporate taxes, and 10.0 percent of all federal taxes in 2019. Yet this group would receive 13.5 percent of the Senate tax cuts. Thus, middle earners would gain a disproportionately large share of the tax cuts under the Senate plan.

So which group is dining on fillet mignon? It is the overgrown federal government because—with or without a tax cut—spending is projected to soar in coming years. Federal spending is one fifth of gross domestic product and rising, and unfortunately that quintile receives solid bipartisan support.

Data Notes 

Citing TPC, Steyer says, “Sixty-two percent of the benefits from the Senate bill’s tax cuts flow to the top 1% of earners.” Bernie Sanders used that statistic on TV yesterday.

That figure is for 2027 when nearly all the individual tax changes are scheduled to have expired in the Senate bill, so it is kind of meaningless. For one thing, it is 62 percent of a small overall revenue loss number. TPC finds that the 2027 tax cuts would reduce revenues by 0.2 percent of income, or just one-sixth the amount that revenues would be reduced in 2019.

The corporate tax rate cut would be the main cut in place in 2027, and TPC assumes that higher earners would receive most of those benefits. But other economists dispute that view, arguing that corporate tax cuts would benefit workers across the income spectrum, as discussed by the CEA.

Finally, most of the estimated static revenue losses from the corporate rate cut in 2027 would offset by corporate tax increases that year, as shown in this Joint Tax Committee report.  

The United States’ immigration system favors family reunification, even in the so-called employment-based categories.  The family members of immigrant workers must use employment-based green cards despite the text of the actual statute and other evidence that strongly suggests that this was not Congress’ intent.  Instead of a separate green card category for spouses and children, they get a green card that would otherwise go to a worker. 

In 2015, 56 percent of all supposed employment-based green cards went to the family members of workers (Chart 1).  The other 44 percent went to the workers themselves.  Some of those family members are workers, but they should have a separate green card category or be exempted from the employment green card quota altogether. 

Chart 1

Employment-Based Green Cards by Recipient Types

 

Source: 2015 Yearbook of Immigration Statistics, Author’s calculations

If family members were exempted from the quota or there was a separate green card category for them, an additional 76,711 highly skilled immigrant workers could have earned a green card in 2015 without increasing the quota.

About 85 percent of those who received an employment-based green card in 2015 were already legally living in the United States (Chart 2).  They were able to adjust their immigration status from another type of visa, like an H-1B or F visa, to an employment-based green card.  Exempting some or all of the adjustments of status from the green card cap would almost double the number of highly skilled workers who could enter.  Here are some other exemption options:

Chart 2

Adjustment of Status vs. New Arrivals

 

Source: 2015 Yearbook of Immigration Statistics, Author’s calculations

  • Workers could be exempted from the cap if they have a higher level of education, like a graduate degree or a Ph.D.
  • A certain number of workers who adjust their status could be exempted in the way the H-1B visa exempts 20,000 graduates of American universities from the cap.
  • Workers could be exempted if they show five or more years of legal employment in the United States prior to obtaining their green card.
  • Workers could be exempted based on the occupation they intend to enter.  This is a problem because it requires the government choosing which occupations are deserving, but so long as it leads to a general increase in the potential numbers of skilled immigrant workers without decreasing them elsewhere, the benefits will outweigh the costs.

I’ve had lots of requests for a non-Scribd link to the 2004 DoD IG report on the THINTHREAD and TRAILBLAZER programs I mentioned in my JustSecurity.org piece yesterday, so you can now find it here

I should point out that at the end of the excellent documentary on this topic, A Good American, the film’s creators noted that Hayden, NSA’s Signal Intelligence Division director Maureen Baginski, and two other senior NSA executives involved in this affair declined to be interviewed on camera.

Michael Currier, like more and more defendants in recent years, was charged with multiple, overlapping offenses: (1) breaking and entering, (2) grand larceny, and (3) possession of a firearm as a convicted felon. This charging decision turned on an aggressive application of Virginia’s felon-in-possession statute, because the alleged firearm violation here was fleeting happenstance: Currier supposedly “handled” the victim’s firearms by moving them out of the way in order to commit the different offense of stealing money from a safe. If Currier had been tried on all these charges at once, the evidence needed to show he was a convicted felon would have been unduly prejudicial on the two primary counts (evidence of past, unrelated criminal behavior is generally inadmissible). The Commonwealth recognized this potential for prejudice, and therefore moved to sever the felon-in-possession count. It opted to try the primary offenses first, and the jury acquitted Currier of the breaking and entering and grand larceny charges. Undeterred, the Commonwealth pressed forward on the felon-in-possession count, refining its case to present the same underlying factual theory to a second jury. And on this second go-round, Currier was convicted. 

As Cato argued in our recent amicus brief, Currier’s conviction is squarely in conflict with the Double Jeopardy Clause of the Fifth Amendment. That provision guarantees that no person shall be “twice put in jeopardy of life or limb” for the same offense, and includes the principle that when an issue of ultimate fact has necessarily been determined by a jury acquittal, the government cannot relitigate the same factual question in a second trial for a separate offense. Given how Currier’s charges were tried the first time, the jury necessarily concluded that he wasn’t guilty of participating in the underlying burglary and theft—he simply wasn’t there at all. But that’s the exact same set of facts the government needed to obtain a conviction in the second trial, because Currier was only alleged to have “handled” the guns in the course of the robbery. 

The Commonwealth justifies this result by arguing that Currier waived his double-jeopardy rights by agreeing to severance, and that there was no blatant prosecutorial misconduct. But this position would deprive the Double Jeopardy Clause of much of its significance, and is inconsistent with the historical development of double jeopardy jurisprudence in the United States—in particular, its goal of guarding against the structural power imbalances that exist between prosecutors and defendants. It is also impossible to square the Commonwealth’s position with the sanctity of jury acquittals and the time-honored authority and prerogative of the jury, speaking for the community, to ultimately and finally determine facts. 

If the Commonwealth’s position becomes the law of the land, the government will be further incentivized to charge more offenses based on the same underlying conduct, thus increasing the need for (and likelihood of) multiple trials for the same underlying series of events. This type of overreach will allow the government to run dress rehearsals for successive prosecutions in more and more cases, thereby undermining the sacred liberty interests protected by the Double Jeopardy Clause, and diminishing the responsibility of the jury to stand between the accused and a potentially arbitrary or abusive government. This result would be a travesty; in today’s world of ever-expanding criminal codes and regulatory regimes, the government needs fewer, not greater, incentives for piling on theories of criminal liability.

This article originally appeared on Just Security on December 7, 2017.   

Retired Gen. Michael Hayden, former director of the NSA and CIA (and now, a national security analyst at CNN), has recently emerged as a leading critic of the Trump administration, but not so long ago, he was widely criticized for his role in the post-9/11 surveillance abuses. With the publication of his memoir, Playing to the Edge: American Intelligence in the Age of TerrorHayden launched his reputational rehab campaign.

Like most such memoirs by high-level Washington insiders, Hayden’s tends to be heavy on self-justification and light on genuine introspection and accountability. Also, when a memoir is written by someone who spent their professional life in the classified world of the American Intelligence Community, an additional caveat is in order: The claims made by the author are often impossible for the lay reader to verify. This is certainly the case for Playing to The Edge, an account of Hayden’s time as director of the NSA, and subsequently, the CIA.

Fortunately, with respect to at least one episode Hayden describes, litigation I initiated under the Freedom of Information Act (FOIA) has produced documentary evidence of Hayden’s role in the 9/11 intelligence failure and subsequent civil liberties violations. The consequences of Hayden’s misconduct during this time continue to be felt today. First, some background. 

The War Inside NSA, 1996 to 2001

By the mid-1990s, a group of analysts, cryptographers, and computer specialists at NSA realized that the growing volume of digital data on global communications circuits was both a potential gold mine of information on drug traffickers and terrorist organizations, as well as a problem for NSA’s largely analog signals intelligence (SIGINT) collection, processing, and dissemination systems. As recounted in the documentary A Good American, three NSA veterans—Bill Binney, Ed Loomis, and Kirk Wiebe—set out to solve the problem of handling an ever-increasing stream of digital data while protecting the 4th Amendment rights of Americans against warrantless searches and seizures.

Through their Signals Intelligence Automation Research Center (SARC), they had, by 1999, developed a working prototype system, nicknamed THINTHREAD. A senior Republican House Permanent Select Committee on Intelligence (HPSCI) staffer, Diane Roark, was so impressed with what Binney, Loomis, and Wiebe had developed, that she helped steer approximately $3 million to the THINTHREAD project to further its development. But by April 2000, Roark and the SARC team had run into the ultimate bureaucratic roadblock for their plan: Hayden, who had recently been installed as NSA director.

He had his own, preferred solution to the same problem the SARC team had been trying to solve. As Hayden noted in his memoir:

Our answer was Trailblazer. This much-maligned (not altogether unfairly) effort was more a venture capital fund than a single program, with our investing in a variety of initiatives across a whole host of needs. What we wanted was an architecture that was common across our mission elements, interoperable, and expandable. It was about ingesting signals, identifying and sorting them, storing what was important, and then quickly retrieving data in response to queries.

It was, of course, a description that fit THINTHREAD perfectly—except for the collection and storage of terabytes of digital junk. THINTHREAD’s focus on metadata mining and link analysis was designed to help analysts pinpoint the truly important leads to follow while discarding irrelevant data. Hayden’s concept mirrored that of his successor, Keith Alexander, who also had a “collect it all” mentality.

In his memoir, Hayden spoke of the need to “engage industry” (p. 20) in the effort to help NSA conquer the challenge of sorting through the mind-numbing quantity of digital data, but even Hayden admitted that “When we went to them for things nobody had done yet, we found that at best they weren’t much better or faster than we were” (page 20).

That should’ve been Hayden’s clue that NSA would be better off pursuing full deployment of THINTHREAD, a proven capability. But Hayden chose to pursue his industry-centric approach instead, and he tolerated no opposition or second-guessing of the decision he’d made.

In April 2000, Hayden’s message to the NSA workforce made it clear that any NSA employees who went to Congress to suggest a better way for the NSA to do business would face his wrath. Even so, the THINTHREAD team pressed on, managing to get their system deployed to at least one NSA site in a test bed status, working against a real-world target. Meanwhile, Roark continued to push NSA to make the program fully operational, but Hayden refused, and just three weeks before Sept. 11, 2001, further development of THINTHREAD was terminated in favor of the still hypothetical TRAILBLAZER program.

DoD IG Investigation vs. Hayden’s memoir

As Loomis noted in his own account of the THINTHREAD-TRAILBLAZER saga, within days after the 9/11 attacks, NSA management ordered key components of THINTHREAD—the system Hayden had rejected—to be integrated (without the inclusion of 4th Amendment compliance software) into what would become known as the STELLAR WIND warrantless surveillance program. Terrified that the technology they’d originally developed to fight foreign threats was being turned on the American people, Loomis, Binney, and Wiebe retired from the NSA at the end of October 2001.

Over the next several months, they would attempt to get the Congressional Joint Inquiry to listen to their story, but to no avail. By September 2002, the trio of retired NSA employees, along with Roark, decided to file a Defense Department Inspector General (DoD IG) hotline complaint, in which they alleged waste, fraud, and abuse in the TRAILBLAZER program. Inside NSA, they still had an ally—a senior executive service manager named Tom Drake, who had become responsible for the remnants of THINTHREAD after the SARC team had resigned. Drake became the key source for the subsequent DoD IG investigation, which resulted in a scathing, classified report completed in December 2004.

The TRAILBLAZER-THINTHREAD controversy subsequently surfaced in the press, and I followed the reporting on it while working as a senior staffer for then-Representative Rush Holt (D-N.J.), a HPSCI member at the time. Once Holt was appointed to the National Commission on Research and Development in the Intelligence Community, I asked for and received copies of the published DoD IG reports dealing with the THINTHREAD and TRAILBLAZER programs.

The 2004 report remains the most damning IG report I’ve ever read, and after Holt announced his departure from Congress in 2014, I decided to continue my own investigation into this episode as an analyst at the Cato Institute. In March 2015, I filed a FOIA request seeking not only the original 2004 DoD IG report, but all other documents relevant to the investigation.

After being stonewalled by DoD and NSA for nearly two years, Cato retained the services of Loevy and Loevy of Chicago to prosecute a FOIA lawsuit to help get the documents I sought. In July 2017, the Pentagon released to me a still heavily redacted version of the 2004 DoD IG report. But there are fewer redactions in my copy than there were in the version provided to the Project on Government Oversight (POGO) in 2011, and it provides the clearest evidence yet that Hayden’s account of the THINTHREAD-TRAILBLAZER episode in his memoir is simply not to be believed.

On The IG Investigation Itself

On page 26 of his memoir, Hayden’s only mention of the IG investigation is a single sentence: “Thin Thread’s advocates filed an IG (inspector general) complaint against Trailblazer in 2002.”

Hayden makes no mention of the efforts he and his staff made to downplay THINTHREAD to the IG, or the climate of fear that Hayden and his subordinates created among those who worried TRAILBLAZER was a programmatic train wreck, and that THINTHREAD could, in fact, provide NSA with exactly the critical “finding the needle in the haystack” capability it needed in the digital age.

In its Executive Summary (page ii), the DoD IG report agreed THINTHREAD was the better solution and should be deployed:

And the DoD IG made it clear that NSA management—meaning Hayden—had deliberately excluded THINTHREAD as an alternative to TRAILBLAZER at a clear cost to taxpayers:

On Defying Congress

Hayden’s fury at the SARC team keeping HPSCI staffer Roark in the loop about their progress was palpable, as he made clear on page 22 of his book:

The alliance with HPSCI staffer Roark created some unusual dynamics. I essentially had several of the agency’s technicians going outside the chain of command to aggressively lobby a congressional staffer to overturn programmatic and budget decisions that had gone against them internally. That ran counter to my military experience—to put it mildly.

But Binney, Loomis, and Wiebe didn’t owe their allegiance to Hayden—they owed it to the Constitution and the American people. And to be clear, Roark was the driver behind briefing and information requests, performing her mandated oversight role, a fact Hayden clearly resented—to the point that he was willing to defy her requests, as the IG report noted on page 2:

That defiance of a congressional request went further, as the DoD IG noted on page 99 of their report:

Hayden didn’t just stiff-arm Roark, he stiff-armed the entire committee.

On Incompetent Program Management and Priorities

Hayden makes clear in his memoir (page 20) that he wanted an orderly approach to the digital traffic problem, even if it meant taking a lot of time to do it:

Our program office had a logical progression in mind: begin with a concept definition phase, then move to a technology demonstration platform to show some initial capability and to identify and reduce technological risk. Limited production and then phased deployment would follow.

The DoD IG investigators viewed Hayden’s approach as ill-considered (p. 4):

In other words, Hayden had learned nothing from his mistake in sand-bagging THINTHREAD prior to 9/11, and he kept the original, full program on ice even after the loss of nearly 3,000 American lives and daily concerns in the months after the terrorist attacks about possible “sleeper cells” and follow-on attacks.

On THINTHREAD’s scalability

Hayden argues in his memoir (page 22) that THINTHREAD was not deployable across all NSA elements:

The best summary I got from my best technical minds was that aspects of Thin Thread were elegant, but it just wouldn’t scale. NSA has many weaknesses, but rejecting smart technical solutions is not one of them.

The DoD IG investigators disagreed, as this response to Hayden’s team at the time makes clear (p. 106):

On THINTHREAD’s effectiveness

On page 21 of his book, Hayden gives the reader the impression that THINTHREAD was not that good at actually finding real, actionable intelligence:

We gave it a try and deployed a prototype to Yakima, a foreign satellite (FORNSAT) collection site in central Washington State. Training the system on only one target (among potentially thousands) took several months, and then it did not perform much better than a human would have done. There were too many false positives, indications of something of intelligence value when that wasn’t really true. A lot of human intervention was required.

An analyst who had actually used THINTHREAD after its initial prototype deployment in November 2000 had a very different view (p. 16):

The second to last sentence is worth repeating: “The analyst received intelligence data that he was not able to receive before using THINTHREAD.” “Not able to receive” from any other NSA system or program. Had THINTHREAD been deployed broadly across NSA and focused on al-Qaeda, it could have helped prevent the 9/11 attacks, as the SARC team and Roark have repeatedly claimed.

On THINTHREAD’s legality

Hayden claims in his memoir (page 24) that NSA’s lawyers viewed THINTHREAD as illegal:

Sometime before 9/11, the Thin Thread advocates approached NSA’s lawyers. The lawyers told them that no system could legally do with US data what Thin Thread was designed to do. Thin Thread was based on the broad collection of metadata that would of necessity include foreign-to-foreign, foreign-to-US, and US-to-foreign communications. In other words, lots of US person data swept up in routine NSA collection.

In fact, as the SARC team noted in A Good American, THINTHREAD’s operational concept was just the opposite: scan the traffic for evidence of foreign bad actors communicating with Americans, segregate and encrypt that traffic, and let the rest go by. No massive data storage problem, no mass spying on Americans.

And the account the DoD IG investigators got from NSA’s Office of General Counsel (page 20) flatly contradicts Hayden’s memoir:

The “Directive 18” in question is United States Signals Intelligence Directive 18, which governs NSA’s legal obligations regarding the acquisition, storage, and dissemination of data on U.S. persons.

As you can probably imagine, I could cite many other instances of Hayden’s rewriting of the history of the THINTHREAD-TRAILBLAZER episode, but if you want as much of the story as is currently available, I suggest you read the entire (though still heavily redacted) version of the DoD IG report I obtained in July.

The Story Goes On

What’s remarkable is that Congress was well aware of Hayden’s misconduct and mismanagement while at NSA, but it still allowed him to become the head of my former employer, the CIA. Meanwhile, Roark’s personal example of integrity and fidelity to congressional oversight were rendered meaningless by her then-boss, House Intelligence Committee Chairman (and former CIA operations officer) Porter Goss’s (R-FL) failure to fully investigate the THINTHREAD-TRAILBLAZER disaster, and by his Senate colleagues who elected to confirm Hayden to head the CIA by a vote of 78-15. Hayden definitely got one thing very right: He knew he could snow House and Senate members and get away with it.

My FOIA lawsuit is ongoing, and additional document productions are—hopefully—just a few months away. To date, DoD is continuing to invoke the NSA Act of 1959 to keep many details of this saga—especially the amount of money squandered on TRAILBLAZER—from public view. For me, that’s actually a key issue in this case—testing the proposition as to whether NSA, utilizing the 1959 law, can conceal indefinitely waste, fraud, abuse, or even criminal conduct from public disclosure.

But the larger policy issue for me is laying bare, using a real-world case study, a prime example of a hugely consequential congressional oversight failure. The SARC team and Roark continue to argue that had THINTHREAD been fully deployed by early 2001, the 9/11 attacks could’ve been prevented. Drake asserts in A Good American that post-attack testing of THINTHREAD against NSA’s PINWALE database uncovered not only the attacks that happened, but ones that didn’t for various reasons.

And the SARC team and Roark maintain that THINTHREAD could have accomplished NSA’s digital surveillance and early warning mission without the kinds of constitutional violations seen or alleged with programs like the PATRIOT Act’s Sec. 215 telephone metadata program or the FISA Amendments Act Sec. 702 program, the latter currently set to expire at the end of this month and the subject of multiple legislative reform proposals.

None of this was examined by either the Congressional Joint Inquiry or the 9/11 Commission, which means the real history of how the 9/11 attacks happened has yet to be written.

Also pending are two Office of Special Counsel investigations into aspects of this episode—one involving Drake, and the other looking at former Assistant DoD IG John Crane, as I’ve written previously on this site. I’ll have more to say on all of this as documents become available or as events warrant.

French rocker Johnny Hallyday—the “French Elvis—has passed away at 74. I do not know his music, but it appears that he was an innovator. His sounds were apparently new to French ears, and his willingness to adopt rock styles from the English-speaking world upset the French establishment. But the people adored his music, and he sold 110 million records. So Hallyday and the market got the better of France’s cultural rules.

Hallyday didn’t like French tax rules either. Here is what I wrote in Global Tax Revolution:

The solidarity tax on wealth was imposed in the 1980s under President Francois Mitterrand. It is an annual assessment on net assets above a threshold of about $1 million, and it has graduated rates from 0.55 percent to 1.8 percent. It covers both financial assets and real estate, including principal homes.

One of those hit by the wealth tax was Johnny Hallyday, a famous French rock star and friend of French president Nicolas Sarkozy. Hallyday created a media sensation when he fled to Switzerland in 2006 to avoid the tax. He has said that he will come back to France if Sarkozy “reforms the wealth tax and inheritance law.” Hallyday stated: “I’m sick of paying, that’s all … I believe that after all the work I have done over nearly 50 years, my family should be able to live in some serenity. But 70 percent of everything I earn goes to taxes.” A poll in Le Monde found that two-thirds of the French public were sympathetic to Hallyday’s decision.

France still has its wealth tax, but numerous other countries have scrapped theirs as global tax competition has heated up. As for Hallyday, he spent his last decade avoiding the wealth tax in Switzerland and Los Angeles.

The latest international academic assessment results are out—this time focused on 4th grade reading—and the news isn’t great for the United States. But how bad is it? I offer a few thoughts—maybe not that wise, but I needed a super-clever title—that might be worth contemplating.

The exam is the Progress in International Reading Literacy Study—PIRLS—which was administered to roughly representative samples of children in their fourth year of formal schooling in 58 education systems. The systems are mainly national, but also some sub-national levels such as Hong Kong and the Flemish-speaking areas of Belgium. PIRLS seeks to assess various aspects of reading ability, including understanding plots, themes, and other aspects of literary works, and analyzing informational texts. Results are reported both in scale scores, which can range from 0 to 1000, with 500 being the fixed centerpoint, and benchmark levels of “advanced,” “high,” “intermediate,” and “low.” The 2016 results also include a first-time assessment called ePIRLS, which looks at online reading, but it includes only 16 systems and has no trend data so we’ll stick to plain ol’ PIRLS.

Keeping in mind that no test tells you even close to all you need to know to determine how effective an education system is, the first bit of troubling news is that the United States was outperformed by students in 12 systems. Among countries, we were outscored by the Russian Federation, Singapore, Ireland, Finland, Poland, Norway, and Latvia. Some other countries had higher scores, but the differences were not statistically significant, meaning there is a non-negligible possibility the differences were a function of random chance. Also, between 2011 and 2016 we were overtaken by Ireland, Poland, Nothern Ireland, Norway, Chinese Taipei, and England.

The second concerning finding is that, on average, the United States has made no statistically significant improvement since 2001. As the chart below shows, our 2016 result was not significantly better than our 2001 score. We appear to have made some strides between 2001 and 2011 but have clearly dipped since then.

A few thoughts:

  • It is tempting to attribute the gains between 2001 and 2011 to the No Child Left Behind Act, and it is certainly possible that the standards-and-accountability emphasis of the NCLB era helped to goose scores. It is, however, impossible to conclude that without looking at numerous other variables that affect test scores, including student demographics and such difficult-to-quantify factors as student motivation. More directly, NCLB was passed in very early 2002, so by 2006 it had had several years to start working. But that year reading scores went down for all but the lowest 25 percent of test takers. By 2011, the next iteration, NCLB had become politically toxic.
  • The U.S. PIRLS results are broken down by various student attributes, including race/ethnicity. We need to be very careful about these blunt categories—they contain lots of subsets, and ultimately reduce to millions of individuals for whom race or ethnicity is just one among countless attributes—but they might hint at something of use. Most interesting, perhaps, is that scores for Asian Americans (591) beat the top-performing systems, the Russian Federation (581) and Singapore (576). This might suggest that there is something about culture— East Asian culture especially is thought to focus heavily on academic achievement, general American culture not so much—and that the education system itself might play a relatively small role in broad academic achievement.
  • Or maybe it’s not culture, or culture is wrapped up in lots of other things such as business success, or Asian Americans tend to arrive from wealthier backgrounds to begin with. As seen below, a simple correlation between median household income for each group and their 2016 score is almost perfect at 0.98. (A perfect positive correlation would be 1.0). This also suggests that the system does not have nearly the impact of other factors, but whether it is culture, wealth, or some intertwining of those and many other factors is unclear.

  • If the system does not matter, at least for standardized reading assessments, then what really hurts about U.S. education policy is that we spend more per-pupil than almost any other country for which we have data but get pretty mediocre results. As of 2013 we spent $11,843 per elementary and secondary student, and in 2016 were beaten by several countries that spent less, including Latvia ($5,995), Poland ($6,644), Ireland ($9,324), and Finland ($9,579).
  • That factors such as culture might matter much more than spending or the system might explain why American school choice programs tend to produce only slightly better standardized test scores but at a fraction of the cost of public schools. Of course, there are also many things people want out of education that might be much more important to them than test scores—raising morally upright, compassionate, creative human beings, for instance—and freeing people to get those things might be the most important and compelling argument for school choice.

That’s it for PIRLS 2016 musings. On to the next standardized test results, and other things that may matter far more.

Political debate in the modern world is impossible without memorizing a list of euphemisms, and there is no shortage of public opprobrium for those who talk about certain topics without using them.  In addition to the many euphemisms that are accepted by virtually everybody, the political left has its own set of euphemisms associated with political correctness, while the political right has its own set linked to patriotic correctness.  Euphemisms tend to serve as signals of political-tribal membership, but also as means to convince ambivalent voters to support one policy or the other.  Violating the other political tribe’s euphemisms can even help a candidate get elected President.  This post explores why people use euphemisms in political debate and whether that effort is worthwhile. 

Euphemisms change over time.  Harvard psychologist Steven Pinker termed this linguist evolution the “euphemism treadmill” and, over twenty years ago, argued that replacing old terms with new ones was likely inspired by the false theory that language influences thoughts, a notion that has been long discredited by cognitive scientists.  Pinker described how those who board the euphemism treadmill can never step off:

People invent new “polite” words to refer to emotionally laden or distasteful things, but the euphemism becomes tainted by association and the new one that must be found acquires its own negative connotations.

Few political debates are as riddled with euphemisms as immigration.  The accurate legal term “illegal alien,” which was once said without political bias and is now almost exclusively used by nativists, was replaced with “illegal immigrant” which was supplanted by “undocumented immigrant” and, in rarer cases, “unauthorized immigrant.”  Goofy terms like “border infiltrator” and “illegal invader” have not caught on yet.  Proponents of the new term “undocumented immigrant” argue that nobody can be illegal, so the term “illegal immigrant” is inaccurate as well as rude.  Of course, nobody is undocumented either, as they just lack the certain specific documents for legal residency and employment.  Many have drivers licenses, debit cards, library cards, and school identifications which are useful documents in specific contexts but not nearly so much for immigration.  “Misdocumented immigrant” would be better if the goal was accuracy, but the goal seems to be to change people’s opinions on emotional topics by changing the words they use.

In the immigration debate, the euphemism treadmill can sometimes run in reverse and actually make political language harsher.  This “cacophemism cliff” turned “birthright citizenship” into “anchor baby” and “liberalized immigration” into “open borders.” 

In the long run, stepping onto the euphemism treadmill can seem like a fool’s errand.  As Pinker explains, people’s feelings toward the replaced term are merely transferred to the euphemism because we all have concepts that we use words to describe but we don’t use words to invent new concepts.  The concept-to-word cognitive production process only affects the sound of the output, not its meaning.

 

    

Framing – “Undocumented Immigrants” or “Illegal Aliens”

Not all is lost for exercisers on the euphemism treadmill.  They just have to lower their expectations and be satisfied with framing political discourse, rather than the quixotic goal of changing concepts with words.  Framing is a psychological technique that can influence the perception of social phenomena, a political or social movement, or a leader.  Research in political psychology has shown that framing works through making certain beliefs accessible in memory upon exposure to a particular frame. Once certain beliefs are activated through the mechanism of framing, they affect all the subsequent information processing. An example of framing’s power to affect perception is that opinions about a Ku Klux Klan rally vary depending on whether it is framed as a public safety or free speech issue.

Framing can steer public opinion in opposite directions of the political spectrum. The “undocumented immigrant” frame will invoke different beliefs from the “illegal alien” frame. Specifically, the former is describing the issue as a bureaucratic government problem afflicting ordinary immigrants.  The latter frames it as a law and order problem with foreign nationals. These two euphemisms, although meant to represent the same concept, do so in different ways that convey different messages and will pull the receivers of the frames in different directions.  Most people feel sympathy toward those caught up in a cruel bureaucratic morass but are much less sympathetic to lawbreakers.

Following this logic, a policy proposal titled “path to citizenship for undocumented immigrants” is going to attract more support than “amnesty for illegal aliens.” Both “path to citizenship” and “amnesty” here mean legalization. However, the term “legalization” implies that there has been something illegal about that group of people, an association which many proponents want to avoid. “Path to citizenship” is a much softer frame that invokes positive emotions.  On the other side of the debate, “legalization” has been replaced with “amnesty,” which has a more negative meaning.  Proponents and users of the term “amnesty” are emphasizing that it is a pardon for an offense rather than a fix of a bureaucratic problem. “Pathway to citizenship” is also sometimes replaced by “earned legalization” or “comprehensive immigration reform.” These two expressions bring up considerations about legality and reform, both of which are far more cognitively charged than “path to citizenship” and therefore less likely to be used by supporters of such policies.

Dog Whistles and the Threat Frame: “Extreme Vetting,” “Illegal Invader,” and “Anchor Baby”

Euphemisms can help legitimize otherwise prejudiced rhetoric. Consider “extreme vetting”, a phrase that has been referred to as a euphemism for “discrimination against Muslims.” Using this particular euphemism helps one accomplish two goals. First, it helps separate oneself from blatant discrimination based on religion or national origin, which is important because prior research in political science has shown that people are increasingly sensitive to social desirability and so are unwilling to express bluntly prejudiced beliefs since it has become less socially acceptable to do so. Thus, masking such prejudice under a neutral euphemism is rather useful in that regard. Second, it still conveys the overall message of hostility to the audience that is receptive to such rhetoric – also known as a dog whistle. Therefore, you can indicate your own beliefs and connect the audience with similar beliefs without coming across as being bluntly prejudiced.

A somewhat similar idea is behind the use of the “illegal invader” term, which goes even further by invoking a threat frame. Threats could be powerful tools since, once threatened, people tend to overestimate the risk and support policies that minimize the threat no matter how small it actually is.  Threat frames negatively bias listeners against this group.

An important effect of threat frame euphemisms is that they can dehumanize and attach negative attitudes to certain groups. Consider the euphemisms “anchor baby” and “catch and release.” “Anchor baby” stands for children born to foreign nationals who are in violation of their immigration status while on U.S. soil.  Those children have automatic citizenship under the U.S. Constitution.  Such children are called “anchor babies” in order to highlight the idea that they are used by their parents to secure their stay in the country although that rarely actually happens. The term dehumanizes both the parents and their children by describing these individuals through association with an inanimate object, the “anchor,” and that the only purpose for the existence of the children is to resolve the parent’s problem with immigration law.  Threat frames also extend to other criminal activity related to immigrants.  

There are examples of other indirect expressions that are not euphemisms. Let us consider “catch and release” and “sanctuary city.” “Catch and release” is used to describe an act of apprehending illegal immigrants and subsequently releasing them. A “sanctuary city” is a city that limits their cooperation with federal immigration enforcement.  These two are used by both sides of the immigration debate and do not have a positive or a negative substitute. The problem with them is that both the expressions might as well pertain to the “animal kingdom” domain, which can be demeaning and humiliating when used to talk about people. “Catch and release” brings up associations with fishing and hunting, thus dehumanizing those that are being caught and released. Similarly, the word “sanctuary” is frequently used to describe a wildlife refuge. Similar to the “anchor baby,” they are of dehumanizing character.  Both of these euphemistic expressions, although not meant to do any harm and not created by political elites, could generate unfavorable attitudes.

Euphemisms as Subliminal Primes

Euphemisms are effective as subliminal primes because they are short and compact expressions.  Priming is an instrument that activates preconscious expectations according to research in political psychology.  Priming is similar to framing but has important differences, as it invokes an automatic reaction without the reader having to read through the whole article. Even a split-second glimpse at the title has the priming effect. As opposed to frames, primes require less time and less cognitive effort to be successful in shaping public opinion.  Primes color the perception of all information that follows the prime.  Consider the hypothetical article titles “Birthright Citizenship for Children of Undocumented Immigrants” versus “Illegal Alien Anchor Babies.” Although these two expressions technically have a similar meaning, they can subconsciously prime the reader and bias all of his subsequent information processing. The reader who encounters the first of the two expressions is likely to have a pro-immigration bias primed, whereas the second will have the opposite direction bias.

Euphemisms as primes are particularly meaningful for citizens who are ambivalent about immigration. Consider a relatively more liberal person who is undecided on immigration. By encountering a random piece of news that uses “undocumented immigrants” instead of “illegal aliens,” an ambivalent voter is more likely to form a pro-immigration bias at a rather early stage because of his greater innate support for fairness, which is offended by the unequal distribution of documents. Whereas a relatively more conservative person who is undecided about immigration is far more likely to be swayed by the term “illegal alien”, because of their greater support for order and structure, which is offended by illegality. 

Conclusion

This post explores the theoretical base of using euphemisms as tools of influence. Although there is some excellent research into these issues related to immigration, it is a field crying out for more experimental and empirical inquiry. Laboratory experiments with human subjects could confirm the effectiveness of specific euphemisms as primes or frames. Since such studies are often criticized for their external validity, a follow-up study that combines content analysis of relevant media with opinion polls that show changes in attitudes could also be useful.

An underexplored possibility is how euphemisms and frames affect political debate by spreading confusion.  People accustomed to the term “illegal immigrant” to describe foreign-born persons who are currently unlawfully residing in the United States might initially fail to react as negatively to the term “undocumented immigrant” merely because they don’t know what it means.  As soon as they know what it means, however, the negative feelings they associate with “illegal immigrant” would probably attach to the term “illegal alien.”  Another is how euphemisms build walls around political tribes and prevent them from talking to each other, thus deepening policy divisions that prevent middle-ground solutions. 

Special thanks to Jen Sidorova for her initial rough draft as well as her invaluable insights and research.  

Some years ago I published a paper on the banking theory and policy views of the important twentieth-century economist Friedrich A. Hayek, entitled “Why Didn’t Hayek Favor Laissez Faire in Banking?[1] Very recently, working on a new paper on Hayek’s changing views of the gold standard, I discovered an important but previously overlooked passage on banking policy in a 1925 article by Hayek entitled “Monetary Policy in the United States After the Recovery from the Crisis of 1920.” I missed the passage earlier because the full text of Hayek’s article became available in English translation only in 1999, the same year my article appeared, in volume 5 of his Collected Works. Only an excerpt had appeared in translation in Money, Capital, and Fluctuations, the 1984 volume of Hayek’s early essays.[2]

Hayek wrote the article in December 1924, very early in his career. In May 1924 he had returned from a post-doctoral stay in New York City and had begun participating in the Vienna seminar run by Ludwig von Mises. It is safe to say that the passage I am about to quote reflects Mises’ influence, since the article cites him, and in many ways takes positions opposite to those Hayek had taken in an earlier article that he wrote while still in New York.

The main topic of the 1925 article is the Federal Reserve’s policies in the peculiar postwar situation in which, as Hayek put it, the US “emerged from the war … as the only country of importance to have retained the gold standard intact.” The US had received “immense amounts” of European gold during and since the war (Hayek documents this movement with pertinent statistical tables and charts), and now held a huge share of the world’s gold reserves — more gold reserves than the Fed knew what to do with. European currencies, having left the gold standard to use inflationary finance during the First World War, and not having yet resumed direct redeemability, were for the time being pegged to the gold-redeemable US dollar. This was a new and unsettled “gold exchange standard,” unlike the prewar classical gold standard in which major nations redeemed their liabilities directly for gold and held their own gold reserves. Rather than delve into what Hayek had to say about that topic, I want to convey what he said about banking.

In section 8 of the article (pp. 145-47 in the 1999 translation), Hayek gives a favorable evaluation of free banking as against central banking. Having overlooked this passage, I had previously thought that Hayek first addressed free banking in his 1937 book Monetary Nationalism and International Stability. Hayek does not embrace free banking as an ideal, first-best system, because he thought it prone to over-issue (as I discussed in my 1999 article based on Hayek’s other writings). But he criticizes the Federal Reserve Act for relaxing rather than strengthening the prior system’s constraints against excess credit expansion by American commercial banks.

Hayek begins the passage with a caution that the intended result of creating a central bank, when the intention is to avoid or mitigate financial crises, need not be the actual result:

It cannot be taken for granted that a central banking system is better suited to prevent disturbances in the economy stemming from excessive variations in the volume of available bank credit than a system of independent and self-reliant commercial banks run on purely private enterprise (liquidity, profitability) lines.

By standing ready to help commercial banks out of liquidity trouble, central banks give “added incentive … to commercial banks to extend a large volume of credit.” In modern terminology, a lender of last resort creates moral hazard in commercial banking. A free banking system (my phrase, not his) restrains excessive credit creation by fear of failure:

In the absence of any central bank, the strongest restraint on individual banks against extending excessive credit in the rising phase of economic activity is the need to maintain sufficient liquidity to face the demands of a period of tight money from their own resources.

Hayek’s belief that the pre-Fed US system did not restrain credit creation firmly enough is understandable in light of the five financial panics during the fifty years of the federally regulated “National Banking system” that prevailed between the Civil War and the First World War. He might have noted, however, that the National Banking system was a system legislatively hobbled by branching and note-issue restrictions rather than a free banking system or a system “run on purely private enterprise lines.”[3] The Canadian banking system, lacking those restrictions, did not experience financial panics during this period (or even during the Great Depression) despite having an otherwise similar largely agricultural economy.

Despite the flawed character of the pre-Fed system, Hayek judged that the Federal Reserve Act made the situation worse rather than better by loosening the prevailing constraints against unwarranted credit expansions:

Had banking legislation had the primary gold to prevent cyclical fluctuations, its main efforts should have been directed towards limiting credit expansion, perhaps along the lines proposed — in an extreme, yet ineffective way — by the theorists of the “currency school,” who sought to accomplish this purpose by imposing limitations upon the issuing of uncovered notes. … Largely because of the public conception of their function, central banks are intrinsically inclined to direct their activities primarily towards easing the money market, while their hands are practically tied when it comes to preventing economically unjustified credit extension, even if they should favour such an action. …

This applies especially to a central banking mechanism superimposed on an existing banking system. … The American bank reform of 1913-14 followed the path of least resistance by relaxing the existing rigid restraints of the credit system rather than choosing the alternative path …

Thus the Fed was granted the power to expand money and credit, a power that “was fully exploited during and immediately after the war,” not waiting for a banking liquidity crisis. The annual inflation rate in the United States, as measured by the CPI, exceeded 20 percent in 1917, and remained in double digits for the next three years (17.5, 14.9, and 15.8) before the partial reversal of 1921. Hayek (p. 147) observed ruefully “how large an expansion of credit took place under the new system without exceeding the legal limits and without activating in time automatic countermeasures forcing the banks to restrict credit.” He concluded: “There can be no doubt that the introduction of the central banking system increased the leeway in the fluctuations of the volume of bank credit in use.”

Here Hayek reminds us that a less-regulated banking system does not need to be perfect to be better than even well-intentioned heavier regulatory intervention. Good intentions do not equal good results in bank regulation.

_______________

[1] Lawrence H. White, “Why Didn’t Hayek Favor Laissez Faire in Banking?“ History of Political Economy 31 (Winter 1999), pp. 753-769. I also published a companion paper on his monetary theory: Lawrence H. White “Hayek’s Monetary Theory and Policy: A Critical Reconstruction,” Journal of Money, Credit, and Banking 31 (February 1999), pp. 109-20.

[2] F. A. Hayek, “Monetary Policy in the United States after the Recovery from the Crisis of 1920,” in Good Money Part I: The New World, ed. Stephen Kresge, vol. 5 of The Collected Works of F. A. Hayek (Chicago: University of Chicago Press, 1999); F. A. Hayek, Money, Capital, and Fluctuations: Early Essays, ed. Roy McCloughry (Chicago: University of Chicago Press, 1984).

[3] See Vera C. Smith, The Rationale of Central Banking (Indianapolis: Liberty Fund, 1990), chapter 11; and George A. Selgin and Lawrence H. White, “Monetary Reform and the Redemption of National Bank Notes, 1863-1913,” The Business History Review 68, no. 2 (1994), pp. 205-43.

[Cross-posted from Alt-M.org]

Over a decade ago, James Hamilton was convicted of a felony in Virginia, for which he served no jail time. Since then, the state of Virginia has restored all of his civil rights, including the right to possess firearms. In the years since then, Hamilton has worked as an armed guard, firearms instructor, and protective officer for the Department of Homeland Security. Despite never exhibiting any violent tendencies and leading a stable family, the state of Maryland, where Hamilton now resides, forbids him from possessing firearms because of that decade-old Virginia conviction.

Hamilton challenged Maryland’s absolute prohibition on the possession of firearms by felons as applied to him, arguing that, while there may be reasons for forbidding some felons from owning firearms, the prohibition made no sense when applied to him, a person who committed a non-violent felony over a decade ago. The Fourth Circuit, however, decided that Hamilton was not eligible to bring an as-applied challenge to Maryland’s law, leaving states in the Fourth Circuit wide latitude to abuse the constitutional rights of a huge class of citizens and leaving those citizens with no way to vindicate their rights.

On petition to the Supreme Court, Cato submitted a brief as amicus curiae, arguing for the court to hear Hamilton’s case. We argued that, by allowing the Fourth Circuit to defer to state legislatures in defining who is and is not entitled to Second Amendment protection, the Fourth Circuit allowed Maryland to define the scope of a constitutional right, in direct contravention of Supreme Court precedent, specifically Heller. In general, lower courts have shown tremendous zeal in treating the Second Amendment as a second-class right—even after Heller and McDonald—and those concerns are magnified here, where the Fourth Circuit ruled that a person cannot even bring an as-applied challenge to a law that burdens the exercise of a constitutional right. The Fourth Circuit justified its position by quoting Supreme Court language referring to felon-in-possession bans as “presumptively constitutional.” However, that is not how the Fourth Circuit has treated this law. A restriction that is not capable of being defeated is not “presumptively lawful,” it is absolutely and inviolably lawful, and thus we urged the Supreme Court to step in and rein in this abuse by the lower court. The Supreme Court declined.

Hamilton is another in a long line of Second Amendment cases that the Supreme Court has refused to hear, including one just last week challenging Maryland’s “assault weapons” ban. Hamilton is particularly unfortunate because, if taken far enough, states could deny large portions of their citizens the right to keep and bear arms without any way to remedy their loss. Hamilton’s case was a great vessel for the Supreme Court to clarify Heller and McDonald and finally force the circuit courts to make Second Amendment decisions with some modicum of consistency. A decade-old, non-violent, non-firearm-related felony for which Hamilton served no time is no reason to strip him of the basic human right of effective self-defense.

There are good reasons to believe that fraud took place in Honduras’ presidential election. The Economist did a statistical analysis of the election results and found “reasons to worry” about the integrity of the vote—although they were not conclusive. A report from the Organization of American States Observation Mission points out “irregularities, mistakes, and systemic problems plaguing this election [that] make it difficult… to be certain about the outcome.”

At the heart of the controversy is how the results of the presidential election shifted dramatically after a blackout in the release of information that lasted nearly 38 hours. A first report released by the Electoral Tribunal (TSE) on Monday 27 November at 1:30 am (ten hours after polls closed and after both leading contenders had declared themselves the winners) showed opposition candidate Salvador Nasralla leading incumbent president Juan Orlando Hernández 45.17% versus 40.21%, with 57.18% of tally sheets from polling stations counted.

Then came the blackout, during which officials from Hernandez’s National Party argued that the results would be reversed once the release of information resumed. Their claim was that the tally sheets initially reported came from polling stations in urban areas, whereas the National Party strongholds are in rural areas. Indeed, when the TSE began releasing information again on Tuesday afternoon, Nasralla’s five point lead steadily declined and then disappeared. With almost all votes counted, Hernández is now ahead by 1.6 points.

Other irregularities documented by the OAS include missing tally sheets, opened and incomplete containers with electoral material from polling stations, and undisclosed criteria for processing the ballots that arrived at the TSE collection center.

What now? The opposition is demanding a full Florida-style recount. This would prolong the uncertainty about who won the election, but given the extent of irregularities, it seems a fair request. However, some officials from Nasralla’s camp also claim that the election has been irretrievably tainted. Nasralla himself proposed a run-off vote with Hernández, but the constitution does not allow for such possibility. The real danger is that the opposition will reject anything short of a repeat of the election, even if there is a transparent recount. A repeat of the election, expensive as it is, would also create an ominous precedent for contesting close election results in the future.

It is also fair to say that Nasralla’s camp is not likely to concede defeat under any circumstances. His left-wing coalition—conspicuously named the “Opposition Alliance against the Dictatorship”—was going to cry foul if Nasralla was defeated, regardless of the margin. He also reneged on a signed pledge to respect the result emanating from the TSE and threatened to continue the chaos brought about by his supporters “until the country comes to an end.” Instead of being a responsible actor during the crisis, Nasralla is increasingly giving the impression that he does not want an institutional solution to it. For example, Nasralla has yet to file a formal challenge to the election, despite the fact that a legal deadline was extended until Friday in order to give his Alliance more time to do so. He has not presented evidence of manipulated tally sheets either.

There are no easy ways out of this quagmire and it is likely that one side will end up feeling cheated. Still, a solution needs to be worked out: The TSE should facilitate the verification of all the 18,103 tally sheets and, if anomalies arise, allow for a recount of those where there are discrepancies. This process should be closely monitored by observers from the Organization of American States and the European Union. It is their task to serve as ultimate arbiters and certify whether the conditions have been met for a transparent verification and recount process.

A post-election institutional arrangement could be part of the solution: Since Honduras’ Constitutional Court struck down the prohibition on presidential reelection, the Congress should establish non-consecutive reelection (such as in Chile, Costa Rica, and Uruguay). In addition, a run-off should be introduced for presidential elections. Finally, the appointment of the TSE justices should be taken away from Congress and given to the Supreme Court in order to guarantee their impartiality.  

Pages