Daily-Dose

Contents

From New Yorker

From Vox

I have asked the Attorney General to stand prepared to take all actions to oppose this administration’s unconstitutional overreach of executive power. It has no place in America. Not now, and not ever.

— Governor Mark Gordon (@GovernorGordon) September 9, 2021

Let’s get one thing out of the way first: Vaccine mandates are not unconstitutional. The Supreme Court upheld a local health board’s decision to mandate smallpox vaccinations in Jacobson v. Massachusetts (1905). And states routinely require nearly all school-age children to receive a long list of vaccines.

In Georgia, for example, nearly all children must be vaccinated against many diseases, including polio, hepatitis B, measles, mumps, rubella, varicella, tetanus, diphtheria, pertussis, meningitis, and septicemia.

But just because the Constitution permits the government to require vaccines does not necessarily mean that the Labor Department may, as Biden says it will, issue a binding rule requiring large employers to encourage vaccination. The Labor Department may only act pursuant to an act of Congress. So unless Congress passes a new law, the department must rely on an existing statute if it wishes to regulate employers.

There is a strong argument, however, that the Occupational Safety and Health Act of 1970 (OSH) permits the Labor Department to act. Among other things, that law permits the secretary of labor to issue an “emergency temporary standard” regarding workplace health or safety if they determine that “employees are exposed to grave danger from exposure to substances or agents determined to be toxic or physically harmful,” and that such a standard is “necessary to protect employees from such danger.”

The delta variant of the SARS-CoV-2 virus is a substance or agent that is physically harmful. Still, employers who object to the Labor Department’s new rules might claim that Covid-19 does not present a sufficiently “grave danger” to justify implementing an emergency standard. Or they might argue that the particular rule announced by President Biden is not “necessary” to protect workers from Covid-19.

Under existing case law, the Biden administration likely could overcome these objections. But they will also have to present their case to a judiciary that is dominated by Republican appointees and that has handed down several decisions undercutting public health rules intended to prevent the spread of Covid-19. So there’s no guarantee these courts will follow existing law.

The bottom line is that the fate of the new vaccination rules is uncertain. There is fairly little case law interpreting the Labor Department’s power to issue emergency standards, and there is no guarantee that an increasingly conservative judiciary will treat existing case law as binding.

But if the courts eventually strike down the Labor Department’s rule, it matters a great deal when they do so. If the rule is in effect for several months, even if the Supreme Court eventually decides to repeal it, many employers are likely to comply with it voluntarily — and millions of Americans could be vaccinated while the rule is in effect.

When can the Labor Department issue an “emergency temporary standard”?

The Occupational Safety and Health Act gives the Labor Department very broad authority to protect workers’ health and safety. Under the law, the secretary may issue binding regulations “to serve the objectives” of the statute. It enumerates several purposes that it is intended to achieve, including “authorizing the Secretary of Labor to set mandatory occupational safety and health standards applicable to businesses affecting interstate commerce,” and “providing medical criteria which will assure insofar as practicable that no employee will suffer diminished health, functional capacity, or life expectancy as a result of his work experience.”

Yet while the law gives the department a great deal of power to issue regulations, it normally may only do so in a lumbering, protracted process that typically takes years to complete. The Occupational Safety and Health Administration (OSHA), an agency within the Labor Department, often spends months or years meeting with stakeholders within an industry before proposing a new rule. The proposed rule must be announced to the public to give people who may be affected by the rule an opportunity to comment. Then OSHA must take these comments into account while devising a final rule.

According to the nonpartisan Congressional Research Service, the whole process, on average, takes seven years and nine months — meaning that, if the Biden administration started that process now in order to push out a vaccination rule, President Biden could be out of office by the time that rule is finalized.

The law also has a rarely invoked provision permitting OSHA to bypass this long process and issue an emergency temporary standard, which may remain in place for up to six months. Although this provision has not been used very often in the past, the Biden administration did use it last June to implement new, Covid-related rules governing health care employers.

Litigation challenging the Biden administration’s new vaccination rules is likely to focus on whether the Covid-19 pandemic meets the legal standard required to issue such an emergency rule — including whether Covid-19 presents a “grave danger” and whether the administration’s proposed rule is “necessary” to protect workers from that danger.

 Kent Nishimura/Los Angeles Times via Getty Images
On Thursday, President Biden announced several new policies to encourage vaccination including requiring large employers to protect their workers from unvaccinated colleagues by requiring either vaccination or weekly testing.

Prior to the pandemic, the last time OSHA attempted to issue an emergency standard was 1983, when the Reagan administration tried to reduce the amount of asbestos workers could be exposed to by 75 percent. That rule was eventually struck down by the United States Court of Appeals for the Fifth Circuit, although on fairly narrow grounds, and in an opinion that suggests OSHA has a fair amount of discretion to decide when an emergency standard is warranted.

Significantly, the Fifth Circuit’s opinion in Asbestos Information Association v. OSHA (1984) holds that courts should not second-guess the agency’s determination that a particular health hazard presents a “grave danger” to workers. “Gravity of danger is a policy decision committed to OSHA, not to the courts,” according to the Fifth Circuit.

Moreover, even if the courts were to make such judgments, the Fifth Circuit’s opinion suggests that health hazards that are far less threatening than Covid-19 can still be a “grave danger.” According to OSHA, the Reagan administration rule at issue in Asbestos Information Association was expected to save “eighty lives out of an estimated worker population of 375,399” during the six-month period that it would have been in effect — far fewer than the hundreds of thousands of lives lost to Covid-19.

And yet, the Fifth Circuit explained that “the Secretary determined that eighty lives at risk is a grave danger. We are not prepared to say it is not.”

While Asbestos Information Association has some good news for the Biden administration, the court’s opinion also suggests that OSHA may carry a difficult burden of proof when it claims that an emergency standard is “necessary” to protect workers.

The Fifth Circuit did not rule on whether the Reagan-era asbestos standards were “necessary,” but it did strike down those standards because OSHA failed to consider alternative rules, such as requiring employers to provide respirators to their workers that could filter out asbestos.

Under the Fifth Circuit’s opinion, OSHA could have potentially reissued the same emergency asbestos standards if it had explained why alternatives such as respirators are inadequate. But OSHA had to actually do that work.

So are the courts likely to uphold the Biden administration’s vaccination rules?

One piece of good news for the Biden administration is that anyone challenging an emergency temporary standard must file a petition in a United States Court of Appeals, not in the federal district courts that normally try cases prior to appeals. That will prevent these challengers from seeking out one of the several district judges with well-earned reputations as conservative ideologues who can often be relied on to block Democratic policies.

The Fifth Circuit’s opinion in Asbestos Information Association suggests one way that judges acting in bad faith might sabotage the new vaccine rules. That opinion implies that OSHA must consider all reasonable alternatives to a new policy, and explain why those alternatives are inadequate. (We won’t know why the Biden administration believes that alternatives such as mask requirements are inadequate until OSHA formally issues the document implementing the new policy.)

But a hostile panel might also force OSHA to consider unreasonable alternatives, and then send the vaccination rule back to the drawing board because OSHA failed to explain why it couldn’t require unvaccinated employees to wear hazmat suits or to seal themselves up in a hermetic bubble.

The Supreme Court’s conservative majority is also increasingly hostile to federal statutes that grant broad regulatory authority to federal agencies, and may view a challenge to new vaccine rules as a good opportunity to diminish OSHA’s authority.

Last month, in Alabama Association of Realtors v. HHS, the Supreme Court struck down a moratorium on many evictions promulgated by the Centers for Disease Control and Prevention (CDC). The CDC claimed that such a moratorium is justified because it would prevent the spread of Covid-19 by people who lost their homes and had to seek shelter with friends or in homeless shelters. And the CDC relied on a broadly worded statute permitting it to “make and enforce such regulations as in [its] judgment are necessary to prevent the introduction, transmission, or spread of communicable diseases.”

Nevertheless, a majority of the justices construed this statute narrowly to prohibit the eviction moratorium.

There are some important distinctions between the statute at issue in Alabama Association of Realtors and the text of OSH. The statute at issue in the eviction moratorium case contained some language that, at least according to a majority of the justices, should be read to limit the CDC’s authority to actions to “inspection, fumigation, disinfection, sanitation, pest extermination, [and] destruction of animals or articles.” OSH does not contain similar language.

That said, a major thrust of the Court’s opinion in Alabama Association of Realtors was that the CDC claimed a “breathtaking amount of authority,” and that no previous CDC regulation issued under the same statutory provision “has even begun to approach the size or scope of the eviction moratorium.”

These are not legal judgments so much as value judgments. A majority of the Supreme Court was more uncomfortable with a federal agency having the power to halt evictions en masse than it was with the possibility that people who lost their homes might spread a deadly disease. It’s not hard to imagine the same justices applying a similar value judgment to the Biden administration’s vaccine rules.

All of which is a long way of saying that the fate of those rules is likely to turn less on what OSH (or any other federal law) actually says, and more on whether at least five members of the Supreme Court agree with Biden’s policy.

The man elected to govern the United States has come up with a plan to fight Covid-19. It’s now up to a panel of unelected justices to tell us if he can implement it.

But some things can be measured. There have been no 9/11-scale terrorist attacks in the United States in the past 20 years. Meanwhile, according to the most recent estimates from Brown University’s Costs of War Project, at least 897,000 people around the world have died in violence that can be classified as part of the war on terror; at least 38 million people have been displaced due to these wars; and the effort has cost the US at least $5.8 trillion, not including about $2 trillion more needed in health care and disability coverage for veterans in decades to come.

When you lay it all out on paper, an honest accounting of the war on terror yields a dismal conclusion: Even with an incredibly generous view of the war on terror’s benefits, the costs have vastly exceeded them. The past 20 years of war represent a colossal failure by the US government, one it has not begun to reckon with or atone for.

We are now used to the fact that the US government routinely bombs foreign countries with which it is not formally or even informally at war, in the name of killing terrorists. We are used to the fact that the National Security Agency works with companies like Facebook and Google to collect our private information en masse. We are used to the fact that 39 men are sitting in Guantanamo Bay, detained indefinitely without trial.

These realities were not inevitable. They were chosen as part of a policy regime that has done vastly more harm than good.

What America and the world might have gained from the war on terror

Before going further, it’s important to define our terms. By “war on terror,” I mean all policy initiatives undertaken by the US government from September 11, 2001, to the present with a goal of fighting Islamist — and particularly al-Qaeda/ISIS — terrorism.

This means that not all US policy initiatives in the Middle East and North Africa are counted here as part of the war on terror. The stated rationale behind the NATO intervention in Libya in 2011, for instance, was to force a ceasefire in the country’s incipient civil war and to prevent Muammar Qaddafi’s army from committing atrocities against civilians — so it does not count for our purposes.

The US invasion and occupation of Iraq, by contrast, does count as part of the war on terror, for the simple reason that the Bush administration considered it so. The administration argued for and justified the invasion as a necessary measure to prevent terrorist groups from acquiring weapons of mass destruction and striking the United States.

The costs of the war in Iraq, and indeed of every other front in the war on terror, are relatively easy to relate: hundreds of thousands of lost lives, trillions in dollars spent, incalculable damage to the US’s reputation in the world.

So let’s start with a harder question: What, if any, benefits accrued to the US and the world as a result of the war on terror?

The first, and most obvious, is the wrecking of al-Qaeda’s ability to carry out large attacks in the West. Before 9/11, al-Qaeda was able to operate fairly openly as an organization training and indoctrinating thousands of recruits in how to carry out attacks on the US and its allies.

“The very top leadership [was based] in Afghanistan and able to orchestrate things with a degree of impunity,” Daniel Byman, a senior fellow at the Center for Middle East Policy at the Brookings Institution and a professor at Georgetown University, told me. “They were able to invite literally thousands of recruits there to train.” They could also indoctrinate recruits like future 9/11 ringleader Mohammed Atta, who arrived planning to fight Russians in Chechnya but was persuaded by al-Qaeda leadership to target the US.

“Even with the US departing from Afghanistan, you don’t have the mass,” Byman continued. “You can’t invite thousands of people there without great risk. [Al-Qaeda] leaders are always vulnerable to drone strikes or special ops raids.”

This situation took some time to come about; even after the US invasion of Afghanistan, al-Qaeda was able to maintain an international network of members who went on to carry out massive attacks in Europe, like the March 11, 2004, subway strikes in Madrid and the July 7, 2005, plot in London. Upstart regional groups like al-Qaeda in Iraq and al-Qaeda in the Arabian Peninsula were able to operate with even greater impunity within those countries.

But between direct ground troop assaults (up to and including the assassination of Osama bin Laden), targeted drone strikes, and a greatly expanded system of intelligence sharing both among US intelligence agencies (like the CIA and FBI, which famously failed to share intelligence before 9/11) and with foreign intelligence agencies, al-Qaeda’s operational capabilities have been badly degraded, especially when it comes to attacking the US.

This is not merely because of successes in the US-led war on terror. ISIS, a group that emerged as a direct result of the war, became a more effective recruiter of young aspiring militants than al-Qaeda, especially in 2014 and 2015. But it seems fair to credit at least a good share of the group’s weakening to US actions.

How much the destruction of al-Qaeda is worth to the US is a matter of perspective. Let us then take an incredibly generous estimate of its value, to see if that would justify the war on terror’s costs.

In the aftermath of 9/11, fears of attacks of that scale recurring on a regular basis were pervasive. Those fears were not realized because of the decimation of al-Qaeda and because the group, even at its height, was probably not capable of carrying out an attack like that every year.

A chart of circles showing American terror deaths on 9/11 were nearly three times 
American terror deaths in 1995 to 2019 combined.

Let’s suppose for the sake of argument, though, that al-Qaeda was capable of more attacks on the scale of 9/11, and that absent the war on terror, the US would have lost 3,000 people (the approximate death toll on 9/11) annually due to al- Qaeda strikes. That amounts to some 60,000 lives saved to date. Whoa, if true.

But even with that degraded capability, global deaths from al-Qaeda, ISIS, and Taliban attacks have not fallen since 9/11. While al-Qaeda’s ability to attack America has been badly degraded, its operations in countries like Yemen, Syria, and Libya are still significant and deadly. ISIS’s attacks, and those of the pre-conquest Taliban in Afghanistan, were even deadlier.

A chart showing global deaths from al-Qaeda, ISIS, and Taliban attacks rose 
much higher and then fell somewhat since 9/11.

More clearly relevant in an accounting of the war on terror are the potential benefits that accrued to some civilians.

Civilians in Afghanistan and Iraq suffered horrifically as a result of America’s invasions and occupations. But the prior regimes in those countries were also horrific. Pre-war Iraq was suffering both from Saddam Hussein’s policies and from international sanctions, and Taliban-governed Afghanistan was a human rights disaster for ethnic minorities and women in particular.

The Taliban has now returned to power, but women in Afghanistan had 20 years free of a theocratic regime, with many able to attend school and university, hold political positions, and generally be more independent of their fathers and husbands. In a 2016-2017 survey, the Afghanistan Central Statistics Organization estimated the literacy rate among women ages 15 to 24 at 38.7 percent — far below the 68.2 percent rate reached among young men, but well above the 19.6 percent rate among women recorded in 2005.

That said, these gains tended to be concentrated in cities like Kabul; many women in more rural regions suffering under repeated American airstrikes, and enjoying fewer gains in liberties, were eager to see the US-backed regime gone.

Health conditions also improved for Afghans during the occupation, with mortality rates for children in particular falling. A study in The Lancet Global Health found that between 2003 and 2015, mortality for children under 5 in Afghanistan fell by 29 percent. Given current birth levels in Afghanistan, that could translate to roughly 44,500 lives saved annually due to reduced child mortality.

It would be a stretch, however, to give the war on terror sole credit for this; many neighboring countries saw child mortality gains in this period, too, at least in official statistics. Iraq, by contrast, did not see notable gains in child mortality post-invasion.

After the US destroyed Iraq’s relatively stable authoritarian regime and plunged the country into a sectarian civil war, the country eventually calmed somewhat, though factional violence continues at high levels. According to the University of Maryland’s Global Terrorism Database, in 2019, Iraq had the second-highest level of terrorism in the world (behind only Afghanistan), but “only” 564 people died in those attacks, down from a peak of 9,929 in 2014.

It would be a stretch to call the current Iraqi regime a “democracy”: Freedom House, a US government-funded nonprofit, rates it as “not free,” citing pervasive Iranian influence on Iraqi politics, endemic corruption, and ongoing violence. The Varieties of Democracy data set classifies Iraq as an “electoral autocracy.” But an electoral autocracy is still likely a step up from Saddam’s brutal regime, and Iraq’s Shia majority and Kurdish minority enjoy much more access to political power than they did pre-invasion.

Of course, these benefits weren’t the only outcomes of 20 years of war.

The costs of the war on terror

Since 2010, the best quantitative source on the toll exacted by US operations in Afghanistan, Iraq, and elsewhere to combat terrorism has been the Costs of War Project, based at Brown University and co-directed by Catherine Lutz, Neta Crawford, and Stephanie Savell.

The group’s mission is simple: to attempt a rigorous accounting of the human and financial cost of America’s post-9/11 wars and produce credible estimates of lives lost, people displaced, and dollars spent.

    <img alt="A chart showing the war on terror’s 8 trillion cost, including future veteran care costs." 
src=“https://cdn.vox-cdn.com/thumbor/KE8Ehaq7G7X96rhfN_zTWUmTpnQ=/800x0/filters:no_upscale()/cdn.vox- cdn.com/uploads/chorus_asset/file/22841162/8_trillion_breakdown.jpg” />

Their most recent estimates were released on September 1. The Costs of War Project estimates the total cost of America’s post-9/11 conflicts at roughly $8 trillion, of which $5.8 trillion has been spent or requested so far and $2.2 trillion represents estimated future obligations to care for veterans of these conflicts.

The biggest costs arose from actual war budgets for the Defense Department and associated increases in its base budget; these total around $3 trillion, with interest costs adding another $1.1 trillion. The homeland security part of the war’s cost amounted to roughly $1.1 trillion as well, with $465 billion spent on veteran care to date.

That $5.8 trillion spent over 20 years can be a bit hard to picture. It amounts to $290 billion per year — though very unevenly distributed, with the bulk of the costs coming at the apex of the Iraq War. For comparison, $290 billion is more than the US spent on traditional anti-poverty programs (SNAP, SSI, refundable tax credits) last year.

A chart 
showing the vast majority of the war on terror’s 900,000 deaths were not American.

The Costs of War Project estimates that between 897,000 and 929,000 people have been killed in Afghanistan, Pakistan, Iraq, Syria, Yemen, and other post-9/11 war zones. These are conservative figures; they exclude, for instance, civilian deaths in countries like the Philippines and Kenya that have seen drone or special ops engagements but for which reliable civilian death figures are not available. It uses only confirmed deaths that are directly due to the wars, rather than estimated deaths using mortality surveys; the latter method has produced much higher civilian death estimates of the war in Iraq, for instance.

We can take a narrower view and look only at US lives lost. Crawford and Lutz estimate that 15,262 American military members, Defense Department civilians, and contractors have died in these conflicts — a much lower toll.

But if we’re looking myopically at the US as a self-interested actor, we also cannot consider any benefits to the war outside prevented terrorist attacks on the US. Any civilian lives saved through better health care in Afghanistan would be rendered irrelevant, as would any gains to women’s rights. And given how often humanitarian rationales were invoked to defend aspects of the war on terror, it feels important to include the full humanitarian costs and the full humanitarian benefits in our accounting.

A chart showing civilians, national armies, and opposition fighters make up 
most deaths in the war on terror.

A little less than two-thirds of the war on terror’s deaths were of civilians or allied members of national armies, like the armies of Afghanistan and Iraq. About a third of deaths were of opposition fighters, like Iraqi insurgents, the Taliban, and ISIS.

Some may object to including the latter deaths here on an equal footing with civilians and allied militaries. But failing to do so risks dramatically undercounting civilian deaths. “A lot of times, there are political incentives for governments to undercount civilians and put people in the category of ‘opposition fighters’ or ‘militants’ because politically that looks a lot less bad,” Savell, co-director of the Costs of War Project, told me.

As part of the US drone war under President Obama, the US government embraced a policy that “in effect counts all military-age males in a strike zone as combatants … unless there is explicit intelligence posthumously proving them innocent,” per reporting by the New York Times’s Jo Becker and Scott Shane. Obviously, not every adult male killed in the drone war was an opposition fighter.

Then there are the costs in terms of people not killed but displaced by war. A paper released by the Costs of War Project last month estimates that Iraq produced 9.2 million refugees, the Syrian theater of the ISIS battle produced 7.1 million, and Afghanistan produced 5.9 million. The authors estimate a total of 38 million displaced people, mostly in their own countries, as a result of US wars.

There are indirect costs as well. The war on terror and Iraq and the torture regime in particular caused a devastating blow to America’s standing in the world. “The U.S. image abroad is suffering almost everywhere,” the Pew Research Center reported in its assessment of global opinion of the US at the end of 2008.

The war created diplomatic catastrophes with America’s adversaries. Iran, which had been productively cooperating with the US in Afghanistan against their common enemy, the Taliban, cut off all cooperation when the Bush administration declared the country part of the “axis of evil” as part of its war-on-terror messaging. Kim Jong Il, the North Korean dictator, reportedly produced the country’s first nuclear weapon in 2002 in part as a response to the “axis of evil” speech, believing it meant the country needed an ironclad deterrent against US attack.

America’s interventions also played a role in provoking further conflicts in the regions in question. Most notably, the invasion of Iraq led directly to the creation of al-Qaeda in Iraq, and that group’s eventual transformation into the Islamic State of Iraq and Syria. The ISIS rising of 2014-’15 exacted a horrific humanitarian toll on the people of that region, and the group’s subsequent attacks like the November 13, 2015, strikes in Paris and attacks by individuals inspired by ISIS, like the June 12, 2016, Pulse nightclub shooting in Orlando, should be considered costs of America’s invasion of Iraq.

What sort of benefits would justify these costs?

The most comprehensive attempt I’ve seen of a cost-benefit analysis of counterterrorism policies is in the book Terror, Security, and Money: Balancing the Risks, Benefits, and Costs of Homeland Security, a 2011 book by political scientist John Mueller and engineering professor Mark G. Stewart.

They estimate the cost of a 9/11-scale attack at roughly $200 billion, both in economic costs in rebuilding, health care for survivors, and reduced business activity in the wake of the attack, and, more important, in the lives of those lost. To calculate the latter, they use a measure known as the value of a statistical life. The idea is to use, for instance, the extra wages that workers in especially dangerous jobs demand to be paid to estimate how much the typical person is willing to pay to extend their life.

In Mueller and Stewart’s book, they put the value of a statistical life in the US at $6.5 million (that’s actually lower than the $7 million a recent review of studies found). Using that, the gross cost of the war on terror falls to “only” about $13.9 trillion.

That implies that for the war on terror to have been worth it, it had to have prevented more than 69 9/11-scale attacks over the past two decades, or about 3.5 attacks every single year.

More plausibly, the war on terror could be justified through, say, the far greater number of lives saved through aid to the Afghan health system.

Here, too, though, the necessary number of lives saved needs to be enormous to justify the costs. At a total cost of $13.9 trillion and a value of $6.5 million per life saved, the entire effort would have had to save at least 2.1 million lives to have been worthwhile.

There’s simply no evidence suggesting that the war on terror, or the public health programs launched as part of it, saved that many lives on net. The only estimate I’ve seen in that territory is the Brookings Institution’s Michael O’Hanlon telling his colleague Jonathan Rauch that he “guesstimates that U.S. activities [in Afghanistan] saved a million or more lives.”

I emailed O’Hanlon to ask where that number came from. This was his reply:

Here’s a rough start on the problem: if deaths to children under 5 went down from 200 per 1,000 to 100 per 1,000 (illustratively), and there were more than half a million births a year, right there is a reduction of at least 25,000 deaths per year. Times 20 means at least 500,000 lives saved. That’s on the child survival front. There were also gains with life expectancy for adults, reductions in maternal mortality. I didn’t do a formal calculation; this is a ballpark estimate.

The figure of 25,000 deaths averted a year he cites is actually lower than the rough estimate of 44,500 I came to above on reduced child mortality. But even so, 500,000 total lives saved is likely an overestimate. The reduction in child mortality did not occur instantaneously between 2001 and 2002; it was gradual, meaning the gains, if they were the result of US actions, were only in effect for a fraction of the US occupation. And doubling the lives saved estimate to 1 million, without a specific reason to think an equivalent number of lives were saved through reductions in non-child mortality, seems foolish.

It is also important to think of the opportunity cost of the war. Coincident with the war’s launch was the initiation of PEPFAR, the President’s Emergency Plan for AIDS Relief. That program, then and now, buys and distributes massive quantities of antiretroviral drugs to treat HIV and AIDS in developing countries, and promotes condom distribution and other prevention measures.

One influential study of PEPFAR’s impact found that in its first four years, in 12 specific focus countries, the program reduced the death rate from HIV by 10.5 percent, resulting in 1.2 million lives saved, at a cost of $2,450 per death averted. It is truly one of George W. Bush’s great achievements.

That implies that the US, by expanding funding for HIV treatment and in other cost-effective areas like malaria prevention, could save 2 million lives at a cost of more like $5 billion, or less than one-thousandth the cost of the war on terror.

When you step back and think about the cost of the war on terror and all the possible benefits that could have come from it, you would be hard-pressed to arrive at a place where the benefits outstrip the costs. Indeed, the former never comes remotely close to the latter. The war on terror was as wasteful, and morally horrific, on the balance sheet as it was in the collective memory.

We need to remember the sheer magnitude of this disaster

At this juncture in history, perhaps the math above feels obvious, or even a non-story. Of course the war on terror, especially the war in Iraq, was a disaster. Of course the US wasted billions if not trillions of dollars, and ended hundreds of thousands of lives, when it did not need to. These are not original points, and many of us have internalized them so deeply that we no longer bat an eye.

It’s worth batting an eye, though. Jadedness has a tendency to cause us to glaze over outrages, to accept things that are not natural and to keep us from interrogating whether they are worthwhile parts of our world.

The war on terror has been naturalized to an astonishing degree over the past two decades. The drone war continues, usually off the news radar. Compare how much you heard about the ISIS strike on the Kabul airport to how much you heard about the 10 people — most or all of whom were reportedly civilians — the US killed in a reprisal drone strike.

 Wakil Kohsar/RAFP via Getty Images
A Taliban fighter stands guard at the site of the August 26 twin suicide bombs near the airport in Kabul.

As president, Donald Trump ramped up the effort by moving strikes outside a cumbersome interagency vetting process and toward faster-moving “country teams” responsible for strikes in their areas, and President Joe Biden is partially continuing that approach. The prison at Guantanamo Bay is still open and holding 39 people, including 9/11 mastermind Khalid Sheikh Mohammed. All but two of them are being detained essentially without trial, and in the case of some, like Mohammed, after many excruciating rounds of torture.

These are policies that warrant evaluating in their own right. But they’re also worth considering as the remaining components of an outrageously wasteful policy disaster. It seems clear the war on terror was a bad idea. What are we going to do about it?

Epic Games may have won a small battle in the Epic-Apple trial, but Apple won the war.

The great Apple-Epic Games trial about Apple’s control over its own App Store and whether it was an unfair monopoly now has a decision, and it’s not great for Epic Games.

Judge Yvonne Gonzalez Rogers ruled in Apple’s favor on almost every count. Epic Games had hoped to prove that the tech giant’s App Store was a monopoly, causing higher prices for consumers and forcing developers to follow all of its rules in order to be allowed on Apple’s mobile devices.

Gonzalez Rogers ruled that the App Store is not a monopoly and that Apple should not be punished for its success. And while the court is forcing Apple to allow developers to tell app users about alternative ways they can pay for subscriptions and in-app purchases — which may seem like (and in some cases, was initially reported as) a win for Epic — Apple will be allowed to continue most of the App Store practices Epic was fighting to get it to stop.

“The Court finds in favor of Apple on all counts except with respect to violation of California’s Unfair Competition law (Count Ten) and only partially with respect to its claim for Declaratory Relief,” the judge wrote.

But you don’t have to take her word for it; Epic’s and Apple’s statements also reflect whose side the verdict favored.

“Today’s ruling isn’t a win for developers or for consumers,” Epic CEO Tim Sweeney tweeted. “Epic is fighting for fair competition among in-app payment methods and app stores for a billion consumers.”

“The court has affirmed what we’ve known all along: the App Store is not in violation of antitrust law,” Apple said.

A big factor in the decision was the definition of the “market” Apple allegedly had a monopoly over. This was a sticking point in the trial: Apple argued that the market was all gaming platforms; Epic said the market was just Apple’s App Store. Gonzalez Rogers said during the trial that she thought the market might be all mobile gaming, which would include other mobile platforms and stores like Google Play. And that’s the definition she went with in her ruling. It’s hard to prove that Apple is a monopoly when the judge’s definition of the market also includes its competitors.

The one victory Epic Games did achieve was a limited one: Though Gonzalez Rogers ruled that Apple had to allow developers to show app users links where they can make purchases outside of the App Store (purchases Apple won’t get a cut of), Epic is still not allowed to insert its own payment method in the app itself, nor can it place its own app store on Apple devices.

“This measured remedy will increase competition, increase transparency, increase consumer choice and information while preserving Apple’s iOS ecosystem which has pro-competitive justifications,” the judge wrote.

But Apple had already decided (or was strongly pressured) a few weeks ago to end its prohibition on telling users they could purchase subscriptions and in-game items outside of the App Store. So this ruling doesn’t really change anything for Apple now, and companies like Epic and Spotify are already on record saying the ability to tell customers about their alternatives isn’t good enough.

As for Epic’s other claims, Gonzalez Rogers said the company “overreached” and couldn’t prove that Apple was a monopolist. That doesn’t necessarily mean that Apple isn’t a monopoly, nor that another plaintiff couldn’t make a better argument that it is. Gonzalez Rogers added: “The trial record was not as fulsome with respect to antitrust conduct in the relevant market as it could have been.” The 30 percent commission Apple takes on most subscriptions and in-app purchases, she said, “appears inflated” and was “potentially anticompetitive.” But, since Epic wasn’t challenging the amount of the commission (only the fact that there was one), she wasn’t able to rule on it.

So this one civil lawsuit won’t be the final word on Apple and antitrust. United States lawmakers and regulators, as well as those in several other countries, are pressuring Apple to change what they see as possible violations of their antitrust laws. Apple is one of several Big Tech companies included in the Biden administration’s big antitrust push, which includes appointing Big Tech critic Lina Khan to the chair of the Federal Trade Commission (FTC). The issue is bipartisan, too: Republican and Democratic lawmakers are vocal Big Tech opponents and have started to introduce new antitrust bills targeting it, while state attorneys general teamed up to sue Google for antitrust violations multiple times in the last year. Facebook has also been sued for antitrust violations by the FTC and almost every state — though the state attorneys general’s version of the suit was thrown out.

Sen. Amy Klobuchar, who has made antitrust in Big Tech one of her major issues, said the ruling showed that more antitrust laws were needed.

“App stores raise serious competition concerns,” Klobuchar said in a statement. “While the ruling addresses some of those concerns, much more must be done. We need to pass federal legislation on app store conduct to protect consumers, promote competition, and foster innovation.”

Spotify, which has been a vocal opponent of Apple’s App Store and complained about it to the European Union’s antitrust enforcement commission, said it was pleased with the part of the ruling that said Apple’s conduct was anti-competitive and barred its anti-steering rule, and hoped it would lead to more decisions like this.

“This and other developments around the world show that there is strong need and momentum for legislation to address these and many other unfair practices, which are designed to hurt competition and consumers,” Spotify’s head of global affairs and chief legal officer Horacio Gutierrez said in a statement.

As for Epic’s stunt that kicked all of this off — putting a direct payment system in Fortnite that was in violation of the App Store rules, which got it booted from iOS and macOS devices — the judge ruled in Apple’s favor. Not only did she declare that Apple’s decision to terminate its agreement with Epic was “valid, lawful, and enforceable,” she also ordered Epic to pay Apple damages: 30 percent of the revenue it collected through the forbidden direct payment system from its August 2020 installation to the present day.

From The Hindu: Sports

From The Hindu: National News

From BBC: Europe

From Ars Technica

From Jokes Subreddit