Friday, May 22, 2015

No Respect for the Poor

Go To Original

We all make mistakes. In 1996, I ventured a silly notion at the end of a grant-funded project study that criticized the over-optimistic labor market assumptions behind U.S. “welfare reform.”  Welfare “reform” was a euphemism for the elimination of poor families’ entitlement to basic family cash assistance in the name of “welfare-to-work” and “work first.” My fellow researchers and I (working under the rubric of the Midwest Job Gap project) showed that the U.S. economy was generating far too few decent-paying low-skilled jobs to absorb the millions of poor mothers being pushed into the job market by the bipartisan “Personal Responsibility and Work Opportunity Reconciliation Act.”  There wasn’t enough employment “opportunity” out there for welfare “reform” to meaningfully reduce poverty in the U.S., we argued.
Nonetheless, I found it necessary for some reason to hint that there might be a “silver lining” to the vicious policy in question. Maybe, I suggested, poor people would be treated with more respect in the U.S. since it would now be clearer than ever that most of the nation’s worst-off citizens were employed. I was thinking of opinion surveys I’d seen showing that the working poor were held in much higher regard than “the welfare poor” by the public and by policy makers.
Surrendering Basic Rights
Who was I trying to kid? In the late 1990s, at the peak of the “Clinton boom,” the brilliant left author Barbara Ehrenreich began the participant-observatory research for what became her bestselling 2001 book Nickeled and Dimed: On Not Getting By in America – a harrowing account of her attempts to pay her bills and maintain her dignity while working at the bottom of the American occupational structure. Ehrenreich wanted to know how anyone could make it on $6 an hour without benefits as a hotel maid, house cleaner, waitress, and Wal-Mart sales “associate,” working in the precarious region between fading public benefits eligibility and good jobs?  She found that the nation’s lowest-status jobs were both physically and mentally exhausting and that one such job was not enough to pay for decent food, clothing, and shelter.
But what most particularly struck Ehrenreich about life at the low-wage end of the “Fabulous Nineties” was the remarkable extent to which working people were “required to surrender…basic civil rights…and self-respect” thanks to employer practices that helped “mak[e] ours not just an economy but a culture of extreme inequality.”  The humiliations she witnessed and experienced included routine mandatory drug testing, intrusive pre-employment tests full of demeaning questions, rules against “talking” and “gossip” (against organizing, often enough), restrictions on trips to the bathroom, abusive rants by over-bearing supervisors, petty disciplinary measures, stolen labor time, and the constant threat of being fired for “stepping out of line.”  She learned as a waitress that management had the right to search her purse at any time.
So much for the notion that Bill Clinton and Newt Gingrich’s welfare “reform” (elimination) might restore some dignity and honor to the poor by moving more of them off the dole and into the paid workplace.
Two Cruel Jokes: The Minimum Wage and Poverty Level
Things have gotten worse for low-wage U.S. workers since Nickeled and Dimed hit the bookshelves. Real hourly wages for those at the middle of the wage distribution have stagnated since 2000, consistent with deeper trends across the long neoliberal era. But no group of workers has suffered more than those at the very bottom. Americans with only a high school degree or less have actually seen their wages fall since the turn of the millennium.
One part of the problem is that the U.S. minimum wage is a bad joke. If it had kept pace with increases in U.S. labor productivity since the 1970s, it would be $18 an hour today.  Instead it sits at a pathetic $7.25, which translates (assuming full-time year round work) into $14,500 per year, well below the notoriously inadequate federal poverty level for a three-person family ($19,790).
The most that “liberal” Democrats in Washington seem ready to pretend to fight for is an increase of the minimum wage to $10 an hour, that is, to a mere $20,000 a year for low-wage workers fortunate enough to work 40 hours a week 50 weeks in a year.
Which brings us to another bad joke: the U.S. poverty level. According to the Economic Policy Institute’s heroically researched Family Budget Calculator, the real cost of a minimally adequate no-frills standard of living for one parent with one kid in Iowa City, Iowa, is $48,235.  That sounds high until you add up the monthly expenses: housing ($853), food ($369), child care ($684), transportation ($459), health care ($891), other necessities ($313), and taxes ($450), for a total monthly outlay of $4,020. Go to the San Francisco metropolitan area and the cost of a basic family budget for one parent with one kid is $70,929. In the Chicago area, it’s $53,168. Make it two parents and two kids in Iowa City and the cost is $66,667.
It is absurd not only that the US federal poverty level (based on a hopelessly antiquated 1950s formula that multiplies a minimum food budget three times) is so low but also that it is not adjusted for significant geographic variations in the cost of living across US metro areas.
The EPI’s figures are worth keeping in mind the next time you hear the Chamber of Commerce or the American Enterprise Institute express horror at the notion that the minimum wage should go as “astronomically” high as $15 an hour.  Even such a dramatically increased minimum wage translates into just $30,000 a year for a full time worker fortunate to stay employed full time.
With most Americans’ wages stagnating for more than a decade and with the lowest paid workers’ wages shrinking, it is no wonder that half of the more than 24 million Americans who rely on food banks for basic nutrition are employed.  The cost of living just keeps going up.
“Put a Bullet Through Your Head”
Psychological abuse from employers remains very much a problem for the working poor. As the working class activist and journalist Bob Simpson reported from Chicago last year, a McDonald’s worker named Carmen Navarrette was “told that she ‘should put a bullet through her head,’ because she had requested permission to go home after becoming very ill at work. She is a diabetic and had just been released from the hospital.”  The daughter of a different Chicago fast food worker spoke “about how her mom comes home crying because ‘the manager would scream at her and yell mean things. And right now she is pregnant and he makes her carry more than she is supposed to and that’s not good for her. But he says he doesn’t care.’….On top of …[the] economic burden” that goes with working poverty in the U.S.,  Simpson noted, “comes the stress of cruel verbal abuse and the threat of arbitrary discipline without fair hearing.”
Dickensian Facts
Back to “welfare reform.” How’s that forgotten experiment in neoliberal “tough love” doing these days? As the Center for Budget and Policy Priorities (CBPP) reported to Congress three weeks ago, Temporary Assistance for Needy Families (TANF, the program that replaced AFDC, Aid for Families with Dependent Children under the 1996 welfare “reform”) provides cash assistance to very few needy families and lifts far few children out of “deep poverty” (incomes below half the federal poverty line) than did its predecessor, AFDC – this while poverty has risen in the current century. CBPP Vice President Ladonna Pavetti’s testimony to the U.S. House Ways and Means Committee reads like something out of Charles Dickens:
“The national TANF average monthly caseload has fallen by almost two-thirds — from 4.7 million families in 1996 to 1.7 million families in 2013 — even as poverty and deep poverty have worsened. The number of families with children in poverty hit a low of 5.2 million in 2000, but has since increased to more than 7 million. Similarly, the number of families with children in deep poverty hit a low of about 2 million in 2000, but is now above 3 million. These opposing trends — TANF caseloads going down while poverty is going up — mean that TANF reaches a much smaller share of poor families than AFDC did. When TANF was enacted, nationally, 68 families received assistance for every 100 families in poverty; that number has since fallen to just 26 families receiving assistance for every 100 families in poverty…In ten states, fewer than 10 families receive cash assistance for every 100 families in poverty.”
On the eve of its elimination in 1995, AFDC raised 62% of children who would have otherwise been in deep poverty.  It saved 2,210,000 children from life at less than half the poverty level.  Fifteen years later, TANF did the same for a mere 629,000 children, lifting just 24% of children who would have otherwise been deeply poor. U.S. welfare payments were in fact never high enough to permit poor mothers to escape the necessity of participation in the job market, but, as the Public Broadcasting System recently reported, “welfare checks have shrunk so much that the very poorest single-parent families [now] receive…35 percent less than they did before welfare-to-work began.”
That is disgraceful in and of itself.  It is doubly shameful in a time when poverty has expanded while wealth and income have concentrated in ever fewer hands (the top 1% garnered 95% of the nation’s income gains during Obama’s first administration), bringing the nation to an openly acknowledged New Gilded Age of savage inequality and transparent plutocracy.
Welfare to Work?
Welfare to work? As Pavetti told Congress, most of the early employment gains among single mothers that were seen after TANF’s creation in 1997 have vanished thanks to the disappearance (after 2000) of the briefly favorable labor market for lesser skilled workers that emerged in the late 1990s.  The success of “work first” programs, which emphasize getting participants into the labor market quickly during the late 1990s, is vastly overstated. Although employment increased, the vast majority of former welfare recipients pushed into the job market did not attain stable employment even at the height of the unsustainable, debt-leveraged Clinton expansion. And today, after two predictable (and predicted) capitalist recessions (one epic in nature) and with another recession looming, U.S. states “spend little of their TANF funds to help improve recipients’ employability.”  TANF recipients report that TANF “welfare to work” programs typically involve little more than direction to short-lived, commonly seasonal low-wage jobs and that serious training and placement programs are unavailable and without funds.
“Welfare to work” is a scam to cover the slashing of government’s responsibility for the nation’s most vulnerable citizens in a society whose “free market” system offers ever fewer real opportunities for stability and upward mobility through employment while conferring vast government subsidies and protections and on the wealthy corporate and financial Few.
Fight for 15 and for Dignity
The U.S. working class struggle for a Living Wage that has emerged in recent years in connection with the Fight for Fifteen – for a minimum wage of $15 an hour (still below basic family budgets in all U.S. metropolitan areas) – is more than an economic struggle. It is also a political and moral struggle for basic decency, for self-respect, and for dignity.
Connecting economic oppression to psychological mistreatment in her widely read book, Barbara Ehrenreich guessed in Nickeled and Dimed “that the indignities imposed on so many low-wage workers – the drug tests, the constant surveillance, being ‘reamed out’ by managers – are part of what keep wages low.  If you’re made to feel unworthy enough,” Ehrenreich wrote, “you may come to think that what you’re paid is what you’re worth.”  It was an important point. Debilitating shame and the related psychological battering of working people in the all-too unprotected, de-unionized, and hidden abode of the workplace is part of how the employer class rules over low-wage workers in “the land of freedom.”
Inspiringly enough, however, tens of thousands of those workers in the U.S. have in the last two years stood up to tell their bosses and the nation that they not only need but also deserve more than miserable wages and denigration on the job.  “The [workers] of the Fight for 15 campaign,” Simpson noted last year, “want a world where a decent standard of living and respect for all is the norm.”
The fight for 15 is also a fight for dignity. Respect for workers, the struggle’s participants know, will only be won from the bottom up, through collective and militant action.  It will never granted from the top-down by elites who have little more respect for a Walmart or McDonald’s worker than they do for a TANF recipient or for one of the nation’s more than 2 million prisoners.

Only one in four workers worldwide has a stable job

Go To Original
Only one quarter of the world’s working population holds a permanent and stable job, according to a new report published by the International Labor Organization (ILO) Tuesday.
Even as the number of unemployed people worldwide remains significantly higher than before the 2008 crisis, the few jobs that have been created in recent years have been disproportionately part-time, contingent and low-wage.
The ILO’s World Employment and Social Outlook—Trends 2015 report found that three-quarters of workers are “employed on temporary or short-term contracts, in informal jobs often without any contract, under own-account arrangements or in unpaid family jobs.”
The report notes that worldwide more than 60 percent of workers do not have any sort of employment contract, with most of them working on family farms and businesses in developing countries. But even among those who earn wages or salaries, less than half—only 42 percent—are employed on a permanent basis.
In what are categorized as high-income countries, the share of workers employed on a permanent basis has declined in recent years, from 74 percent in 2004 to 73.2 percent in 2012. For males this decline has been even sharper, with the share working on permanent contracts falling from 73.1 percent to 71.2 percent during the same time.
The report likewise found a global rise in part-time employment. “In the vast majority of countries with available information, the rise in the number of part-time jobs outpaced gains in full-time jobs between 2009 and 2013.”
The ILO notes, “In France, Italy, Japan, Spain and the [European Union] more broadly, increases in part-time employment occurred alongside losses in full-time jobs—leading in some instances to overall job losses during this period.” Since 2009, the number of full-time jobs in the European Union fell by nearly 3.3 million, while part-time employment increased by 2.1 million.
Meanwhile, legal protections assuring workers a stable employment schedule have been slashed, with the ILO noting that, “labour protection has generally decreased since 2008.”
“The shift we’re seeing from the traditional employment relationship to more non-standard forms of employment is in many cases associated with the rise in inequality and poverty rates in many countries,” said Guy Ryder, Director-General of the ILO.
The report found “a shift away from the standard employment model, in which workers… have stable jobs and work full time. In advanced economies, the standard employment model is less and less dominant.”
This phenomenon was mirrored in developing countries where, “at the bottom of global supply chains, very short-term contracts and irregular hours are becoming more widespread.” As a result, “in emerging and developing economies, the historical trend toward more wage and salaried employment is slowing down.”
The report notes that “nearly eight years have passed since the first signs of crisis emerged in the global economy,” yet “the more recent period has seen global unemployment march higher” and has been “characterized by an uneven and fragile job recovery.”
The ILO estimates that the number of people unemployed worldwide hit 201 million last year, up by 30 million since the eruption of the global financial crisis in 2008. The report notes that, far from making any significant dent in the number of people unemployed worldwide, “providing jobs to more than 40 million additional people who enter the global labour market every year is proving to be a daunting challenge.”
The ILO notes that employment growth has largely stalled worldwide, with the number of jobs available growing by only 0.1 percent each year in developed countries since 2008, compared to a rate of 0.9 percent between 2000 and 2007.
This has corresponded with an overall slump in economic growth. For the “advanced economies” as a whole, growth in the period between 2007 and 2014 averaged about 0.7 percent per year, compared with an annual growth rate of two percent in the period before the crisis.
The report warned that falling wages and continued mass unemployment have contributed to a structural weakness in global demand, resulting in a further slump in the labor market. Director-General Ryder added, “These trends risk perpetuating the vicious circle of weak global demand and slow job creation that has characterized the global economy and many labour markets throughout the post-crisis period.”
The increasing prevalence of low-wage, part-time and contingent work has coincided with a massive enrichment of the financial elite. Since 2009, the wealth of the world’s richest 400 individuals has nearly tripled, from $2.4 trillion to $7.05 trillion in 2015, according to Forbes magazine. This massive growth of inequality has been the direct outcome of policies carried out by governments throughout the world, which responded to the 2008 crash by pumping trillions of dollars into the financial system while slashing social services and promoting poverty-wage employment.
The findings of the report constitute a scathing indictment of the capitalist system, which is incapable of addressing mass unemployment, poverty or any other social problem. Developing countries, robbed and exploited by imperialism, remain backward and impoverished, while in the “advanced” economies the ruling classes have carried out a relentless assault on jobs, wages and living conditions for the great majority of the population.
There is nothing in the 155-page report to indicate any prospect for improvement in the near future. This fact constitutes an implicit admission that soaring inequality, falling wages, mass unemployment and increasingly contingent employment constitute essential features of the present social order.

Token fines for banks caught rigging foreign exchange markets

Go To Original
In yet another wrist-slap settlement for bankers involved in criminality on a massive scale, the US government on Wednesday announced that five major banks had pleaded guilty to felony conspiracy and antitrust charges and agreed to pay a combined total of approximately $5 billion in fines.
The payouts, much of them tax deductible, are a fraction of the combined profits of the banks. The amounts have already been set aside by bank CEOs as the cost of doing business in an environment in which banks routinely break the law, secure in the knowledge that there will be no serious consequences.
The banks—JPMorgan Chase, Citigroup, UBS, Barclays and RBS—admitted to conspiring to rig global currency exchange rates. They made billions of dollars by illegally manipulating rates affecting countless businesses and individuals around the world. All of the banks were previously implicated in rigging Libor (the London Interbank Offered Rate), the global benchmark used to set short-term interest rates for hundreds of trillions of dollars in loans.
Two of the banks, UBS and Barclays, carried out the foreign exchange fraud in violation of the terms of their non-prosecution agreements with the US government stemming from their involvement in the Libor scandal.
The documents released by the Justice Department in relation to the settlement point to the culture of fraud and criminality on Wall Street. As one Barclays vice president put it, “If you ain’t cheating, you ain’t trying.”
Since the Wall Street crash of 2008, these and other major banks have been cited for crimes ranging from fraudulently selling worthless mortgage securities, to laundering money for Mexican drug lords, facilitating Bernard Madoff’s Ponzi scheme, and concealing billions in speculative losses. For these crimes they have suffered no serious consequences.
Instead, regulators in the US and internationally have crafted settlements in backroom negotiations with the criminals involving token fines that turn out to be significantly smaller than the nominal figures announced by government officials.
“The criminality occurred on a massive scale,” said FBI Assistant Director Andrew McCabe, announcing the foreign exchange fraud settlement on Wednesday. He explained that traders at multiple banks rigged estimates of global currency exchange rates every day for up to five years.
US Attorney General Loretta Lynch spoke of the conspiracy’s “breathtaking flagrancy, its systemic reach, and its significant impact.” Aitan Goelman, the head of enforcement at the Commodity Futures Trading Commission, called the five banks a “cabal.”
These statements, meant to give the appearance of government toughness toward the banks, only underscored the gaping discrepancy between the scale of the crimes and the toothless character of the punishment. Wednesday’s announcement was further confirmation that the US and international financial aristocracy is above the law.
Not a single major bank has been closed down or broken up since the 2008 crash, triggered by reckless and illegal speculative activities. Not a single bank CEO or top official has been prosecuted or jailed for crimes that have led to the impoverishment of countless millions of people.
But a petty crime carried out by a US worker or working-class youth brings down the wrath of a so-called “justice system” that is merciless when it comes to the lower social orders. Tens of thousands of workers and poor people are cast into America’s prison gulag every year for offenses that pale in comparison to the crimes carried out by Wall Street CEOs.
Or they are killed outright by the militarized police who occupy America’s working-class neighborhoods. Michael Brown, an 18-year-old unarmed youth, was gunned down last August by a Ferguson, Missouri cop who was tracking him for allegedly stealing a package of cigarillos from a convenience store.
In the deal announced Wednesday, the banks pleaded guilty to felony charges. This is a departure from previous settlements in which the government allowed the banks to avoid any admission of guilt.
But the guilty pleas were part of a scheme worked out between the government and the banks to render the pleas virtually meaningless. The Securities and Exchange Commission issued waivers exempting the banks from the legal repercussions of committing a felony, giving them continued preferential treatment in issuing debt as well as the continued ability to operate mutual funds.
In today’s thoroughly corrupt political environment, totally dominated by corporate money, there is no stigma attached to a bank that effectively admits to being a criminal enterprise. The media pays no attention and the markets could care less. Shares of most of the banks involved in the settlement spiked on Wednesday. UBS and Barclays both rose 3.4 percent. RBS finished the day up by 1.9 percent.
Wednesday’s settlement is further evidence of the reassertion of the aristocratic principle in contemporary capitalist society: there is one set of laws for the vast majority, the working people, and an entirely different legal framework for the financial oligarchs—one that can be summed up with the phrase “Anything goes.”

Britain’s anti-terror law and the global assault on democratic rights

Go To Original
In the year marking 800 years since England’s Magna Carta, which asserted that kings could not simply impose their will without oversight and freemen could not be punished unless they violated the law of the land, Britain’s new Conservative government is preparing a massive assault on civil liberties.
The Tories are set to enact new legislation targeting “extremists” that poses a fundamental threat to political opponents of the government and to the working class. The claim that the new law is aimed simply or primarily at Islamic terrorists is a lie.
Under the legislation’s provisions, the authorities will be able to punish anyone engaged in “harmful” behaviour, ranging from public disorder to threatening the functioning of democracy. Individuals or groups subject to “extremist disruption orders” and “banning orders” will be compelled to submit to the police all material they intend to publish, including on social media. Individuals may also be prohibited from attending public gatherings and speaking at demonstrations or protests.
Prime Minister David Cameron indicated the sweep of the government’s intentions when he proclaimed that Britain has been a “passively tolerant society for too long, saying to our citizens: as long as you obey the law, we will leave you alone.” Freedom from persecution by the state will no longer be guaranteed, even to those who obey the law.
The British government’s proposal is only the latest in a raft of anti-democratic measures adopted internationally in recent months. The new push for police state powers was initiated by the United States.
In a speech at the United Nations last September, President Barack Obama called on Washington’s allies to step up efforts to combat Islamic extremism. The call came shortly after the US initiated its latest war of aggression in the Middle East with the bombardment of Islamic State of Iraq and Syria (ISIS) positions in Iraq and Syria.
In February, Obama hosted a terrorism summit in Washington involving 65 states. The US president remarked in a speech that both domestic and foreign policies had to target not only “terrorists who are killing innocent people,” but also the “ideologies, the infrastructure of extremists—the propagandists, recruiters, the funders who radicalize and recruit or incite people to violence.”
The WSWS warned at the time that such all-embracing formulations could “potentially include virtually anyone who condemns the supposedly ‘moderate’ policies of US imperialism.”
Within weeks of this summit, draconian measures have been imposed in a number of countries.
Last month, the French parliament adopted a new anti-terror law allowing for a vast expansion of state surveillance. Seizing on January’s attack on the offices of the publication Charlie Hebdo, the Socialist Party government of François Hollande included in the law an increase in military and intelligence service positions so as to expand monitoring of the Internet and social networking sites. Intelligence agents will be authorized to read documents of individuals under surveillance and a new database for air travel is to be established.
In Canada, the Conservative government of Stephen Harper is rushing a law through parliament that hands draconian new powers to the intelligence services. The Canadian Security Intelligence Service (CSIS) will now be empowered to actively “disrupt” groups considered to threaten economic or national security or the country’s territorial integrity. The CSIS will be given a green light to violate the Canadian Constitution’s Charter of Rights and Freedoms and break virtually any law in so doing.
As in the case of Cameron’s forthcoming legislation, the definition of national security threats in Canada’s Bill C-51 is so broad and elastic as to cover political opponents of the government’s militarist foreign policy and striking workers.
Bill C-51 also provides for the prosecution of individuals accused of “promoting terrorism,” in line with similar provisions passed in Australia last year targeting free speech.
A significant feature of the new “anti-terrorism” and “anti-extremist” laws is the rapidity with which they are being implemented. Over the course of a few months, legislation has been enacted in France and rammed through Canada’s House of Commons that undermines long-standing democratic norms. The British law is due to be included in the May 27 Queen’s Speech, which outlines the government’s legislative priorities, and will be implemented within months.
There is no serious opposition within the ruling elite internationally to the abandonment of democratic procedures and implementation of police state measures. The Cameron government’s new law was presented at the end of an election campaign in which all of the major parties upheld the right of the intelligence services to continue their mass surveillance of the population, supported Britain’s aggressive militarist foreign policy abroad, and advocated fresh attacks on the social rights of workers at home.
The British Labour Party, during its thirteen years in power, oversaw the strengthening of police powers to detain suspects, codified in the 2001 Terrorism Act, the criminalisation of the “encouragement” and “incitement” of terrorism in 2006, and Britain’s leading role in the operation of a global spying network in alliance with the US National Security Agency (NSA).
The US led the way in the initial assault on democratic rights in the aftermath of 9/11, with the passage of the Patriot Act, the opening of the gulag at Guantanamo, the policies of rendition and torture, and the vast expansion of the NSA’s spying programme.
Edward Snowden’s revelations brought to light the fact that all of the major imperialist powers are complicit in the mass surveillance of their own populations.
The claim that these governments are pursuing a crusade for democracy in the supposed “war on terror” has been exposed as a fraud. This was most recently illustrated by the revelation that the 2011 assassination of Al Qaeda leader Osama bin Laden came after US intelligence had been aware for a year that he was being sheltered by Pakistani intelligence.
In pursuit of their imperialist interests around the globe, the US and its allies have been more than willing to collaborate with the same Islamic extremist groups they cite to justify attacks on democratic rights domestically. In both Libya and Syria, the Western powers backed forces loyal to Al Qaeda, some of which went on to form ISIS. So brazen was the support for jihadist forces in Libya that, as revealed recently by the Ottawa Citizen, Canadian military officials joked in 2011 that NATO warplanes were operating as “Al Qaeda’s air force.”
There are two interconnected reasons for the establishment of the infrastructure for dictatorship. First, as the danger mounts of a global conflagration between the imperialist powers, fears grow within the bourgeoisie of the potential for a mass movement in opposition to militarism.
Second, the same limitless self-enrichment by the financial oligarchy that animates the turn to militarism and colonial-style wars of conquest underlies the assault on jobs, wages and essential services that Cameron has declared the new “age of austerity.”
The prevailing level of social inequality—the ever greater concentration of wealth at the very top of society alongside the impoverishment of the vast majority—is incompatible with democratic forms of rule. The preservation of such a social order demands coercion and state violence.

The American Military Uncontained

Go To Original

It’s 1990. I’m a young captain in the U.S. Air Force.  I’ve just witnessed the fall of the Berlin Wall, something I never thought I’d see, short of a third world war.  Right now I’m witnessing the slow death of the Soviet Union, without the accompanying nuclear Armageddon so many feared.  Still, I’m slightly nervous as my military gears up for an unexpected new campaign, Operation Desert Shield/Storm, to expel Iraqi autocrat Saddam Hussein’s military from Kuwait.  It’s a confusing moment.  After all, the Soviet Union was forever (until it wasn’t) and Saddam had been a stalwart U.S. friend, his country a bulwark against the Iran of the Ayatollahs.  (For anyone who doubts that history, just check out the now-infamous 1983 photo of Donald Rumsfeld, then special envoy for President Reagan, all smiles and shaking hands with Saddam in Baghdad.)  Still, whatever my anxieties, the Soviet Union collapsed without a whimper and the campaign against Saddam’s battle-tested forces proved to be a “cakewalk,” with ground combat over in a mere 100 hours.

Think of it as the trifecta moment: Vietnam syndrome vanquished forever, Saddam’s army destroyed, and the U.S. left standing as the planet’s “sole superpower.”

Post-Desert Storm, the military of which I was a part stood triumphant on a planet that was visibly ours and ours alone.  Washington had won the Cold War.  It had won everything, in fact.  End of story.  Saddam admittedly was still in power in Baghdad, but he had been soundly spanked.  Not a single peer enemy loomed on the horizon.  It seemed as if, in the words of former U.N. ambassador and uber-conservative Jeane Kirkpatrick, the U.S. could return to being a normal country in normal times.

What Kirkpatrick meant was that, with the triumph of freedom movements in Central and Eastern Europe and the rollback of communism, the U.S. military could return to its historical roots, demobilizing after its victory in the Cold War even as a “new world order” was emerging.  But it didn’t happen.  Not by a long shot.  Despite all the happy talk back then about a “new world order,” the U.S. military never gave a serious thought to becoming a “normal” military for normal times.  Instead, for our leaders, both military and civilian, the thought process took quite a different turn.  You might sum up their thinking this way, retrospectively: Why should we demobilize or even downsize significantly or rein in our global ambitions at a moment when we can finally give them full expression?  Why would we want a “peace dividend” when we could leverage our military assets and become a global power the likes of which the world has never seen, one that would put the Romans and the British in the historical shade?  Conservative columnist Charles Krauthammer caught the spirit of the moment in February 2001 when he wrote, “America is no mere international citizen. It is the dominant power in the world, more dominant than any since Rome. Accordingly, America is in a position to reshape norms, alter expectations, and create new realities. How? By unapologetic and implacable demonstrations of will.”

What I didn’t realize back then was: America’s famed “containment policy” vis-à-vis the Soviet Union didn’t just contain that superpower—it contained us, too.  With the Soviet Union gone, the U.S. military was freed from containment.  There was nowhere it couldn’t go and nothing it couldn’t do—or so the top officials of the Bush administration came into power thinking, even before 9/11.  Consider our legacy military basesfrom the Cold War era that already spanned the globe in an historically unprecedented way.  Built largely to contain the Soviets, they could be repurposed as launching pads for interventions of every sort.  Consider all those weapon systems meant to deter Soviet aggression.  They could be used to project power on a planet seemingly without rivals. 

Now was the time to go for broke.  Now was the time to go “all in,” to borrow the title of Paula Broadwell’s fawning biography of her mentor and lover, General David Petraeus.  Under the circumstances, peace dividends were for wimps.  In 1993, Madeleine Albright, secretary of state under Bill Clinton, caught the coming post-Cold War mood of twenty-first-century America perfectly when she challengedJoint Chiefs Chairman Colin Powell angrily over what she considered a too-cautious U.S. approach to the former Yugoslavia. “What’s the point of having this superb military that you’re always talking about,” she asked, “if we can’t use it?”   

Yet even as civilian leaders hankered to flex America’s military muscle in unpromising places like Bosnia and Somalia in the 1990s, and Afghanistan, Iraq, Libya, Pakistan, and Yemen in this century, the military itself has remained remarkably mired in Cold War thinking.  If I could transport the 1990 version of me to 2015, here’s one thing that would stun him a quarter-century after the collapse of the Soviet Union: the force structure of the U.S. military has changed remarkably little.  Its nuclear triadof land-based ICBMs, submarine-launched SLBMs, and nuclear-capable bombers remains thoroughly intact.  Indeed, it’s being updated and enhanced at mind-boggling expense(perhaps as high as a trillion dollars over the next three decades).  The U.S. Navy?  Still built around large, super-expensive, and vulnerableaircraft carrier task forces.  The U.S. Air Force?  Still pursuing new, ultra-high-tech strategic bombersand new, wildly expensive fighters and attack aircraft—first the F-22, now the F-35, both supremely disappointing.  The U.S. Army?  Still configured to fight large-scale, conventional battles, a surplusof M-1 Abrams tanks sitting in mothballs just in case they’re needed to plug the Fulda Gap in Germany against a raging Red Army.  Except it’s 2015, not 1990, and no mass of Soviet T-72 tanks remains poised to surge through that gap.

Much of our military today remains structured to meet and defeat a Soviet threat that long ago ceased to exist.  (Occasional sparring matches with Vladimir Putin’s Russia in and around Ukraine do not add up to the heated “rumbles in the jungle” we fought with the Soviet leaders of yesteryear.)  And it’s not just a matter of weaponry.  Our military hierarchy remains wildly andunsustainably top-heavy, with a Cold War-style cupboard of generals and admirals, as if we were still stockpiling brass in case of another world war and a further expansion of what is already uncontestablythe largest military on the planet.  If you had asked me in 1990 what the U.S. military would look like in 2015, the one thing I wouldn’t have guessed was that, in its force structure, it would look basically the same. 

This persistence of such Cold War structures and the thinking that goes with them is a vivid illustration of military inertia, the plodding last-war conservatism that is a common enough phenomenon in military history.  It’s also a reminder that the military-industrial-congressional-complex that President Dwight Eisenhower first warned us about in 1961 remains in expansion mode more than half a century later, with its taste for business as usual (meaning, among other things, wildly expensive weapons systems).  Above all, though, it’s an illustration of something far more disturbing: the failure of democratic America to seize the possibility of a less militarized world.

Today, it’s hard to recapture the heady optimism of 1990, the idea that this country, as after any war, might at least begin to take steps to demobilize, however modestly, to become a more peaceable land.  That’s why 1990 should be considered the high-water mark of the U.S. military.  At that moment, we were poised on the brink of a new normalcy—and then it all began to go wrong.  To understand how, it’s important to see not just what remained the same, but also what began to change and just how we ended up with today’s mutant military. 

Paramilitaries Without, Militaries Within, Civilian Torturers, and Assassins Withal

Put me back again in my slimmer, uniformed 1990 body and catapult me for a second time to 2015.  What do I see in this military moment that surprises me?  Unmanned aerial vehicles, or drones, for sure.  Networked computers everywhere and the reality of a military preparing for“cyberwar.”  Incessant talk of terrorism as America’s chief threat.  A revival, however haltingly, of counterinsurgency operations, or COIN, a phenomenon abandoned in Vietnam with a stake through its heart (or so I thought then).  Uncontrolled and largely unaccountable mass surveillance of civilian society that in the Cold War era would have been a hallmark of the “Evil Empire.” 

More than anything, however, what would truly have shocked the 1990 version of me is the almost unimaginable way the military has “privatized” in the twenty-first century.  The presence of paramilitary forces (mercenary companies like DynCorp and the former Blackwater, now joinedwith Triple Canopy in the Constellis Group) and private corporations like KBRdoing typical military tasks like cooking and cleaning (what happened to privates doing KP?), delivering the mail, and mounting guard duty on military bases abroad; an American intelligence system that’s filled to the brimwith tens of thousands of private contractors; a new Department of Defense called the Department of Homeland Security (“homeland” being a word I would once have associated, to be blunt, with Nazi Germany) that has also embraced paramilitaries and privatizers of every sort; the rapid riseof a special operations community, by the tens of thousands, that has come to constitute a vast, privileged, highly secretivemilitary caste within the larger armed forces; and, most shocking of all, the public embrace of tortureand assassinationby America’s civilian leaders—the very kinds of tactics and techniques I associated in 1990 with the evils of communism. 

Walking about in such a world in 2015, the 1990-me would truly find himself a stranger in a strange land.  This time-traveling Bill Astore’s befuddlement could, I suspect, be summed up in an impolite sentiment expressed in three letters: WTF?   

Think about it.  In 2015, so many of America’s “trigger-pullers” overseas are no longer, strictly speaking, professional military.  They’re mercenaries, guns for hire, or CIA drone pilots (some on loan from the Air Force), or warrior corporationsand intelligence contractors looking to get in on a piece of the action in a war on terror where progress is defined—official denials to the contrary—by body count, by the number of “enemy combatants” killed in drone or other strikes. 

Indeed, the very persistence of traditional Cold War structures and postures within the “big” military has helped hide the full-scale emergence of a new and dangerous mutant version of our armed forces.  A bewildering mish-mash of special ops, civilian contractors (both armed and unarmed), and CIA and other intelligence operatives, all plunged into a penumbra of secrecy, all largely hidden from view (even as they’re openly celebratedin various Hollywood action movies), this mutant military is forever clamoring for a greater piece of the action.

While the old-fashioned, uniformed military guards its Cold War turf, preserved like some set of monstrous museum exhibits, the mutant military strives with great success to expand its power across the globe.  Since 9/11, it’s the mutant military that has gotten the lion’s share of the action and much of the adulation—here’s looking at you, SEAL Team 6—along with its ultimate enabler, the civilian commander-in-chief, now acting in essence as America’s assassin-in-chief.

Think of it this way: a quarter-century after the end of the Cold War, the U.S. military is completely uncontained.  Washington’s foreign policies are strikingly military-first ones, and nothing seems to be out of bounds.  Its two major parts, the Cold War-era “big” military, still very much alive and kicking, and the new-era military of special ops, contractors, and paramilitaries seek to dominate everything.   Nuclear, conventional, unconventional, land, sea, air, space, cyber, you name it: all realms must be mastered. 

Except it can’t master the one realm that matters most: itself.  And it can’t find the one thing that such an uncontained military was supposed to guarantee: victory (not in a single place anywhere on Earth).

Loaded with lootand praised to the rafters, America’s uncontained military has no disciplineand no direction.  It never has to make truly tough choices, like getting rid of ICBMs or shedding its obscenely bloated top ranks of officers or cancelling redundant weapon systems like the F-35.  It just aims to do it all, just about everywhere.  As Nick Turse reported recently, U.S. special ops touched down in 150 countriesbetween 2011 and 2014.  And the results of all this activity have been remarkably repetitive and should by now be tragically predictable: lots of chaos spread, lots of casualties inflicted, and in every case, mission unaccomplished.

The Future Isn’t What It Used to Be

Say what you will of the Cold War, at least it had an end.  The overriding danger of the current American military moment is that it may lack one.

Once upon a time, the U.S. military was more or less tied to continental defense and limited by strong rivals in its hegemonic designs.  No longer.  Today, it has uncontained ambitions across the globe and even as it continually stumbles in achieving them, whether in Iraq, Afghanistan, Yemen, or elsewhere, its growth is assured, as our leaders trip over one another in continuing to shower it with staggering sums of money and unconditional love.

No military should ever be trusted and no military should ever be left uncontained.  Our nation’s foundersknew this lesson.  Five-star general Dwight D. Eisenhower took pains in his farewell addressin 1961 to remind us of it again.  How did we as a people come to forget it?  WTF, America?

What I do know is this: Take an uncontained, mutating military, sprinkle it with unconditional love and plenty of dough, and you have a recipe for disaster.  So excuse me for being more than a little nervous about what we’ll all find when America flips the calendar by another quarter-century to the year 2040.

Speculation Pays Billions

Go To Original
The absurdity of the tsunami of money crammed into speculators’ bank accounts is illustrated in the fact that the 25 highest-paid hedge-fund managers vacuumed up a collective $11.6 billion in 2014 — and that was considered to be a bad year for them by the business press. Stratospheric though that total is, it is barely more than half of what the top 25 took in a year earlier.
All together now: Awwww. Yes, somehow these speculators will have to get by on a paltry average of $467 million.
Institutional Investor’s Alpha magazine — one can hear their editors’ teeth gnashing at their heroes’ bitter fate — lamented that 2014 was the worst year since the 2008 stock meltdown for hedge-fund managers in announcing its “Rich List.”
Nonetheless, some observers might believe that these moguls earned somebody serious money to collect such enormous paychecks. But that wasn’t necessarily the case. For the sixth consecutive year, hedge funds fell short of the average stock-market performance, returning a composite average of three percent. Perhaps the 25 hedge-fund managers who hauled in the most money for themselves were better? Not really. Alpha reports that the hedge funds of at least 12 of the individuals on its top 25 list posted gains below the 2014 average.
The S&P 500 Index, the broadest measure of U.S. stock markets, gained 11.4 percent in 2014 and the benchmark Dow Jones Industrial Average gained 7.5 percent. So somebody throwing darts, or parking their money in a passive fund that tracks a major index, would have done as well or better in many cases. Despite their subpar performances, hedge-fund managers continue to receive an annual fee of two percent of the value of the total assets under management and 20 percent of any profits. The fee gets paid even when the fund loses money.
So it’s heads, Wall Street wins and tails, Wall Street wins. And hedge funders pay less in taxes. Much of their income is classified as capital gains under U.S. tax law, and the tax rate on capital gains are much less than on regular income.
Imposing austerity on others is a job never finished
What is that hedge-fund managers do to “earn” such enormous sums of money? Let us take a look. The top person on the 2014 list is Kenneth Griffin of Citadel Capital, who hauled in $1.3 billion for the year. Citadel makes lots of moneythrough computerized high-speed trading — buying and selling securities in microseconds to take advantage of momentary price changes. Apparently allowing computers to do the work leaves Mr. Griffin with time to pursue his hobby of widening inequality still more.
Not content with the fact that his 2014 earnings are equal to the combined median wage of 26,000 U.S. workers, he contributed $10 million to an Illinois campaign that seeks to cut workers’-compensation benefits, make it illegal for employees to contribute to political campaigns through their union, abolish prevailing-wage laws and render union dues collections much more difficult. He’s also contributed millions to the Koch brothers’ war chest. Mr. Griffin’s firm also owns a stake in ServiceMaster, a company that profits from the privatization of public services by firing employees and rehiring them at lower wages.
A Huffington Post article, noting that Mr. Griffin is also a major donor to Chicago Mayor Rahm Emanuel, nonetheless reports that he believes Mayor 1% is too soft on public employees despite the mayor’s attacks on pensions and teachers. The article said:
“Griffin, alone, could fund all of Chicago’s pension liabilities for [2014] (estimated at $692 million) and still have $208 million [from his 2013 income] left to scrap by on. Yet Griffin is terribly worried that the mayor is being too soft on retirees. He castigated Chicago and Illinois politicians for not making ‘tough choices,’ blaming Democrats who control city, county and state government for not fixing pension, education and crime problems.”
Second on the hedge-fund list is James Simons of Renaissance Technologies. Although Alpha reported that he no longer runs his firm on a day-to-day basis and “spends a good chunk of the year on his 226-foot yacht,” Mr. Simons hauled in $1.2 billion in 2014. His firm employs physicists, others scientists and mathematicians to develop models for its computerized trading. Alas, speculation pays much more than scientific research that might benefit humanity.
Buy, strip, profit, repeat
Third on the list is Raymond Dalio of Bridgewater Associates, who took in $1.1 billion in 2014. He specializes in bond and currency speculation. Fourth on the list is William Ackman of Pershing Square Capital Management, who is what the corporate media likes to call an “activist investor.” In other words, someone who buys stock in a company and immediately demands massive cuts so he can make a large short-term profit is an “activist investor” because he does this more loudly than others.
Mr. Ackman hauled in $950 million in 2014. Forbes magazine, as consistent a cheerleader for the corporate overclass as any institution, summed him up this way last year:
“[H]edge fund billionaire William Ackman has tried to destroy a company that sells diet shakes, played a prominent role in nearly driving a 112-year-old retailer into the ground [and] helped launch a hostile takeover of a pharmaceutical company in a way that the Securities & Exchange Commission is reportedly examining for potential violations of insider trading law. Now, Ackman is suing the U.S. government.”
He is suing the U.S. government because it is taking the profits from federal housing-loan programs Fannie Mae and Freddie Mac to recoup money used to bail them out rather than handing the profits over to speculators such as himself. Never mind that the government spent hundreds of billions of dollars bailing out speculators. Among his most recent exploits, he was involved in two separate deals that would have moved a U.S. corporation’s headquarters to Canada so that it could avoid paying taxes, savings that would be earmarked for speculators’ wallets.
No summation of hedge-fund greed would be complete without a mention of Paul Singer, another entrant on the rich list. The vulture capitalist specializes in buying debt at pennies on the dollar and then demands to be paid the full face value, regardless of human cost. Among other exploits, he has seized an Argentine naval ship, demanded $400 million from the Republic of the Congo for bonds he bought for less than $10 million and compelled the government of Peru to pay him a 400 percent profit on the debt of two banks he bought four years earlier.
The outsized renumeration of financiers is due to the disproportionate size of the financial industry. A rough calculation estimates that in 11 business days speculators trade instruments and contracts with a value greater than all the products and services produced by the entire world in one year. In other words, a year’s worth of gross world product is traded in about two weeks on the world’s stock, bond, derivative, futures and foreign-exchange markets.
Such frenzied trading, often involving high-speed computers and ever more exotic betting, has little to do with actual economic needs and much to do with extracting money by ever more imaginative needs. Such is a system that values financial engineering more than human life.