Friday, June 17, 2011

Why Texas is a Complete Basket-Case

 (So Perry and the GOP are proud of the fact that his policies of tax breaks and subsidies for his corporate cronies who fund his campaign in return has eliminated well paying jobs and replaced them with more minimum wage service jobs like fast-food clerks, dishwashers, busboys, and 7-11 clerks? Anyone with a good job who believes in him will be mopping up puddles of puke at an underfunded elementary school within a year. There are not enough jobs in Texas that pay what a person needs for his/her mortgage and to raise a family. The Miracle is a myth--or rather a big pile of lying bullshit. And Perry, the fraud, would be a worse president than GW Bush was. He'll never get elected.--jef)
 +++

Conservatives claim the "Texas Miracle" is a model for the nation, but it's actually a blueprint for winning the race to the bottom.
By Joshua Holland, AlterNet
Posted on June 17, 2011

Conservative mythology now holds up Texas as a shining example of right-wing governance in action. Republicans would have us believe that gutting the state's social safety net, denying workers the right to bargain collectively and relentlessly cutting taxes unleashed a torrent of “job creation” and, ultimately, prosperity.

Under governor Rick “Goodhair” Perry's term in office, Texas has indeed been a model of conservative governance, but the truth is that it has resulted in anything but prosperity for the people of the Lone Star State. In fact, Texas is not only a complete basket-case, it would be faring far worse today without the help of policies enacted by Democrats at the federal level – policies Perry lambasted as “irresponsible spending that threatens our future.”

The kernel of truth on which the tale of the Texas Miracle is built is that the state has in fact added a lot of jobs over the past decade. In a gushing lead editorial, the Wall Street Journal noted that “37% of all net new American jobs since the recovery began were created in Texas.” The Journal then spun that fact like this:
Capital—both human and investment—is highly mobile, and it migrates all the time to the places where the opportunities are larger and the burdens are lower. Texas has no state income tax. Its regulatory conditions are contained and flexible. It is fiscally responsible and government is small. Its right-to-work law doesn't impose unions on businesses or employees. (Yeah, uh, Texas is "will to work" which means a company can fire an employee without giving a reason or warning first.--jef)
In the Journal's hyper-partisan view, the lesson to be learned is that “the core impulse of Obamanomics is to make America less like Texas and more like California, with more government, more unions, more central planning, higher taxes.” That spin was echoed during last week's GOP debate by none other than Newt Gingrich, who asked, “Why [would] you want to be at California's unemployment level when you can be [at] Texas's employment level?”

James Galbraith, an economist at the University of Texas, scoffed at the whole narrative, telling AlterNet, “the notion that our state government is a model is almost enough to beckon the spirit of Molly Ivins back from the shades.” Galbraith said “Texas has been a low-tax, low-service state since the time of the Republic,” and noted that it's “therefore impossible that this fact suddenly accounts for its better job performance over the past few years. 


“Texas is an energy state benefiting from high oil prices and the incipient boom in natural gas,” explained Galbraith. “That's an accident of nature.” He added that the state “went through the S&L crisis, had major criminal prosecutions and more restrictive housing finance regulations this time around; hence it was not an epicenter of the subprime housing disaster. (Texas may, however, doomed to be the epicenter of the coming commercial real estate crash. Houston and Dallas could be doomed once that market has slowly cratered as hundreds of billions of dollars in bad loans mature.--jef)

That's called a learning experience.” Tighter regulation of the lending industry is also anathema to today's GOP.

Arguably the biggest sleight-of-hand in the Texas Miracle storyline, however, is that many of those jobs were a result of a huge surge in the state's population, much of it fueled by immigration from Latin America (rather than liberal hell-holes like California).

Texas' population grew by 20% over the past decade, and Hispanics accounted for almost two-thirds of that growth. A surge in people created greater demand for goods and services, which leads to more jobs. But the jobs being created in Texas aren't keeping up with the state's expanding workforce – the Wall Street Journal somehow failed to mention that during the exact same period in which it was adding all those new jobs, Texas' unemployment rate actually increased from 7.7 to 8% (it also failed to note that 23 states -- including such deep blue ones as Vermont, New York and Massachusetts -- enjoy lower unemployment rates than Texas).

But perhaps the most laughable claim in this whole narrative is that Texas has been “fiscally responsible.” Perry certainly adhered to the conservative playbook, offering massive tax breaks without the deep cuts in services that might inspire a voter backlash. As a result – an entirely predictable one – the Austin American-Statesman reported that “state lawmakers have spent much of the year grappling with a budget shortfall that left them $27 billion short of the money needed to continue current state services.”

CNN adds that while Perry was railing against the Democratic stimulus package passed over the fierce resistance of conservatives, the state “was facing a $6.6 billion shortfall for its 2010-2011 fiscal years,” and “it plugged nearly all of that deficit with $6.4 billion in Recovery Act money.” The stimulus package created or saved 205,000 jobs in Texas, second only to California. But as James Galbraith told AlterNet, while “the state budget has not yet been cut drastically” due to the stimulus boost, “the key phrase is 'not yet.'” Now that the stimulus has run its course, “if projections for the current budget cycle are correct, things will get much worse in the next year.”

Indeed, those cuts are now on their way. The Texas legislature imposed draconian cuts to Medicaid, cut tuition aid to 43,000 low-income students and is weighing $10 billion in cuts to the state's education system. According to Texas state senator Rodney Ellis, D-Fort Bend, the 2012-2013 budget will underfund “health and human services in Texas by $23 billion, 29.8% below what is needed to maintain current services.”

But Perry's tax breaks are indeed part of the state's jobs picture; as Time magazine's Massimo Calabresi noted, Perry established several massive business tax breaks “designed to lure companies from other states.”
[But] the funds have been controversial. They have channeled millions of dollars to companies whose officers or investors are major Perry campaign donors and Perry has allowed them to keep their subsidies in many cases even when they fail to deliver promised jobs. More important for the purposes of judging Perry’s job-creating record, even those that do produce jobs don’t necessarily create long-lasting ones, or increase the state’s overall prosperity.
In a report written for Perry last spring, Michael Porter of Harvard Business School noted that such tax breaks “ultimately don’t support long-term prosperity,” because companies that can move easily “are looking for the best deal and when the deal runs out they move” again, taking their jobs with them.

He also found that Texas’ per capita income growth was the eighth slowest of any state in the country between 1998 and 2008. That's because, as the American Independent's Patrick Brendel noted, “Texas has by far the largest number of employees working at or below the federal minimum wage,” and the number of crappy jobs has exploded while this supposed Texas Miracle was taking place. “From 2007 to 2010, the number of minimum wage workers in Texas rose from 221,000 to 550,000, an increase of nearly 150 %,” wrote Brendel. As a result, Texas is now “tied with Mississippi for the greatest percentage of minimum wage workers, while California had among the fewest (less than 2 %).”*

At a fundraiser this week, Rick Perry, who despite toying with the idea of secession in the past may now be eying a White House bid, told a group of Republican fat-cats that in his state, “you don't have to use your imagination, saying, 'What'll happen if we apply this or that conservative principle?' You just need to look around, because they've been in play across our state for years, generating real results.”

In this, Perry is absolutely, 100% correct. He slashed taxes to the bone, handing out credits to his political cronies like they were candy. He decried the evils of Big Government while hypocritically using federal stimulus funds to help close Texas' budget gap in the short term, and now he's using the state's longer term fiscal disaster – one of his own creation – as a premise for destroying an already threadbare social safety net serving the neediest Texans. As a result of these policies, plus immigration and other external factors, his state's added a lot of low-paying poverty jobs without decent benefits. He's added very little in the way of “prosperity.”

In the final analysis, Texas is indeed a shining example of conservative governance, as well as an almost perfect model for winning the race to the bottom.

* It should be noted that the cost of living is higher in California than in Texas.

The Merger of Military, Law Enforcement and Public Safety

America's Judge Dredd Moment
By JOHN STANTON

According to the US Department of Justice, “Defending national security from both internal and external threats remains the [Department’s] highest priority. National security threats are constantly evolving and adapting, requiring additional resources to address new critical areas. Increasing global access to technological advancements results in new vulnerabilities that must be addressed…. technological and human capital [is needed] to detect, disrupt, and deter threats to our national security.” The DOJ’s FY 2012 Budget requested a total of $128.6 million and 170 positions in program increases that provide essential technological and human capital to detect, disrupt, and deter threats to our national security.

Wait! Isn’t that the Pentagon’s gig?

Secretary of Defense Leon Panetta expressed this view on the new national security planning and operating environment. “In the post-September 11th era, there have been significant benefits due to increased unity of effort and interagency cooperation. Civilian-military collaboration has improved, and our military commanders expect to operate in a coordinated and joint, multi-service environment. Diplomats, development experts, intelligence analysts, and law enforcement must work together in today’s complex operations.”

Does Panetta mean all levels of law enforcement in the USA?

According to the National Military Strategy of the United States, 2011, “There are no more vital interests than the security of the American people, our territory, and our way of life… Military power complements economic development, governance, and rule of law – the true bedrocks of counterterrorism efforts…We will defend the homeland and play a critical role in supporting homeland security…”

Isn’t the rule of law determined and enforced by civilians? Doesn’t he mean supports rather than “complements”?

The Department of Justice describes law enforcement as “the prevention, detection, and investigation of crime and the apprehension and detention of individuals suspected of law violation.”

Not Civil, Not Military: Judge Dredd

The fact is the uniformed services of the USA are not the premier caretakers of the American people’s security or freedom. That is myth. National security has become a team effort and it’s time to recognize that. The USA has gone completely paramilitary in every aspect of its local-state-national-international life. America has transmogrified into the movie character Judge Dredd played by Sly Stallone.

But what happens to a democratic republic, say the USA, when military and civilian law enforcement/public safety functions merge and become indistinguishable?

No one has a definitive answer to that question.

Most say that the civil-military divide is clear cut, a Rubicon that will never be crossed by the executive branch, the military, the defense industrial base and its many civilian employees, and uncritical supporters in the US Congress.

Admiral Mullen, outgoing Chair of the Joint Chiefs of Staff, said that civilian rule in the USA will be maintained at all costs. In fact, in his National Military Strategy of the USA, emphasis is placed again and again on the matter of civilian control of the military.

That’s good, of course, but why the reminder now in 2011? After all, Americans know that the US Constitution makes the military subordinate to civilians.

And yet determining who is in charge/responsible for the nation’s security, public safety and law enforcement has become more difficult over the last decade. In fact, the US military and US law enforcement operate with the same end game in mind: preempt, deter, neutralize, and remediate. Or, in US COIN parlance-- clear, hold, and build (CHB). CHB applies to a drug infested neighborhood in Washington, DC, as it does in a community in Afghanistan.

The special operations units in the US military have much in common with their law enforcement/public safety counterparts. As far as the end game of an operation is concerned, SWAT and SEAL teams operate on similar kill-pacify chains and “jointness of operations”.

Perhaps it is time for a grand restructure of the national security machinery.

If it is to be the case that military and law enforcement/public safety operate under the same strategies, operations and tactics, should domestic and global law enforcement/public safety functions be placed under some sort of civil-military geographic combatant command based on legislation similar to the Goldwater-Nichols model?

Should the Pentagon, Department of Justice, Central Intelligence Agency, Department of State and Homeland Security, etc., be merged into some sort of National Security Department? Would centralization of the national security function alleviate interagency madness?

Can the President and Secretary of Defense continue to maintain direct control over the expanding national security machinery? This question speaks not to a military coup, but instead to how two human beings—and their Cold War era staffs—can make the best decisions for the USA in the face of waves of near instantaneous information/events that flood national security decision making centers? Should the role of the largely national security illiterate members of the US House and Senate be changed?

And who are the American people supposed to honor and salute—sometimes deify--each day? Is it the brave men and women of the US military stationed at an outpost in Afghanistan? The contractor who serves the food at CENTCOM HQ or the security contractor—from Peru, say—that guards a US State Department official? Is it the courageous Predator pilot sitting in an air conditioned trailer thousands of miles away from his/her target package? How about the uniformed “bean counters” in the Pentagon?

Or is it the law enforcement/public safety personnel in Detroit, Washington, DC, or Houston who should be thanked for their service? Or the FBI agent shot to death while serving an arrest warrant? How about the guards at a state prison who detect, deter and disrupt plans for an inmate rebellion?

According to the Pew Center, “Second only to Medicaid, corrections has become the fastest growing general fund expenditure in the United States. Two million three hundred thousand people in the U.S. are now in prison or jail—more than one in 100 adults. On any given day 7.3 million adults are under federal, state, or local correctional control (including those on probation, parole, and other forms of supervision)—one in 31 adults. In FY2008, the most recent year data are available, states spent an estimated $47 billion of general funds on corrections, an increase of 303 percent since 1988. They spent an additional $4 billion in special funds and bonds and $900 million in federal funds, bringing total corrections expenditures to nearly $52 billion.” And this excludes the scores of US military/intelligence clandestine prisons located around the globe.

It has all gotten very confusing.

Kill ‘em All or Lock ‘em All Up

Arguably, the security of the American people is primarily the responsibility of civilian law enforcement and public safety personnel. An F-16 pilot or SOF operator in Libya, Somalia, Mexico or Guatemala is of no consequence to Americans living in high crime areas who want to get to school or work without being murdered, raped, assaulted or robbed.

The latest uniformed crime statistics for the period January to December 2010 published by the Department of Justice show that:
  •   14, 627 Americans were murdered
  •   53, 481 were forcibly raped; 
  • 442, 932 were robbed, and 
  • 649, 650 were brutally assaulted. 
The Virginia Association of Police Chiefs reports that 56 law enforcement officers were “feloniously killed” in 2010.

The neighborhood fire station is more relevant to security in a local US community than a forward operating base or outpost in Afghanistan/Iraq. It’s a good thing too as the Department of Justice reports that there were 40,070 cases of arson from January to December 2010. That excludes the daily calls that firefighters must make to subdue other incendiary events. According to the US Fire Administration, “an estimated 103,000 multifamily residential building fires are reported each year and cause an estimated 390 deaths, 4,050 injuries, and $1.2 billion in property loss.” Over the first five months of 2011, 33 fire fighters lost their lives in the line-of-duty, according to the US Fire Administration.

The national security consciousness is preeminent now in the minds of Americans. Everything from seeds used to plant corn for Kellogg’s Corn Flakes to rare earth minerals are a matter of national security, of survival (agroterrorism). Americans have a dangerous disregard now for the sovereignty of other nations terminating-- 21st Century gang land style--any real or perceived enemy of the state.

The USA has planted the seeds of reprisal. American Homeland watch out!

There is only one other nation with a similar psyche as this and that is Israel. That nation provides a very useful comparative template, a sort of mirror in which to look. There is much to be admired about Israel and its people, but adopting their narrow security-minded view of the world is absolutely wrong for the USA. How has it come to pass that an Israeli official speaking before members of the US Congress receives more applause than the President of the United States of America delivering a State of the Union Address? Is there a clearer message of America’s move away from itself and its ideals?

Selling Clinical Depression

Adding New Spin and Urgency to Depression Drug Sales
By MARTHA ROSENBERG


The discovery that many people with life problem or occasional bad moods would willingly dose themselves with antidepressants sailed the drug industry through the 2000s. A good chunk of the $4.5 billion a year direct-to-consumer advertising has been devoted to convincing people they don't have problems with their job, the economy and their family, they have depression. Especially because depression can't be diagnosed from a blood test.

Unfortunately, three things dried up the depression gravy train for the drug industry. Blockbusters went off patent and generics took off, antidepressants were linked with gory and unpredictable violence, especially in young users and -- they didn't even work, according to medical articles!

That's when the drug industry began debuting the concept of "treatment resistant depression." It wasn't that their drugs didn't work (or you didn't have depression in the first place), you had "treatment resistant depression." Your first expensive and dangerous drug needed to be coupled with more expensive and dangerous drugs because monotherapy, one drug alone, wasn't doing the trick!

You've got to admire the drug industry's audacity with this upsell strategy. Adding drugs to your treatment resistant depression triples its take, patients don't know which drug is working so they'll take all of them and the defective drugs are exonerated! (Because the problem is you.)

Now the drug industry has a new whisper campaign to keep the antidepressant boat afloat. Your depression is "progressive."

Once upon a time, when depression was neither seasonal, atypical, bipolar or treatment resistant, it was considered to be a self-limiting disease. In fact, just about the only good thing you could say about depression was it wouldn't last forever.

But now, the drug industry is giving depression the don't-wait scare treatment like coronary events (statins), asthma attacks ("controller" drugs) and thinning bones (Sally Field). If you don't hurry and take medication, your depression will get worse!

"Depressive episodes become more easily triggered over time," floats an article on the physician Web site Medscape (flanked by ads for the antidepressant Pristiq.) "As the number of major depressive episodes increase, the risk for subsequent episodes is predicted more from the number of prior episodes and less from the occurrence of a recent life stress." The article, unabashedly titled "Neurobiology of Depression: Major Depressive Disorder as a Progressive Illness," is written by Vladimir Maletic who happens to have served on Eli Lilly's Speaker's Bureau, says the disclosure information, and whose co-authors are each employees and/or Lilly shareholders.

On WebMD, a sister site to Medscape, the depression sell is even less subtle. An article called Recognizing the Symptoms of Depression, smothered with five ads for the Eli Lilly antidepressant, Cymbalta, submits, "Most of us know about the emotional symptoms of depression. But you may not know that depression can be associated with many physical symptoms, too."

Depression may masquerade as headaches, insomnia, fatigue, backache, dizziness, lightheadedness or appetite problems mongers the article. "You might feel queasy or nauseous. You might have diarrhea or become chronically constipated." And here, you thought it was something you ate!

The danger with these symptoms says the article is that you would fail to diagnose yourself as suffering from a psychiatric problem and buy an over-the-counter drug like a normal person. "Because these symptoms occur with many conditions, many depressed people never get help, because they don't know that their physical symptoms might be caused by depression. A lot of doctors miss the symptoms, too."

But when head and backaches aren't labeled as depression, the drug industry make no money and insurance rates could stop climbing from over-treatment with unnecessary, expensive and dangerous psychoactive drugs!

To prevent such goring of marketshare, the article (whose content was "selected and controlled by WebMD's editorial staff and is funded by Lilly USA," an original WebMD financial partner according to the Washington Post) counsels worry about physical symptoms. "Don't assume they'll go away on their own." Symptoms may "need additional treatment" and "some antidepressants, such as Cymbalta and Effexor, may help with chronic pain, too."

Before direct-to-consumer advertising, the health care system was devoted to preventing over-treatment and assuring patients they were probably okay. Who remembers "Take two aspirin and call me in the morning"? Now patients are assured they probably aren't okay but probably have a progressive disease. Luckily their disease can be treated with progressive prescriptions from pharma.

Too Big to Fail, Too Conflicted to Govern

Shameless on the Hill
By RUSSELL MOKHIBER

At a hearing on Capitol Hill this week on "too big to fail" banks, both corporate parties were posturing.

Strutting their stuff.

Ripping into each other.

But the reality?

When push came to shove, both didn't have the guts to do the right thing to prevent another bailout.

That would be – limit the size of the big banks so that they are no longer "too big to fail."

"Too big to fail" means exactly that.

The banks are too big to fail.

If they fail, we must bail them out.

Or the economy goes down in a spectacular flameout.

The six biggest banks in America?

Wells Fargo.

Citibank.

Bank of America.

JP Morgan Chase.

Morgan Stanley.

Goldman Sachs.

Together, they control assets equal to about 65 of GDP.

Twenty years ago, that number was about 15 percent of GDP.

The hearing yesterday was held by a House Financial Services Committee subcommittee chaired by Congresswoman Shelley Moore Capito (R-West Virginia).

Before the hearing got started, Public Citizen was passing out a letter calling on Capito to formally disclose that her husband now works for one of the those too big to fail banks – Wells Fargo.

She has so far refused to do so.

But the conflicts on the committee on both sides of the aisle are deeper than the horse crap at the back of any barn in Capito's Second Congressional District of West Virginia.

True, the Republican side is marinated in Wall Street cash.

Just as an example, Capito's number one contributor over her career is from another one of the too big to fail banks – Citibank.

But the Democrats are marinated in Wall Street cash too.

And therefore the hypocrisy on the Democratic side is as deep – maybe deeper – than the conflicts.

The best in show winner for posturing was Congressman Luis Gutierrez (D-NY).

Gutierrez used his five minutes to rip into the Republicans for being the party of Wall Street.

He ended pointing at his Republican colleagues and saying – "You should just tell people you're for big banks and make it clear and simple."

Then he got up and left.

Didn't want to contemplate of the hypocrisy of it all.

But Congressman Brad Miller (D-North Carolina) was shameless.

Miller understands – as does almost everyone on the committee – that the way you deal with the problem of too big to fail banks is to limit their size so that they are no longer too big to fail.

There actually was a vote on the Senate side in 2008 on an amendment – the Brown-Kaufman amendment – that would have done the trick.

Miller introduced a similar amendment in the House.

Miller put it this way:

"Senator Kaufman introduced an amendment on the Senate side that failed, that would limited the overall size of those -- of banks to 2 percent of the GDP. That's still like a $300 billion company. That's a pretty big -- pretty big bank, big enough to do pretty much anything, but it would have required that the six biggest firms be broken up into more than 30 banks."

"No Republican support for that law," Miller said. "I introduced the idea on the House side, but the fight was really over on the Senate side."

What is Congressman Miller not telling us?

He is not telling us that the Brown/Kaufman amendment was defeated by President Obama and his Secretary of Treasury Tim Geithner.

Neil Barofsky, who was the Special Inspector General for the TARP, told Corporate Crime Reporter last week that the Brown Kaufman amendment would have passed had the Obama administration gotten behind it.

Instead, Treasury Secretary Geithner lobbied against the bill.

"The reason it didn't pass was because the Treasury Secretary lobbied individual Senators to convince them to vote against this bill," Barofsky said.

And what is the result of that vote?

"The largest banks are now 20 percent larger today than they were going into the crisis," Barofsky said. "They are systemically more significant, they are bigger, they are more important. And we just haven't seen the political or regulatory will to take on the fundamental problems that are presented by these institutions."

"Standard and Poors recently put the U.S. government's credit rating on watch. And one of the things they talked about was the contingent liability to support our financial institutions. And they estimated that the up front costs of another bailout could be up to $5 trillion."

"And when you think about the focus on our budget issues, our deficit and our debt – what happens with the next crisis and we have to come up with another $5 trillion to bail out our system once again?"

"It's a terrifying concept. One of TARP's biggest legacies is that it emphasized to the market that the government would not let these largest banks fail. And we haven't done anything to address this problem. So, we are going to be right back where we were in late 2008 – if not in a worse position."

When Only "Crazies" See the Bank Giveaway for What It Was

By MICHAEL HUDSON - Counterpunch

Financial crashes were well understood for a hundred years after they became a normal financial phenomenon in the mid-19th century. Much like the buildup of plaque deposits in human veins and arteries, an accumulation of debt gained momentum exponentially until the economy crashed, wiping out bad debts – along with savings on the other side of the balance sheet. Physical property remained intact, although much was transferred from debtors to creditors. But clearing away the debt overhead from the economy’s circulatory system freed it to resume its upswing. 

That was the positive role of crashes: They minimized the cost of debt service, bringing prices and income back in line with actual “real” costs of production. Debt claims were replaced by equity ownership. Housing prices were lower – and more affordable, being brought back in line with their actual rental value. Goods and services no longer had to incorporate the debt charges that the financial upswing had built into the system.
Financial crashes came suddenly. They often were triggered by a crop failure causing farmers to default, or “the autumnal drain” drew down bank liquidity when funds were needed to move the crops. Crashes often also revealed large financial fraud and “excesses.”

This was not really a “cycle.”  It was an ascending curve, ending in a vertical plunge. But popular terminology called it a cycle because the pattern was similar again and again, every eleven years or so. When loans by banks and debt claims by other creditors could not be paid, they were wiped out in a convulsion of bankruptcy. 

Gradually, as the financial system became more “elastic,” each business recovery started from a larger debt overhead relative to output. The United States emerged from World War II relatively debt free. Downturns occurred, crashes wiped out debts and savings, but each recovery since 1945 has taken place with a higher debt overhead. Bank loans and bonds have replaced stocks, as more stocks have been retired in leveraged buyouts (LBOs) and buyback plans (to keep stock prices high and thus give more munificent rewards to managers via the stock options they give themselves) than are being issued to raise new equity capital.

But after the stock market’s dot.com crash of 2000 and the Federal Reserve flooding the U.S. economy with credit after 9/11, 2001, there was so much “free spending money” that many economists believed that the era of scientific money management had arrived and the financial cycle had ended. Growth could occur smoothly – with no over-optimism as to debt, no inability to pay, no proliferation of over-valuation or fraud. This was the era in which Alan Greenspan was applauded as Maestro for ostensibly creating a risk-free environment by removing government regulators from the financial oversight agencies.

What has made the post-2008 crash most remarkable is not merely the delusion that the way to get rich is by debt leverage (unless you are a banker, that is). Most unique is the crash’s aftermath. This time around the bad debts have not been wiped off the books. There have indeed been the usual bankruptcies – but the bad lenders and speculators are being saved from loss by the government intervening to issue Treasury bonds to pay them off out of future tax revenues or new money creation. The Obama Administration’s Wall Street managers have kept the debt overhead in place – toxic mortgage debt, junk bonds, and most seriously, the novel web of collateralized debt obligations (CDO), credit default swaps (almost monopolized by A.I.G.) and kindred financial derivatives of a basically mathematical character that have developed in the 1990s and early 2000s. 

These computerized casino cross-bets among the world’s leading financial institutions are the largest problem. Instead of this network of reciprocal claims being let go, they have been taken onto the government’s own balance sheet. This has occurred not only in the United States but even more disastrously in Ireland, shifting the obligation to pay – on what were basically gambles rather than loans – from the financial institutions that had lost on these bets (or simply held fraudulently inflated loans) onto the government (“taxpayers”). 

The U.S. government took over the mortgage lending guarantors, Fannie Mae and Freddie Mac, (privatizing the profits, “socializing” the losses) for $5.3 trillion – almost as much as the entire national debt. The Treasury lent $700 billion under the Troubled Asset Relief Plan (TARP) to Wall Street’s largest banks and brokerage houses. The latter re-incorporated themselves as “banks” to get Federal Reserve handouts and access to the Fed’s $2 trillion in “cash for trash” swaps crediting Wall Street with Fed deposits for otherwise “illiquid” loans and securities (the euphemism for toxic, fraudulent or otherwise insolvent and unmarketable debt instruments) – at “cost” based on full mark-to-model fictitious valuations. 

Altogether, the post-2008 crash saw some $13 trillion in such obligations transferred onto the government’s balance sheet from high finance, euphemized as “the private sector” as if it were the core economy itself, rather than its calcifying shell. Instead of losing on their bad bets, bad loans, toxic mortgages and outright fraudulent claims, the financial institutions cleaned up, at public expense. They collected enough to create a new century’s power elite to lord it over “taxpayers” in industry, agriculture and commerce who will be charged to pay off this debt.

If there was a silver lining to all this, it has been to demonstrate that if the Treasury and Federal Reserve can create $13 trillion of public obligations – money – electronically on computer keyboards, there really is no Social Security problem at all, no Medicare shortfall, no inability of the American government to rebuild the nation’s infrastructure.

The bailout of Wall Street showed how central banks can create money, as Modern Money Theory (MMT) explains. But rather than explaining how this phenomenon worked, the bailout was rammed through Congress under emergency conditions. Bankers threatened economic Armageddon if the government did not create the credit to save them from taking losses. 

Even more remarkable is the attempt to convince the population that new money and debt creation to bail out Wall Street – and vest a new century of financial billionaires at public subsidy – cannot be mobilized just as readily to save labor and industry in the “real” economy. The Republicans and Obama administration appointees held over from the Bush and Clinton administration have joined to conjure up scare stories that Social Security and Medicare debts cannot be paid, although the government can quickly and with little debate take responsibility for paying trillions of dollars of bipartisan Finance-Care for the rich and their heirs.

The result is a financial schizophrenia extending across the political spectrum from the Tea Party to Tim Geithner at the Treasury and Ben Bernanke at the Fed. It seems bizarre that the most reasonable understanding of why the 2008 bank crisis did not require a vast public subsidy for Wall Street occurred at Monday’s Republican presidential debate on June 13, by none other than Congressional Tea Party leader Michele Bachmann – who had boasted in a Wall Street Journal interview two days earlier, on Saturday, that she:
“voted against the Troubled Asset Relief Program (TARP) ‘both times.’… She complains that no one bothered to ask about the constitutionality of these extraordinary interventions into the financial markets. ‘During a recent hearing I asked Secretary [Timothy] Geithner three times where the constitution authorized the Treasury's actions [just [giving] the Treasury a $700 billion blank check], and his response was, “Well, Congress passed the law.” …With TARP, the government blew through the Constitutional stop sign and decided ‘Whatever it takes, that's what we're going to do.’”
Clarifying her position regarding her willingness to see the banks fail, Bachmann explained:
“I would have. People think when you have a, quote, ‘bank failure,’ that that is the end of the bank. And it isn't necessarily. A normal way that the American free market system has worked is that we have a process of unwinding. It’s called bankruptcy. It doesn't mean, necessarily, that the industry is eclipsed or that it's gone. Often times, the phoenix rises out of the ashes.”
There were easily enough sound loans and assets in the banks to cover deposits insured by the FDIC – but not enough to pay their counterparties in the “casino capitalist” category of their transactions. This super-computerized financial horseracing is what the bailout was about, not bread-and-butter retail and business banking or insurance.

It all seems reminiscent of the 1968 presidential campaign. The economic discussion back then between Democrat Hubert Humphrey and Republican Richard Nixon was so tepid that it prompted journalist Eric Hoffer to ask why only a southern cracker, third-party candidate Alabama Governor George Wallace, was talking about the real issues. We seem to be in a similar state in preparation for the 2012 campaign, with junk economics on both sides.

Meanwhile, the economy is still suffering from the Obama administration’s failure to alleviate the debt overhead by seriously making banks write down junk mortgages to reflect actual market values and the capacity to pay. Foreclosures are still throwing homes onto the market, pushing real estate further into negative equity territory while wealth concentrates at the top of the economic pyramid. No wonder Republicans are able to shed crocodile tears for debtors and attack President Obama for representing Wall Street (as if this is not equally true of the Republicans). He is simply continuing the Bush Administration’s policies, not leading the change he had promised. So he has left the path open for Congresswoman  Bachmann to highlight her opposition to the Bush-McCain-Obama-Paulson-Geithner giveaways.

The missed opportunity
When Lehman Brothers filed for bankruptcy on September 15, 2008, the presidential campaign between Barack Obama and John McCain was peaking toward Election Day on November 4. Voters told pollsters that the economy was their main issue – their debts, soaring housing costs (“wealth creation” to real estate speculators and the banks getting rich off mortgage lending), stagnant wage levels and worsening workplace conditions. And in the wake of Lehman the main issue under popular debate was how much Wall Street’s crash would hurt the “real” economy. If large banks went under, would depositors still be safely insured? What about the course of normal business and employment? 

Credit is seen as necessary; but what of credit derivatives, the financial sector’s arcane “small print”? How intrinsic are financial gambles on collateralized debt obligations (CDOs, “weapons of mass financial destruction” in Warren Buffett’s terminology) – not retail banking or even business banking and insurance, but financial bets on the economy’s zigzagging measures. Without casino capitalism, could industrial capitalism survive? Or had the superstructure become rotten and best left to “free markets” to wipe out in mutually offsetting bankruptcy claims?

Obama ran as the “candidate of change” from the Bush Administration’s war in Iraq and Afghanistan, its deregulatory excesses and giveaways to the pharmaceuticals industry and other monopolies and their Wall Street backers. Today it is clear that his promises for change were no more than campaign rhetoric, not intended to limit a continuation of the policies that most voters hoped to see changed. There even has been continuity of Bush Administration officials committed to promoting financial policies to keep the debts in place, enable banks to “earn their way out of debt” at the expense of consumers and businesses – and some $13 trillion in government bailouts and subsidy. 

History is being written to depict the policy of saving the bankers rather than the economy as having been necessary – as if there were no alternative, that the vast giveaways to Wall Street were simply “pragmatic.” Financial beneficiaries claim that matters would be even worse today without these giveaways. It is as if we not only need the banks, we need to save them (and their stockholders) from losses, enabling them to pay and retain their immensely rich talent at the top with even bigger salaries, bonuses and stock options. It is all junk economics – well-subsidized illogic, quite popular among fundraisers.

From the outset in 2009, the Obama Plan has been to re-inflate the Bubble Economy by providing yet more credit (that is, debt) to bid housing and commercial real estate prices back up to pre-crash levels, not to bring debts down to the economy’s ability to pay. The result is debt deflation for the economy at large and rising unemployment – but enrichment of the wealthiest 1 per cent of the population as economies have become even more financialized.

This smooth continuum from the Bush to the Obama Administration masks the fact that there was a choice, and even a clear disagreement at the time within Congress, if not between the two presidential candidates, who seemed to speak as Siamese Twins as far as their policies to save Wall Street (from losses, not from actually dying) were concerned. Wall Street saw an opportunity to be grabbed, and its spokesmen panicked policy-makers into imagining that there was no alternative. And as President Obama’s chief of staff Emanuel Rahm noted, this crisis is too important an opportunity to let it go to waste. For Washington’s Wall Street constituency, the bold aim was to get the government to save them from having to take a loss on loans gone bad – loans that had made them rich already by collecting fees and interest, and by placing bets as to which way real estate prices, interest rates and exchange rates would move. 

After September 2008 they were to get rich on a bailout – euphemized as “saving the economy,” if one believes that Wall Street is the economy’s core, not its wrapping or supposed facilitator, not to say a vampire squid. The largest and most urgent problem was not the inability of poor homebuyers to cope with the interest-rate jumps called for in the small print of their adjustable rate mortgages. The immediate defaulters were at the top of the economic pyramid. Citibank, AIG and other “too big to fail” institutions were unable to pay the winners on the speculative gambles and guarantees they had been writing – as if the economy had become risk-free, not overburdened with debt beyond its ability to pay. 

Making the government to absorb their losses – instead of recovering the enormous salaries and bonuses their managers had paid themselves for selling these bad bets – required a cover story to make it appear that the economy could not be saved without the Treasury and Federal Reserve underwriting these losing gambles. Like the sheriff in the movie Blazing Saddles threatening to shoot himself if he weren’t freed, the financial sector warned that its losses would destroy the retail banking and insurance systems, not just the upper reaches of computerized derivatives gambling.

How America’s Bailouts Endowed a Financial Elite to rule the 21st Century
The bailout of casino capitalists vested a new ruling class with $13 trillion of public IOUs (including the $5.3 trillion rescue of Fannie Mae and Freddie Mac) added to the national debt. The recipients have paid out much of this gift in salaries and bonuses, and to “make themselves whole” on their bad risks in default to pay off. An alternative would have been to prosecute them and recover what they had paid themselves as commissions for loading the economy with debt. 

Although there were two sides within Congress in September 2008, there was no disagreement between the two presidential candidates. John McCain ran back to Washington on the fateful Friday of their September 26 debate to insist that he was suspending his campaign in order to devote all his efforts to persuading Congress to approve the $700 billion bank bailout – and would not debate Mr. Obama until that was settled. But he capitulated and went to the debate. On September 29 the House of Representatives rejected the giveaway, headed by Republicans in opposition. 

So McCain did not even get brownie points for being able to sway politicians on the side of his Wall Street campaign contributors. Until this time he had campaigned as a “maverick.” But his capitulation to high finance reminded voters of his notorious role in the Keating Five, standing up for bank crooks. His standing in the polls plummeted, and the Senate capitulated to a redrafted TARP bill on October 1. President Bush signed it into law two days later, on October 3, euphemized as the Emergency Economic Stabilization Act.

Forward to today. What does it signify when a right-wing cracker makes a more realistic diagnosis of bad bank lending better than Treasury Secretary Geithner, Fed Chairman Bernanke or other Bush-era financial experts retained by the Obama team? Without the bailout the gambling arm of Wall Street would have shrivelledg, but the “real” economy’s everyday banking and insurance operations could have continued. The bottom 99 percent of the U.S. economy would have recovered with only a speed bump to clean out the congestion at the top, and the government would have ended up in control of the biggest and most reckless banks and AIG – as it did in any case.

The government could have used its equity ownership and control of the banks to write down mortgages to reflect market conditions. It could have left families owning their homes at the same cost they would have had to pay in rent – the economic definition of equilibrium in property prices. The government-owned “too big to fail” banks could have told to refrain from gambling on derivatives, from lending for currency and commodity speculation, and from making takeover loans and other predatory financial practices. Public ownership would have run the banks like savings banks or post office banks rather than gambling schemes fueling the international carry trade (computer-driven interest rate and currency arbitrage) that has no linkage to the production-and-consumption economy.

The government could have used its equity ownership and control of the banks to provide credit and credit card services as the “public option.” Credit is a form of infrastructure, and such public investment is what enabled the United States to undersell foreign economies in the 19th and 20th centuries despite its high wage levels and social spending programs. As Simon Patten, the first economics professor at the nation’s first business school (the Wharton School) explained, public infrastructure investment is a “fourth factor of production.” It takes its return not in the form of profits, but in the degree to which it lowers the economy’s cost of doing business and living. Public investment does not need to generate profits or pay high salaries, bonuses and stock options, or operate via offshore banking centers.

But this is not the agenda that the Bush-Obama administrations a chose. Only Wall Street had a plan in place to unwrap when the crisis opportunity erupted. The plan was predatory, not productive, not lowering the economy’s debt overhead or cost of living and doing business to make it more competitive. So the great opportunity to serve the public interest by taking over banks gone broke was missed. Stockholders were bailed out, counterparties were saved from loss, and managers today are paying themselves bonuses as usual. The “crisis” was turned into an opportunity to panic politicians into helping their Wall Street patrons.

One can only wonder what it means when the only common sense being heard about the separation of bank functions should come from a far-out extremist in the current debate. The social democratic tradition had been erased from the curriculum as it had in political memory.
Tom Fahey: Would you say the bailout program was a success? …
BACHMANN: John, I was in the middle of this debate. I was behind closed doors with Secretary Paulson when he came and made the extraordinary, never-before-made request to Congress: Give us a $700 billion blank check with no strings attached.
And I fought behind closed doors against my own party on TARP. It was a wrong vote then. It’s continued to be a wrong vote since then. Sometimes that’s what you have to do. You have to take principle over your party.” (Presidential Debate, CNN, 6/13)
Proclaiming herself a libertarian, Bachmann opposes raising the federal debt ceiling, Obama’s Medicare reform and other federal initiatives. So her opposition to the Wall Street bailout turns out to lack an understanding of how governments and their central banks can create money with a stroke of the computer pen, so to speak. But at least she was clear that wiping out bank counterparty gambles made by high rollers at the financial race track could have been wiped out (or left to settle among themselves in Wall Street’s version of mafia-style kneecapping) without destroying the banking system’s key economic functions.

The moral
Contrasting  Bachmann’s remarks to the panicky claims by Geithner and Hank Paulson in September 2008 confirm a basic axiom of today’s junk economics: When an economic error becomes so widespread that it is adopted as official government policy, there is always a special interest at work to promote it. 

In the case of bailing out Wall Street – and thereby the wealthiest 1 per cent of Americans – while saying there is no money for Social Security, Medicare or long-term public social spending and infrastructure investment, the beneficiaries are obvious. So are the losers. High finance means low wages, low employment, low industry and a shrinking economy under conditions where policy planning is centralized in hands of Wall Street and its political nominees rather than in more objective administrators.

Pres. Jimmy Carter: Call Off the Global Drug War

Friday, June 17, 2011 by the Minneapolis Star Tribune
by Pres. Jimmy Carter

In an extraordinary new initiative announced earlier this month, the Global Commission on Drug Policy has made some courageous and profoundly important recommendations in a report on how to bring more effective control over the illicit drug trade.

The commission includes the former presidents or prime ministers of five countries, a former secretary general of the United Nations, human rights leaders, and business and government leaders, including Richard Branson, George P. Shultz and Paul A. Volcker.

The report describes the total failure of the present global antidrug effort, and in particular America's "war on drugs," which was declared 40 years ago today. It notes that the global consumption of opiates has increased 34.5 percent, cocaine 27 percent and cannabis 8.5 percent from 1998 to 2008.

Its primary recommendations are to substitute treatment for imprisonment for people who use drugs but do no harm to others, and to concentrate more coordinated international effort on combating violent criminal organizations rather than nonviolent, low-level offenders.

These recommendations are compatible with U.S. drug policy from three decades ago. In a message to Congress in 1977, I said the country should decriminalize the possession of less than an ounce of marijuana, with a full program of treatment for addicts.

I also cautioned against filling our prisons with young people who were no threat to society, and summarized by saying: "Penalties against possession of a drug should not be more damaging to an individual than the use of the drug itself."

These ideas were widely accepted at the time. But in the 1980s President Ronald Reagan and Congress began to shift from balanced drug policies, including the treatment and rehabilitation of addicts, toward futile efforts to control drug imports from foreign countries.

This approach entailed an enormous expenditure of resources and the dependence on police and military forces to reduce the foreign cultivation of marijuana, coca and opium poppy and the production of cocaine and heroin. One result has been a terrible escalation in drug-related violence, corruption and gross violations of human rights in a growing number of Latin American countries.

The commission's facts and arguments are persuasive. It recommends that governments be encouraged to experiment "with models of legal regulation of drugs ... that are designed to undermine the power of organized crime and safeguard the health and security of their citizens." For effective examples, they can look to policies that have shown promising results in Europe, Australia and other places.

But they probably won't turn to the U.S. for advice. Drug policies here are more punitive and counterproductive than in other democracies, and have brought about an explosion in prison populations. At the end of 1980, just before I left office, 500,000 people were incarcerated in America; at the end of 2009 the number was nearly 2.3 million.

There are 743 people in prison for every 100,000 Americans, a higher portion than in any other country and seven times as great as in Europe. Some 7.2 million people are either in prison or on probation or parole — more than 3 percent of all American adults!

Some of this increase has been caused by mandatory minimum sentencing and "three strikes you're out" laws. But about three-quarters of new admissions to state prisons are for nonviolent crimes.

And the single greatest cause of prison population growth has been the war on drugs, with the number of people incarcerated for nonviolent drug offenses increasing more than twelvefold since 1980.

Not only has this excessive punishment destroyed the lives of millions of young people and their families (disproportionately minorities), but it is wreaking havoc on state and local budgets. Former California Gov. Arnold Schwarzenegger pointed out that, in 1980, 10 percent of his state's budget went to higher education and 3 percent to prisons; in 2010, almost 11 percent went to prisons and only 7.5 percent to higher education.

Maybe the increased tax burden on wealthy citizens necessary to pay for the war on drugs will help to bring about a reform of America's drug policies. At least the recommendations of the Global Commission will give some cover to political leaders who wish to do what is right.

A few years ago I worked side by side for four months with a group of prison inmates, who were learning the building trade, to renovate some public buildings in my hometown of Plains, Ga. They were intelligent and dedicated young men, each preparing for a productive life after the completion of his sentence. More than half of them were in prison for drug-related crimes, and would have been better off in college or trade school.

To help such men remain valuable members of society, and to make drug policies more humane and more effective, the American government should support and enact the reforms laid out by the Global Commission on Drug Policy.

The Bush Tax Cuts: A Decade of Economic Disaster

Friday, June 17, 2011 by The Providence Journal
by William Barclay

This month marks the 10th anniversary of the first of the two tax cuts sought by President George W. Bush. The Economic Growth and Tax Relief Reconciliation Act was enacted in 2001 to be followed, in 2003, by the Jobs and Growth Tax Relief Reconciliation Act.

Ten years later, it is time we assess the actual results of these tax cuts, looking at economic performance rather than political promises. The results have been a disaster for the U.S. economy and for almost all of the American people. We have had very slow income and employment growth for the vast majority of families, an extremely unequal distribution of the direct financial benefits from these measures and very slow growth in the economy as a whole.

As a high-income person who has received these tax cuts during the past 10 years, I feel that it is my responsibility to speak out.

Supporters of tax cuts for high-income households, such as House Speaker John Boehner (R.-Ohio), argue that rich people are the “job creators” and that tax cuts encourage them to create jobs and that these new jobs, in turn, increase employment opportunities and improve the wages of the rest of the population.

Did any of these benefits occur after the Bush tax cuts? The quick and accurate answer is, no, they did not. Adjusted for inflation, the median weekly earnings of working Americans actually fell 2.3 percent from the end of the 2000-01 recession to the onset of the Great Recession. This is unique in the post World War II period.

Further, the recovery from the 2000-01 recession was the slowest of any post World War II recession to date, requiring 39 months before the number of employed Americans reached the pre-recession level. Where is even a scintilla of evidence that tax cuts such as those passed in 2001 and 2003 generate income and employment growth for the vast majority of the population?

A significant part of the failure of the Bush tax cuts to generate jobs and income growth flows from the top-heavy distribution of the benefits conveyed by these measures. The vast bulk of the reduced taxes were reaped by a very small number of families. In 2011, the average tax reduction to families receiving an income of $1 million or more (about 321,000 families) will be $139,199.

For this less than 0.5 percent of all families this is a total reduction in taxes of $860 million/week. Compare these tax benefits with the yearly savings proposed by cutting the Women, Infants and Children (WIC) health and nutrition program: $833 million. An obvious question is, why can’t this very small group of very high-income families give up just one week of their tax cut to provide nutrition for the tens of thousands of women and children that benefit from the WIC program?

More significantly, in light of the deficit hysteria gripping Washington, D.C., the combined impact of the 2001 and 2003 Bush tax cuts has been the addition of more than $2.6 trillion to the federal debt. This included more than $400 billion in interest payments on the debt necessary to pay for the cuts.

Of course, one might forgive these policy failures if the promise of economic growth had been fulfilled. On this measure, however, the record is even worse.

The 2000-01 recession ended in the fourth quarter of 2001, just in time for the first Bush tax cut to take effect. From the end of the recession until the onset of the Great Recession, the economy grew at a slower rate than in any other post recession period since World War II. Thus, despite promises from the advocates of the tax cuts, the reality was slower growth rather than faster growth. The additional tax cut in 2003 did nothing to increase the pace of economic growth.

In sum, the Bush tax cuts were a bad idea at the time and are an even worse idea today. Ending these cuts for incomes over $250,000 would generate over $100 billion a year in additional revenue. If we also created additional tax rates for very high-income families (e.g., at $500,000, $1 million, $5 million and $10 million) we could increase federal revenue by more than double that amount and put ourselves on the road to reducing deficits and debts.

We Invent Enemies to Buy Bombs--Eisenhower's Fears Came True

Friday, June 17, 2011 by the Guardian/UK
Britain faces no serious threat, yet keeps waging war. While big defence exists, glory-hungry politicians will use it
by Simon Jenkins

Why do we still go to war? We seem unable to stop. We find any excuse for this post-imperial fidget and yet we keep getting trapped. Germans do not do it, or Spanish or Swedes. Britain's borders and British people have not been under serious threat for a generation. Yet time and again our leaders crave battle. Why?

Last week we got a glimpse of an answer and it was not nice. The outgoing US Defense Secretary, Robert Gates, berated Europe's "failure of political will" in not maintaining defense spending. He said NATO had declined into a "two-tier alliance" between those willing to wage war and those "who specialize in 'soft' humanitarian, development, peacekeeping and talking tasks". Peace, he implied, is for wimps. Real men buy bombs, and drop them.

This call was echoed by NATO's chief, Anders Fogh Rasmussen, who pointed out how unfair it was that US defence investment represented 75% of the NATO defense expenditure, where once it was only half. Having been forced to extend his war on Libya by another three months, Rasmussen wanted to see Europe's governments come up with more money, and no nonsense about recession. Defense to him is measured not in security but in spending.

The call was repeated back home by the navy chief, Sir Mark Stanhope. He had to be "dressed down" by the prime minister, David Cameron, for warning that an extended war in Libya would mean "challenging decisions about priorities". Sailors never talk straight: he meant more ships. The navy has used so many of its £500,000 Tomahawk missiles trying to hit Colonel Gaddafi (and missing) over the past month that it needs money for more. In a clearly co-ordinated lobby, the head of the RAF also demanded "a significant uplift in spending after 2015, if the service is to meet its commitments". It, of course, defines its commitments itself.

Libya has cost Britain £100m so far, and rising. But Iraq and the Afghan war are costing America $3bn a week, and there is scarcely an industry, or a state, in the country that does not see some of this money. These wars show no signs of being ended, let alone won. But to the defense lobby what matters is the money. It sustains combat by constantly promising success and inducing politicians and journalists to see "more enemy dead", "a glimmer of hope" and "a corner about to be turned".

Victory will come, but only if politicians spend more money on "a surge". Soldiers are like firefighters, demanding extra to fight fires. They will fight all right, but if you want victory that is overtime.

On Wednesday the Russian ambassador to NATO warned that Britain and France were "being dragged more and more into the eventuality of a land-based operation in Libya". This is what the defense lobby wants institutionally, even if it may appall the generals. In the 1980s Russia watched the same process in Afghanistan, where it took a dictator, Mikhail Gorbachev, to face down the Red Army and demand withdrawal. The west has no Gorbachev in Afghanistan at the moment. NATO's Rasmussen says he "could not envisage" a land war in Libya, since the UN would take over if Gaddafi were toppled. He must know this is nonsense. But then he said NATO would only enforce a no-fly zone in Libya. He achieved that weeks ago, but is still bombing.

It is not democracy that keeps western nations at war, but armies and the interests now massed behind them. The greatest speech about modern defense was made in 1961 by the US president Eisenhower. He was no leftwinger, but a former general and conservative Republican. Looking back over his time in office, his farewell message to America was a simple warning against the "disastrous rise of misplaced power" of a corporate military-industrial complex with "unwarranted influence on government". A burgeoning defense establishment, backed by large corporate interests, would one day employ so many people as to corrupt the political system. (His original draft even referred to a "military-industrial-congressional complex".) This lobby, said Eisenhower, could become so huge as to "endanger our liberties and democratic processes".

I wonder what Eisenhower would make of today's US, with a military grown from 3.5 million people to 5 million. The western nations face less of a threat to their integrity and security than ever in history, yet their defense industries cry for ever more money and ever more things to do. The cold war strategist, George Kennan, wrote prophetically: "Were the Soviet Union to sink tomorrow under the waters of the ocean, the American military-industrial complex would have to remain, substantially unchanged, until some other adversary could be invented."

The devil makes work for idle hands, especially if they are well-financed. Britain's former special envoy to Kabul, Sherard Cowper-Coles, echoed Kennan last week in claiming that the army's keenness to fight in Helmand was self-interested. "It's use them or lose them, Sherard," he was told by the then chief of the general staff, Sir Richard Dannatt. Cowper-Coles has now gone off to work for an arms manufacturer.

There is no strategic defense justification for the US spending 5.5% of its gross domestic product on defense or Britain 2.5%, or for the NATO "target" of 2%.

These figures merely formalize existing commitments and interests. At the end of the cold war soldiers assiduously invented new conflicts for themselves and their suppliers, variously wars on terror, drugs, piracy, internet espionage and man's general inhumanity to man. None yields victory, but all need equipment. The war on terror fulfilled all Eisenhower's fears, as America sank into a swamp of kidnapping, torture and imprisonment without trial.

The belligerent posture of the US and Britain towards the Muslim world has fostered antagonism and moderate threats in response. The bombing of extremist targets in Pakistan is an invitation for terrorists to attack us, and then a need for defence against such attack. Meanwhile, the opportunity cost of appeasing the complex is astronomical. Eisenhower remarked that "every gun that is made is a theft from those who hunger" – a bomber is two power stations and a hospital not built. Likewise, each Tomahawk Cameron drops on Tripoli destroys not just a Gaddafi bunker (are there any left?), but a hospital ward and a classroom in Britain.

As long as "big defense" exists it will entice glory-hungry politicians to use it. It is a return to the hundred years war, when militaristic barons and knights had a stranglehold on the monarch, and no other purpose in life than to fight. To deliver victory they demanded ever more taxes for weapons, and when they had ever more weapons they promised ever grander victories. This is exactly how Britain's defense ministry ran out of budgetary control under Labour.

There is one piece of good news. NATO has long outlived its purpose, now justifying its existence only by how much it induces its members to spend, and how many wars irrelevant to its purpose it finds to fight. Yet still it does not spend enough for the US Defense Secretary. In his anger, Gates threatened that "future US leaders … may not consider the return on America's investment in NATO worth the cost". Is that a threat or a promise?

Fukushima: It's Much Worse Than You Think

Thursday, June 16, 2011 by Al-Jazeera-English
Scientific experts believe Japan's nuclear disaster to be far worse than governments are revealing to the public.
by Dahr Jamail

"Fukushima is the biggest industrial catastrophe in the history of mankind," Arnold Gundersen, a former nuclear industry senior vice president, told Al Jazeera.

 Japan's 9.0 earthquake on March 11 caused a massive tsunami that crippled the cooling systems at the Tokyo Electric Power Company's (TEPCO) nuclear plant in Fukushima, Japan. It also lead to hydrogen explosions and reactor meltdowns that forced evacuations of those living within a 20km radius of the plant.

Gundersen, a licensed reactor operator with 39 years of nuclear power engineering experience, managing and coordinating projects at 70 nuclear power plants around the US, says the Fukushima nuclear plant likely has more exposed reactor cores than commonly believed.

"Fukushima has three nuclear reactors exposed and four fuel cores exposed," he said, "You probably have the equivalent of 20 nuclear reactor cores because of the fuel cores, and they are all in desperate need of being cooled, and there is no means to cool them effectively."

TEPCO has been spraying water on several of the reactors and fuel cores, but this has led to even greater problems, such as radiation being emitted into the air in steam and evaporated sea water - as well as generating hundreds of thousands of tons of highly radioactive sea water that has to be disposed of.

"The problem is how to keep it cool," says Gundersen. "They are pouring in water and the question is what are they going to do with the waste that comes out of that system, because it is going to contain plutonium and uranium. Where do you put the water?"

Even though the plant is now shut down, fission products such as uranium continue to generate heat, and therefore require cooling.

"The fuels are now a molten blob at the bottom of the reactor," Gundersen added. "TEPCO announced they had a melt through. A melt down is when the fuel collapses to the bottom of the reactor, and a melt through means it has melted through some layers. That blob is incredibly radioactive, and now you have water on top of it. The water picks up enormous amounts of radiation, so you add more water and you are generating hundreds of thousands of tons of highly radioactive water."

Independent scientists have been monitoring the locations of radioactive "hot spots" around Japan, and their findings are disconcerting.

"We have 20 nuclear cores exposed, the fuel pools have several cores each, that is 20 times the potential to be released than Chernobyl," said Gundersen. "The data I'm seeing shows that we are finding hot spots further away than we had from Chernobyl, and the amount of radiation in many of them was the amount that caused areas to be declared no-man's-land for Chernobyl. We are seeing square kilometres being found 60 to 70 kilometres away from the reactor. You can't clean all this up. We still have radioactive wild boar in Germany, 30 years after Chernobyl."

Radiation monitors for children

Japan's Nuclear Emergency Response Headquarters finally admitted earlier this month that reactors 1, 2, and 3 at the Fukushima plant experienced full meltdowns.

TEPCO announced that the accident probably released more radioactive material into the environment than Chernobyl, making it the worst nuclear accident on record.

Meanwhile, a nuclear waste advisor to the Japanese government reported that about 966 square kilometres near the power station - an area roughly 17 times the size of Manhattan - is now likely uninhabitable.

In the US, physician Janette Sherman MD and epidemiologist Joseph Mangano published an essay shedding light on a 35 per cent spike in infant mortality in northwest cities that occurred after the Fukushima meltdown, and may well be the result of fallout from the stricken nuclear plant.

The eight cities included in the report are San Jose, Berkeley, San Francisco, Sacramento, Santa Cruz, Portland, Seattle, and Boise, and the time frame of the report included the ten weeks immediately following the disaster.

"There is and should be concern about younger people being exposed, and the Japanese government will be giving out radiation monitors to children," Dr MV Ramana, a physicist with the Programme on Science and Global Security at Princeton University who specialises in issues of nuclear safety, told Al Jazeera.

Dr Ramana explained that he believes the primary radiation threat continues to be mostly for residents living within 50km of the plant, but added: "There are going to be areas outside of the Japanese government's 20km mandatory evacuation zone where radiation is higher. So that could mean evacuation zones in those areas as well."

Gundersen points out that far more radiation has been released than has been reported.

"They recalculated the amount of radiation released, but the news is really not talking about this," he said. "The new calculations show that within the first week of the accident, they released 2.3 times as much radiation as they thought they released in the first 80 days."

According to Gundersen, the exposed reactors and fuel cores are continuing to release microns of caesium, strontium, and plutonium isotopes. These are referred to as "hot particles".

"We are discovering hot particles everywhere in Japan, even in Tokyo," he said. "Scientists are finding these everywhere. Over the last 90 days these hot particles have continued to fall and are being deposited in high concentrations. A lot of people are picking these up in car engine air filters."

Radioactive air filters from cars in Fukushima prefecture and Tokyo are now common, and Gundersen says his sources are finding radioactive air filters in the greater Seattle area of the US as well.

The hot particles on them can eventually lead to cancer.

"These get stuck in your lungs or GI tract, and they are a constant irritant," he explained, "One cigarette doesn't get you, but over time they do. These [hot particles] can cause cancer, but you can't measure them with a Geiger counter. Clearly people in Fukushima prefecture have breathed in a large amount of these particles. Clearly the upper West Coast of the US has people being affected. That area got hit pretty heavy in April."

Blame the US?

In reaction to the Fukushima catastrophe, Germany is phasing out all of its nuclear reactors over the next decade. In a referendum vote this Monday, 95 per cent of Italians voted in favour of blocking a nuclear power revival in their country. A recent newspaper poll in Japan shows nearly three-quarters of respondents favour a phase-out of nuclear power in Japan.

Why have alarms not been sounded about radiation exposure in the US?

Nuclear operator Exelon Corporation has been among Barack Obama's biggest campaign donors, and is one of the largest employers in Illinois where Obama was senator. Exelon has donated more than $269,000 to his political campaigns, thus far. Obama also appointed Exelon CEO John Rowe to his Blue Ribbon Commission on America's Nuclear Future.

Dr Shoji Sawada is a theoretical particle physicist and Professor Emeritus at Nagoya University in Japan.
He is concerned about the types of nuclear plants in his country, and the fact that most of them are of US design.

"Most of the reactors in Japan were designed by US companies who did not care for the effects of earthquakes," Dr Sawada told Al Jazeera. "I think this problem applies to all nuclear power stations across Japan."

Using nuclear power to produce electricity in Japan is a product of the nuclear policy of the US, something Dr Sawada feels is also a large component of the problem.

"Most of the Japanese scientists at that time, the mid-1950s, considered that the technology of nuclear energy was under development or not established enough, and that it was too early to be put to practical use," he explained. "The Japan Scientists Council recommended the Japanese government not use this technology yet, but the government accepted to use enriched uranium to fuel nuclear power stations, and was thus subjected to US government policy."

As a 13-year-old, Dr Sawada experienced the US nuclear attack against Japan from his home, situated just 1400 metres from the hypocentre of the Hiroshima bomb.

"I think the Fukushima accident has caused the Japanese people to abandon the myth that nuclear power stations are safe," he said. "Now the opinions of the Japanese people have rapidly changed. Well beyond half the population believes Japan should move towards natural electricity."

A problem of infinite proportions

Dr Ramana expects the plant reactors and fuel cores to be cooled enough for a shutdown within two years.
"But it is going to take a very long time before the fuel can be removed from the reactor," he added. "Dealing with the cracking and compromised structure and dealing with radiation in the area will take several years, there's no question about that."

Dr Sawada is not as clear about how long a cold shutdown could take, and said the problem will be "the effects from caesium-137 that remains in the soil and the polluted water around the power station and underground. It will take a year, or more time, to deal with this".

Gundersen pointed out that the units are still leaking radiation.

"They are still emitting radioactive gases and an enormous amount of radioactive liquid," he said. "It will be at least a year before it stops boiling, and until it stops boiling, it's going to be cranking out radioactive steam and liquids."

Gundersen worries about more earthquake aftershocks, as well as how to cool two of the units.

"Unit four is the most dangerous, it could topple," he said. "After the earthquake in Sumatra there was an 8.6 [aftershock] about 90 days later, so we are not out of the woods yet. And you're at a point where, if that happens, there is no science for this, no one has ever imagined having hot nuclear fuel lying outside the fuel pool. They've not figured out how to cool units three and four."

Gundersen's assessment of solving this crisis is grim.

"Units one through three have nuclear waste on the floor, the melted core, that has plutonium in it, and that has to be removed from the environment for hundreds of thousands of years," he said. "Somehow, robotically, they will have to go in there and manage to put it in a container and store it for infinity, and that technology doesn't exist. Nobody knows how to pick up the molten core from the floor, there is no solution available now for picking that up from the floor."

Dr Sawada says that the creation of nuclear fission generates radioactive materials for which there is simply no knowledge informing us how to dispose of the radioactive waste safely.

"Until we know how to safely dispose of the radioactive materials generated by nuclear plants, we should postpone these activities so as not to cause further harm to future generations," he explained. "To do otherwise is simply an immoral act, and that is my belief, both as a scientist and as a survivor of the Hiroshima atomic bombing."

Gundersen believes it will take experts at least ten years to design and implement the plan.

"So ten to 15 years from now maybe we can say the reactors have been dismantled, and in the meantime you wind up contaminating the water," Gundersen said. "We are already seeing Strontium [at] 250 times the allowable limits in the water table at Fukushima. Contaminated water tables are incredibly difficult to clean. So I think we will have a contaminated aquifer in the area of the Fukushima site for a long, long time to come."

Unfortunately, the history of nuclear disasters appears to back Gundersen's assessment.

"With Three Mile Island and Chernobyl, and now with Fukushima, you can pinpoint the exact day and time they started," he said, "But they never end."

Thursday, June 16, 2011

Researchers: Psilocybin Mushrooms' Positive Effects Last Over a Year

By Eric W. Dolan - RAW Story
Wednesday, June 15th, 2011

Scientists at the Johns Hopkins University School of Medicine claim to have determined the proper dose levels needed to create positive changes in attitudes, mood, life satisfaction, and behavior that persist for more than a year with the psychoactive substance in so-called "magic mushrooms."

The findings are the latest in a series of experiments done at Johns Hopkins to investigate psilocybin, a psychedelic substance contained in certain mushrooms. The findings were published online this week in the peer-reviewed journal Psychopharmacology.

“In cultures before ours, the spiritual guide or healer had to discern how much of what type of mushroom to use for what purposes, because the strength of psychoactive mushrooms varies from species to species and even from specimen to specimen," said Roland Griffiths, Ph.D., a professor of psychiatry and behavioral sciences and neuroscience at the Johns Hopkins University School of Medicine and the study's lead scientist.

"In our laboratory, weʼre working with the pure chemical psilocybin, which we can measure out precisely," he added. "We wanted to take a methodical look at how its effects change with dosage. We seem to have found levels of the substance and particular conditions for its use that give a high probability of a profound and beneficial experience, a low enough probability of psychological struggle, and very little risk of any actual harm."

The researchers said 94 percent of the study's 18 participants rated their experiences with psilocybin as among the top five most or as the top most spiritually significant experience of his or her life at a 14-month follow-up. Eighty-four percent also reported positive changes in their behaviors, changes like improved relationships with family and others, increased physical and psychological self-care, and increased devotion to spiritual practice, which were corroborated by family members and others.

"I feel that I relate better in my marriage," one participant reported. "There is more empathy – a greater understanding of people and understanding their difficulties and less judgment."

The participants, ages 29 to 62, were screened to include only those who were deemed psychologically and physically healthy. Each participant received five closely-monitored, eight-hour psilocybin sessions a month apart. Each session had a different dose of psilocybin, with one session using a placebo, and was "blind," meaning neither the participants, the monitors, nor the scientists knew how much psilocybin - if any - had been given during any session.

At higher doses of psilocybin, nearly a third of the participants experienced great fear or had delusions, but with gentle guidance from researchers, those reactions did not outlast the psilocybin session or harm the participants. Those who received lower doses of the psychedelic before the higher doses were the most likely to have long-lasting positive changes in attitudes and behavior.

Potential psychotherapeutic value of 'magic mushrooms'

"The Hopkins psilocybin studies clearly demonstrate that this route to the mystical is not to be walked alone," Jerome Jaffe, M.D., who served as the first White House "Drug Czar," commented. "But they have also demonstrated significant and lasting benefits."

"That raises two questions: Could psilocybin-occasioned experiences prove therapeutically useful, for example in dealing with the psychological distress experienced by some terminal patients? And should properly-informed citizens, not in distress, be allowed to receive psilocybin for its possible spiritual benefits, as we now allow them to pursue other possibly risky activities such as cosmetic surgery and mountain-climbing?"

A similar study published in the Archives of General Psychiatry in January found psilocybin can safely improve the moods of patients with advanced-stage cancer and anxiety.

The study was headed by Dr. Charles S. Grob, a professor of psychiatry at the Los Angeles Biomedical Research Institute at Harbor-UCLA Medical Center and found a significant improvement of mood and a reduction in symptoms of anxiety up to six months after receiving psilocybin-assisted therapy.

"We are working with a patient population that often does not respond well to conventional treatments," said Grob. "Following their treatments with psilocybin, the patients and their families reported benefit from the use of this hallucinogen in reducing their anxiety. This study shows psilocybin can be administered safely, and that further investigation of hallucinogens should be pursued to determine their potential benefits."

Although some may be concerned that a relatively unstudied and illegal substance such as psilocybin could pose health risks, a study published in the Psychopharmacology in 2004 found there is "no cause for concern that [psilocybin] is hazardous with respect to somatic health" in healthy adults.

The only physiological side-effects the study found was a brief increase in blood pressure and a small increase in levels of thyroid-stimulating-hormone (TSH), prolactin, cortisol, and adrenocorticotropic hormone. None of these side-effects were considered dangerous to healthy adults.

"Typically, the experiences after [medium and high doses of psilocybin] were rated positive, with retrospective statements ranging from ‘pleasurable’ to ‘ineffably beautiful,’ " according to the study. One volunteer had a fearful experience after being given a high dose of psilocybin, but his anxiety was resolved without the need for pharmacological intervention.

"We know that psilocybin is remarkably non-toxic to the bodyʼs organ systems," Matthew Johnson, Ph.D., an assistant professor of psychiatry and behavioral sciences at Johns Hopkins and lead author of an earlier Hopkins paper on hallucinogen safety remarked. "But there are indirect risks: if someone experiences high anxiety, fear, or paranoia during a psilocybin session, itʼs not hard to imagine them behaving in ways harmful to themselves or others."

"These risks appear to be minimized when volunteers develop a trusting relationship with a skilled monitor, who remains present with them for the duration of the substanceʼs primary effects, and who is available afterwards for consultation."

SIMILAR: First therapeutic study of LSD in 35 years finishes treatment of last subject