Saturday, June 23, 2012

The Sky Is Pink - Fracking Looks to Tobacco Pseudo-science for Further Deception

High Court Produces a Politics Of, By and For Corporations



Though all political eyes and ears await an upcoming decision by the US Supreme Court on the Affordable Care Act, likely to be handed down next week, a new study released on Thursday suggests that many Americans, while awaiting one decision or another, are missing a larger trend pervading the Robert's court. The report, released by the Constitution Accountability Center, found that in every case before the court this term, the majority has found in favor of the position taken by the US Chamber of Commerce, a conservative lobbying group.

This trend, in which the Chamber has scored seven consecutive victories, bolsters arguments made by many observers who note that the Supreme Court in recent years has taken decidedly favorable positions for business interests while siding against the arguments of consumer advocates, labor unions, and the public interest.

The Nation's John Nichols, responding to Thursday's SCOTUS decision that will now require public-employee unions to get specific permission from employees in workplaces they represent for special assessments before making political campaign expenditures, said the pattern of corporate interests winning out over the interests of "real human beings" is quite clear.

"The Court-ordered shift creates an incredible bureaucratic nightmare for organizations that represent hundreds of thousands of workers," contends Nichols. "And," he adds,"It was entirely unnecessary, as key unions have indicated that they would be willing not just to maintain their “opt-out” clauses but to refund special assessment money to any member or represented nonmember who might object to a political initiative."

Writing the dissenting for the opinion was Justice Stephen Breyer who was joined only by Justice Kagan in opposing the majority. “The debate about public unions’ collective bargaining rights is currently intense,” Breyer wrote. “The question of how a nonmember indicates a desire not to pay constitutes an important part of this debate.… There is no good reason for this court suddenly to enter the debate, much less now to decide that the Constitution resolves it.”

If both history and legal precedent are being written at the Supreme Court, the tale seems to be that big business has a court quite willing to take its side in arguments. For many, the more troubling development is the manner in which the court has gone out of its way to make life that much harder for those who seek to put the reigns on corporate power.

*  *  *

Huffington Post: Supreme Court: U.S. Chamber Of Commerce Undefeated This Term
The U.S. Chamber of Commerce is undefeated at the Supreme Court this term, continuing to improve its success in securing business-friendly judgments since Chief Justice John Roberts took the bench in 2005.
The Constitutional Accountability Center, a left-leaning think tank and law firm, reported its findings on Thursday, noting that this term, which began in October and will likely conclude by the end of June, could be the chamber's "first 'perfect' term before the Supreme Court since at least 1994."
This term's "string of seven straight victories brings the chamber’s overall win/loss rate before the Roberts Court up to 68 percent (60 of 88 cases)," wrote Neal Weare, the center's litigation counsel and Supreme Court fellow. That success rate is significantly higher than during comparable periods of personnel stability under the past two chief justices, William Rehnquist and Warren Burger. From 1994 to 2005, when Rehnquist was chief justice, the chamber succeeded in 56 percent of the cases it backed. The business lobby had a 43 percent success rate from 1981 to 1986 during the final years of Chief Justice Warren Burger's tenure.
The chamber, a pro-business lobbying group that supports conservative candidates and causes, formed its modern litigation strategy in 1971, when corporate attorney Lewis Powell wrote a memorandum that urged the group to aggressively pursue its interests in courts, citing the liberal American Civil Liberties Union's success as a model. "Under our constitutional system, especially with an activist-minded Supreme Court, the judiciary may be the most important instrument for social, economic and political change," Powell wrote. Then-President Richard Nixon nominated Powell that same year to the Supreme Court, where he wrote the opinion that the Roberts Court relied upon in the Citizens United decision to allow unlimited corporate spending in political campaigns.
The health care cases, likely to be decided next week, may yet spoil the chamber's perfect season. The chamber took no position on the individual mandate's constitutionality, but did urge the justices to strike down the entire law if they decide to void the mandate. That position, however, found little support among the justices during oral argument in March.
Still, not even a loss in the Supreme Court's highest profile case of the term, if not the decade, would do much to dent the chamber's high rate of success at the Roberts Court.
*  *  *

John Nichols: High Court Produces a Politics Of, By and For Corporations
The most politically partisanand politically activist—Supreme Court in modern American history has already assumed that, when it comes to electioneering, corporations have pretty much the same rights as human beings. Indeed, the High Court’s Citizens United ruling has given corporations unprecedented flexibility to act on their own behalf to influence election campaigns and results.
Yet, the same Court has now said that groups of actual human beings—trade unions that have organized public-sector workers—must sacrifice their flexibility in order to meet standards never before demanded of labor organizations.
Do we detect a pattern here?
Of course.
That pattern was on stark display in Thursday’s Supreme Court decision to require that public-employee unions get specific permission from employees in workplaces they represent for special assessments that are used to advance political agendas. This high court swipe at the ability of working people to make their voices heard in the political life of communities, states and the nation won't get the attention that will be accorded the Supreme Court's equally politicized ruling on the Affordable Care Act.
But there's a good argument to be made that, by further skewing the Democratic process that was so badly warped by the 2010 Citizens United ruling, the court has done even greater damage to the long-term prospects for renewing the republic.
The damage is contained in a structural shift that will make unions -- and potentially other membership-based organizations -- less flexible and functional in the political fights of the future.

The Truth Behind the Bath Salts "Epidemic"

The latest unfounded drug scare...bath salts have created a zombie epidemic! Aaaaahhhhhh!!!

++++

By Tony O'Neill, The Fix
Posted on June 22, 2012

Drug scares, like the seasons, are cyclical. Here in the US, we had media firestorms over crack in the 80’s, meth in the 90’s, and prescription painkillers in the 00’s. Right on schedule, the latest demon drug that is supposedly tearing our society apart has entered stage left: bath salts.

Bath salts really just means a drug that is a combination of two stimulants — MDPV and mephedrone. Sold online and via headshops as a cheap, legal alternative to cocaine and ecstasy, mephedrone was first synthesized in 1929 while MDPV came along in 1969. Both were rediscovered in 2003 and they were perfect drugs for the Internet age—an ideal alternative to pricy illegal drugs that could be obtained legally with nothing more than a credit card and the click of a mouse.

The exact pharmacology of bath salts can vary, as compounds are constantly tweaked by chemists to stay one step ahead of the law. “Most of these substances seem to be cathinone derivatives, and as such are central nervous system stimulants that act through interruptions of dopamine, norepinephrine and—to a more limited extent—serotonin function,” explains Dr. Adi Jaffe, an addiction specialist at UCLA. While noting that actual research on these substances is in its early stages and reports are limited, Jaffe says that “at low to moderate doses the most common effects for MDPV can be thought of as meth-like: stimulation, euphoria and alertness. Mephedrone seems to act more like MDMA than meth.”

While the chemistry may change, one thing that has remained consistent is the ballooning popularity of this sector of the drug market.

After the explosion in use, the next phase of the drug scare comes in the form of demonization, and the authorities have certainly wasted no time in making some pretty wild allegations about the supposed effects of bath salts; recently we’ve heard that these drugs can cause cannibalism, a la the infamous Miami face-eater, pedophilia and even cross-dressing goat abuse.

The third part of any good drug scare happens when the press, despite a total lack of causal evidence, parrots these outlandish accusations. In the Miami cannibal case, the link between bath salts originated from a statement made by someone with no direct involvement with the case—the president of the Miami Fraternal Order of Police, Armando Aguilar—despite the fact that an autopsy and toxicology had yet to be performed on attacker Rudy Eugene.

In the case of Shane Shuyler, the Miami man accused of exposing himself to children while “allegedly” under the influence of bath salts, the evidence was no less hazy. The police said they found something that “appeared” to be bath salts in his wallet (i.e. an unidentified white powder). And then there was this strange quote from a detective giving evidence against Shuyler: "Upon talking to him, he made some statements to me which led me to believe that he was cooling off in a fountain by the tot-lot, because he was hot, which was consistent with ingesting bath salts." The logic being that since bath salts cause users body temperatures to rise, then cooling off by a water fountain is evidence of bath salts use. Never mind the fact that the incident took place in June, in Miami, where the average temperature is 88.1 degrees.

After the hype comes the crackdown, which means that high-profile cases like these have created a push from both the media and law enforcement for a federal ban on the sale of bath salts.

Florida Republican Rep. Sandy Adams is one of the politicians who helped push the Combating Dangerous Synthetic Stimulants Act of 2011 through the House last December. The bill would federally ban MDPV and mephedrone, the two chemicals found in bath salts, as well as outlawing dozens of other chemicals found in synthetic drugs. The reasoning? "Looking at the Miami incident, we've seen people do some very bizarre acts on bath salts," Adams told the U.S. News and World Report. If he gets his way, bath salts would be categorized alongside heroin and LSD.

So can bath salts really cause ordinary normal people to cannibalize strangers, expose themselves to children, or murder goats? And if they can, why on earth would anybody take them? A drug that has been described variously as “super powered LSD” and “PCP on crack” seems like a confusing proposition. So which is it?

“The reason for the contrasting descriptions is most likely the small but very significant difference in the specific chemicals involved,” says Dr. Jaffe. “Meth and ecstasy are very close chemical cousins but obviously cause very different effects for the user; the same is true here.”

To better get a handle of what is happening here in the US, I looked toward the United Kingdom, which has recently been through a similar cycle of shock horror media coverage of the “bath salt epidemic,” followed by a rush to ban. Bath salts were known under various aliases in the UK, including M-Kat, Meow-Meow and Bubbles, so for the sake of clarity, I’m going to use their chemical name of the most common compound: mephedrone.

“The issues in the US and UK are very similar, except that, as in many things, the US hype is even more over the top than that in the UK,” says Danny Kushlick, founder of the Transform Drug Policy Foundation, a charitable think tank that attempts to draw public attention to the fact that drug prohibition is the major cause of drug-related harm. “I mean, we never got so far as cannibalism.”

Yet at its height, the mephedrone scare in the UK was still pretty lurid; according to some of the coverage, it was linked to overdose, patricide and at least one case of self scrotum tearing. Many of these stories would later be disproved (the scrotum story, unsurprisingly was revealed as a hoax) and the two deaths that started the whole media scare—that of Louis Wainwright, 18, and Nicholas Smith, 19—turned out to be totally unrelated to the use of the drug. By the time the toxicology report on Wainwright and Smith had been published, however, it was too late: the drug had been banned a month earlier by the UK parliament, after a one-hour debate and no vote on the matter.

Kushlick, who is on the council of the International Harm Reduction Association, and is a member of the British Society of Criminology's Advisory Council, was a vocal opponent of the rush to ban mephedrone in the UK. In the months following the ban, he saw harm actually increase and not reduce because, he said, “when mephedrone was banned, the price increased and it was sold not by legitimate retailers but by non-tax paying unregulated dealers. It was also immediately replaced by a more potent compound and traded as Ivory Wave, which users had little experience with so they were more likely to get into trouble.”

These sentiments are echoed by Dr. Fiona Measham, a senior lecturer on Criminology at Lancaster University and the author of several books on drug use in young people. Her groundbreaking research into the use of mephedrone in the UK has provided some of the only hysteria-free data into mephedrone and the people who use it. Her paper, “Tweaking, Bombing, Dabbing and Stockpiling; the emergence of mephedrone and the Perversity of prohibition” (2010 Measham, et al), is the definitive account of the UK’s experience with the drug from the perspective of the people who actually use it.

“The government might claim that the ban was successful because mephedrone use and deaths have fallen,” says Measham, “but for me the question to ask is: what are users taking instead? In the UK, the rise and fall of mephedrone in 2009 and 2010 tracked the fall and now rise in the purity of cocaine, ecstasy pills and MDMA powder. [Since the ban], pills are back, big time, and the legal highs—whether now banned or not—just don’t compete for most recreational users.”

Despite the hazy evidence that it would have any effect at all on the level of drug use among young people, a federal ban is something that many in politics and law enforcement are in favor of here in the States. All it really seems guaranteed to do is drive up prices, lower purity and criminalize young people who continue to use the drug. At best, the use of the drug might be reduced as users switch over wholesale to different drugs. But in the politics of the drug war, no one ever let the truth get in the way of an expansion of hostilities. The police, Drug Enforcement Agency and court system will always welcome a new front in the war.

The big motivation, as always, is money. Local authorities want more money to “fight the war on drugs” even if they are fighting against a phantom menace. Already there are dark rumblings from the cops about needing more resources to fight against this army of bath salts crazed cannibals who are seemingly about to smash down your door and devour your children.

“The cities are deeply involved [in banning synthetic drugs] because the state can’t seem to get on top of it,” Pembroke Pines Commissioner Angelo Castillo said in an illuminating interview with Inquisitir.com. “We see cities jumping in to handle it because the state isn’t. Otherwise people die and people’s faces get chewed off.” Castillo added, “This is the next horizon in the war on drugs and we need to gear up and deal with it.”

Kushlick, who has seen firsthand the lack of effect that “gearing up to deal with it” has had on levels of use, suggests a more pragmatic approach. “The wiser move would have been to leave meph on the market and monitor its effect in order to ascertain its costs and benefits,” he tells me. “We know from experience that the Iron Law of Prohibition means that milder drugs will be replaced by more potent ones. That's what happened under alcohol prohibition in the US as bootlegged spirits replaced beer. Conducting research on its effects and how people were using it could have been used to provide information to current and potential new users.”

The bottom line is this: more and more new users are trying bath salts, at least in part because of the attention thrust on the drug by the hysterical press coverage. When they discover that it doesn’t cause them to chew off peoples’ faces or tear off their own genitals, then they are more inclined to ignore all warnings about these drugs’ potential dangers. Kushlick’s advice to users is less likely to get the press excited but more likely to reduce harm. “As with any drug, if you are not used to using meph, you should take a small amount to assess its effects on you as an individual," he says. "Always take new drugs with someone who is an experienced and trusted friend. Just because you have a good effect from the dose you've taken, do not assume that you can double the fun by doubling the dose.”
 

All Americans are Suspected Terrorists


WAKE UP!!!



I don't want to live here anymore. This is not freedom, it's fascism. It's tyranny.

Thursday, June 21, 2012

SCOTUS Rules on Mandatory Minimum Sentencing for Crack and Nudity & Profanity Fines Meted by FCC

Older convictions subject to new crack sentencing guidelines
By Stephen C. Webster - RAW Story
Thursday, June 21, 2012

In a 5-4 decision (PDF) on Thursday, the U.S. Supreme Court ruled that reduced sentences for crack cocaine, approved by Congress in 2010, must be applied to individuals with pending legal cases at the time of its passage if they had not yet been sentenced.

The nation’s top judges took up the Fair Sentencing Act (FSA) after two Chicago men were given mandatory minimum sentences as required by Congress in the 1980s, when a crack “epidemic” was sweeping the nation. Then, Congress set the sentence for simple possession of a single gram of crack to a minimum of five years, whereas someone found with less than 100 grams of powder cocaine wouldn’t face nearly the same sentence.

Due to the popularity of crack in low-income, urban communities, the harsh sentencing laws saw a wildly inordinate number of African-Americans jailed for much longer than white offenders caught with the more expensive powder cocaine. Crack and cocaine are the same drug in different forms, but crack is thought to be more addictive because it is commonly smoked, rather than snorted, producing a stronger and faster high.

Congress finally recognized this disparity and passed the FSA with bipartisan support in August 2010. The new law adjusted the sentencing rules to bring crack and cocaine penalties in line with each other, setting a mandatory minimum sentence of five years at 28 grams of crack, instead of one.

Just one month after that measure was signed into law, the two Chicago men — Corey Hill and Edward Dorsey — were both given mandatory minimum sentences in line with the Reagan-era penalties. Judges on the 7th Circuit Court of Appeals agreed with those sentences, too, noting that the men had both committed their crimes before the FSA was signed by President Barack Obama, and because it was not clear whether the FSA should be applied to cases pending at that time.

The Supreme Court reversed that decision on Thursday, however, after Justice Anthony Kennedy sided with Sotomayor, Kagan, Breyer and Ginsburg to overcome the conservative justices in a split decision.

++++

Supreme Court overrules FCC on TV swearing ban 
By Arturo Garcia - RAW Story
Thursday, June 21, 2012

So it turns out you can curse on television – sort of.

According to MSNBC, a ruling by the Supreme Court Thursday waived fines and sanctions against ABC and Fox, saying the Federal Communications Commission did not give them fair notice before punishing them over brief instances of curse words and nudity.

The ruling (PDF), which does not affect the FCC’s overall policy toward profanity, centered on outbursts by Cher and Nicole Richie on live awards shows on FOX and a brief instance of partial nudity shown on ABC’s NYPD Blue.

“Because the Commission failed to give Fox or ABC fair notice prior to the broadcasts in question that fleeting expletives and momentary nudity could be found actionably indecent, the Commission’s standards as applied to these broadcasts were vague,” Justice Anthony Kennedy wrote in the unanimous decision, adding that the FCC was free to revise its current policy “in light of its determination of the public interest and applicable legal requirements.”

CISPA sponsor says Obama will sign cybersecurity bill

21 June, 2012 - RT


The White House has gone on the record to say that US President Barack Obama will veto the controversial cybersecurity bill known as CISPA, but the author of the act has his doubts that the commander-in-chief will keep that promise.

Congressman Mike Rogers (R-Michigan), one of two US Representatives responsible for introducing the heated Cyber Intelligence Security Protection Act to Congress, has opened up once more on the subject of CISPA. According to the lawmaker — who also sits as chairman of the House Permanent Select Committee on Intelligence — President Obama is likely to loosen his stance on the cybersecurity bill and sign it into law if given the change.

“[I]f we can get a bill on information-sharing to the president’s desk, he’ll sign it. I do believe that,” Rep. Rogers said this week during a panel discussion on the bill, reports Daily Dot. 

Since being introduced earlier this year, CISPA has attracted criticism from across the country thanks to activists rallying against the attempt to involve Washington in the inner workings of the Web. If passed, the government would be given the go-ahead to peer into personal information stored online by Internet users and would be invited to put their eyes on sensitive data provided to third-party companies and other private sector corporations that operate on the Web — all while excusing those businesses from any liability — under the guise of national security.

The legislation has already cleared the US House of Representatives and is awaiting a vote in the Senate. As lawmakers on that side of the aisle prepare to ponder the bill, however, Rep. Rogers says he’s confident that, once the “dust settles,” Obama will authorize the act.
Earlier this year, the White House responded to the attention CISPA was accumulating by releasing a statement expressing the administration's attitude against the bill.
“The sharing of information must be conducted in a manner that preserves Americans' privacy, data confidentiality and civil liberties and recognizes the civilian nature of cyberspace,” the memo reads. “Cybersecurity and privacy are not mutually exclusive. Moreover, information sharing, while an essential component of comprehensive legislation, is not alone enough to protect the nation's core critical infrastructure from cyber threats. Accordingly, the administration strongly opposes H.R. 3523, the Cyber Intelligence Sharing and Protection Act, in its current form.”

Comparisons have been made, of course, with another controversial legislation authorized by Obama in recent months that was done after a similar statement was offered to the public. Last year the administration insisted that the president would veto the National Defense Authorization Act for Fiscal Year 2012, but Obama eventually inked his name to the bill on December 31, providing a signing statement acknowledging that he had reservations about the legislation. Nonetheless, the attention-grabbing provisions of the NDAA that include rules about indefinitely detaining American citizens were included in the final bill, despite the president’s alleged opposition. Several state have since considered legislation on their own that would free them from complying with those federal provisions.

Last month, Howard A. Schmidt, a leading White House official who publically condemned CISPA on behalf of the administration, suddenly stepped down as Obama’s cybersecurity coordinator. In turn, skeptics were quick to suggest that the president may in the future be less persistent in the White House’s fight against CISPA than it had been with Schmidt on board.

That is not to say, however, that CISPA will be inevitably authorized. It has garnered the attention of several leading lawmakers in Washington who are opposed to the bill, including Rep. Ron Paul (R-Texas) and Sen. Ron Wyden (D-Oregon). Last month Sen. Wyden told members of Congress that passing CISPA would create “a Cyber Industrial Complex” that would allow the federal government and its Big Business cohorts to profit off of the personal info of any American with an Internet connection.

Wealth Distribution

GOP Enthusiasm for Romney (This Modern World)




Campaign Spending and States' Rights

by JAMES MARC LEAS
 
Never, never, never. That appears to be the word from Montana Attorney General Steve Bullock as he charges toward judgment day on Thursday June 21 at the US Supreme Court with no assertion of Montana’s sovereign immunity, its best defense against summary reversal. The case came before the Supreme Court conference last Thursday but was held for further consideration at the Supreme Court conference this Thursday, giving Bullock one more chance. The Montana case is a state-level replay of Citizens United. The corporations suing Montana want the Supreme Court to grant review and then immediately issue a one-line decision summarily reversing the Montana Supreme Court decision upholding Montana’s Corrupt Practices Act that outlaws the kind of private money in elections in Montana that is flooding elections elsewhere.

But the clock is ticking down. When the Court closes its doors Wednesday night, Montana’s last best hope to file a motion asserting sovereign immunity under the 11th Amendment–an action that would effectively void any risk of summary reversal–comes to an end, as described in an article “Montana Citizens United Case Hanging in the Balance.” Any exception to 11th Amendment immunity because a case originated in state court has been overturned, as illustrated by a 2011 Supreme Court decision.

On Thursday, the nine US Supreme Court justices meet in conference to give final consideration to this case. A New York Times article suggested that acceptance of this “historic” case for review and immediate summary reversal will be the likely result. Immediate summary reversal means a decision reversing the Montana Supreme Court decision and ending the case without any briefs, without any hearing, and without its day in court for Montana. In three recent cases the Supreme Court accepted cases and then summarily reversed in a five vote decision.

Some liberal Democrats play down any form of states’s right. But Steve Bullock appears to have made a fundamentalist religion of opposing state sovereign immunity in this case. No matter the extreme facts and how compelling the argument is to use it. And no matter the potential cost to election integrity in Montana and to the country as a whole.

Not so for other attorneys general. For example, in my state of Vermont, Attorney General Bill Sorrell listed sovereign immunity among the affirmative defenses in his response to the 2011 federal court suit against Vermont state officials by Entergy Corporation, which is seeking to keep our aging and leaking Fukushima-type nuclear power plant operating.

But the Vermont Yankee case presents not nearly as strong an argument for denying federal jurisdiction as does Montana’s case. The Montana case is so important, and the facts are so extreme, that all parts of the political spectrum can join in supporting state sovereign immunity in this case. And they can do so without any risk of adverse consequences to other vitally important issues because none of those issues have anywhere near the extreme features the Montana case has.

Two amicus briefs to the US Supreme Court filed by public interest organizations supporting Montana describe how the five most conservative justices on the US Supreme Court, led by Justice Anthony Kennedy, laid out the stringent conditions under which sovereign immunity can trump an individual or corporate suit against state officials in federal court.  If the real party in interest is the state itself, rather than the state officials, then the 11th Amendment by its express terms prohibits the Supreme Court from taking jurisdiction over the suit. The state is the real party in interest when fundamental sovereignty interests of the state or the state treasury would be put at risk if the private or corporate party prevails. Another key factor is whether an alternate way exists, such as for the United States Government to defend its interest in enforcing federal law against the state. All of these line up in the Montana case, as described in the two amicus briefs and in the article, “Montana Citizens United Case Hanging in the Balance.”

First, the words of Montana Governor Brian Schweitzer summarize the sovereign legitimacy interests at stake in this suit: “This business of allowing corporations to bribe their way into government has got to stop.” Governor Schweitzer also said, “This is our government and we are not going to allow any corporation to steal it from us.”

Second, concern that defending states rights in this case would affect immigrant rights in Arizona, and race, gender, labor, and civil and political rights elsewhere does not apply. Just the opposite. In cases involving such rights, either Congress has already passed laws abrogating state’s immunity under the 11th Amendment, or, as in the Arizona case, the United States Government has brought the action to force the state to comply with the Constitution. In the Montana case, Congress has not passed a law requiring the states to obey Citizens United or requiring states to defend themselves in federal court against private suits seeking to overturn state laws restricting money in elections. Nor has the Executive Branch brought suit against the state. Until one of those two things happens Montana should be immune under the 11th Amendment from any corporate suit seeking to overturn its Corrupt Practices Act.

By taking a fundamentalist position against even asserting Montana’s 11th Amendment right not to be sued in federal court Steve Bullock is putting Montana’s 100-year-old anti-corruption law at risk. Bullock is putting even more at risk, for if Montana prevails against the corporate bullies and raiders, Montana sets a template for all the states to free themselves from rule by self-dealing private corporations and plutocrats.

Montana AG Steve Bullock has just hours left to reconsider and to file a one page conditional motion with the US Supreme Court saying that Montana wishes to assert its defense of sovereign immunity under the 11th Amendment if the Court decides to accept the case. His motion can include the request that the Court consider whether it has jurisdiction in view of Montana sovereign immunity before making any decision–including summary reversal–on the merits.

Montana could mention the previous cases in which the Supreme Court decided that jurisdiction can be raised at any stage of the proceedings and that jurisdiction is the first thing to be decided–before any decision on the merits, including summary reversal. This way Montana can avoid the heavy risk of summary reversal. Montana should assert its sovereign immunity right today, as the last day it can approaches, or the nation and Montana voters can forever wonder why it did not.

The Recession Politicians Don't Want to Talk About

The Real Story of the Housing Crash
by DEAN BAKER

The economy is certain to occupy center stage in the presidential race this fall. Unfortunately neither Governor Romney nor President Obama are likely to give us an accurate account of the economic problems we are now facing.

Romney’s efforts seem intended to convince the public that President Obama has turned the country into the Soviet Union, with government bureaucrats shoving aside business leaders to take the commanding role in the economy. He will have lots of money to make this case, which he will need since it is so far from reality.

Corporate profits are at their highest share as a percentage of the economy in almost 50 years. The share of profits being paid in taxes is near its post-World War II low. The government’s share of the economy has actually shrunk in the Obama years, as has government employment. Perhaps Romney can convince the public that the private sector is being crushed by burdensome regulation and taxes, but that has nothing to do with reality.

Unfortunately President Obama’s economic advisors have not been much more straightforward with the American people, never offering a clear explanation of why the economy has taken so long to recover. They have pointed out that economies often take long to recover from the effects of a financial crisis like to the one we experienced in the fall of 2008, but that is not an explanation for why we have not recovered.

The basic story is actually quite simple. The housing bubble had been driving the economy prior to the recession. It created demand through several channels. A near-record pace of housing construction added about 2 percentage points of GDP to annual demand or more than $300 billion in the current economy.

The $8 trillion in ephemeral housing wealth created by the bubble led to a huge surge in consumption. Tens of millions of people borrowed against bubble-generated equity or decided that they didn’t need to save for retirement. When house prices were going up 15-20 percent a year, the house was doing the saving. The result was a huge consumption boom on the order of 4 percent of GDP or $600 billion a year.

In addition, there was a bubble in non-residential real estate that followed in the wake of the housing bubble. This raised non-residential construction above its normal levels by close to 1 percent of GDP or $150 billion a year.

Adding these sources of demand together, the bubble generated well over $1 trillion in annual demand at its peak in 2005-2007. When the bubble burst, this $1 trillion in annual demand vanished as well. That is the central story of the downturn.

To recover we must find some way to replace this demand; however that is not easy. People will not go back to their old consumption patterns because they know they need to save more. Tens of millions of people have much less wealth than they expected at this point in their lives after they saw the equity in their homes largely vanish. Tens of millions of baby boomers are approaching retirement with almost nothing but their Social Security to support them.

Given the huge loss of wealth from the collapse of the housing bubble it is not reasonable to expect consumption to rise to fill the demand gap. It doesn’t make much more sense to expect investment to do the job. Historically, investment in equipment and software has been close to 8 percent of GDP. It is pretty much back to that level today. To fill the demand gap created by the collapse of the housing bubble the investment share of GDP would have to nearly double to 14 percent.

This would be almost impossible to imagine at any time, but it is especially far-fetched at a time when much of the economy is operating far below its capacity. Businesses are unlikely to spend a lot of money expanding their facilities when the existing capacity is sitting idle regardless of how nice we are to job creators.

Over a longer term we can expect that net exports will fill the demand gap. If we bring our huge trade deficit close to balance by selling more abroad and importing less it will provide a substantial boost to demand. However this will require that the dollar fall in value relative to the currencies of our trading partners, making U.S. products more competitive. That is a process that will take time. With many of trading partners also in severe slumps, we cannot expect any major improvement in our trade balance in the immediate future.

This leaves government as the only remaining source of demand. This is not a question of whether we prefer the government or the private sector. We need the government sector to fill the gap in demand because the private sector will not do it. And that will be true no matter how much we love the private sector and its job creators.

Until we get our trade deficit closer to balance we will need large government deficits to fill the gap in demand created by the housing bubble. That is the simple reality that neither party seems anxious to tell the people.

How Not to Waste Your Vote in November

Repressive Democracy
by ANDREW LEVINE

There is no sugar-coating the fact: the failure to recall Governor Scott Walker in Wisconsin was a defeat for organized labor and the entire American working class, and for everyone with interests opposed to those of the plutocrats for whom Walker fronts. By some estimates, this would be 99% of us.

Still, judging by the reaction of the corporate media, the election was barely newsworthy – except, for a night or two, to the Republican flacks at Fox News and their Democratic counterparts at “left leaning” MSNBC. What they both cared about was the impact on Barack Obama’s reelection.

Predictably, at Fox, Walker’s victory spelled doom for Obama; at MSNBC, also predictably, it wasn’t that big a deal. Sure, Ed Schultz, self-declared friend of the working stiff, was mad as hell, but he’ll get over it; Rachel Maddow, growing more tiresome by the day, seemed over it before she even got into it.

To the extent that they offered any analysis at all it was about how zillionaires, the Koch Brothers especially, pumped in so much money for Walker that he outspent Tom Barrett, the Democratic candidate, seven to one.

This translates into seven times more mind-numbing commercials. No one bothered to fault a political culture where this has become what it’s all about. Nobody complained that these ad campaigns leave little or no time or space for rational deliberation, the hallmark supposedly of collective decision-making in democratic states.

What the “left leaning” talking heads complained about instead was how unfair the PR wars have become now that the Supreme Court has given corporate “persons” carte blanche to spend all they please.

There was hardly a word, therefore, about how Obama remained aloof from the events in Wisconsin throughout 2011, as people from all walks of life mounted the most sustained revolt in decades against capital’s escalating attack on the working class. And neither was there much discussion of how, in the days before the election, Obama couldn’t even be bothered with the recall election – except to send a bland tweet of support for Barrett the day before.

He was too busy palling around with corporate CEOs in adjacent states, and chatting up hedge fund managers and banksters in Chicago; too busy, in other words, showing his true colors.

And, of course, no one thought to mention how little the Barrett campaign had to do with the aspirations of the tens of thousands of people who occupied the state Capital in the winter and spring of 2011; or of how the class offensive Walker launched was only an extreme version of the neo-liberal assault on the gains of the middle decades of the twentieth century, a regressive project that Democrats and Republicans alike have been pursuing for more than thirty years.

Therefore no one pointed out how the most one could have hoped for from a Barrett victory was a kinder and gentler version of Walker’s overreaching.

None of this is surprising. Serious discussions of anything other than the horse race between Romney and Obama would be too much to expect from the pundits at MSNBC. They are basically cheerleaders for Obama. They work a different angle.

Because they have so little to cheer, they fear monger shamelessly — making much, perhaps too much, of Republican idiocy. It’s nice work if you can get it because Republicans are easy prey – their plutocrats are more than usually repellent, their elected officials are morons, and their useful idiots wear their ridiculousness on their sleeves.

Rubbing all this in is well and good. But it is also diversionary. It helps Obama and Company go ahead with their endless (and largely secret) wars. It draws attention away from their war-induced disregard of the rule of law. And it frees them up to toady to the capitalists whose hearts and minds they yearn, in vain, to win over.

While Obama and the others do all this and more, liberals don’t notice or else they notice but still cut them slack. And why shouldn’t they? By their lights, Obama and his minions are all that stand between those who still have the wits they were born with and the demented theocrats and market theologians on the other side.

No matter that if there was anything like equal justice under law in the Land of the Free, Obama would have been indicted or impeached long ago. Having American citizens put to death without even a semblance of due process is only the most egregious of the high crimes and misdemeanors with which he could be charged. Nixon did no worse; by comparison, Clinton was impeached for a trifle.

No matter too that many of the Nobel laureate’s machinations abroad are actionable under international law. Were the mighty held to the standards they impose on others, Obama, like Bush and Cheney before him, could count on spending the rest of his life in an orange jump suit.

But none of this matters to the “liberal” media — not when there are Republicans to mock and Democratic voters to frighten.

On most domestic issues, Kennedy and Johnson and even Nixon were better than Obama by orders of magnitude. But like Obama and Bush before him, they waged ruinous wars that outraged the moral sensibilities of people throughout the world. Then, like now, the situation they brought about seemed impossible to set right; not, anyway, through the usual political channels.

The apparent impossibility of changing the old order from within led some of the opponents of Kennedy’s, Johnson’s, and Nixon’s wars to call on intellectual traditions that transcend the horizons of normal politics by focusing on the contradictory character of prevailing institutions and norms.

The most venerable and soundest of these radical currents, and the most influential in that period, derived from the philosophy of G.W.F. Hegel (1770-1831). The usual point of entry to this tradition for anti-war and anti-imperialist militants in the United States in the sixties and seventies was through the writings of thinkers associated with the so-called Frankfurt School.

Herbert Marcuse was not the most eminent exponent of Frankfurt School thought but he was by far the most influential propagator of it in the United States. As such, he quickly rose from obscurity to fame, becoming a target of adoration in New Left circles and, needless to say, of media vilification. Today, he is an all but forgotten figure.

But his work remains a source of suggestive – and timely – insights that can be useful for thinking about overcoming hopelessness in the Age of Obama and in the aftermath of the Wisconsin defeat.

* * *

Four decades ago, Marcuse published a widely read essay with a seemingly self-contradictory title, “Repressive Tolerance.”

His aim in that essay was to account for the astonishing degree of political conformity he observed in the United States and other liberal democracies. What struck him was the fact that, despite a comparative absence of overt repression, “critical” thinking, counter-systemic thinking opposed to the established order, was, if anything, even less evident in liberal democracies than in societies where speech and other forms of expression were subject to government suppression and control.

For Hegelians like Marcuse, history is the story of the career and ultimate realization of Reason in society or, what comes to the same thing in the Hegelian view, of a certain idea of human freedom, according to which, at the end of history, universal principles of Right govern human interactions and institutional arrangements. The United States, for all its liberal virtues, was nothing like the ideal. Marcuse was struck by how politically inconsequential this fact was, and at how much support there was instead for the existing order.

From within a broadly Hegelian perspective, one possible explanation for political conformity could be a rational consensus supporting the status quo. It could be, in other words, that there is little critical thinking in societies like ours because the goal of criticism has already been realized. Marcuse, for obvious reasons, rejected this explanation. For him, as for all “left Hegelians” since the 1840s, liberal democracy was, at best, History’s penultimate stage; not its “end.”

How then can the inherently conservative, non-critical “one dimensionality” of our politics be explained? Marcuse’s ideas were sketchy and problematic, but they grew out of a striking insight that warrants careful consideration.

He thought that while liberalism had been and in many ways still is part of Reason’s forward advance, and while tolerance of speech and other forms of expression is a defining element of liberal doctrine, tolerance can and sometimes does work to maintain an oppressive status quo. Marcuse claimed that tolerance did for the regime in place in the United States and similar societies what repression did for societies on the other side of the “iron curtain.”

He was vague about how this comes about. Much like his contemporary, Marshall McLuhan, his reflections focused on the nature of modern mass media and, for reasons he never made clear, he ascribed great causal significance to the differences between, say, flat screens and printed words. His reflections were at most only suggestive. They were also vague enough to be ignored or dismissed once the political moment that had made them seem timely passed.

But however unsatisfactory or obscure his explanations were, he was clear as can be about the phenomenon itself: in (literally) repressive societies, subversive ideas are potent. On the other hand, where repressive tolerance reigns, anybody can say anything, but it doesn’t matter; critical thought is effectively neutered.

According to Marcuse, it is not the potency of critical ideas that necessitates their repression in authoritarian societies; quite the contrary, it is their repression that renders them potent. Similarly, it is not the impotence of critical thinking that makes pure tolerance possible in liberal regimes. Tolerance, Marcuse argued, is what renders criticism impotent.

To be clear: Marcuse was not an opponent of liberal values. For him as much as for any defender of pure tolerance, an ideal world would be a tolerant world. And he agreed with the founding figures of modern liberalism — John Stuart Mill, for example –when they maintained that in general tolerance is a means for advancing the ideal; that the end and the most effective means for achieving it are in many cases one and the same. His point was just that in societies like ours, for any of a variety of barely specified reasons, tolerance had “turned into its opposite”; that what had been and ideally is an instrument of human liberation had become a means for impeding humanity’s forward march.

Marcuse’s account of repressive tolerance was not just an idle philosophical reflection. It was a contribution to on-going philosophical debates about free speech, but it was also a political intervention at a time when students and others engaged in struggles for civil rights and against the Vietnam War would sometimes disrupt the speeches of racists and war defenders.

His views on disrupting speech were more subtle than those of many of the disruptors: he held that illiberal means are almost always counter-productive, even if there is no moral constraint in deploying them. Still, his brief against pure tolerance was taken as a theoretical justification for a political practice that had taken on illiberal colorations. This was yet another reason why Marcuse’s fame was brief.

But his position was not as out of line with mainstream thinking as might appear. For example, among those who think that violence has no place in an ideal world, there are some (the vast majority) who think that a judicious use of violence in the actual world can be, and often is, useful for advancing the ideal. Only pacifists, for whom violence is everywhere and always morally proscribed, think otherwise.

Liberals who uphold tolerance in any and all circumstances are like pacifists. Marcuse’s position is analogous to the non-pacifist’s. He thought that, in certain circumstances, a judicious use of intolerance can be beneficial for bringing a more tolerant society into being, just as others think that a judicious use of violence can help bring about a more peaceful world. Presumably, the bar with respect to both transgressions of the ideal is best set high.

Whatever we ultimately make of Marcuse’s position on free speech, his main point, properly generalized, is unassailable: that in the actual world, where the misfortunes consequent upon the indefinite prolongation of capitalist civilization are manifest, it can be and often is counter-productive to act as one should in a world closer to the Hegelian ideal.

In that spirit, it bears reflection whether, in our circumstances, democracy, or at least the electoral system that captures a large part of what we nowadays mean by that word, can work to the detriment of democratic ideals, whether it too can function repressively (or, more precisely anti-democratically).

This is a possibility that the Wisconsin insurgents would have done well to worry about more, and that people involved with the Occupy movements should think carefully about as well, especially now that Team Obama is eager to coopt their energy while quashing the ideas and not-yet-fully-articulated intuitions that spurred them into action.

* * *

The analogy with repressive tolerance is imperfect, of course. Marcuse’s target was pure tolerance, tolerance of speech and expression regardless of content. His point was that acting as if we are already living in a world in which pure tolerance can work the beneficial effects its defenders imputed to it can impede efforts to bring that condition about; that there are times and places – our own, for example – where tolerance that takes no account of content actually works to the detriment of the ideal.

But our democracy is anything but ‘pure.’ It is an amalgam of institutional arrangements and practices concocted long ago by lawyers, merchants and slave owners who sought democratic legitimacy at the same time that they were intent on shielding themselves from the consequences of government of, by, and for the people. Thus we don’t even pretend to implement anything like the principle of one person, one vote.

In addition, our democracy has lately been sullied further by Republican Supreme Court Justices intent on making plutocratic domination – they call it “free speech” — the law of the land. And lately, again thanks to the GOP, efforts at voter suppression are rife.

Therefore even if we identify democracy with competitive elections that, like ours, are more or less “free and fair,” our democracy is not nearly as ‘pure’ as our tolerance is. Nevertheless, Marcuse’s reflections on how tolerance turned into its opposite are applicable to what passes for democracy in our time and place.

Implicitly, the people fighting Walker’s depredations in Wisconsin realized this before the force of circumstances caused their efforts to take an electoral turn. The masses of people involved in the Occupy movement realized it too with even greater clarity. They grasped what had not yet become clear to many of them a year or two earlier or indeed to the millions who voted for Obama in 2008: that if the idea is to make the world a better place, forget about an electoral regime dominated by Democrats and Republicans.

If only it could have stayed that way!

When the electoral season was still far off, it was still possible to ignore Obama and the national Democratic Party, to pay back Obama’s indifference in kind. [In Wisconsin, the state Democratic Party was a different story; at key moments – for instance, when the entire Democratic caucus in the Senate fled the state – Democrats actually played a constructive role.]

But it soon became apparent that democracy in the streets would have to assume a more political focus. It was not possible to ignore Obama and the Democrats indefinitely, especially in an election year.

After what happened in Wisconsin, it has become as plain as can be that, in a repressive democracy, it is essential to name the enemy. The enemy includes benighted theocrats and free marketeers and, of course, the plutocrats who bankroll GOP candidates. But the list includes Obama too, and the Democratic Party, and their paymasters.

That’s not exactly news, but it is a point that that not everyone in the MSNBC demographic and in the ranks of organized labor realizes yet. Or, if they do, they don’t dare draw the obvious conclusion.

It’s that lesser evil thing again. And who can deny the reality of the menace? Romney and the Republicans truly are pieces of work.

Even so, it’s not clear what follows. Lesser evil voting almost always has race to the bottom consequences that must not be overlooked and that can be devastating over time. And it is demonstrably the case that when a Republican is in the White House, Democrats in Congress become better (less bad).

Compare the period from 2006 to 2008, when the executive branch was still in Republican hands and Democrats controlled Congress, with the period between 2008 and 2010, when Democrats controlled both branches of government. A case could be made that we were better off in the earlier period, even despite Nancy Pelosi’s, Harry Reid’s and other leading Democrats best efforts to keep the party on its rightward, Clintonite, course.

On the other hand, it is almost certainly the case that we can’t get from here back to there because a Romney victory in November would all but assure another Democratic “shellacking” in the House and Senate. And there is the additional consideration that a Republican administration would make worse judicial appointments than Obama would, and that we’d be living with the consequences for decades to come.

In short, the coming election raises problems for which there is no obvious solution. The only sure thing is that the outcome, whatever it is, will be awful – for everyone for whom the failure to recall Walker is awful, but on a grander scale.

A first step in gaining a sound purchase on just how bad the situation is, and therefore for figuring out what to do about it eventually – there may be nothing that can be done in the short run — is to realize, in the spirit of Marcuse’s account of repressive tolerance, that our democracy is indeed a repressive democracy, and that the electoral system itself — not in general but in our time and place – has become a means for keeping an oppressive status quo in place.

* * *

The problem would be mitigated, of course, if elements of the old liberal-labor coalition would break free from the Democratic Party by voting for candidates who run on platforms that plutocrats don’t own. In the presidential contest this time around that would mean voting for Jill Stein of the Green Party.

However it is a deeply entrenched dogma of our political culture that third party votes are wasted. This self-fulfilling prophecy is all but impossible to dislodge, and so its consequences cannot be evaded. If even a deeply respected national figure like Ralph Nader could only garner 2.74 per cent of the vote in 2000, running against Al Gore and George W. Bush, what chance is there for someone who is unknown and massively underfunded to break out of the iron cage?

The short answer is: none at all. But that doesn’t mean that her candidacy can’t be useful for telling people, those who are able to listen, what they need to hear. And it doesn’t mean that it there is no point in casting a protest vote when the alternative is piling on votes for the lesser evil, especially when it isn’t clear who the lesser evil is.

In our very impure repressive democracy, the only way to cast a protest vote against Obama is to vote for somebody else. Since voting for Romney is unthinkable for any ninety-nine percenter with minimally developed moral and intellectual capacities, voting for the Greens may be all that’s left.

To be sure, voting for someone, no matter how estimable, who has no chance of garnering a single electoral vote is making the best of a very bad situation. But at least it is not, as the conventional wisdom would have it, wasting a vote – at least not in states where, thanks to our impure democratic institutions, the electoral votes are already effectively cast.

In those cases, the best, perhaps the only, way to waste a vote is to cast it for the purported lesser evil, the drone-besotted corporate flunky who has made it all but impossible even to speak without derision of “hope” or “change.”

Silent Spring for Us?

by PAUL CRAIG ROBERTS
 
With her 1962 book, Silent Spring, Rachel Carson got DDT and other synthetic pesticides banned and saved bird life. Today it is humans who are directly threatened by  technologies designed to extract the maximum profit at the lowest private cost and the maximum social cost from natural resources.

Once abundant clean water has become a scarce resource. Yet, in the US ground water and surface water are being polluted and made unusable by mountain top removal mining, fracking and other such “new technologies.”  Ranchers in eastern Montana, for example, are being forced out of ranching by polluted water.

Offshore oil drilling and chemical farming run-off have destroyed fisheries in the Gulf of Mexico. In other parts of the world, explosives used to maximize short-run fish catches have destroyed coral reefs that sustained fish life.   Deforestation for short-run agricultural production results in replacing bio-diverse rain forests with barren land. The “now generation” is leaving a resource scarce planet to future generations.

Nuclear power plants are thoughtlessly built in earthquake and tsunami zones. Spent fuel rods are stored within the plants, a practice that adds their destructive potential to a catastrophic accident or act of nature.

The newest threat comes from genetically modified seeds that produce crops resistant to herbicides.  The active ingredient in Monsanto the devil’s Roundup herbicide is glyphosate, a toxic element that now contaminates groundwater in Spain and according to the US Geological Survey is now “commonly found in rain and streams in the Mississippi River Basin.”

In 2011 Don Huber, a plant pathologist and soil microbiologist, wrote to the US Secretary of Agriculture about the unexpected consequences of GMOs and the accompanying herbicides. He cited adverse effects on critical micronutrients, soil fertility, and the nutritional value of foods.  He cited the impairment of metabolic pathways that prevents plants from accumulating and storing minerals, such as iron, manganese, and zinc, minerals important for liver function and immune response in animals and people. He cited toxic effects on the microorganisms in the soil that have disrupted nature’s balance and resulted in large increases in plant diseases. He cited livestock deaths from botulism, premature animal aging, and an increase in animal and human infertility.

In an interview, Huber said that the power of agri-business has made it almost impossible to do research on GMOs and that regulatory agencies with the responsibility of protecting the public are dependent on the industry’s own self-serving studies and have no independent objective science on which to base a regulatory decision.

In short, in order to secure bumper crops for several years, we are destroying the fertility of soil, animal and human life.

Mankind has been destroying the world for a long time. In his fascinating book, 1493,  Charles C. Mann describes the adverse effects on the environment, people, and civilizations of the globalism unleashed by Christopher Columbus. These include the international transfer of human and plant diseases, deforestation, destructions of peoples and empires, and the impact on distant China of Spanish new world silver.

Mann provides a history lesson in unintended and unexpected consequences resulting from the actions of elites and of those that elites dominated. The Chinese government fixed taxation in terms of the quantity of silver, but the importation of Spanish silver inflated prices (decreased the value of a given quantity of silver) and left the government without sufficient revenues.

A successor government or dynasty evicted Chinese from the coast in order to deprive pirates of resources. The displaced millions of people deforested mountainsides in order to sustain themselves with terrace agriculture. The result of deforestation was floods that not only washed away the terraces but also the crops in the fertile valleys below. Consequently, floods became one of China’s greatest challenges to its food supply.

The first slaves were conquered new world natives, but the “Indians” had no immunity to European diseases. The second wave of slaves were European whites, but the Europeans had no immunity to malaria and yellow fever. By default slavery fell to blacks, many of whom had immunity to malaria and yellow fever. Thus, a black workforce could survive the infected environments and newly created wetlands in which to raise sugarcane, wetlands that were ideal homes for malaria and yellow fever bearing mosquitoes.  Mann, of course, is merely reporting, not justifying black or any slavery.

Mann points out that the lowly mosquito had a large impact on American history. The Mason-Dixon Line roughly splits the East Coast into two zones, the South in which disease carrying mosquitoes were an endemic threat, and the north in which malaria was not a threat. In the South, a person who survived childhood and grew into an adult had acquired immunity.  Northerners had no such protection.

This had enormous consequences when Northern armies invaded the South. Mann reports that “disease killed twice as many Union troops as Confederate bullets or shells.”  Between the summers of 1863 and 1864, the official annual infection rate for what was called “intermittent fevers” was 233 percent.  The average northern soldier was felled more than twice. In one year 361,968 troops were infected.  Most of the deaths from malaria were indirect.  The disease so badly weakened the troops that they died from dysentery, measles or strep infection.

The mosquito was the South’s most powerful ally and so prolonged the war, despite the vast numerical superiority of the Union force, that Lincoln was forced to take action that he opposed and declare emancipation of slaves. Thus, Mann writes, it is not farfetched to conclude that blacks were freed by the very malaria mosquito that had caused blacks to be the preferred workforce.

Mann shows that long before the birth of capitalism, greed drove men to barbarous treatment of their fellows. He also shows that policies, whether driven by greed or by well-intended socio-political design, inevitably had unexpected consequences. His multi-faceted history well illustrates the old adage, “the well laid plans of mice and men often go awry.”

The old world’s colonization of the new world devastated new world peoples, but the new world bit back with the spread of the potato blight to Europe and Spanish and European inflation.

Environmental destruction resulted mainly from deforestation and soils washed away by consequent floods. Prior to modern technology and toxic chemicals, the planet survived mankind.

Today the prospects for the planet are different. The human population is vast compared to earlier times, putting far more pressure on resources, and the disastrous consequences of new technologies are unknown at the time that they are employed, when the focus is on the expected benefits. Moreover, these costs are external to the business, corporation, or economic unit. The costs are inflicted on the environment and on other humans and other animal life. The costs are not included when the business calculates its profit and return on its investment.  The external costs of fracking, mountain top removal mining, chemical farming, and GMOs could exceed the value of the marketable products.

Businesses have no incentive to take these costs into account, because to do so reduces their profits and could indicate that the full cost of production exceeds the value of the output. Governments have proven to be largely ineffective in controlling external costs, because of the ability of private interests to influence the decisions of government. Even if one country were to confront these costs, other countries would take advantage of the situation. Companies that externalize some of their costs can undersell companies that internalize all of the costs of their production.  Thus, the planet can be destroyed by the short-term profit and convenience interests of one generation.

The main lesson that emerges from Mann’s highly readable book is that people today have no better grasp of the consequences of their actions than superstitious and unscientific people centuries ago.  Modern technological man is just as easily bamboozled by propaganda as ancient man was by superstition and ignorance.

If you doubt that the peoples of Western civilization live in an artificial reality created by propaganda, watch this documentary on psyops. The documentary does a good job despite wandering off into a couple of side issues on which it takes one-sided positions.  It is a bit heavy on blaming the rich, and overlooks that Stalin, for example, had plenty of propaganda and wasn’t looking to make himself a billionaire.  Not all the rich are against the people. Billionaires Roger Milliken and Sir James Goldsmith fought against jobs offshoring and globalism, which increases the powerlessness of the people vis-a-vis the elites. Both spoke for the people to no avail.

The documentary also blames the Constitution for limiting the participation of the mass of the people in governing themselves without acknowledging that the Constitution restricted the power of government and guaranteed civil liberty by making law a shield of the people instead of a weapon in the hands of the government. It is not the Constitution’s fault, or the fault of Founding Father James Madison, that the American people succumbed to propaganda by Bush and Obama and gave up their civil liberty in order to be “safe” from “Muslim terrorists.”
The documentary shows that propaganda is a form of mind control, and controlled minds are indeed the American predicament.

In 1962 Rachel Carson caught Monsanto the devil off guard and thus gained an audience. Today she would not get the same attention. Ready and waiting psyops would go into operation to discredit her.  I just read an article by an economist who wrote that economists have decided that environmentalism is a religion, in other words, an unscientific belief system that preaches “religious values.”  This demonstrates what little importance economists attribute to external costs and the ability of externalized costs to destroy the productive power of the planet.  Thus, the question, “silent spring for us?” is not merely rhetorical.  It is real.

Monday, June 18, 2012

How Capitalism Steered Innovation Toward Social Control Rather Than Technological Wonders

Why Don't We Have Flying Cars?
The technologies that did emerge proved most conducive to surveillance and work discipline.
By David Graeber, The Baffler
June 18, 2012

The following article has been adapted from the Baffler. To read the piece in its entirety check out the Baffler.com. 

A secret question hovers over us, a sense of disappointment, a broken promise we were given as children about what our adult world was supposed to be like. I am referring not to the standard false promises that children are always given (about how the world is fair, or how those who work hard shall be rewarded), but to a particular generational promise—given to those who were children in the fifties, sixties, seventies, or eighties—one that was never quite articulated as a promise but rather as a set of assumptions about what our adult world would be like. And since it was never quite promised, now that it has failed to come true, we’re left confused: indignant, but at the same time, embarrassed at our own indignation, ashamed we were ever so silly to believe our elders to begin with.

Where, in short, are the flying cars? Where are the force fields, tractor beams, teleportation pods, antigravity sleds, tricorders, immortality drugs, colonies on Mars, and all the other technological wonders any child growing up in the mid-to-late twentieth century assumed would exist by now? Even those inventions that seemed ready to emerge—like cloning or cryogenics—ended up betraying their lofty promises. What happened to them?

We are well informed of the wonders of computers, as if this is some sort of unanticipated compensation, but, in fact, we haven’t moved even computing to the point of progress that people in the fifties expected we’d have reached by now. We don’t have computers we can have an interesting conversation with, or robots that can walk our dogs or take our clothes to the Laundromat.


***


For the technologies that did emerge proved most conducive to surveillance, work discipline, and social control. Computers have opened up certain spaces of freedom, as we’re constantly reminded, but instead of leading to the workless utopia Abbie Hoffman imagined, they have been employed in such a way as to produce the opposite effect. They have enabled a financialization of capital that has driven workers desperately into debt, and, at the same time, provided the means by which employers have created “flexible” work regimes that have both destroyed traditional job security and increased working hours for almost everyone. Along with the export of factory jobs, the new work regime has routed the union movement and destroyed any possibility of effective working-class politics.

Meanwhile, despite unprecedented investment in research on medicine and life sciences, we await cures for cancer and the common cold, and the most dramatic medical breakthroughs we have seen have taken the form of drugs such as Prozac, Zoloft, or Ritalin—tailor-made to ensure that the new work demands don’t drive us completely, dysfunctionally crazy.

With results like these, what will the epitaph for neoliberalism look like? I think historians will conclude it was a form of capitalism that systematically prioritized political imperatives over economic ones. Given a choice between a course of action that would make capitalism seem the only possible economic system, and one that would transform capitalism into a viable, long-term economic system, neoliberalism chooses the former every time. There is every reason to believe that destroying job security while increasing working hours does not create a more productive (let alone more innovative or loyal) workforce. Probably, in economic terms, the result is negative—an impression confirmed by lower growth rates in just about all parts of the world in the eighties and nineties.

But the neoliberal choice has been effective in depoliticizing labor and over determining the future. Economically, the growth of armies, police, and private security services amounts to dead weight. It’s possible, in fact, that the very dead weight of the apparatus created to ensure the ideological victory of capitalism will sink it. But it’s also easy to see how choking off any sense of an inevitable, redemptive future that could be different from our world is a crucial part of the neoliberal project

At this point all the pieces would seem to be falling neatly into place. By the sixties, conservative political forces were growing skittish about the socially disruptive effects of technological progress, and employers were beginning to worry about the economic impact of mechanization. The fading Soviet threat allowed for a reallocation of resources in directions seen as less challenging to social and economic arrangements, or indeed directions that could support a campaign of reversing the gains of progressive social movements and achieving a decisive victory in what U.S. elites saw as a global class war. The change of priorities was introduced as a withdrawal of big-government projects and a return to the market, but in fact the change shifted government-directed research away from programs like NASA or alternative energy sources and toward military, information, and medical technologies.

Of course this doesn’t explain everything. Above all, it does not explain why, even in those areas that have become the focus of well-funded research projects, we have not seen anything like the kind of advances anticipated fifty years ago. If 95 percent of robotics research has been funded by the military, then where are the Klaatu-style killer robots shooting death rays from their eyes?

Obviously, there have been advances in military technology in recent decades. One of the reasons we all survived the Cold War is that while nuclear bombs might have worked as advertised, their delivery systems did not; intercontinental ballistic missiles weren’t capable of striking cities, let alone specific targets inside cities, and this fact meant there was little point in launching a nuclear first strike unless you intended to destroy the world.

Contemporary cruise missiles are accurate by comparison. Still, precision weapons never do seem capable of assassinating specific individuals (Saddam, Osama, Qaddafi), even when hundreds are dropped. And ray guns have not materialized—surely not for lack of trying. We can assume the Pentagon has spent billions on death ray research, but the closest they’ve come so far are lasers that might, if aimed correctly, blind an enemy gunner looking directly at the beam. Aside from being unsporting, this is pathetic: lasers are a fifties technology.

Phasers that can be set to stun do not appear to be on the drawing boards; and when it comes to infantry combat, the preferred weapon almost everywhere remains the AK-47, a Soviet design named for the year it was introduced: 1947.

The Internet is a remarkable innovation, but all we are talking about is a super-fast and globally accessible combination of library, post office, and mail-order catalogue. Had the Internet been described to a science fiction aficionado in the fifties and sixties and touted as the most dramatic technological achievement since his time, his reaction would have been disappointment. Fifty years and this is the best our scientists managed to come up with? We expected computers that would think!

Overall, levels of research funding have increased dramatically since the seventies. Admittedly, the proportion of that funding that comes from the corporate sector has increased most dramatically, to the point that private enterprise is now funding twice as much research as the government, but the increase is so large that the total amount of government research funding, in real-dollar terms, is much higher than it was in the sixties. “Basic,” “curiosity-driven,” or “blue skies” research—the kind that is not driven by the prospect of any immediate practical application, and that is most likely to lead to unexpected breakthroughs—occupies an ever smaller proportion of the total, though so much money is being thrown around nowadays that overall levels of basic research funding have increased

Yet most observers agree that the results have been paltry. Certainly we no longer see anything like the continual stream of conceptual revolutions—genetic inheritance, relativity, psychoanalysis, quantum mechanics—that people had grown used to, and even expected, a hundred years before. Why?

Part of the answer has to do with the concentration of resources on a handful of gigantic projects: “big science,” as it has come to be called. The Human Genome Project is often held out as an example. After spending almost three billion dollars and employing thousands of scientists and staff in five different countries, it has mainly served to establish that there isn’t very much to be learned from sequencing genes that’s of much use to anyone else. Even more, the hype and political investment surrounding such projects demonstrate the degree to which even basic research now seems to be driven by political, administrative, and marketing imperatives that make it unlikely anything revolutionary will happen.

Here, our fascination with the mythic origins of Silicon Valley and the Internet have blinded us to what’s really going on. It has allowed us to imagine that research and development is now driven, primarily, by small teams of plucky entrepreneurs, or the sort of decentralized cooperation that creates open-source software. This is not so, even though such research teams are most likely to produce results. Research and development is still driven by giant bureaucratic projects.

What has changed is the bureaucratic culture. The increasing interpenetration of government, university, and private firms has led everyone to adopt the language, sensibilities, and organizational forms that originated in the corporate world. Although this might have helped in creating marketable products, since that is what corporate bureaucracies are designed to do, in terms of fostering original research, the results have been catastrophic.

My own knowledge comes from universities, both in the United States and Britain. In both countries, the last thirty years have seen a veritable explosion of the proportion of working hours spent on administrative tasks at the expense of pretty much everything else. In my own university, for instance, we have more administrators than faculty members, and the faculty members, too, are expected to spend at least as much time on administration as on teaching and research combined. The same is true, more or less, at universities worldwide.

The growth of administrative work has directly resulted from introducing corporate management techniques. Invariably, these are justified as ways of increasing efficiency and introducing competition at every level. What they end up meaning in practice is that everyone winds up spending most of their time trying to sell things: grant proposals; book proposals; assessments of students’ jobs and grant applications; assessments of our colleagues; prospectuses for new interdisciplinary majors; institutes; conference workshops; universities themselves (which have now become brands to be marketed to prospective students or contributors); and so on.

As marketing overwhelms university life, it generates documents about fostering imagination and creativity that might just as well have been designed to strangle imagination and creativity in the cradle. No major new works of social theory have emerged in the United States in the last thirty years. We have been reduced to the equivalent of medieval scholastics, writing endless annotations of French theory from the seventies, despite the guilty awareness that if new incarnations of Gilles Deleuze, Michel Foucault, or Pierre Bourdieu were to appear in the academy today, we would deny them tenure.

There was a time when academia was society’s refuge for the eccentric, brilliant, and impractical. No longer. It is now the domain of professional self-marketers. As a result, in one of the most bizarre fits of social self-destructiveness in history, we seem to have decided we have no place for our eccentric, brilliant, and impractical citizens. Most languish in their mothers’ basements, at best making the occasional, acute intervention on the Internet.

If all this is true in the social sciences, where research is still carried out with minimal overhead largely by individuals, one can imagine how much worse it is for astrophysicists. And, indeed, one astrophysicist, Jonathan Katz, has recently warned students pondering a career in the sciences. Even if you do emerge from the usual decade-long period languishing as someone else’s flunky, he says, you can expect your best ideas to be stymied at every point:

You will spend your time writing proposals rather than doing research. Worse, because your proposals are judged by your competitors, you cannot follow your curiosity, but must spend your effort and talents on anticipating and deflecting criticism rather than on solving the important scientific problems. . . . It is proverbial that original ideas are the kiss of death for a proposal, because they have not yet been proved to work.


That pretty much answers the question of why we don’t have teleportation devices or antigravity shoes. Common sense suggests that if you want to maximize scientific creativity, you find some bright people, give them the resources they need to pursue whatever idea comes into their heads, and then leave them alone. Most will turn up nothing, but one or two may well discover something. But if you want to minimize the possibility of unexpected breakthroughs, tell those same people they will receive no resources at all unless they spend the bulk of their time competing against each other to convince you they know in advance what they are going to discover.

No accident Americans underestimate inequality--The rich prefer it that way

We’ve been brainwashed

By Joseph E. Stiglitz
This article was adapted from the new book The Price of Inequality.
 
How, in a democracy supposedly based on one person one vote, could the 1 percent could have been so victorious in shaping policies in its interests? It is part of a process of disempowerment, disillusionment, and disenfranchisement that produces low voter turnout, a system in which electoral success requires heavy investments, and in which those with money have made political investments that have reaped large rewards — often greater than the returns they have reaped on their other investments.

There is another way for moneyed interests to get what they want out of government: convince the 99 percent that they have shared interests. This strategy requires an impressive sleight of hand; in many respects the interests of the 1 percent and the 99 percent differ markedly.

The fact that the 1 percent has so successfully shaped public perception testifies to the malleability of beliefs. When others engage in it, we call it “brainwashing” and “propaganda.” We look askance at these attempts to shape public views, because they are often seen as unbalanced and manipulative, without realizing that there is something akin going on in democracies, too. What is different today is that we have far greater understanding of how to shape perceptions and beliefs — thanks to the advances in research in the social sciences.

It is clear that many, if not most, Americans possess a limited understanding of the nature of the inequality in our society: They believe that there is less inequality than there is, they underestimate its adverse economic effects, they underestimate the ability of government to do anything about it, and they overestimate the costs of taking action. They even fail to understand what the government is doing — many who value highly government programs like Medicare don’t realize that they are in the public sector.

In a recent study respondents on average thought that the top fifth of the population had just short of 60 percent of the wealth, when in truth that group holds approximately 85 percent of the wealth. (Interestingly, respondents described an ideal wealth distribution as one in which the top 20 percent hold just over 30 percent of the wealth. Americans recognize that some inequality is inevitable, and perhaps even desirable if one is to provide incentives; but the level of inequality in American society is well beyond that level.)

Not only do Americans misperceive the level of inequality; they underestimate the changes that have been going on. Only 42 percent of Americans believe that inequality has increased in the past ten years, when in fact the increase has been tectonic. Misperceptions are evident, too, in views about social mobility. Several studies have confirmed that perceptions of social mobility are overly optimistic.

Americans are not alone in their misperceptions of the degree of inequality. Looking across countries, it appears that there is an inverse correlation between trends in inequality and perceptions of inequality and fairness. One suggested explanation is that when inequality is as large as it is in the United States, it becomes less noticeable—perhaps because people with different incomes and wealth don’t even mix.

These mistaken beliefs, whatever their origins, are having an important effect on politics and economic policy.

Perceptions have always shaped reality, and understanding how beliefs evolve has been a central focus of intellectual history. Much as those in power might like to shape beliefs, and much as they do shape beliefs, they do not have full control: ideas have a life of their own, and changes in the world—in our economy and technology—impact ideas (just as ideas have an enormous effect in shaping our economy). What is different today is that the 1 percent now has more knowledge about how to shape preferences and beliefs in ways that enable the wealthy to better advance their cause, and more tools and more resources to do so.

Beliefs and perceptions, whether they are grounded in reality or not, affect behavior. If people see the “Marlboro man” as the type of person they aspire to be, they may choose that cigarette over others. If individuals overestimate some risk, they may take excessive precautions.

But important as perceptions and beliefs are in shaping individual behavior, they are even more important in shaping collective behavior, including political decisions affecting economics.

Economists have long recognized the influence of ideas in shaping policies. As Keynes famously put it,
The ideas of economists and political philosophers, both when they are right and when they are wrong, are more powerful than is commonly understood. Indeed the world is ruled by little else. Practical men, who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist.
Social sciences like economics differ from the hard sciences in that beliefs affect reality: beliefs about how atoms behave don’t affect how atoms actually behave, but beliefs about how the economic system functions affect how it actually functions. George Soros, the great financier, has referred to this phenomenon as reflexivity, and his understanding of it may have contributed to his success.

Keynes, who was famous not just as a great economist but also as a great investor, described markets as a beauty contest where the winner is the one who assessed correctly what the other judges would judge to be the most beautiful.

Markets can sometimes create their own reality. If there is widespread belief that markets are efficient and that government regulations only interfere with efficiency, then it is more likely that government will strip away regulations, and this will affect how markets actually behave. In the most recent crisis what followed from deregulation was far from efficient, but even here a battle of interpretation rages. Members of the Right tried to blame the seeming market failures on government; in their mind the government effort to push people with low incomes into homeownership was the source of the problem. Widespread as this belief has become in conservative circles, virtually all serious attempts to evaluate the evidence have concluded that there is little merit in this view. But the little merit that it had was enough to convince those who believed that markets could do no evil and governments could do no good that their views were valid, another example of “confirmatory bias.”

If individuals believe that they are being treated unfairly by their employer, they are more likely to shirk on the job. If individuals from some minority are paid lower wages than other equally qualified individuals, they will and should feel that they are being treated unfairly—but the lower productivity that results can, and likely will, lead employers to pay lower wages. There can be a “discriminatory equilibrium.”

Even perceptions of race, caste, and gender identities can have significant effects on productivity. In a brilliant set of experiments in India, low- and high-caste children were asked to solve puzzles, with monetary rewards for success. When they were asked to do so anonymously, there was no caste difference in performance. But when the low caste and high caste were in a mixed group where the low-caste individuals were known to be low caste (they knew it, and they knew that others knew it), low-caste performance was much lower than that of the high caste. The experiment highlighted the importance of social perceptions: low-caste individuals somehow absorbed into their own reality the belief that lower-caste individuals were inferior—but only so in the presence of those who held that belief.

Fairness, like beauty, is at least partly in the eyes of the beholder, and those at the top want to be sure that the inequality in the United States today is framed in ways that make it seem fair, or at least acceptable. If it is perceived to be unfair, not only may that hurt productivity in the workplace but it might lead to legislation that would attempt to temper it.

In the battle over public policy, whatever the realpolitik of special interests, public discourse focuses on efficiency and fairness. In my years in government, I never heard an industry supplicant looking for a subsidy ask for it simply because it would enrich his coffers. Instead, the supplicants expressed their requests in the language of fairness—and the benefits that would be conferred on others (more jobs, high tax payments).

The same goes for the policies that have shaped the growing inequality in the United States—both those that have contributed to the inequality in market incomes and those that have weakened the role of government in bringing down the level of inequality. The battle about “framing” first centers on how we see the level of inequality—how large is it, what are its causes, how can it be justified?
Corporate CEOs, especially those in the financial sector, have thus tried to persuade others (and themselves) that high pay can be justified as a result of an individual’s larger contribution to society, and that it is necessary to motivate him to continue making those contributions. That is why it is called incentive pay. But the crisis showed to everyone what economic research had long revealed—the argument was a sham. What was called incentive pay was anything but that: pay was high when performance was high, but pay was still high when performance was low. Only the name changed. When performance was low, the name changed to “retention pay.”

If the problems of those at the bottom are mainly of their own making and if those collecting welfare checks were really living high on the rest of society (as the “welfare deadbeats” and “welfare queen” campaign in the 1980s and 1990s suggested), then there is little compunction in not providing assistance to them. If those at the top receive high incomes because they have contributed so much to our society—in fact, their pay is but a fraction of their social contribution—then their pay seems justified, especially if their contributions were the result of hard work rather than just luck. Other ideas (the importance of incentives and incentive pay) suggest that there would be a high price to reducing inequality. Still others (trickle-down economics) suggest that high inequality is not really that bad, since all are better off than they would be in a world without such a high level of inequality.

On the other side of this battle are countering beliefs: fundamental beliefs in the value of equality, and analyses such as those presented in earlier chapters that find that the high level of inequality in the United States today increases instability, reduces productivity, and undermines democracy, and that much of it arises in ways that are unrelated to social contributions, that it comes, rather, from the ability to exercise market power—the ability to exploit consumers through monopoly power or to exploit poor and uneducated borrowers through practices that, if not illegal, ought to be.

The intellectual battle is often fought over particular policies, such as whether taxes should be raised on capital gains. But behind these disputes lies this bigger battle over perceptions and over big ideas—like the role of the market, the state, and civil society. This is not just a philosophical debate but a battle over shaping perceptions about the competencies of these different institutions. Those who don’t want the state to stop the rent seeking from which they benefit so much, and don’t want it to engage in redistribution or to increase economic opportunity and mobility, emphasize the state’s failings. (Remarkably, this is true even when they are in office and could and should do something to correct any problem of which they are aware.) They emphasize that the state interferes with the workings of the markets. At the same time that they exaggerate the failures of government, they exaggerate the strengths of markets. Most importantly for our purposes, they strive to make sure that these perceptions become part of the common perspective, that money spent by private individuals (presumably, even on gambling) is better spent than money entrusted to the government, and that any government attempts to correct market failures—such as the proclivity of firms to pollute excessively—cause more harm than good.

This big battle is crucial for understanding the evolution of inequality in America. The success of the Right in this battle during the past thirty years has shaped our government. We haven’t achieved the minimalist state that libertarians advocate. What we’ve achieved is a state too constrained to provide the public goods—investments in infrastructure, technology, and education—that would make for a vibrant economy and too weak to engage in the redistribution that is needed to create a fair society. But we have a state that is still large enough and distorted enough that it can provide a bounty of gifts to the wealthy. The advocates of a small state in the financial sector were happy that the government had the money to rescue them in 2008—and bailouts have in fact been part of capitalism for centuries.

These political battles, in turn, rest on broader ideas about human rights, human nature, and the meaning of democracy and equality. Debates and perspectives on these issues have taken a different course in the United States in recent years than in much of the rest of the world, especially in other advanced industrial countries. Two controversies—the death penalty (which is anathema in Europe) and the right to access to medicine (which in most countries is taken as a basic human right)—are emblematic of these differences. It may be difficult to ascertain the role the greater economic and social divides in our society has played in creating these differences in beliefs; but what is clear is that if American values and perceptions are seen to be out of line with those in the rest of the world, our global influence will be diminished.