Monday, June 8, 2009

We Need the Right Kinds of Innovation

Innovation is the engine of the economy. However, we need the right kinds of innovation. One promising area is in mass transportation. Maybe, we should consider putting G.M. to work manufacturing light rail and high speed rail cars (systems).

The Failed Promise of Innovation in the U.S.

During the past decade, innovation has stumbled. And that may help explain America's economic woes

"We live in an era of rapid innovation." I'm sure you've heard that phrase, or some variant, over and over again. The evidence appears to be all around us: Google (GOOG), Facebook, Twitter, smartphones, flat-screen televisions, the Internet itself.

But what if the conventional wisdom is wrong? What if outside of a few high-profile areas, the past decade has seen far too few commercial innovations that can transform lives and move the economy forward? What if, rather than being an era of rapid innovation, this has been an era of innovation interrupted? And if that's true, is there any reason to expect the next decade to be any better?

These are not comfortable questions in the U.S. Pride in America's innovative spirit is one of the few things that both Democrats and Republicans—from Bill Clinton to George W. Bush to Barack Obama—share.

But there's growing evidence that the innovation shortfall of the past decade is not only real but may also have contributed to today's financial crisis. Think back to 1998, the early days of the dot-com bubble. At the time, the news was filled with reports of startling breakthroughs in science and medicine, from new cancer treatments and gene therapies that promised to cure intractable diseases to high-speed satellite Internet, cars powered by fuel cells, micromachines on chips, and even cloning. These technologies seemed to be commercializing at "Internet speed," creating companies and drawing in enormous investments from profit-seeking venture capitalists—and ordinarily cautious corporate giants. Federal Reserve Chairman Alan Greenspan summed it up in a 2000 speech: "We appear to be in the midst of a period of rapid innovation that is bringing with it substantial and lasting benefits to our economy."

Where are the new products?

With the hindsight of a decade, one thing is abundantly clear: The commercial impact of most of those breakthroughs fell far short of expectations—not just in the U.S. but around the world. No gene therapy has yet been approved for sale in the U.S. Rural dwellers can get satellite Internet, but it's far slower, with longer lag times, than the ambitious satellite services that were being developed a decade ago. The economics of alternative energy haven't changed much. And while the biotech industry has continued to grow and produce important drugs—such as Avastin and Gleevec, which are used to fight cancer—the gains in health as a whole have been disappointing, given the enormous sums invested in research. As Gary P. Pisano, a Harvard Business School expert on the biotech business, observes: "It was a much harder road commercially than anyone believed."

If the reality of innovation was less than the perception, that helps explain why America's apparent boom was built on borrowing. The information technology revolution is worth cheering about, but it isn't sufficient by itself to sustain strong growth—especially since much of the actual production of tech gear shifted to Asia. With far fewer breakthrough products than expected, Americans had little new to sell to the rest of the world. Exports stagnated, stuck at around 11% of gross domestic product until 2006, while imports soared. That forced the U.S. to borrow trillions of dollars from overseas. The same surges of imports and borrowing also distorted economic statistics so that growth from 1998 to 2007, rather than averaging 2.7% per year, may have been closer to 2.3% per year. While Wall Street's mistakes may have triggered the financial crisis, the innovation shortfall helps explain why the collapse has been so broad. (To see a full explanation of the problems with the economic statistics, go to Growth: Why the Stats Are Misleading.)

But here's some optimism to temper the gloom: Many of the technological high hopes of 1998, it turns out, were simply delayed. Scientific progress continued, the technologies have matured, and more innovations are coming to market—everything from the first gout treatment in 40 years to cloud computing, the long-­ballyhooed phenomenon "information at your fingertips." The path has been long and winding, but if the rate of commercialization picks up, the current downturn may not be as protracted as expected.

To see both the reality of the innovation shortfall and its potentially happy ending, look at Organogenesis, a small company in Canton, Mass. Back in 1998, Organogenesis received approval from the Food & Drug Administration to sell the world's first living skin substitute. The product, Apligraf, was a thin, stretchy substance that could be grown in quantity and applied to speed the healing of diabetic leg ulcers and other wounds that had stayed open for years.

From a health perspective, the approval of Apligraf seemed to open up an entire world of "tissue engineering," growing all sorts of replacement body parts from living human cells. From an economic angle, the possibilities were equally appealing: Apligraf, approved in Canada and Switzerland, was being exported, creating skilled jobs in Massachusetts. This was the sort of high-tech product needed to drive the U.S. economy into the 21st century.

But there were several big problems, recalls Geoff MacKay, the company's current CEO, who repeatedly used the word "cautious" during our interview. For one, Apligraf cost more to make than the company could sell it for—never a good way to stay in business. In addition, Organogenesis couldn't figure out how to deliver Apligraf reliably, since it was shipping a product made out of living cells. "This is something no one had done before," says MacKay, who at the time was working for Novartis (NVS), then the marketing partner for Apligraf. "The way to commercialize this type of technology was more difficult than initially anticipated."

By 2002 the early enthusiasm for Apligraf had vanished, along with the money. Novartis pulled out, Organogenesis declared bankruptcy, and jobs were slashed. The company was not alone: The entire field of tissue engineering was languishing. Shortly after, MacKay took over at Organogenesis with a clear mandate to straighten out the company's manufacturing, logistics, and sales, and turn this tarnished product into a moneymaker.

And that's what he did. By bringing down costs, "we now have margins that are pharmaceutical-like," says MacKay. Sales of Apligraf are growing at more than 20% per year, the company is taking over two more buildings on the same street in Canton, and it has FDA approval to install high-reliability robots from Japan's Denso, the same supplier Toyota (TM) uses, he says. Employment is expected to climb from 350 jobs to about 600, the company is introducing products, and MacKay is talking about "cautious globalization." In other words, Organogenesis is fulfilling the promise of 1998—a decade later.

stumbling blocks

Now multiply that story a hundredfold and extend it to other areas. Consider, for example, micromachines—miniaturized gyroscopes, pumps, levers, or sensors on a silicon chip. Also known as MEMS (microelectromechanical systems), micromachines have been around in one form or another for years, most notably as the sensors that trigger airbags in cars.

In 1998, MEMS suddenly became the "next big thing." Engineers started to see how the devices could be useful in all sorts of ways that conventional semiconductors were not. For example, MEMS, in theory, could be used to make miniature sensors to monitor a hospital patient's blood at far less cost than conventional medical equipment. Venture capitalists threw billions into optical MEMS, miniaturized arrays of tiny mirrors designed to run fiber optic networks.

"In 1998 friends of mine started a MEMS company and asked me if I wanted to live the semiconductor revolution again," says Jeff Hilbert, now president and chief operating officer at MEMS outfit WiSpry in Irvine, Calif. "I naively thought it was a lot closer to being commercialized than it was." A whole array of challenges arose when it came time to move to mass production. "We didn't know what we didn't know," says Hilbert. WiSpry, which just closed a $20 million round of venture funding, is now about to start shipping MEMS chips that will go into cell phones, improving battery life and reducing dropped calls.

And then there is the biotech sector. The story driving the biotech boom was both scientifically sound and economically compelling: By understanding DNA and the human genome, researchers could develop effective drugs more quickly and easily. Pharmaceutical companies would no longer have to rely on serendipity to find a treatment for an illness. Instead, they could focus like lasers on the biological mechanisms that were broken or needed to be shored up. And the benefits of biotech were supposed to stretch into new sources of energy, increased agricultural production, and better ways to clean up environmental problems.

But fixing and improving the human body turned out to be far more complicated than expected. Even the sequencing of the human genome—an acclaimed scientific achievement—has not reduced the cost of developing profitable drugs. One indicator of the problem's scope: 2008 was the first year that the U.S. biotech industry collectively made a profit, according to a recent report by Ernst & Young—and that performance is not expected to be repeated in 2009.

red flags

There's no government-constructed "innovation index" that would allow us to conclude unambiguously that we've been experiencing an innovation shortfall. Still, plenty of clues point in that direction. Start with the stock market. If an innovation boom were truly happening, it would likely push up stock prices for companies in such leading-edge sectors as pharmaceuticals and information technology.

Instead, the stock index that tracks the pharmaceutical, biotech, and life sciences companies in the Standard & Poor's (MHP) 500-stock index dropped 32% from the end of 1998 to the end of 2007, after adjusting for inflation. The information technology index fell 29%. To pick out two major companies: The stock price of Merck declined 35% between the end of 1998 and the end of 2007, after adjusting for inflation, while the stock price of Cisco Systems (CSCO) was down 9%.

Consider another indicator of commercially important innovation: the trade balance in advanced technology products. The Census Bureau tracks imports and exports of goods in 10 high-tech areas, including life sciences, biotech, advanced materials, and aerospace. In 1998 the U.S. had a $30 billion trade surplus in these advanced technology products; by 2007 that had flipped to a $53 billion deficit. Surprisingly, the U.S. was running a trade deficit in life sciences, an area where it is supposed to be a leader.

A more indirect indication of the lack of innovation lies in the wages of college-educated workers. These are the people we would expect to prosper in growing, innovative industries that need smart, creative employees. But the numbers tell a different story. From 1998 to 2007, earnings for a U.S. worker with a bachelor's degree rose only 0.4%, adjusted for inflation. And young college graduates—who should be able to take advantage of opportunities in hot new industries—were hit by a 2.8% real decline in wages.

The final clue: the agonizingly slow improvement in death rates by age, despite all the money thrown into health-care research. Yes, advances in health care can affect the quality of life, but one would expect any big innovation in medical care to result in a faster decline in the death rate as well.

The official death-rate stats offer a mixed but mostly disappointing picture of how medical innovation has progressed since 1998. On the plus side, Americans 65 and over saw a faster decline in their death rate compared with previous decades. The bad news: Most age groups under 65 saw a slower fall in the death rate. For example, for children ages 1 to 4, the death rate fell at a 2.3% annual pace between 1998 and 2006, compared with a 4% decline in the previous decade. And surprisingly, the death rate for people in the 45-to-54 age group was slightly higher in 2006 than in 1998.

Each of these statistics has shortcomings as an innovation indicator. The relatively small decline in the death rate for many age groups could reflect an increase in obesity-related diseases among the American population rather than a shortfall in health-care innovation. The import and export numbers leave out trade in services and innovative products produced by U.S. companies overseas. And drawing conclusions about innovation from movements in stock prices is a dicey business at best. But taken together, these statistics tell a story of weaker-than-expected innovation.

The final piece of evidence is the financial crisis itself. After the 2001 tech bust, trillions of dollars flowed into the U.S.—but most of it went into government bonds and housing rather than into innovative sectors of the economy. While subprime mortgages boomed, venture capital investments have more or less stagnated since 2001, with few tech startups going public. "The U.S. was awash in capital, much of it desperately seeking a good deal," says Robert D. Atkinson, president of the Information Technology & Innovation Foundation, a nonpartisan Washington think tank. "If this had truly been an innovative period, then a vast array of cutting-edge innovations and their commercialization would have demanded hundreds of billions of dollars of capital."

If the description of the last decade as an innovation shortfall turns out to be accurate, that could make a big difference in how we think about the U.S. economy. For one thing, it helps explain why the trade deficit skyrocketed. A high-wage country such as the U.S. either has to develop innovative products and services to compete with low-cost countries such as China or accept a lower standard of living. "The competitive advantage of the U.S. economy has to be leveraging our science capacity for economic growth," says Pisano of Harvard. Fewer innovative products mean a weaker trade performance.

An innovation shortfall might also have weakened the country's underlying productivity growth, which in turn influenced real wages and the ability of consumers to spend without borrowing. Certainly economists on both the left and the right believe innovation is an essential ingredient for growth. A December 2006 paper by the Brookings Institution, co-authored by Peter R. Orszag, now head of the Office of Management & Budget, observed: "Because the U.S. is at the frontier of modern technological and scientific advances, sustaining economic growth depends substantially on our ability to advance that frontier."

The flip side: A shortfall in innovation could undercut growth and incomes, especially over a decade-long period. True, the economic statistics appear to show decent productivity growth across this stretch. But since there is compelling evidence that the figures are overstated by the credit bubble and statistical problems, we can construct a plausible narrative for the financial bust that gives a starring role to innovation—or rather, to the lack of it. It goes something like this: In the late 1990s most economists and CEOs agreed that the U.S. was embarking on a once-in-a-century innovation wave—not just in info tech but also in biotech and many other technologies. Forecasters upped their long-run growth estimates for the U.S. economy. Consumers borrowed against their home equity, assuming their future incomes would rise. And foreign investors lent America money by buying up U.S. securities, assuming the country would come up with enough new products to pay off the accumulated trade deficit.

This underlying optimism about the economy's growth potential became an enabler for Wall Street's financial shenanigans and greed. In this narrative, investors and bankers could convince themselves that rising home prices were reasonable given the bright future, which was based in part on strong innovation. In the end, the credit market collapse in September 2008 reflected a downgrading of expectations about future growth, which put trillions of dollars of debt underwater.

beyond info tech

Many economists are skeptical about placing the blame on an innovation shortfall, preferring to focus on problems on Wall Street and in Washington. "I tend to see the direct causes in our regulatory system," says Paul Romer, an economist at Stanford University's Graduate School of Business renowned for his work on innovation. "The big task is to explain why risk was so badly mispriced, particularly the risk of a collapse of the housing bubble."

Whatever the ultimate cause of the downturn, a pickup in innovation would provide a welcome economic boost. In part, that could come from information technology, where the combination of Google, social networks, wireless technology, and the beginnings of cloud computing is substantially altering the way people live their lives.

Of course, no industrial revolution in the past has been based on a single technology. A combination of radio, television, flight, antibiotics, synthetic materials, and automobiles drove the productivity surge of the early and mid-20th century. The Industrial Revolution of the second half of the 19th century combined railroads, electricity, and the telegraph and telephone.

Similarly, for sustainable economic growth, the U.S. needs breakthrough innovations outside of core IT. Some technologies weren't ready for prime time 10 years ago but have matured. MacKay of Organogenesis says that after spending years cutting costs and increasing reliability, he is ready to "reinject innovation" back into the company. He is in the process of submitting new treatments for FDA approval, including a product made from living cells that helps stimulate the growth of gum tissue. "This is the type of manufacturing that won't be lost offshore," says MacKay.

MEMS, too, is maturing. Nintendo is about to release a new add-on for the Wii controller, containing an innovative MEMS gyroscope chip that makes it easier to sense a wider range of user movements. The chip is made by InvenSense, a Sunnyvale (Calif.) startup that was able to build on 10 years of industry false starts to produce something small enough and cheap enough to go into a mass consumer product. "If I had any idea how difficult it would be, I wouldn't have started the company," says CEO Steve Nasiri. He expects a fivefold increase in revenue this year, and he sees a competitive advantage for the U.S. "In MEMS technology, the U.S. is two to three years ahead of Europe and Japan," says Nasiri.

The imponderables are biotech and alternative energy. In biotech, the clinical-test pipelines are full of breakthrough treatments, any of which could turn out to be a blockbuster. But as we've seen far too many times, a drug can be promising up to the moment it is rejected by the FDA. Similarly, the potential for innovation in alternative energy is enormous, but it's hard to know which approach will pay off.

The professor, trader, and author Nassim Nicholas Taleb calls technological breakthroughs "positive Black Swans"—unexpected events with huge positive consequences that in retrospect look inevitable. Some, such as Google, come out of nowhere to dominate within a short time. Others take years to mature and are surprising only because people forgot they were there. We've learned over the past 10 years just how unpredictable technology can be. But right about now, the U.S. could use a few positive Black Swans.

Mandel is chief economist for BusinessWeek.

Ouch!


Treasury Secretary Tim Geithner is in China today, doing whatever it takes–no matter how shameless and sordid– to make sure that the Chinese don’t pull out of American bonds, which are tanking in value thanks to Geithner’s multi-trillion-dollar gift to Wall Street. The Chinese are not happy, and since the Chinese loaned America all the money it’s now blowing, they call the shots.Treasury Secretary Tim Geithner is in China today, doing whatever it takes–no matter how shameless and sordid– to make sure that the Chinese don’t pull out of American bonds, which are tanking in value thanks to Geithner’s multi-trillion-dollar gift to Wall Street. The Chinese are not happy, and since the Chinese loaned America all the money it’s now blowing, they call the shots.

Lagging Recognition from Clusterfuck Nation

Here is an essay from James Kunstler, author of the Geography of Nowhere, a critical tome about suburbanization in the U.S. Kunstler's website is Clusterfuck Nation.

http://www.kunstler.com/



Enjoy


Through the tangle of green shoots and sprouting mustard seeds, a nervous view persists that the arc of events is taking us to places unimaginable। The collapse of General Motors and Chrysler signifies more than the collapse of US car manufacturing. It spells the end of the motoring era in America per se and the puerile fantasy of personal liberation that allowed it to become such a curse to us.


Of course, many Nobel prize-winning economists would argue that it has only been a blessing for us, but that only shows how the newspapers are committing suicide-by-irrelevance. And if other societies, such as China's late-entry industrial start-up, want to adopt a similar fantasy, they will only find themselves all the sooner in history's garage with a tailpipe in their mouths.

Here in the USA, we will mount the most strenuous campaign to keep the motoring system going -- in fact, we're already doing it -- but it will fail just as surely as two (so far) of the "big three" automakers have failed. It will fail because car-making is only one facet of a larger network of systems that is coming undone, namely a revolving debt cheap energy economy.

Americans will never again buy as many new cars as they were able to do before 2008 on the terms that were normal until then: installment loans. Our credit system is completely broken. It choked to death on securitized debt engineered by computer magic and business school hubris. That complex of frauds and swindles coincided with the background force of peak oil, which meant, among other things, that economic growth based on ever-increasing energy resources was over, and along with it ever-increasing credit. What it boils down to now is that we can't service our debt at any level, personal, corporate, or government -- and that translates into comprehensive societal bankruptcy.

The efforts of our federal government to work around this now, to cover up the "non-performing" debt and to generate the new lending necessary to keep the old system going, is a tragic exercise in futility. I'm not saying this to be a "pessimistic" grandstanding doomer pain-in-the-ass, but because I would like to see my country make more intelligent choices that would permit us to continue being civilized, to move into the next phase of our history without a horrible self-destructive convulsion.

Another consequence of the debt problem is that we won't be able to maintain the network of gold-plated highways and lesser roads that was as necessary as the cars themselves to make the motoring system work. The trouble is you have to keep gold-plating it, year after year. Traffic engineers refer to this as "level-of-service." They've learned that if the level-of-service is less than immaculate, the highways quickly enter a spiral of disintegration. In fact, the American Society of Civil Engineers reported several years ago that the condition of many highway bridges and tunnels was at the "D-minus" level, so we had already fallen far behind on a highway system that had simply grown too large to fix even when we thought we were wealthy enough to keep up. Right now, we're pretending that the "stimulus" program will carry us over long enough to resume the old method of state-and-federal spending based largely on bonding (that is, debt).

The political dimension of the collapse of motoring is the least discussed part of problem: as fewer and fewer citizens find themselves able to buy and run cars, they will feel increasingly aggrieved at the system set up to make motoring virtually mandatory for all the chores of everyday life, and their resentments will rise against the elite that can still manage to enjoy it. Because our car-dependency is so extreme, the reaction of the dis-entitled classes is liable to be extreme and probably delusional to an extreme, too.

You can already see it being baked in the cake. Happy Motoring is so entangled in our national identity that the loss of it is bound to cause a national identity crisis. In places like the American south, the old Dixie states, motoring lifted more than half the population out of the dust, and became the basis of the New South economy. The sons and grandsons of starving sharecroppers became Chevy dealers and developers of suburban housing tracts, malls, and strip malls. They don't have any nostalgia for the historical reality of hookworm and 14-hour-days of serf labor in hundred-degree heat. Theirs is a nostalgia for the present, for air-conditioned comfort and convenience and the groaning all-you-can-eat Shoney's breakfast buffet off the freeway ramp. When it is withdrawn from them by the mandate of events, they will be furious.

Given the history of the region and the predilections of its dominant ethnic group, one might imagine that they will want to take out their gall and grievance on the half-African politician who presides over the situation. Among the ever-expanding classes dis-entitled from the so-called American Dream, the crisis is only marginally different in other regions of the nation. Mr. Obama faces a range of awful dilemmas, and it is painful to see them go unrecognized and unacknowledged by his White House. It's hard to imagine that the president and his elite advisors are blind to these equations, but as the weeks tick by they seem stuck in a box of limited perception.

We're in a strange hiatus for now. "Hope" levitates the legitimacy of the dollar, the stock markets, and the authority of leadership. In the background, implosion continues, debt goes unpaid, banks ignore bad loans to keep them off their books, jobs and incomes vanish, cars and other things go unsold, and a tragic wishfulness strains to sustain the unsustainable. Our expectations are inconsistent with what is happening to us.

It will be very painful for us to walk away from the car-centered life. Half the population faces the ugly obstacle of being hopelessly over-invested in a suburban house and all the life-ways associated with it. There will be no easy way out for them, whatever they chose to do politically, whatever noise they make, whomever they scapegoat, whatever fantasies they cultivate about what the world owes them, or who they think they are.

Mr. Obama should not waste another week pretending that we can keep this old system going. The public needs to know that we will be making our livings differently, inhabiting the landscape differently, and spending our days and nights differently -- even while we suffer our losses. The public needs to hear this from more figures than Mr. Obama, too, from leaders in the state capitals, and the agencies, and business and education and what remains of the clergy. But somebody has to set in motion the chain of recognition, or events will soon do it for us.

Saturday, June 6, 2009

We Don't Need No Education

This article is from Reason. It reflects some of my thoughts about higher education in the U.S. I will write a more thorough essay in the future that explains the real reason for the infatuation with higher education in the U.S., and it does not have to do with money nor education itself...


Ask random members of the professoriate at my alma mater, the University of Massachusetts, Amherst, and many will confide that too many people—not too few, as recently suggested by President Barack Obama—are attending college these daysThis opinion is impolite and impolitic (perhaps, in the context of the American university, we should say "un-PC"). But years of furtive conversation with academics suggest it is commonly held. And one can see why. To the professor with expertise in Austro-Hungarian history, for instance, it is unclear why his survey course on the casus foederis of World War I is a necessary stop in a management-level job training program at Hertz.

This is not to say that some Americans should be discouraged from participating in a liberal arts education. As the social scientist Charles Murray writes in his book Real Education, "Saying 'too many people are going to college' is not the same as saying that the average student does not need to know about history, science, and great works of art, music, and literature. They do need to know—and to know more than they are currently learning. So let's teach it to them, but let's not wait for college to do it."

Take this bullet point, proudly included in a November 2008 press release from the Boston public school system: "Of the [Boston public school] graduates from the Class of 2000 who enrolled in college (1,904), 35.5 percent (675 students) earned a degree within seven years of high school graduation. An additional 14 percent (267 students) were still enrolled and working toward a degree." In a news conference celebrating these dismal numbers, Mayor Tom Menino called for a "100 percent increase" in the number of city students attending college, though offered no suggestions on how to ensure that those students actually graduate or are properly prepared to handle undergraduate studies. Besides, if 14 percent of those enrolled are still ambling towards a degree after eight years, is Menino convinced that the pursuit of a university education was the right decision for these students, rather than, say, vocational training?

Alas, these numbers are not uncommon. (They're often worse in other major American cities.) Citing a recent study by two education experts at Harvard University, former Secretary of Education Margret Spellings sighed, "The report shows that two-thirds of our nation's students leave high school unprepared to even apply to a four-year college." Nevertheless, a huge number of these students are matriculating to four-year universities, incurring mountains of debt, and never finishing their degrees.

The devalued undergraduate degree is one thing when the people doing the devaluing have privately financed their education. It is quite another when the federal government foots the bill. While America debates the merits of the Troubled Asset Relief Program, the nationalization of General Motors, and how to fix a broken health care system, the Obama administration has been quietly planning a massive expansion of the Pell Grant program, "making it an entitlement akin to Medicare and Social Security." Read that sentence again. As we spiral deeper into recession and debt, our dear leaders in Washington are considering the creation of a massive entitlement akin to the expensive, inefficient, and failing Medicare and Social Security programs.

According to a report in The Washington Post, Obama's proposals "could transform the financial aid landscape for millions of students while expanding federal authority to a degree that even Democrats concede is controversial." It is a plan that has met with outspoken—though likely toothless—resistance from Republicans. Rep. Paul D. Ryan (R-Wis.), the senior Republican on the House Budget Committee, suggested that the president reform existing entitlements before creating new ones. And, as noted in the Post, Obama is facing resistance from his own side of the aisle as well, with Sens. Ben Nelson (D-Neb.) and Tom Harkin (D-Iowa) expressing skepticism towards both the price tag and the necessity of such an expansion.

Beyond the massive cost of expanded Pell Grants, Ohio University economist Richard Vedder argues that, historically, "it is hard to demonstrate that enhanced federal assistance has either significantly expanded college participation or brought about much greater access to higher education by those who are financially disadvantaged." If the idea is expanded into an entitlement, Vedder sees rising demand for higher education leading to significantly higher costs. "When someone else is paying the bills, costs always rise."

With more than 40 percent of students who enter college dropping out before graduation, Vedder's suggestion that "a greater percentage of entering college students should be attending community colleges, moving up to four year universities only if they succeed well at the community college level," seems sound. But the idea pushed by President Obama that, regardless of a student's career aspirations, secondary education is a necessity in 21st century America, ensures that an undergraduate education will become a required (very expensive) extension of every high school diploma.

To the average high school senior, the American university has become an institution that one simply must slog through to reach a higher salary. As one college dropout recently told the Pittsburgh Post-Gazette, "I am determined to finish my degree. A high school job isn't cutting it these days." The former student, the reader is told, simply wants "to do something else with her life," though it is unclear just what that something else is. Perhaps she'll figure that out after getting the degree.
As Charles Murray observed in The Wall Street Journal, "Our obsession with the BA has created a two-tiered entry to adulthood, anointing some for admission to the club and labeling the rest as second-best." But not to worry. If Obama's plan for a secondary education entitlement is foisted upon us—the final cost of which remains anyone's guess—we might soon have a one-tiered system where everyone is second-best.

Wednesday, June 3, 2009

Keep the Heat on Obama as Well

Constructive criticism can be beneficial -- I know, how cliche. However, I do think the largely Republican criticism of the media "going easy" on President Obama is correct. Institutionally, the presidency is a precarious institution -- the abuse of the Commander in Chief power by just about every president is the most evident example of this. See the Prize Cases and ex parte Milligan (1865) for President Lincoln's abuses and FDR's can be seen in the infamous Korematsu v. U.S. (1944) case. The potentially explosive problems that could occur through further abuse by the President should be the prime constitutional issue brought up in light of Supreme Court nominee Sonia Sotomayor. Wise Americans will pay attention to this issue(s) and NOT allow themselves to be distracted by the nonsense social issues of abortion, gay marriage, illegal immigration, etc. that the Republicans will bring up.

Our Latin American neighbors copied the U.S. presidential system, and historically it has been a disaster. Scholars of Latin America like Arturo Valenzuela and Juan Linz have long called for its replacement with a parliamentary system. The head of government in a parliamentary system, usually called a Prime Minister, is generally chosen by the majority party. Political scientist Robert Dahl, in How Democratic is the American Constitution? points out the institutional flaws in the Presidency and makes a subtle argument that a parliamentary system would be better. So, regardless of your political party affiliation, it is wise to keep the heat on the president and not become infatuated. We've become a celebrity-novelty worshipping culture; let's wisen up a bit.

The Obama InfatuationRobert Samuelson

WASHINGTON -- The Obama infatuation is a great unreported story of our time. Has any recent president basked in so much favorable media coverage? Well, maybe John Kennedy for a moment; but no president since. On the whole, this is not healthy for America.

Our political system works best when a president faces checks on his power. But the main checks on Obama are modest. They come from congressional Democrats, who largely share his goals if not always his means. The leaderless and confused Republicans don't provide effective opposition. And the press -- on domestic, if not foreign, policy -- has so far largely abdicated its role as skeptical observer.

Obama has inspired a collective fawning. What started in the campaign (the chief victim was Hillary Clinton, not John McCain) has continued, as a study by the Pew Research Center's Project for Excellence in Journalism shows. It concludes: "President Barack Obama has enjoyed substantially more positive media coverage than either Bill Clinton or George W. Bush during their first months in the White House."

The study examined 1,261 stores by The Washington Post, The New York Times, ABC, CBS and NBC, Newsweek magazine and the "NewsHour" on PBS. Favorable stories (42 percent) were double the unfavorable (20 percent) , while the rest were "neutral" or "mixed." Obama's treatment contrasts sharply with coverage in the first two months of the presidencies of Bush (22 percent of stories favorable) and Clinton (27 percent).

Unlike Bush and Clinton, Obama received favorable coverage in both news columns and opinion pages. The nature of stories also changed. "Roughly twice as much of the coverage of Obama (44 percent) has concerned his personal and leadership qualities than was the case for Bush (22 percent) or Clinton (26 percent)," the report said. "Less of the coverage, meanwhile, has focused on his policy agenda."

When Pew broadened the analysis to 49 outlets -- cable channels, news Web sites, morning news shows, more newspapers and National Public Radio -- the results were similar, despite some outliers. No surprise: MSNBC was favorable, Fox was not. Another study, released by the Center for Media and Public Affairs at George Mason University, reached parallel conclusions.

The infatuation matters because Obama's ambitions are so grand. He wants to expand health care subsidies, tightly control energy use and overhaul immigration. He envisions the greatest growth of government since Lyndon Johnson. The Congressional Budget Office estimates federal spending in 2019 at nearly 25 percent of the economy (gross domestic product). That's well up from the 21 percent in 2008, and far above the post-World War II average; it would also occur before many baby boomers retire.

Are his proposals practical, even if desirable? Maybe they're neither? What might be unintended consequences? All "reforms" do not succeed; some cause more problems than they solve. Johnson's economic policies, inherited from Kennedy, proved disastrous; they led to the 1970s' "stagflation." The "war on poverty" failed. The press should not be hostile; but it ought to be skeptical.

Mostly, it isn't. The idea of a "critical" Obama story is a tactical conflict with congressional Democrats or criticism from an important constituency. Larger issues are minimized, despite ample grounds for skepticism.

Obama's rhetoric brims with inconsistencies. In the campaign, he claimed he would de-emphasize partisanship -- and also enact a highly-partisan agenda; both couldn't be true. He got a pass. Now, he claims he will control health care spending even though he proposes more government spending. He promotes "fiscal responsibility" when projections show huge and continuous budget deficits. Journalists seem to take his pronouncements at face value even when many are two-faced.

The cause of this acquiescence isn't clear. The press sometimes follows opinion polls; popular presidents get good coverage, and Obama is enormously popular. By Pew, his job performance rating is 63 percent. But because favorable coverage began in the campaign, this explanation is at best partial.

Perhaps the preoccupation with the present economic crisis has diverted attention from the long-term implications of other policies. But the deeper explanation may be as straightforward as this: most journalists like Obama; they admire his command of language; he's a relief after Bush; they agree with his agenda (so it never occurs to them to question basic premises); and they don't want to see the first African-American president fail.

Whatever, a great edifice of government may arise on the narrow foundation of Obama's personal popularity. Another Pew survey shows that since the election both self-identified Republicans and Democrats have declined. "Independents" have increased, and "there has been no consistent movement away from conservatism, nor a shift toward liberalism."

The press has become Obama's silent ally and seems in a state of denial. But the story goes untold: Unsurprisingly, the study of all the favorable coverage received little coverage.

Monday, June 1, 2009

Welcome to Imperium Americana

Imperium Americana is a blog dedicated to the study and discussion of the American Empire.

History has seen many empires. The United States, referred to colloquially as America, is an empire. Albeit, the United States is a unique empire; one that runs counter to the intuitive thought of empires as evil. That is, empires, such as the short lived Empire of Nazi Germany (1933-1945) that acquire territory for the sake of grandiosity.

Rather, the description of the United States as an empire is better fitted to the term hegemon; a term from the Greek which means leader (ἡγεμονία hēgemonía). Historically, and in light of this definition, the The United States is a de facto hegemon or world leader. The Realist scholar of International Relations John Mearsheimer calls the United States a Great Power. The United States became a hegemon as a result of its triumphal situation following the end of World War II. At the time, the United States emerged with only one Great Power rival, the Union of the Soviet Socialist Republics (USSR). And, indeed, the United States and the USSR engaged in what is today referred to as the Cold War for about 45 years. The USSR fractured in 1992 and Russia emerged as the only truly rival military great power to the United States although its relative military strength was not on a par with the former USSR.

For many, the dissolution of the USSR meant the "End of History", the triumph of capitalism and democracy. As political philosopher Francis Fukuyama put it, "There is nothing else ideologically; as all else [historically] has failed. " However, Fukuyama was wrong; the 1990s was soon marred by genocides in Southern Europe (Balkans) and African countries. The 1990s gave place to the term failed states: states like Somalia and Afghanistan where government control ceased to exist. In the midst of this, the United States actually increased its security presence, although it did not military intervene in all such conflicts. After the terrorist attacks of September 2001, the United States found itself a true global hegemon, with military bases in every corner of the globe including some in the countries that once were part of the USSR.

Now the United States finds itself an empire, albeit, a reluctant one. Some world leaders do not like to admit it, despite their emphasis on nationalism and soveriegnty, that they want the United States to retain its security presence. Political Scientist Michael Mandelbaum describes the United States as the World's Goliath -- a necessary empire least the historical antagonisms between countries be released anew. Ironically, on the other hand, world public opinion is generally against a United States presence in their countries. This is a particularly difficult dilemma for politicians in democratic countries. Terrorist acts carried out by radical Moslem militants are not the acts of states but groups which are generally well supported by public opinion. If you do not believe this, then reread the world's newspapers after September 11, 2001. Reflecting public opinion, but not the elites' opinion, these newspapers said: "We Feel for You but You had it Coming."

On the homefront, the American Empire is costly. The economic hegemony of the United States is in a perilous state; it has been in decline since the mid-1970s. The United States is not the physically productive economy that is was before the end of the Vietnam Conflict in 1975 and the Oil Shocks of 1973 and 1979 perpertrated by OPEC as deliberate acts of rebellion against the United States and Western Europe. Western Europe learned its lesson and revised accordingly. The United States did not despite the pleas of President James "Jimmy" Carter whom the next President Ronald Reagan ridiculed as a wimp. President Barack "Barry" Obama has learned his history lesson well from this -- do not emulate President Carter at all costs.

Since the mid-1970s, Americans' standards of living has fallen precipitously. Dual incomes are the norm not out of greed but out of necessity. Beginning with the election of President Reagan in 1980 the United States has become increasingly conservative; the better term is reactionary. Political scientist Theodore Lowi outlines this in The End of Liberalism: The Second Republic of the United States (1979) as well as historian Thomas Frank in What's the Matter with Kansas? How Conservatives Won the Heart of America (2004). You do not have to do much reading to know that physically productive industry in the United States has been in decline for some time now. The textile, steel and other similar industries were lost by the 1980s, and we are witnessing the end of the American automobile industry. As Frank illustrates the Republicans masterfully replaced reality, the decline of the U.S. physical economy, with nonsense social issues such as abortion, gay marriage, and zenophobia about illegal immigrants.

Why did we not see the severity of the decline? We did not see it because the productive physical economy was replaced by a pseudo economy built on personal services - tourism, fast food, etc.- and financial services. The other factor was an economy driven by suburban development. The creation of new homes and the moral promotion of home ownership by developers, interest groups, and the Republican Party. New homes need public services and conveniences and the developers were happy to provide. Sadly, this suburban development was done entirely in light of a continuing reliance on automobiles, which in turn was geopolitically predicated on the low price of oil. Continuous suburban development led to the illusion of growth, but most Americans were not getting wealthier.

Today, we face a dilemma unlike any we have encountered before in our history. How do we meet the demands of empire while taking care of our citizens at home? How do we maintain the empire and deal with a destructive budget deficit? Some say we should abandon our far flung military bases and retreat. Doing so, so says Mandelbaum and others, would create chaos by undermining the security ties provided by the presence of the United States. The leaders of most countries, despite their rhetoric, understand this. What is the connection of our security provision to global trade? How do we deal with an increasingly hostile public opinion of the United States in light of the need to maintain empire? Finally, the destruction of our productive physical economy has left the United States with one truly viable industry -- the military or war economy. Wars and more wars are needed to fuel this economy. This is not an exaggeration. President Eisenhower warned Americans about the need to mitigate a potentially dangerous "military industrial complex" in 1954.