Archive | Politics RSS feed for this section

Victory

 

 

What a night!

For those of us in the progressive camp, it went at least as well as we could imagine: a solid (if close) Obama win; a pick-up of some Senate seats, partly brought on by the defeat of what I call the “pro-rape coalition” of Neanderthals Todd Akin and Richard Mourdock. Legalization of both marijuana and gay marriage in a few states — the first time such measures have actually won a general vote. Dylan nailed this one: the times they indeed are-a-changin’. For many of us, it wasn’t a moment too soon.

But still, much remains to be done: will the Obama camp drop their continued (if somewhat halfhearted) pursuit of the fruitless “war on drugs”? Will they overturn the Defense of Marriage Act? Will they go forth and enact “Obamacare” as promised? Will they tackle longstanding inefficiencies and unfairnesses in the immigration system? Will they indeed push for a future that’s less dependent on dirty, climate-altering, nonrenewable, depleting fossil fuels? I could go on.

If the mood this time around is less unabashedly jubilant than last, it’s because maybe we, collectively, have a clearer picture of the enormous work that still needs to be done — and we’re also less sanguine about the prospect of “reaching across the aisle.” In my years in this country, I’ve watched the continued devolution of “conservatism” from a movement emphasizing cautious-minded retention of core values into one that’s angry, deranged, nativist, theocratic. Free-market fundamentalist to a degree that would make Morgan and Rockefeller cringe. Bits of it make sense; most of it is hateful and brutish and belongs nowhere in an advanced democracy.

It’s that sometimes hard lesson that we bring the morning after: if Republicans indeed are what they are, it’s time to stop harboring illusions about bipartisanship and begin forging a future without them. Be bold, Barry. You’ve got a four year runway with no more elections to win. Way I see it, America faces a choice: it can remain an indispensable nation, a pre-eminent technological, economic, and military power whose great wealth can be used to help build a productive, just, outward-looking society. Or it can shrink backward into 18th-century ideology, turning itself into an irrelevant, paranoid, neo-gilded-age banana republic. This election laid the choice bare — and I, for one, am heartened that we pulled the lever forward, not back.

Now let’s go turn those votes into action.

Share

Pride

 

“What shocked both gays and the straight establishment was that gays had, for once, openly fought back.”

I had a dream last night that I was being bullied.

It was some spoiled kid from my elementary/high school (who later actually came out, I believe). Admittedly, the bullying in question — both the dream and its real-life counterpart — were nothing serious or systematic, but one thing about both incidents (real and subconsciously synthesized) struck me.

I didn’t fight back.

Instead, I learned to hide, to “pass”, to subsume my true identity. I’d let parts of it discreetly shine through — I was a nerd, I liked computers and science fiction, all that — but that certain fascination, that quickening of my heart when I walked past a a good-looking boy in high school… well, I ignored that impulse, pretended it wasn’t there, thought it was anything but what it really was. And for my troubles, I was (limitedly) rewarded: the bullying stopped. In exchange for suppressing who I really was, I was quietly ignored.

But eventually, the elimination of a fundamental part of what it means to be human — to love, to date, to marry, even (yes) to have sex both meaningful and not-so-meaningful — took its toll. From early adolescence right into my twenties, I could never remember what dreams I had after I woke up. It’s as if my subconscious had switched itself off.

Then, one snowy February night, after a traumatizing friendship and falling out with a fetching lad in college (nothing happened, alas, except to me, internally), I sat beside my parents’ cat on the sofa at two-thirty in the morning and laid the truth bare.

The cat’s reaction was predictably unenthused, but for me it was a revelation — the revelation — that would change the course of my life. But it took time. I was scared — terrified of those feelings that are for many the baptism-by-fire of teenagerhood. I gradually told friends, then siblings, then parents. It wasn’t until my mid-twenties that I began dating. It wasn’t until my very late twenties that I experienced love, if only briefly. It was only in my thirties that I had my first serious, multi-year relationship.

I’ve often mused about the awkwardness of Pride: a combination party and political rally. But, then, maybe that’s the point. Unlike most minoritarian identities, being gay is something that lives dormant inside you, only reaching fruition with the onset of adult emotion. As such, it’s easy to suppress, ignore, wish away, be told doesn’t exist.

Which is why Pride is what it is: a fight for your right not only to party, but to love, marry, form families, be happy with yourself. Proud. That’s what those youths and drag queens, fighting the New York City police at a speakeasy many Junes ago, understood better than so many of us in the mainstream did: that coming out is, for each of us who’ve gone through it, a mini Stonewall inside ourselves to awaken who we truly are.

Happy Pride to all.

Share

Edward Conard is a Big Fat Idiot (with apologies to Al Franken)

 

The word “connard” in French is often translated as “arse,” “jerk-off,” “dipshit,” “douchebag,” and too many other epithets to list. Which is exactly what went through my mind as I read, a few weeks back, a New York Times article about one Edward Conard, former partner of presidential nominee Mitt Romney in his years at Bain Capital.

Actually, I should qualify: I kind-of respect Conard for doing what few, if any, of his fellow “one percenters” have done — take a stand for their team and lay out, chapter and verse, exactly why they believe that the accumulation of immense, gargantuan, humungous — what many of us term “obscene” — wealth is not only alright, but is actually a virtue.

His book on the subject, which comes out this week, lays out his case. If the introduction and sample chapter are any indicator, it seems like a respectably written treatise on the economic situation today — how it evolved, how we got into this mess, and how various actors played their parts to make it so. In the absence of his complete tome, I had to make do with the samples and his bits of wisdom shared with the Times — and suffice it to say, it ain’t pretty.

First off, I will say that I agree with some of what this haughty fellow posits: that risk-taking ought to be rewarded, that people who are in a position to do so ought to be encouraged (partly in the form of generous remuneration) to take said risks. Though frankly, I can’t believe we’re still having this “Communism led to mass starvation and we won the Cold War” argument. Yes, capitalists: your system is better. Everyone cannot, and should not, a priori, be granted equal reward no matter what their contribution is to the world. And yes, there is, somewhere in our social conditioning or DNA or from our expulsion from Eden (or whatever worldview you choose) a deep-seated human tendency to socially stratify (though I maintain that’s a tendency we can overcome). Old-school Bolshevism, that simplistic bit of Victorian-era social over-engineering, failed to account for all that… and so it crumbled. The few remaining “Communist countries” of the world are either destitute (North Korea, Cuba) or are (to paraphrase the Tea Party epithet for moderate Republicans) CINOs — Communists in Name Only (China).

I can also accept that the received opinion and conventional wisdom about why our economy crashed in 2008 might be flawed — and that a fellow with insider knowledge of “the system” might have a thing or two to say about it: his “bank run” hypothesis actually sounds rather compelling. Heck, I’m not even against mortgage-backed securities: when I started work at the former Countrywide Home Loans some fifteen years back, I learned that the presence of a secondary market in the United States allowed for a deeper, more diverse, more flexible housing market than existed in my native Canada. Back then, the overwhelming number of loans were conventional 30-year fixed-rate loans (something the rather monopolistic banks in Canada could never offer); subprime and exotic loan products were small-scale, specialty niche items either for skilled investors or for the occasional borrower with adverse credit. I still recall one of our more expert staffers scoffing at the then-newfangled negative-amortization, no-money-down or interest-only loans, calling them “junk bonds” and claiming “we’ll never do those.”

So much for that.

And here’s where I part company with Conard and his ilk, who draw a false connection between legitimate innovation and risk-taking, and the pathological foolishness that’s gone on these past fifteen — nay, thirty — years, ever since the real-life Gordon Gekkos promulgated the cult of “greed is good”. Conard’s true colors come out about halfway through the Times article, right when he spots, at a cafe off Madison Avenue, “three young people with plaid shirts and floppy hair.”

“What are they doing, sitting here, having a coffee at 2:30?” he snarls. Apparently coffee breaks are reserved only for jacketed-and-tied one-percenters. He goes on to deride “art history majors,” his blanket term for anyone who isn’t in one of his club of risk-taking “job creators.”

Asshole. First off, who knows what these young folks are doing on their coffee break? Perhaps taking the very risks and innovations he celebrates. Furthermore, Conard seems clueless to the gargantuan risks many “liberal arts majors” are in fact taking: entering professions with chronic underemployment and socio-economic uncertainty in the hopes of “making it” as a writer, artist, filmmaker, web designer, advertising creative director, or countless other professions that involve tremendous sacrifice, risk, and hard work. What’s a bigger risk: having your well-to-do parents pay for your MBA at Wharton that practically guarantees you a six-figure salary on Wall Street, or working multiple jobs while slogging away nights on a screenplay that (you hope) sells for a handsome sum and becomes a box-office smash?

There’s nothing new, of course, about Conard’s elitism. To him, “job-creators” or “risk-takers” is just shorthand for “our crowd,” an elect group of like-minded people — often from educated, comfortable (if not already wealthy a la Romney or Bush) backgrounds with a thorough knowledge of how to attain or further their wealth and power and with that as their sole objective. Somehow, they posit, the sum of all these greeds is supposed to lead to a better world — even though it never, ever has when such a system’s been tried out in its purest form.

And so, I believe Conard and his crowd are actually crushing the “you can be anything” promise of America, replacing it with their gang of technocrats, educated at a small number of schools and ensconced in a select set of professions, who are busy quite successfully lobbying for a government that continues to pander to their every whim (and boy, do they howl mightily like spoiled children when a moderate such as Obama even tries to reform the system). In their full-throated, uncompromising belief in deranged, 18th century Adam Smith fantasies about the “free market,” they seem set on bringing America back to an “earlier time” — a Dickensian gilded-age nightmare wherein the biggest, richest economy in the world is to be transformed into a corrupt uber-scale banana republic with a tiny, wealthy elite astride a horde of poor and near-poor who keep the system afloat by constantly believing in a promise of a “better life” that’s ultimately unattainable.

Good luck, Conard. Just don’t expect the rest of us to take this one lying down.

Share

I Sing the Vehicle Electric: 24 Hours With a Nissan Leaf

Growing up in the energy-crisis late-1970s, oil has always been for me the subject of fascination — and occasional derision. No commodity has, I think, been so central or so fraught. Its energy density, its uncertain availability, its volatile price… no other natural resource makes front-page headlines the way it does (when was the last time anyone cared about the price of, say, phosphorus or magnesium?)

The question that’s been burning me up (pun sort-of intended) since kidhood is: when are we finally going to get off the stuff? It continues to amaze me, what with more solar energy striking the Earth in an hour than humankind uses in a year, and with Einstein’s e=mc2 equation long since having found practical application in nuclear energy (and more destructive forms of same).

Oh, I know, the reasons are legion, from the realities of legacy energy infrastructure to facts about energy density to what I believe is a plain-old failure of the imagination. I recently got into a near-argument with a colleague from conservative Georgia who claimed that alternative-energy research is bunk, and all these newfangled vehicles are doomed to failure. After he promulgated the usual Fox News canards (“none of this will work without subsidies”, “it’s just shifting pollution to the electrical grid”, etc.), I finally blurted out, “Come on; we’re scientists. We’re not supposed to thrown up our hands and say ‘impossible!'”

So to that end, when the time came for me to pick up a rental car for a day’s worth of errands and a visit to some friends down the Peninsula, I responded with an enthusiastic “yes, please!” when the agent at the car-rental shop asked, “do you want to try the new Nissan Leaf?”

I was a bit concerned about the oldest bugaboo in the book: although I’d heard electric cars are a pleasure to drive (responsive, power-efficient engines), the limiting factor for them has always been energy storage. While gasoline, for all its faults, can store an incredible amount of energy (and is easily portable, being a liquid at room temperature), battery technology has always been comparably weak. That’s why batteries typically only power small devices, why they’re constantly running down and needing to be recharged or replaced… and why, in spite of a heritage going back to the earliest days of the automobile, they never caught on the way their petroleum-distillate counterparts did. Only now, with historic rises in oil prices and talk of peak oil — coupled with incremental advances in battery efficiency — are electric cars starting to come out in greater (though still modest) numbers. So much so that my local Enterprise outlet now offers them as part of their regular fleet.

“You can definitely make it to Menlo Park and back,” said the agent as he sat me down in the car and gave me a tour of its Internet-age instrument cluster. “Just make sure to put it in ‘Eco mode.” Apparently in this mode the vehicle uses less energy, sacrificing a bit of performance to do so.

Right away it was apparent that the boosters of electric-drivetrain vehicles were right: the car drives fantastically. Even in Eco mode it was smooth, zippy, and (of course) quiet. The range indicator ticked off the miles remaining at a slightly slower pace than actual miles driven. Since the car is all-electric, it’s able to recapture some of the energy lost in braking to charge the battery, a process known as regenerative braking. The car’s initial range was 80 miles (a bit more with Eco mode on); after half a day’s worth of errands, I still had some 70-plus miles left — more than enough for the 28 or so miles each way to get down to my friends’ party in the Peninsula that evening.

Still, I wanted to see if I could top up, and to that end went looking for one of those electric vehicle charging stations the eco-minded leadership in San Francisco has been busily installing these past few years. A few taps on my smartphone and I found one nearby — in the parking garage of a nearby Costco superstore.

Easier said than done, however: no signage existed inside the mammoth parking area, and only care of an employee did I discover the two spots, forlornly tucked away behind the exit and a tire service center. Both were available free of charge, and both had charging receptacles… but neither one of them featured the type of outlet my Nissan Leaf called for.

Still, I figured I’d be fine, and with a friend in tow I headed down south. Again, even in Eco mode the car was as assured on the highway as any comparable subcompact… but neither of us could help turning our eyes toward the range meter: the miles kept peeling off as we cruised at typical highway speeds (in these parts they nudge 70 mph). Our 30 miles of distance consumed 45 miles of charge… which means we were down to 25 miles and wouldn’t have enough to get home.

Fortunately, my suburban friends have a garage (and a long three-pronged extension cord), and with the car’s own cord and adapter, we plugged the thing in and settled in for some St. Paddy’s Day revelry. Five hours later, the charge was up to 45 miles… which based on previous consumption patterns would barely be enough to get back to San Francisco. As it happens, SFO airport (and another Enterprise outlet) was on the way, so I figured if we were running low on juice stopping there to swap vehicles was always an option.

This time, however, I really took it easy (I am, admittedly, a bit of a leadfoot): we turned off the climate control, and drove a grandpa-style 55 mph the whole way home. This time the mileage corresponded more closely to the range meter, and we got back with over a half-dozen miles to spare. This was more than enough to head back to the rental place the next morning, where I experienced the best part of the whole adventure: no need to fill up!

So what’s the verdict? I think this is a stellar step in the right direction, but a combination of improved battery range and recharging infrastructure need to be in place to truly make this viable. This has been the chicken-and-egg issue with electric vehicles all along: the expense of getting everything in place is staggering. Sure, this was true back in the early 20th-century with gasoline vehicles as well, but the difference then was that no motor vehicle infrastructure existed, so any improvement in fueling and roadways were welcomed (and took a long time, too: from the invention of the motorcar in the 1880s to widespread popularity in the 1920s marks a span of nearly half a century). But now we already have an infrastructure in place, a gasoline-based system, and turfing out that “legacy” investment is proving a tough nut to crack. And until there is a reliable electric infrastructure in place, people are understandably gun-shy about plunking down their ducats… and so continues the vicious cycle.

There are a number of ways out of this: for one thing, a “hot-swappable” battery would be great. At least one company is working on this. A global, universal standard would be nice as well — I shouldn’t have had an issue finding charging stations compatible with my model of vehicle. And improved battery life (and an accurately-rendered one at that) is critical: until cars can get 300-plus miles to a charge, I doubt many of us will be interested (one manufacturer is pretty close to that benchmark already). Sure, having a “city car” is nice, but for most of us, the ability to go anywhere, anytime, is what we pay to have an automobile for. As with all things environmental, it’ll only be when green technologies offer comparable features to their non-green counterparts that they’ll really become popular.

Ultimately, then, I think this is going to take a coordinated, concerted effort on the part of governments, corporations, and the public — a notoriously difficult combination to get in sync. But similar such efforts have been successful in the past, from winning World Wars to putting men on the moon. In spite of initial hurdles, I’m excited to see what comes next — and will definitely be first in line when this technology is more mature.

Share

Can We All Be Rich?

It’s a comment Presidential wannabe Mitt Romney got reamed for last fall, and it even finds its way into far-right Tea Party manifestos: the notion that our society can make everyone wealthy.

I thought about this as I watched another meditation on this notion from last fall, the dsytopian sci-fi thriller In Time. Written and directed by Gattaca‘s Andrew Niccol (I consider the latter film one of the best sci-fi movies ever made), In Time failed to resonate with critics though was a moderate box-office success — no thanks to me, who gave it a miss in theaters. But in spite of its tepid directing (Niccol’s measured, deliberative pacing doesn’t work as well here as in his earlier films) and needlessly Michael Bay-like action-adventure plotting, well, I maintain this is the most brilliantly conceived filmed sci-fi dystopia since 1999’s The Matrix.

It’s all in the premise: In Time posits a future world where the genetic code has been cracked so completely that we can control aging. Nobody ages past 25, but there’s a cruel caveat: you have to earn your “time” beyond that, and time left to live — eerily displayed on people’s arms as a glowing green “life clock” — has become the currency of the age. Just at the reimagined Battlestar Galactica brought us a contemporaneous, Western-style society with spaceships and polytheism, so too does In Time deliver up a world nearly like our own, but with a completely different medium of exchange standing in for present-day money. My back-of-the-envelope calculation from the film, based on “four minutes [of life] for a cup of coffee” and 100 years being considered a fortune, puts 1 minute as roughly equal to 50 present-day cents — which means a day is worth about $700, a month about 20 grand, a year at just over a quarter-of-a-million. When a bank safe containing a “time capsule” (which can be delivered into a person, and can be transferred from individual to individual) is revealed to contain a million years ($250 billion), it elicits gasps — and a boast from its owner/tycoon on how it’s not his first.

It’s interesting that the movie came out within a month of Occupy Wall Street: both push the revolutionary premise that “there’s enough for everyone” if only the rich would quit hoarding it. But where the Occupy movement (or at least some of its supporters) and In Time part ways, it’s in the notion that everyone can be rich. In sci-fi-land, it’s the wealthy members of the establishment who claim that time isn’t infinite, that distributing it to the unwashed masses will “crash the system.” In our present-day world, it’s liberals claiming that not everybody wants to be rich, but most everybody wants a job, a roof over their head, and some degree of security — what I always call a “basic minimum” and agree that every First World country ought to and can provide its citizenry. Bill Maher famously claimed, “of course everybody can’t be rich; who’d be left to do stuff for rich people that nobody wants to do?”

Actually, that’s an excellent point: In Time never quite explains the mechanism by which its “currency” is produced. Does it require energy? Manufacturing? Even if we assume it’s nothing more than a digital countdown, and that the complexity of genetic aging processes can simply be switched on and off — not as far-fetched as its sounds, given research into telomeres — what are the raw economic costs of producing other goods and services to society? This is the basis of economics, at least as I understand it: the medium of exchange reflects, or should reflect, the aggregate cost of producing the stuff we use. As long as these products require human labor and — yes — time to create, an economic system involving scarcity (whether it’s monetary or temporal) seems inevitable.

But this begs the next question: what if stuff costs nothing, or next to nothing, to make? Well, speculative fiction addresses that too… I’m thinking, particularly, of Isaac Asimov‘s robot novels.

In these — starting with The Caves of Steel in 1954 — Asimov imagines a far-ish future (around 1,000 years) where groups of settlers had colonized a bunch of worlds with the aid of robots. But unlike Blade Runner‘s maniacal, malfunctioning replicants, these robots have been (successfully) engineered to be docile, helpful, near-indestructible servants — happy, willing, perfect slaves. They are self-repairing, self-manufacturing, and can (and do) produce all the goods and services anybody could ever want. In contrast to overpopulated, impoverished Earth (gotta have one of those for contrast), these off-world colonies are paradises of wealth. There, too, people live for centuries (though not forever — the prospect of genetic agelessness was unknown to Asimov in 1954) and population is voluntarily held in check. Everyone lives in mansions surrounded by flotillas of well-meaning servants, and does what one likes. Everyone is, indeed, wealthy.

And yet, the Spacer worlds (as they’re known) are far from utopian: jealousy, intrigue, even murder (the premise of at least one of the books) are present. It seems, in eradicating poverty and disease from these brave new worlds, future humans had forgotten to engineer away the not-so-better angels of our nature.

So maybe that’s where real-life activism and speculative fiction come together: technology can solve our problems (though it can also make them worse) but technology alone (or, for that matter, political policy) isn’t enough. As an unrepentant idealist, futurist, and geek, yes, I hold out hope that our scientific advances can improve our world. But we need to change, too. Many of us accept a certain degree of inequality in our lives (this very interesting blog post deals with such notions and how they fall on the political spectrum) — those who take initiatives, who work harder, ought to reap a greater reward — but a society based entirely on one-upsmanship, on reinforcement of inequality, on scarcity as a self-reinforcing idea… that’s a society which, sadly, too many of us have decided to accept.

And here’s where I rejoin my Occupy cohorts, and almost anyone who doesn’t accept our world at face value: I don’t think our current state, our current way of being, is static and inevitable. It may take time — more than on most people’s clocks — but I for one believe we can change, and shouldn’t throw up our hands and refuse to try.

I’ll take a million years, too, if that’s on the table.

Share

Occupations and Reactions

 

The “Occupy” movement has been spreading like wildfire these past weeks, fanning out from its perch in New York’s Zuccotti Park to spots across the world. It’s hard for we social-justice types to contain our enthusiasm: sure, the movement lacks leadership, it doesn’t have one coherent, focused message, and winter is coming. But still, it’s heartening, after decades of John Birchers and free-market fundamentalists and “Tea Party Patriots” to see Americans (and others in concert) protesting what folks like me think are the right things to be protesting.

This is true locally as well: even here in “let’s protest-the-least-relevant-stuff” San Francisco, where putting a small local-coffee-store kiosk in a popular city park or opening a progressive Trader Joes grocery store arouse (utterly unnecessary) ire, the Occupy movement has taken root — so much so that in neighboring Oakland, heavy-handed police tactics led to a widening of the movement and even a general strike. So unprecedented is this in America that most of us probably don’t even know what a general strike is or remember the last time one took place — it’s something consigned to history books, images of black & white laborers from a century ago on rattling streetcars.

And so, at the behest of some friends, I took off one lunch a few weeks back to march in San Francisco’s first such event; a week or so later I was in New York, and in addition to seeing the 9/11 Memorial (quite well done; you forget how massive those buildings were) I made sure to stop in Zuccotti Park to check it all out. I was only there briefly, but this excellent piece gives a nuanced, insightful picture of what these protests are all about.

For me, really, what I find most incredible is that the conversation in America is finally shifting to an honest criticism of our values and mores; growing up, I was always taught that, however corrupting it might be for some, the quest for ever-greater fortune is the right quest to be on. Refusing a promotion, turning down more responsibility, refusing to work gratuitously long hours… these were all things likely to brand you as “lazy,” “bitter,” even “worthless.” Given the ending of the Cold War (or, actually, even during it) anything that smacked of the Communistic was considered heretical, a sure-fire pathway to corrupt officials in dachas making the masses stand in line for toilet paper and fear deportation to Siberia. Oh, I know, it sounds a bit shrill and extreme to, say, liken the Tea Party movement’s cries of “socialism” with Stalinist excesses… but the very reason the teabagger crowd (as they’ve also come to be known) uses those words is because they know that, deep in our subconscious, those associations are in place.

This is my biggest hope for the movement: that it continues to awaken us to the skew inherent in our values. Of course, they could also use a more focused set of ideas, though it is in the nature of such organically-springing movements (in this case aided by social media and other new technology) to be hazy at first. Still, some of our sharpest minds are on the case — and so am I. When I was in New York I decided to do my part: I drafted up a proposal for what I think the Occupiers should be demanding. It’s a bit radical, but also not unprecedented. Check it out here.

And feel free to comment below: what do YOU think the Occupiers should be demanding?

Share

Hatfields and McCoys

 

The West Virginian Hatfield-McCoy feud is likely the best-known and least-understood family drama in American history (I myself mixed it up with the O.K. Corral gunfight, which is totally unrelated). But it’s most often used as metaphor for one of the biggest unresolved disputes of all: that between Israelis and Palestinians.

I’ve read and studied this subject a lot, partly because I have a dog in this fight (a sizeable portion of my extended family resides in the region), and partly because I’m captivated by its symbolism as the flash point in the alleged “clash of civilizations” between Western power and Islamic tradition.

First off, I’m simultaneously grateful and annoyed at world media for the disproportionate amount of coverage this gets. All sides of the conflict are equally culpable here, and I’m sure it must frustrate war-ravaged sub-Saharan Africans or police-stated North Koreans to no end to see their far larger-scale problems upstaged by the travails of a nation the size of Delaware. It’s nice to have so much information about this conflict I follow; at the same time, it would be nice if, say, the war in the Congo (1998-2003; 5.4 million killed) got a bit more airplay.

Various explanations have been advanced for this media malproportion: the relative ease of covering a major civilizational conflict from comfortable Western-style hotel rooms in Jerusalem or Tel Aviv (the very smallness of the country may be part of what keeps it in the news); an Anti-Semitic double standard; the importance of Middle East oil. Often overlooked, as it is in most news coverage and even opinion pieces on the subject, is the bigger picture behind the screaming headlines.

The issue has its passionate partisans (I know more than a few myself), most of whom spend much of their time railing about the savagery and cruelty of the other side. But what no one really states is what the conflict’s really about, i.e. what each side wants. My take on it, the one that I think a startling proportion of people on either side would agree with, is this:

Both sides want the entire place (more-or-less) exclusively for themselves.

At this point the partisans begin their carefully-worded “yes, but” responses: this side must stop doing this; that side did that barbaric or unforgiveable act; we’re not “really” going to exclude people; and so forth. Platitudes are then put forth about colonial projects, latter-day imperialism, (literally) God-given promises made to Biblical figures. But all that really obscures a simple yet intractable fact: this is, at its heart, a custody battle over some real estate.

Since it’s impossible for two sovereign entities to exercise exclusive control over the same turf (and restrict residency in said turf to its own kind), the dance continues, from settlement-building (“facts on land,” the early Zionists would call it) to suicide bombs to rocket attacks to reprisal incursions to full-on wars. But really, truth be told, the true goal of either side will never be realized: there are millions of both peoples (and, to be sure, smatterings of other minorities) co-mingled in the same space; one of the most powerful militaries on Earth on one side; the backing of ethnically-related neighbors on the other who happen to be some of the leading oil-producing nations; and the spectre of nuclear war still hanging like a Damoclean sword over all humankind (just because the Cold War has ended doesn’t mean nuclear weapons are gone, and it would take relatively little to invoke some future global nuclear conflagration).

So if no one’s going to win, is there any hope? Maybe, if the two sides can be persuaded to share the place. A truly multinational state with both Palestinian Arabs and Jewish Israelis as roughly a 50/50 split in population is anathema to both sides. I’m inclined, however, to think that short of this fight going on forever (which is a possibility), this will be what comes to pass long-term. Palestinian nationalists and die-hard Jewish Zionists may loathe and fear it, but I suspect it’s a demographic inevitability.

But in the short and medium term, it won’t work. The two sides are too different, too divided, too… well, Hatfield-McCoy-like, for a simple act of national union to happen. But the much-touted “two-state solution” is more likely.

If this does get closer to happening, I’ll offer up my one bit of highly-unconventional advice to both sides: forget the past. To heck with Biblical covenants or what village your grandpappy lived in before 1948 or who’s more wrong than the other. Make like a character in some movie who loses all memory and is forced to begin life anew.

First: from humble beginnings, the State of Israel is a wealthy, technically-advanced nation — one of the most on Earth. Whatever their motivations, whatever their actions, Israelis have wrought an enviable place from a cultural/economic-development standpoint. Meanwhile, you have another side that’s in dire need of something to do — assuming they can be persuaded not to spend their waking hours plotting their rival’s destruction like some Bond villain. What a great match, like tethering a once-divided Central European nation or two differently-skin-colored people in southern Africa. It’ll be uneven and bumpy at first — it always is, just ask the East Germans or South Africans — but it can work.

But that will mean Israel/Palestine abandoning their mutual goals for an ethnically-homogenous nation-state and instead settling for what we’ve (admittedly imperfectly) built here in North America, in Australia, and to a now-growing degree in Western Europe: a heterogenous “melting pot” nation-state, where people are just distinct enough to be interesting but not so much so that they’re at each others’ throats. We in such nations often scoff at this notion, calling it flawed, racist, outmoded… pick your epithet. But this very dismissal is indicative that our style of nation-ing works so well as to be taken for granted. Innumerable Palestinians and Congolese probably wish they had our “problems.” But in order to get there, they may need to embrace pragmatism and abandon romantic notions about “a state just for us.”

This will be dismissed as naive — and that’s precisely the point. Once every other approach has been tried, maybe a naive one, unencumbered by past trauma, is the way to go. After all, the Hatfield-McCoy feud eventually burned itself out — but who knows what might’ve happened if it were expanded out to the scale of two nations, in the twentieth (and twenty-first) century, across a civilizational divide, and in the backyard of some trillions of dollars in energy deposits. The only way for these two parties to see past their differences is to look in an entirely different direction.

Share

Nine One One

 

It’s retrospection day in America, ten years after the worst foreign-sourced attack on our soil since Pearl Harbor. So here’s one more voice in the chorus. Not the most interesting or traumatic, I grant you, even within my family and friend circle: one of my sisters was living in Manhattan at the time; one friend, a native Manhattanite, was charged with working for the crisis center for Canter Fitzgerald in the weeks and months after that terrible day.

It was a gorgeous morning in New York, one of those sunny September days described in a later Sex & the City episode (one that practically anticipated the day) as “when you could feel the seasons click.” I wasn’t living in New York. I was out here in San Francisco, where an equally sunny morning greeted me as well (seasons and weather patterns being what they are, Indian Summer in SF often resembles early fall Back East).

Back then I would fret about minutiae: a tech economy that was starting to tip, jeopardizing my job and Green Card application at the startup where I worked; an uncertain relationship with a young fellow, a New Jerseyian who’d just moved Out West himself; some second thoughts about my life in California, my adopted state then of five years within my adopted homeland, where I’d struggled, kicked and torn to get in; and the usual news of the day, from market uncertainty to Gary Condit. Chicago was a city I’d once visited and liked but never considered a place to much more than lay over on connecting flights. Liver donations were unknown to me. Going around the world? I struggled to make it around the continent. And on the subject of flights, I remember reveling in the perk of being able to meet arriving passengers at the gate.

Nothing could have prepared me — not even that phone call from an early-rising friend I let go to voicemail at 6:30 in the morning Pacific Time (who the heck calls at that hour? mused my half-asleep brain at the time) — for the world-changing event that was to hit me like a Mack Truck when I switched on CNN to check the wobbly stock market on that sunny Tuesday morning, September 11, 2001.

All manner of musings today bray the conventional wisdom that “nothing was ever the same again.” That was true in America… though for denizens of poorer, strife-torn nations — like much of Africa or even parts of our own hemisphere — America’s latter-era Day of Infamy would have felt familiar: such acts of violence are tragically common in such places. Yes, as some have proferred, some of the world’s geopolitical catastrophes had even been, in some way, caused by American meddling, American profiteering… or simply the unintended consequences of Frankenstein monsters gone amuck (Osama was one of these, natch). Yet I rejected then — when I was more politically right-leaning than I am today — and reject today the notion that 9/11 was in any way legitimate payback or retribution for past American sins: the receptionist at Canter Fitzgerald or the waiter at Windows on the World restaurant was not to blame for the fate of Palestinians in Jenin or Mexicans in Chiapas — and let’s be honest, the psychopaths who planned these attacks weren’t avengers for anything but their own demented notions.

Still, it’s hard, living in the weaker beast that is the American imperium ten years on, not to suspect the bad guys won, if only just a little. If their goal was to terrorize America, to divide America, to bankrupt America… well? Although the bombings were, of course, an act of willful aggression, they really served the ultimate passive-aggressive goal of provoking an overreaction, deepening divisions, throwing the adversary off guard. Oh sure, the battle against Al-Qaeda itself has been largely won, its tentacles smashed, its leader dead. But it’s never the goal of a death cult to emerge sunlit and victorious: for them, true victory lies in taking the enemy down in subtler ways — even at the expense of their own existence.

But then, it’s all too easy to cement in one’s mind a future for the world’s most powerful superpower that is questionable at best and bleak at worst. Recurrent economic crisis — really just a delayed reaction to 9/11 that was put on the nation’s credit card in the years immediately following the attacks — only heightens present-day gloom. But as in anything in life, there is always the prospect of change, of altered shadows and better outcomes. Americans can look past their partisan divides — little more than the squabbles of outmoded ideologies anyway — and discover a new sense of purpose in the challenges of the age: the income divide, the energy question (itself tied up in 9/11 when you consider Mideast oil revenue and how it led to the wealth of the Bin Laden family… you see where this is going), the need to retain and celebrate and make ever-more accessible the “you can do anything” notion — however naive it may sometimes sound — that is a cornerstone of this country’s ideals. Talk here often shifts to the unity of the World War II era… and however sepia-tinged and swing-music-infused that bygone time may seem, we can probably learn a thing or two from the sense of purpose of the Greatest Generation.

I hope we do. Our country — nay, our world — is on the line.

Share

Separate and Unequal

With the seemingly unending debate about the debt ceiling raging, I wanted to highlight a side issue that’s been burbling up in the United States over the past several decades: that of income inequality.

If we follow this way, way back, the widely-accepted story is this: before the agricultural revolution took hold, bands of foragers had little to differentiate each other. There was no concept of “income,” and while no doubt certain leaders may have commanded respect, provisions were divided more or less equally.

We see the rise of inequality with the dawn of agricultural civilizations, where food surpluses enabled delineation and differentiation between people (interestingly enough, it’s also now considered to be the time when what we consider traditional mores — marriage, monogamy, sexual conservatism — took root). With tribal chiefs commanding armies and other forces of social coercion, it was only a matter of time before rulers became monarchs and descendants of great military heroes became kings and nobility. This was the Faustian bargain we humans created for ourselves at the dawn of civilization: in exchange for predictable patterns of settlement and food production, most of us would enjoy fairly meager fruits of the society’s benefits; most of the reward would go to the rulership. In its most extreme form, the lowest elements of society — or those captured from other societies in massive wars — would become property themselves, forced to work with no reward at all. We call this concept slavery.

The division between rulers and commercial interests also began early; while kings and emperors may have held sovereign political power, there still needed to be a fungible medium of exchange by which services and goods could be traded. We have a name for this concept: money. While there had long been merchants and trader classes, it was really only in the Renaissance and the Enlightenment, when Europe, emerging from centuries of stagnation and capitalizing on advances from other places, took the lead and began (for better or worse) to explore and colonize the world. This was, in a sense, the first widespread philosophical uprising against concepts such as the divine right of kings, and held that the mercantile class was entitled to share in societies riches as well. With industrialization hyper-accelerating the ability for capital to be deployed and engaged (whereas before much was tied to land and agriculture, which were held largely by monarchs and the nobility), we entered the modern age with a new term: Capitalism.

Born in the Enlightenment, capitalism’s tenets seem on the surface to be quite noble: the de-emphasis of peerage and inherited titles as a delineator of wealth and influence; the notion of “careers open to talent,” where one’s ability determined one’s standing. When the West’s newest nation in the late-1700s, founded largely by former subjects of one of Europe’s most mercantile-friendly powers, outlined their raison d’etre in the country’s founding document, they more or less distilled these notions into a single, catchy phrase:

“All men are created equal.”

I’m sure that was heady stuff for 1776. So heady, in fact, that its underlying meme of “equality” caught fire around the world. The delineation was eventually expanded to include women, provide redress for the persecution of minorities, and birth entire philosophical concepts regarding the equality of peoples. But what its underlying “free market” system didn’t do is make the world less unequal. Where once we had emperors, now we had captains of industry; where once kings ruled nations, now robber barons commanded corporate trusts. Beginning in the early-to-mid 1800s in Britain, and culminating with the Gilded Age in ascendent America at the turn of the 20th century, writers and social activists from Dickens to Twain to (yes) Marx and Engels pointed out the alarming trend: under classical economics, a small number of smart, greedy, fortunate folks ended up in much the same place as kings of old… with the remaining populace forced to forage for the remaining scraps. Not literally, of course, but in a more figurative, modern sense: working with a system that was rigged against them, workers endured horrific conditions and low pay with little redress or recourse. The legal system was bought and paid for by the rich. A new aristocracy was born.

Of course, the huddled masses fought back, and over the middle decades of the twentieth century, until well into the 1970s, unprecedented new rights and benefits emerged. Between labor unions, old-age pension plans such as Social Security, grants and loans to attend college, and (in America) the prosperity of being the sole Western nation standing after World War Two and a net petroleum producer, the 1950s and 1960s were, in a sense, a Golden Age of socio-economic equality. As I wrote in my last piece, right-wing nostalgia for the 1950s obscures the fact that this was the most “socialist” era in the nation’s history. The income divide was the narrowest it had ever been, and tax rates on the wealthy were much, much higher than today.

Then, of course, everything changed. The reasons for this are many, but from what I’ve been able to piece together, a lot of it rests with Ronald Reagan. Although earlier in his movie career he was actually something of a liberal — head of the Screen Actors Guild, he was pro-union — his work as spokesperson for General Electric and its cadre of old-line pro-business conservative executives slowly changed his mind. He spent his years riding trains across the country and honing his speech on the evils of government regulation… so much so that the breakthrough speech he gave at the 1964 Republican National Convention was a near-clone of that which he’d given during his years at GE. This galvanized a movement and utterly changed the trajectory of young activism: whereas sixties radicals promoted free love and equality, by the 1980s the vanguard of the young politico was arguably fictional character Alex P. Keaton on the sitcom Family Ties (played, ironically, by a rather liberal Canadian, Michael J. Fox). I came of age in that era and between the cultural zeitgeist, a life lived in a corrupt, overtaxed and inefficient Canadian province; and some decidedly neo-conservative parents, I’d come to accept that the old liberal warhorses of unionization and social welfare were bunk. The future lay in the past.

What I think none of us could have anticipated (though in retrospect we should have) is the unintended consequences of that shift. With deindustrialization, de-unionization, rising health-care costs, and the build-up of the financial industry, the income divide in America is now the greatest it’s been since the Gilded Age of a century ago — and it’s as great in America as it is in Third World nations such as Ghana and Uganda.

But the nagging question this begs is: so what? Is there anything inherently “wrong” with a great income divide, if that’s the way the market works things out? In the free-market fundamentalism espoused by the conservative set, this is just the way of things. Some go even further, as a fellow I knew used to state, by claiming that “poor people create their own drama.” Another claimed “there will always be this divide; there’s nothing we can do about it” — a worldview I’d seen echoed by commentators such as George Will. These people point — rightly sometimes — to the hypocrisy, inefficiency and corruption of government programs, and claim this is the best we can do. To the poor, to paraphrase that New York Daily News article about a then-bankrupt New York, these people say: drop dead.

I’ve long rejected that worldview, which I guess in the American sphere tars me as a “liberal.” For one thing, I simply refuse to accept that what we’ve got is “as good as it gets,” and any attempts to improve our collective lot are doomed to failure. For another, there’s a deep-seated emotional notion I have about inequality that has long guided me — and many others, I suspect, in their life philosophies.

I keenly remember, as a boy growing up in a competitive, nouveau-riche suburb, the materialist one-upsmanship that marked my community. One school I went to imposed a uniform dress code — not out of some vague notions of Dead Poets Society tradition but rather to prevent a materialistic fashion show from transpiring among pre-teens. Sitting on a lakeside dock one summer with four other boys, all of us age ten, I was treated to each kid bragging unashamedly about their fathers’ weekly earnings.

But is that all that disdain of income inequality is — childhood jealousy? I would argue that it’s the very roots of this jealousy and resentment that are worth examining. While misapplied resentment can lead to a host of ills, the feeling probably has root in an adaptive mechanism to regulate how we treat each other. While many primate societies possess hierarchical structures — baboons are the prime example — there are others that do not, such as bonobos. “Might makes right” is not an absolute, and I feel that these deep-seated feelings of disapproval and resentment at great wealth have their roots in our deep-seated desire for a world more like the one we’ve long abandoned — the forager world of bands of equals.

While I’m not suggesting we return to the African savannah, I believe that our post-industrial civilization offers the opportunity to reclaim what the past millennia have denied us: a more equal world. The income divide is an arbitrary measure we have chosen to impose upon ourselves, and while I fully support rewarding those who work harder and take the incentive, I feel our reward mechanism has, in America at least, gotten seriously out of kilter. It’s for that reason that the battle between liberal and conservative in America has gotten so pitched — it’s really a fundamental fight between two very different philosophies. It’s difficult, I admit, in a country founded on raw capitalism, for the progressives to be heard as much as they should. But in my view, this is a fight worth having. I know which side I’m on.

Share

The Good Old Days

“It couldn’t always have been the way it is now. It must have been different in my grandfather’s time. You were there. You had Kennedy. I didn’t. I’ve never heard a president say ‘destiny’ and ‘sacrifice’ without thinking, ‘bullshit.'”
Primary Colors (the movie)

Nostalgia is a universal human conceit. An ache, as Man Men‘s Don Draper called it, to return to a time or a place where we know we are loved.

America’s right wing has long held the prize for nostalgia-speak — think Reagan and Morning in America. If we could go back to a simpler, more idyllic era, goes the notion, we’d revive that which is lost in our fallen times.

The validity of that notion is dubious — the halcyon 1950s weren’t so great for gays, or blacks, or women, or a host of other groups. So imagine my surprise to hear another group wax nostalgic about America’s post-World War II past: the progressives.

For me, the realization of that trend began with Thomas Frank’s What’s the Matter with Kansas?, an excellent look at how the American right captured those part of America once known for wildfire leftie populism (including, of course, his titular home state). I saw evidence of this again in the protests this winter against bills stripping unions of their collective bargaining rights in Wisconsin and Ohio. All of America’s political sphere seems united in one thing: a notion that these are not the country’s best days. Heck, I’m even seeing it on an anecdotal, street-level basis in the form of general malaise and nastiness out there — even in relatively well-off San Francisco. People have become downright mean to each other these days — or else, channeling Donald Trump, are prone to fits of vulgar self-promotion.

As a techie with a philosophical bent, I’ve always looked with suspicion on bouts of nostalgia. This skepticism has a long tradition in my family: while her contemporaries pined for the simple life back in Poland, my great-grandmother reminded her fellow immigrants to these shores in the 1920s about the poverty, the rickets, the state-sponsored anti-semitic riots in the old country. No, she said. The good old days are now.

And yet, there’s something about leftie nostalgia for the Bad Old Nineteen-Fifties that rings true: for all the conformity and repression of the era, it was, the statistics tell us, one of markedly reduced economic inequality. Money is by no means a measure of everything, but in these crazy times it matters a lot. This has to be one of the strangest recessions ever: while folks all over the income spectrum are struggling, the very wealthy are in fact better off now than three years ago.

Which makes the current Republican hissy fit about the debt ceiling all the more insane: they’ve adamantly opposed any increase on any taxes — even on the fabulously wealthy, even on corporate jets. Well… during the glorious era they revere so much, taxes on the wealthy were much, much higher than they were now.

I’ve always believed the label “conservative” is a misleading one for the American Right. They have little interest in “conserving” the status quo, and the real era they lionize is not the “socialist” Fifties but a time far further into our past: the euphemistically-named Gilded Age of the late-1800s. That was an era of staggering unfairness, where corporations routinely cut employee wages to guarantee higher stock dividends; when robbers barons paid off police to beat the crap out of workers trying to organize unions; where the notion was that life was risky, dangerous, and hardscrabble — and if you failed, so be it. I’ve long quipped, when right-wingers in America complain that “the Democrats are trying to turn America into a European social-democratic state” (as if that were such a terrible fate)” that they have an opposite plan: to turn this country into a banana republic, with a mostly-poor populace and a tiny elite of wealthy plutocrats. Defaulting on our debt would turn that joke into something all too serious.

I think, then, that the smartest thing to do with nostalgia is identify those parts of the past that worked well, and strive to integrate those into our future. But it’s equally important to recognize what didn’t work, and, well, not head in that direction as well. It seems obvious, doesn’t it… so what are waiting for?

Share