Louis Proyect: The Unrepentant Marxist

October 13, 2013

Obamacare’s Achilles Heel

Filed under: computers,health and fitness,technology — louisproyect @ 7:14 pm

Signing up for Obamacare

My political career (for lack of a better word) began in 1967 just one year before my professional career as a programmer/analyst. The software career came to an end in August 2012 but I am still going strong politically. With such a background, I probably had a keener interest in the lead article in the NY Times today titled “From the Start, Signs of Trouble at Health Portal” than the average person. The lead paragraphs should give you an idea of the depth of the problem. While it is too soon to say if the technical flaws of the Obamacare website will doom a flawed policy, it cannot be ruled out.

In March, Henry Chao, the chief digital architect for the Obama administration’s new online insurance marketplace, told industry executives that he was deeply worried about the Web site’s debut. “Let’s just make sure it’s not a third-world experience,” he told them.

Two weeks after the rollout, few would say his hopes were realized.

For the past 12 days, a system costing more than $400 million and billed as a one-stop click-and-go hub for citizens seeking health insurance has thwarted the efforts of millions to simply log in. The growing national outcry has deeply embarrassed the White House, which has refused to say how many people have enrolled through the federal exchange.

Even some supporters of the Affordable Care Act worry that the flaws in the system, if not quickly fixed, could threaten the fiscal health of the insurance initiative, which depends on throngs of customers to spread the risk and keep prices low.

“These are not glitches,” said an insurance executive who has participated in many conference calls on the federal exchange. Like many people interviewed for this article, the executive spoke on the condition of anonymity, saying he did not wish to alienate the federal officials with whom he works. “The extent of the problems is pretty enormous. At the end of our calls, people say, ‘It’s awful, just awful.’ ”

I got my first inkling of how screwed up the system was from my FB friend Ted Rall, the well-known leftist editorial page cartoonist who started off as an engineering student at Columbia University and who is technically proficient. You can find his scathingly witty account of trying to enroll here. I got a particular chuckle out of how the system responded when he entered his SS number:

Screen shot 2013-10-13 at 2.08.01 PM

Once he got past the SS number snafu and began the enrollment process he was shocked at the rates he would have to pay for “affordable” health care.

For this 50-year-old nonsmoker, New York State’s healthcare plans range from Fidelis Care’s “Bronze” plan at $810.84 per month to $2554.71 per month. I didn’t bother to look up the $2554.71 one because if I had $2554.71 a month lying around, I’d buy a doctor.

$810.84 per month. $10,000 a year. After taxes. Where I live, you have to earn $15,000 to keep $10,000.

Not affordable. Did I mention that?

I was surprised to see that the primary consultant for the Obamacare website was CGI, a Montreal-based company that was one of the chief competitors of Automated Concepts Inc., the consulting group I worked for in the late 70s and early 80s. I have no idea when ACI went out of business but CGI has obviously become a major power. What I found most shocking was the late date at which programming began: “The biggest contractor, CGI Federal, was awarded its $94 million contract in December 2011. But the government was so slow in issuing specifications that the firm did not start writing software code until this spring, according to people familiar with the process.”

For a project of this size, it would be difficult to meet a target date of Fall 2013/Winter 2014 if it had started in Spring 2012 let along Spring 2013. I am amazed that it is even 70 percent complete, as the Times reports. My guess is that is probably only half-done.

There’s a lot of ass-covering going on now. Oracle, the company whose registration software gave Ted Rall such headaches, says, “Our software is running properly.” Oracle’s CEO is Larry Ellison, the third richest man in America whose yacht just won the America’s Cup in San Francisco. After 9/11 Ellison offered to supply a National Id card system to help weed out terrorists. With all of Ted Rall’s SS number woes, we can be thankful that his offer was turned down. Or else half the population would be in Guantanamo right now.

Like Bill Gates, Ellison got rich exploiting the intellectual breakthroughs made by others. Oracle was one of the first relational database systems marketed to corporations in the early 80s, along with Sybase, the proprietary software I supported for twenty years at Columbia University. Relational databases (basically a rows/columns approach similar to the spreadsheet concept) were invented by the mathematician E.J. Codd who made much more of a contribution intellectually than Ellison but never had ambitions to be a billionaire.

The Times has a graphic to illustrate the problems of the Obamacare website at http://www.nytimes.com/interactive/2013/10/13/us/how-the-federal-exchange-is-supposed-to-work-and-how-it-didnt.html.

This particular feature would seem to explain not only the technical challenges that make the system difficult to implement but also a fatal policy flaw:

Screen shot 2013-10-13 at 2.38.01 PMThe government is offering what is called a “many-to-many” relationship in database terms: many applicants choosing from many plans. This is historically a challenge to implement in financial systems such as the kind found typically in investment plans.

It would have been a lot easier to simply extend Medicare to the entire population. Not only would the private insurance companies be eliminated, the existing software would have only required a relatively minor change—eliminating the 65 year old criterion.

And going one step further, what is the purpose of having a bunch of different insurance companies competing with each other to provide the same service? Why not a single payer like in Canada that can be run on a nonprofit basis? And, then, to make it even more manageable why can’t we implement a public health system like in France with doctors functioning more as servants of the public rather than entrepreneurs? This sounds rather utopian, I realize, but only in terms of the resistance we would meet rather than the feasibility. Instead of policies that are economical and rational, we get jury-rigged, Rube Goldberg systems that can barely get off the ground like Howard Hughes’s plywood super-plane.

As long as we are talking in utopian terms, managing an economy would be a whole lot easier if we eliminated the profit motive that pits private enterprises against each other basically offering the same goods and services. I defy anybody to tell me why he or she picks one detergent against another. There will always be a need for small businesses such as restaurants (something the Cubans unfortunately did not realize until too late—not too late, one hopes) but the commanding heights of the economy?

If you think in terms of spreadsheets (or relational database systems), planning an economy is not that big a deal. You think in terms of resources, labor, and social needs that can be arranged in rows and columns. From that you allocate on a rational basis and according to the priorities a democratically elected government deems wise—such as spending more on public transportation than automobiles.

Of course, until an aroused population takes control of the economy and puts people like Larry Ellison and Barack Obama on a secluded island where they will be stripped of the power to exploit and to destroy, those hopes will remain utopian. For me, the need to defend such an orientation will remain with me with every living breath.

July 28, 2013

Spammers, don’t waste my time or yours

Filed under: commercialism,computers,crime — louisproyect @ 9:07 pm

My readers may have noticed a post from the other day asking someone to stop posting what appeared to be legitimate comments from a page identified as spam in WordPress’s database. Since the comments did not have the usual “Excellent points you are make! I will definately bookmark you for future enjoyment” quality, I assumed that they were legitimate. As someone pointed out to me, the spammer took the trouble to find some text somewhere that plausibly corresponded to the content of my post. I should have taken WordPress at its word and simply deleted the bogus comment. Just now some other spammer has taken the same tack as evidenced by this comment being held in my spam queue:

Screen shot 2013-07-28 at 4.52.28 PMI googled the words highlighted above and discovered that they were first posted to Andy Newman’s blog. Some idiot spammer is taking the trouble to find some comment made elsewhere so that one of my readers will click his link. Doesn’t he understand that people who visit the Unrepentant Marxist are the most deeply suspicious people on earth, as likely to click such a link as they are to vote for Mitt Romney? I guess the url of the link indicates the level of desperation. Bodaideal.blogbyt.es comes from Spain. The unemployment there is over 50 percent for people in their early 20s. I would only advise my spammer to work for the overthrow of the capitalist system there. He will have much more success in that endeavor than tricking my readers into going to a website titled “Ideal Wedding”.

January 14, 2013

Thoughts on Aaron Swartz’s suicide

Filed under: computers,imperialism/globalization,intellectual property,Internet — louisproyect @ 8:02 pm

On January 9th my CounterPunch tribute to Sol Yurick concluded with this offer:

There are 13 articles by Sol Yurick listed in JSTOR, including the two I cited above. As most of you know, JSTOR has been a kind of battleground over the past several years with a 26 year old Stanford dropout named Aaron Swartz being arrested for downloading all of the JSTOR articles from MIT with the obvious intention of making them available to the hoi polloi—in other words, the opposite of the Athenian ruling class revered by Sophocles.

If there were any justice in the world, those articles above all should be accessible to the kinds of people who would have taken a class with Sol 30 years ago at the Brecht Forum or who are CounterPunch readers today. In the same defiant spirit as Aaron Swartz, but on a scaled-down level, I invite anybody who wants to read a Sol Yurick article without paying 15 dollars for the privilege to contact me at lnp3@panix.com and we’ll work something out. You can get a list of Sol Yurick’s articles doing a search on jstor.com without paying a penny and I encourage you to so without delay.

Two days later the world learned that Aaron Swartz had hung himself. Needless to say, I feel like I have lost a comrade even though I never met him. Like him, I have always believed in sharing JSTOR articles even though I never would have taken the kind of risk that he did. For me, it has amounted to passing along something like 200 JSTOR articles or so since I first gained access to the database as a Columbia University employee in 1991, including a dozen or so Sol Yurick articles in response to those who had accepted my invitation. One of the recipients was a fellow named Nicholas Levis, a Greek-American who chaired the protest against Golden Dawn in Astoria a while back. With his Hellenic ties, it was natural for him to request a copy of Sol’s article on Oedipus Rex. That, in my view, is exactly the kind of connection that Aaron Swartz sought to facilitate—even giving his life in the process. While the N.Y. Times obituary claims that it might have been depression, an illness he had been battling for many years, that caused the suicide, his family and partner thought differently:

Aaron’s death is not simply a personal tragedy. It is the product of a criminal justice system rife with intimidation and prosecutorial overreach. Decisions made by officials in the Massachusetts U.S. Attorney’s office and at MIT contributed to his death. The US Attorney’s office pursued an exceptionally harsh array of charges, carrying potentially over 30 years in prison, to punish an alleged crime that had no victims. Meanwhile, unlike JSTOR, MIT refused to stand up for Aaron and its own community’s most cherished principles.

Like Julian Assange and Bradley Manning, Aaron Swartz was a victim of a national-security state that is anxious to keep critical information out of the hands of its citizens. Since most of JSTOR consists of narrowly focused, if not pedantic, articles mostly designed to help academics survive the “publish or perish” ordeal, one might wonder if there is any connection between Wikileaks and JSTORLeaks.

In one of a number of memorable tributes to Swartz, Glenn Greenwald explained what was at stake in the persecution of Aaron Swartz:

Nobody knows for sure why federal prosecutors decided to pursue Swartz so vindictively, as though he had committed some sort of major crime that deserved many years in prison and financial ruin. Some theorized that the DOJ hated him for his serial activism and civil disobedience. Others speculated that, as Doctorow put it, “the feds were chasing down all the Cambridge hackers who had any connection to Bradley Manning in the hopes of turning one of them.”

I believe it has more to do with what I told the New York Times’ Noam Cohen for an article he wrote on Swartz’s case. Swartz’s activism, I argued, was waged as part of one of the most vigorously contested battles – namely, the war over how the internet is used and who controls the information that flows on it – and that was his real crime in the eyes of the US government: challenging its authority and those of corporate factions to maintain a stranglehold on that information. In that above-referenced speech on SOPA, Swartz discussed the grave dangers to internet freedom and free expression and assembly posed by the government’s efforts to control the internet with expansive interpretations of copyright law and other weapons to limit access to information.

As I began collecting my thoughts on the meaning of Aaron Swartz’s death, it began to dawn on me that the stakes are even higher than those set down by Glenn Greenwald. If there was someone with the focus and the Marxist erudition of V.I. Lenin today, a primary task would be to analyze “the latest stage of capitalism” in terms of the evolving character of American imperialism and its cohorts in the advanced industrial countries, for whom intellectual property is beginning to assume the same critical function as a steel mill or a bank did in 1914.

I could not help but reminded of this every time I put a screener from The Weinstein Company, the Walt Disney Corporation, et al on my DVD player in November and December in order to help me nominate best picture, director, actor, etc. for NYFCO’s annual meeting. Every one of them starts with a warning that if I distribute the DVD to just about anybody, I face a 5 year prison term and a $250,000 fine. When I sit through something like “The Dark Knight Rises”, I feel like that is punishment enough.

In 2001 my colleague and friend Michael Perelman wrote a book titled Steal This Idea: Intellectual Property and the Corporate Confiscation of Creativity. You can read an article based on the primary thesis of the book in the January 2003 Monthly Review. Perelman states:

The dramatic expansion of intellectual property rights represents a new stage in commodification that threatens to make virtually everything bad about capitalism even worse. Stronger intellectual property rights will reinforce class differences, undermine science and technology, speed up the corporatization of the university, inundate society in legal disputes, and reduce personal freedoms.

We have no precise measure of the extent of intellectual property, but a rough calculation by Marjorie Kelly suggests the magnitude of intellectual property rights. At the end of 1995, the book value of the Standard and Poor (S&P) index of 500 companies accounted for only 26 percent of market value. Intangible assets were worth three times the value of tangible assets.1 Of course, not all intangible assets are intellectual property rights, but a substantial proportion certainly is.

While the legal protection of intellectual property might seem inseparable from contemporary global capitalism, until fairly recently capitalists were equivocal about such things. During the first six decades of the nineteenth century, corporations in the United States were not inclined to respect such intellectual property rights. For example, they often paid as little as possible, or nothing at all, to inventors. In addition, the United States did not even recognize international copyrights.

The free-marketeers of the nineteenth century vigorously opposed intellectual property rights as feudalistic monopolies. Their view of intellectual property rights mostly dominated political economic opinion in the United States until the massive depression of 1870s weakened faith in market forces. In the context of the economic crisis, business was desperate for anything that would return profits to what they considered to be an acceptable level.

At first, business owners tried forming cartels and trusts to hobble competitive forces. In response to vigorous protests, Congress passed the Sherman Antitrust Act. However, corporations were able to use patents, which were perfectly legal, as a convenient loophole to evade the intent of that law. Through patent pools, they could divide up the market and exclude new competitors. In this way, intellectual property rights were important in establishing monopoly capitalism.

The strengthening of intellectual property rights accelerated once again as the bloom wore off the post-Second World War “Golden Age” and the United States’ export surplus disappeared. Behind closed doors, corporate leaders successfully lobbied the government to strengthen intellectual property rights that would give advantages to their industries. Just as in the late nineteenth century, business saw property rights as a means of increasing profits when economic conditions began to sour. The public never had a clue about the extent to which the government had given away important rights.

With its hold over the developing world becoming ever more tenuous and enforced nowadays more by Predator Drones than boots on the ground, American imperialism must do everything in its power to control information. That information can be the State Department cables that Bradley Manning turned over to Wikileaks. It can also be the JSTOR articles that are stockpiled behind a paywall as if they were gold bars at Fort Knox. Considering the fact that anybody can go to the N.Y. Public Library, take out a print journal, Xerox an article, take it home, scan it, and send it out to thousands of recipients, it shows how vulnerable corporations seeking to leverage their control over intellectual property can be. Since the true model for the Internet should be the Public Library, the powers-that-be have a big job on their hands. They want the Internet to be an open exchange of information since so much of commerce is based on this model, but they want to keep certain parts of it off-limits to the unwashed masses—the very people whose cause Aaron Swartz took up.

Speaking of Monthly Review, it is worth mentioning that John Bellamy Foster, its current editor, and former editor Robert McChesney had some very interesting things to say about the struggle to keep the Internet open and free along the lines of the public library. In a March 2011 article titled The Internet’s Unholy Marriage to Capitalism, they wrote:

This economic context points to the paradox of the Internet as it has developed in a capitalist society. The Internet has been subjected, to a significant extent, to the capital accumulation process, which has a clear logic of its own, inimical to much of the democratic potential of digital communication, and that will be ever more so, going forward. What seemed to be an increasingly open public sphere, removed from the world of commodity exchange, seems to be morphing into a private sphere of increasingly closed, proprietary, even monopolistic markets.

There are many distinct levels at which Internet activity takes place, and all of them are in the process of being commercialized. The second area where conventional microeconomics would raise eyebrows if not ring alarm bells is how capitalist development of Internet-related industries has quickly, inexorably, generated considerable market concentration at almost every level, often beyond that found in non-digital markets. What this means is that there are multiple areas where private interests can get a chokehold on the Internet and seize monopoly profits, and they are all being pursued. Google, for example, holds 70 percent of the search engine market, and its share is increasing. It is on pace to challenge the market share that John D. Rockefeller’s Standard Oil had at its peak. Microsoft, Intel, Amazon, eBay, Facebook, Cisco, and a handful of other giants enjoy considerable monopolistic power as well. The crucial Wi-Fi chipset market, for example, is a duopoly where two firms have 80 percent of the market between them. Apple, via iTunes, controls an estimated 87 percent market share in digital music downloads and 70 percent of the MP3 player market.

As should be obvious from the citation above, Foster and McChesney are obviously looking at “the latest stage of capitalism” even if they probably don’t see themselves as following in Lenin’s footsteps. In a way, this is the logical outcome of being disciples of Harry Magdoff and Paul Sweezy who could be described as continuing the tradition of Lenin’s classic treatise on imperialism.

As Facebook, blogs, YouTube, and email lists become increasingly more important in connecting the left globally, we can expect more and more efforts to hinder the efforts of those who constitute its vanguard—like the young people who stepped to the forefront in the Arab Spring. Like Mayor Bloomberg forcing antiwar or Occupy activists into penned areas, we can expect the Mark Zuckerbergs of the world to draw closer to those in power in order to keep the left on a tight leash. It is one thing to allow people to put a video of a cat playing with a ball of cotton on Youtube. It something else altogether to use the Internet to build a march on Washington demanding that the government step down. I strongly suspect that the moves against Swartz, Manning and Assange are designed with that future Armageddon in mind.

October 21, 2012

We are Legion

Filed under: computers,Film — louisproyect @ 11:26 pm

Like its name, and unlike Wikileaks that is known mostly through its founder Julian Assange, the hactivist group Anonymous is not easily tied to any particular individual. Operating in semi-clandestine conditions, its members have only made public appearances behind the famous V for Vendetta Guy Fawkes masks. Opening last Friday night at the Quad Cinema, Brian Knappenberger’s very fine documentary “We are Legion” not only interviews key figures associated with Anonymous but presents a fairly scholarly but riveting account of its origins, much of which should be of avid interest to the left. When so many gray-haired veterans of the left fret over when “fresh blood” will arrive, “We are Legion” makes it clear that help is on the way even if it does not exactly conform to past expectations.

Among the expert witnesses interviewed is Steven Levy, the author of “Hackers: Heroes of the Computer Revolution”. I was particularly interested to hear what he had to say since my review of his book was the very first article I ever posted on the Internet, long before blogs existed—and for that matter, when the Worldwide Web was still in its infancy. Here’s an excerpt from my piece that will give you a flavor of how the earliest generation of geeks tilted left:

Some of the key pioneers in the personal computing revolution were not driven by entrepreneurial greed. For example, the Community Memory project in Berkeley, California was launched in 1973 by Lee Felsenstein. The project allowed remote public access to a time-shared XDS mainframe in order to provide “a communication system which allows people to make contact with each other on the basis of mutually expressed interests, without having to cede judgement to third parties.” The Community Memory project served as a kind of bulletin board where people could post notes, information, etc., sort of like an embryonic version of the Interenet.

Felsenstein, born in 1945, was the son of a CP district organizer and got involved in civil rights struggles in the 1950’s. Eventually, he hooked up with the Free Speech Movement at Berkeley and became a committed radical. Lee’s other passion was electronics and he entered the UC as an electrical engineering major.

Felsenstein then hooked up with another left-of-center computer hacker by the name of Bob Halbrecht and the two went on to form a tabloid called PCC “People’s Computer Company”. Among the people drawn to the journal was Ted Nelson, a programmer who had bounced from one corporate job to another throughout the 60’s but who was always repelled by “the incredible bleakness of the place in these corridors.”

Nelson was the author of “Computer Lib” and announced in its pages that “I want to see computers useful to individuals, and the sooner the better, without necessary complication or human servility being required.” Community Memory flourished for a year and a half until the XDS started breaking down too often The group disbanded in 1975.

Defying the stereotype of adolescent boys sitting behind a keyboard in the parents’ basement, many Anonymous members are female. One of them is Mercedes Haefer who is facing a 15-year sentence for a “denial of service” type attack on Paypal after it stopped processing donations to Wikileaks. At the time Haefer was a 20-year-old student at the University of Nevada, Las Vegas, the home of the “Running Rebels” basketball team, and a cashier at a Sony store. This is not exactly the typical profile of a Left Forum attendee but maybe it should be.

Haefer’s attorney Sidney Cohen, who works with the Center for Constitutional Rights, presented his legal strategy by comparing Haefer and her comrades to the civil rights activists who sat in at lunch counters in the South in the early 60s. While I wouldn’t dream of giving Cohen legal advice, I would remind him that the ruling class in the U.S. was divided over segregation back then. Today there are no such divisions, especially over the right of financial institutions to carry out their business unimpeded and especially the right to carry out their own “denial of service” to Julian Assange and Wikileaks. Even when hackers are capable of paralyzing the website of a major financial institution or some government agency for only a day, the authorities consider such actions an act of espionage deserving the harshest punishment.

Anonymous’s history is really fascinating. They started out as totally disconnected and anonymous posters to a bulletin board called www.4chan.org, bent mostly on what amounted to trolling just for the hell of it. Somewhere along the line a creep by the name of Hal Turner showed up there and got under everybody’s skin, particularly the “anonymous” regulars. Turner was a neo-Nazi who ran a webcast radio show from his home in North Bergen, New Jersey. Spontaneously, the opposition to Turner made his life hell on and off the Internet, hacking his website and subscribing to all sorts of magazines in his name. This step up from trolling ultimately led to a victory over Turner and a sense of empowerment.

Once those involved developed the sense that they had common social and political goals, they began to work together on a fairly organized basis. The next big campaign was against Scientology that had filed copyright infringement injunctions against the posting of a totally embarrassing Tom Cruise interview on Youtube or elsewhere. This got their dander up and they launched a drive to get the video up all over the Internet (maybe I should contact them about my comic book memoir.)

This led to a pitched battle with Scientology and its lawyers employing the techniques that would make Anonymous famous (or infamous if you are against the left), taking their websites offline, jamming their phones, etc. If flooding a switchboard is a crime that carries a 15-year sentence, you can bet that Obama or some other big-time politician will have little to worry about when they invite you to do the same thing to his opponents.

The arrests of Haefer and company has undoubtedly had a dampening effect on hactivism or what the ruling class calls cyberterrorism. Given the nature of Anonymous, it is not surprising to see that their views on what Marxists call strategy are not to be found. This, of course, is the same issue with the black bloc. When you get into the business of challenging bourgeois legality, you have to expect the full might of the state to be used against you.

When I joined the Socialist Workers Party in 1967, I was told that I had to stop using illegal drugs. Since I had grown bored with pot, this was not that much of a sacrifice. Even if this kept our numbers smaller, it made a lot of sense especially in places like Houston, Texas where we had a branch. It was not unusual for an activist to face a stiff prison sentence when they were caught with a small amount of pot. Since we knew that there were informers in our movement, we had to be extra careful.

After the Scientology war ended, Anonymous turned to more and more political issues such as working with Wikileaks and giving technical support to fellow hactivists in the Middle East during the Arab Spring.

While most of what they do is commendable, there are some disturbing signs that the same kind of unaccountability that has characterized black bloc activism threatens to erode the good will that has been built up in the recent past.

While the ties between LulzSec and Anonymous were never clearly established, it can at least be said that most people regard them as part of the broader hactivist movement, including the people who made “We are Legion”. While engaging in political protest, LulzSec’s more overarching goal was to create mayhem. Their manifesto was an exercise in nihilism:

This is the lulz lizard era, where we do things just because we find it entertaining. Watching someone’s Facebook picture turn into a penis and seeing their sister’s shocked response is priceless. Receiving angry emails from the man you just sent 10 dildos to because he can’t secure his Amazon password is priceless. You find it funny to watch havoc unfold, and we find it funny to cause it. We release personal data so that equally evil people can entertain us with what they do with it.

Eventually the group’s leader was unveiled as an FBI informant who had entrapped a number of his comrades, who like Mercedes Haefer are facing stiff prison terms.

Recently Anonymous decided to break with Wikileaks after it began what Anonymous regarded as an intrusive fundraising:

Since yesterday visitors of the Wikileaks site are presented a red overlay page that demands they donate money. This page cannot be closed, and unless a donation is made – the content like GI Files are not displayed.

While they have promised not to attack the Wikileaks website, it is not good to see the two stalwarts of hactivism divided in this fashion.

Frankly, I have some trouble coming to grips with hactivism even though I spent nearly 5 years promoting leftwing computer programming efforts in the late 80s to early 90s through the auspices of Tecnica. Tecnica’s goal was just as radical as Anonymous’s but focused on supporting a government rather than challenging it. We sent hundreds of computer programmers to Nicaragua in order to train its citizens how to use spreadsheets, electronic publishing, databases, etc. as part of an effort to modernize the country and make socialism feasible.

Denial of service attacks on the Pentagon, the CIA and shitty financial institutions is obviously something I sympathize with but I am not exactly sure how the broader goals of the left and those of Anonymous can properly mesh. The closer I look at the left today, the more convinced I become that accountability and transparency are urgently needed. At the risk of sounding hackneyed, I suppose that the premium must be on worker’s democracy both in how we are organized and in our ultimate goal of transforming society.

If you have wrestled with these questions yourself, I strongly urge you to take a look at “We are Legion”, a film that sheds light on one of the more important developments in the past half-decade or so.

July 3, 2012

After programming computers for 44 years, I am finally retiring

Filed under: computers — louisproyect @ 6:38 pm

I should add that of those 44 years about half have been spent at Columbia University. While the university is most certainly a capitalist institution, it was a much better place than some of the truly scumbag institutions that have employed me over the years including Metropolitan Life, the First National Bank of Boston, Texas Commerce Bank, Salomon Brothers, and Goldman-Sachs.

A recent article in the NY Times by Ben Ratliff captured my feelings about working here:

The heat comes quickly in the summer. By early June, working at home with no air-conditioning, I have no concentration. Everything feels close and impolite and loud.

So I go to Butler Library, on the southern end of Columbia’s campus in Morningside Heights. What began as a diversion has become a self-preserving summer thing: not just Butler, but the Butler stacks, the stillness capital of my imagination…

Butler is a 1930s neo-Classical hulk. At the front, above 14 columns, runs a list of writers and thinkers; the last is Vergil, and I like that someone long ago took a stand and chose to spell it in the Anglicization closer to his real name, not the more common “Virgil.” It announces: nonsense not spoken here.

In the late ’80s, I’d been there a lot, studying and working as a summer employee. When I turned up at the Library Information office last year, there was much clucking about how I’d graduated so very long ago that they needed a whole other database to find my information. But that’s cool: I am from another time. Pre-air-conditioning.

I had come to work but also to tune myself up. So I split the day. Some for my bosses, some for me. After I met my deadline, writing in the reference room, I walked behind the main desk into the stacks. The Columbia library system owns over 10 million volumes; 1.5 million, humanities and history, live here. I moved around for a few hours in the stillness, looking things up, standing up or crouching the whole time, purely and almost dopily happy.

I’d forgotten. The Butler stacks are in a different sensory category, starting from the threshold: If you’re tall, you bow your head as you pass through the low door frame. They form an enclosed rectangular prism at the center of Butler — no windows, a bit cooler than the rest of the building. Two or three levels of the inner stacks can correspond to one floor of the outer library. All this reinforces the feeling that the stacks are something special: a separate province or a vital inner organ.

Inside there is the deep quiet of protection and near-abandonment. You hear the hum of the lights, turned on as needed; that’s it. There’s a phone to make outgoing calls on the fifth floor. To me the stacks are the most sacred space in the library, yet here nobody’s telling you not to talk. You’re on your own. It’s a situation for adults.

It is hard to explain what having access to Butler meant to someone like me. (As a retiree, I will continue to have access.) At one point, when I was researching the Marxist analysis of Reconstruction, I went over to Butler with the call letter for an Eric Foner book. Once I got to the shelf it resided on, I found another 5 or 6 books that were closely related. In some ways, it is a lot like following hyperlinks through Google except the connections in this case are “brick and mortar”. Being able to pick up the books in your own hands and browse through them is a mind-expanding experience.

The same thing with the library’s online resources like JSTOR, a database of scholarly journal articles that I have relied on over the years. Or Lexis-Nexis, a database of newspaper articles as well as television and radio transcripts. As I told the manager of applications development in Information Technology in a brief informal exit interview, access to such material has been more important to me than medical benefits.

Working at Columbia has also been different from other places on a human level as well. The average age on my project is over 50. Some people came here as a kind of escape from the Wall Street grind while for others it was the only employment opportunity available for the older job hunter. I interviewed here in the summer of 1991 when I was 46 years old and unemployed. I had solid skills but a downsizing financial industry was aggressively pursuing age discrimination policies that made it practically impossible to get a new job.

I got through the interviews with flying colors but was dismayed to discover in the final interview with the IT director that the salary would be about $20,000 less than I was making at Goldman-Sachs. I decided to take the job anyhow because my professional goals had changed. I simply wanted shelter from the storm. At the time there was a term to describe what motivated me: “downshifting”. From Wikipedia:

Downshifting is a social behavior or trend in which individuals live simpler lives to escape from the rat race of obsessive materialism and to reduce the “stress, overtime, and psychological expense that may accompany it.” It emphasizes finding an improved balance between leisure and work and focusing life goals on personal fulfillment and relationship building instead of the all-consuming pursuit of economic success.

Downshifting, as a concept, shares many characteristics with Simple living, but is distinguished, as an alternative form, by its focus on moderate change and concentration on an individual comfort level, a “dip your toes in gently” approach. In the 1990s this new form of Simple living began appearing in the mainstream media and has continually grown in popularity among populations living in industrial societies especially the United States, the United Kingdom, New Zealand and Australia.

Just about everybody on my project team that I have had the good fortune to work with over the years are here for reasons like mine, although none of them spends much time in Butler Library or reading JSTOR articles. They are all over 40 and quite “mellow” in their attitudes. Over the past 10 years at least there has not been a single defection to the financial industry. They all put a premium on a work environment where you are not surrounded by ambitious young fucks wearing pinstripe suits, male or female. Since it is almost impossible to get a promotion here, people are content to do their job and go home in the evening without obsessing over whether they will become a manager or how big their year-end bonus will be.

Although Karl Marx never worked in an office, he obviously had a handle on my world when he wrote the following in the Economic and Philosophical Manuscripts of 1844: “…the worker feels himself only when he is not working; when he is working, he does not feel himself. He is at home when he is not working, and not at home when he is working.”

One guy is fairly typical. He is Bill Thompson who has never worked anywhere except Columbia. He is responsible for the financial systems that reside on the IBM mainframe and I interact with him on a daily basis since files from that system are downloaded to a client-server application that I have supported for most of my time here. But Bill is also a passionate Japan scholar with a particular interest in film. Years ago he was the Asian film curator at Bleecker Street Cinema, one of the city’s outstanding art houses in the 60s and 70s. (Bill was acknowledged in the introduction to Michael Hoover’s “City on Fire: Hong Kong Cinema”, a Marxist film study from Verso.)

This morning Bill approached me to discuss a problem we had with a reconciliation process between the mainframe and the Unix server I support but I told him that I needed to talk to him about something far more important before we troubleshot the reconciliation problem. That was John Woo’s “Red Cliff”, an outstanding costume drama that I will be writing about in a few days. That’s what makes my job interesting. There are people who care about other things than Mammon, not that you are likely to find altars to it at the university outside the Business School.

But the best thing people-wise has been my boss Joan who is a no-nonsense Jamaican-American who came to Columbia around the same time as me and also out of the financial industry. Joan has cut me a lot of slack over the years, never saying a word about my obvious forays into JSTOR or Lexis-Nexis during working hours, let alone blogging as the Unrepentant Marxist. She learned early on that when I have something to do, I do it promptly and without errors. That was the implicit deal I cut with Columbia. They got someone with tons of experience who was being paid a lot less than his market value in the financial industry but who was not trying to set the world on fire. At least she was assured that I was not after her job! I managed to spend 44 years as a programmer without ever becoming a manager, thank god.

As it turned out, the work experience at Columbia was also far more interesting than I had any reason to expect. I was hired as a development methodology coordinator in 1991, a job that reflected the school’s determination to explore new technologies. I had worked with leading-edge systems on Wall Street and they expected me to provide insights into how to use databases, etc.

My responsibilities for most of my time here have been to support the Sybase database underlying the school’s financial system, as well as the Unix environment that the application runs in. There are more specialized groups to support Sybase and Unix but within my application team, I am the first line of support and liaison to the specialists. I also have written hundreds of perl scripts, a language that I have a great affection for unlike Java, an object-oriented language that I spent about three years using and hating every minute of.

If you read the trade press, Java is depicted at the greatest thing since sliced bread (an apt metaphor since sliced bread is actually pretty shitty) but I have always found it perplexing. For example, here’s a description of polymorphism from http://www.artima.com:

There are two good reasons to learn the meaning of polymorphism. First, using such a fancy word in casual conversation makes you sound intelligent. Second, polymorphism provides one of the most useful programming techniques of the object-oriented paradigm. Polymorphism, which etymologically means “many forms,” is the ability to treat an object of any subclass of a base class as if it were an object of the base class. A base class has, therefore, many forms: the base class itself, and any of its subclasses.

If you need to write code that deals with a family of types, the code can ignore type-specific details and just interact with the base type of the family. Even though the code thinks it is sending messages to an object of the base class, the object’s class could actually be the base class or any one of its subclasses. This makes your code easier for you to write and easier for others to understand. It also makes your code extensible, because other subclasses could be added later to the family of types, and objects of those new subclasses would also work with the existing code.


The only word that hits home whenever I read something like this is polymorphism but not in the sense intended by the author.

Polymorphous perversity is a psychoanalytic term for human ability to gain sexual gratification outside socially normative sexual behaviors. Sigmund Freud used this term to describe the normal sexual disposition of humans from infancy to about age five.

Yeah! That’s the kind of polymorphism I’m for, what happens in bed not in IT.

I blame Java for my eye troubles in fact. Around 8 years ago when I was in a training class for Java and feeling all sorts of stress over my inability to grasp polymorphism, overloading, etc., I noticed spots in front of my left eye. It turned out to be a floater, the first episode in a continuing battle with eye diseases. It was the fucking Java, I tell you.

Perl, Sybase and Unix are much more my métier. Perl has adopted object-orientation in the past decade or so but I have been lucky enough to avoid it. I have been blessed by being able to work in a profession that is as much fun most of the time as playing chess.

Computer programming is essentially a game. You write instructions in order to get something to work, even if it as mundane as preparing budgets for Columbia’s financial officers. Like chess, you have to operate within the logic of the game. Getting your program to work gives you the same kind of satisfaction as checkmating your opponent, a pleasure I suppose that is only open to those of us a bit on the anal retentive side.

The other nice thing is that you are constantly learning things. Since technologies are always changing, your mind gets challenged. Most of the time when there’s something I am having trouble figuring out, I go to Google and search for an answer. 99 percent of the time, I come up with the answer. In the old days, before the Internet and when I was working with Cobol mainframe applications, I would ask a co-worker for some advice—an admission that I was not omniscient and not something useful for career advancement at a place like Goldman. Fortunately, by the time I got to Goldman I was a seasoned pro and usually being asked by more junior people how to solve a problem.

The other day I was stymied trying to get a handle on identifying larger files on a given file system on the AIX server I work on. No matter how much digging I did in Google, nothing came up. So I turned to Eric, the Unix specialist who I have relied on over the years, for an answer. I wanted to know what the ten largest files were for a given file system and this was his answer:

find . -xdev -type f -ls | sort -k7nr,7 | head -n 10

You can try this on your Mac when you get home!

I should add that over the next month or so I will be blogging about jobs I have had over the years, starting with my first one at Met Life in 1968. Lots of yucks and valuable insights about my day job as a combination of Dilbert and Bartleby the Scrivener.

June 25, 2012

The University of Virginia fracas

Filed under: computers,Education — louisproyect @ 7:55 pm

Over a 22 year career in Columbia University’s IT department, I naturally followed administrative affairs at other universities. I began reading Chronicle of Higher Education back in 1990 mostly as a way of keeping up-to-date with “back office” concerns, especially how computer systems were being used. After a few months, I discovered that this trade magazine could also be relied upon for useful coverage of “the culture wars”, such as Ward Churchill’s firing, etc.

When I first got wind of the forced resignation of the University of Virginia’s President Teresa Sullivan, I wrote it off as some kind of turf battle. As a kind of relic of the medieval world, universities tend to divide into fiefdoms so firings and forced resignations are par for the course. But after a while it became obvious that what happened there had a lot more to do with what’s happening in American society over the past decade or so as the corporate elites of one percent infamy tighten their control over every aspect of our lives, including the Ivory Tower.

Sullivan’s resignation was announced on June 10 and reported in the Chronicle as resulting from “significant disagreements between Ms. Sullivan and the Board of Visitors [another term for board of trustees] about how best to position the historic institution for success in the 21st century.” She had come to U. Va. from the University of Michigan, the same institution that Columbia’s Lee Bollinger had ruled before coming here. As you would expect, she was probably no different than Bollinger or 90 percent of the presidents running colleges today, people once described by Upton Sinclair in “The Goosestep: a study of American Education”:

Thus the college president spends his time running back and forth between Mammon and God, known in the academic vocabulary as Business and Learning. He pleads with the business man to make a little more allowance for the eccentricities of the scholar; explaining the absurd notion which men of learning have that they owe loyalty to truth and public welfare. He points out that if the college comes to be known as a mere tool of special privilege it loses all its dignity and authority; it is absolutely necessary that it should maintain a pretense of disinterestedness, it should appear to the public as a shrine of wisdom and piety. He points out that Professor So-and-So has managed to secure great prestige throughout the state, and if he is unceremoniously fired it will make a terrific scandal, and perhaps cause other faculty members to resign, and other famous scientists to stay away from the institution.

Sullivan, like Bollinger, spent her time running between Mammon and God at U. of Va. but apparently not fast enough to assuage “visitor” Helen Dragas, a rector of the university and a real estate developer. What? You were expecting a poet or a sculptor maybe? Dragas’s main ally on the board was Vice Rector Mark Kington, who ran an asset management firm, another prerequisite for overseeing an institution of higher learning. The third and most interesting member of the anti-Sullivan triumvirate was Peter Kiernan, who was chairman of U. Va.’s Darden Business School board of trustees and formerly a Goldman-Sachs partner. This Kiernan is a real piece of work, based on the fawning NY Times Dealbook profile from February 29th of this year written by the loathsome Andrew Ross Sorkin, infamous for his article sneering at Occupy Wall Street.

Ultimately, Mr. Kiernan, 58, says he believes we need to put aside political differences to solve our national problems and avoid losing our place in the global pecking order, a doctrine he calls “radical centrism.”

“I was really writing the book to people to say, here’s what you’ve got to do to lead the country in uncertain times,” Mr. Kiernan said, over a recent lunch at the Peking Duck House in New York’s Chinatown. “For once, I wanted to read a book that is agnostic to political parties.”

At Goldman, Mr. Kiernan – a rugged Irish Catholic with a firm handshake and a polished demeanor straight out of Wall Street central casting – was better known for his philanthropy than his politics. He headed the Robin Hood Foundation, an antipoverty group whose ranks are populated by financial titans, and led a charity bicycle ride through Vietnam in 1998, with the proceeds going to disabled veterans.

So, when you put together people like Dragas, Kington and Kiernan, the results are predictable. They will be focused on the university’s “bottom line”, and all the rest—from scholarship to teaching young people how to become good citizens—be damned. Kiernan, like Kington, resigned not long after the national media got a hold of the Sullivan story. This was obviously meant to release some steam rather than solve the underlying problem, namely Mammon running roughshod over culture.

The smoking gun in all this was an email from Dragas to Kington calling attention to a Wall Street Journal article touting the benefits of computerized classes written by John E. Chubb and Terry M. Moe. (Don’t miss Doug Henwood’s interview with Moe here.) Dragas’s subject heading was “we can’t afford to wait”. The title of the WSJ article was most revealing, almost as written to illustrate volume one of Karl Marx’s Capital: “The substitution of technology (which is cheap) for labor (which is expensive) can vastly increase access to an elite-caliber education”. While Marx usually wrote about this in the context of Britain’s textile mills, apparently 21st century capitalism has every worker in its sights, including those in the halls of ivy.

Chubb and Moe are Hoover Institution scholars. Yes, I know, you weren’t expecting that. Both are fanatical rightwingers who have targeted teachers both at the college and secondary education level. They don’t see any particular need to be at Harvard to get a top-flight education:

The fact is, students do not need to be on campus at Harvard or MIT to experience some of the key benefits of an elite education. Moreover, colleges and universities, whatever their status, do not need to put a professor in every classroom. One Nobel laureate can literally teach a million students, and for a very reasonable tuition price. Online education will lead to the substitution of technology (which is cheap) for labor (which is expensive)–as has happened in every other industry–making schools much more productive.

One would be hard-pressed, however, to say whether they want to make a Harvard education available globally to any son or daughter foolish enough to part with their money, or to adopt a different education model altogether:

Don’t dismiss the for-profit colleges and universities, either. Institutions such as the University of Phoenix–and it is hardly alone–have embraced technology aggressively. By integrating online courses into their curricula and charging less-than-elite prices for them, for-profit institutions have doubled their share of the U.S. higher education market in the last decade, now topping 10%. In time, they may do amazing things with computerized instruction–imagine equivalents of Apple or Microsoft, with the right incentives to work in higher education–and they may give elite nonprofits some healthy competition in providing innovative, high-quality content.

As to be expected, Chubb and Moe swept for-profit school failure under the rug. If this type of institution is supposed to be a harbinger of things to come in higher education, American society will be going down the drain a lot faster than anybody expected. What places like these are best at is not educating people, but ripping them off. Through clever advertising campaigns, from all appearances the number one placement on NYC’s buses and subways, they tell working class kids—especially Blacks and Latinos—that a degree from such a school will get them a good job.

The Huffington Post reported  on what really makes for-profit institutions tick. Here’s a hint. It is not computers, but the cash register:

And despite the considerable cost, federal data show that for-profit colleges on average devote less than a third of the money that public universities do toward student instruction, and less than a fifth of the money spent on students by private non-profit institutions.

Much of the money is instead going toward marketing and recruiting new students, and to executive compensation and profits. According to securities filings for some of the larger publicly traded corporations that own for-profit schools, more than 30 percent of revenues are being redirected toward marketing efforts and administrative costs.

There is a tremendous irony in the U. of Va. crisis considering the school’s origins in 1819. It was founded by Thomas Jefferson and the first board of trustees included him, and two other former presidents James Madison and James Monroe.

In a letter written to British scientist Joseph Priestley, Jefferson declared: “We wish to establish in the upper country of Virginia, and more centrally for the State, a University on a plan so broad and liberal and modern, as to be worth patronizing with the public support, and be a temptation to the youth of other States to come and drink of the cup of knowledge and fraternize with us.”

We’ve come a long way from the “cup of knowledge” considering what can be found on the university’s website, even before Dragan’s vision for the future is realized. This is from the Corporate Connections page, shamelessly placed as a link on the university’s home page.

Welcome to the University of Virginia’s “Corporate Connections” gateway. This site will help you navigate through the variety of ways the University relates to and collaborates with business, industry and private foundations.

The Corporate and Foundation Relations office seeks to maximize contributions and other support to the University of Virginia from corporations and foundations, by creating, maintaining and enhancing mutually beneficial relationships between these entities and university units.

We provide an infrastructure for prospect coordination, planning, solicitation and other services that empower university units to conduct these activities in the most effective manner. Our central staff can help you get started based upon your interests and needs. Call (434) 924-4159, e-mail Nick Duke, or write: Office of Corporate and Foundation Relations, University of Virginia, P.O. Box 400807, Charlottesville, Virginia 22904-4807

Among the fruit borne from this poisonous bush is this:

Philip Morris USA Supports Medical Research and Business Leadership with a $25 Million Gift to U.Va.

This is a bit of Philip Morris PR designed to deflect attention from its primary purpose, namely to sell cancer sticks. It should be mentioned as well that the “Nick Duke” inviting emails above is none other than Nicholas R. Duke, a scion of the tobacco-growing empire. How appropriate.

In addition to the company’s support for the University of Virginia, Philip Morris USA has made significant investments in youth-smoking prevention and cessation programs and in research.

Since 1998, Philip Morris USA has invested $1 billion in youth-smoking prevention programs through its Youth Smoking Prevention department and its responsible retailing incentives.

Between 1999 and 2006, Philip Morris USA has provided grants in excess of $176 million to schools, school districts and youth-focused organizations across the United States to help them implement programs that help young people develop confidence and avoid risky behaviors, such as smoking.

Somehow the tobacco giant’s good intentions were lost on the government of Uruguay that like other subversive states in Latin America decided to put the health of its population above that of what Upton Sinclair called Mammon. From the Daily Beast:

Except over a glass of ruby Tannat wine or a sizzling tenderloin, most people pay little mind to Uruguay. But just mention this demure South American nation to the tobacco industry and watch the smoke billow. A long-burning row between the government in Montevideo and cigarette maker Philip Morris is slowly turning into the mother of asymmetric battles.

Earlier this year, little Uruguay (68,000 square miles, half again the size of Cuba), with a population of 3.5 million and a GDP of $44 billion, tightened the already drastic restrictions on local sales of cigarettes. The international tobacco colossus, with a market capitalization of $107 billion and legions of high-priced lawyers and lobbyists from Bern to the Washington Beltway, struck back, filing a complaint with the World Bank’s International Centre for Settlement of Investment Disputes. The battlefield is minuscule, the size of a pack of smokes. But the case is starting rows over national sovereignty, free trade, and public health that show little sign of dissipating any time soon. Through it all, Uruguay has stood firm, showing it can go toe to toe with giants.

Of course, I am obliged to inform my readers that there is a certain consistency in the U. of Va.’s ties to Phillip Morris. After all, the main cash crop on Thomas Jefferson’s plantation was tobacco.

April 22, 2012

Red Plenty

Filed under: computers,Red Plenty,Stalinism,ussr — louisproyect @ 8:15 pm

The NY Times was not content to give Francis Spufford’s mixture of fact and fiction about the USSR (faction—really) “Red Plenty” one rave review. At least two were necessary. On February 14th this year, Dwight Garner wrote this valentine:

Any reader with a pencil has a dozen ways to express negative sentiment in the margins of a book — I am partial to ick, ack, awk, ugh and the occasional wha? — but a writer’s great sentences, in their bid for posterity, mostly just get underlined. At the end of the first chapter of Francis Spufford’s “Red Plenty,” however, I printed a nerdy but heartfelt word: “Bravo.” I felt like giving the author a little bow, or maybe a one-man standing O.

For what it is worth, Garner also went head over heels for Saïd Sayrafiezadeh’s “When Skateboards Will Be Free”, a callow memoir about growing up an SWP red diaper baby. I guess that even if there was no longer a single Marxist alive anywhere in the world, reviewers will still be singing the praises of such books. That is to be expected when the contradictions of capitalism create the objective conditions for a renewed interest in Marxism, as is the case today.

On March 2nd, Andrew Meier was just as effusive, concluding his review thusly:

Yes, “Red Plenty” is a literary/historical seesaw, a work sure to have even the most bilious Kindle-haters tapping for hyperlinks. But it is a work, by turns learned and lyrical, that grows by degree, accreting into something lasting: a replica in miniature of a world of ideas never visible to most, and now gone.

Neither Garner nor Meier are typical hardened anti-Communists. In fact, their political outlook is not that much different from that of Spufford, a liberal I have described as preferring the “devil you know” to the socialism some unrepentant types still uphold.

One thing is made abundantly clear from the introduction to Part Five of “Red Plenty”, the USSR was rotten from the beginning:

But the Soviet experiment had run into exactly the difficulty that Plato’s admirers encountered, back in the fifth century BC, when they attempted to mould philosophical monarchies for Syracuse and Macedonia. The recipe called for rule by heavily-armed virtue—or in the Leninist case, not exactly virtue, but a sort of intentionally post-ethical counterpart to it, self-righteously brutal. Wisdom was to be set where it could be ruthless. Once such a system existed, though, the qualities required to rise in it had much more to do with ruthlessness than wisdom. Lenin’s core of Bolsheviks, and the socialists like Trotsky who joined them, were many of them highly educated people, literate in multiple European languages, learned in the scholastic traditions of Marxism; and they preserved these attributes even as they murdered and lied and tortured and terrorized. They were social scientists who thought principle required them to behave like gangsters.

In other words, Lenin led to Stalin. This is the same formula that is at the heart of all Sovietology, whether of the liberal variety like Robert Tucker or the reactionary Robert Service. Spufford grudgingly admits that the USSR was proceeding along a viable path in the 1920s under the NEP, but concludes that Stalin’s war on the peasantry was practically necessary: “the farmers’ incomes made them dangerously independent, and food prices bounced disconcertingly up and down. Collectivisation saw to all these problems at once.”

One suspects that Spufford views the NEP as an outlier. For him, the main course of Soviet history is a blend of top-down economic planning and the police state. Even more disconcertingly, this historical narrative would appear to be in consonance with Marx’s writings rather than an assault on them.

The dead giveaway is the reference to Plato in the citation above. When Heinrich Blucher (Mr. Hannah Arendt) was indoctrinating me as an undergraduate at Bard College in the early 60s, he insisted that totalitarianism was the logical outcome of a German philosophical idealism rooted in Plato’s philosophy. The Soviet dictatorship was nothing else but the embodiment of the philosopher king. This obsession with ideology ran very deep in the 50s and early 60s. To some extent, it was a reflection of the existentialism of the day (Blucher and Arendt were part of Heidegger’s circle in the 1920s) but also Anglo-American sociology typified by Daniel Bell’s “End of Ideology”. Now that I am older and wiser (go ahead and laugh), I understand that American pragmatism–the official ideology of the ruling class–never received the same kind of critical scrutiny. After all, as Marx pointed out, the ideas of the ruling class are in every epoch the ruling ideas. Why should America be any different?

Although book reviewers are beside themselves with Spufford’s literary skills, I find them just as dubious as his political analysis. The brunt of this review will be about the latter, but a few words about style are necessary.

I counted 56 characters in “Red Plenty”—some fictional like Khrushchev and others made up like Galina, a college student who is a kind of hard-core Communist. Unlike a novel, there is little attempt to connect them. They pop up in one part and only reappear a decade or so later. It is almost as if Spufford had written a thousand page novel and about half of it got lost on the way to the printer. Not just the first half or the second half, but random pages perhaps strewn by the wind. In many ways, they remind me of the “interludes” in John Dos Passos’s “USA”, his brief profiles of people like Henry Ford, etc. But “Red Plenty” is all interludes with no connective plot tissue. This is certainly “novel” in the sense of being new, but no competition to the traditional novel of ideas that such a subject (the crisis of Soviet planning) deserves.

The other major problem is the rather clumsy attempt by Spufford to achieve verisimilitude by throwing in all sorts of reminders that this is happening in a very backward Russia and nowhere else. You can turn to practically any page and read something like this:

But the smell of vodka merged with the sweaty sourness of the workers a little further forward, whose factory had plainly lodged them in a barracks without a bathroom, and the fierce rosewater scent the short woman had on, into one, hot, composite human smell, just as all the corners and pieces of sleeve and collar he could see fused into one tight kaleidoscope of darned hand-me-downs, and worn leather, and too-big khaki.

Aaah! I can hear the Song of the Volga Boatmen in the background now.

From the very top of the hierarchy to the very bottom, Spufford’s characters are marked by a Quixotic belief that their system works, even if they disagree on how to make it work better. Unlike some of the classic anti-totalitarian literature of the 20th century, there is no Winston Smith to put forward a contrary analysis even though there is an implication throughout that a Grand Illusion is at work.

At its core, “Red Plenty” is a rehashing of issues that I first ran into nearly 20 years ago when “market socialism” was all the rage among certain elements of the left. On PEN-L, the Progressive Economists Network listserv, you had both Marxists and liberals similar to Spufford arguing that planning was futile. Some were even ready to agree with the Austrian school that markets were necessary for the proper allocation of resources.

There was always an assumption that planning existed in the USSR. Even with computers, planning was doomed to fail—a central thesis of “Red Plenty”. But how warranted was that assumption? While I admit to having only a skimpy knowledge of planning in the USSR in the 50s and 60s, I have to wonder how much it differed from what went on when its rulers were up-and-coming bureaucrats.

The Soviet government announced the first five-year plan in 1928. Stalin loyalists, like Krzhizanovksy and Strumlin, who headed Gosplan, the minister of planning, worried about the excess rigidity of this plan. They noted that the success of the plan was based on 4 factors: 1) five good consecutive crops, 2) more external trade and help than in 1928, 3) a “sharp improvement” in overall economic indicators, and 4) a smaller ration than before of military expenditures in the state’s total expenditures.

How could anybody predict five consecutive good crops in the USSR? The plan assumed the most optimistic conditions and nobody had a contingency plan to allow for failure of any of the necessary conditions.

Bazarov, another Stalin loyalist in Gosplan, pointed to another area of risk: the lack of political cadres. He warned the Gosplan presidium in 1929, “If you plan simultaneously a series of undertakings on such a gigantic scale without knowing in advance the organizational forms, without having cadres and without knowing what they should be taught, then you get a chaos guaranteed in advance; difficulties will arise which will not only slow down the execution of the five-year plan, which will take seven if not ten years to achieve, but results even worse may occur; here such a blatantly squandering of means could happen which would discredit the whole idea of industrialization.”

Strumlin admitted that the planners preferred to “stand for higher tempos rather than sit in prison for lower ones.” Strumlin and Krzhizanovksy had been expressing doubts about the plan for some time and Stalin removed these acolytes from Gosplan in 1930.

In order for the planners, who were operating under terrible political pressure, to make sense of the plan, they had to play all kinds of games. They had to falsify productivity and yield goals in order to allow the input and output portions of the plan to balance. V.V. Kuibyshev, another high-level planner and one of Stalin’s protégés, confessed in a letter to his wife how he had finessed the industrial plan he had developing. “Here is what worried me yesterday and today; I am unable to tie up the balance, and as I cannot go for contracting the capital outlays–contracting the tempo–there will be no other way but to take upon myself an almost unmanageable task in the realm of lowering costs.”

Eventually Kuibyshev swallowed any doubts he may have had and began cooking the books in such a way as to make the five-year plan, risky as it was, totally unrealizable.

Real life proved how senseless the plan was. Kuibyshev had recklessly predicted that costs would go down, meanwhile they went up: although the plan allocated 22 billion rubles for industry, transportation and building, the Soviets spent 41.6 billion. The money in circulation, which planners limited to a growth of only 1.25 billion rubles, consequently grew to 5.7 billion in 1933.

As madcap and as utopian as the original plan was, Stalin tossed it into the garbage can immediately after the planners submitted it to him. He commanded new goals in 1929-30 that disregarded any economic criteria. For example, instead of a goal of producing 10 million tons of pig iron in 1933, the Soviets now targeted 17 million. All this scientific “planning” was taking place when a bloody war against the Kulaks was turning the Russian countryside into chaos. Molotov declared that to talk about a 5-year plan during this period was “nonsense.”

Stalin told Gosplan to forget about coming up with a new plan that made sense. The main driving force now was speed. The slogan “tempos decide everything” became policy. The overwhelming majority of Gosplan, hand-picked by Stalin, viewed the new policy with shock. Molotov said this was too bad, and cleaned house in the old Gosplan with “all of its old-fashioned planners” as he delicately put it.

Now the USSR was clearly a different place in 1960 than it was in 1930. But it was subject to the same distortions as it was earlier. Politics trumped science. No matter how cogent the strategies put forward by software engineers, they were likely to be superseded by the feudal-like social relations that existed under Stalinism, with a King at the top—the General Secretary of the CP—and all the fiefdoms underneath him run by factory managers. Socialism is not just about planning. It is about workers control. Without thoroughgoing democracy that is fed by initiatives by those who produce the wealth of society, it is very difficult to use scientific methods to create “plenty”. The implied message of “Red Plenty” is that since democratic socialism is impossible, you might as well live with market relations and all the shit that goes along with it, well on display in his own country and the rest of Europe today.

In the 1980s I was president of the board of Tecnica, a kind of radical Peace Corps that sent programmers and engineers to Nicaragua to volunteer their skills. While Nicaragua was not socialist, it was committed to planning on a large scale. Our organization reported to Carl Oquist who was Daniel Ortega’s chief economics adviser. But no matter how many volunteers we sent, including some very capable people from Bell Labs and other blue-chip American firms, and no matter how many Nicaraguans we trained in the use of spreadsheets and database management systems, it could never compensate for a contra war that was draining the country economically.

Missing from “Red Plenty” is any engagement with the costs of war on economic development in the USSR. Spufford has no use for Stalin but does not explain how the Soviet people ended up with him. A contra war that ensued after the Bolsheviks took power resulted in the deaths of many of the most democratically minded and politically educated worker cadres who made the revolution. A political vacuum and the country’s inherited backwardness made it all the more easy for a despot like Stalin to take power and build a “socialism” that had much more in common with the primitive accumulation stage of capitalism than the Marxist beliefs of those who were shot down by American and British rifles in 1919.

In an interview with the Browser, Spufford is asked whether he agrees with the thesis of Yuri Slezkine’s “The Jewish Century”, namely that “in the 1920s and 30s the Soviet Union was a brilliantly successful state for Jews.”

While admitting that some aspects of the book are “undeniable”, he repeats the catechism of the professional anti-Communist:

But communism is a bit embarrassing now. It is getting hard to get people to own up to the fact that once upon a time they thought it was sensible. It was part of the centre of gravity of the 20th century. What I agree with about it is that it brings an aspect of the 20th century into view. One of the hardest things for us to remember about Stalinism is that as well as being a system of horrors it also represented modernity and social mobility and opportunity for lots of people. In a horribly straightforward way the great purges opened up an incredible number of jobs, as we saw with Khrushchev, who is a fine example of Russia being a land of opportunity built on numerous graves.

In generational terms, I belong to the 60s—a time when many young people still believed that the Soviet Union was “progressive” but only in the most dialectical sense. Born in 1964 and nearly 20 years my junior, Spufford comes of age at a time when Francis Fukuyama’s “end of history” and Austrian economics were becoming hegemonic.

Those born 20 years after Spufford are now in their late twenties and unlikely to be convinced that history is at an end or that markets and “plenty” go together. These are the young people with crappy jobs if they are lucky and/or with huge college debts. They are also the kinds of people in the vanguard of the Occupy movement and the democratic revolutions sweeping the Middle East.

No matter how many books get written about the evils of 20th century communism, I doubt that this will matter to them. They want a society that is based on economic justice and peace. Whatever name you call it, this is what they seek no matter how many glowing reviews that “Red Plenty” garners in the New York Times. As has been the case since Marx and Engels were the same age, capitalism creates socialism through its own contradictions. As long as there is capitalism, there will be a movement for socialism. It is our job in the 21st century to rebuild a movement that is the only hope for the future, starting now.

February 4, 2012

Anonymous hacks FBI conference call

Filed under: computers,repression — louisproyect @ 3:43 pm

NY Times February 3, 2012

F.B.I. Admits Hacker Group’s Eavesdropping


WASHINGTON — The international hackers group known as Anonymous turned the tables on the F.B.I. by listening in on a conference call last month between the bureau, Scotland Yard and other foreign police agencies about their joint investigation of the group and its allies.

Anonymous posted a 16-minute recording of the call on the Web on Friday and crowed about the episode in via Twitter: “The FBI might be curious how we’re able to continuously read their internal comms for some time now.”

Hours later, the group took responsibility for hacking the Web site of a law firm that had represented Staff Sgt. Frank Wuterich, who was accused of leading a group of Marines responsible for killing 24 unarmed civilians in Haditha, Iraq, in 2005. The group said it would soon make public “mails, faxes, transcriptions” and other material related to the case, taken from the site of Puckett & Faraj, a Washington-area law firm. A voluminous 2.55 gigabyte file labeled as those files was later posted on a site often used by hackers, Pirate Bay.

Regarding the conference call, an F.B.I. official said Anonymous had not in fact hacked into it or any other bureau facilities. Instead, the official said, the group had simply obtained an e-mail giving the time, telephone number and access code for the call. The e-mail had been sent on Jan. 13 to more than three dozen people at the bureau, Scotland Yard, and agencies in France, Germany, Ireland, the Netherlands and Sweden. One recipient, a foreign police official, evidently forwarded the notification to a private account, he said, and it was then intercepted by Anonymous.

“It’s not really that sophisticated,” said the official, who would discuss the episode only on condition of anonymity. He said no Federal Bureau of Investigation system was compromised but noted that communications security was more challenging when agencies in multiple countries were involved.

“We’re always looking at ways to make our communications more secure, and obviously we’ll be taking a look at what happened here,” he said.

The bureau issued a brief statement confirming the intrusion, which was first reported by The Associated Press: “The information was intended for law enforcement officers only and was illegally obtained. A criminal investigation is under way to identify and hold accountable those responsible.”

The breach, clearly an embarrassment for investigators, is the latest chapter in a continuing war of words and contest of technology between hacking groups and their perceived opponents in law enforcement and the corporate world.

The F.B.I. e-mail titled “Anon-Lulz International Coordination Call” — a reference to Anonymous and to an allied group of hackers, Lulz Security — announced a conference call for investigators “to discuss the on-going investigations related to Anonymous, Lulzsec, Antisec, and other associated splinter groups.”

The recording posted on YouTube and elsewhere included American and British voices discussing suspects in the case. The call begins with banter between an American named Bruce and British officials named Stewart or Stuart and Matt, who are joined by another official from F.B.I. headquarters, Timothy F. Lauster Jr., who sent the e-mail announcing the conference call.

The conference call illustrates both the scale of the international police effort to identify and prosecute the hackers, and the striking contrast in age and status of the investigators and their targets: what seem to be middle-aged law enforcement officials on two continents are overheard dissecting the illicit activities of teenagers.

A British official refers to Ryan Cleary and Jake Davis, two British teenagers who have been arrested and are wanted in the United States on suspicion of having ties to Anonymous. The British official describes a 325-page report analyzing Ryan Cleary’s hard drive, and an F.B.I. agent in Los Angeles discusses various suspects and their nicknames.

The investigators also refer to several suspects who had not yet been arrested, including one who calls himself Tehwongz, described by the British official as “a 15-year-old kid who’s basically just doing this all for attention and is a bit of an idiot.”

The conversation was part of an international criminal investigation that began in 2010 after Anonymous championed WikiLeaks by mounting electronic attacks on MasterCard and PayPal and other sites that had stopped collecting donations for the antisecrecy organization.

Last month, Anonymous attacked the Web sites of the Justice Department and major entertainment companies in retaliation for criminal charges against the founders of Megaupload, a popular Internet service used to transfer music and movies anonymously.

The hackers could have penetrated the law-enforcement official’s personal e-mail account by guessing a weak password, sneaking into an unencrypted wireless network, or, most likely, with a common and relatively easy tactic known as a phishing attack, said Keith Ross, a computer science professor at Polytechnic Institute of New York University and a security expert. A phishing attack involves sending an e-mail that looks like it is from a friend or relative and persuading the recipient to click on a link that allows every keystroke entered on that particular computer to be recorded. Recording keystrokes is an efficient way to steal someone’s e-mail username and password.

“The real issue for law-enforcement officials is they need to be better educated about how they handle sensitive data on their e-mails,” Mr. Ross said. “It’s an easy vulnerability to crack. If you’re not careful it’s a very dangerous attack.”

The same methods may have been used to hack the Web site of the lawyers who represented Sergeant Wuterich, Neal Puckett and Haytham Faraj. Their Web site was defaced by the hackers to display a message from Anonymous saying it was exposing “the corruption of the court systems and the brutality of U.S. imperialism,” Gawker.com reported. Later, the site was taken down.

In an interview late Friday, Mr. Faraj said he thought that little of the material stolen from their site related to the Haditha case, though some documents might relate to a polygraph that he said Sergeant Wuterich had passed. He said he feared the documents might include a confidential statement from a rape victim in an unrelated case. “I think in their haste to put stuff out there, they’re going to hurt some people,” he said.

Mr. Faraj said he had represented Guantanamo detainees and had supported and offered to represent Pfc. Bradley Manning, the soldier accused of providing documents to WikiLeaks, suggesting that the hackers of Anonymous may be inadvertently attacking someone who shares some of their presumed political views. “They got the wrong guy,” he said.

He said the F.B.I. had contacted the law firm and opened an investigation.

Sergeant Wuterich, 31, pleaded guilty last month in a military court in California to dereliction of duty, telling the judge that he regretted ordering his men to “shoot first, ask questions later.” As part of a plea agreement, however, he received no prison time, though his rank was reduced to private. The sentence sparked anger in Iraq and among some human rights advocates, and the Anonymous message complained that Sergeant Wuterich had gotten “only a pay cut” as a penalty.

Somini Sengupta and Nicole Perlroth contributed reporting from San Francisco.


January 6, 2012

Desk Set

Filed under: computers,Film — louisproyect @ 6:37 pm

As I sit watching this 1957 Tracy-Hepburn classic on HBO, I realize that “The Company Men” followed in its footsteps even though the downsizing in the early film was driven more by the need of a capitalist firm to achieve “efficiencies” through a very early mainframe computer rather than the whiplash of the 2008 economic collapse. Here’s my review of “Desk Set” from some years ago:

Made in 1957, “Desk Set” has the distinction of being the last comedy that Katherine Hepburn and Spencer Tracy costarred in. It is also one of the first movies (and probably the last) that tackled the subject of computers and unemployment.

Tracy plays Richard Sumner, an MIT graduate and computer expert who is consulting with a huge media corporation in order to introduce Emilac into their research department. That department is run by Bunny Watson, played by Hepburn. She and her staff–all women–would seem to be resistant to any kind of drastic technological innovations. First of all, the questions that are put to them over the phone each day would seem resistant to automation: “What is the tonnage of the planet Earth?”; “Who are Santa’s reindeers?”, etc. Second of all, their office evoked a time in the American corporate world where expressions of individuality were tolerated, if not encouraged. For instance, there is a vine in Hepburn’s office that snakes wildly across the walls and ceilings, an obvious statement that its owner will not allow herself to be subjected to the right-angled efficiency of Sumner’s automation schemes.

Once the computer is finally introduced, Hepburn and her staff receive pink slips on the very first payday, courtesy of another Emilac that has been installed in the payroll department. In the climactic scene, when Sumner visits the research department to see how the new computer is working out, all hell is breaking loose. The research department is being deluged with phone calls that the new female and anal retentive operator of Emilac can not process accurately. When she feeds the machine a question as to whether the King of the Watusis drives a car, the machine can only spit out a movie review of “King Solomon’s Mines”, which included the keyword “Watusi.”

Sumner implores Hepburn to pitch in and help process the complex queries. Why should she, she asks, since she has just been fired. Fired? That’s not possible, he replies, for the research staff was not only meant to be kept on, there were going to be new hires to handle the expected increase in volume, in light of the company’s plans to merge with another corporate giant. Just as she shows him her pink slip, the CEO of the company storms into the research office and shows Sumner his own pink slip. It turns out that the payroll computer has screwed up and everybody in the company has been fired. This revelation is accompanied by the sight of Emilac spitting out punch cards across the room like confetti and the sound of agonized electronic burbling as the burden of the complex queries finally proves too much for its logic circuits.

The movie ends happily with everybody retaining their job and Tracy and Hepburn (rather elderly at this point in their career) smooching. For those who have not had the pleasure of seeing a Tracy-Hepburn comedy, this may not be the most rewarding of their films. Generally they are cast to type, with Tracy as a gruff, homespun, working class guy, while Hepburn is more refined, gifted with an ironically dry sense of humor. Their films also tend to be observations on society, such as the remarkable 1941 “Woman of the Year”. Written by Ring Lardner Jr., it is Hollywood’s ideological contribution to the Stalin-Hitler Nonaggression Pact. In this film, Tracy’s character believes that the United States should focus on its own problems, while Hepburn is a blueblood activist who is always running around to various meetings concerned with war in peace in some far-off hotspot.

Based on a Broadway play by William Marchant, “Desk Set” prefigures concerns that last with us until this day. Will computers throw people out of work? What kind of progress will that be? As I have mentioned previously, the first attempt to come to grips with these sorts of questions in a Marxist framework was found in the pages of The American Socialist, a magazine that lasted from 1954 to 1959 and whose impact and legacy are much greater than would be suggested by its brief life span.

In the December 1955 issue, we find a symposium on “What’s Ahead for Labor” that tries to assess the impact of automation and mechanization on jobs. Kermit Eby, a professor at the University of Chicago who had begun contributing to the magazine that year, notes that 1.6 million fewer people are engaged in industrial production than at the high point. “These displaced persons, of course, push into the services and displace others. All the time, each is working for lower wages.”

Anticipating the neo-Luddite protests against automation that surfaced in the 1990s from people like Jeremy Rifkin and Kirkpatrick Sale, Eby defends a socialist perspective. If we can forgive the male chauvinism contained in his observations, there is still much that makes sense:

It is not my thesis that the machine does not liberate, nor do I argue for return to the primitive, as Gandhi did. However, I do insist that man ends are not defined in the volume of goods and services his industrial machines produce. Instead, man’s ends lie in the quality of life that increased leisure makes possible. And today, at least in America, more and more of us are freed to live life in dimensions which transcend survival, as measured in bread-and-butter terms. Consequently, not only must we today examine our work-ethic, but also our attitude toward play and leisure-time activity.

For example, it has been emphasized that ours is a spectator culture. It is, of course, but there are other signs already mentioned: do-it-yourself, travel, and so on. All these things point to something more than the spectator view. To begin with, I would examine what life would be like when we no longer need to eat our bread by the sweat of our brow. And how would our lives be changed if we realized that work is not a punishment for past sins and that play is not evil, but rather a creative expression of man’s creative and artistic self?

As our industrial revolution advances, we come face to face with a world moving towards the 30-hour week, paid vacations, field, early retirement. How many workers dream of their chicken farm? For the skilled operator and the maintenance man, going to the factory will perhaps not be so bad. On many operations there will be little to do except watch the machine. There will be time for talk-fest with the boys. Under such circumstances the factory kind of club where the worker goes to meet the boys will be one of the few man-dominated worlds left.

October 26, 2011

Reflections on the passing of John McCarthy and Dennis Ritchie

Filed under: computers — louisproyect @ 7:00 pm

John McCarthy

Dennis Ritchie

Two towering figures in computing died this month, besides Steve Jobs. A good case can be made that their contributions are as great as his, if not greater. The first was Dennis Ritchie who died at the age of 70 on October 13th. Ritchie invented the C programming language and the Unix operating system, two of the technologies that are key to the distributed computing today that makes the Internet possible. The other is John McCarthy who died yesterday at the age of 84. Like Ritchie, McCarthy was a major innovator in both languages and operating systems. He was the father of time-sharing, the key component of mainframe operating systems that allowed multiple users to access the computer as if they were the exclusive owner. McCarthy went on to become a pioneer of artificial intelligence and created a language called LISP that is widely used in this arena.

When I began working on IBM mainframes in 1970, I was trained in something called TSO or time-sharing option. This was the framework for submitting test jobs, editing my COBOL programs, debugging, etc. TSO grew out of a project under John McCarthy’s supervision in 1957 on an IBM 704, the predecessor to the 360 generation of computers that I was introduced to in 1970. Before time-sharing, a computer was used on a serial, one-at-time basis. So an IBM 704 might be loaded with a payroll program to print checks. Afterwards, another program to print checking account statements would be loaded to print statements, etc. If you wanted to make a change to the program that was used for payroll, you had to wait until these jobs were completed. With time-sharing, all three tasks—plus many more—could be run simultaneously. The only limitation was the size of the memory of the computer. It is astonishing to think that the average mainframe in the early 70s, which cost 115 thousand dollars and required mammoth air-conditioning support, typically was shipped with 8 megabytes of memory. By comparison, my wife’s IPod with 8 gigabytes of memory, a thousand times larger than an IBM 360, costs $199.

I had a brief exposure to LISP about 15 years ago when I enrolled in a computing and education program at Teachers College in Columbia. One of the classes I took was titled Cognition and Computers taught by John Black. It was supposed to introduce you to the use of artificial intelligence in primary education. We were required to use a LISP-like meta-language (don’t ask me to explain) for homework exercises, all of which were supposed to “teach” students about highly mechanical tasks such as playing billiards or starting a car. I seem to remember his explanation of how the language would be used:





I stupidly tried to use our LISP-like language for an exercise on American history that he told me was inappropriate. I tried again with a more mechanical application, only to be told—after 25 years of programming experience—that I still didn’t get it. I dropped out of the class with a great feeling of relief.

One of the books I have on my shelf at work is titled “LISP Lore: a Guide to Programming the LISP Machine”, a gift of the author Hank Bromley, an MIT graduate who was one of our Tecnica volunteers in Nicaragua. I have opened it once or twice without making heads or tails out of what I was reading.

In the NY Times obituary on McCarthy, the verdict on AI is mixed at best:

Artificial intelligence is still thought to be far in the future, though tremendous progress has been made in systems that mimic many human skills, including vision, listening, reasoning and, in robotics, the movements of limbs. From the mid-’60s to the mid-’70s, the Stanford lab played a vital role in creating some of these technologies, including robotics and machine-vision natural language.

The last time I heard AI being hyped was in the early 80s when Reagan was pushing SDI or “Star Wars”, an anti-ballistic missile system that would supposedly put a shield over the U.S. Breakthroughs in AI would supposedly make the system fail-safe. Fat chance of that after what I saw at Teacher’s College.

Opposition to SDI became part of a broader anti-nuclear movement that opposed both nuclear weapons and nuclear power. Three Mile Island in 1979, Chernobyl in 1986, and the Reagan administration’s talk about surviving nuclear war got lots of people into motion, including the Green Party in Germany. In the U.S., you saw the growth of Computer Professionals for Social Responsibility, a group that was formed primarily to oppose SDI. I was a member briefly until my attention shifted to solidarity work in Central America.

I imagine that John McCarthy would have been gung-ho for SDI even though he started out early in life as a Communist. The Times obit  recounts:

John McCarthy was born on Sept. 4, 1927, into a politically engaged family in Boston. His father, John Patrick McCarthy, was an Irish immigrant and a labor organizer.

His mother, the former Ida Glatt, a Lithuanian Jewish immigrant, was active in the suffrage movement. Both parents were members of the Communist Party. The family later moved to Los Angeles in part because of John’s respiratory problems.

He entered the California Institute of Technology in 1944 and went on to graduate studies at Princeton, where he was a colleague of John Forbes Nash Jr., the Nobel Prize-winning economist and subject of Sylvia Nasar’s book “A Beautiful Mind,” which was adapted into a movie.

At Princeton, in 1949, he briefly joined the local Communist Party cell, which had two other members: a cleaning woman and a gardener, he told an interviewer. But he quit the party shortly afterward.

In the ’60s, as the Vietnam War escalated, his politics took a conservative turn as he grew disenchanted with leftist politics.

Dennis Ritchie went in the reverse direction politically from McCarthy as the NY Times obit reveals:

While a graduate student at Harvard, Mr. Ritchie worked at the computer center at the Massachusetts Institute of Technology, and became more interested in computing than math. He was recruited by the Sandia National Laboratories, which conducted weapons research and testing. “But it was nearly 1968,” Mr. Ritchie recalled in an interview in 2001, “and somehow making A-bombs for the government didn’t seem in tune with the times.”

I first ran into Ritchie’s writings back in the early 90s when Columbia trained us in the C programming language. Ritchie co-authored a book titled “The C Programming Language” with Brian Kernighan that I still have on my shelf from that time. The university was about to go full blast into client-server computing and C was being considered as a front-end to replace COBOL.

I have to confess that I never developed an appreciation for C and could never understand why it would be considered for business-oriented applications of the sort that had been written in COBOL. In Cobol, you would open a file on a disk or on tape for output with this instruction “OPEN OUTPUT EMPLOYEE_MASTER_FILE”. Here is the equivalent command in C:

/* fopen example */
#include <stdio.h>
int main ()
FILE * pFile;
pFile = fopen (“myfile.txt”,”w”);
if (pFile!=NULL)
fputs (“fopen example”,pFile);
fclose (pFile);
return 0;


Columbia wisely decided against getting involved with C and used Foxpro instead for a client-server system to handle the school’s basic financial functions (purchasing, budgets, etc.) I never got involved with the Foxpro work but was assigned to support the back-end of the system using Unix and Perl, a fairly high-level language that was simple to use like COBOL but succinct like C. My job consisted mainly of getting data from the mainframe loaded into Sybase tables that were updated by users during the day and sent back to the mainframe at night. Columbia is about to replace this nearly 20 year old system with a package from Peoplesoft. I am in the strange position of supporting a “legacy” system that was bleeding-edge when it was introduced but that’s the way it goes in data processing.

I am glad that I acquired basic skills in Unix since my Macbook—and all Apple computers—run on Dennis Ritchie’s operating system. This means that I am able to write software on the computer using Perl, including a program called compare.prl that reconciles my checking account using a file I download from Chase online. The best thing about Unix, of course, is that it is far more stable than Windows and we have Steve Jobs to thank for that. Jobs went against the grain when he started NeXT back in 1985, a personal computer that used Unix. Apple bought NeXT in 1996 and adapted the computer for the Mac, an architecture that persists to this day.

What a long, strange trip it’s been.

« Previous PageNext Page »

Blog at WordPress.com.