PMB figures you should bear a few things in mind that will make dealing with your resident literature scholar a more endurable, even pleasant experience this holiday season. After all, your token scholar will likely return home (family), attend an office holiday party (colleagues), or ask you to please move at the bookstore (random encounters!), so read up and prepare to deal with them. They can get grumpy over the holidays because of the inflated ratio of consumerism to personal wealth that a lit. scholar typically experiences at this time of year.
The Myths In No Particular Order:
1) A book is always a good gift.
There are so many reasons why a book is quite often not a good gift. The main reason is that any book is a reminder to a lit. scholar of how many books are on the impossible to-read list, which is comparable in volume to Borges' Library of Babel. Accordingly, unless the lit. scholar specifies a book or type of book, s/he probably won't have time to read the book you've gifted anytime this decade. Feelings of guilt--both personal and professional--will compound exponentially for every day the gifted book goes unread.
It's also true that, despite common misconceptions, lit. scholars actually don't fetishize books as much as they sometimes lead you to believe. Often it's worthwhile for a lit. scholar to pretend that s/he loves books that much, and feels the need to pop a handful of aspirin upon a Kindle sighting just to avoid having a brain hemorrhage, in order not to have to spend 20 minutes explaining to you the difference between a scholar trained to think of a weekend trip to Victoria's Secret as 'a narrative' and a the two-for-one-bra-sale sign as 'a text,' and a book collector.
2) Lit. scholars have holidays.
No. Even though two weeks of vacation time in a calendar year is scant enough to be a legitimate human rights concern (and a crime against humanity), at the very least your boss probably gives you that. Not so for the lit. scholar, who has no vacation, ever. And just to underscore that fact, many of the largest literature conferences occur in late December/early January. And when do you think that conference paper is going to get written? Merry Christmas, happy Chanukah, and all that...
3) Lit. scholars enjoy talking about books...with you.
Suppose you're at a holiday party, and you've just been introduced to your friend's fiancee, whom you're told is a particle physicist by trade. Would you ask him what his favorite subatomic particle is? Do you suppose he prefers bosons to leptons?
If that scenario sounds ridiculous, you should then understand why 'what is your favorite book' is the worst question you could possibly ask a lit. scholar. And, to clarify, the reason for this is not 'because there are just sooooo many great books!' The reason for this is because for lit. scholars, books are objects of study just like bosons are for particle physicists. Accordingly, no lit. scholar wants to have a conversation with you about books that would be the particle physics equivalent of 'so, do you enjoy the electron neutrino? Because I always found it sort of dry.' The wonderful thing about particle physics is that most people rightly understand that they don't know anything about particle physics. The awful thing about literature scholarship is that most people don't understand that they don't know anything about literature scholarship, but assume that because they can speak and read English they can carry on a worthwhile conversation about it. Wrong. Possible conversational alternatives: 'How's the weather in (where you live)?'; 'What are your thoughts on tax cuts?'; 'What is your favorite type of dumpling?'; 'Do you like sports?'; 'Do you remember when Sheetz sold fried macaroni and cheese bites, which was awesome? Do you think the FDA shut that down?'
4) Lit. scholars are drunks and philanderers.
Despite what Hollywood would lead you to believe, it's not true. But lit. scholars are happy that Hollywood pays attention to them. Given the film treatment, it's no wonder why this image of the lazy, underworked, oversexed, Dionysian literature professor persists; but it's a fallacy to assume that simply because people do a lot of work on fiction, they live fictive lives.
5) Lit. scholars want to become writers.
If 'what is your favorite book' is the worst question, 'do you want to become a writer' is the second-worst. Again, it would be absurd to ask your friend's physicist fiancee if he wants to become a quark; so why would someone who studies literature want to become a writer? Or, to put it more fairly, even if someone both studies literature and wants to become a writer, why should the one beget the other? I suppose an apple and an orange are both roughly spherical objects, and both mosquitos and airplanes fly, and both clouds and q-tips are white and fluffy; but commonalities (like being on opposite ends of the literary production process) don't always bespeak a larger relationship. If you want to become a writer, quite possibly the worst thing you can do is become a lit. scholar first.
There you have it. Tread lightly this holiday season!
Wednesday, December 8, 2010
Wednesday, November 3, 2010
Mathematics Will Destroy The Universe
The October 2, 2010 issue of The New Scientist features an article called ‘Countdown to Oblivion,’ which explains the research of physicist Ben Freivogel and his team at UC Berkeley. Freivogel and colleague Raphael Buosso work on theories of eternal inflation, best (over)simplified as the notion that ‘different parts of space can undergo dramatic growth spurts, essentially ballooning into separate universes with their own physical properties.’ Further, ‘the process happens an infinite number of times, creating an infinite number of universes, called the multiverse’ (quoted from the article).
As the article explains, ‘the infinities involved mean that anything that can happen does happen—an infinite number of times,’ such that defining probabilities according to our typical means of doing so becomes hugely problematic. Accordingly, physicists take a slice—a ‘cut-off’—of the multiverse as a sample; ‘however, doing this inevitably slices through individual universes on the edge of the sample,’ leading to flawed probabilities…
…UNLESS (!), as Freivogel contends, ‘the mathematical cut-offs somehow have real and dire consequences for the places they intersect.’
In other words, as PMB understands it, the argument here is that because of an inexplicable hitch in a self-sealing mathematical method of making ‘cosmological predictions,’ the physicists involved are prepared to assume not that their method of understanding the cosmos is limited, but that a mathematical aporia can actually manifest itself in the physical destruction of the universe and the end of time.
The most striking quotes come from the scientists who are trying to negotiate the problem:
“‘We’re stuck between a rock and a hard place,’ says Buosso, ‘if you don’t like the cut-off, then you have no way of making predictions and deciding what’s possible in eternal inflation.’”
If you 'don't like' the cut-off?!?
PMB translation: if it doesn’t make sense to you how the existence of a mathematical problem can not merely signify, but actively create the destruction of the universe, then it may be better to just pretend like this is the case so that we can continue to run our simulations, unfettered.
And then:
“‘If we do have the end of time, then that’s a strange situation, but at least it solves this paradox,’ says Olum.’”
PMB translation: look, since solving this paradox is obviously more important to us than the continuation of time, we’re happy to accept this theory as soon as we can find a way to prove it.
PMB isn’t interested in commenting on the validity of Freivogel’s research or the solution to this paradox, nor is he qualified to do so. He flagged this article as an excellent example of how the supercilious Cult of Doing Science can actually undermine good scientific thinking. These physicists openly admit that their properly derived doubt about their conclusions is directly affected by the fact that such doubt could undermine the very method by which they do their work. Here what the public is led to believe is 'scientific,' hence bulletproof, by the Cult of Doing Science, is actually a good representation of much of what doing science actually entails: speculation, assumption, doubt, failure, readjustment.
(And before the scientists find a way to get self-righteous about this portrayal of science as rather difficult work, let PMB remind you that science disciplines aren't the only ones who experience this amid the rigors of their work; same is true for everyone else).
The real rock and hard place are these: if the scientific community focuses on doing good science above its pursuit of monopolizing knowledge itself, it will necessarily have to open up the public to the same kinds of doubts about science that any scientist faces every day; and if it opens up the public to that truth, the public may cease to drink the kool aid.
Jean Baudrillard, representative par excellence of the sort of literary and cultural theory that famed physicist Alan Sokal pilloried in 1996, argued in Simulacra and Simulation (1981) that we've reached a point of references without referents, where simulations become the new reality, taking precedence over any perceived notion of the real.
Now PMB is starting to think that the cultural theorists are having the last laugh...
As the article explains, ‘the infinities involved mean that anything that can happen does happen—an infinite number of times,’ such that defining probabilities according to our typical means of doing so becomes hugely problematic. Accordingly, physicists take a slice—a ‘cut-off’—of the multiverse as a sample; ‘however, doing this inevitably slices through individual universes on the edge of the sample,’ leading to flawed probabilities…
…UNLESS (!), as Freivogel contends, ‘the mathematical cut-offs somehow have real and dire consequences for the places they intersect.’
In other words, as PMB understands it, the argument here is that because of an inexplicable hitch in a self-sealing mathematical method of making ‘cosmological predictions,’ the physicists involved are prepared to assume not that their method of understanding the cosmos is limited, but that a mathematical aporia can actually manifest itself in the physical destruction of the universe and the end of time.
The most striking quotes come from the scientists who are trying to negotiate the problem:
“‘We’re stuck between a rock and a hard place,’ says Buosso, ‘if you don’t like the cut-off, then you have no way of making predictions and deciding what’s possible in eternal inflation.’”
If you 'don't like' the cut-off?!?
PMB translation: if it doesn’t make sense to you how the existence of a mathematical problem can not merely signify, but actively create the destruction of the universe, then it may be better to just pretend like this is the case so that we can continue to run our simulations, unfettered.
And then:
“‘If we do have the end of time, then that’s a strange situation, but at least it solves this paradox,’ says Olum.’”
PMB translation: look, since solving this paradox is obviously more important to us than the continuation of time, we’re happy to accept this theory as soon as we can find a way to prove it.
PMB isn’t interested in commenting on the validity of Freivogel’s research or the solution to this paradox, nor is he qualified to do so. He flagged this article as an excellent example of how the supercilious Cult of Doing Science can actually undermine good scientific thinking. These physicists openly admit that their properly derived doubt about their conclusions is directly affected by the fact that such doubt could undermine the very method by which they do their work. Here what the public is led to believe is 'scientific,' hence bulletproof, by the Cult of Doing Science, is actually a good representation of much of what doing science actually entails: speculation, assumption, doubt, failure, readjustment.
(And before the scientists find a way to get self-righteous about this portrayal of science as rather difficult work, let PMB remind you that science disciplines aren't the only ones who experience this amid the rigors of their work; same is true for everyone else).
The real rock and hard place are these: if the scientific community focuses on doing good science above its pursuit of monopolizing knowledge itself, it will necessarily have to open up the public to the same kinds of doubts about science that any scientist faces every day; and if it opens up the public to that truth, the public may cease to drink the kool aid.
Jean Baudrillard, representative par excellence of the sort of literary and cultural theory that famed physicist Alan Sokal pilloried in 1996, argued in Simulacra and Simulation (1981) that we've reached a point of references without referents, where simulations become the new reality, taking precedence over any perceived notion of the real.
Now PMB is starting to think that the cultural theorists are having the last laugh...
Tuesday, October 12, 2010
This Is A University
A university is this. You don't have to agree with every aspect of the Wikipedia entry, but it's a solid starting point. A university is an institution of higher education. Universities have professors and students. Research happens at universities because research is part of higher education. That is, research is something fundamental to the process of educating students at universities.
This is not a university. The Max Planck Institutes undoubtedly educate people, but only as a residual function of their primary purpose: to conduct research.
If you've spent time at a major research university, however, you can certainly be pardoned for thinking that there really isn't much of a meaningful difference between a university as you know it and an independent research institution like those of the Max Planck Society. Particularly if you're used to UK and European institutions of higher education--which are almost exclusively research universities--and have never set foot on a liberal arts campus, where the primary aim of the institution is to teach undergraduates--this idea that a university is not primarily a research institution might seem disconcertingly foreign to you.
In fact, in places like the UK, and with few exceptions, academics, higher education administrators, and politicians are all essentially lying to themselves and others by continuing to think of the UK's leading universities as, well, universities. It's no secret that, but for resource-intensive undergraduate curricula at places like Oxford and Cambridge, teaching is really an afterthought at UK universities. And at the end of the day, when government cuts have to be made, the loudest voices in UK higher education are those who head the top research universities and fear that taking away elite-research money and giving it to more teaching-centered post-92 universities will be the end of UK higher education as we know it. The idea of the university as a research institution first (and a teaching institution second, if at all) rules the day in the UK, and has done so for a while now.
The implications of this attitude that universities are primarily for research are startling, especially at a time when budget cuts mean universities have to be even more explicit about what they aim to do and how they aim to do it. What many in UK higher education are proposing, perhaps unwittingly, is the end of the university. In other words, if the most compelling argument for higher education funding is that such funding will produce top-flight, globally competitive research products, everything non-research about the university, plus every university research pursuit (e.g. in the humanities) whose main purpose is to bolster teaching efforts, rather than to stand alone as a marketable research product in itself, will falter. Effectively, what higher education leaders and politicians are currently asking for is funding to become independent research institutes like the Max Planck. That they're using the longstanding legitimacy and social cache of the idea of the university--a place of learning, hence a place of teaching--as a cover for abolishing that very thing, however, is nothing short of appalling to old curmudgeons like PMB.
Many independent research institutions and research-producing corporations do very well, both by themselves and by the broader societies for whom they produce research. But the end of the university is not just the end of our primary means of higher-level teaching and instruction, in both vocational and non-vocational terms, for centuries; it's also the end of the professional study and teaching of whole fields of inquiry and bodies of knowledge. Many fields, like literature, philosophy, classics, religion, history, anthropology, law, etc., don't produce--and don't aim to produce--standalone research products with direct or immediate social impact. A peer-reviewed journal article on a novel about Middle-Eastern trade, read by a small handful of other scholars interested in the given field, will not have much, if any, societal impact. The same article, read by another scholar, taught to a classroom of students, and internalized by a couple of them who later go off into policy work on the Middle East may confer a real benefit, the deep understanding of that particular history (to give a convoluted example, as they always are). By contrast, if a chemical engineer develops a way of making better televisions, however trivial the benefit of having a marginally better television may be, it's nonetheless a direct benefit--a standalone research product that makes an immediate impact. If the playing field is tilted in favor of research or the standalone research product as the primary aim of the university, both teaching and the types of research that principally aid teaching will fall by the wayside. The university as a place of learning and instruction, a place with professors and students, will become instead an independent research institution that students will no longer pay to attend, serving only a research elite for very narrow purposes.
Addendum: Almost every day PMB engages in discussions about the relative values of various fields of research. While most are carrying on about how their brilliant research aims to cure cancer or AIDS or Malaria, or produces 'crucial' medical technologies, or even produces better televisions, the idea that the central benefit of some research is actually teaching and educating young students, and preparing them for a range of careers and experiences, simply never registers. At an independent research institute, this wouldn't be a problem. At a so-called university, it's an abomination.
This is not a university. The Max Planck Institutes undoubtedly educate people, but only as a residual function of their primary purpose: to conduct research.
If you've spent time at a major research university, however, you can certainly be pardoned for thinking that there really isn't much of a meaningful difference between a university as you know it and an independent research institution like those of the Max Planck Society. Particularly if you're used to UK and European institutions of higher education--which are almost exclusively research universities--and have never set foot on a liberal arts campus, where the primary aim of the institution is to teach undergraduates--this idea that a university is not primarily a research institution might seem disconcertingly foreign to you.
In fact, in places like the UK, and with few exceptions, academics, higher education administrators, and politicians are all essentially lying to themselves and others by continuing to think of the UK's leading universities as, well, universities. It's no secret that, but for resource-intensive undergraduate curricula at places like Oxford and Cambridge, teaching is really an afterthought at UK universities. And at the end of the day, when government cuts have to be made, the loudest voices in UK higher education are those who head the top research universities and fear that taking away elite-research money and giving it to more teaching-centered post-92 universities will be the end of UK higher education as we know it. The idea of the university as a research institution first (and a teaching institution second, if at all) rules the day in the UK, and has done so for a while now.
The implications of this attitude that universities are primarily for research are startling, especially at a time when budget cuts mean universities have to be even more explicit about what they aim to do and how they aim to do it. What many in UK higher education are proposing, perhaps unwittingly, is the end of the university. In other words, if the most compelling argument for higher education funding is that such funding will produce top-flight, globally competitive research products, everything non-research about the university, plus every university research pursuit (e.g. in the humanities) whose main purpose is to bolster teaching efforts, rather than to stand alone as a marketable research product in itself, will falter. Effectively, what higher education leaders and politicians are currently asking for is funding to become independent research institutes like the Max Planck. That they're using the longstanding legitimacy and social cache of the idea of the university--a place of learning, hence a place of teaching--as a cover for abolishing that very thing, however, is nothing short of appalling to old curmudgeons like PMB.
Many independent research institutions and research-producing corporations do very well, both by themselves and by the broader societies for whom they produce research. But the end of the university is not just the end of our primary means of higher-level teaching and instruction, in both vocational and non-vocational terms, for centuries; it's also the end of the professional study and teaching of whole fields of inquiry and bodies of knowledge. Many fields, like literature, philosophy, classics, religion, history, anthropology, law, etc., don't produce--and don't aim to produce--standalone research products with direct or immediate social impact. A peer-reviewed journal article on a novel about Middle-Eastern trade, read by a small handful of other scholars interested in the given field, will not have much, if any, societal impact. The same article, read by another scholar, taught to a classroom of students, and internalized by a couple of them who later go off into policy work on the Middle East may confer a real benefit, the deep understanding of that particular history (to give a convoluted example, as they always are). By contrast, if a chemical engineer develops a way of making better televisions, however trivial the benefit of having a marginally better television may be, it's nonetheless a direct benefit--a standalone research product that makes an immediate impact. If the playing field is tilted in favor of research or the standalone research product as the primary aim of the university, both teaching and the types of research that principally aid teaching will fall by the wayside. The university as a place of learning and instruction, a place with professors and students, will become instead an independent research institution that students will no longer pay to attend, serving only a research elite for very narrow purposes.
Addendum: Almost every day PMB engages in discussions about the relative values of various fields of research. While most are carrying on about how their brilliant research aims to cure cancer or AIDS or Malaria, or produces 'crucial' medical technologies, or even produces better televisions, the idea that the central benefit of some research is actually teaching and educating young students, and preparing them for a range of careers and experiences, simply never registers. At an independent research institute, this wouldn't be a problem. At a so-called university, it's an abomination.
Friday, September 17, 2010
Down With College Football
Now is a special time in America. The UVA and Michigan grads working in finance flock to the bars in their college colors on crisp fall Saturdays pretending to some solidarity with the young men who do gridiron battle on college's behalf each week. Perhaps more importantly, the spectators, who, unlike the spectated, were forced more or less to attend class as a precondition of collegiate success, can claim solidarity with one another over the weekly toils of their modern-day Spartans, at once revering and celebrating the godlike players for their athletic prowess and exploiting them for the fanfare they generate. It's college football season, baby!
Emerging from the background on occasion at this time of year are the Andy Katzenmoyer stories: tales of college football gods who were drafted to the NFL only to suffer career ending injuries, prompting the fabled 'what the fuck now' moments that come when someone who was enrolled in courses such as 'Golf,' 'AIDS Awareness,' and 'Music' at Ohio State University all of a sudden can't rely on his body to earn him a livable income any longer. The sports pundits had a great laugh about Andy Katzenmoyer's course-load (after which he was barely academically eligible to compete in football), just as they did when reporters discovered that University of Georgia head basketball coach Jim Herrick had enrolled his athletes in a course for credit at UGA called 'Basketball 101,' taught by Herrick himself, whose exams consisted of questions like 'how many points does a 3-point field goal account for in a basketball game?' But this stuff isn't really funny, is it? Should we be laughing at Andy Katzenmoyer, a kid who was told that his only purpose in college was to play football, allowed to slide on everything else, and had is only waking purpose taken away from him in the blink of an eye? What about these Georgia kids enrolled in Basketball 101? Is this the kind of education they deserve? Is basketball all they're good for? And what about the non-athlete students at University of Georgia? Is this what their degree is really worth? How many points does a 3-point field goal account for?
'I see here you're a Georgia grad. Go Bulldogs! But sorry, you need an accredited degree to get a job here.'
It's easy for an educated bear to be snarky about these things, just as it's easy for the sports nuts to have a good laugh about them, and then go back to their frantic coverage of bigtime collegiate sports, as if the guy running a personal training studio in god-knows-where Ohio is just a joke or an aside, nothing to do with the industry that made him. But being snarky isn't really the point.
In its present condition, college football is a bad thing.
The idea of collegiate sports, like a range of other extracurriculars that can build skills and character, and can generally enrich someone's college experience, is a great thing. But the multi-billion dollar industry that is college football is not an extracurricular; so we should stop pretending like that's all it is. While some 'student-athletes' undoubtedly do go to college foremost for an education, and take their *college* responsibilities seriously, it's a widely accepted fact that most bigtime college athletes are on 'scholarship' for football (or basketball) first, and scholarship second. In many cases, these athletes are there for football *only*, and scholarship *never*. Instead of being treated like every other student, they're treated differently, in some ways advantageously, in others disastrously.
Proponents of the current system say bigtime college athletes are given tremendous opportunities that others might not have, like a free college education, for example. They say many of these kids are first-generation college students, and/or come from difficult personal backgrounds. They're usually right. And the idea of giving a disadvantaged, first-generation college student a free education is a fantastic one. Except that this isn't what really happens. There is no education. There is only football.
Proponents of the current system say that far from being exploited, these kids are treated like campus and hometown gods. They live like local celebrities, and in some cases national and international celebrities. They get all the advantages in the world, while Joe Average majoring in math and playing the tuba in the pep band gets nothing of the sort. Again, they're right. Except that the respect and dignity with which these players are treated is wholly contingent upon their athletic success; it rarely encourages strong performance in the classroom; and it rarely lasts beyond college for those who don't go on to play professional sports. Sure, there are success stories, too. The NCAA makes a point in its advertisements to find successful former student-athletes who 'went pro' in something other than sports; but what about the majority at a range of universities in innumerable bigtime sports programs who fail to even graduate?
College football also largely fails to benefit the university that houses the program, and in many cases actually harms the university. After all, it's the university that enables the college football industry to have athletes who generate billions of dollars in merchandise, TV contract, and ticket-sales revenue *work for free*. In fact, it's the governing body of collegiate sports, the NCAA, which specifically places strict limitations on the earning potential of college athletes, making sure they can't legally cash in on their talents and abilities. Sorry Reggie. And where does all the revenue go, if not to the athletes? Well, in many cases college coaches make more than the president of the university. In others state-of-the-art spectator stadiums and company boxes are installed for local supporters and alumni to watch the games luxuriously. In others the athletes themselves are flown from coast to coast for competitions and given professional-caliber training facilities, not so that they can be the best college students they can be, but so they can be the best college football players. Where does the money *not* go? It does *not* go toward hiring top faculty and building better teaching facilities. Nor toward research grants or scholarships for non-football-playing, academic 'stars.' Nor toward libraries or campus-wide WiFi or even nicer dormitories. It usually stays in athletic-department coffers to be spent on the primary expenses of the athletic department: the football and basketball teams.
In this scenario, the bigtime college football industry needs the university to furnish it with a default loyal fanbase and a team full of super-talented athletes who generate massive amounts of money *for free*; yet two or three assistant football coaches will undoubtedly make higher salaries than the most accomplished professor of classics or engineering at a given university, and the academic side (ha!) of the university will be looked upon by coaches and athletic department personnel as a mere nuisance that detracts from their mission of providing the wider world with a great football team.
Here's what needs to be done about this mess:
If a college football program surpasses a set revenue limit, it should be forced to choose between three options. The revenue limit would be like an eligibility clause of the sort imposed on the athletes for their non-acceptance of compensation for their efforts. Options:
A) Forfeit all profit to the university, whose panel of faculty and administrators will decide how much the football team should get, with the vast majority of revenue going back into the university and earmarked specifically for educational pursuits foremost, and then infrastructural improvements secondarily. Adhere to university demands that student-athletes actually be students first. And let these demands be properly enforced.
B) Scale back the program and its assets such that it does not surpass the set revenue limit in the following year.
C) Break off from the university altogether, at which point the program has to fund itself completely, procure its own facilities, and find its own athletes willing to play either for free or for whatever the program can offer them. Athletes will have to choose whether to remain enrolled at the university and not play for the disaffiliated team or to forfeit their place at the university to remain a member of the team. Those who choose to stay on at the university and leave the team would retain their scholarships.
Then, without the university propping up and legitimizing the industry that hangs upon it like a parasite, we would perhaps see how many athletes are willing to work for free, and how many programs are actually serious about this whole student-athlete thing.
Emerging from the background on occasion at this time of year are the Andy Katzenmoyer stories: tales of college football gods who were drafted to the NFL only to suffer career ending injuries, prompting the fabled 'what the fuck now' moments that come when someone who was enrolled in courses such as 'Golf,' 'AIDS Awareness,' and 'Music' at Ohio State University all of a sudden can't rely on his body to earn him a livable income any longer. The sports pundits had a great laugh about Andy Katzenmoyer's course-load (after which he was barely academically eligible to compete in football), just as they did when reporters discovered that University of Georgia head basketball coach Jim Herrick had enrolled his athletes in a course for credit at UGA called 'Basketball 101,' taught by Herrick himself, whose exams consisted of questions like 'how many points does a 3-point field goal account for in a basketball game?' But this stuff isn't really funny, is it? Should we be laughing at Andy Katzenmoyer, a kid who was told that his only purpose in college was to play football, allowed to slide on everything else, and had is only waking purpose taken away from him in the blink of an eye? What about these Georgia kids enrolled in Basketball 101? Is this the kind of education they deserve? Is basketball all they're good for? And what about the non-athlete students at University of Georgia? Is this what their degree is really worth? How many points does a 3-point field goal account for?
'I see here you're a Georgia grad. Go Bulldogs! But sorry, you need an accredited degree to get a job here.'
It's easy for an educated bear to be snarky about these things, just as it's easy for the sports nuts to have a good laugh about them, and then go back to their frantic coverage of bigtime collegiate sports, as if the guy running a personal training studio in god-knows-where Ohio is just a joke or an aside, nothing to do with the industry that made him. But being snarky isn't really the point.
In its present condition, college football is a bad thing.
The idea of collegiate sports, like a range of other extracurriculars that can build skills and character, and can generally enrich someone's college experience, is a great thing. But the multi-billion dollar industry that is college football is not an extracurricular; so we should stop pretending like that's all it is. While some 'student-athletes' undoubtedly do go to college foremost for an education, and take their *college* responsibilities seriously, it's a widely accepted fact that most bigtime college athletes are on 'scholarship' for football (or basketball) first, and scholarship second. In many cases, these athletes are there for football *only*, and scholarship *never*. Instead of being treated like every other student, they're treated differently, in some ways advantageously, in others disastrously.
Proponents of the current system say bigtime college athletes are given tremendous opportunities that others might not have, like a free college education, for example. They say many of these kids are first-generation college students, and/or come from difficult personal backgrounds. They're usually right. And the idea of giving a disadvantaged, first-generation college student a free education is a fantastic one. Except that this isn't what really happens. There is no education. There is only football.
Proponents of the current system say that far from being exploited, these kids are treated like campus and hometown gods. They live like local celebrities, and in some cases national and international celebrities. They get all the advantages in the world, while Joe Average majoring in math and playing the tuba in the pep band gets nothing of the sort. Again, they're right. Except that the respect and dignity with which these players are treated is wholly contingent upon their athletic success; it rarely encourages strong performance in the classroom; and it rarely lasts beyond college for those who don't go on to play professional sports. Sure, there are success stories, too. The NCAA makes a point in its advertisements to find successful former student-athletes who 'went pro' in something other than sports; but what about the majority at a range of universities in innumerable bigtime sports programs who fail to even graduate?
College football also largely fails to benefit the university that houses the program, and in many cases actually harms the university. After all, it's the university that enables the college football industry to have athletes who generate billions of dollars in merchandise, TV contract, and ticket-sales revenue *work for free*. In fact, it's the governing body of collegiate sports, the NCAA, which specifically places strict limitations on the earning potential of college athletes, making sure they can't legally cash in on their talents and abilities. Sorry Reggie. And where does all the revenue go, if not to the athletes? Well, in many cases college coaches make more than the president of the university. In others state-of-the-art spectator stadiums and company boxes are installed for local supporters and alumni to watch the games luxuriously. In others the athletes themselves are flown from coast to coast for competitions and given professional-caliber training facilities, not so that they can be the best college students they can be, but so they can be the best college football players. Where does the money *not* go? It does *not* go toward hiring top faculty and building better teaching facilities. Nor toward research grants or scholarships for non-football-playing, academic 'stars.' Nor toward libraries or campus-wide WiFi or even nicer dormitories. It usually stays in athletic-department coffers to be spent on the primary expenses of the athletic department: the football and basketball teams.
In this scenario, the bigtime college football industry needs the university to furnish it with a default loyal fanbase and a team full of super-talented athletes who generate massive amounts of money *for free*; yet two or three assistant football coaches will undoubtedly make higher salaries than the most accomplished professor of classics or engineering at a given university, and the academic side (ha!) of the university will be looked upon by coaches and athletic department personnel as a mere nuisance that detracts from their mission of providing the wider world with a great football team.
Here's what needs to be done about this mess:
If a college football program surpasses a set revenue limit, it should be forced to choose between three options. The revenue limit would be like an eligibility clause of the sort imposed on the athletes for their non-acceptance of compensation for their efforts. Options:
A) Forfeit all profit to the university, whose panel of faculty and administrators will decide how much the football team should get, with the vast majority of revenue going back into the university and earmarked specifically for educational pursuits foremost, and then infrastructural improvements secondarily. Adhere to university demands that student-athletes actually be students first. And let these demands be properly enforced.
B) Scale back the program and its assets such that it does not surpass the set revenue limit in the following year.
C) Break off from the university altogether, at which point the program has to fund itself completely, procure its own facilities, and find its own athletes willing to play either for free or for whatever the program can offer them. Athletes will have to choose whether to remain enrolled at the university and not play for the disaffiliated team or to forfeit their place at the university to remain a member of the team. Those who choose to stay on at the university and leave the team would retain their scholarships.
Then, without the university propping up and legitimizing the industry that hangs upon it like a parasite, we would perhaps see how many athletes are willing to work for free, and how many programs are actually serious about this whole student-athlete thing.
Tuesday, September 7, 2010
Ailing America: Race, Ethnicity, and American Identity
Philip Roth's The Counterlife features a type of conflict that, however prominent in today's geopolitics, never really takes on in its distilled form the tabloid luster of certain Americans' little wars with Islam in certain parts of Manhattan. One of Roth's characters is an ethnically Jewish, non-religious American dentist from New Jersey who decides at midlife to leave his family and move to a desert settlement in Israel to take up the cause of militant Zionism. The conflict that emerges between Roth's born-again Zionist and his older brother, also American-born and non-religious, and very happy to stay that way, is profound: for the Zionist brother, it is impossible to be an authentic Jew in America, or anywhere else outside of Israel; a diaspora Jew is no Jew at all. For the older brother, the battle for Jewish consciousness and Jewish identity can be just as real in America as it is for the Zionist in war-torn Israel; and the preferred victor in both regards is the tolerant, pluralist, nonviolent American Jew, rather than the Zionist Jew who understands the very core of Jewishness as as a bloody struggle against Arabs and other Gentile forms for a specifically Jewish state. At the heart of such a conflict is the question of whether American pluralism can adequately protect historically persecuted groups like the Jews, the question of whether a tolerant and pluralist society is really possible in a balkanized and conflict-ridden world. The question can be put more succinctly: is collective identity possible?
To pretend that America's ongoing conflict with militant Islam--and the ways in which such a conflict seeps into our personal and political dealings with non-militant Islam--has nothing to do with America's geopolitical relationship with Israel, and bears no analogical relationship to militant Zionism, is a seeming impossibility. Yet this is what we do, what our politicians do, what our media do, day after day. Make no mistake about it, however: the battle that rages in the Middle East between militant Arabs and militant Jews--a battle whose residue seeps into the personal and political dealings with non-militant Arabs and non-militant Jews in the Middle East--has everything to do with the interest of Islamic militants in the destruction of America, the American wars in the Middle East, and, in no small way, the series of racial and ethnic conflicts in America that are tearing at America from all ends. Here in America, as there in the Middle East, it is pointless and outright bigoted to blame the Arabs or the Jews. If you seek a culprit for all of this material and symbolic destruction and misery, that culprit lies somewhere along the fault line of this great conflict between collective and singular identity.
For the most part, Americans may not be throwing stones from tenement buildings at each others' cars, or policing the Mason-Dixon line with loaded assault rifles. There is no missile mounted in Tempe and pre-programed for Guadalajara, and there isn't likely to be one. But two things inflame this American struggle between collective and singular identity, and have been doing so for a long time now.
From the American left we have a crude, racialized brand of identity politics that virtually ignores ethnic distinctions and places the most prominent American races (white, Hispanic, black, Asian) in opposition to one another (and to the exclusion of all others). It then places 'whites' and 'minorities' in opposition to one another. As a consequence, the meaningful ethnic values and experiences of all Americans are generally subordinated to vague racial categories that are over-vulnerable to crude stereotypical definition and racial in-fighting. While these racial categories afford high levels of solidarity and political agency in some cases, in others they force people to sacrifice important aspects of their ethnic cultural background and upbringing in exchange for political visibility.
Though 'white' political visibility is justifiably less important than minority political visibilities because of the historical, and in many senses enduring, privileges of those Americans constituted as a 'white' majority, 'whiteness' is not immune from identity crisis under such a system. While 'white' identity is largely conceived of as contemporary Anglo-Protestant identity (see the bestselling 'Stuff White People Like,' based on the blog), first-, second- and third-generation 'white' Americans hailing from the massive waves of Irish, Italian, Polish, German, etc. immigrants to America in the 20th century are no more comfortable being thought of as 'the same' as would be a Korean-American presumed Japanese or an Afro-Caribbean American presumed Sudanese. As we've seen recently, a number of fearful, often under-educated 'white' Americans have lashed out against 'Muslims,' 'foreigners,' 'Mexicans,' 'illegal aliens,' 'anchor babies,' etc. in racially charged ways, prompting media commentators to consider the possibility that this tide of fear and aggression has something to do with the election of a 'black' president, a symbol of 'nonwhite' power during a time in which American racial and ethnic demographics are shifting. If there is anything to such a theory, it's quite possible that those 'whites' who are either flatly bigoted or simply ill-equipped to make sense of pluralist values and to understand racial, ethnic, and religious difference are feeling particularly embattled about their constitution as ethnically mislabled or non-labeled, blank, blanco, and are adopting a particular (and particularly xenophobic) 'American' identity in opposition to the racialized minority. To put it simply, many 'white' Americans don't know what they hell they are or are supposed to be within this prevailing system of racializing Americans, so are deciding that to be white is to be American, to be American is to be white. Now this twisted indignation wells up in anti-historical, sentimentalist rants about the 'loss of America,' the 'end of our country,' the horror of 'Obama's America,' held in opposition to 'my America,' 'my ['white'] America.' 'I'll take my guns, money, religion, and freedom, and you can keep the 'change.''
Now we begin to understand the challenge to American pluralism launched from the American right: American history is revised such that 'America' has always been a particular thing not entirely different from the Zionist Jewish utopia...only the Protestant Christian version. 'American' values therefore proceed from a narrow set of Anglo-Protestant ethnic values. An 'American' is not a Muslim or a Jew or a homosexual or even a Catholic. 'American' governance is based on the pull-yourself-up-by-the-boot-straps Protestant ethic, hence social welfare programs are 'un-American.' 'American' governments are to be as small as possible, but may become large and sprawling to defend the 'American' interest abroad in military conflict. 'America' is a nation under [Protestant] God; and because God gave 'Americans' animals to eat and oil to burn, concerning oneself with the rights of animals that don't bark or meow or the reduction of un-clean energy use for the sake of the environment and/or the climate are also 'un-American.' This is a crude picture, yes. But each of these positions, typically taken by the American right, center on a distinct sense of 'Americanness,' a very particular understanding of what America is. And PMB didn't paint this picture; he merely copied it.
The historical record of America proves otherwise, however. As PMB has written elsewhere, the success of America stems primarily from its pluralist tradition, or its remarkable history of accepting a plurality of types into the American fold. Certainly such acceptance hasn't come at a cost, or gone over without serious periods of difficulty, exploitation, and violence. After Jewish, Italian, and Irish immigrants arrived, before they were all considered 'white,' they had their bloody battles and their ghetto mentalities, but eventually they learned to get along, to think of each other as equally American. After the Alien and Sedition Acts, the abject practice of slavery, Japanese internment, and the McCarthy-era inquisitions, Americans have chosen liberal democracy over theocracy, openness over insularity, pluralism over zealotry. Today, one hopes, Americans will choose Mosques in Manhattan over bigotry and fear, and ethnic difference over racial conflict. To do so will be to prolong a great tradition that stands as proof of the ability of collective identity to function with and incorporate a multitude of singular, overlapping, not unchanging or uncomplicated identities shaded and flourishing under its expansive wingspan. Roth's Zionist character ultimately has it wrong: he can't see a path to his ethnic self-realization within a pluralist society, so he chooses an exclusionary society, a society that understands difference as a war imperative rather than an opportunity for strengthening and growth.
To pretend that America's ongoing conflict with militant Islam--and the ways in which such a conflict seeps into our personal and political dealings with non-militant Islam--has nothing to do with America's geopolitical relationship with Israel, and bears no analogical relationship to militant Zionism, is a seeming impossibility. Yet this is what we do, what our politicians do, what our media do, day after day. Make no mistake about it, however: the battle that rages in the Middle East between militant Arabs and militant Jews--a battle whose residue seeps into the personal and political dealings with non-militant Arabs and non-militant Jews in the Middle East--has everything to do with the interest of Islamic militants in the destruction of America, the American wars in the Middle East, and, in no small way, the series of racial and ethnic conflicts in America that are tearing at America from all ends. Here in America, as there in the Middle East, it is pointless and outright bigoted to blame the Arabs or the Jews. If you seek a culprit for all of this material and symbolic destruction and misery, that culprit lies somewhere along the fault line of this great conflict between collective and singular identity.
For the most part, Americans may not be throwing stones from tenement buildings at each others' cars, or policing the Mason-Dixon line with loaded assault rifles. There is no missile mounted in Tempe and pre-programed for Guadalajara, and there isn't likely to be one. But two things inflame this American struggle between collective and singular identity, and have been doing so for a long time now.
From the American left we have a crude, racialized brand of identity politics that virtually ignores ethnic distinctions and places the most prominent American races (white, Hispanic, black, Asian) in opposition to one another (and to the exclusion of all others). It then places 'whites' and 'minorities' in opposition to one another. As a consequence, the meaningful ethnic values and experiences of all Americans are generally subordinated to vague racial categories that are over-vulnerable to crude stereotypical definition and racial in-fighting. While these racial categories afford high levels of solidarity and political agency in some cases, in others they force people to sacrifice important aspects of their ethnic cultural background and upbringing in exchange for political visibility.
Though 'white' political visibility is justifiably less important than minority political visibilities because of the historical, and in many senses enduring, privileges of those Americans constituted as a 'white' majority, 'whiteness' is not immune from identity crisis under such a system. While 'white' identity is largely conceived of as contemporary Anglo-Protestant identity (see the bestselling 'Stuff White People Like,' based on the blog), first-, second- and third-generation 'white' Americans hailing from the massive waves of Irish, Italian, Polish, German, etc. immigrants to America in the 20th century are no more comfortable being thought of as 'the same' as would be a Korean-American presumed Japanese or an Afro-Caribbean American presumed Sudanese. As we've seen recently, a number of fearful, often under-educated 'white' Americans have lashed out against 'Muslims,' 'foreigners,' 'Mexicans,' 'illegal aliens,' 'anchor babies,' etc. in racially charged ways, prompting media commentators to consider the possibility that this tide of fear and aggression has something to do with the election of a 'black' president, a symbol of 'nonwhite' power during a time in which American racial and ethnic demographics are shifting. If there is anything to such a theory, it's quite possible that those 'whites' who are either flatly bigoted or simply ill-equipped to make sense of pluralist values and to understand racial, ethnic, and religious difference are feeling particularly embattled about their constitution as ethnically mislabled or non-labeled, blank, blanco, and are adopting a particular (and particularly xenophobic) 'American' identity in opposition to the racialized minority. To put it simply, many 'white' Americans don't know what they hell they are or are supposed to be within this prevailing system of racializing Americans, so are deciding that to be white is to be American, to be American is to be white. Now this twisted indignation wells up in anti-historical, sentimentalist rants about the 'loss of America,' the 'end of our country,' the horror of 'Obama's America,' held in opposition to 'my America,' 'my ['white'] America.' 'I'll take my guns, money, religion, and freedom, and you can keep the 'change.''
Now we begin to understand the challenge to American pluralism launched from the American right: American history is revised such that 'America' has always been a particular thing not entirely different from the Zionist Jewish utopia...only the Protestant Christian version. 'American' values therefore proceed from a narrow set of Anglo-Protestant ethnic values. An 'American' is not a Muslim or a Jew or a homosexual or even a Catholic. 'American' governance is based on the pull-yourself-up-by-the-boot-straps Protestant ethic, hence social welfare programs are 'un-American.' 'American' governments are to be as small as possible, but may become large and sprawling to defend the 'American' interest abroad in military conflict. 'America' is a nation under [Protestant] God; and because God gave 'Americans' animals to eat and oil to burn, concerning oneself with the rights of animals that don't bark or meow or the reduction of un-clean energy use for the sake of the environment and/or the climate are also 'un-American.' This is a crude picture, yes. But each of these positions, typically taken by the American right, center on a distinct sense of 'Americanness,' a very particular understanding of what America is. And PMB didn't paint this picture; he merely copied it.
The historical record of America proves otherwise, however. As PMB has written elsewhere, the success of America stems primarily from its pluralist tradition, or its remarkable history of accepting a plurality of types into the American fold. Certainly such acceptance hasn't come at a cost, or gone over without serious periods of difficulty, exploitation, and violence. After Jewish, Italian, and Irish immigrants arrived, before they were all considered 'white,' they had their bloody battles and their ghetto mentalities, but eventually they learned to get along, to think of each other as equally American. After the Alien and Sedition Acts, the abject practice of slavery, Japanese internment, and the McCarthy-era inquisitions, Americans have chosen liberal democracy over theocracy, openness over insularity, pluralism over zealotry. Today, one hopes, Americans will choose Mosques in Manhattan over bigotry and fear, and ethnic difference over racial conflict. To do so will be to prolong a great tradition that stands as proof of the ability of collective identity to function with and incorporate a multitude of singular, overlapping, not unchanging or uncomplicated identities shaded and flourishing under its expansive wingspan. Roth's Zionist character ultimately has it wrong: he can't see a path to his ethnic self-realization within a pluralist society, so he chooses an exclusionary society, a society that understands difference as a war imperative rather than an opportunity for strengthening and growth.
Wednesday, September 1, 2010
Lawyers and Carpenters
Camille Paglia had this to say about 'the defining idea of the coming decade' in higher education: re-valorize the trades! PMB hopes that this idea does flourish in the years to come, but for reasons slightly different than Paglia's.
For Paglia and many others, higher education should be devoted to vocational training and preparation first and foremost. They argue that since nowadays jobs are no longer guaranteed (as though they ever were) to newly minted college graduates, since the marketplace is globalized and hypercompetitive, and since the price tag of a college education is becoming almost preventatively steep, colleges and universities need to rethink the grandiose 'liberal arts' model and get more serious about preparing students with 'job skills' to make them more competitive and employable. This is the kind of reasoning that underlies Paglia's call for the partnering of liberal arts colleges and research universities with vocational-technical institutions; the call, in other words, to re-valorize the trade vocations.
Paglia and others rightly identify a problem--the increasing difficulty college graduates face in finding gainful employment--but seem to miss entirely the causal roots of this predicament. Similarly, while encouraging young people to consider trade vocations is an excellent solution, it's only a partial solution.
The heart of the problem is not that colleges and universities fail to prepare graduates for the job market or 'the real world,' or fail to impart the necessary skills, experiences, and modes of acculturation for young job-seekers. As PMB has written elsewhere, anyone who's ever worked even a highly competitive corporate job understands that actually very few transferable skills (and very basic ones at that) are required to succeed in these kinds of jobs. In terms of 'skills,' the average college graduate is overprepared, not underprepared, for most jobs that appear on the job-seeking radar of a college graduate.
The heart of the problem, rather, is that, much like the housing market, the higher education market is experiencing a bubble that can't be sustained without significant changes to the way business is done in higher education. Much like the way public policy and popular opinion pushed people into buying homes that they couldn't afford by giving the impression that home ownership is a necessity and an unconditional public good, we have too many students pursuing a specific kind of higher education, one-size-fits-all, for which they are underprepared, undermotivated, and in many cases under-competent. The far-and-away most influential reason for this crippling problem is the idea, sold to millions like a laced methamphetamine, that the purpose of a college education is to get you a better job. In a roundabout way, then, it is precisely the vocationalization of higher education--the get-you-a-job focus--that is responsible for the failure of a college education to help graduates secure jobs. And the more we emphasize 'job skills' and vocational aims in higher education, as we do now, the worse the situation will get.
A strong solution to this problem is to use vocational aims to de-emphasize vocational aims. In other words, provide room in higher education for overtly vocational pursuits in order to take unwarranted and counterproductive pressure off of academic disciplines to supplant primary content with nondescript 'job skills training.'
Because deflating the higher education bubble means doing a better job of matching the skills and interests of young people with the appropriate avenues to develop those skills and pursue those interests (as opposed to shuttling everyone into a traditional 4-year college with a primarily academic core curriculum under the impression that no college degree = no fulfilling job), we should, as Paglia suggests in a light way, try partnering vocational programs with academic programs at universities. This could give students who have no interest in (or aptitude for) academic pursuits the option to enroll after high school in an apprentice-style vocational trade program (carpentry, plumbing, electrician, computers, etc.) without wholly abandoning ties to an academic university should the student want to take distributive courses in academic disciplines along the way, or decide later to transfer into academia altogether. In fact, the possibility of a joint program with a core academic curriculum (humanities, civics, finance and economics) and trade certification would be exciting. Likewise, joint programs could provide options for academic-track students to learn trade skills that could end up launching a lucrative and fulfilling career in trade, rather than the kind of generalized 'office job' that millennial seem to be taking and leaving and taking and leaving and taking and leaving ad infinitum. Rather than holding the two (academic and trade) paths separate, selling the 'academic' path to a majority middle class as the way to avoid 'undesirable' trade careers, the two general sets of skills and pursuits should live much closer together. Trade students should have access to the civic benefits of higher education, just as the countless graduates of four-year colleges and universities who develop during college no real interest in becoming lawyers, doctors, professors, or any other profession for which an academic background is essential, shouldn't have to file into nondescript corporate 'white-collar' jobs after graduation, tens of thousands of dollars in education debt, just because these are the only jobs we seem to deem acceptable for college graduates. Such a system would also reduce the absurd pressures to vocationalize foisted upon academic disciplines in the sciences and humanities for which vocational training is really (and ought to be) secondary to subject matter.
PMB's radical proposal, first wave: rage against the propagandists who suggest that literally everyone belongs in an academic, four-year institution of higher education, a suggestion that implicitly undervalues the trade professions. Take the vocational pressure off of academic disciplines that are not and were never concerned with 'getting you a job' by fighting the political battle within your departments and universities. It won't happen any other way.
PMB's radical proposal, second wave: restructure underperforming and essentially non-competitive universities to include something like a College of Business and Trade, a College of Engineering, and a College of Humanities and Sciences (in many cases all this would mean is integrating the trades). Give the programs more flexibility, in curricula and in tuition fees. Let the admissions standards vary within university colleges. Let there be selective liberal arts colleges structured more or less as they are now (types like Amherst, Bucknell, Colgate, Davidson, Holy Cross, etc.), but partner them with vocational institutions. Do the same with elite universities like Harvard, Stanford, etc. If the best universities aren't broken or bankrupt, they don't need to be fixed; but students of all types could still benefit from having trade ties.
It won't be as simple as it's written here; but it's clear that we need to stop pretending that the unrigorous force-feeding of 'job skills' to students who don't even go to class is all of a sudden going to produce more jobs, or more qualified people to fill them.
For Paglia and many others, higher education should be devoted to vocational training and preparation first and foremost. They argue that since nowadays jobs are no longer guaranteed (as though they ever were) to newly minted college graduates, since the marketplace is globalized and hypercompetitive, and since the price tag of a college education is becoming almost preventatively steep, colleges and universities need to rethink the grandiose 'liberal arts' model and get more serious about preparing students with 'job skills' to make them more competitive and employable. This is the kind of reasoning that underlies Paglia's call for the partnering of liberal arts colleges and research universities with vocational-technical institutions; the call, in other words, to re-valorize the trade vocations.
Paglia and others rightly identify a problem--the increasing difficulty college graduates face in finding gainful employment--but seem to miss entirely the causal roots of this predicament. Similarly, while encouraging young people to consider trade vocations is an excellent solution, it's only a partial solution.
The heart of the problem is not that colleges and universities fail to prepare graduates for the job market or 'the real world,' or fail to impart the necessary skills, experiences, and modes of acculturation for young job-seekers. As PMB has written elsewhere, anyone who's ever worked even a highly competitive corporate job understands that actually very few transferable skills (and very basic ones at that) are required to succeed in these kinds of jobs. In terms of 'skills,' the average college graduate is overprepared, not underprepared, for most jobs that appear on the job-seeking radar of a college graduate.
The heart of the problem, rather, is that, much like the housing market, the higher education market is experiencing a bubble that can't be sustained without significant changes to the way business is done in higher education. Much like the way public policy and popular opinion pushed people into buying homes that they couldn't afford by giving the impression that home ownership is a necessity and an unconditional public good, we have too many students pursuing a specific kind of higher education, one-size-fits-all, for which they are underprepared, undermotivated, and in many cases under-competent. The far-and-away most influential reason for this crippling problem is the idea, sold to millions like a laced methamphetamine, that the purpose of a college education is to get you a better job. In a roundabout way, then, it is precisely the vocationalization of higher education--the get-you-a-job focus--that is responsible for the failure of a college education to help graduates secure jobs. And the more we emphasize 'job skills' and vocational aims in higher education, as we do now, the worse the situation will get.
A strong solution to this problem is to use vocational aims to de-emphasize vocational aims. In other words, provide room in higher education for overtly vocational pursuits in order to take unwarranted and counterproductive pressure off of academic disciplines to supplant primary content with nondescript 'job skills training.'
Because deflating the higher education bubble means doing a better job of matching the skills and interests of young people with the appropriate avenues to develop those skills and pursue those interests (as opposed to shuttling everyone into a traditional 4-year college with a primarily academic core curriculum under the impression that no college degree = no fulfilling job), we should, as Paglia suggests in a light way, try partnering vocational programs with academic programs at universities. This could give students who have no interest in (or aptitude for) academic pursuits the option to enroll after high school in an apprentice-style vocational trade program (carpentry, plumbing, electrician, computers, etc.) without wholly abandoning ties to an academic university should the student want to take distributive courses in academic disciplines along the way, or decide later to transfer into academia altogether. In fact, the possibility of a joint program with a core academic curriculum (humanities, civics, finance and economics) and trade certification would be exciting. Likewise, joint programs could provide options for academic-track students to learn trade skills that could end up launching a lucrative and fulfilling career in trade, rather than the kind of generalized 'office job' that millennial seem to be taking and leaving and taking and leaving and taking and leaving ad infinitum. Rather than holding the two (academic and trade) paths separate, selling the 'academic' path to a majority middle class as the way to avoid 'undesirable' trade careers, the two general sets of skills and pursuits should live much closer together. Trade students should have access to the civic benefits of higher education, just as the countless graduates of four-year colleges and universities who develop during college no real interest in becoming lawyers, doctors, professors, or any other profession for which an academic background is essential, shouldn't have to file into nondescript corporate 'white-collar' jobs after graduation, tens of thousands of dollars in education debt, just because these are the only jobs we seem to deem acceptable for college graduates. Such a system would also reduce the absurd pressures to vocationalize foisted upon academic disciplines in the sciences and humanities for which vocational training is really (and ought to be) secondary to subject matter.
PMB's radical proposal, first wave: rage against the propagandists who suggest that literally everyone belongs in an academic, four-year institution of higher education, a suggestion that implicitly undervalues the trade professions. Take the vocational pressure off of academic disciplines that are not and were never concerned with 'getting you a job' by fighting the political battle within your departments and universities. It won't happen any other way.
PMB's radical proposal, second wave: restructure underperforming and essentially non-competitive universities to include something like a College of Business and Trade, a College of Engineering, and a College of Humanities and Sciences (in many cases all this would mean is integrating the trades). Give the programs more flexibility, in curricula and in tuition fees. Let the admissions standards vary within university colleges. Let there be selective liberal arts colleges structured more or less as they are now (types like Amherst, Bucknell, Colgate, Davidson, Holy Cross, etc.), but partner them with vocational institutions. Do the same with elite universities like Harvard, Stanford, etc. If the best universities aren't broken or bankrupt, they don't need to be fixed; but students of all types could still benefit from having trade ties.
It won't be as simple as it's written here; but it's clear that we need to stop pretending that the unrigorous force-feeding of 'job skills' to students who don't even go to class is all of a sudden going to produce more jobs, or more qualified people to fill them.
Sunday, August 15, 2010
Mosqueing Attitudes
Few times has PMB been so embarrassed for his country as now, in light of the rigorous and misguided opposition to plans to build a mosque and Islamic community center near Ground Zero in New York City.
Those in opposition to the building plans are claiming that a mosque and Islamic center built near Ground Zero would be an affront to the victims of 9-11, a breach of the sanctity of the tragic site, and would signify capitulation to proponents of radical Islam.
Few on either side of the issue would openly admit that they can identify no difference between what is called 'radical Islam,' 'Islamofacism,' 'Al-Quaidaism,' 'Islamic terrorism,' etc. and the peaceful, 'mainstream' practice of Islam in places all over the world, including New York. Most reasonable people would consider the simple equation of the practice of Islam with terrorism a straightforwardly bigoted attitude. Nonetheless, the rapid and unthinking and direct association of Islam in general, by way of the image of the mosque, and the radical Islamic terrorism of 9-11, underpins the entirety of the position against building a mosque and Islamic community center near Ground Zero. If the simple ontological status 'Muslim' is rendered equivalent to 'terrorism,' our attitude problem at home is as potentially threatening to the American way of life as is any danger abroad.
One could argue that even the mere evocation of anything loosely 'Muslim' is offensive at Ground Zero, given what's happened there, even if there is no explicit or admitted equivalency being produced between Muslim Americans in New York and 9-11 terrorists. But as long as we indulge that paranoia, we again threaten to undermine some of the very basic freedoms that make America what it is.
The building of a mosque and Islamic community center near Ground Zero would be the ultimate symbol of American endurance, the ultimate sign of America prevailing over terrorism, and the ultimate slap in the face to radical Islamic terrorist groups who would like nothing more than for Americans to turn against our own pluralist values and become the monster they portray us as.
When terrorist struck down the twin towers on 9-11, they thought they had struck at the symbolic heart of America, the pillars of America as world financial center. Little did they know that America is best exemplified by its pluralism, tolerance, and polyvocality: by the faces you can still see climbing on and off the World Trade Center Subway stop and walking along Wall St. and Vesey St. where the towers used to be. Erecting a mosque and whatever else serves the community there would demonstrate that America is still the diverse and tolerant community that's made it great. PMB can think of few things that would constitute sweeter poetic justice than the building of a mosque and Islamic community center near Ground Zero.
The ultimate shame, however, is the politicizing by people who have nothing to do with New York, on a national level, of these building plans, along with the childish fear and unenlightened contempt exhibited by those opposed. To raise a political fuss over the wholly legal and appropriate building of the planned Islamic community facilities in their planned location is to take a hack at the most important pillar that Americans ever built: the pluralist tradition.
Those in opposition to the building plans are claiming that a mosque and Islamic center built near Ground Zero would be an affront to the victims of 9-11, a breach of the sanctity of the tragic site, and would signify capitulation to proponents of radical Islam.
Few on either side of the issue would openly admit that they can identify no difference between what is called 'radical Islam,' 'Islamofacism,' 'Al-Quaidaism,' 'Islamic terrorism,' etc. and the peaceful, 'mainstream' practice of Islam in places all over the world, including New York. Most reasonable people would consider the simple equation of the practice of Islam with terrorism a straightforwardly bigoted attitude. Nonetheless, the rapid and unthinking and direct association of Islam in general, by way of the image of the mosque, and the radical Islamic terrorism of 9-11, underpins the entirety of the position against building a mosque and Islamic community center near Ground Zero. If the simple ontological status 'Muslim' is rendered equivalent to 'terrorism,' our attitude problem at home is as potentially threatening to the American way of life as is any danger abroad.
One could argue that even the mere evocation of anything loosely 'Muslim' is offensive at Ground Zero, given what's happened there, even if there is no explicit or admitted equivalency being produced between Muslim Americans in New York and 9-11 terrorists. But as long as we indulge that paranoia, we again threaten to undermine some of the very basic freedoms that make America what it is.
The building of a mosque and Islamic community center near Ground Zero would be the ultimate symbol of American endurance, the ultimate sign of America prevailing over terrorism, and the ultimate slap in the face to radical Islamic terrorist groups who would like nothing more than for Americans to turn against our own pluralist values and become the monster they portray us as.
When terrorist struck down the twin towers on 9-11, they thought they had struck at the symbolic heart of America, the pillars of America as world financial center. Little did they know that America is best exemplified by its pluralism, tolerance, and polyvocality: by the faces you can still see climbing on and off the World Trade Center Subway stop and walking along Wall St. and Vesey St. where the towers used to be. Erecting a mosque and whatever else serves the community there would demonstrate that America is still the diverse and tolerant community that's made it great. PMB can think of few things that would constitute sweeter poetic justice than the building of a mosque and Islamic community center near Ground Zero.
The ultimate shame, however, is the politicizing by people who have nothing to do with New York, on a national level, of these building plans, along with the childish fear and unenlightened contempt exhibited by those opposed. To raise a political fuss over the wholly legal and appropriate building of the planned Islamic community facilities in their planned location is to take a hack at the most important pillar that Americans ever built: the pluralist tradition.
Thursday, August 5, 2010
Research for Research's Sake?
In a July 13, 2010 Guardian article, Malcolm Grant, provost of University College London, is quoted arguing that imminent government cuts to higher education should be directed at closing down lesser universities altogether, if need be, rather than reducing budgets at elite research universities. Grant's rationale, which in many ways makes sense, is that while top-flight UK research universities make the UK a top competitor in the global research biz, and "research" writ large is a rather important biz in which to be a leader, many of the UK's teaching-oriented, non-research, or semi-research universities underperform in their primary task of teaching, and do little to produce world-leading research. Grant's fear is that by taking research funding and resources away from top research universities, the UK government will render the UK's top research universities less competitive, all while reducing the potential for practically beneficial research. Much of the force of Grant's argument, which is foremost a very public appeal for the people who vote and pay taxes to side with him, UCL, and places like it when the government decides to bring down the funding hatchet, can be summed up in these remarks, quoted from the article, with Grant's quotes quoted within:
"Such a move [to cut research funding] could 'decimate Britain's global competitiveness in research', Grant told the Guardian, arguing that there is a 'direct human benefit' in areas such as cancer and Parkinson's disease from research-intensive universities."
As universities all over the world are facing the possibility, if not the certainty, of budget cuts, whether from government or from their own trustees and administrators, the kind of thinking Grant espouses comes up frequently. Typical are calls to validate everything that universities do according to some immediate demonstration of "direct human benefit," as though the only things that have "direct human benefit" are things that can also be easily quantified on a balance sheet and/or explained concisely and convincingly to a shareholder-type with an attention span rivaling that of a moth. The logical conclusion of such thinking is that the primary purpose of a university--and hence the primary activities for which it is or ought to be funded--is research. "Research." "Research," from the expensive chemical analysis of belly-button lint to the sequencing of the human genome. All that need be uttered to win over the holder of the purse strings in the eyes of so many is "research" (excluding, naturally, nonscientific research--even if you successfully pass this off as "research," they'll eventually catch you at the "payoff" or "takeaway" stages of the "direct human benefit" assessment, when you ramble on for more than thirty seconds about your "research").
Now is a good time to turn to Princeton philosopher Peter Singer, who had this to say in Animal Liberation, his seminal discussion of the inclusion of animals within our ethical spheres (which, naturally, touches on the use of animals for scientific research):
"In addition to the general attitude of speciesism that experimenters share with other citizens, some special factors also help to make possible some of the experiments I have described. Foremost among these is the immense respect that people sill have for scientists."
Now PMB should inform you that the types of experiments Singer is writing about here are not only, as one might guess from his broader topic, experiments on animals, but also experiments with humans that demonstrate our general willingness to put "science" and "research" ahead of ethics or rationality. Singer gives as an example of this "immense respect that people still have for scientists" a well-known Harvard study in which people who were told to "punish" fellow humans with electric shocks (in the name of "research") would follow the order when it was given by people in white lab coats, continuing to do as the lab-coated "scientist" recommends even as the human writhes and shouts in pain in full view (it should be noted that the person "being shocked" was only acting according to the "shocker's" administration of "shocks," and wasn't being shocked or tortured for real).
The rational and probably non-sadistic person in the Harvard experiment shocking the hell out of a human subject under the direction of a white-coated scientist is a phenomenal metaphor for much of our decision making in higher education. Whatever you think about Singer's ethical position with respect to animals, his anecdote provides significant insight into the flawed logic and the prejudices that affect so much of this decision-making. Research Almighty is almost always granted top priority in the value chain as "directly beneficial" to our lives, despite prominent research findings that suggest, among other things, that by the late 20th century, the vast minority (generously 3.5-5%) of improvements in population mortality could be attributed to medical intervention (versus environmental factors) (Singer's book, to some extent outdated, cites Thomas McKeown's The Role of Medicine, 1976, and the J.B. and S.M. McKinlay study "Trends in Death and Disease and the Contribution of Medical Measures," 1988). We might then begin to talk about what kinds of research (or non-research practices) best improve quality of life; but then we're starting to move into the realm of things that, like a discussion of the value of Chaucer, might take up a bit more time than the shareholders have to listen.
None of this is to say that research shouldn't be a priority, or that research is valueless. Of course one living in this world would have to be blind to much of their own reality to assert such a thing. Nonetheless, research of no kind should ever get the free pass that much of it does, much less be bandied about as a panacea and a monolithic concept all wrapped into one, then splashed over the headlines as an argument to de-fund and de-emphasize teaching in higher education, and in so doing to slash opportunities for a broader range of willing and eager students to gain access to higher education. The simple utterance of the word "research" a few times in an argument is hardly enough to settle debates about the purpose of higher education. Likewise, this word is not enough to cloud the astute reader's rather vivid impression of the innumerable ways and instances, from cancer research with the noblest intentions to animal testing for consumer cosmetics, in which "studies" and "research" produce nothing but a line item on someone's CV, plus negative externalities ranging from large-scale pain, suffering, and death to inconceivable wastes of money.
"Such a move [to cut research funding] could 'decimate Britain's global competitiveness in research', Grant told the Guardian, arguing that there is a 'direct human benefit' in areas such as cancer and Parkinson's disease from research-intensive universities."
As universities all over the world are facing the possibility, if not the certainty, of budget cuts, whether from government or from their own trustees and administrators, the kind of thinking Grant espouses comes up frequently. Typical are calls to validate everything that universities do according to some immediate demonstration of "direct human benefit," as though the only things that have "direct human benefit" are things that can also be easily quantified on a balance sheet and/or explained concisely and convincingly to a shareholder-type with an attention span rivaling that of a moth. The logical conclusion of such thinking is that the primary purpose of a university--and hence the primary activities for which it is or ought to be funded--is research. "Research." "Research," from the expensive chemical analysis of belly-button lint to the sequencing of the human genome. All that need be uttered to win over the holder of the purse strings in the eyes of so many is "research" (excluding, naturally, nonscientific research--even if you successfully pass this off as "research," they'll eventually catch you at the "payoff" or "takeaway" stages of the "direct human benefit" assessment, when you ramble on for more than thirty seconds about your "research").
Now is a good time to turn to Princeton philosopher Peter Singer, who had this to say in Animal Liberation, his seminal discussion of the inclusion of animals within our ethical spheres (which, naturally, touches on the use of animals for scientific research):
"In addition to the general attitude of speciesism that experimenters share with other citizens, some special factors also help to make possible some of the experiments I have described. Foremost among these is the immense respect that people sill have for scientists."
Now PMB should inform you that the types of experiments Singer is writing about here are not only, as one might guess from his broader topic, experiments on animals, but also experiments with humans that demonstrate our general willingness to put "science" and "research" ahead of ethics or rationality. Singer gives as an example of this "immense respect that people still have for scientists" a well-known Harvard study in which people who were told to "punish" fellow humans with electric shocks (in the name of "research") would follow the order when it was given by people in white lab coats, continuing to do as the lab-coated "scientist" recommends even as the human writhes and shouts in pain in full view (it should be noted that the person "being shocked" was only acting according to the "shocker's" administration of "shocks," and wasn't being shocked or tortured for real).
The rational and probably non-sadistic person in the Harvard experiment shocking the hell out of a human subject under the direction of a white-coated scientist is a phenomenal metaphor for much of our decision making in higher education. Whatever you think about Singer's ethical position with respect to animals, his anecdote provides significant insight into the flawed logic and the prejudices that affect so much of this decision-making. Research Almighty is almost always granted top priority in the value chain as "directly beneficial" to our lives, despite prominent research findings that suggest, among other things, that by the late 20th century, the vast minority (generously 3.5-5%) of improvements in population mortality could be attributed to medical intervention (versus environmental factors) (Singer's book, to some extent outdated, cites Thomas McKeown's The Role of Medicine, 1976, and the J.B. and S.M. McKinlay study "Trends in Death and Disease and the Contribution of Medical Measures," 1988). We might then begin to talk about what kinds of research (or non-research practices) best improve quality of life; but then we're starting to move into the realm of things that, like a discussion of the value of Chaucer, might take up a bit more time than the shareholders have to listen.
None of this is to say that research shouldn't be a priority, or that research is valueless. Of course one living in this world would have to be blind to much of their own reality to assert such a thing. Nonetheless, research of no kind should ever get the free pass that much of it does, much less be bandied about as a panacea and a monolithic concept all wrapped into one, then splashed over the headlines as an argument to de-fund and de-emphasize teaching in higher education, and in so doing to slash opportunities for a broader range of willing and eager students to gain access to higher education. The simple utterance of the word "research" a few times in an argument is hardly enough to settle debates about the purpose of higher education. Likewise, this word is not enough to cloud the astute reader's rather vivid impression of the innumerable ways and instances, from cancer research with the noblest intentions to animal testing for consumer cosmetics, in which "studies" and "research" produce nothing but a line item on someone's CV, plus negative externalities ranging from large-scale pain, suffering, and death to inconceivable wastes of money.
Monday, June 28, 2010
The World Cup and English Nationalism
PMB loves the English, and is privileged to be perched comfortably in an English dwelling. But he can't help himself at the moment. It's routine to hear English friends and acquaintances smugly denounce Americans, for their football (soccer), their lack of 'culture' (whatever that means), their nationalism, their guns and religion, even their (very un-English) public and political sentimentality. More often than not these denunciations come from people whose impression of the US is derived either from the English print media (whose coverage of all things American is as cartoonish and myopic as is the American media's coverage of Palestine), or a view of Times Square on a brief visit to New York for which a junket beyond Manhattan would constitute some kind of ethnic cultural overload for the average Brit. Come World Cup time, especially in matches against Germany, however, virtually everything the English allege about Americans that the English in fact espouse and embody threefold simmers to the surface. Beyond the English flag epidemic (that Germany hopefully cured yesterday), the patronizing comments by British TV announcers about little 'Africa' defeating the US ('it's men versus boys, and right now the boys are on top'), and the references to non-European countries suddenly 'learning' how to kick European ass all over the pitch this year, CBS news and the LA Times have included some gems that PMB, with much ambivalence, feels the need to highlight:
CBS NEWS:
"In England, they joke about the war, German accents and Hitler.
In Germany, they joke about the fact that the English joke about the war, German accents and Hitler.
The Germans used to get offended. Now they look on in slightly patronizing bemusement as English newspapers trot out ethnic stereotypes about war, Aryan races and bombing, preparing their readers for yet another agony-filled elimination game against their old foe Sunday.
With the German team now being made up of Poles, Turks, a Spaniard, a Ghanaian, a Nigerian and even a Brazilian, it's harder for the English to make fine German-baiting jokes. The Daily Star tried, coming up with a demonizing World War II remembrance headline, "Mixed Master Race," to describe the composition of the German team.
And the Daily Express offered this deep literary analysis: "Our national poet (Shakespeare) wrote 38 plays and 154 sonnets. His German equivalent wrote 'Faust,' a gloomy two-part drama about a man who sells his soul to the devil, and a novel called 'The Sorrows of Young Werther.' . . . The latter sparked a craze of copycat suicides among romantic young men. Generations of pupils forced to study Goethe's work know how they felt."
Here's the real joke: The Germans don't really care."
LA TIMES:
"The sad truth of the matter is that England's players, with few exceptions, are an arrogant, ignorant and unpleasant lot. They are paid far too much by their Premier League clubs, where their true allegiance lies, and their ability individually and collectively in an England shirt does not match their swagger.
It is not too much to say that the worthless and nationalistic English tabloids are reflected in the English team. It's all about drinking, drugs, womanizing, gambling, fast cars and slow minds. Little England written large.
Consider, for just a moment, these sophomoric headlines from the gutter press in the days leading up to Sunday afternoon's match at the Free State Stadium:
"Germans Wurst at Penalties."
"Herr We Go Again."
"Job Done, Now for the Hun."
"Das Boot Is on the Other Foot.""
CBS NEWS:
"In England, they joke about the war, German accents and Hitler.
In Germany, they joke about the fact that the English joke about the war, German accents and Hitler.
The Germans used to get offended. Now they look on in slightly patronizing bemusement as English newspapers trot out ethnic stereotypes about war, Aryan races and bombing, preparing their readers for yet another agony-filled elimination game against their old foe Sunday.
With the German team now being made up of Poles, Turks, a Spaniard, a Ghanaian, a Nigerian and even a Brazilian, it's harder for the English to make fine German-baiting jokes. The Daily Star tried, coming up with a demonizing World War II remembrance headline, "Mixed Master Race," to describe the composition of the German team.
And the Daily Express offered this deep literary analysis: "Our national poet (Shakespeare) wrote 38 plays and 154 sonnets. His German equivalent wrote 'Faust,' a gloomy two-part drama about a man who sells his soul to the devil, and a novel called 'The Sorrows of Young Werther.' . . . The latter sparked a craze of copycat suicides among romantic young men. Generations of pupils forced to study Goethe's work know how they felt."
Here's the real joke: The Germans don't really care."
LA TIMES:
"The sad truth of the matter is that England's players, with few exceptions, are an arrogant, ignorant and unpleasant lot. They are paid far too much by their Premier League clubs, where their true allegiance lies, and their ability individually and collectively in an England shirt does not match their swagger.
It is not too much to say that the worthless and nationalistic English tabloids are reflected in the English team. It's all about drinking, drugs, womanizing, gambling, fast cars and slow minds. Little England written large.
Consider, for just a moment, these sophomoric headlines from the gutter press in the days leading up to Sunday afternoon's match at the Free State Stadium:
"Germans Wurst at Penalties."
"Herr We Go Again."
"Job Done, Now for the Hun."
"Das Boot Is on the Other Foot.""
Sunday, June 20, 2010
Political (Haircut) Reform
People are always talking about how politicians can be so untrustworthy and disconnected from 'everyday' citizens; how bitter partisanship is alienating voters; how something in politics needs to change before we can seriously reinvest ourselves in the notion that these talking heads can really make a difference for the better.
Well, PMB has a simple solution that could dramatically alter the face of politics for the better: political (haircut) reform. The proposed new law, which is straightforward and easy to implement in virtually any country in the world, reads as follows:
"No person shall be eligible for political office who parts his hair on the side and combs it across his forehead."
Examples of current politicians who would be rendered ineligible for political office by virtue of their haircuts are pictured below:
Virginia Governor (R) Bob McDonnell

Senate Minority Leader (R) Mitch McConnell

House Minority Leader (R) John Boehner

Without question, the disqualification of men who look like this from politics would make way for a whole new kind of politician, such as one who hasn't dedicated an entire life to becoming an establishment archetype.
Well, PMB has a simple solution that could dramatically alter the face of politics for the better: political (haircut) reform. The proposed new law, which is straightforward and easy to implement in virtually any country in the world, reads as follows:
"No person shall be eligible for political office who parts his hair on the side and combs it across his forehead."
Examples of current politicians who would be rendered ineligible for political office by virtue of their haircuts are pictured below:
Virginia Governor (R) Bob McDonnell

Senate Minority Leader (R) Mitch McConnell

House Minority Leader (R) John Boehner

Without question, the disqualification of men who look like this from politics would make way for a whole new kind of politician, such as one who hasn't dedicated an entire life to becoming an establishment archetype.
Tuesday, June 8, 2010
"To get the product up..."
It will take engineers--and damn good ones--to put a plug in the gushing oil well off the Gulf Coast that's leaked around 40 million 'barrels' of oil into the ocean, through the wetlands, and onto dry, American land; but it will take people who pay keen attention to language use and linguistic representation to put a plug in the mouths of people who can't seem to understand that the way we talk about things both indicates much about how we think about things, why we act (or fail to act) on things, and what those actions will look like.
Consider the ways in which prominent people in politics, industry regulation, and the media have talked about oil-in-quantity throughout the BP disaster:
Former US Environmental Protection Agency administrator and ConocoPhillips board member William K. Reilly, while pointing out the failure of the oil industry to reach the level of technological advancement necessary to prevent offshore and deap-sea drilling disasters, nonetheless can't help himself while marveling at the technology required to drill offshore and 'to get the product up.' For Reilly, a former EPA administrator, even in the context of describing the disaster of the endless spouting of oil from inside the Earth into the ocean, the natural substance, the sticky, black substance from the center of the Earth that existed before humans walked on two legs, is articulated as a "product." Just like, you know, a pair of shoes or a kind of breakfast cereal.
Sarah Palin, while on the campaign warpath (campaigning for herself, generally speaking, that is) has been a fiery proponent of offshore drilling. Her 'drill, baby, drill' mantra was taken up by more important colleagues like John McCain and Rudy Giuliani; but Palin in particular has phrased this desire strangely, suggesting we ought to drill for all the 'barrels of oil that are warehoused underground.' Palin is evidently so convinced of the idea that this naturally occurring substance called oil is not only inherently and primarily a commodity, but also that oil's commodity nature is best represented by thinking of oil as already prepackaged for sale, barreled-up, and stored neatly in rows in a commercial warehouse four miles beneath the surface of the Earth.
It's not just political shills like Palin, however, who chiefly conceive of oil as a prepackaged commodity. One notices with ease (given the repetition of coverage) that the standard unit for measuring the volume of oil is 'barrels,' such that even oil gushing out of control directly into the ocean (i.e. oil that is currently running wild, about as far from being corralled and commodified as possible) is measured and conceived of as 'barrels' of oil by media left, right, and center (wouldn't it be nice if the BP oil spill were really just 40 million barrels filled securely with oil floating around in the gulf, ripe for the plucking of BP cargo ships with big, barrel-snatching cranes attached to them?). Consider that the extent to which oil has been commodified, its price in barrels manipulated in finance markets far away from wells where the 'product' is extracted from the Earth, has led us to measure its volume not in standard metric or Imperial units, but in packaging units.
The reality, however, is that no matter how intensely and thoroughly we commodify something, be it a human or natural resource, a thing never becomes just a commodity. When we start thinking that we've successfully manipulated and brought under control through commodification virtually everything under the sun, divorcing material reality so far from perception, we play a very dangerous game. The language of those talking about the oil spill is telling, as it suggests that we've become so caught-up in the idea of oil as pure commodity, measured in 'barrels,' 'warehoused' underground for the taking and selling, a 'product' to be extracted, that we've lost sight of the fat, loud, sequined material fact that's been right in front of our faces the whole time: the oil floating around in the gulf and washing up on the shores in gelatinous blobs isn't available in barrels, never exited in a warehouse, and always has required tremendous labor and technology, and tremendous risk (including the risk of human life) to transform it into a neatly packaged commodity. When people get up and talk about oil, in the face of this disaster, as though it grows on a barrel tree in a warehouse somewhere, they demonstrate the very lack of care and consideration that produces disasters like the one in which we're currently embroiled. It's not that their metaphorical language should be understood literally, but that they're so deluded by the metaphor that the literal (and its attendant dangers and risks) has long since escaped them.
Yes, at the end of the day, a brilliant engineering team will 'save' the day by figuring out how to stop the gushing oil, and they'll get all the credit in the media and all the funding for future salvation projects, and they'll mostly deserve it. But if the rhetoric of 'drill baby drill for those warehoused barrels of oil' persists, this won't be the last Exxon-Valdez...ahem..BP oil disaster. Conventional wisdom (it's called 'conventional' in part because it's never particularly good) suggests that 'actions speak louder than words.' Well, you rarely hear of a human action that words didn't have a hand in causing.
Consider the ways in which prominent people in politics, industry regulation, and the media have talked about oil-in-quantity throughout the BP disaster:
Former US Environmental Protection Agency administrator and ConocoPhillips board member William K. Reilly, while pointing out the failure of the oil industry to reach the level of technological advancement necessary to prevent offshore and deap-sea drilling disasters, nonetheless can't help himself while marveling at the technology required to drill offshore and 'to get the product up.' For Reilly, a former EPA administrator, even in the context of describing the disaster of the endless spouting of oil from inside the Earth into the ocean, the natural substance, the sticky, black substance from the center of the Earth that existed before humans walked on two legs, is articulated as a "product." Just like, you know, a pair of shoes or a kind of breakfast cereal.
Sarah Palin, while on the campaign warpath (campaigning for herself, generally speaking, that is) has been a fiery proponent of offshore drilling. Her 'drill, baby, drill' mantra was taken up by more important colleagues like John McCain and Rudy Giuliani; but Palin in particular has phrased this desire strangely, suggesting we ought to drill for all the 'barrels of oil that are warehoused underground.' Palin is evidently so convinced of the idea that this naturally occurring substance called oil is not only inherently and primarily a commodity, but also that oil's commodity nature is best represented by thinking of oil as already prepackaged for sale, barreled-up, and stored neatly in rows in a commercial warehouse four miles beneath the surface of the Earth.
It's not just political shills like Palin, however, who chiefly conceive of oil as a prepackaged commodity. One notices with ease (given the repetition of coverage) that the standard unit for measuring the volume of oil is 'barrels,' such that even oil gushing out of control directly into the ocean (i.e. oil that is currently running wild, about as far from being corralled and commodified as possible) is measured and conceived of as 'barrels' of oil by media left, right, and center (wouldn't it be nice if the BP oil spill were really just 40 million barrels filled securely with oil floating around in the gulf, ripe for the plucking of BP cargo ships with big, barrel-snatching cranes attached to them?). Consider that the extent to which oil has been commodified, its price in barrels manipulated in finance markets far away from wells where the 'product' is extracted from the Earth, has led us to measure its volume not in standard metric or Imperial units, but in packaging units.
The reality, however, is that no matter how intensely and thoroughly we commodify something, be it a human or natural resource, a thing never becomes just a commodity. When we start thinking that we've successfully manipulated and brought under control through commodification virtually everything under the sun, divorcing material reality so far from perception, we play a very dangerous game. The language of those talking about the oil spill is telling, as it suggests that we've become so caught-up in the idea of oil as pure commodity, measured in 'barrels,' 'warehoused' underground for the taking and selling, a 'product' to be extracted, that we've lost sight of the fat, loud, sequined material fact that's been right in front of our faces the whole time: the oil floating around in the gulf and washing up on the shores in gelatinous blobs isn't available in barrels, never exited in a warehouse, and always has required tremendous labor and technology, and tremendous risk (including the risk of human life) to transform it into a neatly packaged commodity. When people get up and talk about oil, in the face of this disaster, as though it grows on a barrel tree in a warehouse somewhere, they demonstrate the very lack of care and consideration that produces disasters like the one in which we're currently embroiled. It's not that their metaphorical language should be understood literally, but that they're so deluded by the metaphor that the literal (and its attendant dangers and risks) has long since escaped them.
Yes, at the end of the day, a brilliant engineering team will 'save' the day by figuring out how to stop the gushing oil, and they'll get all the credit in the media and all the funding for future salvation projects, and they'll mostly deserve it. But if the rhetoric of 'drill baby drill for those warehoused barrels of oil' persists, this won't be the last Exxon-Valdez...ahem..BP oil disaster. Conventional wisdom (it's called 'conventional' in part because it's never particularly good) suggests that 'actions speak louder than words.' Well, you rarely hear of a human action that words didn't have a hand in causing.
Saturday, May 29, 2010
"Real" Doctors?
PMB is aware of a commonly held belief, particularly among Americans, that "real" doctors are medical doctors or physicians, and that it's offensively self-indulgent for PhD-holders to take the title "doctor." Whether this belief stems from the fact that physicians, and not chemical engineers or sociology professors, get to parade around in scrubs and expensive watches on popular TV shows, movies, and even real life, solving (or attempting to solve) a definitive set of explicitly illustrated problems with which we can all identify--or whether it's for some other reason--is not really PMB's concern here. Regardless of the cause, the belief that only physicians or medical doctors have the moral right to call themselves doctors is hogwash, based on ignorance of the history and meaning of the term "doctor" and the curricular differences between those who earn degrees in medicine and those who earn doctorates in academic disciplines. Viz:
The term "doctor" comes from the Latin "doctoris", or "teacher." Such a title has been, first and foremost for over a thousand years, an academic title. Though, coincidentally, the first academic degrees happened to be in professional disciplines, like medicine, law, and theology (theology is arguably no longer a professional discipline in the same way law or medicine are today), the title "doctor" was not bestowed simply on account of one studying medicine (or law, or theology), but because one achieved a certain level of academic distinction in a given discipline. "Doctor," "teacher," is a title of honor and accomplishment given someone who has become qualified to preside as a teacher in an institute of (higher) education. While a medieval or Renaissance "doctor" was likely to be a lawyer or a physician who also studied literature, philosophy, and the sciences (as "learned" people were "learned" people then, not specialized and divided as we are today after the democratization of education), "doctor" clearly bears no inherent relationship to the study or practice of medicine.
Presumably, after hundreds of years of linguistic slippage, the American "doctor" is more popularly understood as a physician or medical practitioner who has earned a medical doctorate, though this is at least partially the case because in the US, a doctoral degree is required to become a practicing physician. In much of the rest of the world, however, one is trained to become a physician without earning a doctorate. In the UK and much of Europe, for example, a degree in medicine (and in law) can be undertaken as one's undergraduate degree. Only after pursuing a doctoral degree (and usually submitting a thesis or dissertation of original research) does a physician become a "doctor" in the classical sense. The American-style usage of "doctor" as interchangeable with "physician" or "medic" persists in some cases, though it's widely understood that many such physicians actually don't hold doctorates at all.
As far as who deserves to wield the title of "doctor," or who gets to call themselves a "real" doctor,' the prejudice against PhD-holders is both ignorant and uncalled-for. Earning a PhD requires typically 2-3 years of coursework, examination, and/or thesis writing, only to be followed by another 3-5 years of independent research (and teaching "on the side" (ha!)) that must culminate in more or less a book-length dissertation that constitutes an original research contribution to a wider field of study. Of course, that research must be defended as the culmination of the doctoral degree, such that examiners are satisfied that the work is both strong and a viable contribution to the field. Not at all to belittle the medical doctoral curriculum, for which original research is not a requirement, but difficult and stressful examinations are; but at the very least one would be extremely hard-pressed to find evidence that earning a PhD is somehow easier or "softer" or less demanding or less "real" than earning a medical doctorate.
PMB is not in the business of telling people what to call themselves. Actually, PMB finds it tasteless to push a title--any title--in many social settings, even if you are a medical doctor. Further, PMB understands if people take exception to the use of titles in any or all situations these days, as there's an argument to be had about whether titles are unavoidably pompous, classist, etc. But the trouble starts for PMB when people unthinkingly assume not just that titles are bad, but that one type of doctor has any greater claim to the title than another, or any greater reason to identify as "doctor" in certain situations than another type. What exactly are the reasons for that differentiation, that imposed hierarchy of doctors, that gives medical doctors the right to identify as "doctor" while PhD-holders are somehow always considered insecure or pompous should they dare to take their rightful and proper title, for one reason or another?
It takes an offensive level of ignorance to think that someone who has gone through the long, grueling, and often thankless process of earning a PhD should be thought of as causing offense for simply entertaining the option of taking on the title "doctor," "teacher," which that accomplishment officially bestows upon them, and has for a millennium. A PhD-holder may not have the payoff at the end of the long road of seeing his profession dramatized amid blood and guts by the likes of George Clooney. In fact, a PhD-holder probably never had the satisfaction of taking a break from studying for exams or working late nights in the lab or brooding over a dissertation chapter to gather with friends and popcorn and have a med-school-class viewing of "Grey's Anatomy," a respite in which to fantasize about the days to come, sure to be filled with sex and heartache and, most importantly, salaries large enough to actually pay off student loans. But at the very least, a PhD-holder has the moral high ground, unequivocally, to call herself "doctor," every bit as much as, if not more than, a medical doctor, without being presumed insecure or pompous. Certainly all kinds of doctors can abuse their title and its attendant status and distinction; but the idea that only a medical doctorate deserves the option of distinction in certain situations is an insult, and should be taken as such. PhD-holders achieve a level of distinction and qualification well beyond that of BA- or MA- holders, for example, if not MD holders as well. Accordingly, PhD-holders should be expected to censor their accomplishments or to go by Mrs./Ms./Mr. no more than our ordinary holders of medical doctorates. If that's offensive to you, PMB suggests you make an appointment with your local academic historian (who, perhaps, may refer you to a specialist in the history or sociology of academic titles); they will likely have the cure for your ailment, probably for an alarmingly cheap fee.
The term "doctor" comes from the Latin "doctoris", or "teacher." Such a title has been, first and foremost for over a thousand years, an academic title. Though, coincidentally, the first academic degrees happened to be in professional disciplines, like medicine, law, and theology (theology is arguably no longer a professional discipline in the same way law or medicine are today), the title "doctor" was not bestowed simply on account of one studying medicine (or law, or theology), but because one achieved a certain level of academic distinction in a given discipline. "Doctor," "teacher," is a title of honor and accomplishment given someone who has become qualified to preside as a teacher in an institute of (higher) education. While a medieval or Renaissance "doctor" was likely to be a lawyer or a physician who also studied literature, philosophy, and the sciences (as "learned" people were "learned" people then, not specialized and divided as we are today after the democratization of education), "doctor" clearly bears no inherent relationship to the study or practice of medicine.
Presumably, after hundreds of years of linguistic slippage, the American "doctor" is more popularly understood as a physician or medical practitioner who has earned a medical doctorate, though this is at least partially the case because in the US, a doctoral degree is required to become a practicing physician. In much of the rest of the world, however, one is trained to become a physician without earning a doctorate. In the UK and much of Europe, for example, a degree in medicine (and in law) can be undertaken as one's undergraduate degree. Only after pursuing a doctoral degree (and usually submitting a thesis or dissertation of original research) does a physician become a "doctor" in the classical sense. The American-style usage of "doctor" as interchangeable with "physician" or "medic" persists in some cases, though it's widely understood that many such physicians actually don't hold doctorates at all.
As far as who deserves to wield the title of "doctor," or who gets to call themselves a "real" doctor,' the prejudice against PhD-holders is both ignorant and uncalled-for. Earning a PhD requires typically 2-3 years of coursework, examination, and/or thesis writing, only to be followed by another 3-5 years of independent research (and teaching "on the side" (ha!)) that must culminate in more or less a book-length dissertation that constitutes an original research contribution to a wider field of study. Of course, that research must be defended as the culmination of the doctoral degree, such that examiners are satisfied that the work is both strong and a viable contribution to the field. Not at all to belittle the medical doctoral curriculum, for which original research is not a requirement, but difficult and stressful examinations are; but at the very least one would be extremely hard-pressed to find evidence that earning a PhD is somehow easier or "softer" or less demanding or less "real" than earning a medical doctorate.
PMB is not in the business of telling people what to call themselves. Actually, PMB finds it tasteless to push a title--any title--in many social settings, even if you are a medical doctor. Further, PMB understands if people take exception to the use of titles in any or all situations these days, as there's an argument to be had about whether titles are unavoidably pompous, classist, etc. But the trouble starts for PMB when people unthinkingly assume not just that titles are bad, but that one type of doctor has any greater claim to the title than another, or any greater reason to identify as "doctor" in certain situations than another type. What exactly are the reasons for that differentiation, that imposed hierarchy of doctors, that gives medical doctors the right to identify as "doctor" while PhD-holders are somehow always considered insecure or pompous should they dare to take their rightful and proper title, for one reason or another?
It takes an offensive level of ignorance to think that someone who has gone through the long, grueling, and often thankless process of earning a PhD should be thought of as causing offense for simply entertaining the option of taking on the title "doctor," "teacher," which that accomplishment officially bestows upon them, and has for a millennium. A PhD-holder may not have the payoff at the end of the long road of seeing his profession dramatized amid blood and guts by the likes of George Clooney. In fact, a PhD-holder probably never had the satisfaction of taking a break from studying for exams or working late nights in the lab or brooding over a dissertation chapter to gather with friends and popcorn and have a med-school-class viewing of "Grey's Anatomy," a respite in which to fantasize about the days to come, sure to be filled with sex and heartache and, most importantly, salaries large enough to actually pay off student loans. But at the very least, a PhD-holder has the moral high ground, unequivocally, to call herself "doctor," every bit as much as, if not more than, a medical doctor, without being presumed insecure or pompous. Certainly all kinds of doctors can abuse their title and its attendant status and distinction; but the idea that only a medical doctorate deserves the option of distinction in certain situations is an insult, and should be taken as such. PhD-holders achieve a level of distinction and qualification well beyond that of BA- or MA- holders, for example, if not MD holders as well. Accordingly, PhD-holders should be expected to censor their accomplishments or to go by Mrs./Ms./Mr. no more than our ordinary holders of medical doctorates. If that's offensive to you, PMB suggests you make an appointment with your local academic historian (who, perhaps, may refer you to a specialist in the history or sociology of academic titles); they will likely have the cure for your ailment, probably for an alarmingly cheap fee.
Thursday, May 20, 2010
More Arrogant: Americans or Scientists?
As PMB collates in his fearsome paper head a few unprompted comments from scientists on science over the past couple of weeks, there can be little doubt that for some people, science is a cult idol. Rarely has a contingent been so arrogant about its primacy and potential.
Perhaps the most empirical observation we can make, present all around us, is that humans are terrible scientists. So easily fooled by optical illusions, differing vantage points, constant misapprehensions, and a sheer inability to apply rationality in our daily decision-making, we employ machines to count, measure, and process all that we cannot. Thank the Science Gods for science, which enables us to render scientific and explain scientifically all of that which we are not and do not understand scientifically. Thank the Science Gods for the Great Intelligent Design of Science, Science being perhaps the greatest artist--and PMB says this without irony--in the history of the world.
Science, like art, is a perfect self-sealing argument: defined scientifically through the scientific process, everything that can be crammed into this artificial framework can be churned out with a scientific explanation. Defined subjectively and contextually, everything that can be framed under the banner of art can be explained artistically. The difference between empirical truth and experienced truth is vast, though the presumed primacy of the former is as much a construct as the latter, and brings us no closer to an objective Truth, despite its claims of 'little objectivities' and its strategic essentialisms. The idea of science as the all-powerful and all-encompassing, general mode of progress and discovery, is crude and shortsighted, and ought not to be tolerated. From a hapless poet, William Blake, "Jerusalem":
Perhaps the most empirical observation we can make, present all around us, is that humans are terrible scientists. So easily fooled by optical illusions, differing vantage points, constant misapprehensions, and a sheer inability to apply rationality in our daily decision-making, we employ machines to count, measure, and process all that we cannot. Thank the Science Gods for science, which enables us to render scientific and explain scientifically all of that which we are not and do not understand scientifically. Thank the Science Gods for the Great Intelligent Design of Science, Science being perhaps the greatest artist--and PMB says this without irony--in the history of the world.
Science, like art, is a perfect self-sealing argument: defined scientifically through the scientific process, everything that can be crammed into this artificial framework can be churned out with a scientific explanation. Defined subjectively and contextually, everything that can be framed under the banner of art can be explained artistically. The difference between empirical truth and experienced truth is vast, though the presumed primacy of the former is as much a construct as the latter, and brings us no closer to an objective Truth, despite its claims of 'little objectivities' and its strategic essentialisms. The idea of science as the all-powerful and all-encompassing, general mode of progress and discovery, is crude and shortsighted, and ought not to be tolerated. From a hapless poet, William Blake, "Jerusalem":
They Plow'd in tears, the trumpets sounded before the golden Plow
And the voices of the Living Creatures were heard in the clouds of heaven
Crying; Compell the Reasoner to Demonstrate with unhewn Demonstrations
Let the Indefinite be explored. and let every Man be judged
By his own Works, Let all Indefinites be thrown into Demonstrations
To be pounded to dust & melted in the Furnaces of Affliction:
He who would do good to another, must do it in Minute Particulars
General Good is the plea of the scoundrel hypocrite & flatterer:
For Art & Science cannot exist but in minutely organized Particulars
And not in generalizing Demonstrations of the Rational Power,
The Infinite alone resides in Definite & Determinate Identity
Establishment of Truth depends on destruction of Falshood continually
On Circumcision: not on Virginity, O Reasoners of Albion
Saturday, May 15, 2010
The Case Against Conservatism, Made Simple
There exists a long history of authoritarian disaster on the Left, which is rather easily pointed out: where communism reached its extremes in Leninism and Stalinism, for example, we saw the devastating potential of unchecked tendencies toward authoritarianism by way of a Leftist political agenda. What enabled these despotic regimes was fundamentally *not* any particular Marxist or collectivist ideology, but rather an interpretation of these ideologies that favors autocratic government and fears pluralism. In other words, a truly free and pluralist society has room for ideologues like Lenin and Stalin, but ultimately treads toward its own murderous decline when it finds ways to enshrine absolute power in the hands of ideologues instead of merely tolerating them and their political voices within the stable framework of a pluralist and rights-based democratic society. In this vein, for example, America tolerates the unsightly public demonstrations of neo-Nazis and Ku Klux Klan sheetheads, and does so admirably, its tolerance of even the most abominable speech positions being altogether very different from a (de facto, as it were) ratification of these extreme positions. This is how great pluralist societies work: instead of martyring the fringe, they protect the right of fringe expression and co-opt it into a broader and more sensible political discourse.
That said, here's the beginning of the case against today's American conservatism: whether conservatives like it or not, America is and has always been a pluralist nation par excellence. The greatness of America does not and has never come from any particular 'American' ideology or way of life, but rather through America's impeccable ability to accommodate a daunting range of ethnic, cultural, religious, and political difference under a series of very big tents, doing so by ensuring a set of inalienable rights and shunning authoritarianism. Today's conservatives, however, are committed to a very different narrative of America, a narrative that betrays our characteristic pluralism, our national lifeblood. For conservatives, America's success stems from America as a Christian nation with a free-market economy, a particular set of family values, a way of educating in the great Western European tradition, and particular versions of individualism and self-determinism that sanctify the pursuit of wealth for its own sake. The case against conservatism, however, is not an argument that any of these aspects of the conservative American narrative are wrong, but that the assumed primacy of these aspects of the conservative American narrative is wrong. These aspects--the conservative brand of individualism, the relatively unregulated markets, etc.--are not prime, but derivative, the *results* of a rights-based pluralism that enables some Americans to envision their country in this way, while others can see it differently.
What PMB is getting at, then, is that when conservatives mistake some of the fruits of America's great pluralism and tolerance as the prime American narrative itself--the source of American greatness--they risk establishing a rather narrow, singular version of what America is or ought to be, and consequently they move to defend that singular version at all costs. We see this daily in conservative politics, including, but not limited to, the following examples:
1) The abandonment of the basic right of protection against cruel and unusual punishment in advocating for the torturing of 'enemy combatants,' even when such enemy combatants are American citizens, in order to obtain intelligence (even if this tactic is proven not to work, the point is that the conservative position here is to abandon a fundamental right by making an exception for certain 'special' security cases).
2) The abandonment of basic due process rights (via Miranda) for people suspected of terrorist activity.
3) The threat to remove the American citizenship, through the Department of State, without due process and without conviction, of anyone suspected of cooperating loosely with a government-defined and identified terrorist cell.
4) The abandonment of due process rights and the stopping of people on the street in Arizona based on 'reasonable suspicion' of illegal immigration status (appearance).
5) National prayer days
6) The governments in Texas and Arizona intervening into the substantive material being taught in schools by banning (in AZ) ethnic studies curricula and removing (in TX) Thomas Jefferson, who believed strongly in the separation of church and state, from history of the enlightenment curricula.
7) Assertions that any Americans or American politicians who favor any degree of social welfare provisions are anti-American or 'ruining America.'
8) Assertions that any semblance of government regulation in financial markets, political lobbying, or monopolistic or duopolistic markets is anti-American or 'ruining America.'
9) Assertions that certain Americans living in certain American regions are 'real Americans,' and the 'heartland' is the 'real America.'
In sum, these positions advocate strongly for a singular view of America and Americanness, as opposed to a pluralist America in which ideological difference is not wielded as a threat of exclusion, expulsion, or treasonous hostility.
The case against conservatism, then, is that it's anti-pluralist. The conservative agenda isn't in favor of big or small government, regulated or unregulated markets, personal freedoms or authoritarian measures. It moves back and forth on each of these issues in order to defend, by any ideological means necessary, a singular conservative understanding of what America is and is all about. Even if the professed conservative positions on these issues--small governments, unregulated markets, individual freedom--are ultimately correct, it is incumbent upon all Americans to reject the totality with which these positions are assumed and advanced, for the sake of the true lifeblood of this nation: our pluralism, tolerance, and INALIENABLE rights, all of which can be compromised UNDER NO CIRCUMSTANCES.
That said, here's the beginning of the case against today's American conservatism: whether conservatives like it or not, America is and has always been a pluralist nation par excellence. The greatness of America does not and has never come from any particular 'American' ideology or way of life, but rather through America's impeccable ability to accommodate a daunting range of ethnic, cultural, religious, and political difference under a series of very big tents, doing so by ensuring a set of inalienable rights and shunning authoritarianism. Today's conservatives, however, are committed to a very different narrative of America, a narrative that betrays our characteristic pluralism, our national lifeblood. For conservatives, America's success stems from America as a Christian nation with a free-market economy, a particular set of family values, a way of educating in the great Western European tradition, and particular versions of individualism and self-determinism that sanctify the pursuit of wealth for its own sake. The case against conservatism, however, is not an argument that any of these aspects of the conservative American narrative are wrong, but that the assumed primacy of these aspects of the conservative American narrative is wrong. These aspects--the conservative brand of individualism, the relatively unregulated markets, etc.--are not prime, but derivative, the *results* of a rights-based pluralism that enables some Americans to envision their country in this way, while others can see it differently.
What PMB is getting at, then, is that when conservatives mistake some of the fruits of America's great pluralism and tolerance as the prime American narrative itself--the source of American greatness--they risk establishing a rather narrow, singular version of what America is or ought to be, and consequently they move to defend that singular version at all costs. We see this daily in conservative politics, including, but not limited to, the following examples:
1) The abandonment of the basic right of protection against cruel and unusual punishment in advocating for the torturing of 'enemy combatants,' even when such enemy combatants are American citizens, in order to obtain intelligence (even if this tactic is proven not to work, the point is that the conservative position here is to abandon a fundamental right by making an exception for certain 'special' security cases).
2) The abandonment of basic due process rights (via Miranda) for people suspected of terrorist activity.
3) The threat to remove the American citizenship, through the Department of State, without due process and without conviction, of anyone suspected of cooperating loosely with a government-defined and identified terrorist cell.
4) The abandonment of due process rights and the stopping of people on the street in Arizona based on 'reasonable suspicion' of illegal immigration status (appearance).
5) National prayer days
6) The governments in Texas and Arizona intervening into the substantive material being taught in schools by banning (in AZ) ethnic studies curricula and removing (in TX) Thomas Jefferson, who believed strongly in the separation of church and state, from history of the enlightenment curricula.
7) Assertions that any Americans or American politicians who favor any degree of social welfare provisions are anti-American or 'ruining America.'
8) Assertions that any semblance of government regulation in financial markets, political lobbying, or monopolistic or duopolistic markets is anti-American or 'ruining America.'
9) Assertions that certain Americans living in certain American regions are 'real Americans,' and the 'heartland' is the 'real America.'
In sum, these positions advocate strongly for a singular view of America and Americanness, as opposed to a pluralist America in which ideological difference is not wielded as a threat of exclusion, expulsion, or treasonous hostility.
The case against conservatism, then, is that it's anti-pluralist. The conservative agenda isn't in favor of big or small government, regulated or unregulated markets, personal freedoms or authoritarian measures. It moves back and forth on each of these issues in order to defend, by any ideological means necessary, a singular conservative understanding of what America is and is all about. Even if the professed conservative positions on these issues--small governments, unregulated markets, individual freedom--are ultimately correct, it is incumbent upon all Americans to reject the totality with which these positions are assumed and advanced, for the sake of the true lifeblood of this nation: our pluralism, tolerance, and INALIENABLE rights, all of which can be compromised UNDER NO CIRCUMSTANCES.
Wednesday, May 12, 2010
More on Arizona, Union's New Worst State
Arizona Gov. Jan Brewer and her state legislature may have one-upped Texas governor and Texas secessionist Rick Perry's general puerility by dealing the unfortunate residents of the (not so) Grand Canyon State two consecutive and particularly egregious blows. As if Arizona's new anti-immigration law, which the governor tells us will not amount to racial profiling, weren't enough on its own to reflect the racial and ethnic fears and intolerances of a vociferous bloc of Arizona residents, now comes a new bill prohibiting the teaching of ethnic studies courses in public high schools.
Brewer and head of State schools Tom Home argue that ethnic studies courses cause racial hatred and resentment, and teach non-white, predominately Mexican-American students to hate white people. On this basis they will abolish these courses.
PMB doesn't intend for this post to be a thorough discussion of the merits and problems of ethnic studies curricula, as such a discussion would be both longer and more complicated than he has time for at the moment. PMB will say, however, that though ethnic studies courses, taught in certain ways and with certain objectives, can certainly result in race-based resentment, there are two relevant counterpoints:
1) As with teaching virtually any body of knowledge, there are counterproductive ways to proceed and there are productive and laudable ways to proceed. Simply because course material is about a particular ethnic group, its history, literatures, languages, etc., doesn't mean that teaching such material amounts to racial or ethnic favoritism, or an unproductive mode of 'solidarity-promotion,' as the proponents of this law fear. This material can be and in fact is taught responsibly and productively, and has been for a long time. Simply arguing, as the governor and her henchman do, that anything ethnic-studies related is by definition inflammatory because it concerns ethnicity-based knowledge is ludicrous. Not to state the obvious (but sometimes one has to when dealing with ignorant politicians), but the VAST MAJORITY of history, literature, etc., taught in US schools is already refracted through a white, Anglo-European ethnic lens. This makes sense, of course, given the history of the country. But we wouldn't dare argue that the teaching of Shakespeare or Milton or Rousseau over Borges or Garcia Marquez risks rousing a dangerous and disruptive white solidarity. When educating young Mexican-Americans in Arizona, however, certainly allowing them to identify ethnically with some Mexican or Mexican-American writers or histories alongside the rest of the standard Eurocentric curriculum isn't exactly a militant exercise aimed at causing hatred of white people.
2) We must be careful not to confuse transmission of knowledge with advocating ethnic or racial separatism. Tom Home compares the ethnic studies courses to the Old South; but this is like saying that one who teaches or learns about slavery is also advocating for it. Of course that's ridiculous. Categorizing knowledge along ethnic or cultural or linguistic lines, as we often do (you wouldn't accuse your Spanish teacher of being anti-English-language or anti-American for teaching you Spanish, would you?) reflects a tendency to heuristically separate histories and knowledge fields in a certain way, and not a tendency to ratify the separation of actual people in such a way.
PMB is skeptical, further, that these courses are really having the extreme effects on race relations that proponents of the new bill suggest (this is in part due to a lack of trust in people who invoke slavery and segregation in comparisons with teaching Mexican-American kids about their ethnic heritage, no less in school districts that are about half Mexican-American, demographically).
But the issue to trump all other issues here, I think, has less to do with concerns about ethnic studies and more to do with the relationship between the state (and the state legislature) and the education of children, even in public schools.
Even in public schools, ignorant politicians have no business censoring school curricula until they've thoroughly understood the curriculum and the materials they aim to censor. And even then, one would have to cross a lot of lines to shut down courses that teach kids about their own cultural heritage. PMB is not convinced, in this case, that any of these people know anything about ethnic studies and/or Mexican-American literatures, histories, or cultures. This is transparently a political intervention into the sphere of education, a sphere not at all immune from both internal and external politics, but vastly more knowledgeable and competent when it comes to sorting out its own political, pedagogical, and curricular conundrums. This situation, in which a state head of schools has apparently forgotten, if he ever knew, the value of a broad education, is like a distorted mirror image of the situation in higher education, in which many of those running the administrative show are demonstrating utter cluelessness about what actually goes on in the classroom, how, and why.
Brewer and head of State schools Tom Home argue that ethnic studies courses cause racial hatred and resentment, and teach non-white, predominately Mexican-American students to hate white people. On this basis they will abolish these courses.
PMB doesn't intend for this post to be a thorough discussion of the merits and problems of ethnic studies curricula, as such a discussion would be both longer and more complicated than he has time for at the moment. PMB will say, however, that though ethnic studies courses, taught in certain ways and with certain objectives, can certainly result in race-based resentment, there are two relevant counterpoints:
1) As with teaching virtually any body of knowledge, there are counterproductive ways to proceed and there are productive and laudable ways to proceed. Simply because course material is about a particular ethnic group, its history, literatures, languages, etc., doesn't mean that teaching such material amounts to racial or ethnic favoritism, or an unproductive mode of 'solidarity-promotion,' as the proponents of this law fear. This material can be and in fact is taught responsibly and productively, and has been for a long time. Simply arguing, as the governor and her henchman do, that anything ethnic-studies related is by definition inflammatory because it concerns ethnicity-based knowledge is ludicrous. Not to state the obvious (but sometimes one has to when dealing with ignorant politicians), but the VAST MAJORITY of history, literature, etc., taught in US schools is already refracted through a white, Anglo-European ethnic lens. This makes sense, of course, given the history of the country. But we wouldn't dare argue that the teaching of Shakespeare or Milton or Rousseau over Borges or Garcia Marquez risks rousing a dangerous and disruptive white solidarity. When educating young Mexican-Americans in Arizona, however, certainly allowing them to identify ethnically with some Mexican or Mexican-American writers or histories alongside the rest of the standard Eurocentric curriculum isn't exactly a militant exercise aimed at causing hatred of white people.
2) We must be careful not to confuse transmission of knowledge with advocating ethnic or racial separatism. Tom Home compares the ethnic studies courses to the Old South; but this is like saying that one who teaches or learns about slavery is also advocating for it. Of course that's ridiculous. Categorizing knowledge along ethnic or cultural or linguistic lines, as we often do (you wouldn't accuse your Spanish teacher of being anti-English-language or anti-American for teaching you Spanish, would you?) reflects a tendency to heuristically separate histories and knowledge fields in a certain way, and not a tendency to ratify the separation of actual people in such a way.
PMB is skeptical, further, that these courses are really having the extreme effects on race relations that proponents of the new bill suggest (this is in part due to a lack of trust in people who invoke slavery and segregation in comparisons with teaching Mexican-American kids about their ethnic heritage, no less in school districts that are about half Mexican-American, demographically).
But the issue to trump all other issues here, I think, has less to do with concerns about ethnic studies and more to do with the relationship between the state (and the state legislature) and the education of children, even in public schools.
Even in public schools, ignorant politicians have no business censoring school curricula until they've thoroughly understood the curriculum and the materials they aim to censor. And even then, one would have to cross a lot of lines to shut down courses that teach kids about their own cultural heritage. PMB is not convinced, in this case, that any of these people know anything about ethnic studies and/or Mexican-American literatures, histories, or cultures. This is transparently a political intervention into the sphere of education, a sphere not at all immune from both internal and external politics, but vastly more knowledgeable and competent when it comes to sorting out its own political, pedagogical, and curricular conundrums. This situation, in which a state head of schools has apparently forgotten, if he ever knew, the value of a broad education, is like a distorted mirror image of the situation in higher education, in which many of those running the administrative show are demonstrating utter cluelessness about what actually goes on in the classroom, how, and why.
Tuesday, April 27, 2010
Arixenophobia (Boycott Arizona)
Arizona passed a new immigration law, which effectively requires all immigrants, legal or illegal, to carry their papers with them at all times, and mandates that law enforcement officers stop and question anyone for whom they have "reasonable suspicion" of being an illegal "alien."
Republican Senator and former JAG lawyer Lindsay Graham believes the new law is unconstitutional, while former Under Secretary of Homeland Security Michael D. Brown goes off here about how the new law really isn't that big of a deal.
Unfortunately, our former Under Secretary of Homeland Security is functionally illiterate. His misunderstanding of the most controversial aspect of the new Arizona immigration law below:
"Consider just one paragraph of the bill:
FOR ANY LAWFUL CONTACT MADE BY A LAW ENFORCEMENT OFFICIAL OR AGENCY OF THIS STATE OR A COUNTY, CITY, TOWN OR OTHER POLITICAL SUBDIVISION OF THIS STATE WHERE REASONABLE SUSPICION EXISTS THAT THE PERSON IS AN ALIEN WHO IS UNLAWFULLY PRESENT IN THE UNITED STATES, A REASONABLE ATTEMPT SHALL BE MADE, WHEN PRACTICABLE, TO DETERMINE THE IMMIGRATION STATUS OF THE PERSON.
Article 8, Section 11-1051, Paragraph B., lines 20-25
Consider that language again: a law enforcement office must first have 'lawful contact.' 'Reasonable suspicion' must exist that the person is here illegally. A 'reasonable attempt' is all the officer can make to determine the immigration status of the person and then, only when 'practicable.'
That's racist, Reverend Sharpton? Sounds like sound law enforcement to me."
PMB will now reinterpret this paragraph correctly (hopefully the Under Secretary is paying attention):
The former Under Secretary treats "lawful contact" and "reasonable suspicion" above as though they're two separate steps in the process of legally stopping and questioning someone, but this is wrong. "Lawful contact" in this case is not a precursor to "reasonable suspicion"; "lawful contact" requires "reasonable suspicion." "Lawful contact" here is basically interchangeable with "reasonable suspicion," though it's presented above by the former Under Secretary as a preexisting condition upon which "reasonable suspicion" can be (more safely) derived. This might seem like nitpicking, but it's actually the crux of the controversy regarding this new law: since there is much concern over what constitutes "reasonable suspicion" and how "reasonable suspicion" would be derived in order to legally stop and question someone on the street about their immigration status, those defending this law have the burden of explaining this mystery about how, practically, could "reasonable suspicion" be lawfully derived. The former Under Secretary attempts to assuage this concern by presenting "reasonable suspicion" as something based not on a guess about the way someone looks on the street, but as a second-order determination after something called "lawful contact" has already been made. Ah, but since "reasonable suspicion" is necessary for "lawful contact" and not the other way around, the former Under Secretary's interpretation is complete and total bullshit. He attempts to explain away the crux of the controversy circuitously, by arguing essentially that "reasonable suspicion will be attained in the context of lawful stops, because a lawful stop requires reasonable suspicion."
The question remains, then: how exactly would a law enforcement officer attain reasonable suspicion? It's easy to argue, as the Republicans are currently doing in their talking points, that the "reasonable suspicion" stops we're talking about here are things like speeding white vans with 16 passengers packed in the back and police cars tailing them (speeding/being in a car chase indeed meets the "reasonable suspicion" standard to make a stop, obviously); but what about the inevitable gray areas? Will the cops ask every white speeder they stop for his or her immigration papers, or will they profile for Mexican-looking people? Will law enforcement officers who see loiterers outside of the club or the grocery store and approach them for their immigration papers...if they're white? Hispanic? What constitutes reasonable suspicion, not just of a given traffic violation, but of illegal immigration status? And finally, do we expect all law enforcement officers, who can be prosecuted themselves under this law for failing to question the "reasonably suspicious" about their immigration status, to reasonably abide all of these nuances, and to avoid simply profiling for Mexican-looking people? This is most likely why law enforcement officers themselves have voiced serious opposition to this new law, doing so also years before when a similar law was on the table during Janet Napolitano's governorship.
PMB tends to agree with Senator Graham that not only is the new Arizona law unconstitutional, but it's also not a very smart or effective way forward for tackling immigration issues. As current head of Homeland Security Janet Napolitano has suggested, the law will likely result in the redistribution of immigration law enforcement resources away from concentrated areas of need (cartels and human trafficking) and toward the detaining of peaceful and productive people who happen to get stopped without their papers, and constitute a far lesser security risk.
Those in favor of the new Arizona law are calling it a way of fighting drug and human trafficking; yet the negative effects of this law will move far beyond its alleged toughness on real criminal behavior. John McCain talks about illegals bringing drugs across our borders as though Americans aren't the ones demanding and paying for the drugs that keep the cartels and their violent tactics in steady operation. If drug smuggling and all the nastiness that goes with it were really a concern among these politicians, then surely they'd have figured out by now that the best way to crush the cartels is to cut them out of their US market by legalizing and regulating certain drugs on our own soil. But PMB digresses...
The Arizona immigration law is just another example of shortsighted, xenophobic, fear-addled people compromising some of the most basic American values in order to attempt a quick fix. So long as certain Americans believe that Americans are the only ones in the world deserving of the fundamental rights that we daily proclaim our own, we threaten to erode these rights within our own communities. If you believe things like due process and habeas corpus are fundamental, INALIENABLE rights, then you must concede that they are universally INALIENABLE, even, ironically, for those we deem "aliens."
Republican Senator and former JAG lawyer Lindsay Graham believes the new law is unconstitutional, while former Under Secretary of Homeland Security Michael D. Brown goes off here about how the new law really isn't that big of a deal.
Unfortunately, our former Under Secretary of Homeland Security is functionally illiterate. His misunderstanding of the most controversial aspect of the new Arizona immigration law below:
"Consider just one paragraph of the bill:
FOR ANY LAWFUL CONTACT MADE BY A LAW ENFORCEMENT OFFICIAL OR AGENCY OF THIS STATE OR A COUNTY, CITY, TOWN OR OTHER POLITICAL SUBDIVISION OF THIS STATE WHERE REASONABLE SUSPICION EXISTS THAT THE PERSON IS AN ALIEN WHO IS UNLAWFULLY PRESENT IN THE UNITED STATES, A REASONABLE ATTEMPT SHALL BE MADE, WHEN PRACTICABLE, TO DETERMINE THE IMMIGRATION STATUS OF THE PERSON.
Article 8, Section 11-1051, Paragraph B., lines 20-25
Consider that language again: a law enforcement office must first have 'lawful contact.' 'Reasonable suspicion' must exist that the person is here illegally. A 'reasonable attempt' is all the officer can make to determine the immigration status of the person and then, only when 'practicable.'
That's racist, Reverend Sharpton? Sounds like sound law enforcement to me."
PMB will now reinterpret this paragraph correctly (hopefully the Under Secretary is paying attention):
The former Under Secretary treats "lawful contact" and "reasonable suspicion" above as though they're two separate steps in the process of legally stopping and questioning someone, but this is wrong. "Lawful contact" in this case is not a precursor to "reasonable suspicion"; "lawful contact" requires "reasonable suspicion." "Lawful contact" here is basically interchangeable with "reasonable suspicion," though it's presented above by the former Under Secretary as a preexisting condition upon which "reasonable suspicion" can be (more safely) derived. This might seem like nitpicking, but it's actually the crux of the controversy regarding this new law: since there is much concern over what constitutes "reasonable suspicion" and how "reasonable suspicion" would be derived in order to legally stop and question someone on the street about their immigration status, those defending this law have the burden of explaining this mystery about how, practically, could "reasonable suspicion" be lawfully derived. The former Under Secretary attempts to assuage this concern by presenting "reasonable suspicion" as something based not on a guess about the way someone looks on the street, but as a second-order determination after something called "lawful contact" has already been made. Ah, but since "reasonable suspicion" is necessary for "lawful contact" and not the other way around, the former Under Secretary's interpretation is complete and total bullshit. He attempts to explain away the crux of the controversy circuitously, by arguing essentially that "reasonable suspicion will be attained in the context of lawful stops, because a lawful stop requires reasonable suspicion."
The question remains, then: how exactly would a law enforcement officer attain reasonable suspicion? It's easy to argue, as the Republicans are currently doing in their talking points, that the "reasonable suspicion" stops we're talking about here are things like speeding white vans with 16 passengers packed in the back and police cars tailing them (speeding/being in a car chase indeed meets the "reasonable suspicion" standard to make a stop, obviously); but what about the inevitable gray areas? Will the cops ask every white speeder they stop for his or her immigration papers, or will they profile for Mexican-looking people? Will law enforcement officers who see loiterers outside of the club or the grocery store and approach them for their immigration papers...if they're white? Hispanic? What constitutes reasonable suspicion, not just of a given traffic violation, but of illegal immigration status? And finally, do we expect all law enforcement officers, who can be prosecuted themselves under this law for failing to question the "reasonably suspicious" about their immigration status, to reasonably abide all of these nuances, and to avoid simply profiling for Mexican-looking people? This is most likely why law enforcement officers themselves have voiced serious opposition to this new law, doing so also years before when a similar law was on the table during Janet Napolitano's governorship.
PMB tends to agree with Senator Graham that not only is the new Arizona law unconstitutional, but it's also not a very smart or effective way forward for tackling immigration issues. As current head of Homeland Security Janet Napolitano has suggested, the law will likely result in the redistribution of immigration law enforcement resources away from concentrated areas of need (cartels and human trafficking) and toward the detaining of peaceful and productive people who happen to get stopped without their papers, and constitute a far lesser security risk.
Those in favor of the new Arizona law are calling it a way of fighting drug and human trafficking; yet the negative effects of this law will move far beyond its alleged toughness on real criminal behavior. John McCain talks about illegals bringing drugs across our borders as though Americans aren't the ones demanding and paying for the drugs that keep the cartels and their violent tactics in steady operation. If drug smuggling and all the nastiness that goes with it were really a concern among these politicians, then surely they'd have figured out by now that the best way to crush the cartels is to cut them out of their US market by legalizing and regulating certain drugs on our own soil. But PMB digresses...
The Arizona immigration law is just another example of shortsighted, xenophobic, fear-addled people compromising some of the most basic American values in order to attempt a quick fix. So long as certain Americans believe that Americans are the only ones in the world deserving of the fundamental rights that we daily proclaim our own, we threaten to erode these rights within our own communities. If you believe things like due process and habeas corpus are fundamental, INALIENABLE rights, then you must concede that they are universally INALIENABLE, even, ironically, for those we deem "aliens."
Sunday, April 25, 2010
Bear Arms, To Keep And
PMB notes that the text of the 2nd Amendment to the US Constitution, which prescribes the right of US citizens to own guns, reads as follows:
"A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed."
Now that we've seen a dramatic increase in US 'militia groups,' groups of private citizens who tend to stockpile arms and gather to conduct psuedo-military training exercises in preparation to defend themselves or their states against the US Federal government, PMB is compelled to comment briefly on how this phenomenon relates to the 2nd Amendment.
While many gun rights activists argue that the 2nd Amendment right to bear arms is an essential freedom whose primary function is to allow individuals to defend themselves and their families against criminals and allow for the use of guns in sporting or hunting pursuits, another prominent segment of activists views the 2nd Amendment primarily as a means of empowering the people for an uprising against a tyrannical government. While the former view is probably a more tenable justification for the 2nd Amendment in the 21st century, the latter is almost certainly more Constitutionally valid. Considering the language of the amendment together with its historical context--the Revolutionary War against tyrannical Britain, fought and won largely by a militia-style army--Constitutional originalists would have to agree that using arms as an aid to populist uprising against an overreaching government was the main purpose or spirit of the 2nd Amendment. For the rest of us who think that divining a singular and definitive meaning or notion of intent from the text of the Constitution is about as ridiculous as assuming that everyone who's ever read Macbeth ought to have come to the exact same conclusions about its 'meaning' and Shakespeare's 'intent,' one can still assume with some confidence based on the text and the history that, indeed, the clause 'A well regulated Militia, being necessary to the security of a free State' either imposes a condition on 'the right of the people to keep and bear arms' (for the forming of a militia) or it constitutes clear evidence of an 'amplifying example' of the importance of the right to bear arms, the example of a militia being considered extremely important (thus 'amplifying') to those who drafted the amendment. This is to suggest that the 2nd Amendment pertains primarily to the formation of militia groups as a check against tyranny not because Constitutional intent is always clear, but rather because a piece of it happens to be rather clear in this particular case. Whether one subscribes to the idea that the 'militia clause' is qualifying or amplifying, the idea of the formation of a militia and its relation to the right to bear arms is central nonetheless.
What does this mean for the 2nd Amendment in the 21at century?
While PMB is not attempting here to argue for the sheer abolition or recusal of the 2nd Amendment, it's time to reconsider the whole militia thing.
As Obama and Medvedev's recent arms reduction treaty reduces the terms of nuclear weapons deployment down to no more than 1550 warheads or 700 launchers, the idea of 'a well regulated militia' with the ability to 'keep and bear arms' becomes perhaps slightly less ridiculous than it already was in the 1990s, when the US and Russia were operating with literally thousands of nuclear warheads. Of course, nuclear warheads really are just the tip of the 21st-century munitions and combat technology iceberg. Without going into anymore embarrassingly obvious details, suffice it to say that no militia acquiring any number of legal arms stands a shred of a chance against the US military. Perhaps the dark underside of this statement is that any fringe group, foreign or domestic, naturally stands its best chance against state military powers by engaging in acts of terrorism. And terrorists like Timothy McVeigh don't need guns to murder 168 people.
The point here is that, putting aside terrorist groups for whom obviously the 2nd Amendment was never meant to be an enabling factor (i.e. people who will engage in lethal combat against innocent, unengaged civilians as opposed to a state military), militias are now obsolete. In fact, as was the case with the Hutaree Militia in Michigan, the FBI is already protecting the nation and its law enforcement officers and government employees by tracking and getting to potentially violent militia groups before they even get the chance to stare down the barrel of a US tank. With the threat of terrorism continually present, whether from persons home or abroad, the only things a bunch of ragtag middle-aged men playing backyard George Washingtons and Paul Reveres are going to harm with their 2nd Amendment rights is innocent people and potentially themselves, and not the Federal government.
If we're going to justify the 2nd Amendment in the 21st century, let us at the very least dispense with the notion that the right to own guns has anything to do with anti-government resistance anymore. Let's stop feeding this potentially dangerous fantasy to burgeoning militia groups all over the country before people start getting killed for all the wrong reasons.
PMB will post again shortly on the Constitution, the early American government, the 'founders,' and the myriad ways in which contemporary anti-government populism evinces a harmful and regrettable ignorance of US history.
"A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed."
Now that we've seen a dramatic increase in US 'militia groups,' groups of private citizens who tend to stockpile arms and gather to conduct psuedo-military training exercises in preparation to defend themselves or their states against the US Federal government, PMB is compelled to comment briefly on how this phenomenon relates to the 2nd Amendment.
While many gun rights activists argue that the 2nd Amendment right to bear arms is an essential freedom whose primary function is to allow individuals to defend themselves and their families against criminals and allow for the use of guns in sporting or hunting pursuits, another prominent segment of activists views the 2nd Amendment primarily as a means of empowering the people for an uprising against a tyrannical government. While the former view is probably a more tenable justification for the 2nd Amendment in the 21st century, the latter is almost certainly more Constitutionally valid. Considering the language of the amendment together with its historical context--the Revolutionary War against tyrannical Britain, fought and won largely by a militia-style army--Constitutional originalists would have to agree that using arms as an aid to populist uprising against an overreaching government was the main purpose or spirit of the 2nd Amendment. For the rest of us who think that divining a singular and definitive meaning or notion of intent from the text of the Constitution is about as ridiculous as assuming that everyone who's ever read Macbeth ought to have come to the exact same conclusions about its 'meaning' and Shakespeare's 'intent,' one can still assume with some confidence based on the text and the history that, indeed, the clause 'A well regulated Militia, being necessary to the security of a free State' either imposes a condition on 'the right of the people to keep and bear arms' (for the forming of a militia) or it constitutes clear evidence of an 'amplifying example' of the importance of the right to bear arms, the example of a militia being considered extremely important (thus 'amplifying') to those who drafted the amendment. This is to suggest that the 2nd Amendment pertains primarily to the formation of militia groups as a check against tyranny not because Constitutional intent is always clear, but rather because a piece of it happens to be rather clear in this particular case. Whether one subscribes to the idea that the 'militia clause' is qualifying or amplifying, the idea of the formation of a militia and its relation to the right to bear arms is central nonetheless.
What does this mean for the 2nd Amendment in the 21at century?
While PMB is not attempting here to argue for the sheer abolition or recusal of the 2nd Amendment, it's time to reconsider the whole militia thing.
As Obama and Medvedev's recent arms reduction treaty reduces the terms of nuclear weapons deployment down to no more than 1550 warheads or 700 launchers, the idea of 'a well regulated militia' with the ability to 'keep and bear arms' becomes perhaps slightly less ridiculous than it already was in the 1990s, when the US and Russia were operating with literally thousands of nuclear warheads. Of course, nuclear warheads really are just the tip of the 21st-century munitions and combat technology iceberg. Without going into anymore embarrassingly obvious details, suffice it to say that no militia acquiring any number of legal arms stands a shred of a chance against the US military. Perhaps the dark underside of this statement is that any fringe group, foreign or domestic, naturally stands its best chance against state military powers by engaging in acts of terrorism. And terrorists like Timothy McVeigh don't need guns to murder 168 people.
The point here is that, putting aside terrorist groups for whom obviously the 2nd Amendment was never meant to be an enabling factor (i.e. people who will engage in lethal combat against innocent, unengaged civilians as opposed to a state military), militias are now obsolete. In fact, as was the case with the Hutaree Militia in Michigan, the FBI is already protecting the nation and its law enforcement officers and government employees by tracking and getting to potentially violent militia groups before they even get the chance to stare down the barrel of a US tank. With the threat of terrorism continually present, whether from persons home or abroad, the only things a bunch of ragtag middle-aged men playing backyard George Washingtons and Paul Reveres are going to harm with their 2nd Amendment rights is innocent people and potentially themselves, and not the Federal government.
If we're going to justify the 2nd Amendment in the 21st century, let us at the very least dispense with the notion that the right to own guns has anything to do with anti-government resistance anymore. Let's stop feeding this potentially dangerous fantasy to burgeoning militia groups all over the country before people start getting killed for all the wrong reasons.
PMB will post again shortly on the Constitution, the early American government, the 'founders,' and the myriad ways in which contemporary anti-government populism evinces a harmful and regrettable ignorance of US history.
Subscribe to:
Comments (Atom)