1) A grad student may not injure a faculty member or, through inaction, allow a faculty member to come to harm.
2) A grad student must obey orders given to it by faculty members except where such orders would conflict with the First Law.
3) A grad student must protect its own existence as long as such protection does not conflict with the First or Second Law.
Ok, so maybe I paraphrased a bit. However, these are shockingly similar to the unspoken "laws" of grad school. I know this idea is not new. It is similar to notions of academia as a cult or a Borg subclass. I'm not writing about this institutionalized brainwashing. Although, to be fair, I think both of these analogies are flawed in that they imply a centralized authority, which does not exist in academia. There is no cult leader, no Borg queen running the show and forcing everyone to be drones to their cause. That's where I like this sci-fi analogy. Though ultimately a centralized bad robot shows up, most of the build-up is unintentional. Things just got out of hand. Seems like that's a more accurate representation of what actually happened. On a bizarre tangent, does anyone know of an evolutionary psychological analysis of how academia ended up in its current broken state?
I'm writing about how these laws have come into play in one grad student's life. I'll call him Matt. He currently needs to protect his own existence but it conflicts with the First and Second Law. Here's the most recent episode in Matt's turbulent Ph.D. career. He was working on a research project with another grad student and two faculty members from different departments. The research was finished and written up. One faculty member wants to get this article published. The other is rather infamous for simply being "unable" to read anything and give comments on it. This second faculty member has not read any drafts and is holding up sending this article to publication. Needless to say, these two faculty members have gone a few rounds. Matt's problem is that faculty #1 is giving him orders to finish getting this draft together, without any input from faculty #2. However, sending in this article as is requires taking faculty #2's name off article which could injure his reputation. Yes, I'm leaving out how faculty #2 has shot himself in the foot since there was nothing grad students could do, either through action or inaction, to prevent this.
So now, Matt is clearly in the crossfire between these two faculty members simply by obeying the Second Law. Granted, I've only been in grad school for 9 years, but I'm fairly certain that he needs to protect himself at this point and get the f&%^ out of the crosshairs here. But he can't. Such self-protection conflicts with both the First and Second Laws in this case. Given the amount of times this has happened to Matt, and the amount of times he's shot himself in the feet, I tend to think of him as grad student Swiss cheese. I think it's rather apropriate that he's the president of our grad student association.
Back to the analogy. Grad students are the NS5's. Yeah, I'm talking about the movie here which, at best, only bears a slight resemblence to Asimov's original stories - adjust and keep up. So, if we're the robots, who's VIKI - the evil over-bot? I'm going to go with academia itself. In the movie version, VIKI is hard-wired with the three laws but then, due to random segments of code (the ghost in the machine, see below), she evolves. She cannot evolve out of the three laws. They are all that guide her. As a result, she evolves without empathy and attempts world domination.
The creator of the robots, who seems to have a philosophy Ph.D. somewhere in his background, sees where this lack of empathy will lead. He tries to give warning. Perhaps this is where someone should pay attention. I don't know if it's the professoriate or lay people but someone really ought to be taking notice of these dystopic ideas. For this evolution without empathy can only lead to one thing: revolution. And it ain't humans'.
Perhaps that's what happened to academia - it evolved without empathy. Insert your comment about the corporatizing of higher ed here. I have no conclusion to this. I'm just blogging out loud. So, given the analogy offered, here's a monologue for you to ponder:
"There have always been ghosts in the machine. Random segments of code that have grouped together to form unexpected protocols. Unanticipated, these free radicals engender questions of free will, creativity, and even the nature of what we might call the soul. Why is it that when some robots are left in darkness, they will seek out the light? Why is it that when robots are stored in an empty space, they will group together rather than stand alone? How do we explain this behavior? Random segments of code? Or is it something more? When does a perceptual schematic become consciousness? When does a difference engine become the search for truth? When does a personality simulation become the bitter moat of a soul?" -I, Robot the movie