But how can it not know what it is? –Blade Runner
I think it is pretty dim to be offended by percieved Iraq analogy in BattleStar. I guess these are the same people that see dhimmitude in the bend of a blade of grass these days.
BSG is so much more. BSG asks the same question that Kikaku Kidotai (GitS II: Innosensu) and Blade Runner, and countless scifi series and novels ask, can a machine be human? Which is really a paraphrase of what does it mean to be human?
The questions about torture and suicide bombing; is it ever ok to torture? what if it is a machine? is it ok to suicide bomb? what if the explodees are machines (ie, not human)? are not just about Iraq. How impoverished must be the imaginations that only see the 2-D representation.
BSG is a sort of intellectual field lab for asking those question, which need to be answered in the next thirty years, because of the advent of the Singularity.
But also, I am interested in the Friendliness Problem for Strong AI, because, if we could solve the problem for AIs, couldn’t we solve it for homosapiens? Or, does it mean that as machines become more human they become less Friendly?
In Innosensu, when the gynoids kill their masters, it turns out that they can violate the prime directive because they have become part human thru the Locus Solus process of ghost-dubbing. The Bladerunner skin-jobs can kill their makers because they have become too human, ie just like us. But perhaps in the end, the skinjobs can be come more than human, as when Roy saved Detective Deckard. Or is that truly human? Is the saving grace of humanity compassion and mercy and love? And will we see the cylons achieve it too?
I must admit that I was surprised the writers of BG were willing to have suicide bombings, etc. in the series. It’s a touchy subject and most TV writers are producers want only winning, proven, non-controversial formulas.
After the season opener, a friend of mine told me he might stop watching because it was too depressing.