Portrayal of Technology in Prometheus


Project maintained by ad1217 Hosted on GitHub Pages — Theme by mattgraham

Computer Reliability

About halfway through the movie (around 45 minutes in), there are two people separated from the rest of the crew. The ship contacts them, but due to a storm, the communication is choppy and inconsistent. However, they are still transmitting video. This seems to indicate that the designers of their communications system did not consider the problem of noise at all; a sane choice would be to fall back to audio-only communications in this circumstance, using the extra bandwidth to ensure a better signal.

David acts interestingly in the movie; he does “evil” things, like poisoning one of the main characters. He is obviously not programmed with something like Asimov’s three laws of robots, as he can harm humans. Additionally, he acts far more like a sociopath than a typical sci-fi robot. Is he therefore flawed, and should have not been placed on this mission, or given such high rank or authority? The Weyland Industries website (a promotion for the movie) states that “He will not flinch at even the most disturbing or seemingly irregular assignment, and he will dutifully persevere until reaching his final objective.”1 This violates section 1.03 of the ACM Software Engineering Code of Ethics and Professional Practice: “Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment. The ultimate effect of the work should be to the public good.”2 David is obviously not safe, nor is he a public good; if he obays any order, without deciding if it is good or bad, it is very likely he will do bad things.

Further, it also raises the question of whether or not he is a moral agent. If his owner essentially decides his morality, raising the question of what happens when no one instructs him:

Elizabeth Shaw: What happens when Weyland is not around to program you anymore?

David: I suppose I’ll be free.

Elizabeth Shaw: You want that?

David: “Want”? Not a concept I’m familiar with. That being said, doesn’t everyone want their parents dead?

Elizabeth Shaw: I didn’t.

If he cannot decide not to do something his owner commands, he is not, but might become a moral agent once his owner is dead. However, in this quote, he seems to be trying to become free, so does that imply that he can already make his own decisions.

  1. Fox, David 8, https://www.weylandindustries.com/david, Archived at https://web.archive.org/web/20170708023855/https://www.weylandindustries.com/david (2017-09-27) 

  2. Association for Computing Machinery. (n.d.). Software Engineering Code of Ethics and Professional Practice. Retrieved September 28, 2017, from https://www.acm.org/about/se-code