Moral Operating System a Must in a Changing Technological World, Asserts Google’s Horowitz

December 18, 2011

In this video, David Horowitz, professor of philosophy and cognitive science, as well as Google’s official In-House Philosopher, presents his idea for a “moral operating system” to guide individual and collective decisions about how to develop and use technology.

Over the past few decades, technology has developed to the point where users have the ability to gather intensely personal data about private citizens without their knowledge or permission and to use that data to predict the behavior of individuals. Horowitz equates data with power and asserts that it is the job of developers to think about how their technology will be used and to consider the ethical ramifications of that use.

It is imperative, Horowitz maintains, that those who are at the forefront of the development of technology examine their consciences and allow their own ethics to influence the work they do. Currently, developers, and society in general, consider the role of developers to be merely to develop. Development is typically viewed as being divorced from morality; ethical considerations are generally considered the sole province of those who utilize the technology after it is developed.

Horowitz asserts that it is time for developers to begin operating under a moral framework. Citing the Hannah Arendt quote, “The sad truth is that most evil done in the world is not done by people who choose to be evil but arises from not thinking,” Horowitz urges his audience to take the time to consider the moral implications of their work and to begin dialogues with people from other fields in order to develop their own moral operating systems.

Be Sociable, Share!

{ 0 comments… add one now }

Leave a Comment

Comment moderation is enabled. Your comment may take some time to appear.

Previous post:

Next post: