d ICTethics Surviving and Flourishing in a Digital World

Surviving and Flourishing in a Digital World

The development of digital Information and Communication Technology (ICT) has drastically changed the organization of our social, economic and political life. Still, people realize that many ethical social and legal issues remain open.

The ICTethics project zooms in on key concepts such as information, intelligence, privacy and underlines that our ill-framed understanding of these concepts may be the major stumbling blocks of a constructive social debate and acceptance of smart tools.

Information changes our lives

The development of digital Information and Communication Technology (ICT) has drastically changed the organization of our social, economic and political life.

“Computing is not about computers anymore.
It’s about living.”
(Nicholas Negroponte)

“The real revolution is not in the machines that calculate data but in the data itself and how we use it.”
(Viktor Mayer-Schonberger and
Kenneth Cukier)

Working with information is seen as

  • the basis of biological and social activities
  • the way to build intelligence
  • the underpinning of smart decision-making and action
  • the basis of successful monitoring of human health and security
  • the driving force of economic and social innovation

Tools become intelligent

We are used to think of intelligence as the unique prerogative of humans. Still in the last decades we feel increasingly comfortable to delegate intelligent action and decision-making to “smart tools” that process information.

Smart tools include

  • mainframe computer systems
  • smart cameras that identify threats and act or suggest actions
  • smart sensors
  • computer power embedded in daily appliances
  • nodes of information exchange that link different smart tools among thems–elves as well as with available information resources worldwide
  • human-like robots that interact with people in a naturally human way, understand how we feel, share leisure time activities and take over household chores
  • machine-like robots or drones that take over our dirty, dangerous and difficult jobs.

A typical European name for smart environments is Ambient Intelligence (AmI)

“Ambient Intelligence (AmI) is about creating environments that are sensitive and responsive to the presence of people.”
(Emile Aarts and Frits Grotenhuis)

Similar developments have a variety of other names:

Society is enthusiast and cautious

Advanced information processing and smart tools have found their way into society. Still, people realize that many ethical social and legal issues remain open. Most of the issues boil down to the requirement that new technologies must be harmoniously integrated in the social and legal context and should not infringe upon deeply rooted social values.

Strategies used to harmonize technologies and society include:

Technologies adapt to our social values

E.g. When airports started to install body scanners that accurately reveal any weapon or contraband travelers may have hidden under their clothes, privacy activists reacted claiming that such security measure is in fact the digital equivalent of a public strip search. To accommodate the social value of privacy, a new version of body scanners was developed that replaces the human body with a generic dummy that does not show any shape of the traveler’s body.

Existing laws adapt to new technologies

E.g. Laws that regulate confidentiality of the mail were not made with e-mail in mind. E.g. Legislators and legal specialists find out to what extent the law must be updated. Stealing and breaking in must be redefined as in a digital world it is possible to steal a document without taking the original away, or to break in without forcing a lock or a door.

Technologies are stopped or forbidden when they cannot adapt

E.g. Some people believe that networks of smart machines will never be able to make completely reliable decisions. They draw the conclusion that automatic decision making is never allowed when important aspects of human life are involved, or when life itself is at stake. This includes that a military drone should never be allowed to shoot after it automatically identified a much sought-after terrorist.

E.g. Some people believe that worldwide networks of intelligent tools and information sharing must be stopped altogether, as no legal, technological, psychological, ethical or any other fix will ever be able to contain the immense risk of data leaks - which not only endangers privacy, but also security of financial transactions and of any other social infrastructure, including the army.

The list of tensions between new technologies and social structures, perceptions and values shows a wide variety of potential conflicts. As societies and technologies are continuously changing the list of tensions is ever growing, and the important work to harmonize those tensions remains an ongoing process.

Patch Ethics and Deeper Questions

The ICTethics project could benefit from the input of

  • Patch Ethics.
    Much of the ethical issues and approaches evoked above can be labelled “patch ethics”, as they are linked to finding out how one or more social practices and tools can or must be adapted to make them match and work together.
  • Deeper Questions.
    The ICTethics project focused on problems with key concepts that are typically used in patch ethics (such as: information, intelligence, privacy) and that are often seen as the stable context, the facts about which patch ethical discussions are held.
    The ICTethics project underlined that these key concepts are themselves part of the problems, and may even be the major stumbling blocks of ethical discussion about smart tools.

Information and Intelligence

Listed Information and Active Information.
In a digital age, copying and sending identical copies of information has become easy. Even machines can endlessly and reliably copy and transmit information without ever being required to understand its meaning. This “listed information” is the power of smart technologies. It is also their weakness. In daily life of people, what counts is “active information”. What matters for people is not the blind storage or processing of the symbols or sounds that we saw or heard, but understanding and feeling what it is all about and acting accordingly.

Listed Intelligence and Active Intelligence.
Similarly, security agencies and smart tools take “intelligence” to be the accumulation of “listed information” and the generation of even more listed information (by data mining or other methods of mathematical processing of symbolic information). In real life, the “intelligence” that people develop, gets beyond listed information or mathematical and rule based processing.

Research Agenda.
The difference between “listed information” and “active information” (or “listed intelligence” and “active intelligence”) is of crucial importance to understanding how society can live with smart tools. The exploration of the impact of this difference and how listed information can enrich active information, was the lead of the ICT ethics conference “The Power of Information” (Brussels, January 2013). This line of research is only beginning.


There is a wide consensus that “privacy is in disarray” (Daniel Solove), or better: there is no consensus at all about whether privacy must be protected, what exactly must be protected and how. In all social literature about smart tools, privacy comes out as the number one ethical and legal problem. While legislators continue to develop ever tighter privacy regulations, there is also a widespread feeling that it is better to stop protecting privacy, because honest people have nothing to hide or because in a digital world privacy is a lost case anyway.

A deeper analysis of privacy shows that the problems with privacy are not generated by our booming information technologies. The problems with privacy started from the very first moments in the 19th century, when the Western world started its ambiguous attempts to protect “privacy” as a legal right.

What society wanted to protect by law was the possibility of individuals to develop their own life without physical or social pressure. Much attention was paid to preventing the social pressure of widespread gossip that could prevent people to deviate from what was seen as politically correct, could result in self-self-censorship or force people to live a hidden life. The fight against dissemination of information, and the right and duty to hide, was soon seen as the core method to protect privacy. This approach resulted in a host of attempts to identify which types of information are “personal” and must be hidden and how. The original inspiration is often lost out of sight. Our digital age has inherited the ambiguous approaches of former centuries and even emphasized the ill-framed focus on the dissemination pattern of listed information.

We cannot ignore the link between information and privacy. It is clear that gossip and information of all kinds can be used to hinder and damage people. It is surely wise to be cautious with many types of information and to implement different levels of data protection. But to tackle the deeper problems of privacy, we will have to integrate the original inspiration of protecting the individual creativity and the right to experiment and color outside the lines ‒ without being pushed to justify each and every action and exploration.