You are here: Resources > FIDIS Deliverables > Profiling > D7.3: Report on Actual and Possible Profiling Techniques in the Field of Ambient Intelligence > 
.B. Music in a restaurant  Title:
.C. INTERACTIVE TELEVISION AT HOME
 Conclusions

 

.C. Interactive television at home

This example shows the many actors involved in data collection and profile application, and the difficulties to obtain transparency and control over the use of your own personal data. It also demonstrates the fragmentation of our society, or the loss of a “common reality”, as everybody will receive different information and will enter into his own reality. 

To provide the best home entertainment experience, interactive television providers monitor and store exactly what is being watched on each television. Using these data, computers are capable of calculating perfectly what kind of content has been watched by the largest group of televisions (group profile) and what content has been watched on a particular television (personalised profile). By the examination of the group profile, broadcasters can stop buying or producing programs that are not often viewed. These group profiles can be sold to other television broadcasters. The computer will also be capable to anticipate which content will be broadcasted in the future and for what price, and this towards each individual television. The broadcaster can also, for instance, use customer databases of supermarkets. By linking the IP-addresses of televisions to the physical addresses of supermarket customers (obtained for the purpose of delivery of goods), supermarkets could buy exclusivity to provide commercial messages towards particular televisions, or to provide exclusive e-commerce applications in the food- sector towards particular televisions.

Watching behaviour is a type of personal data, as far as it is related to an identifiable person. This would mean that in principle the data subject must be informed about the purposes of the data collection, like the building and application of profiles (see section .A.3). Data protection is not applicable for the construction of the group profile, if this is built upon anonymised data, while it is applicable for the construction of a personalised profile if this can be related to an identifiable person. In the latter case, the recipients or the categories of recipients of the personalised profiles must be communicated to the data subject. As we have seen, such communication of rather abstract categories of recipients does not really help the data subject to control his/her own data. If supermarkets provide the personal data to third parties, the data subject has to be notified of this possible transfer.

Automated decisions in group profiling and personal profiling

Special attention should be paid at article 15 of the directive. This article deals with so-called “automated individual decisions and it strongly related to profiling.

Article 15(1) states: “every person has the right not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc.

However, article 15 (2) contains again exceptions and states that “a person may nevertheless be subjected to an automated individual decision if that decision is taken [a] in the course of the entering into or performance of a contract, provided the request for the entering into or the performance of the contract, lodged by the data subject, has been satisfied or that there are suitable measures to safeguard his legitimate interests, such as arrangements allowing him to put his point of view, or [b] is authorised by a law which also lays down measures to safeguard the data subject’s legitimate interests.

Contrary to the other provisions of the data protection directive, this article may be read regardless of the question whether or not the application of the profile involves collecting data of an identifiable individual or not. In fact, according to the Commission, article 15 has been designed to protect the interest of the data subject in participating in the making of decisions which are of importance to him. The use of extensive data profiles of individuals by powerful public and private institutions deprives the individual of the capacity to influence decision- making processes within those institutions, should decisions be taken on the sole basis of his ‘data shadow’. The Commission also stated that “the result produced by the machine, using more and more sophisticated software, and even expert systems, has an apparently objective and incontrovertible character to which a human decision-maker may attach too much weight, thus abdicating his own responsibilities”.

We did not find case law regarding this article. Lee Bygrave analysed the possible impact of this article on automated profiling in “Minding the Machine: Article 15 of the EC Data Protection Directive and Automated Profiling”. According to Bygrave, article 15(1) does not directly prohibit a particular type of decision-making or profile application. Rather it confers on persons a right to prevent them being subjected to such decision making, if their personal data are processed. This would “leave the actual exercise of the right to the discretion of each person and allow, in effect, the targeted decision making to occur in the absence of the right being exercised”. In other words, Bygrave suggests that the data subject involved must actively exercise his right not to be subjected to automated decision making if his personal data are processed. It is not clear whether a person has the right to a posteriori nullify a decision, for example, in the case of automated decisions that are taken without his prior consent. Article 15 does not seem to impose the condition of prior consent of the data subject to be subjected to automated decision making.

However, article 15 does not prevent national legislators from implementing it in terms of an explicit prohibition on targeted decision making. The Belgian Data Protection Act of 1992, revised in 1998, contains indeed in article 12b is an explicit prohibition that goes further than the minimum requirements of the Directive:A decision resulting into legal effects for a person or affecting him seriously, may not be taken purely on the basis of automatic data processing that is destined for the evaluation of certain aspects of his personality. The prohibition laid down in the first section is not applicable if the decision is taken in the context of an agreement or if it has its ground in a provision laid down by or by virtue of a law, decree or ordinance. In such agreement or provision appropriate measures shall be taken for the protection of the legitimate interests of the data subject. At least he shall be allowed to bring up his standpoint in a useful way.

Furthermore, Bygrave analyses the difficulties that exist in interpreting the provisions laid down in article 15. It is not easy to anticipate what should fall within the cumulative conditions of the article: do personalised advertising banners, that automatically adjust their content according to the visitor’s profile, involve an automated decision that significantly affects data subjects; when do decisions produce legal effects; when do decisions “significantly affect” data subjects; in which case can a decision be said to be based solely on automated data processing?

Privacy and Electronic Communications Directive 2002/58

The Directive (article 13) contains a special provision concerning unsolicited communications towards natural persons. It puts an end to the long lasting controversy regarding direct marketing, by explicitly adopting an opt-in system that inherently implies the prohibition of unsolicited marketing mail or communications. The use of e-communication media such as e-mail and SMS for the purposes of direct marketing may only be allowed in respect of subscribers who have given their prior consent for it (opt-in), except where the electronic contacts were obtained directly from the customer in the context of a sale of a product or service for similar products and services, provided that the customer has a clear and easy opportunity to object to such use at the moment of collection and on the occasion of each message (opt-out).

What direct marketing is has not been defined in the directive. Article 13 only states that unsolicited communication has to relate to direct marketing: “The use of automated calling systems without human intervention (automatic calling machines), facsimile machines fax) or electronic mail for the purposes of direct marketing may only be allowed in respect of subscribers who have given their prior consent.”

It is clear, however, that the communications need to have a commercial content in order to fall under the opt-in regulation of directive 2002/58. Considerans 40 indicates that direct marketing relates to commercial communications only: “Safeguards should be provided for subscribers against intrusion of their privacy by unsolicited communications for direct marketing purposes in particular by means of automated calling machines, telefaxes, and e-mails, including SMS messages. These forms of unsolicited commercial communications …”

As seen before, it is not clear whether personalised experiences in AmI environments and/or the delivery of AmI personalised goods and services, can be considered as direct marketing. In the case of AmI experiences and delivery of goods and services, there is no “marketing” because the goods and services are in fact often supplied without being preliminary promoted.

E-commerce and Consumer Protection Law; Standards and Operability

We refer to sections and .

 

.B. Music in a restaurant  fidis-wp7-del7.3.ami_profiling_02.sxw  Conclusions
59 / 62