Privacy and Data Protection in the Digital Age

 
Privacy en Gegevensbescherming
PBLQ Traineeprogramma
 
Bart van der Sloot
Senior Researcher
Tilburg Institute for Law, Technology, and Society (TILT)
Tilburg University, Netherlands
 
Overzicht
 
(1) Kort interactief debat
(2) Algemene Verordening Gegevensbescherming
(3) Pauze
(4) Kort interactief debat
(5) Big Data, Open Data en Hergebruik
(6) Kort interactief debat
(7) Aansprakelijkheid internet intermediairs
(8) Afronding
 
Bart van der Sloot
 
Specialisatie op het gebied van Privacy en Big Data, de aansprakelijkheid van internet intermediairs, gegevensbescherming
en internetregulering. Actuele kernpunten zijn de onlangs door de Europese Unie aangenomen Algemene Verordening
Gegevensbescherming, internationale gegevensstromen, met name tussen Europa en de Verenigde Staten, en datalekken.
Heeft recht en filosofie gestudeerd in Nederland en Italië en heeft tevens met succes het Honoursprogramma van de
Radboud Universiteit afgerond.
Werkt momenteel bij het Tilburg Institute for Law, Technology, and Society van de Tilburg University. Voorheen gewerkt bij
het Instituut voor Informatierecht, Universiteit van Amsterdam.
Ook part-time gewerkt bij de Wetenschappelijk Raad voor Regeringsbeleid (WRR) (onderdeel van het ministerie van
Algemene Zaken) aan een rapport over de regulering van Big Data in verband met veiligheid en privacy. In dat kader was
hij ook de eerste editor van een wetenschappelijk boek met gastbijdrages van vooraanstaande internationale
wetenschappers en de eerste auteur van een internationaal, rechtsvergelijkend onderzoek naar de regulering van Big Data.
General editor van het internationale privacy tijdschrift European Data Protection Law Review.
Proefschrift over subjectieve rechten in de Big Data wereld.
Betrokken bij het Privacy and Identity Lab en coördinator van het Amsterdam Platform for Privacy Research (APPR) en
voorheen van de Amsterdam Privacy Conference 2012 en de Amsterdam Privacy Conference 2015.
Meer informatie op 
www.bartvandersloot.nl
 
Kort interactief debat
 
 
Algemene Verordening Gegevensbescherming
 
 
ARTIKEL 8  EVRM
 
 
Recht op eerbiediging van privé-, familie- en gezinsleven
1. Een ieder heeft recht op respect voor zijn privé leven, zijn familie- en
gezinsleven, zijn woning en zijn correspondentie.
2. Geen inmenging van enig openbaar gezag is toegestaan in de uitoefening
van dit recht, dan voor zover bij de wet is voorzien en in een democratische
samenleving noodzakelijk is in het belang van de nationale veiligheid, de
openbare veiligheid of het economisch welzijn van het land, het
voorkomen van wanordelijkheden en strafbare feiten, de bescherming van
de gezondheid of de goede zeden of voor de bescherming van de rechten
en vrijheden van anderen.
 
Handvest voor de Grondrechten van de
Europese Unie
 
Artikel 7
Eerbiediging van het privé-leven en het familie- en gezinsleven
Eenieder heeft recht op eerbiediging van zijn privé-leven, zijn familie- en gezinsleven, zijn
woning en 
zijn communicatie.
 
Artikel 8
Bescherming van persoonsgegevens
1. Eenieder heeft recht op bescherming van de hem betreffende persoonsgegevens.
2. Deze gegevens moeten eerlijk worden verwerkt, voor bepaalde doeleinden en met
toestemming van de betrokkene of op basis van een andere gerechtvaardigde grondslag
waarin de wet voorziet. Eenieder heeft recht op toegang tot de over hem verzamelde
gegevens en op rectificatie daarvan.
3. Een onafhankelijke autoriteit ziet toe op de naleving van deze regels.
 
 Privacy en data protection
 
TREATY ESTABLISHING THE EUROPEAN
COMMUNITY
 
Article 100a
1. By way of derogation from Article 100 and save where otherwise provided in this Treaty, the following provisions shall apply for
the achievement of the objectives set out in Article 7a. The Council shall, acting in accordance with the procedure referred to in
Article 189b and after consulting the Economic and Social Committee, adopt the measures for the approximation of the provisions
laid down by law, regulation or administrative action in Member States which have as their object the establishment and
functioning of the internal market. (27)()
2. Paragraph 1 shall not apply to fiscal provisions, to those relating to the free movement of persons nor to those relating to the
rights and interests of employed persons.
3. The Commission, in its proposals envisaged in paragraph 1 concerning health, safety, environmental protection and consumer
protection, will take as a base a high level of protection.
4. If, after the adoption of a harmonization measure by the Council acting by a qualified majority, a Member State deems it
necessary to apply national provisions on grounds of major needs referred to in Article 36, or relating to protection of the
environment or the working environment, it shall notify the Commission of these provisions.
The Commission shall confirm the provisions involved after having verified that they are not a means of arbitrary discrimination or
a disguised restriction on trade between Member States.
By way of derogation from the procedure laid down in Articles 169 and 170, the Commission or any Member State may bring the
matter directly before the Court of Justice if it considers that another Member State is making improper use of the powers
provided for in this Article.
5. The harmonization measures referred to above shall, in appropriate cases, include a safeguard clause authorizing the Member
States to take, for one or more of the non- economic reasons referred to in Article 36, provisional measures subject to a
Community control procedure.
 
Article 16 Treaty on the Functioning of the
European Union
 
1. Everyone has the right to the protection of personal data
concerning them.
2. The European Parliament and the Council, acting in accordance
with the ordinary legislative procedure, shall lay down the rules
relating to the protection of individuals with regard to the processing
of personal data by Union institutions, bodies, offices and agencies,
and by the Member States when carrying out activities which fall
within the scope of Union law, and the rules relating to the free
movement of such data. Compliance with these rules shall be subject
to the control of independent authorities. The rules adopted on the
basis of this Article shall be without prejudice to the specific rules laid
down in Article 39 of the Treaty on European Union.
 
Recente rechtszaken ECJ
 
Coty
Digital Rights Ireland
Weltimo
Tele2
Breyer
Schrems
 
Data Protection Directive
 
No specific duties, but general standards of care
Data collection, use and proecessing should be
necessary and propotioniate, should have a clear and
legitimate goal
Technical and organisational measures
Personal data should be correct, complete and up to
date
Transparancy
 
Data Protection Directive
 
Only three marginal ‘subjective rights’
Right to acces
Richt to information
Right to rectification if data are not processed according to the
data protection rules.
Richt to object
At least in the cases referred to in Article 7 (e) and (f), to object
at any time on compelling legitimate grounds relating to his
particular situation to the processing of data relating to him
Automated individual decisions
which produces legal effects concerning him or significantly
affects him and which is based solely on automated processing
of data intended to evaluate certain personal aspects relating to
him
 
Data Protection Directive
 
Only a marginal role for supervisory authority
Limmited possibilities for remedies, liability and sanctions > left to
national Member States
Notification requirement is mosly ignored
Sector specific codes of conduct are very few and far between
European collection of CBP’s, the Working Party 29, may only adopt
non-binding advisory opinions
 
General Data Protection Regulation
 
Rights
Right to acces, object and resist automatic profiling have
been elaborated on
 
Data portability
Right to be forgotten
Protection against profiling
 
General Data Protection Regulation
 
 
Duties
All original duties have been retained + elaborated on
Accountability duty
Documentation
Risk assessments
Data protection officer
Privacy by design / by default
Reversal of the burden of proof for consent
Verification duty for consent of children
Data breach notification
 
 
General Data Protection Regulation
 
Enforcement
Harmonization of the rules:
Regulation
Commission
EDPB
Harmonization of enforcement: One stop shop/cooperation DPAs
Elaborated tasks and powers DPAs
Sanctions and liability widened
 
Article 83 General conditions for imposing
administrative fines
 
5. Infringements of the following provisions shall, in accordance with paragraph 2,
be subject to administrative fines up to 20 000 000 EUR, or in the case of an
undertaking, up to 4 % of the total worldwide annual turnover of the preceding
financial year, whichever is higher: (a) the basic principles for processing,
including conditions for consent, pursuant to Articles 5, 6, 7 and 9; (b) the data
subjects' rights pursuant to Articles 12 to 22; (c) the transfers of personal data to
a recipient in a third country or an international organisation pursuant to Articles
44 to 49; (d) any obligations pursuant to Member State law adopted under
Chapter IX; (e) non-compliance with an order or a temporary or definitive
limitation on processing or the suspension of data flows by the supervisory
authority pursuant to Article 58(2) or failure to provide access in violation of
Article 58(1). 6.Non-compliance with an order by the supervisory authority as
referred to in Article 58(2) shall, in accordance with paragraph 2 of this Article, be
subject to administrative fines up to 20 000 000 EUR, or in the case of an
undertaking, up to 4 % of the total worldwide annual turnover of the preceding
financial year, whichever is higher.
 
Vragen/discussie over GDPR
 
 
Pauze
 
 
Kort interactief debat
 
 
Big Data, Open Data en Hergebruik
 
 
Defintion and delineation of Big Data
 
The 
Gartner Report
 focusses on three matters when describing Big
Data: increasing volume (amount of data), velocity (speed of data
processing), and variety (range of data types and sources). This is also
called the 3v model or 3v theory
Authors have added new V’s such as 
Value
 (Dijcks, 2012; Dumbill,
2013), 
Variability
 (Hopkins & Evelson, 2011; Tech America
Foundation, 2012), 
Veracity
 (IBM, 2015) and 
Virtual
 (Zikopoulos et al
11; Akerkar et al 2015).
 
Defintion and delineation of Big Data
 
The Article 29 Working Party: 
‘Big Data is a term which refers to the
enormous increase in access to and automated use of information. It refers
to the gigantic amounts of digital data controlled by companies, authorities
and other large organizations which are subjected to extensive analysis
based on the use of algorithms.
 Big Data may be used to identify general
trends and correlations, but it can also be used such that it affects
individuals directly.’
The 
European Data Protection Supervisor
: ‘Big data means large amounts
of different types of data produced at high speed from multiple sources,
whose handling and analysis require new and more powerful processors
and algorithms. Not all of these data are personal, but many players in the
digital economy increasingly rely on the large scale collection of and trade
in personal information. As well as benefits, these growing markets pose
specific risks to individual's rights to privacy and to data protection.’
 
Defintion and delineation of Big Data
 
The Estonian DPA describes Big Data as ‘collected and processed open datasets, which are defined by
quantity, plurality of data formats and data origination and processing speed.’
The Luxembourg DPA: ‘
Big Data stems from the collection of large structured or unstructured
datasets, the possible merger of such datasets as well as the analysis of these data through computer
algorithms. It usually refers to datasets which cannot be stored, managed and analysed with average
technical means due to their size. Personal data can also be a part of Big Data but Big Data usually
extends beyond that, containing aggregated and anonymous data.’
The Dutch DPA: ‘Big Data is all about collecting as much information as possible ; storing it in ever
larger databases ; combining data that is collected for different purposes ; and applying algorithms to
find correlations and unexpected new information.’
The Slovenian DPA: ‘Big Data is a broad term for processing of large amounts of different types of
data, including personal data, acquired from multiple sources in various formats. Big Data revolves
around predictive analytics – acquiring new knowledge from large data sets which requires new and
more powerful processing applications.’
The UK DPA: 
‘repurposing data; using algorithms to find correlations in datasets rather than
constructing traditional queries; and bringing together data from a variety of sources, including
structured and unstructured data.’
The Swedish DPA argues that 
‘the concept is used for situations where large amounts of data are
gathered in order to be made available for different purposes, not always precisely determined in
advance.’
 
Defintion and delineation of Big Data
 
Umbrella term
Open Data: 
Lots of Big Data initiatives are linked to Open Data. Open
Data is the idea, as the name suggests, that (government) data should
be public. Traditionally, it is linked to the strive for transparency in the
public sector and for more control over government power by media
and/or citizens. In particular, the Estonian DPA is very explicit about
the relationship between Open Data and Big Data. Big Data is defined
as 
collected and processed open datasets, which are defined by
quantity, plurality of data formats and data origination and processing
speed’. The desk research also shows a clear link between the two
concepts in some countries, such as Australia, France, Japan and the
United Kingdom.
 
Defintion and delineation of Big Data
 
Re-Use: 
Linked to Open Data is the idea of re-use of data. Yet there is one important difference.
While Open Data traditionally concerned the transparency of and control on government power,
there re-use of (government) data is specifically intended to promote the commercial exploitation
of these data by businesses and private parties. The re-use of Public Sector Information is
stimulated through the PSI Directive of the European Union. But more in general, re-use refers to
the idea that data can be used for another purpose than for which they were originally collected.
The Norwegian DPA, inter alia, has suggested the relationship between Big Data and the re-use of
data. The Norwegians use the definition of the Working Group 29, ‘but also add what in our
opinion is the key aspect of Big Data, namely that it is about the compilation of data from several
different sources. In other words, it is not just the volume in itself that is of interest, but the fact
that secondary value is derived from the data through reuse and analysis.’ The desk research also
showed a link between the two concepts. In France, for example, Big Data is primarily seen as a
phenomenon based on the re-use of data for new purposes and on the combination of different
data and datasets.
 Directive 2003/98/EC of the European Parliament and of the Council of 17
November 2003 on the re-use of public sector information. Directive 2013/37/EU of the European
Parliament and the Council of 26 June 2013 amending Directive 2003/98/EC on the re-use of
public sector information.
 
Defintion and delineation of Big Data
 
Internet of things: 
The term the Internet of Things refers to the idea that
more and more things are connected to the Internet. This may include cars,
lampposts, refrigerators, pants, or whatever object. This allows for the
development of smart devices - for example, a refrigerator that records
that the milk is out and automatically orders new. By providing all objects
with a sensor, large quantities of data can be collected. Therefore, Big Data
and the Internet of Things are often mentioned in the same breath. An
example would be the DPA of the United Kingdom noting ‘that big data
may involve not only data that has been consciously provided by data
subjects, but also personal data that has been observed (e.g. from Internet
of Things devices), derived from other data or inferred through analytics
and profiling.’
 
Defintion and delineation of Big Data
 
Smart: 
Because of the applications of the internet of things and the constantly
communicating devices and computers, the development of smart products and
services has spiralled. Examples of such developments are smart cities, smart
devices and smart robots. The desk research indicates that in a number of
countries, a link is made between such developments and Big Data systems, for
example the United States and the United Kingdom. Also, the DPA from
Luxembourg emphasizes the relationship with smart systems, such as smart
metering.  ‘At a national level, a system of smart metering for electricity and gas
has been launched. The project is however still in a testing phase. - The CNPD has
not issued any decisions, reports or opinions that are directly dealing with Big
Data. The Commission has however issued an opinion in a related matter, namely
with regard to the problematic raised by smart metering.  In 2013, the CNPD
issued an opinion on smart metering. The main argument of the opinion
highlights the necessity to clearly define the purposes of the data processing as
well as the retention periods of the data related to smart metering.’
 
Defintion and delineation of Big Data
 
Profiling: 
A term that is often associated with Big Data and is
sometimes included as part of the definition of Big Data is profiling.
Because increasingly large data sets are collected and analysed, the
conclusions and correlations are mostly formulated on a general or
group level. This mainly involves statistical correlations, sometimes of
a predictive nature. Germany is developing new laws on profiling and
a number of DPAs emphasize the relationship of Big Data with
profiling, such as the DPA of Netherlands, Slovenia, the UK and
Belgium. The latter argues: ‘The general data protection law applies,
and we expect that de new data protection regulation will be able to
provide a partial answer (profiling) to big data issues (legal
interpretation of the EU legal framework).’
 
Defintion and delineation of Big Data
 
Algoritmes:
 
A term that recurs in very many definitions of Big Data is
algorithms. This applies to the definition of Working Party 29, the
EDPS and a number of DPAs such as that of Luxembourg, the
Netherlands and the UK. A number of countries also have a special
focus on algorithms. In Australia, a ‘Program Protocol’ applies to
certain cases – a report may be issues in which the following
elements are contained: a description of the data, a specification of
each matchings algorithm, the expected risks and how they will be
addressed, the means for checking the integrity and the security
measures used.
 
Defintion and delineation of Big Data
 
Cloud Computing: 
Cloud computing is also often associated with Big Data
processes. In particular, in China and Israel, the two terms are often
connected to each other. For example, the Chinese vice-premier stressed
that the government wants to make better use of technologies like Big
Data and cloud computing to support innovation; according to the prime
minister mobile Internet, cloud computing, Big Data and the Internet of
Things are integrated with production processes, and will thus be an
important engine for economic growth. In Israel, the plan is for the army to
have a cloud where all data are stored in 2015 - there is even talk of a
"combat computing cloud", a data center that will make available different
tools to forces on the ground. Also, some DPAs suggest a relationship
between cloud computing and Big Data; the Slovenian DPA states, for
example, that
 
 ‘new concepts and paradigms, such as cloud computing or
big data should not lower or undermine the current levels of data
protection as a fundamental human right.’
 
Use in practice of Big Data
 
In the United States, more than $ 200 million was reserved for a research and development initiative
for Big Data, to be spent by six federal government departments;
 
the army invested the most in Big
Data projects, namely $ 250 million;
 
$ 160 million was invested in a smart cities initiative, investing in
25 collaborations focused on data usage.
 In the United Kingdom, £ 159 million was spent on high-quality computer and network
infrastructure,
 
there are £ 189 million in investments to support Big Data and to develop the data
infrastructure of the UK and £ 10.7 million will be spent on a center for Big Data and space
technologies.
 
In addition, £ 42 million will be spent on the Alan Turing Institute for analysis and
application of big data, £ 50 million for 'The Digital Catapult', where researchers and industry are
brought together to come up with innovative products and lastly, the Minister of Universities and
Science in February 2014 announced a new investment of £ 73 million in Big Data. This is used for
bioinformatics, open data projects, research and the use of environmental data.
In South-Africa, the government has invested 2 billion South-African Rand, approximately € 126.8
million, in the Square Kilometre Array (SKA) project. A project which revolves around very large data
sets.
In France, seven research projects related to Big Data were given € 11.5 million.
In Germany, the Ministry of Education and Research invested € 10 million in Big Data research
institutes and € 20 million in Big Data research; this ministry will also invest approximately € 6.4
million in the project Abida, a four-year interdisciplinary research project on the social and economic
effects of large data sets.
 
Use in practice of Big Data
 
What are the areas in which Big Data is (presumably) used?
Internet companies: advertisements
Health care sector: total genome analysis
Taxs authorities: risk profiles
Police: predictive policing
Intelligence services: terror prevention
 
Use in practice of Big Data
 
Primarily in the private sector, to a lesser extent in the public sector, especially security related
The Hungarian DPA, for example, emphasizes that ‘in Hungarian business sphere more and more enterprises
such as banks, supermarkets, media and telecommunication companies use and take advantage of the
possibilities in Big Data.’
The DPA from Luxembourg holds: ‘To our knowledge there are no prominent examples of the use of Big Data
in the law enforcement sector or by police or intelligence services in Luxembourg. There are however other
actors which deal with Big Data.’
The Norwegian DPA argues along the same line: ‘There are, as far as we know, no usage of big data within
the law enforcement sector in Norway. In 2014, the intelligence service addressed in a public speech the
need to use big data techniques in order to combat terrorism more efficiently. However, politicians across all
parties reacted very negatively to this request and no formal request to use such techniques has since been
launched by the intelligence service. The companies that are most advanced when it comes to using big data
may be found within the telecom (eg. Telenor) and media (eg. Schibsted and Cxence) sector. The tax and
customs authorities have also initiated projects in which they look at how big data can be used to enhance
the efficiency of their work.’ The Norwegian DPA continues: ‘At the Norwegian DPA we are currently looking
into how it affects our privacy when personal data is more and more turning into an valuable commodity in
all sectors of the economy. We are writing a report on how big data is used within the advertising industry,
and how the use of automated, personalised marketing triggers an enourmous appetite for and exchange of
personal data.’
 
Use in practice of Big Data
 
The Slovenian DPA emphasizes: ‘We have thus far not seen prominent examples of the use of Big Data in our
country. To our knowledge Big Data applications are particularly of interest in insurance, banking and
electronic communications sector, mostly to battle fraud and other illegal practices. Another important field
is scientific and statistical research. Law enforcement use is to our knowledge currently at development
stages (e.g. in the case of processing Passenger Name Records), whereas information about the use of Big
Data at intelligence services is either not available or of confidential nature.’
The Swedish DPA holds: ‘We have not carried out any specific supervision related to the concept Big Data
and do not have any statistics or specific information on how this is used.  In our opinion, the law
enforcement sector does not use Big Data. Their personal data processing is strictly regulated in terms of
collection of data, limited purposes etc.’
Finally, the DPA from the United Kingdom states: ‘We have not carried out a comprehensive market
assessment of big data but, from our contacts with business and our desk research, our impression is that
the take up of big data is still at a relatively early stage in the UK.  Nevertheless, we know that companies are
actively investigating the potential of big data, and there are some examples of big data in practice, such as
the use of telematics in motor insurance, the use of mobile phone location data for market research, and the
availability of data from the Twitter ‘firehose’ for analytics. We do not have any specific information on the
use of big data in law enforcement or security. The UK Data Protection Act includes a wide-ranging
exemption from the data protection principles where it is required for safeguarding national security.’
 
Social and ethical dangers of Big Data
 
Power imbalance & Mathew effect: 
 Individuals, as a general rule, have limited power to influence how
large corporations behave. Extensive use of Big Data analytics may increase the imbalance between large
corporations on the one hand and the consumers on the other. It is the companies that collect personal data
that extract the ever-growing value inherent in the analysis and processing of such information, and not the
individuals who submit the information. Rather, the transaction may be to the consumer's disadvantage in
the sense that it can ex- pose them to potential future vulnerabilities (for example, with regard to
employment opportunities, bank loans, or health insurance options).
Data determinism and discrimination:
 The “Big data-mindset” is based on the assumption that the more
data you collect and have access to, the better, more reasoned and accurate decisions you will be able to
make. But collection of more data may not necessarily entail more knowledge. More data may also result in
more confusion and more false positives. Extensive use of automated decisions and prediction analyses may
have adverse consequences for individuals. Algorithms are not neutral, but reflect choices, among others,
about data, connections, inferences, interpretations, and thresholds for inclusion that advances a specific
purpose. 32 Big Data may hence consolidate existing prejudices and stereotyping, as well as reinforce social
exclusion and stratification. Use of correlation analysis may also yield completely incorrect results for
individuals. Correlation is often mistaken for causality. If the analyses show that individuals who like X have
an eighty per cent probability rating of being exposed to Y, it is impossible to conclude that this will occur in
100 per cent of the cases. Thus, discrimination on the basis of statistical analysis may become a privacy
issue. A development where more and more decisions in society are based on use of algorithms may result
in a ”Dictatorship of Data”, where we are no longer judged on the basis of our actual actions, but on the
basis of what the data indicate will be our probable actions.
 
Social and ethical dangers of Big Data
 
The Chilling effect: 
If there is a development where credit scores and insurance premiums are based
solely or primarily on the information we leave behind in various contexts on the Internet and in
other arenas in our daily life, this may be of consequence for the protection of privacy and how we
behave. In ten years, our children may not be able to obtain insurance coverage because we
disclosed in a social network that we are predisposed for a genetic disorder, for example. This may
result in us exercising restraint when we participate in society at large, or that we actively adapt our
behaviour – both online and elsewhere. We may fear that the tracks we leave behind in various
contexts may have an impact on future decisions, such as the possibility of finding work, obtaining
loans, insurance, etc. It may even deter users from seeking out alternative points of view online for
fear of being identified, profiled or discovered. With regard to the authorities' use of Big Data,
uncertainty concerning which data sources are used for collecting information and how they are
utilised may threaten our confidence in the authorities. This in turn may have a negative impact on
the very foundation for an open and healthy democracy. Poor protection of our privacy may weaken
democracy as citizens limit their participation in open exchanges of viewpoints. In a worst case
scenario, extensive use of Big Data may have a chilling effect on freedom of expression if the
premises for such use are not revealed and cannot be independently verified.
Echo chambers: 
Personalisation of the web, with customised media and news services based on the
individual's web behaviour, will also have an impact on the framework conditions for public debates
and exchanges of ideas – important premises for a healthy democracy. This is not primarily a privacy
challenge, but constitutes a challenge for society at large. The danger associated with so-called ”echo
chambers” or ”filter bubbles” is that the population will only be exposed to content which confirms
their own attitudes and values. The exchange of ideas and viewpoints may be curbed when
individuals are more rarely exposed to viewpoints different from their own.
Transparency paradox: 
The citizen is becoming more and more transparent to the government, while
the government is becoming more an more in-transparent to the citizen.
 
Juridical challenges of Big Data: Purpose
 
Article 6 Data protection Directive
1. Member States shall provide that personal data must be:
(a) processed fairly and lawfully;
 
Juridical challenges of Big Data: Purpose
 
Article 7
Member States shall provide that personal data may be processed only if:
(a) the data subject has unambiguously given his consent; or
(b) processing is necessary for the performance of a contract to which the data subject is party or in
order to take steps at the request of the data subject prior to entering into a contract; or
(c) processing is necessary for compliance with a legal obligation to which the controller is subject;
or
(d) processing is necessary in order to protect the vital interests of the data subject; or
(e) processing is necessary for the performance of a task carried out in the public interest or in the
exercise of official authority vested in the controller or in a third party to whom the data are
disclosed; or
(f) processing is necessary for the purposes of the legitimate interests pursued by the controller or
by the third party or parties to whom the data are disclosed, except where such interests are
overridden by the interests for fundamental rights and freedoms of the data subject which require
protection under Article 1 (1).
 
Juridical challenges of Big Data: Purpose
 
Article 8
The processing of special categories of data
1. Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or
philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.
2. Paragraph 1 shall not apply where:
(a) the data subject has given his explicit consent to the processing of those data, except where the laws of the Member State provide
that the prohibition referred to in paragraph 1 may not be lifted by the data subject's giving his consent; or
(b) processing is necessary for the purposes of carrying out the obligations and specific rights of the controller in the field of
employment law in so far as it is authorized by national law providing for adequate safeguards; or
(c) processing is necessary to protect the vital interests of the data subject or of another person where the data subject is physically
or legally incapable of giving his consent; or
(d) processing is carried out in the course of its legitimate activities with appropriate guarantees by a foundation, association or any
other non-profit-seeking body with a political, philosophical, religious or trade-union aim and on condition that the processing relates
solely to the members of the body or to persons who have regular contact with it in connection with its purposes and that the data
are not disclosed to a third party without the consent of the data subjects; or
(e) the processing relates to data which are manifestly made public by the data subject or is necessary for the establishment, exercise
or defence of legal claims.
 
Juridical challenges of Big Data: Purpose
limitation
 
Article 6
1. Member States shall provide that personal data must be:
(b) collected for specified, explicit and legitimate purposes and not
further processed in a way incompatible with those purposes. Further
processing of data for historical, statistical or scientific purposes shall
not be considered as incompatible provided that Member States
provide appropriate safeguards;
 
Juridical challenges of Big Data: Purpose
limitation
 
For example, the DPA of Luxembourg emphasises: ‘From a data protection point of view it can
raise many concerns, when it contains personal data, such as the respect of data subjects’
rights - for example in the context of data mining - and their ability to exercise control over the
personal data or the respect fundamental principles of data protection such as that of data
minimization or purpose limitation.’
The definition of Big Data of the Dutch DPA contains, among other elements, ‘combining data
that is collected for different purposes’ and it also holds: ‘Our key concern is that data
protection should be about surprise minimisation, while big data entails the risk of surprise
maximation. There is a real risk that those who are involved in the development and use of Big
Data are ignoring the basic principles of purpose limitation, data minimisation and
transparency. And an additional frightening fact is that the statistical information, even if the
data used is properly anonymised, can lead to such precise results that it essentially
constitutes re-identification.’
The Norwegian DPA states: ‘In other words, it is not just the volume in itself that is of interest,
but the fact that secondary value is derived from the data through reuse and analysis. This
aspect of Big Data, and the consequences it has, is in our opinion the most challenging aspect
from a privacy perspective.’
Finally, the Swedish DPA states about Big Data: ‘As we see it, the concept is used for situations
where large amounts of data are gathered in order to be made available for different
purposes, not always precisely determined in advance.’
 
Juridical challenges of Big Data: 
Data
minimization
 
Article 6
1. Member States shall provide that personal data must be:
(b) collected for specified, explicit and legitimate purposes and not further
processed in a way incompatible with those purposes. Further processing of data
for historical, statistical or scientific purposes shall not be considered as
incompatible provided that Member States provide appropriate safeguards;
(c) adequate, relevant and not excessive in relation to the purposes for which
they are collected and/or further processed;
(e) kept in a form which permits identification of data subjects for no longer than
is necessary for the purposes for which the data were collected or for which they
are further processed. Member States shall lay down appropriate safeguards for
personal data stored for longer periods for historical, statistical or scientific use.
 
Juridical challenges of Big Data: 
Data
minimization
 
Almost all DPAs mention this principle when it comes to the dangers
of Big Data. The DPA from Luxembourg, inter alia, refers to a decision
in which it stressed the importance of a retention period for data
storage. The Dutch DPA summarizes the tension between Big Data
and data minimization in very clear words: ‘Big Data is all about
collecting as much information as possible’.
 
Juridical challenges of Big Data: 
Technical and
organizational measures
 
Article 16 - Confidentiality of processing
 
Any person acting under the authority of the controller or of the processor, including the processor himself, who has access to personal data must not process them except on
instructions from the controller, unless he is required to do so by law.
 
Article 17 - Security of processing
 
1. Member States shall provide that the controller must implement appropriate technical and organizational measures to protect personal data against accidental or unlawful
destruction or accidental loss, alteration, unauthorized disclosure or access, in particular where the processing involves the transmission of data over a network, and against all other
unlawful forms of processing. Having regard to the state of the art and the cost of their implementation, such measures shall ensure a level of security appropriate to the risks
represented by the processing and the nature of the data to be protected.
2. The Member States shall provide that the controller must, where processing is carried out on his behalf, choose a processor providing sufficient guarantees in respect of the
technical security measures and organizational measures governing the processing to be carried out, and must ensure compliance with those measures.
3. The carrying out of processing by way of a processor must be governed by a contract or legal act binding the processor to the controller and stipulating in particular that:
- the processor shall act only on instructions from the controller,
- the obligations set out in paragraph 1, as defined by the law of the Member State in which the processor is established, shall also be incumbent on the processor.
4. For the purposes of keeping proof, the parts of the contract or the legal act relating to data protection and the requirements relating to the measures referred to in paragraph 1
shall be in writing or in another equivalent form.
 
Juridical challenges of Big Data: 
Technical and
organizational measures
 
Many DPAs also mention this principle when discussing the dangers
of Big Data; this holds especially true for countries and DPAs that
establish a link between Big Data and Open Data. The Slovenian DPA
stresses about this particular tension: ‘The principles of personal data
accuracy and personal data being kept up to date may also be under
pressure in Big Data processing. Data may be processed by several
entities and merged from different sources without proper
transparency and legal ground. Processing vast quantities of personal
data also brings along higher data security concerns and calls for strict
and effective technical and organisational data security measures.’
 
Juridical challenges of Big Data: 
Data quality
 
Article 6
1. Member States shall provide that personal data must be:
(d) accurate and, where necessary, kept up to date; every reasonable
step must be taken to ensure that data which are inaccurate or
incomplete, having regard to the purposes for which they were
collected or for which they are further processed, are erased or
rectified;
 
Juridical challenges of Big Data: 
Data quality
 
Often, Big Data applications do not revolve around individual profiles, but around
group profiles, not around retrospective analyses, but around probability and
predictive applications with a certain margin of error. Moreover, it is supposedly
becoming less and less important for data processors to work with correct and
accurate data about specific individuals, as long as a large percentage of the data
on which the analysis is based provides a generally correct picture. Quantity over
quality of data, so the saying goes, as more and more organizations are
accustomed to working with “dirty data”. In the public sector too, it seems that
working with contaminated data or unreliable sources is becoming less
uncommon. Reference can be made to the use by government agencies of open
sources on the internet, inter alia, Facebook, websites and discussion forums. The
Dutch DPA, for example, indicates: ‘There has been a lot of media attention for
big data use by the Tax administration (scraping websites such as Marktplaats [an
e-bay like website] to detect sales, mass collection of data about parking and
driving in leased cars, including use of ANPR-data, and profiling people to detect
potentially fraudulent tax filings’.
 
Juridical challenges of Big Data: 
Transparency
 
Article 10 Information in cases of collection of data from the data subject
Member States shall provide that the controller or his representative must provide a data
subject from whom data relating to himself are collected with at least the following
information, except where he already has it:
(a) the identity of the controller and of his representative, if any;
(b) the purposes of the processing for which the data are intended;
(c) any further information such as
- the recipients or categories of recipients of the data,
- whether replies to the questions are obligatory or voluntary, as well as the possible
consequences of failure to reply,
- the existence of the right of access to and the right to rectify the data concerning him in
so far as such further information is necessary, having regard to the specific circumstances
in which the data are collected, to guarantee fair processing in respect of the data subject.
 
Juridical challenges of Big Data: 
Transparency
 
Article 11 Information where the data have not been obtained from the data subject
1. Where the data have not been obtained from the data subject, Member States shall
provide that the controller or his representative must at the time of undertaking the
recording of personal data or if a disclosure to a third party is envisaged, no later than the
time when the data are first disclosed provide the data subject with at least the following
information, except where he already has it:
(a) the identity of the controller and of his representative, if any;
(b) the purposes of the processing;
(c) any further information such as
- the categories of data concerned,
- the recipients or categories of recipients,
- the existence of the right of access to and the right to rectify the data concerning him in
so far as such further information is necessary, having regard to the specific circumstances
in which the data are processed, to guarantee fair processing in respect of the data subject.
 
Juridical challenges of Big Data: 
Transparency
 
Article 12 Right of access
Member States shall guarantee every data subject the right to obtain from the controller:
(a) without constraint at reasonable intervals and without excessive delay or expense:
- confirmation as to whether or not data relating to him are being processed and information at least as to the
purposes of the processing, the categories of data concerned, and the recipients or categories of recipients to
whom the data are disclosed,
- communication to him in an intelligible form of the data undergoing processing and of any available
information as to their source,
- knowledge of the logic involved in any automatic processing of data concerning him at least in the case of the
automated decisions referred to in Article 15 (1);
(b) as appropriate the rectification, erasure or blocking of data the processing of which does not comply with
the provisions of this Directive, in particular because of the incomplete or inaccurate nature of the data;
(c) notification to third parties to whom the data have been disclosed of any rectification, erasure or blocking
carried out in compliance with (b), unless this proves impossible or involves a disproportionate effort.
 
Juridical challenges of Big Data: 
Transparency
 
Article 10 Information in cases of collection of data from the data subject
Member States shall provide that the controller or his representative must provide a data
subject from whom data relating to himself are collected with at least the following
information, except where he already has it:
(a) the identity of the controller and of his representative, if any;
(b) the purposes of the processing for which the data are intended;
(c) any further information such as
- the recipients or categories of recipients of the data,
- whether replies to the questions are obligatory or voluntary, as well as the possible
consequences of failure to reply,
- the existence of the right of access to and the right to rectify the data concerning him in
so far as such further information is necessary, having regard to the specific circumstances
in which the data are collected, to guarantee fair processing in respect of the data subject.
 
Juridical challenges of Big Data: 
Transparency
 
This principle is in tension with the rise of Big Data too, partly because data
subjects often simply do not know that their data is collected and therefore
are not likely to invoke their right to information. This applies equally to the
flipside of the coin, the transparency obligation for data controllers. For
them, it is often unclear to whom the information relates, where the
information came from and how they could contact the data subjects,
especially when the processes entail the connection of different databases
and the re-use of information. As the Slovenian DPA emphasized: ‘Big Data
has important information privacy implications. Information on personal
data processing may not be known to the individual or poorly described for
the individual, personal data may be used for purposes previously
unknown to the individual. The individual may be profiled and decisions
may be adopted in automated and non-transparent fashion having more or
less severe consequences for the individual.’
 
Juridical challenges of Big Data: 
Individual
rights
 
Article 14 The data subject's right to object
Member States shall grant the data subject the right:
(a) at least in the cases referred to in Article 7 (e) and (f), to object at any time on
compelling legitimate grounds relating to his particular situation to the processing of data
relating to him, save where otherwise provided by national legislation. Where there is a
justified objection, the processing instigated by the controller may no longer involve those
data;
(b) to object, on request and free of charge, to the processing of personal data relating to
him which the controller anticipates being processed for the purposes of direct marketing,
or to be informed before personal data are disclosed for the first time to third parties or
used on their behalf for the purposes of direct marketing, and to be expressly offered the
right to object free of charge to such disclosures or uses.
Member States shall take the necessary measures to ensure that data subjects are aware
of the existence of the right referred to in the first subparagraph of (b).
 
Juridical challenges of Big Data: 
Individual
rights
 
Article 15
Automated individual decisions
1. Member States shall grant the right to every person not to be subject to a decision which
produces legal effects concerning him or significantly affects him and which is based solely on
automated processing of data intended to evaluate certain personal aspects relating to him, such
as his performance at work, creditworthiness, reliability, conduct, etc.
2. Subject to the other Articles of this Directive, Member States shall provide that a person may
be subjected to a decision of the kind referred to in paragraph 1 if that decision:
(a) is taken in the course of the entering into or performance of a contract, provided the request
for the entering into or the performance of the contract, lodged by the data subject, has been
satisfied or that there are suitable measures to safeguard his legitimate interests, such as
arrangements allowing him to put his point of view; or
(b) is authorized by a law which also lays down measures to safeguard the data subject's
legitimate interests.
 
Juridical challenges of Big Data: 
Individual
rights
 
General Data Protection Regulation
Right to be forgotten
Right to data portability
Right to resist profiling
 
Juridical challenges of Big Data: 
Individual
rights
 
The question is whether this focus can be attained in the age of Big Data. It is
often difficult for individuals to demonstrate personal injury or an individual
interest in a case, individuals are often unaware that their rights are violated and
if they do know that their data is gathered, in the Big Data era, data collection will
presumably be so widespread that it is impossible for the individual to assess
each data process to determine whether its personal data are contained therein,
if so, if the processing is lawful and if this is not the case, to go to court or file a
complaint. The DPA of the United Kingdom states on this issue: ‘It may be difficult
to provide meaningful privacy information to data subjects, because of the
complexity of the analytics and people’s reluctance to read terms and conditions,
and because it may not be possible to identify at the outset all the purposes for
which the data will be used. It may be difficult to obtain valid consent,
particularly in circumstances where data is being collected through being
observed or gathered from connected devices, rather than being consciously
provided by data subjects.’
 
Juridical challenges of Big Data: 
legal
regulation
 
The current system is primarily based on the legal regulation of rights
and obligations. Big Data challenges this basis for several reasons.
Data processing is becoming increasingly transnational. This implies
that more and more agreements must be made between jurisdictions
and states. Making legally binding is often difficult due to the
different traditions and legal systems. The rapidly changing
technology brings with it that specific legal provisions can easily be
circumvented and that unforeseen problems and challenges arise.
The legal reality this is often overtaken by events and technical
developments.
 
Juridical challenges of Big Data: 
legal
regulation
 
The fact that many of the problems resulting from Big Data processes, as
also highlighted by a number of DPAs, revolve predominantly about more
general social and societal issues makes it difficult to address all of the Big
Data issues within specific legal doctrines, as these legal doctrines are
often aimed at protecting the interests of individuals, of legal subjects. That
is why more and more national governments look for alternatives or
additions to traditional black letter law when regulating Big Data – for
example self-regulation, codes of conduct and ethical guidelines. The DPA
of the United Kingdom states, for example:
 
‘It is notable however that
there is some evidence of a move towards self-regulation, in the sense that
some companies are developing what can be described as an ‘ethical’
approach to big data, based on understanding the customer’s perspective,
being transparent about the processing and building trust.’
 
Juridical challenges of Big Data:
Difference between non-personal data
and personal data
 
Article 2  Definitions
For the purposes of this Directive:
(a) 'personal data' shall mean any information relating
to an identified or identifiable natural person ('data
subject'); an identifiable person is one who can be
identified, directly or indirectly, in particular by
reference to an identification number or to one or
more factors specific to his physical, physiological,
mental, economic, cultural or social identity;
 
Juridical challenges of Big Data: 
Difference
between types of data
 
Private data
Privacy sensitive data
Personal data
Sensitive personal data
Meta data
Location data
Customer data
Aggregated data
 
Juridical challenges of Big Data: 
Difference
between different actors
 
Private sector
Public sector
Intelligence services
Police
Taks authorities
Etc.
International differences
 
Juridical challenges of Big Data: 
Personal
interests and individual rights
 
Ratione personae
Ratione materiae
Subjective rights of natural persons
 
Juridical challenges of Big Data: balancing
 
Wheighing one interest against another, for example security against
privacy
 
Big Data, Open Data, Hergebruik
 
Wet openbaarheid van bestuur
Wet hergebruik overheidsinformatie
Wet open overheid
 
Wet hergebruik overheidsinformatie
 
Artikel 2. Toepassingsbereik
1 Deze wet is niet van toepassing op:
a. informatie die niet openbaar is op grond van de wet;
b. informatie waarvan een derde de rechthebbende is in de zin van de Auteurswet,
de Wet op de naburige rechten of de Databankenwet;
c. informatie berustend bij een publieke omroep, bij een andere met een publieke
omroeptaak belaste instelling of bij een instelling die werkzaam is onder
verantwoordelijkheid van een publieke omroep of een andere met een publieke
omroeptaak belaste instelling;
d. informatie berustend bij onderwijs- en onderzoeksinstellingen;
e. informatie berustend bij andere culturele instellingen dan bibliotheken en musea;
f. gedeelten van documenten die alleen logo’s, wapens en insignes bevatten;
g. informatie die betrekking heeft op openbare persoonsgegevens waarvan
hergebruik onverenigbaar is met de doeleinden waarvoor ze zijn verkregen.
 
Wet open overheid
 
Nederland is achterop geraakt bij landen die recent een 
freedom of information act 
hebben aangenomen of voortvarend aan de
slag zijn gegaan met het openstellen van overheidsinformatie voor hergebruik, en heeft geen lering getrokken uit buitenlandse
ervaringen. Die hebben in het afgelopen decennium een hoge vlucht genomen. In veel landen om ons heen is het recht op toegang
tot publieke informatie verankerd in wetgeving en zijn praktische oplossingen gevonden om dit recht invulling te geven. Veel
nadruk ligt daarbij op het actief openbaren van informatie en de mogelijkheden van een Informatiecommissaris om de brug te
slaan tussen burger en overheid. Daarmee wordt erkend dat toegankelijkheid van publieke informatie vaak niet alleen een
juridische kwestie is, maar ook een praktisch probleem en een mentaliteitskwestie. Dit voorstel probeert te leren van deze
buitenlandse ervaringen en oplossingen.
Door beter aan te sluiten op de wensen en verlangens van burgers en bedrijven, wordt ook de mogelijkheid vergroot om
economische munt te slaan uit publieke informatie; deze kansen van de digitale open data agenda laat de huidige Wob nu liggen.
Overheden verzamelen bij hun activiteiten grote hoeveelheden gegevens die van waarde kunnen zijn voor allerhande
economische activiteiten en voor verhoging van de effectiviteit van het functioneren van de overheid. Ook kunnen private partijen
helpen bij het inzichtelijk en doorzoekbaar maken van complexe informatie. Voorwaarde hiervoor is wel dat de
informatiehuishouding van overheden en semi-overheden op orde is en dat informatie goed digitaal toegankelijk is en hergebruikt
kan worden.
De emancipatie van de burger jegens de overheid en de opkomst van de informatie-samenleving brengen met zich mee dat
burgers en onder-nemers snel en gemakkelijk toegang willen tot allerlei soorten overheids-informatie. Dit stelt hoge
technologische eisen aan de ict-capaciteit van de publieke sector. Ook de juridische infrastructuur vergt investeringen om ervoor
te zorgen dat de bescherming die het recht biedt in een informatie-samenleving een adequaat niveau heeft voor burgers en
bedrijven. Net zo goed is vereist dat een omslag in de Nederlandse bestuurscultuur wordt bewerkstelligd. Zolang het
informatiemonopolie van de overheid niet doorbroken wordt en een mentaliteit dat de overheid haar informatie ten dienste van
het publieke domein bezit, zal de burgerparticipatie in de politiek en in beleid een moeilijk punt blijven.
 
Vragen over Big Data, Open Data, Hergebruik
 
 
Pauze
 
 
Kort interactief debat
 
 
Aansprakelijkheid internet intermediairs
 
 
E-Commerce Directive
 
Article 1
Objective and scope
5. This Directive shall not apply to:
a) the field of taxation;
(b) questions relating to information society services covered by Directives 95/46/EC and 97/66/EC;
(c) questions relating to agreements or practices governed by cartel law;
(d) the following activities of information society services:
- the activities of notaries or equivalent professions to the extent that they involve a direct and specific
connection with the exercise of public authority,
- the representation of a client and defence of his interests before the courts,
- gambling activities which involve wagering a stake with monetary value in games of chance, including
lotteries and betting transactions.
6. This Directive does not affect measures taken at Community or national level, in the respect of
Community law, in order to promote cultural and linguistic diversity and to ensure the defence of pluralism.
 
 
General Data Protection Regulation
 
Article 2 
Material scope
4.This Regulation shall be without prejudice to the application of
Directive 2000/31/EC, in particular of the liability rules of
intermediary service providers in Articles 12 to 15 of that Directive.
 
E-commerce Directive
 
Section 4: Liability of intermediary service providers
Article 12
"Mere conduit"
1. Where an information society service is provided that consists of the transmission in a communication
network of information provided by a recipient of the service, or the provision of access to a communication
network, Member States shall ensure that the service provider is not liable for the information transmitted,
on condition that the provider:
(a) does not initiate the transmission;
(b) does not select the receiver of the transmission; and
(c) does not select or modify the information contained in the transmission.
2. The acts of transmission and of provision of access referred to in paragraph 1 include the automatic,
intermediate and transient storage of the information transmitted in so far as this takes place for the sole
purpose of carrying out the transmission in the communication network, and provided that the information
is not stored for any period longer than is reasonably necessary for the transmission.
3. This Article shall not affect the possibility for a court or administrative authority, in accordance with
Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.
 
E-commerce Directive
 
Article 13
"Caching"
1. Where an information society service is provided that consists of the transmission in a communication network of information
provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the automatic,
intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the
information's onward transmission to other recipients of the service upon their request, on condition that:
(a) the provider does not modify the information;
(b) the provider complies with conditions on access to the information;
(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used
by industry;
(d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the
use of the information; and
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge
of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has
been disabled, or that a court or an administrative authority has ordered such removal or disablement.
2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal
systems, of requiring the service provider to terminate or prevent an infringement.
 
E-commerce Directive
 
Article 14
Hosting
1. Where an information society service is provided that consists of the storage of information
provided by a recipient of the service, Member States shall ensure that the service provider is not
liable for the information stored at the request of a recipient of the service, on condition that:
(a) the provider does not have actual knowledge of illegal activity or information and, as regards
claims for damages, is not aware of facts or circumstances from which the illegal activity or
information is apparent; or
(b) the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to
disable access to the information.
2. Paragraph 1 shall not apply when the recipient of the service is acting under the authority or
the control of the provider.
3. This Article shall not affect the possibility for a court or administrative authority, in accordance
with Member States' legal systems, of requiring the service provider to terminate or prevent an
infringement, nor does it affect the possibility for Member States of establishing procedures
governing the removal or disabling of access to information.
 
E-commerce Directive
 
Article 15
No general obligation to monitor
1. Member States shall not impose a general obligation on providers, when
providing the services covered by Articles 12, 13 and 14, to monitor the
information which they transmit or store, nor a general obligation actively
to seek facts or circumstances indicating illegal activity.
2. Member States may establish obligations for information society service
providers promptly to inform the competent public authorities of alleged
illegal activities undertaken or information provided by recipients of their
service or obligations to communicate to the competent authorities, at
their request, information enabling the identification of recipients of their
service with whom they have storage agreements.
 
E-commerce Directive
 
Article 21
Re-examination
1. Before 17 July 2003, and thereafter every two years, the Commission shall submit to
the European Parliament, the Council and the Economic and Social Committee a report
on the application of this Directive, accompanied, where necessary, by proposals for
adapting it to legal, technical and economic developments in the field of information
society services, in particular with respect to crime prevention, the protection of minors,
consumer protection and to the proper functioning of the internal market.
2. In examining the need for an adaptation of this Directive, the report shall in particular
analyse the need for proposals concerning the liability of providers of hyperlinks and
location tool services, "notice and take down" procedures and the attribution of liability
following the taking down of content. The report shall also analyse the need for
additional conditions for the exemption from liability, provided for in Articles 12 and 13,
in the light of technical developments, and the possibility of applying the internal market
principles to unsolicited commercial communications by electronic mail.
 
Scarlet/Sabam 
en 
Sabam/Netlog
 
Het HvJ heeft in de zaken 
Scarlet/Sabam 
en 
Sabam/Netlog
 onder meer
bepaald dat de e-commercerichtlijn, in samenhang gelezen met andere
richtlijnen, eraan in de weg staat ‘(...) dat een hostingdienstverlener door
een nationale rechter wordt gelast een filtersysteem te installeren: – voor
de informatie die de gebruikers van zijn diensten op zijn servers opslaan;
dat zonder onderscheid op al die gebruikers wordt toegepast; dat
preventief werkt; dat uitsluitend door hem wordt bekostigd, en dat geen
beperking in de tijd kent, – waarmee elektronische bestanden die muziek-,
cinematografische of audiovisuele werken bevatten waarvan de
verzoekende partij stelt bepaalde intellectueleeigendomsrechten te
bezitten, kunnen worden geïdentificeerd, zodat kan worden voorkomen
dat die werken ter beschikking van het publiek worden gesteld en aldus het
auteursrecht wordt geschonden.’
 
Google/Louis Vuitton
 
Google’s advertentiedienst, die in samenhang 
met de zoekmachine
wordt geleverd, ook onder artikel 14 kan vallen aangezien dit artikel
aldus moet ‘worden uitgelegd dat de daarin genoemde regel geldt
voor de verlener van een zoekmachineadvertentiedienst op internet
wanneer die dienstverlener geen actieve rol heeft gehad waardoor hij
kennis heeft van of controle heeft over de opgeslagen gegevens.
Indien dat het geval is, kan de dienstverlener niet aansprakelijk
worden gesteld voor de gegevens die hij op verzoek van een
adverteerder heeft opgeslagen, tenzij hij niet snel die gegevens
verwijdert of de toegang daartoe onmogelijk maakt nadat hij kennis
heeft gekregen van het onwettige karakter van die gegevens of van
activiteiten van die adverteerder.
 
L’Oréal/eBay
 
Het Hof overweegt 
ten aanzien van eBay dat ‘het enkele feit dat de
beheerder van een elektronische marktplaats de verkoopaanbiedingen op
zijn server opslaat, bepaalt hoe zijn dienst wordt verleend, daarvoor een
vergoeding ontvangt en algemene inlichtingen aan zijn klanten verstrekt, er
niet toe [kan] leiden dat hij geen beroep kan doen op de in richtlijn
2000/31 voorziene vrijstellingen van aansprakelijkheid’. Toch kan een
provider zich ‘niet op de vrijstelling van aansprakelijkheid als bedoeld in die
bepaling beroepen wanneer hij kennis heeft gehad van feiten of
omstandigheden op grond waarvan een behoedzame marktdeelnemer de
onwettigheid van de betrokken verkoopaanbiedingen had moeten
vaststellen en hij, ingeval hij deze kennis had, niet prompt heeft gehandeld
overeenkomstig lid 1, sub b, van genoemd artikel 14’
 
Google/Spanje
 
‘Het is de exploitant van de zoekmachine die het doel van en de
middelen voor deze activiteit vaststelt en dus van de door hem zelf in
dat kader verrichte verwerking van persoonsgegevens, zodat hij (...)
moet worden geacht de “verantwoordelijke” voor deze verwerking 
te
zijn.’
‘Bovendien 
kan de verwerking door de redacteur van een webpagina,
bestaande uit de publicatie van informatie betreffende een
natuurlijke persoon, in voorkomend geval “voor uitsluitend
journalistieke (...) doeleinden” zijn verricht en aldus krachtens artikel
9 van richtlijn 95/46 onder de uitzonderingen op de vereisten van
deze richtlijn vallen, terwijl dit niet het geval is voor de door een
exploitant van een zoekmachine verrichte verwerking.’
 
Lindqvist
 
‘Die uitzondering moet derhalve aldus worden uitgelegd, dat zij
uitsluitend betrekking heeft op activiteiten die tot het persoonlijke of
gezinsleven van particulieren behoren, hetgeen klaarblijkelijk niet het
geval is met de verwerking van persoonsgegevens die bestaat in hun
openbaarmaking op internet waardoor die gegevens voor een
onbepaald aantal personen toegankelijk 
worden gemaakt.’
 
Handyside
 
‘Freedom of expression constitutes one of the essential foundations
of such a society, one of the basic conditions for its progress and for
the development of every man. Subject to paragraph 2 of Article 10, it
is applicable not only to “information” or “ideas” that are favourably
received or regarded as inoffensive or as a matter of indif ference, but
also to those that offend, shock or disturb the State or any sector of
the population. Such are the demands of that pluralism, tolerance
and broadmindedness without which there is no “democratic
society”.
 
Delfi/Estland
 
In the light of its accessibility and its capacity to store and
communicate vast amounts of information, the Internet plays an
important role in enhancing the public’s access to news and
facilitating the dissemination of information in general. The
maintenance of Internet archives is a critical aspect of this role and
the Court therefore considers that such archives fall within the ambit
of the protection af forded by Article 10.
 
Model
 
Vragen en opmerkingen over
aansprakelijkheid internet intermediairs
 
 
Einde
 
Slide Note
Embed
Share

Bart van der Sloot, a Senior Researcher at Tilburg Institute for Law, Technology, and Society, specializes in Privacy and Big Data, focusing on issues such as internet intermediary liability, data protection, and internet regulation. With expertise in the General Data Protection Regulation, international data flows, and data breaches, Bart's work highlights the importance of privacy rights in today's digital world. His contributions to the field include research, editing academic publications, and involvement in privacy-focused organizations and conferences.

  • Privacy
  • Data Protection
  • Big Data
  • European Union
  • Digital Rights

Uploaded on Sep 15, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Privacy en Gegevensbescherming PBLQ Traineeprogramma Bart van der Sloot Senior Researcher Tilburg Institute for Law, Technology, and Society (TILT) Tilburg University, Netherlands

  2. Overzicht (1) Kort interactief debat (2) Algemene Verordening Gegevensbescherming (3) Pauze (4) Kort interactief debat (5) Big Data, Open Data en Hergebruik (6) Kort interactief debat (7) Aansprakelijkheid internet intermediairs (8) Afronding

  3. Bart van der Sloot Specialisatie op het gebied van Privacy en Big Data, de aansprakelijkheid van internet intermediairs, gegevensbescherming en internetregulering. Actuele kernpunten zijn de onlangs door de Europese Unie aangenomen Algemene Verordening Gegevensbescherming, internationale gegevensstromen, met name tussen Europa en de Verenigde Staten, en datalekken. Heeft recht en filosofie gestudeerd in Nederland en Itali en heeft tevens met succes het Honoursprogramma van de Radboud Universiteit afgerond. Werkt momenteel bij het Tilburg Institute for Law, Technology, and Society van de Tilburg University. Voorheen gewerkt bij het Instituut voor Informatierecht, Universiteit van Amsterdam. Ook part-time gewerkt bij de Wetenschappelijk Raad voor Regeringsbeleid (WRR) (onderdeel van het ministerie van Algemene Zaken) aan een rapport over de regulering van Big Data in verband met veiligheid en privacy. In dat kader was hij ook de eerste editor van een wetenschappelijk boek met gastbijdrages van vooraanstaande internationale wetenschappers en de eerste auteur van een internationaal, rechtsvergelijkend onderzoek naar de regulering van Big Data. General editor van het internationale privacy tijdschrift European Data Protection Law Review. Proefschrift over subjectieve rechten in de Big Data wereld. Betrokken bij het Privacy and Identity Lab en co rdinator van het Amsterdam Platform for Privacy Research (APPR) en voorheen van de Amsterdam Privacy Conference 2012 en de Amsterdam Privacy Conference 2015. Meer informatie op www.bartvandersloot.nl

  4. Kort interactief debat

  5. Algemene Verordening Gegevensbescherming

  6. ARTIKEL 8 EVRM Recht op eerbiediging van priv -, familie- en gezinsleven 1. Een ieder heeft recht op respect voor zijn priv leven, zijn familie- en gezinsleven, zijn woning en zijn correspondentie. 2. Geen inmenging van enig openbaar gezag is toegestaan in de uitoefening van dit recht, dan voor zover bij de wet is voorzien en in een democratische samenleving noodzakelijk is in het belang van de nationale veiligheid, de openbare veiligheid of het economisch welzijn van het land, het voorkomen van wanordelijkheden en strafbare feiten, de bescherming van de gezondheid of de goede zeden of voor de bescherming van de rechten en vrijheden van anderen.

  7. Handvest voor de Grondrechten van de Europese Unie Artikel 7 Eerbiediging van het priv -leven en het familie- en gezinsleven Eenieder heeft recht op eerbiediging van zijn priv -leven, zijn familie- en gezinsleven, zijn woning en zijn communicatie. Artikel 8 Bescherming van persoonsgegevens 1. Eenieder heeft recht op bescherming van de hem betreffende persoonsgegevens. 2. Deze gegevens moeten eerlijk worden verwerkt, voor bepaalde doeleinden en met toestemming van de betrokkene of op basis van een andere gerechtvaardigde grondslag waarin de wet voorziet. Eenieder heeft recht op toegang tot de over hem verzamelde gegevens en op rectificatie daarvan. 3. Een onafhankelijke autoriteit ziet toe op de naleving van deze regels.

  8. Privacy en data protection Domain Relations Background Character Privacy Primarily regards the private sphere Primarily regards vertical relationships (citizen state) Rise of nation states Control on the use of power & duties of care Or .. Data Protectio n Regards both the private and the public sphere Primarily regards horizontal relationships (citizen - business) Technological developments Control on the use of power & duties of care Or ..

  9. TREATY ESTABLISHING THE EUROPEAN COMMUNITY Article 100a 1. By way of derogation from Article 100 and save where otherwise provided in this Treaty, the following provisions shall apply for the achievement of the objectives set out in Article 7a. The Council shall, acting in accordance with the procedure referred to in Article 189b and after consulting the Economic and Social Committee, adopt the measures for the approximation of the provisions laid down by law, regulation or administrative action in Member States which have as their object the establishment and functioning of the internal market. (27)() 2. Paragraph 1 shall not apply to fiscal provisions, to those relating to the free movement of persons nor to those relating to the rights and interests of employed persons. 3. The Commission, in its proposals envisaged in paragraph 1 concerning health, safety, environmental protection and consumer protection, will take as a base a high level of protection. 4. If, after the adoption of a harmonization measure by the Council acting by a qualified majority, a Member State deems it necessary to apply national provisions on grounds of major needs referred to in Article 36, or relating to protection of the environment or the working environment, it shall notify the Commission of these provisions. The Commission shall confirm the provisions involved after having verified that they are not a means of arbitrary discrimination or a disguised restriction on trade between Member States. By way of derogation from the procedure laid down in Articles 169 and 170, the Commission or any Member State may bring the matter directly before the Court of Justice if it considers that another Member State is making improper use of the powers provided for in this Article. 5. The harmonization measures referred to above shall, in appropriate cases, include a safeguard clause authorizing the Member States to take, for one or more of the non- economic reasons referred to in Article 36, provisional measures subject to a Community control procedure.

  10. Article 16 Treaty on the Functioning of the European Union 1. Everyone has the right to the protection of personal data concerning them. 2. The European Parliament and the Council, acting in accordance with the ordinary legislative procedure, shall lay down the rules relating to the protection of individuals with regard to the processing of personal data by Union institutions, bodies, offices and agencies, and by the Member States when carrying out activities which fall within the scope of Union law, and the rules relating to the free movement of such data. Compliance with these rules shall be subject to the control of independent authorities. The rules adopted on the basis of this Article shall be without prejudice to the specific rules laid down in Article 39 of the Treaty on European Union.

  11. Recente rechtszaken ECJ Coty Digital Rights Ireland Weltimo Tele2 Breyer Schrems

  12. Data Protection Directive No specific duties, but general standards of care Data collection, use and proecessing should be necessary and propotioniate, should have a clear and legitimate goal Technical and organisational measures Personal data should be correct, complete and up to date Transparancy

  13. Data Protection Directive Only three marginal subjective rights Right to acces Richt to information Right to rectification if data are not processed according to the data protection rules. Richt to object At least in the cases referred to in Article 7 (e) and (f), to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him Automated individual decisions which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him

  14. Data Protection Directive Only a marginal role for supervisory authority Limmited possibilities for remedies, liability and sanctions > left to national Member States Notification requirement is mosly ignored Sector specific codes of conduct are very few and far between European collection of CBP s, the Working Party 29, may only adopt non-binding advisory opinions

  15. General Data Protection Regulation Rights Right to acces, object and resist automatic profiling have been elaborated on Data portability Right to be forgotten Protection against profiling

  16. General Data Protection Regulation Duties All original duties have been retained + elaborated on Accountability duty Documentation Risk assessments Data protection officer Privacy by design / by default Reversal of the burden of proof for consent Verification duty for consent of children Data breach notification

  17. General Data Protection Regulation Enforcement Harmonization of the rules: Regulation Commission EDPB Harmonization of enforcement: One stop shop/cooperation DPAs Elaborated tasks and powers DPAs Sanctions and liability widened

  18. Article 83 General conditions for imposing administrative fines 5. Infringements of the following provisions shall, in accordance with paragraph 2, be subject to administrative fines up to 20 000 000 EUR, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year, whichever is higher: (a) the basic principles for processing, including conditions for consent, pursuant to Articles 5, 6, 7 and 9; (b) the data subjects' rights pursuant to Articles 12 to 22; (c) the transfers of personal data to a recipient in a third country or an international organisation pursuant to Articles 44 to 49; (d) any obligations pursuant to Member State law adopted under Chapter IX; (e) non-compliance with an order or a temporary or definitive limitation on processing or the suspension of data flows by the supervisory authority pursuant to Article 58(2) or failure to provide access in violation of Article 58(1). 6.Non-compliance with an order by the supervisory authority as referred to in Article 58(2) shall, in accordance with paragraph 2 of this Article, be subject to administrative fines up to 20 000 000 EUR, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year, whichever is higher.

  19. Vragen/discussie over GDPR

  20. Pauze

  21. Kort interactief debat

  22. Big Data, Open Data en Hergebruik

  23. Defintion and delineation of Big Data The Gartner Report focusses on three matters when describing Big Data: increasing volume (amount of data), velocity (speed of data processing), and variety (range of data types and sources). This is also called the 3v model or 3v theory Authors have added new V s such as Value (Dijcks, 2012; Dumbill, 2013), Variability (Hopkins & Evelson, 2011; Tech America Foundation, 2012), Veracity (IBM, 2015) and Virtual (Zikopoulos et al 11; Akerkar et al 2015).

  24. Defintion and delineation of Big Data The Article 29 Working Party: Big Data is a term which refers to the enormous increase in access to and automated use of information. It refers to the gigantic amounts of digital data controlled by companies, authorities and other large organizations which are subjected to extensive analysis based on the use of algorithms. Big Data may be used to identify general trends and correlations, but it can also be used such that it affects individuals directly. The European Data Protection Supervisor: Big data means large amounts of different types of data produced at high speed from multiple sources, whose handling and analysis require new and more powerful processors and algorithms. Not all of these data are personal, but many players in the digital economy increasingly rely on the large scale collection of and trade in personal information. As well as benefits, these growing markets pose specific risks to individual's rights to privacy and to data protection.

  25. Defintion and delineation of Big Data The Estonian DPA describes Big Data as collected and processed open datasets, which are defined by quantity, plurality of data formats and data origination and processing speed. The Luxembourg DPA: Big Data stems from the collection of large structured or unstructured datasets, the possible merger of such datasets as well as the analysis of these data through computer algorithms. It usually refers to datasets which cannot be stored, managed and analysed with average technical means due to their size. Personal data can also be a part of Big Data but Big Data usually extends beyond that, containing aggregated and anonymous data. The Dutch DPA: Big Data is all about collecting as much information as possible ; storing it in ever larger databases ; combining data that is collected for different purposes ; and applying algorithms to find correlations and unexpected new information. The Slovenian DPA: Big Data is a broad term for processing of large amounts of different types of data, including personal data, acquired from multiple sources in various formats. Big Data revolves around predictive analytics acquiring new knowledge from large data sets which requires new and more powerful processing applications. The UK DPA: repurposing data; using algorithms to find correlations in datasets rather than constructing traditional queries; and bringing together data from a variety of sources, including structured and unstructured data. The Swedish DPA argues that the concept is used for situations where large amounts of data are gathered in order to be made available for different purposes, not always precisely determined in advance.

  26. Defintion and delineation of Big Data Umbrella term Open Data: Lots of Big Data initiatives are linked to Open Data. Open Data is the idea, as the name suggests, that (government) data should be public. Traditionally, it is linked to the strive for transparency in the public sector and for more control over government power by media and/or citizens. In particular, the Estonian DPA is very explicit about the relationship between Open Data and Big Data. Big Data is defined as collected and processed open datasets, which are defined by quantity, plurality of data formats and data origination and processing speed . The desk research also shows a clear link between the two concepts in some countries, such as Australia, France, Japan and the United Kingdom.

  27. Defintion and delineation of Big Data Re-Use: Linked to Open Data is the idea of re-use of data. Yet there is one important difference. While Open Data traditionally concerned the transparency of and control on government power, there re-use of (government) data is specifically intended to promote the commercial exploitation of these data by businesses and private parties. The re-use of Public Sector Information is stimulated through the PSI Directive of the European Union. But more in general, re-use refers to the idea that data can be used for another purpose than for which they were originally collected. The Norwegian DPA, inter alia, has suggested the relationship between Big Data and the re-use of data. The Norwegians use the definition of the Working Group 29, but also add what in our opinion is the key aspect of Big Data, namely that it is about the compilation of data from several different sources. In other words, it is not just the volume in itself that is of interest, but the fact that secondary value is derived from the data through reuse and analysis. The desk research also showed a link between the two concepts. In France, for example, Big Data is primarily seen as a phenomenon based on the re-use of data for new purposes and on the combination of different data and datasets. Directive 2003/98/EC of the European Parliament and of the Council of 17 November 2003 on the re-use of public sector information. Directive 2013/37/EU of the European Parliament and the Council of 26 June 2013 amending Directive 2003/98/EC on the re-use of public sector information.

  28. Defintion and delineation of Big Data Internet of things: The term the Internet of Things refers to the idea that more and more things are connected to the Internet. This may include cars, lampposts, refrigerators, pants, or whatever object. This allows for the development of smart devices - for example, a refrigerator that records that the milk is out and automatically orders new. By providing all objects with a sensor, large quantities of data can be collected. Therefore, Big Data and the Internet of Things are often mentioned in the same breath. An example would be the DPA of the United Kingdom noting that big data may involve not only data that has been consciously provided by data subjects, but also personal data that has been observed (e.g. from Internet of Things devices), derived from other data or inferred through analytics and profiling.

  29. Defintion and delineation of Big Data Smart: Because of the applications of the internet of things and the constantly communicating devices and computers, the development of smart products and services has spiralled. Examples of such developments are smart cities, smart devices and smart robots. The desk research indicates that in a number of countries, a link is made between such developments and Big Data systems, for example the United States and the United Kingdom. Also, the DPA from Luxembourg emphasizes the relationship with smart systems, such as smart metering. At a national level, a system of smart metering for electricity and gas has been launched. The project is however still in a testing phase. - The CNPD has not issued any decisions, reports or opinions that are directly dealing with Big Data. The Commission has however issued an opinion in a related matter, namely with regard to the problematic raised by smart metering. In 2013, the CNPD issued an opinion on smart metering. The main argument of the opinion highlights the necessity to clearly define the purposes of the data processing as well as the retention periods of the data related to smart metering.

  30. Defintion and delineation of Big Data Profiling: A term that is often associated with Big Data and is sometimes included as part of the definition of Big Data is profiling. Because increasingly large data sets are collected and analysed, the conclusions and correlations are mostly formulated on a general or group level. This mainly involves statistical correlations, sometimes of a predictive nature. Germany is developing new laws on profiling and a number of DPAs emphasize the relationship of Big Data with profiling, such as the DPA of Netherlands, Slovenia, the UK and Belgium. The latter argues: The general data protection law applies, and we expect that de new data protection regulation will be able to provide a partial answer (profiling) to big data issues (legal interpretation of the EU legal framework).

  31. Defintion and delineation of Big Data Algoritmes: A term that recurs in very many definitions of Big Data is algorithms. This applies to the definition of Working Party 29, the EDPS and a number of DPAs such as that of Luxembourg, the Netherlands and the UK. A number of countries also have a special focus on algorithms. In Australia, a Program Protocol applies to certain cases a report may be issues in which the following elements are contained: a description of the data, a specification of each matchings algorithm, the expected risks and how they will be addressed, the means for checking the integrity and the security measures used.

  32. Defintion and delineation of Big Data Cloud Computing: Cloud computing is also often associated with Big Data processes. In particular, in China and Israel, the two terms are often connected to each other. For example, the Chinese vice-premier stressed that the government wants to make better use of technologies like Big Data and cloud computing to support innovation; according to the prime minister mobile Internet, cloud computing, Big Data and the Internet of Things are integrated with production processes, and will thus be an important engine for economic growth. In Israel, the plan is for the army to have a cloud where all data are stored in 2015 - there is even talk of a "combat computing cloud", a data center that will make available different tools to forces on the ground. Also, some DPAs suggest a relationship between cloud computing and Big Data; the Slovenian DPA states, for example, that new concepts and paradigms, such as cloud computing or big data should not lower or undermine the current levels of data protection as a fundamental human right.

  33. Use in practice of Big Data In the United States, more than $ 200 million was reserved for a research and development initiative for Big Data, to be spent by six federal government departments;the army invested the most in Big Data projects, namely $ 250 million;$ 160 million was invested in a smart cities initiative, investing in 25 collaborations focused on data usage. In the United Kingdom, 159 million was spent on high-quality computer and network infrastructure,there are 189 million in investments to support Big Data and to develop the data infrastructure of the UK and 10.7 million will be spent on a center for Big Data and space technologies.In addition, 42 million will be spent on the Alan Turing Institute for analysis and application of big data, 50 million for 'The Digital Catapult', where researchers and industry are brought together to come up with innovative products and lastly, the Minister of Universities and Science in February 2014 announced a new investment of 73 million in Big Data. This is used for bioinformatics, open data projects, research and the use of environmental data. In South-Africa, the government has invested 2 billion South-African Rand, approximately 126.8 million, in the Square Kilometre Array (SKA) project. A project which revolves around very large data sets. In France, seven research projects related to Big Data were given 11.5 million. In Germany, the Ministry of Education and Research invested 10 million in Big Data research institutes and 20 million in Big Data research; this ministry will also invest approximately 6.4 million in the project Abida, a four-year interdisciplinary research project on the social and economic effects of large data sets.

  34. Use in practice of Big Data What are the areas in which Big Data is (presumably) used? Internet companies: advertisements Health care sector: total genome analysis Taxs authorities: risk profiles Police: predictive policing Intelligence services: terror prevention

  35. Use in practice of Big Data Primarily in the private sector, to a lesser extent in the public sector, especially security related The Hungarian DPA, for example, emphasizes that in Hungarian business sphere more and more enterprises such as banks, supermarkets, media and telecommunication companies use and take advantage of the possibilities in Big Data. The DPA from Luxembourg holds: To our knowledge there are no prominent examples of the use of Big Data in the law enforcement sector or by police or intelligence services in Luxembourg. There are however other actors which deal with Big Data. The Norwegian DPA argues along the same line: There are, as far as we know, no usage of big data within the law enforcement sector in Norway. In 2014, the intelligence service addressed in a public speech the need to use big data techniques in order to combat terrorism more efficiently. However, politicians across all parties reacted very negatively to this request and no formal request to use such techniques has since been launched by the intelligence service. The companies that are most advanced when it comes to using big data may be found within the telecom (eg. Telenor) and media (eg. Schibsted and Cxence) sector. The tax and customs authorities have also initiated projects in which they look at how big data can be used to enhance the efficiency of their work. The Norwegian DPA continues: At the Norwegian DPA we are currently looking into how it affects our privacy when personal data is more and more turning into an valuable commodity in all sectors of the economy. We are writing a report on how big data is used within the advertising industry, and how the use of automated, personalised marketing triggers an enourmous appetite for and exchange of personal data.

  36. Use in practice of Big Data The Slovenian DPA emphasizes: We have thus far not seen prominent examples of the use of Big Data in our country. To our knowledge Big Data applications are particularly of interest in insurance, banking and electronic communications sector, mostly to battle fraud and other illegal practices. Another important field is scientific and statistical research. Law enforcement use is to our knowledge currently at development stages (e.g. in the case of processing Passenger Name Records), whereas information about the use of Big Data at intelligence services is either not available or of confidential nature. The Swedish DPA holds: We have not carried out any specific supervision related to the concept Big Data and do not have any statistics or specific information on how this is used. In our opinion, the law enforcement sector does not use Big Data. Their personal data processing is strictly regulated in terms of collection of data, limited purposes etc. Finally, the DPA from the United Kingdom states: We have not carried out a comprehensive market assessment of big data but, from our contacts with business and our desk research, our impression is that the take up of big data is still at a relatively early stage in the UK. Nevertheless, we know that companies are actively investigating the potential of big data, and there are some examples of big data in practice, such as the use of telematics in motor insurance, the use of mobile phone location data for market research, and the availability of data from the Twitter firehose for analytics. We do not have any specific information on the use of big data in law enforcement or security. The UK Data Protection Act includes a wide-ranging exemption from the data protection principles where it is required for safeguarding national security.

  37. Social and ethical dangers of Big Data Power imbalance & Mathew effect: Individuals, as a general rule, have limited power to influence how large corporations behave. Extensive use of Big Data analytics may increase the imbalance between large corporations on the one hand and the consumers on the other. It is the companies that collect personal data that extract the ever-growing value inherent in the analysis and processing of such information, and not the individuals who submit the information. Rather, the transaction may be to the consumer's disadvantage in the sense that it can ex- pose them to potential future vulnerabilities (for example, with regard to employment opportunities, bank loans, or health insurance options). Data determinism and discrimination: The Big data-mindset is based on the assumption that the more data you collect and have access to, the better, more reasoned and accurate decisions you will be able to make. But collection of more data may not necessarily entail more knowledge. More data may also result in more confusion and more false positives. Extensive use of automated decisions and prediction analyses may have adverse consequences for individuals. Algorithms are not neutral, but reflect choices, among others, about data, connections, inferences, interpretations, and thresholds for inclusion that advances a specific purpose. 32 Big Data may hence consolidate existing prejudices and stereotyping, as well as reinforce social exclusion and stratification. Use of correlation analysis may also yield completely incorrect results for individuals. Correlation is often mistaken for causality. If the analyses show that individuals who like X have an eighty per cent probability rating of being exposed to Y, it is impossible to conclude that this will occur in 100 per cent of the cases. Thus, discrimination on the basis of statistical analysis may become a privacy issue. A development where more and more decisions in society are based on use of algorithms may result in a Dictatorship of Data , where we are no longer judged on the basis of our actual actions, but on the basis of what the data indicate will be our probable actions.

  38. Social and ethical dangers of Big Data The Chilling effect: If there is a development where credit scores and insurance premiums are based solely or primarily on the information we leave behind in various contexts on the Internet and in other arenas in our daily life, this may be of consequence for the protection of privacy and how we behave. In ten years, our children may not be able to obtain insurance coverage because we disclosed in a social network that we are predisposed for a genetic disorder, for example. This may result in us exercising restraint when we participate in society at large, or that we actively adapt our behaviour both online and elsewhere. We may fear that the tracks we leave behind in various contexts may have an impact on future decisions, such as the possibility of finding work, obtaining loans, insurance, etc. It may even deter users from seeking out alternative points of view online for fear of being identified, profiled or discovered. With regard to the authorities' use of Big Data, uncertainty concerning which data sources are used for collecting information and how they are utilised may threaten our confidence in the authorities. This in turn may have a negative impact on the very foundation for an open and healthy democracy. Poor protection of our privacy may weaken democracy as citizens limit their participation in open exchanges of viewpoints. In a worst case scenario, extensive use of Big Data may have a chilling effect on freedom of expression if the premises for such use are not revealed and cannot be independently verified. Echo chambers: Personalisation of the web, with customised media and news services based on the individual's web behaviour, will also have an impact on the framework conditions for public debates and exchanges of ideas important premises for a healthy democracy. This is not primarily a privacy challenge, but constitutes a challenge for society at large. The danger associated with so-called echo chambers or filter bubbles is that the population will only be exposed to content which confirms their own attitudes and values. The exchange of ideas and viewpoints may be curbed when individuals are more rarely exposed to viewpoints different from their own. Transparency paradox: The citizen is becoming more and more transparent to the government, while the government is becoming more an more in-transparent to the citizen.

  39. Juridical challenges of Big Data: Purpose Article 6 Data protection Directive 1. Member States shall provide that personal data must be: (a) processed fairly and lawfully;

  40. Juridical challenges of Big Data: Purpose Article 7 Member States shall provide that personal data may be processed only if: (a) the data subject has unambiguously given his consent; or (b) processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract; or (c) processing is necessary for compliance with a legal obligation to which the controller is subject; or (d) processing is necessary in order to protect the vital interests of the data subject; or (e) processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller or in a third party to whom the data are disclosed; or (f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject which require protection under Article 1 (1).

  41. Juridical challenges of Big Data: Purpose Article 8 The processing of special categories of data 1. Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life. 2. Paragraph 1 shall not apply where: (a) the data subject has given his explicit consent to the processing of those data, except where the laws of the Member State provide that the prohibition referred to in paragraph 1 may not be lifted by the data subject's giving his consent; or (b) processing is necessary for the purposes of carrying out the obligations and specific rights of the controller in the field of employment law in so far as it is authorized by national law providing for adequate safeguards; or (c) processing is necessary to protect the vital interests of the data subject or of another person where the data subject is physically or legally incapable of giving his consent; or (d) processing is carried out in the course of its legitimate activities with appropriate guarantees by a foundation, association or any other non-profit-seeking body with a political, philosophical, religious or trade-union aim and on condition that the processing relates solely to the members of the body or to persons who have regular contact with it in connection with its purposes and that the data are not disclosed to a third party without the consent of the data subjects; or (e) the processing relates to data which are manifestly made public by the data subject or is necessary for the establishment, exercise or defence of legal claims.

  42. Juridical challenges of Big Data: Purpose limitation Article 6 1. Member States shall provide that personal data must be: (b) collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes. Further processing of data for historical, statistical or scientific purposes shall not be considered as incompatible provided that Member States provide appropriate safeguards;

  43. Juridical challenges of Big Data: Purpose limitation For example, the DPA of Luxembourg emphasises: From a data protection point of view it can raise many concerns, when it contains personal data, such as the respect of data subjects rights - for example in the context of data mining - and their ability to exercise control over the personal data or the respect fundamental principles of data protection such as that of data minimization or purpose limitation. The definition of Big Data of the Dutch DPA contains, among other elements, combining data that is collected for different purposes and it also holds: Our key concern is that data protection should be about surprise minimisation, while big data entails the risk of surprise maximation. There is a real risk that those who are involved in the development and use of Big Data are ignoring the basic principles of purpose limitation, data minimisation and transparency. And an additional frightening fact is that the statistical information, even if the data used is properly anonymised, can lead to such precise results that it essentially constitutes re-identification. The Norwegian DPA states: In other words, it is not just the volume in itself that is of interest, but the fact that secondary value is derived from the data through reuse and analysis. This aspect of Big Data, and the consequences it has, is in our opinion the most challenging aspect from a privacy perspective. Finally, the Swedish DPA states about Big Data: As we see it, the concept is used for situations where large amounts of data are gathered in order to be made available for different purposes, not always precisely determined in advance.

  44. Juridical challenges of Big Data: Data minimization Article 6 1. Member States shall provide that personal data must be: (b) collected for specified, explicit and legitimate purposes and not further processed in a way incompatible with those purposes. Further processing of data for historical, statistical or scientific purposes shall not be considered as incompatible provided that Member States provide appropriate safeguards; (c) adequate, relevant and not excessive in relation to the purposes for which they are collected and/or further processed; (e) kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected or for which they are further processed. Member States shall lay down appropriate safeguards for personal data stored for longer periods for historical, statistical or scientific use.

  45. Juridical challenges of Big Data: Data minimization Almost all DPAs mention this principle when it comes to the dangers of Big Data. The DPA from Luxembourg, inter alia, refers to a decision in which it stressed the importance of a retention period for data storage. The Dutch DPA summarizes the tension between Big Data and data minimization in very clear words: Big Data is all about collecting as much information as possible .

  46. Juridical challenges of Big Data: Technical and organizational measures Article 16 - Confidentiality of processing Any person acting under the authority of the controller or of the processor, including the processor himself, who has access to personal data must not process them except on instructions from the controller, unless he is required to do so by law. Article 17 - Security of processing 1. Member States shall provide that the controller must implement appropriate technical and organizational measures to protect personal data against accidental or unlawful destruction or accidental loss, alteration, unauthorized disclosure or access, in particular where the processing involves the transmission of data over a network, and against all other unlawful forms of processing. Having regard to the state of the art and the cost of their implementation, such measures shall ensure a level of security appropriate to the risks represented by the processing and the nature of the data to be protected. 2. The Member States shall provide that the controller must, where processing is carried out on his behalf, choose a processor providing sufficient guarantees in respect of the technical security measures and organizational measures governing the processing to be carried out, and must ensure compliance with those measures. 3. The carrying out of processing by way of a processor must be governed by a contract or legal act binding the processor to the controller and stipulating in particular that: - the processor shall act only on instructions from the controller, - the obligations set out in paragraph 1, as defined by the law of the Member State in which the processor is established, shall also be incumbent on the processor. 4. For the purposes of keeping proof, the parts of the contract or the legal act relating to data protection and the requirements relating to the measures referred to in paragraph 1 shall be in writing or in another equivalent form.

  47. Juridical challenges of Big Data: Technical and organizational measures Many DPAs also mention this principle when discussing the dangers of Big Data; this holds especially true for countries and DPAs that establish a link between Big Data and Open Data. The Slovenian DPA stresses about this particular tension: The principles of personal data accuracy and personal data being kept up to date may also be under pressure in Big Data processing. Data may be processed by several entities and merged from different sources without proper transparency and legal ground. Processing vast quantities of personal data also brings along higher data security concerns and calls for strict and effective technical and organisational data security measures.

  48. Juridical challenges of Big Data: Data quality Article 6 1. Member States shall provide that personal data must be: (d) accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure that data which are inaccurate or incomplete, having regard to the purposes for which they were collected or for which they are further processed, are erased or rectified;

  49. Juridical challenges of Big Data: Data quality Often, Big Data applications do not revolve around individual profiles, but around group profiles, not around retrospective analyses, but around probability and predictive applications with a certain margin of error. Moreover, it is supposedly becoming less and less important for data processors to work with correct and accurate data about specific individuals, as long as a large percentage of the data on which the analysis is based provides a generally correct picture. Quantity over quality of data, so the saying goes, as more and more organizations are accustomed to working with dirty data . In the public sector too, it seems that working with contaminated data or unreliable sources is becoming less uncommon. Reference can be made to the use by government agencies of open sources on the internet, inter alia, Facebook, websites and discussion forums. The Dutch DPA, for example, indicates: There has been a lot of media attention for big data use by the Tax administration (scraping websites such as Marktplaats [an e-bay like website] to detect sales, mass collection of data about parking and driving in leased cars, including use of ANPR-data, and profiling people to detect potentially fraudulent tax filings .

  50. Juridical challenges of Big Data: Transparency Article 10 Information in cases of collection of data from the data subject Member States shall provide that the controller or his representative must provide a data subject from whom data relating to himself are collected with at least the following information, except where he already has it: (a) the identity of the controller and of his representative, if any; (b) the purposes of the processing for which the data are intended; (c) any further information such as - the recipients or categories of recipients of the data, - whether replies to the questions are obligatory or voluntary, as well as the possible consequences of failure to reply, - the existence of the right of access to and the right to rectify the data concerning him in so far as such further information is necessary, having regard to the specific circumstances in which the data are collected, to guarantee fair processing in respect of the data subject.

Related


More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#