Privacy by Design & Case Studies: Enhancing Data Protection

Privacy by design & case studies
George Danezis 
  
g.danezis@ucl.ac.uk
With help from:
 
Luca Melis 
  
luca.melis.14@ucl.ac.uk
 
Steve Dodier-Lazaro 
 
s.dodier-lazaro.12@ucl.ac.uk
GA17: Where are we at?
Communications content – E2E encryption
Communications meta-data – Anonymous communications
Computations – Homomorphic encryption / SMPC
Integrity – Zero-Knowledge proofs
Authentication / Authorization – Selective disclosure credentials
Regulations – Data protection
Data & Query anonymization – Differential privacy
Human Aspects & requirements
Storage & retrieval – PIR, ORAM, …
Case studies – putting it all together in one architecture.
+ Labs & code review!
Privacy by Design (PbD)
Economics of privacy engineering:
Thinking of privacy at the design stage, cheaper than at later stages.
Example 1.
 Alice builds a database and decides “name” & “gender” are
immutable. She uses them as keys to the database records.
Example 2.
 Bob builds a web-site that records a lot of user activity, but
presents none of it to the user. The activity is stored on a separate back end.
Problem?
Privacy-by-design approach:
Integrate thinking about privacy, and subject rights at all stages of
development.
Requirements, specification, implementation, testing, deployment,
maintenance.
Integrate at all stages appropriate controls to mitigate privacy risks.
G
e
o
r
g
e
 
D
a
n
e
z
i
s
,
 
J
o
s
e
p
 
D
o
m
i
n
g
o
-
F
e
r
r
e
r
,
 
M
a
r
i
t
 
H
a
n
s
e
n
,
 
J
a
a
p
-
H
e
n
k
 
H
o
e
p
m
a
n
,
 
D
a
n
i
e
l
 
L
e
 
M
e
t
a
y
e
r
,
 
R
o
d
i
c
a
 
T
i
r
t
e
a
,
 
S
t
e
f
a
n
 
S
c
h
i
f
f
n
e
r
.
 
P
r
i
v
a
c
y
 
a
n
d
 
D
a
t
a
P
r
o
t
e
c
t
i
o
n
 
b
y
 
D
e
s
i
g
n
 
-
 
f
r
o
m
 
p
o
l
i
c
y
 
t
o
 
e
n
g
i
n
e
e
r
i
n
g
.
 
E
u
r
o
p
e
a
n
 
U
n
i
o
n
 
A
g
e
n
c
y
 
f
o
r
 
N
e
t
w
o
r
k
 
a
n
d
 
I
n
f
o
r
m
a
t
i
o
n
 
S
e
c
u
r
i
t
y
 
(
E
N
I
S
A
)
,
 
I
S
B
N
:
 
9
7
8
-
9
2
-
9
2
0
4
-
1
0
8
-
3
7 principles of PbD
1.
Proactive not Reactive; Preventative not Remedial
2.
Privacy as the Default Setting
3.
Privacy Embedded into Design
4.
Full Functionality – Positive-Sum, not Zero-Sum
 
5.
End-to-End Security – Full Lifecycle Protection
6.
Visibility and Transparency – Keep it Open
7.
Respect for User Privacy – Keep it User-Centric
The 7 Principles: https://www.privacybydesign.ca/index.php/about-pbd/7-foundational-principles/
Gürses, Seda, Carmela Troncoso, and Claudia Diaz. "Engineering privacy by design." 
Computers, Privacy & Data Protection
 14 (2011).
“[…] these principles remain 
vague
 and leave many open questions about their
application when engineering systems.
” - Gurses et al (2011)
PbD and its discontents (I)
“Privacy by design 
can be reduced to a series of symbolic activities
to assure consumers’ confidence, as well as the 
free flow of
information
 in the marketplace”
“From a security engineering perspective, 
control and transparency
mechanisms do not provide the means to mitigate the privacy risks
that arise through the collection of data in massive databases.”
Gürses, Seda, Carmela Troncoso, and Claudia Diaz. "Engineering privacy by design." 
Computers, Privacy & Data Protection
 14 (2011).
PbD and its discontents (I)
“This becomes especially problematic with respect to 
large-scale
mandatory systems
 like road tolling systems and smart energy
systems, or 
de facto mandatory systems
 like telecommunications
(e.g., mobile phones).”
Conclusion:
“From a security engineering perspective, 
the risks inherent to the
digital format
 imply that 
data minimization must be the
foundational principle
 in applying privacy by design to these
systems.”
Gürses, Seda, Carmela Troncoso, and Claudia Diaz. "Engineering privacy by design." 
Computers, Privacy & Data Protection
 14 (2011).
A note on software architecture
We present the case study as 
architectures
.
What is software architecture? (Shaw and Garlan 1996)
“Software architecture encompasses the 
set of significant decisions 
about
the organization of a software system including the 
selection of the
structural elements
 and their 
interfaces
 by which the system is 
composed
;”
“behavior as specified in 
collaboration among those elements
; composition
of these structural and behavioral elements into 
larger subsystems
;”
“and an architectural 
style that guides
 this organization.”
Architecture should:
Expose the structure
 of the system but 
hide the implementation 
details.
Realize all of the 
use cases 
and scenarios.
Try to 
address the requirements 
of various stakeholders.
Handle both 
functional
 and 
quality
 requirements.
Microsoft Application Architecture Guide, 2nd Edition. October 2009.
Case Study 1: e-petitions (Diaz et al. 2009)
Petitions: a formal request to an authority.
E.g. EU Lisbon Treaty: 1M people across EU may request legislation.
Two key risks:
(1) Disclosure of who signed a petition (sensitive topics).
(2) Discrimination or profiling on the basis of signing a petition.
Requirements:
“The signatures correspond to existing individuals.”
“Only individuals eligible to sign a petition are able to do so.”
“Each individual can sign a petition only once.”
“The number of signatures is correctly counted.”
Note: “identifiability not inherent” -> Principle of data minimization …
Claudia Diaz, Eleni Kosta, Hannelore Dekeyser, Markulf Kohlweiss, and Girma Nigusse. Privacy preserving electronic petitions. Identity in the
Information Society, 1(1):203-209, 2009
E-Petition architecture
Registration
Server
Users
Authentication 
& Attributes
(
Encrypted
)
Issue 
Selective Disclosure
Credential
Petition A Server
(Web)
Petition B Server
(Web)
Petition C Server
(Web)
Credential show 
over
Anonymous channel
Prove:
- Possess Credential
- Eligible
- Unique ID per petition
(duplicate detection)
Log of NIZK
Log of all signatures NIKZ
& Public counting
Check inclusion
 
over
Anonymous channel
Question: Re-evaluate risks!
Ensure only one / same
credential per person.
Pattern: anonymize.
Case Study 2: Electronic Toll Pricing
Pay according to road use: time, distance, type or road, congestion.
Privacy risks:
(1) Third party access to traffic / location data of driver.
(2) Abuse of traffic data by authority performing the billing.
 
(location data cannot be easily anonymized)
Requirements:
“the provider needs to know the final fee to charge;”
“the provider must be reassured that this fee is correctly computed and users
cannot commit fraud”
Note: location as a means to the above -> not intrinsic.
Josep Balasch, Alfredo Rial, Carmela Troncoso, Christophe Geuens, Bart Preneel,and Ingrid Verbauwhede. PrETP: Privacy-preserving electronic
toll pricing. In 19th USENIX Security Symposium, pages 63-78. USENIX Association, 2010.
ETP Architecture
On-Board-Unit
Location 
(GPS)
Compute fee for each
road segment.
Compute total fee.
Tariffs, maps, policy
Final fee, time / location
commitments
, fee
commitments,
Homomorphic add
 of fee
Authority
Location 
Observations
(Speed Cameras)
Challenge with handful of
known locations + fees), 
reveal
commitments
 for this time.
Note: 
Link between GPS-Location-fee by challenge.
Question:
 Cap risk if one fee entry is wrong?
Case Study 3: Smart metering
Smart energy meters record household consumption every 30 mins.
Privacy Risks:
(1) Inference of sensitive personal
attributes. (Health, religion, work)
Requirements:
“Billing should be correct”
“Aggregate statistics per household or group should be available”
“Fraud / tampering detection”
Alfredo Rial and George Danezis. Privacy-Preserving Smart Metering. Proceedings of the 2011 ACM WPES 2011, Chicago, USA, October 17, 2008.
Klaus Kursawe, George Danezis, Markulf Kohlweiss: Privacy-Friendly Aggregation for the Smart-Grid. PETS 2011, Waterloo, ON, Canada, July 27-29, 2011.
Smart meter billing architecture (I)
Secure H/W
ZKP for correct billing
ZKP for other apps
Encrypted Links
Smart meter billing architecture (II)
Note: No LAN, all data in the cloud, but encrypted
ZKP
Secret sharing allows for aggregation
Smart Meter Architecture (III)
x
i
K
i2
, K
i3
DCC
y
1
y
2
K
i1
K
i2
Repository of “Encrypted”
Readings (
Secret Sharing
)
Authorized Queries
Response 
(
Differential privacy
)
Sending encoded readings
(
Secret sharing / 
PK Encryption
)
G
i
l
l
e
s
 
B
a
r
t
h
e
,
 
G
e
o
r
g
e
 
D
a
n
e
z
i
s
,
 
B
e
n
j
a
m
i
n
 
G
r
é
g
o
i
r
e
,
 
C
é
s
a
r
 
K
u
n
z
,
 
S
a
n
t
i
a
g
o
 
Z
a
n
e
l
l
a
 
B
é
g
u
e
l
i
n
:
 
V
e
r
i
f
i
e
d
 
C
o
m
p
u
t
a
t
i
o
n
a
l
 
D
i
f
f
e
r
e
n
t
i
a
l
 
P
r
i
v
a
c
y
w
i
t
h
 
A
p
p
l
i
c
a
t
i
o
n
s
 
t
o
 
S
m
a
r
t
 
M
e
t
e
r
i
n
g
.
 
C
S
F
 
2
0
1
3
:
 
2
8
7
-
3
0
1
Decryption / Aggregation
Authorities (
Access control
)
Meter
(
H/W
)
Note:
 little user involvement
Aggregation despite failures.
Remember: 
more than one way to architect.
Case Study 4: Danish Sugar Beet Auctions
Farmer contracts to produce sugar beet. Single sugar producer.
Rights / contracts are sold at an Auction
Risks:
(1) Bids for contracts reveal a farmers economy (can be abused)
(2) May be abused by single producer or other farmers.
Requirements:
“Run a double auction”
“Receive sealed bids”
“Compute Market Clearing Price”
“Perform trade at MCP – binding”
The Danish Sugar Beet Auctions, Tomas Toft, Aarhus University, PEC Workshop. Dec. 8th 2012
Beet Auction Architecture
Secret Sharing
Authentication / Authorization
“The auction has been run every year since 2008”
Mutually distrustful parties
-
Danisco (Sugar
Producer)
-
DSK (Farmers assoc.)
-
SIMAP (researchers)
Off line keys
Case Study 5: SecureDrop for whistle blowing
Sources within Gov / Industry want to help journalists uncover wrong
doing.
Privacy Risks:
(1) The identity of the source may be uncovered.
(2) The documents may contain too much information.
Requirements:
“Source can submit story / documents”
“Journalist may converse with source”
“Documents can be redacted / selected”
“Selected documents can be made public”
Freedom of the Press Foundation -- https://securedrop.org/
SecureDrop Architecture
Tor 
(
a
n
o
n
y
m
o
u
s
 
c
o
m
m
s
.
)
T
a
i
l
s
 
(
O
S
 
P
r
i
v
a
c
y
)
A
i
r
g
a
p
 
(
A
r
c
h
i
t
e
c
t
u
r
e
)
Encryption + H/W keys
(
E
n
c
r
y
p
t
i
o
n
)
Lesson: Architecture can also be a privacy mechanism!
https://github.com/freedomofpress/securedrop/issues/274
Privacy Engineering (Gurses et al, 2011)
Process:
Functional Requirements Analysis
:
(Vague requirements lead to privacy problems.)
Data Minimization:
(Identity or data not always necessary)
Modelling Attackers, Threats and Risks
(Which parties have incentives to be hostile to the requirements)
Multilateral Security Requirements Analysis
(Conflicting / contradicting security requirements of all parties)
Implementation and Testing of the Design
Crucial
Iterate
all
“If the functionality was not properly delimited in our case studies, even
following our methodology, we would be forced to go for a centralized
approach collecting all the data” -- Gurses et al 2009.
Slide Note
Embed
Share

Exploring the concept of Privacy by Design (PbD) through case studies and principles, emphasizing the integration of privacy considerations at all stages of system development to mitigate risks effectively.

  • Privacy
  • Data Protection
  • Privacy Engineering
  • Privacy by Design
  • Case Studies

Uploaded on Oct 08, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Privacy by design & case studies George Danezis g.danezis@ucl.ac.uk With help from: Luca Melis Steve Dodier-Lazaro s.dodier-lazaro.12@ucl.ac.uk luca.melis.14@ucl.ac.uk

  2. GA17: Where are we at? Communications content E2E encryption Communications meta-data Anonymous communications Computations Homomorphic encryption / SMPC Integrity Zero-Knowledge proofs Authentication / Authorization Selective disclosure credentials Regulations Data protection Data & Query anonymization Differential privacy Human Aspects & requirements Storage & retrieval PIR, ORAM, Case studies putting it all together in one architecture. + Labs & code review!

  3. Privacy by Design (PbD) Economics of privacy engineering: Thinking of privacy at the design stage, cheaper than at later stages. Example 1. Alice builds a database and decides name & gender are immutable. She uses them as keys to the database records. Example 2. Bob builds a web-site that records a lot of user activity, but presents none of it to the user. The activity is stored on a separate back end. Problem? Privacy-by-design approach: Integrate thinking about privacy, and subject rights at all stages of development. Requirements, specification, implementation, testing, deployment, maintenance. Integrate at all stages appropriate controls to mitigate privacy risks. George Danezis, Josep Domingo-Ferrer, Marit Hansen, Jaap-Henk Hoepman, Daniel Le Metayer, Rodica Tirtea, Stefan Schiffner. Privacy and Data Protection by Design - from policy to engineering. European Union Agency for Network and Information Security (ENISA), ISBN: 978-92-9204-108-3

  4. 7 principles of PbD 1. 2. 3. 4. 5. 6. 7. Proactive not Reactive; Preventative not Remedial Privacy as the Default Setting Privacy Embedded into Design Full Functionality Positive-Sum, not Zero-Sum End-to-End Security Full Lifecycle Protection Visibility and Transparency Keep it Open Respect for User Privacy Keep it User-Centric [ ] these principles remain vague and leave many open questions about their application when engineering systems. - Gurses et al (2011) The 7 Principles: https://www.privacybydesign.ca/index.php/about-pbd/7-foundational-principles/ G rses, Seda, Carmela Troncoso, and Claudia Diaz. "Engineering privacy by design." Computers, Privacy & Data Protection 14 (2011).

  5. PbD and its discontents (I) Privacy by design can be reduced to a series of symbolic activities to assure consumers confidence, as well as the free flow of information in the marketplace From a security engineering perspective, control and transparency mechanisms do not provide the means to mitigate the privacy risks that arise through the collection of data in massive databases. G rses, Seda, Carmela Troncoso, and Claudia Diaz. "Engineering privacy by design." Computers, Privacy & Data Protection 14 (2011).

  6. PbD and its discontents (I) This becomes especially problematic with respect to large-scale mandatory systems like road tolling systems and smart energy systems, or de facto mandatory systems like telecommunications (e.g., mobile phones). Conclusion: From a security engineering perspective, the risks inherent to the digital format imply that data minimization must be the foundational principle in applying privacy by design to these systems. G rses, Seda, Carmela Troncoso, and Claudia Diaz. "Engineering privacy by design." Computers, Privacy & Data Protection 14 (2011).

  7. A note on software architecture We present the case study as architectures. What is software architecture? (Shaw and Garlan 1996) Software architecture encompasses the set of significant decisions about the organization of a software system including the selection of the structural elements and their interfaces by which the system is composed; behavior as specified in collaboration among those elements; composition of these structural and behavioral elements into larger subsystems; and an architectural style that guides this organization. Architecture should: Expose the structure of the system but hide the implementation details. Realize all of the use cases and scenarios. Try to address the requirements of various stakeholders. Handle both functional and quality requirements. Microsoft Application Architecture Guide, 2nd Edition. October 2009.

  8. Case Study 1: e-petitions (Diaz et al. 2009) Petitions: a formal request to an authority. E.g. EU Lisbon Treaty: 1M people across EU may request legislation. Two key risks: (1) Disclosure of who signed a petition (sensitive topics). (2) Discrimination or profiling on the basis of signing a petition. Requirements: The signatures correspond to existing individuals. Only individuals eligible to sign a petition are able to do so. Each individual can sign a petition only once. The number of signatures is correctly counted. Note: identifiability not inherent -> Principle of data minimization Claudia Diaz, Eleni Kosta, Hannelore Dekeyser, Markulf Kohlweiss, and Girma Nigusse. Privacy preserving electronic petitions. Identity in the Information Society, 1(1):203-209, 2009

  9. E-Petition architecture Registration Server Ensure only one / same credential per person. Authentication & Attributes (Encrypted) Issue Selective Disclosure Credential Petition A Server (Web) Credential show over Anonymous channel Prove: - Possess Credential - Eligible - Unique ID per petition (duplicate detection) Petition B Server (Web) Check inclusionover Anonymous channel Log of all signatures NIKZ & Public counting Log of NIZK Users Petition C Server (Web) Question: Re-evaluate risks! Pattern: anonymize.

  10. Case Study 2: Electronic Toll Pricing Pay according to road use: time, distance, type or road, congestion. Privacy risks: (1) Third party access to traffic / location data of driver. (2) Abuse of traffic data by authority performing the billing. (location data cannot be easily anonymized) Requirements: the provider needs to know the final fee to charge; the provider must be reassured that this fee is correctly computed and users cannot commit fraud Note: location as a means to the above -> not intrinsic. Josep Balasch, Alfredo Rial, Carmela Troncoso, Christophe Geuens, Bart Preneel,and Ingrid Verbauwhede. PrETP: Privacy-preserving electronic toll pricing. In 19th USENIX Security Symposium, pages 63-78. USENIX Association, 2010.

  11. ETP Architecture Location Observations (Speed Cameras) Location (GPS) Tariffs, maps, policy Final fee, time / location commitments, fee commitments, Homomorphic add of fee On-Board-Unit Authority Challenge with handful of known locations + fees), reveal commitments for this time. Compute fee for each road segment. Compute total fee. Note: Link between GPS-Location-fee by challenge. Question: Cap risk if one fee entry is wrong?

  12. Case Study 3: Smart metering Smart energy meters record household consumption every 30 mins. Privacy Risks: (1) Inference of sensitive personal attributes. (Health, religion, work) Requirements: Billing should be correct Aggregate statistics per household or group should be available Fraud / tampering detection Alfredo Rial and George Danezis. Privacy-Preserving Smart Metering. Proceedings of the 2011 ACM WPES 2011, Chicago, USA, October 17, 2008. Klaus Kursawe, George Danezis, Markulf Kohlweiss: Privacy-Friendly Aggregation for the Smart-Grid. PETS 2011, Waterloo, ON, Canada, July 27-29, 2011.

  13. Smart meter billing architecture (I) ZKP for correct billing Encrypted Links Secure H/W ZKP for other apps

  14. Smart meter billing architecture (II) Note: No LAN, all data in the cloud, but encrypted Secret sharing allows for aggregation ZKP

  15. Smart Meter Architecture (III) Remember: more than one way to architect. Note: little user involvement Aggregation despite failures. Decryption / Aggregation Authorities (Access control) y1 y2 ??= ?????????? ? ??= ??????(??) ? ? Meter ci= E ri = ri+ si mod 232 (H/W) Ki1 Ki2 Sending encoded readings (Secret sharing / PK Encryption) xi Authorized Queries ?? ?? ?? ?? ?? ?? DCC Ki2, Ki3 ???? Response (Differential privacy) R = ?? Repository of Encrypted Readings (Secret Sharing) ? ? Gilles Barthe, George Danezis, Benjamin Gr goire, C sar Kunz, Santiago Zanella B guelin: Verified Computational Differential Privacy with Applications to Smart Metering.CSF 2013: 287-301

  16. Case Study 4: Danish Sugar Beet Auctions Farmer contracts to produce sugar beet. Single sugar producer. Rights / contracts are sold at an Auction Risks: (1) Bids for contracts reveal a farmers economy (can be abused) (2) May be abused by single producer or other farmers. Requirements: Run a double auction Receive sealed bids Compute Market Clearing Price Perform trade at MCP binding The Danish Sugar Beet Auctions, Tomas Toft, Aarhus University, PEC Workshop. Dec. 8th 2012

  17. Beet Auction Architecture Authentication / Authorization Secret Sharing Off line keys Mutually distrustful parties - Danisco (Sugar Producer) - DSK (Farmers assoc.) - SIMAP (researchers) The auction has been run every year since 2008

  18. Case Study 5: SecureDrop for whistle blowing Sources within Gov / Industry want to help journalists uncover wrong doing. Privacy Risks: (1) The identity of the source may be uncovered. (2) The documents may contain too much information. Requirements: Source can submit story / documents Journalist may converse with source Documents can be redacted / selected Selected documents can be made public Freedom of the Press Foundation -- https://securedrop.org/

  19. SecureDrop Architecture Tails (OS Privacy) Airgap (Architecture) Tor (anonymous comms.) Encryption + H/W keys (Encryption) Lesson: Architecture can also be a privacy mechanism! https://github.com/freedomofpress/securedrop/issues/274

  20. Privacy Engineering (Gurses et al, 2011) Process: Functional Requirements Analysis: (Vague requirements lead to privacy problems.) Data Minimization: (Identity or data not always necessary) Modelling Attackers, Threats and Risks (Which parties have incentives to be hostile to the requirements) Multilateral Security Requirements Analysis (Conflicting / contradicting security requirements of all parties) Implementation and Testing of the Design Crucial Iterate all If the functionality was not properly delimited in our case studies, even following our methodology, we would be forced to go for a centralized approach collecting all the data -- Gurses et al 2009.

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#