Catalyst Conference, Day 3 (Friday, July 31)

This morning's Privacy Track was the most intellectually stimulating set of sessions for me in the Catalyst Conference.  The blend of theoretical background and practical application of privacy principles was a good combination.  I certainly don't consider myself a privacy expert, so I learned much and and gained valuable perspective, both the point of view as an Identity Management practitioner and as a person who values personal privacy. Hats off to Burton Group for assembling an excellent set of speakers.

Here are the high points for me:

Privacy: Principles Yield Practice
Bob Blakley (Burton Group)
  1. Privacy is not about data, it is about people
  2. Protecting privacy means putting oneself in the place of another and understand the consequences of your actions
  3. Privacy means different things in different contexts
  4. Privacy principles:
    1. accountability
    2. transparency
    3. meaningful choice
    4. minimal collection and disclosure
    5. constrained use
    6. data quality and accuracy
    7. validated access
    8. security
  5. Put principles into context - then derive set of rules
  6. IdM systems have much personal data in them.  Are we protecting the dignity of the people I know things about?


Privacy Issues Related to Healthcare and Identity
Speaker: David Miller (Covisint)
  1. IAM is not a security thing.  It is a privacy thing.
  2. Security is about keeping people out; privacy is about letting the right people in.
  3. Electronic Medical Records (EMR) are being dictated by legislation, but have challenges to overcome, including:
    1. authentication
    2. authorization
    3. data exists in many places
    4. patient access to records depends on many factors
    5. many organizations want access to information
    6. regulatory issues
    7. legal/tort issues
  4. One solution is a central Health Information Exchange (HIE).
  5. Several different organizations at the national, state and health care organization level approach HIE's differently.


Privacy - how to have a productive multi-stakeholder discussion
Robin Wilton (Future Identity Ltd.)
  1. Privacy is usually a multi-stakeholder discussion
  2. It is difficult for stakeholders to articulate their view of privacy problems in a way that other stakeholders understand
  3. Use the "Onion Model" to explore and use levels of importance of personal information
  4. Use the "Ladder Model" to facilitate different viewpoints about privacy
  5. We are doing all this technical interaction in online networking as if it works the same way as face to face interaction, but it does not.
  6. "Privacy management" implies being aware of relationships and contexts, and acting accordingly.
  7. Technology is not an automatic answer to privacy.


A Dual Mission: Identity Management and Privacy Protection in the Federal Government

Bob Mocny, Director, DHS-VISIT Program - Department of Homeland Security
  1. Identity management is critical to national security
  2. US VISIT - check credentials for visitors into
  3. 100 million biometric records used for authentication, 200K transactions/day - largest in the world
  4. Built privacy into architecture of system
  5. Secure facilities and networks are in place to protect privacy
  6. Redress process to correct personal information in the system is essential
  7. No more important condition between the government and the people it protects than trust
  8. US VISIT built trust into the biometric system


Joint Q&A
Bob Blakley (Burton Group)
Bob Mocny (Department of Homeland Security)
Robin Wilton (Future Identity Ltd)
David Miller (Covisint)
  1. Privacy-enhancing governance is difficult (e.g. if you request that your PII be deleted from a list, is your PII still on the audit trail?)
  2. Much explicit effort and systems are necessary to avoid unitended consequences of amassing large amounts of personal information.
  3. People who have grown upon in a hyper-connected, pervasive-surveillance world have tend to have different perspectives of privacy than older people for whom personal information was secret by default.


Partnering via Privacy
Ian Glazer (Burton Group)
  1. Increased regulatory action, higher penalties, more people looking at privacy - all increase the attention companies must focus on privacy.
  2. Increased reliance on partners requires companies to understand privacy practices of partners.
  3. Preform Privacy Impact Assessments (PIA) to determine where we are, how we got here, and how changes can impact risks.
  4. PIA - opportunity to look at mission goals, design goals and privacy principles - are they in alignment?
  5. Reduce privacy risk by "cleaning your basement"
    1. Scary basements (something might be illegal)
    2. Messy basements (policy in place, but not well-applied)
  6. Procurement process is the best place to ask tough questions about partner privacy practices.


The Watchmen: UCLA & Georgetown Protect and Defend Privacy and Data Security
Heidi Wachs (Georgetown University)
  1. Although Georgetown University and UCLA have significant differences in size, organization and operational practices for privacy policy, the incident response process is quite similar
  2. Both suffered significant privacy breaches
  3. Response depends on what data is actually "acquired" vs. how much was "exposed"
  4. Privacy breaches triggered much public press and discussion
  5. New policies implemented quickly as a result of the breach have been difficult to implement


How Google Protects the Privacy of Our Users

Shuman Ghosemajumder (Google)
  1. Google global design principles: transparency, choice, security.
  2. End to end security is an essential part of every Google Service.
  3. Google Latitude: make privacy choices very visible and easily assessible, with opt-out at multiple levels.
  4. Street view: blur faces and license plates automatically, but allow individuals to request blurring if automated process fails.
  5. Interest base advertising: give users control over categories and opt out at different levels of granularity.
  6. Gmail: contextual ads caused concern - because of its proximity to and dependence on personal email.
  7. Data retention: Google anonymizes IP addresses in logs after 9 months.
  8. Google chose paradigm of "opt-in after the fact", rather than offering "opt-in beforehand" to not disrupt the user experience or advertising ecosystem.

Technorati Tags: , , , , , ,

Comments:

Post a Comment:
Comments are closed for this entry.
About

Discovering Identity was founded on blogs.sun.com in May 2005 as a means of documenting my exploration of the field of Identity and Access Management. In February, 2010, I switched to hosting the blog at DiscoveringIdentity.com. In March 2012, I began posting Oracle-related information in both places.

Thanks for stopping by.

Please connect with me in cyberspace at LinkedIn or Twitter.


The views expressed on this blog are my own and do not necessarily reflect the views of my employer, Oracle Corporation, or any other person or organization.

Search

Archives
« April 2014
SunMonTueWedThuFriSat
  
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
   
       
Today