8.5 Online Privacy Requirements
This section follows the method used in Chapter 3, “Information Privacy Requirements and Guidelines,” to arrive at online privacy requirements. The first part of this section presents several different categorizations of principles for online privacy. The second part of this section discusses an online privacy framework developed by the U.S. Federal Trade Commission (FTC), which is based on the FTC’s statement of principles.
Online Privacy Principles
The FTC defines a set of fair information practice principles (FIPPs) appropriate for specifying online privacy requirements [FTC98]. The FTC derives these principles from the privacy FIPPs developed by the OECD and other organizations, as described in Chapter 3. The principles are:
Notice/awareness: Ensuring that consumers are notified or made aware of an organization’s information practices before any information is actually collected from them (e.g., an organization’s privacy policy). Such notification or awareness should include:
— Identification of the entity collecting the data
— Identification of the uses to which the data will be put
— Identification of any potential recipients of the data
— Nature of the data collected
— Whether the provision of the requested data is voluntary or required and the consequences of a refusal to provide the requested information
— Steps taken by the data collector to ensure the confidentiality, integrity, and quality of the data
Choice/consent: Ensuring that consumers are given the option to decide how personal information collected about them is to be used and whether it may be used for secondary purposes.
Access/participation: Ensuring an individual’s ability both to access data and to contest that data’s accuracy and completeness.
Integrity/security: Ensuring that data are both accurate and secure. Security and accuracy come from both the consumer and the organization collecting the PII.
Enforcement/redress: Ensuring that mechanisms are in place to enforce privacy.
A proposed consumer privacy bill of rights from the U.S. government [OWH12] develops a more detailed list of principles, which is useful in understanding the range of requirements that should guide implementation of online privacy:
Individual control: Consumers have a right to exercise control over what personal data companies collect from them and how they use it. Companies should provide consumers appropriate control over the personal data that consumers share with others and over how companies collect, use, or disclose personal data. Companies should enable these choices by providing consumers with easily used and accessible mechanisms that reflect the scale, scope, and sensitivity of the personal data that they collect, use, or disclose, as well as the sensitivity of the uses they make of personal data. Companies should offer consumers clear and simple choices, presented at times and in ways that enable consumers to make meaningful decisions about personal data collection, use, and disclosure. Companies should offer consumers means to withdraw or limit consent that are as accessible and easily used as the methods for granting consent in the first place.
Transparency: Consumers have a right to easily understandable and accessible information about privacy and security practices. At times and in places that are most useful to enabling consumers to gain a meaningful understanding of privacy risks and the ability to exercise individual control, companies should provide clear descriptions of what personal data they collect, why they need the data, how they will use it, when they will delete the data or de-identify it, and whether and for what purposes they may share personal data with third parties.
Respect for context: Consumers have a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data. Companies should limit their use and disclosure of personal data to purposes that are consistent with both the relationship they have with consumers and the context in which consumers originally disclosed the data, unless required by law to do otherwise. If companies use or disclose personal data for other purposes, they should provide heightened transparency and individual control by disclosing these other purposes in a manner that is prominent and easily actionable by consumers at the time of data collection. If, subsequent to collection, companies decide to use or disclose personal data for purposes that are inconsistent with the context in which the data was disclosed, they must provide heightened measures of transparency and individual choice. Finally, the age and familiarity with technology of consumers who engage with a company are important elements of context. Companies should fulfill the obligations under this principle in ways that are appropriate for the age and sophistication of consumers. Helen Nissenbaum’s work in this area is particularly interesting [NISS11].
Security: Consumers have a right to secure and responsible handling of personal data. Companies should assess the privacy and security risks associated with their personal data practices and maintain reasonable safeguards to control risks such as loss; unauthorized access, use, destruction, or modification; and improper disclosure.
Access and accuracy: Consumers have a right to access and correct personal data in usable formats in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data is inaccurate. Companies should use reasonable measures to ensure that they maintain accurate personal data. Companies also should provide consumers with reasonable access to personal data that they collect or maintain about them, as well as the appropriate means and opportunity to correct inaccurate data or request its deletion or use limitation. Companies that handle personal data should construe this principle in a manner consistent with freedom of expression and freedom of the press. In determining what measures they may use to maintain accuracy and to provide access, correction, deletion, or suppression capabilities to consumers, companies may also consider the scale, scope, and sensitivity of the personal data that they collect or maintain and the likelihood that its use may expose consumers to financial, physical, or other material harm.
Focused collection: Consumers have a right to reasonable limits on the personal data that companies collect and retain. Companies should collect only as much personal data as they need to accomplish purposes specified under the respect for context principle. Companies should securely dispose of or de-identify personal data when they no longer need it, unless they are under a legal obligation to do otherwise.
Accountability: Consumers have a right to have personal data handled by companies with appropriate measures in place to ensure that they adhere to the Consumer Privacy Bill of Rights. Companies should be accountable to enforcement authorities and consumers for adhering to these principles. Companies also should hold employees responsible for adhering to these principles. To achieve this end, companies should train their employees as appropriate to handle personal data consistently with these principles and regularly evaluate their performance in this regard. Where appropriate, companies should conduct full audits. Companies that disclose personal data to third parties should at a minimum ensure that the recipients are under enforceable contractual obligations to adhere to these principles, unless they are required by law to do otherwise.
Online Privacy Framework
The FTC presents an online privacy framework that is a useful guide to best practices for implementing online privacy policies and mechanisms [FTC12]. The framework consists of three elements, as shown in Figure 8.5 and described in the list that follows:
Privacy by design: Build in privacy at every stage of product development.
Simplified choice for businesses and consumers: Give consumers the ability to make decisions about their data at a relevant time and in a relevant context, including through a “do not track” mechanism, while reducing the burden on businesses of providing unnecessary choices.
Greater transparency: Make information collection and use practices transparent.
FIGURE 8.5 FTC Online Privacy Network
Privacy by Design
As described in Chapter 2, “Information Privacy Concepts,” privacy by design (PbD) dictates that privacy requirements be considered at the time of designing a new system, subsystem, application, or other component of the IT infrastructure of an organization. The intent is to design privacy engineering mechanisms and techniques that can be incorporated in a holistic fashion during the implementation and deployment of a system.
The PbD element of the online privacy framework defines two components. First, companies should incorporate substantive privacy protections into their actions, referred to as PbD principles. Second, companies should maintain comprehensive data management procedures throughout the life cycle of their products and services.
The relevant PbD principles are as follows:
Data security: Effective security for PII involves both management practices and technical controls. Organizations can obtain guidance in this area from a number of private sector sources, such as the Payment Card Institute Data Security Standard for payment card data, the SANS Institute’s security policy templates, and standards and best practices guidelines for the financial services industry provided by BITS, the technology policy division of the Financial Services Roundtable. Standards organizations, such as NIST and ISO, also provide useful guidance for all types of organizations; these sources are described in Chapter 3.
Reasonable collection limits: Companies should limit data collection to that which is consistent with the context of a particular transaction or the consumer’s relationship with the business, or as required or specifically authorized by law. For any data collection that is inconsistent with these contexts, companies should make appropriate disclosures to consumers at a relevant time and in a prominent manner—outside of a privacy policy or other legal document. The FTC cites one example of a company innovating around the concept of privacy by design through collection limitation [FTC12]. The Graduate Management Admission Council (GMAC) previously collected fingerprints from individuals taking the Graduate Management Admission Test. After concerns were raised about individuals’ fingerprints being cross-referenced against criminal databases, GMAC developed a palm vein recognition system that could be used solely for test-taking purposes [CLIN10]. GMAC found this system more stable over time than fingerprinting, more accurate than facial recognition, and less invasive than iris or retinal scanning. It is less susceptible to function creep over time than the taking of fingerprints because palm prints are not widely used as a common identifier.
Sound retention practices: Companies should implement reasonable restrictions on the retention of data and should dispose of the data when they have outlived the legitimate purpose for which they were collected. In some contexts, companies could retain data after de-identification.
Data accuracy: Companies should take reasonable steps to ensure the accuracy of the data they collect and maintain, particularly if such data could cause significant harm or be used to deny consumers services.
Procedural Protections
The other aspect of PbD in the privacy framework is procedural protections. In essence, this means that companies should maintain comprehensive data management procedures throughout the life cycle of their products and services.
To understand the scope that is intended by the term procedural protections, it is useful to look at a settlement between the FTC and Google [FTC11]. The privacy programs that the settlement mandates must, at a minimum, contain certain controls and procedures, including:
The designation of personnel responsible for the privacy program.
A risk assessment that, at a minimum, addresses employee training and management and product design and development. The privacy risk assessment should include consideration of risks in each area of relevant operation, including but not limited to (1) employee training and management, including training on the requirements of this order, and (2) product design, development, and research.
The implementation of controls designed to address the risks identified, together with regular testing or monitoring of the effectiveness of those privacy controls.
Appropriate oversight of service providers.
Evaluation and adjustment of the privacy program in light of regular testing and monitoring.
A 2016 independent assessor’s report [PROM16] found that Google had implemented the mandated privacy program, including the following controls:
Privacy program staffing and subject matter expertise
Employee privacy training and awareness
Internal and external policies, procedures, and guidelines
Privacy risk assessment activities
Product launch reviews for privacy considerations
Privacy code audits
End user privacy tools and settings
Complaint and feedback processes and mechanisms
Periodic internal and external privacy program assessments
Coordination with, and support of, the Google information security program
Third-party service provider oversight
Incident reporting and response procedures
Of particular interest are the end user privacy tools and settings. Table 8.3 indicates the privacy settings, guides, and tools for users to control how Google collects, uses, and protects their data. The Google privacy program is a good example of a set of online privacy policies and procedures and can serve as a guide for other companies.
TABLE 8.3 Google End User Privacy Settings and Tools
Setting/Tool Type |
Name |
Description |
Account management tools |
My Account |
Serves as the central hub of security, privacy, and general account settings, tools, and guides for each user |
Dashboard |
Provides an “at-a-glance” view of the user’s recent activity (e.g., how many documents and emails the user has saved) and lets the user manage product settings directly |
|
Activity Controls |
Displays settings to manage, edit, and delete activity and data use associated with a user’s account, including the user’s searches and browsing activity |
|
Account Permissions for Connected Apps |
Shows the applications and external websites connected to a user’s Google account and allows the user to manage those permissions and remove applications as desired |
|
Inactive Account Manager |
Allows the user to choose what happens to account data if an account becomes inactive for a length of time that the user specifies, including deleting the account or nominating a trusted contact who may access the account data when the account becomes inactive |
|
Account and Service Deletion |
Allows the user to delete certain Google products (e.g., Gmail) or delete the user’s entire Google account |
|
Product settings |
Ads Settings |
Allows the user to control the types of ads received by adjusting the user’s interests and demographic details and removing unwanted ads or to opt out of personalized ads altogether |
Search Settings |
Allows the user to control search settings such as the use of SafeSearch filters and whether private results are included in search results |
|
Analytics Opt-Out |
Allows the user to control whether the user’s data will be used by Google Analytics |
|
Privacy tools and guides |
Privacy Checkup |
Facilitates a walkthrough of a user’s products and services that allows the user to adjust privacy settings |
Product Privacy Guide |
Contains links to articles with information about how Google’s products work and how a user can manage his or her data within products |
|
Incognito Mode |
Allows use of Google’s Chrome browser without Chrome saving the pages viewed in Incognito windows |
|
Security tools and guides |
Security Checkup |
Facilitates a walkthrough of a user’s products and services that allows the user to adjust security settings, including the user’s recovery information, recent security events, connected devices, and account permissions |
2-Step Verification |
Allows the user to enable a stronger security sign-on process for the user’s Google accounts that requires two forms of authentication (e.g., password and verification code) |
|
Device Activity and Notifications |
Allows the user to review which devices have accessed the user’s accounts and control how to receive alerts if Google detects potentially suspicious activity |
|
Service Encryption |
Provides information about service encryption, which is available for several Google products, including Search, Maps, YouTube, and Gmail |
|
Chrome Safe Browsing |
Provides warning messages about websites that could contain malware, unwanted software, and phishing schemes designed to steal personal information |
|
Data export |
Download Your Data |
Allows the user to download and export data from his or her Google accounts |
Simplified Consumer Choice
The FTC considers the handling of personal data to fall into two categories [FTC12]:
Practices that do not require choice: Companies do not need to provide choice before collecting and using consumer data for practices that are consistent with the context of the transaction or the company’s relationship with the consumer or that are required or specifically authorized by law.
Practices that require choice: For practices requiring choice, companies should offer the choice at a time and in a context in which the consumer is deciding about his or her data. Companies should obtain affirmative express consent before (1) using consumer data in a materially different manner than claimed when the data was collected or (2) collecting sensitive data for certain purposes.
Transparency of Data Practices
Users need to be aware of the privacy risks inherent in sharing information with particular companies. The FTC lists three principles that should guide companies in providing customers and other uses with privacy information [FTC12]:
Privacy notices: Privacy notices should be clear, short, and standardized to enable better comprehension and comparison of privacy practices.
Access: Companies should provide reasonable access to the consumer data they maintain; the extent of access should be proportionate to the sensitivity of the data and the nature of their use.
Consumer education: All stakeholders should expand their efforts to educate consumers about commercial data privacy practices.