9 min read

What are Privacy Impact Assessments?

Introduction

Privacy Impact Assessments (PIAs) are a tool utilized by privacy experts to determine the risk associated with handling personal information in a business or government setting. This tool is often also referred to as a Data Privacy Impact Assessment (DPIA). When considering the risk of handling private data, controls are often selected to mitigate the associated risk. However, modifying a process after implementation typically requires more resources than sourcing the privacy handling assessment early in the product or service lifecycle. As a result, Privacy Impact Assessments are often conducted when conceptualizing a product which will handle privacy data.

Privacy Impact Assessments act as early warning systems for a company or government which may be facing privacy issues (Wright, 2012). Privacy impact assessments can also be inspired based on business requirements. For example, a business may choose to perform a privacy impact assessment after onboarding a new third-party service provider (Densmore, 2022). Businesses may also determine that the changing legal landscape requires reevaluation or reperformance of a privacy impact assessment. When considering GDPR requirements, a business may realize that they must conduct a privacy impact assessment due to monitoring a protected group (“Art. 35 GDPR – Data Protection Impact Assessment,” n.d.). Overall, it is critical to tailor the Privacy Impact Assessment to the situation that is presented.

Definitions

When considering how the Privacy Impact Assessment will be implemented, it is necessary to consider the overall goal of the PIA. Returning to the requirements demonstrates the different scenarios in which a Privacy Impact Assessment will be required. This may be enforced by law. In fact, Article 35 of the GDPR requires the use of a Data Privacy Impact Assessment (Bisztray & Gruschka, 2019). The individual requirements of the United States compared to the European Union are provided in Table 1.

United States Government Requirements (Privacy Impact Assessment (PIA) Guide, 2007)

GDPR Requirements (“Art. 35 GDPR – Data Protection Impact Assessment,” n.d.)

Required when initiating a new electronic collection of information in identifiable form for 10 or more persons, consistent with the Paperwork Reduction Act (PRA)

Required for any system involving the systematic and extensive evaluation of personal aspects relating to natural persons which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the natural person or similarly significantly affect the natural person

 

Required when developing, or procuring any new technologies or systems that handle or collect personal information

Required when processing large scale special categories of data referred to in GDPR Article 9(1), or of personal data relating to criminal convictions and offences referred to in GDPR Article 10

 

Required when issuing a new or updated rulemaking that affects personal information.

Required when performing systematic monitoring of a publicly accessible area on a large scale.

Required when a system which stores personal data has been categorized with system security controls as “High-Major” or “Moderate-Major”

N/A – No equivalent category

Required when developing system revisions to a system which was previously the subject of a Privacy Impact Assessment.

N/A – No equivalent category

Table 1 – Comparison of US Government and GDPR PIA Decision Requirements

 

The European Union defines a Privacy Impact Assessment as “a process for assessing the impacts on privacy of a project, policy, program, service, product, or other initiative and, in consultation with stakeholders, for taking remediation actions as necessary in order to avoid or minimize the negative impacts.” This was originally defined by Wright, David, and Paul De Hert (Wright & De Hert, 2012). This terminology is used throughout the European Union Privacy Impact Assessment framework which is called A Privacy Impact Assessment Framework (APIAF) (A Privacy Impact Assessment Framework for Data Protection and Privacy Rights, 2011).

The United States government, specifically the Securities and Exchanges Commission, defines a Privacy Impact Assessment as “an analysis of how personally identifiable information is collected, stored, protected, shared and managed.” The SEC goes on further to describe it as maintaining review and records for information systems. The ownership of the PIA falls under the purview of the system owner (Privacy Impact Assessment (PIA) Guide, 2007). The United States government was mandated to create PIA governance rules in the E-Government Act of 2002 which mandated PIAs be formulated whenever a new system collected, maintained, or disseminated any form of information (CMS Privacy Impact Assessment (PIA) Handbook - CMS Information Security & Privacy Group, n.d.).

The United States National Institute of Standards and Technology (NIST) has a different approach. NIST SP 800-60 defined a Privacy Impact Assessment as “an analysis of how information is handled: (i) to ensure handling conforms to applicable legal, regulatory, and policy requirements regarding privacy, (ii) to determine the risks and effects of collecting, maintaining and disseminating information in identifiable form in an electronic information system, and (iii) to examine and evaluate protections and alternative processes for handling information to mitigate potential privacy risks” (M-03-22, OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002, n.d.). This was implemented in NIST SP 800-60 with the definition provided through Office of Management and Budget memorandum which defined the Privacy Impact Assessment.

 

 

Methods

Overall, considering the requirements for when a Privacy Impact Assessment are required can help drive towards the ideal definition of what the assessment is. To consider the most accurate definition, the concept of engineering design modeling can be applied to privacy requirements with regards to matching to the necessary definition. This allows for ensuring that the ideal definition meets most requirements for both United States and European Union use cases.

For this determination, the manufacturing design process known as a “house of quality” approach was utilized. This utilizes a grid pattern to map customer requirements to engineering requirements using a strong/medium/weak scoring system. Strong responses are scored with nine points. Medium responses are scored with three points. Weak responses are scored with one single point (Pahl et al., 2007).

For this analysis, the customer requirements have been replaced by the government requirements for when a Privacy Impact Assessment is required. To score as a strong correlation, the definition directly mentioned the associated government requirement. To score as a medium correlation, the relationship between the definition and the requirement can be inferred by the language tautology. For example, large scale systematic monitoring can be inferred from the concept of analyzing personally identifiable information collection. Weak correlation acts as the base case for minimal direct or indirect correlation.

Figure 1 – House of Quality Applied to Privacy Scoring

 

Based on this chart, the definition of a privacy impact assessment provided by NIST best meets the overall description of a privacy impact assessment required by the European Union, US SEC, and NIST. Privacy impact assessments may also be necessary based on business requirements. With the understanding that NIST maps most directly to government requirements, a privacy impact assessment derived from NIST fundamentals can be created.

Recommendations

As NIST was determined to be the most successful at conveying the goals of a privacy impact assessment when mapped to each governmental requirement, a modified form of the recommendations of NIST 800-122 should be considered as a primary resource when creating a privacy impact assessment (McCallister et al., 2010). Based on the information provided in NIST 800-122, there are six key questions throughout the course of the assessment:

1)      What information is being collected?

2)      Why is the information being collected?

3)      What is the intended use of the information?

4)      Who will receive the information?

5)      How will the information be secured?

6)      What choices have been made for maintaining the information because of conducting the Privacy Impact Assessment?

NIST 800-122 recommends initially conducting a Privacy Threshold Analysis (PTA) to act as an initial Privacy Impact Assessment. The PTA will provide information necessary prior to any form of testing and also prior to conducting a significant change to any system covered by a PIA. The PTA analyzes for points of contact, system description, medium of data storage such as electronic or paper, certification requirements, accreditation requirements, government national security classification, FISMA applicability, distribution methodology, specific individual identification and correlation capabilities, and the population being targeted. The questions asked are typically binary selections with several being multiple choice. This allows for increased speed and accuracy when filling out the PTA.

Once a PTA has indicated that a Privacy Impact Assessment is necessary, the questions asked during the Privacy Threshold Analysis should be increased in granularity to include pertinent information about the system or process. For example, rather than asking about the overall system classification level, it may be necessary to include a list of individuals or organizations that will have access along with a justification letter attached for each group. Personally Identifiable Information should be identified via schema and provided an impact level to allow an improved judgement on how severe a breach may be in the event of a major security event(McCallister et al., 2010).

The Privacy Impact Assessment should provide an overview of PII confidentiality safeguards. This should reference necessary policies and procedures associated with handling PII in the context of the system or process being assessed. Similarly, the Privacy Impact Assessment should include any required training for individuals which will be handling PII in the organization including any safeguards they will be trained in.

In addition to safeguards, any provisions for anonymizing information or otherwise decoupling the information from the privacy data subject should be provided. Standard practices that should be accounted for in the Privacy Impact Assessment are removal of data elements, delinking data elements, or randomized values for data elements which are not required for system operation. NIST also recommends using average values when the actual value is not required.

Conclusions

Whenever considering privacy risks assessments, it is critical for leadership to determine the most cost-effective methodology. By mapping government requirements to framework definitions, it is possible to mathematically determine the framework most likely to have a globally applicable methodology. As Privacy Impact Assessments can often be the first sign of trouble in a data system where safeguards are not adequately implemented, it is critical to implement the findings of the PIA as soon as possible in the engineering lifecycle. By adopting the engineering methodologies discussed within this paper, a more rigorous and applicable assessment may be possible.

 References

A Privacy Impact Assessment Framework for Data Protection and Privacy Rights, JLS/2009- 2010/DAP/AG, European Commission Directorate General Justice (2011). Art.

35 GDPR – Data protection impact assessment. (n.d.). General Data Protection Regulation (GDPR). Retrieved March 17, 2024, from https://gdpr-info.eu/art-35-gdpr/

Bisztray, T., & Gruschka, N. (2019). Privacy Impact Assessment: Comparing Methodologies with a Focus on Practicality. In A. Askarov, R. R. Hansen, & W. Rafnsson (Eds.), Secure IT Systems (Vol. 11875, pp. 3–19). Springer International Publishing. https://doi.org/10.1007/978-3-030-35055-0_1

CMS Privacy Impact Assessment (PIA) Handbook—CMS Information Security & Privacy Group. (n.d.). Retrieved March 17, 2024, from https://security.cms.gov/policy-guidance/cms-privacyimpact-assessment-pia-handbook

Densmore, R. (2022). Privacy Program Management (3rd ed.). IAPP.

M-03-22, OMB Guidance for Implementing the Privacy Provisions of the E-Government Act of 2002. (n.d.). Retrieved March 17, 2024, from https://georgewbushwhitehouse.archives.gov/omb/memoranda/m03-22.html

McCallister, E., Grance, T., & Scarfone, K. A. (2010). Guide to protecting the confidentiality of Personally Identifiable Information (PII) (NIST SP 800-122; 0 ed., p. NIST SP 800-122). National Institute of Standards and Technology. https://doi.org/10.6028/NIST.SP.800-122

Pahl, G., Beitz, W., Feldhusen, J., & Grote, K.-H. (2007). Engineering Design. Springer. https://doi.org/10.1007/978-1-84628-319-2

Privacy Impact Assessment (PIA) Guide. (2007).

Wright, D. (2012). The state of the art in privacy impact assessment. Computer Law & Security Review, 28(1), 54–61. https://doi.org/10.1016/j.clsr.2011.11.007

Wright, D., & De Hert, P. (2012). Introduction to Privacy Impact Assessment. In D. Wright & P. De Hert (Eds.), Privacy Impact Assessment (pp. 3–32). Springer Netherlands. https://doi.org/10.1007/978-94-007-2543-0_1