Personally Identifiable Information—Not The Same All Over The World


Personally Identifiable Information—Not The Same All Over The World

Personally Identifiable Information, or PII as we tend to shorten it, has dramatically different definitions around the world. Given the focus around the world on individual privacy rights and on security breaches involving personal information, I thought it might be good to discuss this critical topic.

Further on in this blog, you’ll see references to different jurisdiction definitions of personal and sensitive information. I know it may seem overboard to add the definitions from a bunch of regulations, but there are two points I want to highlight with this exercise. And seeing the definitions should help you understand exactly why we need to focus on a broader definition of PII than we may be inclined to.


Before I get into the discussion, this post is about what should be considered personal information and sensitive information.

First point: In the United States, we consider fraud and identity theft personal data that requires special protection (or “sensitive” data). That is not the way other countries view sensitive information. Sensitive information outside the U.S. is most often focused on areas of discrimination. In our work as background screeners, criminal record history is the type of information most often treated differently around the world with respect to privacy protections. In the U.S., criminal history is considered a public record. Outside of the U.S., criminal history is often considered sensitive data.

Second, take a close look at the definition of basic personal information. Outside of the U.S., the focus is on information about an identified or identifiable individual. That is very broad. The EU has broadened it to include direct and indirect identification. This means that if you can piece together two separate pieces of information, that on their own do not identify a person, but together do identify a person, you now have information about an identifiable individual. A great example is the case about the Canadian Adverse Drug Reaction Information System (CADRIS) database [i]. It was found that disclosing the province in this database would allow the information to be combined with other pieces of information, including public information such as obituaries. This would “substantially increase the possibility” that individuals could be identified [ii].

As background screeners, we need to be extremely mindful of the information we collect. We need to know where it is stored, how easy is it to combine seemingly inconsequential information and create identified information, and we need to make sure we are only collecting what we must collect when we must collect it. (For those of you who attended the NAPBS Mid-Year conference a couple of years ago, you will remember the famous sash made by Vu to memorialize this concept.) If we are processing criminal history information on individuals who are not residents of the U.S., we must be mindful of the added protections required for this very sensitive information.

Summary of key definitions:

In the U.S., PII is defined differently depending upon the jurisdiction and the regulation. The main Federal privacy regulation for background screeners in the U.S. is the Fair Credit Reporting Act (FCRA). Other U.S. regulations also cover privacy, such as GLBA, DPPA, COPPA, and HIPAA, but for now let’s stick with just the FCRA part of the alphabet soup. The FCRA does not define PII, but rather uses the term “consumer’s file”. The Consumer File is defined as, “all of the information on that consumer recorded and retained by a consumer reporting agency…” [iii]. Other U.S. regulations call out higher levels of protections for information that could be used in fraud, including identity theft, such as the person’s name plus their Social Security Number, driver’s license number, or financial numbers. In some states, health insurance, medical, or biometric (think fingerprints) information is required to have these higher protection levels. We think of this as Sensitive Personally Identifiable Information or SPII.


Outside of the U.S., personal information is often very clearly defined. We see definitions of:

  • Canada
    • Personal Information: “Information about an identifiable individual” [iv]
    • Sensitive Information: Not defined, although there is a mention in PIPEDA about information being sensitive based upon context. See 4.3.4 of PIPEDA. [v]
  • Mexico [vi]
    • Personal Information: “Any information concerning an identified or identifiable individual”.
    • Sensitive Information: “Personal data touching on the most private areas of the data owner’s life, or whose misuse might lead to discrimination or involve a serious risk for said data owner. In particular, sensitive data is considered that which may reveal items such as racial or ethnic origin; present and future health status; genetic information; religious, philosophical and moral beliefs; union membership; political views; or sexual preference.”
  • European Union – Directive 95/46/EC (“EU Directive”) [vii]
    • Personal Information: “Any information relating to an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural, or social identity”.
    • Sensitive Information (called Special Categories of Information): Personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life. (Note that criminal history information is often included in the Special Category as well.)
  • European Union – GDPR [viii]
    • Personal Information: “Any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural, or social identity of that natural person.”
    • Sensitive Information: “Personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health, or data concerning a natural person’s sex life or sexual orientation shall be prohibited”. (As with the EU Directive, criminal history information is often bundled into this category.)



[ii] A summary of this is found in Leading by Example:

[iii] Fair Credit Reporting Act,, § 603. Definitions; rules of construction [15 U.S.C. § 1681a] [iv] Personal Information,, accessed January 11, 2018

[v] Personal Information Protection and Electronic Documents Act,

[vi] Federal Law on Protection of Personal Data Held by Private Parties,

[vii] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data,

[viii] GDPR, 


Kerstin Bagus – Director, Global Initiatives

Kerstin Bagus supports ClearStar’s Global Screening Program as its Director of Global Initiatives. She has more than 30 years of background screening industry experience, working for a variety of firms, large and small. Kerstin is one of the few individuals in the industry who is privacy-certified through the International Association of Privacy Professionals (IAPP) for Canada, the EU, and the U.S.

Kerstin is a passionate participant in the Professional Background Screening Association (PBSA, formerly NAPBS) and is a current member of the Board, in addition to participating on several committees. She also participates on IFDAT’s Legal Committee, with a primary focus on global data privacy.

At ClearStar, we are committed to your success. An important part of your employment screening program involves compliance with various laws and regulations, which is why we are providing information regarding screening requirements in certain countries, region, etc. While we are happy to provide you with this information, it is your responsibility to comply with applicable laws and to understand how such information pertains to your employment screening program. The foregoing information is not offered as legal advice but is instead offered for informational purposes. ClearStar is not a law firm and does not offer legal advice and this communication does not form an attorney client relationship. The foregoing information is therefore not intended as a substitute for the legal advice of a lawyer knowledgeable of the user’s individual circumstances or to provide legal advice. ClearStar makes no assurances regarding the accuracy, completeness, or utility of the information contained in this publication. Legislative, regulatory and case law developments regularly impact on general research and this area is evolving rapidly. ClearStar expressly disclaim any warranties or responsibility or damages associated with or arising out of the information provided herein.


Let's start a conversation


    eskort mersin - youtube seo