Article Hero
Blog7 minutes read
August 7, 2023
  • telegram
  • facebook
  • twitter
  • github

The Privacy Problem with Biometric Data

Biometrics, which involve the unique physical or behavioral characteristics of individuals, have been employed in various sectors, ranging from personal devices to government systems. While biometric data collection offers convenience and enhanced security, it also raises concerns about privacy. This article aims to explore the privacy issues associated with biometric data and its collection, storage, and use, as well as highlight its limitations and potential privacy risks.


What is biometric data?

Biometric data refers to any measurable biological or behavioral characteristic of an individual that can be used to identify them uniquely. This can include physical characteristics such as fingerprints, facial recognition, iris scans, and DNA samples, as well as behavioral characteristics such as voice patterns, gait analysis, and keystroke dynamics.

How are biometrics used?

Biometric characteristics are collected and stored in databases for identification or authentication purposes. This type of data is considered to be more secure than traditional methods of identification/authentication such as passwords or PINs because biometrics are unique to each individual and cannot be easily replicated or stolen... It's not impossible, but it's at least harder to do.

Authentication: Biometric authentication verifies the identity of an individual based on their unique biometric traits. It is commonly used in everyday devices and situations, such as smartphones, laptops, virtual assistants, and access control systems. Using biometrics allows individuals to unlock their devices or gain access to restricted areas by scanning their fingerprint, face, or iris, or using their voice.

Identification: Biometric identification aims to determine the identity of an individual by matching their biometric traits against a database of enrolled biometric templates. Law enforcement agencies employ this technology for forensic investigations or identifying suspects in criminal cases.

Here are some common examples of biometric data collection and uses:

1. Unlocking smartphones or other devices: Many smartphones these days allow users to unlock their devices using fingerprint or facial recognition.

2. Airport security: Many airports now use facial recognition to verify passengers' identities at various checkpoints.

3. Banking and financial services: Some banks use voice recognition or fingerprint scanning, to provide an additional layer of security for accessing customer accounts.

4. Employee time and attendance tracking: Some companies use fingerprint scanning to track employee time and attendance.

5. Physical access control: For example, iris recognition can be used to control physical access to buildings, rooms, or other secure areas.

6. Health monitoring: Wearable devices that track health data, such as heart rate and sleep patterns, often use biometric data to collect this information.

7. Law enforcement: Fingerprints and DNA are often used by law enforcement agencies to identify suspects or solve crimes.

8. Border control: Facial recognition can be used to verify the identity of travelers at border crossings.

9. Voting systems: Some countries are exploring the use of biometric systems to help prevent voter fraud and ensure the integrity of the voting process.

10. Sports and fitness: Heart rate and motion tracking are often used by athletes and fitness enthusiasts to monitor their performance and progress.

How do biometric systems work?

Biometric systems involve an individual's biometric data being collected and converted into a template, a mathematical representation of their unique features. When verification or identification is required, the system compares the captured data with the templates in the database and produces a match or non-match result.

Here's a breakdown of the general processes involved:

  • Capture: The first step is to capture the biometric traits of the individual. This is done using specialized sensors or cameras that are designed to detect and capture specific data such as fingerprints, facial recognition, iris scans, voice patterns, or DNA samples. The captured data is then converted into a digital format that can be stored in a database.
  • Pre-processing: The captured data is pre-processed to remove any noise or distortions and to extract relevant features that can be used for comparison. This process involves various techniques such as filtering, segmentation, normalization, and feature extraction.
  • Storage: The pre-processed data is then stored in a secure database and is usually encrypted to protect against unauthorized access.
  • Comparison: When an individual attempts to authenticate or identify themselves (or be identified by a third party) using the biometric system, the stored data is compared to newly captured biometric data. This process involves comparing the relevant features extracted from both sets of data to determine if they match.
  • Decision: Based on the comparison result, the biometric system makes a decision whether to accept or reject the authentication or identification attempt. The decision is based on a predetermined threshold that determines the level of similarity required for a match to be considered valid.

What are the limitations of biometric systems?

Biometric systems are not 100% accurate, errors of false positives and false negatives can and do occur. These can be a result of various factors such as environmental conditions, aging, and changes in physical appearance.

But they can also be a result of biases “learned” by the AI systems.

Biometric biases can lead to incorrect identification, and exclusion, or cause individuals to pay more for certain services.

Let's take a look at some of the biases and examples.

What is biometric bias?

Biometric bias refers to the tendency of biometric systems to inaccurately recognize or authenticate individuals from certain demographics or population groups. While some of these instances may cause minor inconveniences, they can have a serious and negative impact on a person's life.

It's worth stating that it is not the biometric data itself or, necessarily, the process of data collection that is biased. AI technology cannot make judgments on its own accord.

The biases come from either the context in which the data is used or from the algorithms used to train the machine learning models.

For example, if the machine learning models are trained on a biased data set that is not representative of the general population, the algorithms may be less accurate when dealing with certain demographics.

Similarly, the technology may reinforce existing biases if biometric data is used in contexts that are biased against certain groups, such as in law enforcement or hiring processes.

Here are some ways in which biometric bias appears and affects individuals:

Racial bias

Facial recognition technology has been found to have higher error rates when it comes to recognizing individuals with darker skin tones or other physical features that differ from the algorithm's training data.

In 2018, a study found that Amazon's facial recognition software, Rekognition, misidentified 28 members of Congress as criminals, and 40% of those misidentified were people of color.

Voice recognition systems have also been found to have lower accuracy rates for people with accents or dialects that differ from the system's training data.

For example, a Stanford University study tested the speech recognition systems developed by five big-tech companies— Amazon, IBM, Google, Microsoft, and Apple.

It found the error rates for all five systems were almost twice as high for blacks as for whites, even for speakers of the same gender speaking the exact same words.

Additionally, wearable devices such as fitness trackers that use photoplethysmographic (PPG) green light signaling to collect biometric information have been found to be less accurate or unable to work at all for people with darker skin tones.

These devices can include low heart rate alerts, help detect arrhythmia, and track sleeping patterns and pulse pressure. These limitations mean individuals with darker skin tones are missing out on information that could benefit their long-term health and well-being.

Fingerprint recognition bias

Fingerprint recognition systems are one of the most prevalent biometric systems since they are cheaper to implement and generally provide a high rate of accuracy. But, this type of biometric data collection can exclude some members of society. For example, individuals whose fingerprints have been worn down due to medical treatments, prolonged use of harsh chemicals, manual labor, or aging can be disproportionately affected.

Gender bias

Some biometric systems have been found to have gender bias, where they are less accurate in recognizing or authenticating individuals of one gender over another. Back in 2018, Amazon was discovered to be using an AI tool that actually downgraded resumes that mentioned “women” as in “women's college” or “women's team”.

Most wearable fitness trackers have also been programmed to assess data based on white males with relative fitness levels. The further the wearer is from that demographic, the less accurate their data is going to be.

What are the privacy issues surrounding biometrics?

As well as the above bias issues, biometric data collection poses one major disadvantage that affects every single user, regardless of age, race, or gender... and that is a privacy risk.

The sheer amount of personal and highly unique, identifiable data being collected via biometric systems has given rise to profound concerns regarding privacy and personal autonomy.

Let's take a look at some of the ways personal privacy is being attacked.

System vulnerabilities

Biometric systems are just as vulnerable to data breaches as any other data collection system, but the fallout from these can be much worse.

Unlike passwords or credit card numbers, biometrics is irreplaceable data. Once compromised, it cannot be changed, putting individuals at long-term risk. Unauthorized access can lead to identity theft, fraud, and other forms of misuse.

Data quality

As we've already discussed, the accuracy and reliability of biometric data can vary due to factors like aging, injuries, or changes in physical features. False positives or false negatives mean people may be denied access to their own devices, accounts, or premises, or unauthorized individuals may gain access. This creates privacy concerns as it involves sharing personal information with third parties and potential exposure to unauthorized entities.

Data deletion

When individuals choose to discontinue their association with a system or when the purpose for which the biometric data was collected has been fulfilled, they should have the right to request the deletion of that information.

However, unlike other types of personal data, biometric information is inherently permanent. Once captured, collected, and stored, it can be a challenge to delete completely.

Even if a system deletes the stored biometric template, there may be copies or backups of the data that remain accessible. Data collected by one entity may have been shared with or stored by third parties.

Consent and covert collection

Obtaining informed consent from individuals for the collection and use of their biometric details is crucial. However, in some cases, consent may be implicit or not adequately informed, leading to concerns about its transparency and the autonomy of individuals.

Biometrics can be collected without an individual's knowledge or consent, posing serious risks to personal privacy. Covert collection methods, such as facial recognition in public spaces or recording keystrokes compromise an individual's privacy rights.

Profiling and tracking

When combined with other personal data, biometric information can enable the creation of detailed profiles about individuals. This profiling can lead to surveillance, targeted advertising, or discriminatory practices based on sensitive characteristics. The pervasive collection and analysis of such unique data can undermine individuals' right to privacy and autonomy.

Function creep

Biometric data collected for a specific purpose may be used for unrelated purposes without individuals' knowledge or consent. For example, a company collecting fingerprints for employee attendance tracking might re-purpose that data for criminal investigations or surveillance without the individuals' knowledge or permission. This can lead to a loss of control over one's personal information and a violation of privacy expectations.

Lack of regulation and standards

In many jurisdictions, the legal frameworks governing the collection, storage, and use of biometric information are still evolving or inadequate. This lack of comprehensive regulations and standards leaves gaps in privacy protection and accountability, making potential abuses or unauthorized access easier.

Summing up biometric data and privacy

The widespread adoption of biometric data collection has given rise to profound concerns regarding privacy. While some may argue that biometrics offer convenience and heightened security, the privacy risks they pose cannot be ignored.

The limitations and vulnerabilities of biometric systems, coupled with the potential misuse of collected data, create a dangerous landscape.

Once obtained, this data becomes an irrevocable part of an individual's identity.

The accuracy and reliability of biometric systems are far from perfect while covert collection without explicit consent, infringes upon the fundamental right to privacy.

Biometric data collection poses a severe threat to civil liberties, potentially resulting in mass surveillance and widespread discriminatory practices.

The lack of transparency and accountability in how the data is collected, stored, and shared further exacerbates these privacy concerns.

Robust legal and regulatory frameworks are urgently needed to ensure strict limitations on the collection and use of biometrics, as well as to establish clear rules for data retention and deletion.

It is our responsibility to challenge the unchecked expansion of biometric systems and advocate for stronger privacy safeguards. Only by doing so can we strive to strike a balance between technological progress and the preservation of our most fundamental right—the right to privacy.

READ MORE: Deepfake Phishing: What is it & How to Protect Yourself

Ruby M
Hoody Editorial Team

Ruby is a full-time writer covering everything from tech innovations to SaaS, Web 3, and blockchain technology. She is now turning her virtual pen to the world of data privacy and online anonymity.

Latest


Blog
Timer7 minutes read

How the Government Hacks You, Final Chapter: IoT Hacks

Chapter 14: IoT Hacks

Will R
6 months ago
Blog
Timer9 minutes read

How the Government Hacks You, Chapter 13: GPS Tracking

Dive into the unsettling world of government-controlled GPS tracking!

Will R
6 months ago
Blog
Timer7 minutes read

How the Government Hacks You, Chapter 12: Garbage Day

Trash Talk: How your garbage can be exploited by hackers, law enforcement, and government agencies

Will R
7 months ago
Blog
Timer8 minutes read

How the Government Hacks You, Chapter 11: Resonance Attacks

It’s time to uncover how government surveillance gets personal.

Will R
7 months ago

Bulletproof privacy in one click

Discover the world's #1 privacy solution

  • Chrome Icon
  • Brave Icon
  • Edge Icon
  • Chromium Icon
  • Coming soon

    Firefox Icon
  • Coming soon

    Safari Icon
  • Coming soon

    Opera Icon

No name, no email, no credit card required

Create Key