Bulletproof privacy in one click
Discover the world's #1 privacy solution
Coming soon
Coming soon
Coming soon
Back in August 2019, Google invited some of the biggest advertisers in the world to beta test their new Privacy Sandbox. At the time, they were still developing their main goals and mission statement, so the advertisers didn't necessarily know what they were getting into. But there were hints.
In the invitation, they mentioned “new standards that advance privacy.” This was a massive alarm bell to some in the advertising industry. Many marketing professionals felt that browser companies were just posturing so that users felt safe about their personal data, and then things would go back to normal. But a few of them braced for impact, knowing that a shakeup was inevitable given the amount of effort Apple, Firefox, and then Google were putting into these new initiatives.
Two years prior, Apple landed a surprise jab when they made changes to ITP in Safari. Intelligent Tracking Prevention (ITP) began to limit website tracking capabilities, and over time they tightened the noose. Over the next two years, client-side tracking became more and more restrictive, until those cookies were forced to expire anywhere between one and seven days, depending on the type.
But that was Safari, around 20% of the market. Not the 69% juggernaut that was Chrome.
With chaos in the air, online marketing firms were right to be worried. Google was setting up for a change that would shift the privacy conversation into a whole new realm. But what Google was proposing wasn't really privacy. It was simply a monopoly.
The aftereffects of this policy announcement would ripple throughout the industry, touching on everything from who is responsible for ad content, to the future of web analytics, to the bidding process for ads, to developing new technologies that might let advertisers circumvent these privacy measures in the future.
In this article, we'll talk about what browser cookies are, why they're important, how to enable Chrome cookies, how to clear Chrome cookies, what the third-party cookie ban means to massive marketers like Facebook Ads, how Google has dropped the ball on their own plans, and how users can protect themselves from the next big threat to privacy: Browser fingerprinting.
An HTTP cookie, also known as a browser cookie, is a small block of data that websites ask to insert into a computer's web browser. Sites that ask for a user to log in will often have a login cookie, for example, to track that user's individual actions and preferences on the site. Other cookies might try to save previously entered data and automatically use it to customize a website. The cookie usually matches the domain of the website that placed it there, the same one that is shown in the web browser's address bar. Those are called first-party cookies.
Sadly, one of the more common kinds of browser cookies is tracking cookies, which are usually third-party cookies. A third-party cookie is inserted by a program or service that is embedded in a website… an advertiser for example. The issue with third-party cookies is that any other website that also uses that same advertiser can be tracked by the ad firm. So if a user visited three different sites all being served ads from advertisingscum.com, that marketing firm would be able to start creating profiles about the user's habits: Shopping, hobbies, politics, and other personal data.
And when those ad companies build up a big enough list, they make even more money by selling that information. Bigger ad companies will buy up all the lists from smaller companies to get a more complete picture of a user's buying habits. User data is also passed from company to company through acquisition, meaning the bigger marketing firms have even more personal information about the user.
Although Firefox, Brave, Tor, and Safari have all blocked third-party cookies by default, Chrome is taking a slower approach and developing its own privacy system to put into place before they block all third-party cookies…
This was Google's intention back in 2022, but Google went back on its word, pushing the date out to 2023. And it's been pushed back yet again to the first quarter of 2024. The plan now is to migrate just 1% of Chrome users to the Privacy Sandbox with no real plans or deadlines for when that will be rolled out for 100%. They've also scrapped the method that they had proposed to the industry. More on that later.
In short: For the moment, there are still third-party cookies in Chrome.
Since the Chrome browser is one of the few remaining that accepts third-party cookies, it's important to know how they are controlled. Ultimately, it is the end user who is responsible for managing their cookies. Whether that's through manual review and deletion, or via a cookie manager, is their decision.
To get to the cookie settings in Chrome, click on the three stacked dots on the far right of the address bar (or press ALT-E or ALT-F on the keyboard, assuming the website doesn't have another function for those keypresses). Then select 'Settings' from the menu.
Select 'Privacy and Security' from the menu on the left, then click on 'Cookies and other site data'. The result should look something like this.
In order to enable Chrome cookies, one of the top three menu items needs to be selected. But the top setting, 'Allow all cookies', is certainly not recommended.
The default behavior for cookies in Chrome is 'Block third-party cookies in Incognito', which allows advertising cookies unless a Chrome Incognito window is used. For most people concerned with their privacy on the web, selecting 'Block third-party cookies' is recommended. Any of these three cases will enable Chrome cookies for first parties, allowing normal website logins and autofill features.
If a user ever wants to see what happens when all cookies are turned off, 'Block all cookies' is the option that they want. It will functionally cripple many websites, authentications, add-ons, and features. But it is an adventure.
In order to view what cookies are currently set in the browser, assuming the option to enable Chrome cookies has been active in the past, click on 'See all cookies and site data'.
The options are fairly straightforward. Clicking the trashcan icon will clear Chrome cookies associated with that URL. Clicking 'Remove all' will clear Chrome cookies across the entire browser. This is a good way to wipe out all of the old third-party cookies before turning on the 'Block third-party cookies' setting for a fresh start.
As soon as the new Google ad platform launches in 2024, blocking third-party cookies will be the default, assuming the users enable Chrome cookies in the first place. Why is one of the biggest advertising mediums in the world motivated to change its privacy platform?
Maybe it's because they're holding all of the cards.
To answer that question, we need to travel back in time just a little bit.
In early 2020, Google announced its intention to eliminate the use of third-party cookies in Chrome. Justin Schuh said,
“After initial dialogue with the web community, we are confident that with continued iteration and feedback, privacy-preserving and open-standard mechanisms like the Privacy Sandbox can sustain a healthy, ad-supported web in a way that will render third-party cookies obsolete. Once these approaches have addressed the needs of users, publishers, and advertisers, and we have developed the tools to mitigate workarounds, we plan to phase out support for third-party cookies in Chrome. Our intention is to do this within two years.”
That put Google on a 2022 roadmap to implement its new privacy plan. Over the course of 2020, they gathered all of the feedback from their Privacy Sandbox and assembled a new system to replace third-party cookies. Then in January 2021, they announced the creation of FLoC.
Google's Chetna Bindra described the new system:
“Federated Learning of Cohorts (FLoC) proposes a new way for businesses to reach people with relevant content and ads by clustering large groups of people with similar interests. This approach effectively hides individuals “in the crowd” and uses on-device processing to keep a person's web history private on the browser.”
Open FLoC was supposed to start in Q4 2021. FLoC was intended to hide individual user profiles (unless a user chooses to completely enable Chrome cookies from third parties, for some reason). The new system would have created groups of people with similar interests. Those groups, without revealing any individual information, would have been offered as an entire pool to interested advertisers.
That would have meant that no matter how many websites a user visits that use the same advertising agency, that ad firm couldn't have used third-party cookies to track the individual's buying and browsing habits.
But Google lied about their deadline, as noted previously. Their new deadline became 'some time in 2023'. This gave advertisers another year to milk profits from anyone who hadn't turned off third-party cookies manually. And, as we also have seen, it's been pushed back even further.
Even worse, Google scrapped FLoC. What they were supposed to do is address all of the privacy concerns that their test users brought up about the system. Instead, they rebranded FloC as the Topics API and refused to close the most serious privacy issues that were pointed out, having to do with browser fingerprinting (more on that later).
While FLoC would have been the same for big and small advertisers alike, Topics is clearly prejudiced to give larger advertisers an advantage. This just adds to Google's market share. And it only addresses the most minor of privacy concerns pointed out by the community.
This was a betrayal of the advice and key analysis given to them freely by the privacy and security communities. Topics is a slap in the face, an ineffective compromise that only helps Google's stock price in the end.
Still, lying and predatory tactics aside, it's nice that Chrome will eventually end third-party cookie use for advertising.
But there are other ways, of course...
Browser fingerprinting is a way that advertisers, government organizations, and website owners can build a profile of a user, even without cookies in Chrome being enabled.
To many outside of the privacy or security industries, it might seem impossible for someone to figure out exactly who a user is without the use of cookies. But there's a real-world example that better explains how the process of fingerprinting works.
Take a million people. Now start listing out every detectable feature about a person and tag them in a database. With enough detail about their physical attributes, mannerisms, and habits, no two people will share the exact same list of features. They become unique by the exactness of the terms that define them.
It's a clever trick. And it can be applied to the online world as well. The most basic explanation is that certain information and permissions that users commonly give to websites can be abused. For example, in order to adjust a website's look and feel for various screen sizes (mobile, tablet, laptop, desktop, etc.), the browser will tell the website the height and width of the browser window. It will share browser version and operating system information. It will share regional information, audio availability, and graphics capabilities.
The result? A good browser fingerprinting model can identify individual users with a 95% accuracy rate if they aren't blocking anything.
If anybody without some kind of browser fingerprinting protection isn't convinced, they should visit AmIUnique. 9 out of 10 people will have a unique set of attributes, if they aren't blocking anything.
So even if a user doesn't enable Chrome cookies, or even if they clear Chrome cookies after every session, they can be tracked. And quite easily, in many cases.
One of the biggest differences between cookies in Chrome and browser fingerprinting is consent. There are laws in many nations that require websites to inform users of their cookie policy and give active consent. That's not true of browser fingerprinting. In most countries, organizations can use browser fingerprints without consent. They can collect, sell, and share fingerprint data in an unlimited fashion.
The Electronic Frontier Foundation is always pushing for stronger anti-fingerprinting legislation. They gave a detailed commentary on the GDPR and its impact on fingerprinting methods used in the EU. But in most countries, laws lag far behind advances in tech. And of course paid lobbyists often have something to say about fair and sane privacy laws, particularly if the protection of individuals will lighten their bosses' wallets.
Most fingerprinting information is gathered with scripts on websites, though some use browser add-ons and helper apps launched from a website.
There are three main methods used to collect a browser fingerprint:
And there are dozens of other parameters that can be used if you have no browser fingerprinting protection. It's a hole wide enough to drive a virtual truck through.
Once a third party has a fingerprint of a user, they can track them from site to site. Even if their technique fails 5% of the time, it doesn't matter. 19 out of 20 times, they have valuable personal information that can then be linked to shopping habits, politics, social media, friends, groups, family, travel habits, and much more.
Among other things, social media sites use browser fingerprinting to better target their ads. By offering features such as tight geolocation, hobby identification, and lifestyle analysis to advertisers, they can sell a heck of a lot more ads. The more invasive they are, the better their conversion rate.
To fully understand how important the lack of privacy is to big social media sites, one only needs to take a look at Facebook.
In April of 2021, Apple announced its App Tracking Transparency system (ATT). This new software enhancement caused Apple devices to inform their users that their private information could be shared with advertisers if they decide to opt-in with certain app downloads.
Facebook responded with venom. They took out a number of big national ad campaigns, including full-page newspaper ads in major publications, accusing Apple of trying to 'take control of the Internet'. They said it was a monopolistic practice that would set back progress on the Internet by decades and destroy small businesses.
Apple's head of software engineering, Craig Federighi, responded:
“It's already clear that some companies are going to do everything they can to stop the App Tracking Transparency feature I described earlier - or any innovation like it - and to maintain their unfettered access to people's data.”
In regards to Facebook's tantrum, the Electronic Frontier Foundation (EFF) was far more outspoken. They said:
“Facebook touts itself in this case as protecting small businesses, and that couldn't be further from the truth. Facebook has locked them into a situation in which they are forced to be sneaky and adverse to their own customers. The answer cannot be to defend that broken system at the cost of their own users' privacy and control.
To begin with, we shouldn't allow companies to violate our fundamental human rights, even if it's better for their bottom line. Stripped of its shiny PR language, that is what Facebook is complaining about. If businesses want our attention and money, they need to do so by respecting our rights, including our right to privacy and control over our data.”
One thing that Facebook didn't mention in their press releases is that they were already using techniques that would circumvent ATTs effectiveness. In fact, even without using browser cookies at all, they could target ads so tightly that they could hit individual cities on a global scale.
To demonstrate this, the encrypted Signal chat app decided to do a little experiment. In May 2021, they took out ads that simply told users their own demographics on Facebook's Instagram service. The specificity of the results was staggering. One shocked user received the message:
“You got this ad because you're a K-Pop-loving chemical engineer. This ad used your location to see that you were in Berlin. And you have a new baby. And just moved. And you're really feeling those pregnancy exercises lately.”
Signal's advertising account was quickly banned from Instagram, but not before showing off just how much data Facebook had collected on hundreds of millions, if not billions, of individuals. It's highly unlikely that browser cookies and ad agency databases are the only resources that they had at their disposal. More than likely, browser fingerprinting played a major role in Facebook's user surveillance.
And they aren't alone. At the major online advertising and marketing agencies around the globe, there is a race to develop the next machine learning or deep learning algorithm to analyze browser fingerprints. Defeating privacy measures is big business. Manipulating the system to get even an extra 1% advantage means billions more in ad revenue.
But ad revenue is just half the battle. Accurate user information is the bread and butter of data brokers, who create offerings for both business-to-business and business-to-client customers. What's on sale is people's private information, plain and simple. And that includes their browser fingerprints. Some of these companies operate legally, others quasi-legally, and many are completely illegal in their countries of origin. Illegal data brokers can only be found on the Dark Web, and only accept untraceable payment methods.
Also on sale from data brokers are users' E-mail addresses, physical addresses, phone numbers, and much more. These days there's no concept of exclusivity, at least at the bottom rungs of the identity-sales ladder. Whether this data is legally obtained or is blatantly stolen, after it has been washed through a couple of company names and acquired by a major player, pedigree hardly matters.
Going back to Facebook for a moment, the list of companies they've purchased is staggering. Many of those companies were search, consumer, or data-oriented in nature. When someone acquires a company, they also acquire their user data… which of course can be used to verify, cross reference, and enhance browser fingerprinting.
As some may have already guessed, although certain data points in Incognito mode are set to the most generic levels, that by itself isn't enough to evade browser fingerprinting. Even with totally clear Chrome cookies and form data, and even if all extensions and external helper apps are disabled, the user is still using the same device. So video and audio fingerprinting remain unchanged. It's quite likely that their fingerprint will remain unique.
This holds true of most 'safe browsing' modes found in modern browsers. Unless they specifically state that they have anti-fingerprinting tech built in, assume that they do next to nothing to protect the user from the kind of audio and video profiling that makes browser fingerprinting so dangerous.
Simply put, they need to use an anti-fingerprinting app. That might be a locally running app that randomizes all non-essential system information before handing it over to a querying script. Or, more appropriately, it might be an app that entirely disguises all aspects of your personal connection, because the real connection is being made by a remote virtual machine.
One example of this is the Hoody app. Hoody is a privacy tool that uses private encrypted networks and browser virtualization to your advantage.
Among its many privacy features is the option to spoof your location, which is one of the major ways that advertisers deem whether or not a user is in the appropriate market. By shifting the apparent source of the query through the major cities in four or five of the most populated countries in the world, the 'uniqueness' of the browser fingerprint starts to drop. It certainly won't be correlated with your real location.
Another Hoody feature is falsifying the operating system and associated user agent (browser type) to different versions than what you're using, or a different OS altogether. Each one of your tabs and websites gets a new IP, a new location, and a unique set of Fingerprints, making tracking impossible.
This is what makes Hoody different from other similar tools. For example, VPNs can hide your IP address but they can't do anything about your browser or device fingerprints.
The other difference between Hoody and a VPN is that Hoody's servers keep no logs. In fact, they have no hard drives. And they have no other permanent writable storage… everything is kept in memory, which self-destructs if anyone tries to tamper with it.
These anti-fingerprinting methods throw off the uniqueness numbers. And the more people who adopt Hoody, the fewer people advertisers can use as 'typical' cases.
On websites where such attributes aren't critical, things like time zone, navigator properties, Java enabling, Flash legacy enabling, and visible menus and status bars can all be randomized. These are even more attributes that websites use to lock down unique identities through browser fingerprinting.
But perhaps most importantly, Hoody feeds a false set of audio and video driver information to the probing website. This includes WebGL data and parameters, Canvas data, screen and browser sizes, acceptable audio formats and codecs, audio frequency and context information, and media device specifics. Such scrambling, as long as it preserves the core functionality of the website, makes fingerprinting impossible. And every single tab is uncorrelated with the last, making it a trick that works again and again.
Yes. Of the most popular web browsers, Firefox is probably the one with the best default browser fingerprinting protection. That's not to say it's good. As of Q4 2023, no web browser is great at preventing fingerprinting without a privacy app helping them out.
But Firefox is certainly better than Chrome, Safari, or Edge. At least it maintains a list of sites that use fingerprinting regularly and modifies its responses appropriately. Firefox uses a herd methodology - make everyone look the same and nobody can be singled out. This would work better if it was far more popular, of course, but it comes in a distant fourth on the adoption list. Sadly, that makes it not very useful.
Tor and Brave are both better alternatives as far as browser fingerprinting protection is concerned since they both have modes that spoof and scramble critical information by default. But fingerprinting is an all-or-nothing proposition.
Sure, Brave is probably better out of the box than the other browsers (although website compatibility can be an issue with more restrictive policies). Tor may require a bit more configuration, but can achieve levels of fingerprint protection similar to Brave with fewer compatibility issues… but it still isn't perfect, and perfection is required here.
With all that said, the truth is that most sites use JavaScript heavily, and browser settings that mess with the way JavaScript functions can cripple web browsing. And even the best web browsers have pretty big gaps in their fingerprinting protection.
That makes the Hoody app one of the only reliable options.
One of the best ways to test browser fingerprinting security is a site provided by the EFF. Cover Your Tracks is a test that attempts to probe the privacy limits of a user's browser. Very few pass with flying colors. The best performers are generally, as one might expect, Tor and Brave. But the right combination of privacy tools and add-ons can produce stellar results as well.
The Device Info site is fast, simple, and brutally effective. In addition to all of the other functions that it serves, it offers a simple 'true or false' answer as to whether or not a user's browser has Canvas, Audio, or generic fingerprint resistance.
The world of browser privacy is very slowly creeping towards a more positive light. The elimination of third-party cookies is a good start. Although Google screwed us all with their planned and missed 2023 implementation of default third-party cookie blocking, at least they provide easy ways to clear Chrome cookies and then disable third-party cookies in their settings.
But mass surveillance is getting more common in the world, with India quickly joining the likes of China for both facial recognition and trying to destroy encryption. They, just like every major government, will continue to push the envelope in order to track their citizens.
Browser fingerprinting is becoming more and more sophisticated. There's no doubt that artificial intelligence (AI) or machine learning (ML) driven analysis will make fingerprinting an even more effective technique as time goes by. It is the next browser privacy battleground, without a doubt.
The future of browser privacy is a hopeful one. People are becoming more aware of the topic, which is step one. Step two is to adopt the most powerful encryption and browser protection available. And step three is to change our attitudes towards privacy, and make mass surveillance unacceptable in any form.
Learn more about browser fingerprinting here: What Is Browser Fingerprinting and Why You'll Want to Stop It
Will is a former Silicon Valley sysadmin and award-winning non-functional tester. After 20+ years in tech, he decided to share his experience with the world as a writer. His recent work involves documenting government hacking methods while probing the current state of privacy and security on the Internet.
Chapter 14: IoT Hacks
Dive into the unsettling world of government-controlled GPS tracking!
Trash Talk: How your garbage can be exploited by hackers, law enforcement, and government agencies
It’s time to uncover how government surveillance gets personal.
Discover the world's #1 privacy solution
Coming soon
Coming soon
Coming soon