Skip to main content

Use cases Research organizations

Browser fingerprinting and investigative research

Browser fingerprinting is a tracking technique where a remote service collects data from a browser to formulate a fingerprint that may be unique amongst others. Sophisticated techniques can even read information about the state of a device's processor, screen, graphics, and audio hardware.

Research teams should assume the sites they monitor can watch them back.

That is not a theoretical concern. The Berkeley Protocol on Digital Open Source Investigations, developed by the Human Rights Center at UC Berkeley and the UN Office of the High Commissioner for Human Rights, tells investigators to assume they are being monitored and to work inside technical systems or environments that limit exposure to cyberthreats.

For research organizations investigating governments, corporations, armed groups, or sanctioned networks, the problem is operational rather than abstract. Public-facing portals, procurement systems, company sites, media properties, and campaign pages can all log recurring visitors and tie them together across weeks of work.

Current security practices cover some risks well, but not this one.

The practical gap is well documented. Research teams often use Tor, VPNs, hardened browsers, and encrypted messaging for good reasons. Those tools help, but they do not eliminate the browser fingerprint that remote sites can still observe.

  • VPNs: They replace the outward network location, but the browser and device fingerprint can stay stable across visits and still be correlated over time.
  • Tor and hardened browsers: They reduce important tracking surfaces, but browser fingerprinting remains notoriously hard to combat completely and the use of Tor itself can be a visible signal.
  • Private browsing: It limits local storage and cookie persistence, but it does not remove the active fingerprint surfaces that remote services collect on each request.
  • Why the gap matters: Research teams often assume that IP privacy and encrypted communications are enough. They are not enough when the site being monitored can still recognize the browser profile itself.

404 addresses the fingerprinting layer by substituting one coherent identity across the full stack.

In plain language, 404 runs locally and replaces the browser identity that remote sites see. It does that across TLS, headers, JavaScript-visible browser surfaces, and network packet values, so the browser is not leaking a patchwork of host-specific defaults.

The coherence matters. Randomized spoofing often creates outliers that are easier to notice because the layers do not agree with each other. 404 is built around profile-driven substitution instead of noise, which makes the resulting identity look internally consistent over time.

One coherent profile, not random noise

404 applies a full browser identity that stays internally consistent. That matters because random inconsistencies are themselves detection signals.

TLS and HTTP alignment

The handshake, headers, ordering, and client hints are rewritten to match the selected profile instead of leaking the host machine defaults.

Browser-surface control

Navigator properties, screen values, graphics output, audio behavior, and other JavaScript-visible signals are normalized around the same identity.

Network packet rewrite

On Linux, 404 also rewrites packet-level values such as TTL, MSS, and window size through eBPF so the lower network layer tells the same story as the browser layer.

404 is designed to reduce external linkability at the web research layer. It is not a substitute for source protection workflows, compartmentalization, or the rest of an investigative security program.

What repeated monitoring looks like with and without fingerprint protection

Consider a researcher at an organization such as Bellingcat or Truth Hounds revisiting the same government portal over several weeks while building a case file. Without protection, that site can see a recurring browser identity even if the researcher changes networks or clears local state. The analytics view is not just another anonymous visitor. It is one stable device returning again and again.

With 404 running, the site still sees a browser, but it sees the profile 404 is presenting instead of the host machine's native stack. The goal is not invisibility. The goal is to prevent repeated visits from resolving into a clean, host-specific fingerprint that can be linked back to one investigator or one institution over time.