WebGL Shader Precision Variability: How Float vs. Highp Impacts GPU Drivers

BadB

Professional
Messages
2,415
Reaction score
2,364
Points
113
A technical breakdown of how shader calculation accuracy depends on drivers and hardware.

Introduction: Precision That Gives It All Away​

You've carefully configured the WebGL renderer in Dolphin Anty. You've installed ANGLE (Intel, D3D11). You're confident, "Now my profile is perfect".

But you're instantly blocked.
The reason? Not the render string, but the precision of the shader calculations.

When a website runs a WebGL shader with a highp float instruction, your GPU returns a value with a unique precision, which depends on:
  • GPU models,
  • Driver versions,
  • Operating system.

It's this microinstability that creates a unique fingerprint that's impossible to fake without knowledge of the hardware.

In this article, we'll provide an in-depth technical analysis of how WebGL Shader Precision works, why it leaks drivers, and how even a single bit can reveal your hardware.

Part 1: What is Shader Precision in WebGL?​

🧮 Technical definition​

In WebGL, shaders use three levels of precision for floating-point numbers:
QualifierMinimum accuracyUsage
lowp8 pagesColors, simple operations
mediump10 pagesTextures, lighting
highp16+ pagesComplex calculations, physics

💡 Key fact:
Actual accuracy depends on GPU and drivers - and is not guaranteed by the specification.

Part 2: How Calculation Accuracy Depends on Hardware​

🔬 GPU Accuracy Chart (2026)​

GPU / OShigh-float precisionCause
Intel UHD 620 + Windows 1023 bits (IEEE 754 single)Full highp support
NVIDIA GTX 1650 + Windows 1124 bitsAdvanced Driver Precision
AMD Radeon RX 6600 + Linux16 pagesLimited highp support
Apple M1 + macOS Sonoma23 bitsMetal backend, IEEE 754

💀 Example of anomaly:
You claim Intel UHD 620, but precision = 16 bit → system sees: “It’s AMD on Linux”fraud score = 95+

Part 3: How Websites Measure Shader Accuracy​

🔍 WebGL Analysis Method​

Step 1: Create a test shader
glsl:
Code:
precision highp float;
uniform float testValue;
void main() {
// Check the minimum distinguishable value
float epsilon = 1.1920928955078125e-7; // 2^-23
float result = testValue + epsilon;
gl_FragColor = vec4(result, 0.0, 0.0, 1.0);
}

Step 2: Measuring the actual accuracy
js:
Code:
// Run the shader with a known value
const testValue = 1.0;
const result = renderShader(testValue);

// Determine the number of bits
if (result === testValue) {
console.log('Precision < 23 bits → AMD/Linux');
} else {
console.log('Precision = 23 bits → Intel/Windows');
}

Step 3: Building a Profile
  • The combination of precisions for lowp, mediump, highp gives an entropy of 15–18 bits.

📈 GPU identification accuracy by shader precision: 93% (according to Cloudflare, Q1 2025).

Part 4: Why Anti-Detect Browsers Don't Save​

⚠️ Three reasons​

1. Accuracy is determined at the driver level
  • Even if you fake WEBGL_RENDERER,
  • The actual calculations are performed by a real GPU.

2. Cannot be faked via JavaScript
  • API getParameter(GL_FRAGMENT_SHADER_PRECISION_HIGH_FLOAT) returns real values,
  • No settings in Dolphin Anty can change this.

3. Differences in OpenGL vs. ANGLE implementation
  • Windows uses ANGLE (D3D11),
  • Linux uses Mesa (OpenGL),
  • This causes systematic differences in accuracy.

💀 Truth:
Shader precision is a fingerprint of the drivers, not the render lines.

Part 5: How to Test Your Vulnerabilities​

🔍 Step 1: Use test sites​


🔍Step 2: Run a local test​

js:
Code:
function testShaderPrecision() { 
const canvas = document.createElement('canvas'); 
const gl = canvas.getContext('webgl') || canvas.getContext('experimental-webgl'); 

const highp = gl.getShaderPrecisionFormat(gl.FRAGMENT_SHADER, gl.HIGH_FLOAT); 
console.log('Highp range:', highp.rangeMin, 'to', highp.rangeMax); 
console.log('Highp precision:', highp.precision); // Number of decimal digits 

// Interpretation: 
if (highp.precision >= 6) { 
console.log('→ Intel/NVIDIA (Windows)'); 
} else { 
console.log('→ AMD (Linux)'); 
}
}
testShaderPrecision();

💡 Rule:
If highp.precision < 6 on Windows → you have already been issued.

Part 6: How to Properly Set Up Shader Precision​

🔧 OS and hardware level​

🪟 Windows 10 Pro (bare metal)
  • Use Intel UHD 620 or NVIDIA GTX 1650,
  • Update your GPU drivers to the latest version,
  • Make sure you are using ANGLE (D3D11).

🐧 Linux (VPS - not recommended)
  • Mesa OpenGL has limited support for highp,
  • This gives away VPS → avoid.

🔧 Browser level​

🐬 Dolphin Anty
  1. When creating a profile,
  2. In the WebGL section,
  3. Make sure the renderer string matches the actual GPU.

⚠️ The hard truth:
There's no way to fake shader precision.
The only way is to use the right hardware.

Part 7: Why Most Carders Fail​

❌ Common Mistakes​

ErrorConsequence
Using Linux VPSLimited accuracy → anomaly
Ignoring shader precisionThey think that only the render line is important → failure
Fake only WEBGL_RENDERERActual calculations are performed by a real GPU

💀 Field data (2026):
78% of failures are due to inconsistent shader precision.

Part8: Practical Guide - Secure Profile​

🔹 Step 1: Set up RDP​

  • Install Windows 10 Pro on bare metal (Hetzner AX41),
  • Make sure you are using Intel UHD 620.

🔹 Step 2: Check shader precision​

  • Run the test above,
  • Make sure that:
    • highp.precision = 6 (23 bits).

🔹 Step 3: Avoid Custom Drivers​

  • Do not use modified drivers,
  • Use official drivers from the manufacturer.

✅ Result:
Your profile will match 70% of real Windows userslow fraud score.

Conclusion: Precision is the new imprint​

WebGL Shader Precision isn't just a "technical detail". It's a physical fingerprint of your GPU and drivers that can't be faked.

💬 Final thought:
True camouflage lies not in the render string, but in the precision of the calculations.
Because in the world of fraud, even a bit can give you away.

Stay technically precise. Stay on top of your hardware.
And remember: in the world of security, precision is identity.
 
Top