Modern news websites have become hostile infrastructure. The average news page now ships 49MB of payload to render a few hundred words of text, a ratio so absurd it functions as a case study in institutional negligence. The culprit is programmatic ad-tech: layered auction systems, tracking pixels, and third-party scripts that multiply unchecked because no single team owns the damage they cause.
The piece does not stop at payload size. It maps the specific architecture of failure: how ad slots trigger waterfall requests, how consent management platforms add latency before a single content byte loads, and how A/B testing frameworks run continuously in the background whether you are in a test or not. These are design choices, not accidents.
What makes this worth reading in full is the framing. The author treats hostile web architecture as a collective action problem, not a technical one. Every stakeholder, publishers, advertisers, analytics vendors, has an incentive that points away from the reader. Understanding that structure is the only way to know where a fix could actually come from.
[READ ORIGINAL →]