Link: ‘News Sites Are Fatter and Slower Than Ever’
Frédéric Filloux over at Monday Note dives into some of the ways major news sites are completely out-of-touch with what readers actually want: fast, clean pages.
Today, a news site web page of a consists of a pile of scripts, of requests to multiple hosts in which relevant content only makes up an insignificant proportion of the freight. (On the same subject, see this post by John Gruber, and Dean Murphyâs account of An hour with Safari Content Blocker in iOS9)
Consider the following observations: When I click on a New York Times article page, it takes about 4 minutes to download 2 megabytes of data through⦠192 requests, some to Timesâ hosts, most to a flurry of others servers hosting scores of scripts. Granted: the most useful part â 1700 words / 10,300 characters article + pictures â will load in less that five seconds.
But when I go to Wikipedia, a 1900 words story will load in 983 milliseconds, requiring only 168 kilobytes of data through 28 requests.
Ridiculous. There is no excuse for a news site to take this long. Gigabyte apps can download from the App Store in under 4 minutes.
I stopped visiting most web pages a long time ago. I use an RSS reader and Instapaper to save the articles I want to read while simultaneously stripping out all the crap.
I try my best to make AfterPad a clean, fast, easy-to-read site. I think I'm beating the New York Tomes in that regard. And that's pathetic.