Chrome's New 'Cache Partitioning' System Impacts Google Fonts Performance (zdnet.com) 27
A change made in the Google Chrome browser in October has impacted the performance of the Google Fonts service for millions of websites. From a report: The change is an update to Chrome's internal cache system. A browser's cache system works by serving as a temporary storage system for images, CSS, and JavaScript files used by websites. Files stored in the cache are typically reused across multiple sites instead of having the browser re-download each file for every page/tab load. But with the release of Chrome 86 in early October 2020, Google has overhauled how Chrome's entire caching system works. Instead of using one big cache for all websites, Google has "partitioned" the Chrome cache, which will now be storing resources on a per-website and per-resource basis. While this is a big win for user security, preventing some forms of web attacks, this change has affected web services designed around the old cache system.
Summary (Score:5, Insightful)
Dejar Editor:
The summary does not even mention any font, at all. A summary is not the first paragraph of an article. Please do not accept submissions like this one without editing It.
Yours
Re: Summary (Score:4, Informative)
I definitely consider external fonts harmful, if you can't use device local fonts then there's a problem.
Many downloaded fonts actually looks worse than the device local fonts as well.
Re: (Score:1)
In Firefox, about:config. Then just set gfx.downloadable_fonts.enabled to false.
Re: (Score:2)
I definitely consider external fonts harmful, if you can't use device local fonts then there's a problem.
Many downloaded fonts actually looks worse than the device local fonts as well.
Well, for one thing they don't follow local font setting. I chose to enforce local font settings on remote fonts in QtWebEngine but it turned out many online fonts, even popular ones are terrible, buggy, and only looks right with "default" settings.. Which is why they don't look as sharp if you don't have a hidpi screen, you need to render them pretty blured as they don't hint well. Well they look sharp in my web engine at least, but then kerning gets broken instead.
Re: (Score:2)
Re: (Score:2)
Yes, sites like that are SLOW always. I keep noscript on all the time just to get some performance. We use Azure DevOps at work and I am amazed at times how amazingly slow it is and why customers would want to use it.
Re: Why would you rely on a particular caching sch (Score:4, Informative)
This type of caching is the main reason to use sites like cdnjs. If it won't be cached for you, you might as well store all files on your own cdn.
Re: Why would you rely on a particular caching sch (Score:4, Insightful)
The web development culture is a culture of hacks.*
They have been reyling on non-standars browser quirks for the entirety of their existence. Sometimes out of necessity. but mostly out of "Whoo! Look what I can do and you can't!".
It's so deeply ingrained in the culture, that is *is* the culture.
The only way to fix it now, is to kill it. Unless you want to wait for literal centuries.
_ _ _ _
* I've worked in there, and the web started out as a trivial "platform" for home users (mmmbecause it is!), only for loads and loads of fakers to pour in during the dotcom bubble. On the browser and standard-designing front too. Certainly when the HTML 3.2 spaghetti-browser-code standard-inverters killed the W3C and XHTML Strict, formed the WhatWG, and won. (Some say the "W3C isn't going fast enough" was the start of Google's plan to kill all other browser makers with endless pointless kitchen sink bloat changes that nobody can keep up with.)
Re: (Score:2)
The web development culture is a culture of hacks.*
They have been reyling on non-standars browser quirks for the entirety of their existence. Sometimes out of necessity. but mostly out of "Whoo! Look what I can do and you can't!".
It's so deeply ingrained in the culture, that is *is* the culture.
The only way to fix it now, is to kill it. Unless you want to wait for literal centuries.
_ _ _ _ * I've worked in there, and the web started out as a trivial "platform" for home users (mmmbecause it is!), only for loads and loads of fakers to pour in during the dotcom bubble. On the browser and standard-designing front too. Certainly when the HTML 3.2 spaghetti-browser-code standard-inverters killed the W3C and XHTML Strict, formed the WhatWG, and won. (Some say the "W3C isn't going fast enough" was the start of Google's plan to kill all other browser makers with endless pointless kitchen sink bloat changes that nobody can keep up with.)
It's a culture of making things work. It's why there is a true world wide web, instead of just a few academics.
Re: (Score:2)
I think the whole point is that the "service" in question is spying on you, and the way it breaks is that you get a cache miss (despite the fact that you already downloaded the font when you visited pornhub) so you download it again, causing them to incorrectly infer you're not a pornhub user, so therefore you don't get your special pornhub ads and offers. Enjoy your lameass bible ads!
Of course, the other breakage is just that your web browser is now slightly slower, since it's doing work it didn't need to.
Don't rely on third parties to save bandwidth (Score:3)
Realistically, the solution here is to not depend on third party sites for hotlinking content if you're trying to save bandwidth, as it still saves the bandwidth of the website the content is linked from, but the end-user doesn't save anything.
That can also be solved rather easily by some tweaks to the cache mechanism itself by designating fixed assets as globally cachable. eg embedded content (think youtube's player controls, and jquery libraries) , and thus de-duplicating the cached assets on the client-side.
Re: (Score:3)
Yes it does seem like a local whitelist saying which websites can have their caches merged would solve this. Google could pre-fill this list with their services (and I guess promise to be careful to prevent any exploits to be stored in them).
Re: (Score:2)
You could fingerprint the cache just based on the cached google font service alone. I don't think whitelisting it would work.
Re: (Score:2)
No more CDNs advantage? (Score:2)
Re: (Score:2)
Nope. Zero, if using HTTP2.
The problem is, all of our browser caches are going to be huge and stuffed full with duplicate fonts and JS libraries.
How the fuck is that a win?
Just randomize the response times!!
Today S.O. browsers are funny! (Score:1)
I seriously hope it uses deduplication. (Score:2)
With hardlinks, that would be trivial.
And while we'e at it, how about for RAM too.
Because whe our memory is filled with bloat and crap, then at the very least it should not have the same crap twice.
HTTPS Everywhere (Score:1)
HTTPS is cacheable on client (Score:2)
HTTPS follows the same client-side cache rules as cleartext HTTP. It just rules out intermediate proxies near the client, as used to be operated in remote areas or developing countries.
Question: Chromium or Chrome? (Score:2)
My opinion (Score:1)