I don't understand how picking between two different browser tracking technologies can "help combat racism, sexism, anti-LGBTQ+ discrimination, and discrimination against those with mental illness with four lines of code,"
Because, Jesus if it were that easy why didn't we solve racism and sexism decades ago?
I can only speculate, but I believe the issue is that FLoC will start categorizing people by their browsing habits at the browser level. So within a period of time, the browser will know which groups you belong to and start showing you tailored content. As they state it's a privacy concern, I assume the issue is something like this: 1) Your browser figures you you are in a group such as LGBTQ+ and start tailoring your ad/news content accordingly. 2) Through means undefined, a bad actor finds this out (or ma
FLoC assigns users to "cohorts" based on their perceived interests. The cohort is then reported to websites so they can target ads, as an alternative to them tracking you online to figure out your interests.
Google claims this is better for privacy because all the interest tracking is done in the browser and kept local to your computer. The website only gets a general cohort identifier, and cohorts will contain thousands of users so are supposed to be of limited use in tracking individuals.
There are numerous problems. For a start the implementation is half baked, with the cohort generation system not being sufficiently resistant to deanonymization. An adversary could simulate thousands of browsing sessions and observe which cohorts result from them, or if they control a number of popular sites use those to force users towards selected cohorts.
Google claims that it will make sure that sensitive cohorts are blocked, so e.g. there will be no religious ones, nothing to do with sexual orientation or the like. Again though the problem is that their list, which they already use for other purposes, is incomplete and mostly based around Western taboos and social problems. It's very likely that abusive cohorts will be created, putting e.g. LGBTQ+ people in danger in countries where that is illegal, or by outing them.
FLoC also breaks private browsing mode. By default FLoC sends a null when there isn't enough data to assign a user to a cohort, or when they are in private browsing mode. That gives adversaries a way to detect private browsing.
If their ability to determine my interests is as crappy as Amazon, I'll end up in some very, very weird cohorts. Good luck selling that to advertisers!
They'd be better off (and probably will) say "Oh sure, we think he's got interests in soft drinks, sports shoes and electronic gadgets. His browser says he also seems to have interests in springs, composting and cafes on Jersey".
In other words, it'll be just like Google's "amazing algorithm for search results", which is sort of automated, but bastardised by h
A debugged program is one for which you have not yet found the conditions
that make it fail. -- Jerry Ogdin
Maybe I'm overlooking something (Score:5, Insightful)
I don't understand how picking between two different browser tracking technologies can "help combat racism, sexism, anti-LGBTQ+ discrimination, and discrimination against those with mental illness with four lines of code,"
Because, Jesus if it were that easy why didn't we solve racism and sexism decades ago?
Re: (Score:5, Insightful)
I can only speculate, but I believe the issue is that FLoC will start categorizing people by their browsing habits at the browser level. So within a period of time, the browser will know which groups you belong to and start showing you tailored content. As they state it's a privacy concern, I assume the issue is something like this:
1) Your browser figures you you are in a group such as LGBTQ+ and start tailoring your ad/news content accordingly.
2) Through means undefined, a bad actor finds this out (or ma
Re:Maybe I'm overlooking something (Score:5, Interesting)
FLoC assigns users to "cohorts" based on their perceived interests. The cohort is then reported to websites so they can target ads, as an alternative to them tracking you online to figure out your interests.
Google claims this is better for privacy because all the interest tracking is done in the browser and kept local to your computer. The website only gets a general cohort identifier, and cohorts will contain thousands of users so are supposed to be of limited use in tracking individuals.
There are numerous problems. For a start the implementation is half baked, with the cohort generation system not being sufficiently resistant to deanonymization. An adversary could simulate thousands of browsing sessions and observe which cohorts result from them, or if they control a number of popular sites use those to force users towards selected cohorts.
Google claims that it will make sure that sensitive cohorts are blocked, so e.g. there will be no religious ones, nothing to do with sexual orientation or the like. Again though the problem is that their list, which they already use for other purposes, is incomplete and mostly based around Western taboos and social problems. It's very likely that abusive cohorts will be created, putting e.g. LGBTQ+ people in danger in countries where that is illegal, or by outing them.
FLoC also breaks private browsing mode. By default FLoC sends a null when there isn't enough data to assign a user to a cohort, or when they are in private browsing mode. That gives adversaries a way to detect private browsing.
Re: (Score:0)
If their ability to determine my interests is as crappy as Amazon, I'll end up in some very, very weird cohorts. Good luck selling that to advertisers!
They'd be better off (and probably will) say "Oh sure, we think he's got interests in soft drinks, sports shoes and electronic gadgets. His browser says he also seems to have interests in springs, composting and cafes on Jersey".
In other words, it'll be just like Google's "amazing algorithm for search results", which is sort of automated, but bastardised by h