I don't understand how picking between two different browser tracking technologies can "help combat racism, sexism, anti-LGBTQ+ discrimination, and discrimination against those with mental illness with four lines of code,"
Because, Jesus if it were that easy why didn't we solve racism and sexism decades ago?
I can only speculate, but I believe the issue is that FLoC will start categorizing people by their browsing habits at the browser level. So within a period of time, the browser will know which groups you belong to and start showing you tailored content. As they state it's a privacy concern, I assume the issue is something like this: 1) Your browser figures you you are in a group such as LGBTQ+ and start tailoring your ad/news content accordingly. 2) Through means undefined, a bad actor finds this out (or maybe it's just your dad borrows your PC/Phone and sees the ad content tailored for you) and uses this information in a negative way.
There may be other security implications, but as I saw it described in one article, they summarized FLoC as starting every session with a website as beginning with your browser telling it all of your habits/interests/preferences right up front, so the site could use that data to tailor ads/content. I can see where this might raise a few eyebrows.
FLoC will categorize a young black single man so that he sees a lot of black singles dating web sites. It will also not show him ads for engineering jobs or mortgage lenders.
Do you look for engineering jobs enough for your history to skew that way Or are these presented to people based on inferred demographics from browsing habits?
can be applied for whatever they want, discounts, services, products, politics, etc
one politic can run ads saying they will defend LGBT people and send then to the LGBT people only and the do ads that will fight LGBT and send them to radical religious people. By having clear profiles of each user, people that sell and buy ads can make whatever they want with the data, even abuse it... and you have zero control on that
FLoC assigns users to "cohorts" based on their perceived interests. The cohort is then reported to websites so they can target ads, as an alternative to them tracking you online to figure out your interests.
Google claims this is better for privacy because all the interest tracking is done in the browser and kept local to your computer. The website only gets a general cohort identifier, and cohorts will contain thousands of users so are supposed to be of limited use in tracking individuals.
There are numerous problems. For a start the implementation is half baked, with the cohort generation system not being sufficiently resistant to deanonymization. An adversary could simulate thousands of browsing sessions and observe which cohorts result from them, or if they control a number of popular sites use those to force users towards selected cohorts.
Google claims that it will make sure that sensitive cohorts are blocked, so e.g. there will be no religious ones, nothing to do with sexual orientation or the like. Again though the problem is that their list, which they already use for other purposes, is incomplete and mostly based around Western taboos and social problems. It's very likely that abusive cohorts will be created, putting e.g. LGBTQ+ people in danger in countries where that is illegal, or by outing them.
FLoC also breaks private browsing mode. By default FLoC sends a null when there isn't enough data to assign a user to a cohort, or when they are in private browsing mode. That gives adversaries a way to detect private browsing.
They can put me into whatever "buckets" they like and I won't care, because I use an ad-blocker.
I occasionally do my part mess with ad data, like suddenly spending a day browsing baby clothes, cribs, formula...then the next time browsing for race car parts...the next day it's mountain climbing gear....then wheelchairs and canes.
So if this comes to pass I'll end up in the "we have no fucking clue what he likes" bucket.
I want tailored ads about as much as I want tailored prices and tailored legal eligibility.
LGBTQ+ is the least of our concerns. Sometimes I just want to know what it's like to exist in society as someone else or understand how other people are treated. The idea of a "personalized experience", everywhere I go, is pretty frightening.
Or, and this is really more likely, WP is just full of it and thinks such wild claims will translate into compliance and/or more users. They're using a ridiculous (and empty) social/political claim to make a technical argument, and in my view it just makes them look like arrogant idiots or con-men. They invented a problem that might exist in the future, act like it already exists, inflate the f-k out of it, and then claim to be able to fix it.
A debugged program is one for which you have not yet found the conditions
that make it fail. -- Jerry Ogdin
Maybe I'm overlooking something (Score:5, Insightful)
I don't understand how picking between two different browser tracking technologies can "help combat racism, sexism, anti-LGBTQ+ discrimination, and discrimination against those with mental illness with four lines of code,"
Because, Jesus if it were that easy why didn't we solve racism and sexism decades ago?
Re:Maybe I'm overlooking something (Score:5, Insightful)
I can only speculate, but I believe the issue is that FLoC will start categorizing people by their browsing habits at the browser level. So within a period of time, the browser will know which groups you belong to and start showing you tailored content. As they state it's a privacy concern, I assume the issue is something like this:
1) Your browser figures you you are in a group such as LGBTQ+ and start tailoring your ad/news content accordingly.
2) Through means undefined, a bad actor finds this out (or maybe it's just your dad borrows your PC/Phone and sees the ad content tailored for you) and uses this information in a negative way.
There may be other security implications, but as I saw it described in one article, they summarized FLoC as starting every session with a website as beginning with your browser telling it all of your habits/interests/preferences right up front, so the site could use that data to tailor ads/content. I can see where this might raise a few eyebrows.
Re: (Score:3, Funny)
FLoC will categorize a young black single man so that he sees a lot of black singles dating web sites. It will also not show him ads for engineering jobs or mortgage lenders.
Re: (Score:2, Insightful)
Do you look for engineering jobs enough for your history to skew that way Or are these presented to people based on inferred demographics from browsing habits?
Re: (Score:3)
can be applied for whatever they want, discounts, services, products, politics, etc
one politic can run ads saying they will defend LGBT people and send then to the LGBT people only and the do ads that will fight LGBT and send them to radical religious people. By having clear profiles of each user, people that sell and buy ads can make whatever they want with the data, even abuse it... and you have zero control on that
Re: (Score:3)
Re:Maybe I'm overlooking something (Score:5, Interesting)
FLoC assigns users to "cohorts" based on their perceived interests. The cohort is then reported to websites so they can target ads, as an alternative to them tracking you online to figure out your interests.
Google claims this is better for privacy because all the interest tracking is done in the browser and kept local to your computer. The website only gets a general cohort identifier, and cohorts will contain thousands of users so are supposed to be of limited use in tracking individuals.
There are numerous problems. For a start the implementation is half baked, with the cohort generation system not being sufficiently resistant to deanonymization. An adversary could simulate thousands of browsing sessions and observe which cohorts result from them, or if they control a number of popular sites use those to force users towards selected cohorts.
Google claims that it will make sure that sensitive cohorts are blocked, so e.g. there will be no religious ones, nothing to do with sexual orientation or the like. Again though the problem is that their list, which they already use for other purposes, is incomplete and mostly based around Western taboos and social problems. It's very likely that abusive cohorts will be created, putting e.g. LGBTQ+ people in danger in countries where that is illegal, or by outing them.
FLoC also breaks private browsing mode. By default FLoC sends a null when there isn't enough data to assign a user to a cohort, or when they are in private browsing mode. That gives adversaries a way to detect private browsing.
This happened rather famously (Score:2)
Re: (Score:2)
They can put me into whatever "buckets" they like and I won't care, because I use an ad-blocker.
I occasionally do my part mess with ad data, like suddenly spending a day browsing baby clothes, cribs, formula...then the next time browsing for race car parts...the next day it's mountain climbing gear....then wheelchairs and canes.
So if this comes to pass I'll end up in the "we have no fucking clue what he likes" bucket.
Re:Maybe I'm overlooking something (Score:4, Interesting)
I want tailored ads about as much as I want tailored prices and tailored legal eligibility.
LGBTQ+ is the least of our concerns. Sometimes I just want to know what it's like to exist in society as someone else or understand how other people are treated. The idea of a "personalized experience", everywhere I go, is pretty frightening.
Re: (Score:2)