If legitimate interest is actually legitimate then there is no reason to allow an opt-out. They allow it because the truth is that it wouldn’t actually fall under legitimate interests.
According to Finnish data protection ombudsman, data subject has right to object in case of legitimate or public interest. Data subject does not have right to object when it's based on contract or legal obligations.
Objection itself may or may not stop the processing of data. Usually it should, but there are some situations where it would still be allowed (e.g. "a task in the public interest that requires scientific or historical research or the compilation of statistics")
Now I don't know if there has been any decisions or not based on what kind of tracking would actually be legitimate interest (the text on the website is very ambiguous)
This site has a 3 item slider at the very top of the page promoting recent? decisions. 2 of the 3 have the same number of lines of text. The third one has an additional line of text. Every time the 3rd one comes/goes, the entire page is shifted up/down to accommodate causing the page to have a very slow bounce. tsk tsk tsk
I am not impressed. I clicked on my country and the four most recent fines are: 600(private indiv.), 150 (private indiv.), 100 (Bank), 0 (Post office).
I'm not opposed to GDPR. I just think it's ridiculous how they boasted about fines up to 20 million or 4% of annual worldwide revenue, and then we get an interpretation of "up to" that we otherwise only know from ISPs. I mean, a "fine" of 0 Euro, and 100 Euro for a bank? That is not how you make organisations respect user privacy.
At this rate we're going to have three different any% categories of this speedrun before we can hope for an announcement of a plan to tighten restrictions in an unspecified amount.
I disagree, every company on the top fines list has more money in proportion to the fine than the examples the parent comment gave, except maybe the bank. These fines are so tiny no one will ever care about user privacy or data security.
Isn't it still supposed to be opt-in? Seems strange to allow the data processor to define what is legitimate interest, and then bypass the otherwise clear requirement of opt-in and informed consent?
If you invoke Legitimate Interest, you do not need consent (assuming your Legitimate Interest is valid). There are many common misunderstandings of GDPR, and one of them is that consent is always required. It is not.
To process data under GDPR, you need a Legal Basis. Consent is one Legal Basis. Legitimate Interest is a different Legal Basis. There are four others.
Consent is opt-in. That's the defining feature of Consent as a Legal basis, since that's what "consent" means. It can also be revoked.
Legitimate Interest is opt-out, as is Public Interest.
If your Legal Basis is one of the other three, then there isn't even an opt-out requirement. Which makes sense, because those cover essential or non-optional processing: Legal requirements (e.g. retaining credit card records), processing necessary to perform a contract the Data Subject has signed, and "Vital Interests" which means "literally life-or-death situation."
Note that cookies are regulated by the ePrivacy Directive in addition to GDPR. The ePD requires consent for cookies and does not have a concept of Legitimate Interest. If a company invokes Legitimate Interests for their cookies, they are Doing It Wrong.
What you describe makes sense, but the way it's implemented everywhere seems like a complete breach of GDPR. If I understand it correctly, "legitimate interest" would be the processing of data necessary to perform the service in question, of which extent must be properly informed?
If I can turn the "legitimate interest" options off, and the service / product remains the same, then... isn't that a clear indication that the grounds for it being "legitimate" don't hold up? For example, I'd consider a service feedback functionality to be "legitimate interest". It's obvious that for it to work, there is a legitimate interest for processing the data transmitted.
Legitimate interest is very broad and very vague. It's the "wild card" Legal Basis, basically used to cover all of the cases that the law didn't explicitly address. The legal requirements are more-or-less "the company has a good reason, and the privacy impact is minimal." The validity of the good reason or minimal privacy impact are subject to regulatory review, but companies are trusted to make this decision on their own until a regulator gets involved.
A company can also decline opt-out if they have an "Overriding Legitimate Interest." This is true regardless of whether the original legal basis was Legitimate Interest or Consent. However the company must restrict processing only to that particular overriding interest.
"Fraud Detection" is the canonical example of an (Overriding) Legitimate Interest. To my knowledge, that's the only example that's actually given in the text of GDPR itself. Telemetry is generally believed to be another example, and in that case it's probably not Overriding.
Processing necessary to provide a service is kind of weird. If the service is part of a contract, then you use Performance of Contract as your Legal Basis. But if the use of the service doesn't actually form a contract, then you can't use that Legal Basis and have to use either Consent or Legitimate Interest. There are arguments for and against either.
Legitimate interest can be opt-out but it’s definitely a dark pattern presenting the same processing under both options. It should be either one or the other.
It's not just a dark pattern, it's straight-up non-compliant.
Article 7 "Conditions for Consent," paragraph 2:
> If the data subject’s consent is given in the context of a written declaration which also concerns other matters, the request for consent shall be presented in a manner which is clearly distinguishable from the other matters[...]
Most regulators have taken this to mean that requests for consent must be distinguished even from other noticies required by GDPR. I.e. it must be a separate request from the Privacy Notice itself.
I have always wondered how a site is allowed to offer you an opt-in for anything that doesn't fall under legitimate interest. It would be driven by an illegitimate interest by assumption.
When using a legitimate interest (opt-out) as a legal basis, the interest must be both legitimate AND outweigh the data subject's rights and freedoms. This requires a balancing test between the various factors to be performed first.
Similarly, you can't just legitimize anything with consent (opt-in) – the consent must be valid, and of course can't override more specific laws. You can't consent to something illegal.
So no, failing to use legitimate interest doesn't mean it's illegitimate or that consent could always be used. It could also mean that the balancing test failed, or that laws prescribe a different legal basis. E.g. the “cookie law”prescribes consent for non-necessary cookies and similar technologies.
It becomes clearer if you look at it in terms of core business. So yes, they can collect X and Y because that's their core business and directly related to the product.
When it's for marketing, telemetry or similar purposes, it's tangential data, which need not be illegal or immoral to be an "illegitimate" interest. It becomes more of a dark pattern when they present a selectable option for "legitimate interests" - at best malicious compliance. They might think it's legitimate because it makes them money?
Similarly in the vein of malicious compliance is offering a cookie consent banner. As far as I know, they only need to do that if they're tracking you or storing TMI/PII. Worse is, it works, too, because now everyone is complaining about the law and not the companies engaging in these dark patterns.
A legitimate interest is a use of personal information that is needed to fulfill a service. This would be something like a session cookie for storing the contents of a shopping cart, a site's preferences, or login information. Using a cookie is the only way to provide that, and the user is basically implicitly asking for something to be stored. It would be silly to have a consent checkboxes like "before you can shop with us we need your permission to register what you want to buy" or "you give us permission to share your address details with the delivery company so they can actually deliver stuff to you".
Annoyingly, legitimate interest covers more than that - it also covers opt-in-by-default to direct marketing. Yes, if a customer registers an account or makes a purchase, you can opt them in by default on the basis of "legitimate" interest[0].
Yeah, the problem with "legitimate interests" is they're being used for "build a marketing profile of you" and "send you targeted advertisements" anyway, with the excuse that they're interested in doing that as the basis of their business.
I'm not saying I agree with it, but just for the sake of playing devil's advocate - what if the business legitimately makes its revenue by serving ad content on it's site to it's users?
> A legitimate interest is a use of personal information that is needed to fulfill a service.
No, it's not. If you need it to fulfill a service, then you are covered by (b) of Article 6 GDPR I cited earlier:
processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;
Legitimate interest under (f) would be something that is not strictly needed to provide the service but (1) beneficial to the processor and (2) does not unduly negatively affect the data subject.
> Thing I keep seeing and don't understand is "Legitimate interest" as a separate thing to consent.
I think it's like this:
Legitimate interest means you've signed up to use the product. It then is assumed that you understand that by signing up/logging in/buying something that you want to be tracked and known (otherwise, how will they know you are the same person who signed up just now?).
Consent doesn't require you to sign up for anything, just click "OK".
But as a result, if you have Legitimate Interest, then companies don't need to ask your permission to track you.
I guess we should ask the EU MPs who included this loophole in the GDPR law.
„Processing shall be lawful only if and to the extent that at least one of the following:
(f) processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.“
This "loophole" is necessary to allow certain usecases not to need a banner or opt-in at all. E.g. If I want to buy something online, the shop has to know my adress to ship me something. It shouldn't have to ask to use it for that usecase. Otoh, if it does not ship me anything and still asks me for an address, that would not be legitimate interest anymore, except it can argue for it (e.g. needs the adress for the invoice).
I would argue that this loophole is for conveniency and was not a hot topic anywhere. How it used now however is a different thing.
> This "loophole" is necessary to allow certain usecases not to need a banner or opt-in at all.
This use-case was already covered by letter b) of the same Article 6.
„b) processing is necessary for the performance of a contract to which the data subject is party or in order to take steps at the request of the data subject prior to entering into a contract;“
The problem is that not having the legitimate interests clause in there potentially causes far more problems - suddenly the law has to enumerate what all the purposes for data processing might be, and new purposes are illegal by default. That would have produced even more HN outrage about GDPR.
That's what consent is for. GDPR allows tracking with user consent (letter a) of article 6). No need to enumerate all the purposes in the law. The problem is that the GDPR allows companies to use hide tracking behind the concept of legitimate interest, and behind 1 million checkboxes that users now have to click in order to opt-out of tracking.
Somewhat related, I got a new computer this week, and had to boot into windows so I could partition the HD to install linux. This was the first time in 15 years I have booted into a brand new "consumer" windows install (it was windows 10 pro). The "setup" was basically just 10 minutes of them asking in different ways if they could collect my personal data, track my location, send back telemetry etc. Office 365 is the same. I find some new thing every day that I have to opt out of to prevent them stealing my and my business data. Its like they have given up on trying to improve their products (which are basically stable) and shifted into finding more ways to steal data. As much as I dislike google for this, I realize I'm the product there, with Microsoft I thought I was paying to get business tools, not to be spied on. (To be fair, I then installed ubuntu which also wanted to send my data back to canonical)
Another example, I bought a car recently that defaults to stealing my personal information and sending it to the manufacturer. I had to call, and provide more information to them, to opt out (and I can only assume they are still stealing information they have deemed critical in some way)
Anyway, I'm reminded of all of this because I think the obfuscated cookie consents are just one facet of how hostile consumer tech has become to users. Aided by complex and ambiguous regulations, companies are able to stay within the letter of the law while making it impossible to just be left alone with your purchase and not be tracked and marketed to.
If there is a regulatory solution, it has to focus on clarity and spirit, not on just more rules. I'm not aware of an example of something like this working elsewhere.
One idea is a heavy tax on advertising. I've argued before that there is a lot in common between environmental pollution and the effects of advertising on the public value of the internet, and I would say this extends to tech generally. Charge a 25-40% tax on ad revenue, and make it less economic for companies to pollute.
> boot into windows so I could partition the HD to install linux
There's your mistake. Partitioning works just fine from the installer, or if the installer provides any live environment on the second virtual console, with gdisk or parted.
The money Microsoft gets from spying on paying users is sometimes called "surveillance dividend". Even if you pay for something, the company can make more money if they also spy on you.
Yup. It's so difficult to run Windows without accidentally agreeing to let them use your data however they'd like. They make it time consuming to opt out as well. Very frustrating at times.
Great advice and thanks for sharing. However, the risk of these programs is that they don't last forever. Microsoft can disable what they do, find alternative ways of stealing data, etc.
It easily could take 10 minutes to actually read/decipher the word games being played to confuse the reader into accepting the preferred option the vendor wants. Just like it only takes seconds to accept the ToS/EULA because nobody reads them. If people did actually read them, it would take hours/days to do a "simple" install.
I don't remember the wording being tricky or something along those lines
I don't have screenshoot of those newest windows, but I found one of older quetsions
Diagnostic data - send all basic diagnostic data along with info about websites you browse and how you use apps and features plus additional info about device and its health and enhanced error reporting
What is 'device'? In moder parlance, device is a phone/tablet type of something. I've personally never heard of a computer being refered to as a device. What is 'enhanced error reporting'? What is 'basic'?
My natural instinct would be to have clicked no to everything, but just taking that approach screws you when it is worded like some of the options in the TFA 'Disable all basic diagnostic blah blah'. If you quickly select no for everything, then you just said no to disabling, thereby granting permission to do what you thought you were just disabling. These are the things to be looking out for.
In legal terms stealing != stealing. There is larceny, petty larceny, grand larceny. One type gets you a slap on the wrist (stealing one's personal data for monetary gain). Another gets you a ridiculous monetary fine (downloading a song/game from torrents). It's out of balance like the murderer going free while the kid with minor amounts of weed going to jail.
I find it absurd to frame making an observation (this person buys a lot of cereal from my store, so I'll tell him about a new brand of cereal that came in he might like) as stealing their data. How do you justify that? How has this cereal buying customer been robbed?
Harvesting peoples behavioral or other data is depriving them of privacy. There is lots of precedent for why privacy is important. So to be specific, the data collection is "stealing" privacy.
And with respect to the gp comment, this is actually different than copyright infringement, where the infringer is not depriving the copyright holder of anything, except potentially a business model based on withholding information. There is obviously a lively debate about the appropriate reach of copyright law, but imo at least, copyrights are more abstract that the privacy rights that are being infringed through surreptitious data collection
> Harvesting peoples behavioral or other data is depriving them of privacy. There is lots of precedent for why privacy is important. So to be specific, the data collection is "stealing" privacy.
It's not always black and white as the OP demonstrates. Store owners memorize their customer's preferences or habits in hoping of their return; waiters do so to please their customers for tips. The customers weren't consented but that still isn't stealing. Scaling it up, small mom and pop stores compete with big box chain by providing better customer services. They can't do that without knowing their customers. That isn't stealing either.
So where is the line? I'd argue collecting customer's information en mass for the purpose of reselling being the line and that isn't perfect either.
A waiter doesn't sell to an adversting company the fact that I like my food prepared a specific way, nor does/did my local bar tender sell to advertisers that i prefer a specific cocktail. Comparing these are night/day different from what pervassive online tracking is doing.
Are you talking about any particular individuals or organizations? Who are they? Is this just a general observation? In which case... what evidence do we have to support your statement? Are you suggesting 100% of individuals who support piracy also support privacy? Or is it more like 1%?
We're trying to prevent the data from being created and collected in the first place. Data is abundant after it's been created. Once it's in a database it's a lost cause.
That annoying "TRUSTe" modal. The one you see on java.com for example?
While I have seen less of the "30 seconds to save" issue recently (I dunno if it was a ublock origin update or the ad companies actually fix their scripts). The issue causing it was ublock origin. Looking at the network activity when it was happening (it pissed me off too), the script was sending a request to each of the partners with your prefence and the script had to wait for the timeout on the request (as ublock was blocking the request) before moving onto the next batch. this scaled over all the partners listed in their ad/tracking partners added up for a piss take of a long time.
But as I said for me personally when I see that particular opt in/out modal these days it saves almost instantly, so someone somewhere fixed it :-)
EDIT: thinking about it, it might of even been the addition of FireFox's built in tracker protection that "fixed" the issue for me. I can't recall extactly when I stopped seeing the TRUSTe modal take forever to save my prefs.
I don't know if uBlock Origin increases this further, but even without it it's ridiculous. We measured this just for fun in a paper last year [1]:
> Compared to accepting cookies, opting out causes an additional 279 HTTP(S) requests to 25 domains, which amounts to an additional 1.2 MB / 5.8 MB of data transfer (compressed / uncompressed).
Its been an age since I looked into it. But I remember if you disabled uBlock on the page before you hit save, it updated the settings a lot faster then if it was enabled. Same thing for the Ad Choice mass optout tool (Though that would say it failed to opt out for all the companies as it couldn't send the request).
"We can't be bothered to not load trackers without consent so we're going to make calls to all their endpoints and trust they'll respect that and not use the calls themselves to track you"
with a mix of:
"Hey, if we put a sleep(1) every 5 entries it's going to be slow and annoying and less people opt out"
The people doing it just know you won't like the explanation so they're not going to.
Yes, this was on Oracle site when downloading Java (don't know it its there still). The thing had a progress bar when 'processing' cookies. Always made me wonder.
I always thought the intention was to make people angry at lawmakers for coming up with GDPR. “Look what your government made us do to you” kind of thing.
https://github.com/iamadamdev/bypass-paywalls-chrome is what I use and most of the time I never see a paywall. (If you are a FF user (as I am), ignore the word chrome in the url as it also supports FireFox. I think the chrome version got removed from the chrome extenstion store, so you might want to look for something else if you want auto updates and the "you are using dev mode" message on chrome start annoys you.
My browser is set to not accept cookies. I then use the Dev Tools to highlight the GDPR/cookie banner to add a Display:none to the css. I'm trusting uBO/no-script/etc to protect me the rest of the way
This cookie consent functionality should be something the browser reads and gives it to you on a standard format - like the https lock and other privacy info.
This is the correct goddamn answer. Or, better yet, get rid of cookies as a thing. The one and only legitimate use for them is session tracking, so why not provide a session storage mechanism instead? Every website gets a standard login/logout button with pluggable functionality for how you authenticate. And maybe, just maybe, we can then also have Persona-type identities that are stored and synced across all your devices so you just choose from a drop down of which identity you want to use to log in rather than typing usernames and passwords.
Because when I hit the logout button, the local session ID is deleted from my browser session storage (because that action would be performed by the browser and not by the website’s code under this system), so I would look like a brand new user to the site (setting aside other identifying stuff like IP address, etc.). All the session store should hold is an opaque ID for the session and it’s expiration info and it would be sent to the web server as a header (Session: djsisnxidnskxjf). The server would store all the info about you but if you don’t send that header, the server has no idea who you are.
OK - so instead of cookies storing information on who you are and sending that back to the service, you instead have your browser telling the service who you are and the service storing that information on you?
I'm still not seeing the advantage. This isn't me arguing with you and telling you that you're wrong - I just genuinely don't see the difference and would like to understand better. This hypothetical service will still have the same amount of information on you either way (just stored service-side rather than in cookies), right? Unless you're claiming that the service wouldn't associate your various sessions with one another, which seems both incorrect (they certainly would, if they possibly can - as you say, via IP address, etc.) and undesirable (almost all moderately-sized-or-larger web services would feature some kind of persistent settings, at least).
Any browser authentication functionality you create will track people exactly as well as 1st party cookies. So, just disallow 3rd party cookies, and get the exact same level of privacy.
Firefox does the "synchronize the authentication data across devices" thing too.
If I hit “logout” in the browser under this system, I would be assured that my session ID is erased and not sent to the server. As is, a website can have a logout button that erases a session cookie but keeps tracking cookies.
Oh, ok, so you want a button to delete your cookies. That should be reasonably easy to do in an extension.
I would probably use such thing too, and not only for privacy reasons. Bonus if it deleted stored data, cache, and everything else related to the site.
Almost. Cookies are way too permissive and complex. What I am thinking of is essentially a Session: abcdef12345 type of header that would be used for authentication/sessions. This header would be directly tied to the domain I am talking to and should have a simple expiration (or not) policy that the web server may suggest but I can override. No more X-PHP_Session_widget_co-referer: … type crap. No multiple cookies. No large storage. No HTTP vs HTTPS, no HTTP-only, no third party anything. path is the only thing I am somewhat ambivalent about because it does present a certain complication but also allows you to have applications that aren’t at the top of the domain.
Along with this we have a browser API we can call Authentication with something like Authenticate.prompt() and Authenticate.clear(). prompt() would bring up the standard browser UI for logging in and the parameters to it would dictate how the authentication should happen: username and password, 2FA options, private/public key, client certificate, etc. Registration could be handled in the same UI or have a separate API. clear() lets you have an internal logout button or mechanism on the site. The same APIs should be available via HTTP headers for non-JS usage.
As a bonus, we can then develop a mechanism for creating a session ID based on me having a private key that has an associated identity with with. So when I am prompted with a login UI instead of entering usernames and passwords I can simply choose from a drop down which identity I want to use for this site. Of course the problem of syncing private keys across devices is hard, but not any harder than what password managers currently do with my passwords.
It already exists. If the user agent sends a Do-Not-Track header, the HTTP server will know the user has made their lack of consent explicit. This knowledge is available before the web application even gets control. There are no excuses and no ambiguities.
All courts have to do is request server logs and look for this header. If it's present and the company is found to be violating people's privacy, they are obviously guilty and should be condemned and fined.
Indeed. Not only is it useless for its intended purpose but it also adds an additional bit of data to track users with. Everything would've been different if it could be enforced by law.
The whole "website asks" thing seems like a stupid political answer to a technical problem. If the browser denied cookies by default (like it does with location, or webcam access etc) then the problem would be solved.
I suspect the reason Chrome doesn't do that already is that user tracking is essentially Googles business.
If Chrome made denied cookies by default and required an explicit opt-in caused by a user action (basically deter un-prompted cookie prompts like we managed to deter popups) then that would change very quickly. I wonder why they don't?
This is amusing and on point, kudos to the creator!
The biggest takeaway from this is the dark patterns sites aggressively use to trick you into accepting all their cookies, by making use of creative language that might take a while to parse for the impatient reader or setting buttons to common colours that might confuse someone into clicking.
I really wish there was just a setting in the browser that just says
- Accept 'functional/mandatory' cookies (with exclusion support for sites that abuse this...)
- Reject advertising cookies
- Reject personalisation cookies
- Reject analytics cookies
- Reject tracking cookies
etc. and this config is available for these GDPR banners to query and apply the appropriate settings.
I'm just using uBlockO as such a solution—with the hope that vast majority of problematic ‘third parties’ are already in the blocklists, at a given time.
I am not sure much trickery is needed having witnessed the speed at which some friends just click right past the warnings. Training Gerbils could not be easier.
people want their fix and they want it now and many are just apathetic to the idea of privacy on the net to the point we need a better solution.
I think it's less apathy and more that they don't understand the stakes. It's a lot like how laws in the US were written when data collection and processing was a manual task.
Sure, I could tail someone for two weeks, flash their email and SMS data, and flip through publicly available images of them. Or I can get a bunch of digital data points like GPS, wireless APs, and the actual emails and SMS data. Computers and databases make it trivial to sift through this data.
The average person likely doesn't understand how deep digital profiles can go. They think that because they use incognito to look up birthday gifts and porn, everything that's private stays private. What about when screen sharing a work presentation and there's a banner ad for cancer or addiction treatment? What about months of funeral care ads after searching for what to do after a parent or child dies?
People think that advertisers are wasting money since they see ads for the same purchase made a week prior. They'd be devastated if health insurance providers partnered with Visa or a tracking network to extract a "health risk" profile.
The DNT header got abused and sent by default, which gave companies the excuse that it wasn’t actually conveying a user selection, thus wasn’t reflective of their actual choice to avoid tracking. So it goes.
It got sent by default, but I think calling that an abuse is stretching it. Do not track by default is what is meant to happen. That's what opt-in means.
And this is exactly why I enabled the global "Disable JavaScript" option in
uBlock Origin. The frustration these popups constantly cause far outweighs the
slight annoyance of having to re-enable JS for some websites (and you can ask
uBO to remember those anyways).
That's a bit broken for me now. I don't see the popups but I still sometimes get the overlays that stop me scrolling and I have to turn off ublock for the site, click accept, and turn ublock back on.
And this is why the consent information/opt-in/out boxes ought to be able to run with JS disabled, too. It's easy enough to do that... but that easy if it's something that gets put on the site via JS.
Personally I think all efforts to protect online privacy and stop tracking are wrong. But not in an obvious way. It sees right at first. But the truth it is impossible in the long run to keep privacy and not be tracked on the internet.
But the effort is to fight tracking and protect privacy at all cost. Even if this destroys foundations of the internet.
Moreover it gives the false belief that clicking NO will protect you from tracking, that companies protect your data.
But it is not a true belief. People should be aware that every password and everything transmitted through the internet can be tracked and may become public one day. And act accordingly.
It is just like data protection. You can have firewalls, antivirus and so on. But what you always really want to have is a backup.
The same goes for privacy and tracking. You can use some measures to protect, but you should act as you are tracked and everything can become public one day.
But such laws ensure people they don't need to act in such a way, what makes them less safe in the end run, rendering these laws to making people surprisingly less safe contrary to the intention of law makers.
I know this might just be me, but I miss the good old days where the browser would simply allow you to accept or reject cookies from a specific domain and then remember the choice. It made things like this much easier, although I suspect it would be something of a nightmare in todays cookie-infested third-party hell.
Firefox has settings to allow or block cookies for websites in Privacy & Security settings. Even better would be option to allow some specific cookies, e.g. language or sign in information, and block everything else by default.
As long as I can block third-party cookies by default, I'm content to let the website I'm on set whatever it wants. Firefox is moving in the right direction with total isolation, including caches, to prevent Spectre-style timing attacks, and I only hope that Chrome will follow suit.
"Clear cookies on departure" feels like it goes too far -- I do want the ability of the site to remember my login, etc., as a default thing, and once you open that door, they can link any browser identification to whatever they want on the backend; cookies just give an easy way for them to not talk to their own backend, but introduce no new security or privacy issues as far as I'm concerned.
I'd love to know similar sites for other EU countries.
One thing that I particularly dislike (and seems to be a uniquely US take on the EU GDPR rules) is "you must consent to access our site" banner. Not give _or refuse_ consent, but actively agree to the marketing crap or be redirected to a "bugger off commie" wall. An example would be healthline.com
I think this explicitly violates article 7, paragraph four of the regulations that states:
> When assessing whether consent is freely given, utmost account shall be taken of whether, inter alia, the performance of a contract, including the provision of a service, is conditional on consent to the processing of personal data that is not necessary for the performance of that contract.
But then -- I am not a lawyer. But if any HN readers are lawyers, I'd love to hear your take on it...
I turn off JS globally in Vivaldi, then the browser has a super easy way to enable JS for each website. Then when I hit something that I really want to view and it needs JS I open it in a private window.
Cookies are not the problem here. They don't need consent for cookies, but for tracking. And if you were to block cookies, they can still track you with a lot of other fingerprinting technologies - and would again ask you for consent for that.
I recommend enabling the EasyList Cookie blocking list in the adblocker of choice (i.e. uBlock Origin). Its not enabled by default, so check your settings (Edit: This will block the consent popups, not the cookies).
If there was a browser setting to accept (or reject) all cookies regardless of intended use, that would actually solve the problem for many. Just because many people want to make case by case decisions, we shouldn't have to burden everyone with this task.
I personally would prefer to accept all cookies, and take responsibility for keeping separate cookie jars as needed.
As I said, there are hundreds of other ways to fingerprint you. If the number of users who block tracking cookies reaches a critical mass, advertisers would switch to those.
What you're talking about doesn't describe my problem. My only problem is that sites spend any time at all asking about cookies or tracking, which I can control on my end anyway.
I would globally pre-consent to all tracking if I could because I don'd mind supporting the websites I use. This should definitely be a browser feature.
How should a Transparency & Consent String be stored?
In version 1 of the TCF Specifications the consent string was specified to be stored as either a 1st party cookie for service-specific consent or a 3rd party cookie for global consent. In version 2 of the TCF Specifications, the storage mechanism used for service-specific TC Strings is up to a CMP, including any non-cookie storage mechanism. However, global TC Strings must still be stored as cookies under the consensu.org domain.[0]
In practice, some CMPs used to share positive consent across websites, but did not share negative consent. So if they tricked you into accepting once, they keep it; if you refuse, they keep annoying you. My understanding is that watchdogs pushed back, which is why the whole sharing thing isn't as prominent anymore.
I think there is a browser extension for this. I forgot what it's called but it partners with the cookie banner companies, so that it automatically sets your preferences on most websites.
On a sidenote: the game ist (fun) advertisment for a website selling a book.
When I visit this site (me sitting in Europe) they immediately set the _ga cookie (tested on vanilla Chrome on purpose).
There is no privacy banner at all, they just set the cookie.
They probably left out the banner to save my time, no?
EDIT: gumroad sets the cookie, not bigdatagirl. Does that make it better?
Hey, It's me the guy who made both those things.
We spent along time removing all non essential cookies from the site, so i'm not sure what your getting. I just panicked and tried on vanilla chrome, and couldn't find that cookie? We have no GA but use fathom instead. But if it's there I want to remove it asap. Let me know if you have any more info
The "Cookie Law" and the GDPR aren't the same thing. I've noticed people make this mistake a few times recently.
The Cookie Law is circa 10 years ago, I think, and is widely considered to be poorly implemented. The GDPR is newer (implemented in 2018) and is widely considered to be a good idea. AFAIK, the GDPR didn't subsume the Cookie Law, but I may be wrong about that.
The law isn't poorly implemented. The way websites deal with it is. Just don't set any cookies for a read-only visitor and you don't need to add any popups.
It's both. The law itself is poorly thought-out and overly restrictive. And then websites also don't understand it and do stupid things in the name of compliance, which are neither compliant nor beneficial to the user.
"Make a fraction of the ad money you'd have had with targeting and you don't need any popups" doesn't help people running non-hobby websites put food on the table.
> AFAIK, the GDPR didn't subsume the Cookie Law, but I may be wrong about that
You are correct. GDPR repealed and replaced the Data Protection Directive (DPD) from 1995. The "cookie law" (ePrivacy Directive, ePD) was an extension of the DPD, and made heavy reference to it. As part of replacing the DPD, GDPR includes a provision that any law referring to the DPD now refers to GDPR instead, which affects the ePD.
So ePD is still in effect, and by reference uses GDPR's new stricter definition of consent. This is a problem. The ePD was dumb but mostly ignorable. The "upgrade" has made its dumb-ness actually impactful.
Yes the Cookie Law was older but websites determination to harvest as much as they can despite GDPR is what spawned these giant horrible pop-ups that have ten rows of confusing switches. Its a trick to make you opt in to all the things that GDPR says you should be able to opt out of.
I like that they didn’t go all out on those dark patterns and created a rather user-friendly and straightforward version of how that experience feels in real life.
Cookie banners have been so badly designed everywhere I see them. This being mandatory makes me work the extra mile to ensure I don't require/use ANY cookie on the webapps/websites I make.
I made a tool-assisted speedrun to complete this in 00:00.00 which has been confirmed by the dev as the TAS world record, paste this into your browser console after clicking on "let's do this" https://pastebin.com/NZQGSxhL
"You opted out of our cookies, but we're going to say we need them anyway, but you can still opt out of that".
It's somewhere between underhand and downright disturbing ("our interests override your lack of consent"? Eww)