For now spec calls "holdbacks", which are designed for this purpose. Attestors will fail randomly for a set percentage of the requests so this can't be used as a whitelist. Surely this "holdbacks" will either be not implemented or dropped in no time by attestors.
I rechecked the current spec. It does not fully cover what a user agent can ask to the attestor ( "content binding" to be defined). So we can assume this attestation spec is defined at the attestor.
Of course this does not mean attestor can not have different profiles to attest for.
So your comment even though is possible, just not defined yet. Which we can - I believe - rightfully assume will be in the final spec or implementation.
I am using Firefox too. However I also consume lots and lots of general purpose websites which in time probably become not consumable if you are not compliant. Which in turn either render FF not usable, or adopt the unfortunate standards.
Block users all you want, but don't expect me to "attest my hardware and software" from a 3rd party. Let alone make this a standard and think about leaving the keys to parties which are probably "themselves" only.
How on earth the expectation can be giving authority to third parties to set my hardware and software to be validated so they attest to an arbitrary standard which I will never have control over?
See the current SSL certificate authorities mess. I have to pay to a third party to asure my clients that my server can securely communicate with them. Now they are doing this to clients with a more strict manner.
Because it just works (tm). And it is flexible to a point that no GUI can ever accomplish. It's liberating. It's repeatable, It's automatable. It's about control. And most importantly, it's FAST!
If you try to max out the control, GUI comes out of as an UX disaster. Check any enterprise software GUI to see what I mean. There will be lot's and lot's of buttons all around, and you would also end up with some kind of text input or programming environment inside it.
put.io is another option. You provide links, it downloads and streams it for you. I haven't been using this for a long long while but it was really really good. With deduplication fibrils it has, most of the content would be already downloaded for you.
It is, but If you review 3 lines of pr and not the whole 50 lines of the file without thinking of the overall picture, this happens. I got this while reviewing a pr like this. And most probably I approved similar prs for this file in the past. Shame on me too..
It's not the real issue. I introduced that while anonymizing the data. It's that a 3-5 liner code became a huge switch case by just incrementing the code and never thinking how it should be done. This was caused by like 15 engineers in time :)
Something like below would be huge improvement:
subs1 = ["cluster1", "cluster2"]
if subs1.contains(clusterName) return subs1
You can use pencils as supports too