> This part of their defence is outright laughable. What's the point of having this sort of database without the associated metadata?
I'm not a fan of Clearview AI, but I understand it's still useful even for faces that aren't explicitly tied to an identity in the system, because the if the police know where the matched photo is from, they can do the legwork to fill in the missing details.
I think one of the examples in an early article about it had the police feeding a suspect's face into it. The system found a match in the background of some photo taken at a trade show (the guy was working a both). It only knew the website the photo was from, which allowed the police the identify the trade show, which allowed them to track down the people working at that booth to find their suspect.
The point being... _they then know where it's from_, that's the metadata. If they know the trade show, then they know where and when it was, whether it was in Canada and whether they need to delete it.
> The point being... _they then know where it's from_, that's the metadata. If they know the trade show, then they know where and when it was, whether it was in Canada and whether they need to delete it.
In most cases they probably only know the URL, and it's far from trivial (or even possible in a general case) to mechanically derive if photo was taken in Canada from that.
Personally I hope they're forced to overshoot and delete far more than necessary to ensure compliance (e.g. only keep photos they can positively determine were taken outside of Canada).
They don't, they just know the original URL. IIRC, they also scrape news websites and it's impossible to automatically and correctly detect the location of a person.
Even if it's from a social network, you still can't be sure. Especially, if a photo have multiple persons.
It is so impossible that it is the actual use case they are selling?
" The system found a match in the background of some photo taken at a trade show (the guy was working a both). It only knew the website the photo was from, which allowed the police the identify the trade show, which allowed them to track down the people working at that booth to find their suspect."
Just as evidence I'm not defending them, I find their project totally unethical, and the owners and their associates are also highly involved with internet fascism and credential phishing (I know the source isn't the best, but I recommend reading the article): https://www.huffpost.com/entry/clearview-ai-facial-recogniti...
With that said: the order is clearly impossible to execute. There's a huge difference between police clicking on a URL in a result and manually investigating to determine the location and having their system automatically determine the location of all photos and delete the ones that are in Canada.
> With that said: the order is clearly impossible to execute.
It is clearly possible to execute, it would just require much higher costs per image that ClearView processes and the probable need for ClearView to avoid using any images that they can't verify the location of in am economical fashion.
Saying it is "impossible" is different from saying that the costs of compliance would fundamentally change ClearView's business model.
Sure, but the point is that their business is not viable then. If it's illegal to serve childporn and imgur can't profitably detect if it's serving childporn or not, then imgur is doomed. I think no one would dispute that?
Depends, what level of accuracy are you demanding?
If the answer is 100%, then yes, all user-generated content on the internet will have to be banned unless subject to manual human review before allowing the post to be seen, and even that is probably not perfect.
The matched photo is some piece of evidence the police already have from some investigation, e.g. surveillance camera or whatever. They know exactly where they got it. This is matched against the Clearview AI database. So then that match attaches information to the Clearview AI photo. They may be able to figure out where that was taken, since it is narrowed down.
I'm not a fan of Clearview AI, but I understand it's still useful even for faces that aren't explicitly tied to an identity in the system, because the if the police know where the matched photo is from, they can do the legwork to fill in the missing details.
I think one of the examples in an early article about it had the police feeding a suspect's face into it. The system found a match in the background of some photo taken at a trade show (the guy was working a both). It only knew the website the photo was from, which allowed the police the identify the trade show, which allowed them to track down the people working at that booth to find their suspect.