Hitoshi Kokumai

1年前 · 1 分の読書時間 · visibility ~10 ·

chat 著者への問い合わせ

thumb_up 関連性 message コメント

‘Harmful for security or privacy’ OR ‘Harmful for both security and privacy’


The situation still the same, I bring back an article posted 13 months ago.

From one view angle, biometrics would be harmful for ‘privacy’ if as accurate as claimed or would be harmful for ‘security’ if not so accurate.

From another view angle, biometrics is harmful for ‘both security and privacy’ irrespective of whether accurate or inaccurate.

https://www.linkedin.com/pulse/security-vs-privacy-hitoshi-kokumai

........................................................

‘Security vs Privacy’ OR ’Security & Privacy’



‘Harmful for security or privacy’ OR ‘Harmful for both security and privacy’How does live facial recognition work?<br />
<br />
QP:<br />
<br />
CALCU ATG<br />
<br />
= possible<br />
and flagged<br />
<br />
Phot ¢<br />
matches may be<br />
ke

Police facial recognition surveillance court case starts ( https://www.bbc.co.uk/news/uk-48315979 )

I am interested in what is not referred to in the linked BBC report. That is, the empirical rate of target suspects not getting spotted (False Non-Match) when 92% of 2,470 potential match was wrong (False Match).

 The police could have gathered such False Non-Match data in the street just easily and quickly by getting several cops acting as suspects, with some disguised with cosmetics, glasses, wigs, beards, bandages, etc. as many of the suspects are supposed to do when walking in the street.

 Combining the False Match and False Non-Match data, they would be able to obtain the overall picture of the performance of the AFR (automated face recognition) in question.

 1.   If the AFR is judged as accurate enough to correctly identify a person at a meaningful probability, the AFR could be viewed as a serious 'threat' to privacy’ in democratic societies as civil-rights activist fear. This scenario is very unlikely, though, in view of the figure of 92% for false spotting.

 2.   If the AFR is judged as inaccurate enough to wrongly misidentify a person at a meaningful probability as we would suspect it is, we could conclude not only that deploying AFR is just waste of time and money but also that a false sense of security brought by the misguided excessive reliance on AFR could be a ‘'threat' to security’.

 Incidentally, should the (2) be the case, we could extract two different observations.

 (A)  It could discourage civil-rights activists - It is hardly worth being called a 'threat' to our privacy - it proving only that he may be or may not be he and she may be or may not be she, say, an individual may or may not be identified correctly

 (B)  It could do encourage civil-rights activists - It debunks the story that AFR increases security so much that a certain level of threats-to- privacy must be tolerated. 

 It would be up to civil-rights activists which view point to take.

 Anyway, various people could get to different conclusions from this observation. I would like to see some police conduct the empirical False Non-Match research in the street as indicated above, which could solidly establish whether “AFR is a threat to privacy though it may help security” or “AFR is a threat to both privacy and security”


thumb_up 関連性 message コメント
コメント
Hitoshi Kokumai

Hitoshi Kokumai

1年前 #2

#1
No objection. Biometrics, when deployed unwisely, would make racism worse I am focusing on the threat of biometrics to the security of digital identity because few other people talk much about this particular issue, whereas many people are already talking about the threat of biometrics to privacy and civil rights.

Zacharias 🐝 Voulgaris

A black-box technology like this is bound to be much worse than that. For instance, it can be a way to further promote racism, considering that the police database is bound to have way more pictures of black people (thanks to the existing racism in arrests and the race-based social divides). So, some people would be more accurately identified, leading to the solidification of the argument that the system work, while other people would be probably dismissed due to the high chance of a false match. If that's not a cover for racial distrimination I don't know what is. Cheers!

その他の記事 Hitoshi Kokumai

ブログを見る