7 comments

  • ChrisArchitect an hour ago ago

    [dupe] https://news.ycombinator.com/item?id=41726385

    https://news.ycombinator.com/item?id=41724957

    among other submissions.

    Just focus on the students' source instead of repeating these news posts over and over and over

    Source Doc: https://news.ycombinator.com/item?id=41724310

  • gradientsrneat 3 hours ago ago

    Really glad there are researchers out there coming up with bad actor scenarios, and I hope corporations, privacy advocates, and the government take these into consideration.

    These are solvable problems. Not always 100% but the harms can be mitigated. But the truth is that some people don't want these problems solved.

    Another example in the wild was Apple Air Tags.

  • rahimnathwani 3 hours ago ago

    I'm curious about how they do the reverse image search. Do they just use Google?

    • palmfacehn 3 hours ago ago

      The article mentions that they used publicly available databases. Their paper specifically mentioned Pimeyes and Facecheck ID.

      • mrgoldenbrown 19 minutes ago ago

        This use seems like it would violate the terms of service for pimeyes, which say it should only be used with the consent of the people in the photos. (In practice it's clear they dont actually care)

  • Justin_K 2 hours ago ago

    While I like what they stitched together, it has nothing to do with meta ... at the end of the day, any camera can do the same

    • 39896880 an hour ago ago

      Meta just released an internet connected camera disguised as eye wear. That makes this situation quite different. If a person wants to reject being photographed, they move away from the person holding the camera, or avoid public places that have surveillance systems. The expectation has now been increased: to reject being identified, a person must avoid people wearing glasses.

      Far more people wear glasses than hold a camera.