Why you should think twice before uploading photos of family/friends...

Started by Mario, February 11, 2025, 06:34:45 PM

Previous topic - Next topic

Mario

The https://theyseeyourphotos.com/ site is a project to inform users about what the Google bot (and Facebook bot, and Bing bot, and Apple bot, ...) learns from images scanned from personal and commercial websites, Facebook, Instagram etc.

For example, for this image from my collection (AI-generated, so no privacy issues for these ladies):
(right-click on the image > open in new tab for a better view).

The image contains only minimal EXIF data, no GPS or camera info. This data would further help Google to dig deeper.


Image3.jpg

Note: When you drag a test photo onto this web site, give it a few seconds to handle it.

voronwe

This is really scary. I just uploaded a picture of me (nothing special, just a normal portrait you would use for e.g. Teams).
I do not know what scares me more:
The assumptions this API makes which are correct
or
The assumptions it made but which are inccorect.

For example: The assumption of my political position was correct - And I really wunder where it got this information from


Mario

Quote from: voronwe on February 20, 2025, 03:51:26 PMFor example: The assumption of my political position was correct - And I really wunder where it got this information from
Facebook, Twitter, Instagram, Google, YouTube and several thousand other companies are tracking you across every web site, unless you use tools like UBlock or Privacy Badger. And probably even then.

If you visit web sites, watch YouTube videos, or maybe buy the same toilet paper as persons A,C,D, and X, and these all have a specific political view or tend towards a specific party, Google and others will consider this.

Your smart phone is constantly feeding what you do, where you are, where you go and how fast, how long you stay etc. into the graphs Google, Apple and others maintain. Some popular apps out there feed your data to dozens or even hundreds of data brokers.  But hey, it's free!

Facebook, Google etc. use face recognition to detect persons in photos you upload. And in photos uploaded by others showing you. This is how they learn who you know. And who the persons you know know.

AI now allows them to learn everything about photos on the web, Facebook, Twitter, Instagram. From where they were taken and when, the clothes you wear, your weight and age, which car you drive, what the inside of your home looks like, which furniture you have etc.

They accumulate all of the data they gather in your graph, buy additional data, sell your data, and run powerful algorithms to learn who you are, and how to make you buy stuff by showing you the right ads. Or showing you the right feeds on Twitter or YouTube.

Web sites like https://theyseeyourphotos.com/ show the extent of information and knowledge these companies have about us.
Unless they are in IT or interested in things like security and privacy, most people are not aware of any of this.
And the guy is using public APIs to get the data. I highly doubt that Google gives full access to their graphs to others.

In the E.U. we have rather strict privacy settings. And of course Big IT is moaning, and fighting every law and regulation, spending years in the courts just to delay the rights for more privacy - to make some extra money in the meantime.

Just the other day I've surprised somebody. We were talking about cookies and what they do. And, her words, the "Damn Cookie banners in Germany".

I gave her an example, using the website "The Verge". When you click the customize option in their cookie banner to learn what this is all about, you see this:

Image1.jpg

The term "nnn partners can use this purpose" means that these are companies having access to the cookie theverge.com plants on your PC or phone. And they have access to these cookies from any web sites in their marketing network, allowing them to track you wherever you go, whatever site you visit.

The blue highlight is mine. I'm not sure I want to allow over 800 companies I don't know anything about to track me everywhere on the internet. Event the "essential" always on cookies are shared with 536 other companies.

The Verge is just an example. Most sites use the same tracking networks, with a similar number of companies they share data with (aka "partners").

Of course most people out there just click "I agree" and then move on with their lives.
And then wonder, why Google and others know so much about them  ;D




sinus

Quote from: Mario on February 20, 2025, 05:05:09 PMAnd then wonder, why Google and others know so much about them  ;D



I wonder, if you click on, like you do, only necessary cookies and do not alloud the others, if this is finally really always the same.
At least, we (I) believe it, that only the essential cookies are used  ... but who knows? 
Best wishes from Switzerland! :-)
Markus

JohnZeman

I almost never post photos of people on facebook or X but before I ever upload any photo(s) to social media I use ExifTool to strip all of the metadata from the photo(s).

Mario

Quote from: sinus on February 20, 2025, 05:43:37 PMI wonder, if you click on, like you do, only necessary cookies and do not alloud the others, if this is finally really always the same.
At least, we (I) believe it, that only the essential cookies are used  ... but who knows? 
Essential cookies, in the case of The Verge, are shared with 536 other companies. If you look at the cookie settings for your favorite web site, you will see very similar numbers :-\

Mario

Quote from: JohnZeman on February 20, 2025, 07:02:52 PMI almost never post photos of people on facebook or X but before I ever upload any photo(s) to social media I use ExifTool to strip all of the metadata from the photo(s).
You can also simply use the Batch Processor to export the image without or with controlled metadata (like the tag I've explained in Protect Your Images from Data Mining and AI.

Metadata helps Big IT. But even without metadata, applying face recognition and image analysis, they learn what's there to learn about your photos.

onnod

A quick check of PimEyes and faceCheck.ID should be sobering. Stripping metadata can only do so much. There are several instances where it is hard/impossible to stop personal images and data from being uploaded for the public, but once it is, it is available forever.

suttonbg

Dear Mario,

Thank you for this eye-opening demonstration. I've found a way to double your fun: submit the same photo multiple times and compare the results. I've received analyses that are completely contradictory and sometimes totally wrong, but someone somewhere is making a decision based on this data that may, directly or indirectly, impact our lives.

It also raises the question of how to safely share family photos. I recently photographed a relative's 70th birthday party, with many guests, globally dispersed, all of which would like to see my best results. The preference is to keep the sharing private within the attendees. I've looked at a lot of photo sharing websites, guided by usually dependable reviews and really not found anything that fits the bill in terms of utility and security. For instance, I was very excited about group sharing in Amazon Photos, until I discovered, after lengthy chats with them, that this facility is not offered in places like the UK, Canada and Australia.

The conclusion I've arrived at is to use iMatch Anywhere over a private vpn, probably Tailscales. From my reading and a bit of experimentation, that seems to offer, for the user, ease of use and the security we want.

That decision raises some operational questions regarding deployment of IMA, but I'll ask those in the appropriate forum.

Thanks again.

sinus

One of the problem is, that the "world" now is quite global and people like to share photos and videos on social medias.
And if you are also a bit interested in social live or at least some events, then the chance or risk, depending on you view, is big, that your face will flow into the world on internet, sooner or later.

Say, you go to a weeding as a guest. People do there a lot of pics and videos and most of them to not ask of course, because the mood is great and it would be also sometimes not possible to ask or, in such occasions, a bit curious.

And then people like to share and, after while, you will become a message links, where all the photos are. :) And sometimes you will spot images with your face, what you did not know.

This is also one example, but I believe, there are a lot of such things, like sport-events, concerts and many other small or big events. Or you walks simply on the street and someone likes to take photos.

Well, this "new world" has a lot of nice parts, but can be also very annoying, I think.
Best wishes from Switzerland! :-)
Markus

axel.hennig

Quote from: suttonbg on March 10, 2025, 07:53:50 AMThe conclusion I've arrived at is to use iMatch Anywhere over a private vpn, probably Tailscales. From my reading and a bit of experimentation, that seems to offer, for the user, ease of use and the security we want.

Interesting. I would be interested in more details how exactly you did set-up this (from a security point of view).

Mario

One does not have to share everything with the entire world. Facebook offers closed groups, to share photos and information only with invited guests. Facebook will have access to the information, though.

In general, I try not to feed the Big IT AI's with personal information.
The less their AIs know about you and the less information about you is in their "graphs", the better. Less targeted ads, less potential manipulation via Facebook, Insta, and YouTube feeds. And less-biased results in Google search results.

All of this has become to creepy for my taste. But I'm me, other people's mileage will vary.