EyeEm Photo-Sharing Platform

EyeEm, the Berlin-based photo-sharing community, which was acquired by Spanish company Freepik after facing bankruptcy last year, has introduced a new provision in its Terms & Conditions. The clause grants the company the rights to utilize users’ photos for training AI models. In an email sent out earlier this month, users were notified about this update and given a 30-day window to opt out by removing their content from EyeEm’s platform. Failure to do so implies consent to this use case for their work.

At the time of its acquisition in 2023, EyeEm boasted a photo library of 160 million images and nearly 150,000 users. The company expressed its intention to gradually integrate its community with Freepik’s. Despite its decline, data from Appfigures indicates that nearly 30,000 people are still downloading the app each month.

Once considered a potential rival to Instagram or atleast “Europe’s Instagram”, EyeEm had downsized to a mere staff of three prior to its acquisition by Freepik, as previously reported by TechCrunch’s Ingrid Lunden. Joaquin Cuenca Abela, CEO of Freepik, hinted at potential plans for EyeEm, suggesting an exploration of incorporating more AI into the platform for creators.

EyeEm’s latest Terms & Conditions reads as follows:

8.1 Grant of Rights – EyeEm Community

By uploading Content to EyeEm Community, you grant us regarding your Content the non-exclusive, worldwide, transferable and sublicensable right to reproduce, distribute, publicly display, transform, adapt, make derivative works of, communicate to the public and/or promote such Content.

This specifically includes the sublicensable and transferable right to use your Content for the training, development and improvement of software, algorithms and machine learning models. In case you do not agree to this, you should not add your Content to EyeEm Community.

The rights granted in this section 8.1 regarding your Content remains valid until complete deletion from EyeEm Community and partner platforms according to section 13. You can request the deletion of your Content at any time. The conditions for this can be found in section 13.

Section 13 outlines a complex deletion process, starting with the direct removal of photos, which, according to the company, won’t affect content shared to EyeEm Magazine or social media previously. To delete content from the EyeEm Market (where photographers sold their photos) or other content platforms, users must email support@eyeem.com, providing the Content ID numbers for the photos they wish to delete and specifying whether it should be removed from their account or just the EyeEm Market.

Notably, the notice mentions that deletions from the EyeEm Market and partner platforms may take up to 180 days. Interestingly, while requested deletions have a lengthy processing time, users are given only 30 days to opt out, leaving manual deletion of photos as the sole option.

Worse than that, the company adds that: 

You hereby acknowledge and agree that your authorization for EyeEm to market and license your Content according to sections 8 and 10 will remain valid until the Content is deleted from EyeEm and all partner platforms within the time frame indicated above. All license agreements entered into before complete deletion and the rights of use granted thereby remain unaffected by the request for deletion or the deletion.

In Section 8, the specifics of licensing rights for AI training are outlined. Meanwhile, Section 10 notifies users that deleting their account means relinquishing any potential payouts for their work, a detail that might lead users to consider deleting their accounts to prevent their data from being utilized in AI models. Tricky, isn’t it?

EyeEm’s recent actions highlight a broader trend of AI models being trained using users’ content, sometimes without their explicit consent. While EyeEm did provide a form of opt-out procedure, photographers who missed the announcement risked losing control over the future use of their photos. Given EyeEm’s dwindling popularity as an alternative to Instagram, many photographers might have forgotten about their association with the platform altogether, possibly overlooking or disregarding the email notification, perhaps buried in a spam folder.

Those who did catch wind of the changes expressed frustration over the short 30-day notice period and the absence of bulk deletion options for their contributions, making the opt-out process considerably more cumbersome.

Requests for comment were sent to EyeEm, but as confirmation wasn’t immediate and considering the 30-day deadline, we’ve decided to proceed with publishing before receiving a response.

Instances of such questionable conduct underscore why users are increasingly exploring alternatives within the open social web. One such platform, Pixelfed, operating on the ActivityPub protocol shared by Mastodon, is leveraging the EyeEm incident to entice users.

In an official statement, Pixelfed declared, “We prioritize your privacy and pledge never to utilize your images for AI model training. Privacy First, Pixels Forever.”

Read More: Shy Kids Explains Sora-powered AI-Generated Short’s Features and Constraints