Saturday, July 6, 2024
HometechnologyArtists Boycott Meta's AI Policy: Leaving Instagram for Cara

Artists Boycott Meta’s AI Policy: Leaving Instagram for Cara

Meta’s recent decision to use Instagram content to train its AI models has sparked a backlash among artists. As the rapid development of artificial intelligence continues, the use of user-generated content for AI training by companies like Meta has begun to draw criticism from various groups. After learning that their Instagram posts were being used to train Meta’s AI, many artists have decided to move to a new platform. Here are the details…

Artists Migrate from Instagram to Cara

Artists, upon discovering that Meta was utilizing Instagram content for AI training, have decided to switch to a different platform. Cara, an AI-resistant portfolio application, has gained significant attention as an alternative to Instagram, promising to protect artists’ works from being exploited by AI. In just a few days, Cara’s user base has surged from 100,000 to 300,000, climbing to the top of app store charts. Over the past week, the platform’s user count has skyrocketed from 40,000 to 650,000. Cara only allows AI-generated content if it is explicitly labeled as such.

Cara is designed as a social networking app for creators, offering a platform where users can share their drawings, content, or text-based posts with others. It is available for free download on the Apple App Store and Google Play Store and also accessible via web browsers.

Combining elements of X (formerly Twitter) and Instagram, Cara’s art-focused nature and protection against AI exploitation have garnered significant interest from artists over the past week. In December, Cara introduced Cara Glaze, a tool aimed at better protecting artists’ works from AI. Developed by the SAND Lab at the University of Chicago, this tool alters how AI bots perceive art, making it harder for them to mimic the artists’ styles. Glaze makes minimal changes that are imperceptible to the human eye but successfully deceive AI models.

Advanced Tools for Artist Protection

Cara will soon introduce another protective tool called Nightshade, which adds invisible pixels to artworks to confuse AI models. This ensures that AI misinterprets the content, further safeguarding artists’ creations.

Beyond the artist community, there is a broader ethical concern regarding Meta’s use of personal content for AI training without explicit permission. Many find it unethical for Meta to use users’ posts in this manner. What are your thoughts on Meta using your content to train its AI? Share your opinions in the comments below!

LEAVE A REPLY

Please enter your comment!
Please enter your name here

RELATED ARTICLES

Most Popular

Recommended News