• Latest
  • Trending
  • All
  • BUSINESS
  • ENTERTAINMENT
  • POLITICAL
  • TECHNOLOGY

November 23, 2024
Indices: Already not extreme fear

Indices: Already not extreme fear

April 24, 2025
Eurozone: Tariff reversal is some relief, but no game changer – ABN AMRO

Eurozone: Tariff reversal is some relief, but no game changer – ABN AMRO

April 24, 2025
US: The US has already lost the trade war – ABN AMRO

US: The US has already lost the trade war – ABN AMRO

April 24, 2025
Predictive Analytics Promise the End of ‘Gut Feelings’ in Construction

Predictive Analytics Promise the End of ‘Gut Feelings’ in Construction

April 24, 2025
First Border Wall Contracts of Second Trump Term Awarded in Texas, San Diego

First Border Wall Contracts of Second Trump Term Awarded in Texas, San Diego

April 24, 2025
Construction Economics for April 28, 2025

Construction Economics for April 28, 2025

April 24, 2025
AI startups backed to boost construction productivity

AI startups backed to boost construction productivity

April 24, 2025
Why is building safety litigation on the rise?

Why is building safety litigation on the rise?

April 24, 2025
Severfield to cut 6 per cent of staff despite ‘solid’ order book

Severfield to cut 6 per cent of staff despite ‘solid’ order book

April 24, 2025
Bovis promotes operations head to board

Bovis promotes operations head to board

April 24, 2025
China expresses condolences over death of Pope Francis, World News

China expresses condolences over death of Pope Francis, World News

April 24, 2025
Pope Francis’ body taken in procession to St Peter’s for lying in state, World News

Pope Francis’ body taken in procession to St Peter’s for lying in state, World News

April 24, 2025
  • About
  • Advertise
  • Privacy & Policy
  • Contact
Friday, June 13, 2025
No Result
View All Result
  • HOME
  • BUSINESS
  • ENTERTAINMENT
  • POLITICAL
  • TECHNOLOGY
  • ABOUT US
  • Login
  • Register
  • HOME
  • BUSINESS
  • ENTERTAINMENT
  • POLITICAL
  • TECHNOLOGY
  • ABOUT US
No Result
View All Result
Huewire
No Result
View All Result
Home TECHNOLOGY

by huewire
November 23, 2024
in TECHNOLOGY
0
491
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Artists and writers have launched several lawsuits against AI companies, arguing that their work has been scraped into databases for training AI models without consent or compensation. Tech companies have responded that anything on the public internet falls under fair use. But it will be years until we have a legal resolution to the problem. 

Unfortunately, there is little you can do if your work has been scraped into a data set and used in a model that is already out there. You can, however, take steps to prevent your work from being used in the future. 

Here are four ways to do that. 

Mask your style 

One of the most popular ways artists are fighting back against AI scraping is by applying “masks” on their images, which protect their personal style from being copied. 

Tools such as Mist, Anti-DreamBooth, and Glaze add tiny changes to an image’s pixels that are invisible to the human eye, so that if and when images are scraped, machine-learning models cannot decipher them properly. You’ll need some coding skills to run Mist and Anti-DreamBooth, but Glaze, developed by researchers at the University of Chicago, is more straightforward to apply. The tool is free and available to download as an app, or the protection can be applied online. Unsurprisingly, it is the most popular tool and has been downloaded millions of times. 

But defenses like these are never foolproof, and what works today might not work tomorrow. In computer security, breaking defenses is standard practice among researchers, as this helps people find weaknesses and make systems safer. Using these tools is a calculated risk: Once something is uploaded online, you lose control of it and can’t retroactively add protections to images. 

Rethink where and how you share 

Popular art profile sites such as DeviantArt and Flickr have become gold mines for AI companies searching for training data. And when you share images on platforms such as Instagram, its parent company, Meta, can use your data to build its models in perpetuity if you’ve shared it publicly. (See opt-outs below.) 

One way to prevent scraping is by not sharing images online publicly, or by making your social media profiles private. But for many creatives that is simply not an option; sharing work online is a crucial way to attract clients. 

It’s worth considering sharing your work on Cara, a new platform created in response to the backlash against AI. Cara, which collaborates with the researchers behind Glaze, is planning to add integrations to the lab’s art defense tools. It automatically implements “NoAI” tags that tell online scrapers not to scrape images from the site. It currently relies on the goodwill of AI companies to respect artists’ stated wishes, but it’s better than nothing. 

Opt out of scraping 

Data protection laws might help you get tech companies to exclude your data from AI training. If you live somewhere that has these sorts of laws, such as the UK or the EU, you can ask tech companies to opt you out of having your data scraped for AI training. For example, you can follow these instructions for Meta. Unfortunately, opt-out requests from users in places without data protection laws are honored only at the discretion of tech companies. 

The site Have I Been Trained, created by the artist-run company Spawning AI, lets you search to find out if your images have ended up in popular open-source AI training data sets. The organization has partnered with two companies: Stability AI, which created Stable Diffusion, and Hugging Face, which promotes open access to AI. If you add your images to Spawning AI’s Do Not Train Registry, these companies have agreed to remove your images from their training data sets before training new models. Again, unfortunately, this relies on the goodwill of AI companies and is not an industry-wide standard. 

If all else fails, add some poison

The University of Chicago researchers who created Glaze have also created Nightshade, a tool that lets you add an invisible layer of “poison” to your images. Like Glaze, it adds invisible changes to pixels, but rather than just making it hard for AI models to interpret images, it can break future iterations of these models and make them behave unpredictably. For example, images of dogs might become cats, and handbags might become toasters. The researchers say relatively few samples of poison are needed to make an impact. 

You can add Nightshade to your image by downloading an app here. In the future, the team hopes to combine Glaze and Nightshade, but at the moment the two protections have to be added separately. 

Read More

Share196Tweet123
huewire

huewire

Recent Comments

No comments to show.

Recent Posts

  • Indices: Already not extreme fear
  • Eurozone: Tariff reversal is some relief, but no game changer – ABN AMRO
  • US: The US has already lost the trade war – ABN AMRO
  • Predictive Analytics Promise the End of ‘Gut Feelings’ in Construction
  • First Border Wall Contracts of Second Trump Term Awarded in Texas, San Diego
Huewire

Copyrights © 2024 Huewire.com.

Navigate Site

  • About
  • Advertise
  • Privacy & Policy
  • Contact

Follow Us

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

No Result
View All Result
  • HOME
  • BUSINESS
  • ENTERTAINMENT
  • POLITICAL
  • TECHNOLOGY
  • ABOUT US

Copyrights © 2024 Huewire.com.