Norobiik @Norobiik@noc.social<p><a href="https://noc.social/tags/Nightshade" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Nightshade</span></a> is an offensive <a href="https://noc.social/tags/DataPoisoning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>DataPoisoning</span></a> tool, a companion to a defensive style protection tool called <a href="https://noc.social/tags/Glaze" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Glaze</span></a>, which The Register covered in February last year.</p><p>Nightshade poisons <a href="https://noc.social/tags/ImageFiles" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ImageFiles</span></a> to give indigestion to models that ingest data without permission. It's intended to make those training image-oriented models respect content creators' wishes about the use of their work. <a href="https://noc.social/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://noc.social/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a></p><p>How artists can poison their pics with deadly Nightshade to deter <a href="https://noc.social/tags/AIScrapers" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AIScrapers</span></a><br><a href="https://www.theregister.com/2024/01/20/nightshade_ai_images/" rel="nofollow noopener noreferrer" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">theregister.com/2024/01/20/nig</span><span class="invisible">htshade_ai_images/</span></a></p>