1. AI Tools List
  2. Art
  3. Stable Attribution
Stable Attribution
Stable Attribution

Stable Attribution


Stable Attribution
Stable Attribution is a tool that aims to restore credit to the artists whose work is used to train artificial intelligence (AI) models for image generation. It works by finding the most similar images from the training data that influenced the AI-generated image, and displaying them along with their original sources and creators. By doing so, Stable Attribution hopes to enable ethical collaboration between human artists and AI, as well as to foster creativity and discovery of new works.
Discord Community No Signup Required


Stable Attribution was developed by Chroma, a start-up that specializes in making AI understandable by analyzing how the training data affects the model's behavior. The tool currently supports images generated by Stable Diffusion, a popular text-to-image model that uses a large dataset of images and captions scraped from the web. Stable Attribution's algorithm decodes the generated image and compares it with the indexed dataset to find the most influential images, based on visual similarity and details.

Stable Attribution is currently in beta and invites users to try it out and join their Discord community. They also provide a FAQ section that answers common questions about their tool, their motivation, and their vision for the future of AI and art.

  • It gives credit to artists whose work is used to train AI models.
  • It allows users to discover new creators whose work they like.
  • It opens up the possibility of ethical collaboration and compensation for artists.
  • It makes AI models more transparent and understandable.
  • It encourages creativity and expression with AI.



  • It may not be able to find all the contributing images for a given AI-generated image.
  • It may not be able to verify the original source or license of the contributing images.
  • It may not be able to handle images that are generated by other models or methods than Stable Diffusion.
  • It may not be able to prevent misuse or abuse of AI-generated images by malicious actors.
  • It may not be able to address other ethical issues related to AI, such as bias, privacy, or accountability.

Alternative AI Tools