Featured image of post ConceptAttention: Diffusion Transformers Learn Highly Interpretable Features

ConceptAttention: Diffusion Transformers Learn Highly Interpretable Features

A clever way to attend to the important region

Info

Comments

Concetpt Attention repurposes DiT attention layers to generate saliency maps by processing concept embeddings through a separate attention mechanism that interacts with image features. Method

Last updated: 2025-05-07
Built with Hugo, theme modified on Stack