Ebook: Cloud-VAE: Variational autoencoder with concepts embedded
- Genre: Computers // Algorithms and Data Structures: Pattern Recognition
- Series: 140
- Year: 2023
- Publisher: Elsevier
- Language: English
- pdf
Variational Autoencoder (VAE) has been widely and successfully used in learning coherent latent representation of data. However, the lack of interpretability in the latent space constructed by the VAE under
the prior distribution is still an urgent problem. This paper proposes a VAE with understandable concept
embedding named Cloud-VAE, which constructs interpretable latent space by disentangling the latent
variables and considering their uncertainty based on cloud model. Firstly, cloud model-based clustering
algorithm cast initial constraint of latent space into a prior distribution of concept which can be embedded into the latent space of the VAE to disentangle the latent variables. Secondly, reparameterization
trick based on forward cloud transformation algorithm is designed to estimate the latent space concept
by increasing the randomness of latent variables. Furthermore, variational lower bound of Cloud-VAE is
derived to guide the training process to construct concepts of latent space, realizing the mutual mapping
between latent space and concept space. Finally, experimental results on 6 benchmark datasets show that
Cloud-VAE has good clustering and reconstruction performance, which can explicitly explain the aggregation process of the model and discover more interpretable disentangled representations.
the prior distribution is still an urgent problem. This paper proposes a VAE with understandable concept
embedding named Cloud-VAE, which constructs interpretable latent space by disentangling the latent
variables and considering their uncertainty based on cloud model. Firstly, cloud model-based clustering
algorithm cast initial constraint of latent space into a prior distribution of concept which can be embedded into the latent space of the VAE to disentangle the latent variables. Secondly, reparameterization
trick based on forward cloud transformation algorithm is designed to estimate the latent space concept
by increasing the randomness of latent variables. Furthermore, variational lower bound of Cloud-VAE is
derived to guide the training process to construct concepts of latent space, realizing the mutual mapping
between latent space and concept space. Finally, experimental results on 6 benchmark datasets show that
Cloud-VAE has good clustering and reconstruction performance, which can explicitly explain the aggregation process of the model and discover more interpretable disentangled representations.
Download the book Cloud-VAE: Variational autoencoder with concepts embedded for free or read online
Continue reading on any device:
Last viewed books
Related books
{related-news}
Comments (0)