From the course: Programming Generative AI: From Variational Autoencoders to Stable Diffusion with PyTorch and Hugging Face
Unlock this course with a free trial
Join today to access over 24,800 courses taught by industry experts.
Representing images as tensors
From the course: Programming Generative AI: From Variational Autoencoders to Stable Diffusion with PyTorch and Hugging Face
Representing images as tensors
- This lesson is going to be all about how do we build generative models for images? But before we can get to that, we have to think about how might we actually represent image data itself in code or in a computational way? From the previous lesson, when we were talking about how machine learning and how generative models can actually create things, I presented this image of a cat, some black box, we're either predicting the label cat or we're giving it the label cat and having it generate an image. What I didn't talk about is that usually when we are representing images in code, they're represented in some numerical sense. The most basic is actually a gray-scale image and representing it as values simply between zero and 255 that correspond to the luminosity of a given pixel, or essentially how light or dark it is. Given this picture of a cat and a kind of highly simplified version of a matrix, which, as we learned, is actually a tensor, but for this gray-scale image, we have a…
Contents
-
-
-
-
-
(Locked)
Topics54s
-
(Locked)
Representing images as tensors7m 45s
-
(Locked)
Desiderata for computer vision4m 57s
-
(Locked)
Features of convolutional neural networks7m 56s
-
(Locked)
Working with images in Python10m 20s
-
(Locked)
The Fashion-MNIST dataset4m 48s
-
(Locked)
Convolutional neural networks in PyTorch10m 43s
-
(Locked)
Components of a latent variable model (LVM)8m 57s
-
(Locked)
The humble autoencoder5m 29s
-
(Locked)
Defining an autoencoder with PyTorch5m 42s
-
(Locked)
Setting up a training loop9m 47s
-
(Locked)
Inference with an autoencoder4m 16s
-
(Locked)
Look ma, no features!8m 21s
-
(Locked)
Adding probability to autoencoders (VAE)4m 49s
-
(Locked)
Variational inference: Not just for autoencoders7m 20s
-
(Locked)
Transforming an autoencoder into a VAE13m 26s
-
(Locked)
Training a VAE with PyTorch13m 33s
-
(Locked)
Exploring latent space11m 37s
-
(Locked)
Latent space interpolation and attribute vectors12m 30s
-
(Locked)
-
-
-
-
-