mirror of
https://github.com/lucidrains/DALLE2-pytorch.git
synced 2026-02-14 10:24:31 +01:00
Compare commits
3 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
bb86ab2404 | ||
|
|
ae056dd67c | ||
|
|
033d6b0ce8 |
@@ -18,7 +18,11 @@ There was enough interest for a <a href="https://github.com/lucidrains/dalle2-ja
|
|||||||
|
|
||||||
- A research group has used the code in this repository to train a functional diffusion prior for their CLIP generations. Will share their work once they release their preprint. This, and <a href="https://github.com/crowsonkb">Katherine's</a> own experiments, validate OpenAI's finding that the extra prior increases variety of generations.
|
- A research group has used the code in this repository to train a functional diffusion prior for their CLIP generations. Will share their work once they release their preprint. This, and <a href="https://github.com/crowsonkb">Katherine's</a> own experiments, validate OpenAI's finding that the extra prior increases variety of generations.
|
||||||
|
|
||||||
- Decoder is now verified working for unconditional generation on my experimental setup for Oxford flowers
|
- Decoder is now verified working for unconditional generation on my experimental setup for Oxford flowers. 2 researchers have also confirmed Decoder is working for them.
|
||||||
|
|
||||||
|
<img src="./samples/oxford.png" width="600px" />
|
||||||
|
|
||||||
|
*ongoing at 21k steps*
|
||||||
|
|
||||||
## Install
|
## Install
|
||||||
|
|
||||||
|
|||||||
@@ -366,7 +366,7 @@ class DecoderTrainer(nn.Module):
|
|||||||
lr = 1e-4,
|
lr = 1e-4,
|
||||||
wd = 1e-2,
|
wd = 1e-2,
|
||||||
eps = 1e-8,
|
eps = 1e-8,
|
||||||
max_grad_norm = None,
|
max_grad_norm = 0.5,
|
||||||
amp = False,
|
amp = False,
|
||||||
**kwargs
|
**kwargs
|
||||||
):
|
):
|
||||||
|
|||||||
BIN
samples/oxford.png
Normal file
BIN
samples/oxford.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 985 KiB |
Reference in New Issue
Block a user