Logo
Explore Help
Sign In
aljaz/DALLE2-pytorch
1
1
Fork 0
You've already forked DALLE2-pytorch
mirror of https://github.com/lucidrains/DALLE2-pytorch.git synced 2025-12-19 17:54:20 +01:00
Code Issues Packages Projects Releases Wiki Activity
132 Commits 2 Branches 353 Tags
f75d49c7811e5929b8ad43965674deea1bc3b0dd
Commit Graph

7 Commits

Author SHA1 Message Date
Phil Wang
f75d49c781 start a file for all attention-related modules, use attention-based upsampling in the unets in dalle-2 2022-04-25 18:59:10 -07:00
Phil Wang
3b520dfa85 bring in attention-based upsampling to strengthen vqgan-vae, seems to work as advertised in initial experiments in GAN 2022-04-25 17:27:45 -07:00
Phil Wang
f82917e1fd prepare for turning off gradient penalty, as shown in GAN literature, GP needs to be only applied 1 out of 4 iterations 2022-04-23 07:52:10 -07:00
Phil Wang
05b74be69a use null container pattern to cleanup some conditionals, save more cleanup for next week 2022-04-22 15:23:18 -07:00
Phil Wang
76b32f18b3 first pass at complete DALL-E2 + Latent Diffusion integration, latent diffusion on any layer(s) of the cascading ddpm in the decoder. 2022-04-22 13:53:13 -07:00
Phil Wang
461347c171 fix vqgan-vae for latent diffusion 2022-04-22 11:38:57 -07:00
Phil Wang
ad17c69ab6 prepare for latent diffusion in the first DDPM of the cascade in the Decoder 2022-04-21 17:54:31 -07:00
Powered by Gitea Version: 1.24.6 Page: 35ms Template: 3ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API