Phil Wang
|
848e8a480a
|
always rederive the predicted noise from the clipped x0 for ddim + predict noise objective
1.12.3
|
2023-03-05 10:45:44 -08:00 |
|
Phil Wang
|
cc58f75474
|
bump to newer package of clip-anytorch that allows for text encodings < maximum context length
1.12.2
|
2023-03-04 09:37:25 -08:00 |
|
Phil Wang
|
3b2cf7b0bc
|
fix for self conditioning in diffusion prior network https://github.com/lucidrains/DALLE2-pytorch/issues/273
1.12.1
|
2023-02-11 17:18:40 -08:00 |
|
Phil Wang
|
984d62a373
|
default ddim sampling eta to 0
1.12.0
|
2022-12-23 13:23:09 -08:00 |
|
Phil Wang
|
683dd98b96
|
extra insurance in case eos id is not there
1.11.4
|
2022-12-15 10:54:21 -08:00 |
|
Phil Wang
|
067ac323da
|
address https://github.com/lucidrains/DALLE2-pytorch/issues/266
1.11.2
|
2022-11-23 08:41:25 -08:00 |
|
zion
|
91c8d1ca13
|
bug fix cosine annealing optimizer in prior trainer (#262)
|
2022-11-11 12:15:13 -08:00 |
|
zion
|
08238a7200
|
depend on open-clip-torch (#261)
fix the previous commit which assumes open_clip is installed
|
2022-11-07 16:19:08 -08:00 |
|
zion
|
7166ad6711
|
add open clip to train_config (#260)
add the ability to use open_clip in the train configs (useful for the new SOTA h/14 model)
|
2022-11-07 15:44:36 -08:00 |
|
Phil Wang
|
fbba0f9aaf
|
bring in prediction of v objective, combining the findings from progressive distillation paper and imagen-video to the eventual extension of dalle2 to make-a-video
1.11.1
|
2022-10-28 18:21:07 -07:00 |
|
Romain Beaumont
|
9f37705d87
|
Add static graph param (#226)
* Add static graph param
* use static graph param
|
2022-10-25 19:31:29 +02:00 |
|
Phil Wang
|
c3df46e374
|
fix openclipadapter to be able to use latest open sourced sota model
1.10.9
|
2022-10-23 15:12:09 -07:00 |
|
Phil Wang
|
41fabf2922
|
fix a dtype conversion issue for the diffusion timesteps in the diffusion prior, thanks to @JiaHeng-DLUT
0.10.8
|
2022-10-19 09:26:06 -07:00 |
|
Heng Jia
|
5975e8222b
|
Fix assert message (#253)
|
2022-10-18 08:50:59 -07:00 |
|
Phil Wang
|
c18c080128
|
fix for use with larger openai clip models by extracting dimension of last layernorm in clip
1.10.7
|
2022-09-29 09:09:47 -07:00 |
|
Phil Wang
|
b39653cf96
|
fix readme dataloader example
|
2022-09-20 08:39:52 -07:00 |
|
Phil Wang
|
39f8b6cf16
|
show example of using SOTA open sourced open clip
|
2022-09-19 10:45:20 -07:00 |
|
Phil Wang
|
d0c11b30b0
|
handle open clip adapter image size being a tuple
1.10.6
|
2022-09-19 10:27:14 -07:00 |
|
zion
|
86e2d5ba84
|
Minor Decoder Train Script Fixes (#242)
* ensure tokenized text is on proper device
* fix lpips mage distribution
|
2022-09-15 17:21:48 -07:00 |
|
Phil Wang
|
0d82dff9c5
|
in ddim, noise should be predicted after x0 is maybe clipped, thanks to @lukovnikov for pointing this out in another repository
1.10.5
|
2022-09-01 09:40:47 -07:00 |
|
Phil Wang
|
8bbc956ff1
|
fix bug with misnamed variable in diffusion prior network
1.10.4
|
2022-08-31 17:19:05 -07:00 |
|
Phil Wang
|
22019fddeb
|
todo
|
2022-08-31 13:36:05 -07:00 |
|
Phil Wang
|
6fb7e91343
|
fix ddim to use alpha_cumprod
1.10.3
|
2022-08-31 07:40:46 -07:00 |
|
Phil Wang
|
ba58ae0bf2
|
add two asserts to diffusion prior to ensure matching image embedding dimensions for clip, diffusion prior network, and what was set on diffusion prior
1.10.1
|
2022-08-28 10:11:37 -07:00 |
|
Phil Wang
|
1cc5d0afa7
|
upgrade to best downsample
1.10.0
|
2022-08-25 10:37:02 -07:00 |
|
Phil Wang
|
59fa101c4d
|
fix classifier free guidance for diffusion prior, thanks to @jaykim9870 for spotting the issue
1.9.0
|
2022-08-23 08:29:01 -07:00 |
|
Aidan Dempster
|
916ece164c
|
Merge pull request #234 from Veldrovive/deepspeed_fp16
Fixed issues with clip and deepspeed fp16
|
2022-08-20 19:01:43 -04:00 |
|
Aidan
|
cbaadb6931
|
Fixed issues with clip and deepspeed fp16
Also more more general compatibility fixes
|
2022-08-20 17:58:32 +00:00 |
|
Phil Wang
|
083508ff8e
|
cast attention matrix back to original dtype pre-softmax in attention
1.8.4
|
2022-08-20 10:56:01 -07:00 |
|
Phil Wang
|
7762edd0ff
|
make it work for @ethancohen123
|
2022-08-19 11:28:58 -07:00 |
|
Phil Wang
|
de5e628773
|
cite einops
|
2022-08-17 08:58:41 -07:00 |
|
Phil Wang
|
1b4046b039
|
gratitude
|
2022-08-17 08:57:33 -07:00 |
|
Phil Wang
|
27f19ba7fa
|
make sure diffusion prior trainer can operate with no warmup
1.8.2
|
2022-08-15 14:27:40 -07:00 |
|
Phil Wang
|
8f38339c2b
|
give diffusion prior trainer cosine annealing lr too
1.8.1
|
2022-08-15 07:38:01 -07:00 |
|
Phil Wang
|
6b9b4b9e5e
|
add cosine annealing lr schedule
1.8.0
|
2022-08-15 07:29:56 -07:00 |
|
Phil Wang
|
44e09d5a4d
|
add weight standardization behind feature flag, which may potentially work well with group norm
|
2022-08-14 11:34:45 -07:00 |
|
Phil Wang
|
34806663e3
|
make it so diffusion prior p_sample_loop returns unnormalized image embeddings
1.6.5
|
2022-08-13 10:03:40 -07:00 |
|
Phil Wang
|
dc816b1b6e
|
dry up some code around handling unet outputs with learned variance
1.6.4
|
2022-08-12 15:25:03 -07:00 |
|
Phil Wang
|
05192ffac4
|
fix self conditioning shape in diffusion prior
1.6.3
|
2022-08-12 12:30:03 -07:00 |
|
Phil Wang
|
9440411954
|
make self conditioning technique work with diffusion prior
1.6.1
|
2022-08-12 12:20:51 -07:00 |
|
Phil Wang
|
981d407792
|
comment
|
2022-08-12 11:41:23 -07:00 |
|
Phil Wang
|
7c5477b26d
|
bet on the new self-conditioning technique out of geoffrey hintons group
1.6.0
|
2022-08-12 11:36:08 -07:00 |
|
Phil Wang
|
be3bb868bf
|
add gradient checkpointing for all resnet blocks
1.5.0
|
2022-08-02 19:21:44 -07:00 |
|
Phil Wang
|
451de34871
|
enforce clip anytorch version
1.4.6
|
2022-07-30 10:07:55 -07:00 |
|
Phil Wang
|
f22e8c8741
|
make open clip available for use with dalle2 pytorch
1.4.5
|
2022-07-30 09:02:31 -07:00 |
|
Phil Wang
|
87432e93ad
|
quick fix for linear attention
1.4.4
|
2022-07-29 13:17:12 -07:00 |
|
Phil Wang
|
d167378401
|
add cosine sim for self attention as well, as a setting
1.4.3
|
2022-07-29 12:48:20 -07:00 |
|
Phil Wang
|
2d67d5821e
|
change up epsilon in layernorm the case of using fp16, thanks to @Veldrovive for figuring out this stabilizes training
1.4.2
|
2022-07-29 12:41:02 -07:00 |
|
Phil Wang
|
748c7fe7af
|
allow for cosine sim cross attention, modify linear attention in attempt to resolve issue on fp16
1.4.0
|
2022-07-29 11:12:18 -07:00 |
|
Phil Wang
|
80046334ad
|
make sure entire readme runs without errors
1.2.2
|
2022-07-28 10:17:43 -07:00 |
|