Phil Wang
a922a539de
bring back convtranspose2d upsampling, allow for nearest upsample with hyperparam, change kernel size of last conv to 1, make configurable, cleanup
v0.15.0
2022-07-01 09:21:47 -07:00
Phil Wang
8f2466f1cd
blur sigma for upsampling training was 0.6 in the paper, make that the default value
v0.14.1
2022-06-30 17:03:16 -07:00
Phil Wang
908ab83799
add skip connections for all intermediate resnet blocks, also add an extra resnet block for memory efficient version of unet, time condition for both initial resnet block and last one before output
v0.14.0
2022-06-29 08:16:58 -07:00
Phil Wang
46a2558d53
bug in pydantic decoder config class
v0.12.4
2022-06-29 07:17:35 -07:00
yytdfc
86109646e3
fix a bug of name error ( #179 )
2022-06-29 07:16:44 -07:00
Phil Wang
6a11b9678b
bring in the skip connection scaling factor, used by imagen in their unets, cite original paper using it
v0.12.3
2022-06-26 21:59:55 -07:00
Phil Wang
b90364695d
fix remaining issues with deriving cond_on_text_encodings from child unet settings
v0.12.2
2022-06-26 21:07:42 -07:00
zion
868c001199
bug fixes for text conditioning update ( #175 )
2022-06-26 16:12:32 -07:00
Phil Wang
032e83b0e0
nevermind, do not enforce text encodings on first unet
v0.12.1
2022-06-26 12:45:05 -07:00
Phil Wang
2e85e736f3
remove unnecessary decoder setting, and if not unconditional, always make sure the first unet is condition-able on text
v0.12.0
2022-06-26 12:32:17 -07:00
Aidan Dempster
f5760bdb92
Add data flexibility to decoder trainer ( #165 )
...
* Added the ability to train decoder with text embeddings
* Added the ability to train using on the fly generated embeddings with clip
* Clip now generates embeddings for whatever is not precomputed
2022-06-25 19:05:20 -07:00
zion
c453f468b1
autoswitch tqdm for notebooks ( #171 )
...
avoids printing the `tqdm` progress bar to a newline in notebooks when detected
2022-06-25 16:37:06 -07:00
zion
98f0c17759
add sampels-seen and ema decay ( #166 )
2022-06-24 15:12:09 -07:00
Phil Wang
a5b9fd6ca8
product management
2022-06-24 08:15:05 -07:00
Phil Wang
4b994601ae
just make sure decoder learning rate is reasonable and help out budding researchers
v0.11.5
2022-06-23 11:29:28 -07:00
zion
fddf66e91e
fix params in decoder ( #162 )
2022-06-22 14:45:01 -07:00
Phil Wang
c8422ffd5d
fix EMA updating buffers with non-float tensors
0.11.4
2022-06-22 07:16:39 -07:00
Conight
2aadc23c7c
Fix train decoder config example ( #160 )
2022-06-21 22:17:06 -07:00
Phil Wang
c098f57e09
EMA for vqgan vae comes from ema_pytorch now
2022-06-20 15:29:08 -07:00
Phil Wang
0021535c26
move ema to external repo
v0.11.3
2022-06-20 11:48:32 -07:00
Phil Wang
56883910fb
cleanup
2022-06-20 11:14:55 -07:00
Phil Wang
893f270012
project management
2022-06-20 10:00:22 -07:00
Phil Wang
f545ce18f4
be able to turn off p2 loss reweighting for upsamplers
v0.11.2
2022-06-20 09:43:31 -07:00
Phil Wang
fc7abf624d
in paper, blur sigma was 0.6
v0.11.1
2022-06-20 09:05:08 -07:00
Phil Wang
67f0740777
small cleanup
2022-06-20 08:59:51 -07:00
Phil Wang
138079ca83
allow for setting beta schedules of unets differently in the decoder, as what was used in the paper was cosine, cosine, linear
v0.11.0
2022-06-20 08:56:37 -07:00
zion
f5a906f5d3
prior train script bug fixes ( #153 )
2022-06-19 15:55:15 -07:00
Phil Wang
0215237fc6
update status
2022-06-19 09:42:24 -07:00
Phil Wang
461b91c5c1
also merge distributed training code for decoder, thanks to @Veldrovive
v0.10.1
2022-06-19 09:26:44 -07:00
Aidan Dempster
58892135d9
Distributed Training of the Decoder ( #121 )
...
* Converted decoder trainer to use accelerate
* Fixed issue where metric evaluation would hang on distributed mode
* Implemented functional saving
Loading still fails due to some issue with the optimizer
* Fixed issue with loading decoders
* Fixed issue with tracker config
* Fixed issue with amp
Updated logging to be more logical
* Saving checkpoint now saves position in training as well
Fixed an issue with running out of gpu space due to loading weights into the gpu twice
* Fixed ema for distributed training
* Fixed isue where get_pkg_version was reintroduced
* Changed decoder trainer to upload config as a file
Fixed issue where loading best would error
2022-06-19 09:25:54 -07:00
Phil Wang
e37072a48c
0.10.0
v0.10.0
2022-06-19 08:50:53 -07:00
Phil Wang
41ca896413
depend on huggingface accelerate, move appreciation thread up for visibility
2022-06-19 08:50:35 -07:00
zion
fe19b508ca
Distributed Training of the Prior ( #112 )
...
* distributed prior trainer
better EMA support
update load and save methods of prior
* update prior training script
add test evalution & ema validation
add more tracking metrics
small cleanup
2022-06-19 08:46:14 -07:00
Phil Wang
6651eafa93
one more residual, after seeing good results on unconditional generation locally
v0.9.2
2022-06-16 11:18:02 -07:00
Phil Wang
e6bb75e5ab
fix missing residual for highest resolution of the unet
2022-06-15 20:09:43 -07:00
Giorgos Zachariadis
b4c3e5b854
changed str in order to avoid confusions and collisions with Python ( #147 )
2022-06-15 13:41:16 -07:00
Phil Wang
b7f9607258
make memory efficient unet design from imagen toggle-able
v0.8.1
2022-06-15 13:40:26 -07:00
Phil Wang
2219348a6e
adopt similar unet architecture as imagen
v0.8.0
2022-06-15 12:18:21 -07:00
Phil Wang
9eea9b9862
add p2 loss reweighting for decoder training as an option
v0.7.1
2022-06-14 10:58:57 -07:00
Phil Wang
5d958713c0
fix classifier free guidance for image hiddens summed to time hiddens, thanks to @xvjiarui for finding this bug
v0.7.0
2022-06-13 21:01:50 -07:00
Phil Wang
0f31980362
cleanup
2022-06-07 17:31:38 -07:00
Phil Wang
bee5bf3815
fix for https://github.com/lucidrains/DALLE2-pytorch/issues/143
2022-06-07 09:03:48 -07:00
Phil Wang
350a3d6045
0.6.16
v0.6.16
2022-06-06 08:45:46 -07:00
Kashif Rasul
1a81670718
fix quadratic_beta_schedule ( #141 )
2022-06-06 08:45:14 -07:00
Phil Wang
934c9728dc
some cleanup
v0.6.15
2022-06-04 16:54:15 -07:00
Phil Wang
ce4b0107c1
0.6.13
0.6.13
2022-06-04 13:26:57 -07:00
zion
64c2f9c4eb
implement ema warmup from @crowsonkb ( #140 )
2022-06-04 13:26:34 -07:00
Phil Wang
22cc613278
ema fix from @nousr
v0.6.12
2022-06-03 19:44:36 -07:00
zion
83517849e5
ema module fixes ( #139 )
2022-06-03 19:43:51 -07:00
Phil Wang
708809ed6c
lower beta2 for adam down to 0.99, based on https://openreview.net/forum?id=2LdBqxc1Yv
v0.6.11
2022-06-03 10:26:28 -07:00