Commit Graph

  • 461b91c5c1 also merge distributed training code for decoder, thanks to @Veldrovive v0.10.1 Phil Wang 2022-06-19 09:26:44 -07:00
  • 58892135d9 Distributed Training of the Decoder (#121) Aidan Dempster 2022-06-19 12:25:54 -04:00
  • e37072a48c 0.10.0 v0.10.0 Phil Wang 2022-06-19 08:50:53 -07:00
  • 41ca896413 depend on huggingface accelerate, move appreciation thread up for visibility Phil Wang 2022-06-19 08:50:35 -07:00
  • fe19b508ca Distributed Training of the Prior (#112) zion 2022-06-19 10:46:14 -05:00
  • 6651eafa93 one more residual, after seeing good results on unconditional generation locally v0.9.2 Phil Wang 2022-06-16 11:18:02 -07:00
  • e6bb75e5ab fix missing residual for highest resolution of the unet Phil Wang 2022-06-15 20:09:43 -07:00
  • 57f1ddf9d2 fix missing resisidual for highest resolution of the unet v0.9.1 Phil Wang 2022-06-15 19:11:58 -07:00
  • 6647050c33 fix missing resisidual for highest resolution of the unet v0.9.0 Phil Wang 2022-06-15 18:01:12 -07:00
  • b4c3e5b854 changed str in order to avoid confusions and collisions with Python (#147) Giorgos Zachariadis 2022-06-15 23:41:16 +03:00
  • b7f9607258 make memory efficient unet design from imagen toggle-able v0.8.1 Phil Wang 2022-06-15 13:40:26 -07:00
  • 2219348a6e adopt similar unet architecture as imagen v0.8.0 Phil Wang 2022-06-15 12:18:21 -07:00
  • 9eea9b9862 add p2 loss reweighting for decoder training as an option v0.7.1 Phil Wang 2022-06-14 10:58:57 -07:00
  • 5d958713c0 fix classifier free guidance for image hiddens summed to time hiddens, thanks to @xvjiarui for finding this bug v0.7.0 Phil Wang 2022-06-13 21:01:50 -07:00
  • 0f31980362 cleanup Phil Wang 2022-06-07 17:31:38 -07:00
  • bee5bf3815 fix for https://github.com/lucidrains/DALLE2-pytorch/issues/143 Phil Wang 2022-06-07 09:03:48 -07:00
  • 350a3d6045 0.6.16 v0.6.16 Phil Wang 2022-06-06 08:45:40 -07:00
  • 1a81670718 fix quadratic_beta_schedule (#141) Kashif Rasul 2022-06-06 17:45:14 +02:00
  • 934c9728dc some cleanup v0.6.15 Phil Wang 2022-06-04 16:54:15 -07:00
  • bdc3b222f2 some cleanup v0.6.14 Phil Wang 2022-06-04 16:53:20 -07:00
  • ce4b0107c1 0.6.13 0.6.13 Phil Wang 2022-06-04 13:26:57 -07:00
  • 64c2f9c4eb implement ema warmup from @crowsonkb (#140) zion 2022-06-04 13:26:34 -07:00
  • 22cc613278 ema fix from @nousr v0.6.12 Phil Wang 2022-06-03 19:44:36 -07:00
  • 83517849e5 ema module fixes (#139) zion 2022-06-03 19:43:51 -07:00
  • 708809ed6c lower beta2 for adam down to 0.99, based on https://openreview.net/forum?id=2LdBqxc1Yv v0.6.11 Phil Wang 2022-06-03 10:26:28 -07:00
  • 9cc475f6e7 fix update_every within EMA v0.6.10 Phil Wang 2022-06-03 10:21:05 -07:00
  • ffd342e9d0 allow for an option to constrain the variance interpolation fraction coming out from the unet for learned variance, if it is turned on v0.6.9 Phil Wang 2022-06-03 09:34:57 -07:00
  • f8bfd3493a make destructuring datum length agnostic when validating in training decoder script, for @YUHANG-Ma Phil Wang 2022-06-02 13:54:57 -07:00
  • 9025345e29 take a stab at fixing generate_grid_samples when real images have a greater image size than generated Phil Wang 2022-06-02 11:33:15 -07:00
  • 8cc278447e just cast to right types for blur sigma and kernel size augs v0.6.8 Phil Wang 2022-06-02 11:21:58 -07:00
  • 38cd62010c allow for random blur sigma and kernel size augmentations on low res conditioning (need to reread paper to see if the augmentation value needs to be fed into the unet for conditioning as well) v0.6.7 Phil Wang 2022-06-02 11:11:18 -07:00
  • 1cc288af39 Improve Readability (#133) Ryan Russell 2022-06-01 15:28:02 -05:00
  • a851168633 make youtokentome optional package, due to reported installation difficulties v0.6.6 Phil Wang 2022-06-01 09:25:35 -07:00
  • 1ffeecd0ca lower default ema beta value v0.6.5 Phil Wang 2022-05-31 11:55:21 -07:00
  • 3df899f7a4 patch v0.6.4 Phil Wang 2022-05-31 09:03:43 -07:00
  • 09534119a1 Fixed non deterministic optimizer creation (#130) Aidan Dempster 2022-05-31 12:03:20 -04:00
  • 6f8b90d4d7 add packaging package v0.6.3 Phil Wang 2022-05-30 11:45:00 -07:00
  • b588286288 fix version v0.6.2 Phil Wang 2022-05-30 11:06:34 -07:00
  • b693e0be03 default number of resnet blocks per layer in unet to 2 (in imagen it was 3 for base 64x64) v0.6.1 Phil Wang 2022-05-30 10:06:48 -07:00
  • a0bed30a84 additional conditioning on image embedding by summing to time embeddings (for FiLM like conditioning in subsequent layers), from passage found in paper by @mhh0318 v0.6.0 Phil Wang 2022-05-30 09:26:46 -07:00
  • 387c5bf774 quick patch for new prior loader (#123) zion 2022-05-29 16:25:53 -07:00
  • a13d2d89c5 0.5.7 v0.5.7 Phil Wang 2022-05-29 07:40:25 -07:00
  • 44d4b1bba9 overhaul prior dataloader (#122) zion 2022-05-29 07:39:59 -07:00
  • f12a7589c5 commit to trying out grid attention Phil Wang 2022-05-26 12:56:10 -07:00
  • b8af2210df make sure diffusion prior can be instantiated from pydantic class without clip v0.5.6 Phil Wang 2022-05-26 08:47:30 -07:00
  • f4fe6c570d allow for full customization of number of resnet blocks per down or upsampling layers in unet, as in imagen v0.5.5 Phil Wang 2022-05-26 08:33:31 -07:00
  • 645e207441 credit assignment Phil Wang 2022-05-26 08:16:03 -07:00
  • 00743b3a0b update Phil Wang 2022-05-26 08:12:25 -07:00
  • 01589aff6a cite maxvit properly Phil Wang 2022-05-26 07:12:25 -07:00
  • 7ecfd76cc0 fix evaluation config splat in training decoder script Phil Wang 2022-05-26 07:11:31 -07:00
  • 6161b61c55 0.5.4 v0.5.4 Phil Wang 2022-05-25 09:32:05 -07:00
  • 1ed0f9d80b use deterministic optimizer params (#116) zion 2022-05-25 09:31:43 -07:00
  • f326a95e26 0.5.3 v0.5.3 Phil Wang 2022-05-25 09:07:28 -07:00
  • d7a0a2ce4b add more support for configuring prior (#113) zion 2022-05-25 09:06:50 -07:00
  • f23fab7ef7 switch over to scale shift conditioning, as it seems like Imagen and Glide used it and it may be important v0.5.2 Phil Wang 2022-05-24 21:46:12 -07:00
  • 857b9fbf1e allow for one to stop grouping out weight decayable parameters, to debug optimizer state dict problem v0.5.1 Phil Wang 2022-05-24 21:42:32 -07:00
  • 8864fd0aa7 bring in the dynamic thresholding technique from the Imagen paper, which purportedly improves classifier free guidance for the cascading ddpm 0.5.0a Phil Wang 2022-05-24 18:14:35 -07:00
  • 72bf159331 update v0.5.0 Phil Wang 2022-05-24 08:25:40 -07:00
  • e5e47cfecb link to aidan's test run Phil Wang 2022-05-23 12:41:46 -07:00
  • fa533962bd just use an assert to make sure clip image channels is never different than the channels of the diffusion prior and decoder, if clip is given v0.4.14 Phil Wang 2022-05-22 22:43:14 -07:00
  • a0e41267f8 just use an assert to make sure clip image channels is never different than the channels of the diffusion prior and decoder, if clip is given 0.4.12 Phil Wang 2022-05-22 22:34:33 -07:00
  • 276abf337b fix and cleanup image size determination logic in decoder 0.4.11 Phil Wang 2022-05-22 22:28:45 -07:00
  • ae42d03006 allow for saving of additional fields on save method in trainers, and return loaded objects from the load method 0.4.10 Phil Wang 2022-05-22 22:14:25 -07:00
  • 4d346e98d9 allow for config driven creation of clip-less diffusion prior Phil Wang 2022-05-22 20:36:20 -07:00
  • dc50c6b34e allow for config driven creation of clip-less diffusion prior v0.4.9 Phil Wang 2022-05-22 20:13:20 -07:00
  • 2b1fd1ad2e product management Phil Wang 2022-05-22 19:23:40 -07:00
  • 82a2ef37d9 Update README.md (#109) zion 2022-05-22 19:22:30 -07:00
  • 5c397c9d66 move neural network creations off the configuration file into the pydantic classes 0.4.8 Phil Wang 2022-05-22 19:18:18 -07:00
  • 0f4edff214 derived value for image preprocessing belongs to the data config class 0.4.7 Phil Wang 2022-05-22 18:42:40 -07:00
  • 501a8c7c46 small cleanup 0.4.6 Phil Wang 2022-05-22 15:39:32 -07:00
  • 4e49373fc5 project management Phil Wang 2022-05-22 15:27:40 -07:00
  • 49de72040c fix decoder trainer optimizer loading (since there are multiple for each unet), also save and load step number correctly 0.4.5 Phil Wang 2022-05-22 15:21:00 -07:00
  • 271a376eaf 0.4.3 0.4.3 Phil Wang 2022-05-22 15:10:28 -07:00
  • e527002472 take care of saving and loading functions on the diffusion prior and decoder training classes Phil Wang 2022-05-22 15:10:15 -07:00
  • c12e067178 let the pydantic config base model take care of loading configuration from json path 0.4.2 Phil Wang 2022-05-22 14:47:23 -07:00
  • c6629c431a make training splits into its own pydantic base model, validate it sums to 1, make decoder script cleaner 0.4.1 Phil Wang 2022-05-22 14:43:22 -07:00
  • 7ac2fc79f2 add renamed train decoder json file Phil Wang 2022-05-22 14:32:50 -07:00
  • a1ef023193 use pydantic to manage decoder training configs + defaults and refactor training script 0.4.0 Phil Wang 2022-05-22 14:27:40 -07:00
  • d49eca62fa dep Phil Wang 2022-05-21 11:27:46 -07:00
  • 8aab69b91e final thought Phil Wang 2022-05-21 10:47:45 -07:00
  • b432df2f7b final cleanup to decoder script Phil Wang 2022-05-21 10:42:16 -07:00
  • ebaa0d28c2 product management Phil Wang 2022-05-21 10:30:52 -07:00
  • 8b0d459b25 move config parsing logic to own file, consider whether to find an off-the-shelf solution at future date Phil Wang 2022-05-21 10:30:10 -07:00
  • 6ea65f59cc move config parsing logic to own file, consider whether to find an off-the-shelf solution at future date 0.3.9 Phil Wang 2022-05-21 10:25:15 -07:00
  • 5340a96c0f move config parsing logic to own file, consider whether to find an off-the-shelf solution at future date 0.3.8 Phil Wang 2022-05-21 10:24:27 -07:00
  • 0064661729 small cleanup of decoder train script Phil Wang 2022-05-21 10:17:07 -07:00
  • b895f52843 appreciation section Phil Wang 2022-05-21 08:32:12 -07:00
  • 80497e9839 accept unets as list for decoder 0.3.7 Phil Wang 2022-05-20 20:31:26 -07:00
  • f526f14d7c bump 0.3.6 Phil Wang 2022-05-20 20:20:40 -07:00
  • 8997f178d6 small cleanup with timer Phil Wang 2022-05-20 20:05:01 -07:00
  • 022c94e443 Added single GPU training script for decoder (#108) Aidan Dempster 2022-05-20 22:46:19 -04:00
  • 430961cb97 it was correct the first time, my bad 0.3.5 Phil Wang 2022-05-20 18:05:15 -07:00
  • 721f9687c1 fix wandb logging in tracker, and do some cleanup Phil Wang 2022-05-20 17:27:43 -07:00
  • 9340d33d5f fix wandb logging in tracker, and do some cleanup 0.3.4 Phil Wang 2022-05-20 17:10:33 -07:00
  • e0524a6aff Implemented the wandb tracker (#106) Aidan Dempster 2022-05-20 19:39:23 -04:00
  • c85e0d5c35 Update decoder dataloader (#105) Aidan Dempster 2022-05-20 19:38:55 -04:00
  • db0642c4cd quick fix for @marunine 0.3.3 Phil Wang 2022-05-18 20:22:52 -07:00
  • bb86ab2404 update sample, and set default gradient clipping value for decoder training 0.3.2 Phil Wang 2022-05-16 17:38:30 -07:00
  • ae056dd67c samples Phil Wang 2022-05-16 13:46:26 -07:00
  • 033d6b0ce8 last update Phil Wang 2022-05-16 13:38:33 -07:00