zion
44d4b1bba9
overhaul prior dataloader ( #122 )
...
add readme for loader
2022-05-29 07:39:59 -07:00
Phil Wang
f12a7589c5
commit to trying out grid attention
2022-05-26 12:56:10 -07:00
Phil Wang
b8af2210df
make sure diffusion prior can be instantiated from pydantic class without clip
v0.5.6
2022-05-26 08:47:30 -07:00
Phil Wang
f4fe6c570d
allow for full customization of number of resnet blocks per down or upsampling layers in unet, as in imagen
v0.5.5
2022-05-26 08:33:31 -07:00
Phil Wang
645e207441
credit assignment
2022-05-26 08:16:03 -07:00
Phil Wang
00743b3a0b
update
2022-05-26 08:12:25 -07:00
Phil Wang
01589aff6a
cite maxvit properly
2022-05-26 07:12:25 -07:00
Phil Wang
7ecfd76cc0
fix evaluation config splat in training decoder script
2022-05-26 07:11:31 -07:00
Phil Wang
6161b61c55
0.5.4
v0.5.4
2022-05-25 09:32:17 -07:00
zion
1ed0f9d80b
use deterministic optimizer params ( #116 )
2022-05-25 09:31:43 -07:00
Phil Wang
f326a95e26
0.5.3
v0.5.3
2022-05-25 09:07:28 -07:00
zion
d7a0a2ce4b
add more support for configuring prior ( #113 )
2022-05-25 09:06:50 -07:00
Phil Wang
f23fab7ef7
switch over to scale shift conditioning, as it seems like Imagen and Glide used it and it may be important
v0.5.2
2022-05-24 21:46:12 -07:00
Phil Wang
857b9fbf1e
allow for one to stop grouping out weight decayable parameters, to debug optimizer state dict problem
v0.5.1
2022-05-24 21:42:32 -07:00
Phil Wang
8864fd0aa7
bring in the dynamic thresholding technique from the Imagen paper, which purportedly improves classifier free guidance for the cascading ddpm
0.5.0a
2022-05-24 18:15:14 -07:00
Phil Wang
72bf159331
update
v0.5.0
2022-05-24 08:25:40 -07:00
Phil Wang
e5e47cfecb
link to aidan's test run
2022-05-23 12:41:46 -07:00
Phil Wang
fa533962bd
just use an assert to make sure clip image channels is never different than the channels of the diffusion prior and decoder, if clip is given
v0.4.14
2022-05-22 22:43:14 -07:00
Phil Wang
276abf337b
fix and cleanup image size determination logic in decoder
0.4.11
2022-05-22 22:28:45 -07:00
Phil Wang
ae42d03006
allow for saving of additional fields on save method in trainers, and return loaded objects from the load method
0.4.10
2022-05-22 22:14:25 -07:00
Phil Wang
4d346e98d9
allow for config driven creation of clip-less diffusion prior
2022-05-22 20:36:20 -07:00
Phil Wang
2b1fd1ad2e
product management
2022-05-22 19:23:40 -07:00
zion
82a2ef37d9
Update README.md ( #109 )
...
block in a section that links to available pre-trained models for those who are interested
2022-05-22 19:22:30 -07:00
Phil Wang
5c397c9d66
move neural network creations off the configuration file into the pydantic classes
0.4.8
2022-05-22 19:18:18 -07:00
Phil Wang
0f4edff214
derived value for image preprocessing belongs to the data config class
0.4.7
2022-05-22 18:42:40 -07:00
Phil Wang
501a8c7c46
small cleanup
0.4.6
2022-05-22 15:39:38 -07:00
Phil Wang
4e49373fc5
project management
2022-05-22 15:27:40 -07:00
Phil Wang
49de72040c
fix decoder trainer optimizer loading (since there are multiple for each unet), also save and load step number correctly
0.4.5
2022-05-22 15:21:00 -07:00
Phil Wang
271a376eaf
0.4.3
0.4.3
2022-05-22 15:10:28 -07:00
Phil Wang
e527002472
take care of saving and loading functions on the diffusion prior and decoder training classes
2022-05-22 15:10:15 -07:00
Phil Wang
c12e067178
let the pydantic config base model take care of loading configuration from json path
0.4.2
2022-05-22 14:47:23 -07:00
Phil Wang
c6629c431a
make training splits into its own pydantic base model, validate it sums to 1, make decoder script cleaner
0.4.1
2022-05-22 14:43:22 -07:00
Phil Wang
7ac2fc79f2
add renamed train decoder json file
2022-05-22 14:32:50 -07:00
Phil Wang
a1ef023193
use pydantic to manage decoder training configs + defaults and refactor training script
0.4.0
2022-05-22 14:27:40 -07:00
Phil Wang
d49eca62fa
dep
2022-05-21 11:27:52 -07:00
Phil Wang
8aab69b91e
final thought
2022-05-21 10:47:45 -07:00
Phil Wang
b432df2f7b
final cleanup to decoder script
2022-05-21 10:42:16 -07:00
Phil Wang
ebaa0d28c2
product management
2022-05-21 10:30:52 -07:00
Phil Wang
8b0d459b25
move config parsing logic to own file, consider whether to find an off-the-shelf solution at future date
2022-05-21 10:30:10 -07:00
Phil Wang
0064661729
small cleanup of decoder train script
2022-05-21 10:17:13 -07:00
Phil Wang
b895f52843
appreciation section
2022-05-21 08:32:12 -07:00
Phil Wang
80497e9839
accept unets as list for decoder
0.3.7
2022-05-20 20:31:26 -07:00
Phil Wang
f526f14d7c
bump
0.3.6
2022-05-20 20:20:40 -07:00
Phil Wang
8997f178d6
small cleanup with timer
2022-05-20 20:05:01 -07:00
Aidan Dempster
022c94e443
Added single GPU training script for decoder ( #108 )
...
Added config files for training
Changed example image generation to be more efficient
Added configuration description to README
Removed unused import
2022-05-20 19:46:19 -07:00
Phil Wang
430961cb97
it was correct the first time, my bad
0.3.5
2022-05-20 18:05:15 -07:00
Phil Wang
721f9687c1
fix wandb logging in tracker, and do some cleanup
2022-05-20 17:27:43 -07:00
Aidan Dempster
e0524a6aff
Implemented the wandb tracker ( #106 )
...
Added a base_path parameter to all trackers for storing any local information they need to
2022-05-20 16:39:23 -07:00
Aidan Dempster
c85e0d5c35
Update decoder dataloader ( #105 )
...
* Updated the decoder dataloader
Removed unnecessary logging for required packages
Transferred to using index width instead of shard width
Added the ability to select extra keys to return from the webdataset
* Added README for decoder loader
2022-05-20 16:38:55 -07:00
Phil Wang
db0642c4cd
quick fix for @marunine
0.3.3
2022-05-18 20:22:52 -07:00