Skip to content

Commit cde02b0

Browse files
authored
Fix kontext finetune issue when batch size >1 (#11921)
set drop_last to True Signed-off-by: mymusise <mymusise1@gmail.com>
1 parent 5dc503a commit cde02b0

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

examples/dreambooth/train_dreambooth_lora_flux_kontext.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1614,7 +1614,7 @@ def load_model_hook(models, input_dir):
16141614
)
16151615
if args.cond_image_column is not None:
16161616
logger.info("I2I fine-tuning enabled.")
1617-
batch_sampler = BucketBatchSampler(train_dataset, batch_size=args.train_batch_size, drop_last=False)
1617+
batch_sampler = BucketBatchSampler(train_dataset, batch_size=args.train_batch_size, drop_last=True)
16181618
train_dataloader = torch.utils.data.DataLoader(
16191619
train_dataset,
16201620
batch_sampler=batch_sampler,

0 commit comments

Comments
 (0)