In-Context-Lora fine-tuned model
1.0











This is an article about In-Context-Lora fine-tuning model
Training with 9 images, drawing different conclusions:
1. Training on Chinese characters is ineffective
2. For clothing/products (not overly complex), precise descriptions placed on a white background can be accurately replicated after training
3. If certain descriptions are ignored, the generated images after training will also ignore the corresponding white background elements
4. Training on a gingko tree image before was not ideal with normal Flux training results. However, using the In-Context-Lora approach for fine-tuning had great effects, being able to restore and creatively alter scene elements
4.2 Training on Hanfu with a hand-held oiled paper umbrella had very good results (essentially cutting out images, as for cutting out images and the context of the original image, I don't quite understand)
5. Surprisingly, the guqin training on Flux base film is very good, just adding some guqin decorations
6. An old issue remains - the ski board fastener restoration still has bugs, but the ski board restoration is slightly better than direct Flux training
7. Training Config file: https://github.com/ali-vilab/In-Context-LoRA/tree/main/config
8. Using the 4090 24g card for training overflowed memory, pay attention to uncommenting in the .yml file: low_vram: true

Regarding reference prompts:
a woman wearing a white short sleeve shirt and a black pleated skirt, with a black tie, standing in a flower park. Her hair naturally hangs down on her shoulders, looking very elegant.
This is a picture depicting an ancient style scene; a woman in pink and purple Hanfu holds an oil paper umbrella. Her hair is adorned with hair accessories, and she wears a wine red belt around her waist. She holds a yellow lantern in her hand with a background of green trees and flowing water.
A woman in a sleeveless waist cinched ruffled short skirt with a blue pattern stands at the seaside. She has long black hair and wears a pearl necklace. Her g...
