You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our training consists of two stages, and each stage utilized approximately more than 200 A800 GPUs, training for 24*5 hours; based on our pre-trained model, performing fine-tuning with LoRA only requires an A10 or a GPU with equivalent memory performance for 2 hours to achieve satisfactory results.
No description provided.
The text was updated successfully, but these errors were encountered: