Skip to content

Latest commit

 

History

History
19 lines (8 loc) · 568 Bytes

File metadata and controls

19 lines (8 loc) · 568 Bytes

Finetuning GPT-2 Model For Generating News Headlines

Note book given can be directly run on google collab. Max batch in p2.x GPU is 2 Training one epoch takes around 12 hours.

Training/Finetuning Details

Infernece