Skip to content

Development of two Encoder-Decoder models based on the pretrained distilroberta and bert-tiny, fine-tuned on CoQA for Question Answering

Notifications You must be signed in to change notification settings

Dundalia/Conversational_QA_CoQA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Conversational_QA_CoQA

In this project we observed the performance of four different encoder-decoder models on a task of question-answering on the CoQA dataset. We studied the effect of the model architecture and specification, the effect of providing the history of the dialogues to the models, possible causes for the errors and shortages, and patterns of mistakes observed in the models.

About

Development of two Encoder-Decoder models based on the pretrained distilroberta and bert-tiny, fine-tuned on CoQA for Question Answering

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published