Skip to content

Inference on custom trained model #672

Answered by fcakyon
kaphleamrit2 asked this question in Q&A
Discussion options

You must be logged in to vote

Thanks a lot for the issue @kaphleamrit2 !

We currently only support yaml config files for Detectron2 models. Check this post to details on how to save model config as yaml file: https://medium.com/codable/perform-sliced-inference-and-detailed-error-analysis-using-detectron2-models-37063e33073f

Replies: 3 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by fcakyon
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #404 on September 30, 2022 18:57.