You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A script that can accept some flags and run an inference for demonstration purposes is useful for a quickstart:
# fetch model(s) first
$ docker run...
$ python run_inference.py --gpu --model=resnet50 ./image.jpg
Dog
It doesn't need to expose all the options the server has (e.g. batch size) but just needs a simplified set of arguments and work with some set of models (enough for demo purposes)
The text was updated successfully, but these errors were encountered:
A script that can accept some flags and run an inference for demonstration purposes is useful for a quickstart:
# fetch model(s) first $ docker run... $ python run_inference.py --gpu --model=resnet50 ./image.jpg Dog
It doesn't need to expose all the options the server has (e.g. batch size) but just needs a simplified set of arguments and work with some set of models (enough for demo purposes)
The text was updated successfully, but these errors were encountered: