Docker
Quickstart instructions are below, for more information on DeepDetect docker images, see https://github.com/jolibrain/deepdetect/tree/master/docker.
Installation steps:
- Install nvidia-docker, see installation instructions
- Pull the docker image
- Run the container with:
Usage
Get info from the DeepDetect server:
curl http://localhost:8080/info
Yields something like:
{
"status":{
"code":200,
"msg":"OK"
},
"head":{
"method":"/info",
"version":"0.1",
"branch":"master",
"commit":"c8556f0b3e7d970bcd9861b910f9eae87cfd4b0c",
"services":[]
}
}
The image embeds googlenet
and resnet_50
image classification models, see how to use them without effort below.
To use a pre-trained model from outside the docker image, see how to share a volume in the Docker Tips section.
Here is how to do a simple image classification service and prediction test:
Service creation
curl -X PUT "http://localhost:8080/services/imageserv" -d '{
"mllib":"caffe",
"description":"image classification service",
"type":"supervised",
"parameters":{
"input":{
"connector":"image"
},
"mllib":{
"nclasses":1000
}
},
"model":{
"repository":"/opt/models/ggnet/"
}
}'
{
"status":{
"code":201,
"msg":"Created"
}
}
Image classification
curl -X POST "http://localhost:8080/predict" -d '{
"service":"imageserv",
"parameters":{
"input":{
"width":224,
"height":224
},
"output":{
"best":3
},
"mllib":{
"gpu":true
}
},
"data":[
"http://i.ytimg.com/vi/0vxOhd4qlnA/maxresdefault.jpg"
]
}'
{
"status":{
"code":200,
"msg":"OK"
},
"head":{
"method":"/predict",
"time":852.0,
"service":"imageserv"
},
"body":{
"predictions":{
"uri":"http://i.ytimg.com/vi/0vxOhd4qlnA/maxresdefault.jpg",
"classes":[
{
"prob":0.2255125343799591,
"cat":"n03868863 oxygen mask"
},
{
"prob":0.20917612314224244,
"cat":"n03127747 crash helmet"
},
{
"last":true,
"prob":0.07399296760559082,
"cat":"n03379051 football helmet"
}
]
}
}
}
Docker tips
Use models from outside the container:
docker run -d -u $(id -u ${USER}):$(id -g ${USER}) -p 8080:8080 -v /path/to/models:/opt/models/ jolibrain/deepdetect_gpu
where path/to/models
is the path to your local models you’d like to use with the DeepDetect container.
You are now ready to use any available model.
Looking at DeepDetect server logs:
docker logs -f
where container name
can be obtained via docker ps