Skip to content

A caffe implementation of MobileNet-YOLO detection network

License

Notifications You must be signed in to change notification settings

VenAlone/MobileNet-YOLO

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MobileNet-YOLO Caffe

MobileNet-YOLO

A caffe implementation of MobileNet-YOLO detection network , first train on COCO trainval35k then fine-tune on 07+12 , test on VOC2007

Network mAP Resolution Download NetScope Inference time (GTX 1080) Inference time (i5-4440)
MobileNet-YOLOv3-Lite 74.6 320 caffemodel graph 4.79 ms 150 ms
MobileNet-YOLOv3-Lite 76.3 416 caffemodel graph 6.52 ms 280 ms
  • inference time was log from script , does not include pre-processing
  • the benchmark of cpu performance on Tencent/ncnn framework
  • the deploy model was made by merge_bn.py , or you can try my custom version
  • bn_model download here

Knowledge Transfer

I use the following training path to improve accuracy , and decrease lite version trainning time

  • First , train MobileNet-YOLOv3 on coco dataset (IOU_0.5 : 40.2 mAP)
  • Second , train MobileNet-YOLOv3-Lite on coco dataset , pretrain weights use the first step output (IOU_0.5 : 38.9 mAP)
  • Finally , train MobileNet-YOLOv3-Lite on voc dataset , pretrain weights use the second step output (76.3 mAP)

Windows Version

Caffe-YOLOv3-Windows

Oringinal darknet-yolov3

Converter

test on coco_minival_lmdb (IOU 0.5)

Network mAP Resolution Download NetScope
yolov3 54.2 416 caffemodel graph
yolov3-spp 59.8 608 caffemodel graph
  • I haven't implement correct_yolo_boxes and relative function , so here only support square input resolution

Performance

Train on COCO trainval35k (2014) , and compare with YOLO , (IOU 0.5)

Network IOU 0.5:0.95 IOU 0.5 IOU 0.75 Weight size Resolution NetScope Resize Mode
MobileNet-YOLOv3-Lite 19.9 35.5 19.6 22.0 mb 320 graph WARP
MobileNet-YOLOv3-Lite 21.5 38.9 21.2 22.0 mb 416 graph WARP
MobileNet-YOLOv3 22.7 40.2 22.6 22.5 mb 416 graph LetterBox
YOLOv3-Tiny 33.1 33.8 mb 416
MobileNet-YOLOv3-Lite-trt 37.5 23.5 mb 416 graph WARP
  • (*) testdev-2015 server was closed , here use coco 2014 minival
  • MobileNet-YOLOv3-Lite-trt was the fastest model

Other Models

You can find non-depthwise convolution network here , Yolo-Model-Zoo

network mAP resolution macc param
PVA-YOLOv3 0.703 416 2.55G 4.72M
Pelee-YOLOv3 0.703 416 4.25G 3.85M

Model visulization tool

Supported on Netron , browser version

Build , Run and Training

See wiki

License and Citation

Please cite MobileNet-YOLO in your publications if it helps your research:

@article{MobileNet-YOLO,
  Author = {eric612,avisonic},
  Year = {2018}
}

Reference

https://github.com/weiliu89/caffe/tree/ssd

https://pjreddie.com/darknet/yolo/

https://github.com/gklz1982/caffe-yolov2

https://github.com/yonghenglh6/DepthwiseConvolution

https://github.com/alexgkendall/caffe-segnet

https://github.com/BVLC/caffe/pull/6384/commits/4d2400e7ae692b25f034f02ff8e8cd3621725f5c

Cudnn convolution

https://github.com/chuanqi305/MobileNetv2-SSDLite/tree/master/src

About

A caffe implementation of MobileNet-YOLO detection network

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 71.0%
  • Makefile 9.6%
  • Python 8.4%
  • Cuda 5.8%
  • CMake 3.9%
  • MATLAB 0.7%
  • Other 0.6%