Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
|
|
5 years ago | |
|---|---|---|
| .. | ||
| src | 5 years ago | |
| Readme.md | 5 years ago | |
| eval.py | 5 years ago | |
| mindpsore_hub_conf.py | 5 years ago | |
The GhostNet architecture is based on an Ghost module structure which generate more features from cheap operations. Based on a set of intrinsic feature maps, a series of cheap operations are applied to generate many ghost feature maps that could fully reveal information underlying intrinsic features.
Paper: Kai Han, Yunhe Wang, Qi Tian, Jianyuan Guo, Chunjing Xu, Chang Xu. GhostNet: More Features from Cheap Operations. CVPR 2020.
Quantization refers to techniques for performing computations and storing tensors at lower bitwidths than floating point precision. For 8bit quantization, we quantize the weights into [-128,127] and the activations into [0,255]. We finetune the model a few epochs after post-quantization to achieve better performance.
The overall network architecture of GhostNet is show below:
Dataset used: Oxford-IIIT Pet
├── GhostNet
├── Readme.md # descriptions about GhostNet # shell script for evaluation with CPU, GPU or Ascend
├── src
│ ├──config.py # parameter configuration
│ ├──dataset.py # creating dataset
│ ├──launch.py # start python script
│ ├──lr_generator.py # learning rate config
│ ├──ghostnet.py # GhostNet architecture
│ ├──quant.py # GhostNet quantization
├── eval.py # evaluation script
├── mindspore_hub_conf.py # export model for hub
To Be Done
After installing MindSpore via the official website, you can start evaluation as follows:
# infer example
Ascend: python eval.py --dataset_path ~/Pets/test.mindrecord --platform Ascend --checkpoint_path [CHECKPOINT_PATH]
GPU: python eval.py --dataset_path ~/Pets/test.mindrecord --platform GPU --checkpoint_path [CHECKPOINT_PATH]
checkpoint can be produced in training process.
result: {'acc': 0.825} ckpt= ./ghostnet_1x_pets_int8.ckpt
| Parameters | ||
|---|---|---|
| Model Version | GhostNet | GhostNet-int8 |
| uploaded Date | 09/08/2020 (month/day/year) ; | 09/08/2020 (month/day/year) |
| MindSpore Version | 0.6.0-alpha | 0.6.0-alpha |
| Dataset | ImageNet2012 | ImageNet2012 |
| Parameters (M) | 5.2 | / |
| FLOPs (M) | 142 | / |
| Accuracy (Top1) | 73.9 | w/o finetune:72.2, w finetune:73.6 |
| Parameters | ||
|---|---|---|
| Model Version | GhostNet | GhostNet-int8 |
| uploaded Date | 09/08/2020 (month/day/year) ; | 09/08/2020 (month/day/year) |
| MindSpore Version | 0.6.0-alpha | 0.6.0-alpha |
| Dataset | Oxford-IIIT Pet | Oxford-IIIT Pet |
| Parameters (M) | 3.9 | / |
| FLOPs (M) | 140 | / |
| Accuracy (Top1) | 82.4 | w/o finetune:81.66, w finetune:82.45 |
In dataset.py, we set the seed inside “create_dataset" function. We also use random seed in train.py.
Please check the official homepage.
MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
C++ Python Text Unity3D Asset C other