Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
|
|
4 years ago | |
|---|---|---|
| .. | ||
| src | 4 years ago | |
| Readme.md | 4 years ago | |
| eval.py | 4 years ago | |
| mindpsore_hub_conf.py | 5 years ago | |
The GhostNet architecture is based on an Ghost module structure which generate more features from cheap operations. Based on a set of intrinsic feature maps, a series of cheap operations are applied to generate many ghost feature maps that could fully reveal information underlying intrinsic features.
Paper: Kai Han, Yunhe Wang, Qi Tian, Jianyuan Guo, Chunjing Xu, Chang Xu. GhostNet: More Features from Cheap Operations. CVPR 2020.
The overall network architecture of GhostNet is show below:
Dataset used: Oxford-IIIT Pet
├── GhostNet
├── Readme.md # descriptions about ghostnet # shell script for evaluation with CPU, GPU or Ascend
├── src
│ ├──config.py # parameter configuration
│ ├──dataset.py # creating dataset
│ ├──launch.py # start python script
│ ├──lr_generator.py # learning rate config
│ ├──ghostnet.py # GhostNet architecture
│ ├──ghostnet600.py # GhostNet-600M architecture
├── eval.py # evaluation script
├── mindspore_hub_conf.py # export model for hub
To Be Done
After installing MindSpore via the official website, you can start evaluation as follows:
# infer example
Ascend: python eval.py --model [ghostnet/ghostnet-600] --dataset_path ~/Pets/test.mindrecord --platform Ascend --checkpoint_path [CHECKPOINT_PATH]
GPU: python eval.py --model [ghostnet/ghostnet-600] --dataset_path ~/Pets/test.mindrecord --platform GPU --checkpoint_path [CHECKPOINT_PATH]
checkpoint can be produced in training process.
result: {'acc': 0.8113927500681385} ckpt= ./ghostnet_nose_1x_pets.ckpt
result: {'acc': 0.824475333878441} ckpt= ./ghostnet_1x_pets.ckpt
result: {'acc': 0.8691741618969746} ckpt= ./ghostnet600M_pets.ckpt
| Parameters | ||
|---|---|---|
| Model Version | GhostNet | GhostNet-600 |
| uploaded Date | 09/08/2020 (month/day/year) ; | 09/08/2020 (month/day/year) |
| MindSpore Version | 0.6.0-alpha | 0.6.0-alpha |
| Dataset | ImageNet2012 | ImageNet2012 |
| Parameters (M) | 5.2 | 11.9 |
| FLOPs (M) | 142 | 591 |
| Accuracy (Top1) | 73.9 | 80.2 |
| Parameters | ||
|---|---|---|
| Model Version | GhostNet | GhostNet-600 |
| uploaded Date | 09/08/2020 (month/day/year) ; | 09/08/2020 (month/day/year) |
| MindSpore Version | 0.6.0-alpha | 0.6.0-alpha |
| Dataset | Oxford-IIIT Pet | Oxford-IIIT Pet |
| Parameters (M) | 3.9 | 10.6 |
| FLOPs (M) | 140 | 590 |
| Accuracy (Top1) | 82.4 | 86.9 |
| Model | FLOPs (M) | Latency (ms)* | Accuracy (Top1) |
|---|---|---|---|
| MobileNetV2-1x | 300 | 28.2 | 78.5 |
| Ghost-1x w\o SE | 138 | 19.1 | 81.1 |
| Ghost-1x | 140 | 25.3 | 82.4 |
| Ghost-600 | 590 | - | 86.9 |
*The latency is measured on Huawei Kirin 990 chip under single-threaded mode with batch size 1.
In dataset.py, we set the seed inside “create_dataset" function. We also use random seed in train.py.
Please check the official homepage.
MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
C++ Python Text Unity3D Asset C other