Browse Source

add resnet18 310 infer in readme

pull/14860/head
jiangzhenguang 4 years ago
parent
commit
d0ab7afefe
3 changed files with 72 additions and 2 deletions
  1. +35
    -0
      model_zoo/official/cv/resnet/README.md
  2. +35
    -0
      model_zoo/official/cv/resnet/README_CN.md
  3. +2
    -2
      model_zoo/official/cv/resnet/scripts/run_infer_310.sh

+ 35
- 0
model_zoo/official/cv/resnet/README.md View File

@@ -12,6 +12,10 @@
- [Script Parameters](#script-parameters) - [Script Parameters](#script-parameters)
- [Training Process](#training-process) - [Training Process](#training-process)
- [Evaluation Process](#evaluation-process) - [Evaluation Process](#evaluation-process)
- [Inference Process](#inference-process)
- [Export MindIR](#export-mindir)
- [Infer on Ascend310](#infer-on-ascend310)
- [result](#result)
- [Model Description](#model-description) - [Model Description](#model-description)
- [Performance](#performance) - [Performance](#performance)
- [Evaluation Performance](#evaluation-performance) - [Evaluation Performance](#evaluation-performance)
@@ -479,6 +483,37 @@ result: {'top_5_accuracy': 0.9342589628681178, 'top_1_accuracy': 0.7680657810499


``` ```


## Inference Process

### [Export MindIR](#contents)

```shell
python export.py --ckpt_file [CKPT_PATH] --file_name [FILE_NAME] --file_format [FILE_FORMAT]
```

The ckpt_file parameter is required,
`EXPORT_FORMAT` should be in ["AIR", "MINDIR"]

### Infer on Ascend310

Before performing inference, the mindir file must bu exported by `export.py` script. We only provide an example of inference using MINDIR model.
Current batch_Size can only be set to 1. The precision calculation process needs about 70G+ memory space, otherwise the process will be killed for execeeding memory limits.

```shell
# Ascend310 inference
bash run_infer_310.sh [MINDIR_PATH] [DATA_PATH] [DEVICE_ID]
```

- `DEVICE_ID` is optional, default value is 0.

### result

Inference result is saved in current path, you can find result like this in acc.log file.

```bash
top1_accuracy:70.42, top5_accuracy:89.7
```

# [Model Description](#contents) # [Model Description](#contents)


## [Performance](#contents) ## [Performance](#contents)


+ 35
- 0
model_zoo/official/cv/resnet/README_CN.md View File

@@ -14,6 +14,10 @@
- [脚本参数](#脚本参数) - [脚本参数](#脚本参数)
- [训练过程](#训练过程) - [训练过程](#训练过程)
- [评估过程](#评估过程) - [评估过程](#评估过程)
- [推理过程](#推理过程)
- [导出MindIR](#导出mindir)
- [在Ascend310执行推理](#在ascend310执行推理)
- [结果](#结果)
- [模型描述](#模型描述) - [模型描述](#模型描述)
- [性能](#性能) - [性能](#性能)
- [评估性能](#评估性能) - [评估性能](#评估性能)
@@ -446,6 +450,37 @@ result:{'top_5_accuracy':0.9342589628681178, 'top_1_accuracy':0.768065781049936}


``` ```


## 推理过程

### [导出MindIR](#contents)

```shell
python export.py --ckpt_file [CKPT_PATH] --file_name [FILE_NAME] --file_format [FILE_FORMAT]
```

参数ckpt_file为必填项,
`EXPORT_FORMAT` 必须在 ["AIR", "MINDIR"]中选择。

### 在Ascend310执行推理

在执行推理前,mindir文件必须通过`export.py`脚本导出。以下展示了使用minir模型执行推理的示例。
目前仅支持batch_Size为1的推理。精度计算过程需要70G+的内存,否则进程将会因为超出内存被系统终止。

```shell
# Ascend310 inference
bash run_infer_310.sh [MINDIR_PATH] [DATA_PATH] [DEVICE_ID]
```

- `DEVICE_ID` 可选,默认值为0。

### 结果

推理结果保存在脚本执行的当前路径,你可以在acc.log中看到以下精度计算结果。

```bash
top1_accuracy:70.42, top5_accuracy:89.7
```

# 模型描述 # 模型描述


## 性能 ## 性能


+ 2
- 2
model_zoo/official/cv/resnet/scripts/run_infer_310.sh View File

@@ -55,7 +55,7 @@ fi


function compile_app() function compile_app()
{ {
cd ../ascend310_infer/src/
cd ../ascend310_infer/src/ || exit
if [ -f "Makefile" ]; then if [ -f "Makefile" ]; then
make clean make clean
fi fi
@@ -64,7 +64,7 @@ function compile_app()


function infer() function infer()
{ {
cd -
cd - || exit
if [ -d result_Files ]; then if [ -d result_Files ]; then
rm -rf ./result_Files rm -rf ./result_Files
fi fi


Loading…
Cancel
Save