Are you sure you want to delete this task? Once this task is deleted, it cannot be recovered.
|
|
4 years ago | |
|---|---|---|
| .. | ||
| scripts | 4 years ago | |
| src | 4 years ago | |
| README.md | 4 years ago | |
| README_CN.md | 4 years ago | |
| eval.py | 4 years ago | |
| train.py | 4 years ago | |
FCN is mainly used in the field of image segmentation, which is an end-to-end segmentation method. FCN changes the last full connected layers of VGG to process images of any size, reduce the parameters and improve the segmentation speed of the model. FCN uses VGG structure in the encoding part and deconvolution / up sampling operation in the decoding part to recover the image resolution. Finally, FCN8s uses 8 times deconvolution / up sampling operation to restore the output image to the same size as the input image.
[Paper]: Long, Jonathan, Evan Shelhamer, and Trevor Darrell. "Fully convolutional networks for semantic segmentation." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2015.
FCN8s uses VGG16 without the full connected layers as the encoding part, and fuses the features of the 3rd, 4th and 5th pooling layers in VGG16 respectively. Finally, the deconvolution of stride 8 is used to obtain the segmented image.
Dataset used:
After installing MindSpore through the official website, you can start training and evaluation by following these steps:
running on Ascend with default parameters
# run training example
python train.py --device_id device_id
# run evaluation example with default parameters
python eval.py --device_id device_id
├── cv
├── FCN8s
├── README.md // descriptions about FCN
├── scripts
├── run_train.sh
├── run_standalone_train.sh
├── run_eval.sh
├── build_data.sh
├── src
│ ├──data
│ ├──build_seg_data.py // creating dataset
│ ├──dataset.py // loading dataset
│ ├──nets
│ ├──FCN8s.py // FCN-8s architecture
│ ├──loss
│ ├──loss.py // loss function
│ ├──utils
│ ├──lr_scheduler.py // getting learning_rateFCN-8s
├── train.py // training script
├── eval.py // evaluation script
Parameters for both training and evaluation can be set in config.py.
config for FCN8s
# dataset
'data_file': '/data/workspace/mindspore_dataset/FCN/FCN/dataset/MINDRECORED_NAME.mindrecord', # path and name of one mindrecord file
'batch_size': 32,
'crop_size': 512,
'image_mean': [103.53, 116.28, 123.675],
'image_std': [57.375, 57.120, 58.395],
'min_scale': 0.5,
'max_scale': 2.0,
'ignore_label': 255,
'num_classes': 21,
# optimizer
'train_epochs': 500,
'base_lr': 0.015,
'loss_scale': 1024.0,
# model
'model': 'FCN8s',
'ckpt_vgg16': '',
'ckpt_pre_trained': '',
# train
'save_steps': 330,
'keep_checkpoint_max': 5,
'ckpt_dir': './ckpt',
For more information, see config.py.
build mindrecord training data
sh build_data.sh
or
python src/data/build_seg_data.py --data_root=/home/sun/data/Mindspore/benchmark_RELEASE/dataset \
--data_lst=/home/sun/data/Mindspore/benchmark_RELEASE/dataset/trainaug.txt \
--dst_path=dataset/MINDRECORED_NAME.mindrecord \
--num_shards=1 \
--shuffle=True
running on Ascend with default parameters
python train.py --device_id device_id
Checkpoints will be stored in the default path
Evaluated on Pascal VOC 2012 validation set using Ascend
Before running the command, check the path of the checkpoint used for evaluation. Please set the absolute path of the checkpoint
python eval.py
After running the above command, you can see the evaluation results on the terminal. The accuracy on the test set is presented as follows:
mean IoU 0.6425
| Parameters | Ascend |
|---|---|
| Model Version | FCN-8s |
| Resource | Ascend 910; CPU 2.60GHz, 192cores; Memory 755G; OS Euler2.8 |
| uploaded Date | 12/30/2020 (month/day/year) |
| MindSpore Version | 1.1.0-alpha |
| Dataset | PASCAL VOC 2012 |
| Training Parameters | epoch=500, steps=330, batch_size = 32, lr=0.015 |
| Optimizer | Momentum |
| Loss Function | Softmax Cross Entropy |
| outputs | probability |
| Loss | 0.038 |
| Speed | 1pc: 564.652 ms/step; |
| Scripts | FCN script |
| Parameters | Ascend |
|---|---|
| Model Version | FCN-8s |
| Resource | Ascend 910; OS Euler2.8 |
| Uploaded Date | 12/30/2020 (month/day/year) |
| MindSpore Version | 1.1.0-alpha |
| Dataset | PASCAL VOC 2012 |
| batch_size | 16 |
| outputs | probability |
| mean IoU | 64.25 |
We set the random seeds in train.py.
Please check the official homepage.
MindSpore is a new open source deep learning training/inference framework that could be used for mobile, edge and cloud scenarios.
C++ Python Text Unity3D Asset C other