Browse Source

modify readme

tags/v0.5.0-beta
wukesong 5 years ago
parent
commit
0933862588
1 changed files with 13 additions and 12 deletions
  1. +13
    -12
      model_zoo/wide_and_deep/README.md

+ 13
- 12
model_zoo/wide_and_deep/README.md View File

@@ -1,20 +1,14 @@
recommendation Model
Recommendation Model
## Overview
This is an implementation of WideDeep as described in the [Wide & Deep Learning for Recommender System](https://arxiv.org/pdf/1606.07792.pdf) paper.

WideDeep model jointly trained wide linear models and deep neural network, which combined the benefits of memorization and generalization for recommender systems.

## Dataset
The [Criteo datasets](http://labs.criteo.com/2014/02/download-kaggle-display-advertising-challenge-dataset/) are used for model training and evaluation.
The Criteo datasets are used for model training and evaluation.

## Running Code

### Download and preprocess dataset
To download the dataset, please install Pandas package first. Then issue the following command:
```
bash download.sh
```

### Code Structure
The entire code structure is as following:
```
@@ -26,13 +20,15 @@ The entire code structure is as following:
|--- src/ "entrance of training and evaluation"
config.py "parameters configuration"
dataset.py "Dataset loader class"
process_data.py "process dataset"
preprocess_data.py "pre_process dataset"
WideDeep.py "Model structure"
callbacks.py "Callback class for training and evaluation"
metrics.py "Metric class"
```

### Train and evaluate model
To train and evaluate the model, issue the following command:
To train and evaluate the model, command as follows:
```
python train_and_test.py
```
@@ -51,7 +47,7 @@ Arguments:
* `--eval_file_name` : Eval output file.
* `--loss_file_name` : Loss output file.

To train the model, issue the following command:
To train the model in one device, command as follows:
```
python train.py
```
@@ -70,7 +66,13 @@ Arguments:
* `--eval_file_name` : Eval output file.
* `--loss_file_name` : Loss output file.

To evaluate the model, issue the following command:
To train the model in distributed, command as follows:
```
# configure environment path, RANK_TABLE_FILE, RANK_SIZE, MINDSPORE_HCCL_CONFIG_PATH before training
bash run_multinpu_train.sh
```

To evaluate the model, command as follows:
```
python test.py
```
@@ -90,4 +92,3 @@ Arguments:
* `--loss_file_name` : Loss output file.

There are other arguments about models and training process. Use the `--help` or `-h` flag to get a full list of possible arguments with detailed descriptions.


Loading…
Cancel
Save