diff --git a/README.md b/README.md
index 0529495..9307f72 100644
--- a/README.md
+++ b/README.md
@@ -1,150 +1,140 @@
-# MindSpore-based Inference Service Deployment
+# MindSpore Serving
+[查看中文](./README_CN.md)
-- [MindSpore-based Inference Service Deployment](#mindspore-based-serving-service-deployment)
+- [MindSpore Serving](#mindspore-serving)
- [Overview](#overview)
- - [Starting Serving](#starting-serving)
- - [Application Example](#application-example)
- - [Exporting Model](#exporting-model)
- - [Starting Serving Inference](#starting-serving-serving)
- - [Client Samples](#client-samples)
- - [Python Client Sample](#python-client-sample)
- - [C++ Client Sample](#cpp-client-sample)
+ - [Installation](#installation)
+ - [Installing Serving](#installing-serving)
+ - [Configuring Environment Variables](#configuring-environment-variables)
+ - [Quick Start](#quick-start)
+ - [Documents](#documents)
+ - [Developer Guide](#developer-guide)
+ - [Community](#community)
+ - [Governance](#governance)
+ - [Communication](#communication)
+ - [Contributions](#contributions)
+ - [Release Notes](#release-notes)
+ - [License](#license)
-
-
## Overview
-MindSpore Serving is a lightweight and high-performance service module that helps MindSpore developers efficiently deploy online serving services in the production environment. After completing model training using MindSpore, you can export the MindSpore model and use MindSpore Serving to create an serving service for the model. Currently, only Ascend 910 is supported.
+MindSpore Serving is a lightweight and high-performance service module that helps MindSpore developers efficiently deploy online inference services in the production environment. After completing model training on MindSpore, you can export the MindSpore model and use MindSpore Serving to create an inference service for the model.
+MindSpore Serving architecture:
-## Starting Serving
-After MindSpore is installed using `pip`, the Serving executable program is stored in `/{your python path}/lib/python3.7/site-packages/mindspore/ms_serving`.
-Run the following command to start Serving:
-```bash
-ms_serving [--help] [--model_path ] [--model_name ]
- [--port ] [--device_id ]
-```
-Parameters are described as follows:
+Currently, the MindSpore Serving nodes include client, master, and worker. On a client node, you can directly deliver inference service commands through the gRPC or RESTful API. Servable model service is deployed on a worker node. Servable indicates a single model or a combination of multiple models and provides different services in various methods. A master node manages all worker nodes and their model information, as well as managing and distributing tasks. The master and worker nodes can be deployed in the same process or in different processes. Currently, the client and master nodes do not depend on specific hardware platforms. The worker node supports only the Ascend 310 and Ascend 910 platforms. GPUs and CPUs will be supported in the future.
+
-|Parameter|Attribute|Function|Parameter Type|Default Value|Value Range|
-|---|---|---|---|---|---|
-|`--help`|Optional|Displays the help information about the startup command. |-|-|-|
-|`--model_path=`|Mandatory|Path for storing the model to be loaded. |String|Null|-|
-|`--model_name=`|Mandatory|Name of the model file to be loaded. |String|Null|-|
-|`--=port `|Optional|Specifies the external Serving port number. |Integer|5500|1–65535|
-|`--device_id=`|Optional|Specifies device ID to be used.|Integer|0|0 to 7|
+MindSpore Serving provides the following functions:
- > Before running the startup command, add the path `/{your python path}/lib:/{your python path}/lib/python3.7/site-packages/mindspore/lib` to the environment variable `LD_LIBRARY_PATH`.
+- gRPC and RESTful APIs on clients
+- Pre-processing and post-processing of assembled models
+- Batch. Multiple instance requests are split and combined to meet the `batch size` requirement of the model.
+- Simple Python APIs on clients
-## Application Example
-The following uses a simple network as an example to describe how to use MindSpore Serving.
+## Installation
-### Exporting Model
-Use [add_model.py](https://gitee.com/mindspore/mindspore/blob/master/serving/example/export_model/add_model.py) to build a network with only the Add operator and export the MindSpore serving deployment model.
+MindSpore Serving depends on the MindSpore training and inference framework. Therefore, install [MindSpore](https://gitee.com/mindspore/mindspore/blob/master/README.md#installation) and then MindSpore Serving.
-```python
-python add_model.py
-```
-Execute the script to generate the `tensor_add.mindir` file. The input of the model is two one-dimensional tensors with shape [2,2], and the output is the sum of the two input tensors.
+### Installing Serving
-### Starting Serving Inference
-```bash
-ms_serving --model_path={model directory} --model_name=tensor_add.mindir
-```
-If the server prints the `MS Serving Listening on 0.0.0.0:5500` log, the Serving has loaded the serving model.
+Use the pip command to install Serving. Perform the following steps:
-### Client Samples
-#### Python Client Sample
-Obtain [ms_client.py](https://gitee.com/mindspore/mindspore/blob/master/serving/example/python_client/ms_client.py) and start the Python client.
-```bash
-python ms_client.py
-```
+- Download the .whl package from the MindSpore Serving page and install it.
-If the following information is displayed, the Serving has correctly executed the serving of the Add network.
-```
-ms client received:
-[[2. 2.]
- [2. 2.]]
-```
+ ```python
+ pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/1.1.0/Serving/ascend/ubuntu_x86/mindspore_serving-1.1.0-cp37-cp37m-linux_x86_64.whl
+ ```
-#### C++ Client Sample
-1. Obtain an executable client sample program.
+- Install Serving using the source code.
- Download the [MindSpore source code](https://gitee.com/mindspore/mindspore). You can use either of the following methods to compile and obtain the client sample program:
- + When MindSpore is compiled using the source code, the Serving C++ client sample program is generated. You can find the `ms_client` executable program in the `build/mindspore/serving/example/cpp_client` directory.
- + Independent compilation
+ Download the [source code](https://gitee.com/mindspore/serving) and go to the `serving` directory.
- Preinstall [gRPC](https://gRPC.io).
+ Method 1: Specify the path of the installed or built MindSpore package on which Serving depends and install Serving.
- Run the following command in the MindSpore source code path to compile a client sample program:
- ```bash
- cd mindspore/serving/example/cpp_client
- mkdir build && cd build
- cmake -D GRPC_PATH={grpc_install_dir} ..
- make
- ```
- In the preceding command, `{grpc_install_dir}` indicates the gRPC installation path. Replace it with the actual gRPC installation path.
+ ```shell
+ sh build.sh -p $MINDSPORE_LIB_PATH
+ ```
-2. Start the client.
+ In the preceding information, `build.sh` is the build script file in the `serving` directory, and `$MINDSPORE_LIB_PATH` is the `lib` directory in the installation path of the MindSpore software package, for example, `softwarepath/mindspore/lib`. This path contains the library files on which MindSpore depends.
- Execute `ms_client` to send an serving request to the Serving.
- ```bash
- ./ms_client --target=localhost:5500
- ```
- If the following information is displayed, the Serving has correctly executed the serving of the Add network.
- ```
- Compute [[1, 2], [3, 4]] + [[1, 2], [3, 4]]
- Add result is 2 4 6 8
- client received: RPC OK
+ Method 2: Directly build Serving. The MindSpore package is built together with Serving. You need to configure the [environment variables](https://gitee.com/mindspore/docs/blob/master/install/mindspore_ascend_install_source_en.md#configuring-environment-variables) for MindSpore building.
+
+ ```shell
+ # Ascend 310
+ sh build.sh -e d -V 310
+ # Ascend 910
+ sh build.sh -e ascend
```
-The client code consists of the following parts:
+ In the preceding information, `build.sh` is the build script file in the `serving` directory, Take the x86 system as an example. After the build is complete, find the .whl installation package of MindSpore in the `serving/third_party/mindspore/build/package/` directory and install it.
-1. Implement the client based on MSService::Stub and create a client instance.
- ```
- class MSClient {
- public:
- explicit MSClient(std::shared_ptr channel) : stub_(MSService::NewStub(channel)) {}
- private:
- std::unique_ptr stub_;
- };MSClient client(grpc::CreateChannel(target_str, grpc::InsecureChannelCredentials()));
-
- MSClient client(grpc::CreateChannel(target_str, grpc::InsecureChannelCredentials()));
-
+ ```python
+ pip install mindspore_ascend-1.1.0-cp37-cp37m-linux_x86_64.whl
```
-2. Build the request input parameter `Request`, output parameter `Reply`, and gRPC client `Context` based on the actual network input.
- ```
- PredictRequest request;
- PredictReply reply;
- ClientContext context;
-
- //construct tensor
- Tensor data;
-
- //set shape
- TensorShape shape;
- shape.add_dims(4);
- *data.mutable_tensor_shape() = shape;
-
- //set type
- data.set_tensor_type(ms_serving::MS_FLOAT32);
- std::vector input_data{1, 2, 3, 4};
-
- //set datas
- data.set_data(input_data.data(), input_data.size());
-
- //add tensor to request
- *request.add_data() = data;
- *request.add_data() = data;
- ```
-3. Call the gRPC API to communicate with the Serving that has been started, and obtain the return value.
+
+ Find the .whl installation package of Serving in the `serving/build/package/` directory and install it.
+
+ ```python
+ pip install mindspore_serving-1.1.0-cp37-cp37m-linux_x86_64.whl
```
- Status status = stub_->Predict(&context, request, &reply);
+
+Run the following commands to verify the installation. Import the Python module. If no error is reported, the installation is successful.
+
+```python
+from mindspore_serving import master
+from mindspore_serving import worker
+```
+
+### Configuring Environment Variables
+
+To run MindSpore Serving, configure the following environment variables:
+
+- MindSpore Serving depends on MindSpore. You need to configure [environment variables](https://gitee.com/mindspore/docs/blob/master/install/mindspore_ascend_install_source_en.md#configuring-environment-variables) to run MindSpore.
+
+- MindSpore Serving depends on the MindSpore library file. You need to specify the `lib` path in the build or installation path of the MindSpore software package to LD_LIBRARY_PATH. For the meaning of `$MINDSPORE_LIB_PATH`, see [MindSpore-based Inference Service Deployment](https://gitee.com/mindspore/serving/blob/master/README.md#installation).
+
+ ```shell
+ export LD_LIBRARY_PATH=$MINDSPORE_LIB_PATH:${LD_LIBRARY_PATH}
```
-For details about the complete code, see [ms_client](https://gitee.com/mindspore/mindspore/blob/master/serving/example/cpp_client/ms_client.cc).
+## Quick Start
+
+[MindSpore-based Inference Service Deployment](https://www.mindspore.cn/tutorial/inference/en/master/serving_example.html) is used to demonstrate how to use MindSpore Serving.
+
+## Documents
+
+### Developer Guide
+
+- [gRPC-based MindSpore Serving Access](https://www.mindspore.cn/tutorial/inference/en/master/serving_grpc.html)
+- [RESTful-based MindSpore Serving Access](https://www.mindspore.cn/tutorial/inference/en/master/serving_restful.html)
+- [Servable Provided Through Model Configuration](https://www.mindspore.cn/tutorial/inference/en/master/serving_model.html)
+
+For more details about the installation guide, tutorials, and APIs, see [MindSpore Python API](https://www.mindspore.cn/doc/api_python/en/master/index.html).
+
+## Community
+
+### Governance
+
+[MindSpore Open Governance](https://gitee.com/mindspore/community/blob/master/governance.md)
+
+### Communication
+
+- [MindSpore Slack](https://join.slack.com/t/mindspore/shared_invite/zt-dgk65rli-3ex4xvS4wHX7UDmsQmfu8w) developer communication platform
+
+## Contributions
+
+Welcome to MindSpore contribution.
+
+## Release Notes
+
+[RELEASE](RELEASE.md)
+
+## License
+
+[Apache License 2.0](LICENSE)
\ No newline at end of file
diff --git a/docs/architecture.png b/docs/architecture.png
index c6ac6dd..6cc6883 100644
Binary files a/docs/architecture.png and b/docs/architecture.png differ
diff --git a/mindspore_serving/worker/check_type.py b/mindspore_serving/worker/check_type.py
index 261a701..097fb38 100644
--- a/mindspore_serving/worker/check_type.py
+++ b/mindspore_serving/worker/check_type.py
@@ -87,8 +87,6 @@ def check_and_as_int_tuple_list(arg_name, ints, mininum=None, maximum=None):
for item in ints:
if not isinstance(item, int):
raise RuntimeError(f"The item of parameter '{arg_name}' should be int, but actually {type(item)}")
- if not item:
- raise RuntimeError(f"The item of parameter '{arg_name}' should not be empty int")
if item in int_list:
raise RuntimeError(f"The item name '{item}' in parameter '{arg_name}' should not be repeated")
check_int(arg_name, item, mininum, maximum)