Browse Source

!66 Update Serving API

From: @xu-yfei
Reviewed-by: @linqingke,@zhoufeng54,@linqingke
Signed-off-by: @linqingke
tags/v1.1.0
mindspore-ci-bot Gitee 5 years ago
parent
commit
80c1fee260
6 changed files with 49 additions and 52 deletions
  1. +4
    -4
      README_CN.md
  2. +1
    -1
      mindspore_serving/ccsrc/common/servable.h
  3. +13
    -16
      mindspore_serving/client/python/client.py
  4. +8
    -8
      mindspore_serving/master/_master.py
  5. +22
    -22
      mindspore_serving/worker/_worker.py
  6. +1
    -1
      mindspore_serving/worker/register/method.py

+ 4
- 4
README_CN.md View File

@@ -105,15 +105,15 @@ MindSpore Serving运行需要配置以下环境变量:

## 快速入门

以一个简单的[Add网络示例](https://www.mindspore.cn/tutorial/inference/zh-CN/serving_example.html),演示MindSpore Serving如何使用。
以一个简单的[Add网络示例](https://www.mindspore.cn/tutorial/inference/zh-CN/master/serving_example.html),演示MindSpore Serving如何使用。

## 文档

### 开发者教程

- [基于gRPC接口访问MindSpore Serving服务](https://www.mindspore.cn/tutorial/inference/zh-CN/serving_grpc.html)
- [基于RESTful接口访问MindSpore Serving服务](https://www.mindspore.cn/tutorial/inference/zh-CN/serving_restful.html)
- [通过配置模型提供Servable](https://www.mindspore.cn/tutorial/inference/zh-CN/serving_model.html)
- [基于gRPC接口访问MindSpore Serving服务](https://www.mindspore.cn/tutorial/inference/zh-CN/master/serving_grpc.html)
- [基于RESTful接口访问MindSpore Serving服务](https://www.mindspore.cn/tutorial/inference/zh-CN/master/serving_restful.html)
- [通过配置模型提供Servable](https://www.mindspore.cn/tutorial/inference/zh-CN/master/serving_model.html)

有关安装指南、教程和API的更多详细信息,请参阅[用户文档](https://www.mindspore.cn/doc/api_python/zh-CN/master/index.html)。



+ 1
- 1
mindspore_serving/ccsrc/common/servable.h View File

@@ -83,7 +83,7 @@ struct RequestSpec {
struct MS_API ServableMeta {
std::string servable_name;
std::string servable_file; // file name
ModelType model_format; // OM MindIR
ModelType model_format; // OM, MindIR
bool with_batch_dim = true; // whether there is batch dim in model's inputs/outputs
size_t inputs_count = 0;
size_t outputs_count = 0;


+ 13
- 16
mindspore_serving/client/python/client.py View File

@@ -143,19 +143,17 @@ def _check_int(arg_name, int_val, mininum=None, maximum=None):
class Client:
"""
The Client encapsulates the serving gRPC API, which can be used to create requests,
access serving, and parse results.
access serving, and parse results.

Args:
ip(str): Serving ip.
port(int): Serving port.
servable_name(str): The name of servable supplied by Serving.
method_name(str): The name of method supplied by servable.
version_number(int): The version number of servable, default 0,
0 meaning the maximum version number in all running versions.
max_msg_mb_size(int): The maximum acceptable gRPC message size in megabytes(MB), default 512,
value range [1, 512].
ip (str): Serving ip.
port (int): Serving port.
servable_name (str): The name of servable supplied by Serving.
method_name (str): The name of method supplied by servable.
version_number (int): The version number of servable, default 0,
which means the maximum version number in all running versions.
Raises:
RuntimeError: The type or value of the parameters is invalid, or other error happened.
RuntimeError: The type or value of the parameters is invalid, or other errors happened.

Examples:
>>> from mindspore_serving.client import Client
@@ -169,13 +167,12 @@ class Client:
>>> print(result)
"""

def __init__(self, ip, port, servable_name, method_name, version_number=0, max_msg_mb_size=512):
def __init__(self, ip, port, servable_name, method_name, version_number=0):
_check_str("ip", ip)
_check_int("port", port, 0, 65535)
_check_int("port", port, 1, 65535)
_check_str("servable_name", servable_name)
_check_str("method_name", method_name)
_check_int("version_number", version_number, 0)
_check_int("max_msg_mb_size", max_msg_mb_size, 1, 512)

self.ip = ip
self.port = port
@@ -184,7 +181,7 @@ class Client:
self.version_number = version_number

channel_str = str(ip) + ":" + str(port)
msg_bytes_size = max_msg_mb_size * 1024 * 1024
msg_bytes_size = 512 * 1024 * 1024 # 512MB
channel = grpc.insecure_channel(channel_str,
options=[
('grpc.max_send_message_length', msg_bytes_size),
@@ -197,11 +194,11 @@ class Client:
Used to create requests, access serving, and parse results.

Args:
instances(map, tuple of map): Instance or tuple of instance, every instance item is the inputs map.
instances (map, tuple of map): Instance or tuple of instances, every instance item is the inputs map.
The map key is the input name, and the value is the input value.

Raises:
RuntimeError: The type or value of the parameters is invalid, or other error happened.
RuntimeError: The type or value of the parameters is invalid, or other errors happened.
"""
if not isinstance(instances, (tuple, list)):
instances = (instances,)


+ 8
- 8
mindspore_serving/master/_master.py View File

@@ -72,12 +72,12 @@ def start_grpc_server(ip="0.0.0.0", grpc_port=5500, max_msg_mb_size=100):

Args:
ip (str): gRPC server ip.
grpc_port (int): gRPC port ip, default 5500,ip port range [0, 65535].
grpc_port (int): gRPC port ip, default 5500, ip port range [1, 65535].
max_msg_mb_size (int): The maximum acceptable gRPC message size in megabytes(MB), default 100,
value range [1, 512].

Raises:
RuntimeError: Start gRPC server failed.
RuntimeError: Fail to start the gRPC server.

Examples:
>>> from mindspore_serving import master
@@ -96,17 +96,17 @@ def start_grpc_server(ip="0.0.0.0", grpc_port=5500, max_msg_mb_size=100):
@stop_on_except
def start_master_server(ip="127.0.0.1", master_port=6100):
r"""
Start gRPC server for the commication between workers and the master.
Start the gRPC server for the communication between workers and the master.

Note:
The ip is expected to be accessed only by workers, not clients.

Args:
ip (str): gRPC ip for worker to commnicate with, default '127.0.0.1'.
master_port (int): gRPC port ip, default 6100,ip port range [0, 65535].
ip (str): gRPC ip for workers to communicate with, default '127.0.0.1'.
master_port (int): gRPC port ip, default 6100, ip port range [1, 65535].

Raises:
RuntimeError: Start gRPC server failed.
RuntimeError: Fail to start the master server.

Examples:
>>> from mindspore_serving import master
@@ -129,12 +129,12 @@ def start_restful_server(ip="0.0.0.0", restful_port=5900, max_msg_mb_size=100):

Args:
ip (str): RESTful server ip.
restful_port (int): gRPC port ip, default 5900,ip port range [0, 65535].
restful_port (int): gRPC port ip, default 5900, ip port range [1, 65535].
max_msg_mb_size (int): The maximum acceptable RESTful message size in megabytes(MB), default 100,
value range [1, 512].

Raises:
RuntimeError: Start RESTful server failed.
RuntimeError: Fail to start the RESTful server.

Examples:
>>> from mindspore_serving import master


+ 22
- 22
mindspore_serving/worker/_worker.py View File

@@ -98,22 +98,22 @@ def start_servable(servable_directory, servable_name, version_number=0,
Serving has two running modes. One is running in a single process, providing the Serving service of a single model.
The other includes a master and multiple workers. This interface is for the second scenario.

The master is responsible for providing the Serving access interface for client,
while the worker is responsible for providing the inference service of the specific model. The master
and worker communicate through gPRC defined as (master_ip, master_port) and (worker_ip, worker_port).
The master is responsible for providing the Serving access interface for clients,
while the worker is responsible for providing the inference service of the specific model. The communications
between the master and workers through gPRC are defined as (master_ip, master_port) and (worker_ip, worker_port).

Args:
servable_directory (str): The directory where the servable located in, there expected to has a directory named
`servable_name`. For more detail:
`How to config Servable <https://gitee.com/mindspore/serving/blob/master/docs/MODEL.md>`_
servable_directory (str): The directory where the servable is located in. There expects to has a directory
named `servable_name`. For more detail:
`How to config Servable <https://www.mindspore.cn/tutorial/inference/zh-CN/master/serving_model.html>`_ .

servable_name (str): The servable name.
version_number (int): Servable version number to be loaded. Version number should be a positive integer,
started from 1, and 0 means to load the latest version. Default: 0.
device_type (str): Current only support "Ascend", "Davinci" and None, Default: None.
"Ascend" which means device type can be Ascend910 or Ascend310, etc.
"Davinci" has the same meanings of "Ascend".
None: The device type is determined by the mindspore environment.
version_number (int): Servable version number to be loaded. The version number should be a positive integer,
starting from 1, and 0 means to load the latest version. Default: 0.
device_type (str): Currently only supports "Ascend", "Davinci" and None, Default: None.
"Ascend" means the device type can be Ascend910 or Ascend310, etc.
"Davinci" has the same meaning as "Ascend".
None means the device type is determined by the MindSpore environment.
device_id (int): The id of the device the model loads into and runs in.
master_ip (str): The master ip the worker linked to.
master_port (int): The master port the worker linked to.
@@ -161,24 +161,24 @@ def start_servable(servable_directory, servable_name, version_number=0,
def start_servable_in_master(servable_directory, servable_name, version_number=0, device_type=None,
device_id=0):
r"""
Start up the servable named 'servable_name' defined in 'svable_directory', and the worker will running in
Start up the servable named 'servable_name' defined in 'svable_directory', and the worker will run in
the process of the master.

Serving has two running modes. One is running in a single process, providing the Serving service of a single model.
The other includes a master and multiple workers. This interface is for the first scenario.

Args:
servable_directory (str): The directory where the servable located in, there expected to has a directory named
`servable_name`. For more detail:
`How to config Servable <https://gitee.com/mindspore/serving/blob/master/docs/MODEL.md>`_
servable_directory (str): The directory where the servable is located in. There expects to has a directory named
`servable_name`. For more detail:
`How to config Servable <https://www.mindspore.cn/tutorial/inference/zh-CN/master/serving_model.html>`_ .

servable_name (str): The servable name.
version_number (int): Servable version number to be loaded. Version number should be a positive integer,
started from 1, and 0 means to load the latest version. Default: 0.
device_type (str): Current only support "Ascend", "Davinci" and None, Default: None.
"Ascend" which means device type can be Ascend910 or Ascend310, etc.
"Davinci" has the same meanings of "Ascend".
None: The device type is determined by the mindspore environment.
version_number (int): Servable version number to be loaded. The version number should be a positive integer,
starting from 1, and 0 means to load the latest version. Default: 0.
device_type (str): Currently only supports "Ascend", "Davinci" and None, Default: None.
"Ascend" means the device type can be Ascend910 or Ascend310, etc.
"Davinci" has the same meaning as "Ascend".
None means the device type is determined by the MindSpore environment.

Examples:
>>> import os


+ 1
- 1
mindspore_serving/worker/register/method.py View File

@@ -368,7 +368,7 @@ def register_method(output_names):
Preprocess and postprocess are optional.

Args:
output_names(str, tuple or list of str): The output names of method. The input names is
output_names (str, tuple or list of str): The output names of method. The input names is
the args names of the registered function.

Raises:


Loading…
Cancel
Save