You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

RELEASE.md 2.0 kB

5 years ago
5 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
  1. # MindSpore Serving 1.2.0
  2. ## MindSpore Serving 1.2.0 Release Notes
  3. ### Major Features and Improvements
  4. - [STABLE] Support distributed inference, it needs to cooperate with distributed training to export distributed models for super-large-scale neural network parameters(Ascend 910).
  5. - [STABLE] Support GPU platform, Serving worker nodes can be deployer on Nvidia GPU, Ascend 310 and Ascend 910.
  6. - This release is based on MindSpore version 1.2.0
  7. - Support Python 3.8 and 3.9.
  8. ### API Change
  9. #### API Incompatible Change
  10. ##### Python API
  11. Support deployment of distributed model, refer to [distributed inference tutorial](https://www.mindspore.cn/tutorial/inference/en/r1.2/serving_distributed_example.html) for related API.
  12. #### Deprecations
  13. ##### Python API
  14. ### Bug Fixes
  15. ## Contributors
  16. Thanks goes to these wonderful people:
  17. chenweifeng, qinzheng, xujincai, xuyongfei, zhangyinxia, zhoufeng.
  18. Contributions of any kind are welcome!
  19. ## MindSpore Serving 1.1.1 Release Notes
  20. ## Major Features and Improvements
  21. - Adapts new C++ inference interface for MindSpore version 1.1.1.
  22. ## Bug fixes
  23. - [BUGFIX] Fix bug in transforming result of type int16 in python Client.
  24. - [BUGFIX] Fix bytes type misidentified as str type after python preprocess and postprocess.
  25. - [BUGFIX] Fix bug releasing C++ tensor data when it's wrapped as numpy object sometimes.
  26. - [BUGFIX] Update RuntimeError to warning log when check Ascend environment failed.
  27. ## MindSpore Serving 1.1.0 Release Notes
  28. ### Major Features and Improvements
  29. - [STABLE] Support gRPC and RESTful API.
  30. - [STABLE] Support simple Python API for Client and Server.
  31. - [STABLE] Support Model configuration,User can customize preprocessing & postprocessing for model.
  32. - [STABLE] Support multiple models,Multiple models can run simultaneously.
  33. - [STABLE] Support Model batching,Multiple instances will be split and combined to meet the batch size requirements of the model.
  34. - This release is based on MindSpore version 1.1.0
  35. ### Bug Fixes
  36. ### Contributors

A lightweight and high-performance service module that helps MindSpore developers efficiently deploy online inference services in the production environment.