You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

api.py 34 kB

[to #43040150] fix: login warn refactor, error message when exception, ut case to new user 未登录warn提示信息重构,只有遇到请求失败,exception时,如果cookie为空,提示用户login,单元测试用户修改成单独 Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9283579 * [to #42322933] add space dialog-state tracking pipeline Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9227018 * init * token to ids * add model * model forward ready * add intent * intent preprocessor ready * intent success * merge master * test with model hub * add flake8 * update * update * update * Merge branch 'master' into nlp/space/gen * delete file about gen * init * fix flake8 bug * [to #42322933] init * bug fix * [to #42322933] init * update pipeline registry info * Merge remote-tracking branch 'origin/master' into feat/nli * [to #42322933] init * [to #42322933] init * modify forward * [to #42322933] init * generation ready * init * Merge branch 'master' into feat/zero_shot_classification # Conflicts: # modelscope/preprocessors/__init__.py * [to #42322933] bugfix * [to #42322933] pre commit fix * fill mask * registry multi models on model and pipeline * add tests * test level >= 0 * local gen ready * merge with master * dialog modeling ready * fix comments: rename and refactor AliceMindMLM; adjust pipeline * space intent and modeling(generation) are ready * bug fix * add dep * add dep * support dst data processor * merge with nlp/space/dst * merge with master * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/models/nlp/__init__.py modelscope/pipelines/builder.py modelscope/pipelines/outputs.py modelscope/preprocessors/nlp.py requirements/nlp.txt * merge with master * merge with master 2/2 * fix comments * fix isort for pre-commit check * allow params pass to pipeline's __call__ method * Merge remote-tracking branch 'origin/master' into feat/zero_shot_classification * merge with nli task * merge with sentiment_classification * merge with zero_shot_classfication * merge with fill_mask * merge with space * merge with master head * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/utils/constant.py * fix: pipeline module_name from model_type to 'fill_mask' & fix merge bug * unfiinished change * fix bug * unfinished * unfinished * revise modelhub dependency * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * add eval() to pipeline call * add test level * ut run passed * add default args * tmp * merge master * all ut passed * remove an useless enum * revert a mis modification * revert a mis modification * Merge commit 'ace8af92465f7d772f035aebe98967726655f12c' into feat/nlp * commit 'ace8af92465f7d772f035aebe98967726655f12c': [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py # modelscope/utils/constant.py * Merge branch 'feat/nlp' into feat/nlp_refactor * feat/nlp: [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py * fix compile bug * refactor space * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * pre-commit lint * lint file * lint file * lint file * update modelhub dependency * lint file * ignore dst_processor temporary * solve comment: 1. change MaskedLMModelBase to MaskedLanguageModelBase 2. remove a useless import * recommit * remove MaskedLanguageModel from __all__ * Merge commit '1a0d4af55a2eee69d89633874890f50eda8f8700' into feat/nlp_refactor * commit '1a0d4af55a2eee69d89633874890f50eda8f8700': [to #42322933] test level check Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9143809 [to #42322933] update nlp models name in metainfo Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9134657 # Conflicts: # modelscope/metainfo.py * update * revert pipeline params update * remove zeroshot * update sequence classfication outpus * merge with fill mask * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * init dialog state tracking * fix flake8 warning of dst * Merge remote-tracking branch 'origin/feat/fill_mask' into feat/nlp * merge with master * remove useless test.py * add init * merge nlp * Merge remote-tracking branch 'origin/master' into feat/nlp * remove unformatted space trainer * Merge branch 'feat/nlp' into nlp/space/dst * revise based on comment except chinease comment * skip ci blocking * change Chinese notes of space3.0 into English * translate chinese comment to english * add space to metainfo * space dst pipeline is ready, but model's result is wrong * merge feat/nlp * merge with master * change processor * change example * test case ready * dst loacl ready * update dst conf * merge feat/nlp * inform revise * inherit bug fix * init * add 2 complete examples * fix bug * add test case * merge with master * modify model name * add missing setting * add outputs * modify test level * modify chinese comment * remove useless doc * merge feat/nlp * Merge remote-tracking branch 'origin' into nlp/space/dst * Merge branch 'feat/nlp' into nlp/space/dst * dst test ready * merge feat nlp * space outputs normalization * update dst * merge feat nlp * Merge branch 'master' into nlp/space/dst * update requirement * merge with master * Merge remote-tracking branch 'origin/master' into nlp/space/dst * formating output * update requirements/nlp * merge with master * add test cases * Merge remote-tracking branch 'origin/master' into nlp/space/dst * merge with master * login warn refactor, error message when exception, ut case to new user * login warn refactor, error message when exception, ut case to new user
3 years ago
[to #43040150] fix: login warn refactor, error message when exception, ut case to new user 未登录warn提示信息重构,只有遇到请求失败,exception时,如果cookie为空,提示用户login,单元测试用户修改成单独 Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9283579 * [to #42322933] add space dialog-state tracking pipeline Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9227018 * init * token to ids * add model * model forward ready * add intent * intent preprocessor ready * intent success * merge master * test with model hub * add flake8 * update * update * update * Merge branch 'master' into nlp/space/gen * delete file about gen * init * fix flake8 bug * [to #42322933] init * bug fix * [to #42322933] init * update pipeline registry info * Merge remote-tracking branch 'origin/master' into feat/nli * [to #42322933] init * [to #42322933] init * modify forward * [to #42322933] init * generation ready * init * Merge branch 'master' into feat/zero_shot_classification # Conflicts: # modelscope/preprocessors/__init__.py * [to #42322933] bugfix * [to #42322933] pre commit fix * fill mask * registry multi models on model and pipeline * add tests * test level >= 0 * local gen ready * merge with master * dialog modeling ready * fix comments: rename and refactor AliceMindMLM; adjust pipeline * space intent and modeling(generation) are ready * bug fix * add dep * add dep * support dst data processor * merge with nlp/space/dst * merge with master * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/models/nlp/__init__.py modelscope/pipelines/builder.py modelscope/pipelines/outputs.py modelscope/preprocessors/nlp.py requirements/nlp.txt * merge with master * merge with master 2/2 * fix comments * fix isort for pre-commit check * allow params pass to pipeline's __call__ method * Merge remote-tracking branch 'origin/master' into feat/zero_shot_classification * merge with nli task * merge with sentiment_classification * merge with zero_shot_classfication * merge with fill_mask * merge with space * merge with master head * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/utils/constant.py * fix: pipeline module_name from model_type to 'fill_mask' & fix merge bug * unfiinished change * fix bug * unfinished * unfinished * revise modelhub dependency * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * add eval() to pipeline call * add test level * ut run passed * add default args * tmp * merge master * all ut passed * remove an useless enum * revert a mis modification * revert a mis modification * Merge commit 'ace8af92465f7d772f035aebe98967726655f12c' into feat/nlp * commit 'ace8af92465f7d772f035aebe98967726655f12c': [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py # modelscope/utils/constant.py * Merge branch 'feat/nlp' into feat/nlp_refactor * feat/nlp: [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py * fix compile bug * refactor space * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * pre-commit lint * lint file * lint file * lint file * update modelhub dependency * lint file * ignore dst_processor temporary * solve comment: 1. change MaskedLMModelBase to MaskedLanguageModelBase 2. remove a useless import * recommit * remove MaskedLanguageModel from __all__ * Merge commit '1a0d4af55a2eee69d89633874890f50eda8f8700' into feat/nlp_refactor * commit '1a0d4af55a2eee69d89633874890f50eda8f8700': [to #42322933] test level check Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9143809 [to #42322933] update nlp models name in metainfo Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9134657 # Conflicts: # modelscope/metainfo.py * update * revert pipeline params update * remove zeroshot * update sequence classfication outpus * merge with fill mask * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * init dialog state tracking * fix flake8 warning of dst * Merge remote-tracking branch 'origin/feat/fill_mask' into feat/nlp * merge with master * remove useless test.py * add init * merge nlp * Merge remote-tracking branch 'origin/master' into feat/nlp * remove unformatted space trainer * Merge branch 'feat/nlp' into nlp/space/dst * revise based on comment except chinease comment * skip ci blocking * change Chinese notes of space3.0 into English * translate chinese comment to english * add space to metainfo * space dst pipeline is ready, but model's result is wrong * merge feat/nlp * merge with master * change processor * change example * test case ready * dst loacl ready * update dst conf * merge feat/nlp * inform revise * inherit bug fix * init * add 2 complete examples * fix bug * add test case * merge with master * modify model name * add missing setting * add outputs * modify test level * modify chinese comment * remove useless doc * merge feat/nlp * Merge remote-tracking branch 'origin' into nlp/space/dst * Merge branch 'feat/nlp' into nlp/space/dst * dst test ready * merge feat nlp * space outputs normalization * update dst * merge feat nlp * Merge branch 'master' into nlp/space/dst * update requirement * merge with master * Merge remote-tracking branch 'origin/master' into nlp/space/dst * formating output * update requirements/nlp * merge with master * add test cases * Merge remote-tracking branch 'origin/master' into nlp/space/dst * merge with master * login warn refactor, error message when exception, ut case to new user * login warn refactor, error message when exception, ut case to new user
3 years ago
[to #43040150] fix: login warn refactor, error message when exception, ut case to new user 未登录warn提示信息重构,只有遇到请求失败,exception时,如果cookie为空,提示用户login,单元测试用户修改成单独 Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9283579 * [to #42322933] add space dialog-state tracking pipeline Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9227018 * init * token to ids * add model * model forward ready * add intent * intent preprocessor ready * intent success * merge master * test with model hub * add flake8 * update * update * update * Merge branch 'master' into nlp/space/gen * delete file about gen * init * fix flake8 bug * [to #42322933] init * bug fix * [to #42322933] init * update pipeline registry info * Merge remote-tracking branch 'origin/master' into feat/nli * [to #42322933] init * [to #42322933] init * modify forward * [to #42322933] init * generation ready * init * Merge branch 'master' into feat/zero_shot_classification # Conflicts: # modelscope/preprocessors/__init__.py * [to #42322933] bugfix * [to #42322933] pre commit fix * fill mask * registry multi models on model and pipeline * add tests * test level >= 0 * local gen ready * merge with master * dialog modeling ready * fix comments: rename and refactor AliceMindMLM; adjust pipeline * space intent and modeling(generation) are ready * bug fix * add dep * add dep * support dst data processor * merge with nlp/space/dst * merge with master * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/models/nlp/__init__.py modelscope/pipelines/builder.py modelscope/pipelines/outputs.py modelscope/preprocessors/nlp.py requirements/nlp.txt * merge with master * merge with master 2/2 * fix comments * fix isort for pre-commit check * allow params pass to pipeline's __call__ method * Merge remote-tracking branch 'origin/master' into feat/zero_shot_classification * merge with nli task * merge with sentiment_classification * merge with zero_shot_classfication * merge with fill_mask * merge with space * merge with master head * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/utils/constant.py * fix: pipeline module_name from model_type to 'fill_mask' & fix merge bug * unfiinished change * fix bug * unfinished * unfinished * revise modelhub dependency * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * add eval() to pipeline call * add test level * ut run passed * add default args * tmp * merge master * all ut passed * remove an useless enum * revert a mis modification * revert a mis modification * Merge commit 'ace8af92465f7d772f035aebe98967726655f12c' into feat/nlp * commit 'ace8af92465f7d772f035aebe98967726655f12c': [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py # modelscope/utils/constant.py * Merge branch 'feat/nlp' into feat/nlp_refactor * feat/nlp: [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py * fix compile bug * refactor space * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * pre-commit lint * lint file * lint file * lint file * update modelhub dependency * lint file * ignore dst_processor temporary * solve comment: 1. change MaskedLMModelBase to MaskedLanguageModelBase 2. remove a useless import * recommit * remove MaskedLanguageModel from __all__ * Merge commit '1a0d4af55a2eee69d89633874890f50eda8f8700' into feat/nlp_refactor * commit '1a0d4af55a2eee69d89633874890f50eda8f8700': [to #42322933] test level check Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9143809 [to #42322933] update nlp models name in metainfo Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9134657 # Conflicts: # modelscope/metainfo.py * update * revert pipeline params update * remove zeroshot * update sequence classfication outpus * merge with fill mask * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * init dialog state tracking * fix flake8 warning of dst * Merge remote-tracking branch 'origin/feat/fill_mask' into feat/nlp * merge with master * remove useless test.py * add init * merge nlp * Merge remote-tracking branch 'origin/master' into feat/nlp * remove unformatted space trainer * Merge branch 'feat/nlp' into nlp/space/dst * revise based on comment except chinease comment * skip ci blocking * change Chinese notes of space3.0 into English * translate chinese comment to english * add space to metainfo * space dst pipeline is ready, but model's result is wrong * merge feat/nlp * merge with master * change processor * change example * test case ready * dst loacl ready * update dst conf * merge feat/nlp * inform revise * inherit bug fix * init * add 2 complete examples * fix bug * add test case * merge with master * modify model name * add missing setting * add outputs * modify test level * modify chinese comment * remove useless doc * merge feat/nlp * Merge remote-tracking branch 'origin' into nlp/space/dst * Merge branch 'feat/nlp' into nlp/space/dst * dst test ready * merge feat nlp * space outputs normalization * update dst * merge feat nlp * Merge branch 'master' into nlp/space/dst * update requirement * merge with master * Merge remote-tracking branch 'origin/master' into nlp/space/dst * formating output * update requirements/nlp * merge with master * add test cases * Merge remote-tracking branch 'origin/master' into nlp/space/dst * merge with master * login warn refactor, error message when exception, ut case to new user * login warn refactor, error message when exception, ut case to new user
3 years ago
[to #43040150] fix: login warn refactor, error message when exception, ut case to new user 未登录warn提示信息重构,只有遇到请求失败,exception时,如果cookie为空,提示用户login,单元测试用户修改成单独 Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9283579 * [to #42322933] add space dialog-state tracking pipeline Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9227018 * init * token to ids * add model * model forward ready * add intent * intent preprocessor ready * intent success * merge master * test with model hub * add flake8 * update * update * update * Merge branch 'master' into nlp/space/gen * delete file about gen * init * fix flake8 bug * [to #42322933] init * bug fix * [to #42322933] init * update pipeline registry info * Merge remote-tracking branch 'origin/master' into feat/nli * [to #42322933] init * [to #42322933] init * modify forward * [to #42322933] init * generation ready * init * Merge branch 'master' into feat/zero_shot_classification # Conflicts: # modelscope/preprocessors/__init__.py * [to #42322933] bugfix * [to #42322933] pre commit fix * fill mask * registry multi models on model and pipeline * add tests * test level >= 0 * local gen ready * merge with master * dialog modeling ready * fix comments: rename and refactor AliceMindMLM; adjust pipeline * space intent and modeling(generation) are ready * bug fix * add dep * add dep * support dst data processor * merge with nlp/space/dst * merge with master * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/models/nlp/__init__.py modelscope/pipelines/builder.py modelscope/pipelines/outputs.py modelscope/preprocessors/nlp.py requirements/nlp.txt * merge with master * merge with master 2/2 * fix comments * fix isort for pre-commit check * allow params pass to pipeline's __call__ method * Merge remote-tracking branch 'origin/master' into feat/zero_shot_classification * merge with nli task * merge with sentiment_classification * merge with zero_shot_classfication * merge with fill_mask * merge with space * merge with master head * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/utils/constant.py * fix: pipeline module_name from model_type to 'fill_mask' & fix merge bug * unfiinished change * fix bug * unfinished * unfinished * revise modelhub dependency * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * add eval() to pipeline call * add test level * ut run passed * add default args * tmp * merge master * all ut passed * remove an useless enum * revert a mis modification * revert a mis modification * Merge commit 'ace8af92465f7d772f035aebe98967726655f12c' into feat/nlp * commit 'ace8af92465f7d772f035aebe98967726655f12c': [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py # modelscope/utils/constant.py * Merge branch 'feat/nlp' into feat/nlp_refactor * feat/nlp: [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py * fix compile bug * refactor space * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * pre-commit lint * lint file * lint file * lint file * update modelhub dependency * lint file * ignore dst_processor temporary * solve comment: 1. change MaskedLMModelBase to MaskedLanguageModelBase 2. remove a useless import * recommit * remove MaskedLanguageModel from __all__ * Merge commit '1a0d4af55a2eee69d89633874890f50eda8f8700' into feat/nlp_refactor * commit '1a0d4af55a2eee69d89633874890f50eda8f8700': [to #42322933] test level check Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9143809 [to #42322933] update nlp models name in metainfo Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9134657 # Conflicts: # modelscope/metainfo.py * update * revert pipeline params update * remove zeroshot * update sequence classfication outpus * merge with fill mask * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * init dialog state tracking * fix flake8 warning of dst * Merge remote-tracking branch 'origin/feat/fill_mask' into feat/nlp * merge with master * remove useless test.py * add init * merge nlp * Merge remote-tracking branch 'origin/master' into feat/nlp * remove unformatted space trainer * Merge branch 'feat/nlp' into nlp/space/dst * revise based on comment except chinease comment * skip ci blocking * change Chinese notes of space3.0 into English * translate chinese comment to english * add space to metainfo * space dst pipeline is ready, but model's result is wrong * merge feat/nlp * merge with master * change processor * change example * test case ready * dst loacl ready * update dst conf * merge feat/nlp * inform revise * inherit bug fix * init * add 2 complete examples * fix bug * add test case * merge with master * modify model name * add missing setting * add outputs * modify test level * modify chinese comment * remove useless doc * merge feat/nlp * Merge remote-tracking branch 'origin' into nlp/space/dst * Merge branch 'feat/nlp' into nlp/space/dst * dst test ready * merge feat nlp * space outputs normalization * update dst * merge feat nlp * Merge branch 'master' into nlp/space/dst * update requirement * merge with master * Merge remote-tracking branch 'origin/master' into nlp/space/dst * formating output * update requirements/nlp * merge with master * add test cases * Merge remote-tracking branch 'origin/master' into nlp/space/dst * merge with master * login warn refactor, error message when exception, ut case to new user * login warn refactor, error message when exception, ut case to new user
3 years ago
3 years ago
3 years ago
[to #43040150] fix: login warn refactor, error message when exception, ut case to new user 未登录warn提示信息重构,只有遇到请求失败,exception时,如果cookie为空,提示用户login,单元测试用户修改成单独 Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9283579 * [to #42322933] add space dialog-state tracking pipeline Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9227018 * init * token to ids * add model * model forward ready * add intent * intent preprocessor ready * intent success * merge master * test with model hub * add flake8 * update * update * update * Merge branch 'master' into nlp/space/gen * delete file about gen * init * fix flake8 bug * [to #42322933] init * bug fix * [to #42322933] init * update pipeline registry info * Merge remote-tracking branch 'origin/master' into feat/nli * [to #42322933] init * [to #42322933] init * modify forward * [to #42322933] init * generation ready * init * Merge branch 'master' into feat/zero_shot_classification # Conflicts: # modelscope/preprocessors/__init__.py * [to #42322933] bugfix * [to #42322933] pre commit fix * fill mask * registry multi models on model and pipeline * add tests * test level >= 0 * local gen ready * merge with master * dialog modeling ready * fix comments: rename and refactor AliceMindMLM; adjust pipeline * space intent and modeling(generation) are ready * bug fix * add dep * add dep * support dst data processor * merge with nlp/space/dst * merge with master * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/models/nlp/__init__.py modelscope/pipelines/builder.py modelscope/pipelines/outputs.py modelscope/preprocessors/nlp.py requirements/nlp.txt * merge with master * merge with master 2/2 * fix comments * fix isort for pre-commit check * allow params pass to pipeline's __call__ method * Merge remote-tracking branch 'origin/master' into feat/zero_shot_classification * merge with nli task * merge with sentiment_classification * merge with zero_shot_classfication * merge with fill_mask * merge with space * merge with master head * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/utils/constant.py * fix: pipeline module_name from model_type to 'fill_mask' & fix merge bug * unfiinished change * fix bug * unfinished * unfinished * revise modelhub dependency * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * add eval() to pipeline call * add test level * ut run passed * add default args * tmp * merge master * all ut passed * remove an useless enum * revert a mis modification * revert a mis modification * Merge commit 'ace8af92465f7d772f035aebe98967726655f12c' into feat/nlp * commit 'ace8af92465f7d772f035aebe98967726655f12c': [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py # modelscope/utils/constant.py * Merge branch 'feat/nlp' into feat/nlp_refactor * feat/nlp: [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py * fix compile bug * refactor space * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * pre-commit lint * lint file * lint file * lint file * update modelhub dependency * lint file * ignore dst_processor temporary * solve comment: 1. change MaskedLMModelBase to MaskedLanguageModelBase 2. remove a useless import * recommit * remove MaskedLanguageModel from __all__ * Merge commit '1a0d4af55a2eee69d89633874890f50eda8f8700' into feat/nlp_refactor * commit '1a0d4af55a2eee69d89633874890f50eda8f8700': [to #42322933] test level check Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9143809 [to #42322933] update nlp models name in metainfo Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9134657 # Conflicts: # modelscope/metainfo.py * update * revert pipeline params update * remove zeroshot * update sequence classfication outpus * merge with fill mask * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * init dialog state tracking * fix flake8 warning of dst * Merge remote-tracking branch 'origin/feat/fill_mask' into feat/nlp * merge with master * remove useless test.py * add init * merge nlp * Merge remote-tracking branch 'origin/master' into feat/nlp * remove unformatted space trainer * Merge branch 'feat/nlp' into nlp/space/dst * revise based on comment except chinease comment * skip ci blocking * change Chinese notes of space3.0 into English * translate chinese comment to english * add space to metainfo * space dst pipeline is ready, but model's result is wrong * merge feat/nlp * merge with master * change processor * change example * test case ready * dst loacl ready * update dst conf * merge feat/nlp * inform revise * inherit bug fix * init * add 2 complete examples * fix bug * add test case * merge with master * modify model name * add missing setting * add outputs * modify test level * modify chinese comment * remove useless doc * merge feat/nlp * Merge remote-tracking branch 'origin' into nlp/space/dst * Merge branch 'feat/nlp' into nlp/space/dst * dst test ready * merge feat nlp * space outputs normalization * update dst * merge feat nlp * Merge branch 'master' into nlp/space/dst * update requirement * merge with master * Merge remote-tracking branch 'origin/master' into nlp/space/dst * formating output * update requirements/nlp * merge with master * add test cases * Merge remote-tracking branch 'origin/master' into nlp/space/dst * merge with master * login warn refactor, error message when exception, ut case to new user * login warn refactor, error message when exception, ut case to new user
3 years ago
[to #43040150] fix: login warn refactor, error message when exception, ut case to new user 未登录warn提示信息重构,只有遇到请求失败,exception时,如果cookie为空,提示用户login,单元测试用户修改成单独 Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9283579 * [to #42322933] add space dialog-state tracking pipeline Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9227018 * init * token to ids * add model * model forward ready * add intent * intent preprocessor ready * intent success * merge master * test with model hub * add flake8 * update * update * update * Merge branch 'master' into nlp/space/gen * delete file about gen * init * fix flake8 bug * [to #42322933] init * bug fix * [to #42322933] init * update pipeline registry info * Merge remote-tracking branch 'origin/master' into feat/nli * [to #42322933] init * [to #42322933] init * modify forward * [to #42322933] init * generation ready * init * Merge branch 'master' into feat/zero_shot_classification # Conflicts: # modelscope/preprocessors/__init__.py * [to #42322933] bugfix * [to #42322933] pre commit fix * fill mask * registry multi models on model and pipeline * add tests * test level >= 0 * local gen ready * merge with master * dialog modeling ready * fix comments: rename and refactor AliceMindMLM; adjust pipeline * space intent and modeling(generation) are ready * bug fix * add dep * add dep * support dst data processor * merge with nlp/space/dst * merge with master * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/models/nlp/__init__.py modelscope/pipelines/builder.py modelscope/pipelines/outputs.py modelscope/preprocessors/nlp.py requirements/nlp.txt * merge with master * merge with master 2/2 * fix comments * fix isort for pre-commit check * allow params pass to pipeline's __call__ method * Merge remote-tracking branch 'origin/master' into feat/zero_shot_classification * merge with nli task * merge with sentiment_classification * merge with zero_shot_classfication * merge with fill_mask * merge with space * merge with master head * Merge remote-tracking branch 'origin' into feat/fill_mask Conflicts: modelscope/utils/constant.py * fix: pipeline module_name from model_type to 'fill_mask' & fix merge bug * unfiinished change * fix bug * unfinished * unfinished * revise modelhub dependency * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * add eval() to pipeline call * add test level * ut run passed * add default args * tmp * merge master * all ut passed * remove an useless enum * revert a mis modification * revert a mis modification * Merge commit 'ace8af92465f7d772f035aebe98967726655f12c' into feat/nlp * commit 'ace8af92465f7d772f035aebe98967726655f12c': [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py # modelscope/utils/constant.py * Merge branch 'feat/nlp' into feat/nlp_refactor * feat/nlp: [to #42322933] Add cv-action-recongnition-pipeline to maas lib [to #42463204] support Pil.Image for image_captioning_pipeline [to #42670107] restore pydataset test [to #42322933] add create if not exist and add(back) create model example Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9130661 [to #41474818]fix: fix errors in task name definition # Conflicts: # modelscope/pipelines/builder.py * fix compile bug * refactor space * Merge branch 'feat/nlp_refactor' of http://gitlab.alibaba-inc.com/Ali-MaaS/MaaS-lib into feat/nlp_refactor * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * pre-commit lint * lint file * lint file * lint file * update modelhub dependency * lint file * ignore dst_processor temporary * solve comment: 1. change MaskedLMModelBase to MaskedLanguageModelBase 2. remove a useless import * recommit * remove MaskedLanguageModel from __all__ * Merge commit '1a0d4af55a2eee69d89633874890f50eda8f8700' into feat/nlp_refactor * commit '1a0d4af55a2eee69d89633874890f50eda8f8700': [to #42322933] test level check Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9143809 [to #42322933] update nlp models name in metainfo Link: https://code.alibaba-inc.com/Ali-MaaS/MaaS-lib/codereview/9134657 # Conflicts: # modelscope/metainfo.py * update * revert pipeline params update * remove zeroshot * update sequence classfication outpus * merge with fill mask * Merge remote-tracking branch 'origin' into feat/fill_mask * fix * init dialog state tracking * fix flake8 warning of dst * Merge remote-tracking branch 'origin/feat/fill_mask' into feat/nlp * merge with master * remove useless test.py * add init * merge nlp * Merge remote-tracking branch 'origin/master' into feat/nlp * remove unformatted space trainer * Merge branch 'feat/nlp' into nlp/space/dst * revise based on comment except chinease comment * skip ci blocking * change Chinese notes of space3.0 into English * translate chinese comment to english * add space to metainfo * space dst pipeline is ready, but model's result is wrong * merge feat/nlp * merge with master * change processor * change example * test case ready * dst loacl ready * update dst conf * merge feat/nlp * inform revise * inherit bug fix * init * add 2 complete examples * fix bug * add test case * merge with master * modify model name * add missing setting * add outputs * modify test level * modify chinese comment * remove useless doc * merge feat/nlp * Merge remote-tracking branch 'origin' into nlp/space/dst * Merge branch 'feat/nlp' into nlp/space/dst * dst test ready * merge feat nlp * space outputs normalization * update dst * merge feat nlp * Merge branch 'master' into nlp/space/dst * update requirement * merge with master * Merge remote-tracking branch 'origin/master' into nlp/space/dst * formating output * update requirements/nlp * merge with master * add test cases * Merge remote-tracking branch 'origin/master' into nlp/space/dst * merge with master * login warn refactor, error message when exception, ut case to new user * login warn refactor, error message when exception, ut case to new user
3 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804
  1. # Copyright (c) Alibaba, Inc. and its affiliates.
  2. # yapf: disable
  3. import datetime
  4. import os
  5. import pickle
  6. import platform
  7. import shutil
  8. import tempfile
  9. import uuid
  10. from collections import defaultdict
  11. from http import HTTPStatus
  12. from http.cookiejar import CookieJar
  13. from os.path import expanduser
  14. from typing import Dict, List, Optional, Tuple, Union
  15. import requests
  16. from modelscope import __version__
  17. from modelscope.hub.constants import (API_RESPONSE_FIELD_DATA,
  18. API_RESPONSE_FIELD_EMAIL,
  19. API_RESPONSE_FIELD_GIT_ACCESS_TOKEN,
  20. API_RESPONSE_FIELD_MESSAGE,
  21. API_RESPONSE_FIELD_USERNAME,
  22. DEFAULT_CREDENTIALS_PATH,
  23. MODELSCOPE_CLOUD_ENVIRONMENT,
  24. MODELSCOPE_CLOUD_USERNAME,
  25. ONE_YEAR_SECONDS, Licenses,
  26. ModelVisibility)
  27. from modelscope.hub.errors import (InvalidParameter, NotExistError,
  28. NotLoginException, NoValidRevisionError,
  29. RequestError, datahub_raise_on_error,
  30. handle_http_post_error,
  31. handle_http_response, is_ok,
  32. raise_for_http_status, raise_on_error)
  33. from modelscope.hub.git import GitCommandWrapper
  34. from modelscope.hub.repository import Repository
  35. from modelscope.utils.config_ds import DOWNLOADED_DATASETS_PATH
  36. from modelscope.utils.constant import (DEFAULT_DATASET_REVISION,
  37. DEFAULT_MODEL_REVISION,
  38. DEFAULT_REPOSITORY_REVISION,
  39. MASTER_MODEL_BRANCH, DatasetFormations,
  40. DatasetMetaFormats, DownloadChannel,
  41. DownloadMode, ModelFile)
  42. from modelscope.utils.logger import get_logger
  43. from .utils.utils import (get_endpoint, get_release_datetime,
  44. model_id_to_group_owner_name)
  45. logger = get_logger()
  46. class HubApi:
  47. def __init__(self, endpoint=None):
  48. self.endpoint = endpoint if endpoint is not None else get_endpoint()
  49. self.headers = {'user-agent': ModelScopeConfig.get_user_agent()}
  50. def login(
  51. self,
  52. access_token: str,
  53. ) -> tuple():
  54. """
  55. Login with username and password
  56. Args:
  57. access_token(`str`): user access token on modelscope.
  58. Returns:
  59. cookies: to authenticate yourself to ModelScope open-api
  60. gitlab token: to access private repos
  61. <Tip>
  62. You only have to login once within 30 days.
  63. </Tip>
  64. """
  65. path = f'{self.endpoint}/api/v1/login'
  66. r = requests.post(
  67. path, json={'AccessToken': access_token}, headers=self.headers)
  68. raise_for_http_status(r)
  69. d = r.json()
  70. raise_on_error(d)
  71. token = d[API_RESPONSE_FIELD_DATA][API_RESPONSE_FIELD_GIT_ACCESS_TOKEN]
  72. cookies = r.cookies
  73. # save token and cookie
  74. ModelScopeConfig.save_token(token)
  75. ModelScopeConfig.save_cookies(cookies)
  76. ModelScopeConfig.save_user_info(
  77. d[API_RESPONSE_FIELD_DATA][API_RESPONSE_FIELD_USERNAME],
  78. d[API_RESPONSE_FIELD_DATA][API_RESPONSE_FIELD_EMAIL])
  79. return d[API_RESPONSE_FIELD_DATA][
  80. API_RESPONSE_FIELD_GIT_ACCESS_TOKEN], cookies
  81. def create_model(
  82. self,
  83. model_id: str,
  84. visibility: str,
  85. license: str,
  86. chinese_name: Optional[str] = None,
  87. ) -> str:
  88. """
  89. Create model repo at ModelScopeHub
  90. Args:
  91. model_id:(`str`): The model id
  92. visibility(`int`): visibility of the model(1-private, 5-public), default public.
  93. license(`str`): license of the model, default none.
  94. chinese_name(`str`, *optional*): chinese name of the model
  95. Returns:
  96. name of the model created
  97. <Tip>
  98. model_id = {owner}/{name}
  99. </Tip>
  100. """
  101. if model_id is None:
  102. raise InvalidParameter('model_id is required!')
  103. cookies = ModelScopeConfig.get_cookies()
  104. if cookies is None:
  105. raise ValueError('Token does not exist, please login first.')
  106. path = f'{self.endpoint}/api/v1/models'
  107. owner_or_group, name = model_id_to_group_owner_name(model_id)
  108. body = {
  109. 'Path': owner_or_group,
  110. 'Name': name,
  111. 'ChineseName': chinese_name,
  112. 'Visibility': visibility, # server check
  113. 'License': license
  114. }
  115. r = requests.post(
  116. path, json=body, cookies=cookies, headers=self.headers)
  117. handle_http_post_error(r, path, body)
  118. raise_on_error(r.json())
  119. model_repo_url = f'{get_endpoint()}/{model_id}'
  120. return model_repo_url
  121. def delete_model(self, model_id):
  122. """_summary_
  123. Args:
  124. model_id (str): The model id.
  125. <Tip>
  126. model_id = {owner}/{name}
  127. </Tip>
  128. """
  129. cookies = ModelScopeConfig.get_cookies()
  130. if cookies is None:
  131. raise ValueError('Token does not exist, please login first.')
  132. path = f'{self.endpoint}/api/v1/models/{model_id}'
  133. r = requests.delete(path, cookies=cookies, headers=self.headers)
  134. raise_for_http_status(r)
  135. raise_on_error(r.json())
  136. def get_model_url(self, model_id):
  137. return f'{self.endpoint}/api/v1/models/{model_id}.git'
  138. def get_model(
  139. self,
  140. model_id: str,
  141. revision: str = DEFAULT_MODEL_REVISION,
  142. ) -> str:
  143. """
  144. Get model information at modelscope_hub
  145. Args:
  146. model_id(`str`): The model id.
  147. revision(`str`): revision of model
  148. Returns:
  149. The model detail information.
  150. Raises:
  151. NotExistError: If the model is not exist, will throw NotExistError
  152. <Tip>
  153. model_id = {owner}/{name}
  154. </Tip>
  155. """
  156. cookies = ModelScopeConfig.get_cookies()
  157. owner_or_group, name = model_id_to_group_owner_name(model_id)
  158. if revision:
  159. path = f'{self.endpoint}/api/v1/models/{owner_or_group}/{name}?Revision={revision}'
  160. else:
  161. path = f'{self.endpoint}/api/v1/models/{owner_or_group}/{name}'
  162. r = requests.get(path, cookies=cookies, headers=self.headers)
  163. handle_http_response(r, logger, cookies, model_id)
  164. if r.status_code == HTTPStatus.OK:
  165. if is_ok(r.json()):
  166. return r.json()[API_RESPONSE_FIELD_DATA]
  167. else:
  168. raise NotExistError(r.json()[API_RESPONSE_FIELD_MESSAGE])
  169. else:
  170. raise_for_http_status(r)
  171. def push_model(self,
  172. model_id: str,
  173. model_dir: str,
  174. visibility: int = ModelVisibility.PUBLIC,
  175. license: str = Licenses.APACHE_V2,
  176. chinese_name: Optional[str] = None,
  177. commit_message: Optional[str] = 'upload model',
  178. revision: Optional[str] = DEFAULT_REPOSITORY_REVISION):
  179. """
  180. Upload model from a given directory to given repository. A valid model directory
  181. must contain a configuration.json file.
  182. This function upload the files in given directory to given repository. If the
  183. given repository is not exists in remote, it will automatically create it with
  184. given visibility, license and chinese_name parameters. If the revision is also
  185. not exists in remote repository, it will create a new branch for it.
  186. This function must be called before calling HubApi's login with a valid token
  187. which can be obtained from ModelScope's website.
  188. Args:
  189. model_id (`str`):
  190. The model id to be uploaded, caller must have write permission for it.
  191. model_dir(`str`):
  192. The Absolute Path of the finetune result.
  193. visibility(`int`, defaults to `0`):
  194. Visibility of the new created model(1-private, 5-public). If the model is
  195. not exists in ModelScope, this function will create a new model with this
  196. visibility and this parameter is required. You can ignore this parameter
  197. if you make sure the model's existence.
  198. license(`str`, defaults to `None`):
  199. License of the new created model(see License). If the model is not exists
  200. in ModelScope, this function will create a new model with this license
  201. and this parameter is required. You can ignore this parameter if you
  202. make sure the model's existence.
  203. chinese_name(`str`, *optional*, defaults to `None`):
  204. chinese name of the new created model.
  205. commit_message(`str`, *optional*, defaults to `None`):
  206. commit message of the push request.
  207. revision (`str`, *optional*, default to DEFAULT_MODEL_REVISION):
  208. which branch to push. If the branch is not exists, It will create a new
  209. branch and push to it.
  210. """
  211. if model_id is None:
  212. raise InvalidParameter('model_id cannot be empty!')
  213. if model_dir is None:
  214. raise InvalidParameter('model_dir cannot be empty!')
  215. if not os.path.exists(model_dir) or os.path.isfile(model_dir):
  216. raise InvalidParameter('model_dir must be a valid directory.')
  217. cfg_file = os.path.join(model_dir, ModelFile.CONFIGURATION)
  218. if not os.path.exists(cfg_file):
  219. raise ValueError(f'{model_dir} must contain a configuration.json.')
  220. cookies = ModelScopeConfig.get_cookies()
  221. if cookies is None:
  222. raise NotLoginException('Must login before upload!')
  223. files_to_save = os.listdir(model_dir)
  224. try:
  225. self.get_model(model_id=model_id)
  226. except Exception:
  227. if visibility is None or license is None:
  228. raise InvalidParameter(
  229. 'visibility and license cannot be empty if want to create new repo'
  230. )
  231. logger.info('Create new model %s' % model_id)
  232. self.create_model(
  233. model_id=model_id,
  234. visibility=visibility,
  235. license=license,
  236. chinese_name=chinese_name)
  237. tmp_dir = tempfile.mkdtemp()
  238. git_wrapper = GitCommandWrapper()
  239. try:
  240. repo = Repository(model_dir=tmp_dir, clone_from=model_id)
  241. branches = git_wrapper.get_remote_branches(tmp_dir)
  242. if revision not in branches:
  243. logger.info('Create new branch %s' % revision)
  244. git_wrapper.new_branch(tmp_dir, revision)
  245. git_wrapper.checkout(tmp_dir, revision)
  246. files_in_repo = os.listdir(tmp_dir)
  247. for f in files_in_repo:
  248. if f[0] != '.':
  249. src = os.path.join(tmp_dir, f)
  250. if os.path.isfile(src):
  251. os.remove(src)
  252. else:
  253. shutil.rmtree(src, ignore_errors=True)
  254. for f in files_to_save:
  255. if f[0] != '.':
  256. src = os.path.join(model_dir, f)
  257. if os.path.isdir(src):
  258. shutil.copytree(src, os.path.join(tmp_dir, f))
  259. else:
  260. shutil.copy(src, tmp_dir)
  261. if not commit_message:
  262. date = datetime.datetime.now().strftime('%Y_%m_%d_%H_%M_%S')
  263. commit_message = '[automsg] push model %s to hub at %s' % (
  264. model_id, date)
  265. repo.push(commit_message=commit_message, local_branch=revision, remote_branch=revision)
  266. except Exception:
  267. raise
  268. finally:
  269. shutil.rmtree(tmp_dir, ignore_errors=True)
  270. def list_models(self,
  271. owner_or_group: str,
  272. page_number=1,
  273. page_size=10) -> dict:
  274. """List models in owner or group.
  275. Args:
  276. owner_or_group(`str`): owner or group.
  277. page_number(`int`): The page number, default: 1
  278. page_size(`int`): The page size, default: 10
  279. Returns:
  280. dict: {"models": "list of models", "TotalCount": total_number_of_models_in_owner_or_group}
  281. """
  282. cookies = ModelScopeConfig.get_cookies()
  283. path = f'{self.endpoint}/api/v1/models/'
  284. r = requests.put(
  285. path,
  286. data='{"Path":"%s", "PageNumber":%s, "PageSize": %s}' %
  287. (owner_or_group, page_number, page_size),
  288. cookies=cookies,
  289. headers=self.headers)
  290. handle_http_response(r, logger, cookies, 'list_model')
  291. if r.status_code == HTTPStatus.OK:
  292. if is_ok(r.json()):
  293. data = r.json()[API_RESPONSE_FIELD_DATA]
  294. return data
  295. else:
  296. raise RequestError(r.json()[API_RESPONSE_FIELD_MESSAGE])
  297. else:
  298. raise_for_http_status(r)
  299. return None
  300. def _check_cookie(self,
  301. use_cookies: Union[bool,
  302. CookieJar] = False) -> CookieJar:
  303. cookies = None
  304. if isinstance(use_cookies, CookieJar):
  305. cookies = use_cookies
  306. elif use_cookies:
  307. cookies = ModelScopeConfig.get_cookies()
  308. if cookies is None:
  309. raise ValueError('Token does not exist, please login first.')
  310. return cookies
  311. def list_model_revisions(
  312. self,
  313. model_id: str,
  314. cutoff_timestamp: int = None,
  315. use_cookies: Union[bool, CookieJar] = False) -> List[str]:
  316. """Get model branch and tags.
  317. Args:
  318. model_id (str): The model id
  319. cutoff_timestamp (int): Tags created before the cutoff will be included.
  320. The timestamp is represented by the seconds elasped from the epoch time.
  321. use_cookies (Union[bool, CookieJar], optional): If is cookieJar, we will use this cookie, if True, will
  322. will load cookie from local. Defaults to False.
  323. Returns:
  324. Tuple[List[str], List[str]]: Return list of branch name and tags
  325. """
  326. cookies = self._check_cookie(use_cookies)
  327. if cutoff_timestamp is None:
  328. cutoff_timestamp = get_release_datetime()
  329. path = f'{self.endpoint}/api/v1/models/{model_id}/revisions?EndTime=%s' % cutoff_timestamp
  330. r = requests.get(path, cookies=cookies, headers=self.headers)
  331. handle_http_response(r, logger, cookies, model_id)
  332. d = r.json()
  333. raise_on_error(d)
  334. info = d[API_RESPONSE_FIELD_DATA]
  335. # tags returned from backend are guaranteed to be ordered by create-time
  336. tags = [x['Revision'] for x in info['RevisionMap']['Tags']
  337. ] if info['RevisionMap']['Tags'] else []
  338. return tags
  339. def get_valid_revision(self, model_id: str, revision=None, cookies: Optional[CookieJar] = None):
  340. release_timestamp = get_release_datetime()
  341. current_timestamp = int(round(datetime.datetime.now().timestamp()))
  342. # for active development in library codes (non-release-branches), release_timestamp
  343. # is set to be a far-away-time-in-the-future, to ensure that we shall
  344. # get the master-HEAD version from model repo by default (when no revision is provided)
  345. if release_timestamp > current_timestamp + ONE_YEAR_SECONDS:
  346. branches, tags = self.get_model_branches_and_tags(
  347. model_id, use_cookies=False if cookies is None else cookies)
  348. if revision is None:
  349. revision = MASTER_MODEL_BRANCH
  350. logger.info('Model revision not specified, use default: %s in development mode' % revision)
  351. if revision not in branches and revision not in tags:
  352. raise NotExistError('The model: %s has no branch or tag : %s .' % revision)
  353. logger.info('Development mode use revision: %s' % revision)
  354. else:
  355. if revision is None: # user not specified revision, use latest revision before release time
  356. revisions = self.list_model_revisions(
  357. model_id, cutoff_timestamp=release_timestamp, use_cookies=False if cookies is None else cookies)
  358. if len(revisions) == 0:
  359. raise NoValidRevisionError('The model: %s has no valid revision!' % model_id)
  360. # tags (revisions) returned from backend are guaranteed to be ordered by create-time
  361. # we shall obtain the latest revision created earlier than release version of this branch
  362. revision = revisions[0]
  363. logger.info('Model revision not specified, use the latest revision: %s' % revision)
  364. else:
  365. # use user-specified revision
  366. revisions = self.list_model_revisions(
  367. model_id, cutoff_timestamp=current_timestamp, use_cookies=False if cookies is None else cookies)
  368. if revision not in revisions:
  369. raise NotExistError(
  370. 'The model: %s has no revision: %s !' % (model_id, revision))
  371. logger.info('Use user-specified model revision: %s' % revision)
  372. return revision
  373. def get_model_branches_and_tags(
  374. self,
  375. model_id: str,
  376. use_cookies: Union[bool, CookieJar] = False,
  377. ) -> Tuple[List[str], List[str]]:
  378. """Get model branch and tags.
  379. Args:
  380. model_id (str): The model id
  381. use_cookies (Union[bool, CookieJar], optional): If is cookieJar, we will use this cookie, if True, will
  382. will load cookie from local. Defaults to False.
  383. Returns:
  384. Tuple[List[str], List[str]]: Return list of branch name and tags
  385. """
  386. cookies = self._check_cookie(use_cookies)
  387. path = f'{self.endpoint}/api/v1/models/{model_id}/revisions'
  388. r = requests.get(path, cookies=cookies, headers=self.headers)
  389. handle_http_response(r, logger, cookies, model_id)
  390. d = r.json()
  391. raise_on_error(d)
  392. info = d[API_RESPONSE_FIELD_DATA]
  393. branches = [x['Revision'] for x in info['RevisionMap']['Branches']
  394. ] if info['RevisionMap']['Branches'] else []
  395. tags = [x['Revision'] for x in info['RevisionMap']['Tags']
  396. ] if info['RevisionMap']['Tags'] else []
  397. return branches, tags
  398. def get_model_files(self,
  399. model_id: str,
  400. revision: Optional[str] = DEFAULT_MODEL_REVISION,
  401. root: Optional[str] = None,
  402. recursive: Optional[str] = False,
  403. use_cookies: Union[bool, CookieJar] = False,
  404. headers: Optional[dict] = {}) -> List[dict]:
  405. """List the models files.
  406. Args:
  407. model_id (str): The model id
  408. revision (Optional[str], optional): The branch or tag name.
  409. root (Optional[str], optional): The root path. Defaults to None.
  410. recursive (Optional[str], optional): Is recursive list files. Defaults to False.
  411. use_cookies (Union[bool, CookieJar], optional): If is cookieJar, we will use this cookie, if True,
  412. will load cookie from local. Defaults to False.
  413. headers: request headers
  414. Raises:
  415. ValueError: If user_cookies is True, but no local cookie.
  416. Returns:
  417. List[dict]: Model file list.
  418. """
  419. if revision:
  420. path = '%s/api/v1/models/%s/repo/files?Revision=%s&Recursive=%s' % (
  421. self.endpoint, model_id, revision, recursive)
  422. else:
  423. path = '%s/api/v1/models/%s/repo/files?Recursive=%s' % (
  424. self.endpoint, model_id, recursive)
  425. cookies = self._check_cookie(use_cookies)
  426. if root is not None:
  427. path = path + f'&Root={root}'
  428. r = requests.get(
  429. path, cookies=cookies, headers={
  430. **headers,
  431. **self.headers
  432. })
  433. handle_http_response(r, logger, cookies, model_id)
  434. d = r.json()
  435. raise_on_error(d)
  436. files = []
  437. for file in d[API_RESPONSE_FIELD_DATA]['Files']:
  438. if file['Name'] == '.gitignore' or file['Name'] == '.gitattributes':
  439. continue
  440. files.append(file)
  441. return files
  442. def list_datasets(self):
  443. path = f'{self.endpoint}/api/v1/datasets'
  444. params = {}
  445. r = requests.get(path, params=params, headers=self.headers)
  446. raise_for_http_status(r)
  447. dataset_list = r.json()[API_RESPONSE_FIELD_DATA]
  448. return [x['Name'] for x in dataset_list]
  449. def fetch_dataset_scripts(
  450. self,
  451. dataset_name: str,
  452. namespace: str,
  453. download_mode: Optional[DownloadMode],
  454. revision: Optional[str] = DEFAULT_DATASET_REVISION):
  455. if namespace is None:
  456. raise ValueError(
  457. f'Dataset from Hubs.modelscope should have a valid "namespace", but get {namespace}'
  458. )
  459. revision = revision or DEFAULT_DATASET_REVISION
  460. cache_dir = os.path.join(DOWNLOADED_DATASETS_PATH, namespace,
  461. dataset_name, revision)
  462. download_mode = DownloadMode(download_mode
  463. or DownloadMode.REUSE_DATASET_IF_EXISTS)
  464. if download_mode == DownloadMode.FORCE_REDOWNLOAD and os.path.exists(
  465. cache_dir):
  466. shutil.rmtree(cache_dir)
  467. os.makedirs(cache_dir, exist_ok=True)
  468. datahub_url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}'
  469. cookies = ModelScopeConfig.get_cookies()
  470. r = requests.get(datahub_url, cookies=cookies)
  471. resp = r.json()
  472. datahub_raise_on_error(datahub_url, resp)
  473. dataset_id = resp['Data']['Id']
  474. dataset_type = resp['Data']['Type']
  475. datahub_url = f'{self.endpoint}/api/v1/datasets/{dataset_id}/repo/tree?Revision={revision}'
  476. r = requests.get(datahub_url, cookies=cookies, headers=self.headers)
  477. resp = r.json()
  478. datahub_raise_on_error(datahub_url, resp)
  479. file_list = resp['Data']
  480. if file_list is None:
  481. raise NotExistError(
  482. f'The modelscope dataset [dataset_name = {dataset_name}, namespace = {namespace}, '
  483. f'version = {revision}] dose not exist')
  484. file_list = file_list['Files']
  485. local_paths = defaultdict(list)
  486. dataset_formation = DatasetFormations(dataset_type)
  487. dataset_meta_format = DatasetMetaFormats[dataset_formation]
  488. for file_info in file_list:
  489. file_path = file_info['Path']
  490. extension = os.path.splitext(file_path)[-1]
  491. if extension in dataset_meta_format:
  492. datahub_url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/repo?' \
  493. f'Revision={revision}&FilePath={file_path}'
  494. r = requests.get(datahub_url, cookies=cookies)
  495. raise_for_http_status(r)
  496. local_path = os.path.join(cache_dir, file_path)
  497. if os.path.exists(local_path):
  498. logger.warning(
  499. f"Reusing dataset {dataset_name}'s python file ({local_path})"
  500. )
  501. local_paths[extension].append(local_path)
  502. continue
  503. with open(local_path, 'wb') as f:
  504. f.write(r.content)
  505. local_paths[extension].append(local_path)
  506. return local_paths, dataset_formation, cache_dir
  507. def get_dataset_file_url(
  508. self,
  509. file_name: str,
  510. dataset_name: str,
  511. namespace: str,
  512. revision: Optional[str] = DEFAULT_DATASET_REVISION):
  513. if file_name.endswith('.csv'):
  514. file_name = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/repo?' \
  515. f'Revision={revision}&FilePath={file_name}'
  516. return file_name
  517. def get_dataset_access_config(
  518. self,
  519. dataset_name: str,
  520. namespace: str,
  521. revision: Optional[str] = DEFAULT_DATASET_REVISION):
  522. datahub_url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/' \
  523. f'ststoken?Revision={revision}'
  524. return self.datahub_remote_call(datahub_url)
  525. def get_dataset_access_config_session(
  526. self,
  527. cookies: CookieJar,
  528. dataset_name: str,
  529. namespace: str,
  530. revision: Optional[str] = DEFAULT_DATASET_REVISION):
  531. datahub_url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/' \
  532. f'ststoken?Revision={revision}'
  533. r = requests.get(url=datahub_url, cookies=cookies, headers=self.headers)
  534. resp = r.json()
  535. raise_on_error(resp)
  536. return resp['Data']
  537. def list_oss_dataset_objects(self, dataset_name, namespace, max_limit,
  538. is_recursive, is_filter_dir, revision):
  539. url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/oss/tree/?' \
  540. f'MaxLimit={max_limit}&Revision={revision}&Recursive={is_recursive}&FilterDir={is_filter_dir}'
  541. cookies = ModelScopeConfig.get_cookies()
  542. resp = requests.get(url=url, cookies=cookies)
  543. resp = resp.json()
  544. raise_on_error(resp)
  545. resp = resp['Data']
  546. return resp
  547. def on_dataset_download(self, dataset_name: str, namespace: str) -> None:
  548. url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/download/increase'
  549. cookies = ModelScopeConfig.get_cookies()
  550. r = requests.post(url, cookies=cookies, headers=self.headers)
  551. raise_for_http_status(r)
  552. def delete_oss_dataset_object(self, object_name: str, dataset_name: str,
  553. namespace: str, revision: str) -> str:
  554. if not object_name or not dataset_name or not namespace or not revision:
  555. raise ValueError('Args cannot be empty!')
  556. url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/oss?Path={object_name}&Revision={revision}'
  557. cookies = self.check_local_cookies(use_cookies=True)
  558. resp = requests.delete(url=url, cookies=cookies)
  559. resp = resp.json()
  560. raise_on_error(resp)
  561. resp = resp['Message']
  562. return resp
  563. def delete_oss_dataset_dir(self, object_name: str, dataset_name: str,
  564. namespace: str, revision: str) -> str:
  565. if not object_name or not dataset_name or not namespace or not revision:
  566. raise ValueError('Args cannot be empty!')
  567. url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/oss/prefix?Prefix={object_name}/' \
  568. f'&Revision={revision}'
  569. cookies = self.check_local_cookies(use_cookies=True)
  570. resp = requests.delete(url=url, cookies=cookies)
  571. resp = resp.json()
  572. raise_on_error(resp)
  573. resp = resp['Message']
  574. return resp
  575. @staticmethod
  576. def datahub_remote_call(url):
  577. cookies = ModelScopeConfig.get_cookies()
  578. r = requests.get(url, cookies=cookies, headers={'user-agent': ModelScopeConfig.get_user_agent()})
  579. resp = r.json()
  580. datahub_raise_on_error(url, resp)
  581. return resp['Data']
  582. def check_local_cookies(self, use_cookies) -> CookieJar:
  583. return self._check_cookie(use_cookies=use_cookies)
  584. def dataset_download_uv(self, dataset_name: str, namespace: str):
  585. if not dataset_name or not namespace:
  586. raise ValueError('dataset_name or namespace cannot be empty!')
  587. # get channel and user_name
  588. channel = DownloadChannel.LOCAL.value
  589. user_name = ''
  590. if MODELSCOPE_CLOUD_ENVIRONMENT in os.environ:
  591. channel = os.environ[MODELSCOPE_CLOUD_ENVIRONMENT]
  592. if MODELSCOPE_CLOUD_USERNAME in os.environ:
  593. user_name = os.environ[MODELSCOPE_CLOUD_USERNAME]
  594. url = f'{self.endpoint}/api/v1/datasets/{namespace}/{dataset_name}/download/uv/{channel}?user={user_name}'
  595. cookies = ModelScopeConfig.get_cookies()
  596. r = requests.post(url, cookies=cookies, headers=self.headers)
  597. resp = r.json()
  598. raise_on_error(resp)
  599. return resp['Message']
  600. class ModelScopeConfig:
  601. path_credential = expanduser(DEFAULT_CREDENTIALS_PATH)
  602. COOKIES_FILE_NAME = 'cookies'
  603. GIT_TOKEN_FILE_NAME = 'git_token'
  604. USER_INFO_FILE_NAME = 'user'
  605. USER_SESSION_ID_FILE_NAME = 'session'
  606. @staticmethod
  607. def make_sure_credential_path_exist():
  608. os.makedirs(ModelScopeConfig.path_credential, exist_ok=True)
  609. @staticmethod
  610. def save_cookies(cookies: CookieJar):
  611. ModelScopeConfig.make_sure_credential_path_exist()
  612. with open(
  613. os.path.join(ModelScopeConfig.path_credential,
  614. ModelScopeConfig.COOKIES_FILE_NAME), 'wb+') as f:
  615. pickle.dump(cookies, f)
  616. @staticmethod
  617. def get_cookies():
  618. cookies_path = os.path.join(ModelScopeConfig.path_credential,
  619. ModelScopeConfig.COOKIES_FILE_NAME)
  620. if os.path.exists(cookies_path):
  621. with open(cookies_path, 'rb') as f:
  622. cookies = pickle.load(f)
  623. for cookie in cookies:
  624. if cookie.is_expired():
  625. logger.warn(
  626. 'Authentication has expired, please re-login')
  627. return None
  628. return cookies
  629. return None
  630. @staticmethod
  631. def get_user_session_id():
  632. session_path = os.path.join(ModelScopeConfig.path_credential,
  633. ModelScopeConfig.USER_SESSION_ID_FILE_NAME)
  634. session_id = ''
  635. if os.path.exists(session_path):
  636. with open(session_path, 'rb') as f:
  637. session_id = str(f.readline().strip(), encoding='utf-8')
  638. return session_id
  639. if session_id == '' or len(session_id) != 32:
  640. session_id = str(uuid.uuid4().hex)
  641. ModelScopeConfig.make_sure_credential_path_exist()
  642. with open(session_path, 'w+') as wf:
  643. wf.write(session_id)
  644. return session_id
  645. @staticmethod
  646. def save_token(token: str):
  647. ModelScopeConfig.make_sure_credential_path_exist()
  648. with open(
  649. os.path.join(ModelScopeConfig.path_credential,
  650. ModelScopeConfig.GIT_TOKEN_FILE_NAME), 'w+') as f:
  651. f.write(token)
  652. @staticmethod
  653. def save_user_info(user_name: str, user_email: str):
  654. ModelScopeConfig.make_sure_credential_path_exist()
  655. with open(
  656. os.path.join(ModelScopeConfig.path_credential,
  657. ModelScopeConfig.USER_INFO_FILE_NAME), 'w+') as f:
  658. f.write('%s:%s' % (user_name, user_email))
  659. @staticmethod
  660. def get_user_info() -> Tuple[str, str]:
  661. try:
  662. with open(
  663. os.path.join(ModelScopeConfig.path_credential,
  664. ModelScopeConfig.USER_INFO_FILE_NAME),
  665. 'r', encoding='utf-8') as f:
  666. info = f.read()
  667. return info.split(':')[0], info.split(':')[1]
  668. except FileNotFoundError:
  669. pass
  670. return None, None
  671. @staticmethod
  672. def get_token() -> Optional[str]:
  673. """
  674. Get token or None if not existent.
  675. Returns:
  676. `str` or `None`: The token, `None` if it doesn't exist.
  677. """
  678. token = None
  679. try:
  680. with open(
  681. os.path.join(ModelScopeConfig.path_credential,
  682. ModelScopeConfig.GIT_TOKEN_FILE_NAME),
  683. 'r', encoding='utf-8') as f:
  684. token = f.read()
  685. except FileNotFoundError:
  686. pass
  687. return token
  688. @staticmethod
  689. def get_user_agent(user_agent: Union[Dict, str, None] = None, ) -> str:
  690. """Formats a user-agent string with basic info about a request.
  691. Args:
  692. user_agent (`str`, `dict`, *optional*):
  693. The user agent info in the form of a dictionary or a single string.
  694. Returns:
  695. The formatted user-agent string.
  696. """
  697. # include some more telemetrics when executing in dedicated
  698. # cloud containers
  699. env = 'custom'
  700. if MODELSCOPE_CLOUD_ENVIRONMENT in os.environ:
  701. env = os.environ[MODELSCOPE_CLOUD_ENVIRONMENT]
  702. user_name = 'unknown'
  703. if MODELSCOPE_CLOUD_USERNAME in os.environ:
  704. user_name = os.environ[MODELSCOPE_CLOUD_USERNAME]
  705. ua = 'modelscope/%s; python/%s; session_id/%s; platform/%s; processor/%s; env/%s; user/%s' % (
  706. __version__,
  707. platform.python_version(),
  708. ModelScopeConfig.get_user_session_id(),
  709. platform.platform(),
  710. platform.processor(),
  711. env,
  712. user_name,
  713. )
  714. if isinstance(user_agent, dict):
  715. ua = '; '.join(f'{k}/{v}' for k, v in user_agent.items())
  716. elif isinstance(user_agent, str):
  717. ua += ';' + user_agent
  718. return ua