1260 Commits (decc8404a9fad3ab75c3d8c45d076a545de71b51)
 

Author SHA1 Message Date
  caojian05 73d4cf77d4 add model parameters for vgg16 to open mixed precision. 5 years ago
  mindspore-ci-bot b39247d3de !557 add README file for vgg16 5 years ago
  mindspore-ci-bot 07dcd6e0cc !616 remove the parameter 'batch_size' of VGG16, for we can use 'flatten' instead of 'reshape'. 5 years ago
  caifubi 246fc290d0 clean runtime codex 5 years ago
  lianliguang c4aeb5a0b8 add format chooice when kernel selecting reduce or raise precision 5 years ago
  mindspore-ci-bot d132b61bcd !625 Fix check nullptr by calling function directly 5 years ago
  mindspore-ci-bot 928dcf4ec4 !554 Fix derelu fusion pass 5 years ago
  liyong 1f222ddb9e fix mindrecord c ut 5 years ago
  mindspore-ci-bot c646faed1e !626 gpu dynamic memory pool can't reuse allReduce in multi-stream 5 years ago
  huanghui cd6e8d6542 fix ReluV2's mask shape in derelu fusion pass 5 years ago
  mindspore-ci-bot ebc3f12b21 !620 [Auto parallel] Fix the code-style warnings in parallel-mode 5 years ago
  fary86 065e9e6a4e Fix when not set GLOG_log_dir, create log file at /tmp 5 years ago
  limingqi107 0f0e8fe874 gpu dynamic memory pool can not reuse allReduce in multi-stream 5 years ago
  lvliang 5b39a3ea6e fix-check-nullptr-by-calling-function 5 years ago
  mindspore-ci-bot 99e7e43cbb !599 fix clipbynorm op run error in pynative mode 5 years ago
  mindspore-ci-bot 4a3e5cb944 !492 Add AllGather fusion pass 5 years ago
  mindspore-ci-bot 82cb39274b !619 Clean code style check warning in pynative codes 5 years ago
  mindspore-ci-bot 85c889bc3b !615 Modify log level of context 5 years ago
  mindspore-ci-bot 54e0fa5c09 !556 [Auto Parallel] use DeviceMemory instead of fixed-size memory check 5 years ago
  mindspore-ci-bot ad75618ca6 !611 change error message type 5 years ago
  mindspore-ci-bot 5034bc10ce !612 Bugfix: correct wrong pattern of confusion_softmax_grad_rule pass 5 years ago
  mindspore-ci-bot f2a861516a !549 [MS][BERT] Add README.md file of BERT model 5 years ago
  caojian05 10994a5e7d add README file for vgg16 5 years ago
  Xiaoda Zhang ec043fcd56 fix the codex and bot warnings 5 years ago
  mindspore-ci-bot de4aca2025 !613 fix codedex warning and a datacopy coredump bug 5 years ago
  lvliang aec761c143 pynative-clean-reviewbot-warning 5 years ago
  YuJianfeng 39945d0f79 Add AllGather fusion pass 5 years ago
  mindspore-ci-bot eefb6edde2 !606 Fix dataset api doc problem 5 years ago
  caojian05 b36094e327 remove the parameter batch_size of VGG16, for we can use flatten instead of reshape. 5 years ago
  simson 8492f3dd7f modify log level of context 5 years ago
  mindspore-ci-bot ebd0fd33f6 !572 Add REAMME to resnet50_cifar10 5 years ago
  liubuyu 0b6b5e5123 fix codedex warning 5 years ago
  ch-l f806b72447 use DeviceMemory for memory control 5 years ago
  mindspore-ci-bot 31f1d2956a !604 GeneratorDataset column names support string type 5 years ago
  mindspore-ci-bot 2d6c703794 !582 Add gpu test case for dynamic lr. 5 years ago
  zhaojichen 8963e39516 change error type 5 years ago
  zhousiyi f6a4f3d155 [static_analysis]: remove the TrivialPrimEvaluator cache. 5 years ago
  mindspore-ci-bot 496ffff3fd !573 Fix bug in `ParseAttribute` 5 years ago
  dinghao 60971f26d8 fix clipbynorm in pynative mode 5 years ago
  mindspore-ci-bot 19e1494a68 !375 Add prim name to error message for _grad_ops.py 5 years ago
  fary86 17adba5eb5 Optimize flow of exproting onnx 5 years ago
  huanghui 14df771175 fix confusion_softmax_grad_rule pass 5 years ago
  mindspore-ci-bot 507b63ea20 !461 Add interface to get attributes of network. 5 years ago
  zhaojichen ff57caceb9 change error type 5 years ago
  mindspore-ci-bot 112040791a !610 modify lenet and alexnet README.md 5 years ago
  mindspore-ci-bot 5ac07bb695 !504 Support Tensor assign A[slice] = u 5 years ago
  wsc f90629a01d Add readme file of BERT model 5 years ago
  zhaojichen 8f1d140de1 change error type 5 years ago
  wukesong 15ccc5c56e modify lenet&alexnet 5 years ago
  fary86 8bb93411f3 Add prim name to error message for _grad_ops.py 5 years ago