1808 Commits (d471ac491e6979b5cdd54d7c7d9ed06ad965103d)

Author SHA1 Message Date
  mindspore-ci-bot 9bd34a1b29 !6673 Add stage information for ops and strategy 5 years ago
  Xun Deng ea57699ed1 added raise_not_implemented_error in distribution 5 years ago
  Lixia Chen 477808e622 Enable cache for other leaf ops 5 years ago
  mindspore-ci-bot 0d8015b043 !5300 [MD] Add nD Slice Support 5 years ago
  mindspore-ci-bot b785b7d0ff !6813 dataset UT: check PIL version, delete test_callback_2maps 5 years ago
  hesham 9cee0d2143 Add num_epochs to non-sink training 5 years ago
  huangxinjing 4ef439e27b Add stage information for ops and strategy 5 years ago
  Cathy Wong f2b07d907a dataset UT: check PIL version, delete test_callback_2maps 5 years ago
  mindspore-ci-bot 4c0b3c1bd3 !6017 per_batch_map needs to support x input columns and y output columns 5 years ago
  mindspore-ci-bot e88e114a50 !5930 Cache server phase 2 single node 5 years ago
  nhussain 77d507279c first working implementation 5 years ago
  Lixia Chen 983827ec5c Rebase up to 88ded11f59 5 years ago
  Zirui Wu 0cab373223 per_batch_map supports x num_columns to y num_columns 5 years ago
  mindspore-ci-bot 2f3add4acd !6301 [MD] Combine c++ and python ops in map 5 years ago
  mindspore-ci-bot 55be3c42a5 !5875 Add IFMR op for new backend. 5 years ago
  peixu_ren 23ff21edd8 Added lognormal distribuition 5 years ago
  mindspore-ci-bot 39874d133f !6644 fix testcase running slowly 5 years ago
  mindspore-ci-bot 9f84920f64 !6761 remove redundant check about IsInstance op 5 years ago
  liangzelang 195d97ad46 Remove redundant check about IsInstance op 5 years ago
  jonyguo 1c6c54ae0f fix: testcase test_cutmix_batch_op.py slowly 5 years ago
  Yi Huaijie 6066b16838 implement parallel Pack 5 years ago
  mindspore-ci-bot 9475f9a19a !6548 Implement parallel Split 5 years ago
  mindspore-ci-bot f72f2c22fb !6653 fix stream sync error for mixed precesion on pynative mode 5 years ago
  liuxiao93 0e02df812a Add IFMR op for new backend. 5 years ago
  mindspore-ci-bot 091ee5085e !6651 Modify init interface to internal interface 5 years ago
  nhussain fda9462682 embed python compose op 5 years ago
  Jiaqi 4e3e6006b6 modify init 5 years ago
  mindspore-ci-bot dfe77372f5 !6505 Set top graph parameters' name the same as original graph parameters. 5 years ago
  chujinjin 1cf8f3b777 fix stream sync error for mixed precision 5 years ago
  mindspore-ci-bot 428927bdff !6554 fix pylint 5 years ago
  mindspore-ci-bot dde9f5ac25 !6607 delete SoftmaxCrossEntropyExpand interface 5 years ago
  Zhang Qinghua 6c72d88ba1 Set top graph parameters' name as original graph parameters. 5 years ago
  jinyaohui 334a32d501 fix pylint 5 years ago
  Wei Luning cdbd16de0c fix bug in parameter set & fix code style in pynative_executa.cc 5 years ago
  guohongzilong a754dea90c delete SoftmaxCrossEntropyExpand 5 years ago
  mindspore-ci-bot 861602bce9 !6512 Clean up the view code 5 years ago
  mindspore-ci-bot 3aa07a4362 !6320 change mix_precision to c++ 5 years ago
  mindspore-ci-bot 4905de06bd !6545 improve the recognition of Parameter object and raise error when convert keywordarg to pydata 5 years ago
  shenwei41 f2e34b2eaf Clean up the view code 5 years ago
  mindspore-ci-bot 53a82fa6e0 !6472 fix get_dataset_size error in CelebaDataset when usage is not 'all' 5 years ago
  buxue 483c8a179a improve the recognition of Parameter object and raise error when convert keywordarg to pydata 5 years ago
  mindspore-ci-bot 5a20b11012 !6502 [AutoParallel]Fix auto parallel find loss bug 5 years ago
  mindspore-ci-bot 92aecf128f !6549 [MD] minddata add iterator parameter validation 5 years ago
  yanghaitao1 4ff4c17632 fix get_dataset_size in CelebADataset when usage is not all 5 years ago
  Yi Huaijie 18ed2bec53 implement parallel Split 5 years ago
  xiefangqi a6360cb2e4 add output_numpy validation to iterator 5 years ago
  yanghaitao1 dd0d9fe9ab fix ci error 5 years ago
  kpy 44c01e27c0 do mixprecision in c++ for pynative 5 years ago
  mindspore-ci-bot 1cfad93704 !6479 fix bucket_batch_by_length show pyfunc timeout 5 years ago
  lichenever d4bba3f1d2 fix_auto_parallel_find_loss_bug 5 years ago