1414 Commits (2139c7ddc6beda341ef25c871ff0db5b60888787)

Author SHA1 Message Date
  mahdi a5228c75c7 Fixed 2D one-hot label problems in CutMix and MixUp 5 years ago
  Wei Luning 051b019c96 fix bug in parameter init 5 years ago
  simson 7c406fb3a0 fix risk of memory leak 5 years ago
  mindspore-ci-bot fd8ad73689 !5194 fix: padded dataset when no div and with repeat op for br:r0.7 5 years ago
  qianlong bc8aec007f fix softdvpp coredump 5 years ago
  jonyguo d262c63214 fix: padded dataset with non div & repeat 5 years ago
  mindspore-ci-bot cee889e426 !5126 Fix problem in RandomPosterize & CutMixBatch 5 years ago
  luoyang a75ac9c445 Add type check for RandomPosterize & Add Float tensor support for CutMixBatch 5 years ago
  Yi Huaijie 524cf0ed9a raise RuntimeError when set different mode after Initializer created 5 years ago
  Zichun Ye 04b5b8c737 fix bernoulli prob formula; fix some other minor bugs 5 years ago
  mindspore-ci-bot 94a109f476 !4898 Fix coredump caused by function call depth too large 5 years ago
  fary86 04524b6bd3 Fix coredump caused by function call depth too large 5 years ago
  mindspore-ci-bot ac81886328 !4916 fix generator_dataset hangs and test_graphdata_distributed.py failing randomly 5 years ago
  mindspore-ci-bot b366608a3f !4952 Fix errors in log calculation logics 5 years ago
  mindspore-ci-bot 9b503e4f38 !4955 Fixes for Dynamic Augmentation Ops 5 years ago
  peixu_ren 1c8eb9b15d Fix errors in log calculation logics 5 years ago
  Mahdi a5f9b8f92e Added fix for MixUpBatch and CutMixBatch and for RandomAffine 5 years ago
  mindspore-ci-bot 9d7250c483 !4776 Introduce 2 extra ctrl flags to DataBuffer in dataset, address remaining cmts to PR4632 5 years ago
  Zirui Wu 74c1e6da60 introducing pause and quit flags to DataBuffer 5 years ago
  mindspore-ci-bot 82c888f065 !4930 Fix CI cifar hang issue 5 years ago
  xiefangqi e3e7820413 fix cifar stuck problem 5 years ago
  mindspore-ci-bot 0d1a7ac654 !4909 [bug]support implicit type conversion for parameter 5 years ago
  mindspore-ci-bot c92f7c9170 !4911 add func type check for switch layer 5 years ago
  panyifeng abab21ed57 add func type check for switch layer 5 years ago
  mindspore-ci-bot 6e8d3a3b82 !4859 Add CTCGrerdyDecoder ops for old backend. 5 years ago
  heleiwang 4870abc848 1. fix generator_dataset hangs 5 years ago
  Wei Luning 77dcdd89ec support parameter updata with implicit type conversion 5 years ago
  panyifeng f9f3cd7ce0 fix switch_layer_issues 5 years ago
  mindspore-ci-bot bccb92adf7 !4868 fix: concat with none sample dataset 5 years ago
  liuxiao93 9bc18ea2e5 Add CTCGrerdyDecoder ops for GE. 5 years ago
  mindspore-ci-bot 2ac410e90d !4789 Add EditDistance op for GE. 5 years ago
  mindspore-ci-bot 3d06cbf987 !4801 Must set or change parallel mode before any Initializer created 5 years ago
  jonyguo 5b4b539751 fix: concat with none sample dataset 5 years ago
  mindspore-ci-bot 2cb22a0f78 !4688 Fix bool operation parsing to be consistent with pynative 5 years ago
  liuxiao93 4c99f4f649 Add EditDistance op for GE. 5 years ago
  Yi Huaijie 89a4ebf8a1 parallel mode must be set before create an initializer 5 years ago
  huangdongrun f30418991c refactor bool op parsing to be consistent with pynative mode 5 years ago
  mindspore-ci-bot 9ee144ea40 !4744 [AutoParallel]Support bert 5 years ago
  mindspore-ci-bot ed954dc407 !4782 Python UT fails on RandomPosterize 5 years ago
  mindspore-ci-bot 24700afa2d !4689 add check for user define bprop in Pynative mode 5 years ago
  islam_amin 85357238fe Fixing posterize python ut 5 years ago
  Jesse Lee 8a08d0c37b Phase 2 of CacheOp 5 years ago
  Zirui Wu 78c1aa1d96 Implemented Callback for Dataset 5 years ago
  buxue 855d6b8fed add check for user define bprop in Pynative mode. 5 years ago
  lichenever 221a801395 auto parallel support bert 5 years ago
  mindspore-ci-bot 70a218123b !4722 modiy error type 5 years ago
  mindspore-ci-bot 256dccc651 !4498 Gnn data processing supports distributed scenarios 5 years ago
  李嘉琪 c65ea1687b modify error type 5 years ago
  mindspore-ci-bot 2d683234fb !4723 Support to concat more than 3 tensors in auto parallel mode 5 years ago
  yangzhenzhang cda08f6a52 concat 3 tensors in auto parallel mode 5 years ago