1293 Commits (e52d85ba9d5a4d17d9fde72f8a7931041f28138c)

Author SHA1 Message Date
  islam_amin e52d85ba9d Added RandomAffine Op 5 years ago
  mindspore-ci-bot 9ad82f79fd !3979 Adding MixUp operation 5 years ago
  mindspore-ci-bot e195fa6f9e !4192 Fix md five_crop ci problem 5 years ago
  xiefangqi 3d7826b6df test five crop problem 5 years ago
  mindspore-ci-bot fe0b8d6272 !4068 Add parallel operator for Concat 5 years ago
  Mahdi 1896950ae5 Added Mixup 5 years ago
  peixu_ren 517fed55a5 Added LGamma op 5 years ago
  mindspore-ci-bot e19d382473 !3346 Maintain epoch/repeat count for ops 5 years ago
  yangzhenzhang f4bb43bbaf add concat op 5 years ago
  panyifeng 34e50e5d6e fix cell bprop 5 years ago
  Lixia Chen ac85b77b76 Maintain epoch/repeat count for ops 5 years ago
  mindspore-ci-bot 470328eeaf !4060 fix do concat in while loop specialize error 5 years ago
  Wei Luning d4d6457ea7 init parameter data by defaultOnly keep no data as MetaTensor in auto parallel mode 5 years ago
  fary86 7602054acd Fix do concat in while loop specialize error 5 years ago
  kingfo 28dabf0332 fix grad flag update issue in pynative 5 years ago
  mindspore-ci-bot 0b407dfe78 !4005 unsortedsegsum grad 5 years ago
  mindspore-ci-bot c493859978 !4004 add squreasumall grad 5 years ago
  mindspore-ci-bot 235378d5d7 !3789 eager mode sparse 5 years ago
  mindspore-ci-bot 7ec0b5857a !3977 [MD] Refactor Concatenate Op 5 years ago
  nhussain 61769b2dd9 fix concat 5 years ago
  fangzehua 99f2be7064 unsortedsegsum grad 5 years ago
  fangzehua 7011508379 add squreasumall grad 5 years ago
  panyifeng 9927e6eb5c eager mode sparse 5 years ago
  mindspore-ci-bot 8dec74908a !4000 improve interface '__bool__' for tensor 5 years ago
  mindspore-ci-bot 617b98f104 !3966 [AutoParallel]Add dropout distributed op 5 years ago
  buxue ace34525cd improve interface '__bool__' for tensor 5 years ago
  lichenever bfc96de1b9 add dropout distributed op 5 years ago
  peixu_ren b9c8c0b439 Added Bijector TransformDistribution base classes and two instances: power and exp bijectors 5 years ago
  mindspore-ci-bot a301fc1757 !3916 remove loss_scale range check which is a temp fix. 5 years ago
  mindspore-ci-bot 5827ba9a8a !3940 View code support interface 'all' and 'any' of tensor 5 years ago
  buxue 2c4cb49a11 support interface 'all' and 'any' of tensor 5 years ago
  zhousiyi 7d31deb6fa remove loss_scale range check to make FP32Imm(inf) comparison equal 5 years ago
  mindspore-ci-bot 8b396cea98 !3915 Revert modification of opt 5 years ago
  mindspore-ci-bot bac1781539 !3648 Add some testing cases for mindspore.profiler 5 years ago
  mindspore-ci-bot c3edfe12bc !3715 Update python pattern expression 5 years ago
  mindspore-ci-bot af9398b39a !3756 Change distribution api 5 years ago
  mindspore-ci-bot 2da29bce66 !3370 same graph used in different context should be treat differently. 5 years ago
  simson 3617121ccf revert modification of opt 5 years ago
  mindspore-ci-bot aa65cbf733 !3846 Add TBE op SquaredDifference\Xdivy\Xlogy for VM. 5 years ago
  mindspore-ci-bot 27a0a2e333 !3880 fix auto mix precision issue in pynative 5 years ago
  mindspore-ci-bot a182a87a6d !3820 support tensor attr shape and dtype in graph mode 5 years ago
  liuxiao93 374b7b8583 Add TBE op SquaredDifference for VM. 5 years ago
  kingfo 577535b387 fix gradients issue in pynative 5 years ago
  buxue b075674cf2 support tensor attr shape and dtype in graph mode 5 years ago
  mindspore-ci-bot be507b96e0 !3473 [refine]change base class of parameter 5 years ago
  zhousiyi f926650c64 if AbstractFunction comparison succeed in NewContext, then the evaluator should use the same one, otherwise one of the evaluator will not be evaluated. 5 years ago
  BowenK 6d4c07c886 Update python pattern expression 5 years ago
  mindspore-ci-bot 25fdb67e3d !3744 [Auto parallel] Add a new graph operation in the DP algorithm 5 years ago
  Wei Luning a05c38bb63 make python Parameter inherit from Tensor 5 years ago
  mindspore-ci-bot d1b3096418 !3837 add_test_for_mindrecord 3 5 years ago