83 Commits (6fcd6cab684bd2d2ae2da5613efa3aa4e2cd7d5a)

Author SHA1 Message Date
  wangshuide2020 9f058af6d8 add adapter of Erf, Expm1, Inv, etc. for graphengine. 4 years ago
  changzherui 7781f135cc modify topk adapter 4 years ago
  wangshuide2020 a0a260dca1 add adapter of Conv3D and Conv3DTranspose operators for graphengine. 5 years ago
  changzherui f1543b69f2 modify adapter tensor size limit for ma 5 years ago
  wangshuide2020 4e8bfc2862 1.add adapter of Mod, MaxPool3D and BCEWithLogitsLoss operators for graphengine. 5 years ago
  huangbingjian 63a89925ff remove ControlDepend and its use 5 years ago
  mindspore-ci-bot dfd368a574 !12757 Change all io_format in master 5 years ago
  zhoufeng ec7f9f395a fix cmake lint 5 years ago
  l00591931 7a683ed293 Change all io_format in master 5 years ago
  changzherui 44f8fbc219 mod Conv2DBackpropInput adapter 5 years ago
  mindspore-ci-bot 2b2964e6dd !12925 [MS][Maskrcnn][310infer]Maskrcnn 310 infer failed in halfway 5 years ago
  lanzhineng c8899e1f59 fix 310 maskrcnn infer failed in halway 5 years ago
  l00591931 680324f225 Change make tuple in core.ops 5 years ago
  l00591931 65cbb7b08f Change io_format in adapter 5 years ago
  He Wei 7d9a783993 [auto-monad] Support side-effects by auto-monad 5 years ago
  jinyaohui 30a27b2adb modify Gelu、FastGelu to GeLU and FastGeLU 5 years ago
  mindspore-ci-bot aebe263dce !11895 unify mindir for different backend: the output num of optimizer ops, the backward of concat 5 years ago
  mindspore-ci-bot c9aaa70b39 !12092 Change L2Norm, merge from r1.1 to master 5 years ago
  mindspore-ci-bot ad5b033cc5 Change L2Norm, r1.1 to master 5 years ago
  jinyaohui 8022f9a6ed modify pack to stack 5 years ago
  wangnan39@huawei.com cd9173fdfd unify the output num of optimizer ops 5 years ago
  mindspore-ci-bot b189f177bb Change tuple_getitem to TupleGetItem and some other ops, merge from r1.1 to master 5 years ago
  l00591931 9ec100d069 Change TensorAdd to Add, from r1.1 to master 5 years ago
  mindspore-ci-bot 9fa0499fa0 Change GatherV2 to Gather r1.1 to master 5 years ago
  xsmq 73b7154e55 fix cmakelint error 5 years ago
  yuchaojie 611a9a3654 unify Conv's attr pad_list 5 years ago
  yuchaojie 1932d87a26 update some op's attr name 5 years ago
  yuchaojie b51b3a6764 update Pool's attr kernel_size, pad_mode 5 years ago
  wangshuide2020 d574de4c42 add adapter of Asin, Asinh, Atan etc. operators for graphengine. 5 years ago
  jiangzhenguang d3f42e7d6b add nll_loss operation. 5 years ago
  mindspore-ci-bot 142f9c2d3e !9069 register fusion operator for lamb optimizer 5 years ago
  shibeiji b2d98d2751 add tbe fusion operators LambApplyOptimizerAssign and LambApplyWeightAssign for lamb optimizer 5 years ago
  mindspore-ci-bot 9b2d79c14c !10841 handle empty tensor in ge adapter 5 years ago
  zhoufeng f107fbd51e handle empty tensor in ge adapter 5 years ago
  yanghaoran b1ee7d9926 Synchronize latest Ascend software suite 29 Dec 2020 5 years ago
  changzherui e4b730260a modify export air and check file_name 5 years ago
  shibeiji cd850bd210 register activation operator fast_gelu 5 years ago
  chenhaozhe b3add83bf0 support const input in graph_ir convertor, add value inference in Concat 5 years ago
  simson 0c52d69ffa fix bug int32 to int64 5 years ago
  yangzhenzhang 7780e1893e fix int64 5 years ago
  Yi Huaijie d7faa77b5e support int64 shape 5 years ago
  zhoufeng 183742009f mindspore cxx api for 310 inference 5 years ago
  liuxiao93 45d343257b Add DynamicGRU. 5 years ago
  liangchenghui 8093f770bf Add ReverseV2 op for old backend. 5 years ago
  mindspore-ci-bot 6e2010ae75 !7479 Add Softplus,SoftplusGrad for old backend. 5 years ago
  liangchenghui b4fda82ee5 Add Softplus,SoftplusGrad for old backend.. 5 years ago
  liuxiao93 54c96fe13b Add DynamicRNN for old backend. 5 years ago
  mindspore-ci-bot 6af5bafcce !7318 modify convert.cc warning log 5 years ago
  changzherui f48d30ba8a modify convert.cc warning log 5 years ago
  liangchenghui 8eedd68b8e Fix MatrixSetDiag, Select, Conv2D op annotation problem. 5 years ago