11 Commits (dfd894c19093e3fa878ff2b46e209fe1c1dbdc22)

Author SHA1 Message Date
  zengzitao 3ef0e9f053 substitute dropout by cudnnuniformreal and dropout 5 years ago
  mindspore-ci-bot 232dff3598 !8685 [GraphKernel] For fp16 value, declare fp32 firstly and than cast to fp16 in expander 5 years ago
  tronzhang 80f071e9fa declare fp32 and than cast to fp16 in expander 5 years ago
  zengzitao 266bfa50bf expand logsoftmax and logsoftmax_grad, delete softmax's cast and fix layernorm op 5 years ago
  zengzitao 326540cbbd expand layernorm_grad op 5 years ago
  zengzitao 28f1db74dd expand maximum_grad minimum_grad dropout_grad op 5 years ago
  zengzitao db27783d54 expand tanh_grad and reduce_mean, fix bug and add test_case in ci 5 years ago
  zengzitao 53043ae18f support expand fused_adam and fused_adam_weight_decay op 5 years ago
  zengzitao 5cfa172720 expand gelu and gelugrad op 5 years ago
  zengzitao febdb1850c expand bias_add and bias_add_grad op 5 years ago
  dayschan 37a48f6aac GraphKernel supports GPU 5 years ago