i-robot
8d00a8d803
!22360 Fix Transformer Mirror Error
Merge pull request !22360 from huangxinjing/fix_transformer_mirror_error
4 years ago
huangxinjing
62496d75f3
less the interface exposed
4 years ago
baihuawei
a9694a9230
ascend add nontask sink mode
4 years ago
huangxinjing
eaa8027903
Fix document api error
4 years ago
i-robot
20df29faba
!22257 add parallel owners yao_yf
Merge pull request !22257 from yao_yf/code_docs_add_owners
4 years ago
yao_yf
e97f2d3655
add commiter yao_yf
4 years ago
ms_yan
36a8886ca2
Revert "[feat] [assistant] [I3T96T] add new Dataset operator CMUARCTICDataset"
This reverts commit b077aa1cab .
Revert "[feat] [assistant] [I3T96X] add new Dataset operator LibriSpeechDataset"
This reverts commit 4e6f7dc97d .
delete pass_registry_test.cc
comment hiai_nlu_model_multi.pb related line
4 years ago
djc
4e6f7dc97d
[feat] [assistant] [I3T96X] add new Dataset operator LibriSpeechDataset
4 years ago
huangxinjing
d777742904
1. Move the class to mindspore.parallel, support activation sharding
4 years ago
yao_yf
a83bf73298
union auto_parallel_context interface dataset_strategy
4 years ago
yao_yf
dc7dc7d3fa
dataset strategy set
4 years ago
Xiaoda Zhang
bb5d4212f7
enable All2All in infering redistribution ops
4 years ago
i-robot
c9d3c1d346
!20411 enable optimizer parallel for inference
Merge pull request !20411 from gziyan/enable_opt_shard_predict
4 years ago
jin-xiulang
276b1b1638
Fix issues of federated's secure aggregation
4 years ago
Ziyan
1c9166e0a6
remove restriction for opt shard in inference
4 years ago
ZPaC
a9a0f590e6
Fix master static check
4 years ago
Xiaoda Zhang
04381273b3
Add the sharding propagation function:
1) users configure sharding strategies for operators;
2) framework will propagate the strategies from configured-ops to
non-configured ops using BFS;
3) the propagation goal is to minimize redistribution communication
cost;
4 years ago
i-robot
eeeabf5d07
!20064 fix_pipeline_opt_shard_master
Merge pull request !20064 from yao_yf/fix_pipeline_opt_shard_master
4 years ago
yao_yf
4aae231a8a
fix_pipeline_opt_shard
4 years ago
jin-xiulang
33de9e66d3
Fix issues of mindspore federated
4 years ago
ZPaC
f15f4f3377
Add value check
4 years ago
jin-xiulang
257cc23ca2
Add secure parameters for mindspore federated learning.
fix prime initialization and build key bug
fix reconstruct access bug and kernels' retcode.
Fix init issue
Add fl secure parameter cipher_time_window
4 years ago
jin-xiulang
ebc71d3306
Add secure parameters for mindspore federated learning.
4 years ago
ZPaC
fb76361d81
sync bug fix
4 years ago
i-robot
d33adf825b
!18475 cipher calling code
Merge pull request !18475 from ql_12345/master_merge_0617_1
4 years ago
ql_12345
5af0c890a1
cipher calling code
4 years ago
ZPaC
c053100ef6
Add hybrid script
4 years ago
Ziyan
be1f5a43d7
opt shard fit micro batch
4 years ago
ZPaC
9a8a7c215d
Sync from enterprise
4 years ago
chendongsheng
bfaab72934
add node recovery
4 years ago
lichenever
cb438ce350
rectification_log
4 years ago
Ziyan
95ac0f6d58
fix optimizer weight shard config
4 years ago
ZPaC
2ab4d78374
Add server round kernel for hybrid.
4 years ago
Xiaoda Zhang
07e1e39a82
fix some codestyle warnings
4 years ago
chendongsheng
1a72c6ac35
added scale out and scale in
4 years ago
ZPaC
768f6b40c3
Optimize ps doc
4 years ago
huangxinjing
e79db658e8
Fix codex for python file
4 years ago
ZPaC
19a2585ba4
Optimize server log
4 years ago
chendongsheng
2aae0b01ec
tcp support ssl
4 years ago
mindspore-ci-bot
ee885b4e58
!15180 enable not fully use opt shard
From: @gong_zi_yan
Reviewed-by: @stsuteng
Signed-off-by: @stsuteng
4 years ago
ZPaC
a6f9814552
Add Server part 3
4 years ago
Ziyan
2a752f24bf
enable not fully use opt shard
5 years ago
Ziyan
b3eebea4de
fix shard strategy for batchnorm
5 years ago
Ziyan
27c7c8618e
fix get seed validation
5 years ago
mindspore-ci-bot
444ff97206
!13505 typo fix
From: @wudenggang
Reviewed-by: @ouwenchang,@Hanshize,@kingxian
Signed-off-by: @kingxian
5 years ago
Ziyan
6e475bea3f
add warning for parameter share among multi devices
5 years ago
Ziyan
d19d42ee44
modify grad accu and comm fusion api
5 years ago
wudenggang
b17a558af4
typo fix
5 years ago
liujunzhu
6541b96c40
Add communication parallel mode.
5 years ago
chendongsheng
6c22dc0d55
added worker
5 years ago