Skip to content

Commit

Permalink
[PaddleSpeech] Add OPs and others needed by fastspeech_2 model (#9716)
Browse files Browse the repository at this point in the history
* [X86] Add set value op and double data type to framework. (#9580)

* [QualcommQnn] add ops (#9538)

support fusion_elementwise_mul_activation, fusion_elementwise_sub_activation, fusion_elementwise_div_activation, fusion_elementwise_min_activation, fusion_elementwise_max_activation, fusion_elementwise_pow_activation, instance_norm, prelu, arg_max, arg_min, flatten, flatten2, norm

* add float64 type to lite

* add float64 kernel for set value

* change the third-party-libs url due to flatbuf update.

* fix include files conflict

* fix bug

* Fix heterogeneous execution errors

* fix control_flow_op_control_flow_op_shared_inputs_and_outputs_place_sync_pass bug

* fix comment

Co-authored-by: zhupengyang <[email protected]>

* [PaddleSpeech] Add OPs and others needed by fastspeech_2 model (#9706)

* [Host] add 3 OPs: set_value, round, share_data
test=develop

* [Host] add expand_v2 OP registration with type kBool
test=develop

* [Arm] add reduce_sum OP Int64 registration and neon implement & add reduce_max OP kInt32 registration
test=develop

* [X86] fix bug in set_value OP
test=develop

* [Extra] move 2 round and share_data to extra
test=develop

* [proto] fix a bug
test=develop

Co-authored-by: csy0225 <[email protected]>
Co-authored-by: zhupengyang <[email protected]>
  • Loading branch information
3 people authored Nov 22, 2022
1 parent f294964 commit ba943a9
Show file tree
Hide file tree
Showing 54 changed files with 2,798 additions and 79 deletions.
1 change: 1 addition & 0 deletions lite/api/_paddle_use_ops.h
Original file line number Diff line number Diff line change
Expand Up @@ -131,3 +131,4 @@ USE_LITE_OP(box_clip)
USE_LITE_OP(assign_value)
USE_LITE_OP(hard_sigmoid)
USE_LITE_OP(rsqrt)
USE_LITE_OP(set_value)
Loading

0 comments on commit ba943a9

Please sign in to comment.