Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[PaddleSpeech] Add OPs and others needed by fastspeech_2 model (#9716)
* [X86] Add set value op and double data type to framework. (#9580) * [QualcommQnn] add ops (#9538) support fusion_elementwise_mul_activation, fusion_elementwise_sub_activation, fusion_elementwise_div_activation, fusion_elementwise_min_activation, fusion_elementwise_max_activation, fusion_elementwise_pow_activation, instance_norm, prelu, arg_max, arg_min, flatten, flatten2, norm * add float64 type to lite * add float64 kernel for set value * change the third-party-libs url due to flatbuf update. * fix include files conflict * fix bug * Fix heterogeneous execution errors * fix control_flow_op_control_flow_op_shared_inputs_and_outputs_place_sync_pass bug * fix comment Co-authored-by: zhupengyang <[email protected]> * [PaddleSpeech] Add OPs and others needed by fastspeech_2 model (#9706) * [Host] add 3 OPs: set_value, round, share_data test=develop * [Host] add expand_v2 OP registration with type kBool test=develop * [Arm] add reduce_sum OP Int64 registration and neon implement & add reduce_max OP kInt32 registration test=develop * [X86] fix bug in set_value OP test=develop * [Extra] move 2 round and share_data to extra test=develop * [proto] fix a bug test=develop Co-authored-by: csy0225 <[email protected]> Co-authored-by: zhupengyang <[email protected]>
- Loading branch information