Skip to content

[tmva][sofie] Add HardSigmoid operator for ONNX inference#20933

Open
AdityaDRathore wants to merge 1 commit intoroot-project:masterfrom
AdityaDRathore:feature/sofie-hardsigmoid
Open

[tmva][sofie] Add HardSigmoid operator for ONNX inference#20933
AdityaDRathore wants to merge 1 commit intoroot-project:masterfrom
AdityaDRathore:feature/sofie-hardsigmoid

Conversation

@AdityaDRathore
Copy link
Copy Markdown
Contributor

@AdityaDRathore AdityaDRathore commented Jan 18, 2026

This Pull Request

Add support for the ONNX HardSigmoid operator in SOFIE.

Checklist:

  • tested changes locally (TestSofieHardSigmoid passing)
  • updated the docs (if necessary)

cc: @moneta @sanjibansg @guitargeek

Copy link
Copy Markdown
Collaborator

@sanjibansg sanjibansg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @AdityaDRathore,
Thanks for your contribution and developing the operator implementation for Hard-sigmoid. The implementation looks good to me. But, we are in the process of consolidating unary operations (like activation functions) into the ROperator_BasicUnary.hxx file. Can you make the corresponding changes?

Thanks.

@github-actions
Copy link
Copy Markdown

github-actions bot commented Jan 28, 2026

Test Results

    21 files      21 suites   2d 23h 37m 41s ⏱️
 3 837 tests  3 836 ✅  1 💤 0 ❌
72 256 runs  72 238 ✅ 18 💤 0 ❌

Results for commit d575330.

♻️ This comment has been updated with latest results.

@AdityaDRathore
Copy link
Copy Markdown
Contributor Author

Hi @AdityaDRathore, Thanks for your contribution and developing the operator implementation for Hard-sigmoid. The implementation looks good to me. But, we are in the process of consolidating unary operations (like activation functions) into the ROperator_BasicUnary.hxx file. Can you make the corresponding changes?

Thanks.

Hi @sanjibansg, thanks for the review!

I started refactoring to move this into ROperator_BasicUnary, but I noticed a potential architectural mismatch ROperator_BasicUnary appears to be designed specifically for stateless operators via the UnaryOpTraits pattern.

Since HardSigmoid is stateful (it must store alpha and beta attributes parsed from the ONNX node), it doesn't fit naturally into the current BasicUnary design without forcing all other simple operators (like Abs, Cos) to carry unused attribute overhead.

I also checked how other stateful unary operators are handled and saw that LeakyRelu (which also carries an alpha attribute) is implemented as a standalone operator.

Given this precedent, do you think it is safer to keep HardSigmoid separate to preserve the clean, stateless nature of BasicUnary? Or do you have a specific pattern in mind for handling attributes within the traits system?

@sanjibansg
Copy link
Copy Markdown
Collaborator

Hi @AdityaDRathore, Thanks for your contribution and developing the operator implementation for Hard-sigmoid. The implementation looks good to me. But, we are in the process of consolidating unary operations (like activation functions) into the ROperator_BasicUnary.hxx file. Can you make the corresponding changes?
Thanks.

Hi @sanjibansg, thanks for the review!

I started refactoring to move this into ROperator_BasicUnary, but I noticed a potential architectural mismatch ROperator_BasicUnary appears to be designed specifically for stateless operators via the UnaryOpTraits pattern.

Since HardSigmoid is stateful (it must store alpha and beta attributes parsed from the ONNX node), it doesn't fit naturally into the current BasicUnary design without forcing all other simple operators (like Abs, Cos) to carry unused attribute overhead.

I also checked how other stateful unary operators are handled and saw that LeakyRelu (which also carries an alpha attribute) is implemented as a standalone operator.

Given this precedent, do you think it is safer to keep HardSigmoid separate to preserve the clean, stateless nature of BasicUnary? Or do you have a specific pattern in mind for handling attributes within the traits system?

Ah yes! That is correct. Since this is a stateful operator, maybe better to keep this separately. Can you fix the formatting issues flagged in the CI? In the meantime, I will wait for the review of @lmoneta before we can merge this,

Thanks!

@AdityaDRathore AdityaDRathore force-pushed the feature/sofie-hardsigmoid branch from 7963b08 to 0c6123d Compare February 1, 2026 17:46
@AdityaDRathore
Copy link
Copy Markdown
Contributor Author

AdityaDRathore commented Feb 1, 2026

Hi @sanjibansg
I have pushed the formatting fixes. The implementation is now clean and ready for the review.

AdityaDRathore added a commit to AdityaDRathore/SOFIE that referenced this pull request Mar 9, 2026
Port of root-project/root#20933, #20944, and #21092.
- HardSigmoid: stateful operator with alpha/beta attributes (ONNX Opset 6+)
- HardSwish: inline generation with split topology (ONNX Opset 14)
- Softplus: numerical stability fix using log1p and threshold at 20.0f
All use hexfloat constants for bit-exact reproducibility.
Copy link
Copy Markdown
Collaborator

@sanjibansg sanjibansg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @AdityaDRathore,
Can you try fixing the merge conflicts, and add the test following the convention in the SOFIE subdirectory:
https://github.com/root-project/root/tree/master/tmva/sofie/test

@AdityaDRathore AdityaDRathore force-pushed the feature/sofie-hardsigmoid branch from 0c6123d to d575330 Compare April 2, 2026 14:34
@AdityaDRathore
Copy link
Copy Markdown
Contributor Author

Hi, @sanjibansg
I fixed the merge conflicts and added an end-to-end test to follow the standard SOFIE convention you mentioned.

@AdityaDRathore AdityaDRathore force-pushed the feature/sofie-hardsigmoid branch from d575330 to c56694d Compare April 18, 2026 12:56
@AdityaDRathore
Copy link
Copy Markdown
Contributor Author

Hi @sanjibansg,
I've updated my addtion with the SOFIE operator conventions and resolved the merge conflicts also. Can you please review it once again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants