mm_add_relu() — pytorch Function Reference
Architecture documentation for the mm_add_relu() function in bench_mm_fusion.py from the pytorch codebase.
Entity Profile
Dependency Diagram
graph TD 6f68b04e_8de5_dbe9_8c34_ecbacc0af2f5["mm_add_relu()"] fe1cc36a_6b6c_badf_4954_01fbf4301526["mm()"] 6f68b04e_8de5_dbe9_8c34_ecbacc0af2f5 -->|calls| fe1cc36a_6b6c_badf_4954_01fbf4301526 style 6f68b04e_8de5_dbe9_8c34_ecbacc0af2f5 fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
benchmarks/dynamo/microbenchmarks/bench_mm_fusion.py lines 41–44
def mm_add_relu(a, b, bias):
y = torch.mm(a, b)
y += bias
return torch.relu(y)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does mm_add_relu() do?
mm_add_relu() is a function in the pytorch codebase.
What does mm_add_relu() call?
mm_add_relu() calls 1 function(s): mm.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free