speedup_experiment_fx2trt() — pytorch Function Reference
Architecture documentation for the speedup_experiment_fx2trt() function in common.py from the pytorch codebase.
Entity Profile
Dependency Diagram
graph TD 19aa986b_7c81_6518_3e84_45819cd8e90d["speedup_experiment_fx2trt()"] 04a3a4a6_8db3_854d_a893_02c9542bf9dd["speedup_experiment()"] 19aa986b_7c81_6518_3e84_45819cd8e90d -->|calls| 04a3a4a6_8db3_854d_a893_02c9542bf9dd style 19aa986b_7c81_6518_3e84_45819cd8e90d fill:#6366f1,stroke:#818cf8,color:#fff
Relationship Graph
Source Code
benchmarks/dynamo/common.py lines 842–848
def speedup_experiment_fx2trt(args, model_iter_fn, model, example_inputs):
"""
Measure speedups over eager using the trt inference backend. TRT backend is based fx graph
generated by torch._dynamo.
Writes to ./speedups_fx2trt.csv
"""
return speedup_experiment(args, model_iter_fn, model, example_inputs)
Domain
Subdomains
Calls
Source
Frequently Asked Questions
What does speedup_experiment_fx2trt() do?
speedup_experiment_fx2trt() is a function in the pytorch codebase.
What does speedup_experiment_fx2trt() call?
speedup_experiment_fx2trt() calls 1 function(s): speedup_experiment.
Analyze Your Own Codebase
Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.
Try Supermodel Free