sestep
a day ago
Hey Eric, great to see you've now published this! I know we chatted about this briefly last year, but it would be awesome to see how the performance of jax-js compares against that of other autodiff tools on a broader and more standard set of benchmarks: https://github.com/gradbench/gradbench
ekzhang
a day ago
For sure! It looks like this is benchmarking the autodiff cpu time, not the actual kernels though, which (correct me if I’m wrong) isn’t really relevant for an ML library — it’s more for if you have a really complex scientific expression
sestep
a day ago
Nope, both are measured! In fact, the time to do the autodiff transformation isn't even reflected in the charts shown on the README and the website; those charts only show the time to actually run the computations.
ekzhang
a day ago
Hm okay, seems like an interesting set of benchmarks — let me know if there’s anything I can do to help make jax-js more compatible with your docker setup
sestep
a day ago
It should be fairly straightforward; feel free to open a PR following the instructions in CONTRIBUTING.md :)
ekzhang
a day ago
I don’t think this is straightforward but it may be a skill issue on my part. It would require dockerizing headless Chrome with WebGPU support and dynamically injecting custom bundled JavaScript into the page, then extracting the results with Chrome IPC
sestep
a day ago
Ahh no you're right, I forgot about the difficulties for GPU specifically; apologies for my overly curt earlier message. More accurately: I think this is definitely possible (Troels and I have talked a bit about this previously) and I'd be happy to work together if this is something you're interested in. I probably won't work on this if you're not interested on your end, though.