Library for Jacobian descent with PyTorch. It enables optimization of neural networks with multiple losses (e.g. multi-task learning).
-
Updated
May 9, 2025 - Python
Library for Jacobian descent with PyTorch. It enables optimization of neural networks with multiple losses (e.g. multi-task learning).
Optimization of simple functions with JD and plot of the trajectories.
Simple C++ Inverse Kinematics library, taking axis, lengths and a goal position and solves the inverse kinematics using Jacobian Gradient Descent.
This repo is used to compare different torchjd functions (e.g. aggregators), potentially from different commits of torchjd, on a variety of topics (computation time, precision, etc.)
Add a description, image, and links to the jacobian-descent topic page so that developers can more easily learn about it.
To associate your repository with the jacobian-descent topic, visit your repo's landing page and select "manage topics."