Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rework scalar multiplication and division #289

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

jwallwork23
Copy link
Contributor

Towards #158.
(Refined from #286.)

This PR revises the scalar multiplication/division approach as described on #286:

I spent far too much time trying to get the gradients to be computed correctly for scalar multiplication and division as it is currently implemented. However, I discovered that everything works nicely if we just define scalars as rank-1 tensors with a single value. The revised approach to scalar multiplication/division adds a bit of boilerplate, but perhaps this could be reduced by introducing a torch_scalar class (see #285 for discussion).

There are lots of deletions because several fypp sections are no longer needed.

The PR also adds some info on this to both the autograd doc page and the autograd example.

@jwallwork23 jwallwork23 added enhancement New feature or request autograd Tasks towards the online training / automatic differentiation feature labels Feb 18, 2025
@jwallwork23 jwallwork23 self-assigned this Feb 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
autograd Tasks towards the online training / automatic differentiation feature enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant