Skip to content

Conversation

@yf225
Copy link
Contributor

@yf225 yf225 commented Apr 1, 2020

Accompanying example code: pytorch/examples#745.

@netlify
Copy link

netlify bot commented Apr 1, 2020

Deploy preview for pytorch-tutorials-preview ready!

Built with commit 7a1634d

https://deploy-preview-921--pytorch-tutorials-preview.netlify.com

@yf225 yf225 force-pushed the cpp_autograd_tutorial branch from 75c48f5 to ea0a7a0 Compare April 2, 2020 03:39
@yf225 yf225 force-pushed the cpp_autograd_tutorial branch from ea0a7a0 to cf844b7 Compare April 3, 2020 00:17
@yf225 yf225 changed the title [WIP] C++ autograd tutorial C++ autograd tutorial Apr 3, 2020
@yf225 yf225 requested a review from albanD April 3, 2020 01:15
Copy link
Contributor

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks mostly good to me.
Just some small questions.


1 1
1 1
[ CPUFloatType{2,2} ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The cpp Tensor print does not show requires_grad and the grad_fn like we do in python?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No it doesn't :( We might want to fix it though to be consistent with Python tensor print. I filed an issue here: pytorch/pytorch#36004.

}

static variable_list backward(AutogradContext *ctx, variable_list grad_outputs) {
auto saved = ctx->get_saved_variables();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this not get_saved_tensors() ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I think this should be get_saved_tensors(), and my guess is that this API was created before the Variable/Tensor merge was completed, and the author wanted to make a distinction between tensors and variables. Now that we don't have this distinction anymore, I think we should change it to match the Python API: pytorch/pytorch#36099.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I renamed all Variables to Tensor now, to minimize the inconsistency with Python to only this get_saved_variables() function :)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes I guess it is just a remain of the cpp Tensor/Variable merge.

----------

You should now have a good overview of PyTorch's C++ autograd API.
You can find the code examples displayed in this note (yf225 TODO update link) `here
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: TODO

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will merge the examples PR and update the link here, when everything else looks good on this PR :)

Copy link
Contributor

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM thanks for the updates

@yf225
Copy link
Contributor Author

yf225 commented Apr 6, 2020

Note to self: merge pytorch/examples#745 first and update link in this PR


Done

@jlin27 jlin27 merged commit 5d1149b into pytorch:master Apr 14, 2020
rodrigo-techera pushed a commit to Experience-Monks/tutorials that referenced this pull request Nov 29, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants