Skip to content
This repository was archived by the owner on Mar 21, 2024. It is now read-only.

Conversation

@dccastro
Copy link
Member

@dccastro dccastro commented Feb 3, 2022

No description provided.

@harshita-s harshita-s self-requested a review February 4, 2022 14:40
harshita-s
harshita-s previously approved these changes Feb 4, 2022
num_features = int(prod(as_tensor(feature_shape)).item())
# fix weights, no fine-tuning
for param in feature_extractor.parameters():
param.requires_grad = False
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should not be set False for fine-tuning, right?

@harshita-s harshita-s self-requested a review February 7, 2022 13:05
@dccastro dccastro merged commit eda7635 into main Feb 7, 2022
@dccastro dccastro deleted the dacoelh/deepmil_dropout branch February 7, 2022 13:09
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants