Skip to content
This repository was archived by the owner on Sep 10, 2025. It is now read-only.

Conversation

@joecummings
Copy link
Member

Bug

Expect batched input to match single input e.g.

  1. [seq1, ... seq_m] -> generate -> [output1, ...., output_m]
  2. [seq1] -> generate -> [output1]

Before this would not create the same output1. The issue was that the src_key_padding_mask was not being propagated forward.

Fix

Create padding mask and add it to model_kwargs and pass it to the forward function.


def greedy_search(
self, input_ids: torch.Tensor, max_length: int, eos_idx: int, pad_idx: Optional[int] = None, **model_kwargs
self, input_ids: torch.Tensor, max_length: int, eos_idx: int, pad_idx: int, **model_kwargs
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does changing pas_idx from Optional to required break any call sites?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope. Only being called from the entry point method atm.


# Append the next tokens to the previous tokens
input_ids = torch.cat([input_ids, next_tokens], dim=-1)
input_ids = torch.cat([input_ids, next_tokens[:, None]], dim=-1)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does the [:, None] do here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same thing as unsqueezing the last dim

tokens_for_single_example = generation_model.generate(inputs, num_beams=1, max_length=30)
generated_text_for_single_example = self.transform.decode(tokens_for_single_example.tolist())

self.assertEqual(generated_text[0], generated_text_for_single_example[-1])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we do generated_text_for_single_example[-1] instead of generated_text_for_single_example[0]?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was originally going to pass multiple through the second pass, but did not. Both get the same result though. -1 == 0

@joecummings joecummings requested review from Nayef211 and rshraga March 7, 2023 18:41
Copy link
Contributor

@Nayef211 Nayef211 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@joecummings joecummings merged commit db26565 into pytorch:main Mar 7, 2023
@joecummings joecummings deleted the fix-diff-generation-batch branch March 7, 2023 19:53
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants