Skip to content

Conversation

@krishnakalyan3
Copy link
Contributor

@krishnakalyan3 krishnakalyan3 commented Jan 6, 2021

Made Changes to

  • test/torchaudio_unittest/librosa_compatibility_test.py
  • test/torchaudio_unittest/functional/functional_cpu_test.py

fixes: #1156

torch.testing.assert_allclose(norm_tensor, expected_norm_tensor, atol=1e-5, rtol=1e-5)
class TestComplexNorm(common_utils.TorchaudioTestCase):

@parameterized.expand([
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suppose this is not quite right. list(itertools.product(...) might be the right way to go about this. However that would require importing import itertools.

@mthrok which would be the best way to go about this?.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, that's right. That's the pattern I use everywhere. itertools is a standard library so there is no problem.
Maybe it's good time to add a helper decorator for @partameterized.expand(list(itertools.product pattern, but let's not bother at this moment.

@krishnakalyan3
Copy link
Contributor Author

Following occurrences will be substituted (pytest.mark.parametrize -> parameterized) eventually as part of this PR.

image

(torch.randn(1, 2, 1025, 400, 2), 1),
(torch.randn(1025, 400, 2), 2)
])
def test_complex_norm(self, complex_tensor, power):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Performing torch.randn in decorator makes it difficult to control the randomness, so instead can you parameterize the shape?

def test_complex_norm(self, shape, power):
    torch.manual_seed(0)
    complex_tensor = torch.randn(*shape)

@krishnakalyan3 krishnakalyan3 requested a review from mthrok January 7, 2021 18:01
Copy link
Contributor

@mthrok mthrok left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Made some suggestions.

complex_tensor = torch.randn(*shape)
expected_norm_tensor = complex_tensor.pow(2).sum(-1).pow(power / 2)
norm_tensor = F.complex_norm(complex_tensor, power)
torch.testing.assert_allclose(norm_tensor, expected_norm_tensor, atol=1e-5, rtol=1e-5)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you replace torch.testing.assert_allclose with self.assertEqual?

the use of torch.testing.assert_allclose is discouraged now.


class TestDB_to_amplitude(common_utils.TorchaudioTestCase):
def test_DB_to_amplitude(self):
torch.random.manual_seed(42)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's leave this test untouched. #1113 is modifying this test suite.

complex_stretch = complex_specgrams_stretch[index].numpy()
complex_stretch = complex_stretch[..., 0] + 1j * complex_stretch[..., 1]

assert np.allclose(complex_stretch, expected_complex_stretch, atol=1e-5)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you use self.assertEqual here? You might need to wrap the expected using torch.from_numpy.

@krishnakalyan3 krishnakalyan3 requested a review from mthrok January 8, 2021 04:12
@mthrok mthrok merged commit 1e7d8d2 into pytorch:master Jan 8, 2021
@mthrok
Copy link
Contributor

mthrok commented Jan 8, 2021

Thanks!

@krishnakalyan3 krishnakalyan3 deleted the pytest_param branch January 8, 2021 16:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Replace pytest's paremeterization with parameterized

3 participants