Skip to content

Conversation

@nicoddemus
Copy link
Member

Given that our examples use a lot of custom marks, they end up generating a lot of warnings which are only noise in the documentation.

nicoddemus added 2 commits May 8, 2019 21:46
A lot of our examples use custom markers to make a point and showcase
features, which generates a lot of warnings
@codecov
Copy link

codecov bot commented May 8, 2019

Codecov Report

Merging #5234 into features will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##           features    #5234   +/-   ##
=========================================
  Coverage     96.14%   96.14%           
=========================================
  Files           115      115           
  Lines         26125    26125           
  Branches       2577     2577           
=========================================
  Hits          25118    25118           
  Misses          706      706           
  Partials        301      301

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5eeb5ee...5d76869. Read the comment docs.

@blueyed
Copy link
Contributor

blueyed commented May 8, 2019

which are only noise in the documentation.

Where do they show up?
Or would they only without this fix? (when running regendoc without the patch)

@nicoddemus
Copy link
Member Author

They show up in the docs themselves, when we run pytest and capture the output to display on the documentation.

Here's an example git diff when we run tox -e regendoc on features:

diff --git a/doc/en/example/simple.rst b/doc/en/example/simple.rst
index ff422c585..5509612a8 100644
--- a/doc/en/example/simple.rst
+++ b/doc/en/example/simple.rst
@@ -194,10 +194,16 @@ and when running it will see a skipped "slow" test:
     collected 2 items

     test_module.py .s                                                    [100%]
+
+    ============================= warnings summary =============================
+    $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321
+      $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a

 typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
+        PytestUnknownMarkWarning,
+
+    -- Docs: https://docs.pytest.org/en/latest/warnings.html
     ========================= short test summary info ==========================
     SKIPPED [1] test_module.py:8: need --runslow option to run
-
-    =================== 1 passed, 1 skipped in 0.12 seconds ====================
+    ============= 1 passed, 1 skipped, 1 warnings in 0.12 seconds ==============

 Or run it including the ``slow`` marked test:

@@ -212,7 +218,13 @@ Or run it including the ``slow`` marked test:

     test_module.py ..                                                    [100%]

-    ========================= 2 passed in 0.12 seconds =========================
+    ============================= warnings summary =============================
+    $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321
+      $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321: PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a

 typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
+        PytestUnknownMarkWarning,
+
+    -- Docs: https://docs.pytest.org/en/latest/warnings.html
+    =================== 2 passed, 1 warnings in 0.12 seconds ===================

 Writing well integrated assertion helpers
 --------------------------------------------------
@@ -524,10 +536,16 @@ If we run this:
     E       assert 0

     test_step.py:11: AssertionError
+    ============================= warnings summary =============================
+    $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321
+      $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321: PytestUnknownMarkWarning: Unknown pytest.mark.incremental - is

 this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
+        PytestUnknownMarkWarning,
+
+    -- Docs: https://docs.pytest.org/en/latest/warnings.html
     ========================= short test summary info ==========================
     XFAIL test_step.py::TestUserHandling::test_deletion
       reason: previous test failed (test_modification)
-    ============== 1 failed, 2 passed, 1 xfailed in 0.12 seconds ===============
+    ======== 1 failed, 2 passed, 1 xfailed, 1 warnings in 0.12 seconds =========

 We'll see that ``test_deletion`` was not executed because ``test_modification``
 failed.  It is reported as an "expected failure".
@@ -640,7 +658,13 @@ We can run this:
     E       assert 0

     a/test_db2.py:2: AssertionError
-    ========== 3 failed, 2 passed, 1 xfailed, 1 error in 0.12 seconds ==========
+    ============================= warnings summary =============================
+    $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321
+      $PYTHON_PREFIX/lib/python3.6/site-packages/_pytest/mark/structures.py:321: PytestUnknownMarkWarning: Unknown pytest.mark.incremental - is

 this a typo?  You can register custom marks to avoid this warning - for details, see https://docs.pytest.org/en/latest/mark.html
+        PytestUnknownMarkWarning,
+
+    -- Docs: https://docs.pytest.org/en/latest/warnings.html
+    ==== 3 failed, 2 passed, 1 xfailed, 1 warnings, 1 error in 0.12 seconds ====

 The two test modules in the ``a`` directory see the same ``db`` fixture instance
 while the one test in the sister-directory ``b`` doesn't see it.  We could of course

@nicoddemus nicoddemus merged commit 0bd02cd into pytest-dev:features May 9, 2019
@nicoddemus nicoddemus deleted the marks-regen branch May 9, 2019 14:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants