Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trying to execute mutmut run I get KeyError #342

Open
udelledo opened this issue Nov 6, 2024 · 7 comments
Open

Trying to execute mutmut run I get KeyError #342

udelledo opened this issue Nov 6, 2024 · 7 comments

Comments

@udelledo
Copy link

udelledo commented Nov 6, 2024

I'm running on python 3.9 with mutmut 3.2.0

Steps taken:

  1. I did create a sample project with pyscaffold
  2. Added mutmut to the venv in the project
  3. run mutmut run and had the following stacktrace
❯ mutmut run
generating mutants
    done in 35ms
⠦ running stats
    done
⠼ running clean tests
    done
⠸ running forced fail test
    done
Traceback (most recent call last):
  File "/home/my_user/mutmut_issue/venv/bin/mutmut", line 8, in <module>
    sys.exit(cli())
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/mutmut/__main__.py", line 1202, in run
    mutants = sorted(mutants, key=lambda x: estimated_worst_case_time(x[1]))
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/mutmut/__main__.py", line 1202, in <lambda>
    mutants = sorted(mutants, key=lambda x: estimated_worst_case_time(x[1]))
  File "/home/my_user/mutmut_issue/venv/lib/python3.9/site-packages/mutmut/__main__.py", line 1091, in estimated_worst_case_time
    return sum(mutmut.duration_by_test[t] for t in tests)
  File "/home/nsbuild/mutmut_issue/venv/lib/python3.9/site-packages/mutmut/__main__.py", line 1091, in <genexpr>
    return sum(mutmut.duration_by_test[t] for t in tests)
KeyError: 'tests/test_skeleton.py::test_main'

Not sure if it's a matter of python version, or of test structures.

@boxed
Copy link
Owner

boxed commented Nov 7, 2024

Could you create a project like this and put it up on github and send the link?

@udelledo
Copy link
Author

udelledo commented Nov 7, 2024

@nedbat
Copy link

nedbat commented Nov 13, 2024

BTW, I get the same issue with https://github.com/nedbat/templite. With Python 3.10, install pytest and mutmut, then:

% python -V
Python 3.10.15
% pip install pytest mutmut
% pip install -e .
% mutmut run
generating mutants
    done in 231ms
⠋ running stats
    done
⠙ running clean tests
    done
⠼ running forced fail test
    done
Traceback (most recent call last):
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/bin/mutmut", line 8, in <module>
    sys.exit(cli())
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/mutmut/__main__.py", line 1202, in run
    mutants = sorted(mutants, key=lambda x: estimated_worst_case_time(x[1]))
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/mutmut/__main__.py", line 1202, in <lambda>
    mutants = sorted(mutants, key=lambda x: estimated_worst_case_time(x[1]))
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/mutmut/__main__.py", line 1091, in estimated_worst_case_time
    return sum(mutmut.duration_by_test[t] for t in tests)
  File "/usr/local/virtualenvs/tmp-7bde11ae9ca1d4c/lib/python3.10/site-packages/mutmut/__main__.py", line 1091, in <genexpr>
    return sum(mutmut.duration_by_test[t] for t in tests)
KeyError: 'tests/test_templite.py::TempliteTest::test_bad_nesting'

@boxed
Copy link
Owner

boxed commented Nov 13, 2024

I made a new release.

There were a bunch of fixed on main that made at least the templite project run properly. Thanks Ned for a nice and clean reproduction scenario.

@udelledo please try 3.2.1 now

@nedbat
Copy link

nedbat commented Nov 13, 2024

Thanks, and templite itself seems good:

% mutmut run
⠼ Generating mutants
    done in 1ms
⠋ Listing all tests
Found 26 new tests, rerunning stats collection
⠙ Running stats
    done
⠹ Running clean tests
    done
⠴ Running forced fail test
    done
Running mutation testing
⠋ 214/214  🎉 214 🫥 0  ⏰ 0  🤔 0  🙁 0  🔇 0
34.97 mutations/second

@udelledo
Copy link
Author

udelledo commented Nov 14, 2024

Awesome, version 3.2.1 didn't show the error anymore, but I'm still facing a problem.
This time it is related to the interaction with another test library I have in my project.

ApprovalTest

Running `mutmut run` I get this output
❯ mutmut run
⠹ Generating mutants
    done in 790ms
⠇ Running stats     
    done
⠧ Running clean testsF
============================================================================================================ FAILURES =============================================================================================================
___________________________________________________________________________________________ TestTeamCityConfig.test_save_configuration ____________________________________________________________________________________________

self = <tests.ezconfig_test.TestTeamCityConfig object at 0x7f20f970a370>, args = (), kwargs = {}, config_file = <tempfile._TemporaryFileWrapper object at 0x7f20f9638160>

    @patch("getpass.getpass", lambda x: "my_test_token")
    @patch("builtins.input", lambda *args: "")
    @patch("requests.Session.get", lambda url, x: MagicMock())
    def test_save_configuration(self, *args, **kwargs):
        with tempfile.NamedTemporaryFile() as config_file:
            sut = TeamCityConfig(config_file.name)
            sut.populate_from_user_input()
            sut.configure_token()
>           verify(
                "\n".join(
                    [f"{field.name}={sut.get(field.name)}" for field in sut.fieldList]
                ),
                options=Options().with_scrubber(DateScrubber(r"\d{8}T\d{6}").scrub),
            )
E           approvaltests.approval_exception.ApprovalException: We noticed that you called verify more than once in the same test. 
E           This is the second call to verify:
E               approved_file: /webdev/teamcity-lib/tests/approved_files/TestTeamCityConfig.test_save_configuration.approved.txt
E           
E           By default, ApprovalTests only allows one verify() call per test.
E           To find out more, visit: 
E           https://github.com/approvals/ApprovalTests.Python/blob/main/docs/how_to/multiple_approvals_per_test.md
E           
E           # Possible Fixes
E           1. Separate your test into two tests
E           2. In your verify call, add `options=NamerFactory.with_parameters("your_paramater")`
E           3. In your test, call `approvals.settings().allow_multiple_verify_calls_for_this_method()`
Pytest runs the test without issues
❯ pytest tests
======================================================================================================= test session starts =======================================================================================================
platform linux -- Python 3.9.18, pytest-8.3.3, pluggy-1.5.0
rootdir: /webdev/teamcity-lib
configfile: pyproject.toml
plugins: approvaltests-14.1.0
collected 27 items                                                                                                                                                                                                                

tests/ezconfig_test.py ..s....                                                                                                                                                                                              [ 25%]
tests/teamcity_test.py ....s......s........                                                                                                                                                                                 [100%]

======================================================================================================== warnings summary =========================================================================================================
tests/ezconfig_test.py::EzConfigTests::test_is_complete
  /webdev/teamcity-lib/ns-teamcity-lib-venv/lib64/python3.9/site-packages/_pytest/unraisableexception.py:85: PytestUnraisableExceptionWarning: Exception ignored in: <function _TemporaryFileCloser.__del__ at 0x7f204a735e50>
  
  Traceback (most recent call last):
    File "/usr/lib64/python3.9/tempfile.py", line 461, in __del__
      self.close()
    File "/usr/lib64/python3.9/tempfile.py", line 457, in close
      unlink(self.name)
  FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmpng0kucgh'
  
    warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================================================================================ 24 passed, 3 skipped, 1 warning in 0.47s =============================================================================================

In the plain test I have a test case where I use multiple verify statement because it's a parametrized test, but I'm not sure why this issue is being reported with mutmut.

I understand integration with other plugins/libraries might not be a primary concern of this library, nonetheless I would ask if there's any strategy you could think that would let me use the two libraries together.

maybe a mechanism that would let me extend the configuration for test execution with mutmut

@boxed
Copy link
Owner

boxed commented Nov 15, 2024

I would think this is an issue with that library that you should discuss with them. Mutmut does not modify each test, so I would guess their check for multiple calls to verify() per test is slightly broken, and doesn't actually check per test, but per process of the entire testing run or something. At the very least they should probably reset their check logic when pytest starts, as pytest will start multiple times in the same process in mutmut.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants