Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-28109

Test failure in obs_base test_cameraMapper.Mapper2TestCase

    XMLWordPrintable

    Details

    • Type: Bug
    • Status: Done
    • Resolution: Done
    • Fix Version/s: None
    • Component/s: obs_base
    • Labels:
      None
    • Story Points:
      1
    • Sprint:
      AP S21-1 (December)
    • Team:
      Alert Production
    • Urgent?:
      No

      Description

      This seems like it has to be caused by DM-27178. But I see Jenkins passed on that branch at 2020-12-16T04:11Z (a force-push thereafter did not significantly change the code), and the nightly lsst_distrib Jenkins build also passed.

      The weird thing is that (on lsst-devl02) pytest seems to succeed, as does scons -j 1, but scons -j 8 fails. 20 runs of pytest -n 8 tests/test_cameraMapper.py separated by rm -rf .pytest_cache gives 2 passes and 18 failures. 20 similar runs with -n 4 gives 14 passes and only 6 failures. Interestingly, -n 5 gave 20 failures, and -n 3 gave 19.

      Is this a race condition having to do with the Filter singleton?

      But the pytest workers run in separate processes, so I don't see how that happens.

        Attachments

          Activity

          ktl Kian-Tat Lim created issue -
          ktl Kian-Tat Lim made changes -
          Field Original Value New Value
          Description This seems like it has to be caused by DM-27178. But I see Jenkins passed on that branch at 2020-12-16T04:11Z (a force-push thereafter did not significantly change the code), and the nightly {{lsst_distrib}} Jenkins build also passed.

          The weird thing is that (on lsst-devl02) {{pytest}} seems to succeed, as does {{scons -j 1}}, but {{scons -j 8}} fails. 20 runs of {{pytest -n 8 tests/test_cameraMapper.py}} separated by {{rm -rf .pytest_cache}} gives 2 passes and 18 fails. 20 similar runs with {{-n 4}} gives 14 passes and only 6 failures.

          Is this a race condition having to do with the {{Filter}} singleton?
          This seems like it has to be caused by DM-27178. But I see Jenkins passed on that branch at 2020-12-16T04:11Z (a force-push thereafter did not significantly change the code), and the nightly {{lsst_distrib}} Jenkins build also passed.

          The weird thing is that (on lsst-devl02) {{pytest}} seems to succeed, as does {{scons -j 1}}, but {{scons -j 8}} fails. 20 runs of {{pytest -n 8 tests/test_cameraMapper.py}} separated by {{rm -rf .pytest_cache}} gives 2 passes and 18 failures. 20 similar runs with {{-n 4}} gives 14 passes and only 6 failures. Interestingly, {{-n 5}} gave 20 failures, and {{-n 3}} gave 19.

          Is this a race condition having to do with the {{Filter}} singleton?

          But the pytest workers run in separate processes, so I don't see how that happens.
          krzys Krzysztof Findeisen made changes -
          Status To Do [ 10001 ] In Progress [ 3 ]
          krzys Krzysztof Findeisen made changes -
          Epic Link DM-27906 [ 442554 ]
          krzys Krzysztof Findeisen made changes -
          Sprint AP S21-1 (December) [ 1059 ]
          Story Points 1
          krzys Krzysztof Findeisen made changes -
          Reviewers Kian-Tat Lim [ ktl ]
          Status In Progress [ 3 ] In Review [ 10004 ]
          ktl Kian-Tat Lim made changes -
          Status In Review [ 10004 ] Reviewed [ 10101 ]
          krzys Krzysztof Findeisen made changes -
          Resolution Done [ 10000 ]
          Status Reviewed [ 10101 ] Done [ 10002 ]

            People

            Assignee:
            krzys Krzysztof Findeisen
            Reporter:
            ktl Kian-Tat Lim
            Reviewers:
            Kian-Tat Lim
            Watchers:
            Kian-Tat Lim, Krzysztof Findeisen
            Votes:
            0 Vote for this issue
            Watchers:
            2 Start watching this issue

              Dates

              Created:
              Updated:
              Resolved:

                Jenkins

                No builds found.