Uploaded image for project: 'Data Management'
  1. Data Management
  2. DM-11607

obs_base fails with pytest-xdist

    Details

    • Templates:

      Description

      Running the obs_base tests in parallel gives some failures:

      =================================== FAILURES ===================================
      ______________________________ TestInputOnly.test ______________________________
      [gw5] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testComposite.TestInputOnly testMethod=test>
       
          def setUp(self):
              packageDir = getPackageDir('obs_base')
              self.testData = os.path.join(packageDir, 'tests', 'composite')
              self.firstRepoPath = os.path.join(self.testData, 'repo1')
              self.objA = dpTest.TestObject("abc")
              self.objB = dpTest.TestObject("def")
              self.policy = dafPersist.Policy(
                                         {'camera': 'lsst.afw.cameraGeom.Camera',
                                          'datasets': {
                                              'basicObject1': {
                                                  'python': 'lsst.daf.persistence.test.TestObject',
                                                  'template': 'basic/id%(id)s.pickle',
                                                  'storage': 'PickleStorage'},
                                              'basicObject2': {
                                                  'python': 'lsst.daf.persistence.test.TestObject',
                                                  'template': 'basic/name%(name)s.pickle',
                                                  'storage': 'PickleStorage'},
                                              'basicPair': {
                                                  'python': 'lsst.daf.persistence.test.TestObjectPair',
                                                  'composite': {
                                                      'a': {
                                                          'datasetType': 'basicObject1'
                                                      },
                                                      'b': {
                                                          'datasetType': 'basicObject2',
                                                          'inputOnly' : True
                                                      }
                                                  },
                                                  'assembler': 'lsst.daf.persistence.test.TestObjectPair.assembler',
                                                  'disassembler': 'lsst.daf.persistence.test.TestObjectPair.disassembler'
          
                                              }
                                          }})
          
              repoArgs = dafPersist.RepositoryArgs(root=self.firstRepoPath,
                                                   mapper='lsst.obs.base.test.CompositeMapper',
                                                   policy=self.policy)
      >       butler = dafPersist.Butler(outputs=repoArgs)
       
      tests/testComposite.py:485: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:527: in __init__
          self._getCfgs(repoDataList)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      self = <[AttributeError("'Butler' object has no attribute '_repos'") raised in repr()] Butler object at 0x7fa3b8ae57f0>
      repoDataList = [RepoData(id=140341154832112,repoArgs=RepositoryArgs(root='/home/jenkins-slave/workspace/stack-os-matrix/label/centos-...fg=None,cfgOrigin=None,cfgRoot=None,repo=None,parentRepoDatas=[],isV1Repository=False,role=output,parentRegistry=None)]
       
          def _getCfgs(self, repoDataList):
              """Get or make a RepositoryCfg for each RepoData, and add the cfg to the RepoData.
                  If the cfg exists, compare values. If values match then use the cfg as an "existing" cfg. If the
                  values do not match, use the cfg as a "nested" cfg.
                  If the cfg does not exist, the RepositoryArgs must be for a writable repository.
          
                  Parameters
                  ----------
                  repoDataList : list of RepoData
                      The RepoData that are output and inputs of this Butler
          
                  Raises
                  ------
                  RuntimeError
                      If the passed-in RepositoryArgs indicate an existing repository but other cfg parameters in those
                      RepositoryArgs don't
                      match the existing repository's cfg a RuntimeError will be raised.
                  """
              def cfgMatchesArgs(args, cfg):
                  """Test if there are any values in an RepositoryArgs that conflict with the values in a cfg"""
                  if args.mapper is not None and cfg.mapper != args.mapper:
                      return False
                  if args.mapperArgs is not None and cfg.mapperArgs != args.mapperArgs:
                      return False
                  if args.policy is not None and cfg.policy != args.policy:
                      return False
                  return True
          
              for repoData in repoDataList:
                  cfg, isOldButlerRepository = self._getRepositoryCfg(repoData.repoArgs)
                  if cfg is None:
                      if 'w' not in repoData.repoArgs.mode:
                          raise RuntimeError(
                              "No cfg found for read-only input repository at {}".format(repoData.repoArgs.cfgRoot))
                      repoData.setCfg(cfg=RepositoryCfg.makeFromArgs(repoData.repoArgs),
                                      origin='new',
                                      root=repoData.repoArgs.cfgRoot,
                                      isV1Repository=isOldButlerRepository)
                  else:
                      if 'w' in repoData.repoArgs.mode:
                          # if it's an output repository, the RepositoryArgs must match the existing cfg.
                          if not cfgMatchesArgs(repoData.repoArgs, cfg):
                              raise RuntimeError(("The RepositoryArgs and RepositoryCfg must match for writable " +
                                                  "repositories, RepositoryCfg:{}, RepositoryArgs:{}").format(
      >                                               cfg, repoData.repoArgs))
      E                       RuntimeError: The RepositoryArgs and RepositoryCfg must match for writable repositories, RepositoryCfg:RepositoryCfg(root='/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo1', mapper='lsst.obs.base.test.CompositeMapper', mapperArgs={}, parents=[], policy={'camera': 'lsst.afw.cameraGeom.Camera', 'datasets': {'basicObject2': {'template': 'basic/name%(name)s.pickle', 'storage': 'PickleStorage', 'python': 'lsst.daf.persistence.test.TestObject'}, 'basicPair': {'composite': {'b': {'datasetType': 'basicObject2'}, 'a': {'datasetType': 'basicObject1'}}, 'disassembler': 'lsst.daf.persistence.test.TestObjectPair.disassembler', 'assembler': 'lsst.daf.persistence.test.TestObjectPair.assembler', 'python': 'lsst.daf.persistence.test.TestObjectPair'}, 'bypassTestType': {'composite': {'b': {'datasetType': 'basicObject2'}, 'a': {'datasetType': 'basicObject1'}}, 'python': 'lsst.daf.persistence.test.TestObjectPair'}, 'basicObject1': {'template': 'basic/id%(id)s.pickle', 'storage': 'PickleStorage', 'python': 'lsst.daf.persistence.test.TestObject'}, 'stdTestType': {'composite': {'b': {'datasetType': 'basicObject2'}, 'a': {'datasetType': 'basicObject1'}}, 'python': 'lsst.daf.persistence.test.TestObjectPair'}}}), RepositoryArgs:RepositoryArgs(root='/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo1', cfgRoot=None, mapper='lsst.obs.base.test.CompositeMapper', mapperArgs=None, tags=set(), mode='w', policy={'camera': 'lsst.afw.cameraGeom.Camera', 'datasets': {'basicObject2': {'template': 'basic/name%(name)s.pickle', 'storage': 'PickleStorage', 'python': 'lsst.daf.persistence.test.TestObject'}, 'basicPair': {'python': 'lsst.daf.persistence.test.TestObjectPair', 'disassembler': 'lsst.daf.persistence.test.TestObjectPair.disassembler', 'assembler': 'lsst.daf.persistence.test.TestObjectPair.assembler', 'composite': {'b': {'datasetType': 'basicObject2', 'inputOnly': True}, 'a': {'datasetType': 'basicObject1'}}}, 'basicObject1': {'template': 'basic/id%(id)s.pickle', 'storage': 'PickleStorage', 'python': 'lsst.daf.persistence.test.TestObject'}}})
       
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:805: RuntimeError
      ______________________ TestFindParentMapperV1Butler.test _______________________
      [gw5] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testFindParentMapper.TestFindParentMapperV1Butler testMethod=test>
       
          def setUp(self):
              packageDir = getPackageDir('obs_base')
              self.testDir = os.path.join(packageDir, 'tests', 'findParentMapper')
          
              self.parentRepoDir = os.path.join(self.testDir, 'parentRepo')
      >       os.makedirs(self.parentRepoDir)
       
      tests/testFindParentMapper.py:48: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      name = '/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/findParentMapper/parentRepo'
      mode = 511, exist_ok = False
       
          def makedirs(name, mode=0o777, exist_ok=False):
              """makedirs(name [, mode=0o777][, exist_ok=False])
          
              Super-mkdir; create a leaf directory and all intermediate ones.  Works like
              mkdir, except that any intermediate path segment (not just the rightmost)
              will be created if it does not exist. If the target directory already
              exists, raise an OSError if exist_ok is False. Otherwise no exception is
              raised.  This is recursive.
          
              """
              head, tail = path.split(name)
              if not tail:
                  head, tail = path.split(head)
              if head and tail and not path.exists(head):
                  try:
                      makedirs(head, mode, exist_ok)
                  except FileExistsError:
                      # Defeats race condition when another thread created the path
                      pass
                  cdir = curdir
                  if isinstance(tail, bytes):
                      cdir = bytes(curdir, 'ASCII')
                  if tail == cdir:           # xxx/newdir/. exists if xxx/newdir exists
                      return
              try:
      >           mkdir(name, mode)
      E           FileExistsError: [Errno 17] File exists: '/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/findParentMapper/parentRepo'
       
      ../../miniconda/lib/python3.5/os.py:241: FileExistsError
      _________________ TestCompositeTestCase.testDottedDatasetType __________________
      [gw1] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testComposite.TestCompositeTestCase testMethod=testDottedDatasetType>
       
          def testDottedDatasetType(self):
              """Verify that components of a composite can be loaded by dotted name in the form
                  DatasetType.componentName
                  """
              thirdRepoPath = os.path.join(self.testData, 'repo3')
              # child repositories do not look up in-repo policies. We need to fix that.
              repoArgs = dafPersist.RepositoryArgs(root=thirdRepoPath, policy=self.policy)
              butler = dafPersist.Butler(inputs=self.firstRepoPath, outputs=repoArgs)
      >       verificationButler = dafPersist.Butler(inputs=thirdRepoPath)
       
      tests/testComposite.py:139: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:527: in __init__
          self._getCfgs(repoDataList)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      self = <[AttributeError("'Butler' object has no attribute '_repos'") raised in repr()] Butler object at 0x7fa8df3556a0>
      repoDataList = [RepoData(id=140363276102008,repoArgs=RepositoryArgs(root='/home/jenkins-slave/workspace/stack-os-matrix/label/centos-...cfg=None,cfgOrigin=None,cfgRoot=None,repo=None,parentRepoDatas=[],isV1Repository=False,role=input,parentRegistry=None)]
       
          def _getCfgs(self, repoDataList):
              """Get or make a RepositoryCfg for each RepoData, and add the cfg to the RepoData.
                  If the cfg exists, compare values. If values match then use the cfg as an "existing" cfg. If the
                  values do not match, use the cfg as a "nested" cfg.
                  If the cfg does not exist, the RepositoryArgs must be for a writable repository.
          
                  Parameters
                  ----------
                  repoDataList : list of RepoData
                      The RepoData that are output and inputs of this Butler
          
                  Raises
                  ------
                  RuntimeError
                      If the passed-in RepositoryArgs indicate an existing repository but other cfg parameters in those
                      RepositoryArgs don't
                      match the existing repository's cfg a RuntimeError will be raised.
                  """
              def cfgMatchesArgs(args, cfg):
                  """Test if there are any values in an RepositoryArgs that conflict with the values in a cfg"""
                  if args.mapper is not None and cfg.mapper != args.mapper:
                      return False
                  if args.mapperArgs is not None and cfg.mapperArgs != args.mapperArgs:
                      return False
                  if args.policy is not None and cfg.policy != args.policy:
                      return False
                  return True
          
              for repoData in repoDataList:
                  cfg, isOldButlerRepository = self._getRepositoryCfg(repoData.repoArgs)
                  if cfg is None:
                      if 'w' not in repoData.repoArgs.mode:
                          raise RuntimeError(
      >                       "No cfg found for read-only input repository at {}".format(repoData.repoArgs.cfgRoot))
      E                   RuntimeError: No cfg found for read-only input repository at /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo3
       
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:794: RuntimeError
      ----------------------------- Captured stdout call -----------------------------
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo1
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo1
      ________________________ TestCompositeTestCase.testStd _________________________
      [gw2] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testComposite.TestCompositeTestCase testMethod=testStd>
       
          def testStd(self):
              """Verify that composite dataset types with a std_ function are passed to the std_ function after
                  being instantiated."""
              secondRepoPath = os.path.join(self.testData, 'repo2')
              repoArgs = dafPersist.RepositoryArgs(root=secondRepoPath, policy=self.policy)
      >       butler = dafPersist.Butler(inputs=self.firstRepoPath, outputs=repoArgs)
       
      tests/testComposite.py:173: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:527: in __init__
          self._getCfgs(repoDataList)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      self = <[AttributeError("'Butler' object has no attribute '_repos'") raised in repr()] Butler object at 0x7f21b681cba8>
      repoDataList = [RepoData(id=139782770081184,repoArgs=RepositoryArgs(root='/home/jenkins-slave/workspace/stack-os-matrix/label/centos-...cfg=None,cfgOrigin=None,cfgRoot=None,repo=None,parentRepoDatas=[],isV1Repository=False,role=input,parentRegistry=None)]
       
          def _getCfgs(self, repoDataList):
              """Get or make a RepositoryCfg for each RepoData, and add the cfg to the RepoData.
                  If the cfg exists, compare values. If values match then use the cfg as an "existing" cfg. If the
                  values do not match, use the cfg as a "nested" cfg.
                  If the cfg does not exist, the RepositoryArgs must be for a writable repository.
          
                  Parameters
                  ----------
                  repoDataList : list of RepoData
                      The RepoData that are output and inputs of this Butler
          
                  Raises
                  ------
                  RuntimeError
                      If the passed-in RepositoryArgs indicate an existing repository but other cfg parameters in those
                      RepositoryArgs don't
                      match the existing repository's cfg a RuntimeError will be raised.
                  """
              def cfgMatchesArgs(args, cfg):
                  """Test if there are any values in an RepositoryArgs that conflict with the values in a cfg"""
                  if args.mapper is not None and cfg.mapper != args.mapper:
                      return False
                  if args.mapperArgs is not None and cfg.mapperArgs != args.mapperArgs:
                      return False
                  if args.policy is not None and cfg.policy != args.policy:
                      return False
                  return True
          
              for repoData in repoDataList:
                  cfg, isOldButlerRepository = self._getRepositoryCfg(repoData.repoArgs)
                  if cfg is None:
                      if 'w' not in repoData.repoArgs.mode:
                          raise RuntimeError(
      >                       "No cfg found for read-only input repository at {}".format(repoData.repoArgs.cfgRoot))
      E                   RuntimeError: No cfg found for read-only input repository at /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo1
       
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:794: RuntimeError
      ----------------------------- Captured stdout call -----------------------------
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo1
      ________________ TestCompositeTestCase.testDatasetDoesNotExist _________________
      [gw7] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testComposite.TestCompositeTestCase testMethod=testDatasetDoesNotExist>
       
          def testDatasetDoesNotExist(self):
              """Verify that Butler.datasetExists returns false for a composite dataset where some of the
                  components do not exist."""
              repoPath = os.path.join(self.testData, 'repo')
              repoArgs = dafPersist.RepositoryArgs(root=repoPath, policy=self.policy,
                                                   mapper='lsst.obs.base.test.CompositeMapper')
          
              butler = dafPersist.Butler(outputs=repoArgs)
              self.objA = dpTest.TestObject("abc")
      >       butler.put(self.objA, 'basicObject1', dataId={'id': 'foo'})
       
      tests/testComposite.py:165: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:1431: in put
          location.getRepository().write(location, obj)
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/repository.py:189: in write
          return butlerLocationStorage.write(butlerLocation, obj)
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/posixStorage.py:280: in write
          with SafeFilename(os.path.join(self.root, locations[0])) as locationString:
      ../../miniconda/lib/python3.5/contextlib.py:59: in __enter__
          return next(self.gen)
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/safeFileIo.py:137: in SafeFilename
          temp = tempfile.NamedTemporaryFile(mode="w", dir=outDir, prefix=outName, delete=False)
      ../../miniconda/lib/python3.5/tempfile.py:549: in NamedTemporaryFile
          (fd, name) = _mkstemp_inner(dir, prefix, suffix, flags, output_type)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      dir = '/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo/basic'
      pre = 'idfoo.pickle', suf = '', flags = 131266, output_type = <class 'str'>
       
          def _mkstemp_inner(dir, pre, suf, flags, output_type):
              """Code common to mkstemp, TemporaryFile, and NamedTemporaryFile."""
          
              names = _get_candidate_names()
              if output_type is bytes:
                  names = map(_os.fsencode, names)
          
              for seq in range(TMP_MAX):
                  name = next(names)
                  file = _os.path.join(dir, pre + name + suf)
                  try:
      >               fd = _os.open(file, flags, 0o600)
      E               FileNotFoundError: [Errno 2] No such file or directory: '/home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo/basic/idfoo.pickle87_nf71r'
       
      ../../miniconda/lib/python3.5/tempfile.py:260: FileNotFoundError
      ----------------------------- Captured stdout call -----------------------------
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo1
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/composite/repo
      _____________________ TestGenericAssembler.testConstructor _____________________
      [gw6] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testComposite.TestGenericAssembler testMethod=testConstructor>
       
          def testConstructor(self):
              """Test the case where the arguments to the default constructor match the component names and so the
                  default constructor can be used by the generic assembler to assemble the object
                  Uses getters named by the policy to disassemble the object.
                  """
              repoArgs = dafPersist.RepositoryArgs(root=self.secondRepoPath, policy=self.policy)
              butler = dafPersist.Butler(inputs=self.firstRepoPath, outputs=repoArgs)
              verificationButler = dafPersist.Butler(inputs=self.secondRepoPath)
          
      >       objABPair = butler.get('basicPair', dataId={'id': 'foo', 'name': 'bar'})
       
      tests/testComposite.py:289: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:1396: in get
          return callback()
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:1391: in <lambda>
          callback = lambda: self._read(location)
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:1530: in _read
          obj = self.get(componentInfo.datasetType, location.dataId, immediate=True)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      self = Butler(datasetTypeAliasDict={}, repos=RepoDataContainer(_inputs=[RepoData(id=140353425559680,repoArgs=RepositoryArgs(r...role=input,parentRegistry=None)]), persistence=<lsst.daf.persistence.persistence.Persistence object at 0x7fa6941388f0>)
      datasetType = 'basicObject1'
      dataId = DataId(initialdata={'id': 'foo', 'name': 'bar'}, tag=set())
      immediate = True, rest = {}
       
          def get(self, datasetType, dataId=None, immediate=True, **rest):
              """Retrieves a dataset given an input collection data id.
          
                  Parameters
                  ----------
                  datasetType - string
                      The type of dataset to retrieve.
                  dataId - dict
                      The data id.
                  immediate - bool
                      If False use a proxy for delayed loading.
                  **rest
                      keyword arguments for the data id.
          
                  Returns
                  -------
                      An object retrieved from the dataset (or a proxy for one).
                  """
              datasetType = self._resolveDatasetTypeAlias(datasetType)
              dataId = DataId(dataId)
              dataId.update(**rest)
          
              location = self._locate(datasetType, dataId, write=False)
              if location is None:
      >           raise NoResults("No locations for get:", datasetType, dataId)
      E           lsst.daf.persistence.butlerExceptions.NoResults: No locations for get: datasetType:basicObject1 dataId:DataId(initialdata={'id': 'foo', 'name': 'bar'}, tag=set())
       
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:1384: NoResults
      ----------------------------- Captured stdout call -----------------------------
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/genericAssembler/repo1
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/genericAssembler/repo1
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/genericAssembler/repo1
      _______________________ OutputRootTestCase.testDiffInput _______________________
      [gw5] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testOutputRoot.OutputRootTestCase testMethod=testDiffInput>
       
          def testDiffInput(self):
              """Verify that if an output repository is loaded/created twice, and the second time it has a different
                  parent than the first time, then the second instantiation should raise an exception."""
              butler = dafPersist.Butler(outputs={'root': testInput1, 'mapper': MinMapper1})
              del butler
              butler = dafPersist.Butler(outputs={'root': testInput2, 'mapper': MinMapper1})
              del butler
      >       butler = dafPersist.Butler(inputs=testInput1, outputs=testOutput)
       
      tests/testOutputRoot.py:189: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:527: in __init__
          self._getCfgs(repoDataList)
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      self = <[AttributeError("'Butler' object has no attribute '_repos'") raised in repr()] Butler object at 0x7fa3b8acbac8>
      repoDataList = [RepoData(id=140341155215176,repoArgs=RepositoryArgs(root='/home/jenkins-slave/workspace/stack-os-matrix/label/centos-...cfg=None,cfgOrigin=None,cfgRoot=None,repo=None,parentRepoDatas=[],isV1Repository=False,role=input,parentRegistry=None)]
       
          def _getCfgs(self, repoDataList):
              """Get or make a RepositoryCfg for each RepoData, and add the cfg to the RepoData.
                  If the cfg exists, compare values. If values match then use the cfg as an "existing" cfg. If the
                  values do not match, use the cfg as a "nested" cfg.
                  If the cfg does not exist, the RepositoryArgs must be for a writable repository.
          
                  Parameters
                  ----------
                  repoDataList : list of RepoData
                      The RepoData that are output and inputs of this Butler
          
                  Raises
                  ------
                  RuntimeError
                      If the passed-in RepositoryArgs indicate an existing repository but other cfg parameters in those
                      RepositoryArgs don't
                      match the existing repository's cfg a RuntimeError will be raised.
                  """
              def cfgMatchesArgs(args, cfg):
                  """Test if there are any values in an RepositoryArgs that conflict with the values in a cfg"""
                  if args.mapper is not None and cfg.mapper != args.mapper:
                      return False
                  if args.mapperArgs is not None and cfg.mapperArgs != args.mapperArgs:
                      return False
                  if args.policy is not None and cfg.policy != args.policy:
                      return False
                  return True
          
              for repoData in repoDataList:
                  cfg, isOldButlerRepository = self._getRepositoryCfg(repoData.repoArgs)
                  if cfg is None:
                      if 'w' not in repoData.repoArgs.mode:
                          raise RuntimeError(
      >                       "No cfg found for read-only input repository at {}".format(repoData.repoArgs.cfgRoot))
      E                   RuntimeError: No cfg found for read-only input repository at /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/testInput1
       
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:794: RuntimeError
      ----------------------------- Captured stdout call -----------------------------
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/testInput1
      CameraMapper INFO: Loading Posix exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/testInput2
      ____________________ OutputRootTestCase.testReuseOutputRoot ____________________
      [gw4] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testOutputRoot.OutputRootTestCase testMethod=testReuseOutputRoot>
       
          def testReuseOutputRoot(self):
              """Set up an output repositoriy and verify its parent relationship to the input repository.
                  Then set up an output repository with the first output as an input, and verify the parent
                  relationships."""
              butler = dafPersist.Butler(inputs={'root': testPath, 'mapper': MinMapper1},
                                         outputs=testOutput)
      >       self.assertTrue(os.path.exists(testOutput))
      E       AssertionError: False is not true
       
      tests/testOutputRoot.py:163: AssertionError
      ----------------------------- Captured stdout call -----------------------------
      CameraMapper INFO: Loading exposure registry from /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/build/obs_base/tests/registry.sqlite3
      ___________ TestGenericAssembler.testInferredNameUnderscoreSeparator ___________
      [gw3] linux -- Python 3.5.2 /home/jenkins-slave/workspace/stack-os-matrix/label/centos-7/python/py3/lsstsw/miniconda/bin/python
       
      self = <testComposite.TestGenericAssembler testMethod=testInferredNameUnderscoreSeparator>
       
          def testInferredNameUnderscoreSeparator(self):
              """Test the case where the name of the setter & getter is inferred by the policy name by prepending
                  'set_' and get_
                  """
              repoArgs = dafPersist.RepositoryArgs(root=self.secondRepoPath, policy=self.policy)
              butler = dafPersist.Butler(inputs=self.firstRepoPath, outputs=repoArgs)
              obj = butler.get('underscoreSetter', dataId={'id': 'foo'})
              self.assertEqual(self.objA, obj.get_foo())
              butler.put(obj, 'underscoreSetter', dataId={'id': 'foo'})
          
              verificationButler = dafPersist.Butler(inputs=self.secondRepoPath)
      >       componentObj = verificationButler.get('basicObject1', dataId={'id': 'foo'})
       
      tests/testComposite.py:354: 
      _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
       
      self = Butler(datasetTypeAliasDict={}, repos=RepoDataContainer(_inputs=[RepoData(id=140341476539584,repoArgs=RepositoryArgs(r...ole=parent,parentRegistry=None)]), persistence=<lsst.daf.persistence.persistence.Persistence object at 0x7fa3cbe885a8>)
      datasetType = 'basicObject1'
      dataId = DataId(initialdata={'id': 'foo'}, tag=set()), immediate = True
      rest = {}
       
          def get(self, datasetType, dataId=None, immediate=True, **rest):
              """Retrieves a dataset given an input collection data id.
          
                  Parameters
                  ----------
                  datasetType - string
                      The type of dataset to retrieve.
                  dataId - dict
                      The data id.
                  immediate - bool
                      If False use a proxy for delayed loading.
                  **rest
                      keyword arguments for the data id.
          
                  Returns
                  -------
                      An object retrieved from the dataset (or a proxy for one).
                  """
              datasetType = self._resolveDatasetTypeAlias(datasetType)
              dataId = DataId(dataId)
              dataId.update(**rest)
          
              location = self._locate(datasetType, dataId, write=False)
              if location is None:
      >           raise NoResults("No locations for get:", datasetType, dataId)
      E           lsst.daf.persistence.butlerExceptions.NoResults: No locations for get: datasetType:basicObject1 dataId:DataId(initialdata={'id': 'foo'}, tag=set())
       
      ../../stack/Linux64/daf_persistence/tickets.DM-11595-ga8129d171e/python/lsst/daf/persistence/butler.py:1384: NoResults
      

      These might be related to similar failures in DM-11595 where a test directory is being reused but is not unique.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                tjenness Tim Jenness
                Reporter:
                tjenness Tim Jenness
                Reviewers:
                Nate Pease
                Watchers:
                John Parejko, Nate Pease, Tim Jenness
              • Votes:
                0 Vote for this issue
                Watchers:
                3 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved:

                  Summary Panel