XCP-ng
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Attempting to migrate VM to different host in a different pool get 'operation failed'

    Scheduled Pinned Locked Moved Xen Orchestra
    9 Posts 2 Posters 915 Views 2 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • T Offline
      tc-atwork
      last edited by

      Full error output:

      vm.migrate
      {
        "vm": "06afe27b-045e-d1a9-1d57-aaea2e4394ed",
        "mapVifsNetworks": {
          "e92e3c62-6457-5f3f-de12-ba7c54c91cce": "2187cbd6-f957-c58c-c095-36ddda241566"
        },
        "migrationNetwork": "bc68b95e-6b2b-8331-ffc8-3b7f831c9027",
        "sr": "6d5f5f12-80af-dc6e-f67c-b016136a8728",
        "targetHost": "60889d12-d9e1-403c-9997-7ed2889fb9c0"
      }
      {
        "code": 21,
        "data": {
          "objectId": "06afe27b-045e-d1a9-1d57-aaea2e4394ed",
          "code": "SR_BACKEND_FAILURE_40"
        },
        "message": "operation failed",
        "name": "XoError",
        "stack": "XoError: operation failed
          at operationFailed (/usr/local/lib/node_modules/xo-server/node_modules/xo-common/src/api-errors.js:21:32)
          at file:///usr/local/lib/node_modules/xo-server/src/api/vm.mjs:567:15
          at Xo.migrate (file:///usr/local/lib/node_modules/xo-server/src/api/vm.mjs:553:3)
          at Api.#callApiMethod (file:///usr/local/lib/node_modules/xo-server/src/xo-mixins/api.mjs:417:20)"
      }```
      1 Reply Last reply Reply Quote 0
      • olivierlambertO Online
        olivierlambert Vates 🪐 Co-Founder CEO
        last edited by

        Hi,

        You are not providing a lot of context. To me, it's likely a storage issue, so take a look on your XCP-ng's SMlog 🙂

        T 1 Reply Last reply Reply Quote 0
        • T Offline
          tc-atwork @olivierlambert
          last edited by

          @olivierlambert here is the output to SMLog when I attempt to migrate with the VM shutdown. The VM UUID is 06afe27b-045e-d1a9-1d57-aaea2e4394ed and the disk UUID is 5bcc1298-712a-4288-95c8-11b37de30191.

          Aug 22 10:09:04 H2SPH180034 SM: [26008] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:04 H2SPH180034 SM: [26008] lock: acquired /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:04 H2SPH180034 SM: [26008] ['/usr/sbin/td-util', 'query', 'vhd', '-vpfb', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd']
          Aug 22 10:09:04 H2SPH180034 SM: [26008]   pread SUCCESS
          Aug 22 10:09:05 H2SPH180034 SM: [26008] vdi_attach {'sr_uuid': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'subtask_of': 'DummyRef:|c5a430fd-b0bc-4572-a15e-2505a9a882f6|VDI.attach2', 'vdi_ref': 'OpaqueRef:737c43e7-81e3-40d8-9f3d-7ba4fa7b8d5e', 'vdi_on_boot': 'persist', 'args': ['true'], 'o_direct': False, 'vdi_location': '5bcc1298-712a-4288-95c8-11b37de30191', 'host_ref': 'OpaqueRef:e0d2b385-379f-4342-914a-66aa75eb0ec8', 'session_ref': 'OpaqueRef:4b319af9-5119-4f8a-804a-a91f317b4fc2', 'device_config': {'device': '/dev/disk/by-id/scsi-36d09466058006700293b74fd0e7c6873-part3', 'SRmaster': 'true'}, 'command': 'vdi_attach', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:4ecc1a6e-8115-4104-b06b-6b1431b60de7', 'local_cache_sr': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'vdi_uuid': '5bcc1298-712a-4288-95c8-11b37de30191'}
          Aug 22 10:09:05 H2SPH180034 SM: [26008] lock: opening lock file /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:05 H2SPH180034 SM: [26008] <__main__.EXTFileVDI object at 0x7febd9eeb110>
          Aug 22 10:09:05 H2SPH180034 SM: [26008] result: {'params_nbd': 'nbd:unix:/run/blktap-control/nbd/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191', 'o_direct_reason': 'NO_RO_IMAGE', 'params': '/dev/sm/backend/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191', 'o_direct': True, 'xenstore_data': {'scsi/0x12/0x80': 'AIAAEjViY2MxMjk4LTcxMmEtNDIgIA==', 'scsi/0x12/0x83': 'AIMAMQIBAC1YRU5TUkMgIDViY2MxMjk4LTcxMmEtNDI4OC05NWM4LTExYjM3ZGUzMDE5MSA=', 'vdi-uuid': '5bcc1298-712a-4288-95c8-11b37de30191', 'mem-pool': '54938185-cf60-f996-e2d7-fe8c47c38abb'}}
          Aug 22 10:09:05 H2SPH180034 SM: [26008] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:05 H2SPH180034 SM: [26037] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:05 H2SPH180034 SM: [26037] lock: acquired /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:05 H2SPH180034 SM: [26037] ['/usr/sbin/td-util', 'query', 'vhd', '-vpfb', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd']
          Aug 22 10:09:05 H2SPH180034 SM: [26037]   pread SUCCESS
          Aug 22 10:09:05 H2SPH180034 SM: [26037] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:05 H2SPH180034 SM: [26037] vdi_activate {'sr_uuid': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'subtask_of': 'DummyRef:|a7064e4e-5a08-4b68-afb9-a4aba04c93d3|VDI.activate', 'vdi_ref': 'OpaqueRef:737c43e7-81e3-40d8-9f3d-7ba4fa7b8d5e', 'vdi_on_boot': 'persist', 'args': ['true'], 'o_direct': False, 'vdi_location': '5bcc1298-712a-4288-95c8-11b37de30191', 'host_ref': 'OpaqueRef:e0d2b385-379f-4342-914a-66aa75eb0ec8', 'session_ref': 'OpaqueRef:28ba8e04-21f5-4bbc-b1ab-dafab080dec2', 'device_config': {'device': '/dev/disk/by-id/scsi-36d09466058006700293b74fd0e7c6873-part3', 'SRmaster': 'true'}, 'command': 'vdi_activate', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:4ecc1a6e-8115-4104-b06b-6b1431b60de7', 'local_cache_sr': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'vdi_uuid': '5bcc1298-712a-4288-95c8-11b37de30191'}
          Aug 22 10:09:05 H2SPH180034 SM: [26037] lock: opening lock file /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:05 H2SPH180034 SM: [26037] blktap2.activate
          Aug 22 10:09:05 H2SPH180034 SM: [26037] lock: acquired /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:05 H2SPH180034 SM: [26037] Adding tag to: 5bcc1298-712a-4288-95c8-11b37de30191
          Aug 22 10:09:05 H2SPH180034 SM: [26037] Activate lock succeeded
          Aug 22 10:09:05 H2SPH180034 SM: [26037] ['/usr/sbin/td-util', 'query', 'vhd', '-vpfb', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd']
          Aug 22 10:09:05 H2SPH180034 SM: [26037]   pread SUCCESS
          Aug 22 10:09:05 H2SPH180034 SM: [26037] PhyLink(/dev/sm/phy/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191) -> /var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd
          Aug 22 10:09:05 H2SPH180034 SM: [26037] <EXTSR.EXTFileVDI object at 0x7f727c2cc9d0>
          Aug 22 10:09:05 H2SPH180034 SM: [26037] ['/usr/sbin/tap-ctl', 'allocate']
          Aug 22 10:09:05 H2SPH180034 SM: [26037]  = 0
          Aug 22 10:09:05 H2SPH180034 SM: [26037] ['/usr/sbin/tap-ctl', 'spawn']
          Aug 22 10:09:05 H2SPH180034 SM: [26037]  = 0
          Aug 22 10:09:05 H2SPH180034 SM: [26037] ['/usr/sbin/tap-ctl', 'attach', '-p', '26090', '-m', '2']
          Aug 22 10:09:05 H2SPH180034 SM: [26037]  = 0
          Aug 22 10:09:05 H2SPH180034 SM: [26037] ['/usr/sbin/tap-ctl', 'open', '-p', '26090', '-m', '2', '-a', 'vhd:/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd', '-t', '40']
          Aug 22 10:09:05 H2SPH180034 SM: [26037]  = 0
          Aug 22 10:09:05 H2SPH180034 SM: [26037] Set scheduler to [noop] on [/sys/dev/block/254:2]
          Aug 22 10:09:05 H2SPH180034 SM: [26037] tap.activate: Launched Tapdisk(vhd:/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd, pid=26090, minor=2, state=R)
          Aug 22 10:09:05 H2SPH180034 SM: [26037] DeviceNode(/dev/sm/backend/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191) -> /dev/xen/blktap-2/tapdev2
          Aug 22 10:09:05 H2SPH180034 SM: [26037] NBDLink(/run/blktap-control/nbd/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191) -> /run/blktap-control/nbd26090.2
          Aug 22 10:09:05 H2SPH180034 SM: [26037] lock: released /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:05 H2SPH180034 SM: [26118] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:05 H2SPH180034 SM: [26118] lock: acquired /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:05 H2SPH180034 SM: [26118] sr_scan {'sr_uuid': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'subtask_of': 'DummyRef:|d9d1c923-13d3-4284-a2b0-166f836619c9|SR.scan', 'args': [], 'host_ref': 'OpaqueRef:e0d2b385-379f-4342-914a-66aa75eb0ec8', 'session_ref': 'OpaqueRef:5c779a32-0bb5-4034-a630-2739e4462614', 'device_config': {'device': '/dev/disk/by-id/scsi-36d09466058006700293b74fd0e7c6873-part3', 'SRmaster': 'true'}, 'command': 'sr_scan', 'sr_ref': 'OpaqueRef:4ecc1a6e-8115-4104-b06b-6b1431b60de7', 'local_cache_sr': '54938185-cf60-f996-e2d7-fe8c47c38abb'}
          Aug 22 10:09:05 H2SPH180034 SM: [26118] ['/usr/bin/vhd-util', 'scan', '-f', '-m', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/*.vhd']
          Aug 22 10:09:05 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:05 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/d21f98dc-c385-4d2a-81ae-f2f79c229e5c.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/f0d05834-e6ec-45ee-b5a9-d8036f8e7340.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/a8631a05-efd9-4e56-9b7f-ee7627051e76.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/ea8fcb2d-0dd1-4ce9-b894-ee1445fa6d80.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/e36aeb72-33dc-47ec-8db9-915d5420a12f.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/nwweb1_disk2.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/c8886b66-010f-4288-9139-1e1c19d557dd.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/nwweb1_disk1.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/ec0ce545-2930-48d4-8d57-eb99f35be554.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['vhd-util', 'key', '-p', '-n', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/8a33a5d9-d26f-4c4d-acca-189b0d6abd3e.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ['ls', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb', '-1', '--color=never']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26118] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/running
          Aug 22 10:09:06 H2SPH180034 SM: [26118] lock: tried lock /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/running, acquired: True (exists: True)
          Aug 22 10:09:06 H2SPH180034 SM: [26118] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/running
          Aug 22 10:09:06 H2SPH180034 SM: [26118] Kicking GC
          Aug 22 10:09:06 H2SPH180034 SMGC: [26118] === SR 54938185-cf60-f996-e2d7-fe8c47c38abb: gc ===
          Aug 22 10:09:06 H2SPH180034 SMGC: [26145] Will finish as PID [26146]
          Aug 22 10:09:06 H2SPH180034 SMGC: [26118] New PID [26145]
          Aug 22 10:09:06 H2SPH180034 SM: [26146] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/running
          Aug 22 10:09:06 H2SPH180034 SM: [26146] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/gc_active
          Aug 22 10:09:06 H2SPH180034 SM: [26118] missing config for vdi: nwweb1_disk2
          Aug 22 10:09:06 H2SPH180034 SM: [26118] missing config for vdi: nwweb1_disk1
          Aug 22 10:09:06 H2SPH180034 SM: [26118] utilisation 25685762560 <> 25141539328
          Aug 22 10:09:06 H2SPH180034 SM: [26118] utilisation 52819563008 <> 495104
          Aug 22 10:09:06 H2SPH180034 SM: [26118] utilisation 64737485312 <> 52703637504
          Aug 22 10:09:06 H2SPH180034 SM: [26118] utilisation 32077820416 <> 30558618112
          Aug 22 10:09:06 H2SPH180034 SM: [26118] utilisation 46813917696 <> 45536358912
          Aug 22 10:09:06 H2SPH180034 SM: [26118] utilisation 37654450688 <> 87552
          Aug 22 10:09:06 H2SPH180034 SM: [26118] new VDIs on disk: set(['nwweb1_disk2', 'nwweb1_disk1'])
          Aug 22 10:09:06 H2SPH180034 SM: [26118] VDIs changed on disk: ['d21f98dc-c385-4d2a-81ae-f2f79c229e5c', 'a8631a05-efd9-4e56-9b7f-ee7627051e76', 'ea8fcb2d-0dd1-4ce9-b894-ee1445fa6d80', 'e36aeb72-33dc-47ec-8db9-915d5420a12f', 'c8886b66-010f-4288-9139-1e1c19d557dd', 'ec0ce545-2930-48d4-8d57-eb99f35be554']
          Aug 22 10:09:06 H2SPH180034 SM: [26118] Introducing VDI with location=nwweb1_disk2
          Aug 22 10:09:06 H2SPH180034 SM: [26118] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ***** sr_scan: EXCEPTION <class 'XenAPI.Failure'>, ['UUID_INVALID', 'VDI', 'nwweb1_disk2']
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SRCommand.py", line 110, in run
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     return self._run_locked(sr)
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SRCommand.py", line 159, in _run_locked
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     rv = self._run(sr, target)
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SRCommand.py", line 364, in _run
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     return sr.scan(self.params['sr_uuid'])
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/FileSR.py", line 216, in scan
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     return super(FileSR, self).scan(sr_uuid)
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SR.py", line 346, in scan
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     scanrecord.synchronise()
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SR.py", line 616, in synchronise
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     self.synchronise_new()
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SR.py", line 589, in synchronise_new
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     vdi._db_introduce()
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/VDI.py", line 488, in _db_introduce
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     vdi = self.sr.session.xenapi.VDI.db_introduce(uuid, self.label, self.description, self.sr.sr_ref, ty, self.shareable, self.read_only, {}, self.location, {}, sm_config, self.managed, str(self.size), str(self.utilisation), metadata_of_pool, is_a_snapshot, xmlrpclib.DateTime(snapshot_time), snapshot_of, cbt_enabled)
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/usr/lib/python2.7/site-packages/XenAPI.py", line 264, in __call__
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     return self.__send(self.__name, args)
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/usr/lib/python2.7/site-packages/XenAPI.py", line 160, in xenapi_request
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     result = _parse_result(getattr(self, methodname)(*full_params))
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/usr/lib/python2.7/site-packages/XenAPI.py", line 238, in _parse_result
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     raise Failure(result['ErrorDescription'])
          Aug 22 10:09:06 H2SPH180034 SM: [26118]
          Aug 22 10:09:06 H2SPH180034 SM: [26118] Raising exception [40, The SR scan failed  [opterr=['UUID_INVALID', 'VDI', 'nwweb1_disk2']]]
          Aug 22 10:09:06 H2SPH180034 SM: [26118] ***** Local EXT3 VHD: EXCEPTION <class 'SR.SROSError'>, The SR scan failed  [opterr=['UUID_INVALID', 'VDI', 'nwweb1_disk2']]
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SRCommand.py", line 378, in run
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     ret = cmd.run(sr)
          Aug 22 10:09:06 H2SPH180034 SM: [26118]   File "/opt/xensource/sm/SRCommand.py", line 120, in run
          Aug 22 10:09:06 H2SPH180034 SM: [26118]     raise xs_errors.XenError(excType, opterr=msg)
          Aug 22 10:09:06 H2SPH180034 SM: [26118]
          Aug 22 10:09:06 H2SPH180034 SM: [26146] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146] Found 0 cache files
          Aug 22 10:09:06 H2SPH180034 SM: [26146] lock: tried lock /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/gc_active, acquired: True (exists: True)
          Aug 22 10:09:06 H2SPH180034 SM: [26146] lock: tried lock /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr, acquired: True (exists: True)
          Aug 22 10:09:06 H2SPH180034 SM: [26146] ['/usr/bin/vhd-util', 'scan', '-f', '-m', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/*.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26146]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146] SR 5493 ('0034.datastore1') (11 VDIs in 11 VHD trees):
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         d21f98dc(50.000G/23.922G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         5bcc1298(120.000G/120.166G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         f0d05834(4.000G/591.161M)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         a8631a05(232.887G/49.192G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         ea8fcb2d(64.000G/60.291G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         e36aeb72(80.000G/29.875G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         nwweb1_d(40.001G/35.009G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         c8886b66(100.000G/43.599G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         nwweb1_d(232.886G/49.098G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         ec0ce545(40.002G/35.068G)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]         8a33a5d9(302.000M/98.192M)
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146]
          Aug 22 10:09:06 H2SPH180034 SM: [26146] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146] No work, exiting
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146] GC process exiting, no work left
          Aug 22 10:09:06 H2SPH180034 SM: [26146] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/gc_active
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146] In cleanup
          Aug 22 10:09:06 H2SPH180034 SMGC: [26146] SR 5493 ('0034.datastore1') (11 VDIs in 11 VHD trees): no changes
          Aug 22 10:09:06 H2SPH180034 SM: [26186] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SM: [26186] lock: acquired /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SM: [26186] ['/usr/sbin/td-util', 'query', 'vhd', '-vpfb', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26186]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26186] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SM: [26186] vdi_deactivate {'sr_uuid': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'subtask_of': 'DummyRef:|d5df22a3-6aa7-40ac-b53e-60c564f8bd03|VDI.deactivate', 'vdi_ref': 'OpaqueRef:737c43e7-81e3-40d8-9f3d-7ba4fa7b8d5e', 'vdi_on_boot': 'persist', 'args': [], 'o_direct': False, 'vdi_location': '5bcc1298-712a-4288-95c8-11b37de30191', 'host_ref': 'OpaqueRef:e0d2b385-379f-4342-914a-66aa75eb0ec8', 'session_ref': 'OpaqueRef:c2ab8621-e768-495d-9cb9-733fcc2a7191', 'device_config': {'device': '/dev/disk/by-id/scsi-36d09466058006700293b74fd0e7c6873-part3', 'SRmaster': 'true'}, 'command': 'vdi_deactivate', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:4ecc1a6e-8115-4104-b06b-6b1431b60de7', 'local_cache_sr': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'vdi_uuid': '5bcc1298-712a-4288-95c8-11b37de30191'}
          Aug 22 10:09:06 H2SPH180034 SM: [26186] lock: opening lock file /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:06 H2SPH180034 SM: [26186] blktap2.deactivate
          Aug 22 10:09:06 H2SPH180034 SM: [26186] lock: acquired /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:06 H2SPH180034 SM: [26186] ['/usr/sbin/tap-ctl', 'close', '-p', '26090', '-m', '2', '-t', '30']
          Aug 22 10:09:06 H2SPH180034 SM: [26186]  = 0
          Aug 22 10:09:06 H2SPH180034 SM: [26186] ['/usr/sbin/tap-ctl', 'detach', '-p', '26090', '-m', '2']
          Aug 22 10:09:06 H2SPH180034 SM: [26186]  = 0
          Aug 22 10:09:06 H2SPH180034 SM: [26186] ['/usr/sbin/tap-ctl', 'free', '-m', '2']
          Aug 22 10:09:06 H2SPH180034 SM: [26186]  = 0
          Aug 22 10:09:06 H2SPH180034 SM: [26186] tap.deactivate: Shut down Tapdisk(vhd:/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd, pid=26090, minor=2, state=R)
          Aug 22 10:09:06 H2SPH180034 SM: [26186] ['/usr/sbin/td-util', 'query', 'vhd', '-vpfb', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26186]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26186] Removed host key host_OpaqueRef:e0d2b385-379f-4342-914a-66aa75eb0ec8 for 5bcc1298-712a-4288-95c8-11b37de30191
          Aug 22 10:09:06 H2SPH180034 SM: [26186] lock: released /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:06 H2SPH180034 SM: [26231] lock: opening lock file /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SM: [26231] lock: acquired /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          Aug 22 10:09:06 H2SPH180034 SM: [26231] ['/usr/sbin/td-util', 'query', 'vhd', '-vpfb', '/var/run/sr-mount/54938185-cf60-f996-e2d7-fe8c47c38abb/5bcc1298-712a-4288-95c8-11b37de30191.vhd']
          Aug 22 10:09:06 H2SPH180034 SM: [26231]   pread SUCCESS
          Aug 22 10:09:06 H2SPH180034 SM: [26231] vdi_detach {'sr_uuid': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'subtask_of': 'DummyRef:|12db307f-14fd-44cc-84c8-34b7bfeb02bb|VDI.detach', 'vdi_ref': 'OpaqueRef:737c43e7-81e3-40d8-9f3d-7ba4fa7b8d5e', 'vdi_on_boot': 'persist', 'args': [], 'o_direct': False, 'vdi_location': '5bcc1298-712a-4288-95c8-11b37de30191', 'host_ref': 'OpaqueRef:e0d2b385-379f-4342-914a-66aa75eb0ec8', 'session_ref': 'OpaqueRef:3c833184-2a57-426e-a86a-3e5452790c81', 'device_config': {'device': '/dev/disk/by-id/scsi-36d09466058006700293b74fd0e7c6873-part3', 'SRmaster': 'true'}, 'command': 'vdi_detach', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:4ecc1a6e-8115-4104-b06b-6b1431b60de7', 'local_cache_sr': '54938185-cf60-f996-e2d7-fe8c47c38abb', 'vdi_uuid': '5bcc1298-712a-4288-95c8-11b37de30191'}
          Aug 22 10:09:06 H2SPH180034 SM: [26231] lock: opening lock file /var/lock/sm/5bcc1298-712a-4288-95c8-11b37de30191/vdi
          Aug 22 10:09:06 H2SPH180034 SM: [26231] lock: released /var/lock/sm/54938185-cf60-f996-e2d7-fe8c47c38abb/sr
          
          T 1 Reply Last reply Reply Quote 0
          • T Offline
            tc-atwork @tc-atwork
            last edited by

            My assumption is the error is due to these lines:

            Aug 22 10:09:06 H2SPH180034 SM: [26118] ***** sr_scan: EXCEPTION <class 'XenAPI.Failure'>, ['UUID_INVALID', 'VDI', 'nwweb1_disk2']

            Aug 22 10:09:06 H2SPH180034 SM: [26118] Raising exception [40, The SR scan failed [opterr=['UUID_INVALID', 'VDI', 'nwweb1_disk2']]]

            1 Reply Last reply Reply Quote 0
            • olivierlambertO Online
              olivierlambert Vates 🪐 Co-Founder CEO
              last edited by olivierlambert

              That's correct. Check your SR folder, you have a disk that's not named correctly. It's nwweb1_disk2 while it should be <UUID>.vhd. Please remove this file and everything will work again 🙂

              1 Reply Last reply Reply Quote 0
              • T Offline
                tc-atwork
                last edited by tc-atwork

                Okay, yup that was it. Migration is in progress now. When I manually added these disks to the host I never cleaned up the original disks after importing from the CLI, so removing them allowed the SR scan to work properly (or something like that). Thanks for the direction @olivierlambert !

                Kind of annoying that totally unrelated disks would cause a migration failure for another VM, but alas. Are there any plans to improve the error reporting in XO?

                1 Reply Last reply Reply Quote 0
                • olivierlambertO Online
                  olivierlambert Vates 🪐 Co-Founder CEO
                  last edited by olivierlambert

                  The issue is the initial reporting from XAPI that doesn't give many details sadly (so there's little we can do in XO then). But it's a good practice to never put files manually into your SR folder, otherwise this break things 🙂

                  1 Reply Last reply Reply Quote 0
                  • T Offline
                    tc-atwork
                    last edited by

                    Yeah, the trouble is manually migrating from a different hypervisor platform (hyper-v) through web-browser downloads and uploads (and then, a middleman being my laptop) is extremely slow so I just transferred directly using SCP and then imported using xe commands.

                    1 Reply Last reply Reply Quote 0
                    • olivierlambertO Online
                      olivierlambert Vates 🪐 Co-Founder CEO
                      last edited by

                      You can create a valid VHD file with the command uuidgen 🙂

                      1 Reply Last reply Reply Quote 0
                      • First post
                        Last post