XCP-ng
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Failed to migrate vdi

    Scheduled Pinned Locked Moved Xen Orchestra
    5 Posts 2 Posters 513 Views 1 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • L Offline
      lol_tux
      last edited by olivierlambert

      Hi. I am migrating several virtual storage machines and in one of them an error occurred during the disk migration procedure, causing the virtual machine to be frozen, making it necessary to force a shutdown. After this procedure, the disk simply disappeared. Note that the procedure is to migrate the disk between different LUNs within the same pool. I'm using xcp 8.2.1 and Xen Orchestra commit 9d122 | master commit 6444f. Logs below:

      vdi.migrate
      {
        "id": "e0d805fb-d325-4af7-af40-a4249b5ddec4",
        "sr_id": "4534d3f4-59d6-f7ce-93b7-bafc382ed183"
      }
      {
        "code": "INTERNAL_ERROR",
        "params": [
          "Xenops_interface.Xenopsd_error([S(Failed_to_suspend);[S(d3fbc98c-c742-4f77-45ca-f0579c3e837f);F(1200)]])"
        ],
        "task": {
          "uuid": "02e690ad-a5fa-ef89-9e1b-3e0fb7af59b6",
          "name_label": "Async.VDI.pool_migrate",
          "name_description": "",
          "allowed_operations": [],
          "current_operations": {},
          "created": "20240516T00:31:36Z",
          "finished": "20240516T01:58:09Z",
          "status": "failure",
          "resident_on": "OpaqueRef:f2af7892-f475-4c61-a289-3cb7bffc07c4",
          "progress": 1,
          "type": "<none/>",
          "result": "",
          "error_info": [
            "INTERNAL_ERROR",
            "Xenops_interface.Xenopsd_error([S(Failed_to_suspend);[S(d3fbc98c-c742-4f77-45ca-f0579c3e837f);F(1200)]])"
          ],
          "other_config": {},
          "subtask_of": "OpaqueRef:NULL",
          "subtasks": [],
          "backtrace": "(((process xapi)(filename ocaml/xapi-client/client.ml)(line 7))((process xapi)(filename ocaml/xapi-client/client.ml)(line 19))((process xapi)(filename ocaml/xapi-client/client.ml)(line 12325))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 134))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/rbac.ml)(line 233))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 104)))"
        },
        "message": "INTERNAL_ERROR(Xenops_interface.Xenopsd_error([S(Failed_to_suspend);[S(d3fbc98c-c742-4f77-45ca-f0579c3e837f);F(1200)]]))",
        "name": "XapiError",
        "stack": "XapiError: INTERNAL_ERROR(Xenops_interface.Xenopsd_error([S(Failed_to_suspend);[S(d3fbc98c-c742-4f77-45ca-f0579c3e837f);F(1200)]]))
          at Function.wrap (/opt/xo/xo-builds/xen-orchestra-202305301025/packages/xen-api/src/_XapiError.js:16:12)
          at _default (/opt/xo/xo-builds/xen-orchestra-202305301025/packages/xen-api/src/_getTaskResult.js:11:29)
          at Xapi._addRecordToCache (/opt/xo/xo-builds/xen-orchestra-202305301025/packages/xen-api/src/index.js:998:37)
          at forEach (/opt/xo/xo-builds/xen-orchestra-202305301025/packages/xen-api/src/index.js:1032:14)
          at Array.forEach (<anonymous>)
          at Xapi._processEvents (/opt/xo/xo-builds/xen-orchestra-202305301025/packages/xen-api/src/index.js:1022:12)
          at Xapi._watchEvents (/opt/xo/xo-builds/xen-orchestra-202305301025/packages/xen-api/src/index.js:1195:14)"
      }
      
      1 Reply Last reply Reply Quote 0
      • DanpD Offline
        Danp Pro Support Team
        last edited by

        I've never encountered this particular error. Have you tried rescanning both SRs to see if the disk reappears?

        L 1 Reply Last reply Reply Quote 0
        • L Offline
          lol_tux @Danp
          last edited by

          @Danp said in Failed to migrate vdi:

          I've never encountered this particular error. Have you tried rescanning both SRs to see if the disk reappears?

          Yes, I performed a rescan on both storage volumes, but the disks do not appear. The strange thing is that out of fifty virtual machines where the VDIs were migrated, the problem occurred in three, including one that was offline. For now, I am having to use the warm migration option to ensure greater reliability in the process until I find out what is causing this problem.

          DanpD 1 Reply Last reply Reply Quote 0
          • DanpD Offline
            Danp Pro Support Team @lol_tux
            last edited by

            @lol_tux How did you come to the conclusion that warm migration is more reliable? 🤔 Did you check SMlog for any obvious errors?

            L 1 Reply Last reply Reply Quote 0
            • L Offline
              lol_tux @Danp
              last edited by Danp

              @Danp I am using warm migration based on past experience, where we have already migrated more than four hundred virtual machines without any incident. This function of migrating only the VDI within the same pool is rarely used in our environment. Regarding SMlog, Orquestra Xen presented the error at 22:58:08 on May 15th, below is SMlog data around that time:

              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] lock: opening lock file /var/lock/sm/lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183/96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] lock: acquired /var/lock/sm/lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183/96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Refcount for lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183:96be9922-4bb1-4fea-8c09-c5e29a5475a0 (0, 0) + (1, 0) => (1, 0)
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Refcount for lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183:96be9922-4bb1-4fea-8c09-c5e29a5475a0 set => (1, 0b)
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] lock: acquired /var/lock/sm/.nil/lvm
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] ['/sbin/lvchange', '-ay', '/dev/VG_XenStorage-4534d3f4-59d6-f7ce-93b7-bafc382ed183/VHD-96be9922-4bb1-4fea-8c09-c5e29a5475a0']
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/.nil/lvm
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183/96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] ['/usr/bin/vhd-util', 'query', '--debug', '-vsf', '-n', '/dev/VG_XenStorage-4534d3f4-59d6-f7ce-93b7-bafc382ed183/VHD-96be9922-4bb1-4fea-8c09-c5e29a5475a0']
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] ['/usr/bin/vhd-util', 'set', '--debug', '-n', '/dev/VG_XenStorage-4534d3f4-59d6-f7ce-93b7-bafc382ed183/VHD-96be9922-4bb1-4fea-8c09-c5e29a5475a0', '-f', 'hidden', '-v', '1']
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Deleting vdi: 96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Entering deleteVdi
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] entering updateVdi
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Entering getMetadataToWrite
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Entering VDI info
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Entering VDI info
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] lock: acquired /var/lock/sm/lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183/96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Refcount for lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183:96be9922-4bb1-4fea-8c09-c5e29a5475a0 (1, 0) + (-1, 0) => (0, 0)
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] Refcount for lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183:96be9922-4bb1-4fea-8c09-c5e29a5475a0 set => (0, 0b)
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] lock: acquired /var/lock/sm/.nil/lvm
              May 15 22:58:07 SDE-RK8-R6525-P01-H01 SM: [22357] ['/sbin/lvchange', '-an', '/dev/VG_XenStorage-4534d3f4-59d6-f7ce-93b7-bafc382ed183/VHD-96be9922-4bb1-4fea-8c09-c5e29a5475a0']
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] ['/sbin/dmsetup', 'status', 'VG_XenStorage--4534d3f4--59d6--f7ce--93b7--bafc382ed183-VHD--96be9922--4bb1--4fea--8c09--c5e29a5475a0']
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183/96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: acquired /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] ['/sbin/lvremove', '-f', '/dev/VG_XenStorage-4534d3f4-59d6-f7ce-93b7-bafc382ed183/VHD-96be9922-4bb1-4fea-8c09-c5e29a5475a0']
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] ['/sbin/dmsetup', 'status', 'VG_XenStorage--4534d3f4--59d6--f7ce--93b7--bafc382ed183-VHD--96be9922--4bb1--4fea--8c09--c5e29a5475a0']
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: closed /var/lock/sm/lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183/96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: unlinking lock file /var/lock/sm/lvm-4534d3f4-59d6-f7ce-93b7-bafc382ed183/96be9922-4bb1-4fea-8c09-c5e29a5475a0
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] Setting virtual_allocation of SR 4534d3f4-59d6-f7ce-93b7-bafc382ed183 to 7884496699392
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: acquired /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] ['/sbin/vgs', '--noheadings', '--nosuffix', '--units', 'b', 'VG_XenStorage-4534d3f4-59d6-f7ce-93b7-bafc382ed183']
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357]   pread SUCCESS
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: opening lock file /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/running
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: tried lock /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/running, acquired: True (exists: True)
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/running
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] Kicking GC
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SMGC: [22357] === SR 4534d3f4-59d6-f7ce-93b7-bafc382ed183: gc ===
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SMGC: [22432] Will finish as PID [22433]
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22433] lock: closed /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22433] lock: opening lock file /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/running
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22433] lock: opening lock file /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/gc_active
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SMGC: [22357] New PID [22432]
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: acquired /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22433] lock: opening lock file /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/sr
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22433] LVMCache created for VG_XenStorage-4534d3f4-59d6-f7ce-93b7-bafc382ed183
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22433] lock: tried lock /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/gc_active, acquired: True (exists: True)
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22433] lock: tried lock /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/sr, acquired: False (exists: True)
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/.nil/lvm
              May 15 22:58:08 SDE-RK8-R6525-P01-H01 SM: [22357] lock: released /var/lock/sm/4534d3f4-59d6-f7ce-93b7-bafc382ed183/sr
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] Setting LVM_DEVICE to /dev/disk/by-scsid/360002ac00000000000000096000205b4
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] Setting LVM_DEVICE to /dev/disk/by-scsid/360002ac00000000000000096000205b4
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: opening lock file /var/lock/sm/87edd82a-f612-d1f9-bfcd-bc2cae7cff98/sr
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] LVMCache created for VG_XenStorage-87edd82a-f612-d1f9-bfcd-bc2cae7cff98
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: opening lock file /var/lock/sm/.nil/lvm
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] ['/sbin/vgs', '--readonly', 'VG_XenStorage-87edd82a-f612-d1f9-bfcd-bc2cae7cff98']
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510]   pread SUCCESS
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: acquired /var/lock/sm/87edd82a-f612-d1f9-bfcd-bc2cae7cff98/sr
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] LVMCache: will initialize now
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] LVMCache: refreshing
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: acquired /var/lock/sm/.nil/lvm
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] ['/sbin/lvs', '--noheadings', '--units', 'b', '-o', '+lv_tags', '/dev/VG_XenStorage-87edd82a-f612-d1f9-bfcd-bc2cae7cff98']
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510]   pread SUCCESS
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: released /var/lock/sm/.nil/lvm
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: released /var/lock/sm/87edd82a-f612-d1f9-bfcd-bc2cae7cff98/sr
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] Entering _checkMetadataVolume
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: acquired /var/lock/sm/87edd82a-f612-d1f9-bfcd-bc2cae7cff98/sr
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] vdi_snapshot {'sr_uuid': '87edd82a-f612-d1f9-bfcd-bc2cae7cff98', 'subtask_of': 'DummyRef:|f06fb873-be2b-46ff-8910-e0489003b63c|VDI.snapshot', 'vdi_ref': 'OpaqueRef:f60beba2-8202-4a43-be62-5935e240be01', 'vdi_on_boot': 'persist', 'args': [], 'vdi_location': 'e0d805fb-d325-4af7-af40-a4249b5ddec4', 'host_ref': 'OpaqueRef:f2af7892-f475-4c61-a289-3cb7bffc07c4', 'session_ref': 'OpaqueRef:5e183ddc-f65c-4dff-8322-0e909c489425', 'device_config': {'SCSIid': '360002ac00000000000000096000205b4', 'SRmaster': 'true'}, 'command': 'vdi_snapshot', 'vdi_allow_caching': 'false', 'sr_ref': 'OpaqueRef:85b38e21-10be-4ea3-8578-65686851a0e1', 'driver_params': {'type': 'internal'}, 'vdi_uuid': 'e0d805fb-d325-4af7-af40-a4249b5ddec4'}
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: acquired /var/lock/sm/.nil/lvm
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: released /var/lock/sm/.nil/lvm
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] Pause request for e0d805fb-d325-4af7-af40-a4249b5ddec4
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] Calling tap-pause on host OpaqueRef:e352bec8-d91e-4cfe-af1b-8f53876c4e56
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] LVHDVDI._snapshot for e0d805fb-d325-4af7-af40-a4249b5ddec4 (type 3)
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: opening lock file /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/e0d805fb-d325-4af7-af40-a4249b5ddec4
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: acquired /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/e0d805fb-d325-4af7-af40-a4249b5ddec4
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] Refcount for lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98:e0d805fb-d325-4af7-af40-a4249b5ddec4 (1, 0) + (1, 0) => (2, 0)
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] Refcount for lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98:e0d805fb-d325-4af7-af40-a4249b5ddec4 set => (2, 0b)
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] lock: released /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/e0d805fb-d325-4af7-af40-a4249b5ddec4
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] ['/usr/bin/vhd-util', 'query', '--debug', '-vsf', '-n', '/dev/VG_XenStorage-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/VHD-e0d805fb-d325-4af7-af40-a4249b5ddec4']
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510]   pread SUCCESS
              May 15 22:58:09 SDE-RK8-R6525-P01-H01 SM: [22510] ['/usr/bin/vhd-util', 'scan', '-f', '-m', 'VHD-e0d805fb-d325-4af7-af40-a4249b5ddec4', '-l', 'VG_XenStorage-87edd82a-f612-d1f9-bfcd-bc2cae7cff98', '-a']
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510]   pread SUCCESS
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] lock: opening lock file /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/77aed767-3cd9-428b-b088-a16d1a04d91d
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] lock: acquired /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/77aed767-3cd9-428b-b088-a16d1a04d91d
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] Refcount for lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98:77aed767-3cd9-428b-b088-a16d1a04d91d (1, 0) + (1, 0) => (2, 0)
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] Refcount for lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98:77aed767-3cd9-428b-b088-a16d1a04d91d set => (2, 0b)
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] lock: released /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/77aed767-3cd9-428b-b088-a16d1a04d91d
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] lock: opening lock file /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/00a5a7dd-5651-4ee2-aba1-dbae12c3dae9
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] lock: acquired /var/lock/sm/lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98/00a5a7dd-5651-4ee2-aba1-dbae12c3dae9
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] Refcount for lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98:00a5a7dd-5651-4ee2-aba1-dbae12c3dae9 (0, 0) + (1, 0) => (1, 0)
              May 15 22:58:10 SDE-RK8-R6525-P01-H01 SM: [22510] Refcount for lvm-87edd82a-f612-d1f9-bfcd-bc2cae7cff98:00a5a7dd-5651-4ee2-aba1-dbae12c3dae9 set => (1, 0b)
              
              1 Reply Last reply Reply Quote 0
              • First post
                Last post