XCP-ng
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    VDI missing after trying to fix backup issue

    Scheduled Pinned Locked Moved Solved Backup
    5 Posts 2 Posters 52 Views 2 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • A Offline
      anik
      last edited by

      Hi,

      little over a week ago, one of my VM's backup started to have some hickups after i accidentally rebooted XO (or was it toolstack restart, cant remember) during backups...

      After that hickup my delta backup started throwing this error:

      Error: The value of "offset" is out of range. It must be >= 0 and <= 102399. Received 102400
      

      I kept getting this error everyday, so i tried resetting the backups by removing the snapshot that existed.

      Removing snapshot did not help so i disabled CBT and removed snapshot and all existing delta backups as well, waited for coalescence then tried backup and got this error:

      Error: SR_BACKEND_FAILURE_82(, Failed to snapshot VDI [opterr=['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:2ebd2753-709c-4bf9-afdb-9ce38938b5cf', 'paused']], )
      

      After this i tried migrating the VM to another host and got error:

      vm.migrate
      {
        "vm": "d5d0334c-a7e3-b29f-51ca-1be9c211d2c1",
        "migrationNetwork": "1f6f4495-1045-6fe2-3da6-4e43862e623d",
        "sr": "6b24cd1c-22ad-0994-5b6b-a75389a6ddba",
        "targetHost": "d09b0fef-7aab-4516-a21e-0f72806655aa"
      }
      {
        "code": "SR_BACKEND_FAILURE_202",
        "params": [
          "",
          "General backend error [opterr=['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:2ebd2753-709c-4bf9-afdb-9ce38938b5cf', 'paused']]",
          ""
        ],
        "task": {
          "uuid": "1d7d6b6c-1d5d-f166-c1e9-d42a94a08253",
          "name_label": "Async.VDI.disable_cbt",
          "name_description": "",
          "allowed_operations": [],
          "current_operations": {},
          "created": "20250613T08:15:55Z",
          "finished": "20250613T08:15:56Z",
          "status": "failure",
          "resident_on": "OpaqueRef:010eebba-be27-489f-9f87-d06c8b675f19",
          "progress": 1,
          "type": "<none/>",
          "result": "",
          "error_info": [
            "SR_BACKEND_FAILURE_202",
            "",
            "General backend error [opterr=['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:2ebd2753-709c-4bf9-afdb-9ce38938b5cf', 'paused']]",
            ""
          ],
          "other_config": {},
          "subtask_of": "OpaqueRef:NULL",
          "subtasks": [],
          "backtrace": "(((process xapi)(filename lib/backtrace.ml)(line 210))((process xapi)(filename ocaml/xapi/storage_access.ml)(line 32))((process xapi)(filename ocaml/xapi/xapi_vdi.ml)(line 1403))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 131))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/rbac.ml)(line 205))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 95)))"
        },
        "message": "SR_BACKEND_FAILURE_202(, General backend error [opterr=['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:2ebd2753-709c-4bf9-afdb-9ce38938b5cf', 'paused']], )",
        "name": "XapiError",
        "stack": "XapiError: SR_BACKEND_FAILURE_202(, General backend error [opterr=['MAP_DUPLICATE_KEY', 'VDI', 'sm_config', 'OpaqueRef:2ebd2753-709c-4bf9-afdb-9ce38938b5cf', 'paused']], )
          at Function.wrap (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/_XapiError.mjs:16:12)
          at default (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/_getTaskResult.mjs:13:29)
          at Xapi._addRecordToCache (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1072:24)
          at file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1106:14
          at Array.forEach (<anonymous>)
          at Xapi._processEvents (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1096:12)
          at Xapi._watchEvents (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1269:14)"
      }
      

      I then tried restarting the host and got error:

      vm.restart
      {
        "id": "d5d0334c-a7e3-b29f-51ca-1be9c211d2c1",
        "force": true,
        "bypassBlockedOperation": false
      }
      {
        "code": "SR_BACKEND_FAILURE_46",
        "params": [
          "",
          "The VDI is not available [opterr=VDI c6853f48-4b06-4c34-9707-b68f9054e6fc locked]",
          ""
        ],
        "task": {
          "uuid": "421e7331-322d-5db1-02eb-62a6e443f4e7",
          "name_label": "Async.VM.hard_reboot",
          "name_description": "",
          "allowed_operations": [],
          "current_operations": {},
          "created": "20250613T08:27:36Z",
          "finished": "20250613T08:31:42Z",
          "status": "failure",
          "resident_on": "OpaqueRef:010eebba-be27-489f-9f87-d06c8b675f19",
          "progress": 1,
          "type": "<none/>",
          "result": "",
          "error_info": [
            "SR_BACKEND_FAILURE_46",
            "",
            "The VDI is not available [opterr=VDI c6853f48-4b06-4c34-9707-b68f9054e6fc locked]",
            ""
          ],
          "other_config": {},
          "subtask_of": "OpaqueRef:NULL",
          "subtasks": [],
          "backtrace": "(((process xapi)(filename ocaml/xapi-client/client.ml)(line 7))((process xapi)(filename ocaml/xapi-client/client.ml)(line 19))((process xapi)(filename ocaml/xapi-client/client.ml)(line 6122))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 134))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/rbac.ml)(line 205))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 95)))"
        },
        "message": "SR_BACKEND_FAILURE_46(, The VDI is not available [opterr=VDI c6853f48-4b06-4c34-9707-b68f9054e6fc locked], )",
        "name": "XapiError",
        "stack": "XapiError: SR_BACKEND_FAILURE_46(, The VDI is not available [opterr=VDI c6853f48-4b06-4c34-9707-b68f9054e6fc locked], )
          at Function.wrap (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/_XapiError.mjs:16:12)
          at default (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/_getTaskResult.mjs:13:29)
          at Xapi._addRecordToCache (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1072:24)
          at file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1106:14
          at Array.forEach (<anonymous>)
          at Xapi._processEvents (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1096:12)
          at Xapi._watchEvents (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/index.mjs:1269:14)"
      }
      

      And i then noticed that the VM is actually frozen, the VDI is not found, anywhere. and obviously the VM wont start...

      I then noticed in storage > Advance tab this error:

      Failed to unpause tapdisk for VDI c6853f48-4b06-4c34-9707-b68f9054e6fc, VMs using this tapdisk have lost access to the corresponding disk(s)
      

      I see the disks in XO and in CLI:

      xe vdi-list uuid=c6853f48-4b06-4c34-9707-b68f9054e6fc
      uuid ( RO)                : c6853f48-4b06-4c34-9707-b68f9054e6fc
                name-label ( RW): vm_disk1
          name-description ( RW): xvda1
                   sr-uuid ( RO): 6b24cd1c-22ad-0994-5b6b-a75389a6ddba
              virtual-size ( RO): 53689188352
                  sharable ( RO): false
                 read-only ( RO): false
      
      

      But when i check my storage repository in /var/run/sr-mount/ the file is not actually there:

      file c6853f48-4b06-4c34-9707-b68f9054e6fc
      c6853f48-4b06-4c34-9707-b68f9054e6fc: cannot open (No such file or directory)
      
      

      Since i just deleted all backup before i began this operation, my only hope is that the "Base copy" VDI which still exists, allows me to recover the VM.

      file 3507a5eb-748b-4d13-bdc8-e8128da0bfdb.vhd
      3507a5eb-748b-4d13-bdc8-e8128da0bfdb.vhd: Microsoft Disk Image, Virtual Server or Virtual PC
      

      Is this my only hope or recovering the disk or is there some other way i have missed?

      1 Reply Last reply Reply Quote 0
      • A Offline
        anik
        last edited by

        UPDATE:

        I managed to find the missing VDI "c6853f48-4b06-4c34-9707-b68f9054e6fc", it was the last successfull backup taken i believe, luckily i have snapshots on backup folder aswell on truenas!

        root@truenas01[.../d5d0334c-a7e3-b29f-51ca-1be9c211d2c1]# ll
        total 341
        drwxr-xr-x  3 root      19 Jun  7 08:35 ./
        drwxr-xr-x 31 nobody    31 Jun  7 09:08 ../
        -rw-r--r--  1 root   20305 Oct  7  2024 20241007T113432Z.json
        -rw-r--r--  1 root   20131 May 27 10:51 20250527T065026Z.json
        -rw-r--r--  1 root   20327 May 28 09:13 20250528T050955Z.json
        -rw-r--r--  1 root   20327 May 28 13:18 20250528T091600Z.json
        -rw-r--r--  1 root   20440 May 29 08:35 20250529T044854Z.json
        -rw-r--r--  1 root   20327 May 30 09:13 20250530T051201Z.json
        -rw-r--r--  1 root   20131 May 30 10:43 20250530T064322Z.json
        -rw-r--r--  1 root   20327 May 31 09:13 20250531T051135Z.json
        -rw-r--r--  1 root   20327 Jun  1 09:13 20250601T051144Z.json
        -rw-r--r--  1 root   20327 Jun  2 09:12 20250602T051127Z.json
        -rw-r--r--  1 root   20327 Jun  3 09:17 20250603T050936Z.json
        -rw-r--r--  1 root   20327 Jun  4 09:07 20250604T050627Z.json
        -rw-r--r--  1 root   20327 Jun  5 09:07 20250605T050623Z.json
        -rw-r--r--  1 root   20327 Jun  6 09:07 20250606T050650Z.json
        -rw-r--r--  1 root   20327 Jun  7 09:08 20250607T050648Z.json
        -rw-r--r--  1 root   17855 Jun  7 09:08 cache.json.gz
        drwx------  5 root       5 Oct  7  2024 vdis/
        root@truenas01[.../d5d0334c-a7e3-b29f-51ca-1be9c211d2c1]# cd vdis
        root@truenas01[...334c-a7e3-b29f-51ca-1be9c211d2c1/vdis]# cd 5a7a5647-d2ca-4014-b87b-06c621c09a20
        root@truenas01[.../5a7a5647-d2ca-4014-b87b-06c621c09a20]# ls
        9f8da2fd-a08d-43c2-a5b1-ce6125cf52f5  c6853f48-4b06-4c34-9707-b68f9054e6fc
        root@truenas01[.../5a7a5647-d2ca-4014-b87b-06c621c09a20]# cd c6853f48-4b06-4c34-9707-b68f9054e6fc
        root@truenas01[.../c6853f48-4b06-4c34-9707-b68f9054e6fc]# ll -hl
        total 21G
        drwx------ 2 root   3 Oct  7  2024 ./
        drwx------ 4 root   4 Oct  7  2024 ../
        -rw-r--r-- 1 root 50G Oct  7  2024 20241007T113432Z.vhd
        

        Do i only need to resotre this: 20241007T113432Z.vhd and the VM should be back, since the basedisk still exists or?

        A 1 Reply Last reply Reply Quote 0
        • A Offline
          anik @anik
          last edited by

          Ok i managed to restore all .vhd files from truenas snapshots, but still cannot start the VM, because one of the disks is stuck on dom0:

          vm.start
          {
            "id": "d5d0334c-a7e3-b29f-51ca-1be9c211d2c1",
            "bypassMacAddressesCheck": false,
            "force": false
          }
          {
            "code": "SR_BACKEND_FAILURE_46",
            "params": [
              "",
              "The VDI is not available [opterr=VDI 9f8da2fd-a08d-43c2-a5b1-ce6125cf52f5 not detached cleanly]",
              ""
            ],
            "call": {
              "duration": 4622,
              "method": "VM.start",
              "params": [
                "* session id *",
                "OpaqueRef:cd8f9bbd-de5e-49a4-bbc0-38abdff5f1ac",
                false,
                false
              ]
            },
            "message": "SR_BACKEND_FAILURE_46(, The VDI is not available [opterr=VDI 9f8da2fd-a08d-43c2-a5b1-ce6125cf52f5 not detached cleanly], )",
            "name": "XapiError",
            "stack": "XapiError: SR_BACKEND_FAILURE_46(, The VDI is not available [opterr=VDI 9f8da2fd-a08d-43c2-a5b1-ce6125cf52f5 not detached cleanly], )
              at Function.wrap (file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/_XapiError.mjs:16:12)
              at file:///opt/xo/xo-builds/xen-orchestra-202506030701/packages/xen-api/transports/json-rpc.mjs:38:21
              at runNextTicks (node:internal/process/task_queues:65:5)
              at processImmediate (node:internal/timers:453:9)
              at process.callbackTrampoline (node:internal/async_hooks:130:17)"
          }
          

          Tried manually unplugging:

          xe vbd-unplug uuid=60c80d37-351c-22e0-81c6-bae24082f4e1 --force
          The server failed to handle your request, due to an internal error. The given message may give details useful for debugging the problem.
          message: Expected 0 or 1 VDI with datapath, had 2
          

          Already tried restarting toolstack, didnt help.

          1 Reply Last reply Reply Quote 0
          • olivierlambertO Offline
            olivierlambert Vates 🪐 Co-Founder CEO
            last edited by

            You need to forget the VDI then rescan the SR and that should do the trick

            A 1 Reply Last reply Reply Quote 0
            • A Offline
              anik @olivierlambert
              last edited by

              @olivierlambert

              xe vdi-forget uuid=c6853f48-4b06-4c34-9707-b68f9054e6fc
              xe sr-scan uuid=6b24cd1c-22ad-0994-5b6b-a75389a6ddba
              xe vdi-list uuid=c6853f48-4b06-4c34-9707-b68f9054e6fc params=sr-uuid sr-uuid ( RO)    : 6b24cd1c-22ad-0994-5b6b-a75389a6ddba
              

              yup, that worked, thanks 🙂

              1 Reply Last reply Reply Quote 1
              • olivierlambertO olivierlambert marked this topic as a question
              • olivierlambertO olivierlambert has marked this topic as solved
              • First post
                Last post