VDI_IO_ERROR(Device I/O errors) when you run scheduled backup
-
Hi good day.
I have an installation of XCP-ng 8.1, xo-server 5.57.3 and xo-web 5.57.1When I want to make delta backups of VMs of more than 500GB it gives me the following error:
VDI_IO_ERROR (Device I / O errors)
Look for the solution on the internet without success.
I ran several tests and found that I had the Timeout = 1 option set to advanced options.
This caused the backup to be canceled in one hour. But the information in the backup status was as follows:
transfer Start: May 3, 2020, 4:15:46 PM End: May 3, 2020, 5:15:11 PM Duration: an hour Error: VDI_IO_ERROR (Device I / O errors)
The error makes no mention of timeout cancellation. What disorients the search for the problem.
The solution, put extend the value in timeout or set it to zero and the backup worked correctly.
(Sorry for the translation, it is done with google translation)
-
-
May be unrelated, but I had a similar issue on my most recent backup. Here's the relevant portion of the log --
"result": { "code": "VDI_IO_ERROR", "params": [ "Device I/O errors" ], "task": { "uuid": "007fb970-600c-5dc8-c527-c047778b2854", "name_label": "[XO] VDI Export", "name_description": "xxxx03", "allowed_operations": [], "current_operations": {}, "created": "20200504T05:09:54Z", "finished": "20200504T06:01:31Z", "status": "failure", "resident_on": "OpaqueRef:e2781353-8cd6-0938-108d-1b7933d4c739", "progress": 1, "type": "<none/>", "result": "", "error_info": [ "VDI_IO_ERROR", "Device I/O errors" ], "other_config": {}, "subtask_of": "OpaqueRef:NULL", "subtasks": [], "backtrace": "(((process xapi)(filename ocaml/xapi/vhd_tool_wrapper.ml)(line 59))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename ocaml/xapi/export_raw_vdi.ml)(line 41))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename ocaml/xapi/export_raw_vdi.ml)(line 62)))" }, "message": "VDI_IO_ERROR(Device I/O errors)", "name": "XapiError", "stack": "XapiError: VDI_IO_ERROR(Device I/O errors)\n at Function.wrap (/opt/xen-orchestra/packages/xen-api/src/_XapiError.js:16:12)\n at _default (/opt/xen-orchestra/packages/xen-api/src/_getTaskResult.js:11:29)\n at Xapi._addRecordToCache (/opt/xen-orchestra/packages/xen-api/src/index.js:837:24)\n at forEach (/opt/xen-orchestra/packages/xen-api/src/index.js:858:14)\n at Array.forEach (<anonymous>)\n at Xapi._processEvents (/opt/xen-orchestra/packages/xen-api/src/index.js:848:12)\n at /opt/xen-orchestra/packages/xen-api/src/index.js:1009:14\n at Generator.next (<anonymous>)\n at asyncGeneratorStep (/opt/xen-orchestra/packages/xen-api/dist/index.js:62:103)\n at _next (/opt/xen-orchestra/packages/xen-api/dist/index.js:64:194)\n at tryCatcher (/opt/xen-orchestra/node_modules/bluebird/js/release/util.js:16:23)\n at Promise._settlePromiseFromHandler (/opt/xen-orchestra/node_modules/bluebird/js/release/promise.js:547:31)\n at Promise._settlePromise (/opt/xen-orchestra/node_modules/bluebird/js/release/promise.js:604:18)\n at Promise._settlePromise0 (/opt/xen-orchestra/node_modules/bluebird/js/release/promise.js:649:10)\n at Promise._settlePromises (/opt/xen-orchestra/node_modules/bluebird/js/release/promise.js:729:18)\n at _drainQueueStep (/opt/xen-orchestra/node_modules/bluebird/js/release/async.js:93:12)\n at _drainQueue (/opt/xen-orchestra/node_modules/bluebird/js/release/async.js:86:9)\n at Async._drainQueues (/opt/xen-orchestra/node_modules/bluebird/js/release/async.js:102:5)\n at Immediate.Async.drainQueues [as _onImmediate] (/opt/xen-orchestra/node_modules/bluebird/js/release/async.js:15:14)\n at processImmediate (internal/timers.js:456:21)" } },
This is with xo-server 5.59.0 and xo-web 5.59.0. My
timeout=6
under advanced settings. I restarted the failed backups and they completed on the 2nd attempt. -
I don't have an immediate solution but the goal for the future is that the timeout cancellation will be clearly visible in logs.
-
Having the same issue after the latest update in 8.2. It was working fine before. I am not sure what caused the issue. Is there a way to revert the latest update?
-
@fachex I doubt the latest 8.2 updates caused this directly. However the updates process (update software, reboot) might reveal issues that were already there waiting to be visible.
The first step would be to look at the logs for errors: https://xcp-ng.org/docs/troubleshooting.html
I'd start with
/var/log/SMlog
since this is storage-related.Also check kernel messages.
-
@stormi The problem continues. I decided to reinstall and not do updates and we are experiencing the same problem.
-
@olivierlambert no more feedback on this problem. We reinstall the whole server, XCP-ng 8.2 and we are still experiencing this problem. We are unable to do Delta backups.
-
Well my answer from 7 days ago still stands current: please check the logs for more information about that I/O error.
-
@stormi There is really nothing there. I've checked but nothing related to this problem. Now let me explain my setup, maybe I am doing something wrong.
- XCP-ng 8.2 with a VM (Debian, XOA)
- I've created a LV storage and attached it to the XOA machine.
- I backup "locally". The remote is a "local" storage on the XOA machine but it is indeed a LVM which is located in another disk. I hope I make sense.
- The VMs to be backed up are in another xcp-ng servers.
The problem appears to be that it does not have permission or it cannot written into that LV partition.
The weird thing is that it worked before, and after we updated all our hypervisors, it stopped working.
To test out, I've created a VM in the same hypervisor where the XOA is located and try to backup that very machine and I've got the same error.
-
@fachex said in VDI_IO_ERROR(Device I/O errors) when you run scheduled backup:
@olivierlambert no more feedback on this problem. We reinstall the whole server, XCP-ng 8.2 and we are still experiencing this problem. We are unable to do Delta backups.
Hi @fachex, @stormi
If you update XCP-ng, remember that the installation cleared all settings in XCP-NG (Host).In my case, XCP-NG deleted the folders where the mount point of the external device was.
Maybe this information will help you in your LVM permission problem.
-
@cbaguzman Thank you! unfortunately, I can't reproduce that. After the update backups were not working at all. Even those that "worked" when trying to restore it failed to do so. So we decided to reinstall 8.2 and don't do updates. But in the process, we think we configure things differently. Now we can't reproduce what we did before even with the documentation we have. The moral of the story? don't create backup systems too complicated
-
Does this mean it's working fine now?
-
@stormi I confess we're now encountering the same error message on nearly all our backups, including CR to a local host. Started from fresh install and cleaned SRs; to avoid memory being the culprit we have upped the memory for Dom0 to 16gb (128gb machine) and XO is running with 16gb of mem of which 12gb is allocated to node.
We've got the same problem occurring across all our hosts. Over 90% of backups error out with the VDI_IO_ERROR, however (weirdly) looking at the target end, I'd say that 75% of the backups 'seem' to complete successfully. Need to restore a couple to find out for sure but confess I've been concentrating on finding out what triggers the error rather than whether it is misleading.
I've gone through the logs in detail and unfortunately nothing jumps out, I'm going to take time to extract the relevant sections from them all to see if you can see something that I can't, but apart from lots of memory messages from squeezed there aren't any obvious errors.
Bizarrely SMlog is pretty clean - it's almost like it receives a termination signal from somewhere rather than erroring out of its own accord - for example tapdisk shuts down with "Closing after 0 errors" and no further explanation. I have found some talk that tapdisk can trigger a target reset after excessive i/o activity but I've not managed to prove that yet.
I'll keep digging into things ; in short it's not something only experienced by @fachex but I haven't yet recreated it in XOA - it's on my todo list.
(If you want me to download all the logs and send them across directly, or to do anything under the covers of XOA, please do let me know. I'm dipping into this when I get time so its not a continuous effort I'm afraid.)
-
We can't promise putting a lot of time into investigating it and the causes could be various, but feel free to share the logs so that anyone with a bit of time may have a quick look for obvious issues.
-
Have recreated using XOA; unfortunately failed as described above. I'm afraid this is going to be a long painful post with lots of logs so my apologies in advance, but hopefully someone cleverer than I can see something that I can't.
Setup: Cleaned all the remnant VMs and VDIs from before starting the backup (for some reason seem to get a lot that aren't listed in the GUI but show up in the CLI); same for the halted VMs.
Objective: Three VMs are being backed up from source (xen11-lon1 in the logs) via CR to an identical host target (xen12-lon2 in the logs) on the local LAN, 1st backup (VM: 98gb) succeeds - 98gb, second (68gb) fails after 7 minutes - this is the one we're discussing - and the third automatically cancels. Concurrency in this scenario is set to 1 whereupon we get a failure probably 50% of the time, if concurrency is set to 2 it fails every time.
Outcome:
Attached is the backup log from XOA: backup NG.log.txtLooking through the source's logs, quite a few storage errors repeating all the way through xensource.log eg :
xapi: [error||3 ||storage_migrate] Tapdisk mirroring has failed xapi: [debug||181905 ||storage_access] Received update: ["Mirror","34bd3a97-3562-75af-a24e-a266caf368e3/269a33a0-4a59-4cc3-8b29-1245c402e51c"] xapi: [debug||23 |sm_events D:2b460f139f9e|storage_access] sm event on mirror: 34bd3a97-3562-75af-a24e-a266caf368e3/269a33a0-4a59-4cc3-8b29-1245c402e51c xapi: [debug||181905 ||storage_access] Calling UPDATES.get Async.VM.migrate_send R:61453ededbc7 3663 30 xapi: [ info||23 |sm_events D:2b460f139f9e|storage_impl] DATA.MIRROR.stat dbg:sm_events D:2b460f139f9e id:34bd3a97-3562-75af-a24e-a266caf368e3/269a33a0-4a59-4cc3-8b29-1245c402e51c xapi: [debug||23 |sm_events D:2b460f139f9e|storage_access] Mirror 34bd3a97-3562-75af-a24e-a266caf368e3/269a33a0-4a59-4cc3-8b29-1245c402e51c has failed xapi: [debug||23 |sm_events D:2b460f139f9e|storage_access] Mirror associated with task: OpaqueRef:61453ede-dbc7-4aed-bca3-00b91253bdda xapi: [error||23 |sm_events D:2b460f139f9e|storage_access] storage event: Caught Db_exn.DBCache_NotFound("missing row", "task", "OpaqueRef:61453ede-dbc7-4aed-bca3-00b91253bdda") while updating mirror
...whilst on the target we get the following xensource.log errors at the point of the backup starting (however first VM backed up successfully?!):
Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80|Importing raw VDI R:063fb70f6b64|vhd_tool_wrapper] vhd-tool failed, returning VDI_IO_ERROR Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80|Importing raw VDI R:063fb70f6b64|vhd_tool_wrapper] vhd-tool output: vhd-tool: Non-zero size required (either a pre-existing destination file or specified via --destination-size on the command line)\x0A Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621685 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:session.slave_login D:e90616ae272a created by task R:063fb70f6b64 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [ info||621685 UNIX /var/lib/xcp/xapi|session.slave_login D:e20578ac50c9|xapi_session] Session.create trackid=dabf19084add1449e2bb4eefb181a0fe pool=true uname= originator=xapi is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621686 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:pool.get_all D:b215ffc57ba1 created by task D:e20578ac50c9 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:VBD.unplug D:6ea0c0aa3771 created by task R:063fb70f6b64 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [ info||621687 UNIX /var/lib/xcp/xapi||taskhelper] task VBD.unplug R:1ac7c9ffd569 (uuid:3078efb7-ffd8-2b8a-f6d9-c7b3be9feb98) created (trackid=dabf19084add1449e2bb4eefb181a0fe) by task R:063fb70f6b64 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|audit] VBD.unplug: VBD = '4ccde4ad-9d36-948a-db99-abfa6b80e1ec' Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|vbdops] VBD.unplug of loopback VBD 'OpaqueRef:d7d882f5-b195-4ece-981f-00fc9511ee8a' Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [ info||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|storage_impl] DP.destroy dbg:OpaqueRef:1ac7c9ff-d569-4e1d-8fe2-688917213cd2 dp:vbd/0/xvda allow_leak:false Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|storage_impl] [destroy_sr] Filtered VDI count:0 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: message repeated 3 times: [ [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|storage_impl] [destroy_sr] Filtered VDI count:0] Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|storage_impl] [destroy_sr] Filtered VDI count:1 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|storage_impl] [destroy_sr] VDI found with the dp is f9d37af0-10d4-4824-8306-f849c0c74d52 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|storage_impl] dbg:OpaqueRef:1ac7c9ff-d569-4e1d-8fe2-688917213cd2 dp:vbd/0/xvda sr:cf2dbaa3-21f3-903b-0fd1-fbe68539f897 vdi:f9d37af0-10d4-4824-8306-f849c0c74d52 superstate:activated RW Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621687 UNIX /var/lib/xcp/xapi|VBD.unplug R:1ac7c9ffd569|storage_impl] dbg:OpaqueRef:1ac7c9ff-d569-4e1d-8fe2-688917213cd2 dp:vbd/0/xvda sr:cf2dbaa3-21f3-903b-0fd1-fbe68539f897 vdi:f9d37af0-10d4-4824-8306-f849c0c74d52 superstate:activated RW Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621692 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:VBD.destroy D:7ee64ab7600c created by task R:063fb70f6b64 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [ info||621692 UNIX /var/lib/xcp/xapi||taskhelper] task VBD.destroy R:dd91e7b81fb2 (uuid:27a52aa2-68dd-1e3d-6c49-184ab9ccdfa7) created (trackid=dabf19084add1449e2bb4eefb181a0fe) by task R:063fb70f6b64 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621692 UNIX /var/lib/xcp/xapi|VBD.destroy R:dd91e7b81fb2|audit] VBD.destroy: VBD = '4ccde4ad-9d36-948a-db99-abfa6b80e1ec' Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621692 UNIX /var/lib/xcp/xapi|VBD.destroy R:dd91e7b81fb2|xapi_vbd_helpers] VBD.destroy (uuid = 4ccde4ad-9d36-948a-db99-abfa6b80e1ec; ref = OpaqueRef:d7d882f5-b195-4ece-981f-00fc9511ee8a) Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621693 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:session.logout D:bdbdfd349487 created by task R:063fb70f6b64 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [ info||621693 UNIX /var/lib/xcp/xapi|session.logout D:443bf85f4ecf|xapi_session] Session.destroy trackid=dabf19084add1449e2bb4eefb181a0fe Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80|Importing raw VDI R:063fb70f6b64|import] Caught exception: VDI_IO_ERROR: [ Device I/O errors ] Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621599 INET :::80|Importing raw VDI R:063fb70f6b64|taskhelper] the status of R:063fb70f6b64 is failure; cannot set it to `failure Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] Importing raw VDI R:063fb70f6b64 failed with exception Server_error(VDI_IO_ERROR, [ Device I/O errors ]) Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] Raised Server_error(VDI_IO_ERROR, [ Device I/O errors ]) Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 1/12 xapi Raised at file ocaml/xapi/vhd_tool_wrapper.ml, line 75 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 2/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 3/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 35 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 4/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 5/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 35 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 6/12 xapi Called from file ocaml/xapi/import_raw_vdi.ml, line 156 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 7/12 xapi Called from file ocaml/xapi/import_raw_vdi.ml, line 194 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 8/12 xapi Called from file ocaml/xapi/server_helpers.ml, line 100 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 9/12 xapi Called from file ocaml/xapi/server_helpers.ml, line 121 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 10/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 11/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 35 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 12/12 xapi Called from file lib/backtrace.ml, line 177 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [debug||621694 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:session.logout D:bdc1513b3d35 created by task D:39d05a9ec99c Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [ info||621694 UNIX /var/lib/xcp/xapi|session.logout D:cc7477457aac|xapi_session] Session.destroy trackid=758ce247f3c6d83f646e031b6504357f Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||import] Caught exception in import handler: VDI_IO_ERROR: [ Device I/O errors ] Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] VDI.import D:39d05a9ec99c failed with exception Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] Raised Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 1/9 xapi Raised at file unix.ml, line 328 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 2/9 xapi Called from file lib/xapi-stdext-unix/unixext.ml, line 469 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 3/9 xapi Called from file lib/xapi-stdext-unix/unixext.ml, line 471 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 4/9 xapi Called from file ocaml/xapi/import_raw_vdi.ml, line 232 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 5/9 xapi Called from file ocaml/xapi/server_helpers.ml, line 100 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 6/9 xapi Called from file ocaml/xapi/server_helpers.ml, line 121 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 7/9 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 8/9 xapi Called from file map.ml, line 135 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 9/9 xapi Called from file sexp_conv.ml, line 147 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] VDI.import D:29850c1bafd6 failed with exception Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] Raised Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace] 1/1 xapi Raised at file (Thread 621599 has no backtrace table. Was with_backtraces called?, line 0 Jan 15 18:19:48 xcp-ng-xen12-lon2 xapi: [error||621599 INET :::80||backtrace]
followed by the below errors when we get to (what will be) the failed VM backing up - the error being 'duplicate key' except the target host and SR were completely cleaned before the backup, including from the CLI for the usual remnants that don't show in the GUI -
Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80|VM.add_to_other_config D:48da5df44120|sql] Duplicate key in set or map: table VM; field other_config; ref OpaqueRef:fde7f87b-0f36-4d02-810b-249a7cede061; key xo:backup:sr Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] VM.add_to_other_config D:48da5df44120 failed with exception Db_exn.Duplicate_key("VM", "other_config", "OpaqueRef:fde7f87b-0f36-4d02-810b-249a7cede061", "xo:backup:sr") Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] Raised Db_exn.Duplicate_key("VM", "other_config", "OpaqueRef:fde7f87b-0f36-4d02-810b-249a7cede061", "xo:backup:sr") Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 1/8 xapi Raised at file ocaml/database/db_cache_impl.ml, line 310 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 2/8 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 3/8 xapi Called from file ocaml/xapi/rbac.ml, line 231 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 4/8 xapi Called from file ocaml/xapi/server_helpers.ml, line 100 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 5/8 xapi Called from file ocaml/xapi/server_helpers.ml, line 121 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 6/8 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 7/8 xapi Called from file map.ml, line 135 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] 8/8 xapi Called from file sexp_conv.ml, line 147 Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [error||623599 INET :::80||backtrace] Jan 15 19:17:36 xcp-ng-xen12-lon2 xapi: [debug||623600 INET :::80||api_effect] VM.remove_from_other_config
...then lots more debug messages, concluding with the VDI_IO_ERROR being returned:
Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623670 UNIX /var/lib/xcp/xapi|VBD.plug R:caeff9d15528|vbdops] device path: sm/backend/cf2dbaa3-21f3-903b-0fd1-fbe68539f897/ea4a2c16-5268-4e53-9efb-c637198fe77d Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623653 INET :::80|Importing raw VDI R:ce4f60abdac4|vhd_tool_wrapper] Executing /bin/vhd-tool serve --source-format vhd --source-protocol none --source-fd 61bc8345-8fa7-4193-b514-99023da700e4 - -tar-filename-prefix --destination file:///dev/sm/backend/cf2dbaa3-21f3-903b-0fd1-fbe68539f897/ea4a2c16-5268-4e53-9efb-c637198fe77d --destination-format raw --progress --machine --direct --destination-size 0 --prezeroed Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623660 INET :::80|[XO] VDI Content Import R:d79ee3e4c7cd|vhd_tool_wrapper] Executing /bin/vhd-tool serve --source-format vhd --source-protocol none --source-fd 41731693-3426-42e6-9a43-5fda7eaf c59f --tar-filename-prefix --destination file:///dev/sm/backend/cf2dbaa3-21f3-903b-0fd1-fbe68539f897/ea4a2c16-5268-4e53-9efb-c637198fe77d --destination-format raw --progress --machine --direct --destination-size 75491041792 --prezeroed Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80|Importing raw VDI R:ce4f60abdac4|vhd_tool_wrapper] vhd-tool failed, returning VDI_IO_ERROR Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80|Importing raw VDI R:ce4f60abdac4|vhd_tool_wrapper] vhd-tool output: vhd-tool: Non-zero size required (either a pre-existing destination file or specified via --destination-size on the command line)\x0A Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623738 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:session.slave_login D:db69be120ea6 created by task R:ce4f60abdac4 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623738 UNIX /var/lib/xcp/xapi|session.slave_login D:094ed09a64e9|xapi_session] Session.create trackid=ba066585a7083cd229be4a26ef154317 pool=true uname= originator=xapi is_local_superuser=true auth_user_sid= parent=trackid=9834f5af41c964e225f24279aefe4e49 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623739 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:pool.get_all D:9bcd81e4dd62 created by task D:094ed09a64e9 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:VBD.unplug D:a0deb2a382f6 created by task R:ce4f60abdac4 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623740 UNIX /var/lib/xcp/xapi||taskhelper] task VBD.unplug R:c0cac1edb5c5 (uuid:8eab9844-7218-b1a9-a94a-ade7e36b70a3) created (trackid=ba066585a7083cd229be4a26ef154317) by task R:ce4f60abdac4 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|audit] VBD.unplug: VBD = 'a375bd27-cea7-4263-ad72-33012568155e' Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|vbdops] VBD.unplug of loopback VBD 'OpaqueRef:314f1ee1-0dd4-462a-8177-cd1bcbde0f0d' Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|storage_impl] DP.destroy dbg:OpaqueRef:c0cac1ed-b5c5-4475-a71a-b4303260e39d dp:vbd/0/xvda allow_leak:false Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|storage_impl] [destroy_sr] Filtered VDI count:0 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: message repeated 3 times: [ [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|storage_impl] [destroy_sr] Filtered VDI count:0] Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|storage_impl] [destroy_sr] Filtered VDI count:1 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|storage_impl] [destroy_sr] VDI found with the dp is ea4a2c16-5268-4e53-9efb-c637198fe77d Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|storage_impl] dbg:OpaqueRef:c0cac1ed-b5c5-4475-a71a-b4303260e39d dp:vbd/0/xvda sr:cf2dbaa3-21f3-903b-0fd1-fbe68539f897 vdi:ea4a2c16-5268-4e53-9efb-c637198fe77d superstate:activated RW Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623740 UNIX /var/lib/xcp/xapi|VBD.unplug R:c0cac1edb5c5|storage_impl] dbg:OpaqueRef:c0cac1ed-b5c5-4475-a71a-b4303260e39d dp:vbd/0/xvda sr:cf2dbaa3-21f3-903b-0fd1-fbe68539f897 vdi:ea4a2c16-5268-4e53-9efb-c637198fe77d superstate:activated RW Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623743 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:VBD.destroy D:830c73f393b5 created by task R:ce4f60abdac4 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623743 UNIX /var/lib/xcp/xapi||taskhelper] task VBD.destroy R:43bb36d95384 (uuid:c96a5c81-59f0-a49f-c934-0aa4b57dece6) created (trackid=ba066585a7083cd229be4a26ef154317) by task R:ce4f60abdac4 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623743 UNIX /var/lib/xcp/xapi|VBD.destroy R:43bb36d95384|audit] VBD.destroy: VBD = 'a375bd27-cea7-4263-ad72-33012568155e' Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623743 UNIX /var/lib/xcp/xapi|VBD.destroy R:43bb36d95384|xapi_vbd_helpers] VBD.destroy (uuid = a375bd27-cea7-4263-ad72-33012568155e; ref = OpaqueRef:314f1ee1-0dd4-462a-8177-cd1bcbde0f0d) Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623744 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:session.logout D:528a483a90fa created by task R:ce4f60abdac4 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623744 UNIX /var/lib/xcp/xapi|session.logout D:89cb21d91bc0|xapi_session] Session.destroy trackid=ba066585a7083cd229be4a26ef154317 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80|Importing raw VDI R:ce4f60abdac4|import] Caught exception: VDI_IO_ERROR: [ Device I/O errors ] Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623653 INET :::80|Importing raw VDI R:ce4f60abdac4|taskhelper] the status of R:ce4f60abdac4 is failure; cannot set it to `failure Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] Importing raw VDI R:ce4f60abdac4 failed with exception Server_error(VDI_IO_ERROR, [ Device I/O errors ]) Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] Raised Server_error(VDI_IO_ERROR, [ Device I/O errors ]) Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 1/12 xapi Raised at file ocaml/xapi/vhd_tool_wrapper.ml, line 75 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 2/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 3/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 35 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 4/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 5/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 35 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 6/12 xapi Called from file ocaml/xapi/import_raw_vdi.ml, line 156 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 7/12 xapi Called from file ocaml/xapi/import_raw_vdi.ml, line 194 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 8/12 xapi Called from file ocaml/xapi/server_helpers.ml, line 100 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 9/12 xapi Called from file ocaml/xapi/server_helpers.ml, line 121 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 10/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 11/12 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 35 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 12/12 xapi Called from file lib/backtrace.ml, line 177 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623745 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:session.logout D:b305b9778ea7 created by task D:378e6880299b Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [ info||623745 UNIX /var/lib/xcp/xapi|session.logout D:32b261df8c78|xapi_session] Session.destroy trackid=2de39b11c565eee741af8c59195f712e Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||import] Caught exception in import handler: VDI_IO_ERROR: [ Device I/O errors ] Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] VDI.import D:378e6880299b failed with exception Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] Raised Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 1/9 xapi Raised at file unix.ml, line 328 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 2/9 xapi Called from file lib/xapi-stdext-unix/unixext.ml, line 469 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 3/9 xapi Called from file lib/xapi-stdext-unix/unixext.ml, line 471 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 4/9 xapi Called from file ocaml/xapi/import_raw_vdi.ml, line 232 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 5/9 xapi Called from file ocaml/xapi/server_helpers.ml, line 100 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 6/9 xapi Called from file ocaml/xapi/server_helpers.ml, line 121 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 7/9 xapi Called from file lib/xapi-stdext-pervasives/pervasiveext.ml, line 24 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 8/9 xapi Called from file map.ml, line 135 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 9/9 xapi Called from file sexp_conv.ml, line 147 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] VDI.import D:c2494631d071 failed with exception Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] Raised Unix.Unix_error(Unix.EPIPE, "single_write", "") Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] 1/1 xapi Raised at file (Thread 623653 has no backtrace table. Was with_backtraces called?, line 0 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [error||623653 INET :::80||backtrace] Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||575 ||xenops] Event on VM 35d318ed-1b0b-44f9-bf9e-da6ce90f0f24; resident_here = true Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||575 ||dummytaskhelper] task timeboxed_rpc D:2765a7d8c02f created by task D:7b99bbf373c5 Jan 15 19:17:40 xcp-ng-xen12-lon2 xapi: [debug||623748 UNIX /var/lib/xcp/xapi||dummytaskhelper] task dispatch:event.from D:7e5dfbe36065 created by task D:7b99bbf373c5
Nothing jumps out in SMlog on either source or target.
XOA syslog, only line that refers to the failure is
xo:xo-server ERROR unhandled error event { error: Cancel { message: 'aborted' } }
fsck shows disks as being clean, dom0 has 16gb of memory allocated. I've tried numerous variations, cleaning the target down, fresh install of XCP-ng, etc.
If anyone has any ideas or other areas I should be looking, please do jump in. I'm out of ideas
-
Forgot to say - the duplicate referred to in the error is the β[Importing...BackupProcessName-timestamp] halted VM that the backup process itself creates - ie the remnants that I spoke of at the outset that needed to be removed that donβt show in the GUI but can be seen via CLI using
xe vm-list power-state=halted
-
Have just updated the hosts with the swathe of patches that came out this morning, and tested with a fresh build from master as of today.
Unfortunately still getting these VDI_IO_ERROR errors when running CR, be it from a trial XOA install or from the latest build from master (rebuilt this morning); nothing particularly helpful jumping out of the logs:
2021-01-23T15_43_13.502Z - backup NG.log.txt
Question to the community - are other people seeing the same VDI_IO_ERROR error occasionally, or is this a unique case? Even if you can't provide access to your logs for privacy reasons, it would be interesting to know if others are seeing it. Of course, that relies upon others seeing this message....
Thanks!
-
Check the usual logs (SMlog and
dmesg
) to see if we can spot the root cause. -
@olivierlambert Nothing jumps out from dmesg and nothing useful in SMlog when running the CR, however do get the following exceptions in SMlog upon boot (same error for each disk array):
SMGC: [24044] *~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~* SMGC: [24044] *********************** SMGC: [24044] * E X C E P T I O N * SMGC: [24044] *********************** SMGC: [24044] gc: EXCEPTION <class 'util.SMException'>, SR 7e772759-f44e-af14-eb45-e839cc67689c not attached on this host SMGC: [24044] File "/opt/xensource/sm/cleanup.py", line 3354, in gc SMGC: [24044] _gc(None, srUuid, dryRun) SMGC: [24044] File "/opt/xensource/sm/cleanup.py", line 3233, in _gc SMGC: [24044] sr = SR.getInstance(srUuid, session) SMGC: [24044] File "/opt/xensource/sm/cleanup.py", line 1552, in getInstance SMGC: [24044] return FileSR(uuid, xapi, createLock, force) SMGC: [24044] File "/opt/xensource/sm/cleanup.py", line 2330, in __init__ SMGC: [24044] SR.__init__(self, uuid, xapi, createLock, force) SMGC: [24044] File "/opt/xensource/sm/cleanup.py", line 1582, in __init__ SMGC: [24044] raise util.SMException("SR %s not attached on this host" % uuid) SMGC: [24044] SMGC: [24044] *~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~* SMGC: [24044] * * * * * SR 7e772759-f44e-af14-eb45-e839cc67689c: ERROR SMGC: [24044] SM: [24066] lock: opening lock file /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/sr SM: [24066] sr_update {'sr_uuid': '7e772759-f44e-af14-eb45-e839cc67689c', 'subtask_of': 'DummyRef:|ed453e0f-248d-403f-9499-ee7255fdf429|SR.stat', 'args': [], 'host_ref': 'OpaqueRef:c1b3f3e8-579b-4b35-9bb6-dcad830583c3', 'session_ref': 'OpaqueRef:dafc19b5-65af-4501-a151-1999d8e7f550', 'device_config': {'device': '/dev/disk/by-id/scsi-2ad3a730200d00000-part1', 'SRmaster': 'true'}, 'command': 'sr_update', 'sr_ref': 'OpaqueRef:bfd47d28-1be1-4dd6-8a7b-2c0a986b8d47', 'local_cache_sr': '34bd3a97-3562-75af-a24e-a266caf368e3'} SM: [24088] lock: opening lock file /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/sr SM: [24088] lock: acquired /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/sr SM: [24088] sr_scan {'sr_uuid': '7e772759-f44e-af14-eb45-e839cc67689c', 'subtask_of': 'DummyRef:|017c9f00-7289-4983-a942-f91c830fed33|SR.scan', 'args': [], 'host_ref': 'OpaqueRef:c1b3f3e8-579b-4b35-9bb6-dcad830583c3', 'session_ref': 'OpaqueRef:a907696c-b7cf-4ac8-b772-b22833073585', 'device_config': {'device': '/dev/disk/by-id/scsi-2ad3a730200d00000-part1', 'SRmaster': 'true'}, 'command': 'sr_scan', 'sr_ref': 'OpaqueRef:bfd47d28-1be1-4dd6-8a7b-2c0a986b8d47', 'local_cache_sr': '34bd3a97-3562-75af-a24e-a266caf368e3'} SM: [24088] ['/usr/bin/vhd-util', 'scan', '-f', '-m', '/var/run/sr-mount/7e772759-f44e-af14-eb45-e839cc67689c/*.vhd'] SM: [24088] pread SUCCESS SM: [24088] ['ls', '/var/run/sr-mount/7e772759-f44e-af14-eb45-e839cc67689c', '-1', '--color=never'] SM: [24088] pread SUCCESS SM: [24088] lock: opening lock file /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/running SM: [24088] lock: tried lock /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/running, acquired: True (exists: True) SM: [24088] lock: released /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/running SM: [24088] Kicking GC SMGC: [24088] === SR 7e772759-f44e-af14-eb45-e839cc67689c: gc === SMGC: [24104] Will finish as PID [24105] SM: [24105] lock: opening lock file /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/running SMGC: [24088] New PID [24104] SM: [24105] lock: opening lock file /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/gc_active SM: [24088] lock: released /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/sr SM: [24105] lock: opening lock file /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/sr SMGC: [24105] Found 0 cache files SM: [24105] lock: tried lock /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/gc_active, acquired: True (exists: True) SM: [24105] lock: tried lock /var/lock/sm/7e772759-f44e-af14-eb45-e839cc67689c/sr, acquired: True (exists: True) SM: [24105] ['/usr/bin/vhd-util', 'scan', '-f', '-m', '/var/run/sr-mount/7e772759-f44e-af14-eb45-e839cc67689c/*.vhd'] SM: [24105] pread SUCCESS
Subsequent to above, found that get identical error when reattaching SR (after detaching it).
xapi-explore-sr does not show anything interesting
Thanks chap; v kind of you