Backup timeout with compresion
-
I have a VM with windows 2019 with two disks:
- first is 140 gb with 20% with data (about 26Gb), it have 4 partition on it (EFI, MRS, Sistem with NTFS, Data with NTFS)
- second 1024gb with 0.3% with data (about 3gb ), it have 1 partition on it (data with NTFS)
I made a Backup plan with compression and I get the message below ( log ), this happen with all VM with two disks that have little data on second disk.
When I disable the compression all is ok. If I switch to Continuous Replication all is OK.But if I remove second drive and leave the VM only with first disk, the backup plan with compression is OK.
If I remove first drive and leave VM with second drive, the backup plan with compression failed. Without compression is OK
The XOCE is commit a4bb453401166041a7a6e8a5740508624eb2fe99
{ "data": { "mode": "full", "reportWhen": "failure" }, "id": "1637149486885", "jobId": "38d7e43a-88f6-474c-af6d-1ff70ef8813d", "jobName": "tb-sema", "message": "backup", "scheduleId": "4e631714-8a75-49f6-8c58-ddcaa716c78d", "start": 1637149486885, "status": "failure", "infos": [ { "data": { "vms": [ "629bdfeb-7700-561c-74ac-e151068721c2" ] }, "message": "vms" } ], "tasks": [ { "data": { "type": "VM", "id": "629bdfeb-7700-561c-74ac-e151068721c2" }, "id": "1637149498626", "message": "backup VM", "start": 1637149498626, "status": "failure", "tasks": [ { "id": "1637149501402", "message": "snapshot", "start": 1637149501402, "status": "success", "end": 1637149504817, "result": "78cb549b-fcce-aa58-e034-8f17ddf319d5" }, { "data": { "id": "ebfc8472-6e08-4bdb-b848-c06ed6c3c4fe", "type": "remote", "isFull": true }, "id": "1637149504843", "message": "export", "start": 1637149504843, "status": "failure", "tasks": [ { "id": "1637149504847", "message": "transfer", "start": 1637149504847, "status": "failure", "end": 1637150445637, "result": { "canceled": false, "method": "GET", "url": "https://192.168.50.15/export/?ref=OpaqueRef%3A8e6b8951-4830-4914-b219-df42a76998da&use_compression=zstd&session_id=OpaqueRef%3A1355bc36-351e-4fe8-b761-fc4787d17a7a&task_id=OpaqueRef%3A9ccbd32c-a211-4bbc-a4af-e4eceaeb02dd", "timeout": true, "message": "HTTP connection has timed out", "name": "Error", "stack": "Error: HTTP connection has timed out\n at IncomingMessage.emitAbortedError (/opt/xen-orchestra/node_modules/http-request-plus/index.js:83:19)\n at Object.onceWrapper (node:events:509:28)\n at IncomingMessage.emit (node:events:390:28)\n at IncomingMessage.patchedEmit (/opt/xen-orchestra/@xen-orchestra/log/configure.js:118:17)\n at IncomingMessage.emit (node:domain:475:12)\n at IncomingMessage._destroy (node:_http_incoming:179:10)\n at _destroy (node:internal/streams/destroy:102:25)\n at IncomingMessage.destroy (node:internal/streams/destroy:64:5)\n at TLSSocket.socketCloseListener (node:_http_client:407:11)\n at TLSSocket.emit (node:events:402:35)" } } ], "end": 1637150445638, "result": { "canceled": false, "method": "GET", "url": "https://192.168.50.15/export/?ref=OpaqueRef%3A8e6b8951-4830-4914-b219-df42a76998da&use_compression=zstd&session_id=OpaqueRef%3A1355bc36-351e-4fe8-b761-fc4787d17a7a&task_id=OpaqueRef%3A9ccbd32c-a211-4bbc-a4af-e4eceaeb02dd", "timeout": true, "message": "HTTP connection has timed out", "name": "Error", "stack": "Error: HTTP connection has timed out\n at IncomingMessage.emitAbortedError (/opt/xen-orchestra/node_modules/http-request-plus/index.js:83:19)\n at Object.onceWrapper (node:events:509:28)\n at IncomingMessage.emit (node:events:390:28)\n at IncomingMessage.patchedEmit (/opt/xen-orchestra/@xen-orchestra/log/configure.js:118:17)\n at IncomingMessage.emit (node:domain:475:12)\n at IncomingMessage._destroy (node:_http_incoming:179:10)\n at _destroy (node:internal/streams/destroy:102:25)\n at IncomingMessage.destroy (node:internal/streams/destroy:64:5)\n at TLSSocket.socketCloseListener (node:_http_client:407:11)\n at TLSSocket.emit (node:events:402:35)" } }, { "data": { "id": "0cc4ad0f-aa78-467a-a24a-ab8555883306", "type": "remote", "isFull": true }, "id": "1637149504844", "message": "export", "start": 1637149504844, "status": "failure", "tasks": [ { "id": "1637149504851", "message": "transfer", "start": 1637149504851, "status": "failure", "end": 1637150446316, "result": { "canceled": false, "method": "GET", "url": "https://192.168.50.15/export/?ref=OpaqueRef%3A8e6b8951-4830-4914-b219-df42a76998da&use_compression=zstd&session_id=OpaqueRef%3A1355bc36-351e-4fe8-b761-fc4787d17a7a&task_id=OpaqueRef%3A9ccbd32c-a211-4bbc-a4af-e4eceaeb02dd", "timeout": true, "message": "HTTP connection has timed out", "name": "Error", "stack": "Error: HTTP connection has timed out\n at IncomingMessage.emitAbortedError (/opt/xen-orchestra/node_modules/http-request-plus/index.js:83:19)\n at Object.onceWrapper (node:events:509:28)\n at IncomingMessage.emit (node:events:390:28)\n at IncomingMessage.patchedEmit (/opt/xen-orchestra/@xen-orchestra/log/configure.js:118:17)\n at IncomingMessage.emit (node:domain:475:12)\n at IncomingMessage._destroy (node:_http_incoming:179:10)\n at _destroy (node:internal/streams/destroy:102:25)\n at IncomingMessage.destroy (node:internal/streams/destroy:64:5)\n at TLSSocket.socketCloseListener (node:_http_client:407:11)\n at TLSSocket.emit (node:events:402:35)" } } ], "end": 1637150446317, "result": { "canceled": false, "method": "GET", "url": "https://192.168.50.15/export/?ref=OpaqueRef%3A8e6b8951-4830-4914-b219-df42a76998da&use_compression=zstd&session_id=OpaqueRef%3A1355bc36-351e-4fe8-b761-fc4787d17a7a&task_id=OpaqueRef%3A9ccbd32c-a211-4bbc-a4af-e4eceaeb02dd", "timeout": true, "message": "HTTP connection has timed out", "name": "Error", "stack": "Error: HTTP connection has timed out\n at IncomingMessage.emitAbortedError (/opt/xen-orchestra/node_modules/http-request-plus/index.js:83:19)\n at Object.onceWrapper (node:events:509:28)\n at IncomingMessage.emit (node:events:390:28)\n at IncomingMessage.patchedEmit (/opt/xen-orchestra/@xen-orchestra/log/configure.js:118:17)\n at IncomingMessage.emit (node:domain:475:12)\n at IncomingMessage._destroy (node:_http_incoming:179:10)\n at _destroy (node:internal/streams/destroy:102:25)\n at IncomingMessage.destroy (node:internal/streams/destroy:64:5)\n at TLSSocket.socketCloseListener (node:_http_client:407:11)\n at TLSSocket.emit (node:events:402:35)" } }, { "data": { "id": "d7b5ba1c-6ab9-4a6c-8dd0-a786ae89382d", "type": "remote", "isFull": true }, "id": "1637149504845", "message": "export", "start": 1637149504845, "status": "failure", "tasks": [ { "id": "1637149504852", "message": "transfer", "start": 1637149504852, "status": "failure", "end": 1637150446584, "result": { "canceled": false, "method": "GET", "url": "https://192.168.50.15/export/?ref=OpaqueRef%3A8e6b8951-4830-4914-b219-df42a76998da&use_compression=zstd&session_id=OpaqueRef%3A1355bc36-351e-4fe8-b761-fc4787d17a7a&task_id=OpaqueRef%3A9ccbd32c-a211-4bbc-a4af-e4eceaeb02dd", "timeout": true, "message": "HTTP connection has timed out", "name": "Error", "stack": "Error: HTTP connection has timed out\n at IncomingMessage.emitAbortedError (/opt/xen-orchestra/node_modules/http-request-plus/index.js:83:19)\n at Object.onceWrapper (node:events:509:28)\n at IncomingMessage.emit (node:events:390:28)\n at IncomingMessage.patchedEmit (/opt/xen-orchestra/@xen-orchestra/log/configure.js:118:17)\n at IncomingMessage.emit (node:domain:475:12)\n at IncomingMessage._destroy (node:_http_incoming:179:10)\n at _destroy (node:internal/streams/destroy:102:25)\n at IncomingMessage.destroy (node:internal/streams/destroy:64:5)\n at TLSSocket.socketCloseListener (node:_http_client:407:11)\n at TLSSocket.emit (node:events:402:35)" } } ], "end": 1637150446584, "result": { "canceled": false, "method": "GET", "url": "https://192.168.50.15/export/?ref=OpaqueRef%3A8e6b8951-4830-4914-b219-df42a76998da&use_compression=zstd&session_id=OpaqueRef%3A1355bc36-351e-4fe8-b761-fc4787d17a7a&task_id=OpaqueRef%3A9ccbd32c-a211-4bbc-a4af-e4eceaeb02dd", "timeout": true, "message": "HTTP connection has timed out", "name": "Error", "stack": "Error: HTTP connection has timed out\n at IncomingMessage.emitAbortedError (/opt/xen-orchestra/node_modules/http-request-plus/index.js:83:19)\n at Object.onceWrapper (node:events:509:28)\n at IncomingMessage.emit (node:events:390:28)\n at IncomingMessage.patchedEmit (/opt/xen-orchestra/@xen-orchestra/log/configure.js:118:17)\n at IncomingMessage.emit (node:domain:475:12)\n at IncomingMessage._destroy (node:_http_incoming:179:10)\n at _destroy (node:internal/streams/destroy:102:25)\n at IncomingMessage.destroy (node:internal/streams/destroy:64:5)\n at TLSSocket.socketCloseListener (node:_http_client:407:11)\n at TLSSocket.emit (node:events:402:35)" } } ], "end": 1637150544832, "result": { "message": "all targets have failed, step: writer.run()", "name": "Error", "stack": "Error: all targets have failed, step: writer.run()\n at VmBackup._callWriters (/opt/xen-orchestra/@xen-orchestra/backups/_VmBackup.js:131:13)\n at async VmBackup._copyFull (/opt/xen-orchestra/@xen-orchestra/backups/_VmBackup.js:249:5)\n at async VmBackup.run (/opt/xen-orchestra/@xen-orchestra/backups/_VmBackup.js:410:9)" } } ], "end": 1637150544834 }
Thank you
-
Can you try with GZIP compression instead of ZSTD to see if there's any diff?
-
I tried with both compression options, the result is the same, not working.
At the moment I no longer use compression at all. -
Putting @julien-f in the loop
-
The compression is handled by XCP-ng itself, it does not change the behavior of XO itself.
We need to investigate to be sure, but I don't think there is anything that we could do in XO.
-
Is there any option to set for timeout value?
I say this because, monitoring the process, everything seems to be going well until the moment when it starts to check the free space.
XCP-NG is 8.2 up to date, the disk on which the VM is it is formated as local ext.
The connection to the backup server is on NFS on the LAN at a speed of 1Gb.
On the iotop it seems that the space check goes with speed 650Gbits +.