CBT: the thread to centralize your feedback
-
@icompit Full backups of the same VMs are processed without a problem.
{ "data": { "mode": "delta", "reportWhen": "failure" }, "id": "1728349200005", "jobId": "af0c1c01-8101-4dc6-806f-a4d3cf381cf7", "jobName": "ISO", "message": "backup", "scheduleId": "0e073624-a7eb-47ca-aca5-5f1cd0fc996b", "start": 1728349200005, "status": "success", "infos": [ { "data": { "vms": [ "81dbf29f-adfa-5bc4-9bc1-021f8eb46b9e", "fc1c7067-69f3-8949-ab68-780024a49a75", "07c32e8d-ea85-ffaa-3509-2dabff58e4af", "66949c31-5544-88fe-5e49-f0e0c7946347" ] }, "message": "vms" } ], "tasks": [ { "data": { "type": "VM", "id": "81dbf29f-adfa-5bc4-9bc1-021f8eb46b9e", "name_label": "MX" }, "id": "1728349200726", "message": "backup VM", "start": 1728349200726, "status": "success", "tasks": [ { "id": "1728349200738", "message": "clean-vm", "start": 1728349200738, "status": "success", "end": 1728349200773, "result": { "merge": false } }, { "id": "1728349200944", "message": "snapshot", "start": 1728349200944, "status": "success", "end": 1728349202402, "result": "c9285da2-1b13-e8a7-7652-316fbfde76fd" }, { "data": { "id": "22d6a348-ae2b-4783-a77d-a456e508ba64", "isFull": true, "type": "remote" }, "id": "1728349202402:0", "message": "export", "start": 1728349202402, "status": "success", "tasks": [ { "id": "1728349202859", "message": "transfer", "start": 1728349202859, "status": "success", "end": 1728355287481, "result": { "size": 358491499520 } }, { "id": "1728355310406", "message": "health check", "start": 1728355310406, "status": "success", "tasks": [ { "id": "1728355311238", "message": "transfer", "start": 1728355311238, "status": "success", "end": 1728359446575, "result": { "size": 358491498496, "id": "17402cd8-bd52-98e1-7951-3d47a03dcbf4" } }, { "id": "1728359446575:0", "message": "vmstart", "start": 1728359446575, "status": "success", "end": 1728359477580 } ], "end": 1728359492584 }, { "id": "1728359492598", "message": "clean-vm", "start": 1728359492598, "status": "success", "end": 1728359492625, "result": { "merge": false } } ], "end": 1728359492627 } ], "infos": [ { "message": "will delete snapshot data" }, { "data": { "vdiRef": "OpaqueRef:dea183a0-abf2-422b-b7fd-4d0ed1d2ac12" }, "message": "Snapshot data has been deleted" }, { "data": { "vdiRef": "OpaqueRef:e1c325ac-e8c9-4a75-83d0-a0ee50289665" }, "message": "Snapshot data has been deleted" } ], "end": 1728359492628 }, { "data": { "type": "VM", "id": "fc1c7067-69f3-8949-ab68-780024a49a75", "name_label": "ISH" }, "id": "1728359492631", "message": "backup VM", "start": 1728359492631, "status": "success", "tasks": [ { "id": "1728359492636", "message": "clean-vm", "start": 1728359492636, "status": "success", "end": 1728359492649, "result": { "merge": false } }, { "id": "1728359492792", "message": "snapshot", "start": 1728359492792, "status": "success", "end": 1728359493650, "result": "8c4a61e7-9f4c-a58b-f1d7-e9edb86e73bb" }, { "data": { "id": "22d6a348-ae2b-4783-a77d-a456e508ba64", "isFull": true, "type": "remote" }, "id": "1728359493650:0", "message": "export", "start": 1728359493650, "status": "success", "tasks": [ { "id": "1728359494085", "message": "transfer", "start": 1728359494085, "status": "success", "end": 1728360044219, "result": { "size": 49962294784 } }, { "id": "1728360047118", "message": "health check", "start": 1728360047118, "status": "success", "tasks": [ { "id": "1728360047256", "message": "transfer", "start": 1728360047256, "status": "success", "end": 1728360544642, "result": { "size": 49962294272, "id": "3d7e38b1-2366-e657-884e-67f33bee9b56" } }, { "id": "1728360544642:0", "message": "vmstart", "start": 1728360544642, "status": "success", "end": 1728360567171 } ], "end": 1728360568598 }, { "id": "1728360568628", "message": "clean-vm", "start": 1728360568628, "status": "success", "end": 1728360568663, "result": { "merge": false } } ], "end": 1728360568667 } ], "infos": [ { "message": "will delete snapshot data" }, { "data": { "vdiRef": "OpaqueRef:18e03c56-b537-490c-8c1d-0f87fef61940" }, "message": "Snapshot data has been deleted" } ], "end": 1728360568667 }, { "data": { "type": "VM", "id": "07c32e8d-ea85-ffaa-3509-2dabff58e4af", "name_label": "VPS1" }, "id": "1728360568671", "message": "backup VM", "start": 1728360568671, "status": "success", "tasks": [ { "id": "1728360568677", "message": "clean-vm", "start": 1728360568677, "status": "success", "end": 1728360568702, "result": { "merge": false } }, { "id": "1728360568803", "message": "snapshot", "start": 1728360568803, "status": "success", "end": 1728360569642, "result": "eac56d7c-fa78-f6ae-41c3-d316fbc597a0" }, { "data": { "id": "22d6a348-ae2b-4783-a77d-a456e508ba64", "isFull": true, "type": "remote" }, "id": "1728360569643", "message": "export", "start": 1728360569643, "status": "success", "tasks": [ { "id": "1728360570055", "message": "transfer", "start": 1728360570055, "status": "success", "end": 1728360787031, "result": { "size": 15826942976 } }, { "id": "1728360788525", "message": "health check", "start": 1728360788525, "status": "success", "tasks": [ { "id": "1728360788592", "message": "transfer", "start": 1728360788592, "status": "success", "end": 1728360987674, "result": { "size": 15826942464, "id": "52962bc5-2ed4-c397-9131-8894f45523bd" } }, { "id": "1728360987675", "message": "vmstart", "start": 1728360987675, "status": "success", "end": 1728361007502 } ], "end": 1728361008575 }, { "id": "1728361008588", "message": "clean-vm", "start": 1728361008588, "status": "success", "end": 1728361008622, "result": { "merge": true } } ], "end": 1728361008636 } ], "infos": [ { "message": "will delete snapshot data" }, { "data": { "vdiRef": "OpaqueRef:00c688b0-555d-42bb-93e9-db8b09fea339" }, "message": "Snapshot data has been deleted" } ], "end": 1728361008636 }, { "data": { "type": "VM", "id": "66949c31-5544-88fe-5e49-f0e0c7946347", "name_label": "DNS" }, "id": "1728361008638", "message": "backup VM", "start": 1728361008638, "status": "success", "tasks": [ { "id": "1728361008643", "message": "clean-vm", "start": 1728361008643, "status": "success", "end": 1728361008669, "result": { "merge": false } }, { "id": "1728361008780", "message": "snapshot", "start": 1728361008780, "status": "success", "end": 1728361009626, "result": "20408aa8-5979-23a5-e8db-c32eb5ec52a8" }, { "data": { "id": "22d6a348-ae2b-4783-a77d-a456e508ba64", "isFull": true, "type": "remote" }, "id": "1728361009626:0", "message": "export", "start": 1728361009626, "status": "success", "tasks": [ { "id": "1728361010081", "message": "transfer", "start": 1728361010081, "status": "success", "end": 1728361179509, "result": { "size": 14742417920 } }, { "id": "1728361181070:0", "message": "health check", "start": 1728361181070, "status": "success", "tasks": [ { "id": "1728361181124", "message": "transfer", "start": 1728361181124, "status": "success", "end": 1728361321066, "result": { "size": 14742417408, "id": "cd3ce194-28ca-0c3f-8a78-6873ebac0da2" } }, { "id": "1728361321067", "message": "vmstart", "start": 1728361321067, "status": "success", "end": 1728361368268 } ], "end": 1728361369283 }, { "id": "1728361369296", "message": "clean-vm", "start": 1728361369296, "status": "success", "end": 1728361369327, "result": { "merge": true } } ], "end": 1728361369340 } ], "infos": [ { "message": "will delete snapshot data" }, { "data": { "vdiRef": "OpaqueRef:b6f2b416-83b1-4f50-84f1-662f03aa626d" }, "message": "Snapshot data has been deleted" } ], "end": 1728361369340 } ], "end": 1728361369340 }
-
Switched to latest channel:
Current version: 5.99.1 - XOA build: 20241004exact same error for me as i posted before:
Oct 08 04:12:21 xoa xo-server[152702]: 2024-10-08T09:12:21.139Z xo:backups:worker INFO starting backup Oct 08 04:12:27 xoa xo-server[152702]: 2024-10-08T09:12:27.213Z xo:xapi:vdi INFO found changed blocks { Oct 08 04:12:27 xoa xo-server[152702]: changedBlocks: <Buffer a0 11 60 00 00 00 00 60 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ... 511950 more bytes> Oct 08 04:12:27 xoa xo-server[152702]: } Oct 08 04:12:41 xoa xo-server[152702]: 2024-10-08T09:12:41.036Z xo:xapi:vdi INFO found changed blocks { Oct 08 04:12:41 xoa xo-server[152702]: changedBlocks: <Buffer 00 00 80 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 ... 102353 more bytes> Oct 08 04:12:41 xoa xo-server[152702]: } Oct 08 04:12:43 xoa xo-server[152702]: 2024-10-08T09:12:43.021Z xo:xapi:vdi INFO OpaqueRef:0c5189b6-8ef7-4d55-80c9-0aed7b165e18 has been disconnected from dom0 { Oct 08 04:12:43 xoa xo-server[152702]: vdiRef: 'OpaqueRef:8b3097a7-0038-475f-a45e-5d386f3e3e26', Oct 08 04:12:43 xoa xo-server[152702]: vbdRef: 'OpaqueRef:0c5189b6-8ef7-4d55-80c9-0aed7b165e18' Oct 08 04:12:43 xoa xo-server[152702]: } Oct 08 04:23:43 xoa xo-server[152702]: 2024-10-08T09:23:43.779Z xo:xapi:vdi INFO OpaqueRef:3ce8200f-f3ca-4548-b1b3-beac8b19ae2b has been disconnected from dom0 { Oct 08 04:23:43 xoa xo-server[152702]: vdiRef: 'OpaqueRef:75005d3e-4d4c-4799-ab70-ee7e7a4f7e23', Oct 08 04:23:43 xoa xo-server[152702]: vbdRef: 'OpaqueRef:3ce8200f-f3ca-4548-b1b3-beac8b19ae2b' Oct 08 04:23:43 xoa xo-server[152702]: } Oct 08 04:24:59 xoa xo-server[152702]: 2024-10-08T09:24:59.120Z xo:xapi:vdi INFO OpaqueRef:98b20faa-2c5e-4b5a-9946-63b71857f200 has been disconnected from dom0 { Oct 08 04:24:59 xoa xo-server[152702]: vdiRef: 'OpaqueRef:89cedb32-81c9-482c-b6b0-e9d0102fd198', Oct 08 04:24:59 xoa xo-server[152702]: vbdRef: 'OpaqueRef:98b20faa-2c5e-4b5a-9946-63b71857f200' Oct 08 04:24:59 xoa xo-server[152702]: } Oct 08 04:24:59 xoa xo-server[152702]: 2024-10-08T09:24:59.980Z xo:xapi WARN retry { Oct 08 04:24:59 xoa xo-server[152702]: attemptNumber: 0, Oct 08 04:24:59 xoa xo-server[152702]: delay: 5000, Oct 08 04:24:59 xoa xo-server[152702]: error: XapiError: VDI_IN_USE(OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e, destroy) Oct 08 04:24:59 xoa xo-server[152702]: at XapiError.wrap (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/_XapiError.mjs:16:12) Oct 08 04:24:59 xoa xo-server[152702]: at default (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/_getTaskResult.mjs:13:29) Oct 08 04:24:59 xoa xo-server[152702]: at Xapi._addRecordToCache (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1041:24) Oct 08 04:24:59 xoa xo-server[152702]: at file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1075:14 Oct 08 04:24:59 xoa xo-server[152702]: at Array.forEach (<anonymous>) Oct 08 04:24:59 xoa xo-server[152702]: at Xapi._processEvents (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1065:12) Oct 08 04:24:59 xoa xo-server[152702]: at Xapi._watchEvents (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1238:14) Oct 08 04:24:59 xoa xo-server[152702]: at process.processTicksAndRejections (node:internal/process/task_queues:95:5) { Oct 08 04:24:59 xoa xo-server[152702]: code: 'VDI_IN_USE', Oct 08 04:24:59 xoa xo-server[152702]: params: [ 'OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e', 'destroy' ], Oct 08 04:24:59 xoa xo-server[152702]: call: undefined, Oct 08 04:24:59 xoa xo-server[152702]: url: undefined, Oct 08 04:24:59 xoa xo-server[152702]: task: task { Oct 08 04:24:59 xoa xo-server[152702]: uuid: '240ae893-9d16-1762-572f-ec8da918dc49', Oct 08 04:24:59 xoa xo-server[152702]: name_label: 'Async.VDI.destroy', Oct 08 04:24:59 xoa xo-server[152702]: name_description: '', Oct 08 04:24:59 xoa xo-server[152702]: allowed_operations: [], Oct 08 04:24:59 xoa xo-server[152702]: current_operations: {}, Oct 08 04:24:59 xoa xo-server[152702]: created: '20241008T09:24:58Z', Oct 08 04:24:59 xoa xo-server[152702]: finished: '20241008T09:24:59Z', Oct 08 04:24:59 xoa xo-server[152702]: status: 'failure', Oct 08 04:24:59 xoa xo-server[152702]: resident_on: 'OpaqueRef:010eebba-be27-489f-9f87-d06c8b675f19', Oct 08 04:24:59 xoa xo-server[152702]: progress: 1, Oct 08 04:24:59 xoa xo-server[152702]: type: '<none/>', Oct 08 04:24:59 xoa xo-server[152702]: result: '', Oct 08 04:24:59 xoa xo-server[152702]: error_info: [Array], Oct 08 04:24:59 xoa xo-server[152702]: other_config: {}, Oct 08 04:24:59 xoa xo-server[152702]: subtask_of: 'OpaqueRef:NULL', Oct 08 04:24:59 xoa xo-server[152702]: subtasks: [], Oct 08 04:24:59 xoa xo-server[152702]: backtrace: '(((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 4711))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/rbac.ml)(line 205))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 95)))' Oct 08 04:24:59 xoa xo-server[152702]: } Oct 08 04:24:59 xoa xo-server[152702]: }, Oct 08 04:24:59 xoa xo-server[152702]: fn: 'destroy', Oct 08 04:24:59 xoa xo-server[152702]: arguments: [ 'OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e' ], Oct 08 04:24:59 xoa xo-server[152702]: pool: { Oct 08 04:24:59 xoa xo-server[152702]: uuid: 'fe688bb2-b9ac-db7b-737a-cc457195f095', Oct 08 04:24:59 xoa xo-server[152702]: name_label: 'Private Pool' Oct 08 04:24:59 xoa xo-server[152702]: } Oct 08 04:24:59 xoa xo-server[152702]: } ..... Oct 08 04:30:12 xoa xo-server[152702]: 2024-10-08T09:30:12.650Z xo:xapi:vdi WARN Couldn't disconnect OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e from dom0 { Oct 08 04:30:12 xoa xo-server[152702]: vdiRef: 'OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e', Oct 08 04:30:12 xoa xo-server[152702]: vbdRef: 'OpaqueRef:872dd8e6-031c-4f12-adb5-c7bb61f9dc0f', Oct 08 04:30:12 xoa xo-server[152702]: err: XapiError: OPERATION_NOT_ALLOWED(VBD '8817456d-774e-d82f-e63a-32a393491a0e' still attached to '9eebab44-b26e-4b6a-96e5-54734c459929') Oct 08 04:30:12 xoa xo-server[152702]: at XapiError.wrap (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/_XapiError.mjs:16:12) Oct 08 04:30:12 xoa xo-server[152702]: at file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/transports/json-rpc.mjs:38:21 Oct 08 04:30:12 xoa xo-server[152702]: at process.processTicksAndRejections (node:internal/process/task_queues:95:5) { Oct 08 04:30:12 xoa xo-server[152702]: code: 'OPERATION_NOT_ALLOWED', Oct 08 04:30:12 xoa xo-server[152702]: params: [ Oct 08 04:30:12 xoa xo-server[152702]: "VBD '8817456d-774e-d82f-e63a-32a393491a0e' still attached to '9eebab44-b26e-4b6a-96e5-54734c459929'" Oct 08 04:30:12 xoa xo-server[152702]: ], Oct 08 04:30:12 xoa xo-server[152702]: call: { method: 'VBD.destroy', params: [Array] }, Oct 08 04:30:12 xoa xo-server[152702]: url: undefined, Oct 08 04:30:12 xoa xo-server[152702]: task: undefined Oct 08 04:30:12 xoa xo-server[152702]: } Oct 08 04:30:12 xoa xo-server[152702]: } Oct 08 04:30:17 xoa xo-server[152702]: 2024-10-08T09:30:17.708Z xo:xapi:vm WARN VM_destroy: failed to destroy VDI { Oct 08 04:30:17 xoa xo-server[152702]: error: XapiError: VDI_IN_USE(OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e, destroy) Oct 08 04:30:17 xoa xo-server[152702]: at XapiError.wrap (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/_XapiError.mjs:16:12) Oct 08 04:30:17 xoa xo-server[152702]: at default (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/_getTaskResult.mjs:13:29) Oct 08 04:30:17 xoa xo-server[152702]: at Xapi._addRecordToCache (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1041:24) Oct 08 04:30:17 xoa xo-server[152702]: at file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1075:14 Oct 08 04:30:17 xoa xo-server[152702]: at Array.forEach (<anonymous>) Oct 08 04:30:17 xoa xo-server[152702]: at Xapi._processEvents (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1065:12) Oct 08 04:30:17 xoa xo-server[152702]: at Xapi._watchEvents (file:///usr/local/lib/node_modules/xo-server/node_modules/xen-api/index.mjs:1238:14) Oct 08 04:30:17 xoa xo-server[152702]: at process.processTicksAndRejections (node:internal/process/task_queues:95:5) { Oct 08 04:30:17 xoa xo-server[152702]: code: 'VDI_IN_USE', Oct 08 04:30:17 xoa xo-server[152702]: params: [ 'OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e', 'destroy' ], Oct 08 04:30:17 xoa xo-server[152702]: call: undefined, Oct 08 04:30:17 xoa xo-server[152702]: url: undefined, Oct 08 04:30:17 xoa xo-server[152702]: task: task { Oct 08 04:30:17 xoa xo-server[152702]: uuid: '81ce7cd4-6e11-e60d-6fc2-22522ba2dd74', Oct 08 04:30:17 xoa xo-server[152702]: name_label: 'Async.VDI.destroy', Oct 08 04:30:17 xoa xo-server[152702]: name_description: '', Oct 08 04:30:17 xoa xo-server[152702]: allowed_operations: [], Oct 08 04:30:17 xoa xo-server[152702]: current_operations: {}, Oct 08 04:30:17 xoa xo-server[152702]: created: '20241008T09:30:17Z', Oct 08 04:30:17 xoa xo-server[152702]: finished: '20241008T09:30:17Z', Oct 08 04:30:17 xoa xo-server[152702]: status: 'failure', Oct 08 04:30:17 xoa xo-server[152702]: resident_on: 'OpaqueRef:010eebba-be27-489f-9f87-d06c8b675f19', Oct 08 04:30:17 xoa xo-server[152702]: progress: 1, Oct 08 04:30:17 xoa xo-server[152702]: type: '<none/>', Oct 08 04:30:17 xoa xo-server[152702]: result: '', Oct 08 04:30:17 xoa xo-server[152702]: error_info: [Array], Oct 08 04:30:17 xoa xo-server[152702]: other_config: {}, Oct 08 04:30:17 xoa xo-server[152702]: subtask_of: 'OpaqueRef:NULL', Oct 08 04:30:17 xoa xo-server[152702]: subtasks: [], Oct 08 04:30:17 xoa xo-server[152702]: backtrace: '(((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 4711))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/rbac.ml)(line 205))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 95)))' Oct 08 04:30:17 xoa xo-server[152702]: } Oct 08 04:30:17 xoa xo-server[152702]: }, Oct 08 04:30:17 xoa xo-server[152702]: vdiRef: 'OpaqueRef:7d08bb03-3247-4ace-98b0-4110851a893e', Oct 08 04:30:17 xoa xo-server[152702]: vmRef: 'OpaqueRef:fac41617-6192-46b5-9d01-993da8ce039f' Oct 08 04:30:17 xoa xo-server[152702]: } Oct 08 04:30:18 xoa xo-server[152702]: 2024-10-08T09:30:18.041Z xo:backups:MixinBackupWriter INFO deleting unused VHD { Oct 08 04:30:18 xoa xo-server[152702]: path: '/xo-vm-backups/d5d0334c-a7e3-b29f-51ca-1be9c211d2c1/vdis/05ce27a6-b447-4b12-b963-7e8d4978c95a/9f8da2fd-a08d-43c2-a5b1-ce6125cf52f5/20241007T135637Z.vhd' Oct 08 04:30:18 xoa xo-server[152702]: } Oct 08 04:30:18 xoa xo-server[152702]: 2024-10-08T09:30:18.041Z xo:backups:MixinBackupWriter INFO deleting unused VHD { Oct 08 04:30:18 xoa xo-server[152702]: path: '/xo-vm-backups/d5d0334c-a7e3-b29f-51ca-1be9c211d2c1/vdis/05ce27a6-b447-4b12-b963-7e8d4978c95a/c6853f48-4b06-4c34-9707-b68f9054e6fc/20241007T135637Z.vhd' Oct 08 04:30:18 xoa xo-server[152702]: } Oct 08 04:30:18 xoa xo-server[152702]: 2024-10-08T09:30:18.148Z xo:backups:worker INFO backup has ended Oct 08 04:30:48 xoa xo-server[152702]: 2024-10-08T09:30:48.256Z xo:backups:worker WARN worker process did not exit automatically, forcing... Oct 08 04:30:48 xoa xo-server[152702]: 2024-10-08T09:30:48.257Z xo:backups:worker INFO process will exit { Oct 08 04:30:48 xoa xo-server[152702]: duration: 1107117425, Oct 08 04:30:48 xoa xo-server[152702]: exitCode: 0, Oct 08 04:30:48 xoa xo-server[152702]: resourceUsage: { Oct 08 04:30:48 xoa xo-server[152702]: userCPUTime: 839002829, Oct 08 04:30:48 xoa xo-server[152702]: systemCPUTime: 176610166, Oct 08 04:30:48 xoa xo-server[152702]: maxRSS: 89720, Oct 08 04:30:48 xoa xo-server[152702]: sharedMemorySize: 0, Oct 08 04:30:48 xoa xo-server[152702]: unsharedDataSize: 0, Oct 08 04:30:48 xoa xo-server[152702]: unsharedStackSize: 0, Oct 08 04:30:48 xoa xo-server[152702]: minorPageFault: 4860813, Oct 08 04:30:48 xoa xo-server[152702]: majorPageFault: 0, Oct 08 04:30:48 xoa xo-server[152702]: swappedOut: 0, Oct 08 04:30:48 xoa xo-server[152702]: fsRead: 4224, Oct 08 04:30:48 xoa xo-server[152702]: fsWrite: 209821368, Oct 08 04:30:48 xoa xo-server[152702]: ipcSent: 0, Oct 08 04:30:48 xoa xo-server[152702]: ipcReceived: 0, Oct 08 04:30:48 xoa xo-server[152702]: signalsCount: 0, Oct 08 04:30:48 xoa xo-server[152702]: voluntaryContextSwitches: 451688, Oct 08 04:30:48 xoa xo-server[152702]: involuntaryContextSwitches: 480143 Oct 08 04:30:48 xoa xo-server[152702]: }, Oct 08 04:30:48 xoa xo-server[152702]: summary: { duration: '18m', cpuUsage: '92%', memoryUsage: '87.62 MiB' } Oct 08 04:30:48 xoa xo-server[152702]: }
-
@icompit
Xen Orchestra, commit a5967
Master, commit a5967 -
Please use the markdown code block when posting logs, otherwise it's horrible.
-
I still have error, even after disabled CBT and purge snapshot.
"stream has ended with not enough data (actual: 446, expected: 512)"
It's a production VM, for a customer. What can I do quickly ?
Ticket #7729749
Thanks !
-
@still_at_work are u able to spin up a test xoa based on stable? Maybe u can check if it does work in that version?
-
@rtjdamen said in CBT: the thread to centralize your feedback:
are u able to spin up a test xoa based on stable? Maybe u can check if it does work in that version?
I'm already on stable channel
-
@still_at_work said in CBT: the thread to centralize your feedback:
p a test xoa based
ah, in that case it won't help! i see a lot of people have an open ticket on this issues, i expect @florent is actively working on resolving them, hope we hear something soon!
-
Since the latest release we see a lot more fall back to base errors, anyone else having the same issue?
-
@rtjdamen Seems to happen when you run a mixture of backup solutions.
-
@StormMaster thanks, seems logical if u backup the same vm with 2 different solutions, but in our case we don't use a different backup tool to do so. We do use Alike for some smaller backups but not this specific vms.
-
@rtjdamen Sorry! Just to clarify. When I said a mixture of backup solutions, I was talking about the different backup solutions that XCP-NG backup provides. IE Running a delta backup after running a continuous replication backup.
When running a mixture of XCP-NG incremental backups, there appears to be a bug somewhere that has been causing the fall back to base errors along with a couple of other errors that break the backup process.
-
@olivierlambert @florent If it helps to know... Disabling "Use NBD + CBT to transfer disk if available" on the same backup jobs as I used above works flawlessly. Although on big backup jobs, not having NBD available does add about a quarter of the time to the backup process.
-
@StormMaster i understand, we are also not using this on these vms but it does make sense, something is breaking the cbt chain and causing a full, question is if this is caused by the backup job or something else. Thanks for your input.
-
I was looking to do some updates on our TrueNAS Scale device providing an NFS share to my XCP-ng hosts (8.2.1), we have CBT enabled for backups.
However when I try to move the Xen Orchestra VDI from TrueNAS to local storage I receive the following error:
{ "id": "0m261vorl", "properties": { "method": "vdi.migrate", "params": { "id": "f91f81f2-308d-4de9-879e-c1fa84a37d27", "sr_id": "49822b62-3367-7e7c-76ee-1cfc91a262e9" }, "name": "API call: vdi.migrate", "userId": "7b63bade-51f3-4916-9174-f969da17774a", "type": "api.call" }, "start": 1728731129889, "status": "failure", "updatedAt": 1728731132752, "end": 1728731132752, "result": { "code": "VDI_CBT_ENABLED", "params": [ "OpaqueRef:4f16cd0e-fbaf-48c3-aae4-092b9906b9e4" ], "task": { "uuid": "7ce61fba-d6d3-12cb-2585-79d5b69d3857", "name_label": "Async.VDI.pool_migrate", "name_description": "", "allowed_operations": [], "current_operations": {}, "created": "20241012T11:05:31Z", "finished": "20241012T11:05:32Z", "status": "failure", "resident_on": "OpaqueRef:fe0440a3-4a31-44d6-8317-a0e64d0ee01e", "progress": 1, "type": "<none/>", "result": "", "error_info": [ "VDI_CBT_ENABLED", "OpaqueRef:4f16cd0e-fbaf-48c3-aae4-092b9906b9e4" ], "other_config": {}, "subtask_of": "OpaqueRef:NULL", "subtasks": [], "backtrace": "(((process xapi)(filename ocaml/xapi-client/client.ml)(line 7))((process xapi)(filename ocaml/xapi-client/client.ml)(line 19))((process xapi)(filename ocaml/xapi-client/client.ml)(line 12359))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 134))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 35))((process xapi)(filename lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/rbac.ml)(line 205))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 95)))" }, "message": "VDI_CBT_ENABLED(OpaqueRef:4f16cd0e-fbaf-48c3-aae4-092b9906b9e4)", "name": "XapiError", "stack": "XapiError: VDI_CBT_ENABLED(OpaqueRef:4f16cd0e-fbaf-48c3-aae4-092b9906b9e4)\n at Function.wrap (file:///opt/xo/xo-builds/xen-orchestra-202410111017/packages/xen-api/_XapiError.mjs:16:12)\n at default (file:///opt/xo/xo-builds/xen-orchestra-202410111017/packages/xen-api/_getTaskResult.mjs:13:29)\n at Xapi._addRecordToCache (file:///opt/xo/xo-builds/xen-orchestra-202410111017/packages/xen-api/index.mjs:1041:24)\n at file:///opt/xo/xo-builds/xen-orchestra-202410111017/packages/xen-api/index.mjs:1075:14\n at Array.forEach (<anonymous>)\n at Xapi._processEvents (file:///opt/xo/xo-builds/xen-orchestra-202410111017/packages/xen-api/index.mjs:1065:12)\n at Xapi._watchEvents (file:///opt/xo/xo-builds/xen-orchestra-202410111017/packages/xen-api/index.mjs:1238:14)" } }
I can see a task disabling CBT on the disk and looking at the UI it shows as CBT disabled.
I experience the same issue attempting to migrate other VDI's too.
-
You need to remove all snapshots before migration and disable cbt. Storage migration is not supported when cbt is invalid. I believe xoa should do this automatically however.
-
@rtjdamen Thanks for the super fast response!
Just removed the existing snapshots and the task is proceeding.
Did you mean cbt is enabled as opposed to cbt is invalid?
-
@andyh no cbt should be disabled, u canβt migrate an cbt enabled vdi.
-
@rtjdamen I understand now
-
After a number of XOA Updates i decided to test CBT with snapshot delete again.
Instead of " "can't create a stream from a metadata VDI, fall back to a base" i am seeing a more verbose error but the issue remains the same. In a 2 host pool with shared NFS storage if i have CBT with Snap delete enabled after a VM is migrated from host A to host B (remaining on the shared NFS SR) and a backup runs the delta backup fails and a full runs. This time the error shows " Can't do delta with this vdi, transfer will be a full"
This is with XOA Latest: 5.100.0
I have attached the backup log if this helps.