Hooray, fix_xva_import_thin works! Huge thanks to @florent for the fix!
FWIW, jumping around between commits was a breeze thanks to that external tool.
Hooray, fix_xva_import_thin works! Huge thanks to @florent for the fix!
FWIW, jumping around between commits was a breeze thanks to that external tool.
@olivierlambert Indeed it is with that tool that @john-c mentioned. While not official, it's a fairly useful and popular one that significantly lowered the barrier to entry for getting me started with XO from sources.
I mostly made that post for other people in my exact situation, and even mentioned the external tool in the prerequisites header. I didn't want to link the external repo and risk it being received as spam, and I'm sure you guys wouldn't be thrilled to have an external tool like that mistakenly wind up in your support queue. I figured those who already use it would recognize it. I'm just a humble, casual homelab user and thought there might be someone else out there pooping their pants in a similar way.
I'll give some of the other commits mentioned in this thread a try and let you know how it goes!
Hooray, fix_xva_import_thin works! Huge thanks to @florent for the fix!
FWIW, jumping around between commits was a breeze thanks to that external tool.
No luck on latest master
commit 5d4723e I'm afraid. Going to try fix_xva_import_thin next.
{
"id": "0ls6joz2k",
"properties": {
"name": "importing vms 30",
"userId": "f424aa77-c82d-432a-9228-709a614019e6",
"total": 1,
"done": 0,
"progress": 0
},
"start": 1706993226381,
"status": "failure",
"updatedAt": 1706994410318,
"tasks": [
{
"id": "8ggmvpx0fno",
"properties": {
"name": "importing vm 30",
"done": 1,
"progress": 100
},
"start": 1706993226382,
"status": "failure",
"tasks": [
{
"id": "juhjiy9w6y7",
"properties": {
"name": "connecting to my-notsoawesome-vmware-server.mydomain.net"
},
"start": 1706993226383,
"status": "success",
"end": 1706993226546,
"result": {
"_events": {},
"_eventsCount": 0
}
},
{
"id": "79wfj2zidiu",
"properties": {
"name": "get metadata of 30"
},
"start": 1706993226547,
"status": "success",
"end": 1706993226851,
"result": {
"name_label": "my-awesome-vm",
"memory": 2147483648,
"nCpus": 2,
"guestToolsInstalled": false,
"firmware": "uefi",
"powerState": "poweredOff",
"disks": [
{
"capacity": 64424509440,
"isFull": true,
"uid": "94dc7d43",
"fileName": "my-awesome-vm-flat.vmdk",
"parentId": "ffffffff",
"vmdFormat": "VMFS",
"nameLabel": "my-awesome-vm-flat.vmdk",
"datastore": "my-awesome-vmware-nfs-datastore",
"path": "my-awesome-vm",
"descriptionLabel": " from esxi",
"node": "scsi0:0"
}
],
"networks": [
{
"label": "my-awesome-network",
"macAddress": "00:50:56:9f:8a:cc",
"isGenerated": false
}
]
}
},
{
"id": "soh8ame3hok",
"properties": {
"name": "build disks and snapshots chains for 30"
},
"start": 1706993226852,
"status": "success",
"end": 1706993226852,
"result": {
"scsi0:0": [
{
"capacity": 64424509440,
"isFull": true,
"uid": "94dc7d43",
"fileName": "my-awesome-vm-flat.vmdk",
"parentId": "ffffffff",
"vmdFormat": "VMFS",
"nameLabel": "my-awesome-vm-flat.vmdk",
"datastore": "my-awesome-vmware-nfs-datastore",
"path": "my-awesome-vm",
"descriptionLabel": " from esxi",
"node": "scsi0:0"
}
]
}
},
{
"id": "wvybfjogex",
"properties": {
"name": "creating MV on XCP side"
},
"start": 1706993226852,
"status": "success",
"end": 1706993226954,
"result": {
"uuid": "a3dd5318-24ec-063b-3f98-9f66d50312e4",
"allowed_operations": [
"changing_NVRAM",
"changing_dynamic_range",
"changing_shadow_memory",
"changing_static_range",
"make_into_template",
"migrate_send",
"destroy",
"export",
"start_on",
"start",
"clone",
"copy",
"snapshot"
],
"current_operations": {},
"name_label": "my-awesome-vm",
"name_description": "from esxi",
"power_state": "Halted",
"user_version": 1,
"is_a_template": false,
"is_default_template": false,
"suspend_VDI": "OpaqueRef:NULL",
"resident_on": "OpaqueRef:NULL",
"scheduled_to_be_resident_on": "OpaqueRef:NULL",
"affinity": "OpaqueRef:NULL",
"memory_overhead": 20971520,
"memory_target": 0,
"memory_static_max": 2147483648,
"memory_dynamic_max": 2147483648,
"memory_dynamic_min": 2147483648,
"memory_static_min": 2147483648,
"VCPUs_params": {},
"VCPUs_max": 2,
"VCPUs_at_startup": 2,
"actions_after_shutdown": "destroy",
"actions_after_reboot": "restart",
"actions_after_crash": "restart",
"consoles": [],
"VIFs": [],
"VBDs": [],
"VUSBs": [],
"crash_dumps": [],
"VTPMs": [],
"PV_bootloader": "",
"PV_kernel": "",
"PV_ramdisk": "",
"PV_args": "",
"PV_bootloader_args": "",
"PV_legacy_args": "",
"HVM_boot_policy": "BIOS order",
"HVM_boot_params": {
"order": "cdn"
},
"HVM_shadow_multiplier": 1,
"platform": {
"timeoffset": "0",
"nx": "true",
"acpi": "1",
"apic": "true",
"pae": "true",
"hpet": "true",
"viridian": "true"
},
"PCI_bus": "",
"other_config": {
"mac_seed": "22ad11b7-fb42-517f-91b1-1e834eb184af",
"vgpu_pci": "",
"base_template_name": "Other install media",
"install-methods": "cdrom"
},
"domid": -1,
"domarch": "",
"last_boot_CPU_flags": {},
"is_control_domain": false,
"metrics": "OpaqueRef:d405d6a0-178b-45b2-8a92-5db36d65a220",
"guest_metrics": "OpaqueRef:NULL",
"last_booted_record": "",
"recommendations": "<restrictions><restriction field=\"memory-static-max\" max=\"137438953472\" /><restriction field=\"vcpus-max\" max=\"32\" /><restriction property=\"number-of-vbds\" max=\"255\" /><restriction property=\"number-of-vifs\" max=\"7\" /><restriction field=\"has-vendor-device\" value=\"false\" /></restrictions>",
"xenstore_data": {},
"ha_always_run": false,
"ha_restart_priority": "",
"is_a_snapshot": false,
"snapshot_of": "OpaqueRef:NULL",
"snapshots": [],
"snapshot_time": "19700101T00:00:00Z",
"transportable_snapshot_id": "",
"blobs": {},
"tags": [],
"blocked_operations": {},
"snapshot_info": {},
"snapshot_metadata": "",
"parent": "OpaqueRef:NULL",
"children": [],
"bios_strings": {},
"protection_policy": "OpaqueRef:NULL",
"is_snapshot_from_vmpp": false,
"snapshot_schedule": "OpaqueRef:NULL",
"is_vmss_snapshot": false,
"appliance": "OpaqueRef:NULL",
"start_delay": 0,
"shutdown_delay": 0,
"order": 0,
"VGPUs": [],
"attached_PCIs": [],
"suspend_SR": "OpaqueRef:NULL",
"version": 0,
"generation_id": "0:0",
"hardware_platform_version": 0,
"has_vendor_device": false,
"requires_reboot": false,
"reference_label": "",
"domain_type": "hvm",
"NVRAM": {}
}
},
{
"id": "fx32r2hmr3w",
"properties": {
"name": "Cold import of disks scsi0:0"
},
"start": 1706993226955,
"status": "failure",
"end": 1706994410295,
"result": {
"message": "already finalized or destroyed",
"name": "Error",
"stack": "Error: already finalized or destroyed\n at Pack.entry (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/tar-stream/pack.js:138:51)\n at Pack.resolver (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/promise-toolbox/fromCallback.js:5:6)\n at Promise._execute (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/debuggability.js:384:9)\n at Promise._resolveFromExecutor (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/promise.js:518:18)\n at new Promise (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/promise.js:103:10)\n at Pack.fromCallback (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/promise-toolbox/fromCallback.js:9:10)\n at writeBlock (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/_writeDisk.mjs:6:22)\n at addDisk (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/_writeDisk.mjs:27:13)\n at importVm (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/importVm.mjs:22:5)\n at importVdi (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/importVdi.mjs:6:17)\n at file:///opt/xo/xo-builds/xen-orchestra-202402031433/packages/xo-server/src/xo-mixins/migrate-vm.mjs:260:21\n at Task.runInside (/opt/xo/xo-builds/xen-orchestra-202402031433/@vates/task/index.js:158:22)\n at Task.run (/opt/xo/xo-builds/xen-orchestra-202402031433/@vates/task/index.js:141:20)"
}
}
],
"end": 1706994410318,
"result": {
"message": "already finalized or destroyed",
"name": "Error",
"stack": "Error: already finalized or destroyed\n at Pack.entry (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/tar-stream/pack.js:138:51)\n at Pack.resolver (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/promise-toolbox/fromCallback.js:5:6)\n at Promise._execute (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/debuggability.js:384:9)\n at Promise._resolveFromExecutor (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/promise.js:518:18)\n at new Promise (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/promise.js:103:10)\n at Pack.fromCallback (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/promise-toolbox/fromCallback.js:9:10)\n at writeBlock (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/_writeDisk.mjs:6:22)\n at addDisk (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/_writeDisk.mjs:27:13)\n at importVm (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/importVm.mjs:22:5)\n at importVdi (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/importVdi.mjs:6:17)\n at file:///opt/xo/xo-builds/xen-orchestra-202402031433/packages/xo-server/src/xo-mixins/migrate-vm.mjs:260:21\n at Task.runInside (/opt/xo/xo-builds/xen-orchestra-202402031433/@vates/task/index.js:158:22)\n at Task.run (/opt/xo/xo-builds/xen-orchestra-202402031433/@vates/task/index.js:141:20)"
}
}
],
"end": 1706994410318,
"result": {
"succeeded": {},
"message": "already finalized or destroyed",
"name": "Error",
"stack": "Error: already finalized or destroyed\n at Pack.entry (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/tar-stream/pack.js:138:51)\n at Pack.resolver (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/promise-toolbox/fromCallback.js:5:6)\n at Promise._execute (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/debuggability.js:384:9)\n at Promise._resolveFromExecutor (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/promise.js:518:18)\n at new Promise (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/bluebird/js/release/promise.js:103:10)\n at Pack.fromCallback (/opt/xo/xo-builds/xen-orchestra-202402031433/node_modules/promise-toolbox/fromCallback.js:9:10)\n at writeBlock (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/_writeDisk.mjs:6:22)\n at addDisk (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/_writeDisk.mjs:27:13)\n at importVm (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/importVm.mjs:22:5)\n at importVdi (file:///opt/xo/xo-builds/xen-orchestra-202402031433/@xen-orchestra/xva/importVdi.mjs:6:17)\n at file:///opt/xo/xo-builds/xen-orchestra-202402031433/packages/xo-server/src/xo-mixins/migrate-vm.mjs:260:21\n at Task.runInside (/opt/xo/xo-builds/xen-orchestra-202402031433/@vates/task/index.js:158:22)\n at Task.run (/opt/xo/xo-builds/xen-orchestra-202402031433/@vates/task/index.js:141:20)"
}
}
@olivierlambert Indeed it is with that tool that @john-c mentioned. While not official, it's a fairly useful and popular one that significantly lowered the barrier to entry for getting me started with XO from sources.
I mostly made that post for other people in my exact situation, and even mentioned the external tool in the prerequisites header. I didn't want to link the external repo and risk it being received as spam, and I'm sure you guys wouldn't be thrilled to have an external tool like that mistakenly wind up in your support queue. I figured those who already use it would recognize it. I'm just a humble, casual homelab user and thought there might be someone else out there pooping their pants in a similar way.
I'll give some of the other commits mentioned in this thread a try and let you know how it goes!
I ran into this exact same issue with the "already finalized or destroyed" message. I'm running XO from sources, but it seems to have started after 5.91 was released, and it's happened to multiple VMs, all with a single disk.
For those who are...
...here's what got me back to a working state.
Only consider doing this if you're in this exact, recently developed situation. It is not an appropriate long-term fix, and it may leave you with a vulnerable instance of Xen Orchestra if enough time has passed since this post.
./xo-install.sh
and attempt to use the built-in rollback feature.xo-install.cfg
and change the BRANCH
variable from "master
" to 89a4de5b21104fb3fa4a6c301eb3fe98328c90d0
"fix_xva_import_thin
". This will pin your installation to a ./xo-install.sh
again and select the option to install.I hope at least one person finds this helpful. Like probably many others, I'm in the middle of a migration from VMware in my homelab. Really digging XCP-ng and Xen Orchestra so far, and it feels like the switch was long overdue. Thanks for all the awesome work!