XCP-ng
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. ph7
    3. Posts
    P
    Offline
    • Profile
    • Following 0
    • Followers 0
    • Topics 13
    • Posts 181
    • Groups 0

    Posts

    Recent Best Controversial
    • RE: VM association with shared storage

      New job, try pool.rollingUpdate

      posted in Management
      P
      ph7
    • RE: VM association with shared storage

      @McHenry
      Use the builtin function, just press Rolling pool update when there is updates available.
      Home/Pools/"Ryssen#
      82404e6e-8241-427a-b7d3-397813017978-image.png

      The rest is automatic or should I say, magic 🙂

      posted in Management
      P
      ph7
    • RE: XCP-ng 8.3 updates announcements and testing

      The new template for debian 13 is working in XO-Lite 👍

      posted in News
      P
      ph7
    • RE: XO Lite: Change URL

      @mx234
      I'm running netbird with an exit node and I get access to my LAN.
      Everything works great

      posted in XO Lite
      P
      ph7
    • RE: Having issues installing StartOS as a VM. Cant detect a disk for it to install to.

      @rk9268vc
      I got no info for You, but is the host updated?
      912a9c04-56b6-433c-9fbb-3c4d1a7d62be-image.png

      posted in XCP-ng
      P
      ph7
    • RE: XCP-ng 8.3 updates announcements and testing

      @stormi

      I also take this opportunity to call for more feedback on the previous batch of updates,

      Well I updated a few days ago, tough I dont run much of the updated functions on my simple home lab, it all seems to work fine.
      i7 gen 4 and NFS

      Now testing the new updates......

      posted in News
      P
      ph7
    • RE: Alarms in XO

      @McHenry
      Check Your performance plugin settings
      I don't remember if 0,95 triggerlevel is 95% or 0,95%

      posted in Management
      P
      ph7
    • RE: VM association with shared storage

      @McHenry
      You can try to schedule a rolling pool update.

      posted in Management
      P
      ph7
    • RE: Performing automated shutdown during a power failure using a USB-UPS with NUT - XCP-ng 8.2

      @Forza
      Yes it can
      This is how I do it
      https://xcp-ng.org/forum/topic/10834/apcupsd-install/9

      posted in Compute
      P
      ph7
    • RE: Fail backup report XO CE commit a67ad

      Since I'm not used to using git, I was running a few tests on my "test" host.
      I had made a change in my "one line commando"
      My memory is not in best shape so I forgot the change and ran it in my update.
      Now It's sorted out and ran a new update and voila

      posted in Backup
      P
      ph7
    • RE: Fail backup report XO CE commit a67ad

      @olivierlambert
      As usual, I'm an Idiot
      No need to involve @lsouai-vates

      I am checking what went wrong and report back

      posted in Backup
      P
      ph7
    • RE: Fail backup report XO CE commit a67ad

      @olivierlambert
      I'l report tomorrow after next backup tonight

      posted in Backup
      P
      ph7
    • RE: Fail backup report XO CE commit a67ad

      I now see that I also got a errormessage around the time I updated:

      Unable to get the true granularity: 60
      
      vm.stats
      {
        "id": "46bfadff-6002-299e-506a-abbf9a9aa8a8"
      }
      {
        "message": "Unable to get the true granularity: 60",
        "name": "FaultyGranularity",
        "stack": "FaultyGranularity: Unable to get the true granularity: 60
          at XapiStats._getAndUpdateStats (file:///root/xen-orchestra/packages/xo-server/src/xapi-stats.mjs:396:13)
          at runNextTicks (node:internal/process/task_queues:65:5)
          at processImmediate (node:internal/timers:453:9)
          at process.callbackTrampoline (node:internal/async_hooks:130:17)
          at Task.runInside (/root/xen-orchestra/@vates/task/index.js:175:22)
          at Task.run (/root/xen-orchestra/@vates/task/index.js:159:20)
          at Api.#callApiMethod (file:///root/xen-orchestra/packages/xo-server/src/xo-mixins/api.mjs:469:18)"
      }
      
      posted in Backup
      P
      ph7
    • Fail backup report XO CE commit a67ad

      Updated xo ce yesterday
      this morning I got a "fail mail"
      ea09563f-4233-4947-9482-99bd6817a384-image.png

      I checked my backup log

      {
        "data": {
          "mode": "delta",
          "reportWhen": "failure"
        },
        "id": "1759798920278",
        "jobId": "5f847711-37e6-4b87-9057-54ec653288dd",
        "jobName": "Daily_Offline",
        "message": "backup",
        "scheduleId": "98c42ed9-df3d-4094-8786-338d343f7736",
        "start": 1759798920278,
        "status": "success",
        "infos": [
          {
            "data": {
              "vms": [
                "5142d7d2-e278-f380-8a9f-cb7fc45f8f9c",
                "0f5c4931-a468-e75d-fa54-e1f9da0227a1",
                "0334c083-8543-57dc-6a3d-908e1aa772b4"
              ]
            },
            "message": "vms"
          }
        ],
        "tasks": [
          {
            "data": {
              "type": "VM",
              "id": "5142d7d2-e278-f380-8a9f-cb7fc45f8f9c",
              "name_label": "Docker"
            },
            "id": "1759798923897",
            "message": "backup VM",
            "start": 1759798923897,
            "status": "success",
            "tasks": [
              {
                "id": "1759798923910",
                "message": "clean-vm",
                "start": 1759798923910,
                "status": "success",
                "end": 1759798924717,
                "result": {
                  "merge": false
                }
              },
              {
                "id": "1759798933285",
                "message": "snapshot",
                "start": 1759798933285,
                "status": "success",
                "end": 1759798934156,
                "result": "7084f483-51a5-4950-77a0-a371c6f95d21"
              },
              {
                "data": {
                  "id": "0927f4dc-4476-4ac3-959a-bc37e85956df",
                  "isFull": false,
                  "type": "remote"
                },
                "id": "1759798934157",
                "message": "export",
                "start": 1759798934157,
                "status": "success",
                "tasks": [
                  {
                    "id": "1759798935946",
                    "message": "transfer",
                    "start": 1759798935946,
                    "status": "success",
                    "end": 1759798952957,
                    "result": {
                      "size": 1426063360
                    }
                  },
                  {
                    "id": "1759798954177",
                    "message": "clean-vm",
                    "start": 1759798954177,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1759798954342",
                        "message": "merge",
                        "start": 1759798954342,
                        "status": "success",
                        "end": 1759798992668
                      }
                    ],
                    "warnings": [
                      {
                        "data": {
                          "path": "/xo-vm-backups/5142d7d2-e278-f380-8a9f-cb7fc45f8f9c/20251007T010215Z.json",
                          "actual": 1426063360,
                          "expected": 1426563072
                        },
                        "message": "cleanVm: incorrect backup size in metadata"
                      }
                    ],
                    "end": 1759798992757,
                    "result": {
                      "merge": true
                    }
                  }
                ],
                "end": 1759798992759
              }
            ],
            "end": 1759798992759
          },
          {
            "data": {
              "type": "VM",
              "id": "0f5c4931-a468-e75d-fa54-e1f9da0227a1",
              "name_label": "Sync Mate"
            },
            "id": "1759798992785",
            "message": "backup VM",
            "start": 1759798992785,
            "status": "success",
            "tasks": [
              {
                "id": "1759798992797",
                "message": "clean-vm",
                "start": 1759798992797,
                "status": "success",
                "end": 1759798993754,
                "result": {
                  "merge": false
                }
              },
              {
                "id": "1759799002480",
                "message": "snapshot",
                "start": 1759799002480,
                "status": "success",
                "end": 1759799004460,
                "result": "37d321ca-ea78-2e81-a630-3f93f9ac2ca2"
              },
              {
                "data": {
                  "id": "0927f4dc-4476-4ac3-959a-bc37e85956df",
                  "isFull": false,
                  "type": "remote"
                },
                "id": "1759799004461",
                "message": "export",
                "start": 1759799004461,
                "status": "success",
                "tasks": [
                  {
                    "id": "1759799007511",
                    "message": "transfer",
                    "start": 1759799007511,
                    "status": "success",
                    "end": 1759799026016,
                    "result": {
                      "size": 1134559232
                    }
                  },
                  {
                    "id": "1759799027505",
                    "message": "clean-vm",
                    "start": 1759799027505,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1759799027717",
                        "message": "merge",
                        "start": 1759799027717,
                        "status": "success",
                        "end": 1759799048142
                      }
                    ],
                    "warnings": [
                      {
                        "data": {
                          "path": "/xo-vm-backups/0f5c4931-a468-e75d-fa54-e1f9da0227a1/20251007T010327Z.json",
                          "actual": 1134559232,
                          "expected": 1135043072
                        },
                        "message": "cleanVm: incorrect backup size in metadata"
                      }
                    ],
                    "end": 1759799048229,
                    "result": {
                      "merge": true
                    }
                  }
                ],
                "end": 1759799048231
              }
            ],
            "end": 1759799048231
          },
          {
            "data": {
              "type": "VM",
              "id": "0334c083-8543-57dc-6a3d-908e1aa772b4",
              "name_label": "Home Assistant"
            },
            "id": "1759799048256",
            "message": "backup VM",
            "start": 1759799048256,
            "status": "success",
            "tasks": [
              {
                "id": "1759799048268",
                "message": "clean-vm",
                "start": 1759799048268,
                "status": "success",
                "end": 1759799049008,
                "result": {
                  "merge": false
                }
              },
              {
                "id": "1759799067232",
                "message": "snapshot",
                "start": 1759799067232,
                "status": "success",
                "end": 1759799068300,
                "result": "24fd84b8-f1a0-abbf-b8d7-048387eb22e9"
              },
              {
                "data": {
                  "id": "0927f4dc-4476-4ac3-959a-bc37e85956df",
                  "isFull": false,
                  "type": "remote"
                },
                "id": "1759799068301",
                "message": "export",
                "start": 1759799068301,
                "status": "success",
                "tasks": [
                  {
                    "id": "1759799070236",
                    "message": "transfer",
                    "start": 1759799070236,
                    "status": "success",
                    "end": 1759799079551,
                    "result": {
                      "size": 713031680
                    }
                  },
                  {
                    "id": "1759799080824",
                    "message": "clean-vm",
                    "start": 1759799080824,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1759799080992",
                        "message": "merge",
                        "start": 1759799080992,
                        "status": "success",
                        "end": 1759799116807
                      }
                    ],
                    "warnings": [
                      {
                        "data": {
                          "path": "/xo-vm-backups/0334c083-8543-57dc-6a3d-908e1aa772b4/20251007T010430Z.json",
                          "actual": 713031680,
                          "expected": 713273344
                        },
                        "message": "cleanVm: incorrect backup size in metadata"
                      }
                    ],
                    "end": 1759799116916,
                    "result": {
                      "merge": true
                    }
                  }
                ],
                "end": 1759799116917
              }
            ],
            "end": 1759799116918
          }
        ],
        "end": 1759799116918
      }
      

      The usual "cleanVm: incorrect backup size in metadata" was there but otherwise it seemed OK.
      Perhaps this is a suspect

                    "result": {
                      "merge": true
                    }
      

      There might be one } missing, but what do I know...

      So I ran a restore health check and got this message:

      INTERNAL_ERROR(xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, "16: Device or resource busy")))
      
      This is a XenServer/XCP-ng error
      
      backupNg.checkBackup
      {
        "id": "0927f4dc-4476-4ac3-959a-bc37e85956df//xo-vm-backups/0334c083-8543-57dc-6a3d-908e1aa772b4/20251007T010430Z.json",
        "settings": {
          "mapVdisSrs": {}
        },
        "sr": "ebc70898-d9c2-33dc-b22b-a465e39075a2"
      }
      {
        "code": "INTERNAL_ERROR",
        "params": [
          "xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\"))"
        ],
        "task": {
          "uuid": "3a02bd0d-adbc-67ba-a496-343846190081",
          "name_label": "Async.VM.start",
          "name_description": "",
          "allowed_operations": [],
          "current_operations": {},
          "created": "20251007T07:04:17Z",
          "finished": "20251007T07:04:32Z",
          "status": "failure",
          "resident_on": "OpaqueRef:ed939ef0-4062-e338-ea05-6fc86232259e",
          "progress": 1,
          "type": "<none/>",
          "result": "",
          "error_info": [
            "INTERNAL_ERROR",
            "xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\"))"
          ],
          "other_config": {
            "debug_info:cancel_points_seen": "1"
          },
          "subtask_of": "OpaqueRef:NULL",
          "subtasks": [],
          "backtrace": "(((process xenopsd-xc)(filename lib/backtrace.ml)(line 210))((process xenopsd-xc)(filename ocaml/xenopsd/xc/device.ml)(line 1298))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2157))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1893))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1901))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2490))((process xenopsd-xc)(filename list.ml)(line 121))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2483))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2637))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1893))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1901))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 3294))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 3304))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 3325))((process xenopsd-xc)(filename ocaml/xapi-idl/lib/task_server.ml)(line 192))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3363))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3742))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3372))((process xapi)(filename lib/backtrace.ml)(line 210))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3378))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3748))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3372))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3481))((process xapi)(filename ocaml/xapi/xapi_vm.ml)(line 353))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 141))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 1514))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 39))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 1896))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 39))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 39))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 1878))((process xapi)(filename ocaml/xapi/rbac.ml)(line 188))((process xapi)(filename ocaml/xapi/rbac.ml)(line 197))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 77)))"
        },
        "message": "INTERNAL_ERROR(xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\")))",
        "name": "XapiError",
        "stack": "XapiError: INTERNAL_ERROR(xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\")))
          at Function.wrap (file:///root/xen-orchestra/packages/xen-api/_XapiError.mjs:16:12)
          at default (file:///root/xen-orchestra/packages/xen-api/_getTaskResult.mjs:13:29)
          at Xapi._addRecordToCache (file:///root/xen-orchestra/packages/xen-api/index.mjs:1073:24)
          at file:///root/xen-orchestra/packages/xen-api/index.mjs:1107:14
          at Array.forEach (<anonymous>)
          at Xapi._processEvents (file:///root/xen-orchestra/packages/xen-api/index.mjs:1097:12)
          at Xapi._watchEvents (file:///root/xen-orchestra/packages/xen-api/index.mjs:1270:14)
          at runNextTicks (node:internal/process/task_queues:65:5)
          at processImmediate (node:internal/timers:453:9)
          at process.callbackTrampoline (node:internal/async_hooks:130:17)"
      }
      

      edit
      deleted a silly remark
      /edit

      posted in Backup
      P
      ph7
    • RE: XO6 dashboard and backup view feedback

      @lsouai-vates
      👍

      posted in Backup
      P
      ph7
    • XO6 dashboard and backup view feedback

      XO ver. cf044
      In dashboard, when a scheduled replication job has run, a message is shown
      Screenshot 2025-10-02 at 08-42-06 Xen Orchestra.png

      It will not be cleared after the job has run, You have to refresh the page


      In the dashboard before the replication job has run
      Screenshot 2025-10-02 at 08-30-26 Xen Orchestra.png

      If I remeber correct, It used to say All is good or something like that.
      At least it should.

      But in backup it's OK 👍
      Screenshot 2025-10-02 at 08-31-17 Xen Orchestra.png

      posted in Backup
      P
      ph7
    • RE: visual bug / backup state filter

      @olivierlambert
      not by me

      posted in Backup
      P
      ph7
    • RE: visual bug / backup state filter

      @olivierlambert
      I reported this in april
      https://xcp-ng.org/forum/topic/10755/feedback-xo-v6

      posted in Backup
      P
      ph7
    • RE: XCP-ng 8.3 updates announcements and testing

      @gduperrey
      Ran updates on my old hosts
      i7 gen4 and ryzen5
      nothing exploded yet after ~10h of "testing"

      posted in News
      P
      ph7
    • RE: Boot New VM to ISO in XO-Lite?

      @MathieuRA
      Yes, I can confirm if I disable the Secure boot option the VM will start.
      After a quick flash of the pxe boot screen, the Debian installer will appear, and I can continue the installation.

      posted in XO Lite
      P
      ph7