XCP-ng
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login

    Fail backup report XO CE commit a67ad

    Scheduled Pinned Locked Moved Solved Backup
    6 Posts 2 Posters 430 Views 2 Watching
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • P Offline
      ph7
      last edited by ph7

      Updated xo ce yesterday
      this morning I got a "fail mail"
      ea09563f-4233-4947-9482-99bd6817a384-image.png

      I checked my backup log

      {
        "data": {
          "mode": "delta",
          "reportWhen": "failure"
        },
        "id": "1759798920278",
        "jobId": "5f847711-37e6-4b87-9057-54ec653288dd",
        "jobName": "Daily_Offline",
        "message": "backup",
        "scheduleId": "98c42ed9-df3d-4094-8786-338d343f7736",
        "start": 1759798920278,
        "status": "success",
        "infos": [
          {
            "data": {
              "vms": [
                "5142d7d2-e278-f380-8a9f-cb7fc45f8f9c",
                "0f5c4931-a468-e75d-fa54-e1f9da0227a1",
                "0334c083-8543-57dc-6a3d-908e1aa772b4"
              ]
            },
            "message": "vms"
          }
        ],
        "tasks": [
          {
            "data": {
              "type": "VM",
              "id": "5142d7d2-e278-f380-8a9f-cb7fc45f8f9c",
              "name_label": "Docker"
            },
            "id": "1759798923897",
            "message": "backup VM",
            "start": 1759798923897,
            "status": "success",
            "tasks": [
              {
                "id": "1759798923910",
                "message": "clean-vm",
                "start": 1759798923910,
                "status": "success",
                "end": 1759798924717,
                "result": {
                  "merge": false
                }
              },
              {
                "id": "1759798933285",
                "message": "snapshot",
                "start": 1759798933285,
                "status": "success",
                "end": 1759798934156,
                "result": "7084f483-51a5-4950-77a0-a371c6f95d21"
              },
              {
                "data": {
                  "id": "0927f4dc-4476-4ac3-959a-bc37e85956df",
                  "isFull": false,
                  "type": "remote"
                },
                "id": "1759798934157",
                "message": "export",
                "start": 1759798934157,
                "status": "success",
                "tasks": [
                  {
                    "id": "1759798935946",
                    "message": "transfer",
                    "start": 1759798935946,
                    "status": "success",
                    "end": 1759798952957,
                    "result": {
                      "size": 1426063360
                    }
                  },
                  {
                    "id": "1759798954177",
                    "message": "clean-vm",
                    "start": 1759798954177,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1759798954342",
                        "message": "merge",
                        "start": 1759798954342,
                        "status": "success",
                        "end": 1759798992668
                      }
                    ],
                    "warnings": [
                      {
                        "data": {
                          "path": "/xo-vm-backups/5142d7d2-e278-f380-8a9f-cb7fc45f8f9c/20251007T010215Z.json",
                          "actual": 1426063360,
                          "expected": 1426563072
                        },
                        "message": "cleanVm: incorrect backup size in metadata"
                      }
                    ],
                    "end": 1759798992757,
                    "result": {
                      "merge": true
                    }
                  }
                ],
                "end": 1759798992759
              }
            ],
            "end": 1759798992759
          },
          {
            "data": {
              "type": "VM",
              "id": "0f5c4931-a468-e75d-fa54-e1f9da0227a1",
              "name_label": "Sync Mate"
            },
            "id": "1759798992785",
            "message": "backup VM",
            "start": 1759798992785,
            "status": "success",
            "tasks": [
              {
                "id": "1759798992797",
                "message": "clean-vm",
                "start": 1759798992797,
                "status": "success",
                "end": 1759798993754,
                "result": {
                  "merge": false
                }
              },
              {
                "id": "1759799002480",
                "message": "snapshot",
                "start": 1759799002480,
                "status": "success",
                "end": 1759799004460,
                "result": "37d321ca-ea78-2e81-a630-3f93f9ac2ca2"
              },
              {
                "data": {
                  "id": "0927f4dc-4476-4ac3-959a-bc37e85956df",
                  "isFull": false,
                  "type": "remote"
                },
                "id": "1759799004461",
                "message": "export",
                "start": 1759799004461,
                "status": "success",
                "tasks": [
                  {
                    "id": "1759799007511",
                    "message": "transfer",
                    "start": 1759799007511,
                    "status": "success",
                    "end": 1759799026016,
                    "result": {
                      "size": 1134559232
                    }
                  },
                  {
                    "id": "1759799027505",
                    "message": "clean-vm",
                    "start": 1759799027505,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1759799027717",
                        "message": "merge",
                        "start": 1759799027717,
                        "status": "success",
                        "end": 1759799048142
                      }
                    ],
                    "warnings": [
                      {
                        "data": {
                          "path": "/xo-vm-backups/0f5c4931-a468-e75d-fa54-e1f9da0227a1/20251007T010327Z.json",
                          "actual": 1134559232,
                          "expected": 1135043072
                        },
                        "message": "cleanVm: incorrect backup size in metadata"
                      }
                    ],
                    "end": 1759799048229,
                    "result": {
                      "merge": true
                    }
                  }
                ],
                "end": 1759799048231
              }
            ],
            "end": 1759799048231
          },
          {
            "data": {
              "type": "VM",
              "id": "0334c083-8543-57dc-6a3d-908e1aa772b4",
              "name_label": "Home Assistant"
            },
            "id": "1759799048256",
            "message": "backup VM",
            "start": 1759799048256,
            "status": "success",
            "tasks": [
              {
                "id": "1759799048268",
                "message": "clean-vm",
                "start": 1759799048268,
                "status": "success",
                "end": 1759799049008,
                "result": {
                  "merge": false
                }
              },
              {
                "id": "1759799067232",
                "message": "snapshot",
                "start": 1759799067232,
                "status": "success",
                "end": 1759799068300,
                "result": "24fd84b8-f1a0-abbf-b8d7-048387eb22e9"
              },
              {
                "data": {
                  "id": "0927f4dc-4476-4ac3-959a-bc37e85956df",
                  "isFull": false,
                  "type": "remote"
                },
                "id": "1759799068301",
                "message": "export",
                "start": 1759799068301,
                "status": "success",
                "tasks": [
                  {
                    "id": "1759799070236",
                    "message": "transfer",
                    "start": 1759799070236,
                    "status": "success",
                    "end": 1759799079551,
                    "result": {
                      "size": 713031680
                    }
                  },
                  {
                    "id": "1759799080824",
                    "message": "clean-vm",
                    "start": 1759799080824,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1759799080992",
                        "message": "merge",
                        "start": 1759799080992,
                        "status": "success",
                        "end": 1759799116807
                      }
                    ],
                    "warnings": [
                      {
                        "data": {
                          "path": "/xo-vm-backups/0334c083-8543-57dc-6a3d-908e1aa772b4/20251007T010430Z.json",
                          "actual": 713031680,
                          "expected": 713273344
                        },
                        "message": "cleanVm: incorrect backup size in metadata"
                      }
                    ],
                    "end": 1759799116916,
                    "result": {
                      "merge": true
                    }
                  }
                ],
                "end": 1759799116917
              }
            ],
            "end": 1759799116918
          }
        ],
        "end": 1759799116918
      }
      

      The usual "cleanVm: incorrect backup size in metadata" was there but otherwise it seemed OK.
      Perhaps this is a suspect

                    "result": {
                      "merge": true
                    }
      

      There might be one } missing, but what do I know...

      So I ran a restore health check and got this message:

      INTERNAL_ERROR(xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, "16: Device or resource busy")))
      
      This is a XenServer/XCP-ng error
      
      backupNg.checkBackup
      {
        "id": "0927f4dc-4476-4ac3-959a-bc37e85956df//xo-vm-backups/0334c083-8543-57dc-6a3d-908e1aa772b4/20251007T010430Z.json",
        "settings": {
          "mapVdisSrs": {}
        },
        "sr": "ebc70898-d9c2-33dc-b22b-a465e39075a2"
      }
      {
        "code": "INTERNAL_ERROR",
        "params": [
          "xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\"))"
        ],
        "task": {
          "uuid": "3a02bd0d-adbc-67ba-a496-343846190081",
          "name_label": "Async.VM.start",
          "name_description": "",
          "allowed_operations": [],
          "current_operations": {},
          "created": "20251007T07:04:17Z",
          "finished": "20251007T07:04:32Z",
          "status": "failure",
          "resident_on": "OpaqueRef:ed939ef0-4062-e338-ea05-6fc86232259e",
          "progress": 1,
          "type": "<none/>",
          "result": "",
          "error_info": [
            "INTERNAL_ERROR",
            "xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\"))"
          ],
          "other_config": {
            "debug_info:cancel_points_seen": "1"
          },
          "subtask_of": "OpaqueRef:NULL",
          "subtasks": [],
          "backtrace": "(((process xenopsd-xc)(filename lib/backtrace.ml)(line 210))((process xenopsd-xc)(filename ocaml/xenopsd/xc/device.ml)(line 1298))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2157))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1893))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1901))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2490))((process xenopsd-xc)(filename list.ml)(line 121))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2483))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 2637))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1893))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 1901))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 3294))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 3304))((process xenopsd-xc)(filename ocaml/xenopsd/lib/xenops_server.ml)(line 3325))((process xenopsd-xc)(filename ocaml/xapi-idl/lib/task_server.ml)(line 192))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3363))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3742))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3372))((process xapi)(filename lib/backtrace.ml)(line 210))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3378))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3748))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3372))((process xapi)(filename ocaml/xapi/xapi_xenops.ml)(line 3481))((process xapi)(filename ocaml/xapi/xapi_vm.ml)(line 353))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 141))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 1514))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 39))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 1896))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 39))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 24))((process xapi)(filename ocaml/libs/xapi-stdext/lib/xapi-stdext-pervasives/pervasiveext.ml)(line 39))((process xapi)(filename ocaml/xapi/message_forwarding.ml)(line 1878))((process xapi)(filename ocaml/xapi/rbac.ml)(line 188))((process xapi)(filename ocaml/xapi/rbac.ml)(line 197))((process xapi)(filename ocaml/xapi/server_helpers.ml)(line 77)))"
        },
        "message": "INTERNAL_ERROR(xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\")))",
        "name": "XapiError",
        "stack": "XapiError: INTERNAL_ERROR(xenopsd internal error: Cannot_add(0000:02:00.3, Xenctrlext.Unix_error(4, \"16: Device or resource busy\")))
          at Function.wrap (file:///root/xen-orchestra/packages/xen-api/_XapiError.mjs:16:12)
          at default (file:///root/xen-orchestra/packages/xen-api/_getTaskResult.mjs:13:29)
          at Xapi._addRecordToCache (file:///root/xen-orchestra/packages/xen-api/index.mjs:1073:24)
          at file:///root/xen-orchestra/packages/xen-api/index.mjs:1107:14
          at Array.forEach (<anonymous>)
          at Xapi._processEvents (file:///root/xen-orchestra/packages/xen-api/index.mjs:1097:12)
          at Xapi._watchEvents (file:///root/xen-orchestra/packages/xen-api/index.mjs:1270:14)
          at runNextTicks (node:internal/process/task_queues:65:5)
          at processImmediate (node:internal/timers:453:9)
          at process.callbackTrampoline (node:internal/async_hooks:130:17)"
      }
      

      edit
      deleted a silly remark
      /edit

      P 1 Reply Last reply Reply Quote 0
      • P Offline
        ph7 @ph7
        last edited by

        I now see that I also got a errormessage around the time I updated:

        Unable to get the true granularity: 60
        
        vm.stats
        {
          "id": "46bfadff-6002-299e-506a-abbf9a9aa8a8"
        }
        {
          "message": "Unable to get the true granularity: 60",
          "name": "FaultyGranularity",
          "stack": "FaultyGranularity: Unable to get the true granularity: 60
            at XapiStats._getAndUpdateStats (file:///root/xen-orchestra/packages/xo-server/src/xapi-stats.mjs:396:13)
            at runNextTicks (node:internal/process/task_queues:65:5)
            at processImmediate (node:internal/timers:453:9)
            at process.callbackTrampoline (node:internal/async_hooks:130:17)
            at Task.runInside (/root/xen-orchestra/@vates/task/index.js:175:22)
            at Task.run (/root/xen-orchestra/@vates/task/index.js:159:20)
            at Api.#callApiMethod (file:///root/xen-orchestra/packages/xo-server/src/xo-mixins/api.mjs:469:18)"
        }
        
        1 Reply Last reply Reply Quote 0
        • olivierlambertO Offline
          olivierlambert Vates 🪐 Co-Founder CEO
          last edited by

          Thanks for your feedback, adding @lsouai-vates in the loop to tell XO team something might have been broken on master.

          P 2 Replies Last reply Reply Quote 0
          • P Offline
            ph7 @olivierlambert
            last edited by

            @olivierlambert
            I'l report tomorrow after next backup tonight

            1 Reply Last reply Reply Quote 1
            • P Offline
              ph7 @olivierlambert
              last edited by

              @olivierlambert
              As usual, I'm an Idiot
              No need to involve @lsouai-vates

              I am checking what went wrong and report back

              P 1 Reply Last reply Reply Quote 1
              • P Offline
                ph7 @ph7
                last edited by

                Since I'm not used to using git, I was running a few tests on my "test" host.
                I had made a change in my "one line commando"
                My memory is not in best shape so I forgot the change and ran it in my update.
                Now It's sorted out and ran a new update and voila

                1 Reply Last reply Reply Quote 1
                • olivierlambertO olivierlambert marked this topic as a question on
                • olivierlambertO olivierlambert has marked this topic as solved on

                Hello! It looks like you're interested in this conversation, but you don't have an account yet.

                Getting fed up of having to scroll through the same posts each visit? When you register for an account, you'll always come back to exactly where you were before, and choose to be notified of new replies (either via email, or push notification). You'll also be able to save bookmarks and upvote posts to show your appreciation to other community members.

                With your input, this post could be even better 💗

                Register Login
                • First post
                  Last post