XCP-ng
    • Categories
    • Recent
    • Tags
    • Popular
    • Users
    • Groups
    • Register
    • Login
    1. Home
    2. ph7
    3. Posts
    P
    Offline
    • Profile
    • Following 0
    • Followers 0
    • Topics 15
    • Posts 240
    • Groups 0

    Posts

    Recent Best Controversial
    • RE: Just FYI: current update seams to break NUT dependancies

      You can install nut on a separate (small) machine and let it ssh to Your master and run a shutdown script.
      That's how I do it since a while back when the NUT-package was temporarily removed from the extra repo

      posted in XCP-ng
      P
      ph7
    • RE: xcp-ng patches install fail

      https://xcp-ng.org/forum/topic/11951/just-fyi-current-update-seams-to-break-nut-dependancies

      posted in XCP-ng
      P
      ph7
    • RE: Failed backup jobs since updating

      No compression

      posted in Backup
      P
      ph7
    • RE: Failed backup jobs since updating

      @peo said:

      Error: _removeUnusedSnapshots don't handle vdi related to multiple VMs

      I had the same error when updated to 449E7
      It ran 2 CR, then threw the error.
      There was other problems so I rolled back to 5bdd7
      https://xcp-ng.org/forum/topic/11969/timestamp-lost-in-continuous-replication

      posted in Backup
      P
      ph7
    • RE: Timestamp lost in Continuous Replication

      I did get a snapshot at one of the CR VM's with the 449e7 version.
      I don't with 5bdd7

      posted in Backup
      P
      ph7
    • RE: Timestamp lost in Continuous Replication

      I did a rollback to 5bdd7 and got the date back and also delta

      posted in Backup
      P
      ph7
    • RE: Timestamp lost in Continuous Replication

      The latest XCP-ng update had an update regarding NTP

      XAPI, XCP-ng's control plane, was updated to version 26.1.3.
      Added API for controlling NTP.
      

      This might be a longshot and I don't know it this has anything to do with my "problem"
      The timestamp on ContRep VM's has always been in UTC
      I'm using Stockholm SE as timezone and other bugs regarding presentation of time have been fixed but not this one,

      posted in Backup
      P
      ph7
    • RE: Timestamp lost in Continuous Replication
      {
        "data": {
          "mode": "delta",
          "reportWhen": "failure"
        },
        "id": "1773474108084",
        "jobId": "3dcd13a8-de8b-47b6-945b-12dbad9c6234",
        "jobName": "ContRep",
        "message": "backup",
        "scheduleId": "522d611e-7cd9-4fe0-a9e1-b409927cd8c8",
        "start": 1773474108084,
        "status": "success",
        "infos": [
          {
            "data": {
              "vms": [
                "b1940325-7c09-7342-5a90-be2185c6d5b9",
                "86ab334a-92dc-324c-0c42-43aad3ae3bc2",
                "0f5c4931-a468-e75d-fa54-e1f9da0227a1"
              ]
            },
            "message": "vms"
          }
        ],
        "tasks": [
          {
            "data": {
              "type": "VM",
              "id": "b1940325-7c09-7342-5a90-be2185c6d5b9",
              "name_label": "PiHole wifi"
            },
            "id": "1773474110343",
            "message": "backup VM",
            "start": 1773474110343,
            "status": "success",
            "tasks": [
              {
                "id": "1773474111649",
                "message": "snapshot",
                "start": 1773474111649,
                "status": "success",
                "end": 1773474113141,
                "result": "d4b0607f-0837-c7ae-5c2d-6426995470bd"
              },
              {
                "data": {
                  "id": "4f2f7ae2-024a-9ac7-add4-ffe7d569cae7",
                  "isFull": true,
                  "name_label": "Q1-ContRep",
                  "type": "SR"
                },
                "id": "1773474113142",
                "message": "export",
                "start": 1773474113142,
                "status": "success",
                "tasks": [
                  {
                    "id": "1773474114002",
                    "message": "transfer",
                    "start": 1773474114002,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1773474189159",
                        "message": "target snapshot",
                        "start": 1773474189159,
                        "status": "success",
                        "end": 1773474190075,
                        "result": "OpaqueRef:fa356b5f-4b25-d3e6-5507-6ed81c32b1d8"
                      }
                    ],
                    "end": 1773474190075,
                    "result": {
                      "size": 4299161600
                    }
                  }
                ],
                "end": 1773474190682
              }
            ],
            "end": 1773474191523
          },
          {
            "data": {
              "type": "VM",
              "id": "86ab334a-92dc-324c-0c42-43aad3ae3bc2",
              "name_label": "Home Assistant"
            },
            "id": "1773474191534",
            "message": "backup VM",
            "start": 1773474191534,
            "status": "success",
            "tasks": [
              {
                "id": "1773474191707",
                "message": "snapshot",
                "start": 1773474191707,
                "status": "success",
                "end": 1773474193196,
                "result": "c3f038c3-7ca9-cbbb-9f84-61e1fd30c9d5"
              },
              {
                "data": {
                  "id": "4f2f7ae2-024a-9ac7-add4-ffe7d569cae7",
                  "isFull": true,
                  "name_label": "Q1-ContRep",
                  "type": "SR"
                },
                "id": "1773474193196:0",
                "message": "export",
                "start": 1773474193196,
                "status": "success",
                "tasks": [
                  {
                    "id": "1773474194123",
                    "message": "transfer",
                    "start": 1773474194123,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1773474462529",
                        "message": "target snapshot",
                        "start": 1773474462529,
                        "status": "success",
                        "end": 1773474463434,
                        "result": "OpaqueRef:c13f3cab-29c8-4ef0-253f-de5998580cd9"
                      }
                    ],
                    "end": 1773474463434,
                    "result": {
                      "size": 15548284928
                    }
                  }
                ],
                "end": 1773474464311
              }
            ],
            "end": 1773474466186
          },
          {
            "data": {
              "type": "VM",
              "id": "0f5c4931-a468-e75d-fa54-e1f9da0227a1",
              "name_label": "Sync Mate"
            },
            "id": "1773474466193",
            "message": "backup VM",
            "start": 1773474466193,
            "status": "success",
            "tasks": [
              {
                "id": "1773474466371",
                "message": "snapshot",
                "start": 1773474466371,
                "status": "success",
                "end": 1773474470399,
                "result": "36a17271-1c2f-4b26-0d86-dc0faf27fa17"
              },
              {
                "data": {
                  "id": "4f2f7ae2-024a-9ac7-add4-ffe7d569cae7",
                  "isFull": true,
                  "name_label": "Q1-ContRep",
                  "type": "SR"
                },
                "id": "1773474470399:0",
                "message": "export",
                "start": 1773474470399,
                "status": "success",
                "tasks": [
                  {
                    "id": "1773474471561",
                    "message": "transfer",
                    "start": 1773474471561,
                    "status": "success",
                    "tasks": [
                      {
                        "id": "1773476263789",
                        "message": "target snapshot",
                        "start": 1773476263789,
                        "status": "success",
                        "end": 1773476264925,
                        "result": "OpaqueRef:88bd55f4-64ea-fb16-f389-4b9f42ac459f"
                      }
                    ],
                    "end": 1773476264925,
                    "result": {
                      "size": 105526591488
                    }
                  }
                ],
                "end": 1773476267354
              }
            ],
            "end": 1773476268187
          }
        ],
        "end": 1773476268187
      }
      
      posted in Backup
      P
      ph7
    • RE: Timestamp lost in Continuous Replication

      I was on 5bdd7 installed at 202603091803

      posted in Backup
      P
      ph7
    • RE: Timestamp lost in Continuous Replication

      Ant they are full
      Screenshot 2026-03-14 at 09-09-23 Backup.png

      Screenshot 2026-03-14 at 09-41-08 Backup.png

      posted in Backup
      P
      ph7
    • Timestamp lost in Continuous Replication

      Updated to 449e7 (roniway script) today and there used to be a timestamp in the end of the name of the Cont. repl. VM
      d77f9787-c488-4f13-b89f-a2050c52e691-image.png

      Now it looks like this
      Screenshot 2026-03-14 at 01-03-17 Xen Orchestra.png

      posted in Backup
      P
      ph7
    • RE: XCP-ng 8.3 updates announcements and testing

      @gduperrey
      In my homelab I've had the same problem for at least 18 month's
      when shutting down the XO-VM it hangs for 2-3 minutes when it tries to umount the remotes

      umount /run/xo-server/mounts/xxxxx (SMB and NFS)
      umount /run/xo-server/mounts/yyyyy (SMB)
      

      I did report this in an earlier thread

      posted in News
      P
      ph7
    • RE: XCP-ng 8.3 updates announcements and testing

      @rzr said:
      xo-lite: 0.18.0-1.xcpng8.3

      Well I was and still is on v0.19.0:
      Screenshot 2026-03-04 at 12-22-48 Settings - XO Lite.png

      I am running an old AMD Ryzen 5 2400GE homelab with NFS
      no XOSTOR or SDN.
      Tested NTP Def and dhcp, ssh, wget, cont. repl. ...
      all worked fine so far.

      posted in News
      P
      ph7
    • RE: backup mail report says INTERRUPTED but it's not ?

      I also have this problem in my XO-CE (ronivay-script) at home
      I get the mail report after 4 days
      A reboot resets the memory

      the XO-CE have 3.1 GB and the Control domain memory have 2 GB
      Node v24.13.1
      Running Continuous Replication and Delta Backups

      posted in Backup
      P
      ph7
    • RE: VM Pool To Pool Migration over VPN

      @acebmxer said in VM Pool To Pool Migration over VPN:

      Maybe VPN overhead.

      Have You checked the VPN capacity spec of Your firewalls?

      posted in Management
      P
      ph7
    • RE: Failed unmounting remotes at XO/XOA shutdown

      No idea if anyone have "fixed" anything

      No, the XO commit 5fcb6 hang for ~3 min at reboot today.
      edit: I disabled the sceduled reboot yesterday.

      posted in Xen Orchestra
      P
      ph7
    • RE: Failed unmounting remotes at XO/XOA shutdown

      @DwightHat
      No
      I did schedule a reboot of XO every morning and it seems it has worked because I "forgot" about it

      No idea if anyone have "fixed" anything

      posted in Xen Orchestra
      P
      ph7
    • RE: XO5 breaks after defaulting to XO6 (from source)

      I'm not running https and still on node v22

      posted in Xen Orchestra
      P
      ph7
    • RE: XO5 breaks after defaulting to XO6 (from source)

      @Gheppy
      My links from 6 to 5 works fine
      Haven’t tested all of them of course, but quit a few and none of them have faild

      posted in Xen Orchestra
      P
      ph7
    • RE: 🛰️ XO 6: dedicated thread for all your feedback!

      I run these kinds of backup jobs

      • Full Backup x2
      • Delta backup
      • Continuous Replication
      • XO Config and Metadata

      In ContRep, Conf/meta and in one of the Full Backup jobs I get Report when Never
      5afa6b9b-8502-469d-ae1a-8948f52639ec-image.png

      But they are enabled in XO5 with mail address and Report when Skipped and failure
      2f6e166c-cfd6-4182-bb5e-1b79bc040a81-image.png

      posted in Xen Orchestra
      P
      ph7