Just updated a day ago,
All of these backups that are failing have no existing Snapshots
This seems to be because each one has 3 (one has 4) base copies as its not coalescing
Output of grep -A 5 -B 5 -i exception /var/log/SMlog
Just updated a day ago,
All of these backups that are failing have no existing Snapshots
This seems to be because each one has 3 (one has 4) base copies as its not coalescing
Output of grep -A 5 -B 5 -i exception /var/log/SMlog
This seems to be a Python code error ? Could this be a bug in the GC script ?
We are having this exact same issue and I have posted in the Discord server to no avail
Mar 5 10:05:57 ops-xen2 SMGC: [25218] *~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*
Mar 5 10:05:57 ops-xen2 SMGC: [25218] ***********************
Mar 5 10:05:57 ops-xen2 SMGC: [25218] * E X C E P T I O N *
Mar 5 10:05:57 ops-xen2 SMGC: [25218] ***********************
Mar 5 10:05:57 ops-xen2 SMGC: [25218] gc: EXCEPTION <class 'XenAPI.Failure'>, ['XENAPI_PLUGIN_FAILURE', 'multi', 'CommandException', 'Input/output error']
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 2961, in gc
Mar 5 10:05:57 ops-xen2 SMGC: [25218] _gc(None, srUuid, dryRun)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 2846, in _gc
Mar 5 10:05:57 ops-xen2 SMGC: [25218] _gcLoop(sr, dryRun)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 2813, in _gcLoop
Mar 5 10:05:57 ops-xen2 SMGC: [25218] sr.garbageCollect(dryRun)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 1651, in garbageCollect
Mar 5 10:05:57 ops-xen2 SMGC: [25218] self.deleteVDIs(vdiList)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 1665, in deleteVDIs
Mar 5 10:05:57 ops-xen2 SMGC: [25218] self.deleteVDI(vdi)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 2426, in deleteVDI
Mar 5 10:05:57 ops-xen2 SMGC: [25218] self._checkSlaves(vdi)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 2650, in _checkSlaves
Mar 5 10:05:57 ops-xen2 SMGC: [25218] self.xapi.ensureInactive(hostRef, args)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/opt/xensource/sm/cleanup.py", line 332, in ensureInactive
Mar 5 10:05:57 ops-xen2 SMGC: [25218] hostRef, self.PLUGIN_ON_SLAVE, "multi", args)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 264, in __call__
Mar 5 10:05:57 ops-xen2 SMGC: [25218] return self.__send(self.__name, args)
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 160, in xenapi_request
Mar 5 10:05:57 ops-xen2 SMGC: [25218] result = _parse_result(getattr(self, methodname)(*full_params))
Mar 5 10:05:57 ops-xen2 SMGC: [25218] File "/usr/lib/python2.7/site-packages/XenAPI.py", line 238, in _parse_result
Mar 5 10:05:57 ops-xen2 SMGC: [25218] raise Failure(result['ErrorDescription'])
Mar 5 10:05:57 ops-xen2 SMGC: [25218]
Mar 5 10:05:57 ops-xen2 SMGC: [25218] *~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*~*
@olivierlambert I see, seems like the GPG key for NodeJS apt repo was out, I will get this sorted and then post back this evening / tomo as to if it corrected any issues.
This is is a development/prod environment so I will have to just pull the latest later today.
I do not feel as that will be the issue as we have a several backups that take place and the one on the 25th ran without issue and saved off 500GB.
node -v v16.20.2
Good morning,
I am hoping some of you guru's here can help me figure this out we had a backup that has been running for over a year without issues just start spitting out this error:
spawn /usr/bin/node EAGAIN
Sep 27 13:42:12 orchestra xo-server[4109]: 2023-09-27T13:42:12.134Z xo:api WARN {redacted} | backupNg.runJob(...) [125ms] =!> Error: spawn /usr/bin/node EAGAIN
I have check syslog and gotten the previous string mentioned and is seems there is no verbosity with Orchestra.
If anyone would be willing to help me start the diagnosis on the issue here I would be most thankful as I don't even know where to begin as it seems like NODE.JS is broken but why would it just break?
Running the following build from source:
Xen Orchestra, commit c714b
xo-server 5.122.0
xo-web 5.124.1