View Issue Details
ID | Project | Category | View Status | Date Submitted | Last Update |
---|---|---|---|---|---|
0000829 | bareos-core | storage daemon | public | 2017-06-30 11:07 | 2023-08-31 09:49 |
Reporter | PaoloM | Assigned To | bruno-at-bareos | ||
Priority | normal | Severity | trivial | Reproducibility | unable to reproduce |
Status | closed | Resolution | unable to reproduce | ||
Platform | Linux | OS | Debian | OS Version | 8 |
Product Version | 16.2.4 | ||||
Summary | 0000829: Consolidate Job fails beacuse of volume corrupted | ||||
Description | Since 2 days ago, my "Consolidate" job fails. The worst is that, the failing volume, is one of the Full Backup of my always incremental job. It worked for a couple of weeks, and the I have the problem. The first error has been: 29-Jun 12:20 bareos-sd JobId 3614: Error: block.c:333 Volume data error at 1:2666845125! Block checksum mismatch in block=657335 len=64512: calc=4814efdd blk=9589a529 With bls utility I found exactly the error: 29-Jun 18:24 bls JobId 0: Error: block.c:333 Volume data error at 1:2666845125! Block checksum mismatch in block=657335 len=64512: calc=4814efdd blk=9589a529 bls: block.c:96-0 Dump block with checksum error 101d0e0: size=64512 BlkNum=657335 Hdrcksum=9589a529 cksum=4814efdd I tried with the "Block Checksum = no" directive in the device configuration file, and retried the consolidation, and now I have this error: 29-Jun 18:59 bareos-sd JobId 3618: Error: block.c:286 Volume data error at 1:2666909637! Wanted ID: "BB02", got "". Buffer discarded. I think is "correct" because the bls command on the corrupted volume, after the corrupted block shows: bls: block.c:109-0 Rec: VId=982 VT=1495786927 FI=366903 Strm=contGZIP len=16541 p=1024cb8 bls: block.c:109-0 Rec: VId=982 VT=1495786927 FI=0 Strm=0 len=0 p=1028d61 bls: block.c:109-0 Rec: VId=982 VT=1495786927 FI=0 Strm=0 len=0 p=1028d6d bls: block.c:109-0 Rec: VId=982 VT=1495786927 FI=0 Strm=0 len=0 p=1028d79 .... While a good volume, ends with: bls: block.c:109-0 Rec: VId=143 VT=1498492591 FI=340630 Strm=contGZIP len=18710 p=998ca8 bls: block.c:109-0 Rec: VId=143 VT=1498492591 FI=340630 Strm=GZIP len=34704 p=99d5ca bls: block.c:109-0 Rec: VId=143 VT=1498492591 FI=340630 Strm=GZIP len=37975 p=9a5d66 30-Jun 10:18 bls JobId 0: End of file 2 on device "FileStorage" (/storage/qnap/bareos), Volume "AI-Consolidated-0343" 30-Jun 10:18 bls JobId 0: Got EOM at file 2 on device "FileStorage" (/storage/qnap/bareos), Volume "AI-Consolidated-0343" So, I need a way to solve the problem, if possible without do a new full, because it means loose 3 months of backup !!! Possibly solutions I imagine: - a command to write the EOF on the file volume, like the btape weof that Unfortunately is only for tapes !!! - a directive that permit to ignore the eof on the tape - a way to set the last block (EOF) in the catalog or its db (postgresql) - a working bcopy version, since in the bareos doc is written: "One of the objectives of this program is to be able to recover as much data as possible from a damaged tape. However, the current version does not yet have this feature" Thanks in advance. | ||||
Tags | No tags attached. | ||||
After almost a month, the problem is still there. All consolidate jobs fail because one of the volumes, on the Full backup is corrupted. Is there a way to solve the problem. Is it possible to bypass that volume and consolidate with the other ones? THanks in advance. |
|
Debunking oldies during summer clean up of Mantis, we want to know if a) you still first have this issue, b) still need a solution (will be paying support/development) ? or we can just close the issue due to its age ? |
|
Closing with no feedback, can't be reproducible in recent version. |
|
Date Modified | Username | Field | Change |
---|---|---|---|
2017-06-30 11:07 | PaoloM | New Issue | |
2017-07-26 14:08 | PaoloM | Note Added: 0002691 | |
2023-07-27 15:46 | bruno-at-bareos | Assigned To | => bruno-at-bareos |
2023-07-27 15:46 | bruno-at-bareos | Status | new => feedback |
2023-07-27 15:46 | bruno-at-bareos | Note Added: 0005267 | |
2023-08-31 09:49 | bruno-at-bareos | Status | feedback => closed |
2023-08-31 09:49 | bruno-at-bareos | Resolution | open => unable to reproduce |
2023-08-31 09:49 | bruno-at-bareos | Note Added: 0005357 |