Yesterday I was just about to lose my mind… was just able to switch off and seem to have recovered over night, thank you…
I was doing a bit of devving on CapitalBe. I got a file “MyAccountData” with accounts and transactions and stuff that I use a fresh copy of to test stuff. Worked a treat in the past.
Then, yesterday, it started to go terribly wrong. Upon launch, CapitalBe reads that test file. But now, instead of opening the data in the app’s window, the file size just keeps increasing!
I kill the CapitalBe app, and after a short while an additional file appears beside the test file named “MyAccountData-journal”. What the…?!
First I tried to figure out which of my changes might have broken the data loading. Didn’t find anything obvious. Built various versions from past commits. Same weirdness. Installed CapitalBe from HaikuDepot, a version I’ve thoroughly tested before with that data file. Same thing!
Today I booted into another test Haiku beta5 installation, and wouldn’t you know it… Everything’s working fine!
So…
What’s going on? Has my BFS journal been corrupted, as that mystery second “journal” file could suggest? I see the same weirdness when loading a file from another partition, BTW.
However, now the equally mysterious becomes: Why does the same binary with the same “MyAccountData” work without problems on my other Haiku installation?
I see the same weirdness when loading a file from another partition
I would first check that it really is the same binary. And I mean checking that what you think is running is what is really being run, and then comparing the files.
When you say "the same “MyAccountData” ", does that include the -journal file?
If you are running the depot version in both cases, opening the same file (with its extra friends or the same lack of them), I’m as stumped as you are. Maybe differences in settings? Going into the very improbable, but different libsqlite?
It must be the same binary and data file. I always remove the “-journal” file together with the size-increased data file. So I always start fresh.
In both Haiku installations I have the “Source” partition mounted, where I keep the git checked out code incl. the testdata file. I copy the “MyAccountData” file into the previously emptied ~/config/settings/CapitalBe/ folder and double-click the binary sitting in the source folder.
The sqlite package is identical.
The troubling install is a beta5, the working is a recent nightly.
I guess I’ll have to bulldoze that beta5 install. I do wonder what’s going on though…
With that recipe, maybe we all want to know what’s going on now, if only to discard a problem in beta5. But I’d be shooting in the dark, maybe using Debugger or strace in case seeing what’s happening when the broken one gets stuck gives any clue.
Good news: I’ve solved the immediate problem of the eternity-loop increasing file size. I still have no clue why it fails on this Haiku install and not the other, same binary, same test data file…
I had a look at the size-increased test data file with SQLiteBrowser. It turns out, that the file was filling up with a “Scheduled transactions”, i.e. CapitalBe checks if a recurring transaction (weekly, monthly etc.) is due and inserts it. At some point the “due date” in the table became “-1”.
In Debugger, I stopped that process and saw that it got stuck in ScheduledExecutor.cpp in the loop:
while (sdata.GetNextDueDate() < current) {
updated = true;
sdata.SetDate(sdata.GetNextDueDate());
gDatabase.AddTransaction(sdata);
// If this is a transfer between accounts, duplicate the transaction object
// and modify account ID, payee and amount
if (destAccount >= 0) {
ScheduledTransData ddata = sdata;
ddata.SetAccount(gDatabase.AccountByID(destAccount));
BString payee = B_TRANSLATE_CONTEXT("Transfer from '%%PAYEE%%'", "MainWindow");
payee.ReplaceFirst("%%PAYEE%%", sdata.GetAccount()->Name());
ddata.SetPayee(payee);
ddata.SetAmount(sdata.Amount().InvertAsCopy());
gDatabase.AddTransaction(ddata);
}
sdata.CalculateNextDueDate();
}
My solution is to check if sdata.GetNextDueDate() is < 0 and break out of the loop if it is.
I don’t know though how the next due date becomes -1, nor why that’s not happening on the other Haiku install with the same binary and data file.
I’ll mark the topic as “solved” anyway…
(Hmm, just saw that some of the “Scheduled transactions” have a due date of 01.01.1970. Probably because they somehow became “-1”. I bet there’s a bug in “CalculateNextDueDate()” which shouldn’t become -1. Need to investigate this further…)
Thanks for playing, @madmax . If anyone has theories about what may be going on here, let’s hear it.
The Epoc Converter says:
1725314400 = Monday, September 2, 2024 10:00:00 PM and
1727906400 = Wednesday, October 2, 2024 10:00:00 PM
So, after adding a month and we come to a date in the future, mktime(×truct); becomes -1.
I booted into the “other” Haiku installation and lo! we don’t get -1 !
Was there something related changed between beta5 (hrev57937+121) and that nightly (haiku-r1~beta5_hrev58035-1-x86_64 from 30.08.2024)?
(ETA: to make it clear: the error is with beta5, the nightly hrev58035 is OK)
That reminds me of playing around with the Date and Time preferences recently,and when I selected a day in the calendar that was too far in the future (two weeks worked,something near the end of the current month didn’t),my system date was reset back to 01.01.1970.
I first thought it was some weirdness of my development system,but seeing that you have exactly the same issue,it looks like it’s a bug in Haiku.
I tested that on a recent Nightly install.
Will look later today if anything related to Date and Time changed recently and if I can fix it.
It’s interesting that for you Beta5 is affected,in my case it’s a Nightly install so it must be a change that has been backported.
I tried a few previous bootstates and the oldest where the error happens is hrev58067.
The next one I have is hrev58006 which works fine,but if your hrev58035 also works right,the bug must have been introduced somewhere between hrev58035 and hrev58067.
Now it’s time to look at each commit in this range and see what could have caused this issue.