When you add many data to a BMessage, it seems that it can crash on segfault. I guess it’s due to maximum size, but I was expecting that the methods BMessage::AddXxxx() will returns an error in that case.
I encounter this kind of issue in TextSearch when you search a word which is present in very large number in a text file with lines of texts of average length. I’m looking for a way to avoid this crash, by pushing the message as-is and restart from a clean one to continue on.
Issue is I fail to find a way to detect this saturation.
That’s more likely a bug than a limit set on purpose.
I’m not an expert,but from quickly reading through the code it seems that the BMessage should automatically resize to fit the space needed.
It starts with preallocating these sizes:
I tried to reproduce your issue with TextSearch hoping that I may find the reason for the crash,but I can’t reproduce it here.
What I tried:
Downloading the exact NSS source archive from your ticket and searching it for “config” - Works perfectly fine,finds many results and finishes after like 2-3 seconds.
Cloning the libreoffice/core repository and searching it for “mysql” - Takes a bit longer,finds many results again and works perfectly fine.
Checking out the exact commit a1dd8098e6e2a7d5ba4b9c1a2d094db11d3d6b27 of libreoffice/core as mentioned in the ticket,then repeating the search - Behaves exactly the same as with the latest commit,working fine.
My own test idea,searching the Haiku source tree for “B”,there must likely be some 6-digit number of occurencies,maybe even millions.In that case,the searching threads finished after maybe 1-2 minutes without crashing,but the GUI thread is still processing all the results,now at nearly 350MB RAM usage for TextSearch and still growing.So yeah,not usable for that amount of results,but no crash either.
I tested that on the latest Haiku nightly hrev59580 x86_64.
The ticket is a few years old,can you still reproduce the issue today with a recent nightly?
Any other examples where I could try reproducing the bug?
I used to see the issue last year, but indeed I didn’t retry to reproduce it. As I was working on how to handle too many lines matches in same file to avoid the issue you see, very long time to consume and display the results messages, I was thinking about having a max lines limit (and therefore max fields added to same message), the last field reporting something like “… and N more matching lines”.