I was coding some alternative features for a tar.bz2 compressor/decompressor and I noticed a problem handling big data in memory
example: I try to compress my home folder in ram with tar, then I compress the tar object created with bz2 module, but at tar archive creation time, using this object :
tar_data = io.BytesIO()
tar=tarfile.open(fileobj=tar_data,mode='w')
when I exceed more or less 2,5 gigs of ram I receive a MemoryError
tar_data=create_tar_in_ram_with_attributes(input_paths)
File "/boot/home/Projects/HTPBZ2/./HTMZ.py", line 1165, in create_tar_in_ram_with_attributes
tar.add(input_path, arcname=relative_path)
File "/boot/system/lib/python3.10/tarfile.py", line 2186, in add
self.add(os.path.join(name, f), os.path.join(arcname, f),
File "/boot/system/lib/python3.10/tarfile.py", line 2186, in add
self.add(os.path.join(name, f), os.path.join(arcname, f),
File "/boot/system/lib/python3.10/tarfile.py", line 2186, in add
self.add(os.path.join(name, f), os.path.join(arcname, f),
[Previous line repeated 2 more times]
File "/boot/system/lib/python3.10/tarfile.py", line 2180, in add
self.addfile(tarinfo, f)
File "/boot/system/lib/python3.10/tarfile.py", line 2208, in addfile
copyfileobj(fileobj, self.fileobj, tarinfo.size, bufsize=bufsize)
File "/boot/system/lib/python3.10/tarfile.py", line 255, in copyfileobj
dst.write(buf)
MemoryError
with smaller archives I get no errors. It looks like there is a limit for objects in ram.
I cannot set nor detect free available “app ram” within python with the resource
python module, because it’s not set up for Haiku.
Any help is appreciated