Lisp HUG Maillist Archive

memory problems in FLI

Hello all,
      I'm trying to use FLI to solve a major garbage collection 
performance
problem in LispWorks when traversing and updating a very large
1.4 GB radix tree composed of structures and adjustable arrays.

     Over a period of three months I've written an equivalent C-code
radix which unfortunately has some pointer problems somewhere in
its allocation of over seven million pointers,  so I'm re-writing
it to allocate a 800 MB block once and coding C to squeeze the
entire 1.4 GB radix tree into that 800 MB block,  with bounds
checking on all pointers.

     Before I do all this,  I'd like to know if anyone has seen problems 
creating
and accessing large blocks of FLI memory malloc'ed and assigned
to C-pointers in C global memory.   For instance, do the blocks
become invalid or corrupt over time?

Lawrence Au
Uphrase LLC


RE: memory problems in FLI

>Hello all,
>      I'm trying to use FLI to solve a major garbage collection performance
>problem in LispWorks when traversing and updating a very large
>1.4 GB radix tree composed of structures and adjustable arrays.
>
Hello,

This reply will be a bit late for you, but in case other people have similar 
problems.
I also had large garbage collection delays with such structures. Seems to me 
that
the garbage collector needs to scan the large arrays as I suspect that get 
marked as dirty
as soon as you modify them.

What I did is I split all the collections of structures up into several one 
dimensional specialised arrays, so that there are no pointers in the arrays. 
i.e. I use integers to index into the arrays.

Then is garbage collection problems seems to go away as the garbage 
collector no longer scans the arrays.

Rene.



Re: memory problems in FLI

Rene,
      Thank you for your advice!  I went over my code
again to see if specialized arrays could work.
Unfortunately I have about 12 million adjustable
arrays pointing to string-integer dotted pairs,
and the array length of each of the 12 million arrays varies by
frequency of dictionary usage of particular terms.
It's the price of cross-indexing the meaning of every
term.  When adding phrases to the dictionary
these cross-indexes grow,  causing garbage collection.

     Possibly I could use specialized arrays to buffer
that growth to avoid garbage collection until individual
specialized arrays are outgrown and must be replaced.
I'll keep thinking about it.  The strings in the dotted pairs
might by tricky to store,  but it could be easier than C code
anyway.

-- Lawrence Au



On Jun 15, 2005, at 6:52 AM, Rene de Visser wrote:

>> Hello all,
>>      I'm trying to use FLI to solve a major garbage collection 
>> performance
>> problem in LispWorks when traversing and updating a very large
>> 1.4 GB radix tree composed of structures and adjustable arrays.
>>
> Hello,
>
> This reply will be a bit late for you, but in case other people have 
> similar problems.
> I also had large garbage collection delays with such structures. Seems 
> to me that
> the garbage collector needs to scan the large arrays as I suspect that 
> get marked as dirty
> as soon as you modify them.
>
> What I did is I split all the collections of structures up into 
> several one dimensional specialised arrays, so that there are no 
> pointers in the arrays. i.e. I use integers to index into the arrays.
>
> Then is garbage collection problems seems to go away as the garbage 
> collector no longer scans the arrays.
>
> Rene.
>


Updated at: 2020-12-10 08:52 UTC