ORIGINAL QUESTIONS:
> Architecture
>         Solaris 2.5.1, recommended patches.
>         Machine Ultra 1
> QUESTION 1: 
> ***********
> Within a C and C++ program, when a process allocates memory with
> malloc or new, and then frees it with free or delete, the de-allocated
> memory becomes available to that process for later use, but not for 
> the system and other processes.
> Is there any command or workaround to make the program really free the
> memory, even if that means reorganizing the data it has in the heap?
On this one, I got different responses.
Answer 1:
        Pages of memory are allocated to a process as required.  Pages 
        that are not being uses will be swapped out, so in effect memory 
        will be released to other processes.
Answer 2:
        First <Insert Standard Disclaimer> ... Yes, but if the memory 
        isn't used, it gets swapped out of RAM so that RAM is made 
        available for other processes.
That's not the way it works. I tried it.
Answer 3:
        This is a result of the way that Unix systems allocate memory, 
        and as far as I know is not adjustable. 
Answer 4:
        The only thing I can think of is to fork a child process in your
        program, do the mallocs from there, and just exit the child when 
        you're done.  At that point, the memory should be returned to 
        the system.
I find out that this last options are the right answer. No much to do
but a little trick with the children processes. The process keeps the 
memory and just reuses within the same process if there is a new
malloc, but for the OS is still used.
********************************
> QUESTION 2:
> ***********
> In the following source code:
>         if ( (punt = new char[800000000])==NULL) {
>                 printf ("Not enough memory\n");
>                 return;
>         }
> 
> when I ask more memory than available, the "new" instead of returning
> NULL and getting into the error message, the programs abort with the
> following error:
>          Run-time exception error; current exception: xalloc
>                 No handler for exception.
>          Abort (core dumped)
Answer 1:
        New on very large blocks ( > 1 meg or so ) does tend to fail.  
        Use malloc for these.
        The latest C++ standard and compilers throw an exception when 
        new fails.
That was the answer. The compiler was throwing an exception that I was
not catching.
Many thanks to those who answered:
        Ian TCollins <itc1@scigen.co.uk>
        Harvey Wamboldt <harvey@iotek.ns.ca>
        foster@bial1.ucsd.edu
        Karl E. Vogel <vogelke@c17mis.region2.wpafb.af.mil>
=============================
 Mariel Feder - I.T. Consultant
 unix.support@meralco.com.ph
 Phone: (63) (2) 632.8862 / 632.8977
 Fax:   (63) (2) 632.8868
 Meralco Electric Company
 Distributed Information Technology Team
 Manila - Philipines
=============================
This archive was generated by hypermail 2.1.2 : Fri Sep 28 2001 - 23:12:03 CDT