2

According to the Spark2014 documentation, one is not allowed to handle exceptions in Spark code.

With verification, most run-time errors can be excluded to occur inside a written program, but exceptions like Storage_Error can not be excluded.

Since the Storage_Error could occur on every procedure/function call or when dynamically allocating memory using new (please correct me if I am wrong on that), catching and handling this exception in a code segment with Spark_Mode=off would only be valid at the highest level of a program (the entry point of a program). I really don't like this approach, since you lose nearly all possibilities to react on such an exception.

What I would like to do:

Assume to create an unbounded tree with a procedure Add(). Inside this procedure I would like to check, if there is enough space on the heap, to create a new node inside the tree. If there is, allocate memory for the node and add it to the tree, otherwise an out parameter could be given, where somekind of flag is set.

I have searched through the Spark UserGuide, but was not able to find a way how this should be handled, only that the programmer has to make sure, that there is enough memory available, but not how one could do that.

How could one handle those kind of exceptions in Spark?

mhatzl
  • 173
  • 1
  • 7

2 Answers2

4

SPARK indeed cannot prove (guarantee) the absence of storage errors as these errors are raised from outside the scope of the program. This is true for both a failing heap allocation as well as when stack space is exhausted.

One may, however, cheat a little bit by refraining the allocation routine from the SPARK analysis as shown in the example below. The allocation subprogram New_Integer has postconditions that SPARK can use to analyze the pointer, but the subprogram's body is refrained from analysis. This allows a Storage_Error to be handled. Of course, care must now be taken that the body does indeed conform to the specification: the Ptr field must not be null when Valid is true. SPARK now only assumes this is true, but will not verify this.

Note: All pointer dereferences and absence of memory leaks can be proved using GNAT CE 2021. However, it would've been nice to actually set the Valid discriminator to False during Free and use a postcondition like Post => P.Valid = False. This, unfortunately, makes SPARK complain about possible discriminant check failures.

Update (3-jun-2021)

I updated the example based on the hints of @YannickMoy (see below). Free now ensures that the weak pointer's Valid discriminator is set to False on return.

main.adb

with Test;

procedure Main with SPARK_Mode is

   X : Test.Weak_Int_Ptr := Test.New_Integer (42);

   Y : Integer;

begin

   --  Dereference.
   if X.Valid then
      Y := X.Ptr.all;
   end if;

   --  Free.
   Test.Free (X);

end Main;

test.ads

package Test with SPARK_Mode is

   type Int_Ptr is access Integer;
   
   --  A weak pointer that may or may not be valid, depending on
   --  on whether the allocation succeeded.
   
   type Weak_Int_Ptr (Valid : Boolean := False) is record
      case Valid is
         when False => null;
         when True  => Ptr : Int_Ptr;
      end case;
   end record;

   function New_Integer (N : Integer) return Weak_Int_Ptr
     with Post => (if New_Integer'Result.Valid then New_Integer'Result.Ptr /= null);
   --  Allocates a new integer.
   
   procedure Free (P : in out Weak_Int_Ptr)
     with 
       Pre  => not P'Constrained, 
       Post => P.Valid = False;
   --  Deallocates an integer if needed.

end Test;

test.adb

with Ada.Unchecked_Deallocation;

package body Test with SPARK_Mode is
   
   -----------------
   -- New_Integer --
   -----------------
   
   function New_Integer (N : Integer) return Weak_Int_Ptr is
      pragma SPARK_Mode (Off);      --  Refrain body from analysis.
   begin
      return Weak_Int_Ptr'(Valid => True, Ptr => new Integer'(N));
      
   exception
      when Storage_Error =>
         return Weak_Int_Ptr'(Valid => False);
      
   end New_Integer;
   
   ----------
   -- Free --
   ----------
   
   procedure Free (P : in out Weak_Int_Ptr) is
   
      procedure Local_Free is new Ada.Unchecked_Deallocation
        (Object => Integer, Name => Int_Ptr);
   
   begin
      if P.Valid then
         Local_Free (P.Ptr);
         P := Weak_Int_Ptr'(Valid => False);
      end if;
   end Free;

end Test;

output (gnatprove)

$ gnatprove -Pdefault.gpr -j0 --level=0 --report=all
Phase 1 of 2: generation of Global contracts ...
Phase 2 of 2: flow analysis and proof ...
main.adb:5:04: info: absence of memory leak at end of scope proved
main.adb:13:13: info: discriminant check proved
main.adb:13:18: info: pointer dereference check proved
main.adb:17:08: info: precondition proved
test.adb:31:23: info: discriminant check proved
test.adb:32:12: info: discriminant check proved
test.adb:32:12: info: absence of memory leak proved
test.ads:22:16: info: postcondition proved
Summary logged in [...]/gnatprove.out

Summary (added by OP)

The provided code helps to prevent Storage_Error from dynamic allocation using the new keyword. As infinite recursions can already be proven by SPARK (as mentioned in the comments. see here), the only open issue that can result in a Storage_Error, would be a program that needs more stack during normal execution than what is available. This however can be monitored and also determined at compile time by tools like GNATstack (also mentioned in the comments. see here).

DeeDee
  • 5,654
  • 7
  • 14
  • Thanks for your detailed answer and code sample. Having the Valid flag set to True after free is not that nice, but for a short sample perfectly fine. It is still sad that you must turn off Spark_Mode and this solution only prevents `Storage_Error` from dynamic allocation. Exceeding stack size by too many recursions could still happen, right? – mhatzl Jun 02 '21 at 20:04
  • 1
    @mhatzl Yes, this can still happen. Despite that SPARK can guarantee that the call stack (i.e. number of stack frames) will always remain bounded (for example, no infinite recursion; see also [here](https://docs.adacore.com/live/wave/spark2014/html/spark2014_ug/en/source/subprogram_contracts.html#subprogram-variant)), it may still exceed the storage limit. To analyze stack usage and related risks, other (commercial) analysis tools, like [GNATstack](https://www.adacore.com/gnatpro/toolsuite/gnatstack), are available. – DeeDee Jun 02 '21 at 21:48
  • 2
    Nice example. You can in fact add the postcondition you wanted to Free provided you specify that parameter P is not constrained with precondition `not P'Constrained`, and you give a default discriminant to type `Weak_Int_Ptr` so that you can declare unconstrained values of the type (whose discriminant can be changed) with `type Weak_Int_Ptr (Valid : Boolean := False) is record...` Then GNATprove proves the code. – Yannick Moy Jun 03 '21 at 07:54
1

I think you could create your own Storage Pool (ARM 13.11 ff) to support an additional operation "would new be OK?". Making it safe against concurrency would be more complex.

I suppose you could make its Allocate swallow the exception and return null. Whatever, I think you’d have to program such a thing "outside SPARK"!

Simon Wright
  • 25,108
  • 2
  • 35
  • 62
  • 1
    That would indeed be nice, but user-defined storage pools are (not yet) available in SPARK (see also [SPARK RM 3.10](https://docs.adacore.com/live/wave/spark2014/html/spark2014_rm/declarations-and-types.html#access-types), static semantics, 4th paragraph). However, you can of course, always allocate some array statically and use it as a "storage pool" without using Ada's storage pool facilities. – DeeDee Jun 02 '21 at 17:44
  • Thanks for your answer. As @DeeDee mentioned, using an array as "storage pool" might also be an option, but in that case you would be stuck to the size of the array, which is not as unbound as I would want to implement it. – mhatzl Jun 02 '21 at 20:11