2

Exporting the data of a table before deleting it work like a charm.
Example :

OUTPUT TO VALUE("C:\mytable.d") APPEND.
EXPORT mybd.mytable.
OUTPUT CLOSE. 
DELETE mybd.mytable.

However, I've yet to make it work when using a buffer handle instead. The following exports an integer instead of the data deleted.

DEF INPUT PARAM hlTableToDelete AS HANDLE NO-UNDO.
...
OUTPUT TO VALUE("C:\" + hiTableToDelete:NAME + ".d") APPEND.
EXPORT hlTableToDelete:HANDLE.
OUTPUT CLOSE. 
hlTableToDelete:BUFFER-DELETE().

Which syntax is needed for command export to work and actually export the data of the buffer handle?

jdpjamesp
  • 762
  • 7
  • 19
AXMIM
  • 2,424
  • 1
  • 20
  • 38
  • 1
    Question should be how to use `export` with a `buffer-handle` - your first example is exporting a /static/ buffer. Every (temp-)table has a static buffer with the same name as the table by default. – Stefan Drissen Aug 16 '19 at 05:17

3 Answers3

5

EXPORT only works with static buffers. There is no EXPORT method on buffer handles.

To get equivalent functionality you will need to write some code that loops through the field list.

Something along these lines should get you started:

/* export data like EXPORT does
 *
 * makes no attempt to handle LOB data types
 *
 */

function exportData returns logical ( input bh as handle ):

  define variable bf as handle    no-undo.                                      /* handle to the field                          */
  define variable f  as integer   no-undo.                                      /* field number                                 */
  define variable i  as integer   no-undo.                                      /* array index                                  */

  do f = 1 to bh:num-fields:                                                    /* for each field...                            */

    bf = bh:buffer-field( f ).                                                  /* get a pointer to the field                   */

    if f > 1 then put stream expFile unformatted field_sep.                     /* output field separator                       */

    if bf:extent = 0 then                                                       /* is it an array?                              */
      put stream expFile unformatted
        ( if bf:data-type = "character" then                                    /* character data needs to be quoted to */
          quoter( string( bf:buffer-value ))                                    /* handle (potential) embedded delimiters       */
         else                                                                   /* and quotes within quotes!                    */
          string( bf:buffer-value )                                             /* other data types should not be quoted        */
        )
      .
     else                                                                       /* array fields need special handling           */
      do i = 1 to bf:extent:                                                    /* each extent is exported individually         */
        if i > 1 then put stream expFile unformatted field_sep.                 /* and a field separator                        */
        put stream expFile unformatted
          ( if bf:data-type = "character" then                                  /* see above...                                 */
            quoter( string( bf:buffer-value( i )))
           else  
            string( bf:buffer-value( i ))
          )
          field_sep
        .
      end.

  end.

  put stream expFile skip.                                                      /* don't forget the newline! ;-)                */

  return true.

end.
Tom Bascom
  • 13,405
  • 2
  • 27
  • 33
2

A shorter version of Tom's routine that handles extent fields and normal fields with the same code path.

function exportBuffer returns logical ( input bh as handle ):

  define variable bf as handle    no-undo.
  define variable f  as integer   no-undo.
  define variable i  as integer   no-undo.

  do f = 1 to bh:num-fields:

    bf = bh:buffer-field( f ).

    do i = if bf:extent = 0 then 0 else 1 to bf:extent:        
      put stream expFile unformatted
        ( 
          if bf:data-type = "character" then
            quoter( string( bf:buffer-value( i ) ) )
          else  
            string( bf:buffer-value( i ) )
        )
        (
          if f = bh:num-fields and i = bf:extent then 
            ""
          else
            field_sep
        )
      .
    end.

  end.

  put stream expFile skip.

  return true.

end.

https://abldojo.services.progress.com:443/#/?shareId=5d5643554b1a0f40c34b8bed

I would only use the proprietary export statement format if really, really, really, really needed, in any other case I would use the built in serialize methods (write-xml / write-json). Apart from escaping special characters correctly they also export all data in formats that the whole world understands.

Stefan Drissen
  • 3,266
  • 1
  • 13
  • 21
  • 1
    EXPORT format with a comma delimiter is a CSV file. Lots of stuff still uses that. Having said that - I agree that JSON or XML is better. – Tom Bascom Aug 16 '19 at 13:48
-1

This was helpful! On my tests I figured out that this code does not work with the IMPORT command with DATETIME fields; this was the code I used to format it the way IMPORT expected:

CASE FieldHandle:data-type:
    WHEN "datetime" THEN
    DO:
        OutStr = STRING(FieldHandle:BUFFER-VALUE,"99-99-9999THH:MM:SS.SSS").
        IF OutStr <> ? THEN
            OutStr = SUBSTRING(OutStr,7,4) + "-" + SUBSTRING(OutStr,1,5) + SUBSTRING(OutStr,11).
        PUT STREAM snapshot UNFORMATTED OutStr.
    END.
Jeremy Caney
  • 7,102
  • 69
  • 48
  • 77
Andrew H
  • 1
  • 1
  • Welcome, and thanks for your contribution. You just needed to apply code block formatting to ensure line breaks are honored. I’ve edited your answer to include this. Take a look for future reference. – Jeremy Caney Dec 30 '20 at 21:30