3

I'am using a firebird 2.5 server to write in a Database file(BD.fbd). My delphi XE8 project has a Data module(DMDados) with:

  • SQLConnection (conexao)

  • TSQLQUery1 (QueryBDPortico_Inicial) + TDataSetProvider1 (DSP_BDPortico_Inicial) + TClientDataSet1 (cdsBDPortico_Inicial)

  • TSQLQUery2 (QueryConsulta) (just for use SQL strings)

My database file has this table:

  • PORTICO_INICIAL

The table has these fields (all integer):

  • NPORTICO

  • ELEMENTO

  • ID

None of those fields are primary keys because I will have repeated values in some cases. The connection with the file is ok. The client data set is open when I run the code. The TSQLQUery2 (QueryConsulta) is open when needed.

My code, when triggered by a button, has to delete all tables' records (if exist) then full the table with integer numbers created by a LOOP. In the first try the code just work fine, but when I press the button the second time i get the error 'Unable to find record. No key specified' then when I check the records the table is empty.

I tried to change the ProviderFlags of my query but this make no difference. I checked the field names, the table name or some SQL text error but find nothing. My suspect is that when my code delete the records the old values stay in memory then when try apply updates with the new values the database use the old values to find the new record's place therefore causing this error.

    procedure monta_portico ();
    var
    I,K,L,M, : integer;
    begin
    with DMDados do
      begin
        QUeryCOnsulta.SQL.Text := 'DELETE FROM PORTICO_INICIAL;';
        QueryConsulta.ExecSQL();
        K := 1;
        for I := 1 to 10 do 
          begin
          L := I*100;
          for M := 1 to 3 do
            begin
              cdsBDPortico_Inicial.Insert;
              cdsBDPortico_Inicial.FieldbyName('NPORTICO').AsInteger := 
                M+L;
              cdsBDPortico_Inicial.FieldbyName('ELEMENTO').AsInteger := M;
              cdsBDPortico_Inicial.ApplyUpdates(0);
              K := K +1;
            end;
          end;
      end;
    end;

I want that every time I use the code above it first delete all records in the table then fill it again with the loop. When I use the code for the first time it do what I want but in the second time it just delete the records and can not fill the table with the values.

mjn
  • 36,362
  • 28
  • 176
  • 378

3 Answers3

4

Update I've added some example code below. Also, when I wrote the original version of this answer, I'd forgotten that one of the TDataSetProvider Options is poAllowMultiRecordUpdates, but I'm not sure that's involved in your problem.

The error message Unable to find record. No key specified is generated by the DataSetProvider, so isn't directly connected to your

QUeryCOnsulta.SQL.Text := 'DELETE FROM PORTICO_INICIAL;'

because that bypasses the DataSetProvider. The error is coming from an failed attempt to ApplyUpdates on the CDS. Try changing your call to it to

Assert(cdsBDPortico_Inicial.ApplyUpdates(0) = 0);

That will show you when the error occurs because the return result of ApplyUpdates gives the number of errors that occurred when calling it.

You say

will have repeated values in some cases

If that's true when the problem occurs, it's because you are hitting a fundamental limitation in the way a DataSetProvider works. To apply the updates on the source dataset, it has to generate SQL to send back to the source dataset (TSqlQuery1) which uniquely identifies the row to update in the source data, which is impossible if the source dataset contains duplicated rows.

Basically, you need to re-think your code so that the source dataset rows are all unique. Once you've done that, setting the DSP's UpdateMode to upWhereAll should avoid the problem. It would be best for the source dataset to have a primary key, of course.

A quick work-around would be to use CDS.Locate in the loop where you insert the records, to see if it can locate an already-existing record with the values you're about to add.

Btw, sorry for raising the point about the ProviderFlags. It's irrelevant if there are duplicated rows, because whatever they are set to, the DSP will still fail to update a single record.

In case it helps, here is some code which might help populating your table in a way which avoids getting duplicates. It only populates the first two columns, as in the code you show in your q.

function RowExists(ADataset : TDataSet; FieldNames : String; Values : Variant) : Boolean;
begin
  Result := ADataSet.Locate(FieldNames, Values, []);
end;

procedure TForm1.PopulateTable;
var
  Int1,
  Int2,
  Int3 : Integer;
  i : Integer;
  RowData : Variant;
begin
  CDS1.IndexFieldNames := 'Int1;Int2';
  for i := 1 to 100 do begin
    Int1 := Round(Random(100));
    Int2 := Round(Random(100));
    RowData := VarArrayOf([Int1, Int2]);
    if not RowExists(CDS1, 'Int1;Int2', RowData) then
      CDS1.InsertRecord([Int1, Int2]);
  end;
  CDS1.First;
  Assert(CDS1.ApplyUpdates(0) = 0);
end;
MartynA
  • 30,454
  • 4
  • 32
  • 73
  • Thanks for the information about the limitation. I thought that as it worked the first time should work the second. I will try other code and put in my table a field with a primary key , like a row counter, to work it around. – Johnny Arza Apr 02 '19 at 22:10
  • @JohnnyArza while you do not show all the relevant code what most probably happened was this: when you only start it ("first time") - your CDS is just created and because of that it is empty. After you issue your `DELETE` command, it just happens that table in the database and just-created CDS have the same data - nothing at all. They happen to be in sync. Then you add rows to CDs and make CDS flush those rows into the database using unknown SQL transaction (hopefully auto-committed eventually). Both table and CDS have data, still in sync. But then you go "second time" – Arioch 'The Apr 04 '19 at 08:46
  • When you start your second time both CDS and the DB table have data. Then you issue `DELETE` and kind-of clean the DB table. You actually do not clean yet, you only register your intention to do so. The actual cleaning happens when you commit the transaction(tx) that the query operates under. Your code does not show it, so we may only guess where you do it. Also, you code does not show if CDS and Query are linked to the same database and the same transaction. If they happen to belong to different transactions, then they probably DO NOT see changes of one another. – Arioch 'The Apr 04 '19 at 08:49
  • However, let's for simplicity guess (just guess, it is not sure thing) that they share the same TX. Then (within the volatile bubble of that not yet committed TX) they can see changes of one another - if you somehow tell them to do it. But you do not. You delete rows using query - but you do not re-read CDS from the database. Neither for you destroy and re-create database. Now your table (with the TX) has no rows, but CDS still doo (after the "first time"). Now they are out of sync and when you try to "apply changes" all the mismatches are revealed as errors (lucky you there were errors). – Arioch 'The Apr 04 '19 at 08:51
  • @Arioch'The: Thanks for the comments. I confess I had not come across TDBXTransaction - it was a long time (D7) since I used DBX for real. – MartynA Apr 04 '19 at 09:09
  • @MartynA it does not matter if it DBX or something, transaction is a core concept of RDBMS. Different libraries would have different names for types and methods, different calling conventions, but the core idea would be still the same. – Arioch 'The Apr 04 '19 at 09:37
  • @Arioch'The, sure I understand quite a bitt about transactions but in the version of DBX I worked with, transactions were handled tranparently. – MartynA Apr 04 '19 at 11:40
  • @MartynA many libraries have one or another auto-commit. Problem is, this auto-commit is ignorant of your real program structure, so it basically either has to happen after every single command - much more often that is required - or only when all queries are closed - way too rare. In both ways, this "removes transactions form your path" but at the price of making your programs incorrect - now your database at any changes becomes inconsistent for quite awhile. Before hopefully becoming consistent again (maybe not). Briefcase-model libraries like DataSnap might do better. – Arioch 'The Apr 04 '19 at 13:29
  • They have concept of accumulating changes across many tables, then issuing ApplyUpdates over ALL the datasets at once. This can naturally be wrapped into transactions. But, well, this has issues of trying to apply updates over data, already changed by some another application. Like what we see in this topic (technically it is not "another application" but the change is nonetheless made outside of DataSnap framework, so DataSnap gets lost what is happening) :-D – Arioch 'The Apr 04 '19 at 13:31
  • This code just solved my issue. And I agreed with @Arioch 'The about my error. Thank you very much guys! – Johnny Arza Apr 12 '19 at 04:02
1

Splite the problem into small parties using functions and procedures create an instance of TSqlQuery Execute the SQL statment's and destroy the instance when you finish with it...

procedure DeleteAll;
var
  Qry: TSqlQuery;
begin
  Qry := TSqlQuery.Create(nil);
  try
    Qry.SqlConnection := DMDados.conexao;
    Qry.Sql.Text := 'DELETE FROM PORTICO_INICIAL;';
    Qry.ExecSql;
  finally
    Qry.Free;
  end;
end;

your can even execute directly from TSQlConnection with one line...


DMDados.conexao.ExecuteDirect('DELETE FROM PORTICO_INICIAL;')

procedure monta_portico ();
var
I,K,L,M, : integer;
begin
with DMDados do
  begin

    DeleteAll;

    K := 1;
    for I := 1 to 10 do
      begin
      L := I*100;
      for M := 1 to 3 do
        begin
          cdsBDPortico_Inicial.Insert;
          cdsBDPortico_Inicial.FieldbyName('NPORTICO').AsInteger :=
            M+L;
          cdsBDPortico_Inicial.FieldbyName('ELEMENTO').AsInteger := M;
          cdsBDPortico_Inicial.ApplyUpdates(0);
          K := K +1;
        end;
      end;
  end;
end;
S.FATEH
  • 451
  • 8
  • 16
  • Sorry, but this answer is just plain wrong. The problem has nothing to do with the bulk deletion, so replacing the code which does it is pointless. Do you understand how a TDataSEet provider works? That's where the problem lies. – MartynA Apr 02 '19 at 19:42
  • @S.FATEH thanks for the help. I did not know that could execute the SQlConnection with one line. Know I can make my code more clean. – Johnny Arza Apr 02 '19 at 22:11
  • @JohnnyArza notice however that this one-liner would include full processing of the query (parsing, preparation, etc). If you issue it like once per few minutes - that is ok. But if there are queries you make few times per second (like when saving some complex object) which only differ in variable values, then you better have them pre-cooked (prepared), it may make complex object saving or similar multi-query interrelated work faster few times. – Arioch 'The Apr 04 '19 at 08:07
0

Just few obvervations, cause the primary answers were given, but they not deal with the secondary problems.

cdsBDPortico_Inicial.FieldbyName('NPORTICO').AsInteger := 

FieldByName is slow function - it is linear search over objects array with uppercased string comparison over each one. You better only call it once for every field, not do it again in again in the loop.

cdsBDPortico_Inicial.ApplyUpdates(0);

Again, applying updates is relatively slow - it requires roundtrip to the server all through internal guts of DataSnap library, why so often?

BTW, you delete rows from SQL table - but where do you delete rows from cdsBDPortico_Inicial ??? I do not see that code.

Was I in your shows I would write something like that (granted I am not big fan of Datasnap and CDS):

procedure monta_portico ();
var
  Qry: TSqlQuery;
  _p_EL, _p_NP: TParam;
  Tra: TDBXTransaction; 
var
I,K,L,M, : integer;
begin
  Tra := nil;
  Qry := TSqlQuery.Create(DMDados.conexao); // this way the query would have owner
  try   // thus even if I screw and forget to free it - someone eventually would

    Qry.SqlConnection := DMDados.conexao;
    Tra := Qry.SqlConnection.BeginTransaction;

    // think about making a special function that would create query
    // and set some its properties - like connection, transaction, preparation, etc
    // so you would not repeat yourself again and again, risking mistyping

    Qry.Sql.Text := 'DELETE FROM PORTICO_INICIAL'; // you do not need ';' for one statement, it is not script, not a PSQL block here
    Qry.ExecSql;

    Qry.Sql.Text := 'INSERT INTO PORTICO_INICIAL(NPORTICO,ELEMENTO) '
                  + 'VALUES (:NP,:EL)';
    Qry.Prepared := True;
    _p_EL := Qry.ParamByName('EL'); // cache objects, do not repeat linear searches
    _p_NP := Qry.ParamByName('NP'); // for simple queries you can even do ... := Qry.Params[0]

    K := 1;
    for I := 1 to 10 do
      begin
      L := I*100;
      for M := 1 to 3 do
        begin
          _p_NP.AsInteger := M+L;
          _p_EL.AsInteger := M;
          Qry.ExecSQL;
          Inc(K); // why? you seem to never use it 
        end;
      end;

    Qry.SqlConnection.CommitFreeAndNil(tra);
  finally
    if nil <> tra then Qry.SqlConnection.RollbackFreeAndNil(tra);

    Qry.Destroy;
  end;
end;

This procedure does not populate cdsBDPortico_Inicial - but do you really need it? If you do - maybe you can re-read it from the database: there could be other programs that added rows into the table too. Or you can insert many rows and then apply them all in one command, before committing the transaction (often abreviated tx), but even then, do not call FieldByName more than once.

Also, think about logical blocks of your program work in advance, those very transactions, temporary TSQLQuery objects etc. However boring and tedious it is now, you would bring yourself many more spaghetti trouble if you don't. Adding this logic retroactively after you have many small functions calling one another in unpredictable order is very hard.

Also, if you make Firebird server auto-assigning the ID field (and your program does not need any special values in ID and will be ok with Firebird-made values) then the following command might server yet better for you: INSERT INTO PORTICO_INICIAL(NPORTICO,ELEMENTO) VALUES (:NP,:EL) RETURNING ID

Arioch 'The
  • 15,799
  • 35
  • 62