3

eBay Platform Notifications recommends periodic polling of the GetOrders API to ensure each and every order is received.

In my case, I have Platform Notifications set-up to parse the XML file received and insert it into a MySQL database using PHP.

Now I am looking to, as recommended, "double pass" using GetOrders, which should essentially give me duplicates for each and every single row (or order).

My structure is rather straightforward. But I have a UNIQUE INDEX for OrderLineItemID which, to my understanding, is the unique identifier for each eBay Order.

Is there a better way to do this than I am currently doing?

//retrieve and escape variables for insertion//

$sql = "INSERT INTO eBayOrders (OrderLineItemID, SalesRecordNumber, BuyerUserID, BuyerEmail, Title, SKU, Quantity, TransactionPrice)
VALUES ('".$orderlineitemid."', '".$recordnumber."', '".$buyeruserid."', '".$buyeremail."', '".$title."', '".$sku."', '".$qty."', '".$transactionprice."')";
}

if ($connect->query($sql) === TRUE) {
         echo "New Record Created Successfully";
} else {
         echo "Error: " . $sql . "<br />" . $connect->error;
      $connect->close();
      die();
}

Because of my UNIQUE ON OrderLineItemID, when a duplicate order comes in, the query will result in an error, close the connection, and then exit the script.

I've thought about first checking to see (maybe using a SELECT statement) if the row exists, and then trying an insert, but I'm doing a foreach loop of up to 100 orders using the GetOrders API to run my SQL queries, and it seems like just allowing it to fall to error might be a quicker option, but I'm weary on if this can cause issues down the line.

In all, I'm not familiar with best practices for MySQL "double passes". Anyone have any insight on the best way to conduct this?

edit: here is my entire foreach loop:

foreach ($orders as $order) {
                $i++;
                        $buyeruserid2 = $order->BuyerUserID;
                            $buyeruserid = mysqli_real_escape_string($connect, $buyeruserid2);
                     // $extendedorderid = $order->TransactionArray->Transaction->ExtendedOrderID;
                        $buyeremail2 = $order->TransactionArray->Transaction->Buyer->Email;
                            $buyeremail = mysqli_real_escape_string($connect, $buyeremail2);
                        $salesrecordnumber2 = $order->TransactionArray->Transaction->ShippingDetails->SellingManagerSalesRecordNumber;
                            $salesrecordnumber = mysqli_real_escape_string($connect, $salesrecordnumber2);
                        $orderlineitemid2 = $order->TransactionArray->Transaction->OrderLineItemID;
                            $orderlineitemid = mysqli_real_escape_string($connect, $orderlineitemid2);
                        $title2 = $order->TransactionArray->Transaction->Item->Title;
                            $title = mysqli_real_escape_string($connect, $title2);
                        $sku2 = $order->TransactionArray->Transaction->Item->SKU;
                            $sku = mysqli_real_escape_string($connect, $sku2);
                        $quantitypurchased2 = $order->TransactionArray->Transaction->QuantityPurchased;
                            $quantitypurchased = mysqli_real_escape_string($connect, $quantitypurchased2);
                        $transactionprice2 = $order->TransactionArray->Transaction->TransactionPrice;
                            $transactionprice = mysqli_real_escape_string($connect, $transactionprice2);

            echo $i;
            echo "\n";
            echo "BuyerUserID: " . $buyeruserid . "\n";
            echo "extendedorderid: " . $quantitypurchased . "\n";
            echo "BuyerEmail: " . $buyeremail . "\n";
            echo "SellingManagerSalesRecordNumber: " . $salesrecordnumber . "\n";
            echo "OrderLineItemID: " . $orderlineitemid . "\n";
        // echo "ExtendedOrderID: " . $transaction->ExtendedOrderID . "\n";
            echo "Title: " . $title . "\n";
            echo "SKU: " . $sku . "\n";
            echo "QuantityPurchased: " . $quantitypurchased . "\n";
            echo "TransactionPrice: " . $transactionprice . "\n";
            echo "\n";


$sql = "INSERT INTO eBayOrders (OrderLineItemID, SalesRecordNumber, BuyerUserID, BuyerEmail, Title, SKU, Quantity, TransactionPrice)
VALUES ('".$orderlineitemid."', '".$recordnumber."', '".$buyeruserid."', '".$buyeremail."', '".$title."', '".$sku."', '".$qty."', '".$transactionprice."')";

if ($connect->query($sql) === TRUE) {
  echo "New Record Created Successfully";
} else {
  echo "Error: " . $sql . "<br />" . $connect->error;
  $connect->close();
  die();
}


}
bbruman
  • 667
  • 4
  • 20
  • Learn about prepared Statements to prevent SQL-injection – Jens Mar 16 '17 at 14:55
  • Okay, I could use to learn more about Prepared Statements. I am using `mysqli_real_escape_string` to encapsulate my variables for SQL-injection, but did not include them as I felt it wasn't relevant to the question at hand. – bbruman Mar 16 '17 at 14:58
  • 2
    `INSERT ... ON DUPLICATE KEY` may be what you need https://dev.mysql.com/doc/refman/5.7/en/insert-on-duplicate.html or `INSERT ... SELECT` https://dev.mysql.com/doc/refman/5.7/en/insert-select.html – Funk Forty Niner Mar 16 '17 at 15:03
  • *"but I'm doing a foreach loop of up to 100 orders using the GetOrders"* - You should probably include that part and any other possible relevant code in your question. Or, see the answer that was given just now below. – Funk Forty Niner Mar 16 '17 at 15:07
  • Thanks fellas. Looking into this stuff now. @Fred -ii-, I edited my original question with the full `foreach` link – bbruman Mar 16 '17 at 15:17

1 Answers1

1

To avoid an error when an INSERT fails due to a unique key constraint, we can use the IGNORE option on the INSERT statement.

INSERT IGNORE INTO eBayOrders ...

If you use the IGNORE modifier, errors that occur while executing the INSERT statement are ignored. For example, without IGNORE, a row that duplicates an existing UNIQUE index or PRIMARY KEY value in the table causes a duplicate-key error and the statement is aborted. With IGNORE, the row is discarded and no error occurs. Ignored errors generate warnings instead.

But this also affects error conditions other than duplicate key exceptions.

As another option, we can use INSERT ... ON DUPLICATE KEY ...

Documentation available here:

Reference: https://dev.mysql.com/doc/refman/5.7/en/insert.html

spencer7593
  • 106,611
  • 15
  • 112
  • 140
  • i suppose this answers my question, and I will probably accept it (still reading over the reference material). What I left out is that when an order comes in, I also have a PHP script that subtracts `Quantity` levels from an `Inventory` table. Therefore if the record already exists, *maybe* I'd prefer to get an error and abort the script. Otherwise if the row is inserted, it will continue on with subtracting Inventory quantities (which I of course don't want to 'double-pass' on that for the same order..). At least, that's how I have it set-up now. – bbruman Mar 16 '17 at 15:27
  • Actually going to an error, closing the connection, and aborting the scripting is not going to work in a `foreach` (unlike using Platform Notifications which provide only one order at a time), because if it error and exits it's going to stop future iterations and some will be missed. Looking into alternatives – bbruman Mar 16 '17 at 15:52
  • 1
    @bbruman: the code doesn't necessarily need to close the connection and die when a SQL error occurs. The code can detect the error (like it's already doing), use `mysqli_error` to retrieve the error (like it's already doing)... and *then* it can inspect the error, and if it's the duplicate key error that you are expecting, skip over the rest of the processing (e.g. don't subtract quantity from inventory), and continue in the loop, processing the next order. – spencer7593 Mar 16 '17 at 16:42
  • Exactly my thinking, but it's my first time doing something like this, so wasn't sure. I tested it like this and it skipped the orders for the duplicate unique key, and went through the rest. My only concern was that most of the `GetOrders` rows will result in an error, up to 100 at a time, and wasn't sure if that should be a concern to query that many errors at a time repeatedly. – bbruman Mar 16 '17 at 17:02
  • 1
    @bbruman: Running INSERT that generate errors shouldn't be a problem. That's going to use resources (time) on the database server, not really any more expensive than running INSERT statements that succeed. Both will do work to parse, evaluate, prepare and execute. Both will attempt to obtain and hold locks. So its really just going to cost some time. (Always regular concurrency concerns.) But that won't cause a problem in terms of the database server itself; other clients may be inconvenienced. I'm not aware of any setting that will disconnect a MySQL client after a certain number of errors. – spencer7593 Mar 16 '17 at 20:09