2

I have two PL/SQL systems, residing in two separate databases. SystemA will need to populate SystemB's tables. This will probably be done over a datalink. Everytime a set of records is inserted in SystemB's tables, a process in SystemB must run. I could wait for SystemA to complete and then run a script to start processing in SystemB, but since SystemA could spend many hours processing and then populating SystemB, I'd rather that SystemB handle each set of records as soon as they become available (each set can be processed indpendently of the others so this should work OK).

What I'm not sure of is how I can do even-driven programming in PL/SQL. I'd need SystemA to notify SystemB that a set is ready for processing. My first idea was to have a special "event" table in SystemB and then when SystemA finishes a set, it inserts into the "event" table and there is a trigger on insert that starts the process (and the process could be a long one, possibly 5-10 minutes per process) in SystemB. I don't have enough experience with triggers in Oracle to know if this is an established way of doing it, OR if there's a better mechanism. Suggestions? Tips? Advice?

FrustratedWithFormsDesigner
  • 26,726
  • 31
  • 139
  • 202

2 Answers2

6

Use Oracle Advanced Queuing; it's designed for this. I believe you'll still have to set up a database link between the two systems (from B to A in this case, to consume the queue on A).

Adam Musch
  • 13,286
  • 2
  • 28
  • 32
  • There's going to be a dblink from A to B so A can insert into B's tables. The queue would have to be in A and B would need a link into B? – FrustratedWithFormsDesigner Feb 16 '11 at 19:30
  • Most commonly, you'd have a queue on A that the process on A would insert into, a queue on B that the process on B would dequeue messages from, and you'd use AQ to tell the queue on A to deliver messages to the queue on B. That way, your code never has to handle issues with temporarily interrupted network connections-- AQ will handle the delivery to the remote machine. – Justin Cave Feb 16 '11 at 19:42
  • No, the queue lives on A. B reads the queue on A to do whatever it needs to. If A "pushed" data to B using triggers over a database link, for example, A's ability to process is limited by B's availability and the reliability of the network between them. I'm operating on the assumption that the OP wants to process each record or batch of records on B once and only once; if that's the case, a persistent, single-consumer queue on A would allow B to read each message and be the only recipient of it. – Adam Musch Feb 16 '11 at 19:45
  • @Justin Cave: Thanks for the tip. Is this relatively easy to implement? From the little research I've done, it looks like most of the setup work would be for the DBAs, the PL/SQL code itself looks almost *too* easy... – FrustratedWithFormsDesigner Feb 16 '11 at 19:46
  • @Adam Musch: Correct, each batch should only be processed once (though for testing we might want to repeat a batch if we're trying to debug a problem). – FrustratedWithFormsDesigner Feb 16 '11 at 19:47
  • @Frustrated - From a developer standpoint, yes, it's pretty straightforward. It's also a relatively cookbook process for the DBA's though there is some work there. – Justin Cave Feb 16 '11 at 19:49
  • I won't have a chance to test this for a while, but when we reopen this area of development I will definitely suggest looking into Oracle Advanced Queues. – FrustratedWithFormsDesigner Feb 23 '11 at 15:39
2

Yes, Oracle Advance Queues or even having A submit a venerable Oracle Job to B would be a better idea.

And, if your process is going to be needing complete replication of the data from A to B, then you might want to look something like an Oracle Streams process to copy over the data and then do the processing.

Michael Broughton
  • 4,045
  • 14
  • 12