1

Recently, one of our DBAs suggested me removing DBMS_OUTPUT.PUT_LINE() in one of my package at least have a compilation flag to control it. The strongest argument is the buffer limitation will cause exceptions in DB. Although I fully agree we shouldn't have such debugging code in production, but I don't think DBMS_OUTPUT.PUT_LINE() can introduce problems if the set serveroutput on has not set.

So my first question is whether DBMS_OUTPUT.PUT_LINE() puts line to the buffer regardless the fact whether serveroutput is on or not. If DBMS_OUTPUT.PUT_LINE() only put line to buffer when the serveroutput is on, than in normal case we don't turn on it. It'll be fine. Tom's view ("Frankly, I wouldn't care. ") about the performance perspective seems support my argument.

My second question is about location of the buffer that is where the buffer sits (db server or client). I guess it sits on server side, then whether SET SERVEROUTPUT ON or DBMS_OUTPUT.ENABLE () will make all sessions available to read from the buffer by calling DBMS_OUTPUT.GET_LINE().

My last question is what's the effect when the buffer reach the limitation. Will we get an error in the last caller of DBMS_OUTPUT.PUT_LINE()?

ivenxu
  • 639
  • 4
  • 16
  • 2
    Tom Kyte addresses this [here](https://asktom.oracle.com/pls/apex/f?p=100:11:0::::P11_QUESTION_ID:4951703900346942776) – thatjeffsmith Nov 07 '14 at 01:45
  • They are right, although since 10g sire size of the buffer is "unlimited". BTW even is PL/SQL you can use conditional compilation like in C++. So you can wrap these dbms_output(s) into `#IFDEFs`. – ibre5041 Nov 07 '14 at 07:34

0 Answers0