-1

I am performing an ETL job via Pentaho 7.1. The job is to populate a table 'PRO_T_TICKETS' in PostgreSQL 9.2 via the Pentaho Jobs and transformations?

I have mapped the table fields with respect to the stream fields

Mapped Fields

My Table PRO_T_TICKETS contains the Schema (Column Names) in UPPERCASE. Is this the reason I can't populate the table PRO_T_TICKETS with my ETL Job?

I duplicated the step TABLE_OUTPUT to PRO_T_TICKETS and changed the Target table field to 'PRO_T_TICKETS2'. Pentaho created a new table with lowercase schema and populated the data in it.

But I want this data to be uploaded in the table PRO_T_TICKETS only and with the UPPERCASE schema if possible.

I am attaching the whole job here and the error thrown by Pentaho. Pentaho Error I have also tried my query by adding double quotes to the column names as you can see in the error. But it didn't help.

What do you think I should do?

Pranjal Kaushik
  • 13
  • 1
  • 10

2 Answers2

0

When you create (or modify) the connection, select Advanced on the left panel and click on the Force to upper case or Force to lower case or, even better, Preserve case of reserved words.

To know which option to choose, copy the 4th line of your error log, the line starting with INSERT INTO "public"."PRO_T_TICKETS("OID"... in your SQL-developer tool and change the connection advanced parameters until it works.

Also, at debug time, don't use batch updates, don't use lazy conversion on previous steps, and try with one (1) field rather than all (25).

AlainD
  • 6,187
  • 3
  • 17
  • 31
  • I did not get you after I copy the 4th line of my error to SQL developer tool. Then do what ? @AlainD – Pranjal Kaushik Sep 27 '17 at 09:54
  • Is SQL-developer giving you an error ? If not, then the problem is not with the output step (yes, kettle sometimes get confused with the source of the error). If yes, then you know what the error is (which may not be an uper/lower case issue as you are hypothesizing). – AlainD Sep 27 '17 at 12:32
0

Just as a complement: it worked for me following the tips from AlainD and using specific configurations that I'd like to share with you. I have a transformation streaming data from MySQL to PostgreSQL using a Table Input and Output. In both of DBs I have uppercase objects.

I did the following steps to work in the right way:

  1. In the table input (MySQL) the objects are uppercase too, but I typed in lowercase and it worked and I didn't set any special option in the DB Connection.
  2. In the table output (PostgreSQL) I typed everything in uppercase (schema, table name and columns) and I also set "specify the database fields" (clicking on "Get fields").
  3. In the target DB Connection (PostgreSQL) I put the options (in "Advanced" section): "Quote all in database" and "Preserve case of reserved words".

PS: Ah, the last option is because I've found out that there was one more problem with my fields: there was a column called "Admin" (yes guys, they created a camelcase column using a reserved word!) and for that reason I must to put "Preserve case of reserved words" and type it as "Admin" (without quotes and in camelcase) in the Table Output.

furuka
  • 3
  • 3