3

I've successfully manually triggered a pipeline in development many times. We're loading csv files into an Azure Blob Storage container from Sql server and sinking with an Azure SQL database. Now when I run the same pipeline I get the error: 'StatusCode':'DF-Executor-InvalidOutputColumns','Message':'Job failed due to reason: The result has 0 output columns. Please ensure at least one column is mapped. The schema is late binding. I can preview in Source (sometimes), but not in Sink. I've not changed anything since last successful run. Anyone else ever experience this?

KimA
  • 31
  • 1
  • 3
  • Please clarify your specific problem or provide additional details to highlight exactly what you need. As it's currently written, it's hard to tell exactly what you're asking. – Community Oct 18 '21 at 11:49

6 Answers6

1

Make sure your 'Allow schema drift' option is enabled in 'Sink' tab.

enter image description here

0

I had the same issue, so I will try to explain my problme in more details. I SUPPOSE it is the same thing.

I have a Data Flow within a pipeline. The data flow has only one source - a csv file within a blob storage container. The data flow has also the following compnents: Filter, devrive column, aggregate and two niks (one for the full flow, other for the aggregator). The flow runned multiple times with success in the past, even one hour ago!

After i changed the container from where the source loaded, the problem started to happen. I tried to go back to the previous container without success either.

Now, I can see the dataset and preview the data. No issues here, regardless where the file is. When I debug the data flow, though, the error occurs in the source component. When I select "Data Preview", the messsage "The result has 0 output columns. Please ensure at least one column is mapped" occurs.

My source settings are set through two parameters:

  • FileName: testes with diles with extension txt and csv (that previously worked fine)
  • ColumnDelimiter: Tested with files having tab and | delimiters (that previously worked fine)

My projection display correctly my 24 outputs with the correct data type. Everything works up to the point I try to debug. I tryed to log out and restart the debug mode, both without success.

Below my JSON for the full flow and for the source component.

Data Flow

{ "name": "DCV_Parent_Load", "properties": { "folder": { "name": "Data Ingestion" }, "type": "MappingDataFlow", "typeProperties": { "sources": [ { "dataset": { "referenceName": "FileDCVParent", "type": "DatasetReference" }, "name": "DCVParentConnector" } ], "sinks": [ { "dataset": { "referenceName": "LegacyDCVParent", "type": "DatasetReference" }, "name": "SinkDCVParent" }, { "name": "SaveMetada" } ], "transformations": [ { "name": "NullTreatment" }, { "name": "AddColumns" }, { "name": "RemoveHeader" }, { "name": "GetMetadata" } ], "script": "..." /I removed this line for data security/ } }

Source Info

source( output( /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string, /input name/ as string ), allowSchemaDrift: true, /I also tried to disable schemaDrift, having no success/ validateSchema: false, ignoreNoFilesFound: false, wildcardPaths:[($FileName)] ) ~> DCVParentConnector

0

Possible scenario 1:

You have no files to process further

  • Maybe you deleted first one with "Delete source files", or "moved" with "After Completion" setting
  • Maybe you used filter with start-end time for new files and there are no new files to process

There is a "Allow now files found" checkbox that "supposed" to throw no error when there are no files, yet as of 11.2021 I found that it does work on debug mode yet it does not work well on Triggered mode and gives out "Result has 0 output columns" error.

enter image description here

Possible scenario 2:

  • You are trying to do an Alter Transformation for Insert/Update/Delete on some condition and Sink does not allow one of these required. For a scenario where Sink already contains data and you are going to Update and Sink does not have Update right you may get this error.

In below case Upsert (Insert / Update on condition or maybe you use upsert==true() too) Insert and Update are used enter image description here

On the sink side if you do not have Update method -> Allow update and provided your Items already exist and it will do an update (or update overwrite) it will fail. enter image description here

Gorkem
  • 701
  • 8
  • 22
0

Check the schema, table and column names in your destination dataset. Having some characters like blank space in schema, table or column name can cause this issue.

AskSam
  • 1
  • This does not provide an answer to the question. Once you have sufficient [reputation](https://stackoverflow.com/help/whats-reputation) you will be able to [comment on any post](https://stackoverflow.com/help/privileges/comment); instead, [provide answers that don't require clarification from the asker](https://meta.stackexchange.com/questions/214173/why-do-i-need-50-reputation-to-comment-what-can-i-do-instead). - [From Review](/review/late-answers/30341826) – Bracken Nov 16 '21 at 13:59
0

This issue may happen if you are not defining schema to your sink table on your database.. Example: Azure PostgreSQL

0

In my case the problem was in case sensitive column name.

For example, I had "SomeID" column name in SQL DB table schema and in ADF parameter value it was "SomeId", which is OK from SQL perspective but looks like not OK from ADF.

After specifying column names exactly as they are in DB schema, the problem was solved. Hope this will help somebody!

Oleg
  • 1,467
  • 4
  • 26
  • 39