I have a table in SQL Server with 3 columns, one of which is a data column containing rows of concatenated columns delimited by commas. The first row is also the header row
What you can do is to read the file as is. And Split those values in a script task.
So from source go to a script task. Then in the script task as input column, select the column containing those values (InputColumn1). Then specify the output columns (If I am right I see you have 5, so specify 5 (OutputColumn1 - 5)).
After that is done, go to the script itself (C#).
Under:
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
}
Put the following code in there:
var ColumnValue = Row.InputColumn1.Split(',');
Row.OutputColumn1 = ColumnValue[0];
Row.OutputColumn2 = ColumnValue[1];
Row.OutputColumn3 = ColumnValue[2];
Row.OutputColumn4 = ColumnValue[3];
Row.OutputColumn5 = ColumnValue[4];
After the script task all the columns from the Source as well as the OutputCoulmns1-5 will be available and you can do what you have to.
OUTPUT
Data | ID | Source |OutputColumn1 |OutputColumn2| etc. 3-5
================================================================
a,b,c,d,e | 1 | a.csv | a | b
f,g,h,i,j | 2 | b.csv | f | g
Please ask if something is not clear.
You can use the Token expression to isolate strings delimited by well, delimiters.
Use a derived column transformation
and something like this:
TOKEN([Name_of_your_Column], "," , 1)
Should give you "a"
TOKEN([Name_of_your_Column], "," , 2)
Should give you "b"
You can also set up a simple transformation script component
. Use your "DATA" column as an input and add as many outputs as you need. Use the split method and you're set.
string[] myNewColumns = inputColumn.split(",");