Teradata: Results with duplicate values converted into comma delimited strings

*爱你&永不变心* 提交于 2021-01-29 22:17:01

问题


I have a typical table where each row represents a customer - product holding. If a customer has multiple products, there will be multiple rows with the same customer Id. I'm trying to roll this up so that each customer is represented by a single row, with all product codes concatenated together in a single comma delimited string. The diagram below illustrates this

After googling this, I managed to get it to work using the XMLAGG function - but this only worked on a small sample of data, when scaled up Teradata complained about running out of 'spool space' - so I figure it's not very efficient.

Does anyone know how to efficiently achieve this?


回答1:


Newer versions of Teradata support NPath, which can be used for this. You have to get used to the syntax, it's a Table Operator :-)

E.g. this returns the column list for each table in your system:

SELECT * 
FROM 
   NPath(ON(SELECT databasename, tablename, columnname, columnid 
            FROM dbc.columnsV
           ) AS dt                            -- input data
         PARTITION BY databasename, tablename -- group by columns
         ORDER BY columnid                    -- order within list
         USING
           MODE (NonOverlapping)              -- required syntax 
           Symbols (True AS F)                -- every row
           Pattern ('F*')                     -- is returned
           RESULT(First (databasename OF F) AS DatabaseName, -- group by column
                  First (tablename OF F) AS TableName,       -- group by column
                  Count (* OF F) AS Cnt,
                  Accumulate(Translate(columnname USING unicode_to_latin) OF ANY (F)) AS ListAgg
                 )
        );

Should be waaaaaay better than XMLAgg.



来源:https://stackoverflow.com/questions/63683088/teradata-results-with-duplicate-values-converted-into-comma-delimited-strings

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!