SQLTeam.com | Weblogs | Forums

How to merge multiple rows into one single row

Say I have data as follows:

create table #FakeTable
(id int,
eventtype varchar(50),
comments varchar(Max)
)

insert into #FakeTable
select 1, 'Code Red', 'some text that is long but not too long'
select 1, NULL, ' for this column. Notice that the text in th'
select 1, NULL, 'is column is a continuous statement or comment'
select 2, 'Code Red', 'This is also a long description but not as lo'
select 2, NULL, 'ng as the previous entry'
select 3, 'Code Red', 'This is a comment that fit on a single line'
select 4, 'Code Red', 'This also fits on one line'

I would like to consolidate this into the following format:
id eventtype comment
1 Code Red some text that is long but not too long for this column. Notice that the text in this column is a continuous statement or comment
2 Code Red This is also a long description but not as long as the previous entry
3 Code Red This is a comment that fit on a single line
4 Code Red This also fits on one line

Any ideas? Solutions?

Is there a column that can be used to control the order of the comments? A SQL table has no inherent order, without a column you can't be sure which row will be concatenated first.

For example, you need something like:

select 1, 'Code Red', NULL, 'some text that is long but not too long'
select 1, NULL, 1, ' for this column. Notice that the text in th'
select 1, NULL, 2, 'is column is a continuous statement or comment'
select 1, NULL, 3, '. I added this line to show a third continuation line with the number 3'

In tthe real data, yes there is and I feel foolish for not including it. Call it Line for simplicity's sake :slight_smile:

Awesome. You could go thru some sophisticated logic -- and all the associated debugging / testing time -- to do all rows at once, but since this is almost certainly only a one-time thing, I'd just loop thru and do one row at a time for each ID. I'll post sample code for that in a bit.

It would be like this:
select 1, 'Code Red', 1, 'some text that is long but not too long'
select 1, NULL, 2, ' for this column. Notice that the text in th'
select 1, NULL, 3, 'is column is a continuous statement or comment'

Sounds good ... it will be a one time thing for current data but will have to be done for future data as well. I have no control on the application that stores the data like this but have been tasked with gathering it, cleaning it up, and analyzing it. Fun stuff huh?

Oh and there are several other columns similar to the comments one that will also need either concatenated or simply brought into the main record.

drop table #faketable;
go
create table #FakeTable
(id int NOT NULL,
line smallint NULL,
eventtype varchar(50) NULL,
comments varchar(Max) NULL
)

insert into #FakeTable
select 1, NULL, 'Code Red', 'some text that is long but not too long' union all
select 1, 1, NULL, ' for this column. Notice that the text in th' union all
select 1, 2, NULL, 'is column is a continuous statement or comment' union all
select 2, NULL, 'Code Red', 'This is also a long description but not as lo' union all
select 2, 1, NULL, 'ng as the previous entry' union all
select 3, NULL, 'Code Red', 'This is a comment that fit on a single line' union all
select 4, NULL, 'Code Red', 'This also fits on one line'

SELECT 'Before', * FROM #FakeTable


WHILE EXISTS(SELECT 1 FROM #FakeTable WHERE eventtype IS NULL)
BEGIN

UPDATE FT
SET comments = FT.comments + FT2.comments
FROM #FakeTable FT
CROSS APPLY (
    SELECT TOP (1) *
    FROM #FakeTable FT2
    WHERE FT2.id = FT.id AND FT2.eventtype IS NULL
    ORDER BY FT2.line
) AS FT2
WHERE FT.eventtype IS NOT NULL

DELETE FROM FT
FROM #FakeTable FT
INNER JOIN (
    SELECT *, ROW_NUMBER() OVER(PARTITION BY id ORDER BY line) AS row_num
    FROM #FakeTable
    WHERE eventtype IS NULL
) AS FT2 ON FT2.row_num = 1 AND FT2.id = FT.id AND FT2.line = FT.line

END /*WHILE*/

SELECT 'After', * FROM #FakeTable

I will not be able to test this on real data until tomorrow morning. Thank you for your help so far Scott! I will be in touch here ....

No problem. Of course be sure to do this on a test copy of the data!