tag:blogger.com,1999:blog-2303058199815958946.post3475376437098243202..comments2024-03-28T10:36:35.016+01:00Comments on Microsoft SQL Server Integration Services: Redirect duplicate rowsJoost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.comBlogger15125tag:blogger.com,1999:blog-2303058199815958946.post-5504059203254420022014-03-06T20:50:41.884+01:002014-03-06T20:50:41.884+01:00Awesome...Saved my day....~~ Thanks a bunchAwesome...Saved my day....~~ Thanks a bunchAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-16381077197538457502013-11-06T05:54:22.789+01:002013-11-06T05:54:22.789+01:00Appreciate for posting this article with steps in ...Appreciate for posting this article with steps in detail, Thank you so muchAnonymoushttps://www.blogger.com/profile/06206836334507795101noreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-90217121048152664422013-08-22T22:30:09.183+02:002013-08-22T22:30:09.183+02:00I am new to SSIS - and that was pretty cool. I und...I am new to SSIS - and that was pretty cool. I understand what is going on but need to review code to understand how you make it work.. super post.. ThanksAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-17231258202032906542013-05-09T12:30:01.842+02:002013-05-09T12:30:01.842+02:00That can be achieved by changing data source from ...That can be achieved by changing data source from table to query. Adding a column there in query with ROW_NUmber (partition by.. group by clause). Use CTE to select all rows with new column (row_number). Conditional split all records based on ROW_NUmber value.Paritoshhttps://www.blogger.com/profile/14169592338060322087noreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-36594969808113180032013-04-25T20:35:39.311+02:002013-04-25T20:35:39.311+02:00Thank you so much it helped me a lot in removing t...Thank you so much it helped me a lot in removing the duplicates from table as well as from the flat file.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-46947472022625224312013-02-06T20:29:51.270+01:002013-02-06T20:29:51.270+01:00You can add a request for that at Connect and then...You can add a request for that at <a href="https://connect.microsoft.com/SQLServer/Feedback" rel="nofollow">Connect</a> and then promote it...Joost van Rossumhttps://www.blogger.com/profile/01125981589974671317noreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-8500222780869132832013-02-06T17:36:41.315+01:002013-02-06T17:36:41.315+01:00This is a great article. I needed to get the row c...This is a great article. I needed to get the row count of the duplicates found so we could fail the entire load and reject the source file. This provided that row count. Does anyone know if the sort transformation will have a redirect option in the next release of SSIS?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-47557140192390145752012-07-22T13:50:47.431+02:002012-07-22T13:50:47.431+02:00Hi Joost,
:) It is working fine now, I am able to...Hi Joost,<br /><br />:) It is working fine now, I am able to get Duplicate values. The mistake I did was in sort transformation I have selected check box remove duplicated values, so in sort transformation it was filtering all duplicate values. I am using sql server 2008.. Thanks for the script. I am totally new to this. Thank you very much for helping.Kavithanoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-46236690865610218572012-07-20T21:03:01.592+02:002012-07-20T21:03:01.592+02:00Hi Kavitha,
Step 4 and 5 are the most inmportant....Hi Kavitha,<br /><br />Step 4 and 5 are the most inmportant. Make sure you followed each step thoroughly. Contact me via the <a href="http://microsoft-ssis.blogspot.com/p/contact-me.html" rel="nofollow">contact form</a>.<br /><br />1) Did you have any errors?<br />2) Did you add input columns without re-editing the script?<br />3) What are the data types?<br />4) Which version of SSIS do you use and are you using VB or C#?Joost van Rossumhttps://www.blogger.com/profile/01125981589974671317noreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-48616839315936780082012-07-20T14:45:35.833+02:002012-07-20T14:45:35.833+02:00Hi Joost,
the source used was XML source. I follo...Hi Joost,<br /><br />the source used was XML source. I followed all step unique records coming properly but duplicate records are not getting displayed. If I put debugger, the script is just running not able to stop to execute and check why the duplicate key is not working. Can you please help?Kavithanoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-86391726119437194872012-07-17T23:55:10.268+02:002012-07-17T23:55:10.268+02:00MATE, you cant use Aggregate, you'll lose one ...MATE, you cant use Aggregate, you'll lose one columnAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-66514007525537145072012-07-12T14:55:13.552+02:002012-07-12T14:55:13.552+02:00@Mohd: All roads lead to Rome, but please show me ...@Mohd: All roads lead to Rome, but please show me how you would accomplish the same result with the Aggregate Transformation. I'm curious about your solution.Joost van Rossumhttps://www.blogger.com/profile/01125981589974671317noreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-9279606280555726192012-07-12T14:28:14.950+02:002012-07-12T14:28:14.950+02:00excellent article......but same can be achieved us...excellent article......but same can be achieved using Aggregate transform which is more efficient and takes less time.Anonymoushttps://www.blogger.com/profile/07924979626068780615noreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-4925474425357075162012-06-25T12:35:41.742+02:002012-06-25T12:35:41.742+02:00Very useful article. It can even be used for redir...Very useful article. It can even be used for redirecting error rows as well<br /><br />Thanks for the explanationsAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-2303058199815958946.post-67431351942534928912012-05-27T02:22:00.747+02:002012-05-27T02:22:00.747+02:00Very useful and helpful article.
The only one that...Very useful and helpful article.<br />The only one that give me the way to properly manage duplicates from flat files.<br /><br />Key words for french search : SSIS et gestion des doublons.Anonymousnoreply@blogger.com