tag:blogger.com,1999:blog-23030581998159589462024-03-17T09:30:54.206+01:00Microsoft SQL Server Integration ServicesJoost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.comBlogger246125tag:blogger.com,1999:blog-2303058199815958946.post-77628002489496780402021-02-13T19:32:00.000+01:002021-02-13T19:32:02.200+01:00Start and Stop SSIS Integration Runtimes with ADF only<b><span style="font-size: large;">Case</span></b><div>I want to stop and start my SSIS Integration Runtime from within my Azure Data Factory pipeline, but I don't want write any <a href="http://microsoft-ssis.blogspot.com/2018/02/pause-and-resume-integration-runtime-to.html" target="_blank">code</a> or use other Azure services like Azure Automation or Azure Logic Apps to do this. Is there an Azure Data Factory-only solution where we only use the standard pipeline activities from ADF?</div><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-vi00v5fsESU/YCgXSFw_n0I/AAAAAAAAH-Q/3nCuqV0QtkERYr2Z8dXhWDNfrrUYAlGtQCLcBGAsYHQ/s278/ssisir01.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="278" data-original-width="241" height="400" src="https://1.bp.blogspot.com/-vi00v5fsESU/YCgXSFw_n0I/AAAAAAAAH-Q/3nCuqV0QtkERYr2Z8dXhWDNfrrUYAlGtQCLcBGAsYHQ/w347-h400/ssisir01.png" width="347" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">SSIS Integration Runtime</td></tr></tbody></table><br /><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><b><span style="font-size: large;">Solution</span></b></div><div>Yes there is a nocode solution where you use the Web Activity to call the Rest API of Integration Runtimes (as part of ADF), but oddly enough that requires you to give ADF permissions to its own Integration Runtime via its Managed Service Identity (MSI).</div><div><br />
<br /><div><b>1) Give ADF access to ADF via MSI</b></div><div>For this example we will give ADF access to its own resources. Giving access is done via MSI (managed service identity). The minimum role needed is Data Factory Contributor, but you could also use a regular Contributor or Owner (but less is more).</div><div><ul style="text-align: left;"><li>Go to the Data Factory in the Azure Portal</li><li>In the left menu click on Access control (IAM)</li><li>Click on the +Add button and choose Add role assignment</li><li>Select Data Factory Contributor as Role</li><li>Use Data Factory as Assign access to</li><li>Changing the subscription is probably not necessary</li><li>Optionally enter a (partial) name of your parent ADF (if you have a lot of data factories)</li><li>Select your ADF and click on the Save button</li></ul></div><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-vZ_hCepbLMU/YCgWD0LU-yI/AAAAAAAAH-I/I80zio1b9bQDLEu49CcPnKete3VZtoR6wCLcBGAsYHQ/s1409/ssisir02.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="718" data-original-width="1409" height="204" src="https://1.bp.blogspot.com/-vZ_hCepbLMU/YCgWD0LU-yI/AAAAAAAAH-I/I80zio1b9bQDLEu49CcPnKete3VZtoR6wCLcBGAsYHQ/w400-h204/ssisir02.png" width="400" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Giving ADF access to its own resources</td></tr></tbody></table><br /><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><br /><br /><br />
<b>2) Add Web Activity</b><br />
In your ADF pipeline you need to add a Web Activity to call the Rest API of the integration runtimes. First step is to determine the Rest API URL. Replace in the string below, the <xxx> values with the subscription id, resource group, data factory name and the name of the integration runtime. The Rest API method we will be using is '<a href="https://docs.microsoft.com/en-us/rest/api/datafactory/integrationruntimes/start" target="_blank">Start</a>' but you can replace that word by '<a href="https://docs.microsoft.com/en-us/rest/api/datafactory/integrationruntimes/stop" target="_blank">Stop</a>' to pause the SSIS IR:</div><div><span style="font-size: x-small;"><b>https://management.azure.com/subscriptions/<xxx>/resourceGroups/</b></span><b style="font-size: small;"><xxx></b><span style="font-size: x-small;"><b>/providers/Microsoft.DataFactory/factories/</b></span><b style="font-size: small;"><xxx></b><span style="font-size: x-small;"><b>/integrationRuntimes/</b></span><b style="font-size: small;"><xxx></b><span style="font-size: x-small;"><b>/start?api-version=2018-06-01</b></span><br />
<br />
Example:<br />
<b style="font-size: small;">https://management.azure.com/subscriptions/a74a173e-4d8a-48d9-9ab7-a0b85abb98fb/resourceGroups/bitools/providers/Microsoft.DataFactory/factories/bitools/integrationRuntimes/bitoolsir/start?api-version=2018-06-01</b><br />
<br />
Second step is to create a JSON message for the Rest API. Well the Rest API doesn't use it, but it is required in the Web activity when you use POST as method. So you just need to create a dummy json message:<br />
<pre class="brush: plain; toolbar: false;">{
"Dummy": "Dummy"
}
</pre>
<ul>
<li>Add the <b>Web</b> activity to your pipeline</li>
<li>Give it a descriptive name like <b>Start SSIS</b> (or Stop SSIS)</li>
<li>Go to the Settings tab</li>
<li>Use the Rest API URL from above in the <b>URL </b>property</li>
<li>Choose <b>POST </b>as Method</li>
<li>Add the dummy JSON message from above in the <b>Body </b>property</li>
<li>Under advanced choose <b>MSI </b>as Authentication method</li>
<li>Add 'https://management.azure.com/ in the <b>Resource </b>property</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-2B_jV3frU2Y/YCgLO0Gvt_I/AAAAAAAAH9w/HycDqUwbxh4uaJZnWakeu0JZbBMk4Rh0gCLcBGAsYHQ/s774/ssisir03.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="768" data-original-width="774" height="398" src="https://1.bp.blogspot.com/-2B_jV3frU2Y/YCgLO0Gvt_I/AAAAAAAAH9w/HycDqUwbxh4uaJZnWakeu0JZbBMk4Rh0gCLcBGAsYHQ/w400-h398/ssisir03.png" width="400" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Web Activity calling the SSIS IR Rest API</td></tr></tbody></table><div class="separator" style="clear: both; text-align: center;"><br /></div>
Now run the pipeline by hitting the debug button in the pipeline editor and check the output.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/--_K4DNblhRI/YCgT_Gl5A4I/AAAAAAAAH98/uu9rQa9G-vYMgGbXjNvXtFypSKFPPbNygCLcBGAsYHQ/s980/ssisir04.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="459" data-original-width="980" height="188" src="https://1.bp.blogspot.com/--_K4DNblhRI/YCgT_Gl5A4I/AAAAAAAAH98/uu9rQa9G-vYMgGbXjNvXtFypSKFPPbNygCLcBGAsYHQ/w400-h188/ssisir04.png" width="400" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Then Debug the Pipeline to check the stop/start action</td></tr></tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br /><br /><br />
<br />
<br />
<br /><br />
<br />
<b>3) Retrieve info</b><br />
By changing the operation in the URL (stop or start) to '<a href="https://docs.microsoft.com/en-us/rest/api/datafactory/integrationruntimes/getstatus" target="_blank">getStatus</a>', you can retrieve the current status of the integration runtime. With this information you could for example first check the status before changing it. The expression in the If condition could be something like:<br />@equals(activity('Get SSIS IR Status').output.properties.state,'Stopped')</div><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-EIu4u-mqzuU/YCgJt54PpPI/AAAAAAAAH9Y/2SZYJWd6wD8XrhH46BDUDN8BUSoNBWhOwCLcBGAsYHQ/s812/ssisir05.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="812" data-original-width="778" height="400" src="https://1.bp.blogspot.com/-EIu4u-mqzuU/YCgJt54PpPI/AAAAAAAAH9Y/2SZYJWd6wD8XrhH46BDUDN8BUSoNBWhOwCLcBGAsYHQ/w384-h400/ssisir05.png" width="384" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Using 'getstatus' operation to retrieve current status</td></tr></tbody></table><br /><div><br /></div><table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left;"><tbody><tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-WdONFE6BTKU/YCgJlC9XfdI/AAAAAAAAH9Q/ihhnmCkDP7EVZ46cM6fx1LkAnbXP33YDgCLcBGAsYHQ/s981/ssisir06.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="630" data-original-width="981" height="258" src="https://1.bp.blogspot.com/-WdONFE6BTKU/YCgJlC9XfdI/AAAAAAAAH9Q/ihhnmCkDP7EVZ46cM6fx1LkAnbXP33YDgCLcBGAsYHQ/w400-h258/ssisir06.png" width="400" /></a></td></tr><tr><td class="tr-caption" style="text-align: center;">Status available in output</td></tr></tbody></table><br /><div><br /></div><div><br /></div><div><br /></div><div><br />
</div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br /><br /><br />
<br />
<br /><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><br /></div><div><b><span style="font-size: large;">Conclusion</span></b></div><div>In this post you learned how easy it is to add a stop and start option in your pipeline to save some money on your Azure bill. Check out my other <a href="https://microsoft-bitools.blogspot.com/p/azure-data-lake-what-is-azure-data-lake.html" target="_blank">blog</a> (https://microsoft-bitools.blogspot.com/) for more Rest API solutions in Azure Data Factory.</div>Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Nederland52.132633 5.291265999999999423.822399163821153 -29.864984 80.442866836178837 40.447516tag:blogger.com,1999:blog-2303058199815958946.post-19916262656348847982020-01-09T21:12:00.001+01:002020-01-09T21:37:38.355+01:00Fixed IP addresses for ADF Integration Runtimes<b><span style="font-size: large;">Case</span></b><br />
I want to give my Integration Runtime access to my sources via a firewall rule and block other machines or services. How do I arrange that?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-ajaof6gCipg/XheDR3SwB1I/AAAAAAAAGMo/q5ChDaiW5NkjVzbEicCpzj04Mpa0i_3GACLcBGAsYHQ/s1600/AzureFireWall.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="311" data-original-width="366" height="338" src="https://1.bp.blogspot.com/-ajaof6gCipg/XheDR3SwB1I/AAAAAAAAGMo/q5ChDaiW5NkjVzbEicCpzj04Mpa0i_3GACLcBGAsYHQ/s400/AzureFireWall.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Firewall exceptions for SSIS IR</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
Good news! Microsoft <a href="https://docs.microsoft.com/en-us/azure/data-factory/azure-integration-runtime-ip-addresses#azure-integration-runtime-ip-addresses-specific-regions" target="_blank">published a list of IP addresses</a> per Azure region for the Integration Runtimes in Azure Data Factory. This means you can narrow down the list of machines accessing your sources. Now only Integration Runtimes from a specific Region (like West Europe) can access it. Perhaps not enough for everybody, but it is better then giving ALL Azure services access to for example your Azure SQL Database.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-6jXPBldV9rs/XheFDBC8VBI/AAAAAAAAGM0/VpO4mJiDEL8Pmwa3prERlNQAHmqhlCI9wCLcBGAsYHQ/s1600/AzureFireWall02.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="55" data-original-width="343" src="https://1.bp.blogspot.com/-6jXPBldV9rs/XheFDBC8VBI/AAAAAAAAGM0/VpO4mJiDEL8Pmwa3prERlNQAHmqhlCI9wCLcBGAsYHQ/s1600/AzureFireWall02.png" /></a></div>
<br />
<br />
<br />
<br />
<br />
<br />
Note 1: The IP addresses are listed as CIDR. For some firewalls you have to convert those to a IP range. You can use a <a href="https://www.ipaddressguide.com/cidr" rel="nofollow" target="_blank">CIDR to IPv4 calculater</a> to convert them. For example:<br />
40.74.26.0/23 => 40.74.26.0 to 40.74.27.255 (512 hosts in total)<br />
<br />
Note 2: Not sure how often this list of IP addresses changes. So you might want to put the URL of the list in your documentation for error handling.<br />
<br />
Note 3: Azure Data Factory Data Flows does not use the same IP addresses. A list of <a href="https://www.microsoft.com/en-us/download/details.aspx?id=56519" target="_blank">all Azure IP addresses</a> can be downloaded as a JSON file. This JSON file gets updated on weekly basis.<br />
<br />
<br />
<br />
<br />
<span style="font-size: xx-small;">Credits <a href="https://www.linkedin.com/in/geertendekruijf/" rel="nofollow" target="_blank">Geerten de Kruijf</a></span><br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358820000000488 57.120607 15.61841399999995tag:blogger.com,1999:blog-2303058199815958946.post-51392447938382694682019-12-06T19:27:00.002+01:002019-12-06T19:27:44.709+01:00Azure DevOps - New Microsoft SSIS Deploy task <span style="font-size: large;"><b>Case</b></span><br />
Until now we needed <a href="https://marketplace.visualstudio.com/search?term=ssis&target=AzureDevOps&category=Azure%20Pipelines&sortBy=Relevance" target="_blank">Third Party</a> tasks or <a href="https://microsoft-ssis.blogspot.com/2019/06/azure-devops-build-ssis-project-ci.html" target="_blank">PowerShell </a>to deploy an SSIS project in DevOps. Now Microsoft finally released its own SSIS DevOps tasks. How does it work?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-3EMf_AJmG_s/XcfVw9dYdJI/AAAAAAAAGDg/0lUeq6WgyMwNrcSTMDwUSbsNo_TKh_tpwCLcBGAsYHQ/s1600/SSISDeploy00.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="283" data-original-width="643" height="175" src="https://1.bp.blogspot.com/-3EMf_AJmG_s/XcfVw9dYdJI/AAAAAAAAGDg/0lUeq6WgyMwNrcSTMDwUSbsNo_TKh_tpwCLcBGAsYHQ/s400/SSISDeploy00.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Replace PowerShell code with Microsoft SSIS Deploy task</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<span style="font-size: large;"><b>Solution</b></span><br />
Microsoft just released the <a href="https://marketplace.visualstudio.com/items?itemName=SSIS.ssis-devops-tools" target="_blank">SSIS Deploy task</a> (public preview) which makes it much easier to deploy an SSIS project. Below you will find the codeless steps to deploy artifacts created by the <a href="https://microsoft-ssis.blogspot.com/2019/12/azure-devops-new-microsoft-ssis-build.html" target="_blank">SSIS Build task</a>.<br />
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<b>1) New Release Pipeline - Empty Job</b><br />
The first step is to create a new Release Pipeline. For this example we will use an empty job to start with and later on add tasks manually.<br />
<ul>
<li>Go to the Pipelines menu on the left</li>
<li>Go to Releases</li>
<li>Click on + New to create a New Release pipeline</li>
<li>Next choose 'Empty job'</li>
</ul>
<br />
Next step is to give the first Stage its name. A Stage is an environment like acceptance or production where you want to deploy your SSIS packages. For this example we will have three stages / environments: Dev/Tst, Acc and Prd. The next two stages will be created by using cloning, but first we will add and configure the new deploy task so that you don't have to repeat yourself.<br />
<ul>
<li>After choosing 'Empty job' a first stage is created. Rename it 'DEV/TST'</li>
<li>Close the Stage pane by clicking the cross in the upper right corner</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-UCUfCEIJNdk/XcHsnVB8DsI/AAAAAAAAGBE/wTrqyYbSTH4xIYyawkHXzWgE_GektZFIACLcBGAsYHQ/s1600/release01.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-UCUfCEIJNdk/XcHsnVB8DsI/AAAAAAAAGBE/wTrqyYbSTH4xIYyawkHXzWgE_GektZFIACLcBGAsYHQ/s400/release01.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create Relesae Pipeline</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Select Artifact</b><br />
In the Artifact pane you need to select the Artifact (the ispac file) that you created in the <a href="http://microsoft-ssis.blogspot.com/2019/12/azure-devops-new-microsoft-ssis-build.html" target="_blank">Build Pipeline</a>. So look up the name before you start.<br />
<ul>
<li>In the Artifacts pane click on + Add to add an artifact</li>
<li>Choose the project if you have multiple DevOps projects</li>
<li>Select the Artifact of the CI pipeline</li>
<li>Select Latest version</li>
<li>Optionally rename the source alias</li>
<li>Click on the Add button</li>
</ul>
<br />
We also need to determine whether we want to use the Continuous deployment trigger or perhaps use an other trigger or scheduling. For this example we will use the CD trigger.<br />
<ul>
<li>Click on the lightning icon</li>
<li>Enable Build trigger (to release after each new successful build)</li>
<li>Close the CD trigger pane by clicking the cross in the upper right corner</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-LDt8gf_wAes/XcSC7eSYZ5I/AAAAAAAAGBQ/wFGjLlITzxwnL-48uthZx381T-g0TVL2QCLcBGAsYHQ/s1600/release02.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-LDt8gf_wAes/XcSC7eSYZ5I/AAAAAAAAGBQ/wFGjLlITzxwnL-48uthZx381T-g0TVL2QCLcBGAsYHQ/s400/release02.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Artifact and enable Continuous deployment</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Add variables</b><br />
For this example we need three variables: SsisServer, SsisDBUsername and SsisDBPassword<br />
<ul>
<li>Go to the Variables tab</li>
<li>Click on the + Add to add the first variable</li>
<li>Enter the name of the variable</li>
<li>Enter the value of the variable</li>
<li>Hit the padlock icon when it's a sensitive value like a password</li>
<li>Select the first stage 'Dev/Tst' as scope</li>
</ul>
<br />
Repeat this for all three variables and later on, <b>after the next step</b>, repeat this for each Stage. For each stage it will add the same three variable, but with a different scope (Acc or Prd), but you do have to change the values of the variables.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-35EF4twGTrc/XcXejlOaR1I/AAAAAAAAGBo/Q0ql4Gd3TL4id2WNWcNFv4XMAQfgBiHLACLcBGAsYHQ/s1600/release04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-35EF4twGTrc/XcXejlOaR1I/AAAAAAAAGBo/Q0ql4Gd3TL4id2WNWcNFv4XMAQfgBiHLACLcBGAsYHQ/s400/release04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add variables to DevOps</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>4) Add task to first stage</b><br />
Now we need to add a task to do the actual deployment of the SSIS ispac file. This is where the new Microsoft SSIS Deploy DevOps task is used. Note: Click on the Info-icon behind each field title to get more information about the desired value for that field.<br />
<ul>
<li>Within the first Stage pane, click on the '1 job, 0 task' link to go to the job</li>
<li>Optionally change the settings of the Agent job (for example its name)</li>
<li>Click on the + icon in the Agent Job to add a new task</li>
<li>Search for 'SSIS Deploy'</li>
<li>Add the task called SSIS Deploy (This is the official Deploy Task for SSIS DevOps from Microsoft). Note: There are various third party tasks with the same name.</li>
<li>Click on the new task and rename it to Deploy SSIS [projectname]</li>
<li>Use the ellipsis button to browse to the ISPAC file</li>
<li>Use SSISDB as Destination Type</li>
<li>Use $(DBServer) as Destination Server (this retrieves the value of the variable)</li>
<li>Use "/SSISDB/[projectname]" as Destination Path (replace it with the foldername you need)</li>
<li>Use SQL Server Authentication as Authentication type</li>
<li>Use $(DBUsername) as Username (this retrieves the value of the variable)</li>
<li>Use $(DBPassword) as Password (this retrieves the value of the variable)</li>
<li>Leave Overwrite on (you want to do this more often)</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-K_SrqFZzdVU/Xeqbq1Q-HRI/AAAAAAAAGJw/113YBslaJJcC_8aesWQqYAbfBG93SkrNgCLcBGAsYHQ/s1600/release03a.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-K_SrqFZzdVU/Xeqbq1Q-HRI/AAAAAAAAGJw/113YBslaJJcC_8aesWQqYAbfBG93SkrNgCLcBGAsYHQ/s400/release03a.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add SSIS Deploy task to Release pipeline</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
<br /></div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>5) Add stages</b><br />
Now we need to clone the Dev/Tst stage to Acc and to Prd.<br />
<ul>
<li>Click on the Pipeline tab if you don't see your first Stage</li>
<li>Hover above the first stage and wait for the clone button to appear and click it</li>
<li>Next rename it to Acc</li>
<li>Repeat this for Prd</li>
</ul>
<br />
Notice the red exclamation mark before the variables tab after adding a stage: you need to add the same variables for each stage<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-HYOQ2wE4rCI/XcXjGPSC5ZI/AAAAAAAAGB0/shQ-hFLtFrguoXFzhjbd5U-USQc-Oa1xwCLcBGAsYHQ/s1600/release05.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-HYOQ2wE4rCI/XcXjGPSC5ZI/AAAAAAAAGB0/shQ-hFLtFrguoXFzhjbd5U-USQc-Oa1xwCLcBGAsYHQ/s400/release05.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding stages for Acc and Prd</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
For each Stage you can determine how that Stage starts. These are the 'Pre-deployment conditions'. For example for Dev/Tst you could do an automatic deployment, for Acc you first need an approval from one person and for Prd you need two approvals.<br />
<br />
<ul>
<li>Click on the lightning/person icon on the left side of each Stage to set it</li>
<li>Dev/Tst will use the 'After release' trigger and the next stages will use the 'After stage trigger'</li>
<li>For Acc and Prd you could add Pre-deployment approvals by selecting one of your teammembers.</li>
<li>Close the Pre-deployment conditions pane by clicking the cross in the upper right corner</li>
<li>Repeat this for all stages</li>
</ul>
<br />
And don't forget to add the variables when you have finished the Pre-deployment settings.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-hMaxTwEgOdo/XcXn53jO2eI/AAAAAAAAGCA/E3wPtk3OcdUyRLzx-MfWAP7Egz8yKmAowCLcBGAsYHQ/s1600/release06.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-hMaxTwEgOdo/XcXn53jO2eI/AAAAAAAAGCA/E3wPtk3OcdUyRLzx-MfWAP7Egz8yKmAowCLcBGAsYHQ/s400/release06.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding Pre-deployment conditions for each stage</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>6) Test</b><br />
Now create a new release by activating the build task. After this a new artifact will be created and this will trigger the release pipeline. You can also hit the Create Release button in the upper right corner to use the last created artifact. The artifact will automatically be released to the first stage, but the next step waits for approval (click on the blue ACC button to approve).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-x7uVKjjXFsA/XcXsEV47BkI/AAAAAAAAGCU/m-OCjlmCU0crV5HfFsUm_j3wkn2UFF_VwCLcBGAsYHQ/s1600/release08.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="447" data-original-width="1402" height="127" src="https://1.bp.blogspot.com/-x7uVKjjXFsA/XcXsEV47BkI/AAAAAAAAGCU/m-OCjlmCU0crV5HfFsUm_j3wkn2UFF_VwCLcBGAsYHQ/s400/release08.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Dev/Tst succesful, Acc waiting for approval</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
Note: You can also check the result of the release by clicking on stage DEV/TST and then click on the SSIS Deployment step. If it failed you can see the error messages per step.<br />
<div style="text-align: left;">
</div>
<br />
<br />
<b><span style="font-size: large;">Conclusion</span></b><br />
In this post you saw how easy it is to deploy an SSIS project to the Integration Services Catalog with this new task. It is easy to switch between various Authentication types. One small disadvantage is that you don't have any options for the SSIS Catalog Environments. You either have to do that manually or you have to write some PowerShell or TSQL code to do that after the deployment.<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com2Netherlands52.132633 5.291265999999950547.144659 -5.0358820000000488 57.120607 15.61841399999995tag:blogger.com,1999:blog-2303058199815958946.post-91090628591989922932019-12-06T08:50:00.001+01:002019-12-18T09:09:57.975+01:00Azure DevOps - New Microsoft SSIS Build task<b><span style="font-size: large;">Solution</span></b><br />
Until now we needed <a href="https://marketplace.visualstudio.com/search?term=ssis&target=AzureDevOps&category=Azure%20Pipelines&sortBy=Relevance" target="_blank">Third Party</a> tasks or <a href="https://microsoft-ssis.blogspot.com/2019/06/azure-devops-build-ssis-project-ci.html" target="_blank">PowerShell (with NuGet)</a> to build an SSIS project in DevOps. Now Microsoft finally released its own SSIS DevOps tasks. How does it work?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/--XHd68Wrh_0/Xcb04MfQ_VI/AAAAAAAAGCs/89t14ttp5W8xR7rM6brYE_9EdrE40PQ1wCLcBGAsYHQ/s1600/ssisbuild00.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="573" data-original-width="641" height="357" src="https://1.bp.blogspot.com/--XHd68Wrh_0/Xcb04MfQ_VI/AAAAAAAAGCs/89t14ttp5W8xR7rM6brYE_9EdrE40PQ1wCLcBGAsYHQ/s400/ssisbuild00.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Four tasks that will be replaced by one new task</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Case</span></b><br />
Microsoft just released <a href="https://marketplace.visualstudio.com/items?itemName=SSIS.ssis-devops-tools" target="_blank">SSIS Build</a> (public preview) which makes it much easier to build an SSIS project. Below you will find the codeless steps to build and publish the artifacts so that they can be <a href="https://microsoft-ssis.blogspot.com/2019/12/azure-devops-new-microsoft-ssis-deploy.html" target="_blank">deployed</a>.<br />
<br />
<b>1) Create new build pipeline</b><br />
In this step we will create an empty build pipeline which is connected to Azure Repos Git where our SSIS project is located.<br />
<ul>
<li>Go to Pipelines menu in DevOps and then to Builds. In the list of pipelines you will find the + New sign on top. Click on it and choose New build pipeline.</li>
<li>Now you can choose where your code is located. However on the bottom you will find the option "<b>Use the classic editor to create a pipeline without YAML.</b>". After clicking on it you must select the source you want to use for building the code (Azure Repos Git in our example).</li>
<li>Next step is to create an Empty job/pipeline and give it a useful name.</li>
</ul>
<div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-1AyKgFiemkM/XccDeZaTmjI/AAAAAAAAGC4/NVLan_ZFAL0Fs0scUqoRduZ1WJIgJN65ACLcBGAsYHQ/s1600/ssisbuild02.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-1AyKgFiemkM/XccDeZaTmjI/AAAAAAAAGC4/NVLan_ZFAL0Fs0scUqoRduZ1WJIgJN65ACLcBGAsYHQ/s400/ssisbuild02.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create new empy build pipeline</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br /></div>
<div>
<b>2) Add SSIS Build task</b></div>
<div>
In this step we will add the new SSIS Build task from Microsoft. Make sure to pick the right one because there are several tasks with the same name. You want to look for "by Microsoft Corporation" or "by BeisiZhou".</div>
<ul>
<li>Click on the plus icon behind the "Agent job 1" (you probably want to give that one a better name) and search for "SSIS Build" choose the SSIS Build from Microsoft (see animated gif)</li>
<li>Edit the new SSIS Build task. The only thing you need to change is the Project Path property. You can use the ellipsis button to browse to your SSIS project file.</li>
</ul>
<div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/--WV9zCM2QSw/XccDpXQCkXI/AAAAAAAAGC8/-nuDBjR12ksIdETDm0w2CE1OopzZXBTOQCLcBGAsYHQ/s1600/ssisbuild03.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/--WV9zCM2QSw/XccDpXQCkXI/AAAAAAAAGC8/-nuDBjR12ksIdETDm0w2CE1OopzZXBTOQCLcBGAsYHQ/s400/ssisbuild03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add SSIS Build task by Microsoft</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<b>3) Publish Artifact</b><br />
The previous task will create an IsPac file that we need to publish so that we can use it in a release pipeline later on.</div>
<ul>
<li>Click on the plus icon behind the "Agent job 1" and search for "Publish artifact" choose the Publish artifact from Microsoft (see animated gif)</li>
<li>The only thing you need to change is the Artifact name. Don't use the default 'drop', but use you SSIS project name instead.</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-dx0qACDa5Dk/XccD1o8ntkI/AAAAAAAAGDE/cgXCKJXqufg5ZOlGxIjnVlOG4wUKeO0egCLcBGAsYHQ/s1600/ssisbuild04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-dx0qACDa5Dk/XccD1o8ntkI/AAAAAAAAGDE/cgXCKJXqufg5ZOlGxIjnVlOG4wUKeO0egCLcBGAsYHQ/s400/ssisbuild04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Publish artifact (IsPac)</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>4) Add Trigger</b><br />
To trigger this build pipeline automatically when changes are committed to the Master branch we need to add a trigger. Go to the triggers menu and enable continuous integration. Then add a path filter so that this build will only trigger when the specific project changes. Then save the Build pipeline and trigger it by committing a change or by hitting the trigger options for a manual trigger.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-XD5bxbi3ma4/Xccst7Le5OI/AAAAAAAAGDU/PeS4zoX4-Uw1uSI4fFEvfl04TjdgfCDbwCLcBGAsYHQ/s1600/ssisbuild05.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-XD5bxbi3ma4/Xccst7Le5OI/AAAAAAAAGDU/PeS4zoX4-Uw1uSI4fFEvfl04TjdgfCDbwCLcBGAsYHQ/s400/ssisbuild05.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add trigger</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Conclusion</span></b><br />
This new task is much easier to use than the PowerShell code and also easier than most of the third party tasks. With a little practice you can now easily create a build task under two minutes which is probably faster than the build itself.<br />
<br />
If your build fails with the following error message then you are probably using a custom task or component (like Blob Storage Download Task). These tasks are not installed on the build agents hosted by Microsoft. You can either <a href="https://erwindekreuk.com/2019/02/azure-devops-and-azure-feature-pack-for-integration-services/" target="_blank">install them via PowerShell</a> or use a self hosted agent where you can install all custom components.<br />
<br />
<span style="background-color: black; color: white; font-family: "consolas" , "courier new" , monospace; font-size: 8px; white-space: pre;">System.ArgumentException: Value does not fall within the expected range.</span><br />
<div style="background-color: black; color: white; font-family: Consolas, "Courier New", monospace; font-size: 8px; line-height: 8px; white-space: pre;">
at Microsoft.SqlServer.Dts.Runtime.Interop.ProjectInterop.ReferencePackage(Package package, String packageLocation)<br />
at Microsoft.SqlServer.Dts.Runtime.PackageItem.Load(IDTSEvents events)<br />
at Microsoft.SqlServer.Dts.Runtime.PackageItem.get_Package()<br />
at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.IncrementalBuildThroughObj(IOutputWindow outputWindow)<br />
at Microsoft.DataTransformationServices.Project.DataTransformationsProjectBuilder.BuildIncremental(IOutputWindow outputWindow)<br />
ERR:The process 'C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\devenv.exe' failed with exit code 1<br />
Build D:\a\1\s\SSIS\STG\STG_MySource\STG_MySource.dtproj failed<br />
ERR:Build failed</div>
<div style="background-color: black; color: #e92d3d; font-family: Consolas, "Courier New", monospace; font-size: 8px; line-height: 8px; white-space: pre;">
##[error]Build failed
</div>
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358820000000488 57.120607 15.61841399999995tag:blogger.com,1999:blog-2303058199815958946.post-4153561929898147932019-11-08T23:50:00.002+01:002019-12-06T19:30:27.932+01:00Azure DevOps - Deploy SSIS project (CD) <b><span style="font-size: large;">Case</span></b><br />
Recently you showed how to <a href="https://microsoft-ssis.blogspot.com/2019/06/azure-devops-build-ssis-project-ci.html" target="_blank">build an SSIS project</a> in DevOps, but how do you deploy an SSIS project within DevOps.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-OZ8MdMY6-10/Xb8_CTRDD5I/AAAAAAAAGA4/kfHM2K5AEigGR4siGkga6JiwCmqy110VQCLcBGAsYHQ/s1600/ReleaseSSIS01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="435" data-original-width="1286" height="135" src="https://1.bp.blogspot.com/-OZ8MdMY6-10/Xb8_CTRDD5I/AAAAAAAAGA4/kfHM2K5AEigGR4siGkga6JiwCmqy110VQCLcBGAsYHQ/s400/ReleaseSSIS01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Release SSIS in DevOps</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
There are no out of the box tasks yet, but there are a couple of Third Party DevOps tasks for SSIS which you could try-out. For this example we will use little <a href="http://microsoft-ssis.blogspot.com/2015/07/deploying-ispac-files-with-powershell.html" target="_blank">PowerShell code</a> which we have done before.<br />
<i>Update Dec 6: New <a href="https://microsoft-ssis.blogspot.com/2019/12/azure-devops-new-microsoft-ssis-deploy.html" target="_blank">Microsoft SSIS Deploy task</a></i><br />
<br />
<b>1) New Release Pipeline - Empty Job</b><br />
The first step is to create a new Release Pipeline. For this example we will use an empty job to start with and later on add tasks manually.<br />
<ul>
<li>Go to the Pipelines menu on the left</li>
<li>Go to Releases</li>
<li>Click on + New to create a New Release pipeline</li>
<li>Next choose 'Empty job'</li>
</ul>
Next step is to give the first Stage its name. A Stage is an environment like acceptance or production where you want to deploy to. For this example we will have three stages / environments: Dev/Tst, Acc and Prd. The next two stages will be created by using cloning, but first we will add tasks so that you don't have to repeat yourself.<br />
<ul>
<li>After choosing 'Empty job' a first stage is created. Rename it 'DEV/TST'</li>
<li>Close the Stage pane by clicking the cross in the upper right corner</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-UCUfCEIJNdk/XcHsnVB8DsI/AAAAAAAAGBE/wTrqyYbSTH4xIYyawkHXzWgE_GektZFIACLcBGAsYHQ/s1600/release01.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-UCUfCEIJNdk/XcHsnVB8DsI/AAAAAAAAGBE/wTrqyYbSTH4xIYyawkHXzWgE_GektZFIACLcBGAsYHQ/s400/release01.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create Relesae Pipeline</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><br /></b>
<b><br /></b>
<b>2) Select Artifact</b><br />
In the Artifact pane you need to select the Artifact (the ispac file) you created in the <a href="https://microsoft-ssis.blogspot.com/2019/06/azure-devops-build-ssis-project-ci.html" target="_blank">Build Pipeline</a>. So look the name before you start.<br />
<ul>
<li>In the Artifacts pane click on + Add to add an artifact</li>
<li>Choose the project if you have multiple DevOps projects</li>
<li>Select the Artifact of the CI pipeline</li>
<li>Select Latest version</li>
<li>Optionally rename the source alias</li>
<li>Click on the Add button</li>
</ul>
We also need to determine whether we want to use the Continuous deployment trigger or perhaps use an other trigger or scheduling. For this example we will use the CD trigger.<br />
<ul>
<li>Click on the lightning icon</li>
<li>Enable Build trigger (to release after each new successful build)</li>
<li>Close the CD trigger pane by clicking the cross in the upper right corner</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-LDt8gf_wAes/XcSC7eSYZ5I/AAAAAAAAGBQ/wFGjLlITzxwnL-48uthZx381T-g0TVL2QCLcBGAsYHQ/s1600/release02.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-LDt8gf_wAes/XcSC7eSYZ5I/AAAAAAAAGBQ/wFGjLlITzxwnL-48uthZx381T-g0TVL2QCLcBGAsYHQ/s400/release02.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Artifact and enable Continuous deployment</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Add task to first stage</b><br />
Now we need to add a task to do the actual deployment of the SSIS ispac file. There are a couple of third party SSIS deployment tasks, but the instructions and examples of those are not always that clear. Hopefully Microsoft will create there own SSIS devops tasks soon. For now we will use a little Powershell scripting.<br />
<ul>
<li>Within the first Stage pane, click on the '1 job, 0 task' link to go to the job</li>
<li>Optionally change the settings of the Agent job (for example its name)</li>
<li>Click on the + icon in the Agent Job to add a new task</li>
<li>Search for 'Powershell'</li>
<li>Add the task called PowerShell (Run a PowerShell script on Linux, macOS, or Windows)</li>
<li>Click on the new task and rename it to Deploy SSIS project</li>
<li>Select Inline as Type</li>
<li>Paste the script below in the Script textbox</li>
</ul>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-aWUrVrbPM6c/XcSLXowyTuI/AAAAAAAAGBc/1e9emjkECIUgvo_faVQVyl7XNv3MbuEXwCLcBGAsYHQ/s1600/release03.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-aWUrVrbPM6c/XcSLXowyTuI/AAAAAAAAGBc/1e9emjkECIUgvo_faVQVyl7XNv3MbuEXwCLcBGAsYHQ/s400/release03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add PowerShell code to deploy SSIS</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
Notice the Params piece in the PowerShell code. You could hardcode each value, but for this example we will use Variables from the Release pipeline to set the value of these parameters in the next step. The string <b>"$(SsisServer)"</b> in the code will be replaced by the value of the DevOps variable and result in: <b>"bitools.database.windows.net"</b> (see next step).<br />
<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code
# Params
$SsisServer ="$(SsisServer)"
$SSISDBUsername = "$(SsisDBUsername)"
$SSISDBPassword = "$(SsisDBPassword)"
$FolderName = "RM"
# Mask the password to show something on
# screen, but not the actual password
# This is for testing purposes only.
$SSISDBPasswordMask = $SSISDBPassword -replace '.', '*'
Write-Host "========================================================================================================================================================"
Write-Host "== Used parameters =="
Write-Host "========================================================================================================================================================"
Write-Host "SSIS Server : " $SsisServer
Write-Host "SQL Username : " $SSISDBUsername
Write-Host "SQL Password : " $SSISDBPasswordMask
Write-Host "========================================================================================================================================================"
############## ASSEMBLY ###############
# Add SSIS assembly so you can do SSIS stuff in PowerShell
Write-Host "Referencing SSIS assembly"
$SsisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
# Load the IntegrationServices Assembly
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices") | Out-Null;
############## SQL SERVER ###############
Write-Host "Connecting to Azure SQL DB server $($SsisServer)"
# Create a connectionstring for the Azure DB Server
# Make sure you use SSISDB as the Initial Catalog!
$SqlConnectionstring = "Data Source=$($SsisServer);User ID=$($SSISDBUsername);Password=$($SSISDBPassword);Initial Catalog=SSISDB;"
# Create a connection object
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring
# Check if the connection works
Try
{
$SqlConnection.Open();
Write-Host "Connected to Azure SQL DB server $($SsisServer)"
}
Catch [System.Data.SqlClient.SqlException]
{
Throw [System.Exception] "Failed to connect to Azure SQL DB server $($SsisServer), exception: $($_)"
}
############## SSISDB ###############
# Create the Integration Services object
$IntegrationServices = New-Object $SsisNamespace".IntegrationServices" $SqlConnection
# Check if SSISDB connection succeeded
if (-not $IntegrationServices)
{
Throw [System.Exception] "Failed to connect to SSISDB on $($SsisServer)"
}
else
{
Write-Host "Connected to SSISDB on $($SsisServer)"
}
# Create object for SSISDB Catalog
$Catalog = $IntegrationServices.Catalogs["SSISDB"]
# Check if the SSISDB Catalog exists
if (-not $Catalog)
{
# Catalog doesn't exists. The user should create it manually.
# It is possible to create it, but that shouldn't be part of
# deployment of packages.
# Also make sure the catalog is SSISDB and not master or any
# other database.
Throw [System.Exception] "SSISDB catalog doesn't exist. Create it manually!"
}
else
{
Write-Host "Catalog SSISDB found"
}
############## CATALOG FOLDER ###############
# Create object to the (new) folder
$Folder = $Catalog.Folders[$FolderName]
# Check if folder already exists
if (-not $Folder)
{
# Folder doesn't exists, so create the new folder.
Write-Host "Creating new folder" $FolderName
$Folder = New-Object $SsisNamespace".CatalogFolder" ($Catalog, $FolderName, $FolderName)
$Folder.Create()
}
else
{
Write-Host "Folder" $FolderName "found"
}
############## LOOP ISPACS ###############
Get-ChildItem -Filter "StandardReportGenerator.ispac" -Recurse | Where-Object { -Not ($_.FullName -match "obj") } | ForEach-Object {
############## PROJECT ###############
$IspacFilePath = $_.FullName
# Check if ispac file exists
if (-Not (Test-Path $IspacFilePath))
{
Throw [System.IO.FileNotFoundException] "Ispac file $IspacFilePath doesn't exists!"
}
else
{
$IspacFileName = split-path $IspacFilePath -leaf
Write-Host "Ispac file" $IspacFileName "found"
}
# Get project name from ispac file
$ProjectName = [system.io.path]::GetFileNameWithoutExtension($IspacFilePath)
# Reading ispac file as binary
[byte[]] $IspacFile = [System.IO.File]::ReadAllBytes($IspacFilePath)
$Folder.DeployProject($ProjectName, $IspacFile) | Out-Null
$Project = $Folder.Projects[$ProjectName]
if (-not $Project)
{
# Something went wrong with the deployment
# Don't continue with the rest of the script
return ""
}
Return "Ready deploying $IspacFileName "
}
</pre>
<br />
Note: this script uses a DB user to connect to the SSIS DB. For other authorisations you need to change the code block starting on line 30.<br />
<br />
<b>4) Add variables</b><br />
For this example we need three variables: SsisServer, SsisDBUsername and SsisDBPassword<br />
<ul>
<li>Go to the Variables tab</li>
<li>Click on the + Add to add the first variable</li>
<li>Enter the name of the variable</li>
<li>Enter the value of the variable</li>
<li>Hit the padlock icon when it's a sensitive value like a password</li>
<li>Select the first stage 'Dev/Tst' as scope</li>
</ul>
Repeat this for all three variables and later on, after the next step, repeat this for each Stage. Just add the same variables but change the scope to Acc or Prd.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-35EF4twGTrc/XcXejlOaR1I/AAAAAAAAGBo/Q0ql4Gd3TL4id2WNWcNFv4XMAQfgBiHLACLcBGAsYHQ/s1600/release04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-35EF4twGTrc/XcXejlOaR1I/AAAAAAAAGBo/Q0ql4Gd3TL4id2WNWcNFv4XMAQfgBiHLACLcBGAsYHQ/s400/release04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add variables to DevOps</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>5) Add stages</b><br />
Now we need to clone the Dev/Tst stage to Acc and to Prd.<br />
<ul>
<li>Click on the Pipeline tab if you don't see your first Stage</li>
<li>Hover above the first stage and wait for the clone button to appear and click it</li>
<li>Next rename it to Acc</li>
<li>Repeat this for Prd</li>
</ul>
Notice the red exclamation mark before the variables tab after adding a stage: you need to add the same variables for each stage.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-HYOQ2wE4rCI/XcXjGPSC5ZI/AAAAAAAAGB0/shQ-hFLtFrguoXFzhjbd5U-USQc-Oa1xwCLcBGAsYHQ/s1600/release05.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-HYOQ2wE4rCI/XcXjGPSC5ZI/AAAAAAAAGB0/shQ-hFLtFrguoXFzhjbd5U-USQc-Oa1xwCLcBGAsYHQ/s400/release05.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding stages for Acc and Prd</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
For each Stage you can determine how that Stage starts. These are the 'Pre-deployment conditions'. For example for Dev/Tst you could do an automatic deployment, for Acc you first need an approval from one person and for Prd you need two approvals.<br />
<ul>
<li>Click on the lightning/person icon on the left side of each Stage to set it</li>
<li>Dev/Tst will use the 'After release' trigger and the next stages will use the 'After stage trigger'</li>
<li>For Acc and Prd you could add Pre-deployment approvals by selecting one of your teammembers.</li>
<li>Close the Pre-deployment conditions pane by clicking the cross in the upper right corner</li>
<li>Repeat this for all stages</li>
</ul>
And don't forget to add the variables when you have finished the Pre-deployment settings.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-hMaxTwEgOdo/XcXn53jO2eI/AAAAAAAAGCA/E3wPtk3OcdUyRLzx-MfWAP7Egz8yKmAowCLcBGAsYHQ/s1600/release06.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="850" data-original-width="1383" height="245" src="https://1.bp.blogspot.com/-hMaxTwEgOdo/XcXn53jO2eI/AAAAAAAAGCA/E3wPtk3OcdUyRLzx-MfWAP7Egz8yKmAowCLcBGAsYHQ/s400/release06.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding Pre-deployment conditions for each stage</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>6) Test</b><br />
Now create a new release by activating the build task. After this a new artifact will be created and this will trigger the release pipeline. You can also hit the Create Release button in the upper right corner to use the last created artifact. The artifact will automatically be released to the first stage, but the next step waits for approval (click on the blue ACC button to approve).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-x7uVKjjXFsA/XcXsEV47BkI/AAAAAAAAGCU/m-OCjlmCU0crV5HfFsUm_j3wkn2UFF_VwCLcBGAsYHQ/s1600/release08.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="447" data-original-width="1402" height="127" src="https://1.bp.blogspot.com/-x7uVKjjXFsA/XcXsEV47BkI/AAAAAAAAGCU/m-OCjlmCU0crV5HfFsUm_j3wkn2UFF_VwCLcBGAsYHQ/s400/release08.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Dev/Tst succesful, Acc waiting for approval</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
You can also check the result of the release by clicking on Dev/Tst and then click on the PowerShell step. If it failed you can see the error messages per step.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-FW0_cpVp0zs/XcXreOHNAkI/AAAAAAAAGCM/k9Qgkmfi9vAcw6SzK9s3mTahzv58PrpjgCLcBGAsYHQ/s1600/release07.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="522" data-original-width="1253" height="166" src="https://1.bp.blogspot.com/-FW0_cpVp0zs/XcXreOHNAkI/AAAAAAAAGCM/k9Qgkmfi9vAcw6SzK9s3mTahzv58PrpjgCLcBGAsYHQ/s400/release07.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">See result of deployment</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Conclusions</span></b><br />
In this step you saw how to create a Release pipeline with several Stages and learned how these deployment stages are activated with the Pre-deployment conditions. Unfortunately there are no out-of-the-box Microsoft SSIS devops tasks which force you to either use Third Party tasks or PowerShell. Also take a look around and see all the other options in the build pipeline.<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com1Netherlands52.132633 5.291265999999950547.144659 -5.0358820000000488 57.120607 15.61841399999995tag:blogger.com,1999:blog-2303058199815958946.post-54428506269250767972019-09-20T11:25:00.003+02:002019-09-20T13:31:42.959+02:00GIT Snack: missing buttons in Team Explorer<b><span style="font-size: large;">Case</span></b><br />
I can commit and sync, but where are the GIT buttons Pull Request, Work Items and Builds in my Visual Studio's Team Explorer?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-dR8Vbit9cUg/XYR4WrocQvI/AAAAAAAAF9o/DKy_s_Ni0rkIoJ786tGatqza901HkQjagCLcBGAsYHQ/s1600/SSDTGIT01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="284" data-original-width="487" height="232" src="https://1.bp.blogspot.com/-dR8Vbit9cUg/XYR4WrocQvI/AAAAAAAAF9o/DKy_s_Ni0rkIoJ786tGatqza901HkQjagCLcBGAsYHQ/s400/SSDTGIT01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Missing 3 Git buttons: Pull Request, Work Items and Builds</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
You are not connected to the project in GIT. Let's connect to your project:<br />
<ol>
<li>In Visual Studio/SSDT go to <b>Team</b> in the menu and click on <b>Manage Connections...</b></li>
<li>In the Team Explorer an option "<b>Manage Connections</b>" appears. Click on it and choose <b>Connect to a Project...</b></li>
<li>A new GIT window "<b>Connect to a Project</b>" appears, find and select your project and click on the <b>Connect</b> button.</li>
</ol>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-64djlmvz5MU/XYR_ZZ3Z0hI/AAAAAAAAF90/ojUYpV8KWYMMfX_qOxEV8E50csogbzBJQCLcBGAsYHQ/s1600/SSDTGIT02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="847" data-original-width="1600" height="211" src="https://1.bp.blogspot.com/-64djlmvz5MU/XYR_ZZ3Z0hI/AAAAAAAAF90/ojUYpV8KWYMMfX_qOxEV8E50csogbzBJQCLcBGAsYHQ/s400/SSDTGIT02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Connect to GIT project</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
After connecting the 3 buttons <span style="background-color: white; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">Pull Request, Work Items and Builds</span> will appear in the Team Explorer pane and then you can start a Pull Request from within Visual Studio/SSDT.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-pSBQatWoA2A/XYS4ewZHVoI/AAAAAAAAF-M/nG5WGfsbUI883GHjhAWq6EWunEzfnk02wCLcBGAsYHQ/s1600/SSDTGIT03.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="343" data-original-width="485" height="282" src="https://1.bp.blogspot.com/-pSBQatWoA2A/XYS4ewZHVoI/AAAAAAAAF-M/nG5WGfsbUI883GHjhAWq6EWunEzfnk02wCLcBGAsYHQ/s400/SSDTGIT03.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">All buttons are now available</td></tr>
</tbody></table>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Nederland52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-28844112892458846312019-06-12T20:53:00.003+02:002019-12-06T19:31:32.919+01:00Azure DevOps - Build SSIS project (CI)<b><span style="font-size: large;">Case</span></b><br />
I have my SSIS project in Git (Azure Repos Git) and I want to build my project in Azure DevOps. How do I add Continuous Integration (CI) for my SSIS project?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-EsVKWaHkZrc/XQEmhAb5wlI/AAAAAAAAF6g/D5foRQm3BzgbaiMBWJtAX3-_R71eKGKsQCLcBGAs/s1600/devopsssisci00.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="573" data-original-width="641" height="286" src="https://1.bp.blogspot.com/-EsVKWaHkZrc/XQEmhAb5wlI/AAAAAAAAF6g/D5foRQm3BzgbaiMBWJtAX3-_R71eKGKsQCLcBGAs/s320/devopsssisci00.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">DevOps - Build pipeline</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
For this example we start with an SSIS project already in Azure Repos Git and build the project automatically when committing changes. Deployment (CD) of SSIS will be handled in a <a href="https://microsoft-ssis.blogspot.com/2019/11/azure-devops-deploy-ssis-project-cd.html" target="_blank">separate post</a>.<br />
<i>Update Dec 6: New <a href="https://microsoft-ssis.blogspot.com/2019/12/azure-devops-new-microsoft-ssis-deploy.html" target="_blank">Microsoft SSIS Build task</a></i><br />
<br />
<b>1) New Build Pipeline</b><br />
Go to Pipelines and then to Builds. In the list of pipelines you will find the + New sign. Click on it and choose New build pipeline. Now you can choose where your code is located. However on the bottom you will find the option "Use the classic editor to create a pipeline without YAML." For this example we will use this option since SSIS is a bit old school. As last step choose the Empty job and give your CI pipeline a useful name.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-K9J4f_sLQ6E/XP67hkbdnEI/AAAAAAAAF5M/bZOs38LH3u0Oczri719c5t05nXQje63vQCLcBGAs/s1600/devopsssisci01.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="659" data-original-width="1228" height="213" src="https://1.bp.blogspot.com/-K9J4f_sLQ6E/XP67hkbdnEI/AAAAAAAAF5M/bZOs38LH3u0Oczri719c5t05nXQje63vQCLcBGAs/s400/devopsssisci01.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Creating new CI pipeline in DevOps for SSIS</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) NuGet Tool Installer</b><br />
Since we will be needing an SSIS building tool that is available as NuGet, we first need to install NuGet itself. Click on the plus icon behind the "Agent job 1" (you need to give that a better name) and search for "NuGet" choose the NuGet Tool Installer and determine which version of NuGet you need. Fot this example I used 5.1.0 but click on the information icon behind version to get a list of of versions.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-NTWcpkL_sMY/XQAaNhm6gCI/AAAAAAAAF5Y/ZLXnRrhUNLsXUrJu-QtzhgpCXJqtdPEsgCLcBGAs/s1600/devopsssisci02.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="888" data-original-width="1600" height="220" src="https://1.bp.blogspot.com/-NTWcpkL_sMY/XQAaNhm6gCI/AAAAAAAAF5Y/ZLXnRrhUNLsXUrJu-QtzhgpCXJqtdPEsgCLcBGAs/s400/devopsssisci02.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">DevOps - NuGet Tool Installer</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) NuGet SSISBUILD</b><br />
Now that we have installed the tool NuGet itself we can install a so called NuGet package. The one we will be needing is called <a href="https://www.nuget.org/packages/SSISBuild/" target="_blank">SSISBuild</a>. Add a Task to the agent and search for NuGet again. Choose NuGet and change the command to Custom. Then enter the follow command to install SSISBuild to the Build Server: <i>install SSISBuild -Version 2.3.0</i><br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-HC9h5A0yi8c/XQAdlKuDloI/AAAAAAAAF5k/J87NW77id4g58QzR7WbSNd_BdbATUPuRwCLcBGAs/s1600/devopsssisci03.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="888" data-original-width="1600" height="221" src="https://1.bp.blogspot.com/-HC9h5A0yi8c/XQAdlKuDloI/AAAAAAAAF5k/J87NW77id4g58QzR7WbSNd_BdbATUPuRwCLcBGAs/s400/devopsssisci03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">DevOps - NuGet install SSISBuild</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>4) PowerShell to build SSIS project</b><br />
Now the most important step: the building of the SSIS project. We will be using PowerShell for this Task. Add a PowerShell Task to run PowerShell on Windows(, macOS, or Linux). Then change the script type to InLine and copy & paste the code below. On the first line of code (excl comments) you have to specify the path of your SSIS project starting from the solution folder untill the extension <i>.dtproj</i>. In the code a Predefined variable called Build.SourcesDirectory will be concatenated to this path to specify the project path on the server.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-VZnFPgVjfho/XQAiqO9FHbI/AAAAAAAAF5w/LoD5hxzCqhMa6L0Ven5cZKX0U3JrfZD1QCLcBGAs/s1600/devopsssisci04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="888" data-original-width="1600" height="221" src="https://1.bp.blogspot.com/-VZnFPgVjfho/XQAiqO9FHbI/AAAAAAAAF5w/LoD5hxzCqhMa6L0Ven5cZKX0U3JrfZD1QCLcBGAs/s400/devopsssisci04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">DevOps - PowerShel task to build the SSIS project</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code
# The path of the project starting from the solution folder
$ProjectToBuild = "\SSIS\RM\StandardReportGenerator\StandardReportGenerator.dtproj"
# Join the path of the build server with the path of the project
$ProjectFilePath = Join-Path -Path $env:BUILD_SOURCESDIRECTORY -ChildPath $ProjectToBuild
# Check if the file exists on the build server
if (!(Test-Path $ProjectFilePath)) {
# Not found, throw error
throw "Project $($ProjectFilePath) not found!"
}
else
{
# Call ssisbuild.exe with parameters to build this project
&"$($env:BUILD_SOURCESDIRECTORY)\SSISBuild.2.3.0\tools\ssisbuild.exe" $ProjectFilePath -Configuration Development -ProtectionLevel DontSaveSensitive
# Check whether the build was successful
if ($LASTEXITCODE -ne 0) {
# Build failed, throw error
throw "Build of $($ProjectFilePath) failed.";
}
}
</pre>
<br />
You can also test this locally on your own computer. Then you have to download and unzip the SSISBuild nuget to your Visual Studio solution folder and add one line of code at the top to fill Predefined variable with the solution path on your local computer.<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code
$env:BUILD_SOURCESDIRECTORY = "D:\sources\xxxxxxbi\"
</pre>
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-5BL6cBGvnkM/XQH7blt1WgI/AAAAAAAAF64/2YK8WwMUKXY0X5C8RYs2yNoaKSqSJm2dACLcBGAs/s1600/devopsssis13.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="859" data-original-width="1600" height="213" src="https://1.bp.blogspot.com/-5BL6cBGvnkM/XQH7blt1WgI/AAAAAAAAF64/2YK8WwMUKXY0X5C8RYs2yNoaKSqSJm2dACLcBGAs/s400/devopsssis13.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Test locally with PowerShell ISE</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
The PowerShell is very basic at the moment (KISS), but there are some options to extend the possibilities for this PowerShell. These will be handled in a separate post.<br />
<br />
<b>5) Copy Files</b><br />
Now that we have successfully build the project it is time to copy the result, the .ispac file, to the Artifact Staging Directory. Add a new task called Copy Files. Select *.ispac as content and use the <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml" target="_blank">Predefined variable</a> build.ArtifactStagingDirectory as Target Folder.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-2ypuA01RCzs/XQD7r_YzrMI/AAAAAAAAF58/YVLb4_h2xD0jziy9EdAx_kc04OFypaCFACLcBGAs/s1600/devopsssisci04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="888" data-original-width="1600" height="221" src="https://1.bp.blogspot.com/-2ypuA01RCzs/XQD7r_YzrMI/AAAAAAAAF58/YVLb4_h2xD0jziy9EdAx_kc04OFypaCFACLcBGAs/s400/devopsssisci04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">DevOps - Copy ispac file to stage folder</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>6) Publish Build Artifacts</b><br />
The last task is to publish the artifact, the . ispac file, so you can use it in a Release pipeline later on. Add a task called Publish Build Artifacts. By default the same Predefined variable build.ArtifactStagingDirectory is used as path to publish. Rename the artifact so that you can easily find it when building the Release pipeline.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-4ZT3SMR09dg/XQECkrfMGYI/AAAAAAAAF6I/b5Gm-YaBAFYBvc7btfhlKA-BBCs7kAnBQCLcBGAs/s1600/devopsssisci05.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="888" data-original-width="1600" height="221" src="https://1.bp.blogspot.com/-4ZT3SMR09dg/XQECkrfMGYI/AAAAAAAAF6I/b5Gm-YaBAFYBvc7btfhlKA-BBCs7kAnBQCLcBGAs/s400/devopsssisci05.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">DevOps - Publish Build Artifacts</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>7) Add Triggers</b><br />
To trigger this build pipeline automatically when changes are committed to the Master branch we need to add a trigger. Go to the triggers menu and enable continuous integration. Then add a path filter so that this build will only trigger when the specific project changes. Then save the Build pipeline and trigger it by committing a change.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-PXC1VfsurPs/XQEGZa1HhUI/AAAAAAAAF6U/dk5Us99SHKUJTl1phdpEpnaARZrQWgHAwCLcBGAs/s1600/devopsssisci06.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="888" data-original-width="1600" height="221" src="https://1.bp.blogspot.com/-PXC1VfsurPs/XQEGZa1HhUI/AAAAAAAAF6U/dk5Us99SHKUJTl1phdpEpnaARZrQWgHAwCLcBGAs/s400/devopsssisci06.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">DevOps - Enable trigger to build SSIS project</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Conclusion</span></b><br />
In this post I showed you how to build an SSIS project in Azure DevOps with the SSISBuild NuGet. It took quite a few steps, but most steps are relative simple and easy to replicate in your own DevOps project.<br />
<br />
One downside of this method is that it is a bit more strict when building your SSIS project. It could even happen that you can build your project in Visual Studio successfully and even deploy it to the catalog, but that it will fail building when using the SSISBuild Nuget and throwing you this error: <i>ERROR: An item with the same key has already been added.</i><br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-6PKG8yfryzE/XQFCzZd_mSI/AAAAAAAAF6s/rIUrjEzh4ikeH7FdugK7_oWFz2P3j5ZxACLcBGAs/s1600/devopsssisci10.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="655" data-original-width="1525" height="171" src="https://1.bp.blogspot.com/-6PKG8yfryzE/XQFCzZd_mSI/AAAAAAAAF6s/rIUrjEzh4ikeH7FdugK7_oWFz2P3j5ZxACLcBGAs/s400/devopsssisci10.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="font-size: 12.8px;">ERROR: An item with the same key has already been added</span></td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
The possible cause of this error is duplicate Connection Manager Guids. This happens when you copy a package (with a package connection manager) in Visual Studio. I solved it when recreated the connection manager manually.<br />
If you have duplicate package connection managers and you convert one of them to a project connection manager then it will hide the duplicate connection manager in the copied package, making it very hard to find.<br />
<br />
Next step: <a href="https://microsoft-ssis.blogspot.com/2019/11/azure-devops-deploy-ssis-project-cd.html" target="_blank">Deploy your SSIS project</a><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com1Netherlands52.132633 5.291265999999950547.144659 -5.0358820000000488 57.120607 15.61841399999995tag:blogger.com,1999:blog-2303058199815958946.post-1252983672795974262019-04-02T10:47:00.001+02:002020-03-06T13:36:32.587+01:00Stopping an Integration runtime that is already stopping<b><span style="font-size: large;">Case</span></b><br />
My integration runtime is stopping, but it takes already more than an hour and I see errors in the ADF monitor:
<br />
<br />
<span lang="EN" style="margin: 0px;"><span style="font-family: "calibri";"><b>Error 1</b>: Last
operation 'Stop' get the status 'Failed'. Error Code:
PoolDeleteFailedDueToLockOrReference Error Message: Activity ID:
a67615ac-94ff-4547-9244-728d3ae5b25e </span></span><br />
<br />
<div style="margin: 0px;">
<span lang="EN" style="margin: 0px;"><span style="font-family: "calibri";"><b>Error 2</b>: Error
Code: VNetResourceGroupLockedDuringStop Error Message: We are unable to clean
up unneeded resources when stopping your IR. Please unlock the resource group
containing your VNet and/or remove any reference to the resources created for
your IR, and then retry to stop your IR. </span></span></div>
<br />
<div style="margin: 0px;">
<span lang="EN" style="margin: 0px;"><span style="font-family: "calibri";"><b>Error 3</b>: Error
Code: IntegrationRuntimeStoppingLongerThanUsual Error Message: Your integration
runtime is taking longer than usual to stop, so please retry to stop it.</span></span><span lang="EN-US" style="margin: 0px;"></span></div>
<br />
It suggests to retry to stop the SSISIR, but the start and stop buttons are greyed out. How do I retry the stop?<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-S3zWlgrvyvM/XKMMfVSSq8I/AAAAAAAAF2Y/XrdYKQs3IAMaorZac1v5ay6vrsYV1MfCgCLcBGAs/s1600/stopping.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="341" data-original-width="631" height="215" src="https://4.bp.blogspot.com/-S3zWlgrvyvM/XKMMfVSSq8I/AAAAAAAAF2Y/XrdYKQs3IAMaorZac1v5ay6vrsYV1MfCgCLcBGAs/s400/stopping.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Stop and start are greyed out</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
<b></b><span style="font-size: large;"></span>The Azure portal interface does not support restart at the moment, however you can restart an SSISIR that is already stopping with PowerShell!<br />
<br />
<pre class="brush: powershell; toolbar: false;">#PowerShell code
# Login with your Azure account (popup will open)
Login-AzureRmAccount
# Your subscription name (needed if you have more than one subscription)
$SubscriptionName = "xxxxxxx"
# Change the current and default Azure subscription
Select-AzureRmSubscription -SubscriptionName $SubscriptionName
Get-AzureRmSubscription -SubscriptionName $SubscriptionName
# Force stopping SSISIR
Stop-AzureRmDataFactoryV2IntegrationRuntime -DataFactoryName "MyDataFactoryNae" -Name "SSISxxxxxxxIR" -ResourceGroupName "MyResourceGroup" -Force
</pre>
<br />
<br />
Other steps you should take if the PowerShell does not stop the Integration Runtime:<br />
<ol>
<li>Check if there are lockes on the Resource Group or individual items (like ADF, VNET or SSISDB).</li>
<li>If you are using a VNET, check whether it works. (Create a new SSISIR and then press <b>validate</b> on the VNET step)</li>
<li>If there are no locks and the VNET is valid, but the Integration Runtime is still stopping then contact the Azure help + support desk either via Twitter of via an official support request in the portal.</li>
</ol>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-n14WfON15Yg/XKMbnu1V4xI/AAAAAAAAF2k/GzInKoV2w6AmvBSs41ZNz6wuYpJ9RNJnQCLcBGAs/s1600/stopping03.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="467" data-original-width="995" height="187" src="https://1.bp.blogspot.com/-n14WfON15Yg/XKMbnu1V4xI/AAAAAAAAF2k/GzInKoV2w6AmvBSs41ZNz6wuYpJ9RNJnQCLcBGAs/s400/stopping03.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">1) Check for locks</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-l7SKx27A7-Q/XKMcWw4mUhI/AAAAAAAAF2s/DarwbnzMpFcJUI_SQpiQ5pPYohlrgBhEQCLcBGAs/s1600/stopping04.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="579" data-original-width="511" height="400" src="https://4.bp.blogspot.com/-l7SKx27A7-Q/XKMcWw4mUhI/AAAAAAAAF2s/DarwbnzMpFcJUI_SQpiQ5pPYohlrgBhEQCLcBGAs/s400/stopping04.png" width="352" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">2) Validate VNET</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-21ID89lcexg/XKMcqSIPHVI/AAAAAAAAF20/4iDMv-xQuUQJqVqqxTLlKxSOS29eUF9dQCLcBGAs/s1600/stopping02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://2.bp.blogspot.com/-21ID89lcexg/XKMcqSIPHVI/AAAAAAAAF20/4iDMv-xQuUQJqVqqxTLlKxSOS29eUF9dQCLcBGAs/s400/stopping02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">3) Add support request (in worst case)</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>Update</b>: there is a new "retry stop" option available. Click on the Author (pencil) button on the left and then on connection in the left bottom corner. Now open the Integration Runtimes tab and click on the new "retry stop" button. (I hope they will add this option in the monitor as well)<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-ukkkuHVWL1Q/XKzwsL3wMSI/AAAAAAAAF3A/AhxZRDXzKGwp0XO1HrMFDY9exlRoITGDgCLcBGAs/s1600/retrystopir.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="699" data-original-width="1429" height="195" src="https://4.bp.blogspot.com/-ukkkuHVWL1Q/XKzwsL3wMSI/AAAAAAAAF3A/AhxZRDXzKGwp0XO1HrMFDY9exlRoITGDgCLcBGAs/s400/retrystopir.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Retry stopping the SSIS IR</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-29786036025311469222019-02-18T08:02:00.001+01:002019-02-18T08:28:13.891+01:00Power Query Source (Preview)<b><span style="font-size: large;">Case</span></b><br />
Yes new SSIS functionality! In a <a href="https://microsoft-ssis.blogspot.com/2019/01/introducing-data-flows-in-azure-data.html" target="_blank">previous post</a> I had almost written it off, but Microsoft just introduced the SSIS source that we where all waiting for: The PowerQuery source. How does it work?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-RmZhtR-9268/XGozHYnT_fI/AAAAAAAAF0w/CD6j-nmQl5QyQYgktted6AViNQCih6iuACLcBGAs/s1600/pqs001.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="189" data-original-width="287" src="https://4.bp.blogspot.com/-RmZhtR-9268/XGozHYnT_fI/AAAAAAAAF0w/CD6j-nmQl5QyQYgktted6AViNQCih6iuACLcBGAs/s1600/pqs001.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">SSIS - Power Query Source</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
First you need to <a href="https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017" target="_blank">download</a> and install the latest SSDT (Version 15.9.0) and since it is still in preview, you can only use it within SSDT or on a Azure Integration Runtime in Azure Data Factory.<br />
<br />
If you drag the new Power Query Source to your Data Flow canvas and edit it, you can paste your Power Query script from Power BI (or Excel) in the query textbox. So there is not yet(?) an editor, but this should simplify a lot of tasks that could were previously could only be solved with .Net scripting.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-vuH4uJeXTzI/XGpXOrBZHWI/AAAAAAAAF1I/JstQyS2HAdgTaHAQMnuUZlDSACjE_kHxQCLcBGAs/s1600/pqs003.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="997" data-original-width="1594" height="250" src="https://4.bp.blogspot.com/-vuH4uJeXTzI/XGpXOrBZHWI/AAAAAAAAF1I/JstQyS2HAdgTaHAQMnuUZlDSACjE_kHxQCLcBGAs/s400/pqs003.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Paste your Power Query script in the editor</td></tr>
</tbody></table>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<br />
After pasting the Power Query script go to the Connection Managers pane and create a Power Query Connection Manager by first clicking on Detect Data Source and then by adding the new Connection Manager via the drop down list.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-Fk-Dp_X0Rzo/XGpYGPj5bhI/AAAAAAAAF1Q/67L9JzHHBogXz1fLTrEjrruNLLxXa6DUACLcBGAs/s1600/pqs004.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="885" data-original-width="952" height="371" src="https://1.bp.blogspot.com/-Fk-Dp_X0Rzo/XGpYGPj5bhI/AAAAAAAAF1Q/67L9JzHHBogXz1fLTrEjrruNLLxXa6DUACLcBGAs/s400/pqs004.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Power Query Connection Manager</td></tr>
</tbody></table>
<br />
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>Preview notes:</b><br />
<ul>
<li>Only working in SSDT or on a Azure-SSIS IR in ADF</li>
<li>Web source does not work on an Azure-SSIS IR with custom setup</li>
<li>Oracle source only works via Oracle ODBC driver on an Azure-SSIS IR </li>
</ul>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
For more details read <a href="https://docs.microsoft.com/en-us/sql/integration-services/data-flow/power-query-source?view=sql-server-2017" target="_blank">Power Query Source (Preview)</a><br />
<br />
<b><span style="font-size: large;">Conclusion</span></b><br />
This new source was announced (and canceled) a long time ago, but it is finally available. This preview version is still very basic and still has some limitations. I'm not sure whether they will integrate an editor like in Excel and Power BI, but lets hope they will.<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-11905715884707473762019-02-17T17:50:00.000+01:002019-02-18T08:20:55.681+01:00Updated ADF Pipeline Activity for SSIS packages<b><span style="font-size: large;">Case</span>
</b><br />
In April 2018 Microsoft added the SSIS pipeline activity to ADF. A couple of days ago the released a new version. What are the changes?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-ImLo7-5QZq8/XGmOVZvMvII/AAAAAAAAFzs/__BdHjhjndkFar4FEBbsSexJS4nt-GOvgCLcBGAs/s1600/SSISactivity00.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="727" data-original-width="1050" height="276" src="https://1.bp.blogspot.com/-ImLo7-5QZq8/XGmOVZvMvII/AAAAAAAAFzs/__BdHjhjndkFar4FEBbsSexJS4nt-GOvgCLcBGAs/s400/SSISactivity00.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">ADF - Execute SSIS Package activity</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
The big difference is that you can now select your packages and environment instead of typing the full paths manually, but there is also an new "Manual entries" option which allows you to set parameters (and connection managers) directly within the pipeline.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-vDocWgEnKlw/XGmOm4hMBcI/AAAAAAAAFz0/b8FB_-g1keAOLnJ1OpiZPwt_51YXZJzDACLcBGAs/s1600/SSISactivity01.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="806" data-original-width="1600" height="201" src="https://4.bp.blogspot.com/-vDocWgEnKlw/XGmOm4hMBcI/AAAAAAAAFz0/b8FB_-g1keAOLnJ1OpiZPwt_51YXZJzDACLcBGAs/s400/SSISactivity01.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Selecting package and environment</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
See below the old version of the SSIS activity where you had to type the full paths manually.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-Hh_US9lcANw/WtEibiDylYI/AAAAAAAAFfY/L1H-KU-ApX4cbXQTpfxs74YcCpYiI_AQgCLcBGAs/s1600/ssisactivity04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="870" data-original-width="1407" height="246" src="https://3.bp.blogspot.com/-Hh_US9lcANw/WtEibiDylYI/AAAAAAAAFfY/L1H-KU-ApX4cbXQTpfxs74YcCpYiI_AQgCLcBGAs/s400/ssisactivity04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The old manual entry</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Conclusion</span></b><br />
Again much easier then before, but in case of package errors still not very helpful. It still forces you to search for errors in the SSIS catalog. If you are using the ADF monitor to the check for errors I would probably still prefer the <a href="http://microsoft-ssis.blogspot.com/2018/03/get-ssis-messages-to-adf-monitor.html" target="_blank">Stores Procedure activity</a>.Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-47841667819605611152019-01-30T13:43:00.006+01:002019-01-30T14:30:23.856+01:00Introducing Data Flows in Azure Data Factory <span style="font-size: large;"><b>Case</b></span><br />
Since last year we can run the good old SSIS packages in the Azure cloud. Not on a self created and self maintained virtual machine, but with the Integration Runtime service in Azure Data Factory.<br />
<br />
However Microsoft also introduced Azure Databricks and called in <a href="https://www.youtube.com/watch?v=lquF9x8Lw8E" target="_blank">ETL 2.0</a>. So I'm not sure how much effort Microsoft will put in any future release of SSIS. But Azure Databricks is not a visual tool and not everybody is into coding. Is there an alternative?<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-iLiAVh-6r34/XFF0TGPx46I/AAAAAAAAFxc/hfTktejUOG8y-IVz1rzUFTYKw2FszswFQCLcBGAs/s1600/DataFlow00.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="660" data-original-width="1223" height="215" src="https://1.bp.blogspot.com/-iLiAVh-6r34/XFF0TGPx46I/AAAAAAAAFxc/hfTktejUOG8y-IVz1rzUFTYKw2FszswFQCLcBGAs/s400/DataFlow00.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Visual ETL in Azure</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
You can probably keep on using SSIS for an other decade, but if indeed the development of SSIS stagnates and the rest of the data platform isn't, then it is good to learn a couple of new tricks. Within the Microsoft Data Platform we a couple of alternatives for SSIS:<br />
<br />
<br />
<ol>
<li><b>Azure Databricks</b><br />As mentioned above this requires learning some new coding skills since this isn't a visual development tool. I will post an introduction in a later blog post.</li>
<li><b>Azure Data Factory with Pipelines and T-SQL</b><br />You could use the Copy Data activity in combination with the Stored Procedure activity and build all transformations in T-SQL. If you are a diehard SSIS developer then this is probably not your cup of thee.</li>
<li><b>Power BI Dataflows</b><br />The Power BI team just introduced self service ETL within Power BI. We just added a <a href="http://microsoft-bitools.blogspot.com/2018/12/power-bi-introducing-data-flows.html" target="_blank">new post</a> to explain this new option which will make some people very exciting.</li>
<li><b>Azure Data Factory Dataflows</b><br />This is a new <b>preview</b> feature in Azure Data Factory to visually create ETL flows. Below I will show you the steps to create you own first simple Data Flow.</li>
</ol>
<br />
<br />
<b>1) Request preview access</b><br />
If you are reading this during the preview periode (early 2019), then first request access via <a href="http://aka.ms/dataflowpreview"> this form</a>. With your Azure subscription Guid that you need to provide, Microsoft will turn this new feature on for you that specific subscription.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-eTZRm5M6oGI/XEgzlaeXuVI/AAAAAAAAFuM/ww4y3Nu96EsLhLqlU0h84ODOlxGCLFH9QCLcBGAs/s1600/DataFlow01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="643" data-original-width="583" height="400" src="https://2.bp.blogspot.com/-eTZRm5M6oGI/XEgzlaeXuVI/AAAAAAAAFuM/ww4y3Nu96EsLhLqlU0h84ODOlxGCLFH9QCLcBGAs/s400/DataFlow01.png" width="362" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Check 'Version' for the preview version</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Create Databricks Service</b><br />
Yes you are reading this correctly. Under the hood Data Factory is using Databricks to execute the Data flows, but don't worry you don't have to write code.<br />
Create a Databricks Service and choose the right region. This should be the same as your storage region to prevent high data movement costs. As Pricing Tier you can use Standard for this introduction. Creating the service it self doesn't cost anything.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-yJht9Z20MRY/XEsLnWSzsHI/AAAAAAAAFus/3DvgDPAdGMch_GaaLsormzUhzUzzSakJwCLcBGAs/s1600/DataFlow03.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="552" data-original-width="822" height="267" src="https://2.bp.blogspot.com/-yJht9Z20MRY/XEsLnWSzsHI/AAAAAAAAFus/3DvgDPAdGMch_GaaLsormzUhzUzzSakJwCLcBGAs/s400/DataFlow03.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create Databricks Service</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Create Databricks cluster</b><br />
Now go to the newly created service and click on the Launch Workspace button under the big Databricks logo in the center.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-P3CWFDRDKSk/XEse-WXQoMI/AAAAAAAAFvQ/LKqiIaX90gIkLv6gniB5bP5E8mJ_TMvIgCLcBGAs/s1600/DataFlow06.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="678" data-original-width="1531" height="176" src="https://1.bp.blogspot.com/-P3CWFDRDKSk/XEse-WXQoMI/AAAAAAAAFvQ/LKqiIaX90gIkLv6gniB5bP5E8mJ_TMvIgCLcBGAs/s400/DataFlow06.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Launch Workspace</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
A new tab will open (just like with data factory) and in this workspace we need to create a cluster. A cluster is a collection of virtual machines that will do the ETL work for you. This is also where you start paying! Click on <b>Clusters</b> in the left menu and then on Create Cluster (or click on <b>New Cluster</b> under Common Tasks in the center).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-LVUzltBLw6U/XEsW53aPe3I/AAAAAAAAFvE/WklF2uXwBNoyHhKbJfO9hnMsxibvq479QCLcBGAs/s1600/DataFlow05.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="746" data-original-width="847" height="351" src="https://4.bp.blogspot.com/-LVUzltBLw6U/XEsW53aPe3I/AAAAAAAAFvE/WklF2uXwBNoyHhKbJfO9hnMsxibvq479QCLcBGAs/s400/DataFlow05.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create new Databricks cluster</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
For the <b>Cluster Mode</b> use the default option: Standard. For the <b>Runtime Version </b>choose: <i>Runtime: 5.0 (Scale 2.11, Spark 2.4.0). Pyhthon version</i>. The other options to change are the number of minutes of inactivity before terminating the machine. For this example 60 minutes should be enough to create and execute your new data flow. And most important for you wallet is the <b>Worker Type</b> and the numbers of workers. For this example one of the cheapest (Standard_DS3_v2) should be more than enough.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-z2-qq56lzmg/XEsUodHjfJI/AAAAAAAAFu4/mIQ2jMvB63oBvuBK4Dcp_NvxxBi67Z9DQCLcBGAs/s1600/DataFlow04.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="635" data-original-width="828" height="306" src="https://4.bp.blogspot.com/-z2-qq56lzmg/XEsUodHjfJI/AAAAAAAAFu4/mIQ2jMvB63oBvuBK4Dcp_NvxxBi67Z9DQCLcBGAs/s400/DataFlow04.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create cluster manually</td></tr>
</tbody></table>
<br />
<br />
<br />
<b></b><b></b><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>Note: </b>In a future post I will show you how to create a cluster automatically. For now we will create one manually, but terminate it automatically when you are ready (= not doing anything with the cluster).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-f55MjlpqZlY/XE9mIR8yyRI/AAAAAAAAFxI/L0tqhfagtGkgij8UsuGUoZcV2-nA_KIFACLcBGAs/s1600/DataFlow15.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="579" data-original-width="1600" height="144" src="https://2.bp.blogspot.com/-f55MjlpqZlY/XE9mIR8yyRI/AAAAAAAAFxI/L0tqhfagtGkgij8UsuGUoZcV2-nA_KIFACLcBGAs/s400/DataFlow15.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Creating a cluster takes a few minutes</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>4) Create Data Factory V2 with data flow (preview)</b><br />
Create a new Data Factory and choose 'V2 with data flow (preview)' as the version. At the moment of writing this new feature is only available in 4 regions, but more regions will be added on its way from preview to general availability. So watch out with moving lots of data between regions. Also notice the option to add a couple of examples which will help you to master the Data Flows.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-fuDFhEefJbg/XEg-kHHfyWI/AAAAAAAAFug/ekn5WPC1P6AJdxJ-e8KqYOcB2B8KX7TUACLcBGAs/s1600/DataFlow02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="643" data-original-width="583" height="400" src="https://1.bp.blogspot.com/-fuDFhEefJbg/XEg-kHHfyWI/AAAAAAAAFug/ekn5WPC1P6AJdxJ-e8KqYOcB2B8KX7TUACLcBGAs/s400/DataFlow02.png" width="362" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Only available in 4 regions at the moment</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>5) Add new Data Flow</b><br />
Now go to the newly created Data Factory and click on Author & Monitor to go to the Data Factory portal.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-7FMTHtiHYtE/XEsiXFpkERI/AAAAAAAAFvc/nwd5nftwCFAYR1FDyB6j0f6ixtUFp2VBQCLcBGAs/s1600/DataFlow07.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="641" data-original-width="1463" height="175" src="https://2.bp.blogspot.com/-7FMTHtiHYtE/XEsiXFpkERI/AAAAAAAAFvc/nwd5nftwCFAYR1FDyB6j0f6ixtUFp2VBQCLcBGAs/s400/DataFlow07.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Data Factory, open portal</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
In the portal go to the Author page (pencil icon in the left menu) and then click on the three dots behind Data Flows and choose Add Dataflow. A new empty Dataflow will be created and we can start adding a source.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-xw0xKaY6-nM/XEslsnfdXwI/AAAAAAAAFvo/w468wzJ5L5A2DE1VYo94BX6ioXrj2C3fACLcBGAs/s1600/DataFlow08.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="807" data-original-width="1550" height="207" src="https://4.bp.blogspot.com/-xw0xKaY6-nM/XEslsnfdXwI/AAAAAAAAFvo/w468wzJ5L5A2DE1VYo94BX6ioXrj2C3fACLcBGAs/s400/DataFlow08.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Dataflow</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>6) Add source</b><br />
For this example I will use an existing file that is located in an Azure Blob Storage Container. Go to the new dataflow and click on the source to specify the file from the Blob Storage Container. First give the source a suitable name. In the second field you can find all existing datasets, but if you click on the +New button behind it, you can create a new dataset. Now you first need to give the dataset a name and then you can specify the connection to the Azure Blob Storage Container.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-kjiwn9H0ZGo/XEzq6EyjqII/AAAAAAAAFv0/_LGwTS7p0kU6OlkY5wd1e_lCNJva2xcwwCLcBGAs/s1600/DataFlow09.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://1.bp.blogspot.com/-kjiwn9H0ZGo/XEzq6EyjqII/AAAAAAAAFv0/_LGwTS7p0kU6OlkY5wd1e_lCNJva2xcwwCLcBGAs/s400/DataFlow09.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding source part 1</td></tr>
</tbody></table>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
When creating a new Blob Storage Container dataset it <b>only </b>creates a link to the storage account itself. You still need to specify the container, filename and properties like delimiter and column names. To do that hit the Edit button and go to the Connections tab. Here you can specify all properties like location and file format. Next go to the Schema tab to specify the columns. You can do that manually or by clicking on the Import Schema button that will read the column names from the file. After this you can specify the datatypes for each column. Now the source dataset is ready, but we still have to map this to the source in the dataflow.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-lHMR0MM_2c8/XEztzYqJIBI/AAAAAAAAFwA/V3xi4h1LIcMOAOi5oxLAM0HqXbE7YCYqQCLcBGAs/s1600/DataFlow10.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://4.bp.blogspot.com/-lHMR0MM_2c8/XEztzYqJIBI/AAAAAAAAFwA/V3xi4h1LIcMOAOi5oxLAM0HqXbE7YCYqQCLcBGAs/s400/DataFlow10.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding source part 2</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
Go back to the new dataflow and go to the Define Schema tab. Use the Import from dataset button to import all columns from the dataset. The datatype is <b>not </b>copied from the dataset. I'm not sure whether this is deliberately or that it is a bug. Now the source is ready and we can start adding transformations.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-fkhWhoWFajE/XEzv5O_1z9I/AAAAAAAAFwM/3BdqdLE4kb0xyrkaAIi31z2oGU0jFbLPgCLcBGAs/s1600/DataFlow11.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://4.bp.blogspot.com/-fkhWhoWFajE/XEzv5O_1z9I/AAAAAAAAFwM/3BdqdLE4kb0xyrkaAIi31z2oGU0jFbLPgCLcBGAs/s400/DataFlow11.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding source part 3</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>7) Add transformation</b><br />
The number of different transformations is not yet as rich as those of the SSIS Data Flow Task, but the most important transformations are available and there are also some handy new ones like SurrogateKey and Window (for aggregation on time windows) which are not available in SSIS.<br />
<br />
On the other hand the expression language in the Derived Column is much more comprehensive with for example functions for hashing and a lot of string functions which are not available in SSIS.<br />
<br />
For this example I will add a Derived column to upper the location column. Click on the little + behind the source to add a new transformation. Click on Derived Column and edit an existing column. When editing the expression a new window will appear. The upper method can be found under String and the column name can be found under Input. If you know SSIS then this should be very familiar.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-XAxD2hgJ9Cg/XE23drvlr4I/AAAAAAAAFwY/zQqNFqlndD0INBd6vDcTtAbPQv5KLp0uQCLcBGAs/s1600/DataFlow11.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://2.bp.blogspot.com/-XAxD2hgJ9Cg/XE23drvlr4I/AAAAAAAAFwY/zQqNFqlndD0INBd6vDcTtAbPQv5KLp0uQCLcBGAs/s400/DataFlow11.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding a Derived Column transformation</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>8) Add destination</b><br />
For this example I will use an existing table in an Azure SQL Database as destination. Click on the little + behind the Derived Column to add a new item to the dataflow. Scroll down to the bottom and click on Sink. Now you can add a new destination dataset (sink dataset) for this output. First give the output stream a name and then click on the <b>+ New</b> button to add a dataset. Choose Azure SQL Server and give the dataset a name in the next step. After that we need to create a new connection to the database server. See the animated gifs for the details.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-QahJKHk-gpY/XE9eW1J9rCI/AAAAAAAAFws/meH6YwuF7KIkrpcWMaGnWAhhsY5s22MawCLcBGAs/s1600/DataFlow12.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://4.bp.blogspot.com/-QahJKHk-gpY/XE9eW1J9rCI/AAAAAAAAFws/meH6YwuF7KIkrpcWMaGnWAhhsY5s22MawCLcBGAs/s400/DataFlow12.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Destination - Add sink dataset and connection</td></tr>
</tbody></table>
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
When creating a new sink dataset and its connection, it only creates a link to the SQL Server database. You still need to specify the table name and columns. To do that hit the Edit button on the dataflow destination and go to the Connections tab. Here you can specify which table you want to use. Next go to the Schema tab to specify the columns. You can do that by clicking on the Import Schema button that will read the columns from the database table. Now the sink dataset is ready, but you still need to create a column mapping.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-rPS5hNuSnUs/XE9gBiyRUsI/AAAAAAAAFw0/Gi3qF_I0F1YU4JC0RBmOFONFf25eB4haQCLcBGAs/s1600/DataFlow13.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://1.bp.blogspot.com/-rPS5hNuSnUs/XE9gBiyRUsI/AAAAAAAAFw0/Gi3qF_I0F1YU4JC0RBmOFONFf25eB4haQCLcBGAs/s400/DataFlow13.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Destination - Choose table and columns</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
Go back to the dataflow, click on the destination and then go to the Mapping tab. If the column names in your dataflow are the same then you can use Auto Mapping. Otherwise turn of the auto mapping and do it manually.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-NgAWsPZRSIQ/XE9gBjtBJCI/AAAAAAAAFw4/ApLiV7RoJLEHyLJq_obT-Y7BFwpN3th-wCLcBGAs/s1600/DataFlow14.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://3.bp.blogspot.com/-NgAWsPZRSIQ/XE9gBjtBJCI/AAAAAAAAFw4/ApLiV7RoJLEHyLJq_obT-Y7BFwpN3th-wCLcBGAs/s400/DataFlow14.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Destination - Mapping</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>9) Debug</b><br />
You probably already clicked on the Data Preview tab of the source, derived column or destination and saw that you first have to turn on debug mode.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-5NpEGLmunik/XFGJi0SP5gI/AAAAAAAAFx0/BbZJfcqEv3c7sn4SNkccoLPbYaD1MA3XwCLcBGAs/s1600/DataFlow17.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="536" data-original-width="738" height="290" src="https://4.bp.blogspot.com/-5NpEGLmunik/XFGJi0SP5gI/AAAAAAAAFx0/BbZJfcqEv3c7sn4SNkccoLPbYaD1MA3XwCLcBGAs/s400/DataFlow17.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Please turn on the debug mode and wait until cluster is ready to preview data</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
To do this we first need to get a new token from Azure Databricks to connect from Data Factory. Go to the Databricks portal and click in the person icon in the top right. Then choose User Settings and then hit the Generate New Token button. Copy it and keep it save in for example KeePass because you wont be able to retrieve it again.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-oqqZTyIfAE8/XFGHzJflO-I/AAAAAAAAFxo/AMAvOJDVh2kDyT1_XQyHUvJdSE39xPRbwCLcBGAs/s1600/DataFlow16.gif" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://4.bp.blogspot.com/-oqqZTyIfAE8/XFGHzJflO-I/AAAAAAAAFxo/AMAvOJDVh2kDyT1_XQyHUvJdSE39xPRbwCLcBGAs/s400/DataFlow16.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Generate New Token in Azure Databricks</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
Now go back to your dataflow and switch on the debug mode in the top left of the dataflow. A new form appears on the right side where you can create a new linked service to the Databricks service. This is where you need the token from the previous step. For details see animated gif. After that select the cluster that you created and hit the start button. Now it will connect to the cluster and you will be able to see the contents of the Data Preview tabs.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-Y8JNxGxFj4o/XFGSFNNU4aI/AAAAAAAAFyA/fRUKZ03jfegO1A7tewFGh90MMOOVCNCPACLcBGAs/s1600/DataFlow18.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://4.bp.blogspot.com/-Y8JNxGxFj4o/XFGSFNNU4aI/AAAAAAAAFyA/fRUKZ03jfegO1A7tewFGh90MMOOVCNCPACLcBGAs/s400/DataFlow18.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Debugging dataflow (not the actual speed)</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>10) Create pipeline</b><br />
The dataflow is ready and we can now add it to a data factory pipeline with the new dataflow preview activity. You need to provide the name of the dataflow you want to execute, but also the link to the Azure Databricks Service. You can reuse the linked service from the previous step.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-FMq_zFCJ6pk/XFGXhnShXWI/AAAAAAAAFyI/dqT7R9LMwkA5fa7CN4oCfdj8Yifn9ZkrgCLcBGAs/s1600/DataFlow19.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://4.bp.blogspot.com/-FMq_zFCJ6pk/XFGXhnShXWI/AAAAAAAAFyI/dqT7R9LMwkA5fa7CN4oCfdj8Yifn9ZkrgCLcBGAs/s400/DataFlow19.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Adding dataflow to the pipeline (not the actual speed)</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-xRiPGsb1znw/XFGYFXrWpGI/AAAAAAAAFyM/3Y2Eam_65EsgevK6t1Y_fI02Njisz6sagCLcBGAs/s1600/dataflow20.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="586" data-original-width="578" height="400" src="https://1.bp.blogspot.com/-xRiPGsb1znw/XFGYFXrWpGI/AAAAAAAAFyM/3Y2Eam_65EsgevK6t1Y_fI02Njisz6sagCLcBGAs/s400/dataflow20.png" width="393" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">And the result viewed with SSMS</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Summary</span></b><br />
Microsoft created a new Visual ETL tool with lots of promising features that could become a great successor for SSIS. However it does not yet have the Visual Studio feeling like with SSIS. So that will take some time to get used to. Debugging is an other thing. It works, but takes me way to much waiting time at the moment. It will probably be faster with a more expensive cluster, but not sure it will match the SSDT experience. Besides these two notes, I can't wait to get some actual working experience with it.<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358820000000488 57.120607 15.61841399999995tag:blogger.com,1999:blog-2303058199815958946.post-7133963718434903062018-10-29T18:30:00.002+01:002018-11-28T11:27:11.692+01:00Bug: Script Task - Cannot load script for execution<b><span style="font-size: large;">Case</span></b><br />
My Script Tasks are running fine in Visual Studio 2017, but when deployed to the catalog I get an error in all Script Tasks: Cannot load script for execution<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-OntSIDHxLQc/W9ck0TPRf_I/AAAAAAAAFqw/9q8WOUCJ21089PCJ7BbCUbN83bUbgbh7QCLcBGAs/s1600/scripttaskerror01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="165" data-original-width="850" height="123" src="https://1.bp.blogspot.com/-OntSIDHxLQc/W9ck0TPRf_I/AAAAAAAAFqw/9q8WOUCJ21089PCJ7BbCUbN83bUbgbh7QCLcBGAs/s640/scripttaskerror01.png" width="640" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Error: Cannot load script for execution</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<span style="font-size: large;"><b>Solution</b></span><br />
There
is a bug in SSDT for Visual Studio 2017 (15.8.1). The (temporary) workaround is to NOT use SSDT 2017 for deployment. Instead you could use SSMS to deploy your SSIS projects.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-scRvI2wYVNQ/W9dAiDs9LZI/AAAAAAAAFq8/-ZWL_EmcI5YxmJReI_EMxEszEIq33LSHwCLcBGAs/s1600/scripttaskbug02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="278" data-original-width="401" height="276" src="https://4.bp.blogspot.com/-scRvI2wYVNQ/W9dAiDs9LZI/AAAAAAAAFq8/-ZWL_EmcI5YxmJReI_EMxEszEIq33LSHwCLcBGAs/s400/scripttaskbug02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Deploy packages with SSMS</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
Or use <b>ISDeploymentWizard.exe</b> in the folder C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\CommonExtensions\Microsoft\SSIS\<b>150</b>\Binn\ to deploy your projects.<br />
<br />
Expect an update soon!<br />
<br />
<br />
UPDATE: <span id="goog_1157819877"></span><a href="https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017" target="_blank">SSDT 15.8.2 is available<span id="goog_1157819878"></span></a><br />
<i>Fixed an issue that deploying SSIS project which contains packages containing Script Task/Flat file destination to Azure-SSIS will result in the packages failing to execute in Azure-SSIS</i><br />
<a href="https://www.blogger.com/"></a><b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><i></i><br />
<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com1Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-8768904022495978132018-06-30T12:00:00.002+02:002018-07-01T10:32:45.167+02:00The database 'SSISDB' has reached its size quota.<b><span style="font-size: large;">Case</span></b><br />
I'm running SSIS in azure for couple of months now, but I'm getting an error: <span style="background-color: transparent; color: #212121; display: inline; float: none; font-family: "calibri" , sans-serif; font-size: 14.66px; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><i>The database 'SSISDB' has reached its size quota. Partition or delete data, drop indexes, or consult the documentation for possible resolutions.</i> The SQL Server Agent jobs that are scheduled to clean up the log and project versions are not available in Azure.</span><br />
<span style="background-color: transparent; color: #212121; display: inline; float: none; font-family: "calibri" , sans-serif; font-size: 14.66px; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<span style="background-color: transparent; color: #212121; display: inline; float: none; font-family: "calibri" , sans-serif; font-size: 14.66px; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">The first time it occurred I scaled up the database tier and that solved the problem, but now it reoccurs and I don't want to keep scaling up the database tier. How do I solve this?</span><br />
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><i></i><i></i><br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
Since there is no SQL Server agent there are no jobs available, but there will be a solution within a couple of weeks. If you can't wait there is a workaround. First make sure the retention period and number of project versions are set to an acceptable level. If you are running and updating packages frequently then the default settings are probably a bit to high.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-eTjTWfQIg4U/WzdMVvZyY9I/AAAAAAAAFi4/CIOxWfTNrEo995kXaDJtvMCRpCHdSFXuQCLcBGAs/s1600/catalogproperties.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="507" data-original-width="758" height="267" src="https://4.bp.blogspot.com/-eTjTWfQIg4U/WzdMVvZyY9I/AAAAAAAAFi4/CIOxWfTNrEo995kXaDJtvMCRpCHdSFXuQCLcBGAs/s400/catalogproperties.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The default settings (right click SSISDB to see properties)</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>Project versions</b><br />
I set the Maximom Number of Versions per Project to 3. The clean up the old project version locate the Stored Procedure (SSISDB) <b>[internal].[cleanup_server_project_version]</b> and execute it. There are no parameters.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-RJQqjDYO0K0/WzdQ1UzTjhI/AAAAAAAAFjE/fpp9OwyysWAUcPS9LpKPQ9D8KCpkz0vDACLcBGAs/s1600/projectversions.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="507" data-original-width="822" height="246" src="https://1.bp.blogspot.com/-RJQqjDYO0K0/WzdQ1UzTjhI/AAAAAAAAFjE/fpp9OwyysWAUcPS9LpKPQ9D8KCpkz0vDACLcBGAs/s400/projectversions.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Before and after running the Stored Procedure</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>Log retention</b><br />
The same can be done for the Log retention with the stored procedure <b>[internal].[cleanup_server_retention_window]</b>. Again no parameters. The Stored Procedure is working with a T-SQL cursor so if your log is massive and your server very busy it could take a while.<br />
<br />
<br />
You could even schedule these Stored Procedures with for example Azure Data Factory to clean up regularly, but keep up the SSIS announcements to see when they implement a out-of-the box solution.<br />
<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-87871120045344151532018-05-31T23:45:00.001+02:002018-05-31T23:46:45.322+02:00User Group meeting: SSIS in the cloud<span style="background-color: transparent; color: #444444; display: inline; float: none; font-family: "arial" , "tahoma" , "helvetica" , "freesans" , sans-serif; font-size: 13px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 18.2px; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">Recently
I presented an SSIS in the Cloud session at a User Group evening </span><span style="background-color: transparent; color: #444444; display: inline; float: none; font-family: "arial" , "tahoma" , "helvetica" , "freesans" , sans-serif; font-size: 13px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 18.2px; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">in Amsterdam, The Netherlands. Thank you <a href="https://qnh.eu/" target="_blank">QNH</a> for hosting that evening. You can download the PowerPoint and in the comments you will find the <a href="http://microsoft-ssis.blogspot.com/p/azure-integration-services.html" target="_blank">blogposts</a> that I used in my demo's.</span><br />
<span style="background-color: transparent; color: #444444; display: inline; float: none; font-family: "arial" , "tahoma" , "helvetica" , "freesans" , sans-serif; font-size: 13px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; line-height: 18.2px; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"><br /></span>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://sites.google.com/site/ssisjoost/Azure%20Integration%20Services%20public.pptx?attredirects=0&d=1" target="_blank"><img border="0" data-original-height="900" data-original-width="1600" height="180" src="https://4.bp.blogspot.com/-I8oBT5yQ5Us/WxBsS1jsLsI/AAAAAAAAFhg/r7t0QWekATgh5WsqzOuOJMUpVs-Os7JbgCLcBGAs/s320/usergroup00.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-RXAPpFW9XKg/WxBrb7feeLI/AAAAAAAAFhU/qZC677tnNuYxTB0kdEPVvkmW0gbGUCq_QCLcBGAs/s1600/usergroup01.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1200" data-original-width="1600" height="240" src="https://4.bp.blogspot.com/-RXAPpFW9XKg/WxBrb7feeLI/AAAAAAAAFhU/qZC677tnNuYxTB0kdEPVvkmW0gbGUCq_QCLcBGAs/s320/usergroup01.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-0zFvaH-UXBQ/WxBrb2abE5I/AAAAAAAAFhQ/Eat8Fze8fso0F3C6G09dmC8-LvNuUa8LQCLcBGAs/s1600/usergroup02.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1200" data-original-width="1600" height="240" src="https://2.bp.blogspot.com/-0zFvaH-UXBQ/WxBrb2abE5I/AAAAAAAAFhQ/Eat8Fze8fso0F3C6G09dmC8-LvNuUa8LQCLcBGAs/s320/usergroup02.png" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-Ok9PnlOFirs/WxBrb1R2IfI/AAAAAAAAFhM/jT4iPfEIpb8fwGnoIiujJBT4wbzl7VJegCLcBGAs/s1600/usergroup03.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1200" data-original-width="1600" height="240" src="https://1.bp.blogspot.com/-Ok9PnlOFirs/WxBrb1R2IfI/AAAAAAAAFhM/jT4iPfEIpb8fwGnoIiujJBT4wbzl7VJegCLcBGAs/s320/usergroup03.png" width="320" /></a></div>
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0tag:blogger.com,1999:blog-2303058199815958946.post-71789806732610712812018-04-14T00:15:00.003+02:002018-04-14T13:52:24.345+02:00New ADF Pipeline activity: Execute SSIS Package<b><span style="font-size: large;">Case</span></b><br />
Microsoft released a new ADF Pipeline activity today: <b>Execute SSIS Package</b>. How does it work and is it easier/better than the trick with the <a href="http://microsoft-ssis.blogspot.com/2018/02/schedule-package-in-azure-integration.html" target="_blank">Stored Procedure Activity</a>?<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-5E9bbcroxio/WtEbws6qBxI/AAAAAAAAFe0/Kh9qxMiec68WGVTnH-6oadwQQG0bRVfmQCLcBGAs/s1600/SSISActivity01.png" imageanchor="1" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="298" data-original-width="193" height="320" src="https://2.bp.blogspot.com/-5E9bbcroxio/WtEbws6qBxI/AAAAAAAAFe0/Kh9qxMiec68WGVTnH-6oadwQQG0bRVfmQCLcBGAs/s320/SSISActivity01.png" width="207" /></a></div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<span style="font-size: large;"><b>Solution</b></span><br />
The new activity can be found under General (just like the Stored Procedure) and it is indeed much easier than the Stored Procedure activity solution. If you want to execute the package below then follow the steps below.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-8-4MEx_eSr0/WtEeywPs3xI/AAAAAAAAFfA/C8QxAbXiopcbY0KvzAJT-n8pAADZk2e3QCLcBGAs/s1600/SSISActivity02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="145" data-original-width="223" src="https://4.bp.blogspot.com/-8-4MEx_eSr0/WtEeywPs3xI/AAAAAAAAFfA/C8QxAbXiopcbY0KvzAJT-n8pAADZk2e3QCLcBGAs/s1600/SSISActivity02.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The package which I want to execute</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>1) Add activity </b><br />
Drag the new SSIS activity to the canvas of the pipeline and give it a describing name. For example something with the projectname of package name in it.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-Jj82LGkK8_A/WtEgl2JFV4I/AAAAAAAAFfM/Jj7qF9qc1QUSMSj7ehLeHV2jjMQHb4LYACLcBGAs/s1600/ssisactivity03.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="870" data-original-width="1407" height="246" src="https://1.bp.blogspot.com/-Jj82LGkK8_A/WtEgl2JFV4I/AAAAAAAAFfM/Jj7qF9qc1QUSMSj7ehLeHV2jjMQHb4LYACLcBGAs/s400/ssisactivity03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Execute SSIS Package activity</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Settings</b><br />
Go to the Settings tab and first select the name of the Integration Runtime that should execute the package. The second mandatory setting is the Logging Level, but it already has a default setting for 'Basic' and the last mandatory setting is the package path. The path starts with the catalog folder name followed by a forward slash, the project name, an other forward slash and then finally the package name. It shoud look like this: ssisjoost/MyAzureProject/Package.dtsx<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-Hh_US9lcANw/WtEibiDylYI/AAAAAAAAFfY/L1H-KU-ApX4cbXQTpfxs74YcCpYiI_AQgCLcBGAs/s1600/ssisactivity04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="870" data-original-width="1407" height="246" src="https://3.bp.blogspot.com/-Hh_US9lcANw/WtEibiDylYI/AAAAAAAAFfY/L1H-KU-ApX4cbXQTpfxs74YcCpYiI_AQgCLcBGAs/s400/ssisactivity04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Settings</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Run Trigger</b><br />
Now run publish the new pipline and run the trigger to see the result.<br />
<br />
Possible errors:<br />
When the Integration Runtime is <u>not running</u>, it shows "<i>Activity Execute my first package failed: The integration runtime 'IR-SSISJoost' under data factory 'ADF-SSISJoost' does not exist.</i>". This message is a bit strange.<br />
<br />
When the Integration Runtime is <u>starting up</u>, it shows "<i>Activity Execute my first package failed: The state of Azure ssis integration runtime 'IR-SSISJoost' under data factory 'ADF-SSISJoost' is not ready.</i>"<br />
<br />
When the package is failing it shows "<i>Execute my first package failed: Package execution failed.</i>". This shows the shortcoming compared to the Stores Procedure activity which allows you to show the Execution Id and even the <a href="http://microsoft-ssis.blogspot.com/2018/03/get-ssis-messages-to-adf-monitor.html" target="_blank">error messages from the catalog</a> if you fancy a bit of T-SQL scripting.<br />
<br />
<b><span style="font-size: large;">Conclusion</span></b><br />
Much easier, but in case of package errors not very helpful. It forces you to search for errors in the catalog. If you are using the ADF monitor to the check for errors I would probably still prefer the <a href="http://microsoft-ssis.blogspot.com/2018/03/get-ssis-messages-to-adf-monitor.html" target="_blank">Stores Procedure activity</a>.<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0tag:blogger.com,1999:blog-2303058199815958946.post-33334478508041624422018-04-01T20:58:00.003+02:002021-02-14T12:11:20.408+01:00Start and stop Integration Runtime in ADF pipeline<b><span style="font-size: large;">Case</span></b><br />
You showed me how to schedule a <a href="http://microsoft-ssis.blogspot.com/2018/02/pause-and-resume-integration-runtime-to.html" target="_blank">pause and resume</a> of the Integration Runtime (IR) in Azure Automation, but can you also start and stop IR in the Azure Data Factory (ADF) pipeline with one of the activities? This will save the most money possible, especially when you only have one ETL job.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-35GJR3CDfp8/WrZF2pwMe0I/AAAAAAAAFcI/MPI88IY4yPIVobwZwUu_pf8HCXA8deItgCLcBGAs/s1600/pauseir01.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="505" data-original-width="1551" height="130" src="https://1.bp.blogspot.com/-35GJR3CDfp8/WrZF2pwMe0I/AAAAAAAAFcI/MPI88IY4yPIVobwZwUu_pf8HCXA8deItgCLcBGAs/s400/pauseir01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Pause and Resume IR in ADF pipeline</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
Yes you can and even better... you can reuse the existing Runbook PowerShell script that <a href="http://microsoft-ssis.blogspot.com/2018/02/pause-and-resume-integration-runtime-to.html" target="_blank">pauses and resumes</a> the IR. Instead of scheduling it, which is more appropriate when you have multiple projects to run, we will call the scripts via their webhooks.<br />
<br /><i>Update: now easier done <a href="https://microsoft-ssis.blogspot.com/2020/06/start-ssis-ir-via-rest-api.html" target="_blank">via Rest API</a></i><br /><br />
<b>Prerequisites</b>
<br />
<ul>
<li>You have an existing pipeline that <a href="http://microsoft-ssis.blogspot.com/2018/02/schedule-package-in-azure-integration.html" target="_blank">executes</a> an SSIS package (optionally with an <a href="https://microsoft-bitools.blogspot.com/2018/03/add-email-notification-in-azure-data.html" target="_blank">error notification</a>).<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-pPPlz8uFqCk/Wr_xWa-P__I/AAAAAAAAFdA/_RFwlEKWKY82vdPMyoSyvVDA9yJZ6hLkQCLcBGAs/s1600/PauseResumeIR04.png" style="clear: left; float: left; margin-bottom: 1em; margin-right: 1em;"><img border="0" data-original-height="395" data-original-width="978" height="129" src="https://4.bp.blogspot.com/-pPPlz8uFqCk/Wr_xWa-P__I/AAAAAAAAFdA/_RFwlEKWKY82vdPMyoSyvVDA9yJZ6hLkQCLcBGAs/s320/PauseResumeIR04.png" width="320" /></a></div>
<br /><br /><br /><br /><br /><br /><u></u></li>
</ul>
<br />
<br />
<br />
<b><span style="font-size: large;">A) Azure Automation Runbook</span></b><br />
If you already have the two <span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">Runbooks that </span><a href="http://microsoft-ssis.blogspot.com/2018/02/pause-and-resume-integration-runtime-to.html" style="-webkit-text-stroke-width: 0px; background-color: transparent; color: #0066cc; font-family: "Times New Roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; orphans: 2; text-align: left; text-decoration: underline; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;" target="_blank">pauses and resumes</a><span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;"> the IR then you could skip step 1 to 8 and only do step 9 (create webhook) for both scripts. But then make sure to <span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">delete, <a href="https://4.bp.blogspot.com/-sW7YO0cI3rI/Wr6gF1O-ZrI/AAAAAAAAFcY/V7_ImepNTK8h1nHk2BthZZ81Dm-PhU8tACLcBGAs/s1600/PauseResumeIR01.gif" target="_blank">disable</a> or <a href="https://4.bp.blogspot.com/-Vwpe_8IQBqU/Wr6kG3ZNAHI/AAAAAAAAFcc/rrBFjtZr_YQsoTwTSdHFP7XYHhS8ev7WwCLcBGAs/s1600/PauseResumeIR02.gif" target="_blank">unlink</a> the schedule of the two Runbooks</span> and remove the optional trigger part of the code.</span><br />
<br />
<b>1) Collect parameters</b><br />
Before we start coding we first we need to get the name of the Azure Data Factory and its Resource group.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-wpMRrmowtwM/WoCVV5dOokI/AAAAAAAAFS8/MPVYYseiXPcTAeAoLjTzLaw4qct7wkQZACLcBGAs/s1600/pausais02.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="475" data-original-width="813" height="232" src="https://4.bp.blogspot.com/-wpMRrmowtwM/WoCVV5dOokI/AAAAAAAAFS8/MPVYYseiXPcTAeAoLjTzLaw4qct7wkQZACLcBGAs/s400/pausais02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Your Azure Data Factory (V2)</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Azure Automation Account</b><br />
Create an Azure Automation Account. You can find it under + New, <b>Monitoring + Management</b>. Make sure that Create Azure Run As account is turned on.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-Ulyhha-yDko/WoCYlk7VBTI/AAAAAAAAFTQ/zhD6gZFen_Up_kxv414JBFd9zeHz5Qm8ACLcBGAs/s1600/pausais04.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="948" data-original-width="1124" height="336" src="https://4.bp.blogspot.com/-Ulyhha-yDko/WoCYlk7VBTI/AAAAAAAAFTQ/zhD6gZFen_Up_kxv414JBFd9zeHz5Qm8ACLcBGAs/s400/pausais04.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Automation Account</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Import Module</b><br />
We need to tell our code about Integration Runtimes in Azure Data Factory. You do this by adding a modules. Scroll down in the menu of the Automation Account and click on Module. Now you see all installed modules. We need to add the module called <b>AzureRM.DataFactoryV2</b>, however it depends on <b>AzureRM.Profile</b> (≥ 4.2.0). Click on Browse gallery and search for <b>AzureRM.Profile and</b> import it and then repeat it for <b>AzureRM.DataFactoryV2</b>. Make sure to add the V2 version!<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-ag0mQH7pft8/WoCcOUv1TcI/AAAAAAAAFTc/VRNh58oGKjkHwocK3xJlv8mjvTykq45GgCLcBGAs/s1600/pausais05.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="685" data-original-width="1129" height="242" src="https://1.bp.blogspot.com/-ag0mQH7pft8/WoCcOUv1TcI/AAAAAAAAFTc/VRNh58oGKjkHwocK3xJlv8mjvTykq45GgCLcBGAs/s400/pausais05.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Import modules</td></tr>
</tbody></table>
<b></b><b></b><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>4) Connections</b><br />
This step is for your information only and to understand the code. Under Connections you will find a default connection named 'AzureRunAsConnection' that contains information about the Azure environment, like the tendant id and the subscription id. To prevent hardcoded connection details we will retrieve some of these fields in the PowerShell code.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-aGDAyDrfCrg/WoNVxv-T8EI/AAAAAAAAFTs/xiBgbD8AH50tx4ymjOeJIJ8dnAKAkKGYQCLcBGAs/s1600/pausais06.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="518" data-original-width="1383" height="148" src="https://3.bp.blogspot.com/-aGDAyDrfCrg/WoNVxv-T8EI/AAAAAAAAFTs/xiBgbD8AH50tx4ymjOeJIJ8dnAKAkKGYQCLcBGAs/s400/pausais06.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span face=""az_ea_font" , "segoe ui" , , "segoe wp" , "tahoma" , "arial" , sans-serif" style="background-color: transparent; color: #252525; display: inline; float: none; font-size: 12px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">AzureRunAsConnection</span></td></tr>
</tbody></table>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>5) Runbooks</b><br />
Now it is time to add a new Azure Runbook for the PowerShell code. Click on Runbooks and then add a new runbook (There are also five example runbooks of which AzureAutomationTutorialScript could be useful as an example). Give your new Runbook a suitable name and choose PowerShell as type. There will be two separate runbooks/scripts: one for pause and one for resume. When finished with the Pause script you need to repeat this for the Resume script (steps 5 to 9).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-75mtx-V6ABg/WoNbZh3uYYI/AAAAAAAAFT8/a950DX5w4xcs0GicCh25CPXlILGnwp_gwCLcBGAs/s1600/pausais07.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="959" data-original-width="1582" height="241" src="https://2.bp.blogspot.com/-75mtx-V6ABg/WoNbZh3uYYI/AAAAAAAAFT8/a950DX5w4xcs0GicCh25CPXlILGnwp_gwCLcBGAs/s400/pausais07.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create new Rubook</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>6) Edit Script</b><br />
After clicking Create in the previous step the editor will we opened. When editing an existing Runbook you need to click on the Edit button to edit the code. You can copy and paste the code of one of these scripts below to your editor. Study the green comments to understand the code. Also make sure to fill in the right value for the variables (see parameters).<br />
<br />
The first script is the pause script and the second script is the resume script. You could merge both scripts and use an if statement on the status property or <span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">some parameters</span> to either pause or resume, but I prefer two separate scripts.<br />
<br />
<pre class="brush: powershell; toolbar: false;"># This scripts pauses your Integration Runtime if it is running
# Parameters
$ConnectionName = 'AzureRunAsConnection'
$DataFactoryName = 'ADF-SSISJoost'
$ResourceGroup = 'joost_van_rossum'
# Do not continue after an error
$ErrorActionPreference = "Stop"
########################################################
# Log in to Azure (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
try
{
# Get the connection "AzureRunAsConnection "
$ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName
'Log in to Azure...'
$null = Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint
}
catch
{
if (!$ServicePrincipalConnection)
{
# You forgot to turn on 'Create Azure Run As account'
$ErrorMessage = "Connection $ConnectionName not found."
throw $ErrorMessage
}
else
{
# Something went wrong
Write-Error -Message $_.Exception.Message
throw $_.Exception
}
}
########################################################
# Get Integration Runtime in Azure Data Factory
$IntegrationRuntime = Get-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $DataFactoryName `
-ResourceGroupName $ResourceGroup
# Check if Integration Runtime was found
if (!$IntegrationRuntime)
{
# Your ADF does not have a Integration Runtime
# or the ADF does not exist
$ErrorMessage = "No Integration Runtime found in ADF $($DataFactoryName)."
throw $ErrorMessage
}
# Check if the Integration Runtime is running
elseif ($IntegrationRuntime.State -eq "Started")
{
# Stop the integration runtime
Write-Output "Pausing Integration Runtime $($IntegrationRuntime.Name)."
$null = Stop-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $IntegrationRuntime.DataFactoryName `
-Name $IntegrationRuntime.Name `
-ResourceGroupName $IntegrationRuntime.ResourceGroupName `
-Force
Write-Output "Done"
}
else
{
# Write message to screen (not throwing error)
Write-Output "Integration Runtime $($IntegrationRuntime.Name) is not running."
}
</pre>
<br />
<br />
<br />
<br />
<pre class="brush: powershell; toolbar: false;"># This scripts resumes your Integration Runtime if it is stopped
# Parameters
$ConnectionName = 'AzureRunAsConnection'
$DataFactoryName = 'ADF-SSISJoost'
$ResourceGroup = 'joost_van_rossum'
# Do not continue after an error
$ErrorActionPreference = "Stop"
########################################################
# Log in to Azure (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
try
{
# Get the connection "AzureRunAsConnection "
$ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName
'Log in to Azure...'
$null = Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint
}
catch
{
if (!$ServicePrincipalConnection)
{
# You forgot to turn on 'Create Azure Run As account'
$ErrorMessage = "Connection $ConnectionName not found."
throw $ErrorMessage
}
else
{
# Something went wrong
Write-Error -Message $_.Exception.Message
throw $_.Exception
}
}
########################################################
# Get Integration Runtime in Azure Data Factory
$IntegrationRuntime = Get-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $DataFactoryName `
-ResourceGroupName $ResourceGroup
# Check if Integration Runtime was found
if (!$IntegrationRuntime)
{
# Your ADF does not have a Integration Runtime
# or the ADF does not exist
$ErrorMessage = "No Integration Runtime found in ADF $($DataFactoryName)."
throw $ErrorMessage
}
# Check if the Integration Runtime is running
elseif ($IntegrationRuntime.State -ne "Started")
{
# Resume the integration runtime
Write-Output "Resuming Integration Runtime $($IntegrationRuntime.Name)."
$null = Start-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $IntegrationRuntime.DataFactoryName `
-Name $IntegrationRuntime.Name `
-ResourceGroupName $IntegrationRuntime.ResourceGroupName `
-Force
Write-Output "Done"
}
else
{
# Write message to screen (not throwing error)
Write-Output "Integration Runtime $($IntegrationRuntime.Name) is already running."
}
</pre>
<br />
<b>7) Testing</b><br />
You can use the Test Pane menu option in the editor to test your PowerShell scripts. When clicking on Run it will first Queue the script before Starting it. Running takes a couple of minutes.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-i33ymjhBoow/WoXiD0QJrRI/AAAAAAAAFUk/grTGXRffAYYVV-xCZQdVvOWz2WanCOQRACLcBGAs/s1600/pausais.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="504" data-original-width="1200" height="167" src="https://2.bp.blogspot.com/-i33ymjhBoow/WoXiD0QJrRI/AAAAAAAAFUk/grTGXRffAYYVV-xCZQdVvOWz2WanCOQRACLcBGAs/s400/pausais.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Pausing Integration Runtime</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-nyzSusjregU/WoXiNuE8d8I/AAAAAAAAFUo/x8TvUMtEUMkQtNmhSTht7T_lfQDW2SqsACLcBGAs/s1600/pauseais9.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="504" data-original-width="1200" height="167" src="https://2.bp.blogspot.com/-nyzSusjregU/WoXiNuE8d8I/AAAAAAAAFUo/x8TvUMtEUMkQtNmhSTht7T_lfQDW2SqsACLcBGAs/s400/pauseais9.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Resuming Integration Runtime (20 minutes+)</td></tr>
</tbody></table>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>8) Publish</b><br />
When your script is ready, it is time to publish it. Above the editor click on the Publish button. Confirm overriding any previously published versions.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-7P82FadETRo/WoXoruEo9NI/AAAAAAAAFU8/sGyouK9yKX4GJCQPL7R11IWXlF1CMvQlwCLcBGAs/s1600/pausais08.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="400" data-original-width="847" height="188" src="https://4.bp.blogspot.com/-7P82FadETRo/WoXoruEo9NI/AAAAAAAAFU8/sGyouK9yKX4GJCQPL7R11IWXlF1CMvQlwCLcBGAs/s400/pausais08.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Publish your script</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>9) Adding a Webhook</b><br />
To start the runbooks we will be using a Webhook. This is a URL with a long key in it. <b>Do not</b> share this URL, because with it others could execute your runbook. Treat it like a password.<br />
On the Runbook overview page click on <b>Webhook</b> and create a new Webhook. Give it a suitable name (that is similar to its runbook). Set the expiration date and don't forget the copy the URL. This is the only time you can see the URL. If you loose it you need to recreate the Webhook. The URL looks something like: <span style="font-size: xx-small;">https://s2events.azure-automation.net/webhooks?token=vfIx%2fOAHcsJCn95abSXbklPrPXNlFUHwpr%2bSWyANlk0%3d</span><br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-Sj40Y5XhW7w/Wr_nxKNGw2I/AAAAAAAAFcw/BcHbi76XFdwcdmK2AcrSjNueTO3P7JibwCLcBGAs/s1600/PauseResumeIR03.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="813" data-original-width="1197" height="271" src="https://1.bp.blogspot.com/-Sj40Y5XhW7w/Wr_nxKNGw2I/AAAAAAAAFcw/BcHbi76XFdwcdmK2AcrSjNueTO3P7JibwCLcBGAs/s400/PauseResumeIR03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Webhook</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">B) Azure Data Factory</span></b><br />
The scripts are now ready. Next go to your existing ADF pipeline that executes the SSIS package. We will be adding two activities before executing the package and one behind it.<br />
<br />
<br />
<b>1) Resume IR Add Web Activity</b><br />
Next collapse the General activities and drag a Web activity as first activity in your pipeline. This activity will be calling the Webhook of the Resume-SSIS runbook. Give it a suitable name like 'Resume-SSIS'.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-F_PV2g22eEg/WsEFaRn1zxI/AAAAAAAAFdo/xPiHzIIUdB8E6SEofyFahtaOFJUAtE-kwCLcBGAs/s1600/PauseResumeIR04.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="870" data-original-width="1419" height="245" src="https://1.bp.blogspot.com/-F_PV2g22eEg/WsEFaRn1zxI/AAAAAAAAFdo/xPiHzIIUdB8E6SEofyFahtaOFJUAtE-kwCLcBGAs/s400/PauseResumeIR04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Web Activity for Resume IR</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Resume IR Web activity settings</b><br />
Select the newly added Web activity and go to the Settings page. In the URL field you must paste the URL from step A9 (the URL/Webhook that resumes the IR) and as method you need to select <b>Post</b>.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-SoC9QnwML9U/WsEI2EOGQ0I/AAAAAAAAFd0/YG_-9Pj9LGU3b3fkWY5Pw_DPmHdO_3ruwCLcBGAs/s1600/PauseResumeIR05.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="870" data-original-width="1419" height="245" src="https://4.bp.blogspot.com/-SoC9QnwML9U/WsEI2EOGQ0I/AAAAAAAAFd0/YG_-9Pj9LGU3b3fkWY5Pw_DPmHdO_3ruwCLcBGAs/s400/PauseResumeIR05.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Webhook URL</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Wait until online Stored Procedure</b><br />
Because calling the Webhook is asynchronous (don't wait for the result) and starting the IR takes 15 to 20 minutes we need to wait that amount of time. The database view [catalog].[worker_agents] in the SSISDB database will tell you when the IR is active.<br />
<br />
In the SQL Account tab you can select the existing connection to the SSISDB. Then go to the Stored Procedure tab and add 'sp_executesql' as Stored procedure name. Next add a string parameter called stmt (statement) and paste the code below as value.<br />
<br />
When finished connect the Resume-SSIS activity to this new Wait activity and then connect the Wait activity to the activity that executes the package.<br />
<pre class="brush: sql; toolbar: false;">-- Wait until Azure-SSIS IR is started
WHILE NOT EXISTS (SELECT * FROM [SSISDB].[catalog].[worker_agents] WHERE IsEnabled = 1 AND LastOnlineTime > DATEADD(MINUTE, -1, SYSDATETIMEOFFSET()))
BEGIN
WAITFOR DELAY '00:00:30';
END
</pre>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-76QCX4unqA0/WsENHx3MEUI/AAAAAAAAFeA/uhSeuVLSVwM4L4MIbDAjA44sx1TZFy1zQCLcBGAs/s1600/PauseResumeIR06.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="870" data-original-width="1419" height="245" src="https://3.bp.blogspot.com/-76QCX4unqA0/WsENHx3MEUI/AAAAAAAAFeA/uhSeuVLSVwM4L4MIbDAjA44sx1TZFy1zQCLcBGAs/s400/PauseResumeIR06.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add wait</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>Tip</b>: add a 30 minute time-out on this activity to prevent endless waiting in case of unexpected errors.<br />
<br />
<b>4) Pause IR Web Activity</b><br />
Last step is to add an other Web activity called Pause-SSIS. This is similar to the Resume-SSIS activity, but with an other Webhook URL. When finished connect the activity that executes the package to this new Pause activity. Make sure to change it from the default Success to Completion. Otherwise the IR keeps on running in case a package fails.<br />
<br />
The only thing left is some realignment and then you can publish and test your ADF pipeline.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-mH7fgel0mmg/WsEQNx1cknI/AAAAAAAAFeQ/zfc6c61GFnkDGLxr3neAIarRnbsiLJLZQCLcBGAs/s1600/PauseResumeIR07.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="870" data-original-width="1419" height="245" src="https://3.bp.blogspot.com/-mH7fgel0mmg/WsEQNx1cknI/AAAAAAAAFeQ/zfc6c61GFnkDGLxr3neAIarRnbsiLJLZQCLcBGAs/s400/PauseResumeIR07.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Web Activity for Pausing IR</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>5) Test result</b><br />
If you run the pipeline and check its activities in the monitor you see all the individual steps. Starting the IR took almost 25 minutes of the 32 minutes in total. So now you only have to pay for the 32 minutes.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-6Conwa36HXg/WsEqoDxU1aI/AAAAAAAAFek/h66fHaJrXr04XDNG9HYIR4mi2-ffnp9jQCLcBGAs/s1600/PauseResumeIR08.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="415" data-original-width="1600" height="102" src="https://3.bp.blogspot.com/-6Conwa36HXg/WsEqoDxU1aI/AAAAAAAAFek/h66fHaJrXr04XDNG9HYIR4mi2-ffnp9jQCLcBGAs/s400/PauseResumeIR08.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Activities</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b style="text-align: start;"><span style="font-size: large;">Summary</span></b><br />
In this post I showed you how to pause and resume the Integration Runtime in the ADF pipeline. This is especially useful if you only have one job or just a few that don't run at the same time. For all other cases I recommend to schedule <a href="http://microsoft-ssis.blogspot.com/2018/02/pause-and-resume-integration-runtime-to.html" target="_blank">pause and resume</a>.Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-36364092107431214042018-03-08T18:56:00.003+01:002018-04-14T14:07:47.634+02:00Show SSIS error messages in the ADF monitor<b><span style="font-size: large;">Case</span></b><br />
I want to see SSIS error messages in the ADF monitor or in the <a href="http://microsoft-bitools.blogspot.com/2018/03/add-email-notification-in-azure-data.html" target="_blank">ADF email notification</a> so that I can have a quick look to determine the severity of the problem without logging in with SSMS to the catalog in Azure. Is that possible?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-vPutThfwCtE/WpLlEPxWXdI/AAAAAAAAFbI/HW8zD5wId4cqTIGRpUw8mEHEZ1lJGXmfQCLcBGAs/s1600/ADFNotification10.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="565" data-original-width="1510" height="148" src="https://3.bp.blogspot.com/-vPutThfwCtE/WpLlEPxWXdI/AAAAAAAAFbI/HW8zD5wId4cqTIGRpUw8mEHEZ1lJGXmfQCLcBGAs/s400/ADFNotification10.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Need more details in case of an error</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
It is possible but it requires to adjust the T-SQL statement that executes the package. Please follow all steps in this <span id="goog_402875508"></span><a href="http://microsoft-ssis.blogspot.com/2018/02/schedule-package-in-azure-integration.html">blog post<span id="goog_402875509"></span></a>, but replace the T-SQL code from step 4 with the code below.<br />
<br />
The change is in the last part only. Previously it only showed that the execution failed, but now it also retrieves error messages from the catalog. Because the space is a bit limited we only show 7 errors. Errors with 'validation' in the text are less useful for the quick look we want. So those are filtered out as well. All messages are separated with a linefeed for a better overview.<br />
<br />
<pre class="brush: sql; toolbar: false;">-- Variables for execution and error message
declare @err_msg as varchar(8000)
declare @err_msg_part as varchar(1000)
declare @execution_id as bigint
-- Create execution and fill @execution_id variable
EXEC [SSISDB].[catalog].[create_execution] @package_name=N'Package.dtsx', @execution_id=@execution_id OUTPUT, @folder_name=N'SSISJoost', @project_name=N'MyAzureProject', @use32bitruntime=False, @reference_id=2, @useanyworker=True, @runinscaleout=True
-- Set logging level: 0=None, 1=Basic, 2=Performance, 3=Verbose
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'LOGGING_LEVEL', @parameter_value=1
-- Set synchonized option 0=A-SYNCHRONIZED, 1=SYNCHRONIZED
-- A-SYNCHRONIZED: don't wait for the result
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'SYNCHRONIZED', @parameter_value=1
-- Execute the package with parameters from above
EXEC [SSISDB].[catalog].[start_execution] @execution_id, @retry_count=0
-- Check if the package executed successfully (only for SYNCHRONIZED execution)
IF (SELECT [status] FROM [SSISDB].[catalog].[executions] WHERE execution_id=@execution_id) <> 7
BEGIN
SET @err_msg = 'Your package execution did not succeed for execution ID: ' + CAST(@execution_id AS NVARCHAR(20)) + CHAR(13) + CHAR(10)
DECLARE err_cursor CURSOR FOR
SELECT top(7) CAST([message] as varchar(1000)) as message -- Max 7 errors
FROM [catalog].[event_messages]
WHERE [event_name] = 'OnError' -- Only show errors
AND [operation_id] = @execution_id
AND [message] not like '%validation%'
-- Exclude less usefull validation messages like:
-- Error: One or more component failed validation.
-- Error: There were errors during task validation.
-- Error: Error 0xC0012050 while executing package from project reference package "xxx". Package failed validation from the ExecutePackage task. The package cannot run.
-- Error: xxx failed validation and returned error code 0xC020801C.
-- Error: "xxx" failed validation and returned validation status "VS_ISBROKEN".
OPEN err_cursor
FETCH NEXT FROM err_cursor INTO @err_msg_part
WHILE @@FETCH_STATUS = 0
BEGIN
SET @err_msg = @err_msg + @err_msg_part + CHAR(13) + CHAR(10)
FETCH NEXT FROM err_cursor INTO @err_msg_part
END
CLOSE err_cursor
DEALLOCATE err_cursor
RAISERROR(@err_msg,15,1)
END
</pre>
<br />
Now it shows more details in the ADF monitor and if you are also using the <a href="http://microsoft-bitools.blogspot.com/2018/03/add-email-notification-in-azure-data.html" target="_blank">ADF email notifications</a> then the same messages will appear in the email. Feel free to suggest improvements in the comments.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-6WXOqpY-RHQ/WqF2IJvBZiI/AAAAAAAAFb4/n8ZDqWJZrww63Ws5IyY7K5zLZCdE4NNcQCLcBGAs/s1600/errormessagesadf.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="548" data-original-width="1600" height="136" src="https://4.bp.blogspot.com/-6WXOqpY-RHQ/WqF2IJvBZiI/AAAAAAAAFb4/n8ZDqWJZrww63Ws5IyY7K5zLZCdE4NNcQCLcBGAs/s400/errormessagesadf.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">More error details</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<div>
Note that it is not a complete overview of all messages, but in most cases it should be enough for the seasoned developer to quickly identify the problem and take actions to solve it.</div>
<br />
<br />
<i>*update 13-04-2018: There is a new <a href="http://microsoft-ssis.blogspot.com/2018/04/new-adf-pipeline-activity-execute-ssis.html" target="_blank">Execute SSIS Package</a> activity, but without error options.*</i><b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Bellevue, WA, USA47.6101497 -122.201515947.4388297 -122.5242394 47.7814697 -121.87879240000001tag:blogger.com,1999:blog-2303058199815958946.post-21819394595464773862018-03-01T05:00:00.000+01:002018-03-01T05:00:03.680+01:00Azure Blob Source ≠Flat File Source<span style="font-size: large;"><b>Case</b></span><br />
I'm running my SSIS packages in Azure and my source is a flat file in an Azure Blob Storage container. Therefor I use the <a href="http://microsoft-ssis.blogspot.com/2015/06/azure-blob-source-and-destination.html" target="_blank">Azure Blob Source</a> as a source in my Data Flow Task. However this source has just a few formatting options compared to the Flat File Source (and its connection manager). I want to specify things like qualifiers and data types. How do I do that?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-e3oC-J6AtbE/Woig3ONckHI/AAAAAAAAFWc/-LFZfeOsumUveqpomEhHxaXkA5QCtLeLwCLcBGAs/s1600/flatfileblob02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="1030" data-original-width="946" height="400" src="https://4.bp.blogspot.com/-e3oC-J6AtbE/Woig3ONckHI/AAAAAAAAFWc/-LFZfeOsumUveqpomEhHxaXkA5QCtLeLwCLcBGAs/s400/flatfileblob02.png" width="366" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Blob Source has too few options</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
With the current version of the Azure Blob Source you can only specify the column separator, but there is a workaround available. Your Integration Runtime (IR) that is hosted in ADF is actually a virtual machine with Integration Services on it. A simple Script Task running on that IR reveals the drives and their available space. It shows that we have several drives available on that Virtual Machine.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-Q8y3Zb1Qs-A/WoiiYFk4IrI/AAAAAAAAFWo/KsJt_U3xifgdoeZyNesI-uavmIeBMgZYACLcBGAs/s1600/flatfileblob03.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="412" data-original-width="1061" height="155" src="https://4.bp.blogspot.com/-Q8y3Zb1Qs-A/WoiiYFk4IrI/AAAAAAAAFWo/KsJt_U3xifgdoeZyNesI-uavmIeBMgZYACLcBGAs/s400/flatfileblob03.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Log with drive details</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<pre class="brush: c#; toolbar: false;">// C# Code to replace your Main() method
public void Main()
{
// Create array with drive information
System.IO.DriveInfo[] allDrives = System.IO.DriveInfo.GetDrives();
// Create string to store a message
String DriveDetails = "";
// Loop through all drives to get info about it
foreach (System.IO.DriveInfo d in allDrives)
{
// Get drive letter (C:) and type (NTFS)
DriveDetails = d.Name + "(" + d.DriveType + ")" + Environment.NewLine;
// If drive is ready you can get more details
if (d.IsReady == true)
{
DriveDetails += " - Volume label: " + d.VolumeLabel + Environment.NewLine;
DriveDetails += " - File system: " + d.DriveFormat + Environment.NewLine;
DriveDetails += " - Available space to current user: " + d.AvailableFreeSpace + Environment.NewLine;
DriveDetails += " - Total available space: " + d.TotalFreeSpace + Environment.NewLine;
DriveDetails += " - Total size of drive: " + d.TotalSize;
}
// Fire the message as warning to stand out between other messages
Dts.Events.FireWarning(0, "Details", DriveDetails, "", 0);
}
// End Script Task
Dts.TaskResult = (int)ScriptResults.Success;
}
</pre>
<br />
So the solution is to first use the <a href="http://microsoft-ssis.blogspot.com/2015/06/azure-upload-and-download-tasks.html" target="_blank">Azure Blob Download Task</a> to download the file from the Blob Storage Container to the Virtual Machine. After that you can use a regular Flat File Source in the Data Flow Task.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-eM7psC37dVw/Woie5-C9VJI/AAAAAAAAFWQ/qqPwYNjdl1MSw8Mk0gwqybuIFnmLlpDwgCLcBGAs/s1600/flatfileblob01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="519" data-original-width="362" height="400" src="https://1.bp.blogspot.com/-eM7psC37dVw/Woie5-C9VJI/AAAAAAAAFWQ/qqPwYNjdl1MSw8Mk0gwqybuIFnmLlpDwgCLcBGAs/s400/flatfileblob01.png" width="278" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Blob Download Task</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
I'm not sure what the purpose is of these disks and if one of them is for non-persistent data (disk that are automatically cleaned), but I recommend using the E drive to temporarily store the downloaded files and clean up afterwards.<br />
<br />
<b>Windows Temp folder</b><br />
An alternative to pick a temporarily folder on your IR machine is to use a very simple Script Task with only one line of code that retrieves the path of the Windows temp folder. The path looks something like <b>D:\Users\WATASK_1\AppData\Local\Temp\</b>. If you store this path in an SSIS string variable, then you can use that for expressions on your tasks and Flat File connection manager. After the next reboot Windows removes all old files in this folder.<br />
<pre class="brush: c#; toolbar: false;">// C# code (see line 5)
public void Main()
{
// TODO: Add your code here
Dts.Variables["User::tempPath"].Value = System.IO.Path.GetTempPath();
Dts.TaskResult = (int)ScriptResults.Success;
}
</pre>
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-IjXKizADUfc/Wosurx3EdxI/AAAAAAAAFXI/r6iw4Cjmrk80Fh803LFZJNno7D0DHoeDwCLcBGAs/s1600/scripttask01.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="795" data-original-width="1223" height="260" src="https://3.bp.blogspot.com/-IjXKizADUfc/Wosurx3EdxI/AAAAAAAAFXI/r6iw4Cjmrk80Fh803LFZJNno7D0DHoeDwCLcBGAs/s400/scripttask01.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Script Task that fills a variable</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-86479805089019114652018-02-15T21:44:00.003+01:002021-02-14T12:12:11.080+01:00Pause and resume Integration Runtime to save money<b><span style="font-size: large;">Case</span></b><br />
Azure Data Factory (V2) now <a href="http://microsoft-ssis.blogspot.com/2017/12/azure-integration-services-preview-adf.html" target="_blank">supports running SSIS packages</a> in an Integration Runtime, but you are charged by the hour. How can I automatically pause (and resume) my SSIS environment in Azure to save some money on my Azure bill?<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-Pxwyb5Q0NDg/WoXzGQ_f2sI/AAAAAAAAFVY/BR-AJMEZcJUvTamq90nhK9F0rM5_03BwgCLcBGAs/s1600/pausais.png" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="452" data-original-width="1318" height="136" src="https://1.bp.blogspot.com/-Pxwyb5Q0NDg/WoXzGQ_f2sI/AAAAAAAAFVY/BR-AJMEZcJUvTamq90nhK9F0rM5_03BwgCLcBGAs/s400/pausais.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 12.8px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: center; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">Pause your Integration Runtime in the portal</span></td></tr>
</tbody></table>
<b><br /></b>
<br />
<div style="text-align: left;">
</div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
For this example I'm using an Integration Runtime (IR) that runs an SSIS package each hour during working hours. After working hour it is no longer necessary to refresh the data warehouse. Therefore the IR could be suspended to save money. I will also suspend the <a href="http://microsoft-ssis.blogspot.com/2018/02/schedule-package-in-azure-integration.html" target="_blank">trigger</a> that runs the pipeline (package) each hour to prevent errors. For this solution I will use a PowerShell script that runs in an Azure Automation Runbook.<br />
<br />
<b>1) Collect parameters</b><br />
Before we start coding we first we need to get the name of the Azure Data Factory and its Resource group.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-wpMRrmowtwM/WoCVV5dOokI/AAAAAAAAFS8/MPVYYseiXPcTAeAoLjTzLaw4qct7wkQZACLcBGAs/s1600/pausais02.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="475" data-original-width="813" height="232" src="https://4.bp.blogspot.com/-wpMRrmowtwM/WoCVV5dOokI/AAAAAAAAFS8/MPVYYseiXPcTAeAoLjTzLaw4qct7wkQZACLcBGAs/s400/pausais02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Your Azure Data Factory (V2)</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
If you also want to disable the trigger then we need its name. This is probably only necessary if you are running hourly and didn't create separate triggers for each hour. You can find it by clicking on Author & Monitor under Quick links.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-OHyUTVOrR_4/WoCV0Csmw-I/AAAAAAAAFTA/5MrH676DPLIYI4SdqWWGe4l8PnZKEntOwCLcBGAs/s1600/pausais03.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="549" data-original-width="1383" height="158" src="https://2.bp.blogspot.com/-OHyUTVOrR_4/WoCV0Csmw-I/AAAAAAAAFTA/5MrH676DPLIYI4SdqWWGe4l8PnZKEntOwCLcBGAs/s400/pausais03.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Your trigger</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Azure Automation Account</b><br />
Create an Azure Automation Account. You can find it under + New, <b>Monitoring + Management</b>. Make sure that Create Azure Run As account is turned on.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-Ulyhha-yDko/WoCYlk7VBTI/AAAAAAAAFTQ/zhD6gZFen_Up_kxv414JBFd9zeHz5Qm8ACLcBGAs/s1600/pausais04.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="948" data-original-width="1124" height="336" src="https://4.bp.blogspot.com/-Ulyhha-yDko/WoCYlk7VBTI/AAAAAAAAFTQ/zhD6gZFen_Up_kxv414JBFd9zeHz5Qm8ACLcBGAs/s400/pausais04.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Automation Account</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Import Module</b><br />
We need to tell our code about Integration Runtimes in Azure Data Factory. You do this by adding a modules. Scroll down in the menu of the Automation Account and click on Module. Now you see all installed modules. We need to add the module called <b>AzureRM.DataFactoryV2</b>, however it depends on <b>AzureRM.Profile</b> (≥ 4.2.0). Click on Browse gallery and search for <b>AzureRM.Profile and</b> import it and then repeat it for <b>AzureRM.DataFactoryV2</b>. Make sure to add the V2 version!<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-ag0mQH7pft8/WoCcOUv1TcI/AAAAAAAAFTc/VRNh58oGKjkHwocK3xJlv8mjvTykq45GgCLcBGAs/s1600/pausais05.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="685" data-original-width="1129" height="242" src="https://1.bp.blogspot.com/-ag0mQH7pft8/WoCcOUv1TcI/AAAAAAAAFTc/VRNh58oGKjkHwocK3xJlv8mjvTykq45GgCLcBGAs/s400/pausais05.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Import modules</td></tr>
</tbody></table>
<b></b><b></b><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>4) Connections</b><br />
This step is for your information only and to understand the code. Under Connections you will find a default connection named 'AzureRunAsConnection' that contains information about the Azure environment, like the tendant id and the subscription id. To prevent hardcoded connection details we will retrieve some of these fields in the PowerShell code.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-aGDAyDrfCrg/WoNVxv-T8EI/AAAAAAAAFTs/xiBgbD8AH50tx4ymjOeJIJ8dnAKAkKGYQCLcBGAs/s1600/pausais06.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="518" data-original-width="1383" height="148" src="https://3.bp.blogspot.com/-aGDAyDrfCrg/WoNVxv-T8EI/AAAAAAAAFTs/xiBgbD8AH50tx4ymjOeJIJ8dnAKAkKGYQCLcBGAs/s400/pausais06.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span face=""az_ea_font" , "segoe ui" , , "segoe wp" , "tahoma" , "arial" , sans-serif" style="background-color: transparent; color: #252525; display: inline; float: none; font-size: 12px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">AzureRunAsConnection</span></td></tr>
</tbody></table>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>5) Runbooks</b><br />
Now it is time to add a new Azure Runbook for the PowerShell code. Click on Runbooks and then add a new runbook (There are also five example runbooks of which AzureAutomationTutorialScript could be useful as an example). Give your new Runbook a suitable name and choose PowerShell as type. There will be two separate runbooks/scripts: one for pause and one for resume. When finished with the Pause script you need to repeat this for the Resume script.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-75mtx-V6ABg/WoNbZh3uYYI/AAAAAAAAFT8/a950DX5w4xcs0GicCh25CPXlILGnwp_gwCLcBGAs/s1600/pausais07.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="959" data-original-width="1582" height="241" src="https://2.bp.blogspot.com/-75mtx-V6ABg/WoNbZh3uYYI/AAAAAAAAFT8/a950DX5w4xcs0GicCh25CPXlILGnwp_gwCLcBGAs/s400/pausais07.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Create new Rubook</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>6) Edit Script</b><br />
After clicking Create in the previous step the editor will we opened. When editing an existing Runbook you need to click on the Edit button to edit the code. You can copy and paste the code of one of these scripts below to your editor. Study the green comments to understand the code. Also make sure to fill in the right value for the variables (see parameters).<br />
<br />
The first script is the pause script and the second script is the resume script. You could merge both scripts and use an if statement on the status property to either pause or resume, but I prefer two separate scripts both with their own schedule.<br />
<br />
<pre class="brush: powershell; toolbar: false;"># This scripts pauses your Integration Runtime (and its trigger) if it is running
# Parameters
$ConnectionName = 'AzureRunAsConnection'
$DataFactoryName = 'ADF-SSISJoost'
$ResourceGroup = 'joost_van_rossum'
$TriggerName = 'Hourly'
# Do not continue after an error
$ErrorActionPreference = "Stop"
########################################################
# Log in to Azure (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
try
{
# Get the connection "AzureRunAsConnection "
$ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName
'Log in to Azure...'
$null = Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint
}
catch
{
if (!$ServicePrincipalConnection)
{
# You forgot to turn on 'Create Azure Run As account'
$ErrorMessage = "Connection $ConnectionName not found."
throw $ErrorMessage
}
else
{
# Something went wrong
Write-Error -Message $_.Exception.Message
throw $_.Exception
}
}
########################################################
# Get Integration Runtime in Azure Data Factory
$IntegrationRuntime = Get-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $DataFactoryName `
-ResourceGroupName $ResourceGroup
# Check if Integration Runtime was found
if (!$IntegrationRuntime)
{
# Your ADF does not have a Integration Runtime
# or the ADF does not exist
$ErrorMessage = "No Integration Runtime found in ADF $($DataFactoryName)."
throw $ErrorMessage
}
# Check if the Integration Runtime is running
elseif ($IntegrationRuntime.State -eq "Started")
{
<# Start Trigger Deactivation #>
# Getting trigger to check if it exists
$Trigger = Get-AzureRmDataFactoryV2Trigger `
-DataFactoryName $DataFactoryName `
-Name $TriggerName `
-ResourceGroupName $ResourceGroup
# Check if the trigger was found
if (!$Trigger)
{
# Fail options:
# The ADF does not exist (typo)
# The trigger does not exist (typo)
$ErrorMessage = "Trigger $($TriggerName) not found."
throw $ErrorMessage
}
# Check if the trigger is activated
elseif ($Trigger.RuntimeState -eq "Started")
{
Write-Output "Stopping Trigger $($TriggerName)"
$null = Stop-AzureRmDataFactoryV2Trigger `
-DataFactoryName $DataFactoryName `
-Name $TriggerName `
-ResourceGroupName $ResourceGroup `
-Force
}
else
{
# Write message to screen (not throwing error)
Write-Output "Trigger $($TriggerName) is not activated."
}
<# End Trigger Deactivation #>
# Stop the integration runtime
Write-Output "Pausing Integration Runtime $($IntegrationRuntime.Name)."
$null = Stop-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $IntegrationRuntime.DataFactoryName `
-Name $IntegrationRuntime.Name `
-ResourceGroupName $IntegrationRuntime.ResourceGroupName `
-Force
Write-Output "Done"
}
else
{
# Write message to screen (not throwing error)
Write-Output "Integration Runtime $($IntegrationRuntime.Name) is not running."
}
</pre>
<br />
<br />
<br />
<br />
<pre class="brush: powershell; toolbar: false;"># This scripts resumes your Integration Runtime (and its trigger) if it is stopped
# Parameters
$ConnectionName = 'AzureRunAsConnection'
$DataFactoryName = 'ADF-SSISJoost'
$ResourceGroup = 'joost_van_rossum'
$TriggerName = 'Hourly'
# Do not continue after an error
$ErrorActionPreference = "Stop"
########################################################
# Log in to Azure (standard code)
########################################################
Write-Verbose -Message 'Connecting to Azure'
try
{
# Get the connection "AzureRunAsConnection "
$ServicePrincipalConnection = Get-AutomationConnection -Name $ConnectionName
'Log in to Azure...'
$null = Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint
}
catch
{
if (!$ServicePrincipalConnection)
{
# You forgot to turn on 'Create Azure Run As account'
$ErrorMessage = "Connection $ConnectionName not found."
throw $ErrorMessage
}
else
{
# Something went wrong
Write-Error -Message $_.Exception.Message
throw $_.Exception
}
}
########################################################
# Get Integration Runtime in Azure Data Factory
$IntegrationRuntime = Get-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $DataFactoryName `
-ResourceGroupName $ResourceGroup
# Check if Integration Runtime was found
if (!$IntegrationRuntime)
{
# Your ADF does not have a Integration Runtime
# or the ADF does not exist
$ErrorMessage = "No Integration Runtime found in ADF $($DataFactoryName)."
throw $ErrorMessage
}
# Check if the Integration Runtime is running
elseif ($IntegrationRuntime.State -ne "Started")
{
# Resume the integration runtime
Write-Output "Resuming Integration Runtime $($IntegrationRuntime.Name)."
$null = Start-AzureRmDataFactoryV2IntegrationRuntime `
-DataFactoryName $IntegrationRuntime.DataFactoryName `
-Name $IntegrationRuntime.Name `
-ResourceGroupName $IntegrationRuntime.ResourceGroupName `
-Force
Write-Output "Done"
}
else
{
# Write message to screen (not throwing error)
Write-Output "Integration Runtime $($IntegrationRuntime.Name) is already running."
}
<# Start Trigger Activation #>
# Getting trigger to check if it exists
$Trigger = Get-AzureRmDataFactoryV2Trigger `
-DataFactoryName $DataFactoryName `
-Name $TriggerName `
-ResourceGroupName $ResourceGroup
# Check if the trigger was found
if (!$Trigger)
{
# Fail options:
# The ADF does not exist (typo)
# The trigger does not exist (typo)
$ErrorMessage = "Trigger $($TriggerName) not found."
throw $ErrorMessage
}
# Check if the trigger is activated
elseif ($Trigger.RuntimeState -ne "Started")
{
Write-Output "Resuming Trigger $($TriggerName)"
$null = Start-AzureRmDataFactoryV2Trigger `
-DataFactoryName $DataFactoryName `
-Name $TriggerName `
-ResourceGroupName $ResourceGroup `
-Force
}
else
{
# Write message to screen (not throwing error)
Write-Output "Trigger $($TriggerName) is already activated."
}
<# End Trigger Deactivation #>
</pre>
<br />
<b>Note: </b>when you don't want to disable your trigger then remove the lines between <# Start Trigger Deactivation #> and <# End Trigger Deactivation #>.<br />
<br />
<b>7) Testing</b><br />
You can use the Test Pane menu option in the editor to test your PowerShell scripts. When clicking on Run it will first Queue the script before Starting it. Running takes a couple of minutes.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-i33ymjhBoow/WoXiD0QJrRI/AAAAAAAAFUk/grTGXRffAYYVV-xCZQdVvOWz2WanCOQRACLcBGAs/s1600/pausais.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="504" data-original-width="1200" height="167" src="https://2.bp.blogspot.com/-i33ymjhBoow/WoXiD0QJrRI/AAAAAAAAFUk/grTGXRffAYYVV-xCZQdVvOWz2WanCOQRACLcBGAs/s400/pausais.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Pausing Integration Runtime</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-nyzSusjregU/WoXiNuE8d8I/AAAAAAAAFUo/x8TvUMtEUMkQtNmhSTht7T_lfQDW2SqsACLcBGAs/s1600/pauseais9.png" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="504" data-original-width="1200" height="167" src="https://2.bp.blogspot.com/-nyzSusjregU/WoXiNuE8d8I/AAAAAAAAFUo/x8TvUMtEUMkQtNmhSTht7T_lfQDW2SqsACLcBGAs/s400/pauseais9.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Resuming Integration Runtime (20 minutes+)</td></tr>
</tbody></table>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>8) Publish</b><br />
When your script is ready, it is time to publish it. Above the editor click on the Publish button. Confirm overriding any previously published versions.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-7P82FadETRo/WoXoruEo9NI/AAAAAAAAFU8/sGyouK9yKX4GJCQPL7R11IWXlF1CMvQlwCLcBGAs/s1600/pausais08.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="400" data-original-width="847" height="188" src="https://4.bp.blogspot.com/-7P82FadETRo/WoXoruEo9NI/AAAAAAAAFU8/sGyouK9yKX4GJCQPL7R11IWXlF1CMvQlwCLcBGAs/s400/pausais08.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Publish your script</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>9) Schedule</b><br />
And now that we have a working and published Azure Runbook, we need to schedule it. Click on Schedule to create a new schedule for your runbook. My packages run each hour during working hours. So for the resume script I created a schedule that runs every working day on 7:00AM. The pause script could for example be scheduled on working days at 9:00PM (21:00).<br />
Now you need to hit the refresh button in the Azure Data Factory dashboard to see if it really works. It takes a few minutes to run, so don't worry too soon.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-AsUuGPt4Wyw/WoXsKbA96OI/AAAAAAAAFVI/hgYFEBv1pxIGDOcqfAmlwWGlFAeEkpUxwCLcBGAs/s1600/pausais09.gif" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="606" data-original-width="847" height="285" src="https://1.bp.blogspot.com/-AsUuGPt4Wyw/WoXsKbA96OI/AAAAAAAAFVI/hgYFEBv1pxIGDOcqfAmlwWGlFAeEkpUxwCLcBGAs/s400/pausais09.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add schedule</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<span style="font-size: large;"><b>Summary</b></span><br />
In this post you saw how you can pause and resume your Integration Runtime in ADF to save some money on your Azure bill during the quiet hours. As said before pausing and resuming the trigger is optional. When creating the schedule, keep in mind that resuming/starting takes around 20 minutes to finish and note that you also pay during this startup phase.<br />
Steps 5 to 9 need to be repeated for the resume script after you finished the pause script.<br />
<br />
<b>Update:</b> you can also do a <a href="http://microsoft-ssis.blogspot.com/2018/04/start-and-stop-integration-runtime-in.html" target="_blank">pause and resume in the pipeline itself</a> if you only have one ETL job.<br />
<br />
<b>Update 2</b>: now even easier done <a href="https://microsoft-ssis.blogspot.com/2020/06/start-ssis-ir-via-rest-api.html" target="_blank">via Rest API</a>Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-87500774608705648432018-02-10T00:07:00.000+01:002018-03-04T16:02:37.444+01:00SSIS Snack: The semaphore timeout period has expired<b><span style="font-size: large;">Case</span></b><br />
My SSIS package that runs in Azure Data Factory V2 (and gets data from an on-premises source) suddenly stops working and throws an communication error. After this error it won't run again. What is happening?<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-nNlTqWp4z1Q/Wn4UB4B-90I/AAAAAAAAFR8/PivDBPG5ouoDLYNXeUkOuS32UKOz8TR4QCLcBGAs/s1600/ais_error01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="247" data-original-width="1066" height="91" src="https://4.bp.blogspot.com/-nNlTqWp4z1Q/Wn4UB4B-90I/AAAAAAAAFR8/PivDBPG5ouoDLYNXeUkOuS32UKOz8TR4QCLcBGAs/s400/ais_error01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">TCP Provider: The semaphore timeout period has expired</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<span style="font-size: xx-small;">DFT - DIM_xxxxx:Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.<br />An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Communication link failure".<br />An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "TCP Provider: The semaphore timeout period has expired.".</span><br />
<span style="font-size: xx-small;"><br /></span>
<br />
<b><span style="font-size: large;">Solution</span></b><br />
The error seems to be caused by a network communication hiccup with our on-premises source which is connected with a VNET in Azure. Although I do not know the actual cause of the error. There is a solution: restart your Integration Runtime.<br />
<br />
<b>1) ADF dashboard</b><br />
Go to your Azure Data Factory (ADF) that hosts your Integration Runtime and click on <b>Author & Monitor</b> within the Quick links section. This will open the ADF dashboard.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-PAAy_7ZA3nI/Wn4ZZo4TFsI/AAAAAAAAFSM/65g2d1ynvUkFXzddb83fHRFFZ7N07LZsgCLcBGAs/s1600/ais_error02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="533" data-original-width="735" height="290" src="https://3.bp.blogspot.com/-PAAy_7ZA3nI/Wn4ZZo4TFsI/AAAAAAAAFSM/65g2d1ynvUkFXzddb83fHRFFZ7N07LZsgCLcBGAs/s400/ais_error02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Author & Monitor</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Stop & Start IR</b><br />
Click on Author (pencil), on Connections and then on Integration Runtimes (IR). Then Stop and start your IR. This could take up to 30 minutes! After that rerun your package and it should be working again.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-RXpo_GsW0No/Wn4m90X1WeI/AAAAAAAAFSc/pqv73nV8cEMER8dpZ1UblV9inKgCJLBkACLcBGAs/s1600/aisstartstop.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="786" data-original-width="1270" height="247" src="https://1.bp.blogspot.com/-RXpo_GsW0No/Wn4m90X1WeI/AAAAAAAAFSc/pqv73nV8cEMER8dpZ1UblV9inKgCJLBkACLcBGAs/s400/aisstartstop.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Stop & Start IR</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
Please let me know in the comments whether it worked for you and if you found the actual cause of the error.Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-27774580186442782082018-02-04T22:51:00.001+01:002018-04-14T13:55:49.266+02:00Schedule package in Azure Integration Services (ADF V2) <b><span style="font-size: large;">Case</span></b><br />
I have <a href="http://microsoft-ssis.blogspot.com/2017/12/deploying-to-azure-integration-services.html" target="_blank">deployed</a> my SSIS project to an SSIS Catalog in Azure (Data Factory) and now I want to schedule the package to run each day at 7:00AM. How do I do that?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-a05GALXPK-0/WnWy7OHe_II/AAAAAAAAFP0/U6vQh1JYiTYB8OB_IyxHF-Vq-XhvfErSQCLcBGAs/s1600/scheduleAIS01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="430" data-original-width="534" height="321" src="https://2.bp.blogspot.com/-a05GALXPK-0/WnWy7OHe_II/AAAAAAAAFP0/U6vQh1JYiTYB8OB_IyxHF-Vq-XhvfErSQCLcBGAs/s400/scheduleAIS01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">How to schedule this SSIS package in Azure?</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">
Solution</span></b><br />
Microsoft suggests the following options:<br />
<ul>
<li><b>SQL Server Agent</b>. Here you use an on-premises SQL Server Agent to schedule the SSIS package in the cloud. This sounds like the least attractive solution. However if you still have a lot of on-premises projects that all run via SQL Server Agent then it is easy to use this solution and have all scheduled executions on one location.</li>
<li><b>SQL Database elastic jobs</b>. This requires adding an extra component to your subscription, but if you already use a SQL Elastic database pool then this could be a good solution. Will try this in a future post.</li>
<li><b>Azure Data Factory</b>. Since we already use ADF V2 to host the Integration Runtime, this is most obvious solution. It executes a stored procedure that executes the package.</li>
</ul>
<div>
This post explains the ADF solution with the stored procedure, but executing an stored procedure can be done in various other Azure components such as a Runbook in Azure Automation, Logic Apps or even Azure Functions with a trigger on arriving new files in a Blob Storage container.</div>
<br />
<i>*update 13-04-2018: new <a href="http://microsoft-ssis.blogspot.com/2018/04/new-adf-pipeline-activity-execute-ssis.html" target="_blank">Execute SSIS Package</a> activity*</i><br />
<i></i><br />
<b>1) ADF V2 - Author & Monitor</b><br />
Go to your Azure Data Factory that hosts your SSIS Integration Runtime. Under Quick links, click on the <b>Author & Monitor</b> link. A new tab will be opened with the Azure Data Factory dashboard. Next click on <b>Create Pipeline</b> circle and go to the next step.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-FC_OKH0SwUo/WnYUZprPPPI/AAAAAAAAFQA/quNnGMmZo08_q9j0YYXA7sS2xcvqOEKNQCLcBGAs/s1600/AISschedule02.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="925" data-original-width="1193" height="310" src="https://2.bp.blogspot.com/-FC_OKH0SwUo/WnYUZprPPPI/AAAAAAAAFQA/quNnGMmZo08_q9j0YYXA7sS2xcvqOEKNQCLcBGAs/s400/AISschedule02.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Data Factory dashboard</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Create pipeline with Stored Procedure</b><br />
You just created a new pipeline in the previous step. Give it a descriptive name (like 'SSIS Job MyAzureProject') in the General tab. A description is optional. Next collapse the General activities and drag a Stored Procedure to the canvas. Again, give it a descriptive name in the General tab.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-fo286J2C9Kw/WncNDvlnWWI/AAAAAAAAFQM/NnK5oQGu3k09DxW_4UDxdFGJp1u5tXD5wCLcBGAs/s1600/AISschedule03.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="962" data-original-width="1193" height="322" src="https://2.bp.blogspot.com/-fo286J2C9Kw/WncNDvlnWWI/AAAAAAAAFQM/NnK5oQGu3k09DxW_4UDxdFGJp1u5tXD5wCLcBGAs/s400/AISschedule03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Stored Procedure to Pipeline</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Linked service</b><br />
Next step is to create a linked service to the SSISDB to execute the Stored Procedure. Go to the SQL Account tab and add a new <b>Linked service</b>. Point it to the SSISDB that hosts the package that you want to execute. Hit the test button after filling in all fields to make sure the everything is correct. Then click Finish and continue to the next step.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-JwQnJr2q0_0/WndlqXHxDJI/AAAAAAAAFQk/RFIdq1HR0EsDhV--T2LwlVVTsn4rLobvACLcBGAs/s1600/AISschedule05.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="962" data-original-width="658" height="400" src="https://3.bp.blogspot.com/-JwQnJr2q0_0/WndlqXHxDJI/AAAAAAAAFQk/RFIdq1HR0EsDhV--T2LwlVVTsn4rLobvACLcBGAs/s400/AISschedule05.png" width="272" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Most important fields of the Linked service</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-5b-eTPIlvTw/WnctDCwgm9I/AAAAAAAAFQY/zGAkri0kcMcX8BYoYYkcZYqNl5fV1DzWACLcBGAs/s1600/AISschedule04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="962" data-original-width="1549" height="247" src="https://4.bp.blogspot.com/-5b-eTPIlvTw/WnctDCwgm9I/AAAAAAAAFQY/zGAkri0kcMcX8BYoYYkcZYqNl5fV1DzWACLcBGAs/s400/AISschedule04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Linked service</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>4) Creating Stored Procedure call</b><br />
The code to execute a package consists of multiple Stored Procedures. To keep it simple we will use <b>sp_executesql</b> to execute a string of SQL code containing all Stored Procedure calls. You can easily create the SQL for this in SSMS.<br />
<br />
Go to your package in the Catalog. Right click it and choose Execute... Now set all options like Logging Level, Environment and 32/64bit. After setting all options hit the Script button instead of the Ok button. This is the code you want to use. You can finetune it with some code to check whether the package finished successfully.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-7Z2ArG9tPz8/Wndr520rNrI/AAAAAAAAFQw/0Lgu2Sz6zA4TTJo2R42KCIzJ9pGeCa6HwCLcBGAs/s1600/AISschedule06.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="742" data-original-width="1552" height="190" src="https://4.bp.blogspot.com/-7Z2ArG9tPz8/Wndr520rNrI/AAAAAAAAFQw/0Lgu2Sz6zA4TTJo2R42KCIzJ9pGeCa6HwCLcBGAs/s400/AISschedule06.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Generating code in SSMS</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
The code below was generated and finetuned. Copy your code we need it in the next step.<br />
<pre class="brush: sql; toolbar: false;">-- Variables for execution and error message
DECLARE @execution_id bigint, @err_msg NVARCHAR(150)
-- Create execution and fill @execution_id variable
EXEC [SSISDB].[catalog].[create_execution] @package_name=N'Package.dtsx', @execution_id=@execution_id OUTPUT, @folder_name=N'SSISJoost', @project_name=N'MyAzureProject', @use32bitruntime=False, @reference_id=Null, @useanyworker=True, @runinscaleout=True
-- Set logging level: 0=None, 1=Basic, 2=Performance, 3=Verbose
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'LOGGING_LEVEL', @parameter_value=1
-- Set synchonized option 0=A-SYNCHRONIZED, 1=SYNCHRONIZED
-- A-SYNCHRONIZED: don't wait for the result
EXEC [SSISDB].[catalog].[set_execution_parameter_value] @execution_id, @object_type=50, @parameter_name=N'SYNCHRONIZED', @parameter_value=1
-- Execute the package with parameters from above
EXEC [SSISDB].[catalog].[start_execution] @execution_id, @retry_count=0
-- Check if the package executed succesfully (only for SYNCHRONIZED execution)
IF(SELECT [status] FROM [SSISDB].[catalog].[executions] WHERE execution_id=@execution_id)<>7
BEGIN
SET @err_msg=N'Your package execution did not succeed for execution ID: ' + CAST(@execution_id AS NVARCHAR(20))
RAISERROR(@err_msg,15,1)
END
</pre>
<br />
The last part of this T-SQL code is very useful. It will cause an error in de pipeline monitor of ADF when a package fails. When you click on the text balloon it will show which SSIS execution failed.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-O8mkjBbnP2s/Wn4QpHRQJbI/AAAAAAAAFRw/Y-2dCTwkLX4qwTa3nkRZ2GGbzPDE058HACLcBGAs/s1600/scheduleAIS09.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="817" data-original-width="1600" height="203" src="https://1.bp.blogspot.com/-O8mkjBbnP2s/Wn4QpHRQJbI/AAAAAAAAFRw/Y-2dCTwkLX4qwTa3nkRZ2GGbzPDE058HACLcBGAs/s400/scheduleAIS09.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Showing when a package fails in the Pipeline monitor</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>5) Adding Stored Procedure</b><br />
Go to the Stored Procedure tab and enter a Stored Procedure name manually by first checking the Edit option. Then enter <b>sp_executesql</b>. Then add a new parameter with the name <b>stmt</b> (type string) and paste the complete code of the previous step the value field. After this we are ready to test and schedule this ADF Pipeline.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-kCAHTK2SLnI/WndyQE2eMOI/AAAAAAAAFQ8/ukBcnSCYWqQmoa8_6SeGmtmPM5eh-QP1QCLcBGAs/s1600/AISschedule07.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="959" data-original-width="1552" height="246" src="https://4.bp.blogspot.com/-kCAHTK2SLnI/WndyQE2eMOI/AAAAAAAAFQ8/ukBcnSCYWqQmoa8_6SeGmtmPM5eh-QP1QCLcBGAs/s400/AISschedule07.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Stored Procedure code</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>6) Publish and trigger</b><br />
Now it is time to test the Pipeline, but first hit the <b>Publish All</b> button (on the left side) to publish your new pipeline to ADF. Then click on the Trigger button and choose <b>Trigger Now</b> to execute the pipeline immediately.<br />
<br />
After that click on the Monitor button to check your execution and/or go to the Catalog and open an execution report to see the result.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-DEZaMMC1_1E/Wnd3a1cAQCI/AAAAAAAAFRI/A8bxt0x_AtobpQ_ImPsAZcVXMjF3ctU_ACLcBGAs/s1600/AISschedule08.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://1.bp.blogspot.com/-DEZaMMC1_1E/Wnd3a1cAQCI/AAAAAAAAFRI/A8bxt0x_AtobpQ_ImPsAZcVXMjF3ctU_ACLcBGAs/s400/AISschedule08.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Publish and trigger manually</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>7) Schedule</b><br />
If everything went well it is time to schedule your package. Go back to the <b>Author</b> page by clicking on the pencil (on the left side). Then click on the Trigger button, but now choose <b>New/Edit</b> to create a new schedule for your pipeline (package). For this example I choose Daily at 7:00AM. After adding the new schedule you have to Publish your pipeline (again).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-glgDnwf7bhY/Wnd6cs82MWI/AAAAAAAAFRU/fyVHAT5_8GUYbsxNya93ZfgYcGum3DOZACLcBGAs/s1600/AISschedule09.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="803" data-original-width="1600" height="200" src="https://1.bp.blogspot.com/-glgDnwf7bhY/Wnd6cs82MWI/AAAAAAAAFRU/fyVHAT5_8GUYbsxNya93ZfgYcGum3DOZACLcBGAs/s400/AISschedule09.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Add Schedule</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><i></i><u></u><sub></sub><sup></sup><strike></strike><br /></b>
<br />
<b>Note: </b>that it uses UTC time.<br />
<br />
<br />
<br />
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-19858065286490753092017-12-17T21:02:00.001+01:002018-02-10T00:08:09.409+01:00Deploying to Azure Integration Services Preview (ADF V2)<b><span style="font-size: large;">Case</span></b><br />
I just <a href="https://microsoft-ssis.blogspot.com/2017/12/azure-integration-services-preview-adf.html" target="_blank">created</a> an Integration Services Catalog in Azure Data Factory V2, but how do I deploy SSIS packages to this new catalog in Azure.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-2dMas_rDOwk/Wja-3NNEftI/AAAAAAAAFM8/20RMF8q5pyIUVOZD3MvwqYSLTYX4mEAiwCLcBGAs/s1600/aisdeploy00.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="336" data-original-width="395" height="340" src="https://1.bp.blogspot.com/-2dMas_rDOwk/Wja-3NNEftI/AAAAAAAAFM8/20RMF8q5pyIUVOZD3MvwqYSLTYX4mEAiwCLcBGAs/s400/aisdeploy00.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Integration Services</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
The Integration Services catalog in Azure doesn't support Windows Authentication like the on premises version. Therefore the PowerShell <a href="http://microsoft-ssis.blogspot.com/2015/07/deploying-ispac-files-with-powershell.html" target="_blank">deployment script</a> and the deployment in SSDT won't work without some changes.<br />
For this example I will focus on the SQL Server Authentication for which you can use the same user that you used to <a href="https://microsoft-ssis.blogspot.com/2017/12/azure-integration-services-preview-adf.html" target="_blank">configure the catalog</a> in Azure or create a <a href="http://microsoft-bitools.blogspot.com/2017/01/azure-snack-grant-access-to-your-azure.html" target="_blank">new SQL user</a>.<br />
<br />
<b>Solution 1: SSDT</b><br />
First make sure you have the latest version of SSDT 2015 (17.4 or higher) or SSDT 2017 (15.5.0 or higher). If you already have SSDT 2017 make sure you first <a href="https://microsoft-ssis.blogspot.com/2017/08/where-is-new-ssis-project-type.html" target="_blank">remove</a> any installed Visual Studio extension of SSRS or SSAS projects before installing SSDT. For downloads and more details see <a href="https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt" target="_blank">this page</a>.<br />
<br />
If you use an older version that doesn't have the option for SQL Server Authentication you will get an error: <i><span style="font-size: x-small;">Failed to connect to server bitools2.database.windows.net. (Microsoft.SqlServer.ConnectionInfo)<br />
Windows logins are not supported in this version of SQL Server. (Microsoft SQL Server, Error: 40607)</span></i>
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-tL7Ts45KpZs/WjZJXdz5QNI/AAAAAAAAFMY/fH3FAoc1wnIp091W_lGk2DcSJ7rV-hP2QCLcBGAs/s1600/aisdeploy01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="678" data-original-width="729" height="371" src="https://4.bp.blogspot.com/-tL7Ts45KpZs/WjZJXdz5QNI/AAAAAAAAFMY/fH3FAoc1wnIp091W_lGk2DcSJ7rV-hP2QCLcBGAs/s400/aisdeploy01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><div>
Windows logins are not supported</div>
<div>
in this version of SQL Server. </div>
</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<i><br /></i>After updating SSDT you will see a new Integration Services Deployment Wizard, which supports three new authentication methods. After filling in the Server name (which is the URL of your Azure SQL server that hosts your SSISDB), choose the SQL Server Authentication method. Fill in the username and password and click on the Connect button. After that you can use the Browse button to browse your Integration Services catalog in Azure.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-EeAs65hZXBA/WjZbze7YfaI/AAAAAAAAFMo/rq3GASUpBvUjPiIZVsARV7PwyevimnlUQCLcBGAs/s1600/aisdeploy02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="678" data-original-width="729" height="371" src="https://3.bp.blogspot.com/-EeAs65hZXBA/WjZbze7YfaI/AAAAAAAAFMo/rq3GASUpBvUjPiIZVsARV7PwyevimnlUQCLcBGAs/s400/aisdeploy02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">New Integration Services Deployment Wizard</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 700; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">Solution 2: PowerShell</span><br />
If you regularly deploy SSIS projects you probably use a PowerShell script. The 'old' <a href="http://microsoft-ssis.blogspot.com/2015/07/deploying-ispac-files-with-powershell.html" target="_blank">deployment scripts</a> uses Windows Authentication to create a 'System.Data.SqlClient.SqlConnection' connection to the master database on the server that hosts the SSISDB. See this snippet:<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code snippet for on premises deployment
#################################################
############ CONNECT TO SSIS SERVER #############
#################################################
# First create a connection to SQL Server
$SqlConnectionstring = "Data Source=$($SsisServer);Initial Catalog=master;Integrated Security=SSPI;"
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring
</pre>
<br />
We need to change two things. First we need to change the connectionstring to SQL Server Authentication by adding a username and password and removing 'Integrated Security=SSPI;'. Secondly we need to change the database in the Initial Catalog part. This should be SSISDB instead of master.<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code snippet for on premises deployment
#################################################
##################### SERVER ####################
#################################################
Write-Host "Connecting to Azure SQL DB server $($SsisServer)"
# Create a connectionstring for the Azure DB Server
# Make sure you use SSISDB as the Initial Catalog!
$SqlConnectionstring = "Data Source=$($SsisServer);User ID=$($SSISDBUsername);Password=$($SSISDBPassword);Initial Catalog=SSISDB;"
</pre>
<br />
<br />
Selecting the SSISDB database also applies when you want to use SSMS to connect to your catalog in Azure. Below the complete script for deploying ISPAC files. The solution contains two files. The first is the file with all the parameters. This changes per project. The last line executes the second script.<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code snippet for on premises deployment
#PowerShell: finance.ps1
#################################################################################################
# Change source, destination and environment properties
#################################################################################################
# Source
$IspacFilePath = "d:\projects\Finance\bin\Development\Finance.ispac"
# Destination
$SsisServer ="bitools2.database.windows.net"
$SSISDBUsername = "Joost"
$SSISDBPassword = "5ecrtet"
$FolderName = "Finance"
$ProjectName = ""
# Environment
$EnvironmentName = "Generic"
$EnvironmentFolderName = "Environments"
#################################################################################################
# Execute generic deployment script
. "$PSScriptRoot\generalAzureDeployment.ps1" $IspacFilePath $SsisServer $SSISDBUsername $SSISDBPassword $FolderName $ProjectName $EnvironmentName $EnvironmentFolderName
</pre>
<br />
The second file is the generic scripts which will be the same for each project. If you want to make any changes to the script you now only need to maintain one generic PowerShell scripts instead of dozens of copies for each project.<br />
<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code snippet for on premises deployment
#PowerShell: generalAzureDeployment.ps1
################################
########## PARAMETERS ##########
################################
[CmdletBinding()]
Param(
# IsPacFilePath is required
[Parameter(Mandatory=$True,Position=1)]
[string]$IspacFilePath,
# SsisServer is required
[Parameter(Mandatory=$True,Position=2)]
[string]$SsisServer,
# SSISDB Username is required
[Parameter(Mandatory=$True,Position=3)]
[string]$SSISDBUsername,
# SSISDB Password is required
[Parameter(Mandatory=$True,Position=4)]
[string]$SSISDBPassword,
# FolderName is required
[Parameter(Mandatory=$True,Position=5)]
[string]$FolderName,
# ProjectName is not required
# If empty filename is used
[Parameter(Mandatory=$False,Position=6)]
[string]$ProjectName,
# EnvironmentName is not required
# If empty no environment is referenced
[Parameter(Mandatory=$False,Position=7)]
[string]$EnvironmentName,
# EnvironmentFolderName is not required
# If empty the FolderName param is used
[Parameter(Mandatory=$False,Position=8)]
[string]$EnvironmentFolderName
)
# Replace empty projectname with filename
if (-not $ProjectName)
{
$ProjectName = [system.io.path]::GetFileNameWithoutExtension($IspacFilePath)
}
# Replace empty Environment folder with project folder
if (-not $EnvironmentFolderName)
{
$EnvironmentFolderName = $FolderName
}
# Mask the password to show something on
# screen, but not the actual password
# This is for testing purposes only.
$SSISDBPasswordMask = $SSISDBPassword -replace '.', '*'
clear
Write-Host "========================================================================================================================================================"
Write-Host "== Used parameters =="
Write-Host "========================================================================================================================================================"
Write-Host "Ispac File Path : " $IspacFilePath
Write-Host "SSIS Server : " $SsisServer
Write-Host "SQL Username : " $SSISDBUsername
Write-Host "SQL Password : " $SSISDBPasswordMask
Write-Host "Project Folder Path : " $FolderName
Write-Host "Project Name : " $ProjectName
Write-Host "Environment Name : " $EnvironmentName
Write-Host "Environment Folder Path: " $EnvironmentFolderName
Write-Host "========================================================================================================================================================"
Write-Host ""
# Stop the script if an error occurs
$ErrorActionPreference = "Stop"
#################################################
##################### ISPAC #####################
#################################################
# Check if ispac file exists
if (-Not (Test-Path $IspacFilePath))
{
Throw [System.IO.FileNotFoundException] "Ispac file $IspacFilePath doesn't exists!"
}
else
{
$IspacFileName = split-path $IspacFilePath -leaf
Write-Host "Ispac file" $IspacFileName "found"
}
#################################################
############### ADD SSIS ASSEMBLY ###############
#################################################
# Add SSIS assembly so you can do SSIS stuff in PowerShell
# The number 14.0.0.0 refers to SQL Server 2017
# 13.0.0.0 to SQL Server 2016, 12.0.0.0 to SQL
# Server 2014 and 11.0.0.0 to SQL Server 2012
$SsisNamespace = "Microsoft.SqlServer.Management.IntegrationServices"
Add-Type -AssemblyName "$($SsisNamespace), Version=14.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91"
#################################################
##################### SERVER ####################
#################################################
Write-Host "Connecting to Azure SQL DB server $($SsisServer)"
# Create a connectionstring for the Azure DB Server
# Make sure you use SSISDB as the Initial Catalog!
$SqlConnectionstring = "Data Source=$($SsisServer);User ID=$($SSISDBUsername);Password=$($SSISDBPassword);Initial Catalog=SSISDB;"
# Create a connection object
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection $SqlConnectionstring
# Check if the connection works
Try
{
$SqlConnection.Open();
Write-Host "Connected to Azure SQL DB server $($SsisServer)"
}
Catch [System.Data.SqlClient.SqlException]
{
Throw [System.Exception] "Failed to connect to Azure SQL DB server $($SsisServer), exception: $($_)"
}
# Create the Integration Services object
$IntegrationServices = New-Object $SsisNamespace".IntegrationServices" $SqlConnection
# Check if SSISDB connection succeeded
if (-not $IntegrationServices)
{
Throw [System.Exception] "Failed to connect to SSISDB on $($SsisServer)"
}
else
{
Write-Host "Connected to SSISDB on $($SsisServer)"
}
#################################################
#################### CATALOG ####################
#################################################
# Create object for SSISDB Catalog
$Catalog = $IntegrationServices.Catalogs["SSISDB"]
# Check if the SSISDB Catalog exists
if (-not $Catalog)
{
# Catalog doesn't exists. The user should create it manually.
# It is possible to create it, but that shouldn't be part of
# deployment of packages.
# Also make sure the catalog is SSISDB and not master or any
# other database.
Throw [System.Exception] "SSISDB catalog doesn't exist. Create it manually!"
}
else
{
Write-Host "Catalog SSISDB found"
}
#################################################
#################### FOLDER #####################
#################################################
# Create object to the (new) folder
$Folder = $Catalog.Folders[$FolderName]
# Check if folder already exists
if (-not $Folder)
{
# Folder doesn't exists, so create the new folder.
Write-Host "Creating new folder" $FolderName
$Folder = New-Object $SsisNamespace".CatalogFolder" ($Catalog, $FolderName, $FolderName)
$Folder.Create()
}
else
{
Write-Host "Folder" $FolderName "found"
}
#################################################
#################### PROJECT ####################
#################################################
# Deploying project to folder
if($Folder.Projects.Contains($ProjectName)) {
Write-Host "Deploying" $ProjectName "to" $FolderName "(REPLACE)"
}
else
{
Write-Host "Deploying" $ProjectName "to" $FolderName "(NEW)"
}
# Reading ispac file as binary
[byte[]] $IspacFile = [System.IO.File]::ReadAllBytes($IspacFilePath)
$Folder.DeployProject($ProjectName, $IspacFile) | Out-Null
$Project = $Folder.Projects[$ProjectName]
if (-not $Project)
{
# Something went wrong with the deployment
# Don't continue with the rest of the script
return ""
}
#################################################
################## ENVIRONMENT ##################
#################################################
# Check if environment name is filled
if (-not $EnvironmentName)
{
# Kill connection to SSIS
$IntegrationServices = $null
# Stop the deployment script
Return "Ready deploying $IspacFileName without adding environment references"
}
# Create object to the (new) folder
$EnvironmentFolder = $Catalog.Folders[$EnvironmentFolderName]
# Check if environment folder exists
if (-not $EnvironmentFolder)
{
Throw [System.Exception] "Environment folder $EnvironmentFolderName doesn't exist"
}
# Check if environment exists
if(-not $EnvironmentFolder.Environments.Contains($EnvironmentName))
{
Throw [System.Exception] "Environment $EnvironmentName doesn't exist in $EnvironmentFolderName "
}
else
{
# Create object for the environment
$Environment = $Catalog.Folders[$EnvironmentFolderName].Environments[$EnvironmentName]
if ($Project.References.Contains($EnvironmentName, $EnvironmentFolderName))
{
Write-Host "Reference to" $EnvironmentName "found"
}
else
{
Write-Host "Adding reference to" $EnvironmentName
$Project.References.Add($EnvironmentName, $EnvironmentFolderName)
$Project.Alter()
}
}
#################################################
############## PROJECT PARAMETERS ###############
#################################################
$ParameterCount = 0
# Loop through all project parameters
foreach ($Parameter in $Project.Parameters)
{
# Get parameter name and check if it exists in the environment
$ParameterName = $Parameter.Name
if ($ParameterName.StartsWith("CM.","CurrentCultureIgnoreCase"))
{
# Ignoring connection managers
}
elseif ($ParameterName.StartsWith("INTERN_","CurrentCultureIgnoreCase"))
{
# Internal parameters are ignored (where name starts with INTERN_)
Write-Host "Ignoring Project parameter" $ParameterName " (internal use only)"
}
elseif ($Environment.Variables.Contains($Parameter.Name))
{
$ParameterCount = $ParameterCount + 1
Write-Host "Project parameter" $ParameterName "connected to environment"
$Project.Parameters[$Parameter.Name].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, $Parameter.Name)
$Project.Alter()
}
else
{
# Variable with the name of the project parameter is not found in the environment
# Throw an exeception or remove next line to ignore parameter
Throw [System.Exception] "Project parameter $ParameterName doesn't exist in environment"
}
}
Write-Host "Number of project parameters mapped:" $ParameterCount
#################################################
############## PACKAGE PARAMETERS ###############
#################################################
$ParameterCount = 0
# Loop through all packages
foreach ($Package in $Project.Packages)
{
# Loop through all package parameters
foreach ($Parameter in $Package.Parameters)
{
# Get parameter name and check if it exists in the environment
$PackageName = $Package.Name
$ParameterName = $Parameter.Name
if ($ParameterName.StartsWith("CM.","CurrentCultureIgnoreCase"))
{
# Ignoring connection managers
}
elseif ($ParameterName.StartsWith("INTERN_","CurrentCultureIgnoreCase"))
{
# Internal parameters are ignored (where name starts with INTERN_)
Write-Host "Ignoring Package parameter" $ParameterName " (internal use only)"
}
elseif ($Environment.Variables.Contains($Parameter.Name))
{
$ParameterCount = $ParameterCount + 1
Write-Host "Package parameter" $ParameterName "from package" $PackageName "connected to environment"
$Package.Parameters[$Parameter.Name].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, $Parameter.Name)
$Package.Alter()
}
else
{
# Variable with the name of the package parameter is not found in the environment
# Throw an exeception or remove next line to ignore parameter
Throw [System.Exception] "Package parameter $ParameterName from package $PackageName doesn't exist in environment"
}
}
}
Write-Host "Number of package parameters mapped:" $ParameterCount
#################################################
##################### READY #####################
#################################################
# Kill connection to SSIS
$IntegrationServices = $null
Return "Ready deploying $IspacFileName "
</pre>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-L7N1YGCCQjg/WjbVBn95GBI/AAAAAAAAFNM/PRhjdG293wQNfclt434H8MeEeQ4Nh3m1ACLcBGAs/s1600/aisdeploy04.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="789" data-original-width="1288" height="245" src="https://4.bp.blogspot.com/-L7N1YGCCQjg/WjbVBn95GBI/AAAAAAAAFNM/PRhjdG293wQNfclt434H8MeEeQ4Nh3m1ACLcBGAs/s400/aisdeploy04.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">The script</td></tr>
</tbody></table>
<br />Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com1Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-37845960659676273222017-12-14T22:14:00.000+01:002018-02-10T00:08:31.921+01:00Azure Integration Services Preview (ADF V2)<span style="font-size: large;"><b>Case</b></span><br />
I just created an Azure Data Factory V2 to start with SSIS in the cloud, but I cannot find the SSIS options in ADF. How do I configure SSIS in Azure?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-o3q0I304Kk8/Wi2Owc2Ya0I/AAAAAAAAFKk/23VUEXRtNucRKtRZd4fcRyvlHFDjb3Z0QCLcBGAs/s1600/AIS02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="696" data-original-width="897" height="310" src="https://4.bp.blogspot.com/-o3q0I304Kk8/Wi2Owc2Ya0I/AAAAAAAAFKk/23VUEXRtNucRKtRZd4fcRyvlHFDjb3Z0QCLcBGAs/s400/AIS02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Data Factory V2</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
At the moment SSIS (ADF V2) is still in the preview. The user interface for SSIS is not yet available in the Azure portal (however it is visible in <a href="https://www.youtube.com/watch?v=Crmi56v_YA8" target="_blank">this video</a>, but probably only for Microsoft).<br />
<br />
For now you are stuck with a little PowerShell scripting. Before you start make sure you have an <b>ADF V2</b> and an <b>Azure Database Server</b> available. Preferably in the same Azure region. ADF V2 is only available in East US, East US2 and West Europe. Since I'm from the Netherlands I selected West Europe.<br />
<br />
<b>PowerShell ISE</b><br />
If you never used PowerShell for Azure code then you first need to start Windows PowerShell ISE as Administrator to install the <a href="https://docs.microsoft.com/en-us/powershell/azure/install-azurerm-ps?view=azurermps-5.0.0" target="_blank">Azure Module</a>. Execute the following command:<br />
<pre class="brush: powershell; toolbar: false;"># PowerShell code
Install-Module AzureRM
</pre>
<br />
And if you did work with Azure PowerShell before then you probably have to update the Azure Module by adding the parameter -Force at the end<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-KVc4MNJq-aA/Wi2XDPFDucI/AAAAAAAAFK4/uV-gqSSyFFsrLOXmNtVR6_DkSfaERTaJQCLcBGAs/s1600/AIS03.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="728" data-original-width="1022" height="283" src="https://1.bp.blogspot.com/-KVc4MNJq-aA/Wi2XDPFDucI/AAAAAAAAFK4/uV-gqSSyFFsrLOXmNtVR6_DkSfaERTaJQCLcBGAs/s400/AIS03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Install-Module AzureRM</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>Parameters</b><br />
This PowerShell script first starts with a 'parameter' section to provide all details necessary for the script to run.<br />
<b><br /></b>
First start with the name of your subscription. If you are not sure which one it is then you can look it up in the overview page of your Azure Data Factory under Subscription name.<br />
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-yYleY3BIRts/Wi2suhsX2OI/AAAAAAAAFLM/LTCwYLhrha4Fe6A6MotWC-uxwlgVVUTEACLcBGAs/s1600/ais4.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" data-original-height="447" data-original-width="897" height="198" src="https://1.bp.blogspot.com/-yYleY3BIRts/Wi2suhsX2OI/AAAAAAAAFLM/LTCwYLhrha4Fe6A6MotWC-uxwlgVVUTEACLcBGAs/s400/ais4.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 12.8px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: center; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">Subscription name in ADF overview page</span></td></tr>
</tbody></table>
<b><br /></b>
<br />
<div style="text-align: left;">
</div>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<pre class="brush: powershell; toolbar: false;">#################################################
################## PARAMETERS ###################
#################################################
$SubscriptionName = "mySubscriptionName"
</pre>
<br />
To store the SSISDB we need to provide a database server URL and the server admin and its password. The URL can be found on the SQL <b>database</b> overview page and the user can be found on the SQL <b>server</b> overview page.<br />
<pre class="brush: powershell; toolbar: false#;">#Provide login details for the existing database server
$CatalogServerEndpoint = "myDBServer.database.windows.net"
$DBUser = "Joost"
$DBPassword = "5ecret!</pre>
<br />
Next we need to provide the details about our newly created Azure Data Factory V2 environment. You can find all the required information on the overview page of ADF. The location is either WestEurope or EastUs or EastUs2.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-lXXCP_WL57c/Wi2s-mhc9mI/AAAAAAAAFLQ/JI5cDsRGrTgXR7s7LW-G98IJzpghwEH4gCLcBGAs/s1600/ais5.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="450" data-original-width="899" height="200" src="https://2.bp.blogspot.com/-lXXCP_WL57c/Wi2s-mhc9mI/AAAAAAAAFLQ/JI5cDsRGrTgXR7s7LW-G98IJzpghwEH4gCLcBGAs/s400/ais5.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;"><span style="background-color: transparent; color: black; display: inline; float: none; font-family: "times new roman"; font-size: 12.8px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: center; text-decoration: none; text-indent: 0px; text-transform: none; white-space: normal; word-spacing: 0px;">Azure Data Factory V2 overview page</span></td></tr>
</tbody></table>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<pre class="brush: powershell; toolbar: false;"># Provide details about your existing ADF V2
$DataFactoryName = "bitools"
$ResourceGroupName = "Joost_van_Rossum"
$Location = "WestEurope"
</pre>
<br />
The last part is to configure the Integration Runtime. The Catalog Pricing Tier is the database size of your SSISDB database. It should be Basic or S1, S2, S3... S12, etc. The node size should be Standard_A4_v2, Standard_A8_v2, Standard_D1_v2, Standard_D2_v2, Standard_D3_v2 or Standard_D4_v2. After the preview phase more sizes will be available. Prices can be found <a href="https://azure.microsoft.com/en-us/pricing/details/data-factory/v2/" target="_blank">here</a>. <br />
<pre class="brush: powershell; toolbar: false;"># Provide details for the new Integration Runtime
$IntegrationRuntimeName = "SSISJoostIR"
$IntegrationRuntimeDescription = "My First Azure Integration Catalog"
$CatalogPricingTier = "Basic" # S0, S1, S2, S3
$IntegrationRuntimeNodeSize = "Standard_A4_v2"
$IntegrationRuntimeNodeCount = 2
$IntegrationRuntimeParallelExecutions = 2
</pre>
<br />
<br />
<b>The script</b><br />
And now the script itself. It starts with a setting to stop after an error (not the default setting) and a login to Azure. When executing Login-AzureRmAccount it will show a login popup. Login with your azure account.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-uuhwEytKTmQ/Wi2xo3mN2qI/AAAAAAAAFLY/RKZHL9Pn1JgmnLyRKlUl5Fv9EnLBGPpzwCLcBGAs/s1600/ais6.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="912" data-original-width="788" height="400" src="https://2.bp.blogspot.com/-uuhwEytKTmQ/Wi2xo3mN2qI/AAAAAAAAFLY/RKZHL9Pn1JgmnLyRKlUl5Fv9EnLBGPpzwCLcBGAs/s400/ais6.png" width="345" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Login to Azure</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<pre class="brush: powershell; toolbar: false;">#################################################
################## THE SCRIPT ###################
#################################################
$ErrorActionPreference = "Stop"
# Login to Azure (a pop-up will appear)
Login-AzureRmAccount
</pre>
<br />
After the login we need to select the right subscription. The Out-Null will prevent showing all properties of your subscription to the screen.<br />
<pre class="brush: powershell; toolbar: false;"># Select the right subscription
Select-AzureRmSubscription -SubscriptionName $SubscriptionName | Out-Null
</pre>
<br />
And this will create a new credential with the user id and password. We need it for the next command.<br />
<pre class="brush: powershell; toolbar: false;"># Create Database creditial with user id and password
$SecureDBPassword = ConvertTo-SecureString $DBPassword -AsPlainText -Force
$ServerCreds = New-Object System.Management.Automation.PSCredential($DBUser, $SecureDBPassword)
</pre>
<br />
This command will create the Integration Runtime environment in Azure Data Factory.<br />
<pre class="brush: powershell; toolbar: false;"># Create the Integration Runtime
Write-Host "Creating your integration runtime."
Set-AzureRmDataFactoryV2IntegrationRuntime -ResourceGroupName $ResourceGroupName `
-DataFactoryName $DataFactoryName `
-Name $IntegrationRuntimeName `
-Type Managed `
-CatalogServerEndpoint $CatalogServerEndpoint `
-CatalogAdminCredential $ServerCreds `
-CatalogPricingTier $CatalogPricingTier `
-Description $IntegrationRuntimeDescription `
-Location $Location `
-NodeSize $IntegrationRuntimeNodeSize `
-NodeCount $IntegrationRuntimeNodeCount `
-MaxParallelExecutionsPerNode $IntegrationRuntimeParallelExecutions
</pre>
<br />
After creating the Integration Runtime environment we need to start it. Only then you can use it. Starting the environment takes 20 to 30 minutes! There is also an Stop-AzureRmDataFactoryV2IntegrationRuntime method which takes the same parameters and takes 2 to 3 minutes.<br />
<pre class="brush: powershell; toolbar: false;"># Start the Integration Runtime (takes 20 to 30 minutes)
Write-Warning "Starting your integration runtime. This command takes 20 to 30 minutes to complete."
Start-AzureRmDataFactoryV2IntegrationRuntime -ResourceGroupName $ResourceGroupName `
-DataFactoryName $DataFactoryName `
-Name $IntegrationRuntimeName `
-Force
</pre>
<br />
<br />
<b>Connecting with SSMS</b><br />
Now we can use SQL Server Management Studio (SSMS) to connect to our newly created SSISDB in Azure. A little different compared to on-premises: you need to click on the Options button and select the SSISDB first. Otherwise you won't see the Integration Services Catalog.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-oD-otGVbk1o/Wi28djfDRwI/AAAAAAAAFL0/oVttG3m90QQ8Tmqv9ZG_hYYp22D5H6kGwCLcBGAs/s1600/ais7.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="527" data-original-width="491" height="400" src="https://2.bp.blogspot.com/-oD-otGVbk1o/Wi28djfDRwI/AAAAAAAAFL0/oVttG3m90QQ8Tmqv9ZG_hYYp22D5H6kGwCLcBGAs/s400/ais7.gif" width="372" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Connecting to the SSISDB in Azure</td></tr>
</tbody></table>
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-vylNx3Luris/Wi28dT1HtDI/AAAAAAAAFLw/rT9P1wi3sS8pR5AvYj_30D8AtW6EXh0PACLcBGAs/s1600/ais8.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="315" data-original-width="487" height="257" src="https://3.bp.blogspot.com/-vylNx3Luris/Wi28dT1HtDI/AAAAAAAAFLw/rT9P1wi3sS8pR5AvYj_30D8AtW6EXh0PACLcBGAs/s400/ais8.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Spot the differences</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
In the <a href="http://microsoft-ssis.blogspot.com/2017/12/deploying-to-azure-integration-services.html" target="_blank">next post</a> I will show the deployment to the Integration Services Catalog in Azure. And now the complete script for copy and paste purposes.
<br />
<pre class="brush: powershell; toolbar: false;">#################################################
################## PARAMETERS ###################
#################################################
$SubscriptionName = "mySubscriptionName"
# Provide login details for the existing database server
$CatalogServerEndpoint = "myDBServer.database.windows.net"
$DBUser = "Joost"
$DBPassword = "5ecret!"
# Provide details about your existing ADF V2
$DataFactoryName = "bitools"
$ResourceGroupName = "Joost_van_Rossum"
$Location = "WestEurope" # or EastUs/EastUs2
# Provide details for the new Integration Runtime
$IntegrationRuntimeName = "SSISJoostIR"
$IntegrationRuntimeDescription = "My First Azure Integration Catalog"
$CatalogPricingTier = "Basic" # S0, S1, S2, S3
$IntegrationRuntimeNodeSize = "Standard_A4_v2"
$IntegrationRuntimeNodeCount = 2
$IntegrationRuntimeParallelExecutions = 2
# In public preview, only Standard_A4_v2, Standard_A8_v2, Standard_D1_v2,
# Standard_D2_v2, Standard_D3_v2, Standard_D4_v2 are supported.
#################################################
################## THE SCRIPT ###################
#################################################
$ErrorActionPreference = "Stop"
# Login to Azure (a pop-up will appear)
Login-AzureRmAccount
# Select the right subscription
Select-AzureRmSubscription -SubscriptionName $SubscriptionName | Out-Null
# Create Database creditial with user id and password
$SecureDBPassword = ConvertTo-SecureString $DBPassword -AsPlainText -Force
$ServerCreds = New-Object System.Management.Automation.PSCredential($DBUser, $SecureDBPassword)
# Create the Integration Runtime
Write-Host "Creating your integration runtime."
Set-AzureRmDataFactoryV2IntegrationRuntime -ResourceGroupName $ResourceGroupName `
-DataFactoryName $DataFactoryName `
-Name $IntegrationRuntimeName `
-Type Managed `
-CatalogServerEndpoint $CatalogServerEndpoint `
-CatalogAdminCredential $ServerCreds `
-CatalogPricingTier $CatalogPricingTier `
-Description $IntegrationRuntimeDescription `
-Location $Location `
-NodeSize $IntegrationRuntimeNodeSize `
-NodeCount $IntegrationRuntimeNodeCount `
-MaxParallelExecutionsPerNode $IntegrationRuntimeParallelExecutions
# Start the Integration Runtime (takes 20 to 30 minutes)
Write-Warning "Starting your integration runtime. This command takes 20 to 30 minutes to complete."
Start-AzureRmDataFactoryV2IntegrationRuntime -ResourceGroupName $ResourceGroupName `
-DataFactoryName $DataFactoryName `
-Name $IntegrationRuntimeName `
-Force
Write-Host "Done"
</pre>
<br />
<br />
<b>Note: </b>when you get the error message "<i>There is no active worker agent</i>" while executing an SSIS package then you probably need to start the Integration Runtime (see Start-AzureRmDataFactoryV2IntegrationRuntime).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-pLtYXOrIZ8E/WlE35vkMp0I/AAAAAAAAFPc/YYUV9BwKXw8IojeDeie451dJNcdnZCOPwCLcBGAs/s1600/adfv2execute.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="557" data-original-width="1158" height="191" src="https://3.bp.blogspot.com/-pLtYXOrIZ8E/WlE35vkMp0I/AAAAAAAAFPc/YYUV9BwKXw8IojeDeie451dJNcdnZCOPwCLcBGAs/s400/adfv2execute.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">There is no active worker agent.</td></tr>
</tbody></table>
<span style="background-color: transparent; color: black; display: inline; float: none; font-family: "consolas"; font-size: 13.33px; font-style: normal; font-variant: normal; font-weight: 400; letter-spacing: normal; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; white-space: pre; word-spacing: 0px;"><br /></span>
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike>Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com0Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-60493192535515547712017-10-29T18:07:00.001+01:002017-10-30T09:18:42.311+01:00Calculating Hash values in SSIS<b><span style="font-size: large;">Case</span></b><br />
I want to calculate a hash value for a couple of columns in SSIS. In T-SQL you can use HASHBYTES, but that doesn't work for other sources like flat files and for SQL 2012/2014 the input is limited to only 8000 bytes. Is there an alternative for HASHBYTES?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-BdUGzdrUVoM/WfTUEjz5cnI/AAAAAAAAFFI/hLXrGOpLgRM-iZY5mWewNngvssdZmWPzwCLcBGAs/s1600/hashbytes01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="617" data-original-width="1163" height="212" src="https://4.bp.blogspot.com/-BdUGzdrUVoM/WfTUEjz5cnI/AAAAAAAAFFI/hLXrGOpLgRM-iZY5mWewNngvssdZmWPzwCLcBGAs/s400/hashbytes01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Calculating a hash value over multiple columns</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Solution</span></b><br />
There are several alternatives for the T-SQL HASHBYTES. First of all there are various custom components available for SSIS like the SSIS Multiple Hash on <a href="https://ssismhash.codeplex.com/" target="_blank">codeplex</a>, but if you don't want to (or cannot) use custom components, you can accomplish the same result with a little .NET scripting. If you really want to stick to T-SQL, then you can also first stage your files in a table and then calculate the hash with T-SQL afterwards. This blog will show you the scripting solution.<br />
<br />
But first, why do you need a hash? When you want to keep track of history with a Persistent stage, Data Vault or Data Warehouse you want to know whether the record from the stage layer is different then the one you have in your historical data layer. You could check each column one by one, but when you have a whole bunch of columns that could be a lot of work and a bit slow.<br />
<br />
A hash in ETL is used to generate a single, corresponding (but shorter) value for a whole bunch of columns. It is stored in the stage table as a new column. If one character changes in one of those columns then the hash value will also be different. When comparing the two records (one from the stage layer and one from the historical layer) you now only have to compare the hash value. If it did not change you know you don't have to process the record in your historical layer. Since you only want to calculate the hash once (in the stage package) you will also store it in the historical layer.<br />
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><br />
<b>Now it is time to explain the scripting solution</b><br />
<br />
<b>1) Starting point</b><br />
The starting point of this example is a Data Flow Task with a Flat File source component.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-pERvyp1meVw/WfTYWxtMNpI/AAAAAAAAFFU/0NWyoGnyMcYfhC2RifM4N4LHkD_u6qA3ACLcBGAs/s1600/hashbytes02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="382" data-original-width="1115" height="136" src="https://1.bp.blogspot.com/-pERvyp1meVw/WfTYWxtMNpI/AAAAAAAAFFU/0NWyoGnyMcYfhC2RifM4N4LHkD_u6qA3ACLcBGAs/s400/hashbytes02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Flat File Source</td></tr>
</tbody></table>
<br />
<div>
<b><br /></b></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<b>2) Script Component - Input Columns</b></div>
<div>
Add a new Script Component (transformation) to the Data Flow Task. Give it a suitable name and connect it to your flow. Then edit it and select all columns you want to hash on the <b>Input Columns</b> pane. Since we are not changing the existing column you can keep the default Usage Type 'ReadOnly'.</div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-4hNvsGFFrjU/WfTc4xc6F7I/AAAAAAAAFFg/4Nz9qCrpcoUYSeVw1v0Jt_D1ybw1_3EjQCLcBGAs/s1600/hashbytes03.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="824" data-original-width="1400" height="235" src="https://2.bp.blogspot.com/-4hNvsGFFrjU/WfTc4xc6F7I/AAAAAAAAFFg/4Nz9qCrpcoUYSeVw1v0Jt_D1ybw1_3EjQCLcBGAs/s400/hashbytes03.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Script Component Input Columns</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br />
<br /></div>
<div>
Which columns do you want to hash? Three most chosen options:</div>
<ol>
<li>If you do not know the Primary Key: select all columns to calculate the hash.</li>
<li>If you do know the Primary Key: select all columns except the Primary Key to calculate the hash.</li>
<li>If the Primary Key consists of multiple columns you could even calculate a separate hash for the key columns only.</li>
</ol>
<div>
<b>3) Script Component - Output Column</b></div>
<div>
We need to store the calculated hash in a new column. Go to the Inputs and Outputs pane and add a new column in <b>Output 0</b>. The data type is string and the size depends on which hash algoritme you want to use. For this example we use the MD5 algoritme which returns a 128 bits hash. When you convert that to an ASCII string it would be a 32 character string (that only contains hexadecimal digits).</div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-7Dx1xd44s-k/WfTg6_gPYzI/AAAAAAAAFFs/TS1WU7OmGXg6e3HTR1NLv-FexDuIPfmMQCLcBGAs/s1600/hashbytes04.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="724" data-original-width="834" height="346" src="https://1.bp.blogspot.com/-7Dx1xd44s-k/WfTg6_gPYzI/AAAAAAAAFFs/TS1WU7OmGXg6e3HTR1NLv-FexDuIPfmMQCLcBGAs/s400/hashbytes04.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Script Component Inputs and Outputs</td></tr>
</tbody></table>
<div>
<i><b><u><br /></u></b></i></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<b>4) Script Component - The script preparation</b></div>
<div>
Now we are almost ready to add the actual script. Go to the Script pane. Select your scripting language. This example will be in C#. Then hit the <b>Edit Script...</b> button to start the Vsta environment. This is a new instance of Visual Studio and will take a few moments to start.</div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-CPKCNdnwqhw/WfTktWfSFzI/AAAAAAAAFF4/13eD9oCB3yE_Ohu6tDvI9t_ZMMIqa2h8ACLcBGAs/s1600/hashbytes05.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="724" data-original-width="834" height="346" src="https://3.bp.blogspot.com/-CPKCNdnwqhw/WfTktWfSFzI/AAAAAAAAFF4/13eD9oCB3yE_Ohu6tDvI9t_ZMMIqa2h8ACLcBGAs/s400/hashbytes05.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Edit Script...</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<b>Optional:</b></div>
<div>
I always start by removing all unnecessary methods and comments to keep the code clean. For this example we do not need the PreExecute and PostExecute methods and I do not want to keep the default help comments.</div>
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-bNdZWnz7pUA/WfTmNRUDpMI/AAAAAAAAFGE/oaNDH1VXb9MT5t84SPMYTCe4Q1B5n7U6QCLcBGAs/s1600/hashbytes06.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="787" data-original-width="1219" height="257" src="https://2.bp.blogspot.com/-bNdZWnz7pUA/WfTmNRUDpMI/AAAAAAAAFGE/oaNDH1VXb9MT5t84SPMYTCe4Q1B5n7U6QCLcBGAs/s400/hashbytes06.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Clean up before start</td></tr>
</tbody></table>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br />
<b>5) Script Component - The code</b></div>
<div>
First we need to add two extra usings to shorten the code. Unfold the Namespaces region at the top and add the following usings:<br />
<pre class="brush: c#; toolbar: false;">using System.Security.Cryptography;
using System.Text;</pre>
<br />
Then Locate the method called <b>Input0_ProcessInputRow</b> and add a new <b>GetMd5Hash</b> method below this existing method (below its closing }). The new method is copied from <a href="https://msdn.microsoft.com/en-us/library/system.security.cryptography.md5(v=vs.110).aspx" target="_blank">this MSDN page</a>. I only changed the encoding to Unicode <i>(see note 1)</i>:<br />
<pre class="brush: c#; toolbar: false;">static string GetMd5Hash(MD5 md5Hash, string input)
{
// Convert the input string to a byte array and compute the hash.
byte[] data = md5Hash.ComputeHash(Encoding.<span class="typ">Unicode</span><span class="pun"></span>.GetBytes(input));
// Create a new Stringbuilder to collect the bytes
// and create a string.
StringBuilder sBuilder = new StringBuilder();
// Loop through each byte of the hashed data
// and format each one as a hexadecimal string.
for (int i = 0; i < data.Length; i++)
{
sBuilder.Append(data[i].ToString("x2"));
}
// Return the hexadecimal string.
return sBuilder.ToString();
}
</pre>
<br />
<br />
And at last change the code of the existing method Input0_ProcessInputRow to:<br />
<pre class="brush: c#; toolbar: false;">public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string Separator = "|";
string RowData = "";
using (MD5 md5Hash = MD5.Create())
{
Row.Hash = GetMd5Hash(md5Hash, RowData = (Row.Title_IsNull ? "" : Row.Title.ToString()) + Separator + (Row.FirstName_IsNull ? "" : Row.FirstName.ToString()) + Separator + (Row.MiddleName_IsNull ? "" : Row.MiddleName.ToString()) + Separator + (Row.LastName_IsNull ? "" : Row.LastName.ToString()));
}
}
</pre>
<br /></div>
<div>
<br /></div>
<div>
The code above first concatenates all columns with a separator between them <i>(see note 2)</i> and it checks whether the value isn't NULL because we cannot add NULL to a string <i>(see note 3)</i>. You will see that it repeats this piece of code for each column before calling the hash method:</div>
<pre class="brush: c#; toolbar: false;">(Row.Title_IsNull ? "" : Row.Title.ToString()) + Separator
</pre>
For the first record in our example it will hash the following text: Mr.|Syed|E|Abbas<br />
And for the third row that contains a null value it will hash the this text: Ms.|Kim||Abercrombie
<br />
<div>
<br />
<b>6) Testing the code</b><br />
After closing the Vsta editor and clicking OK in the Script Component to close it, add a dummy Derived Column behind it and add a Data Viewer to see the result.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-vOKPmAXkWCM/WfXVg3G66jI/AAAAAAAAFGU/2J3q7aR6Lpwsu8BM_1j7nYArH6KFKNh5QCLcBGAs/s1600/hashbytes07.gif" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="522" data-original-width="886" height="235" src="https://3.bp.blogspot.com/-vOKPmAXkWCM/WfXVg3G66jI/AAAAAAAAFGU/2J3q7aR6Lpwsu8BM_1j7nYArH6KFKNh5QCLcBGAs/s400/hashbytes07.gif" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Similar result to T-SQL HASHBYTES</td></tr>
</tbody></table>
<br />
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><b></b><br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br />
<br /></div>
<div>
<br /></div>
<div>
<b>Note 1:</b><br />
When you want the exact same result as with T-SQL HASHBYTES then you have to make sure you use the same encoding. Otherwise you get a different hash. In the method <b>GetMd5Hash</b> on the first line of code you see Encoding.<span class="typ">Unicode</span><span class="pun"></span>.GetBytes(. There are more options besides Unicode. For example: ASCII, UTF7, UTF8, UTF32, etc. etc. However, as long as you don't have to compare hashes generated by to different methods (T-SQL and .Net) it doesn't matter. In this <a href="https://stackoverflow.com/questions/27908449/tsql-md5-hash-different-to-c-sharp-net-md5" target="_blank">stackoverflow post</a> you find more examples.<br />
<strike><br /></strike></div>
<div>
<b>Note 2:</b><br />
The column separator is added to prevent unwanted matches. If you have these two records with two columns:<br />
<table>
<tbody>
<tr>
<th>Column1</th>
<th>Column2</th>
</tr>
<tr>
<td>123</td>
<td>456</td>
</tr>
<tr>
<td>12</td>
<td>3456</td>
</tr>
</tbody></table>
Without the separator these two will both get concatenated to 123456 and therefor generate the same hash. With the separator you will have two different values to hash: 123|456 and 12|3456. Choose your separator wisely. The number 3 would not be a wise choice in this case.<br />
<br />
<b>Note 3:</b></div>
<div>
In the code you see that the columns are checked for null values because you cannot add null to a string. The null values are replace with an empty string. However this shows a bit of an imperfection of this method, because a string with a null value isn't the same as an empty string. To overcome this you could use a different string that is likely to occur in your text. For Numeric and Date data types you could just add an empty string, something like:</div>
<pre class="brush: c#; toolbar: false;">(Row.MyNumberColumn_IsNull ? "" : Row.MyNumberColumn.ToString()) + Separator
(Row.MyDateColumn_IsNull ? "" : Row.MyDateColumn.ToString()) + Separator
</pre>
<b><br /></b>
<br />
<div>
<b>Note 4:</b></div>
<div>
md5 only uses 128 bits and there are better, saver (, but also a bit slower) methods to calculate hashes:<br />
SHA and SHA1 - 160 bits<br />
SHA2_256 - 256 bits<br />
SHA2_512 - 512 bits<br />
<br />
Safer? As long as you don't use it to hash passwords you are still OK with md5.<br />
Better? In rare cases two different strings could return the same md5 hash, but you have a higher chance to win the galaxy lottery.<br />
<br />
Rather use SHA2_512? Just use this code instead:</div>
<pre class="brush: c#; toolbar: false;">public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string Separator = "|";
string RowData = "";
using (SHA512 shaHash = new SHA512Managed())
{
Row.hash2 = GetShaHash(shaHash, RowData = (Row.Title_IsNull ? "" : Row.Title.ToString()) + Separator + (Row.FirstName_IsNull ? "" : Row.FirstName.ToString()) + Separator + (Row.MiddleName_IsNull ? "" : Row.MiddleName.ToString()) + Separator + (Row.LastName_IsNull ? "" : Row.LastName.ToString()));
}
}
static string GetShaHash(SHA512 shaHash, string input)
{
// Convert the input string to a byte array and compute the hash.
byte[] data = shaHash.ComputeHash(Encoding.Unicode.GetBytes(input));
// Create a new Stringbuilder to collect the bytes
// and create a string.
StringBuilder sBuilder = new StringBuilder();
// Loop through each byte of the hashed data
// and format each one as a hexadecimal string.
for (int i = 0; i < data.Length; i++)
{
sBuilder.Append(data[i].ToString("x2"));
}
// Return the hexadecimal string.
return sBuilder.ToString();
}
</pre>
<br />
<b>Too much columns => too much coding?</b><br />
In my book <b><a href="http://www.apress.com/gp/book/9781484206393" target="_blank">Extending SSIS with .NET Scripting</a></b> you will find a script component example that just loops through all columns to generates the hash. No money to buy it? I used <a href="http://microsoft-ssis.blogspot.com/2010/12/do-something-for-all-columns-in-your.html" target="_blank">this code</a> as the base for that script.<br />
An other alternative is to generate the Script Component and its code with BIML. Here is an example of a <a href="http://microsoft-ssis.blogspot.com/2015/02/creating-biml-script-component.html" target="_blank">Script Component in BIML</a>, but getting the hash to work is a bit of a challenge but doable.Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com5Netherlands52.132633 5.291265999999950547.144659 -5.0358825000000493 57.120607 15.61841449999995tag:blogger.com,1999:blog-2303058199815958946.post-39022041391379974462017-08-29T23:48:00.002+02:002017-08-30T00:03:01.333+02:00Azure Data Lake Store in SSIS <b><span style="font-size: large;">Case</span></b><br />
Microsoft just released a new <a href="https://blogs.msdn.microsoft.com/ssis/2017/08/29/new-azure-feature-pack-release-strengthening-adls-connectivity/" target="_blank">Azure Feature Pack</a> for SSIS with ADLS Connectivity. What's new?<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-PBHOML788N4/WaXkfdkOdhI/AAAAAAAAE_Y/3IPflApqnYAUDHwRsJVjCBdj8sKz0WCkgCLcBGAs/s1600/ADLS00.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="150" data-original-width="309" src="https://2.bp.blogspot.com/-PBHOML788N4/WaXkfdkOdhI/AAAAAAAAE_Y/3IPflApqnYAUDHwRsJVjCBdj8sKz0WCkgCLcBGAs/s1600/ADLS00.png" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Data Lake Store File System Task</td></tr>
</tbody></table>
<b><br /></b>
<br />
<div style="text-align: left;">
</div>
<br />
<div>
<b></b><i></i><u></u><sub></sub><sup></sup><strike><br /></strike></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
<br /></div>
<span style="font-size: large;"><b>Solution</b></span><br />
It contains four new items and a new connection manager:<br />
- Azure Data Lake Store File System Task<br />
- Foreach ADLS File Enumerator<br />
- Azure Data Lake Store Source<br />
- Azure Data Lake Store Destination<br />
<b><br /></b><b>Azure Data Lake Store File System Task</b><br />
This task only allows you to upload or download files to the Azure Data Lake Store. This is similar to the <a href="http://microsoft-ssis.blogspot.nl/2015/06/azure-upload-and-download-tasks.html" target="_blank">Azure Blob Upload / Download Task</a>. In the near future new operations will be added. A delete file or delete folder would be a handy addition<br />
<b><br /></b>
<b>1) Start</b><br />
<b></b><b></b><b></b>First download and install the new <a href="https://blogs.msdn.microsoft.com/ssis/2017/08/29/new-azure-feature-pack-release-strengthening-adls-connectivity/" target="_blank">Azure Feature Pack</a>. Then check the Azure Folder in the SSIS Toolbox and drag the Azure Data Lake Store File System Task to the surface. Give it a suitable name.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-AMv3zuG7aYE/WaXOUi550zI/AAAAAAAAE9Q/H3Oc95Ih6MQZKe_ju6rWtFDKqZ4q-7tVwCLcBGAs/s1600/ADLS01.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="882" data-original-width="1092" height="322" src="https://2.bp.blogspot.com/-AMv3zuG7aYE/WaXOUi550zI/AAAAAAAAE9Q/H3Oc95Ih6MQZKe_ju6rWtFDKqZ4q-7tVwCLcBGAs/s400/ADLS01.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Data Lake Store File System Task</td></tr>
</tbody></table>
<b><br /></b>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><br /></b>
<b>2) Operation and source</b><br />
Edit the new task and select an Operation. For this example I will use the CopyToADLS operation. Then we first need to specify where the files are located on the local machine. This is a hardcoded path but can be overwritten with an expression. The FileNamePattern is a wildcard with ? or *. I use *.csv to upload all csv files in that folder. SearchRecursively allows you to find files in subfolders.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-3Es-LaWr9gE/WaXPjw_TRnI/AAAAAAAAE9Y/gOfZLbipwJUXaAETcpKH2maoeq_nhZ_oQCLcBGAs/s1600/ADLS02.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="638" data-original-width="751" height="338" src="https://4.bp.blogspot.com/-3Es-LaWr9gE/WaXPjw_TRnI/AAAAAAAAE9Y/gOfZLbipwJUXaAETcpKH2maoeq_nhZ_oQCLcBGAs/s400/ADLS02.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Specify local source</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) Destination - Connection manager</b><br />
Next we need to create a new ADLS connection manager or select an existing one.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-L9A9XhD1n5w/WaXSRjTL3sI/AAAAAAAAE9s/6s6Fv0OY8-chFQ5fZa8DeSfOrD0BnjJ1gCLcBGAs/s1600/ADLS04.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="638" data-original-width="751" height="338" src="https://2.bp.blogspot.com/-L9A9XhD1n5w/WaXSRjTL3sI/AAAAAAAAE9s/6s6Fv0OY8-chFQ5fZa8DeSfOrD0BnjJ1gCLcBGAs/s400/ADLS04.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">ADLS Connection Manager</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
As host you can use the URL property from the ADLS Overview page. Go to the Azure Portal and copy that URL<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-DVyXSzG8YWc/WaXRlX5slXI/AAAAAAAAE9k/vzOwbz9BizocX_6eoMLVx5cvOcU9pB7hQCLcBGAs/s1600/ADLS03.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="435" data-original-width="1253" height="138" src="https://1.bp.blogspot.com/-DVyXSzG8YWc/WaXRlX5slXI/AAAAAAAAE9k/vzOwbz9BizocX_6eoMLVx5cvOcU9pB7hQCLcBGAs/s400/ADLS03.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">URL = ADLS Host</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
For this example I will use the easier Authentication: Azure AD User Identity. It uses you email address and password from Azure. The Azure AD Service Identity will be handled in a later post.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-0LESjUjwwgc/WaXSqoj1_cI/AAAAAAAAE90/xzzYJBp4Z-sPv-0Wj-zESAW_0hHp4-4_gCLcBGAs/s1600/ADLS05.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="339" data-original-width="446" height="303" src="https://3.bp.blogspot.com/-0LESjUjwwgc/WaXSqoj1_cI/AAAAAAAAE90/xzzYJBp4Z-sPv-0Wj-zESAW_0hHp4-4_gCLcBGAs/s400/ADLS05.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">ADLS Connection Manager</td></tr>
</tbody></table>
<br />
<b><br /></b>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
When hitting OK or Test Connection it will open an Azure Login page, where you need to login and confirm that SSIS can connect to that ADLS.<br />
<br />
<b>4) Destination - ADLS folder</b><br />
<b></b><i></i><u></u><sub></sub><sup></sup><strike></strike><b></b>Next we need to specify a folder name or path. You can either specify the name of an existing folder or a new folder name that will be created when executed. To find which existing folders you have, you can use the Data Explorer page in ADLS.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-oH70H1yfT9M/WaXUNpIur_I/AAAAAAAAE-A/6ULZBzLISfg4cOsjfu3CnyDfN_-CQt2pQCLcBGAs/s1600/ADLS06.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="492" data-original-width="1600" height="122" src="https://4.bp.blogspot.com/-oH70H1yfT9M/WaXUNpIur_I/AAAAAAAAE-A/6ULZBzLISfg4cOsjfu3CnyDfN_-CQt2pQCLcBGAs/s400/ADLS06.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Data Explorer</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-b0OkKka17fQ/WaXUnkhdWPI/AAAAAAAAE-E/DQAAIllZR7o0Nc7KSwUjIkxp49d6jzjeACLcBGAs/s1600/ADLS07.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="638" data-original-width="751" height="338" src="https://1.bp.blogspot.com/-b0OkKka17fQ/WaXUnkhdWPI/AAAAAAAAE-E/DQAAIllZR7o0Nc7KSwUjIkxp49d6jzjeACLcBGAs/s400/ADLS07.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Specify Folder</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
The <a href="https://blogs.msdn.microsoft.com/azuredatalake/2016/10/09/experience-updates-to-the-azure-data-lake-store-and-analytics-portal/" target="_blank">FileExpiry</a> option lets you specify the data that will be used the expire the files in ADLS. You can leave it empty to never expire.<br />
<br />
<b>5) The result</b><br />
Now run the task/package to see the result. Use Data Explorer in ADLS to see the actual result.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-vGkLtij05uo/WaXf9LSq3dI/AAAAAAAAE_M/-XvdY6RL-7oh1B4D0C9CB3Ckd2e8Iun2QCLcBGAs/s1600/ADLS15.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="492" data-original-width="1600" height="122" src="https://4.bp.blogspot.com/-vGkLtij05uo/WaXf9LSq3dI/AAAAAAAAE_M/-XvdY6RL-7oh1B4D0C9CB3Ckd2e8Iun2QCLcBGAs/s400/ADLS15.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Data Explorer</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><br /></b>
<br />
<hr />
<b>Foreach ADLS File Enumerator</b><b></b><br />
The Foreach ADLS File Enumerator is a new enumerator for the Foreach Loop Container. It allows you to loop through an ADLS folder and return the paths of the files. It is very similar to the <a href="http://microsoft-ssis.blogspot.nl/2015/06/azure-blob-enumerator.html" target="_blank">Azure Blob Enumerator</a>. You can use this enumerator with the Azure Data Lake Store Source in the Data Flow Task.<br />
<br />
<b>1) Select Enumerator</b><br />
When you select the ADLS File Enumerator. You need to specify the Connection Manager (see above, step 3 of task). The remote folder (use the Data Explorer to find an existing folder). And then the wildcard and the Search recursive option.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://2.bp.blogspot.com/-Nl6XlTAv45I/WaXXSE_oEUI/AAAAAAAAE-Q/IaGAfduUDJgedP7CfiPo-YImbEUjjulOwCLcBGAs/s1600/ADLS08.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="638" data-original-width="751" height="338" src="https://2.bp.blogspot.com/-Nl6XlTAv45I/WaXXSE_oEUI/AAAAAAAAE-Q/IaGAfduUDJgedP7CfiPo-YImbEUjjulOwCLcBGAs/s400/ADLS08.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Collection</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) Variable Mappings</b><br />
In the Variable Mappings pane you need to map the first item of the collection (zero based) to an SSIS string variable.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-7fGM0gOA_yc/WaXXpQ7mQjI/AAAAAAAAE-U/4V0YHv1E25osNmjydi0ih8etHigb_HebgCLcBGAs/s1600/ADLS09.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="638" data-original-width="751" height="338" src="https://4.bp.blogspot.com/-7fGM0gOA_yc/WaXXpQ7mQjI/AAAAAAAAE-U/4V0YHv1E25osNmjydi0ih8etHigb_HebgCLcBGAs/s400/ADLS09.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Variable Mappings</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>3) The Result</b><br />
To show the content of the variable during execution, I added a simple Script Task and a little C# code: MessageBox.Show(Dts.Variables["User::filepath"].Value.ToString());<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-Uldkz3ZDCrI/WaXY8S5QKqI/AAAAAAAAE-c/X7CA-WsgSQMVHcKLe8Bwcm0heX4_UXJcwCLcBGAs/s1600/ADLS10.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="509" data-original-width="406" height="400" src="https://3.bp.blogspot.com/-Uldkz3ZDCrI/WaXY8S5QKqI/AAAAAAAAE-c/X7CA-WsgSQMVHcKLe8Bwcm0heX4_UXJcwCLcBGAs/s400/ADLS10.png" width="318" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">MessageBox.Show</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<hr />
<b>C) Azure Data Lake Store Source</b><br />
This allows you to use files from the Azure Data Lake Store as a source in SSIS. Again very similar to the <a href="http://microsoft-ssis.blogspot.nl/2015/06/azure-blob-source-and-destination.html" target="_blank">Azure Blob Source</a>.<br />
<b><br /></b>
<b>1) Edit Source</b><br />
Drag the Azure Data Lake Store Source to the surface and give it a suitable name. Then edit the source and specify the connection manager, File Path and format. You cannot specify the data type or size. In this first test every thing became (DT_WSTR,100).<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-PincbqGmc6U/WaXbNQfFcHI/AAAAAAAAE-o/NQN3rlfHBfYTvaUBZ6GnbCD0Y6QdfYSrwCLcBGAs/s1600/ADLS11.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="885" data-original-width="1098" height="321" src="https://3.bp.blogspot.com/-PincbqGmc6U/WaXbNQfFcHI/AAAAAAAAE-o/NQN3rlfHBfYTvaUBZ6GnbCD0Y6QdfYSrwCLcBGAs/s400/ADLS11.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Data Lake Store Source </td></tr>
</tbody></table>
<b><br /></b>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><br /></b>
<br />
<b>2) The Result</b><br />
To test the result (with a very small file) I added a dummy Derived Column and a Data Viewer.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://3.bp.blogspot.com/-_0TDzsKpLzQ/WaXb8iQIrPI/AAAAAAAAE-w/n3gv5WZLG4cxqY7JUYI8w_Jl19OnYQIVACLcBGAs/s1600/ADLS12.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="376" data-original-width="665" height="225" src="https://3.bp.blogspot.com/-_0TDzsKpLzQ/WaXb8iQIrPI/AAAAAAAAE-w/n3gv5WZLG4cxqY7JUYI8w_Jl19OnYQIVACLcBGAs/s400/ADLS12.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Azure Data Lake Store Source</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<hr />
<b>D) Azure Data Lake Store Destination</b><br />
This allows you to stream your Data Flow Task data to Azure Data Lake Store. Again very similar to the <a href="http://microsoft-ssis.blogspot.nl/2015/06/azure-blob-source-and-destination.html" target="_blank">Azure Blob Destination</a>.<br />
<b><i><br /></i></b>
<b>1) Edit Destination</b><br />
Add a Azure Data Lake Store Destination after your source or transformation and give it a suitable name. You can specify the connection manager, file path and the file format option.<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://1.bp.blogspot.com/-ApoAcXCOKTk/WaXd55nPrxI/AAAAAAAAE-8/KlYBnggPhFAHVVPM31VfWbNIXF1xgFAjQCLcBGAs/s1600/ADLS13.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="831" data-original-width="1055" height="252" src="https://1.bp.blogspot.com/-ApoAcXCOKTk/WaXd55nPrxI/AAAAAAAAE-8/KlYBnggPhFAHVVPM31VfWbNIXF1xgFAjQCLcBGAs/s320/ADLS13.png" width="320" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">ADLS destination - test3.csv</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b>2) The Result</b><br />
To test the result run the package and open the Data Explorer in ADLS to see the result<br />
<table cellpadding="0" cellspacing="0" class="tr-caption-container" style="float: left; margin-right: 1em; text-align: left;"><tbody>
<tr><td style="text-align: center;"><a href="https://4.bp.blogspot.com/-KK22hMQH-9s/WaXfC30xFOI/AAAAAAAAE_E/ypwrum5bUAEyVs8NbSPZ3m8R6lkMK6ayQCLcBGAs/s1600/ADLS14.png" imageanchor="1" style="clear: left; margin-bottom: 1em; margin-left: auto; margin-right: auto;"><img border="0" data-original-height="429" data-original-width="1600" height="106" src="https://4.bp.blogspot.com/-KK22hMQH-9s/WaXfC30xFOI/AAAAAAAAE_E/ypwrum5bUAEyVs8NbSPZ3m8R6lkMK6ayQCLcBGAs/s400/ADLS14.png" width="400" /></a></td></tr>
<tr><td class="tr-caption" style="text-align: center;">Data Explorer</td></tr>
</tbody></table>
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<br />
<b><span style="font-size: large;">Conclusion</span></b><br />
Much needed ADLS extension for the Azure Feature Pack, but nothing spectacular compared to the Blob Storage items in this feature pack. Hopefully the Azure Data Lake Store File System Task will soon be extended with new actions and perhaps they could also introduce the <a href="http://microsoft-ssis.blogspot.com/2015/10/azure-file-system-task-for-ssis.html" target="_blank">Azure Blob Storage File System Task</a>.Joost van Rossumhttp://www.blogger.com/profile/01125981589974671317noreply@blogger.com2