August 8, 2014 at 3:00 PM

Sentinet is highly extendable through standard Microsoft .Net, WCF and WIF extensibility points. Also the Sentinet API interfaces can be used to extend Sentinet possibilities.

In some previous post that were release on this blog we saw how to build a custom access rule expression and how to leverage the WCF extensibility by setting up a virtual service with a custom endpoint behavior .


In this post I would like to explain another extensibility point of Sentinet: Custom Alert Handlers.

Alerts or Violation Alerts are triggered in the Sentinet Service Agreements when certain configured SLA violations occurred. More details about Sentinet Service Agreements can be found here.



The main scenario for this blog post is very simple. We will create our own Custom Alert Handler class by inheriting certain interfaces, register our custom Alert Handler in Sentinet and use it as an Alert when a certain SLA is violated.

Creating the Custom Alert Handler

We start our implementation of the custom alert handler by creating a new class that inherits from the IAlertHandler interface. This interface is available through the Nevatech.Vsb.Repository.dll.

This interface contains one single method: ProcessAlerts where you put your logic to handle the alert after the SLA violation has occurred.


One more thing to do before we can start our implementation of the ProcessAlerts method is to add a reference to the Twilio REST API through NuGet. More information about Twilio can be found here.



The final implementation of our custom Alert Handler looks like below. We start by initializing some variables needed for Twilio. After that everything is pretty straight forward. We read our handler configuration, here we made the choice for a CSV configuration string, but you can perfectly go for an XML configuration and parse it to an XMLDocument or XDocument.

When we've read the receivers from the configuration we create an alert message by concatenating the alert description, after that we send an SMS message by using the Twilio REST API.



The first step in registering your Custom Alert is to add your dll(s) to the Sentinet installation folder. This way our dll(s) can be accessed by the "Nevatech.Vsb.Agent" process, that is responsible for generating the alert. There is no need to add your dll(s) to the GAC.

Next step is to register our Custom Alert in Sentinet itself and assign it to a desired Service Agreement.

In the Sentinet Repository window, click on the Service Agreements and navigate to alerts and then choose add Alert. Following screen will popup.


Next step is clicking on the 'Add Custom Alert Action' button as shown below.


In the following screen we have to add all our necessary parameters to configure our Custom Alert.

  • Name: The friendly name of the Custom Alert
  • Assembly: The fully qualified assembly name that contains the custom alert
  • Type: The .NET class that implements the IAlertHandler interface
  • Default Configuration: The optional default configuration. In this example I specified a CSV value of difference phone number. You can access this value inside your class that implements the IAlertHandler interface.


Confirm the configuration by clicking 'OK'. In the next screen be sure to select your newly configured Custom Alert.

You will end up with following Configured Violation Alerts.




To test this alert I've modified my Service Agreement Metrics to a very low value (ex 'Only 1 call is allowed per minute'). So I could easily trigger the alert. After I called my Virtual Service multiple times per minute, I received following SMS.

Service agreement "CoditBlogAgreement"  has been violated one or more times. The most recent violation is reported for the time period started on 2014-07-24 18:15:00 (Romance Standard Time).


Sentinet is designed to be extensible in multiple areas of the product. In this post I’ve demonstrated how to create a Custom Alert Handler that will send an SMS when an SLA has been violated.



Glenn Colpaert

Posted in: .NET | BizTalk | Sentinet

Tags: , ,

July 22, 2014 at 4:00 PM

Recently I noticed some odd behavior when I was troubleshooting a SQL integration scenario with BizTalk 2010. I was using WCF-Custom adapter to perform Typed-Polling that executed a stored procedure. This stored procedure was using dynamic SQL to fetch the data because it is targeting multiple tables with one generic stored procedure.

In this blog post I will tell you how we implemented this scenario as well as where the adapter was failing when polling.
Next to that I will talk about some "problems" with the receive location and the adapter.
I will finish with some small hints that made our development easier.



In this simplified scenario we have an application that is polling on two tables called ‘tbl_Shipment_XXX’ where XXX is the name of a warehouse. Each warehouse will have a corresponding receive location that will poll the data that is marked as ready to be processed.

This is performed by using a stored procedure called ‘ExportShipments’ which requires the name of the target warehouse and will use the proceed in the following steps – Lock data as being processed, export data to BizTalk & mark as successfully processed.



Creating our polling statement

In our polling statement we will execute our generic stored procedure. This procedure will simply mark our data as being processed, execute a function that will return a SQL statement as a NVARCHAR(MAX). Afterwards we will execute the statement & mark the data as processed.

	@Warehouse nvarchar(10)
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	SET @exportData = [ShopDB].[dbo].[ComposeExportShipmentSelect] (@Warehouse)
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';

In our function we will compose a simple SELECT-statement where we fill in the name of our warehouse.

CREATE FUNCTION ComposeExportShipmentSelect
	@Warehouse nvarchar(10)
	SET @result = 'SELECT [ID], [ORDER_ID], [WAREHOUSE_FROM], [WAREHOUSE_TO], [QUANTITY] FROM [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] WHERE [STATUS_ID] = 7';
	RETURN @result

I chose to seperate the composition of the SELECT-statement in a function because we were using a lot of JOIN's & UNION's and I wanted to seperate this logic from the stored procedure for the sake of code readability.

Although our scenario is very simple we will use a function to illustrate my problem.

Generating the Types-polling schema

Now that everything is set in our database we are ready to generate our Typed-Polling schema for the WarehouseA-polling to our BizTalk project.

1. Right-click on your project and select Add > Add generated items... > Consume Adapter Service


2. Select the SqlBinding and click "Configure"

3. Fill in the Server & InitialCatalog and a unique InboundId. The InboundId will be used for in the namespace of your schema and URI of your receive location. Each receive location requires its own schema. I used "WarehouseA" since I am creating the schema for polling on "tbl_Shipments_WarehouseA".

4. Configure the binding by selecting "TypedPolling" as the InboundOperationType and execute our stored procedure as PollingStatement with parameter "WarehouseA". (Note that we are not using ambient transactions)


5. Click Connect,select the "Service" as contract type and select "/" as category. If everything is configured correctly you can select TypedPolling and click Properties that will show you the metadata. If this is the case click Add and a schema will be generated.


The wizard will create the following schema with a sample configuration for your receive location.


It's all about metadata

The problem here is that executing the dynamic SQL doesn't provide the required metadata to BizTalk in order to successfully generate a schema for that result set.

I solved this by replacing the function 'ComposeExportShipmentSelect' with a new stored procedure called 'ExportShipment_AcquireResults'.
In this stored procedure will execute the dynamic SQL and insert the result set into a TABLE and return it to the caller. This tells BizTalk what columns the result set will contain and of what type they are. 

CREATE PROCEDURE ExportShipment_AcquireResults
	@Warehouse nvarchar(10)
AS DECLARE @result 
			  [ID] [INT] NOT NULL,
	SET @select = 'SELECT [ID], [ORDER_ID], [WAREHOUSE_FROM], [WAREHOUSE_TO], [QUANTITY] FROM [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] WHERE [STATUS_ID] = 7';
	INSERT INTO @result
	SELECT * FROM @result;

Our generic polling stored procedure simply execute our new stored procedure.

	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	EXEC [ShopDB].[dbo].[ExportShipment_AcquireResults] @Warehouse
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';

When we regenerate our schema the problem should be resolved and your schema looks like this -
(Note that the new filename include your InboundId) 


Why not use a #tempTable?

You can also achieve this by setting FMTONLY OFF and using a temporary table but this will not work in every scenario.

	@Warehouse nvarchar(10)
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	SET @exportData = [ShopDB].[dbo].[ComposeExportShipmentSelect] (@Warehouse)
	CREATE TABLE #tempTable
			  [ID] [INT] NOT NULL,
	INSERT INTO #tempTable
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';
	SELECT * FROM #tempTable;

We were using transactions on SQL-level and Try/Catch statements but this conflicted with the FMTONLY and resulted in a Severe error in the event log. Also this didn't make any sense at all since we seperated the SELECT composition to a seperate function/stored procedure and the definition of the result set should be defined there.

If you want to read more about #tempTables, I recommend this post.


PollingDataAvailableStatement & no ambient transactions

With our schema generated I deployed my application to my machine and started creating the receive locations for the polling on my database. The configuration is pretty easy - No ambient transactions, Typed polling, pollingstatement is our stored procedure and specify a SELECT in the PollDataAvailableStatement to check if we need to run the stored procedure.

Apparently the PollDataAvailableStatement is only used when you enable ambient transactions according to this article.

Problem here is that when you clear out the PollDataAvailableStatement and start your receive location it will be disabled automatically with the following error - 

PollDataAvailableStatement seems to be a mandatory field although it is not being used, I easily fixed it with "SELECT 0".


Empty result sets & TimeOutExceptions

Later on the project we had to move from transactions on SQL-level to ambient transactions and our PollDataAvailableStatement was consulted before it ran the stored procedure.

In our stored procedure we used the found data to perform cross-references and return a set of data when required so it is possible that it is not required to return a set.

We started experiencing locks on our tables and TimeOutExceptions occured without any obvious reason.
It seemed that the adapter had a bug: when the adapter has found data it is required to return a result to BizTalk, if not it will lock SQL resources and lock the tables.

This issue can be resolved by installing the standalone fix (link) or BizTalk Adapter Pack 2010 CU1 (link)


Tips & Tricks

During the development of this scenario I noticed some irregularities when using the wizard, here are some of the things I noticed -

  • It is a good practices to validate the metadata of the TypedPolling-operation before adding it. If something is misconfigured or your stored procedure is correct you will receive an error with more information.
    If this is the case you should click OK instead of clicking the X because otherwise the wizard will close. The same error might occur when you finalize the wizard, then it is important to click cancel instead of ok or the wizard will close automatically. 
  • If your wizard closes for some reason it will remember the configuration of the previous wizard when you restart it and you can simply connect, select the requested operation and request the metadata or finish the wizard.
    It might occur that this results in a MetadataException that says that the PollingStatement is empty and therefor invalid -

    This is a bug in the wizard where you always need to open the URI configuration although it still remembers your configuration. You don't need to change anything, just open & close and the exception is gone.


In this blog post I highlighted the problem where I was unable to generate a schema for my stored procedure by only executing the result of my function. I also illlustrated how I fixed it and why I didn't use FMTONLY and a temporary table.

Next to the generation of our schemas I talked about the problems with the receive location where it required a PollDataAvailableStatement even when it was not used and the locking of our tables because our stored procedure wasn't returning a result set. Last but not least I gave two examples of common irregularities when using the wizard and how you can bypass them.

For me it is important to write your SQL scripts like you write your code - Use decent comments, seperate your procedure into subprocedures & functions according to the separation of concern.
While writing you scripts everything might sound obvious but will it in a couple of months? And how about your colleagues? 

All the scripts for this post, incl. DB generation, can be found here.

Thank you for reading,


July 3, 2014 at 4:03 PM

With some luck your environment was installed by a BizTalk professional and the most critical maintenance tasks were configured at installation. When I say critical maintenance tasks, I mean the BizTalk backup job, DTA Purge & Archive,... If these tasks are not configured I'm sure your production BizTalk environment will only be running smoothly for a couple of days or months but definitely not for years!


I'm not going to write about basic configuration tasks as most BizTalk professionals should be aware of these. 

At Codit Managed Services our goal is to detect, avoid and solve big and small issues before it's too late. This way we often detect smaller missing configuration tasks. Not necessary at first, but definitely necessary for the BizTalk environment to keep on running smoothly through the years! 


Purging/archiving the BAM databases:

I'm going to explain you how you should maintain the BizTalk BAM databases:


In most environments these maintenance tasks are missing. You will find the necessary information on the internet, but probably only when it's urgent and you really need it.


My goal is to provide you with this information in time, so you are able to configure the BAM maintenance tasks without putting your SQL server or yourself under some heavy stress. If it comes to BAM and data purging, it's rather simple. By default, nothing will be purged or archived. Yes, even if you have the BizTalk backup and all the other jobs succesfully running! 


I have seen a production environment, running for only two years where the BAMPrimaryImport database had a size of 90GB! It takes a lot of time and processing power to purge such a database. To maintain and purge the BAM databases you will need to configure how long you want to keep the BAM data, configure archiving, trigger several SQL SSIS packages,...


The problem is: purging is configured per Activity, so this is a task for the developer and not a task for the guy who installed the BizTalk environment. You will find all the information to do this on following sites:


BAMData not being purged immediately?

Something very important that you should be aware of is the fact that if you for example want to keep a year of data, you will have to wait another year for the data being purged/archived in a supported way!

That's why it's so important to configure these jobs in time! It's not like the DTA P&A job, where the data is purged immediately.

You can find more information about this on following blog:


Purging the BAMAlertsApplication database:

I'm rather sure the following maintenance task will not be sheduled to clean your BAMAlertsApplication database. I only discovered this myself a couple of days ago! Probably not a lot of people notice this database because it's rather small. After 2 years running in production with a small load it had a size of 8GB. But it's 8GB of (wasted) diskspace!


If you search on the internet on how to clean this database you will find nothing official by Microsoft,...

Credits on how to purge the BAMAlertsApplication go to Patrick Wellink and his blogpost:

If you wonder what the NSVacuum stored procedure looks like, you can find it below:

USE [BAMAlertsApplication]


	DECLARE @QuantumsVacuumed	INT
	DECLARE @QuantumsRemaining	INT
	DECLARE @VacuumStatus		INT

	SET @QuantumsVacuumed = 0
	SET @QuantumsRemaining = 0

	SET @StartTime = GETUTCDATE()

	-- Volunteer to be a deadlock victim

	EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

	IF (0 != @VacuumStatus)			-- VacuumStatus 0 == Running
		GOTO CountAndExit

	DECLARE @RetentionAge		INT
	DECLARE @VacuumedAllClasses	BIT

	-- Remember the last run time and null counts
	UPDATE [dbo].[NSVacuumState] SET LastVacuumTime = @StartTime, LastTimeVacuumEventCount = 0, LastTimeVacuumNotificationCount = 0

	-- Get the retention age from the configuration table (there should only be 1 row)
	SELECT TOP 1 @RetentionAge = RetentionAge FROM [dbo].[NSApplicationConfig]

	SET @CutoffTime = DATEADD(second, -@RetentionAge, GETUTCDATE())

	-- Vacuum incomplete event batches
	EXEC [dbo].[NSVacuumEventClasses] @CutoffTime, 1

	-- Mark expired quantums as 'being vacuumed'
	UPDATE	[dbo].[NSQuantum1] SET QuantumStatusCode = 32
	WHERE	(QuantumStatusCode & 64) > 0 AND		-- Marked completed
			(EndTime < @CutoffTime)					-- Old

	DECLARE @QuantumId			INT

	DECLARE QuantumsCursor CURSOR
	SELECT	QuantumId, EndTime
	WHERE	QuantumStatusCode = 32

	OPEN QuantumsCursor

	-- Do until told otherwise or the time limit expires
	WHILE (1=1)
		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)			-- VacuumStatus 0 == Running

		FETCH NEXT FROM QuantumsCursor INTO @QuantumId, @QuantumEndTime

		IF (@@FETCH_STATUS != 0)
			SET @VacuumStatus = 2		-- VacuumStatus 2 == Completed
			SET @QuantumsRemaining = 0
			GOTO CloseCursorAndExit

		-- Vacuum the Notifications
		EXEC [dbo].[NSVacuumNotificationClasses] @QuantumId, @VacuumedAllClasses OUTPUT

		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)

		-- Vacuum the Events in this quantum
		EXEC [dbo].[NSVacuumEventClasses] @QuantumEndTime, 0

		-- Delete this Quantum from NSQuantums1 if its related records were also deleted
		IF (1 = @VacuumedAllClasses)
			DELETE [dbo].[NSQuantum1] WHERE QuantumId = @QuantumId

			-- Update the count of quantums vacuumed
			SET @QuantumsVacuumed = @QuantumsVacuumed + 1

		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)

	END	-- Main WHILE loop


	CLOSE QuantumsCursor
	DEALLOCATE QuantumsCursor


	-- Report progress
	SET @QuantumsRemaining = (SELECT COUNT(*) FROM [dbo].[NSQuantum1] WITH (READUNCOMMITTED) WHERE QuantumStatusCode = 32)

	SELECT	@VacuumStatus AS Status, @QuantumsVacuumed AS QuantumsVacuumed,
			@QuantumsRemaining AS QuantumsRemaining

END -- NSVacuum

You need shedule the NSVacuum command on your environment. Run this step by step, as it puts a lot of stress on your SQL server.


The content of this post is something every BizTalk professional should add to his BizTalk Server installation/deployment procedure!

June 25, 2014 at 11:00 AM

As discussed yesterday in my blog post around the enhancements to the WCF-WebHttp Adapter the R2 releases of the BizTalk products are focusing more on 'compatibility and platform' alignment over shipping new major features/add-ons to the platform.

To give you another overview, the following features where added in new BizTalk Server 2013 R2 release:

  • Platform Alignment with Visual Studio, SQL Server,…
  • Updates to the SB-Messaging Adapter
  • Updates to the WCF-WebHttp Adapter
  • Updates to the SFTP Adapter
  • Updates to the HL7 Accelerator

This blogpost will focus on the updates of the SB-Messaging Adapter that where shipped with this new release of Microsoft BizTalk Server.


SB-Messaging Adapter enhancements

With this new version of BizTalk Server the SB-Messaging adapter now also supports SAS (Shared Access Signature) authentication, in addition to ACS (Access Control Service).

Due to this improvement BizTalk Server can now also interact with the on-premise edition of Service Bus that is available through the Windows Azure Pack.


More information on SAS Authentication can be found here:

More information on the Windows Azure Pack can be found here:


The image below is a head to head comparison between the ‘old’ version of the SB-Messaging adapter authentication properties window and the ‘new’  version that ships with BizTalk Server 2013 R2.





Out with the old, in with the new

When we compare the 'Access connection information' from both the Windows Azure Portal (cloud) as the Windows Azure Pack Portal (on premise) we already see a clear difference on ways to authenticate to the Service Bus.

You also notice why the update to include SAS was really a "must have" for BizTalk Server.

On the left side you see the cloud portal, that has support for SAS and ACS, on the right side you notice the on premise version of service bus that only supports the SAS and Windows authentication.





This ‘small’ addition to the SB-Messaging adapter has a great impact on interacting with Service Bus as we finally can use the full potential of the on-premise version of Service Bus.

We have started implementing this feature at one of our customers. One of the requirements there is to use the on-premise version of Service Bus. First test are looking good and the behavior and way of implementing is the same as connecting to the service bus in the cloud.

However one downside that I personally find is the lacking of the Windows Authentication possibility feature.


Happy Service Bus’ing !!

Glenn Colpaert

June 24, 2014 at 9:52 AM

Now that BizTalk 2013 R2 is released on MSDN, it’s time to take a first look at the new features and possibilities of this release.

As Guru already clarified on the last BizTalk Summit (BizTalk Summit Summary), the R2 releases of the BizTalk products are focusing more on 'compatibility/platform' alignment and less about shipping new (major) features/add-ons to the platform.

To give you an overview, the following features where added in the new BizTalk Server 2013 R2 release:

  • Platform Alignment with Visual Studio, SQL Server,…
  • Updates to the SB-Messaging Adapter
  • Updates to the WCF-WebHttp Adapter
  • Updates to the SFTP Adapter
  • Updates to the HL7 Accelerator

This blog post will focus on the updates of the WCF-WebHttp Adapter that where shipped with this new release of Microsoft BizTalk Server.


WCF-WebHttp Adapter enhancements

With this new version of BizTalk Server the WCF-WebHttp adapter now also supports sending and receiving JSON messages. This new version of BizTalk ships with a wizard to generate XSD schema from a JSON instance and two new pipelines (and components) for processing JSON messages in a messaging scenario.


Out with the old, in with the new

Let us take a quick glance at the new components that are shipped with this new BizTalk release. We first of all have 2 new pipelines and components for encoding and decoding JSON messages in our BizTalk Ports.


Configuring these 2 new components is very straightforward. On the encoder side there is one component property to specify whether the XML root element should be ignored while encoding the xml message to JSON. On the decoder part there are 2 properties to specify, the root node and the root node namespace to be used by the decoder in the generated xml message.
You can find a screenshot of the properties of both components below.





Next to 2 new components for parsing the JSON messages there is also the new JSON Schema Wizard which allows you to generate XSD files based on a JSON instance. You can find this new wizard in the "Add -> New Item" menu of Visual Studio.



JSON, do you speak it?

To demonstrate the new features and possibilities of the enhanced WCF-WebHttp adapter I created a little POC that will use the iTunes API (

First of all I downloaded a JSON instance from the API from following URL:


Next, I used this JSON instance to generate the XSD with the new JSON Schema Wizard. The wizard itself is pretty straightforward, you simply specify the ‘Instance File"’, choose your ‘Root Node Name’ and ‘Target Namespace’ and simply press ‘Finish’ to get your XSD generated.


When using the ‘Metallica’ instance this results in following schema.




After generating the instance I started configuring the Send Port, to actually consume the service from BizTalk.

Below you can see the configuration of my 2-Way Send Port that is sending requests to the API. We have a PassThruTransmit pipeline and as receive pipeline we have a custom pipeline with the JSON decoder inside it.






When sending the request to the API we get following message in the BizTalk MessageBox (parsed JSON instance).


This sums up the basic demo of the new enhancements of the WCF-WebHttp adapter.

If you have questions regarding this scenario, please share them in the comments section of this blog post and I’ll be happy to answer them.



Glenn Colpaert

Posted in: BizTalk | Schemas | WCF

Tags: , ,

June 19, 2014 at 4:29 PM

Some technologies withstood the test of time very well. One of those technologies is the IBM AS/400 system (now called IBM System i). A lot of software is still running on AS/400 or Mainframe systems.

Luckily, Microsoft provides us with the right tools to integrate with System i and System z environments. These tools are packaged into Host Integration Server and allow you to integrate on different levels with Host environments:

-          Network integration

-          Application or Program integration

-          Data integration (databases and Host files)

-          Message integration (queuing)

-          Security integration

In this post, I will go deeper into Application Integration. More specifically calling a piece of RPG code hosted on an AS/400 system. I will call the RPG program from BizTalk Server by using the adapter for Host Applications.

The scenario

The AS/400 is running a RPG program that returns the balance of someone’s account when passing his name and account number.
BizTalk Server will call into this RPG program by passing in the input parameters and receiving the return value (balance).
BizTalk Server will leverage the Adapter for Host Applications to make this distributed program call into RPG possible.

The RPG program looks like this (*will be used later, you don’t have to understand RPG code):


      *  PROGRAM      - GBALDPC                                                                 


      *  WRITTEN FOR  - MICROSOFT Corporation                                                             





      * Data Definitions                                                                           


     D Name            s             30A

     D AccNum          S              6A                                                                 

     D AccBal          S              9P 2                                                         


      * Mainline                                                                                   


     C     *ENTRY        PLIST                                                                     

     c                   PARM                    Name                                              

     c                   PARM                    AccNum                                           

     c                   PARM                    AccBal                                            


     c                   Z-ADD     2003.66       AccBal                                            


     C                   EVAL      *INLR         = *ON                                             


For the sake of simplicity, but also to make it easy for you to reproduce this, I’m using the standard Microsoft ‘GetBal’ sample here.

Lets get started…

To realize this scenario, the following components are used to make this possible:

-          AS/400 running the RPG program (In my case, the ‘real’ AS/400 will be replaced by a ‘simulator’ - because I don’t have an AS/400 on my laptop…)

-          BizTalk Server 2013

-          Host Integration Server 2013

-          Visual Studio 2012

After you setup BizTalk Server and Host Integration Server (not part of this blog post), the work in Visual Studio can start.

Visual Studio will display 4 new project types that were installed together with Host Integration Server 2013.


The project types:

-          BizTalk Pipeline Conversion
This project type allows to integrate with Host Data Structures without the use of the Adapter for Host Applications. This is very useful when you are – for example – reading a Host Data structure from a queue.

-          Host Application
This project type is used when you call a Host program via a .NET program or via the BizTalk Adapter for Host Application (BAHA). The Host Application project type will describe how the Host Data Structure translates to .NET data structures and vice versa.

-          Host Files
Project type to integrate with the Host File system (VSAM for example).

-          Message Queuing
Project type to integrate with Host Queuing systems.


For this scenario, use the ‘Host Application’ project type and create a new project based on that template.

After creating the project, a blank canvas is presented:


To continue, we will create a .NET Definition. There are two types of .NET Definitions, a Client and a Server Definition.

The .NET Client Definition is used when .NET will call a program on the Host system (also called Windows-Initiated Processing or WIP).
The .NET Server Definition is used when the Host System will call a .NET program (also called Host-Initiated Processing or HIP).

In this case, we will call the Host program, so choose ‘Add .NET Client Definition’.


Name your .NET Client Definition:


On the next page, provide a name for the interface, and an initial version number.


The next page in the wizard looks like this:


Protocol: Choose the correct protocol that is used to communicate with your Host System (TCP, Http and LU6.2). AS/400 is using TCP so we select that. *note: when doing Mainframe integration 2PC (two-phase commit) will only be possible over LU6.2

Target environment: Choose the target environment. In this case System i Distributed Program Call (other options: System i, System z, CICS, IMS)

Programming model: DPC. The programming model depends on the Target environment that was selected. More information on programming models and their functionalities can be found here.

Host Language: Choose the language of the Host program that will be called. In this case RPG. (Can also be COBOL)

On the next screen in the Wizard, we have to tell in what Program library our Host program is located.
The Host administrator or programmer should be able to provide you this value:


Finally, when the wizard is complete, a new .NET Client Definition is added to the Host Application project.


Now that we have a base Host Application project with a .NET Client Definition, we are ready to import the data definition of the RPG program. This import will set the in/out and return parameters of the RPG program. This can be done by importing the RPG program, or manually by adding functions and parameters to the IBanking interface.

The easiest way is to import the RPG program. Right click the ‘GetBalClient’ and select ‘Import Host Definition’.


In the wizard, brows to the RPG program


When your RPG program is read without errors, the following screen of the wizard is shown:


Choose a friendly name for the method we will call. In this case ‘RPGGetBalance’.

The Library name is inferred from the information we entered earlier, this should be correct.
Finally, set the Program name. This is the Program name on the Host environment, this value should be given by the Host administrator (or developer).

On the next screen, set the direction of each parameter:


In this case, ‘NAME’ and ‘ACCNUM’ our input paramters. ‘ACCBAL’ is the return value.

Click ‘Finish’.

The Host Definition was imported successfully:


With that, our Host Application project is finished! We will now generate a DLL and Xsd schema that we can use in BizTalk Server.

Right click the Host Application project and click ‘Save AS400RPG’. (Unlike other .NET project types, a Host Application project is not compiled but saved!)


After saving, a DLL and XSD is generated in the projects folder structure.

We are now ready to setup the BizTalk part. We will keep the BizTalk part very simple. A receive location will read a request message from a File location, a solicit/response send port will send the request to AS/400, a send port will subscribe to the result and write it to an output folder.

Let’s focus on the solicit/response send port that will call the RPG program.

Open the send ports properties and select the HostApps adapter


Open the adapter’s configuration


Set the HostApps Transport properties as shown above.
Then click the ellipsis button on the ‘Connection Strings’ field.

The connection string windows is shown.


Click the ‘Add assemblies – Browse…’ button.


Browse to the DLL that was generated by the Host Application Project in Visual Studio.
The DLL is added to the connection string window:


Double click the newly added DLL to set it’s connection string properties:


Set a TimeOut value for call to the RPG program.

Scroll down to the TCP/IP properties:


Set the IP Adress and port number of the AS/400.

The send port to call the RPG program on AS/400 is not configured correctly.

The last thing to do is test our RPG integration. To do that, I will create a sample message based upon the Xsd Generated by the Host Application project.

My sample message looks like this:

    <ns0:NAME>Kim Akers</ns0:NAME>

When BizTalk picked up the message, the RPG program replied with the expected balance response:



As you can see, integrating with Host Programs (RPG, COBOL), is possible out-of-the box with BizTalk Server and Host Integration Server.
In straightforward scenarios like this, even no coding at all was necessary! 


Peter Borremans

May 26, 2014 at 2:11 PM

When working with a lifecycle product such as Microsoft BizTalk Server, it is recommend to always keep a close eye on the support lifecycle and the one of its supported platforms.

For a couple of common BizTalk setups at this time mainstream support is ending soon:

  • SQL Server 2008R2 : July 2014
  • BizTalk 2009 : July 2014
  • Windows Server 2008 SP2 : January 2015

An upgrade to the, at this time, latest BizTalk (2013) and SQL version (2012SP1) is the best way to ensure further support and updates. Depending on the environment and risk assessment this can be done by moving to a new environment or upgrading the existing BizTalk infrastructure. For older versions the last option might be not a viable solution since a direct upgrade will not be supported.

To upgrade to BizTalk 2013 the existing environment (for server systems) should be BizTalk 2010 , running on Windows Server 2008R2 SP1 (or later). For the SQL Environment microsoft supports an upgrade starting from SQL Server 2005. The minimum requirement for BizTalk 2013 is SQL Server 2008R2 SP1, for BizTalk 2010 this was SQL Server 2008 SP1 / SQL Server 2008 R2.


As a consequence, a supported upgrade to BizTalk 2013 might require more than one upgrade step:




No two deployments of BizTalk Server ever are the same. And this holds true for the environment and applications in each setup. A detailed analysis of the environment and the upgrade path is highly recommended.

Codit supports and maintains the BizTalk environment of a large number of companies. It makes our consultants well aware of the challenges companies face when upgrading or moving to BizTalk 2013.


To make the right decision in regard to the upgrade of your BizTalk environment, we can assess your current environment and propose an upgrade plan based on the experiences we have gained in multiple projects.


Contact us if you would like to get started, or require more information:



May 21, 2014 at 4:03 PM


Everybody probably acknowledges that building and deploying a BizTalk solution without any help can be a serious pain. It's even more of a hassle when your development BizTalk Server and Visual Studio are on separate machines.

An additional problem that we encountered at the customer was that the test-environment consists of 2 nodes, so our project had to be deployed on both, since no access to our personal development machines is wanted or allowed.

Because we didn't feel like building the project, adding the built files as resources and extracting an MSI-file every time we needed to deploy on test, we searched for an easy solution to this problem. We had a few options: Use custom build scripts, use the built-in capabilities of Visual Studio or use the BizTalk Deployment Framework. The decision was made to go for custom build scripts, since they were the most versatile, didn't require a lot of additional setup, and were very simple, so usable by everyone.

In the following paragraphs, I will try and teach you how to use simple MSBuild scripts to build and deploy your solutions, allowing you to simplify your deployment lifecycle.

These scripts will allow you to do the following:

  • Get the latest sources
  • Build the project
  • Prepare our bindings with a custom tool for development / test / acceptance / production
  • Add the resources to the development machine
  • Import bindings
  • Restart development host instances
  • Extract an MSI
  • Import MSI in the Test environment
  • Install MSI on the nodes in the environment
  • Import test bindings
  • Restart test host instances
  • Prepare packages for validation and production releases
  • Create / update databases
  • Many, many more…

Just a little teaser, doing the above (except for the prepping and database takes only 10 seconds for a small project):

I'm going to start with a general deployment, and look into a few sidetracks afterwards.



Script skeleton

<Project xmlns="">
	<Import Project="$(MSBuildExtensionsPath)\ExtensionPack\4.0\MSBuild.ExtensionPack.tasks" />
	<Target Name="Default">
		<CallTarget Targets="Development;Test " />
	<Target Name="Development">
		<Message Text="1. Cleanup" Importance="High" />
		<Message Text="2. Building source" Importance="High" />
		<Message Text="3. Deploying Application" Importance="High" />
		<Message Text="3.1. Stopping host instances" Importance="High" />
		<Message Text="3.2. Creating application" Importance="High" />
		<Message Text="3.3. Adding resources" Importance="High" />
		<Message Text="3.4. Importing bindings" Importance="High" />
		<Message Text="3.5. Starting host instances" Importance="High" />
		<Message Text="3.6. Starting application" Importance="High" />
		<Message Text="4. Create Release Package" Importance="High" />
		<Message Text="4.1. Exporting MSI" Importance="High" />
		<Message Text="4.2. Copying bindings" Importance="High" />
	<Target Name="Test">
		<Message Text="1. Cleanup" Importance="High" />
		<Message Text="2. Create Release package" Importance="High" />
		<Message Text="3. Install on test" Importance="High" />
		<Message Text="3.1. Create application" Importance="High" />
		<Message Text="3.2. Importing MSI" Importance="High" />
		<Message Text="3.3. Installing MSI" Importance="High" />
		<Message Text="3.4. Importing bindings" Importance="High" />
		<Message Text="3.5. Restarting Host Instances" Importance="High" />
		<Message Text="3.6. Starting application" Importance="High" />

General properties

Property Name



The name of the application (in my case also the name of the solution file)


Version number used in ReleasePath, to keep an archive of different MSI's and releases


Computer name of your BizTalk development machine (If it's the same one you're running the script on, you can just use $(COMPUTERNAME))


Similar to DevMachine property, but for the TEST environment


Name of the database server for your Test environment. (If it's the same as the TestMachine, you can just use $(TestMachine))

Development Section


This is a pretty straightforward section, only contains 2 commands to clean 2 folders

<MSBuild.ExtensionPack.FileSystem.Folder TaskAction="RemoveContent" Path="$(APPDATA)\Microsoft\BizTalk Server\Deployment" />
<MSBuild.ExtensionPack.FileSystem.Folder TaskAction="RemoveContent" Path="$(ReleasePath)" Condition="Exists($(ReleasePath))" />

Since I was having issues with binding files being cached, I opt to clean the cache folder before the build. The ReleasePath folder is being cleaned as well, just so you can verify that your installer was created (if you were to automate the deployment). The Condition first checks if the folder exists, since it would be of no use to clean a non-existent folder.

Building source

Building the source isn't a hard task. Just point to your solution file and MSBuild takes care of the rest. (Don't forget to keep your built assemblies, since you'll be needing them later)

<MSBuild Projects="$(Application).sln">
	<Output TaskParameter="TargetOutputs" ItemName="Assemblies"/>

Deploy the application

Stopping host instances

For development I always like to stop the host instances before deploying. So first you have to get a list of all host instances:

<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Get">
	<Output TaskParameter="HostInstances" ItemName="His" />

After which you can stop them:

<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Stop" HostName="%(His.Identity)" Condition="%(His.ServiceState) == 4" />

Note that I use a Condition here to check the "ServiceState", so I only stop the "running" host instances.

Creating application

Now that the host instances have been stopped, you can create the application, before we can do so, we have to check if it already exists, and if it does, we just stop it completely:

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="CheckExists" Application="$(Application)" MachineName="$(DevMachine)">
	<Output TaskParameter="Exists" PropertyName="ApplicationExists" />
<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="StopAll" Applications="$(Application)" MachineName="$(DevMachine)" Condition="$(ApplicationExists)" />
<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="Create" Applications="$(Application)" MachineName="$(DevMachine)" Condition="!$(ApplicationExists)" />

Adding resources

Next up is adding your assemblies to your newly created application:

<MSBuild.ExtensionPack.BizTalk.BizTalkAssembly TaskAction="Add" Application="$(Application)" Assemblies="@(Assemblies)" MachineName="$(DevMachine)" GacOnMsiFileImport="false" Force="true" GacOnMsiFileInstall="true" />

By default the Assemblies are set to be added to the GAC on MSI file Import, but since I want them to GAC on install, I swap the parameters. The Force option here is to overwrite existing assemblies.

Importing bindings

Following that, the bindings should be imported, I keep my bindings separated per environment in a subdirectory of the solution, so your BindingFile location may vary, nothing very special about this:

<MSBuild.ExtensionPack.Biztalk.BizTalkApplication TaskAction="ImportBindings" Application="$(Application)" MachineName="$(DevMachine)" BindingFile=".\Bindings\DEV\$(Application).xml" />

Starting host instances

Once the import has succeeded, restart the Host Instances, note that the same condition applies here, only the Host Instances that were running are started again, this is to avoid starting ones that shouldn't be:

<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Start" HostName="%(His.Identity)" Condition="%(His.ServiceState) == 4" />

Start application

Last thing to do is start your application again:

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="StartAll" Applications="$(Application)" MachineName="$(DevMachine)" />

Create release package

Exporting MSI

Now that the deployment has (hopefully) succeeded, we need to export the MSI package for deployment to other environments.

The export needs the fully qualified name of the assemblies if you want to exclude the binding and party information from the MSI, and we only have the path, so we first need to get extended information about every assembly:

<MSBuild.ExtensionPack.Framework.Assembly TaskAction="GetInfo" NetAssembly="%(Assemblies.Identity)">
	<Output TaskParameter="OutputItems" ItemName="AssembliesToExport" />

Now that you have it, you can pass this information on to the ExportToMsi, which will export the MSI in the root directory of your ReleasePath (because 1 MSI per version should be installed on all your environments):

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="ExportToMsi" Application="$(Application)" Resources="@(AssembliesToExport->'%(FullName)')" MsiPath="$(ReleasePath)\$(Application).msi" MachineName="$(DevMachine)" />

Copying bindings

For the sake of completeness, I also copy the binding file to the ReleasePath directory (the "/z" option is to allow restartable transfers to network drives if your connection would fail):

<MSBuild.ExtensionPack.FileSystem.RoboCopy Source=".\Bindings\DEV" Destination="$(ReleasePath)\DEV" Files="*.xml" Options="/z" />

Test section

Since I don't have a test machine for these purposes, I assume my development machine is also my test machine.


This is almost the same as the cleanup in the Development Section (2.3.1 Cleanup), with the difference that instead of clearing the whole ReleasePath directory, I only clear the TEST subdirectory:

<MSBuild.ExtensionPack.FileSystem.Folder TaskAction="RemoveContent" Path="$(APPDATA)\Microsoft\BizTalk Server\Deployment" />
<MSBuild.ExtensionPack.FileSystem.Folder TaskACtion="RemoveContent" Path="$(ReleasePath)\TEST" Condition="Exists('$(ReleasePath)\TEST')" />

Create release package

Since my release package only needs the binding file, all I need to do is copy my binding file from my solution directory to my release path directory:

<MSBuild.ExtensionPack.FileSystem.RoboCopy Source=".\Bindings\TEST" Destination="$(ReleasePath)\TEST" Files="*.xml" Options="/z" />

Install on test

Create application

This is identical to the part in the development section ( Creating application), except that it's on another server this time:

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="CheckExists" Application="$(Application)" MachineName="$(TestMachine)" DatabaseServer="$(TestDatabaseServer)">
	<Output TaskParameter="Exists" PropertyName="ApplicationExists" />
<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="StopAll" Applications="$(Application)" MachineName="$(TestMachine)" Condition="$(ApplicationExists)" DatabaseServer="$(TestDatabaseServer)" />
<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="Create" Applications="$(Application)" MachineName="$(TestMachine)" Condition="!$(ApplicationExists)" DatabaseServer="$(TestDatabaseServer)" />

Importing MSI

Once your release package has been prepared, you can import the MSI file into BizTalk, yet again the Overwrite switch enables you to overwrite existing assemblies, or else the import fails:

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="ImportFromMsi" Application="$(Application)" MachineName="$(TestMachine)" MsiPath="$(ReleasePath)\$(Application).msi" Overwrite="True"DatabaseServer="$(TestDatabaseServer)" />

Installing MSI

Using psexec, we can remotely install the MSI file on the server:

<Exec Command="psexec \\$(TestMachine) msiexec /i $(ReleasePath)\$(Application).msi /quiet" />

Importing bindings

Since no binding files are included in the MSI, we have to import our own from the ReleasePath directory, similar to what we did before ( Importing bindings):

<MSBuild.ExtensionPack.Biztalk.BizTalkApplication TaskAction="ImportBindings" Application="$(Application)" MachineName="$(TestMachine)" BindingFile="$(ReleasePath)\TEST\$(Application).xml"DatabaseServer="$(TestDatabaseServer)" />

Restarting host instances

This part is similar to the stopping ( Stopping host instances) and starting ( Starting host instances) of the host instances in the development part. This just does it after import of the MSI and bindings, so it causes the least amount of downtime possible:

<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Get">
	<Output TaskParameter="HostInstances" ItemName="His" />
<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Stop" MachineName="$(TestMachine)" HostName="%(His.Identity)" Condition="%(His.ServiceState) == 4"/>
<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Start" MachineName="$(TestMachine)" HostName="%(His.Identity)" Condition="%(His.ServiceState) == 4" />

Starting the application

After the deploy has succeeded, start the application:

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="StartAll" Applications="$(Application)" MachineName="$(TestMachine)" DatabaseServer="$(TestDatabaseServer)" />

Special scenarios

Validation and Production

As you might have noticed, there are no sections for Validation and Production in this script. I've left them out due to the fact that there is no support to check for running service instances.

What you could do is first disable the receive locations using the following command:

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="DisableAllReceiveLocations" Applications="$(Application)" MachineName="$(TestMachine)" DatabaseServer="$(TestDatabaseServer)" />

Afterwards you can force a wait, so you have time to check for running instances (and stop the build if you have to do manual interactions):

<MSBuild.ExtensionPack.UI.Dialog TaskAction="Show" Text="Please confirm all running instances have completed. (No will stop build)" Button1Text="Yes" Button2Text="No">
	<Output TaskParameter="ButtonClickedText" PropertyName="Clicked" />
<Error Text="User stopped build." Condition="$(Clicked) != 'Yes'" />

The steps to take from now on should be similar to Test (at least in my case). However, this may vary, so I'm not responsible for lost production messages ( I should add a disclaimer, shouldn't I? :-) ).

Multiple nodes in Test

If you have multiple nodes in your Test Environment you have to do a couple of changes, including installing the MSI on all nodes. This can be easily done by just editing the psexec command to run on multiple servers:

<Exec Command="psexec \\$(TestMachine),$(TestMachine2) msiexec /i $(ReleasePath)\$(Application).msi /quiet" />

You'll also have to restart the Host instances on the second node, by repeating the restarting lines:

<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Get" MachineName="$(TestMachine2)">
	<Output TaskParameter="HostInstances" ItemName="His" />
<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Stop" MachineName="$(TestMachine2)" HostName="%(His.Identity)" Condition="%(His.ServiceState) == 4" />
<MSBuild.ExtensionPack.BizTalk.BizTalkHostInstance TaskAction="Start" MachineName="$(TestMachine2)" HostName="%(His.Identity)" Condition="%(His.ServiceState) == 4" />

I want to delete the application for every deploy

If you want to delete the application before you deploy, you can specify the Force parameter in the Create action:

<MSBuild.ExtensionPack.BizTalk.BizTalkApplication TaskAction="Create" Applications="$(Application)"Force="True"MachineName="$(DevMachine)" Condition="!$(ApplicationExists)" />


I hope this post gave you an idea on how using simple scripts can really simplify your life. If you have any questions, feel free to ask them.

I've included a quick sample solution, that shows how it can be done. The solution can be found here:

April 10, 2014 at 3:48 PM

Recently I wrote a post about advanced orchestration monitoring for BizTalk Server using System Center Operations Manager (SCOM).


There, I wrote about some of the shortages in the default monitor. In less critical environments the default monitoring for suspended orchestrations will be sufficient in most cases. I bumped into the same issue a couple of times now and I would like to share it with you, hoping it saves you some troubleshooting. For us, Codit Managed Services, it's very important to receive the right alerts when instances get suspended.  


This blog post concerns the default monitor for suspended orchestrations and the alerts it generates.
I often hear following question: "My orchestration is suspended for 4 hours now and still I didn't receive any alerts about it?!". At first I also did not find an explanation for this. Let's take a deeper look at this monitor.


If you have currently suspended orchestrations on your environment you should see them in the SCOM console having a critical or a warning state:


If you open the Health Explorer for this Orchestration you can see some history concerning the health state:


Now the question rises: "Why am I not receiving an alert even while my orchestration has a warning state.... ?".

If we take a look at the properties of the monitor at first sight everything seems to be correctly enabled:

I also missed it several times, but then I took a look at the first property. "Alert on State".

When you check the possible options of the property you will see 2 options. Choose the second one! Default a warning state will trigger no alerts!


The warning limit (the last property) has a default value of 10, so more then 10 orchestrations of the same type should be suspended before an alert is triggered.... If you change the "Alert on State" property you will always receive alerts when your environment contains a suspended orchestration instance, no matter how many instances.


But remember, as soon as the monitor is in a critical state you will no longer receive an alert when new instances get suspended! If you want an alert per instance, check out my previous post about advanced orchestration monitoring.

Posted in: BizTalk | Monitoring

Tags: , ,

March 28, 2014 at 3:45 PM

Recently, I needed to send a message from BizTalk to an external WCF service with Windows Authentication.

For easy testing, I created a Windows console application that will act as the client. I used basicHttpBinding with the security-mode set to TransportCredentialOnly. In the transport-node I chose for Windows as clientCredentialType.

The web.config file looks like this:

	<binding name="DemoWebService_Binding"  textEncoding="utf-16">
		<security mode="TransportCredentialOnly">
			<transport clientCredentialType="Windows" />

 Before sending the test message, I needed to authenticate myself and insert the Windows Credentials:

proxy.ClientCredentials.Windows.ClientCredential.Domain = "xxx";
proxy.ClientCredentials.Windows.ClientCredential.UserName = "xxx";
proxy.ClientCredentials.Windows.ClientCredential.Password = "xxx";

This works, so now to get it right in BizTalk!


Nothing special, just a Send Port, WCF-Custom with basicHttpBinding as Binding Type and the same binding configuration as in the console application:


I thought I just needed to add the credentials to the Credentials-tab in BizTalk to be able to do proper authentication.

Unfortunately, this does not work!

Apparently, when "Windows" is chosen as clientCredentialType, the Credentials-tab is ignored and the credentials of the Host-Instance running the Send Port are used instead.


After some searching, I found the answer thanks to Patrick Wellink's blog post on the Axon Olympos blog:


The credentials of the Host Instance can't be right, because the web-service is from an external party.

To use the Windows Credentials, a custom Endpoint Behaviour has to be created.

So I've created a new Class Library in Visual Studio with a class that inherits from both BehaviorExtensionElement and IEndpointBehavior:

public class WindowsCredentialsBehaviour : BehaviorExtensionElement, IEndpointBehavior

For extensibility, the class needs to have these public properties:

[ConfigurationProperty("Username", DefaultValue = "xxx")]
public string Username
	get { return (string)base["Username"]; }
	set { base["Username"] = value; }

[ConfigurationProperty("Password", DefaultValue = "xxx")]
public string Password
	get { return (string)base["Password"]; }
	set { base["Password"] = value; }

[ConfigurationProperty("Domain", DefaultValue = "xxx")]
public string Domain
	get { return (string)base["Domain"]; }
	set { base["Domain"] = value; }

In the function "AddBindingParameters", I've added this piece of code that sets the Windows Credentials:

public void AddBindingParameters(ServiceEndpoint endpoint, System.ServiceModel.Channels.BindingParameterCollection bindingParameters)
	if (bindingParameters != null)
		SecurityCredentialsManager manager = bindingParameters.Find<ClientCredentials>();

		var cc = endpoint.Behaviors.Find<ClientCredentials>();
		cc.UserName.UserName = this.Domain + @"\" + this.Username;
		cc.UserName.Password = this.Password;
		cc.Windows.ClientCredential.UserName = this.Username;
		cc.Windows.ClientCredential.Password = this.Password;
		cc.Windows.ClientCredential.Domain = this.Domain;

		if (manager == null)
		throw new ArgumentNullException("bindingParameters cannot be null.");

Now after building and putting the assembly in the GAC, we need to let BizTalk know that it can use this custom Endpoint Behavior:

We need to add this line below to the behaviorExtensions (system.serviceModel - extensions) in the machine.config (32-bit and 64-bit):

<add name="WindowsCredentialsBehaviour" type="BizTalk.WCF.WindowsCredentials.WindowsCredentialsBehaviour, BizTalk.WCF.WindowsCredentials, Version=, Culture=neutral, PublicKeyToken=1de22c2808f4ac2e" />

Restart the host instance that runs the send port and you will be able to select the custom EndpointBehavior: