July 25, 2014 at 3:30 PM

Sentinet is highly extendable through standard Microsoft .NET, WCF and WIF extensibility points, and though the Sentinet API interfaces.

In the previous post we saw how to build a custom access rule expression (standard .NET component), with this post I would like to continue the series about the Sentinet Extensibility and demonstrate how to leverage the WCF extensibility setting up the virtual service with a custom endpoint behavior.

 

Scenario

The scenario is pretty simple. A third party REST service, GetCalledNumbers, that provides the list of the recent called numbers needs to be exposed. To meet the customer’s privacy policies it's been required to mask the dialed phone numbers.

 

To achieve these requirements I created a REST façade service that virtualizes the third party one. Then I built a custom endpoint behavior that replaces the phone numbers with asterisks except the last 4 digits.

 

In this demo the inspector has been positioned at the client side so the IClientMessageInspector interface has been implemented. It's not the intent of this post to describe how to build a message inspector, here you can find a complete example.

 

InspectorCode

 

Register

To use the custom component, first it has to be registered.

The register procedure is pretty straightforward:

 

1) Drop Codit.Demo.Sentinet.Extensions.dll in the bin directory of both Repository and Nodes applications (for Sentinet Nodes you might need to create empty bin directory first).

 

2) Register/add behavior extension in web.config files of both Repository and Nodes applications

<behaviorExtensions>
<add name="jsonMask" type="Codit.Demo.Sentinet.Extensions.Formatting.JsonFormatterEndpointBehaviorExtension, Codit.Demo.Sentinet.Extensions, Version=2.0.0.0, Culture=neutral, PublicKeyToken=9fc8943ef10c782f"/>
</behaviorExtensions>

3) Then you can add this endpoint behavior in the Sentinet Console shared behaviors (for the future re-use of this shared component between virtual services):

<behavior name="JsonFieldMask"><jsonMask toBeBlurred="PhoneNumber" /></behavior>

4) Now you can apply this shared endpoint behavior to the inbound endpoint(s) of the virtual service.

Using the Sentinet Administration console, first create a new behavior.

CreateBehaviors

Then apply it to the virtual service.

AddEndpointBehavior

 

Test

For testing the virtual service I used postman to send a GET request to my virtual API.

PostmanTest

 

During this test I enabled the full transaction recording so that in the Monitor tab I can access to the content transmitted and check the result  of the message inspector.

MonitorRecording

 

Conclusion

Message inspectors are the most used extensibility points of the WCF runtime. In this article we saw how to set-up and use custom / third party components in Sentinet. In the next post I will discuss about the Sentinet's routing feature and how to build a custom router.

 

Cheers,

Massimo

Posted in: REST | Sentinet | SOA | WCF

Tags:


July 22, 2014 at 4:00 PM

Recently I noticed some odd behavior when I was troubleshooting a SQL integration scenario with BizTalk 2010. I was using WCF-Custom adapter to perform Typed-Polling that executed a stored procedure. This stored procedure was using dynamic SQL to fetch the data because it is targeting multiple tables with one generic stored procedure.

In this blog post I will tell you how we implemented this scenario as well as where the adapter was failing when polling.
Next to that I will talk about some "problems" with the receive location and the adapter.
I will finish with some small hints that made our development easier.

  

Scenario

In this simplified scenario we have an application that is polling on two tables called ‘tbl_Shipment_XXX’ where XXX is the name of a warehouse. Each warehouse will have a corresponding receive location that will poll the data that is marked as ready to be processed.

This is performed by using a stored procedure called ‘ExportShipments’ which requires the name of the target warehouse and will use the proceed in the following steps – Lock data as being processed, export data to BizTalk & mark as successfully processed.

troubleshooting_wcf_dynamic_sql

 

Creating our polling statement

In our polling statement we will execute our generic stored procedure. This procedure will simply mark our data as being processed, execute a function that will return a SQL statement as a NVARCHAR(MAX). Afterwards we will execute the statement & mark the data as processed.

CREATE PROCEDURE ExportShipments
	@Warehouse nvarchar(10)
AS
BEGIN
	SET NOCOUNT ON;
 
	-- MARK DATA AS LOCKED
	DECLARE @lockData NVARCHAR(MAX);
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	EXEC(@lockData);
 
	-- EXTRACT DATA
	DECLARE @exportData NVARCHAR(MAX);
	SET @exportData = [ShopDB].[dbo].[ComposeExportShipmentSelect] (@Warehouse)
	EXEC(@exportData);
 
	-- MARK DATA AS PROCESSED
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';
	EXEC(@markProcessed);
 
END

In our function we will compose a simple SELECT-statement where we fill in the name of our warehouse.

CREATE FUNCTION ComposeExportShipmentSelect
(
	@Warehouse nvarchar(10)
)
RETURNS NVARCHAR(MAX)
AS
BEGIN
	-- DECLARE SELECT-STRING
	DECLARE @result NVARCHAR(MAX);
 
	-- COMPOSE SELECT STATEMENT
	SET @result = 'SELECT [ID], [ORDER_ID], [WAREHOUSE_FROM], [WAREHOUSE_TO], [QUANTITY] FROM [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] WHERE [STATUS_ID] = 7';
 
	-- RETURN SELECT STRING
	RETURN @result
 
END

I chose to seperate the composition of the SELECT-statement in a function because we were using a lot of JOIN's & UNION's and I wanted to seperate this logic from the stored procedure for the sake of code readability.

Although our scenario is very simple we will use a function to illustrate my problem.


Generating the Types-polling schema

Now that everything is set in our database we are ready to generate our Typed-Polling schema for the WarehouseA-polling to our BizTalk project.

1. Right-click on your project and select Add > Add generated items... > Consume Adapter Service

 

2. Select the SqlBinding and click "Configure"

3. Fill in the Server & InitialCatalog and a unique InboundId. The InboundId will be used for in the namespace of your schema and URI of your receive location. Each receive location requires its own schema. I used "WarehouseA" since I am creating the schema for polling on "tbl_Shipments_WarehouseA".

4. Configure the binding by selecting "TypedPolling" as the InboundOperationType and execute our stored procedure as PollingStatement with parameter "WarehouseA". (Note that we are not using ambient transactions)

 

5. Click Connect,select the "Service" as contract type and select "/" as category. If everything is configured correctly you can select TypedPolling and click Properties that will show you the metadata. If this is the case click Add and a schema will be generated.

 

The wizard will create the following schema with a sample configuration for your receive location.

 

It's all about metadata

The problem here is that executing the dynamic SQL doesn't provide the required metadata to BizTalk in order to successfully generate a schema for that result set.

I solved this by replacing the function 'ComposeExportShipmentSelect' with a new stored procedure called 'ExportShipment_AcquireResults'.
In this stored procedure will execute the dynamic SQL and insert the result set into a TABLE and return it to the caller. This tells BizTalk what columns the result set will contain and of what type they are. 

CREATE PROCEDURE ExportShipment_AcquireResults
	@Warehouse nvarchar(10)
AS DECLARE @result 
      TABLE(
			  [ID] [INT] NOT NULL,
			  [ORDER_ID] [NVARCHAR](50) NOT NULL,
			  [WAREHOUSE_FROM] [NVARCHAR](20) NOT NULL,
			  [WAREHOUSE_TO] [NVARCHAR](18) NULL,
			  [QUANTITY] [INT] NOT NULL
		   )
BEGIN
	-- DECLARE SELECT-STRING
	DECLARE @select NVARCHAR(MAX);
 
	-- COMPOSE SELECT STATEMENT
	SET @select = 'SELECT [ID], [ORDER_ID], [WAREHOUSE_FROM], [WAREHOUSE_TO], [QUANTITY] FROM [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] WHERE [STATUS_ID] = 7';
 
	-- EXECUTE SELECT AND INSERT INTO RESULT
	INSERT INTO @result
		(
			[ID],
			[ORDER_ID],
			[WAREHOUSE_FROM],
			[WAREHOUSE_TO],
			[QUANTITY]
		)
		EXEC(@select);
 
	-- RETURN RESULT SET
	SELECT * FROM @result;
END

Our generic polling stored procedure simply execute our new stored procedure.

BEGIN
	SET NOCOUNT ON;
 
	-- MARK DATA AS LOCKED
	DECLARE @lockData NVARCHAR(MAX);
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	EXEC(@lockData);
 
	-- EXTRACT DATA
	DECLARE @exportData NVARCHAR(MAX);
	EXEC [ShopDB].[dbo].[ExportShipment_AcquireResults] @Warehouse
 
	-- MARK DATA AS PROCESSED
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';
	EXEC(@markProcessed);
 
END

When we regenerate our schema the problem should be resolved and your schema looks like this -
(Note that the new filename include your InboundId) 

 

Why not use a #tempTable?

You can also achieve this by setting FMTONLY OFF and using a temporary table but this will not work in every scenario.

CREATE PROCEDURE ExportShipments
	@Warehouse nvarchar(10)
AS
BEGIN
	SET NOCOUNT ON;
 
	-- MARK DATA AS LOCKED
	DECLARE @lockData NVARCHAR(MAX);
	SET @lockData = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 7 WHERE [STATUS_ID] = 1';
	EXEC(@lockData);
 
	-- EXTRACT DATA
	DECLARE @exportData NVARCHAR(MAX);
	SET @exportData = [ShopDB].[dbo].[ComposeExportShipmentSelect] (@Warehouse)
 
	-- CREATE TEMP TABLE
	SET FMTONLY OFF;
	CREATE TABLE #tempTable
		(
			  [ID] [INT] NOT NULL,
			  [ORDER_ID] [NVARCHAR](50) NOT NULL,
			  [WAREHOUSE_FROM] [NVARCHAR](20) NOT NULL,
			  [WAREHOUSE_TO] [NVARCHAR](18) NULL,
			  [QUANTITY] [INT] NOT NULL
		   )
	
	-- INSERT SELECT RESULTS INTO TEMP TABLE
	INSERT INTO #tempTable
		EXEC(@exportData);
	
	SET FMTONLY ON;
 
	-- MARK DATA AS PROCESSED
	DECLARE @markProcessed NVARCHAR(MAX);
	SET @markProcessed = N'UPDATE [ShopDB].[dbo].[tbl_Shipment_' + @Warehouse + '] SET [STATUS_ID] = 10 WHERE [STATUS_ID] = 7';
	EXEC(@markProcessed);
 
	-- RETURN RESULT
	SELECT * FROM #tempTable;
END

We were using transactions on SQL-level and Try/Catch statements but this conflicted with the FMTONLY and resulted in a Severe error in the event log. Also this didn't make any sense at all since we seperated the SELECT composition to a seperate function/stored procedure and the definition of the result set should be defined there.

If you want to read more about #tempTables, I recommend this post.

 

PollingDataAvailableStatement & no ambient transactions

With our schema generated I deployed my application to my machine and started creating the receive locations for the polling on my database. The configuration is pretty easy - No ambient transactions, Typed polling, pollingstatement is our stored procedure and specify a SELECT in the PollDataAvailableStatement to check if we need to run the stored procedure.

Apparently the PollDataAvailableStatement is only used when you enable ambient transactions according to this article.

Problem here is that when you clear out the PollDataAvailableStatement and start your receive location it will be disabled automatically with the following error - 

PollDataAvailableStatement seems to be a mandatory field although it is not being used, I easily fixed it with "SELECT 0".

 

Empty result sets & TimeOutExceptions

Later on the project we had to move from transactions on SQL-level to ambient transactions and our PollDataAvailableStatement was consulted before it ran the stored procedure.

In our stored procedure we used the found data to perform cross-references and return a set of data when required so it is possible that it is not required to return a set.

We started experiencing locks on our tables and TimeOutExceptions occured without any obvious reason.
It seemed that the adapter had a bug: when the adapter has found data it is required to return a result to BizTalk, if not it will lock SQL resources and lock the tables.

This issue can be resolved by installing the standalone fix (link) or BizTalk Adapter Pack 2010 CU1 (link)

 

Tips & Tricks

During the development of this scenario I noticed some irregularities when using the wizard, here are some of the things I noticed -

  • It is a good practices to validate the metadata of the TypedPolling-operation before adding it. If something is misconfigured or your stored procedure is correct you will receive an error with more information.
    If this is the case you should click OK instead of clicking the X because otherwise the wizard will close. The same error might occur when you finalize the wizard, then it is important to click cancel instead of ok or the wizard will close automatically. 
     
  • If your wizard closes for some reason it will remember the configuration of the previous wizard when you restart it and you can simply connect, select the requested operation and request the metadata or finish the wizard.
    It might occur that this results in a MetadataException that says that the PollingStatement is empty and therefor invalid -


    This is a bug in the wizard where you always need to open the URI configuration although it still remembers your configuration. You don't need to change anything, just open & close and the exception is gone.

Conclusion

In this blog post I highlighted the problem where I was unable to generate a schema for my stored procedure by only executing the result of my function. I also illlustrated how I fixed it and why I didn't use FMTONLY and a temporary table.

Next to the generation of our schemas I talked about the problems with the receive location where it required a PollDataAvailableStatement even when it was not used and the locking of our tables because our stored procedure wasn't returning a result set. Last but not least I gave two examples of common irregularities when using the wizard and how you can bypass them.

For me it is important to write your SQL scripts like you write your code - Use decent comments, seperate your procedure into subprocedures & functions according to the separation of concern.
While writing you scripts everything might sound obvious but will it in a couple of months? And how about your colleagues? 
 

All the scripts for this post, incl. DB generation, can be found here.
 

Thank you for reading,

Tom.


July 8, 2014 at 9:18 AM

In Sentinet, authorization and access to any virtual service is defined using an Access Rule which is a combination of authorization expressions and logical conditions. Sentinet provides an out-of-the-box access rule composer with a set of common Access Rule Expressions like X509 certificate, Claim and User validation, etc...

 

Running out of the in-built tools to cover all the business scenarios is almost inevitable; extensibility is the way to fill this gap. Extensibility is one of the key-features of every successful product and it particularly shines in Sentinet where you can deal with different extensibility points.

 

In this blog post I will go through the steps involved in creating a custom access rule expression, register it and test it.

 

Create a Custom Rule Expression

A custom access rule expression is a regular .NET component that implements the IMessageEvaluator interface (ref. Nevatech.Vbs.Repository.dll). This inteface contains three methods:

  • Evaluate – Where to put the access rule logic.
  • ImportConfiguration – Read the (and apply) the component’s configuration.
  • ExportConfiguration – Save the component its configuration.

 

In this example I’m going to define a component for evaluating an APIKey sent in a custom header (for SOAP services) or as a part of the service URL (for REST services).

As shown in the figure below, the implementation is pretty straightforward.

  • the getSecurityContext method evaluates the System.ServiceModel.Channels.Message object to read the APIKey
  • the isValidKey method evaluate the key.
  • the properties ServiceName and isRest are set with the values specified in the componet configuration.

 

ApiKeyValidator

 

Here the simple implementation for reading the component configuration.

Configuration

 

In this example the API Key is validated against an SQL table. A stored procedure with two parameters evaluates whether the key has access to the specific service.

SQLImplementation

 

Register

First step is to copy the dll(s) to the Sentinet node(s). In this case, notice that it’s not mandatory to sign the assembly and register to the GAC. I simply create a bin folder in my Sentinet node and I copied the dlls.

Bin

 

The custom component can be registered and graphically configured using the Sentinet Administrative Console.Click on the AccessRule node, Add new Access Rule, then at the bottom of the rule designer press Add..

 

Five parameters need to be specified:

  • Name. The fiendly name of the custom rule expression (APIKey)
  • Assembly. The fully qualified assembly name that contains the custom rule expression (Codit.Demo.Sentinet.AccessControl, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null)
  • Type. The .NET class that implement the IMessageEvaluator interface (Codit.Demo.Sentinet.AccessControl.APIKeyValidator)
  • IsReusable. Set it to true if your component is thread-safe. The Sentinet Authorization Engine will
    create and use a single instance of this component.
  • Default Configuration. The default configuration. In this example I let it blank and I will specify the parameters when I use expression in a specific Access Rule

 

Register

 

Test

First I created a SOAP and a REST virtual service that virtualize the Offer backend service (SOAP). Then I defined a couple of access rules using the new APIKey Custom Access Rule Expression and I applied those rules to the Virtual Service using the Access Control tab.

In this example the service name is passed to the Expression through a configuration parameter but a better solution would be to extract it from the Message class.

 

AccessRules

 

For this example I defined few API keys in the SQL table.

SQLApiKeys

 

The SOAP scenario has been tested with soapUI. Depending on the value returned by the Evaluate method the virtual service returns different objects:

  • True => Access granted, the response message is returned.
  • False => Generic Access denied via Soap Fault.
  • Exception => Custom Exception via Soap Fault.

 

SOAPTest

 

The REST scenario has been tested with Fiddler. Depending on the value returned by the Evaluate method, the virtual service returns different HTTP codes:

  • True => HTTP 200
  • False => HTTP 403
  • Exception => HTTP 500

RESTtest

 

Finally, below, you can see how the test results are presented in the Sentinet Monitoring tab.

SentinetMonitor

 

Conclusion

The Sentinet extensibility model is intended to support custom scenarios by enabling you to modify the platform behavior at different levels.In this post I have discussed how to extend the Access Rule engine with an additional authorization component.

 

Cheers,

Massimo

Posted in: .NET | Monitoring | Sentinet | WCF

Tags: , ,


July 3, 2014 at 4:03 PM

With some luck your environment was installed by a BizTalk professional and the most critical maintenance tasks were configured at installation. When I say critical maintenance tasks, I mean the BizTalk backup job, DTA Purge & Archive,... If these tasks are not configured I'm sure your production BizTalk environment will only be running smoothly for a couple of days or months but definitely not for years!

 

I'm not going to write about basic configuration tasks as most BizTalk professionals should be aware of these. 

At Codit Managed Services our goal is to detect, avoid and solve big and small issues before it's too late. This way we often detect smaller missing configuration tasks. Not necessary at first, but definitely necessary for the BizTalk environment to keep on running smoothly through the years! 

 

Purging/archiving the BAM databases:

I'm going to explain you how you should maintain the BizTalk BAM databases:

 

In most environments these maintenance tasks are missing. You will find the necessary information on the internet, but probably only when it's urgent and you really need it.

 

My goal is to provide you with this information in time, so you are able to configure the BAM maintenance tasks without putting your SQL server or yourself under some heavy stress. If it comes to BAM and data purging, it's rather simple. By default, nothing will be purged or archived. Yes, even if you have the BizTalk backup and all the other jobs succesfully running! 

 

I have seen a production environment, running for only two years where the BAMPrimaryImport database had a size of 90GB! It takes a lot of time and processing power to purge such a database. To maintain and purge the BAM databases you will need to configure how long you want to keep the BAM data, configure archiving, trigger several SQL SSIS packages,...

 

The problem is: purging is configured per Activity, so this is a task for the developer and not a task for the guy who installed the BizTalk environment. You will find all the information to do this on following sites:

http://blogs.biztalk360.com/bam-production-environment-management

http://blogs.msdn.com/b/nabeelp/archive/2013/10/22/sql-script-to-clean-up-old-bamarchive-tables.aspx

http://www.biztalkbill.com/Home/tabid/40/EntryId/103/BizTalk-BAM-Archiving.aspx

http://blogs.msdn.com/b/appfabriccat/archive/2010/02/10/best-practices-for-configuring-bam-data-maintenance-and-cube-update-ssis-packages-in-biztalk-solutions.aspx

http://geekswithblogs.net/andym/archive/2009/05/21/132346.aspx

  

BAMData not being purged immediately?

Something very important that you should be aware of is the fact that if you for example want to keep a year of data, you will have to wait another year for the data being purged/archived in a supported way!

That's why it's so important to configure these jobs in time! It's not like the DTA P&A job, where the data is purged immediately.

You can find more information about this on following blog: http://www.richardhallgren.com/bam-tracking-data-not-moved-to-bam-archive-database

 

Purging the BAMAlertsApplication database:

I'm rather sure the following maintenance task will not be sheduled to clean your BAMAlertsApplication database. I only discovered this myself a couple of days ago! Probably not a lot of people notice this database because it's rather small. After 2 years running in production with a small load it had a size of 8GB. But it's 8GB of (wasted) diskspace!

 

If you search on the internet on how to clean this database you will find nothing official by Microsoft,...

Credits on how to purge the BAMAlertsApplication go to Patrick Wellink and his blogpost: http://wellink.bloggingabout.net/2011/02/03/millions-of-records-in-the-bamalertsapplication-and-how-to-get-rid-of-them-nsvacuum-to-the-rescue

If you wonder what the NSVacuum stored procedure looks like, you can find it below:

USE [BAMAlertsApplication]
GO

BEGIN

	DECLARE @QuantumsVacuumed	INT
	DECLARE @QuantumsRemaining	INT
	DECLARE @VacuumStatus		INT
	DECLARE @StartTime			DATETIME

	SET @QuantumsVacuumed = 0
	SET @QuantumsRemaining = 0

	SET @StartTime = GETUTCDATE()

	-- Volunteer to be a deadlock victim
	SET DEADLOCK_PRIORITY LOW

	EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

	IF (0 != @VacuumStatus)			-- VacuumStatus 0 == Running
	BEGIN
		GOTO CountAndExit
	END

	DECLARE @CutoffTime			DATETIME
	DECLARE @RetentionAge		INT
	DECLARE @VacuumedAllClasses	BIT

	-- Remember the last run time and null counts
	UPDATE [dbo].[NSVacuumState] SET LastVacuumTime = @StartTime, LastTimeVacuumEventCount = 0, LastTimeVacuumNotificationCount = 0

	-- Get the retention age from the configuration table (there should only be 1 row)
	SELECT TOP 1 @RetentionAge = RetentionAge FROM [dbo].[NSApplicationConfig]

	SET @CutoffTime = DATEADD(second, -@RetentionAge, GETUTCDATE())

	-- Vacuum incomplete event batches
	EXEC [dbo].[NSVacuumEventClasses] @CutoffTime, 1

	-- Mark expired quantums as 'being vacuumed'
	UPDATE	[dbo].[NSQuantum1] SET QuantumStatusCode = 32
	WHERE	(QuantumStatusCode & 64) > 0 AND		-- Marked completed
			(EndTime < @CutoffTime)					-- Old

	DECLARE @QuantumId			INT
	DECLARE @QuantumEndTime		DATETIME

	DECLARE QuantumsCursor CURSOR
	LOCAL READ_ONLY FAST_FORWARD
	FOR
	SELECT	QuantumId, EndTime
	FROM	NSQuantum1 WITH (READUNCOMMITTED)
	WHERE	QuantumStatusCode = 32
	ORDER BY EndTime

	OPEN QuantumsCursor

	-- Do until told otherwise or the time limit expires
	WHILE (1=1)
	BEGIN
		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)			-- VacuumStatus 0 == Running
		BEGIN
			BREAK
		END

		FETCH NEXT FROM QuantumsCursor INTO @QuantumId, @QuantumEndTime

		IF (@@FETCH_STATUS != 0)
		BEGIN
			SET @VacuumStatus = 2		-- VacuumStatus 2 == Completed
			SET @QuantumsRemaining = 0
			GOTO CloseCursorAndExit
		END

		-- Vacuum the Notifications
		EXEC [dbo].[NSVacuumNotificationClasses] @QuantumId, @VacuumedAllClasses OUTPUT

		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)
		BEGIN
			BREAK
		END

		-- Vacuum the Events in this quantum
		EXEC [dbo].[NSVacuumEventClasses] @QuantumEndTime, 0

		-- Delete this Quantum from NSQuantums1 if its related records were also deleted
		IF (1 = @VacuumedAllClasses)
		BEGIN
			DELETE [dbo].[NSQuantum1] WHERE QuantumId = @QuantumId

			-- Update the count of quantums vacuumed
			SET @QuantumsVacuumed = @QuantumsVacuumed + 1
		END

		EXEC @VacuumStatus = [dbo].[NSVacuumCheckTimeAndState] @StartTime, @SecondsToRun

		IF (0 != @VacuumStatus)
		BEGIN
			BREAK
		END

	END	-- Main WHILE loop

CloseCursorAndExit:

	CLOSE QuantumsCursor
	DEALLOCATE QuantumsCursor

CountAndExit:

	-- Report progress
	SET @QuantumsRemaining = (SELECT COUNT(*) FROM [dbo].[NSQuantum1] WITH (READUNCOMMITTED) WHERE QuantumStatusCode = 32)

	SELECT	@VacuumStatus AS Status, @QuantumsVacuumed AS QuantumsVacuumed,
			@QuantumsRemaining AS QuantumsRemaining

END -- NSVacuum

You need shedule the NSVacuum command on your environment. Run this step by step, as it puts a lot of stress on your SQL server.

 

The content of this post is something every BizTalk professional should add to his BizTalk Server installation/deployment procedure!


June 25, 2014 at 11:00 AM

As discussed yesterday in my blog post around the enhancements to the WCF-WebHttp Adapter the R2 releases of the BizTalk products are focusing more on 'compatibility and platform' alignment over shipping new major features/add-ons to the platform.

To give you another overview, the following features where added in new BizTalk Server 2013 R2 release:

  • Platform Alignment with Visual Studio, SQL Server,…
  • Updates to the SB-Messaging Adapter
  • Updates to the WCF-WebHttp Adapter
  • Updates to the SFTP Adapter
  • Updates to the HL7 Accelerator

This blogpost will focus on the updates of the SB-Messaging Adapter that where shipped with this new release of Microsoft BizTalk Server.

 

SB-Messaging Adapter enhancements

With this new version of BizTalk Server the SB-Messaging adapter now also supports SAS (Shared Access Signature) authentication, in addition to ACS (Access Control Service).

Due to this improvement BizTalk Server can now also interact with the on-premise edition of Service Bus that is available through the Windows Azure Pack.

 

More information on SAS Authentication can be found here: http://msdn.microsoft.com/en-us/library/dn170477.aspx)

More information on the Windows Azure Pack can be found here: http://www.microsoft.com/en-us/server-cloud/products/windows-azure-pack/default.aspx?nv1if0=1#fbid=nVinI5x6SOz?hashlink=s1section6

 

The image below is a head to head comparison between the ‘old’ version of the SB-Messaging adapter authentication properties window and the ‘new’  version that ships with BizTalk Server 2013 R2.

 

clip_image002[1]clip_image002[3]

      

 

Out with the old, in with the new

When we compare the 'Access connection information' from both the Windows Azure Portal (cloud) as the Windows Azure Pack Portal (on premise) we already see a clear difference on ways to authenticate to the Service Bus.

You also notice why the update to include SAS was really a "must have" for BizTalk Server.

On the left side you see the cloud portal, that has support for SAS and ACS, on the right side you notice the on premise version of service bus that only supports the SAS and Windows authentication.

 

clip_image002clip_image002[1]

 

Conclusion

This ‘small’ addition to the SB-Messaging adapter has a great impact on interacting with Service Bus as we finally can use the full potential of the on-premise version of Service Bus.

We have started implementing this feature at one of our customers. One of the requirements there is to use the on-premise version of Service Bus. First test are looking good and the behavior and way of implementing is the same as connecting to the service bus in the cloud.

However one downside that I personally find is the lacking of the Windows Authentication possibility feature.

 

Happy Service Bus’ing !!

Glenn Colpaert


June 24, 2014 at 9:52 AM

Now that BizTalk 2013 R2 is released on MSDN, it’s time to take a first look at the new features and possibilities of this release.

As Guru already clarified on the last BizTalk Summit (BizTalk Summit Summary), the R2 releases of the BizTalk products are focusing more on 'compatibility/platform' alignment and less about shipping new (major) features/add-ons to the platform.

To give you an overview, the following features where added in the new BizTalk Server 2013 R2 release:

  • Platform Alignment with Visual Studio, SQL Server,…
  • Updates to the SB-Messaging Adapter
  • Updates to the WCF-WebHttp Adapter
  • Updates to the SFTP Adapter
  • Updates to the HL7 Accelerator

This blog post will focus on the updates of the WCF-WebHttp Adapter that where shipped with this new release of Microsoft BizTalk Server.

 

WCF-WebHttp Adapter enhancements

With this new version of BizTalk Server the WCF-WebHttp adapter now also supports sending and receiving JSON messages. This new version of BizTalk ships with a wizard to generate XSD schema from a JSON instance and two new pipelines (and components) for processing JSON messages in a messaging scenario.

 

Out with the old, in with the new

Let us take a quick glance at the new components that are shipped with this new BizTalk release. We first of all have 2 new pipelines and components for encoding and decoding JSON messages in our BizTalk Ports.

clip_image001[4]

Configuring these 2 new components is very straightforward. On the encoder side there is one component property to specify whether the XML root element should be ignored while encoding the xml message to JSON. On the decoder part there are 2 properties to specify, the root node and the root node namespace to be used by the decoder in the generated xml message.
You can find a screenshot of the properties of both components below.

 

clip_image002

clip_image002[6]

 

Next to 2 new components for parsing the JSON messages there is also the new JSON Schema Wizard which allows you to generate XSD files based on a JSON instance. You can find this new wizard in the "Add -> New Item" menu of Visual Studio.

clip_image002[8]

 

JSON, do you speak it?

To demonstrate the new features and possibilities of the enhanced WCF-WebHttp adapter I created a little POC that will use the iTunes API (http://itunes.apple.com).

First of all I downloaded a JSON instance from the API from following URL: http://itunes.apple.com/search?term=metallica.

clip_image002[1]

Next, I used this JSON instance to generate the XSD with the new JSON Schema Wizard. The wizard itself is pretty straightforward, you simply specify the ‘Instance File"’, choose your ‘Root Node Name’ and ‘Target Namespace’ and simply press ‘Finish’ to get your XSD generated.

clip_image002[3]

When using the ‘Metallica’ instance this results in following schema.

image

 

 

After generating the instance I started configuring the Send Port, to actually consume the service from BizTalk.

Below you can see the configuration of my 2-Way Send Port that is sending requests to the API. We have a PassThruTransmit pipeline and as receive pipeline we have a custom pipeline with the JSON decoder inside it.

clip_image002[7]

 

clip_image002[9]clip_image002[11]

 

 

When sending the request to the API we get following message in the BizTalk MessageBox (parsed JSON instance).

image

This sums up the basic demo of the new enhancements of the WCF-WebHttp adapter.

If you have questions regarding this scenario, please share them in the comments section of this blog post and I’ll be happy to answer them.

 

Cheers,

Glenn Colpaert

Posted in: BizTalk | Schemas | WCF

Tags: , ,


June 23, 2014 at 4:48 PM

A desktop environment in the Cloud

I recently created a virtual machine in the cloud with the purpose to use this as my own remote desktop. For me, one of the biggest benefits of the IaaS model is the fact that I can easily access the machine from anywhere with any device and I don't need to worry about buying or maintaining hardware, or Windows licenses.

In general, the biggest cost for Microsoft Azure Virtual machines is “compute hours”. Which means, I pay for every minute my remote desktop uses CPU power. To limit the costs, I perform a startup in the morning and shutdown the machine in the evening.
Starting a virtual machine in the morning means I need to login in the Azure management portal, perform some clicks and wait for the provisioning since this can take up to 30 minutes in my experience. Afterwards I provide my credentials and establish a remote desktop connection. The annoying part is I have to do this manually every morning.

I now automated this process: when my laptop boots, the Virtual Machine spins up, automatically connects and displays the virtual machine with my account in a remote desktop session. The reverse process also happens in the evening: when I pull down the lid of my laptop, the Azure VM will shut down.

Pre Requirements

PowerShell with Azure SDK
The first requirement is that we need to install PowerShell and the SDK of Microsoft Azure. The installation can be found here:
http://www.windowsazure.com/en-us/downloads/?fb=en-us
It might be possible you'll need to update your version of PowerShell as well.

Azure virtual machine
If you do not have an Azure Virtual machine, create one through the portal and make sure to enable the remote desktop endpoint:

Azure endpoints

Azure subscription file
To access your account through PowerShell, download your azure subscription file on this link:
https://windows.azure.com/download/publishprofile.aspx

User rights
User rights to edit the group policy. According to MSDN: "You must be logged on as a member of the Domain Administrators security group, the Enterprise Administrators security group, or the Group Policy Creator Owners security group."

Create the script

Provision virtual machine and start remote desktop session

The startup script can be found below.
The service name of the Virtual machine can be found if you grab the DNS name of het VM and remove the ".cloudapp.net" string.
The name of the VM is as shown in the Azure management portal.

Azure_vm_startup.ps1

#Import Azure PowerShell module and your publish settings file
Import-Module 'C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1'
Import-AzurePublishSettingsFile '"C:\start_vm\Visual Studio Premium with MSDN-1-3-2014-credentials.publishsettings"'

#Access the VM
$vm = Get-AzureVM -ServiceName 'myservicename' -Name 'myname'

#Start the VM
$result = Start-AzureVM -ServiceName $vm.ServiceName -Name $vm.Name

do{
  $vm = Get-AzureVM -ServiceName $vm.ServiceName -Name $vm.Name
sleep 5
}until($vm.PowerState -eq 'Started')

#Read the remote desktop adress
$endpoint = Get-AzureEndpoint -Name "Remote Desktop" -VM $vm
$remotedesktopurl = $vm.ServiceName + ".cloudapp.net:" + $endpoint.Port;

#Start remote desktop session
mstsc /v:$remotedesktopurl /f

Shutdown the virtual machine

Azure_vm_shutdown.ps1

#Import Azure PowerShell module and your publish settings file
Import-Module 'C:\Program Files (x86)\Microsoft SDKs\Windows Azure\PowerShell\Azure\Azure.psd1'
Import-AzurePublishSettingsFile 'C:\start_vm\Visual Studio Premium with MSDN-1-3-2014-credentials.publishsettings'

#Acess the VM
$vm = Get-AzureVM -ServiceName 'myservicename' -Name 'myname'

#Stop the VM
if($vm.PowerState.Equals("Started")){
    $result = Stop-AzureVM -ServiceName $vm.ServiceName -Name $vm.Name -Force
}

Create logon and log off triggers

To trigger the start and stop scripts, I used the group policy in Windows, which gives us the possibility to trigger scripts at the logon and log off event of the user.

Open the Group Policy editor:
Start => Run => gpeditor.msc

Navigate to the Scripts section of the user:
Local Computer Policy => User Configuration => Windows Settings => Scripts (Logon/Log off)

 

Group policy scripts

Double click the Logon record, navigate to the tab PowerShell scripts, Click add and add navigate to your script.
Repeat the same step for the log off task

Add logon scripts group policy

To apply the changes immediately, update your Group Policy:
Start => Run => gpupdate

Update group policy

Windows Policy Pitfalls

If you log off your user, and logon again, you should see a remote desktop login popup.
The first time you'll need to provide your credentials and click "Remember my credentials".

Rarely enough, you'll see that the button "Remember my credentials" won't work. Windows is by default disabled to remember those credentials.
I tested this on my Windows 7 and saw this message: "Your system administrator does not allow the use of saved credentials to log on to the remote computer because its identity is not fully verified. Please enter new credentials"

save credentials remote desktop

Luckily, we can change this behavior, so open again to the group Policy Editor (Start => Run => gpeditor.ms)
Navigate to the Credentials delegation section:
Local Computer Policy => Computer Configuration => Administrative Templates => System => Credentials Delegation

credentials delegation group policy

Double click the record "Allow delegation saved credentials" and click enabled in the section on top.
Click the "Show.." Button and add "*" to the list.

Update the Goup Policy again (start => Run => gpupdate) and log off, logon your user account.

Conclusion

We learned some basics on how to use PowerShell with Microsoft Azure: how to start and stop a VM and dynamically and start a remote desktop session based on configured endpoints.


The solution will give us a lot of comfort in the morning, as long as the machine is connected to the internet at logon and log off time!!
Automating provisioning of a virtual machine saves us time and money. You can now enjoy your coffee even more in the morning.

Posted in: Azure | PowerShell

Tags: ,


June 19, 2014 at 4:29 PM

Some technologies withstood the test of time very well. One of those technologies is the IBM AS/400 system (now called IBM System i). A lot of software is still running on AS/400 or Mainframe systems.

Luckily, Microsoft provides us with the right tools to integrate with System i and System z environments. These tools are packaged into Host Integration Server and allow you to integrate on different levels with Host environments:

-          Network integration

-          Application or Program integration

-          Data integration (databases and Host files)

-          Message integration (queuing)

-          Security integration

In this post, I will go deeper into Application Integration. More specifically calling a piece of RPG code hosted on an AS/400 system. I will call the RPG program from BizTalk Server by using the adapter for Host Applications.

The scenario

The AS/400 is running a RPG program that returns the balance of someone’s account when passing his name and account number.
BizTalk Server will call into this RPG program by passing in the input parameters and receiving the return value (balance).
BizTalk Server will leverage the Adapter for Host Applications to make this distributed program call into RPG possible.

The RPG program looks like this (*will be used later, you don’t have to understand RPG code):

      *********************************************************************                        

      *  PROGRAM      - GBALDPC                                                                 

      *  DESCRIPTION  - AS400 GETBALANCE SERVER

      *  WRITTEN FOR  - MICROSOFT Corporation                                                             

      *********************************************************************                        

      *                                                                                            

     H                                                                                             

      *                                                                                            

      * Data Definitions                                                                           

      *                                                                                            

     D Name            s             30A

     D AccNum          S              6A                                                                 

     D AccBal          S              9P 2                                                         

      *                                                                                            

      * Mainline                                                                                   

      *                                                                                            

     C     *ENTRY        PLIST                                                                     

     c                   PARM                    Name                                              

     c                   PARM                    AccNum                                           

     c                   PARM                    AccBal                                            

      *                                                                                            

     c                   Z-ADD     2003.66       AccBal                                            

      *                                                                                            

     C                   EVAL      *INLR         = *ON                                             

 

For the sake of simplicity, but also to make it easy for you to reproduce this, I’m using the standard Microsoft ‘GetBal’ sample here.

Lets get started…

To realize this scenario, the following components are used to make this possible:

-          AS/400 running the RPG program (In my case, the ‘real’ AS/400 will be replaced by a ‘simulator’ - because I don’t have an AS/400 on my laptop…)

-          BizTalk Server 2013

-          Host Integration Server 2013

-          Visual Studio 2012

After you setup BizTalk Server and Host Integration Server (not part of this blog post), the work in Visual Studio can start.

Visual Studio will display 4 new project types that were installed together with Host Integration Server 2013.

clip_image002[4]

The project types:

-          BizTalk Pipeline Conversion
This project type allows to integrate with Host Data Structures without the use of the Adapter for Host Applications. This is very useful when you are – for example – reading a Host Data structure from a queue.

-          Host Application
This project type is used when you call a Host program via a .NET program or via the BizTalk Adapter for Host Application (BAHA). The Host Application project type will describe how the Host Data Structure translates to .NET data structures and vice versa.

-          Host Files
Project type to integrate with the Host File system (VSAM for example).

-          Message Queuing
Project type to integrate with Host Queuing systems.

 

For this scenario, use the ‘Host Application’ project type and create a new project based on that template.

After creating the project, a blank canvas is presented:

clip_image003[4]

To continue, we will create a .NET Definition. There are two types of .NET Definitions, a Client and a Server Definition.

The .NET Client Definition is used when .NET will call a program on the Host system (also called Windows-Initiated Processing or WIP).
The .NET Server Definition is used when the Host System will call a .NET program (also called Host-Initiated Processing or HIP).

In this case, we will call the Host program, so choose ‘Add .NET Client Definition’.

clip_image005[4]

Name your .NET Client Definition:

clip_image007[4]

On the next page, provide a name for the interface, and an initial version number.

clip_image009[4]

The next page in the wizard looks like this:

clip_image011[4]

Protocol: Choose the correct protocol that is used to communicate with your Host System (TCP, Http and LU6.2). AS/400 is using TCP so we select that. *note: when doing Mainframe integration 2PC (two-phase commit) will only be possible over LU6.2

Target environment: Choose the target environment. In this case System i Distributed Program Call (other options: System i, System z, CICS, IMS)

Programming model: DPC. The programming model depends on the Target environment that was selected. More information on programming models and their functionalities can be found here.

Host Language: Choose the language of the Host program that will be called. In this case RPG. (Can also be COBOL)

On the next screen in the Wizard, we have to tell in what Program library our Host program is located.
The Host administrator or programmer should be able to provide you this value:

clip_image013[4]

Finally, when the wizard is complete, a new .NET Client Definition is added to the Host Application project.

clip_image015[4]

Now that we have a base Host Application project with a .NET Client Definition, we are ready to import the data definition of the RPG program. This import will set the in/out and return parameters of the RPG program. This can be done by importing the RPG program, or manually by adding functions and parameters to the IBanking interface.

The easiest way is to import the RPG program. Right click the ‘GetBalClient’ and select ‘Import Host Definition’.

clip_image017[4]

In the wizard, brows to the RPG program

clip_image019[4]

When your RPG program is read without errors, the following screen of the wizard is shown:

clip_image021[4]

Choose a friendly name for the method we will call. In this case ‘RPGGetBalance’.

The Library name is inferred from the information we entered earlier, this should be correct.
Finally, set the Program name. This is the Program name on the Host environment, this value should be given by the Host administrator (or developer).

On the next screen, set the direction of each parameter:

clip_image023[4]

In this case, ‘NAME’ and ‘ACCNUM’ our input paramters. ‘ACCBAL’ is the return value.

Click ‘Finish’.

The Host Definition was imported successfully:

clip_image025[4]

With that, our Host Application project is finished! We will now generate a DLL and Xsd schema that we can use in BizTalk Server.

Right click the Host Application project and click ‘Save AS400RPG’. (Unlike other .NET project types, a Host Application project is not compiled but saved!)

clip_image027[4]

After saving, a DLL and XSD is generated in the projects folder structure.

We are now ready to setup the BizTalk part. We will keep the BizTalk part very simple. A receive location will read a request message from a File location, a solicit/response send port will send the request to AS/400, a send port will subscribe to the result and write it to an output folder.

Let’s focus on the solicit/response send port that will call the RPG program.

Open the send ports properties and select the HostApps adapter

clip_image029[4]

Open the adapter’s configuration

clip_image031[4]

Set the HostApps Transport properties as shown above.
Then click the ellipsis button on the ‘Connection Strings’ field.

The connection string windows is shown.

clip_image033[4]

Click the ‘Add assemblies – Browse…’ button.

clip_image035[4]

Browse to the DLL that was generated by the Host Application Project in Visual Studio.
The DLL is added to the connection string window:

clip_image037[4]

Double click the newly added DLL to set it’s connection string properties:

clip_image039[4]

Set a TimeOut value for call to the RPG program.

Scroll down to the TCP/IP properties:

clip_image041[4]

Set the IP Adress and port number of the AS/400.

The send port to call the RPG program on AS/400 is not configured correctly.

The last thing to do is test our RPG integration. To do that, I will create a sample message based upon the Xsd Generated by the Host Application project.

My sample message looks like this:

<ns0:GetBalClient__Banking__RPGGetBalance__RequestTIAssemblyVersion="1.0"xmlns:ns0="http://microsoft.com/HostApplications/TI/WIP">
  <ns0:RPGGetBalanceInDocument>
    <ns0:NAME>Kim Akers</ns0:NAME>
    <ns0:ACCNUM>123456</ns0:ACCNUM>
  </ns0:RPGGetBalanceInDocument>
</ns0:GetBalClient__Banking__RPGGetBalance__Request>

When BizTalk picked up the message, the RPG program replied with the expected balance response:

<?xmlversion="1.0"encoding="utf-8"?>
<ns0:GetBalClient__Banking__RPGGetBalance__Responsexmlns:ns0="http://microsoft.com/HostApplications/TI/WIP"TIAssemblyVersion="1.0">
  <ns0:RPGGetBalanceOutDocument>
    <ns0:returnValue>777.12</ns0:returnValue>
  </ns0:RPGGetBalanceOutDocument>
</ns0:GetBalClient__Banking__RPGGetBalance__Response>

 

As you can see, integrating with Host Programs (RPG, COBOL), is possible out-of-the box with BizTalk Server and Host Integration Server.
In straightforward scenarios like this, even no coding at all was necessary! 

 

Peter Borremans


June 13, 2014 at 1:37 PM

ITPROceed is a new free annual event in Belgium that is "the technology geek fest for IT Professionals on Microsoft technologies, tools, platforms and services" as they like to call it. Its goal: fill the TechDays void in the ITPro landscape.

It is heavily supported by Microsoft Belux but organised by all the ITPro user groups in Belgium - Azug, Pro-Exchange, SQL Server UG, System Center UG & Win-Talks!

Some Codit employees were present at the event and here is their feedback and review of the sessions;

 

Hybrid Cloud with SQL Server 2014 by Pieter Vanhove

Pieter Vanhove has done a nice job showing us an overview of the new Microsoft Azure features that are available in SQL Server 2014.

The main focus in his session was about how you can create a hybrid cloud solution. (Hybrid = hosting a part in the almighty Azure cloud and a part in your own datacenter).

 

Here are some interesting topics:

Backup to the cloud:

-Built-in in SQL server 2012 – 2014.

-Tool available for earlier versions. ("We are already using this in production")

-New feature is the Smart Admin backup to Azure, definitely interesting for smaller organisations!

 

Migrating databases to Azure:

-SQL 2014 has only made it more easy to migrate your databases! A couple of clicks in the Management studio and you are good to go.

 

Microsoft has definitely made moving from on premise to Azure a lot easier! The other way takes more work. But why would you want to go back? :)

- Brecht
 
 

What is happening in a Windows Azure DC and how do they achieve running with 99.95% SLA? by Panagiotis Kefalidis

I was immediately impressed by the knowledge, enthusiasm and pace of the speaker. You could easily see he would be able to go on for hours about his passion: Microsoft Azure. 

He explained how the Windows Azure's Fabric Controller and the infrastructure around him do their magic like a fine tuned orchestra and keep your machines running and self-healing when things go bad.

After this session I understand how they can guarantee this high SLA. You can’t imagine what happens behind the screens when you for example create a new virtual machine. And that’s maybe the nicest thing about Azure, you don’t have to worry about it!

- Brecht

IT Pros: Meet Azure... again! by Mike Martin & Kristof Rennen

Mike & Kristof kicked off their session with a global overview of the Microsoft Azure landscape from an IaaS perspective.
The rest of their talk was focused on some announcements and new features during this month, they even had to leave some out of it due to time constraints.
 

Here are some topics they covered - 
- Managebility with Puppet & Chef
- New 'Ibiza' portal
- VM Extensions f.e. anti-virus
- Disaster recovery with Site Recovery
- ExpressRoute
- Azure Files
- RemoteApp
- ... 
 

This was a really great session to get a great overview of the recent changes & new features and what they are capable of.
Even I as a developer was amused about the content :) 

- Tom


Deep Dive into Microsoft Azure Storage Services by Yves Goeleven (slides)

I've heard a lot about this session and his precious level 666 and attended with big expectations, Yves has analysed the official white paper and explained the whole architecture of how Microsoft Azure Storage works behind the scene.
It is not just a simple set of disks but uses different layers to achieve this, and so on and on.
 

A heavy session but I was able to follow the big picture and also learned about some important effects f.e. when you're using geo-replication this will not be done at runtime but in a bulk asynchronously.
Next to the main architecture we saw some small but important parts about the supporting previous versions of the API - Depending on the x-ms-version in the HTTP request the services knows what API you are using to route it to the corresponding service, ...
 

Very interesting session! 

- Tom


Deep dive into Windows Azure Pack by Alexandre VerkinderenChristopher Keyaert

In this session they did a deep dive into the Windows Azure Pack and demonstrate all the benefits that you could get from it. For those who don’t know what’s the Azure pack: it’s running your private cloud using the Microsoft Azure technology.

The session was good, it was nice to see the possibilities, but for me personal this was a less interesting session, as I don’t see myself using this in the daily job.

- Brecht

 

Azure Integration Patterns with Windows Azure by Sam Vanhoutte (slides)

Last but not least, Sam was on stage to talk about integration patterns in the cloud and how you can bring the cloud to your enterprise where he gave a short overview of several architectural & operational challenges.
 

He proceeded with three main topics - Network integration, Data integration and Application integration.

- Network integration was a short overview on how you can connect on protocol/network level by using virtual networking, ExpressRoute & hybrid connections as we saw in Mike & Kristof's talk.
 

- Data integration covered how we can use Azure Storage & Azure SQL bases and how we can sync that on-prem, also as we saw in Yves's session
 

- Application integration was the main topic of his talk where he introduced us to the Service Bus landscape and compared Service Bus Relay, BizTalk Services Hybrid Connections & BizTalk Services with some nice demos that illustrate the benefit of each seperate feature.
 

Nice comparison session that enlighten the power of each seperate feature and what to use in specific scenarios.

- Tom

 

ITPROceed was a very good replacement for TechDays with great content, great speakers and lovely catering and that for a free event!
This event once again proved that the community can be very powerful, interesting and fun!

I suggest that ITPros mark their calendar for next years event on 11th of June!

 

Thanks for reading,

Brecht & Tom.

Posted in: Azure | Community | SQL Server

Tags: , ,


June 2, 2014 at 4:00 PM

With “Mobile-First Cloud-First” being the new trending mantra, the communication between devices, Services on-prem(ise) and cloud are growing tremendously. Such a scenario drives the necessity to have a means that provides a high level perspective and complete control of all the services irrespective of their hosting model and aggregate, secure and tune them to business efficiency.

 

Sentinet by Nevatech

 

Sentinet is a lightweight and scalable SOA and API management platform that helps to define, secure and evolve your API program.

It delivers runtime SOA management by enforcing design-time decisions using policies and remote declarative configurations. These capabilities provide SOA and REST implementations in a completely non-intrusive manner. 
Based on the concept of the service virtualization and service brokerage, it allows to transparently manage solutions that run on a diverse SOA infrastructure and quickly adapt to changes.

 

In this blog post I want to give you an overview of the components and the main features.

 

The components

 

A) Sentinet Nodes. A high-performance, low latency, scalable hosting model that can dynamically and non-intrusively extend and modify the behavior of existing services.

B) Sentinet Console. A web-based interactive application that allows SOA administrators and IT operators to manage and monitor the APIs and SOA services.

C) Sentinet Management API: An API that developers can leverage and extend to build their own management extensions and applications

D) SOA Repository. It provides a centralized and secured repository of all SOA managed assets like services, policies, authorization rules, service level agreements and metrics.

           

SentinetComponents2

 

Explore the main features

 

Virtualization

The Sentinet Nodes hosting model enables to aggregate and compose multiple business services in a single Virtual Service. Thanks to the fine-grained virtualization it is possible to configure details like which operations to be virtualized, the uri templates, versions, routing criteria, etc.

 

In this sample two different services (one SOAP and one REST) have been virtualized in a single REST service, two operations have been renamed and included and two excluded from the virtualization.

 

Virtualization

 

Mediation

Business services can be developed and deployed in the application layer with a unified communication and security pattern, while aspects like protocols, security, authorization and versioning are delegated to the Sentinet platform.

 

In this example, a service is exposed as netTcp with Integrated Security and the security configurations are delegated to the virtualized service that has multiple endpoint with different bindings and different security models like TransportWithMessageCredential or Message security with a client certificate. In other words: a protocol mapping and a security mapping has been applied.

Mediation

 

Security and Access Control

Sentinet Nodes dynamically implement and enforce SOA solutions’ security via managed authentication, authorization and access control.

Sentinet security models enable SOA services with Single-Sign-On and Federated Security scenarios and extend implementations with industry standard Security Token Services.

 

In this example I applied a custom access control rule that implement a rate limit of 7000 requests in 10 minutes, an ip filter and a timerange filter. An access rule can be applied to different scopes (Service, Operation, Endpoint), it’s also possible to multiple rules to the same scope to create a chained Access Control.

AccessRuleSimple

Then I run a quick load test for testing the rule I created.

zs_RuntTesta

When the rate limit is hit an HTTP 403 status code is returned.

AccessDenied

 

 

Monitoring and Reporting

Sentinet provides real-time and historical monitoring, auditing and messages recording. 

The image here under, reports the real-time graph related to the run test for the Access Control rule. At a glance we can see the performance trend and other metrics like the number of successful/failed calls, maximum message size and  response times.

 

In this particular case, the real time view helped me to quickly notice that the test has been run in a scenario with a high network latency. Indeed, the summary box reports average duration of 10ms when the average response time measured by the test client was 413ms.

zs_Report

Switching to the logs tab we can find the list of the transactions occurred with additional details like the operation and the triggered access rule. It’s also possible to record the message content or to completely disable changing the monitor.

zs_LogWithDetails

Other reports with aggregated metrics are available. For more details, visit the Nevatech website.

 

SLA management

Sentinet Service Agreements helps to monitor products and maintain them reliable and scalable. A service agreement can cover multiple services and different service scopes (Interface, Operation, Endpoint) and it’s validated against multiple performance metrics.

During the definition of the scope to be monitored, you can choose which message will be targeted specifying an access rule.

SLA violations can trigger alerts and custom actions.

 

In this example,I created a new service agreement that covers two version of the same service. The SLA is applied to different services and the SLA violation is calculated every 5/10 minutes.

SLA1

Then I created another SLA for the maximum duration. Positioning on the Service agreement node, you can monitor in real-time all the agreements merged together. This is very helpful especially when you define different groupes of SLAs.

 

SLA4

Finally, in the logs tab you can find all the violation details within the agreement. the metric violated and the metric value at the time the violation occurred.

 

Conclusion

There are many SOA management products out there. Sentinet is the one we've chosen to enrich our integration offer because it fits perfectly with solutions that leverage the Microsoft technology stack, it has a little footprint, it’s highly extensible with remarkable performances.

 

Cheers,

Massimo

Posted in: Monitoring | REST | Sentinet | SOA | WCF

Tags: , , ,