How to ‘solve’ host instance state ‘Stop pending’

Every now and then you run into this, like I did today. It’s the issue when you try to stop or restart the host instance but it hangs in the ‘Stop pending’ state. Nothing seems help, you cannot do anything in the BizTalk management console nor in the Windows Services console.
Integration MVP Sandro Pereira blogged about this some time ago.

His blog post mentions some solutions, but my favorite one is just kill the process and start again. This option is mentioned, but how to figure out which process to kill? They all look alike in the Task Manager. In Windows Server 2012 and up the processes in the Task Manager can be expanded so the actual name is displayed. This makes it easy to kill the correct process. However in Windows Server 2008 this isn’t the case, and this posts can help you.

The first step is determine which process id is causing the trouble. You can find this piece of PowerShell also in a post from Sandro. For completeness sake I’ll show it here:

$machineName= hostname
$query= "root\MicrosoftBizTalkServer", "Select * from MSBTS_HostInstance where HostType = 1 and ServiceState = 4 and RunningServer = '$machineName'"
$hostInstanceSearch= new-objectsystem.management.managementObjectsearcher($query)
$hostInstanceList= $hostInstanceSearch.get()
foreach($hostInstanceItemin$hostInstanceList)
{
$processName= $hostInstanceItem.HostName
$perfCounter= New-ObjectSystem.Diagnostics.PerformanceCounter("BizTalk:Messaging", "ID Process", $processName)
$processID= $perfCounter.NextValue()
Write-Host
Write-Host"HostName: "-foregroundcoloryellow-NoNewLine
Write-Host$hostInstanceItem.HostName-foregroundcolorwhite
Write-Host"Process Id: "-foregroundcoloryellow-NoNewLine
Write-Host$processID-foregroundcolorwhite
Write-Host
}

The output of this script will be like this:

PowerShell

However I found that this list is sometimes not complete. The problem-host-instance is missing, so we now know which aren’t the problem. With the next command run from a command window you can get all BizTalk host instance process on your machine:

tasklist /FI “IMAGENAME eq btsntsvc.exe”

The output is like this:

TaskList

In this particular example there is no difference between the two because I can enforce the ‘Stop pending’ problem. However you can compare the two outputs now and see which PID is missing and thus causing the problems.

Finally use this command from a command window to kill the process:

taskkill /PID 4160

Hello Logic App!

During INTEGRATE 2014 last November in Seattle Microsoft gave a sneak preview of what today is known as API apps and Logic apps. Back then it was all fuzzy and no preview to play with. Right before the BizTalk Summit in Londen in April, Microsoft released the big news regarding their new App Service platform, including the existing Web apps (Azure websites) and Notification services, combined with the new API apps and Logic apps.

For us BizTalk developers the most interesting of all are the Logic apps. Although you can’t use Logic apps without having API apps, because they are (also) the new adapters to receive and send messages.

I was inspired by this blog post from the BizTalk360 team, in which is described how to read messages from one on-premise location and write them to another using Logic apps and hybrid connections. That blog post saved me from quite some research as there are some tricky things to know. At the end of the post I had the Logic app below.

FileConnectorBasic

For me that was the first step, but in a real BizTalk scenario there at least has to be one map! My scenario is to read from my local laptop ‘inbox’ folder, transform the message and write to the same laptop to an ‘output’ folder.

So I installed the Microsoft Azure BizTalk Services SDK to have mapping functionality in Visual Studio without the need for any BizTalk assembly. I created a small schema and a map to be used for the Logic app.

VisualStudioMap

Next step is to have an API app to perform the transform, because everything is an API on the platform. To create a new BizTalk Transform Service, you can also use this Azure documentation as reference.

NewTransformService

Select BizTalk Transform Service and click ‘Create’. The Azure portal start page will open and show the API app to be created. When the API app is created, click it to open the details. The next step is to add a map to the Transform Service.

TransformServiceMap

Now the BizTalk Transform API service is ready, we need to put it between the two file connectors in the existing Logic app. This is a challenge in itself, because currently it isn’t possible to re-organize the API apps within a Logic app. So just dragging the Transform Service in between the two file connectors is a no-go. I tried to add the Transform Service to the Logic app and then use the code view to re-organize but it appeared not to be that simple (although the design of the Logic app is described in a plain readable JSON file). This will be possible in the future, but for this case I removed the sending file connector, added the Transform Service and then added the file connector again. This results in the Logic app below.

BizTalkBasicScenario

The complexity at the moment is the expression you need to provide as parameter in the API apps. At the moment there is no validation, intellisense or syntax highlighting which complicates development. In the near future this will be possible, there is demand from the community for this as well.

Logic apps is in fact chaining API apps together and using (a part of) the output of one as the input for the next API app. For the BizTalk Transform Service we need to take the output of the first File Connector, which is the body of the message, as input for the map. We can use this expression for this: @triggers().outputs.body.Content

It is pretty easy to read: take the content of the body of the output of the previous API app.

Next is to take the output of the Transform Service and use that in the sending File Connector. To determine the file name to use for the File Connector we can use this: @concat(‘/Outbox/’, triggers().outputs.body.FileName)

The name is appended to the default folder. The content of the message is grabbed from the Transform Service, but has a different expression compared to the one used as input to the Transform Service: @body(‘transformservice’).OutputXml

Like a typical ‘hello world’ with BizTalk, I throw in this message in the Logic app:

<ns0:SourceSchema xmlns:ns0=”http://didago.nl/sourceschema”>
<ns0:FirstName>Indiana</ns0:FirstName>
<ns0:LastName>Jones</ns0:LastName>
</ns0:SourceSchema>

This is the output and as expected the map has been executed.

<?xml version=”1.0″ encoding=”utf-8″?>
<ns1:TargetSchema xmlns:ns0=”http://didago.nl/sourceschema” xmlns:ns1=”http://didago.nl/targetschema”>
<ns1:FullName>Indiana Jones</ns1:FullName>
</ns1:TargetSchema>

Hurray, my first Hello Logic app!

It was a very interesting exercise to play with these basic components of the App Service platform. Logic apps currently are certainly not a replacement for BizTalk and that is also not what it’s meant to be. Integration MVP Michael Stephenson has a nice blog post about it.

Currently the Logic apps are in preview and a lot still needs to be done before it is mature enough to build enterprise solutions. For example the entire ALM needs to be figured out and also the designer should be available in Visual Studio and not only via the browser. Microsoft is betting big on this so it will be a matter of time before these topics will be covered.

It is great playing with new stuff!

The status of the Windows Azure Pack

Until now my focus was not really on the private cloud, but I was triggered by the fact that it was mentioned at Integrate 2014 conference in Seattle which I attended. During the conference I realized I didn’t know a lot about the Azure pack, while it will become very important. The current on-premise BizTalk version will eventually be replaced by what Microsoft is building at the moment. The future Azure version of BizTalk (orchestration engine, connectors, BizTalk microservices, etc) will be deployed on-premise by means of the Azure Pack. By the way the product team indicated a preview will be released around the BizTalk summit in London on 13/14 April 2015.

Then the blog post from Sven van den Brande about this topic came by and that was the trigger to actively take a look at it. I started my 6 year old server to take the first step, install Windows Server 2012 R2.

There are quite some blog posts about installing the Azure Pack like:

  • http://blogs.technet.com/b/privatecloud/archive/2013/12/06/windows-azure-pack-installing-amp-configuring-series.aspx
  • http://blogs.msdn.com/b/nick_meader/archive/2014/07/31/building-a-self-service-private-cloud-using-windows-azure-pack.aspx

I took the ‘express route‘ to quickly install Azure Pack. This installation is meant for single server installs, where a typical Azure Pack production install requires multiple servers to host features you find in Azure like IIS for websites, SQL servers for databases, Active Directory servers for authentication and Hyper-V servers for VM’s.

Especially this blog post I found very helpful for guidance: https://www.helloitsliam.com/2014/11/21/windows-azure-pack-part-1/

I followed this blog post to install everything necessary to run Azure pack on a single server, using SQL Server 2014 Express edition. Installing the Azure pack via the Web Platform Installer is pretty simple. You can check all screenshots in the blog post, as it wouldn’t make sense to add them here again.

After installing and configuration you get this interface (after a bit of playing around), which is pretty familiar but also far behind in features compared to Microsoft Azure today.

Azure Pack

The documentation is dated October 17, 2013 and the Azure Pack feature installers are from October 21, 2014. Also for example the Windows Service Bus hasn’t been updated since October 2013.

So why would you install Azure Pack in a production environment at the moment, if it is so behind in features and it doesn’t get regular updates?

Asking the question is answering it. I think the private cloud is going to become very important as replacement for current Windows Server farms, but for now it is not an option. It is fun to play with but that’s about it. I wouldn’t advise a customer to use this in production.

It is my guess that Microsoft is putting no effort in the current Azure Pack anymore, but is building a new Azure Pack for Windows vNext Server, which is scheduled to be release in 2016, even after Windows 10 server. Or even better, Windows vNext Server will be build to support a private cloud. The timeline for this will be aligned with other Azure features like Servicebus, API Management and BizTalk micro services. Like with the new BizTalk developments, the private cloud will just be an instance of Microsoft Azure which results in feature parity between the two.

We have interesting times ahead!

My migration to Exchange Online

Being a freelance consultant I have a domain registered, which comes with email and hosting. When I started my company back in 2008 I took the most cheap solution (typical Dutch) and everything went fairly well. Every now and then I had an email outage but only a couple of times a year and not for long periods of time.
Nonetheless this started to annoy me that much that I decided to move to the cloud: Exchange Online

This was back in March 2014, just after another outage. I wrote the words ‘Exchange Online’ as TODO on the whiteboard in my study. But moving to another email provider is pretty scary, for me at least: You don’t want to be left without for a couple of days (or worse). So I postponed it again and again until the next outage in November 2014, after which I decided to really move during Christmas.
This after all is a time of limited email traffic so I felt confident to make the journey.

The first step was checking out which ‘plan‘ I needed, the $4 or $8 a month plan. Basically the difference is unlimited storage, but ‘Basic’ already is 50 Gb per user.

Next step is registering and I’ll save you from the rest of the steps because they’re pretty well described in other blog posts and in

When logging in you’ll get an Office365 portal with the typical Outlook items and an Exchange Online admin section. Being a developer it was an eye opener to find the number of options you can configure, because this is normally the system engineers domain. Suddenly you’re an Exchange administrator!

Next step is configuring the domain to be used, otherwise you’ll end up with <your name>@<your company>.onmicrosoft.com but it was pretty easy: just add some DNS fields and you’re done.

Then finally the actual reason I started this blog post, which obviously isn’t a post like my regular ones but it might be helpful for others. Like mentioned I’m a freelance consultant and I own the domain ‘Didago.nl’. For tracing reasons I provide a customized email address to every customer, supplier or other contact I need to provide an email address to. The customization means I put the contact name in the email address, so for example for LinkedIn I would use “linkedin(at)didago.nl”. In this way I can setup Outlook rules, but it is also traceable which source used this email address. More than once I found one of my email addresses in a place it shouldn’t be like a spam list. Some time ago I actually was able to inform a Dutch blog of their compromised CMS before they knew it……

When configuring Exchange Online you need to have a license for every mailbox you create. Since I don’t want to create a mailbox for each and every email address, I configured a ‘Catch All’ account at my previous provider. I was under the assumption this was ordinary, but it appears Microsoft has a policy not to allow catch all accounts because they attract spam. While this is a viable reason, it is not very handy for me so I started my search for a solution because this had to be fixed.

Luckily I found the solution in this blog post by YourItHelp. It describes you have to tell Exchange Online upfront it is not authoritative for your domain, which means it shouldn’t manage its accounts. By disabling this it assumes accounts exist in other locations and Exchange Online is just functioning as relay in case it cannot find a recipient.

After that you can configure a rule to catch all email which has a recipient ‘outside the organization’, this is perfectly described in the blog post from Your IT Help. Although I still had a short fight with the catch all rule definition (I found out it also caught all of my outgoing email :-( ), I’m very happy with the result and Exchange Online integrates very well with my Nokia 930 so I found out :-)

I’m one step closer to the cloud!

Cheers, Jean-Paul

Integrate 2014 – Impact on Integration Consultants

Like a lot of fellow integration consultants I attended Integrate 2014, the integration summit on the Microsoft campus in Redmond. The event was organized by BizTalk360 and they did a great job.

It was the first time I attended a BizTalk event abroad, but I heard something new was to be announced so I registered early. Although it is quite a trip, for a couple of days, all the way from Amsterdam to Seattle, it definitely was worth the jetlag.

I’m not going to describe the sessions, because that has already been done, for example in these great posts:

Like with many such events, the sessions are interesting but the most interesting part is meet new people and discuss with fellow enthusiasts what the impact of the announcements is. Besides the discussions, it is away nice to shake hands with community leaders. Sometimes I only knew them from Twitter or Blogs.

Although some very interesting new things were announced, we as integration consultants mainly need to work with what’s available today. So in this post I’ll focus on the impact on our day-to-day work and the near future.

My takeaways from Integrate 2014, regarding this topic are mainly that:

  • BizTalk is going to stay around
  • There is a real and major shift towards the cloud
  • It will take some time for Microsoft to be feature complete
  • Microsoft Windows Azure BizTalk Services (MABS) will be discontinued, but migration will be possible

Although the announced changes have huge impact for the BizTalk community, BizTalk itself will stay around for many years. Microsoft pointed out again their release cadence regarding BizTalk with a major release every 2 years and a platform alignment every alternate year. For 2015 there is a major release scheduled, but the worrying thing is that Microsoft has not shown a clear picture on what the ‘major’ enhancements will be. Actually I think Microsoft will only do platform alignment (make BizTalk ready for latest Windows/SQL/Visual Studio versions) and maybe add one or two features to improve cloud connectivity or adapt new standards (like happened with REST) but nothing new and innovative will be added to the current BizTalk platform. This in fact means a stand-still from an innovation point of view, so don’t expect any new investments in for example BAM or the workflow engine.

After one of the sessions I had a chat with one of the Program Manager on the BizTalk team and he explained that all of their 800 developers are working on the new stuff to get it ready for the announced preview. This leaves with few resources to do work on BizTalk Server 2015 and also makes very clear where Microsoft’s focus is at the moment. He also mentioned there will be a shared codebase between BizTalk on-premise and BizTalk-in-the-cloud. Later during Integrate it became clear that this actually meant that Microsoft will only build for the cloud and make that available on-premise in form of a private cloud (delivered via Azure Pack for Windows Server). Looking at it from that perspective, it makes sense there will be a single codebase and feature parity between on en off-premise because they’re exact the same product. The only difference is the datacenter it will be deployed, public cloud of private cloud. One important thing regarding private cloud though. Like mentioned before Microsoft will provide Azure BizTalk via the Azure Pack for Windows Server, but this pack currently doesn’t provide what’s available in Azure. So a lot of work needs to be done there as well.

This focus on the cloud leads to the conclusion that there will be minimal development on the current on-premise version of BizTalk Server, and that it will become a different track in the integration space. By that I mean we’ll see exactly the same as happened with ASP and ASP.NET: There will be developers doing ‘classic’ BizTalk and others doing ‘Azure’ BizTalk. Although they functionally do the same work, the technology is way different. This also means different design skills will be needed. In case of Azure, for the first time BizTalk developers will need to really include cost efficiency in their considerations.

I don’t think this separation will happen in the next few years, but it will eventually. The preview of Azure BizTalk (which isn’t an official term by the way) is scheduled for Q1 2015. The demo’s shown on Integrate made me think the first preview is nice to play with, but far from feature complete. For the Azure Integration Services (which is an official term from the slides) in an update cadence of 3 months new features will be added, starting with the features to be able to implement some content/context based routing scenarios. For enterprise solutions with complex orchestrations this will be a different story. Microsoft actually hopes these orchestrations can be broken down into a (large) set of (BizTalk) microservices, but we’ll have to see whether that is actually the case. This is also one of the concerns I have regarding migration of existing solutions. Some solutions will need to be redesigned, because they can’t be migrated.

One of the sessions was about cloud integration at Microsoft itself. They currently heavily invest in moving their integration to MABS, although they know that MABS will be discontinued in favor of the new platform. That means that they’re very confident in the migration path; like Kannan C. Lyer answered on the same question from the audience. As far as I know here in The Netherlands MABS isn’t used very often (if at all?), but in the USA it is used mainly for B2B where it is a suitable solution looking at the current capabilities.

To conclude my view on the new announcement: these are exciting times for integration consultants. For the first time in about 10 years things are really going to change for us and you have to decide for yourself whether you want to stay the ‘classic’ BizTalk consultant or add ‘Azure’ BizTalk to your skillset. The classic BizTalk consultant will be around for at least 10-15 years (lots of customers still use BizTalk 2006 or older, and the end-of-life of BizTalk 2013 R2 is in 2023), so plenty of time to make up your mind.

Didago IT Consultancy

BizTalk Software Factory v5 for BizTalk 2013R2

If you’re not familiar with the BizTalk Software Factory, please read the documentation on http://bsf.codeplex.com/releases.

Every new release of BizTalk requires some changes in the BizTalk Software Factory (BSF) and the 2013R2 edition is no exception. Most important change is in the Visual Studio version and the BSF relies on the Guidance Automation Toolkit (GAT) and Extensions (GAX) to be available.

Since Microsoft stopped development of the toolkit and extensions I’m happy to see the community continues it by means of the Open GAT/GAX (http://opengax.codeplex.com) project.

To be able to install the BSF (http://bsf.codeplex.com) you need to install the Visual Studio 2013 SDK and the Open GAT/GAX for Visual Studio 2013 upfront.

The functionality of the BSF hasn’t changed but it is important to know that at this moment there is no version of the BizTalk Deployment Framework (BTDF) that supports Visual Studio 2013. Since the BSF supports the BTDF, it currently is available but doesn’t work. Installing BTDF v5.5 on Visual Studio 2013 does work, but Visual Studio will not contain any of the BTDF functionality.

If you run into issues or you like to have some additional functionality in the BSF please let me know.

Didago IT Consultancy

Book review Getting Started With BizTalk Services

Recently I was invited to take a look at the Packt Publishing book ‘Getting Started with BizTalk Services’ by Karthik Bharathy  and Jon Fancey.

This is the first available book on Microsoft Azure BizTalk Services and for that reason alone an interesting read. I’ve already played with BizTalk Services so I was curious to measure my knowledge against the book, moreover because the book assumes no prior BizTalk knowledge. The Microsoft Azure platform is expanding at a tremendous rate so I was also interested to see how up to date this book is, as it was published in March 2014 (although that’s only 3 months ago, the release rate of Azure features is quarterly).

To start with the last question, the book is still very up to date. No major changes or new features have been announced that directly impact BizTalk Services. So from that point of view it is still a reliable source of information (besides that WABS is called MABS now J).

The book is organized like this. It starts with a generic overview of what Azure is and for what scenarios it would be useful. It also covers the basics of BizTalk Services.

Next chapter is about the Mapper and from a on-premise BizTalk Server perspective it has been seriously improved. One thing the book briefly touches is the fact that the mapper is not based on XSLT behind the scenes anymore; it’s using XAML. One other thing is the ability to have some sort of exception handling in the map. This has been one of the missing pieces of integration regarding to BizTalk Server. For each operation you can specify what to do in case of an exception, fail or continue and output a null value. Of course this is not really exception handling, but it’s better than nothing.

Then a chapter about Bridges. I knew a bridge was comparable to a BizTalk pipeline but I was surprised to read that behind the scenes a bridge is using Windows Workflow Foundation. This is an indication that the announced workflow engine for BizTalk Services most probably will be Workflow Foundation as well.

Other chapters cover topics like

  • EAI scenarios
  • B2B scenarios (EDI with X12/EDIFACT)
  • Using the BizTalk Adapter Service to connect to on-premise LOB systems
  • Using custom code in bridges
  • Maintaining BizTalk Services using the API via PowerShell or REST service
  • Tracking and Troubleshooting
  • Moving current BizTalk Server investments to BizTalk Services (and when not to)

Especially the chapter about B2B is comprehensive. The X12 standard is used quite often in the US so it makes sense to dive deeper in that part (european customers are using EDIFACT). Besides that the B2B market is the most suitable to move to the cloud first, because it involved integration with other companies which is typically a cloud-scenario.

Everything in the book is described in clear language and doesn’t only scratch the surface. Some topics are explained in more detail including some background as well.

For anyone interested in BizTalk Services and wishes to be quickly up to speed, this is a perfect start.

Didago IT Consultancy

Introducing the BizTalk Port Info Query Tool

Sometimes for a project you have to create a tool which you plan to share with the community. This tool has been on the shelf for at least a year and I finally found time to share it with you.

It started when I got involved as a fire fighter in a BizTalk project at a customer which had a lot of applications and accompanying hundreds of receive and send ports. The architecture was fully based on the pub/sub architecture and it was sometimes very difficult to understand the flow of the message and also to understand where a message would end up. To get a better view of the inner workings I decided to create a small tool to answer questions like:

  • Which ports do uses this specific map?
  • Where is a certain pipeline used?
  • Which ports subscribe to this field?

This tool is capable of reading the management database and retrieve the receive port and send port information via the ExplorerOM assembly. Only port name and so on is not detailed enough, so I added support to retrieve the following information to get a total view:

  • Receive Port Name
  • Receive Locations
  • Receive Transform
  • Receive Pipelines and Pipeline Components + Configuration
  • Receive Adapter Type and Settings like URL’s
  • Send Port Name
  • Send Transforms
  • Send Pipelines and Pipeline Components + Configuration
  • Send Adapter Types and Settings like URL’s
  • Send Port Filter Subscriptions

This information is retrieved for each of the selected applications. Once this information is available, then it is easy to also search through it. So search was added to be able to quickly see which receive ports or send ports are using a certain map or which send ports are subscribed to a certain field value.

It proved to be pretty handy for research purposed. The tool doesn’t require an installer only launching an executable to show up the winforms application. The sources and executable are on CodePlex and are tested with BizTalk 2010 and BizTalk 2013.

If you launch the tool you’ll get the screen below where you can adjust the user id and password to connect to the management database.

StartScreen

Once you click ‘Retrieve BizTalk Applications’ the management database will be queried using the specified credentials to retrieve all deployed BizTalk applications. Depending on the amount of applications this can take some time.

AppsLoaded

The list of applications are prefixed with a checkbox, so you can retrieve the information for specific applications only if you wish. When you’re ready selecting the applications, then click ‘Retrieve Application Info’ to get the details.

PortInfoLoaded

You now see two populated tree view controls on the right hand side. The top one contains the receive port information and the bottom one the send port information. Per BizTalk application there is a hierarchy containing all the details. If you for example expand the tree you’ll see information about the receive locations, transforms, pipelines and so on.

PortInfoExpanded

The following image shows the details of pipeline components and their settings.

PortInfoExpandedPLC

The best part is that you can search through all information in the receive and send port tree views. The search text box scans the trees for a match and displays the results in the bottom list view. In the example below shows a search for every instance containing ‘_to_’ and as expected the maps on the send and receive ports show up, including the path to the applications they reside in.

PortInfoSearch

Sometimes the information (path) is too long, like with filter subscriptions which can take up many characters. In that case you can double-click the entry and the information is displayed in a message box.

PortInfoSearch2

Finally, in situations where you cannot run the tool yourself or if you want to have the information ‘offline’ then you can use the ‘Export App Info’ button. This button saves the tree as a CSV formatted file type which you can open for example in Excel. The export is saved in the folder location the executable is launched from.

Since this started as a tool-with-maximum-use-minimum-UI it is clear the tool can be improved a lot to be more comprehensive and user friendly, but it worked for me. Feel free to take the code and adjust it to your needs!

I hope it helps you guys getting a better understanding of the BizTalk applications you’re faced with :-)

Greetings,

Jean-Paul

Didago IT Consultancy

BTDF and “The mapping does not exist” SSO Error

Recently I needed to use SSO in an existing BizTalk solution where we use the BizTalk Deployment Framework.

Adding SSO support is so easy, but I ran into the famous “The mapping does not exist” SSO error. It took me some time to figure out what the cause was, and by posting it here I hope someone will benefit from it.

To start with some background. The requirement was a configurable value in a mapping, which would be different for Dev/Test/Prod environments. So typically something for SSO.

To deploy SSO as part of the BTDF is pretty easy. In the btdfproj file you have to specify: <IncludeSSO>true</IncludeSSO>

Next, you have to make sure to define the SSO security groups in the btdfproj and settings Excel file. You can also define custom values in the settings Excel, like ‘SomeValueFromSettings':  <ItemGroup>    <PropsFromEnvSettingsInclude=SsoAppUserGroup;SsoAppAdminGroup;SomeValueFromSettings; />  </ItemGroup>

Finally you have to add this to the btdfproj:  <TargetName=CustomSSOCondition=‘$(Configuration)’ == ‘Server’>
<UpdateSSOConfigItem BizTalkAppName=$(BizTalkAppName)SSOItemName=“SomeValueToBeUsedSSOItemValue=$(SomeValueFromSettings)/>
</Target>

So far so good, this deploys the SSO settings as expected. The interesting thing is how to get these values out of SSO again. The BTDF uses a special technique to store the settings, which means you should only use the provided SSOSettingsFileReader.dll assembly to get values out of SSO.

I used the BizTalk Mapper Extensions Utility Pack to use the SSO Config Get functoid. This all seems to work fine, but if the map is executed at runtime you’ll receive the error “The mapping does not exist”.

Although more reasons exist for this error, for this case it turned out the way of retrieving the SSO values was not supported. Obviously the extension pack uses a different way to retrieve the values from SSO, which works fine if you deploy values to SSO using for example the SSO Configuration Application MMC Snap-In.

Because I wanted to use the BTDF I changed the functoid to a Scripting Functoid which calls an external assembly method of the SSOSettingsFileReader assembly. After this change it worked right away.

After all a pretty simple solution, but isn’t that always the case :-)

HTH,

Jean-Paul Smit

Didago IT Consultancy

[BizTalk 2010] Testing a multiple input map with Visual Studio 2010

Since BizTalk 2010 we have a feature in Visual Studio 2010 which allows us to test maps without having to deploy them into BizTalk Server. Until then we were lucky to have frameworks like BizUnit. Also the BizTalk Software Factory uses this feature to test maps.

To test maps just using Visual Studio 2010 we use this piece of code in our unit test project:

Microsoft.BizTalk.TestTools.Mapper.TestableMapBase map = new …map_name…();

map.TestMap(inputFile, Microsoft.BizTalk.TestTools.Schema.InputInstanceType.Xml, outputFile, Microsoft.BizTalk.TestTools.Schema.OutputInstanceType.XML);

After this statement we have the output file available which we can use to compare it with the expected values.

But what if we have a map with multiple inputs? We’re not able to provide multiple input files. So how do we test them?

To understand this, we need to know how BizTalk handles these kinds of maps. By the way, this post assumes you know how to create a multiple input map. If not, you can check it here.

To handle multiple input maps, BizTalk creates a new schema which contains both input schemas. In this way BizTalk is only facing one (combined) schema again, which contains both input schemas. This schema is not visible for us developers but is used behind the scenes. So to be able to test a multiple input map, we only have to create such a combined message. How do we create such a message?

To explain this I’ve setup a test scenario, which uses the following artifacts.

First Schema:

<?xml
version=1.0
encoding=utf-16?>

<xs:schema
xmlns=http://Didago.Samples.MultipleInputMap.FirstSchema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
targetNamespace=http://Didago.Samples.MultipleInputMap.FirstSchema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:element
name=FirstSchema>

<xs:complexType>

<xs:sequence>

<xs:element
name=FirstName
type=xs:string />

<xs:element
name=City
type=xs:string />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

Second Schema:

<?xml
version=1.0
encoding=utf-16?>

<xs:schema
xmlns=http://Didago.Samples.MultipleInputMap.SecondSchema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
targetNamespace=http://Didago.Samples.MultipleInputMap.SecondSchema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:element
name=SecondSchema>

<xs:complexType>

<xs:sequence>

<xs:element
name=LastName
type=xs:string />

<xs:element
name=Country
type=xs:string />

<xs:element
name=Age
type=xs:int />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

Combined (target) schema:

<?xml
version=1.0
encoding=utf-16?>

<xs:schema
xmlns=http://Didago.Samples.MultipleInputMap.CombinedSchema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
targetNamespace=http://Didago.Samples.MultipleInputMap.CombinedSchema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:element
name=CombinedSchema>

<xs:complexType>

<xs:sequence>

<xs:element
name=FirstName
type=xs:string />

<xs:element
name=LastName
type=xs:string />

<xs:element
name=City
type=xs:string />

<xs:element
name=Country
type=xs:string />

<xs:element
name=Age
type=xs:int />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

I’ve created a map which transforms the first and second schema into the combined schema. If you open the map in an XML editor, you can view what BizTalk does with multiple input maps.

<xs:schema
xmlns:ns2=http://Didago.Samples.MultipleInputMap.SecondSchema
xmlns:tns=http://schemas.microsoft.com/BizTalk/2003/aggschema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
xmlns:ns1=http://Didago.Samples.MultipleInputMap.FirstSchema
targetNamespace=http://schemas.microsoft.com/BizTalk/2003/aggschema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:import
schemaLocation=.FirstSchema.xsd
namespace=http://Didago.Samples.MultipleInputMap.FirstSchema />

<xs:import
schemaLocation=.SecondSchema.xsd
namespace=http://Didago.Samples.MultipleInputMap.SecondSchema />

<xs:element
name=Root>

<xs:complexType>

<xs:sequence>

<xs:element
name=InputMessagePart_0>

<xs:complexType>

<xs:sequence>

<xs:element
ref=ns1:FirstSchema />

</xs:sequence>

</xs:complexType>

</xs:element>

<xs:element
name=InputMessagePart_1>

<xs:complexType>

<xs:sequence>

<xs:element
ref=ns2:SecondSchema />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

BizTalk creates a new schema with a ‘Root’ and two sub nodes ‘InputMessagePart_0′ and ‘InputMessagePart_1′. If we recreate this message for our test, we can use the regular unit test code and it will work. The easy way is to take this (xs:)schema part from the map, create a new schema in BizTalk based on this and use the ‘Generate Instance’ feature to have a valid multiple schema example. We can fill this sample with our test message content to have a valid test. For my test scenario it will look like this example below.

<ns0:Root
xmlns:ns0=http://schemas.microsoft.com/BizTalk/2003/aggschema>

<InputMessagePart_0>

<ns1:FirstSchema
xmlns:ns1=http://Didago.Samples.MultipleInputMap.FirstSchema>

<FirstName>FirstName_0</FirstName>

<City>City_0</City>

</ns1:FirstSchema>

</InputMessagePart_0>

<InputMessagePart_1>

<ns2:SecondSchema
xmlns:ns2=http://Didago.Samples.MultipleInputMap.SecondSchema>

<LastName>LastName_0</LastName>

<Country>Country_0</Country>

<Age>10</Age>

</ns2:SecondSchema>

</InputMessagePart_1>

</ns0:Root>

All we have to do now is adjust the content of the test message to fit our needs. Based on this single input test message, we can perform the tests we need and verify it!

Two remarks though:

  1. Besides the Visual Studio testing, there are other test frameworks available. For example BizTalk Map Test Framework or the Fakes Framework
  2. This blog is specifically targeted at BizTalk 2010. Be aware that in BizTalk 2013 there is a bug preventing testing maps using Visual Studio!