Integrate 2016

I already knew it was quite some time ago since my last blog post, but man, more than 9 months since! Shame on me.

The main reason I picked up blogging again is because there is so much going on in the integration space. This became especially visible during lasts weeks conference, which is the largest integration focused and Microsoft oriented conference in the world: Integrate 2016 in London, organised by the BizTalk 260 team.

There are so many recaps available, like from Steef-Jan Wiggers, Rob Fox, Eldert Grootenboer, Kent Weare and of course BizTalk360 itself, so I won’t go into that. The purpose of this blog post is to add new insights, or rather my insights based on the sessions and discussions during the conference.

BizTalk Server 2016

First of all the session on BizTalk Server 2016 (scheduled for RTM in Q4 of 2016). Microsoft clearly mentions their on-premise tool for integration is BizTalk Server and that they will continue to invest in it, but the (only!) session about BizTalk 2016 was quite disappointing. While 45 minute time slots were available, the talk only took 32 minutes. True, some demos were shown as part of the keynote, but if there is so much love for BizTalk Server, it shouldn’t be a problem to talk for hours about it. Main take aways from this session:

  • SQL Server AlwaysOn support
  • Platform alignment (Windows Server 2016, Visual Studio 2015, SQL Server 2016, Office 2016)
  • New adapter for interfacing with LogicApps (available in CTP2), which is cool by the way
  • Lots of customer asks and pain points solved (which ones remain quite unclear, besides “the BizTalk mapper Schema dialog window is now resizable”)
  • Nothing mentioned on ESB

I would expect to get to see the ‘lots of customers asks and resolved pain points’, but I guess it still isn’t possible to generate a multi-input message map from the map creation wizard……
Sorry for being sarcastic but this really didn’t show a lot of love for the product.

Microsoft Flow

One thing that couldn’t be skipped is the recently introduced competitor of Zapier and IFTTT: Microsoft Flow
This is a lightweight version of Logic Apps and is meant for business users. It won’t be directly part of the integrators toolkit, but it contains some easy to do integrations you should know about. Rule of thumb would be to use Microsoft Flow ‘when you can do development in production’. This means no need for source control etc. Although this is very conventient for business users, I fear the management around this as long as there is no tooling available to maintain the endless integrations the business users will create. Microsoft said that when this goes GA there will be tooling available to maintain, monitor, limit and create blue prints of company flows.

Azure Functions

One topic that really caught my eye is the power of Azure Functions which was demonstrated by Chris Anderson. You can do great things with it! It allows for creating (preferably) small logic components exposed as HTTP endpoints, for example functions to convert currency or perform calculations. Besides other things like triggers and schedulers, it is easy to create small API’s with it without having to host them yourself. From that perspective you can look at it as API Apps light and like with Flow I think some best practices are needed for the ALM part of this integration option.

Digital Transformation

This session on the role of the integration expert by Michael Stephenson was outstanding in my view. He clearly showed the fact that we as integration experts can no longer be on an island and need to transform into integration coaches. We passed the times where the ‘integration team ruled the company’ (or blocked the company!) and we need to be aware that other (.NET) developers will do integration work as well. This doesn’t mean we’re obsolete, but it means we have to adjust and take up the role of integration coaches and take care of the governance as we have great expertise and experience in that area. For complex integrations we’ll still be needed, but buiding API Apps or even Logic Apps will also be done by non-integration people. We have to define the blue prints and govern the integrations in the company.

Nick Hauenstein

I really don’t know where to start: with his performance or his session content. Man, both where awesome! He had a great story on the tools we have and that we should look outside our boundaries because there is a lot of greatness out there. BizTalk isn’t the only tool to do integration, so get out of your comfort zone. He demonstrated a solution where he built a BizTalk like solution (including correlation) in Logic Apps and API Apps. The well known BizTalk Pipeline is just another Logic App where we have full flexibility on the content and are no longer bound to the stages (nor be limited to a single component in a stage!). You can download the entire solution here. Last but certainly not least his performance. He was by far the most enthousiastic speaker at the conference. His energy blew me away!

How to ‘solve’ host instance state ‘Stop pending’

Every now and then you run into this, like I did today. It’s the issue when you try to stop or restart the host instance but it hangs in the ‘Stop pending’ state. Nothing seems help, you cannot do anything in the BizTalk management console nor in the Windows Services console.
Integration MVP Sandro Pereira blogged about this some time ago.

His blog post mentions some solutions, but my favorite one is just kill the process and start again. This option is mentioned, but how to figure out which process to kill? They all look alike in the Task Manager. In Windows Server 2012 and up the processes in the Task Manager can be expanded so the actual name is displayed. This makes it easy to kill the correct process. However in Windows Server 2008 this isn’t the case, and this posts can help you.

The first step is determine which process id is causing the trouble. You can find this piece of PowerShell also in a post from Sandro. For completeness sake I’ll show it here:

$machineName= hostname
$query= "root\MicrosoftBizTalkServer", "Select * from MSBTS_HostInstance where HostType = 1 and ServiceState = 4 and RunningServer = '$machineName'"
$hostInstanceList= $hostInstanceSearch.get()
$processName= $hostInstanceItem.HostName
$perfCounter= New-ObjectSystem.Diagnostics.PerformanceCounter("BizTalk:Messaging", "ID Process", $processName)
$processID= $perfCounter.NextValue()
Write-Host"HostName: "-foregroundcoloryellow-NoNewLine
Write-Host"Process Id: "-foregroundcoloryellow-NoNewLine

The output of this script will be like this:


However I found that this list is sometimes not complete. The problem-host-instance is missing, so we now know which aren’t the problem. With the next command run from a command window you can get all BizTalk host instance process on your machine:

tasklist /FI “IMAGENAME eq btsntsvc.exe”

The output is like this:


In this particular example there is no difference between the two because I can enforce the ‘Stop pending’ problem. However you can compare the two outputs now and see which PID is missing and thus causing the problems.

Finally use this command from a command window to kill the process:

taskkill /PID 4160

Hello Logic App!

During INTEGRATE 2014 last November in Seattle Microsoft gave a sneak preview of what today is known as API apps and Logic apps. Back then it was all fuzzy and no preview to play with. Right before the BizTalk Summit in Londen in April, Microsoft released the big news regarding their new App Service platform, including the existing Web apps (Azure websites) and Notification services, combined with the new API apps and Logic apps.

For us BizTalk developers the most interesting of all are the Logic apps. Although you can’t use Logic apps without having API apps, because they are (also) the new adapters to receive and send messages.

I was inspired by this blog post from the BizTalk360 team, in which is described how to read messages from one on-premise location and write them to another using Logic apps and hybrid connections. That blog post saved me from quite some research as there are some tricky things to know. At the end of the post I had the Logic app below.


For me that was the first step, but in a real BizTalk scenario there at least has to be one map! My scenario is to read from my local laptop ‘inbox’ folder, transform the message and write to the same laptop to an ‘output’ folder.

So I installed the Microsoft Azure BizTalk Services SDK to have mapping functionality in Visual Studio without the need for any BizTalk assembly. I created a small schema and a map to be used for the Logic app.


Next step is to have an API app to perform the transform, because everything is an API on the platform. To create a new BizTalk Transform Service, you can also use this Azure documentation as reference.


Select BizTalk Transform Service and click ‘Create’. The Azure portal start page will open and show the API app to be created. When the API app is created, click it to open the details. The next step is to add a map to the Transform Service.


Now the BizTalk Transform API service is ready, we need to put it between the two file connectors in the existing Logic app. This is a challenge in itself, because currently it isn’t possible to re-organize the API apps within a Logic app. So just dragging the Transform Service in between the two file connectors is a no-go. I tried to add the Transform Service to the Logic app and then use the code view to re-organize but it appeared not to be that simple (although the design of the Logic app is described in a plain readable JSON file). This will be possible in the future, but for this case I removed the sending file connector, added the Transform Service and then added the file connector again. This results in the Logic app below.


The complexity at the moment is the expression you need to provide as parameter in the API apps. At the moment there is no validation, intellisense or syntax highlighting which complicates development. In the near future this will be possible, there is demand from the community for this as well.

Logic apps is in fact chaining API apps together and using (a part of) the output of one as the input for the next API app. For the BizTalk Transform Service we need to take the output of the first File Connector, which is the body of the message, as input for the map. We can use this expression for this: @triggers().outputs.body.Content

It is pretty easy to read: take the content of the body of the output of the previous API app.

Next is to take the output of the Transform Service and use that in the sending File Connector. To determine the file name to use for the File Connector we can use this: @concat(‘/Outbox/’, triggers().outputs.body.FileName)

The name is appended to the default folder. The content of the message is grabbed from the Transform Service, but has a different expression compared to the one used as input to the Transform Service: @body(‘transformservice’).OutputXml

Like a typical ‘hello world’ with BizTalk, I throw in this message in the Logic app:

<ns0:SourceSchema xmlns:ns0=””>

This is the output and as expected the map has been executed.

<?xml version=”1.0″ encoding=”utf-8″?>
<ns1:TargetSchema xmlns:ns0=”” xmlns:ns1=””>
<ns1:FullName>Indiana Jones</ns1:FullName>

Hurray, my first Hello Logic app!

It was a very interesting exercise to play with these basic components of the App Service platform. Logic apps currently are certainly not a replacement for BizTalk and that is also not what it’s meant to be. Integration MVP Michael Stephenson has a nice blog post about it.

Currently the Logic apps are in preview and a lot still needs to be done before it is mature enough to build enterprise solutions. For example the entire ALM needs to be figured out and also the designer should be available in Visual Studio and not only via the browser. Microsoft is betting big on this so it will be a matter of time before these topics will be covered.

It is great playing with new stuff!

The status of the Windows Azure Pack

Until now my focus was not really on the private cloud, but I was triggered by the fact that it was mentioned at Integrate 2014 conference in Seattle which I attended. During the conference I realized I didn’t know a lot about the Azure pack, while it will become very important. The current on-premise BizTalk version will eventually be replaced by what Microsoft is building at the moment. The future Azure version of BizTalk (orchestration engine, connectors, BizTalk microservices, etc) will be deployed on-premise by means of the Azure Pack. By the way the product team indicated a preview will be released around the BizTalk summit in London on 13/14 April 2015.

Then the blog post from Sven van den Brande about this topic came by and that was the trigger to actively take a look at it. I started my 6 year old server to take the first step, install Windows Server 2012 R2.

There are quite some blog posts about installing the Azure Pack like:


I took the ‘express route‘ to quickly install Azure Pack. This installation is meant for single server installs, where a typical Azure Pack production install requires multiple servers to host features you find in Azure like IIS for websites, SQL servers for databases, Active Directory servers for authentication and Hyper-V servers for VM’s.

Especially this blog post I found very helpful for guidance:

I followed this blog post to install everything necessary to run Azure pack on a single server, using SQL Server 2014 Express edition. Installing the Azure pack via the Web Platform Installer is pretty simple. You can check all screenshots in the blog post, as it wouldn’t make sense to add them here again.

After installing and configuration you get this interface (after a bit of playing around), which is pretty familiar but also far behind in features compared to Microsoft Azure today.

Azure Pack

The documentation is dated October 17, 2013 and the Azure Pack feature installers are from October 21, 2014. Also for example the Windows Service Bus hasn’t been updated since October 2013.

So why would you install Azure Pack in a production environment at the moment, if it is so behind in features and it doesn’t get regular updates?

Asking the question is answering it. I think the private cloud is going to become very important as replacement for current Windows Server farms, but for now it is not an option. It is fun to play with but that’s about it. I wouldn’t advise a customer to use this in production.

It is my guess that Microsoft is putting no effort in the current Azure Pack anymore, but is building a new Azure Pack for Windows vNext Server, which is scheduled to be release in 2016, even after Windows 10 server. Or even better, Windows vNext Server will be build to support a private cloud. The timeline for this will be aligned with other Azure features like Servicebus, API Management and BizTalk micro services. Like with the new BizTalk developments, the private cloud will just be an instance of Microsoft Azure which results in feature parity between the two.

We have interesting times ahead!

My migration to Exchange Online

Being a freelance consultant I have a domain registered, which comes with email and hosting. When I started my company back in 2008 I took the most cheap solution (typical Dutch) and everything went fairly well. Every now and then I had an email outage but only a couple of times a year and not for long periods of time.
Nonetheless this started to annoy me that much that I decided to move to the cloud: Exchange Online

This was back in March 2014, just after another outage. I wrote the words ‘Exchange Online’ as TODO on the whiteboard in my study. But moving to another email provider is pretty scary, for me at least: You don’t want to be left without for a couple of days (or worse). So I postponed it again and again until the next outage in November 2014, after which I decided to really move during Christmas.
This after all is a time of limited email traffic so I felt confident to make the journey.

The first step was checking out which ‘plan‘ I needed, the $4 or $8 a month plan. Basically the difference is unlimited storage, but ‘Basic’ already is 50 Gb per user.

Next step is registering and I’ll save you from the rest of the steps because they’re pretty well described in other blog posts and in

When logging in you’ll get an Office365 portal with the typical Outlook items and an Exchange Online admin section. Being a developer it was an eye opener to find the number of options you can configure, because this is normally the system engineers domain. Suddenly you’re an Exchange administrator!

Next step is configuring the domain to be used, otherwise you’ll end up with <your name>@<your company> but it was pretty easy: just add some DNS fields and you’re done.

Then finally the actual reason I started this blog post, which obviously isn’t a post like my regular ones but it might be helpful for others. Like mentioned I’m a freelance consultant and I own the domain ‘’. For tracing reasons I provide a customized email address to every customer, supplier or other contact I need to provide an email address to. The customization means I put the contact name in the email address, so for example for LinkedIn I would use “linkedin(at)”. In this way I can setup Outlook rules, but it is also traceable which source used this email address. More than once I found one of my email addresses in a place it shouldn’t be like a spam list. Some time ago I actually was able to inform a Dutch blog of their compromised CMS before they knew it……

When configuring Exchange Online you need to have a license for every mailbox you create. Since I don’t want to create a mailbox for each and every email address, I configured a ‘Catch All’ account at my previous provider. I was under the assumption this was ordinary, but it appears Microsoft has a policy not to allow catch all accounts because they attract spam. While this is a viable reason, it is not very handy for me so I started my search for a solution because this had to be fixed.

Luckily I found the solution in this blog post by YourItHelp. It describes you have to tell Exchange Online upfront it is not authoritative for your domain, which means it shouldn’t manage its accounts. By disabling this it assumes accounts exist in other locations and Exchange Online is just functioning as relay in case it cannot find a recipient.

After that you can configure a rule to catch all email which has a recipient ‘outside the organization’, this is perfectly described in the blog post from Your IT Help. Although I still had a short fight with the catch all rule definition (I found out it also caught all of my outgoing email 🙁 ), I’m very happy with the result and Exchange Online integrates very well with my Nokia 930 so I found out 🙂

I’m one step closer to the cloud!

Cheers, Jean-Paul

Integrate 2014 – Impact on Integration Consultants

Like a lot of fellow integration consultants I attended Integrate 2014, the integration summit on the Microsoft campus in Redmond. The event was organized by BizTalk360 and they did a great job.

It was the first time I attended a BizTalk event abroad, but I heard something new was to be announced so I registered early. Although it is quite a trip, for a couple of days, all the way from Amsterdam to Seattle, it definitely was worth the jetlag.

I’m not going to describe the sessions, because that has already been done, for example in these great posts:

Like with many such events, the sessions are interesting but the most interesting part is meet new people and discuss with fellow enthusiasts what the impact of the announcements is. Besides the discussions, it is away nice to shake hands with community leaders. Sometimes I only knew them from Twitter or Blogs.

Although some very interesting new things were announced, we as integration consultants mainly need to work with what’s available today. So in this post I’ll focus on the impact on our day-to-day work and the near future.

My takeaways from Integrate 2014, regarding this topic are mainly that:

  • BizTalk is going to stay around
  • There is a real and major shift towards the cloud
  • It will take some time for Microsoft to be feature complete
  • Microsoft Windows Azure BizTalk Services (MABS) will be discontinued, but migration will be possible

Although the announced changes have huge impact for the BizTalk community, BizTalk itself will stay around for many years. Microsoft pointed out again their release cadence regarding BizTalk with a major release every 2 years and a platform alignment every alternate year. For 2015 there is a major release scheduled, but the worrying thing is that Microsoft has not shown a clear picture on what the ‘major’ enhancements will be. Actually I think Microsoft will only do platform alignment (make BizTalk ready for latest Windows/SQL/Visual Studio versions) and maybe add one or two features to improve cloud connectivity or adapt new standards (like happened with REST) but nothing new and innovative will be added to the current BizTalk platform. This in fact means a stand-still from an innovation point of view, so don’t expect any new investments in for example BAM or the workflow engine.

After one of the sessions I had a chat with one of the Program Manager on the BizTalk team and he explained that all of their 800 developers are working on the new stuff to get it ready for the announced preview. This leaves with few resources to do work on BizTalk Server 2015 and also makes very clear where Microsoft’s focus is at the moment. He also mentioned there will be a shared codebase between BizTalk on-premise and BizTalk-in-the-cloud. Later during Integrate it became clear that this actually meant that Microsoft will only build for the cloud and make that available on-premise in form of a private cloud (delivered via Azure Pack for Windows Server). Looking at it from that perspective, it makes sense there will be a single codebase and feature parity between on en off-premise because they’re exact the same product. The only difference is the datacenter it will be deployed, public cloud of private cloud. One important thing regarding private cloud though. Like mentioned before Microsoft will provide Azure BizTalk via the Azure Pack for Windows Server, but this pack currently doesn’t provide what’s available in Azure. So a lot of work needs to be done there as well.

This focus on the cloud leads to the conclusion that there will be minimal development on the current on-premise version of BizTalk Server, and that it will become a different track in the integration space. By that I mean we’ll see exactly the same as happened with ASP and ASP.NET: There will be developers doing ‘classic’ BizTalk and others doing ‘Azure’ BizTalk. Although they functionally do the same work, the technology is way different. This also means different design skills will be needed. In case of Azure, for the first time BizTalk developers will need to really include cost efficiency in their considerations.

I don’t think this separation will happen in the next few years, but it will eventually. The preview of Azure BizTalk (which isn’t an official term by the way) is scheduled for Q1 2015. The demo’s shown on Integrate made me think the first preview is nice to play with, but far from feature complete. For the Azure Integration Services (which is an official term from the slides) in an update cadence of 3 months new features will be added, starting with the features to be able to implement some content/context based routing scenarios. For enterprise solutions with complex orchestrations this will be a different story. Microsoft actually hopes these orchestrations can be broken down into a (large) set of (BizTalk) microservices, but we’ll have to see whether that is actually the case. This is also one of the concerns I have regarding migration of existing solutions. Some solutions will need to be redesigned, because they can’t be migrated.

One of the sessions was about cloud integration at Microsoft itself. They currently heavily invest in moving their integration to MABS, although they know that MABS will be discontinued in favor of the new platform. That means that they’re very confident in the migration path; like Kannan C. Lyer answered on the same question from the audience. As far as I know here in The Netherlands MABS isn’t used very often (if at all?), but in the USA it is used mainly for B2B where it is a suitable solution looking at the current capabilities.

To conclude my view on the new announcement: these are exciting times for integration consultants. For the first time in about 10 years things are really going to change for us and you have to decide for yourself whether you want to stay the ‘classic’ BizTalk consultant or add ‘Azure’ BizTalk to your skillset. The classic BizTalk consultant will be around for at least 10-15 years (lots of customers still use BizTalk 2006 or older, and the end-of-life of BizTalk 2013 R2 is in 2023), so plenty of time to make up your mind.

Didago IT Consultancy

BizTalk Software Factory v5 for BizTalk 2013R2

If you’re not familiar with the BizTalk Software Factory, please read the documentation on

Every new release of BizTalk requires some changes in the BizTalk Software Factory (BSF) and the 2013R2 edition is no exception. Most important change is in the Visual Studio version and the BSF relies on the Guidance Automation Toolkit (GAT) and Extensions (GAX) to be available.

Since Microsoft stopped development of the toolkit and extensions I’m happy to see the community continues it by means of the Open GAT/GAX ( project.

To be able to install the BSF ( you need to install the Visual Studio 2013 SDK and the Open GAT/GAX for Visual Studio 2013 upfront.

The functionality of the BSF hasn’t changed but it is important to know that at this moment there is no version of the BizTalk Deployment Framework (BTDF) that supports Visual Studio 2013. Since the BSF supports the BTDF, it currently is available but doesn’t work. Installing BTDF v5.5 on Visual Studio 2013 does work, but Visual Studio will not contain any of the BTDF functionality.

If you run into issues or you like to have some additional functionality in the BSF please let me know.

Didago IT Consultancy

Book review Getting Started With BizTalk Services

Recently I was invited to take a look at the Packt Publishing book ‘Getting Started with BizTalk Services’ by Karthik Bharathy  and Jon Fancey.

This is the first available book on Microsoft Azure BizTalk Services and for that reason alone an interesting read. I’ve already played with BizTalk Services so I was curious to measure my knowledge against the book, moreover because the book assumes no prior BizTalk knowledge. The Microsoft Azure platform is expanding at a tremendous rate so I was also interested to see how up to date this book is, as it was published in March 2014 (although that’s only 3 months ago, the release rate of Azure features is quarterly).

To start with the last question, the book is still very up to date. No major changes or new features have been announced that directly impact BizTalk Services. So from that point of view it is still a reliable source of information (besides that WABS is called MABS now J).

The book is organized like this. It starts with a generic overview of what Azure is and for what scenarios it would be useful. It also covers the basics of BizTalk Services.

Next chapter is about the Mapper and from a on-premise BizTalk Server perspective it has been seriously improved. One thing the book briefly touches is the fact that the mapper is not based on XSLT behind the scenes anymore; it’s using XAML. One other thing is the ability to have some sort of exception handling in the map. This has been one of the missing pieces of integration regarding to BizTalk Server. For each operation you can specify what to do in case of an exception, fail or continue and output a null value. Of course this is not really exception handling, but it’s better than nothing.

Then a chapter about Bridges. I knew a bridge was comparable to a BizTalk pipeline but I was surprised to read that behind the scenes a bridge is using Windows Workflow Foundation. This is an indication that the announced workflow engine for BizTalk Services most probably will be Workflow Foundation as well.

Other chapters cover topics like

  • EAI scenarios
  • B2B scenarios (EDI with X12/EDIFACT)
  • Using the BizTalk Adapter Service to connect to on-premise LOB systems
  • Using custom code in bridges
  • Maintaining BizTalk Services using the API via PowerShell or REST service
  • Tracking and Troubleshooting
  • Moving current BizTalk Server investments to BizTalk Services (and when not to)

Especially the chapter about B2B is comprehensive. The X12 standard is used quite often in the US so it makes sense to dive deeper in that part (european customers are using EDIFACT). Besides that the B2B market is the most suitable to move to the cloud first, because it involved integration with other companies which is typically a cloud-scenario.

Everything in the book is described in clear language and doesn’t only scratch the surface. Some topics are explained in more detail including some background as well.

For anyone interested in BizTalk Services and wishes to be quickly up to speed, this is a perfect start.

Didago IT Consultancy

Introducing the BizTalk Port Info Query Tool

Sometimes for a project you have to create a tool which you plan to share with the community. This tool has been on the shelf for at least a year and I finally found time to share it with you.

It started when I got involved as a fire fighter in a BizTalk project at a customer which had a lot of applications and accompanying hundreds of receive and send ports. The architecture was fully based on the pub/sub architecture and it was sometimes very difficult to understand the flow of the message and also to understand where a message would end up. To get a better view of the inner workings I decided to create a small tool to answer questions like:

  • Which ports do uses this specific map?
  • Where is a certain pipeline used?
  • Which ports subscribe to this field?

This tool is capable of reading the management database and retrieve the receive port and send port information via the ExplorerOM assembly. Only port name and so on is not detailed enough, so I added support to retrieve the following information to get a total view:

  • Receive Port Name
  • Receive Locations
  • Receive Transform
  • Receive Pipelines and Pipeline Components + Configuration
  • Receive Adapter Type and Settings like URL’s
  • Send Port Name
  • Send Transforms
  • Send Pipelines and Pipeline Components + Configuration
  • Send Adapter Types and Settings like URL’s
  • Send Port Filter Subscriptions

This information is retrieved for each of the selected applications. Once this information is available, then it is easy to also search through it. So search was added to be able to quickly see which receive ports or send ports are using a certain map or which send ports are subscribed to a certain field value.

It proved to be pretty handy for research purposed. The tool doesn’t require an installer only launching an executable to show up the winforms application. The sources and executable are on CodePlex and are tested with BizTalk 2010 and BizTalk 2013.

If you launch the tool you’ll get the screen below where you can adjust the user id and password to connect to the management database.


Once you click ‘Retrieve BizTalk Applications’ the management database will be queried using the specified credentials to retrieve all deployed BizTalk applications. Depending on the amount of applications this can take some time.


The list of applications are prefixed with a checkbox, so you can retrieve the information for specific applications only if you wish. When you’re ready selecting the applications, then click ‘Retrieve Application Info’ to get the details.


You now see two populated tree view controls on the right hand side. The top one contains the receive port information and the bottom one the send port information. Per BizTalk application there is a hierarchy containing all the details. If you for example expand the tree you’ll see information about the receive locations, transforms, pipelines and so on.


The following image shows the details of pipeline components and their settings.


The best part is that you can search through all information in the receive and send port tree views. The search text box scans the trees for a match and displays the results in the bottom list view. In the example below shows a search for every instance containing ‘_to_’ and as expected the maps on the send and receive ports show up, including the path to the applications they reside in.


Sometimes the information (path) is too long, like with filter subscriptions which can take up many characters. In that case you can double-click the entry and the information is displayed in a message box.


Finally, in situations where you cannot run the tool yourself or if you want to have the information ‘offline’ then you can use the ‘Export App Info’ button. This button saves the tree as a CSV formatted file type which you can open for example in Excel. The export is saved in the folder location the executable is launched from.

Since this started as a tool-with-maximum-use-minimum-UI it is clear the tool can be improved a lot to be more comprehensive and user friendly, but it worked for me. Feel free to take the code and adjust it to your needs!

I hope it helps you guys getting a better understanding of the BizTalk applications you’re faced with 🙂



Didago IT Consultancy

BTDF and “The mapping does not exist” SSO Error

Recently I needed to use SSO in an existing BizTalk solution where we use the BizTalk Deployment Framework.

Adding SSO support is so easy, but I ran into the famous “The mapping does not exist” SSO error. It took me some time to figure out what the cause was, and by posting it here I hope someone will benefit from it.

To start with some background. The requirement was a configurable value in a mapping, which would be different for Dev/Test/Prod environments. So typically something for SSO.

To deploy SSO as part of the BTDF is pretty easy. In the btdfproj file you have to specify: <IncludeSSO>true</IncludeSSO>

Next, you have to make sure to define the SSO security groups in the btdfproj and settings Excel file. You can also define custom values in the settings Excel, like ‘SomeValueFromSettings’:  <ItemGroup>    <PropsFromEnvSettingsInclude=SsoAppUserGroup;SsoAppAdminGroup;SomeValueFromSettings; />  </ItemGroup>

Finally you have to add this to the btdfproj:  <TargetName=CustomSSOCondition=‘$(Configuration)’ == ‘Server’>
<UpdateSSOConfigItem BizTalkAppName=$(BizTalkAppName)SSOItemName=“SomeValueToBeUsedSSOItemValue=$(SomeValueFromSettings)/>

So far so good, this deploys the SSO settings as expected. The interesting thing is how to get these values out of SSO again. The BTDF uses a special technique to store the settings, which means you should only use the provided SSOSettingsFileReader.dll assembly to get values out of SSO.

I used the BizTalk Mapper Extensions Utility Pack to use the SSO Config Get functoid. This all seems to work fine, but if the map is executed at runtime you’ll receive the error “The mapping does not exist”.

Although more reasons exist for this error, for this case it turned out the way of retrieving the SSO values was not supported. Obviously the extension pack uses a different way to retrieve the values from SSO, which works fine if you deploy values to SSO using for example the SSO Configuration Application MMC Snap-In.

Because I wanted to use the BTDF I changed the functoid to a Scripting Functoid which calls an external assembly method of the SSOSettingsFileReader assembly. After this change it worked right away.

After all a pretty simple solution, but isn’t that always the case 🙂


Jean-Paul Smit

Didago IT Consultancy

  • Recent Posts
  • Recent Comments
  • Archives
  • Categories
  • Meta