Category name:.NET

Integrate 2016

I already knew it was quite some time ago since my last blog post, but man, more than 9 months since! Shame on me.

The main reason I picked up blogging again is because there is so much going on in the integration space. This became especially visible during lasts weeks conference, which is the largest integration focused and Microsoft oriented conference in the world: Integrate 2016 in London, organised by the BizTalk 260 team.

There are so many recaps available, like from Steef-Jan Wiggers, Rob Fox, Eldert Grootenboer, Kent Weare and of course BizTalk360 itself, so I won’t go into that. The purpose of this blog post is to add new insights, or rather my insights based on the sessions and discussions during the conference.

BizTalk Server 2016

First of all the session on BizTalk Server 2016 (scheduled for RTM in Q4 of 2016). Microsoft clearly mentions their on-premise tool for integration is BizTalk Server and that they will continue to invest in it, but the (only!) session about BizTalk 2016 was quite disappointing. While 45 minute time slots were available, the talk only took 32 minutes. True, some demos were shown as part of the keynote, but if there is so much love for BizTalk Server, it shouldn’t be a problem to talk for hours about it. Main take aways from this session:

  • SQL Server AlwaysOn support
  • Platform alignment (Windows Server 2016, Visual Studio 2015, SQL Server 2016, Office 2016)
  • New adapter for interfacing with LogicApps (available in CTP2), which is cool by the way
  • Lots of customer asks and pain points solved (which ones remain quite unclear, besides “the BizTalk mapper Schema dialog window is now resizable”)
  • Nothing mentioned on ESB

I would expect to get to see the ‘lots of customers asks and resolved pain points’, but I guess it still isn’t possible to generate a multi-input message map from the map creation wizard……
Sorry for being sarcastic but this really didn’t show a lot of love for the product.

Microsoft Flow

One thing that couldn’t be skipped is the recently introduced competitor of Zapier and IFTTT: Microsoft Flow
This is a lightweight version of Logic Apps and is meant for business users. It won’t be directly part of the integrators toolkit, but it contains some easy to do integrations you should know about. Rule of thumb would be to use Microsoft Flow ‘when you can do development in production’. This means no need for source control etc. Although this is very conventient for business users, I fear the management around this as long as there is no tooling available to maintain the endless integrations the business users will create. Microsoft said that when this goes GA there will be tooling available to maintain, monitor, limit and create blue prints of company flows.

Azure Functions

One topic that really caught my eye is the power of Azure Functions which was demonstrated by Chris Anderson. You can do great things with it! It allows for creating (preferably) small logic components exposed as HTTP endpoints, for example functions to convert currency or perform calculations. Besides other things like triggers and schedulers, it is easy to create small API’s with it without having to host them yourself. From that perspective you can look at it as API Apps light and like with Flow I think some best practices are needed for the ALM part of this integration option.

Digital Transformation

This session on the role of the integration expert by Michael Stephenson was outstanding in my view. He clearly showed the fact that we as integration experts can no longer be on an island and need to transform into integration coaches. We passed the times where the ‘integration team ruled the company’ (or blocked the company!) and we need to be aware that other (.NET) developers will do integration work as well. This doesn’t mean we’re obsolete, but it means we have to adjust and take up the role of integration coaches and take care of the governance as we have great expertise and experience in that area. For complex integrations we’ll still be needed, but buiding API Apps or even Logic Apps will also be done by non-integration people. We have to define the blue prints and govern the integrations in the company.

Nick Hauenstein

I really don’t know where to start: with his performance or his session content. Man, both where awesome! He had a great story on the tools we have and that we should look outside our boundaries because there is a lot of greatness out there. BizTalk isn’t the only tool to do integration, so get out of your comfort zone. He demonstrated a solution where he built a BizTalk like solution (including correlation) in Logic Apps and API Apps. The well known BizTalk Pipeline is just another Logic App where we have full flexibility on the content and are no longer bound to the stages (nor be limited to a single component in a stage!). You can download the entire solution here. Last but certainly not least his performance. He was by far the most enthousiastic speaker at the conference. His energy blew me away!

The status of the Windows Azure Pack

Until now my focus was not really on the private cloud, but I was triggered by the fact that it was mentioned at Integrate 2014 conference in Seattle which I attended. During the conference I realized I didn’t know a lot about the Azure pack, while it will become very important. The current on-premise BizTalk version will eventually be replaced by what Microsoft is building at the moment. The future Azure version of BizTalk (orchestration engine, connectors, BizTalk microservices, etc) will be deployed on-premise by means of the Azure Pack. By the way the product team indicated a preview will be released around the BizTalk summit in London on 13/14 April 2015.

Then the blog post from Sven van den Brande about this topic came by and that was the trigger to actively take a look at it. I started my 6 year old server to take the first step, install Windows Server 2012 R2.

There are quite some blog posts about installing the Azure Pack like:


I took the ‘express route‘ to quickly install Azure Pack. This installation is meant for single server installs, where a typical Azure Pack production install requires multiple servers to host features you find in Azure like IIS for websites, SQL servers for databases, Active Directory servers for authentication and Hyper-V servers for VM’s.

Especially this blog post I found very helpful for guidance:

I followed this blog post to install everything necessary to run Azure pack on a single server, using SQL Server 2014 Express edition. Installing the Azure pack via the Web Platform Installer is pretty simple. You can check all screenshots in the blog post, as it wouldn’t make sense to add them here again.

After installing and configuration you get this interface (after a bit of playing around), which is pretty familiar but also far behind in features compared to Microsoft Azure today.

Azure Pack

The documentation is dated October 17, 2013 and the Azure Pack feature installers are from October 21, 2014. Also for example the Windows Service Bus hasn’t been updated since October 2013.

So why would you install Azure Pack in a production environment at the moment, if it is so behind in features and it doesn’t get regular updates?

Asking the question is answering it. I think the private cloud is going to become very important as replacement for current Windows Server farms, but for now it is not an option. It is fun to play with but that’s about it. I wouldn’t advise a customer to use this in production.

It is my guess that Microsoft is putting no effort in the current Azure Pack anymore, but is building a new Azure Pack for Windows vNext Server, which is scheduled to be release in 2016, even after Windows 10 server. Or even better, Windows vNext Server will be build to support a private cloud. The timeline for this will be aligned with other Azure features like Servicebus, API Management and BizTalk micro services. Like with the new BizTalk developments, the private cloud will just be an instance of Microsoft Azure which results in feature parity between the two.

We have interesting times ahead!

Introducing the BizTalk Port Info Query Tool

Sometimes for a project you have to create a tool which you plan to share with the community. This tool has been on the shelf for at least a year and I finally found time to share it with you.

It started when I got involved as a fire fighter in a BizTalk project at a customer which had a lot of applications and accompanying hundreds of receive and send ports. The architecture was fully based on the pub/sub architecture and it was sometimes very difficult to understand the flow of the message and also to understand where a message would end up. To get a better view of the inner workings I decided to create a small tool to answer questions like:

  • Which ports do uses this specific map?
  • Where is a certain pipeline used?
  • Which ports subscribe to this field?

This tool is capable of reading the management database and retrieve the receive port and send port information via the ExplorerOM assembly. Only port name and so on is not detailed enough, so I added support to retrieve the following information to get a total view:

  • Receive Port Name
  • Receive Locations
  • Receive Transform
  • Receive Pipelines and Pipeline Components + Configuration
  • Receive Adapter Type and Settings like URL’s
  • Send Port Name
  • Send Transforms
  • Send Pipelines and Pipeline Components + Configuration
  • Send Adapter Types and Settings like URL’s
  • Send Port Filter Subscriptions

This information is retrieved for each of the selected applications. Once this information is available, then it is easy to also search through it. So search was added to be able to quickly see which receive ports or send ports are using a certain map or which send ports are subscribed to a certain field value.

It proved to be pretty handy for research purposed. The tool doesn’t require an installer only launching an executable to show up the winforms application. The sources and executable are on CodePlex and are tested with BizTalk 2010 and BizTalk 2013.

If you launch the tool you’ll get the screen below where you can adjust the user id and password to connect to the management database.


Once you click ‘Retrieve BizTalk Applications’ the management database will be queried using the specified credentials to retrieve all deployed BizTalk applications. Depending on the amount of applications this can take some time.


The list of applications are prefixed with a checkbox, so you can retrieve the information for specific applications only if you wish. When you’re ready selecting the applications, then click ‘Retrieve Application Info’ to get the details.


You now see two populated tree view controls on the right hand side. The top one contains the receive port information and the bottom one the send port information. Per BizTalk application there is a hierarchy containing all the details. If you for example expand the tree you’ll see information about the receive locations, transforms, pipelines and so on.


The following image shows the details of pipeline components and their settings.


The best part is that you can search through all information in the receive and send port tree views. The search text box scans the trees for a match and displays the results in the bottom list view. In the example below shows a search for every instance containing ‘_to_’ and as expected the maps on the send and receive ports show up, including the path to the applications they reside in.


Sometimes the information (path) is too long, like with filter subscriptions which can take up many characters. In that case you can double-click the entry and the information is displayed in a message box.


Finally, in situations where you cannot run the tool yourself or if you want to have the information ‘offline’ then you can use the ‘Export App Info’ button. This button saves the tree as a CSV formatted file type which you can open for example in Excel. The export is saved in the folder location the executable is launched from.

Since this started as a tool-with-maximum-use-minimum-UI it is clear the tool can be improved a lot to be more comprehensive and user friendly, but it worked for me. Feel free to take the code and adjust it to your needs!

I hope it helps you guys getting a better understanding of the BizTalk applications you’re faced with 🙂



Didago IT Consultancy

BTDF and “The mapping does not exist” SSO Error

Recently I needed to use SSO in an existing BizTalk solution where we use the BizTalk Deployment Framework.

Adding SSO support is so easy, but I ran into the famous “The mapping does not exist” SSO error. It took me some time to figure out what the cause was, and by posting it here I hope someone will benefit from it.

To start with some background. The requirement was a configurable value in a mapping, which would be different for Dev/Test/Prod environments. So typically something for SSO.

To deploy SSO as part of the BTDF is pretty easy. In the btdfproj file you have to specify: <IncludeSSO>true</IncludeSSO>

Next, you have to make sure to define the SSO security groups in the btdfproj and settings Excel file. You can also define custom values in the settings Excel, like ‘SomeValueFromSettings’:  <ItemGroup>    <PropsFromEnvSettingsInclude=SsoAppUserGroup;SsoAppAdminGroup;SomeValueFromSettings; />  </ItemGroup>

Finally you have to add this to the btdfproj:  <TargetName=CustomSSOCondition=‘$(Configuration)’ == ‘Server’>
<UpdateSSOConfigItem BizTalkAppName=$(BizTalkAppName)SSOItemName=“SomeValueToBeUsedSSOItemValue=$(SomeValueFromSettings)/>

So far so good, this deploys the SSO settings as expected. The interesting thing is how to get these values out of SSO again. The BTDF uses a special technique to store the settings, which means you should only use the provided SSOSettingsFileReader.dll assembly to get values out of SSO.

I used the BizTalk Mapper Extensions Utility Pack to use the SSO Config Get functoid. This all seems to work fine, but if the map is executed at runtime you’ll receive the error “The mapping does not exist”.

Although more reasons exist for this error, for this case it turned out the way of retrieving the SSO values was not supported. Obviously the extension pack uses a different way to retrieve the values from SSO, which works fine if you deploy values to SSO using for example the SSO Configuration Application MMC Snap-In.

Because I wanted to use the BTDF I changed the functoid to a Scripting Functoid which calls an external assembly method of the SSOSettingsFileReader assembly. After this change it worked right away.

After all a pretty simple solution, but isn’t that always the case 🙂


Jean-Paul Smit

Didago IT Consultancy

Installing and Configuring ESB Toolkit 2.2

With the announcement of BizTalk Server 2013 Beta Microsoft also announced the next minor version of the ESB Toolkit: v2.2

My experience with the previous versions of the ESB Toolkit was that the installation procedure was complex hence preventing  people from using it. Microsoft already announced that the installation of the ESB Toolkit would be simplified and with this blog post I would like to see what has changed. By the way the procedure is documented here.

If you start the setup of BizTalk, the bottom option allows for installation of the ESB Toolkit 2.2.

Also in the ESB Toolkit we have to accept the license agreement.

Next we find the first new screen, which shows that Microsoft keeps its promise. The installation of the ESB Toolkit is very easy, just a matter of selecting the components you need.

Next the regular summary screen before installation starts.

And as always we end with the progress and result screens.

Plain and simple! The components have been installed in C:Program Files (x86)Microsoft BizTalk ESB Toolkit

That’s all that needs to be done installing the ESB Toolkit onto the system, but if you open the BizTalk Administration Console you’ll see nothing has been installed in BizTalk yet.

Installing in BizTalk will be done via configuration. Different from BizTalk is that it’s not possible to start the configuration from the ‘Installation Completed’ screen. The tool needs to be started separately. Once started we see some familiar parts in the tool as well as something new, but first some status-check action is performed.

Then the configuration screen as we know it is started, although I got an error I could bypass it clicking ‘Continue’. I haven’t read anything about this error in other blog posts, so I’m not sure what’s causing it.

This initial configuration screen doesn’t seem to be very much different from v2.1, but if you take a closer look you’ll notice an additional configuration option at the bottom: “ESB BizTalk Applications”.

This is the most interesting part, because the option to enable components will install the ESB application in BizTalk. If you enable it and apply configuration the core components get installed and the BizTalk Administration Console shows a new application has been added.

Besides the application, the ESB also needs policies in the Business Rules Engine. Which have been added.

By the way, it is clear this is a beta, because the BRE version is already changed to the BizTalk Server 2013 version but the picture still shows BizTalk Server 2010.

So the installation and configuration procedure is extremely simplified, which is a great advantage over the previous versions of the ESB Toolkit. This won’t be a showstopper anymore.

But there is also a disappointment: the ESB Toolkit 2.2 still uses Enterprise Library 4.1 where I would expect Microsoft to upgrade it at least to the latest version 5.0 which is around since April 2010. The components used from Enterprise Library are “Microsoft.Practices.EnterpriseLibrary.Common” and “Microsoft.Practices.EnterpriseLibrary.Caching”. Like mentioned in a lot of blog posts (here, here and here for example), this will still be a problem when developers use Enterprise Library 5.0 on their system for other applications while the ESB Toolkit uses 4.1.

Didago IT Consultancy

Installing and Configuring BizTalk Server 2013 beta

This week Microsoft released the first public downloadable beta of BizTalk Server 2013, which is scheduled to be released H1 2013. Curious about the changes in the installation procedure, I decided to grab a Windows Server 2012 VHD, Visual Studio 2012 and SQL Server 2012 to create a base setup for BizTalk Server 2013 beta. I expect not much change in the installation and configuration procedure because it hasn’t really changed the past 6 years and nothing shocking has changed in BizTalk itself.

If you unzip the downloaded BizTalk Server 2013 beta bits, you can start the installation by running the setup.

I picked “Install BizTalk Server 2013 Beta”, and the consumer information screen is shown.

Next the license agreement, which we accept of course. Smile

The following screen is the question if we want to participate in the customer experience improvement program, which we also want. The more people use the beta, the more bugs and issues are found.

Next the components we would like to install, where we select almost everything.

Finally a question about where the installer should get the prerequisites.

Then the summary and the installation process can start.

First the redistributable components.

Then BizTalk Server 2013 beta itself.

After this screen a new screen appears, although this could also be a one-time screen to enable Microsoft Update. At least with me after installing BizTalk Server 2013 beta this screen popped up.

The strange thing is that this step isn’t mentioned in the installation guide and I’m not sure what this means. I hardly can imagine that Microsoft Update would auto-update BizTalk.

After this screen, the installation is finished.

So far nothing really new in the procedure, as expected.

Next the configuration of the BizTalk Server 2013 beta installation. As with the install procedure I don’t expect a lot of change compared to the configuration procedure the past 6 years.

I created a BizTalk service account to be used by the configuration wizard and choose ‘Basic configuration’.

The next screen is also familiar.

As well as the configuration process itself.

Including the SSO warning not to use the admin account.

Now BizTalk Server 2013 beta is installed and configured, we need to perform some standard post-installation steps like mentioned here. Next we can take a look at the new features, which will be the subject of a next blog post.

One final interesting thing is the new BizTalk product version:

With this number all kinds of tools can be updated to recognize the latest offspring.

Didago IT Consultancy

Don’t use the BizTalk 2010 Assembly Checker (BTSAssemblyChecker.exe)!

I’m working on some system engineering documentation for my current customer and of course I’m using the BizTalk Server 2010 Operations Guide as a reference for this. One of the tools is the BTSAssemblyChecker tool which is mentioned here "Checklist: Performing Monthly Maintenance Checks".

One of the steps in the checklist is this one:

“Ensure that the correct version of a set of assemblies is installed on each BizTalk machine (integrity check).”

“Use the BizTalk Assembly Checker and Remote GAC tool (BTSAssemblyChecker.exe) to check the versions of assemblies deployed to the BizTalk Management database and to verify that they are correctly registered in the GAC on all BizTalk Server computers. You can use this tool to verify that all the assemblies containing the artifacts of a certain BizTalk application are installed on all BizTalk nodes. The tool is particularly useful in conjunction with a solid versioning strategy to verify that the correct version of a set of assemblies is installed on each BizTalk machine, especially when side-by-side deployment approach is used. The tool is available with the BizTalk Server 2010 installation media at SupportToolsx86BTSAssemblyChecker.exe.”

If you use the tool, it looks in the wrong GAC (like mentioned in this MSDN forum thread) and therefore is kind of useless, since the job of this tool is to verify the assemblies deployed in the management database are also in the GAC. Besides that, the readme documentation still mentions BizTalk 2006 in several places!

What would be the reason for Microsoft to ship this tool which doesn’t work and with out-dated documentation?

The risk of having assemblies in the management database but not in the GAC still exists, but I’m not aware of an alternative tool to check this.



Didago IT Consultancy

Interesting rendez-vous with NDepend

This story starts in December 2008, but I found out only last June…..

Back in 2008 Patrick Smacchia of NDepend contacted me via my blog to ask me if I was interested in testing NDepend, but unfortunately I wasn’t notified of his message. Almost 4 years later (!) I came by accident across his message and decided to contact him and the request still stood Smile.

I already knew NDepend from early releases. Back then I ran into the famous “File or assembly name or one of its dependencies not found” exception and it was sometimes a challenge to find out which dependency was missing. Moreover because the dependencies also might have missing dependencies themselves, and so on. A first look at current NDepend made me happy because I saw the tool has evolved tremendously.

One of the reasons I started developing the BizTalk Software Factory is the fact that I think developers should be assisted in their work to build consistent, reliable and maintainable software. While looking at the features of NDepend I got enthusiastic about that part of the capabilities of NDepend.

This blog post is a review of some of the capabilities of NDepend. It is capable of much much more, so I’d like to refer to the website for additional information, documentation and instruction videos.

So of the list of features of NDepend, I like to focus on Code Quality, Code Query, Code Metrics, Explore Architecture, Detecting Dependency Cycles and Complexity and Diagrams.

If you’re involved in BizTalk development you probably know that tools like FxCop (now called Visual Studio Code Analysis) and things like Visual Studio Code Coverage are almost meaningless because most of the code is generated. But this doesn’t count for custom components like libraries, helper components, custom pipelines, custom functoids etc. It still is valuable to have tools checking the quality of your code and detect unwanted dependencies. But of course, the more custom written code you have, the more valuable tools become.

So lets start with some of the features that I like most from a code quality perspective. For this blog I use NDepend v4.0.2 Professional which I installed as Add-in for Visual Studio 2010. For my website I used the open source Orchard CMS and I took that code to do some testing.

In Visual Studio a new NDepend menu option shows up. With the option “Attach a new NDepend Project to VS Solution” you can add NDepend analysis to a project and it starts analyzing right away (if you leave the checkbox checked).

In the lower right corner of the Visual Studio a circle shaped indicator is visible after installing NDepend and after analyzing Orchard CMS the indicator is red. The results of the analysis are in the image below. Also clicking on the indicator displays the “Queries and Rules Explorer” which shows what rules actually are violated. This is a nice an quick overview of the violations.


In this example the “Methods too complex” violation is clicked, resulting in displaying the details in the “Queries and Rules Edit” dialog. By hovering over mehod “indexPOST”, a small dialog will be added on the right side of the explorer with details about the file and code metrics. Double clicking “indexPOST” will take you directly to the actual code. It is so easy to navigate to problem locations with this.

Because there are so many rules checked, it is also valuable to just take a look at the violations to learn from best practices. Besides the build-in rules, also custom queries over code are possible. All queries use LINQ style called CQLinq to retrieve information. All code metrics are written using the LINQ queries so it is easy to add your own (or take advantage of queries written by others).


Before NDepend I would have no idea how to find out the number of method in my code with more than 6 parameters, more than 10 declared variables or more than 50 lines of code. Or a combination of that. With this LINQ syntax it is a piece of cake.

For example a query to find all methods where the number of variables > 40(!) and are public.

from m in Methods where m.NbVariables > 40 && m.IsPublic select m
It is amazing to see what’s possible by just trying, assisted by intellisense!
Another really (and I mean really really) awsome feature is the Metric View, which is a visual representation of the code by module and size.

The size of every rectangle indicates the size of the method.

A second view uses LINQ queries again to visualize in this case the top 50 methods by number of lines of code. By visualizing this in the Metric View it becomes immediately clear where problems are located. Of course other metrics can be made visible in this way.

I’d like to end with another nice feature, the dependency graph and dependency matrix. The matrix shows an overview of the dependencies modules have on each other. Not only the dependencies between project components, but also dependencies on .NET system libraries are shown, which gives you insight into what assemblies actually are used in your project.

This view definitely will give you a better understanding of your code and might raise an eye brow if you find out you’re depending on an assembly that you’re not supposed to depend on. Especially if you’re responsible for a clean architecture in a large development team you need to verify the code doesn’t violate your architecture.

This tool is so comprehensive that I couldn’t possibly cover all features, but I’ve touched the ones that I’m most interested in. I’m really impressed by NDepend and I’m sure I’ll use it in upcoming projects. Being able to query the quality of your code is just awesome!


Didago IT Consultancy

SSO Config Cmd Tool

Every experienced BizTalk developer knows that the SSO store is a good place to safely store configuration information. It is secure, distributed and can be updated without having to dive into configuration files (but too bad the host instances have to be restarted after a change before it is picked up). Accessing the SSO used to be difficult until Richard Seroter wrote a tool for it.

In 2009 Microsoft acknowledged the problems and introduced an MMC snap-in to basically do the same thing, but then with a nice user interface. However so far no command line version appears to be available to support automated deployment. Although the Microsoft package contains MSBuild support, I couldn’t find a commandline version so I decided to write/create one myself.

The tool is no rocket science, most of the code is borrowed from the code the MMC snap-in executes to perform tasks (long live ILSpy).

The tool supports importing, exporting, deleting, retrieving details and listing SSO applications.

With the MMC snap-in there now are two SSO tools available, because also the good old ENTSSO Administration tool is present. The weird thing is that the ENTSSO Administration tool displays more SSO applications than the MMC snap-in does. The reason for this is the strange fact that the MMC snap-in only retrieves SSO applications where the contact information equals .com”>“BizTalkAdmin@<company_specified_in_sso>.com”. This is not always an issue because creating an SSO application automatically sets “BizTalkAdmin@<yourcompany>.com” as the contact email address. It will become an issue if you have to change the email address for example because you don’t have a .com domain…..

Filtering still serves a purpose because there are some default SSO applications that shouldn’t show up in the tool, like adapter configuration settings. In the SSO Config Cmd Tool this potential problem is solved by taking the opposite approach, namely to exclude the contact email addresses “” and “” because they are related to the build-in SSO applications.

Let’s take a look at the tool and go over the actions one by one.

If you open the SSO snap-in, you’ll get this user interface. For demo purposes I already created a ‘TestApp2’ with two keys.

If you run the SSOConfigCmdTool without arguments, you’ll get help about the usage of the tool.

To list all SSO applications, where the contact info is not “” or “”:

To retrieve the details of this ‘TestApp2’ application (as you can see the tool is case insensitive):

To export an SSO application, you need a encryption/decryption key and of course a file to export to. If you omit the extension automatically ‘.sso’ is appended. After the export a file is present at the specified location.

To be able to demonstrate the import, first the delete:

After delete, as expected the SSO snap-in doesn’t display the SSO application anymore:

Next we run the import to reinstall the SSO application. The nice thing is you can supply a different name for it (here ‘ImportedTestApp’ is used’).

Now we run into the issue mentioned above. The SSO snap-in doesn’t display the newly created application, while the good old ENTSSO Administration tool does. This is caused by the filter I previously discussed. The SSOConfigCmdTool doesn’t know your SSO company name, so it imports (creates) the SSO application with contact information ‘’, which will be filtered out by the SSO snap-in.

How to solve this?

One option is not to solve it, because the SSO application is actually there despite the fact that it isn’t displayed.

Another option is to go into the ENTSSO Administration tool and change the contact info to .com”>“BizTalkAdmin@<company_specified_in_sso>.com”, but that would involve manual intervention.

The most easy and sustainable way is to grab the source code from this tool and change “YourCompany” to <company_specified_in_sso>, which is in fact just a single line of code in the Program class.

Just compile it after the change and you’re good to go.

Anyhow, after setting the contact information correctly, the imported SSO application shows up.


The executable and the source code of the tool can be downloaded here at CodePlex.

If you run into an issue, please let me know so I can improve the tool. And of course I’m curious if this tool already existed…..


Didago IT Consultancy

Deploying WCF Services using Web Deploy

More than two years ago I wrote a blog post about deploying WCF services using a Web Setup project. Back then it wasn’t an easy task to get services deployed using it. Especially the UI customization of the installation wizard dialog gave me some headache, moreover because they were difficult to test and debug.

As also mentioned in the comments of that post, we now got Web Deploy! So I decided it was about time to rewrite the blog post using the same requirements but then with Web Deploy. Let’s see how much easier it became.

A small revisit of the requirements as formulated in the other blog post:

  1. support deployment of the WCF artifacts
  2. be able to be deployed to whatever web site is available on the server
  3. be able to set ASP.NET security to NTLM
  4. be able to gather service settings during setup
  5. be able to change custom AppSettings in the web.config according to the settings

Web Deploy is available for Visual Studio 2005/2008, but I’ll be focusing on Visual Studio 2010. The approaches of the Web Deploy and Web Setup are different. In Web Setup the output was an MSI which could be run by a system administrator. The Web Deploy approach is focused on IIS as the host of the web application or service. This means the output is no longer an MSI but a (ZIP) package that can be imported in IIS or IIS Express or installed using the MSDeploy command line tool.

What are the typical steps to take to deploy a web application or WCF service using Web Deploy from Visual Studio?

First, select the ‘Publish’ option on the project context menu.

A profile page will show up in which you can specify which publish method to use. The options are:

  • Web Deploy
  • FTP (File Transfer Protocol)
  • File system
  • FPSE (FrontPage Server Extensions)

Also define a profile name, the server to deploy to and the name of the web application or WCF service. Next click ‘Publish’ and the artifacts are deployed to the target IIS server and site/application.

This covers requirement 1: support deployment of the WCF artifacts

As a side step: if you want to deploy to a remote server, you might run into a few challenges. The Web Deploy is a client/server installation so you need to correctly setup your (IIS) server in order for it to receive the deployment request. If you want to know more, take a look at these posts:

Requirement 2 is: be able to be deployed to whatever web site is available on the server

In the previous step the target web site was defined in Visual Studio, but that should be dynamically changeable when a system administrator runs the package on for example a production environment.

First the package must become portable so it can be installed on a different server, and the next step is enabling the system administrator to pick the website to deploy the service to.

To create a package there are 2 options you can use:

  • Visual Studio
  • IIS (use the Export Application functionality, which is a topic by itself and not covered here)

In Visual Studio take from the project context menu the option ‘Package/Publish Settings’ to define how the package looks like and then ‘Build Deployment Package’ to actually build the package.

Important settings here are the location of the package, which is a ZIP file, and the target location of the WCF service on the IIS server. Next select ‘Build Deployment Package’ to create it.

Now the package has been build, it is interesting to see what’s in it. Browse to the folder specified in the ‘Package/Publish Settings’ folder to find the files below.

The package consists of:

  • CMD file
    • This batch file can be used to deploy the package to a server without having to access IIS via the administration console. It uses the command line tool MSDeploy to deploy the package using Web Deploy. Suitable for unattended and/or bulk installations
  • Readme file
    • Interesting file describing all options for the CMD file. One very interesting one is the /T option, which calls msdeploy.exe with the “-whatif” flag, which simulates deployment. This does not deploy the package. Instead, it creates a report of what will happen when you actually deploy the package. Very useful for testing purposes!
  • Parameter file
    • This file contains the parameters that are defined for this service and are extracted from the project file (DeployIisAppPath element) during packaging. This file contains by default only one item, see below. Here we immediately notice that this is the value we need to cover requirement 2.
  • Manifest file
    • This file is not used during deployment, but during generating the package. It contains provider settings and like you can see below some security related entries for the IisApp and setAcl provider.
  • ZIP package
    • Contains the service to deploy, including configuration, binaries and other artifacts. If you open the ZIP you’ll see all kinds of configuration files that are used during import of the application

For now let’s see how to install the package onto a different server based on the package. First open IIS and select the web site you want to import the web application or WCF service to.

It is out of scope to describe all wizard dialogs but at a certain time you’re asked to specify the application:

In here you can define the destination application and if it doesn’t exist, it will be created for you. This is out-of-the box functionality, but nothing different from the Web Setup approach. Anyhow, this is what we need to fulfill requirement 2.

Requirement 3 is: be able to set ASP.NET security to NTLM

By default the security is set to anonymous access, so we need to change that to windows authentication.

At first this didn’t seem like a big deal, because I expected it to be some parameter or manifest setting, but the more I read on the internet the more I got worried. There is a lot to find about Web Deploy and Windows Authentication, but that mostly relates to using Windows Authentication to connect to the Web Deploy server (up to adding keys to the registry), and not about changing the authentication of a deployed WCF service or web application.

So far I haven’t found an easy way to change the authentication of a deployed WCF service or web application. What I did find was a thread in the IIS Forums about specifying the authentication in the web.config, but in order for that to work the applicationHost.config must be changed to allow overrides. That config file is maintained by the system administrator and I can understand why the system administrator should know about deployment scripts that change the authentication. On the other hand, having a WCF service with Windows Authentication is not that uncommon.

Another option that was mentioned in this thread refers to running a command file which can be added to the manifest file. This command file will probably contain some VBScript to change the authentication after the package has been installed. The difficult thing here is knowing in which web site the WCF Service has been deployed…..

I really hope I overlooked something, so if somebody knows how to do this please leave a comment! Thanks in advance!

Requirement 4 and 5 will be discussed combined: “be able to gather service settings during setup” and “be able to change custom AppSettings in the web.config according to the settings”

Like we’ve seen before, there is a parameters file. This file can be customized from within Visual Studio to make it possible to change the UI the system administrator will see. For this example I’ll demonstrate what it takes to change the value of a custom AppSetting in the web.config. This will be changed from within the UI of IIS. The value to change is a key named ‘Test’ in the AppSettings section of the web.config.

First, add an XML file to your project called “Parameters.xml”, this will contain the parameters the UI will show. This is a different file from the ‘SetParameters.xml’ file we saw earlier; that one is used with the command line option and this one with the IIS import.

Next add this piece of XML. By the way, this is a good resource to learn more about parameters in Web Deploy.

This piece describes two things:

  • It defines a parameter with a description to show up in the UI and a default value which will be filled in.
  • It defines where the value, read from the UI, should be applied. In this case it is targeted at the web.config file and the XPath points to the AppSettings key ‘Test’.

Building a package from this and importing it in IIS results in the following UI:

As you’ll notice the name and description are displayed nicely, together with the default value. When the value is changed, it will be changed in the web.config when the wizard is finished.

You can imagine the possibilities of this approach. It is very easy to change the UI and set values for example in the web.config without having to code it.

So, did it actually became easier to deploy WCF services? Yes and no. Obviously it is much easier to customize the UI and perform all kinds of actions while deploying, but on the other hand it seems very difficult to perform some common task like changing the authentication of a web site. Like mentioned before, I hope someone will leave a comment with a clarification on the authentication issue.

Anyhow, there is much more you can do with Web Deploy then is covered in this blog post. Please find below useful spending of your spare time.

Didago IT Consultancy

  • Recent Posts
  • Recent Comments
  • Archives
  • Categories
  • Meta