Category name:Uncategorized

Hello Logic App!

During INTEGRATE 2014 last November in Seattle Microsoft gave a sneak preview of what today is known as API apps and Logic apps. Back then it was all fuzzy and no preview to play with. Right before the BizTalk Summit in Londen in April, Microsoft released the big news regarding their new App Service platform, including the existing Web apps (Azure websites) and Notification services, combined with the new API apps and Logic apps.

For us BizTalk developers the most interesting of all are the Logic apps. Although you can’t use Logic apps without having API apps, because they are (also) the new adapters to receive and send messages.

I was inspired by this blog post from the BizTalk360 team, in which is described how to read messages from one on-premise location and write them to another using Logic apps and hybrid connections. That blog post saved me from quite some research as there are some tricky things to know. At the end of the post I had the Logic app below.

FileConnectorBasic

For me that was the first step, but in a real BizTalk scenario there at least has to be one map! My scenario is to read from my local laptop ‘inbox’ folder, transform the message and write to the same laptop to an ‘output’ folder.

So I installed the Microsoft Azure BizTalk Services SDK to have mapping functionality in Visual Studio without the need for any BizTalk assembly. I created a small schema and a map to be used for the Logic app.

VisualStudioMap

Next step is to have an API app to perform the transform, because everything is an API on the platform. To create a new BizTalk Transform Service, you can also use this Azure documentation as reference.

NewTransformService

Select BizTalk Transform Service and click ‘Create’. The Azure portal start page will open and show the API app to be created. When the API app is created, click it to open the details. The next step is to add a map to the Transform Service.

TransformServiceMap

Now the BizTalk Transform API service is ready, we need to put it between the two file connectors in the existing Logic app. This is a challenge in itself, because currently it isn’t possible to re-organize the API apps within a Logic app. So just dragging the Transform Service in between the two file connectors is a no-go. I tried to add the Transform Service to the Logic app and then use the code view to re-organize but it appeared not to be that simple (although the design of the Logic app is described in a plain readable JSON file). This will be possible in the future, but for this case I removed the sending file connector, added the Transform Service and then added the file connector again. This results in the Logic app below.

BizTalkBasicScenario

The complexity at the moment is the expression you need to provide as parameter in the API apps. At the moment there is no validation, intellisense or syntax highlighting which complicates development. In the near future this will be possible, there is demand from the community for this as well.

Logic apps is in fact chaining API apps together and using (a part of) the output of one as the input for the next API app. For the BizTalk Transform Service we need to take the output of the first File Connector, which is the body of the message, as input for the map. We can use this expression for this: @triggers().outputs.body.Content

It is pretty easy to read: take the content of the body of the output of the previous API app.

Next is to take the output of the Transform Service and use that in the sending File Connector. To determine the file name to use for the File Connector we can use this: @concat(‘/Outbox/’, triggers().outputs.body.FileName)

The name is appended to the default folder. The content of the message is grabbed from the Transform Service, but has a different expression compared to the one used as input to the Transform Service: @body(‘transformservice’).OutputXml

Like a typical ‘hello world’ with BizTalk, I throw in this message in the Logic app:

<ns0:SourceSchema xmlns:ns0=”http://didago.nl/sourceschema”>
<ns0:FirstName>Indiana</ns0:FirstName>
<ns0:LastName>Jones</ns0:LastName>
</ns0:SourceSchema>

This is the output and as expected the map has been executed.

<?xml version=”1.0″ encoding=”utf-8″?>
<ns1:TargetSchema xmlns:ns0=”http://didago.nl/sourceschema” xmlns:ns1=”http://didago.nl/targetschema”>
<ns1:FullName>Indiana Jones</ns1:FullName>
</ns1:TargetSchema>

Hurray, my first Hello Logic app!

It was a very interesting exercise to play with these basic components of the App Service platform. Logic apps currently are certainly not a replacement for BizTalk and that is also not what it’s meant to be. Integration MVP Michael Stephenson has a nice blog post about it.

Currently the Logic apps are in preview and a lot still needs to be done before it is mature enough to build enterprise solutions. For example the entire ALM needs to be figured out and also the designer should be available in Visual Studio and not only via the browser. Microsoft is betting big on this so it will be a matter of time before these topics will be covered.

It is great playing with new stuff!

My migration to Exchange Online

Being a freelance consultant I have a domain registered, which comes with email and hosting. When I started my company back in 2008 I took the most cheap solution (typical Dutch) and everything went fairly well. Every now and then I had an email outage but only a couple of times a year and not for long periods of time.
Nonetheless this started to annoy me that much that I decided to move to the cloud: Exchange Online

This was back in March 2014, just after another outage. I wrote the words ‘Exchange Online’ as TODO on the whiteboard in my study. But moving to another email provider is pretty scary, for me at least: You don’t want to be left without for a couple of days (or worse). So I postponed it again and again until the next outage in November 2014, after which I decided to really move during Christmas.
This after all is a time of limited email traffic so I felt confident to make the journey.

The first step was checking out which ‘plan‘ I needed, the $4 or $8 a month plan. Basically the difference is unlimited storage, but ‘Basic’ already is 50 Gb per user.

Next step is registering and I’ll save you from the rest of the steps because they’re pretty well described in other blog posts and in

When logging in you’ll get an Office365 portal with the typical Outlook items and an Exchange Online admin section. Being a developer it was an eye opener to find the number of options you can configure, because this is normally the system engineers domain. Suddenly you’re an Exchange administrator!

Next step is configuring the domain to be used, otherwise you’ll end up with <your name>@<your company>.onmicrosoft.com but it was pretty easy: just add some DNS fields and you’re done.

Then finally the actual reason I started this blog post, which obviously isn’t a post like my regular ones but it might be helpful for others. Like mentioned I’m a freelance consultant and I own the domain ‘Didago.nl’. For tracing reasons I provide a customized email address to every customer, supplier or other contact I need to provide an email address to. The customization means I put the contact name in the email address, so for example for LinkedIn I would use “linkedin(at)didago.nl”. In this way I can setup Outlook rules, but it is also traceable which source used this email address. More than once I found one of my email addresses in a place it shouldn’t be like a spam list. Some time ago I actually was able to inform a Dutch blog of their compromised CMS before they knew it……

When configuring Exchange Online you need to have a license for every mailbox you create. Since I don’t want to create a mailbox for each and every email address, I configured a ‘Catch All’ account at my previous provider. I was under the assumption this was ordinary, but it appears Microsoft has a policy not to allow catch all accounts because they attract spam. While this is a viable reason, it is not very handy for me so I started my search for a solution because this had to be fixed.

Luckily I found the solution in this blog post by YourItHelp. It describes you have to tell Exchange Online upfront it is not authoritative for your domain, which means it shouldn’t manage its accounts. By disabling this it assumes accounts exist in other locations and Exchange Online is just functioning as relay in case it cannot find a recipient.

After that you can configure a rule to catch all email which has a recipient ‘outside the organization’, this is perfectly described in the blog post from Your IT Help. Although I still had a short fight with the catch all rule definition (I found out it also caught all of my outgoing email 🙁 ), I’m very happy with the result and Exchange Online integrates very well with my Nokia 930 so I found out 🙂

I’m one step closer to the cloud!

Cheers, Jean-Paul

[BizTalk 2010] Testing a multiple input map with Visual Studio 2010

Since BizTalk 2010 we have a feature in Visual Studio 2010 which allows us to test maps without having to deploy them into BizTalk Server. Until then we were lucky to have frameworks like BizUnit. Also the BizTalk Software Factory uses this feature to test maps.

To test maps just using Visual Studio 2010 we use this piece of code in our unit test project:

Microsoft.BizTalk.TestTools.Mapper.TestableMapBase map = new …map_name…();

map.TestMap(inputFile, Microsoft.BizTalk.TestTools.Schema.InputInstanceType.Xml, outputFile, Microsoft.BizTalk.TestTools.Schema.OutputInstanceType.XML);

After this statement we have the output file available which we can use to compare it with the expected values.

But what if we have a map with multiple inputs? We’re not able to provide multiple input files. So how do we test them?

To understand this, we need to know how BizTalk handles these kinds of maps. By the way, this post assumes you know how to create a multiple input map. If not, you can check it here.

To handle multiple input maps, BizTalk creates a new schema which contains both input schemas. In this way BizTalk is only facing one (combined) schema again, which contains both input schemas. This schema is not visible for us developers but is used behind the scenes. So to be able to test a multiple input map, we only have to create such a combined message. How do we create such a message?

To explain this I’ve setup a test scenario, which uses the following artifacts.

First Schema:

<?xml
version=1.0
encoding=utf-16?>

<xs:schema
xmlns=http://Didago.Samples.MultipleInputMap.FirstSchema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
targetNamespace=http://Didago.Samples.MultipleInputMap.FirstSchema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:element
name=FirstSchema>

<xs:complexType>

<xs:sequence>

<xs:element
name=FirstName
type=xs:string />

<xs:element
name=City
type=xs:string />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

Second Schema:

<?xml
version=1.0
encoding=utf-16?>

<xs:schema
xmlns=http://Didago.Samples.MultipleInputMap.SecondSchema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
targetNamespace=http://Didago.Samples.MultipleInputMap.SecondSchema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:element
name=SecondSchema>

<xs:complexType>

<xs:sequence>

<xs:element
name=LastName
type=xs:string />

<xs:element
name=Country
type=xs:string />

<xs:element
name=Age
type=xs:int />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

Combined (target) schema:

<?xml
version=1.0
encoding=utf-16?>

<xs:schema
xmlns=http://Didago.Samples.MultipleInputMap.CombinedSchema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
targetNamespace=http://Didago.Samples.MultipleInputMap.CombinedSchema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:element
name=CombinedSchema>

<xs:complexType>

<xs:sequence>

<xs:element
name=FirstName
type=xs:string />

<xs:element
name=LastName
type=xs:string />

<xs:element
name=City
type=xs:string />

<xs:element
name=Country
type=xs:string />

<xs:element
name=Age
type=xs:int />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

I’ve created a map which transforms the first and second schema into the combined schema. If you open the map in an XML editor, you can view what BizTalk does with multiple input maps.

<xs:schema
xmlns:ns2=http://Didago.Samples.MultipleInputMap.SecondSchema
xmlns:tns=http://schemas.microsoft.com/BizTalk/2003/aggschema
xmlns:b=http://schemas.microsoft.com/BizTalk/2003
xmlns:ns1=http://Didago.Samples.MultipleInputMap.FirstSchema
targetNamespace=http://schemas.microsoft.com/BizTalk/2003/aggschema
xmlns:xs=http://www.w3.org/2001/XMLSchema>

<xs:import
schemaLocation=.FirstSchema.xsd
namespace=http://Didago.Samples.MultipleInputMap.FirstSchema />

<xs:import
schemaLocation=.SecondSchema.xsd
namespace=http://Didago.Samples.MultipleInputMap.SecondSchema />

<xs:element
name=Root>

<xs:complexType>

<xs:sequence>

<xs:element
name=InputMessagePart_0>

<xs:complexType>

<xs:sequence>

<xs:element
ref=ns1:FirstSchema />

</xs:sequence>

</xs:complexType>

</xs:element>

<xs:element
name=InputMessagePart_1>

<xs:complexType>

<xs:sequence>

<xs:element
ref=ns2:SecondSchema />

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:sequence>

</xs:complexType>

</xs:element>

</xs:schema>

BizTalk creates a new schema with a ‘Root’ and two sub nodes ‘InputMessagePart_0’ and ‘InputMessagePart_1’. If we recreate this message for our test, we can use the regular unit test code and it will work. The easy way is to take this (xs:)schema part from the map, create a new schema in BizTalk based on this and use the ‘Generate Instance’ feature to have a valid multiple schema example. We can fill this sample with our test message content to have a valid test. For my test scenario it will look like this example below.

<ns0:Root
xmlns:ns0=http://schemas.microsoft.com/BizTalk/2003/aggschema>

<InputMessagePart_0>

<ns1:FirstSchema
xmlns:ns1=http://Didago.Samples.MultipleInputMap.FirstSchema>

<FirstName>FirstName_0</FirstName>

<City>City_0</City>

</ns1:FirstSchema>

</InputMessagePart_0>

<InputMessagePart_1>

<ns2:SecondSchema
xmlns:ns2=http://Didago.Samples.MultipleInputMap.SecondSchema>

<LastName>LastName_0</LastName>

<Country>Country_0</Country>

<Age>10</Age>

</ns2:SecondSchema>

</InputMessagePart_1>

</ns0:Root>

All we have to do now is adjust the content of the test message to fit our needs. Based on this single input test message, we can perform the tests we need and verify it!

Two remarks though:

  1. Besides the Visual Studio testing, there are other test frameworks available. For example BizTalk Map Test Framework or the Fakes Framework
  2. This blog is specifically targeted at BizTalk 2010. Be aware that in BizTalk 2013 there is a bug preventing testing maps using Visual Studio!

[BizTalk] Binding Gotcha

Last week I ran into something I’d like to share with you, and at the same time is a note to self.

Like in any BizTalk project I have to add a port to one of the projects every now and then, and most of the time the settings are 90% the same so I copy the binding section from one of the current bindings and paste them where I need it.

So I did last week. It was a project where I needed to add a send port, and there wasn’t any yet.

So as anyone knows, the send port section starts with:

<SendPortCollection>

</SendPortCollection>

So I opened the bindingmaster of another project, and copied one of the send port sections. Then I pasted in the other project. When I deployed the other project, I got no errors but also the send ports weren’t imported in the application. No errors, no warnings, it just seemed like the send port part was skipped. The receive ports were imported correctly.

What could this be?

At first I thought there was an error in the BizTalk deployment framework variables, but after stripping that part completely it couldn’t be the cause anymore.

In the end it turned out I copied together with the ‘SendPort’ section, also the “<SendPortCollection></SendPortCollection>” also by accident, resulting in:

<SendPortCollection>

<SendPortCollection>

<SendPort…..>…</SendPort>

</SendPortCollection>

</SendPortCollection>

I appears that this isn’t noted by the import task, it just skips the entire ‘sendportcollection’ section and moves on.

It would have been nice if some kind of parsing error was returned.

Beta release BizTalk Sofware Factory for BizTalk Server 2013

After some struggle I proudly present the BizTalk Software Factory for BizTalk Server 2013.

Struggle because I found out the Guidance Automation Extensions and Guidance Automation Toolkit are no longer maintained by Microsoft. They turned it over to the community and is now known as Open GAX/GAT. With this change also changes in the way it works became clear. The BSF uses for example the GAX Extension Library, also a community project, but it never was compiled against the Open GAX/GAT. It took some time to update the code to make it work.

The BizTalk Software Factory is now at its 4th version and the functionality basically is similar to the previous versions. Support for BizUnit however has been discontinued for testing schemas and maps, because the project seems no longer maintained. Some new out-of-the-box test functionality has been added to test maps which is much better, if Microsoft fixes this bug. For orchestration unit tests however, still BizUnit sample code is there which you can take advantage of.

To use BSF v4, you need to install the following prerequisites:

The Open GAX/GAT depends on the Visual Studio 2012 SDK, so you have to install it.

Install GAX and GAT from http://opengax.codeplex.com, in this order:

  • GAX2010-VS2012.vsix
  • GAT2010-VS2012.vsix

If you like the BizTalk Deployment Framework, it is supported in this version as well. Make sure you install v5.1 beta for BizTalk server 2013.

So from now on you can also build structured BizTalk 2013 solution with the guidance of BSF. If you experience issues or you run into a bug, please post it on the Codeplex site.

Enjoy!

Didago IT Consultancy

BizTalk in the spotlights on TechEd NA

This week was TechEd North America week again. A great show with overview and deep dive sessions about almost every aspect of Microsoft technology. I went to the TechEd a couple of times when it was in Amsterdam and I felt like a kid in a candy store. 🙂

The last couple of years there was little attention for the integration stack of Microsoft at TechEd, SharePoint was hot and also Azure got lots of attention. This time integration was in the spotlights again and with reason. Microsoft took the first serious step towards integration in the cloud with BizTalk services and this is big! It is getting clearer that integration will be the key feature in tomorrow’s world of technology. On premise systems and SaaS platforms will need to exchange information like nothing before and the integration layer is going to be this intermediary like it has been for years, connecting on premise systems.

Of course BizTalk Server will be around for the next decade. Microsoft has committed itself to a two-year release cycle, so the next version will probably be BizTalk Server 2015. However you need to keep an eye on the new ‘BizTalk-in-the-cloud’ initiative called Windows Azure BizTalk Services or WABS. It was silently tried out as EAI/EDI labs in April 2012, but it showed where we’re heading. Now the WABS are in preview and will be general available soon. At this time it is nowhere near what BizTalk Server offers, but this will change rapidly. Microsoft is on a release cycle of every quarter(!) for services which means they can and will add more BizTalk-Server-like functionality at a rapid pace. One of the extensions will be support for ESB in the cloud.

BizTalk Server is an enterprise level, business critical product which will be around for a long time, because companies will need an integration layer on premise. However while BizTalk Services is catching up the functionality of the on premise version of BizTalk at the moment, I expect this to be the other way around in two year. From 2015 on you’ll see innovation in BizTalk Services that won’t be available in the on premise server version until the next release. Will customer wait for 2 years to get access to new features? With that in mind it will be more and more interesting and important for companies to seriously look at WABS.

If you want to keep up with BizTalk, you can start by watching these interesting BizTalk sessions from TechEd:

Didago IT Consultancy

BizTalk –Unit Testing Enabled in Release mode?

Today I ran into something which took me too much time to figure out, so this blog post is to avoid others to waste their time. In the end the solution was pretty simple, although not very obvious.

I’m working on a project which consists of 15 BizTalk applications. These applications are deployed using the BizTalk Deployment Framework. I tested the deployments on my development environment and everything worked as expected.

When I deployed it to the test environment, I experienced an issue with just one of the applications. It mentioned a missing dependency: Microsoft.BizTalk.TestTools.

This is the assembly that is added as a reference to your Visual Studio project when you enable unit testing in the project properties (http://msdn.microsoft.com/en-us/library/dd257907(v=bts.10).aspx). I always thought this was bound to building the solution in Debug mode and since the BizTalk Deployment Framework builds everything in Release mode, the reference should have been dropped.

Why did this only show up at this particular project?

I checked the assembly with ILSpy and found out the reference to Microsoft.BizTalk.TestTools actually was there, while it was built in Release mode. The first thing I did was checking the configuration manager, to make sure the Test project was left out in Release mode, but this was already the case.

Finally I found out that if you pull up the project properties, the settings are dependent on the Debug or Release mode you’ve selected. So it appeared that I accidentally enabled unit testing while the configuration was in Release mode. When I disabled it for Release, leaving it enabled for Debug, everything deployed fine.

It is good to know not all debug and test related settings are only bound to Debug builds, so be aware they might show up in your Release build.

Didago IT Consultancy

BizTalk Server 2013 SharePoint Adapter Walkthrough

The SharePoint adapter has been redesigned in BizTalk 2013. It now supports for example SharePoint 2013 and SharePoint online, but it can still communicate with the current common versions like SharePoint 2007 and 2010.

If SharePoint 2013 is your platform, the adapter is utilizing the new SharePoint Client side Object Model (SCOM). For older versions of SharePoint it still uses the WSS adapter service component which needs to be installed on the SharePoint server itself.

I wanted to see how easy it has become to use the SharePoint adapter with SCOM so I decided to do some testing. The adapter is obviously capable of reading and writing to SharePoint document libraries and I was surprised how easy it actually is.

My scenario was pretty simple. I wanted to read a message from file and write it to a SharePoint document library. Next I want to read this message from the document library again and write it to disk. For this I installed a SharePoint 2013 foundation edition (by accident in Dutch) on my BizTalk 2013 environment, which is equipped with Windows Server 2012 and SQL Server 2012.

So first I created the Visual Studio solution with a source schema, a target schema and a mapping between the two. No rocket science. From the Visual Studio point of view nothing has changed. You create artifacts like you’re used to in BizTalk 2013.

After deploying the artifacts to BizTalk I created a receive port and location to read from file (with a simple map to transform from source to target), and more interesting a send port to send the message to SharePoint. After selecting ‘Windows SharePoint Services’ in the transport type of the send port, the following dialog is displayed.

What can be configured here:

  • Adapter Web Service Port – The HTTP port of the IIS website where the WSS adapter web service is installed
  • Timeout – Value in milliseconds the adapter waits to finish the send action
  • Use Client OM – Specify if the adapter can use the new SCOM or should use the older WSS adapter service component. You can also set this property in an orchestration.
  • Destination Folder URL – Location of the list to write the messages to, relative to the SharePoint site URL.
  • Filename (optional) – This is pretty cool, you can specify a fixed filename or construct it based on an Xpath expression to get a value from the message.
  • Namespaces (optional) – define namespaces used for the Xpath expressions like in the filename just mentioned.
  • Overwrite – Like in the FILE adapter, allow the message to be overwritten with the same name, but here also rename is possible.
  • SharePoint Site URL – Location of the SharePoint site to use .
  • Microsoft Office Integration – Specify here values to integrate solutions with Microsoft Office InfoPath, by adding processing instructions.

Then continued with this, to cover the second half of the transport properties dialog:

  • SharePoint Online – For the first time it is possible to connect with the SharePoint online cloud solution!
  • Windows SharePoint Services Integration – Allows to fill up to 16 columns by means of name/value pairs in the destination SharePoint list. In my scenario I added two additional columns to test this; ‘CategoryFixed’ with a fixed value and ‘NameFromXpath’ to figure out how Xpath expressions work. The columns defined must exist in the SharePoint list, it won’t create them if they are missing.

An example of an Xpath expression used as column value could be this:

%XPATH=/*[local-name()=’Target’ and namespace-uri()=’http://Didago.SharePointAdapter.Target’]/*[local-name()=’Fullname’ and namespace-uri()=”]%

It should start and end with the ‘%’ symbol and the rest is copied from the schema editor in Visual Studio. It is clear I have a target schema containing a (single) ‘Fullname’ element.

The SharePoint site I used is plain and simple and looks like this (unfortunately in Dutch):

There are two document lists named ‘Processed’ and ‘BizTalkList’ and that list has the two in the configuration mentioned columns ‘CategoryFixed’ and ‘NameFromXpath’. With all this configured and setup we can start testing by dropping a test file into the folder.

The test message is also simple:

The mapping between the source and target is just a concatenation of the first and last name into a full name. After dropping the test message in the folder, it is picked up by BizTalk and send to the SharePoint list.

The message is nicely written to the SharePoint list, the fixed column is filled with ‘Target’ and the concatenated full name is extracted from the message. This was very straight forward. One thing to remind is the host instance account must have write permissions on the list.

The second scenario is reading from the SharePoint list, therefor we first need to create a receive location configured for SharePoint.

What can be configured here:

  • Adapter Web Service Port – The HTTP port of the IIS website where the WSS adapter web service is installed.
  • Timeout – Value in milliseconds the adapter waits to finish the send action.
  • Use Client OM – Specify if the adapter can use the new SCOM or should use the older WSS adapter service component. You can also set this property in an orchestration.
  • Archive Filename (optional) – Name of the file when archived, fixed or via Xpath expressions. I used Xpath here.
  • Archive Location URL – List relative to SharePoint Site URL where to archive the processed messages.
  • Archive Overwrite – Allow to overwrite existing messages in the archive.
  • Batch Size – Maximum number of messages read in one batch.
  • Error Threshold – Maximum number of consecutive polling failures before the receive location is disabled.
  • Namespaces (optional) – Define namespaces used for the Xpath expressions like in the archive filename mentioned before.
  • Polling Interval – Interval in seconds between polling for new messages.
  • SharePoint Site URL – Location of the SharePoint site to use.
  • Source Document Library URL – List relative to SharePoint Site URL where to read the messages from.
  • View Name – Name of the view used to filter the messages to read, leave empty to process all messages.
  • Microsoft Office Integration – (optionally try to) remove the Microsoft Office InfoPath processing instructions on the message.
  • SharePoint Online – For the first time it is possible to connect with the SharePoint online cloud solution!

So with this scenario messages will be read from the view named ‘CategorySource’, which has a filter on the ‘CategoryFixed’ column. It shows only messages where that column has the text ‘Source’ as value. Because the first scenario only writes messages to the list with text value ‘Target’, those aren’t picked up by the adapter and I needed to add a document manually to the list with the required field value. In the screenshot below you’ll find the manually added message in the ‘BizTalkList’ SharePoint list and the ‘CategorySource’ is the selected view.

To my surprise both the message from scenario 1 and the manually added message were picked up and not only the one that shows up on the ‘CategorySource’ view. I’m not sure if this is a bug? I couldn’t find anything about it and after all it is a beta.

In the configuration of the receive location I specified to archive the message, which is correctly done by the adapter.

Some nice behavior I experienced is caused by the ‘Archive Overwrite’ setting, which dictates what to do when a message with a certain name already exists in the archive document list. In that case the adapter performs a check-out on the file to process and then waits. This prevents the message from being removed without being processed.

To conclude I must say it is very easy to use the new SharePoint adapter in BizTalk 2013, but I’m going to do some more research on the view name issue I ran into.

Didago IT Consultancy

BizTalk in the Cloud, One Step Closer Part 2

In my previous post I discussed a part of the recently released CTP of Windows Azure Service Bus EAI and EDI. I went over installation and the steps to take to create schemas and mappings (in this post I use different schemas and mapping, because of Visual Studio issues I had to start over).

This post is about how the schemas and mappings are used in Windows Azure. Like we’re used to with BizTalk we need the schemas to define the message types and we need the mappings to transform one message type to another. But how does this all fit together?

When you create a new ‘ServiceBus’ project, you’ll notice a new “BridgeConfiguration.bcs” file:

This file will contain the definition of the ‘Bridge’ between two systems connected to Azure. If you open the “bcs” file you’ll get an empty canvas and in the toolbox there are many options to fill the bridge configuration with:

For example pick the ‘Xml One-Way Bridge’. This is one-way WCF service endpoint, opposed to the ‘Xml Request-Reply Bridge’ of which you probably guessed it to be a request/response WCF service endpoint. The canvas looks like below:

In the solution explorer you’ll find the two schemas (SourceSchema.xsd and TargetSchema.xsd) and a mapping (Source_to_Target.trfm) I use to test with. The schema by the way is a simple mapping with a concatenation and the new if-then-else functoid (which isn’t called a functoid anymore).

 

Also a configuration file has been added to the solution: “XmlOneWayBridge1.BridgeConfig”, which in the end is just an XML configuration file but a so called ‘Bridge Editor’ has been created for ease of editing. Double clicking the file results in this.

In this we recognize the BizTalk pipeline! In the message types module you specify the schema’s you wish to be accepted by this endpoint. The validator is just an XML validator of which I assume it can be replaced by a custom component in the future. However keep in mind that this is running in the cloud so Microsoft will be careful with custom components which might impact the performance and stability of Azure.

Enriching is interesting, it feels like property promotion in BizTalk. It opens the opportunity of content based routing. You can enrich from SOAP, HTTP, (SQL) Lookup and XPath. In the picture below I created XPath properties in the SourceSchema.xsd.

 

The transform module also looks familiar to BizTalk developers. Obviously only the maps that are defined as a message type in the bridge show up here:

If you build the solution now, you’ll get errors. The errors indicate that without a so called ‘Connected Entity’ the message send to this end point cannot be processed. So we need to add one to the bridge configuration. In this case I’ll write the message to a queue, but that can also be an external service that the message is relayed to. When the Queue is dropped on the canvas, it can be connected to the XmlOneWayBridge by using a ‘Connection’. Be aware you can only connect on the red dots and dragging the queue shape doesn’t create a queue on the ServiceBus.

Building this still results in an error about a filtering expression. This is caused by the fact that by default a filter expressing is specified on the connection to the Queue. To fix this click on the connection, select ‘Filter Condition’ and set the filter to ‘Match All’ (or specify a filter).

Now the project will build and the next step will be deploy to Azure!

Because Azure EAI still is in a lab phase, you need to go to a different portal than the regular Azure Management portal: https://portal.appfabriclabs.com/Default.aspx

Go there and create a so called “labsubscription”. If you haven’t created such a subscription before deployment you’ll receive an error: “The remote name could not be resolved: ‘<servicenamespace>-sb.accesscontrol.appfabriclabs.com’”

From the project file menu select ‘Deploy’ and the regular Azure deploy dialog shows up. Specify the Shared Secret and the project will be deployed:

 

Now it is time for fun by testing the bridge with the tools that come with the SDK. The set of tools contains the ‘MessageSender’ tool, which takes the account credentials, bridge URL, sample message and content type. This message was sent to the bridge:

<?xml version=”1.0″ encoding=”utf-8″?>
<SourceSchema xmlns=”http://MyFirstAzureEAI.SourceSchema”>
  <FirstName xmlns=””>FirstName1</FirstName>
  <LastName xmlns=””>LastName1</LastName>
  <Age xmlns=””>1</Age>
  <Country xmlns=””>Country1</Country>
  <BirthDate xmlns=””>1900-01-01</BirthDate>
</SourceSchema>

Now the message is sent to the bridge and went through the pipeline where it was transformed from ‘SourceSchema’ to ‘TargetSchema’. After the pipeline processing has been done the message is written to the queue, TestEAIQueue in my case.

With another tool also from the SDK you can read messages from the queue: ‘MessageReceiver’.

In the message read from the queue you can derive the FirstName and LastName field from the SourceSchema have been concatenated in the TargetSchema. Also it has been derived that someone with and age of 1 is not an adult (IsAdult=false).

Wow, my first BizTalk-in-the-cloud-round-trip is working!

There is a lot more to discover and it is a CTP but this is working quite nice already. One thing we’re lacking in BizTalk today is the possibility to debug by pressing F5. I’m wondering if that will change with Azure EAI in the future.

 

Didago IT Consultancy

Article Published in Dutch .NET Magazine

Just before my holiday started, I got the great news that my article on the BizTalk Software Factory had been published on the Dutch .NET magazine site! It is my first article ever!

The BizTalk Software Factory is a community tool that helps BizTalk developers build consistent solutions.

Together with Mr. Dijkstra I wrote an article to explain the advantages of using the BizTalk Software Factory. The concepts of the BizTalk Software Factory can be applied to other technologies as well though.

You can find the article here, but keep in mind it is in Dutch…

 

Didago IT Consultancy