Tag Archive: Dynamic Transformation


A couple of months ago I released the BRE Pipeline Framework v1.5 (since superseded by v1.5.1) to CodePlex.  One of the new features in this version of the framework is support for dynamic transformation.  In this blog post I’ll explain some scenarios in which this feature might be useful to you and show you how you can use the BRE Pipeline Framework to execute your maps dynamically.

 

Why you’d want to use the BRE Pipeline Framework for Dynamic Transformation

The first reason to take advantage of the BRE Pipeline Framework for Dynamic Transformation is to build messaging only applications that support hot deployments for maps.  A hot deployment from a BizTalk perspective is effectively one in which a BizTalk application doesn’t need to be stopped during a redeployment of your map assemblies, the only requirement being that the relevant host instances are restarted after the deployment is complete.  In a messaging application with maps on your receive/send port you might find that BizTalk complains if you try to import an MSI with maps that are used on receive/send ports if the application is in a running state, in fact you might even need to delete the receive/send ports altogether or at the very least remove the maps from the ports before your redeployment takes.  A hot deployment would not require more than a momentary outage of your application while host instances restart which fits in well with an application that is not tolerant to outages.

The second reason to take advantage of the BRE Pipeline Framework for Dynamic Transformation is for transformation selectivity.  The way BizTalk works if you apply inbound/outbound maps on your receive/send ports is that it chooses the first map for which the source schema matches the message type of the message in question.  If you apply two maps with the same source schema on a receive/send port there is no way for you to specify any additional conditions that determine which map executes, the second map will always be ignored.  With the BRE Pipeline Framework you can apply complex conditions which include combinations of checking message body content through XPath statements or regexes, checking values in the message context, checking values from SSO configuration stores or EDI trading partner agreements, or even based on the current time of the day…and the best part is that if the selectivity functionality already exists in the BRE Pipeline Framework (which it does in all the mentioned scenarios and many more) then you can achieve all of this with 0 lines of code, and if the functionality doesn’t exist then the framework provides extensibility points to cater for this.  Another great advantage of using the BRE to choose which maps get executed is that you can always change the selectivity rules at runtime if required without any code changes, which can offer you much flexibility for applications that are not tolerant to outages.

The third reason to take advantage of the BRE Pipeline Framework for Dynamic Transformation is so that you can chain maps.  As mentioned above, on a BizTalk receive/send port only the first matching map will execute.  It is not possible to execute multiple maps sequentially within a port within a single direction (you can of course specify one inbound and one outbound map thus two if your port is a two-way port).  In a messaging only solution if you receive a message and send it out on a send port then the maximum number of maps you can execute is 2, one on the receive port in the inbound direction, and one on the send port in the outbound direction.  For the most part this is adequate but there might be some scenarios where this isn’t.  With the BRE Pipeline Framework you can specify as many maps to execute sequentially within a single rule or across multiple rules (make sure you set priorities across the different rules to guarantee the required map execution order), or you can execute a map in your BRE Policy and then apply one on your port as well (keep in mind that for an inbound message the pipeline executes before the port maps and the reverse is true for outbound messages).  Beyond just chaining maps, you can also chain other Instructions contained within the BRE Pipeline Framework together with your maps, so you could for example execute a map dynamically then perform a string find/replace against the message body.

Another possible benefit of the dynamic transformation is avoiding inter-application references, which is possibly one of the most painful aspects of BizTalk Server. When you want to pass a message from one application to another you can have almost no choice but to have one application reference the other (the one being referenced containing the schema). Inter-application referenced make deployments a lot more difficult since you can’t update the referenced application without deleting all the referencing applications first (forget about hot deployments altogether, this is the other extreme). To get around this problem you could potentially have separate schemas in each application, create a map in an assembly that only gets deployed to the GAC rather than to the BizTalk Management Database, and then use the BRE Pipeline Framework to transform the message in a pipeline (in either application, it shouldn’t matter). This could allow you to create more decoupled applications with much easier deployment models.

The aforementioned benefits can also be achieved through the use of the ESB Toolkit.  The main difference between the BRE Pipeline Framework and the ESB Toolkit is that the former is intended to provide utility whereas the latter is more about implementing the routing slip pattern with added utility as well.  The ESB Toolkit comes with a fair learning curve as there are a whole lot of new concepts to wrap your head around such as itineraries, resolves, messaging extenders, orchestration extenders etc… and you’ll find that at least the out of the box utility can be quite limited (there are of course community projects that have improved on these lackings).  I definitely see valid scenarios in which either framework should be used, and wouldn’t consider them to be competing frameworks…they could even be used in tandem.

One final reason that I can think of off the top of my head is traceability.  Given that the BRE Pipeline Framework caters for tracing based on the CAT Instrumentation framework and also provides rules execution logs (see this post for more info) you can always tell why a map was chosen for a given message.  This can be especially handy when you are debugging a BizTalk application.

Just like with the ESB Toolkit executing a map dynamically within your pipeline goes against one of the best practice principles of building streaming pipeline components, so please carefully evaluate the pros and cons of using this feature before implementing it.

 

Implementing Dynamic Transformation with the BRE Pipeline Framework

To illustrate how to use the BRE Pipeline Framework to execute maps dynamically I will provide you with and walk you through a simple example solution.  The solution contains three schemas, one called PurchaseOrder, one called PurchaseOrderEDI, and one called PurchaseOrderXML.  All the schemas contain an element (with varying names) which hold the purchase order number.  The PurchaseOrder schema also contains an additional node called OrderType which is also linked to a promoted property of the same name.  The rule we want to put in place for PurchaseOrder messages is that if the value of the OrderType context property is XML then we want to execute a map that converts the message to a PurchaseOrderXML.  If the result of an XPath query against the OrderType node is EDI then we want to execute a map that converts the message to a PurchaseOrderEDI.

On to the implementation…   You will of course need to download and install at least v1.5.1 of the framework from the CodePlex site and import the required vocabularies from the program files folder.  Once done you will need to create a receive pipeline (receive or send are both supported) and drag the BREPipelineFrameworkComponent from the toolbox (you’ll need to add it to your toolbox by right clicking within the toolbox choosing “Choose items” and selecting the component from within the pipeline components tab if it isn’t already in your toolbox) to the validate pipeline stage (you can choose any stage except disassemble/assemble).  The only parameter you have to set on the pipeline component is ExecutionPolicy which specifies which BRE Policy will be called to resolve the maps to execute (you could optionally specify the ApplicationContext parameter if you plan on calling the BRE Policy from multiple pipelines and you want some rules to only apply for certain pipelines).   For the purpose of this example we will use an XML Disassembler component prior to the BREPipelineFrameworkComponent and ensure the StreamsToReadBeforeExecution parameter on the BREPipelineFrameworkComponent is left at its default value of Microsoft.BizTalk.Component.XmlDasmStreamWrapper so that we will be able to inspect context property values promoted by the XML Disassembler (see this post for more info).

Pipeline

Once all the components are deployed to BizTalk we’ll create a receive location that picks up a file and makes use of the aforementioned receive pipeline, as well as a file send port that subscribes to messages from this receive port.  Finally we’ll create the BRE Policy which contains two rules.

The first rule is used to transform messages to the PurchaseOrderXML message format as below.  The rule is made of a single condition which uses the GetCustomContextProperty vocabulary definition from the BREPipelineFramework.SampleInstructions.ContextInstructions vocabulary to evaluate the value of a custom context property.

MapToXMLRule

The second rule is used to transform messages to the PurchaseOrderEDI message format as below.  The rule is made of a single condition which uses the GetXPathResult vocabulary definition from the BREPipelineInstructions.SampleInstructions.HelperInstructions vocabulary to evaluate the value of a node from within the message body with the use of an XPath statement.

MapToEDIRule

Both of the aforementioned rules make use of the TransformMessage vocabulary definition from the BREPipelineInstructions.SampleInstructions.HelperInstructions vocabulary in their actions to apply a map against the message.  The input format of the vocabulary definition is as follows – Execute the map {0} in fully qualified assembly {1} against the current message – {2}.  The first parameter in this vocabulary definition is the fully qualified map name (.NET namespace + .NET type), and the second parameter is the fully qualified assembly name including the assembly version and the PublicKeyToken (you can execute gacutil -l with the assembly name to get the fully qualified assembly name from a Visual Studio Command Prompt).  The third parameter is used to specify what sort of validation is performed against the input message before executing the map and is an enumeration with the below values.

  • ValidateSourceSchema – This option will validate the current messages BTS.MessageType context property against the source schema of the specified map.  If they don’t match up then an exception is thrown rather than the map being executed.  If a message type is not available then an exception will be thrown.
  • ValidateSourceSchemaIfKnown – This option will validate the current messages BTS.MessageType context property against the source schema of the specified map.  If they don’t match up then an exception is thrown rather than the map being executed.  If a message type is not available then no exception will be thrown and the map will execute.
  • DoNotValidateSourceSchema – This option will not result in any validation of the BTS.MessageType context property on the current message and will result in the map being executed regardless, possibly resulting in a runtime error during execution of your map or an empty output message.  If a message type is not available then no exception will be thrown and the map will execute.  I haven’t experimented with this myself but this might allow you to create generic maps which apply generic XSLT against varying input messages to create a given output message format.  If anyone decides to experiment with this then please do let me know your results.

That’s all that is required to stitch together a solution making use of the BRE Pipeline Framework which involves dynamic transformation.  If you push through a PurchaseOrder message with an OrderType of XML then it will get converted to a PurchaseOrderXML message, if the OrderType is EDI then it will get converted to a PurchaseOrderEDI message, and if the OrderType is anything else then the message will remain a PurchaseOrder as expected.

As previously mentioned the BRE Pipeline Framework comes along with a  lot of traceability features (also documented here).  If you set the CAT Instrumentation Framework Controller to capture pipeline component trace output you will get information such as the below which tells you which map is getting executed, what the source message type is, and what the destination message type is.

EventTrace
[3]1FF4.2690::08/16/2014-22:13:01.245 [Event]:TRACEIN: BREPipelineFramework.PipelineComponents.BREPipelineFrameworkComponent.TraceIn() => [102f63bb-2c86-4213-a892-2a5175569469]
[3]1FF4.2690::08/16/2014-22:13:01.245 [Event]:START -> 102f63bb-2c86-4213-a892-2a5175569469
[3]1FF4.2690::08/16/2014-22:13:01.245 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - BRE Pipeline Framework pipeline component has started executing with an application context of , an Instruction Execution Order of RulesExecution and an XML Facts Application Stage of BeforeInstructionExecution.
[3]1FF4.2690::08/16/2014-22:13:01.245 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - BRE Pipeline Framework pipeline component has an optional Execution policy paramater value set to BREMaps.
[3]1FF4.2690::08/16/2014-22:13:01.245 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - BRE Pipeline Framework pipeline component has an optional tracking folder paramater value set to c:\temp.
[3]1FF4.2690::08/16/2014-22:13:01.245 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Inbound message body had a stream type of Microsoft.BizTalk.Component.XmlDasmStreamWrapper
[3]1FF4.2690::08/16/2014-22:13:01.245 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Inbound message body stream was not seekable so wrapping it with a ReadOnlySeekableStream
[3]1FF4.2690::08/16/2014-22:13:01.246 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Reading stream to ensure it's read logic get's executed prior to pipeline component execution
[1]1FF4.2690::08/16/2014-22:13:01.255 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Adding MetaInstruction BREPipelineFramework.SampleInstructions.MetaInstructions.CachingMetaInstructions to Execution Policy facts.
[1]1FF4.2690::08/16/2014-22:13:01.255 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Adding MetaInstruction BREPipelineFramework.SampleInstructions.MetaInstructions.ContextMetaInstructions to Execution Policy facts.
[1]1FF4.2690::08/16/2014-22:13:01.255 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Adding MetaInstruction BREPipelineFramework.SampleInstructions.MetaInstructions.HelperMetaInstructions to Execution Policy facts.
[1]1FF4.2690::08/16/2014-22:13:01.255 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Adding MetaInstruction BREPipelineFramework.SampleInstructions.MetaInstructions.MessagePartMetaInstructions to Execution Policy facts.
[1]1FF4.2690::08/16/2014-22:13:01.255 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Adding MetaInstruction BREPipelineFramework.SampleInstructions.MetaInstructions.XMLTranslatorMetaInstructions to Execution Policy facts.
[1]1FF4.2690::08/16/2014-22:13:01.265 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Executing Policy BREMaps 1.0
[0]1FF4.2690::08/16/2014-22:13:01.277 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Adding Instruction BREPipelineFramework.SampleInstructions.Instructions.TransformationInstruction to the Instruction collection with a key of 0.
[0]1FF4.2690::08/16/2014-22:13:01.277 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Starting to execute all MetaInstructions.
[0]1FF4.2690::08/16/2014-22:13:01.277 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Executing instruction BREPipelineFramework.SampleInstructions.Instructions.TransformationInstruction.
[0]1FF4.2690::08/16/2014-22:13:01.277 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Applying transformation BREMaps.PurchaseOrder_To_PurchaseOrderXML,   BREMaps, Version=1.0.0.0, Culture=neutral, PublicKeyToken=21bb7669ee013ee3 to the message
[0]1FF4.2690::08/16/2014-22:13:01.277 [Event]:102f63bb-2c86-4213-a892-2a5175569469 - Message is being transformed from message type http://BREMaps.PurchaseOrder#PurchaseOrder to message type http://BREMaps.PurchaseOrderXML#PurchaseOrderXML
[0]1FF4.2690::08/16/2014-22:13:01.278 [Event]:END <- 102f63bb-2c86-4213-a892-2a5175569469: 32ms
[0]1FF4.2690::08/16/2014-22:13:01.278 [Event]:TRACEOUT: BREPipelineFramework.PipelineComponents.BREPipelineFrameworkComponent.Execute(...) = "102f63bb-2c86-4213-a892-2a5175569469"

If you set the TrackingFolder parameter on the BREPipelineFrameworkComponent pipeline component to a valid folder then you will get output like the below (note this is just an excerpt) which provides valuable information telling you which BRE rules get fired and why.

CONDITION EVALUATION TEST (MATCH) 16/08/2014 10:13:01 p.m.
Rule Engine Instance Identifier: f2c966cf-b248-4e94-a96a-99d110d59a9b
Ruleset Name: BREMaps
Test Expression: BREPipelineFramework.SampleInstructions.MetaInstructions.ContextMetaInstructions.GetContextProperty == XML
Left Operand Value: XML
Right Operand Value: XML
Test Result: True

CONDITION EVALUATION TEST (MATCH) 16/08/2014 10:13:01 p.m.
Rule Engine Instance Identifier: f2c966cf-b248-4e94-a96a-99d110d59a9b
Ruleset Name: BREMaps
Test Expression: BREPipelineFramework.SampleInstructions.MetaInstructions.HelperMetaInstructions.GetXPathResult == EDI
Left Operand Value: XML
Right Operand Value: EDI
Test Result: False

RULE FIRED 16/08/2014 10:13:01 p.m.
Rule Engine Instance Identifier: f2c966cf-b248-4e94-a96a-99d110d59a9b
Ruleset Name: BREMaps
Rule Name: Map To XML Format
Conflict Resolution Criteria: 0

One more thing worth mentioning is that once the BRE Pipeline Framework executes a map, it will promote the output message type to the BTS.MessageType context property just as a map on a port would.  This means that you can reliably create routing filters based on the BTS.MessageType context property if you make use of the dynamic transformation feature in the BRE Pipeline Framework.

The aforementioned solution is available for download here.  I’ve included the Visual Studio solution with the source code as well as an export of the BRE Policy, as well as an MSI installer which will create the example application for you (you might need to reconfigure the folders in the file receive and send port based on where you unzip the solution and you might have to provide full control permissions on these folders to your host instance user).  I’ve also included some example XML messages for your convenience.

Happy transforming.

Advertisements

While implementing dynamic transformation in the BRE Pipeline Framework I ran into an interesting problem.  In BizTalk 2013 Microsoft changed the way transformations are executed to be based on XSLCompiledTransform, rather than on the long deprecated XSLTransform, which delivers performance benefits in the mapping engine.  This however is a breaking change for all those that chose to implement dynamic transformation via custom .Net code in prior versions of BizTalk.  My specific problem was that I wanted to implement dynamic transformation in the BRE Pipeline Framework without forking the code to provide BizTalk 2010 and 2013+ support.

The code for BizTalk 2010 dynamic transformations in the BRE Pipeline Framework looks (note that it has been truncated to make it easier to view, visit the CodePlex page if you’d like to see the full source code) like the below.

TransformMetaData transformMetaData = TransformMetaData.For(mapType);
SchemaMetadata sourceSchemaMetadata = transformMetaData.SourceSchemas[0];
string schemaName = sourceSchemaMetadata.SchemaName;
SchemaMetadata targetSchemaMetadata = transformMetaData.TargetSchemas[0];

XPathDocument input = new XPathDocument(inmsg.BodyPart.GetOriginalDataStream());
XslTransform transform = transformMetaData.Transform;
Stream output = new VirtualStream();
transform.Transform(input, transformMetaData.ArgumentList, output, new XmlUrlResolver());
output.Position = 0;
inmsg.BodyPart.Data = output;

The above wouldn’t build on a BizTalk 2013 development machine since an ITransform object was expected instead of an XSLTransform object.  The working code looks like the below.

TransformMetaData transformMetaData = TransformMetaData.For(mapType);
SchemaMetadata sourceSchemaMetadata = transformMetaData.SourceSchemas[0];
string schemaName = sourceSchemaMetadata.SchemaName;
SchemaMetadata targetSchemaMetadata = transformMetaData.TargetSchemas[0];

XPathDocument input = new XPathDocument(inmsg.BodyPart.GetOriginalDataStream());
ITransform transform = transformMetaData.Transform;
Stream output = new VirtualStream();
transform.Transform(input, transformMetaData.ArgumentList, output, new XmlUrlResolver());
output.Position = 0;
inmsg.BodyPart.Data = output;

Note that the major point of difference in the above two code snippets is the type of the transform variable.  In order to cater for both scenarios I decided to take advantage of .Net 4’s dynamic type feature whereby instead of specifying a class name (XSLTransform or ITransform) I use the dynamic keyword instead as below.

dynamic transformMetaData = TransformMetaData.For(mapType);
SchemaMetadata sourceSchemaMetadata = transformMetaData.SourceSchemas[0];
string schemaName = sourceSchemaMetadata.SchemaName;
SchemaMetadata targetSchemaMetadata = transformMetaData.TargetSchemas[0];

XPathDocument input = new XPathDocument(inmsg.BodyPart.GetOriginalDataStream());
dynamic transform = transformMetaData.Transform;
Stream output = new VirtualStream();
transform.Transform(input, transformMetaData.ArgumentList, output, new XmlUrlResolver());
output.Position = 0;
inmsg.BodyPart.Data = output;

Note that in the above I also had to use the dynamic keyword in place of the TransformMetaData type since this class appears to belong to a different namespace in BizTalk 2013 compared to prior versions.

The dynamic keyword instructs the compiler to not perform any validation on methods/properties called on that object (so no intellisense) and to instead assume that the developer knows what he is doing.  The object type is resolved at runtime and if any of the called methods/properties don’t exist then that will result in a runtime error.

This of course is only a valid solution if you are targeting .Net 4.0 at the minimum since this feature didn’t exist in previous versions.  This solution works well for solutions targeting BizTalk 2010 and above.  I would also encourage any BizTalk 2010 shops that are dabbling in dynamic transformation to future proof their solutions by using the dynamic keyword.

This of course only scratches the surface of dynamic types.  If you want to read more check out this MSDN article.  I would definitely encourage thorough unit testing (as was the case for the BRE Pipeline Framework) to make up for the loss of compile-time validation.

%d bloggers like this: