While implementing dynamic transformation in the BRE Pipeline Framework I ran into an interesting problem.  In BizTalk 2013 Microsoft changed the way transformations are executed to be based on XSLCompiledTransform, rather than on the long deprecated XSLTransform, which delivers performance benefits in the mapping engine.  This however is a breaking change for all those that chose to implement dynamic transformation via custom .Net code in prior versions of BizTalk.  My specific problem was that I wanted to implement dynamic transformation in the BRE Pipeline Framework without forking the code to provide BizTalk 2010 and 2013+ support.

The code for BizTalk 2010 dynamic transformations in the BRE Pipeline Framework looks (note that it has been truncated to make it easier to view, visit the CodePlex page if you’d like to see the full source code) like the below.

TransformMetaData transformMetaData = TransformMetaData.For(mapType);
SchemaMetadata sourceSchemaMetadata = transformMetaData.SourceSchemas[0];
string schemaName = sourceSchemaMetadata.SchemaName;
SchemaMetadata targetSchemaMetadata = transformMetaData.TargetSchemas[0];

XPathDocument input = new XPathDocument(inmsg.BodyPart.GetOriginalDataStream());
XslTransform transform = transformMetaData.Transform;
Stream output = new VirtualStream();
transform.Transform(input, transformMetaData.ArgumentList, output, new XmlUrlResolver());
output.Position = 0;
inmsg.BodyPart.Data = output;

The above wouldn’t build on a BizTalk 2013 development machine since an ITransform object was expected instead of an XSLTransform object.  The working code looks like the below.

TransformMetaData transformMetaData = TransformMetaData.For(mapType);
SchemaMetadata sourceSchemaMetadata = transformMetaData.SourceSchemas[0];
string schemaName = sourceSchemaMetadata.SchemaName;
SchemaMetadata targetSchemaMetadata = transformMetaData.TargetSchemas[0];

XPathDocument input = new XPathDocument(inmsg.BodyPart.GetOriginalDataStream());
ITransform transform = transformMetaData.Transform;
Stream output = new VirtualStream();
transform.Transform(input, transformMetaData.ArgumentList, output, new XmlUrlResolver());
output.Position = 0;
inmsg.BodyPart.Data = output;

Note that the major point of difference in the above two code snippets is the type of the transform variable.  In order to cater for both scenarios I decided to take advantage of .Net 4’s dynamic type feature whereby instead of specifying a class name (XSLTransform or ITransform) I use the dynamic keyword instead as below.

dynamic transformMetaData = TransformMetaData.For(mapType);
SchemaMetadata sourceSchemaMetadata = transformMetaData.SourceSchemas[0];
string schemaName = sourceSchemaMetadata.SchemaName;
SchemaMetadata targetSchemaMetadata = transformMetaData.TargetSchemas[0];

XPathDocument input = new XPathDocument(inmsg.BodyPart.GetOriginalDataStream());
dynamic transform = transformMetaData.Transform;
Stream output = new VirtualStream();
transform.Transform(input, transformMetaData.ArgumentList, output, new XmlUrlResolver());
output.Position = 0;
inmsg.BodyPart.Data = output;

Note that in the above I also had to use the dynamic keyword in place of the TransformMetaData type since this class appears to belong to a different namespace in BizTalk 2013 compared to prior versions.

The dynamic keyword instructs the compiler to not perform any validation on methods/properties called on that object (so no intellisense) and to instead assume that the developer knows what he is doing.  The object type is resolved at runtime and if any of the called methods/properties don’t exist then that will result in a runtime error.

This of course is only a valid solution if you are targeting .Net 4.0 at the minimum since this feature didn’t exist in previous versions.  This solution works well for solutions targeting BizTalk 2010 and above.  I would also encourage any BizTalk 2010 shops that are dabbling in dynamic transformation to future proof their solutions by using the dynamic keyword.

This of course only scratches the surface of dynamic types.  If you want to read more check out this MSDN article.  I would definitely encourage thorough unit testing (as was the case for the BRE Pipeline Framework) to make up for the loss of compile-time validation.