Quantcast
Channel: Tallan's Technology Blog »» biztalk 2013 - Tallan's Technology Blog
Viewing all 43 articles
Browse latest View live

BizTalk Server 2013 beta (not 2010 R2)

$
0
0

It’s very exciting to see the new major release of BizTalk – BizTalk 2013 beta. There has been much speculation around what it will be called since the Tech Ed announcement this year in Orlando.  Microsoft will continue to support the platform with BizTalk 2013 (not 2010 R2 as previously announced).  I’m reassured that the BizTalk 2013 version will be a significant commitment from Microsoft because there is typically a 5 year major release support model which takes us well into 2017.

I’m especially excited about some of the new features including: REST support, Azure Service Bus send/receive adapters, SFTP adapter support, and especially ESB Toolkit integrated into the platform!  Tallan will be blogging more on this new platform as we start working through client installations, feature proof of concepts, and cloud demo/presentations.

Download BizTalk 2013 beta here http://www.microsoft.com/en-us/download/details.aspx?id=35553


An Overview of BizTalk Server 2013

$
0
0

Microsoft BizTalk Server 2013 was recently released and it includes many new features that make developing and deploying for BizTalk considerably more efficient.  There are also many new features which expand on the already vast capabilities of a BizTalk Solution.

In this post I will go over the new features of Microsoft BizTalk Server 2013.  And explain how a few of the new features can contribute to a more efficient solution.

General

Some of the usual updates to any Microsoft software comes in the form of new platform stacks that are supported, and BizTalk Server 2013 is no different.  BizTalk now supports:

  • Windows Server 2008 R2 SP1
  • Windows Server 2012
  • Windows 7 SP1
  • Windows 8
  • Visual Studio 2012
  • .NET Framework 4.5
  • SQL Server 2008 SP1
  • SQL Server 2012

New Adapters

New features included with BizTalk 2013 are a set of new adapters which allow for connectivity to other applications and protocols, such as Windows Azure and RESTful services.

  • SB-Messaging Adapter: The SB-Messaging adapter allows BizTalk to send and receive messages form the Queues, Topics and Relays of a Service Bus such as Windows Azure.
  • WCF-BasicHttpRelay Adapter: Allows BizTalk to communicate with an ASMX Web services using the WCF protocol.
  • NetTcpRelay Adapater: Send and receive WCF calls using the secure NetTcpRelayBinding.
  • WCF-WebHttp Adapter: The WCF-WebHttp adapter allows BizTalk to send and receive, or expose BizTalk artifacts using REST.
  • SharePoint Services Adapter: Sends and Receives messages to SharePoint Services
  • SFTP Adapter: Enables Biztalk to securely send and receive messages to FTP servers using SSH.

One new adapter which may have a large impact on how BizTalk projects are developed is the new SFTP adapter.  Many clients of mine have had a need for secure messaging to an FTP server, and we have often had to settle for third party SFTP adapters which are sometimes poorly documented or supported.  Even the ones that are well documented and supported are not trivial to implement in a client environment because of doubts about the credibility and security of a third party developed adapter, a valid point when dealing with clients that, rightfully so, demand a high level of security in their solutions.  With the SFTP adapter provided out of the box by BizTalk Server 2013, all of these concerns are addressed and implementation of new interfaces using SFTP adapters are easier without the need for installing third party software.

The SB-Messaging, WCF-BasicHttpRelay, and WCF-NetTcpRelay adapters allow for easy communication with Azure.  Microsoft has a short article on configuring the SB-Messaging adapter, and Kent Weare has a great article about implementing the SB-Messaging adapter to communicate with an Azure Service Bus.

Configurable Dynamic Send Port

In BizTalk Server 2010 and previous versions, dynamic send ports used the default host instance, without the ability to change the host instance handler used by the dynamic send port.  With BizTalk Server 2013, a new feature will be added allowing a send handler to be assigned for each dynamic send port, bringing it in line with the functionality of static send.  MSDN has a short page on this enhancement to Dynamic Send Ports.

Viewing the Artifact Dependencies

A possible cause of frustration when dealing with larger BizTalk applications, or applications with a large number of artifacts, is determining what dependencies exists between artifacts.  For example which ports an orchestration depends on, or which maps a port is using the transform messages.  With BizTalk 2013, a new feature has been added to easily show all of the dependencies between artifacts.

The MSDN BizTalk blog has is has a great tutorial on viewing the dependencies of artifacts.

ESB Toolkit included ‘Out of the Box’

The Microsoft Enterprise Service Bus Toolkit is a collection of tools that extend the service oriented capabilities of BizTalk, though one of the most widely used features of the ESB Toolkit is its centralized exception management capabilities.  The toolkit contains an ASP.NET portal and accompanying set of BizTalk applications and assemblies which subscribe to warning and error messages and provide an easy to navigate web site to view and manage the errors.

In previous versions of BizTalk Server, the Microsoft BizTalk ESB Toolkit has been a separate installation and required its own  configuration procedure that took a considerable effort in order to correctly install and configure.  In BizTalk 2013, the ESB Toolkit installer is contained within the BizTalk 2013 installation media, and can be installed from the start menu.   Most of the remaining configuration is the same as in the previous version, however some of the ESB Toolkit configuration steps have been streamlined.

More information about configuring the ESB Toolkit, can be found on the MSDN page.

Azure IaaS BizTalk Server 2013 Provision Under 30 Minutes

$
0
0

I needed a sandbox BizTalk server for BizTalk proof of concepts.  I decided to use Windows Azure IaaS and more specifically the BizTalk Server 2013 Evaluation.  Here is what I did:

9:11 AM

Logged into Windows Azure Portal, started provisioning the server.  See print screens below…

Step 1

Add new VM

Step 2

Create VM from Gallery

Step 3

Select BizTalk Server 2013 Evaluation

Step 4 (note, do not use ‘Administrator’ it’s already used on the VM when Azure provisions it)

Enter VM name, size, username and pw

Step 5

Azure cloud service, dns name, region, and storage account

Step 6 (note, Availability Set is used to deploy VM’s accross multiple fault domains (or set of racks). For sandbox, this is not needed.

Select Availability Set

9:13 AM

Azure is provisioning a new server based on our configuration.

9:23 AM

Azure completed provisioning.  Now I can remote desktop into the server.

Step 7

Endpoints - Remote Desktop connection information.

Step 8

Connect to our server via RDP

Step 9
After logging in, I copied a BizTalk configuration file I had.  I opened the file, search and replaced user name and domain/server name where applicable.

You can also simply use Basic Configuration if it’s for a sandbox like mine.

Step 10

Choose Custom (if you have a file or want to use more than one users/groups)

Step 11

I'll import the configuration file I copied onto the server.

9:39 AM

Voila!  Server provisioned with SQL Server, BizTalk Server 2013 installed and configured.

BizTalk Configuration Completed and ready for use.

BizTalk Server 2013 – Unable to configure BAM

$
0
0

I was installing and configuring BizTalk Server 2013 for a client in a multi-server configuration (separate BizTalk application and BizTalk SQL servers).  While configuring BAM, I received the following error:

A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 – Could not open a connection to SQL Server) (.Net SqlClient Data Provider)

Solution:

Microsoft has released a hot fix for this issue:

To resolve the issue, apply the fix that is introduced in the following Microsoft Knowledge Base (KB) article:

FIX: The vertical scroll bar on the target schema does not work correctly when you use Visual Studio to design a BizTalk Server 2013 map

Note KB 2830546 includes the fixes from KB articles 2830546, 2832136

and 2832137.

I noticed that this was not an issue on single-server installations and you will not need to reboot the server after the patch – just restart host instances.

Cleaning up BizTalk Databases

$
0
0

One maintenance task that I often perform when finishing development on a client is to refresh my BizTalk environment.  This usually includes cleaning up the tracking and MessageBox databases of old tracking data and messages.  The BizTalk 2010/2013 installation includes a set of stored procedures that can assist in this effort and ensure a BizTalk environment.

One word of warning, the following should not be performed on a production environment.

  • The first step is to stop all BizTalk Host Instances and stop IIS.
    Create the ‘bts_CleanupMsgbox’ stored procedure by executing the <BizTalk Installation>\Schema\msgbox_cleanup_logic.sql on the BizTalkMsgBoxDb.
  • After the creation of the stored procedure run the following SQL query:

execute bts_CleanupMsgBox

  • Then execute the bts_PurgeSubscriptions stored procedure using the following query:

execute bts_PurgeSubscriptions

  • The dtasp_CleanHMData stored procedure within the BizTalkDTADb  database will clean up the tracking database.  Execute the following query against the BizTalkDTADb:

execute dtasp_CleanHMData

  • Start all BizTalk Host Instances and start IIS.

More information can be found on MSDN

MSVCP100.dll Missing When Installing BizTalk Server 2013

$
0
0

A quick post today.  While installing Microsoft BizTalk Server 2013 today on a fresh Windows Server I received the following error during the Enterprise Single Sign-On Server installation step:

“The program can’t start because MSVCP100.dll is missing from your computer.  Try reinstalling the program to fix the problem.”

Also:

“The following platform components failed to install and will need to be manually installed before setup can proceed.  Enterprise Single Sign-On Server: Unspecified error.  Check the log for details.”

The BizTalk Server 2013 installation wizard then fails.  The cause of the above errors is the absence of the Microsoft Visual C++ 2010 Redistributable Packages.  The fix is very painless and involves installing the VC redistributable packages from the Microsoft site.

For a 64 bit installations, ensure that both the x86 and x64 packages are installed, x86 first then x64.  For 32bit installations, just the x86 package is needed.

Microsoft Visual C++ 2010 Redistributable Package (x86)

Microsoft Visual C++ 2010 Redistributable Package (x64)

Utilizing Promoted Properties within Receive Shape Filters in BizTalk

$
0
0

Receive shape filters using promoted properties can provide great flexibility as to what messages your orchestration is subscribed to.  Though, that flexibility also introduces some difficulty when using an typed XML Document message. Recently I attempted to utilize a receive shape filter using a promoted property within my project. However, I noticed that when using an XML Document message type as the receiving message prevented me from using the my project’s  promoted properties within my receive shape filter.

As an example, below is my Property Schema showing my CurrentProcess promoted property

image

We can also see that i have a  field within my RoutingSlip schema with the same promoted property.

image

After building the project, my receive shape filter does not have the option of utilizing my promoted properties within the filter when using an XML document.  The fix for this, other than forgoing the using of an XML Document message type, is to change the ‘Property Schema Base’ property within the Property Schema from MessageDataPropertyBase (default) to MessageContextPropertyBase.  MessageContextPropertyBase signifies to BizTalk that this particular promoted property does not necessarily have to exist within the message data of the message that it was promoted from.  To put it simply, this allows for the use of the promoted properties without requiring that the message schema exist within the same project as the Property Schema, and without the orchestration having to match the message type of the incoming message with that of the promoted property.

image

After building the project with the above change, we can see that I can now select my promoted property within the receive shape filter of the XML Document.

image

Failed to Validate BAM Portal Web Site (BAMPortal)

$
0
0

I have recently encountered an error while trying to reinstall BAM Tool and the BAM Portal on a Microsoft BizTalk Server 2013 installation.  I encountered the following error while attempting to configure the BAM Portal in the Microsoft BizTalk Server Configuration window.

Capture

The error specific error encountered when clicking on the red validation icon is:

Capture

Failed to validate BAM Portal Web Site (BAMPortal)
ADDITIONAL INFORMATION:

The BAM Portal website Default Web Site is not valid. (Microsoft.BizTalk.Bam.CfgExtHelper.PortalHelper)

Exception from HRESULT: 0×80005008 (System.DirectoryServices)

The usual culprit for this error is the “IIS 6 Management Compatibility” windows feature missing – However, this was not the case in this specific scenario, as the feature was already present.  In this specific case, since I was reinstalling the BAM Portal, the Microsoft BizTalk Server Configuration wizard did not completely uninstall the BAM Portal from IIS.  There were a few lingering pieces that have to be manually removed – Specifically the BAMManagementService and BAMQueryService IIS applications.

The solution is to manually remove those two IIS applications by following the following steps.

1.  Open IIS Manager and select your website, by default it is ‘Default Web Site’
2.  Click on ‘View Applications’ within the right hand ‘Actions’ panel.

    Capture

3.  Remove BAMManagementService and BAMQueryService.

Capture

4.  Go back to the Microsoft BizTalk Server Configuration Window, uncheck ‘Enable BAM Portal’ and recheck ‘Enable BAM Portal’ to revalidate the settings.

image

5.  Apply the Configuration.

image


Solution to Common BizTalk Deployment errors in Visual Studio

$
0
0

There are a few common deployment errors in Microsoft Visual Studio when redeploying a previously deployed BizTalk project.

“Failed to add resource(s). Change requests failed for some resource.  BizTalkAssemblyResourceManager failed to complete end type change request.  Failed to update binding information.  Cannot update receive port “”.  Transform “” not found”

Untitled picture

 

There are a few other similar errors that share the same root error “Failed to add resource(s). Change requests failed for some resource” such as

“Failed to add resource(s). Change requests failed for some resources. BizTalkAssemblyResourceManager failed to complete end type change request. Cannot update assembly”

and

“Failed to add resource(s). Change requests failed for some resources. BizTalkAssemblyResourceManager failed to complete end type change request. Failed to update binding”

These all share the same root cause.  Visual Studio uses cached binding files when deploying BizTalk applications.  Removing these cached binding files will result in a ‘clean’ deployment that should resolve any binding related deployment errors.  The files are stored in

%APPDATA%\Microsoft\BizTalk Server\Deployment\BindingFiles\

Clearing the contents of this directory should resolve any deployment issues related to cached bindings.

XLANGs Object Reference Error when dynamically loading a map in BizTalk

$
0
0

This was one of those very frustrating errors that had a very simple solution.  I had an orchestration that was dynamically loading a map using a Fully Qualified Name (FQN) that is stored in BRE.  The exception looked like this:

{ORCHESTRATION_NAME} encountered an error: Object reference not set to an instance of an object.
at Microsoft.XLANGs.Core.Service.ApplyTransform(Type mapRef, Object[] outParams, Object[] inParams)
at {ORCHESTRATION_NAME}.segment3(StopConditions stopOn)
at Microsoft.XLANGs.Core.SegmentScheduler.RunASegment(Segment s, StopConditions stopCond, Exception& exp)

 Many times, this is the result of not deploying your DLL to the GAC, or not having a schema available in the Management Database.  I double checked both of those, and they were there.  I removed them and manually reinstalled them using gacutil, still got the same error.  It finally occurred to me to use reflection on the assembly to see if the FQN was wrong, and that was indeed my culprit.  The class name had .Outbound appended to it, and my orchestration was trying to load a class that ended in .Maps.  I could have changed the FQL in the BRE, but that would have been inconsistent with naming conventions elsewhere in the project.

The source of the error was adding the map to a Solution subfolder in Visual Studio:

BlogPost1

This made the Namespace of the map default to Maps.Outbound:

BlogPost2

Changing the highlighted property to .Maps and rebuilding/deploying fixed the issue.

Introducing the T-Connect EDI File Splitter

$
0
0

A common trend that we have been seeing recently is the similarities in the pain points that companies are often faced with.  One of those such pain points is the difficulty in quickly receiving and processing large EDI files having a file size over 10 MB.  In one instance, there was a need to receive a HIPAA EDI 834 Enrollment file totaling 1.3 GB, containing roughly 800,000 enrollments.  The already powerful BizTalk Server in a 1-1 architecture – 1 BizTalk Server, 1 SQL Server, was having difficulty processing the file.  The file would take several hours to process, often taking up several gigabytes of drive space to attempt to disassemble the file using the EdiDisassembler pipeline.

Benefits

Our solution to this problem was in the creation of the T-Connect EDI File Splitter.  The EDI File Splitter is a Microsoft BizTalk Server pipeline component that enables the rapid splitting of large EDI files within BizTalk Server.  These split files are delivered to an user configurable folder location.  Some of the benefits of this solution include:

  • Splitting large files up into smaller size files improves processing time significantly
  • Easy configuration settings promotes flexibility
  • Deploy easily as a BizTalk Pipeline Component and use with any supported BizTalk adapter
    • Split file by record count allows you to specify the individual file record count as they are split from the larger EDI file
    • Delay split file timer allows you to control and throttle your own delivery of individual files as they are split
    • Split file location allows you to specify where the individual split files will be delivered
  • Process files independently by reconstructing the ISA and GS envelope for each split file
  • Updates ST segment counts to reflect new split file record counts
  • Supports HIPPA X12 EDI Transaction Sets
  • Supports Supply Chain/Order Processing/Transportation X12 EDI Transaction Sets

The tool is able to process files very quickly.  Below are some general speed metrics for a few configurations for an 834 Enrollment file.  All other file types will have similar, if not the same performance.

Record Count File Size Records per split file Total split time
40,000 60 MB 500 10 seconds
40,000 60 MB 1000 10 seconds
800,000 1.3 GB 1000 3.75 minutes

Installation and Setup

The following steps can be used to install and configure the T-Connect EDI File Splitter into an existing BizTalk app

  1. Unzip the installation package after downloading the installation package from the following page
  2. The Install.bat file should then be executed to install the package to the default directory “C:\Program Files (x86)\EDISplitter”  This directory can be altered by opening up the Install.bat file and editing the following line
    • set InstallPath=C:\Program Files (x86)\EDISplitter
    • 1
  3. The bat file will uninstall any previous versions, configure the license, and install the latest version of the T-Connect EDI File Splitter.
  4. The installation will consist of the pipeline components and a sample receive port within the default BizTalk Application 1 application.
    • 2
  5. Below is an example of the configuration options within the T-Connect EDI Splitter pipeline component.
    • 3
  6. By default, the license file is installed to the documents folder of the currently logged in user.  The file can be moved to any location as long as the LicenseFilePath configuration option within the pipeline component settings.  The settings that should be configured by the user are specified below:
Property name Description Default value Required
EdiTransactionType Type of EdiTransaction  is defined herePossible values are : X12_00501_834, X12_00501_837_P, X12_00501_837_I, X12_00501_837_D, X12_00501_835, X12_00501_820, X12_00501_278, X12_00501_270, X12_00501_271, X12_00501_276, X12_00501_277 Empty Yes
OutFolder Specifies the local directory or file share path output file delivery. e.g: C:\Archive Empty Yes
ReleaseAfterXSeconds Number of seconds to delay before the next EDI file written into output folder 0 No
SegmentTerminator Edi segment terminator ~ Yes
SplitAfterOccurances Number of records in an EDI file before it written to out folder 1 Yes

Information

We have rolled this product into a simple to install package which can be downloaded from the following page.  A 30 day full functionality trial is included in the download.  If you would like to learn more about the T-Connect EDI File Splitter, or to purchase a license, please contact us.

MABS EAI Bridge LoB Lookup (Part 1 of 2)

$
0
0

Microsoft Azure BizTalk Services (MABS) has a lot to offer for companies looking for a PaaS Middleware solution.  EAI bridges provide B2B communication as well as LoB access functionality for EDI, XML, and flat file interchanges.  The new mapping system offers some exciting and powerful new functionality as well, vastly simplifying certain tasks that previously required several functiods, and opening up new possibilities and enhanced performance with lists.

However, it is a new technology, and certain tasks that have been very straightforward in BizTalk Server 2013 require a different way of thinking for MABS.  For example, it is a fairly trivial task to create an orchestration that accesses a LoB adapter (using, for example, WCF slqBinding) to do data validation or enhancement, and publishing this orchestration as a web service for client consumption. If your SQL database is SQL Azure, there is some built in functionality to do a “Lookup” in the Azure database, but this may not be an option for an infrastructure that is makes use of features not currently available in SQL Azure, such as the ability to encrypt at rest.  It may also just be possible that an existing LoB SQL database cannot easily be moved for various other reasons.  In this series of posts, I will outline the process for implementing this functionality using the powerful custom code functionality available in MABS.

The tasks include the following (this post will cover steps 1-6):

  1. Creating the BizTalk services
  2. Setting up BizTalk Adapter Services in a local (or IaaS) environment to run a stored procedure in SQL Server
  3. Creating a sample table and stored procedure
  4. Creating a ServiceBus namespace with ACS
  5. Create the Relay to the LOB system
  6. Creating an EAI bridge to access the LoB system
  7. Testing and debugging the bridge with a Visual Studio add on
  8. Writing a custom component to call the LoB adapter in a EAI Bridge Stage and parse the response
  9. Having the component send an email notification using an Office 365 server

Step 1: Create BizTalk Service This is fairly straightforward.  Log into your Azure portal, click BizTalk Service, click New, and then Custom Create.  Choose a name, edition, and region, and your service will begin provisioning:

step1

You’ll need the ACS Information for your newly created service for several of the following steps.  You can get this information by selecting the service you created and clicking “Connection Information” at the bottom.  That will display the following information.  You’ll need similar information for the ServiceBus relay (created in step 4) as well; I found it convenient to copy this information into an Excel sheet for quick reference during development (however, be sure this is stored in a secure manner and is consistent with your IT security policies – this is sensitive information that should not be disclosed to non-administrators of the service):

MABS-ACS

Step 2: Setting up BizTalk Adapter Services For this, you’ll need to get the BizTalk Services SDK: http://www.microsoft.com/en-us/download/details.aspx?id=39087.  The instructions for this can be found at: http://msdn.microsoft.com/en-us/library/azure/hh689760.aspx#BKMK_Installation.  Take note of the requirement regarding machine.config if you have previously installed Azure BizTalk SDK components, and install the optional runtime (this had me going in circles for a while!).  Also take note of the requirements for downloading and installing the certificate from the Azure:

SSLCert

To add it to your security store (see http://msdn.microsoft.com/en-us/library/azure/hh949825.aspx):

  1. On your test/development machine, double-click the .cer file. Select Install Certificate.
  2. Select Place all certificates in the following store, select Browse, and select Trusted Root Certification Authorities.
  3. Complete the installation. You’ll get a Security Warning; which is normal. Click Yes.

When prompted for the service identity, be sure to use an account that has internet access and permissions SQL (or whatever LoB item you’re working with).   A successful install should result in the following page when you navigate to https://localhost:8080/BAService/ManagementService.svc/

RequestError

Step 3: Creating a sample stored procedure For this sample, I’m working with a SQL Server database running in a VM behind Tallan’s firewall.  I did not have to do any special firewall configuration for the VM, my host machine, or IT to get this up and running, nor did I have to make use of any kind of tricky proxying methods.  In my database (named BTSTrainDb, a database I have for testing and samples), I created a table: dbo.sn.  This table has an ID column, a varchar SerialNumber column, and a bit IsValid column.  I also created a simple stored procedure that takes a varchar as a parameter and uses it to do a look up on the dbo.sn table:

USE [BTSTrainDb]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO

CREATE TABLE [dbo].[sn](
 [ID] [int] IDENTITY(1,1) NOT NULL,
 [SerialNumber] [varchar](50) NOT NULL,
 [IsValid] [bit] NOT NULL
) ON [PRIMARY]

GO

SET IDENTITY_INSERT dbo.sn ON
GO

INSERT INTO [dbo].[sn] (SerialNumber, IsValid) VALUES ('asdf', 1), ('fdsa', 0), ('test', 1);'
GO

SET IDENTITY_INSERT dbo.sn OFF
GO

CREATE PROCEDURE [dbo].[usp_ValidateSN]
(
 @sn varchar(50)
)
AS
BEGIN
SET NOCOUNT ON;
SELECT IsValid from dbo.sn WHERE SerialNumber = @sn;
END

Obviously, this procedure could contain any amount of business logic and/or perform other CRUD operations if desired.  I also have a SQL user that has permissions on the database, table, and stored procedure. Step 4: Create a ServiceBus relay with an ACS namespace Microsoft has recently removed the ability to do this from the management portal, and so it has to be done through a PowerShell cmdlet.  On top of that, the current documentation for this command is out of date!  Cesar Cordero (a colleague here at Tallan) just recently wrote a blog about this with some more details here: http://blog.tallan.com/2014/11/06/service-bus-authentication-and-authorization/ Here’s an overview of what you need to do:

  1. Get the Azure PowerShell module(s) if you don’t already have them: http://azure.microsoft.com/en-us/documentation/articles/install-configure-powershell/
  2. Add-AzureAccount; follow the steps in the dialog that pops up
  3. Select-AzureSubscription (if your account is associated with more than one subscription; this information can be found by clicking on the “Subscriptons” button near your username at the top of the Azure Portal)
  4. New-AzureSBNamespace -Name <name> -Location “Eastern US”; at the prompt for NamespaceType, type Messaging.  Note: The name of this namespace must be different from the name of the BizTalk service.  If it is not, there will be a clash between the ACS namespace used for the BizTalk Services ACS and the ServiceBus ACS. 
  5. You can now retrieve the connection information for this ServiceBus namespace from the management portal (or copy it from the output of the PowerShell cmdlet).  I made a note of it as it’s required for configuration in step 5.

Step 5: Create the LoB relay Open Visual Studio.  In the Server Explorer, right click on BizTalk Adapter Services, and select Add a BizTalk Adapter Service.  Enter the URL set up in step 2 (https://localhost:8080/BAService/ManagementService.svc/). Expand the newly created adapter service.  You will be prompted to enter ACS information; enter the information for the BizTalk Adapater Service ACS from Step 1. Right click on SQL, and click “Add a Target”.  For connection parameters, fill in the server information

step 2 of sql

Operations should look familiar to anyone who’s used the WCF SQL binding.  Go to Strongly-Typed Procedures and add the dbo.usp_ValidateSN:

operations sql

Choose your runtime security.  Here, I entered the username that has access to the procedures and its password as a Fixed username:

step 4 sql

Specify the LOB Relay URL using your newly created service bus namespace; for LOB Relay path, choose something that will be memorable this particular SQL relay path, and for the sub-path choose a unique identifier (like the InboundID parameter for WCF SQL binding in on prem BizTalk): step 5 sql

Review your information on the next page and click create.   Step 6: Create an EAI Bridge for this relay In Visual Studio, create a new BizTalk Service project (Under Visual C#).  Right click on your new LOB SQL type and click “Add schemas to <ProjectName>”; enter the project folder and filename prefix you want, and enter the credentials you set up in the previous step under runtime security.  This will generate two schemas that should look pretty familiar if you’ve used WCF SQL before. Drag the LOB SQL type over onto your design surface (named MessageFlowItnerary.bcs by default).  Drag an XML Request-Reply bridge onto the surface from the toolbox.  Drag a connector between the two. On the connector, set the following properties: Filter Condition: Match All (will set to 1=1): this sets the connector to always send messages through; we’ll cover content routing in the next article in this series

filter-condition

Route Action: configure as follows (again, this should look familiar!): this sets the SOAP action header

route-action

Your surface should look like this now (after renaming the XML bridge to SQLRequest):

Surface-1

Double click the SQL LOB icon (circled). In the config file that pops up, edit the sharedSecret node with  the information from your ServiceBus relay ACS (not BizTalk Services ACS):

lob-config-secret

Build and deploy the solution (you’ll be prompted for your BizTalk Services ACS information on deployment).  This operation should not require a refresh of the service, but go ahead and select it anyway.  Grab the MessageSender app: https://code.msdn.microsoft.com/windowsazure/Windows-Azure-BizTalk-EAI-af9bc99f.  Build it and run the following command with a generated instance of the request schema:

messagesender.exe <BizTalk Service Namespace Name> owner <BizTalk Service ACS Key> https://<name>.biztalk.windows.net/default/SQLRequest <instance of the request schema.xml> application/xml

A sample request message looks like this:

<ns0:usp_ValidateSN xmlns:ns0="http://schemas.microsoft.com/Sql/2008/05/TypedProcedures/dbo">
 <ns0:sn>asdf</ns0:sn>
</ns0:usp_ValidateSN>

And here’s the successful output:

message-sender

Stay tuned for part 2!

BizTalk Build Error: The Namespace already contains a definition for _Module_PROXY_

$
0
0

While building BizTalk projects within Visual Studio, it is possible to receiving the following error when trying to compile a project with multiple orchestrations:

“The namespace ‘[namespace]’ already contains a definition for ‘_MODULE_PROXY_’”

While the error seems obvious, in my case, the namespaces were indeed unique, such that the above error did not make much sense.  The underlying issue was in fact that the type names of two of the orchestrations within the project were identical.  It turns out that an orchestration was at one point duplicated, and only the namespace was changed.  After making the type names unique, the project successfully compiled and deployed.

MABS EAI Bridge LoB Lookup (Part 2 of 2)

$
0
0

Last week month (sorry about that!), I wrote a post about using MABS to access a LoB system (in the example, SQL Server) behind several layers of firewalls (here).

We looked at the following tasks

  1. Creating the BizTalk services
  2. Setting up BizTalk Adapter Services in a local (or IaaS) environment to run a stored procedure in SQL Server
  3. Creating a sample table and stored procedure
  4. Creating a ServiceBus namespace with ACS
  5. Create the Relay to the LOB system
  6. Creating an EAI bridge to access the LoB system

This week, we’ll look at these tasks:

  1. Testing and debugging the bridge with a Visual Studio add on
  2. Writing a custom component to call the LoB adapter in a EAI Bridge Stage and parse the response
  3. Having the component send an email notification using an Office 365 server

Using the MessageSender application from Microsoft is handy for a quick test (and for some sample code), but to do more serious work there’s a Visual Studio add on for debugging and sending test messages to EAI bridges called the BizTalk Service Explorer.  It’s available here: https://visualstudiogallery.msdn.microsoft.com/1f75a6a6-a54e-44eb-8b11-1b5ea8928754 (this can also be found through the Visual Studio Extensions manager).  This will add a new item to the Server Explorer window in Visual studio:

Server Explorer 1

Right click on the highlighted item, and click “Add New BizTalk Service”.  Here you’ll have to provide the service URL and the ACS information about the service, as well as a “Friendly Name” used to display the service in the Server Explorer:

add a biztalk service

(if you can’t remember your ACS information, see my last post linked at the top of this entry; hopefully you saved it somewhere, otherwise you can get it from the Azure portal).

Once you enter that information, you will see your assemblies, bridges, certificates, schemas, and transforms deployed to MABS.  There are two particularly helpful functions for bridges: sending a test message and debugging the bridge:

Right click on SQLREQUEST (created last time), and click “Send Test Message…”  In this dialogue, you can load and send a test message and get the response (here, I got back a response of valid because asdf is a valid serial number in my local database):

Send test message

Debugging this bridge is a similar process, but it’s not really doing a whole lot yet.  The next phase of this project is to call the LoB bridge from a one way bridge to make a routing decision.

 

There is no built in way to do that in MABS currently; a two way bridge’s output cannot be routed to a one way destination.  However, various stages of a bridge can all be extended with custom .NET code, and it’s possible to send a message to a bridge and get the response using a custom .NET module.  This is something like a BizTalk Pipeline Component.  I wrote a C# library to place in a new EAI bridge.  The library has two classes:

  • MessageSender class (based off of Microsoft MessageSender application, but refactored to take string inputs instead of files; source included at end of this post)
  • SetPropertiesInspector, which implements IMessageInspector.  This class reads some promoted properties, sends a message to the LoB system, parses the response, and promotes some new properties for routing.

propertiesinspector code

Note that the http scheme is used, not https; this is to ensure that the ACS lookup will happen correctly.  Obviously, a “real” integration would have more sophisticated message parsing, perhaps using an XDocument to load and parse the data.

The bridge itself needs to be created and configured as well.  I dropped a new XmlOneWayBridge on to the MessageFlowItinerary design surface, and two Queues:

Design Surface

 

On the properties for the connector to the first queue, I added the filter condition “Valid=’True’”, and on the other, “Valid=’False’” (see the last post for more about filter conditions).  Valid is a property that gets promoted by the custom component.

 

For the Bridge configuration, I have the following: the message type is a new Request Schema that has a serial number and other information as well.  In the enrich stage, I added two XPath property definitions to promote the serial number node and another node:

enrich stage

  1. Click Enrich
  2. Open the Property Definitions
  3. Add/Edit your properties
  4. Enter the information; Xpath is to promote properties by XPath expression.

Finally, I set up the “On Exit Inspector”to refer to the Fully Qualified Name of my signed assembly:

exit inspector

  1. Select the outter Enrich stage box
  2. Open the properties for the On Exit Inspector
  3. Provide the fully qualified name of the class in the assembly.  You can also pass parameters to the assembly in here if desired.

Deploy your solution (ensure that the custom assembly is referenced and set to Copy Local by the MABS project) as before.

Deployment make take a minute or two to complete, but now we can debug the bridge.  Right click the bridge in Server Explorer and instead of sending a test message, this time we’ll debug; each stage will have new information to show, including showing the message body and any properties getting promoted.  This example is really only working with the requestMessageExtractor stage, so I’ll show those.  First, the “Before requestMessageExtractor”: (two properties have been promoted by earlier stages):

debug 1

 

The XPath promotions come next:

debug 2

Then the Exit Inspector portion of this stage; our new properties are promoted, including “Valid” to true because asdf is a valid serial number in this case; I would not ordinarily promote the outgoing and return messages, but for debugging it’s helpful:

debug 3

And that’s that!  The custom component can do other things with the message or message data if desired.  This solution has the data going to a queue, but the data could also be sent via Exchange 365, like so:

sendmail

There’s a pretty wide range of possibilities here!

 

Finally, here’s the MessageSender class:

 

using System;
using System.Collections.Specialized;
using System.Configuration;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Mime;
using System.ServiceModel.Channels;
using System.Text;
using System.Web;
using System.Xml;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Collections.Generic;

namespace LoBLookup
{

 /// <summary>
 /// MessageSender is used to send a message to a deployed bridge end point
 /// </summary>
 public class MessageSender
 {
 /// <summary>
 /// Send message bytes to the runtime adresss of the deployed bridge end point
 /// </summary>
 /// <param name="acsNamespace">ACS namespace</param>
 /// <param name="issuerName">Issuer name for the specified namespace</param>
 /// <param name="issuerKey">Issuer key for the specified namespace</param>
 /// <param name="runtimeAddress">Runtime address of the deployed bridge</param>
 /// <param name="msg">String of the message to be sent</param>
 /// <param name="contentType">Content type of the message</param>
 /// <returns></returns>
 public static string SendMessage(string acsNamespace, string issuerName, string issuerKey, string runtimeAddress, string msg, string contentType)
 {
 string runtimeToken;
 UriBuilder endpointToGetAcsTokenForBuilder = new UriBuilder(runtimeAddress);
 endpointToGetAcsTokenForBuilder.Scheme = Uri.UriSchemeHttp;
 endpointToGetAcsTokenForBuilder.Port = -1;
 runtimeToken = GetAccessControlToken(endpointToGetAcsTokenForBuilder.ToString(), issuerName, issuerKey, acsNamespace);
 return SendMessageToBridge(runtimeAddress, msg, runtimeToken, contentType);
 }

 /// <summary>
 /// Get the Access Control token for the Service Bus URI
 /// </summary>
 /// <param name="endpointUri">Represents the End Point URI</param>
 /// <param name="issuerName">Issuer name for the Service Bus URI</param>
 /// <param name="issuerKey">Issuer key for the Service Bus URI</param>
 /// <returns>Access Control token</returns>
 private static string GetAccessControlToken(string endpointUri, string issuerName, string issuerKey, string acsNamespace)
 {
 string acsAddress = GetAcsAddress(acsNamespace);
 return GetAcsToken(acsAddress, issuerName, issuerKey, endpointUri);
 }

 /// <summary>
 /// Get the ACS address from the ACS namespace
 /// </summary>
 /// <param name="acsNamespace">Represents ACS Namespace</param>
 /// <returns>ACS Address</returns>

 private static string GetAcsAddress(string acsNamespace)
 {
 //UriBuilder acsUri = new UriBuilder(Uri.UriSchemeHttps + "://" + acsNamespace + "." + "accesscontrol.windows.net");
 //return acsUri.ToString();
 return "https://" + acsNamespace + ".accesscontrol.windows.net:443/";
 }

 /// <summary>
 /// Gets the ACS token for the specified Service Bus URI
 /// </summary>
 /// <param name="acsAddress">Represents ACS address</param>
 /// <param name="issuerName">Issuer name for the specified Service Bus namespace</param>
 /// <param name="issuerKey">Issuer key for the specified Service Bus namespace</param>
 /// <param name="appliesToAddress">Represents Service Bus URI</param>
 /// <returns>ACS Token</returns>
 private static string GetAcsToken(string acsAddress, string issuerName, string issuerKey, string appliesToAddress)
 {
 HttpClient client = new HttpClient();

 HttpContent content = new FormUrlEncodedContent(new Dictionary<string, string>
 {
 {"wrap_name", issuerName},
 {"wrap_password", issuerKey},
 {"wrap_scope", appliesToAddress}
 });

 var message2 = client.PostAsync(acsAddress + "WRAPv0.9/", content).Result;
 string response = message2.Content.ReadAsStringAsync().Result;
 //string response = Encoding.UTF8.GetString(responseBytes);

 // Extract the SWT token and return it.
 return response
 .Split('&')
 .Single(value => value.StartsWith("wrap_access_token=", StringComparison.OrdinalIgnoreCase))
 .Split('=')[1];

 }
 

 /// <summary>
 /// Sends message
 /// </summary>
 /// <param name="address">Represents the runtime address of the bridge end point</param>
 /// <param name="msg">Represents the string of the message to be sent</param>
 /// <param name="token">Represents ACS token</param>
 /// <param name="contentType">Content type of the message</param>
 /// <returns>Success/Failure message of the Send operation</returns>
 private static string SendMessageToBridge(string address, string msg, string token, string contentType)
 {
 string response;
 //WebClient webClient = new WebClient();
 //webClient.Headers[HttpRequestHeader.Authorization] = "WRAP access_token=\"" + HttpUtility.UrlDecode(token) + "\"";
 //webClient.Headers["Content-Type"] = contentType;
 //byte[] uploadData = webClient.UploadData(address, "POST", messageBytes);
 //response = Encoding.UTF8.GetString(uploadData);
 HttpClient c = new HttpClient();

 c.DefaultRequestHeaders.Add("Authorization", "WRAP access_token=\"" + HttpUtility.UrlDecode(token) + "\""); //Authorization = new AuthenticationHeaderValue("WRAP access_token", HttpUtility.UrlDecode(token));
 //c.DefaultRequestHeaders.Add("Content-type", "application/xml");

 //string msg = Encoding.UTF8.GetString(messageBytes);
 HttpContent content = new StringContent(msg);
 content.Headers.ContentType = new MediaTypeHeaderValue("application/xml");

 var message2 = c.PostAsync(address, content).Result;
 response = message2.Content.ReadAsStringAsync().Result;
 return response;
 }

 }
}

Muenchian Grouping in BizTalk while keeping Mapper functionality

$
0
0

Muenchian Grouping is a powerful technique for to allow grouping by common values among looping/repeating nodes in an XML document.  BizTalk does not have out of the box support for this, but it can be achieved by adding custom XSLT to a map.  Chris Romp wrote a post about this years ago that serves as an excellent example of the idea in BizTalk: http://blogs.msdn.com/b/chrisromp/archive/2008/07/31/muenchian-grouping-and-sorting-in-biztalk-maps.aspx.  The drawback of his method is that you lose all other Mapper functionality by using completely custom XSLT, and custom XSLT is more difficult to maintain than a BizTalk map.

Enter Sandro Periera’s phenomenal tutorial on Muenchian Grouping in BizTalk maps (https://code.msdn.microsoft.com/windowsdesktop/Muenchian-Grouping-and-790347d2).  His solution is particularly powerful because it allows you to maintain the functionality and simplicity of the BizTalk Mapping engine while extending it to allow for the Muenchian Grouping technique as well.  However, there is still a limitation to this approach; the XSLT functoids will still be responsible for any transformation of child nodes that are grouped.  That poses a problem if your grouping logic requires that a parent (or perhaps even a root) node gets grouped on the criteria and many child nodes must be appended to the proper parent.

I recently faced just this situation while working for a client.  The XML data coming in needed to be extensively transformed, and in particular, duplicate child nodes had to be converted to unique parent nodes, with the original parents being appended to the correct new unique node.  Custom XSLT is clearly required here, but a hybrid approach can be used to still allow a regular BizTalk map to transform the resultant data.

My keys look like the following: each duplicate node has a pair of elements that, when joined, make it unique (ContactID and ContactType):

 <xsl:key name="Contacts" match="Contact" use="concat(ContactID, '|', ContactType)"/>

I could then use a xsl:for-each loop to join these nodes as the new parent node; the rest of the map was pretty straightforward:

<Contacts>
  <xsl:for-each select="//Root/Node1/Node2/Contact[generate-id(.) = generate-id(key('contacts', concat(ContactID, '|', ContactType)))]">
    <Contact>
    ... parent nodes appended here ...
    </Contact>
  </xsl:for-each>
</Contacts>

To avoid having very complicated logic and needing to replicate custom functoids in that “…” part, I started considering ways to have two maps: one using custom XSLT and the second being a regular BizTalk map.

The most basic way to achieve this strategy would be to have two maps which are called in sequence from an orchestration.  The first map has the custom XSLT to do the Muenchian grouping, the second map is a regular BizTalk map that works from the output of the first map.  This would work, but not if you want to do mapping on a receive port (which the architecture called for), where only the first map would get triggered.  It also is a bad idea of the message sizes are large (which occasionally happens with this trading partner).  Another method would be the one discussed here (again by Chris Romp): http://blogs.msdn.com/b/chrisromp/archive/2008/08/06/stacking-maps-in-biztalk-server.aspx, but this would involve creating several dummy receive locations to achieve something that can be done fairly simply in a pipeline component.

My eventual solution was pretty simple: create a pipeline component with the custom XSLT as an embedded resource.  I decided to put it in the decode stage, but it could have gone in the XML Disassembler stage, or in a C# helper library that gets called by an orchestration.  Here’s the relevant code:

muenchian

A quick explanation of the code: A VirtualStream is used to manipulate the incoming message; the embedded XSLT is loaded as a resource, and XslCompiledTransform is used to run transform the schema.  The new message is added to the Context’s ResourceTracker to ensure that the garbage collector will properly dispose of streams when the pipeline is done with them, and the message is passed on to the next stage of the pipeline component.

This approach does have a drawback.  If the trading partner changes the schema, there will now be several updates to make: the pipeline component’s XSLT, the map, and the inverted schema.  However, this change is not likely to occur often, and using this method means that future developers can continue to use the BizTalk mapper functionality (including some custom functoids) when mapping the partially transformed trading partner data to the internal canonical format.

And the upside is worth it.  The complexity of the XSLT grouping is handled in one place, and it will be easier to maintain changes to that separately from changes to the rest of the mapping logic.  Plus, new developers on the solution will be able to quickly see how the data is getting sent into the canonical, a task that is more difficult to understand when looking at raw XSLT.


BizTalk: Slow Performance while viewing or restarting host instances

$
0
0

Recently, while troubleshooting performance issues at a client, I came across a peculiar issue.  The environment in question was a multi- server environment with multiple BizTalk Servers (2010) connected to a SQL Server cluster.  The issue was extremely slow performance when trying to view, refresh, start or stop the host instances through the Administration Console.  Now usually it takes a few seconds for host instances to start, stop, or restart, however, in this case just refreshing the status of the host instances was taking over a minute.

The problem was that one of the BizTalk Servers in the group was offline for some time, causing the host instances tied to that BizTalk Server, within the same group, to become unresponsive.  This caused slowdown in viewing the host instances for any of the other servers in the BizTalk Group.

There are two ways to resolve this:

1. Turn on any offline BizTalk Servers

2. Remove the host instances for any BizTalk Servers that are no longer needed.

In our case, the second option was our chosen on, as the offline BizTalk server was no longer needed.  Once we removed the host instances from the group, the Administration Console was a lot snappier and responded quicker when refreshing, starting, or stopping host instances.

Capturing and Debugging a SQL Stored Procedure call from BizTalk

$
0
0

So you design your strongly typed stored procedure to take table types from BizTalk and it’s running great with your test cases.  It works well through the unit testing, but then you start running larger jobs and suddenly SQL is choking on it.

Ever wish you could just run that SQL call directly in SSMS with those exact several thousand rows for the table type parameters, and step through it using the debugger?  Well, you can using SQL Server Profiler (and/or Server Traces).  I used this technique recently to help a client resolve a particularly thorny issue that came up when they tried to process some larger messages.

To walk through the process of doing this, I’ll use a database named BTSTrainDb with a stored procedure (dbo.usp_DemoTableTypeSP) that takes a user-defined Table Type (dbo.DemoTableType) as a parameter and then just selects * from it (echoing it back to the caller).

First, fire up SQL Server Profiler and create a new trace.  Uncheck all of the event boxes.  On the Events Selection tab, check off Show all events and Show all columns, and then select the RPC:Completed and the RPC:Starting event; it’s not a bad idea to uncheck the Text column for the RPC:Completed event (as this will repeat the data from the RPC:Starting event).

Trace properties

 

Next, you’ll want to add filters that will capture only calls to usp_DemoTableTypeSP on BTSTrainDb; here, I’m filtering on the database and object names – you would want a username filter as well if there were other users calling the procedure that you’d like to ignore:

DBNameFilter

ObjNameFilter

 

Click OK and then click Run to start the trace.

This will work fine for RPCs using small amounts of data (like test cases with only a few records).  That can be helpful if you just need to get a quick template for running the stored procedure directly during testing, but once you start feeding in larger amounts of data you’ll see the message “Trace Skipped Records.”  The sample below shows both types of occurrences:

TraceSample

So, I have my small sample (super handy for quickly debugging the procedure when I make changes), but when I tried to send in a large sample the Profiler tried to avoid crashing its GUI by loading several megabytes of text data.  To resolve this, you’ll need to start a server side trace.  This is easily accomplished by exporting the trace you just made; click “File->Export->Script Trace Definition->For SQL Server 2005-SQL11…”.  Load the resulting file up in SSMS.  You’ll want to increase the @maxfilesize parameter  and change the filename to something SQL Server will be able to write cool; take a note of the TraceID that it creates as well, as you’ll probably want to stop or dispose of it later:

trace-start

When I run the big file through again now, I get the whole procedure call in the c:\temp\DemoTrace.trc file.  I can stop the trace by running the following stored procedure (the first parameter is the TraceID from earlier, the second is the status value):

exec sp_trace_setstatus 2, 0

If I wanted to remove the trace entirely it’d be exec sp_trace_setstatus 2, 2; or to start it up again exec sp_trace_setstatus 2, 1.  See http://msdn.microsoft.com/en-us/library/ms176034.aspx for more information.  When I open up the DemoTrace.trc file in profiler, I can get the whole call:

LargeTrace

(note the ID column here, which indicates how many rows there are – the file generated almost 165000 rows!)

You can export it by going to File->Export->Extract SQL Server Events->Extract Transact-SQL Events… and saving it as a .sql file.  I now have full debugging access to the SQL stored procedure with the exact data that’s been causing problems!

 

 

 

Inserting into Multiple Parent-Child SQL Tables in a Single Stored Procedure Call using the BizTalk Server WCF-SQL Adapter

$
0
0

This post covers how to load records into several tables that have parent-child relationships, via a single stored procedure call and the use of SQL user-defined table types. You may have seen how to use table types for BizTalk SQL loads before (such as in this post), but the approach detailed below shows how to load multiple parent and child records within the same stored procedure, while also executing no more than one insert statement for each table being loaded. (Note that user-defined table types is only available in SQL Server 2008 or later).

Having BizTalk make just one SQL stored procedure call for all records has multiple benefits:

  • Reduces the number of BizTalk messages going through the BizTalkMsgBoxDb database, as well as the number of roundtrips between BizTalk and SQL Servers, reducing overhead and improving speed of loads
  • All record loads can be performed in the same transaction
  • No need to de-batch and create separate messages (for each root parent record) within BizTalk

In addition, the stored procedure calls just one insert for each table (avoiding having to loop through records), resulting in significant performance gains.

Suppose you need to load the following Order tables:

clip_image002[4]

Now also suppose that for the input, your BizTalk solution will receive an Order XML file containing one or more orders.

clip_image004[4]

In the above CanonOrders schema, there can be one or more Order nodes. Each Order can have one Header, multiple Addresses, multiple Details, and each Detail can have multiple Notes.

 

Step 1: Create SQL Table types

First, we need to create the SQL table types. When creating these, you only need to include the fields that you’ll be passing into the stored procedure. For example, fields such as CreatedDate or ModifiedDate are excluded here, as those fields will be set within the stored procedure.

clip_image006[4]

Also, in the place of Primary and Foreign key Identity columns, set ‘Temporary Id’ columns. For example, in the OrderDetail table, ‘OrderDetail_TempId (varchar(20) takes the place of the OrderDetailId (bigint) Identify column, and ‘OrderHeader_TempId (varchar(20) takes the place of the OrderHeaderId (bigint) foreign key.

There’s no reason to include the identity columns, because we won’t know the identity values until after the records have been loaded into the tables. And the temporary Id columns allows us to link parent and child records together when inserting into the tables later.

Note: Here we’re using varchar(20) for the TempIds because later we’ll be mapping alphanumeric values to it, and the length of 20 will cover our range of values. But know that the datatype doesn’t need to be limited to varchar (it can be a numeric type such as int or bigint, depending on how you later map to it).

clip_image008[4]

Step 2: Create the stored procedure

We need to create the stored procedure next.

clip_image010[4]

Declare ‘IDMapping’ cross reference(xref) table variables, which will be used to lookup the actual Identity value tied to a temp Identity value. We only need to create these xref tables for parent tables that have child tables. (For this Order example, we only need ones for OrderHeader and OrderDetail).

clip_image012[4]

Next we insert into each table, starting with the root parent, and drilling down from there. If we’re inserting into a parent table, we’ll also be using SQL’s OUTPUT clause to capture the newly inserted Identity value, as well as the corresponding temp id, in our ‘IdMapping’ xref table.

 

Inserting into Order Header

clip_image014[4]

You’ll see that we’re using a Merge statement. This is done to get around a limitation of the OUTPUT clause, allowing us to select a column from the source table in the OUTPUT clause . In this case, it allows us to select the OrderHeader_TempId field from the @OrderHeader tabletype parameter. This field needs to be loaded into the @OrderHeaderIdMapping xref table variable, along with the actual inserted Identity, for foreign key lookups later in the procedure.

 

Inserting into OrderAddress

clip_image016[4]

This table doesn’t have any child tables, so we don’t need to output the OrderAddressId identity values. It’s a more straightforward insert. But we do need to join with @OrderHeaderIdMapping to look up the OrderHeaderId foreign key value (based on @OrderAddress.OrderHeader_TempId).

 

Inserting into OrderDetail

clip_image018[4]

Just as we did in the OrderHeader insert, we need to output the inserted identity values along with the corresponding temp id. For OrderDetail we insert these values into @OrderDetailMapping, so that we can lookup OrderDetailId later when inserting into OrderDetailNote. In addition, when specifying the source fields in the MERGE statement, we need to join with @OrderHeaderIdMapping to look up the OrderHeaderId foreign key value (based on @OrderDetail.OrderHeader_TempId).

 

Inserting into OrderDetailNote

clip_image020[4]

This table doesn’t have any child tables, so we don’t need to output the OrderDetailNoteId identity values. But we do need to join with @OrderDetailIdMapping to look up the OrderDetailId foreign key value (based on @OrderDetailNote.OrderDetail_TempId).

 

Step 3: Generate BizTalk SQL Schemas

To generate the SQL schemas for the uspInsertOrders stored procedure…

  • In Visual Studio, right-click the BizTalk (schemas) project, and select Add –> Generated Items…
  • In the ‘add Generated Items’ window, select Consumer Adapter Service

clip_image022[4]

  • In the Consumer Adapter Service Window..
    • Select ‘sqlBinding’, and configure your SQL connection string URI
    • Click Connect
    • Select ‘Client (Outbound operations)’ as the contract type
    • Click on Strongly-typed Procedures, select the stored procedure under ‘Available categories and operations’ (in the case below ‘dbo.uspInsertOrders’), and click ‘Add’

clip_image024[4]

Two schemas will be generated, one for the table types (InsertOrdersTableType.dbo.xsd), and one for the stored procedure call (InsertOrdersTypedProcedure.dbo.xsd). Additionally, a bindings file for a two way WCF_SQL send port is created. Before you import this into your application, you’ll want to edit the bindings to rename the Send Port and set the ‘ApplicationName’. I also renamed this to InsertOrders.bindinginfo.xml, and changed it from a two-way send Port, to a one-way Send Port.

Optionally, you can specify a Filename Prefix in the ‘Consume Adapter Service’ window. Otherwise, you’ll end up with schemas named TableType.dbo.xsd and TypedProcedure.dbo.xsd, which is not recommended, and can be problematic if you need to generate additional schemas for other stored procedure calls.

clip_image026[4]

Step 4: Map to the SQL Stored Procedure Request schema

When mapping CanonOrders to the uspInsertOrders stored procedure call, set loops between the source nodes and the corresponding ‘Type’ Node.

clip_image028[4]

Next we need to map the temporary Primary and Foreign Key Ids in the table types. In the mapping below, we create a separate mapping page called ‘Ids’ for readability.

clip_image030[4]

In this Order mapping, the temporary Id value we’re using here is a combination of: <parent record temp id> + <identifier text> + <Iteration Id>. If we only used the Iteration Functoid value for, let’s say, the OrderAddress_TempId, then we’d end up with duplicate OrderAddress_TempId values. That’s because the Iteration Functoid would restart at 1 when iterating Addresses on the next Order record. By including the parent record’s temp id as a prefix, we’re ensuring all child TempIds are unique, across all parents.

But know you can use any value for the temp id, as long as it’s unique for each child record, and as long as the same value is used for ‘XXX_TempId’ Primary and Foreign keys.

Some examples of Temp Id values that would result from the above mapping:

  • First Order: OrderHeader_TempId value=HDR1
  • First Order, first OrderDetail: OrderDetail_TempId value=HDR1DTL1
  • First Order, first OrderDetail: OrderHeader_TempId (temp ‘foreign key’ id) value=HDR1
  • First Order, second OrderDetail: OrderDetail_TempId value=HDR1DTL2
  • Second Order, first OrderDetail: OrderDetail_TempId value=HDR2DTL1

Below is the String Concatenate functoid configuration for the OrderDetail_TempId value (where the Header temp Id is included as a prefix).

clip_image032[4]

The following is an example map result, given an input CanonOrders.xml with two orders. The first Order has two Addresses, and the second Order has one Address.

Mapped OrderHeaderType records

clip_image034[4]

Mapped OrderAddressType records

clip_image036[4]

Step 5: Deploy and Test

Finally, we have all of the components needed to execute end to end. The example input order file here is rather small (with only two orders, and a couple of addresses and order details each).

Sample CanonOrder.xml inpuT

clip_image038[4]

clip_image040[4]

As you can see in the SQL table results below, all inserted records have proper foreign keys set for all child tables.

clip_image042[4]

To recap, the basic steps to this approach:

  1. Create SQL table types for the tables being loaded. Include Temporary Id columns in place of primary and foreign key Identity key columns.
  2. Create the stored that takes in the table types as parameters and loads the data. This stored procedure will execute one insert for every table. While inserting into parent tables, identity values, along with the temp id values, will be outputted to an id xref table variable.
  3. Generate BizTalk schemas for the SQL procedure call and table type parameters.
  4. Map to the stored procedure, setting each table type record’s temp primary and foreign key ids (to maintain the link between parent and child records).
  5. Deploy schemas and map. Deploy the WCF-SQL Send Port and test

 

Alternatives to using Table Types

Of course, the use of table types is not required for this approach. If you’re loading into SQL Server 2005, table types are not an option. Instead, you can pass in XML as an argument to the stored procedure, and have SQL query the XML.

However, if given a choice I’d always recommend passing in table types over XML for a few reasons:

  • Passing in XML bloats the data, due to the XML tags.
  • The XML must be parsed within SQL Server, which is expensive. Table types are native to SQL and can be read/used immediately without any transformation.
  • You can enforce data/type integrity with table types. With XML, you can pass in bad data/incorrect types, and not know it until the XML is parsed within the stored procedure.

Announcing the T-Connect EDI Viewer!

$
0
0

EDI files offer many advantages in business and health care messaging, and BizTalk offers a premier suite for handling EDI interchanges and agreements.  At the same time, EDI files are notoriously difficult to read, even for EDI experts and highly skilled Business Analysts. The T-Connect EDI Viewer is designed to address this gap.  It supports converting the HIPAA EDI 837 transaction sets (for Professional, Institutional, and Dental) as well as the HIPAA EDI 834 Enrollment transaction set to a human-readable and printable PDF format.

The T-Connect EDI Viewer offers a library and a pipeline component that can be used to convert EDI messages sent to BizTalk into a PDF version of their Paper Form equivalents.  This conversion could be done as part of an existing integration, or using archived/historical data on a case by case basis.  The PipelineComponent offers a very simple way of adding this functionality to an existing BizTalk application with little to no development effort.  The library offers a simple but powerful API for more fine grained control over the process for BizTalk application developers.

A fully functional 30-day trial is available here.

Keep reading to see a sample conversion!

To install the trial, simple start the MSI.  The User Installation Guide offers step by step support for each option during the installation process.

Once installed, you can test the application with any 837P, I, or D file.  It will consume the file and write PDF versions of the transactions to a specified location.  An 837P file that starts out as this:

ISA*00* *00* *ZZ*EDIConvTPsamp1 *ZZ*Tallansamp1 *120926*0642*U*00501*000000252*0*T*:~
GS*HC*XXXXX-CPS*Tallan*20120926*0642*252*X*005010X222A1~
ST*837*000000252*005010X222A1~
BHT*0019*00*000000252*20120926*0642*CH~
NM1*41*2*Mayo Clinic*****46*S32433~
PER*IC*Mayo Clinic*TE*9547487111*FX*9547487222~
NM1*40*2*Zirmed*****46*Zirmed~
HL*1**20*1~
NM1*85*1*John*Smith*J***XX*6565656565~
N3*5537 Some Street*APT ZZ~
N4*Big City*NY*000008888~
REF*EI*222222222~
....

is converted to this:

sample837p

 

Does your organization have custom needs for this tool?  Tallan can provide custom solutions around this product, including:

  • Use of your own in-house custom form
  • Alternate or nonstandard coding/value meanings
  • Providing additional information from the EDI transactions that isn’t normally captured on the paper form

Simply contact BizTalk@tallan.com!

New Update to the T-Connect EDI Splitter Now Available!

$
0
0

Picture1

In September of last year, we released the T-Connect EDI Splitter For BizTalk.  We are proud to announce the release and immediate availability of the newest version of the T-Connect EDI Splitter.  For the uninitiated, the EDI File Splitter is a Microsoft BizTalk Server pipeline component that enables the rapid splitting of large EDI files within BizTalk Server.  These split files are delivered to an user configurable folder location.   Along with additional features added to the EDI Splitter for BizTalk, we are also releasing a standalone application for Microsoft Windows available today: The T-Connect EDI Splitter for Windows

The T-Connect EDI Splitter for BizTalk and the T-Connect EDI Splitter for Windows provide these  great benefits and features:

  • Faster file processing speed - Splitting large files up into smaller size files improves processing time significantly
  • Simple Deployment - Easy configuration settings promotes flexibility
  • Broad Transaction Support
    • Supports HIPPA X12 EDI transaction sets
    • NEW: Supports Supply Chain/Order Processing/Transportation X12 EDI transaction sets
  • Flexibility-
    • Split file by record count allows you to specify the individual file record count as they are split from the larger EDI file
    • NEW: Ability to split incoming messages into a predetermined number of target files
  • Throttling – Delay split file timer allows you to control and throttle your own delivery of individual files as they are split
  • NEW:  Enhanced, easy to use installation wizard.
  • Platform Support
    • BizTalk Support - Deploy easily as a BizTalk Pipeline Component that easily ‘bolts on’ to existing BizTalk environments and use with any supported BizTalk adapter.
    • NEW:  Windows Support – The standalone Windows version for quick splitting and viewing of files without the need for additional servers or software

Both products support a wide range of transaction sets including:

  • HIPAA Transaction sets version 005010 Errata
    • 270 Transaction Eligibility/Benefit Inquiry
    • 271 Transaction Eligibility or Benefit Information (response to 270)
    • –276 Transaction Claim Status Request
    • 277 Transaction Claim
    • –Status Notification (response to 276)
    • –278 Transaction Referral/Authorization Request, Response to Request for Review
    • 820 Transaction Premium Payment
    • 834 Transaction Benefit Enrollment and Maintenance
    • 835 Transaction Claim Payment/Advice (Electronic Remittance)
    • 837 Transaction Three implementations of this transaction:
    • Institutional
    • Professional
    • Dental
  • NEW: Non – HIPAA Transaction sets version 005010
    • 810 Invoice
    • 832 Price/Sales Catelog
    • 846 Inventory
    • –850 Purchase Order
    • 856 Ship Notice/Manifest

Note: Contact us for 004010 transaction sets support!

Information

We have rolled this product into a simple to install package which can be downloaded from the following page.  A 30 day full functionality trial is included in the download.  If you would like to learn more about the T-Connect EDI File Splitter, or to purchase a license, please contact us.

Viewing all 43 articles
Browse latest View live