QuickLearn Re-Launches Azure Logic Apps Class

By John Callaway

QuickLearn Is Excited to Announce the Availability of the Improved Logic Apps Course

July 25th was a big week for Microsoft’s Azure Logic Apps with the announcement that Logic Apps Reaches General Availability, and it was a big week for QuickLearn as well. We have been working for months on honing our expertise with Logic Apps so that we would be ready to deliver the new and improved Cloud-Based Integration Using Azure App Services course in conjunction with Microsoft’s release. This course has been expanded from the original three-day version to a five-day version that includes a full one-day workshop where attendees build a complete integration solution using Logic Apps, Azure Service Bus, and various API App connectors.

Nick Hauenstein and Rob Callaway worked tirelessly over the last few weeks putting the finishing touches on what I have to say is a killer course. The rest of the team provided support in testing and editing but those two did the heavy lifting in truly getting up to speed. Nick did a Herculean job delivering the course to a truly international audience with attendees in London, Sydney and of course our office in Kirkland.

I had the opportunity to attend the class as a student. I have been following the development of Azure App Services as I’m sure many of you have, and felt I had a pretty good handle on how they all work, but I have to admit I came away with a much better idea of how all the parts can work together. Nick has a way of building great scenarios and explaining how the available parts can be used to build a complete integration.

For those of you keeping track, the timing meant that Nick had to shift gears mid-week as Microsoft pushed the GA bits into production. It was interesting to see things work one way in one demo and literally an hour later work a different way!

We also had a real treat when Jeff Hollan, Program Manager for Azure Logic Apps dropped by and spent about an hour talking about Logic Apps and answering questions for the students. Its great being so close to the Microsoft campus, we always appreciate visits from our friends there.

What Does the Future of Integration Look Like?

I wish I had a nickel for every time a student has asked that question in class. It has been puzzling since the story coming out of Redmond has been evolving over the last few years. Fortunately, the story is a good one.

Everyone needs integration. For years if you wanted to build a robust integration solution using .NET, you really only had two options. Start from scratch and build the whole thing yourself, a very time consuming process, or buy BizTalk Server. Although BizTalk is an awesome and powerful product, the learning curve is rather steep and the cost of ownership often high. What was needed was integration for the little guys.

Azure App Service is (becoming) the solution to this problem. Azure App Service is a fully managed platform for web, mobile, and integration scenarios. Our course focuses on connecting your on-premises resources to cloud services such as Service Bus and on building complexity into your solutions via Logic Apps. Although it isn’t a replacement for BizTalk, it shares many of the capabilities and features that BizTalk developers would be familiar with.

Does that mean you don’t need BizTalk anymore? Not at all! BizTalk still provides a very powerful processing engine whether you choose to run it in Azure or on your own hardware. Azure App Services simply provide an option to do some of the things BizTalk is capable of. It is probably best suited for .NET developers who aren’t familiar with BizTalk Server but are looking to integrate with Azure resources.

From time to time I have been asked about how Microsoft Flow fits into all of this. Flow uses the same connectors and services that are built into App Services, it just doesn’t have the ability for developers to extend it using Logic Apps and API Apps. With Flow you are moving into a home that is all furnished for you. With Azure App Services you have the house and a toolbox and a pile of wood to finish it off just the way you want it.

Is This the Right Course for You?

If you happen to be new to integration and are looking for a good place to start, this course is it. On the other hand, if you are an experienced BizTalk developer and you are interested in exploring the future, this is also the course for you. The amount of crossover between the two products is surprisingly small as far as the tools that you use, although of course the concepts will seem very familiar to you.

There are still seats available for the September 19th delivery being presented by Rob Callaway at out Kirkland location (also available for remote attendance). If you are in Europe, you have two opportunities coming up. I will be delivering the class in Oslo Norway with our partner Bouvet on October 24th or you can celebrate Halloween with an American (October 31st) with our partner InfoSupport in Utrecht, Netherlands.

Logic Apps is Officially GA + New Features

By Nick Hauenstein

Today the Logic Apps team has officially announced the general availability of Logic Apps! We’ve been following developments in the space since it was first unveiled back in December of 2014. The technology has certainly come a long way since then, and is certainly becoming capable of being a part of enterprise integration solutions in the cloud. A big congratulations is in order for the team that has carried it over the finish line (and that is already hard at work on the next batch of functionality that will be delivered)!

Along with hitting that ever important GA milestone, Logic Apps has recently added some new features that really improve the overall experience in using the product. The rest of this post will run through a few of those things.

Starter Templates

Starter Logic App Templates

When you go and create a new Logic App today, rather than being given an empty slate and a dream, you are provided with some starter templates with which you can build some simple mash-ups that integrate different SaaS solutions with one another and automate common tasks. If you’d still rather roll up your sleeves and dig right into the code of a custom Logic App, there is nothing preventing you from starting from scratch.

Designer Support for Parallel Actions

Ever since the designer went vertical, it has been very difficult to visualize the flow of actions whenever there were actions that could execute in parallel. No longer! You can now visualize the flow exactly as it will execute – even if there are actions that will be executing in parallel!

Parallel Actions

Logic Apps Run Monitoring

Another handy improvement to the visualization of your Logic Apps is the new runtime monitoring visualization provided in the portal. Instead of seeing a listing of each action in your flow alongside their statuses – with tens of clicks involved in taking in the full state of the flow at any given time – a brand new visualizer can be used to see everything in one shot.

The visualization captures essentially the same thing that you see in the Logic App designer, but shows both the inputs and the outputs on each card along with a green check mark (Success), red X (Failure), or gray X (skipped) in the top-right corner of the cards.

monitoring

Additionally if you have a for each loop within your flow, you can actually drill into each iteration of the loop and see the associated inputs/outputs for that row of data.

For Each Monitoring Visualization

Visual Studio Designer

There is one feature that you won’t see in the Azure portal. In fact, it’s designed for offline use – the Visual Studio designer for Logic Apps. The designer can be used to edit those Logic App definitions that you’d rather manage in source control as part of an Azure Resource Group project – so that you can take advantage of things like TFS for automated build and deploy of your Logic Apps  to multiple environments

Unfortunately, at the moment you will not experience feature parity with the Azure Portal (i.e., it doesn’t do scopes or loops), but it can handle most needs and sure is snappy!

Visual Studio Designer for Logic Apps

That being said, do note that at the moment, the Visual Studio designer is still in preview and the functionality is subject to change, and might have a few bugsies still lingering.

Much More

These are just a few of the features that stick out immediately while using the GA version of the product. However, depending on when you last used the product, you will find that there are lots of runtime improvements and expanded capabilities as well (e.g., being able to control the parallelism of the for each loops so that they can be forced to execute sequentially).

Be Prepared

So how can you be prepared to take your integrations to the next level? Well, I’m actually in the middle of teaching all of these things right now in QuickLearn Training’s Cloud-based Integration using Logic Apps class, and in my humble and biased opinion, it is the best source for getting up to speed in the world of build cloud integrations. I highly recommend it. There’s still a few slots left in the September run of the class if you’re interested in keeping up with the cutting edge, but don’t delay too long as we expect to see these classes fill up through the end of the year.

As always, have fun and do great things!

Deploying and Managing Logic Apps using Visual Studio Team Services

By Rob Callaway

One of the concerns that I have repeatedly heard from customers when we talk about Azure is application lifecycle management. If you do most of your resource deployment and management using the Azure Portal, then you probably picture a very manual migration process if you wanted to move your app from dev to test, or if you wanted to share your app with another developer.

A clear example of this occurred during a run of QuickLearn’s Cloud-Based Integration Using Azure App Service course when my students were quick to see that the Logic Apps they created was pretty much stuck where they created them. Moving from one resource group to another was impossible at the time, and exporting the Logic App (and all the API Apps it depended on) was only a dream, so the only option was to redo all your work in order to create the Logic App in another resource group or subscription.

Logic Apps and Azure App Service have come a long way since then and the QuickLearn staff has been working its collective noodle to come up with application lifecycle management guidance for Logic Apps using the tools that are available today, which will hopefully improve the way you go about deploying and managing your Logic Apps.

A Comforting ARM Around Your Shoulders

Some readers may already be aware of the Azure Resource Manager or ARM for short. For those who haven’t previously met my little friend I’ll give a short introduction of ARM and the tools that exist around it. ARM is the underlying technology that the Azure Portal uses for all its deployment and management tasks. For example, if you create any resource within a new Resource Group using the Portal it’s really ARM behind the scenes orchestrating the provisioning process.

“Great Rob, but why do I care?”

I’ll tell you why. There are tools designed around ARM that make it not only possible, but down-right easy to run ARM commands. For example, you can get the Azure PowerShell module or the Azure Command Line Interface (CLI) and script your management tasks.

There’s a little more to it though, you see, those Azure resources (Logic Apps, Resource Groups, Azure App Service plans, etc.) are complex objects. Resource Groups, for example, have dozens of configurable properties and serve as containers for other objects (e.g., Web Sites, API Apps, Logic Apps, etc.). Let’s not over simplify reality; your cloud applications aren’t made up of a single resource, but instead are many resources that work in tandem. Therefore, any deployment or management strategy needs to bear that in mind. If you want to pull back the covers on your own resources, head over to the Azure Resource Explorer and you’ll see what I’m talking about.

“It’s nice to have a command that I can run in a console window to create a Resource Group, but I need more than that!”

You’re right. You do need more than that. The way you get more is using ARM Templates. ARM Templates provide a declarative way to define deployment of resources. The ARM Template itself is a JSON file that defines the structure and configuration of one or more Azure resources.

“So how I do I get one of these templates?”

There are several ways that you can get your hands on the ARM Template that you want.

  • Build it by hand – The template is a JSON file so I guess if you understand the schema of the JSON well enough you could write an ARM Template using Notepad, Kate, or Visual Studio Code. This doesn’t seem very practical to me.
  • Use starter templates – The Azure SDK for Visual Studio includes an Azure Resource Group project type which includes empty templates for an array of Azure resources. These templates are actually retrieved from an online source and can be updated at any time to include the latest resources. This looks a lot more viable than using Notepad, but in the end you are still modifying a JSON file to define the resource that you want.
  • Export the template – You can export existing resources into a new ARM Template file. The process varies slightly from one type of resource to the next but you essentially go to the resource in the Azure Portal and export the resource to an ARM Template file. Sadly, at the time this article is being written this is not supported for Logic Apps, but Jeff Hollan has a custom PowerShell cmdlet that he built to export a Logic App to an ARM Template file.

One more thing — these templates are designed to utilize parameter files, so any aspect of the resource you’re deploying could be set at deploy-time via a parameter in a parameter file. For example, the pricing tier utilized by your App Service plan might be Free in your development environment and Standard in your test environment. The obvious approach is to create a different parameter file for each environment or configuration you want to use.

2016-06-08_16-58-09

“I see what you did there… So now what?”

Well, now you’ve got your template and a way to represent the differences in environments as your application flows through the release pipeline, and you have an easy and repeatable way to deploy your resources wherever and whenever you want. The only piece that’s missing are the tools to perform the deployment.

As mentioned above, you could use the Azure PowerShell tools or Azure CLI to create scripts that you manually execute. Those Visual Studio ARM Template projects even include a pre-built PowerShell script that you could execute.

Personally, I love automation but I’ve never been a big fan of asking a person to manually run a random script and feed it some random files. I want something that’s more streamlined. I want something that is simultaneously:

  • Automated – The process once triggered should not require manual help or intervention
  • Controlled – The process should accommodate appropriate approvals along the way if needed
  • Consistent and Repeatable – The process should not vary with each execution; it should have predictable outcomes based on the same inputs
  • Transparent – The whole team should have visibility into the deployments that have taken place, and be able to identify which versions of the code live where, and why (i.e., I should have work item-level traceability)
  • Versioned – Changes within the process and/or the process inputs (i.e., Logic App code) should be documented and discoverable
  • Scalable – It should be just as easy to deploy 20 things as it is to deploy 1 thing.

What’s Team Build?

For the past few years my team has been using TFS / VSTS as our primary source control and project management tool. In that time we’ve become more reliant on the excellent build system (Team Foundation Build) that TFS offers.

Team Build is much more than a traditional local build using Visual Studio. Team Builds run on a build server (i.e., not on your local computer) and are defined using a Build Definition. The Build Definition is a declarative definition of both the process that the build server will execute, as well as the settings regarding how the build is triggered, and how it will execute. It’s essentially a workflow for preparing your application for deployment.

The Build Definition is made up of tasks. Each task performs a specific step required in the build process. For example, the Visual Studio Build task is used to compile .NET projects within Visual Studio Solutions, and within the step you can control the Platform (Win32, x86, x64, etc.), and the Configuration (debug or release). While the Xamarin.Android task is used for compiling Android applications with settings appropriate for them.

Build Definitions can have Tasks that do more than compile your code. You might include tasks to run scripts, copy files to the build server, execute tests (Load Tests, Web Performance Tests, Unit Tests, Coded UI tests etc.), or create installation packages (though this would generally just be done through another project in your solution [e.g., with Flexera InstallShield and/or the WiX Toolset]). This gives you the power to quickly and automatically execute the tasks that are appropriate for your application.

Furthermore, a single Team Project in TFS could have multiple build definitions associated with it; because sometimes you want the build to simply compile, but other times you want to burn down the village, compile, run tests, and then deploy your web site to Azure for manual testing. Or perhaps you’re managing builds for multiple feature branches or even multiple applications within the Team Project.

“So what does this have to do with Logic Apps?”

If I add one of those ARM Template Visual Studio projects to my TFS / VSTS source control repository (whether it’s a Git repository or TFVC), I can create a Build Definition that compiles the ARM Deployment Project and other Visual Studio projects that include resources used by my cloud application (e.g., custom API Apps, Web Sites, etc.), and then publishes the ARM Template files (templates and parameter files) to a shared location where they can be accessed by automated deployment processes.

This was surprisingly easy to set up, I think it only took about 5 minutes. The best part is I can have this build trigger on check-in, so my deployment files are always up-to-date.

Here’s what my Build Definition looks like:

First I compile the project.

2016-06-08_13-21-49

Then I copy the ARM Template files and parameter files from the build output directory to a temporary file location.

2016-06-08_13-23-09

Finally, I publish the files from the temporary location. I’m using a Server location that other steps in the build (or a Release Manager release task) could use. It could have also been a file share to give access to processes not hosted in TFS.

2016-06-08_13-24-12

“So what does all this add up to?”

Whenever someone changes the ARM Deployment project (whether modifying the template or parameters file or adding a new template/parameter file to it) Team Build runs my Build Definition to: (1) compile my project, (2) extract the ARM deployment files from the build directory, and (3) publish the files as an Artifact named templates. That Artifact lives on the build server and can be accessed by VSTS Release Management release tasks that will actually deploy my Azure resources to the cloud.

What’s Release Management?

Release Management (a component of TFS / VSTS) helps you automate the deployment and testing of your software in multiple environments. You can either fully automate the delivery of your software all the way to production, or set up semi-automated processes with approvals and on-demand deployments.

In Release Management, you create Release Definitions that are conceptually similar to build definitions. A Release Definition is a declarative definition of the deployment process. Just like a Build Definition, a Release Definition is composed of tasks and each task provides a deployment step. The primary input for a Release Definition is one or more Artifacts created by your Build(s).

Release Definitions add a couple extra layers of complexity. One of those layers is the Environment. We all know that release pipelines are made up of multiple environments, and often each environment will come with its own unique requirements and/or configuration details. Within a single release definition you can create as many environments as you want, and then configure the Tasks within a given environment as appropriate for that system. The various Environments in you Release Definition can have similar or different Tasks

Slide1

Each environment can also utilize variables if you’d prefer to avoid hard-coding things that are subject to change.

Slide2

In this simple example, I created a Release Definition with two environments: Development and Test. Within each environment I used the Azure Resource Group Deployment task to deploy my Logic App, Service Plan, and Resource Group as defined in my ARM Deployment Template JSON file.

2016-06-08_17-04-10

I configured the deployment to Development to happen automatically upon successful build (remember the build runs when I check-in the source code). But I wanted Test deployments to be manual.

2016-06-08_17-07-22

I also created variables that enabled me to parameterize the name of the Resource Group, and the name of the Parameter File to use in each environment.

2016-06-08_17-10-30

You can see here how I’m using those variables within the Azure Resource Group Deployment task.

2016-06-08_17-14-30

But Does It Work?

Of course it works.

If I go to my Visual Studio project and modify something about my Logic App template. Maybe I finally get around to fixing that grammatical error in my response message.

2016-06-09_13-46-50

Then I check-in my changes.

checkin

In VSTS, I can see that my build automatically started.

Build 13

After the build completes, in the Release Hub I can see that a new release (Release-4) using the latest build (13) has started deploying to the Development environment.

Release-4

I’ve got logs to show me what happened during the deployment.

logs

I can see the commits or changesets included in this release compared to earlier releases. So a month from now Nick can see what modifications were deployed in Release-4.

changeset

What’s going on in Azure though? It looks like the Logic App in the Development Resource Group was updated to match my changes.

LogicAppDev

But my Test environment wasn’t touched.

LogicAppTest

Over on the Release Hub, I can manually start the Deployment to Test.

DeployTest

I almost forgot, deploying to Test requires an approval as well.

Approve

Just like that, it’s done.

success

What Does It Mean?

In about 30 minutes I was able to create a deployment pipeline for my Logic App. The deployment pipeline is flexible enough that changes can be made easily, but structured in a way that I (and everyone else on my team) can see exactly what it does.

Give Me More

QuickLearn Training offers courses to enhance your understanding of TFS / VSTS and Logic Apps. Our Build and Release Management Using TFS 2015 course has all the finer details that you’ll never get out of a blog article, and our Cloud-Based Integration Using Azure App Service course teaches you how to build enterprise-ready integration solutions using features of the Azure cloud.

Integrate 2016 Talk In Text: API Apps 101 for BizTalk Server Developers

By Nick Hauenstein

In this post, I am going to try to capture in text the presentation that I gave at the Integrate 2016 conference over in London. Text is likely the worst medium in which to capture such a session, but, alas, I do realize that sometimes it is the best medium for proper digestion of such content. If you’d rather see it in video form, click here.

So with that, let’s pretend that you are sitting among fellow professionals in a beautiful room on the 3rd floor of ExCeL London – complete with bright colored lights to set the mood. A wild American then appears, flailing his arms and babbling about how it’s actually 3 AM, and we’ve all been deceived. Then he starts talking about food.

Getting Our Priorities Straight

The world of software development might be a better place if we approached our tasks in that world the way that we approach each meal. We don’t really start each meal by a trip to the racks or shelves that hold our appliances – thinking, “Well, I have a vegetable peeler, and a fondue pot, I guess that means we’re eating some melted Gruyère and Emmentaler mixed with white wine, and carrot strings for every meal.”

Utensils vs. Cravings + Ingredients

Usually, the way it actually works out is that I’m thinking about what I’m craving, the ingredients that I have on hand and their flavor/nutritional value relative to my needs. From there, I look to proven recipes that satisfy those things, and finally, reach for the specific tools needed to do the job. If I don’t have them, I acquire them, or fashion a workable approximation.

We have to be really careful that we, when approaching software development and integration, take the same approach as we would when crafting an excellent meal. An approach that looks first to the needs and constraints, then to proven patterns/recipes, and allows the tools used to flow from the rest – even to the point of crafting/buying new tools that we haven’t used before if necessary.

Slide - Priorities - Lunch values cravings and ingredients, then proven recipes, then tools. Integration should consider business challenges + constraints, proven patterns, and then finally tools.

Business Challenges / Constraints

So, let’s imagine that we all work together now. We want to take the approach outlined above – one in which we have to consider the business need and the constraints that we may very well simply be stuck with. From there, we can consider proven patterns that might help us overcome, and then finally identify/acquire/create the tools required to get the job done.

Our company makes custom bobbleheads.

Slide - Imagine that we make custom bobbleheads. Dan Rosanova bobblehead is pictured along with his wife BizTalk Server, also a bobblehead. Next to them a T-Rex bobblehead also bobs his head in honor of Sandro Pereira's stickers from the conference.

The way that it works is that a customer uploads a 3D model of their face, and then selects a pre-built body from the gallery. The 3D print of their face starts immediately so that the order can be shipped as soon as possible. The customer is permitted to take as long as they need to select the body from the gallery of pre-built bodies. Once they select a body, we attach the printed face to the chosen body and ship the assembled dolls to the customer.

So what happens behind the scenes that we can’t escape?

Slide - What happens behind the scenes (diagram - text inline describes the diagram)

Well, sadly we don’t do greenfield development here. It’s not really brownfield development either. It’s more like a house haunted by ghost IT – shadow IT that has left.

Whenever a customer uploads their 3D face model, an XML notification message is created that contains the order id and a reference to their face model. At the time it was built, our developers emulated the BizTalk Server demos of the day and built distributed, fault-tolerant XML file copy operations whilst applying the wisdom of Chris Maden who has been quoted saying, “XML is like violence. If it doesn’t solve your problem, you’re not using enough of it.”

These same developers learned while attending conferences over the past few years that Dropbox is the next big thing in enterprise integration. They may have been wrong, but it’s now up to us to Make Dropbox Great Again™.

Once the customer selects a body for their doll, another XML message is created and dropped into the same folder in Dropbox with the same file naming conventions. Ultimately, we need to pair up the two components of the order – the head and body – in order to complete the processing.

Proven Patterns

It’s at this point that we would consult the great oracles of all wisdom and knowledge in the world of integration, Gregor Hohpe and Bobby Woolf. We will search through the patterns to find pieces that help solve each piece of the puzzle.

Which Patterns (slide)

Which patterns might we find? Well, to handle the communication with Dropbox, we might utilize the adapter pattern in hopes that one day the data will be sourced from a different system. We could apply the pipes and filters pattern and build a reusable translation/transformation pipeline made up of reusable independent steps organized in the proper order to provide the required translation/transformation/message enrichment for each interface.

From there, we could apply the publish and subscribe pattern to enable loosely coupled communication between the source system and any number of downstream subscribers – maybe routing the message in a content-based fashion. We could also layer on top of this a process manager to enable content-based correlation.

Tools

How would we use these patterns in concert with the tools we have/don’t have? BizTalk Server might seem like a natural fit.

BizTalk Server Architecture (Slide)

It already provides for us the concept of a port that begins with an adapter which delivers a byte stream to a pipes and filters style pipeline responsible for translation of the message and promotion of context properties used for routing, this is followed by a transform before publishing to the message box. From there, we have process managers in the form of BizTalk Orchestrations that understand the concept of content-based correlation of published messages, allowing them to be reunited by the messaging engine. You get adapters, pipelines, maps, orchestration out of the box – and publish subscribe whether you want it or not!

It’s already bringing to the table everything I need. Everything but a handy Dropbox adapter. Now, I know that we could always build our own, or use out of the box adapters with ungodly amounts of WCF extensions to make some magic happen, but maybe that won’t be our best bet here.

So, let’s set that aside for a moment, and consider what might become possible if we started using Logic Apps like this? It’s really the same question I posed before about MABS.

What if we did Logic Apps like this? (Slide)

In this case, we’re marrying Logic Apps and Service Bus. We have some Logic Apps that act in a similar fashion as BizTalk Ports. They provide adaptation and message enrichment and transformation through the use of relevant API apps for those concerns. Others act more like BizTalk Server orchestrations, coordinating the sends and receives of messages and operating on the content.

The messages are routed to the “orchestration” style Logic Apps through Service Bus. Each flow is triggered by subscribing to messages that arrive on a given Service Bus topic subscription (pre-created). Correlation can then be enabled by subscriptions dynamically created mid-flow.

At this point, you may have the following thought (which I humbly share indeed):


Demo Walkthrough

This isn’t all just a pipe dream – it’s real. I’ve built it. So, let’s see how it can fit together. The flow kicks off with an XML message. For this message, I have created a BizTalk Server 2016 schema (i.e., a regular XML schema with special notations about properties that should be promoted to the message context for routing purposes). The message looks like this:

First message

The message contains a promoted OrderId property that we should be able to correlate on. In other words, the second message that will show up in Dropbox for the order should also contain the same OrderId value – which allows us to determine that they are indeed related messages. The first message also contains a reference to the head for the bobblehead doll that we will be printing.

When this message is uploaded to Dropbox, it will be picked up by our “Port” style Logic App that looks like this:

Logic App - XmlIn-FILE

The first API App after the Dropbox receive is a custom API app that essentially builds a context property bag when it is passed an XML payload. It does this by comparing the document to a BizTalk schema, and using the instructions in the schema to “promote” properties by extracting the relevant content. It takes two inputs to operate.

The first input is a URL to the root of an Azure Blob Storage container that contains BizTalk schemas. It will use these schemas to perform message type resolution and property promotion. The second input is a string containing the entirety of an XML message. Not exactly the screaming performance of a forward-only streaming pipeline component, but it gets the job done, considering we’re already taking on latency to get to the cloud in the first place.

The output of that API app looks like this (click to enlarge):

Output of the ExtractPromotedProperties API App

The next API app takes the payload, along with the property bag (which it treats as a set of brokered message properties) and publishes the message to an Azure Service Bus Topic. This is just the out of the box connector using the outputs of our custom API app. The call out to that app looks like this:

Inputs to Service Bus Publish

This published message is picked up by our Logic App that is acting like an Orchestration. That Logic App has a pre-defined subscription on the same service bus topic for any message with a Message Type of Print Job.

Logic App: Print Process (Slide)

After the message is received, the Logic App must quickly setup a subscription for any related messages that come in for that order. Unfortunately, the out of the box connector for Service Bus doesn’t yet have a way to create a new subscription – only ways to subscribe for messages on an existing subscription.

Thus, we will have to use a custom API app to create a subscription unique to this running instance of the Logic App – one that is based on the OrderId property of the received message. To provide this capability, we have a custom API app called CreateInstanceSubscription.

It requires quite a few inputs to function since we don’t yet have the capability of reading details from a stored API connection in a custom API app.

Create Instance Subscription API App (Custom)

The API takes in a Correlation Property property, which contains the name of the property that is shared in common between the message that triggered the Logic App instance and the message that will be correlated with this running instance.

It also takes in the Message Type (in the namespace + # + root node name format) of the next expected message. Both of these properties will be used to create a new subscription on the service bus topic referenced by the last two configuration properties (Service Bus Topic, and Service Bus Connection String).

After it executes, we might expected to see a subscription like the following (click to enlarge):

Create Instance Subscription API App - Created Subscription (Slide) 

Now that we have the subscription created, we can take our time with the rest of the process until we absolutely require the second message. In this case, we’re calling another custom API which provides a visualization of the received messages. In order to read the content, we can either use the xpath() function of Logic Apps to read the XML directly, or we can covert it to JSON first using the json() function, and then simply dot into it. I decided to use the JSON function since I hadn’t attempted to use it in a situation like this yet. It was okay (it was pretty darn verbose). xpath() would have been a better choice here – and the more natural choice given an XML payload.

image

This yields the following visualization (a body-less bobblehead awaiting its correlated message containing information about which body to use):

image

And we would expect that there is both an instance subscription in service bus and a Logic App that is still actively processing – waiting for Service Bus to re-activate it with a new message.

Running Logic App

At this point, the new message can arrive at any moment in time. It will land in the same Dropbox folder, and process through the same Logic App serving as a “Port” – with the same XML property promotion, and Service Bus publishing action. It will land in the same Service Bus topic as before, and with a matching order id to the originally submitted message.

Second Message Submitted (Slide)

This second message carries some new information, however. In this case, it contains the body that the customer selected for their custom bobblehead doll.

Once published to the topic, the instance subscription previously created by the second Logic App in the process will be matched by a listening Service Bus connector.

Service Bus Connector subscribing to 2nd message (slide) 

The connector uses the topic name and instance subscription name passed to it from the Create Instance Subscription API App. The name of the subscription will be a randomly generated id for that running instance of the Logic App.

Now that we have the message, it’s time to ensure that we don’t bring the problem of zombies into the world of Logic Apps. There is a step that follows which will clean up the instance subscription for the Logic App before continuing with the final bits of the process.

Delete Instance Subscription API (Slide)

Again, since the OOTB Service Bus connector does not contain any operations for managing subscriptions, the custom API App is called to delete the instance subscription using the details returned from the original call to the API which created it.

After that, it’s back off to the bobblehead visualizer with the details from the correlated message received.

Last Step (Slide)Final Result (Slide)

Call to Action

So that’s pretty cool. We can now stand with confidence and proclaim that content-based correlation is possible with Logic Apps! However, it was built out of necessity, and required custom crafted components – as is often the case with anything worth doing.

We needed custom components (slide)

You may be wondering why this talk was titled API Apps 101 for BizTalk Developers. I didn’t really tell you how to create API apps. Instead, I showed that API apps behave in a fashion similar to different components within BizTalk Server (adapters, pipeline components, orchestration shapes, etc…). I don’t want to leave you hanging though, because we are at a point in time where there is a golden opportunity to make your mark in the foundations of this new world.

This is the ground floor of Logic Apps and API Apps. As BizTalk Server developers, we know the required ingredients of enterprise integrations. We know the recipes for success. It’s just a matter of crafting some additional tools for use in the world of Logic Apps, and for the first time we have a unified marketplace to share and even sell these components.

From working on BizTalk Server integrations, we know that we will need custom API apps that can serve as adapters, pipeline components, and pattern utility apps (e.g., content-based correlators). In fact, you may have built such things before. It’s honestly not that difficult to port those things over into this new world of integration (where it makes sense) and reap the rewards. If you need inspiration, check out the listings of such components that have already been created for BizTalk Server. Each component represents a solution to a specific integration challenge – many of which are timeless challenges.

What Now? (Slide)

We write BizTalk components and API Apps in the same languages, though with different techniques, and targeting a different runtime.

How do we make that all happen? Well, today we are providing the world with “the goods”. All of the slides from this talk, a sample module from the February 2016 version of our Cloud-based Integration Using Azure App Service course, and all of the code involved in the demo. With those combined resources, you should be set on the right track to start building custom API Apps for use in Logic Apps – leveraging skills and work you’ve already accomplished.

If you’re ready to get started, click the image below to download the resources:

Slide46

Until Next Time

That’s all for now! Again, go forth and create API apps and come visit us in our Cloud-based Integration Using Azure App Service course if you’d like to learn more.

I’ll leave Simon Young with the final word – and dining tip!

API Apps 101 for BizTalk Developers at Integrate 2016

By Nick Hauenstein

I’m happy to announce that I’ve been asked to speak at the BizTalk 360 Integrate 2016 conference May 11th-13th in London. When brainstorming for the session topics I had lots of ideas of fun things that could be accomplished with Logic Apps and API Apps, but decided to take things in a slightly different direction.

Over the last few years, I’ve seen BizTalk Server developers who are skilled as general .NET developers, but who may not have the time or energy to keep up-to-date with the evolution of Logic Apps, API Apps – and all of the things that come with them Web API, node.js, Swagger, etc. That is perfectly understandable because as a developer building enterprise integrations exchanging X12 or EDIFACT data using BizTalk Server, you might not have needed to interact with JSON serialization in the past.

This Will Make Your Dreams Come True

This last year, my team and I have been working hard to stay on top of changes to Microsoft’s cloud-based integration technologies. Time and again we’ve fallen, and seen others do it as well, into the trap of looking at a tool and then trying to figure out how it can solve all of our problems – even to the point of searching out problems (imagined or otherwise) that it could tackle. It happens anytime there’s a sufficiently impressive tool, or sufficiently impressive salesperson (or both). Go ahead, click the link, I’ll wait. But if you do click that link, you will end up trying to find how you too could buy a 10 pack of vegetable peelers. Then you would be trying to figure out how to incorporate zucchini strands into dessert.

Getting Back on Track

Whenever that happens we’ve been able to course correct by forgetting about the tool and focusing on the problem we want to solve, or even better, the ideal solution to the problem. When describing the solution using Enterprise Integration Patterns as our vocabulary, we can quickly model an answer that works without assigning a particular tool or technology. Maybe a given solution needs a content enricher, or a resequencer, or guaranteed delivery, or whatever – not necessarily a specific technology applied.

As a BizTalk Developer, I know how to implement these patterns using BizTalk; the real question that we keep running into is can we implement these patterns using API Apps and/or Logic Apps – and better yet should we?

Can != Should?

My session is designed to help BizTalk Developers learn how to answer those questions for themselves. Specifically, identifying the capabilities of the new additions to our toolset, and then showing how to use them without assumptions of in-depth knowledge of the underpinnings. In the talk, you will see API App that creates promoted properties from an XML document (just like BizTalk Server) so that we could then potentially reach out to other Azure capabilities and implement publish-subscribe in concert with content-based routing and correlation.

Azure App Service Logic Apps Refresh

By Nick Hauenstein

Much has happened in the world of Logic Apps and API Apps since the original announcement back in December of 2014. We have seen the continued development of SaaS connectivity within the product, along with the overall expansion of integration capabilities. We have also seen the team responding to customer feedback actively while maintaining transparency in the process, and even providing a roadmap to give insight into what is coming and when we can expect to see the sweet moment that is GA.

Sometimes, customer feedback causes fairly large shifts in the underlying product. Such is the case seen in the latest updates for the product in the form of a completely overhauled designer, new feature support for triggering flows (i.e., any action can be a trigger), and an API deployment model that is more consistent with the rest of App Service and does not require a dedicated gateway.

New Designer

One of the most obvious changes that will stick out immediately as you go out to create a Logic App is the new designer that moved over into App Service from Power Apps.

New Logic Apps Designer

The new designer supports editing workflows build using the updated workflow language (schema version: 2015-08-01-preview), and sports a vertical layout, rather than a horizontal layout, and conditions that appear to wrap around actions instead of being embedded inside actions (though the code view demonstrates that the underlying behavior is similar).

image

You might also notice that the experience of adding actions is much quicker, as this act no longer provisions a new instance of an API App within your own subscription. Instead in an interesting reversal, Microsoft hosts managed instances of out-of-the-box API Apps. The result is that configuration information is sent as part of the request, instead of stored inside the API App container, and you will have far more simplified ARM deployment templates given that your deployment will no longer need to take into account each API App used by your Logic Apps.

imageSo how do my own custom API Apps end up in the list? Well, you can apply to have them registered in the Azure Marketplace, or you can use the Http + Swagger action in order to point to a custom API App that already exists. Of course that brings us to the question of what it looks like to actually build a custom API App in this refresh of the preview.

New API App Development Model

In the preview refresh, the process to develop and consume a custom API using the designer is quite a bit different. You still have the ability to use swagger extensions for a clean designer experience – but there are new extensions intended to take advantage of new designer capabilities. These capabilities include things like dynamic schemas for parameters / return types of API (imagine a different object shape depending on the type of entity within a CRM system, or a table within a database), and dynamic values for enumerations.

The biggest change here though is that we no longer have a gateway managing authentication, internal storage, or configuration for our APIs, and get to manage that ourselves, but  as a side effect, we’re no longer constrained by where our APIs live – all APIs get the same first class experience.

I would definitely recommend taking some time to read each link within this article before starting out on building a new API. I’m working on building out updates to T-Rex to help with the metadata – while also providing a few example APIs to take advantage of all of the new capabilities – but if you want a head start, the knowledge is out there!

New Triggering Capabilities

What other changes are under the hood that you should know about? Well, you may have noticed the announcement of the availability of Webhooks for Logic apps for one, or even saw the x-ms-trigger extension called out in the article linked above. The end result of this is that any action within a Logic App can have a polling trigger style behavior, or even an async push style behavior, and the Logic App itself can be triggered manually at an endpoint that isn’t tied to a specific Azure subscription.

We can see some of these changes in action as we look at actions like the Send approval email action from the Office 365 connector/API. The action sends an email, and then notifies the Logic App what the response is when the response is available – without polling.

Office 365 send approval email metadata

It even includes the shape of the notification as part of the swagger metadata that is exposed, so that the designer can support using the shape of that async output in later steps. The result is that as a developer, I can use the action to build what looks like a synchronous flow without the complexity of an async flow, and yet I’m benefitting from the performance characteristics of the async implementation (i.e., immediate notification when the event happens rather than polling at a fixed or variable interval).

What Are We Doing About It?

Reading about all of this might have you wondering what QuickLearn Training is doing about all of this, and/or what you should do about all of this.

Well, I (Nick Hauenstein) am hard at work on an update to QuickLearn’s T-Rex metadata library that takes into account the new way to build API Apps. I’m on target to wrap up the core code by end of week, and hopefully have some decent sample apps out there shortly thereafter.

We’re all busy learning everything we can about the new functionality so that we can rapidly integrate those changes into our Cloud-Based Integration Using Azure App Service course.

In the meantime, keep an eye out for announcements from the BizTalk 360 folks about Integrate 2016 Europe. You might be able to meet up with myself (Nick Hauenstein), Rob Callaway, or John Callaway to talk about BizTalk Server, or any of the things in this post. Also watch for the next release of TRex on NuGet which will include support for all of the new goodies we have available in Logic Apps.

In the meantime, take care, and have fun building great things!

BizTalk Server’s Road Ahead for the Next Year

By Nick Hauenstein

I’m finally settling back into the swing of things as we kick off the year 2016! It has been quite a relaxing break, spending Christmas and New Year’s with my family out in the woods of Snohomish, WA. Since getting back to the office, I’ve been catching up on quite the backlog of emails. Among them was an email that called out a file that was uploaded to the Microsoft download site at the end of last month – the long awaited BizTalk Server Roadmap for 2016 or should I say the Microsoft Integration Roadmap (more on that to below).

Continued Commitment to BizTalk Server

The document opens up with a bullet pointed summary of the core takeaways (I for one appreciate that it leads with the TLDR):

  • Continuing commitment to BizTalk Server, with our 10th release of BizTalk Server in Q4 2016.
  • Expansion of our iPaaS vision to provide a comprehensive and compelling integration offering spanning both traditional and modern integration requirements. Preview refresh in January 2016 and General Availability (GA) in April 2016.
  • Deliver our iPaaS offering on premises through Logic Apps on Azure Stack in preview around Q3 2016 and GA around end of the year.
  • Strong roadmap and significant investments to ensure we continue to be recognized as a market leader in integration.
  • The next release of Host Integration Server is planned on the same timeline as BizTalk Server below.

BizTalk Server 2016 Roadmap

That’s right; 2016 is the year where we start to see Microsoft’s integration investments in the cloud start to pay dividends on-premises – with two complementary offerings that each offer their own approach to solving integration challenges while still ensuring that you can build mission critical BizTalk Server integrations on the latest Microsoft platform. Though Microsoft is expanding the integration toolbox beyond just BizTalk Server, the focus is still firm on Integration, and the tools are built on proven platforms with a proven infrastructure.

BizTalk Server 2016 New Features

So what can we expect in BizTalk Server 2016?

  • Platform alignment – SQL 2016, Windows Server 2016, Office 2016 and latest release of Visual Studio.
  • BizTalk support for SQL 2016 AlwaysOn Availability Groups both on-premises and in Azure IaaS to provide high availability (HA).
  • HA production workloads supported in Azure IaaS.
  • Tighter integration between BizTalk Server and API connectors to enable BizTalk Server to consume our cloud connectors such as SalesForce.Com and O365 more easily.
  • Numerous enhancements including
    • Improved SFTP adapter,
    • Improved WCF NetTcpRelay adapter with SAS support
    • WCF-SAP adapter based on NCo (.NET library)
    • SHA2 support
  • Host Integration Server “2016”
    • New and improved BizTalk adapters for Informix, MQ & DB2
    • Improvements to PowerShell integration, and installation and configuration

I don’t know about you, but I’m fairly excited to see this listing. With the death of SHA1 certificates this year, it’s good to see SHA2 support finally coming into BizTalk Server 2016, if for nothing else, then for SHA2 a BizTalk Server 2016 upgrade is going to be a must.

Also, notice the tighter integration between BizTalk Server and API connectors. That’s fantastic! One thing that Logic Apps do really well is provide friendly connectivity to SaaS endpoints. One thing they don’t do as well is content-based correlation and long -running transactions. One thing that BizTalk Server doesn’t do too well is provide friendly connectivity to SaaS endpoints (there is generic REST connectivity, but you’re going to be wishing that you would have built/bought/downloaded an adapter once you start going down that road). One thing that BizTalk Server does really well is content-based correlation and long-running transactions. Here we’re seeing the best of Azure App Service Logic Apps meeting the best of BizTalk Server. That should make anyone happy.

An Integration Taxonomy

One interesting thing found in the roadmap is a brief discussion of an integration taxonomy that makes a distinction between “Modern Integration” – which is usually SaaS and web-centric, based in the cloud, and within the realm of Web and mobile developers — and “Enterprise Integration” – which includes support for industry standards (e.g., X12, EDIFACT, etc…), targets mission critical workloads, and caters more towards enterprise integration specialists.

In a way, this sets the context for the two core integration offerings of BizTalk Server and Logic Apps – defining the persona that might gravitate towards each. However, Logic Apps will offer an Enterprise Integration Pack for the pro developer that wants the power of BizTalk Server with the elasticity of a PaaS offering.

Where Is This Going?

Well, you might be reading this because you’re passionate about Logic Apps; you might be reading this if you’ve been working with BizTalk Server since the year 2000. Either way, you’re in the business of doing integration. MIcrosoft isn’t interested in building up cliques of developers, but instead catering to all while providing an easy to use location agnostic (cloud/on-prem) rock solid, highly scalable platform for mission critical integration.

The focus is on evolving capabilities, it doesn’t matter what brand name is slapped on the side of it (whether it’s Logic Apps, Power Apps, or BizTalk Server), Microsoft is committed to making the world of enterprise integration a better place!

A Brief History of Cloud-Based Integration in Microsoft Azure

By Rob Callaway

Mission Briefing

In conversations with students and other integration specialists, I’m discovering more and more how confused some people are about the evolution of cloud-based integration technologies. I suspect that cloud-based integration is going to be big business in the coming years, but this confusion will be an impediment to us all.

To address this I want to write a less technical, very casual, blog post explaining where we are today (November of 2015), and generally how we got here. I’ll try to refrain from passing judgement on the technologies that came before and I’ll avoid theorizing on what may come in the future. I simply want to give a timeline that anyone can use to understand this evolution, along with a high-level description of each technology.

I’ll only speak to Microsoft technologies because that’s where my expertise lies, but it’s worth acknowledging that there are alternatives in the marketplace.

If you’d like a more technical write-up of these technologies and how to use them, Richard Seroter has a good article on his blog that can be found here.

On the First Day, Microsoft Created Azure

Way, way back in October of 2008 Microsoft unveiled Windows Azure (although it wouldn’t be until February of 2010 that Azure went “live”). On that first day, Azure wasn’t nearly the monster it has become.

It provided a service platform for .NET services, SQL Services, and Live Services. Many people were still very skeptical about “the cloud” (if they even knew what that meant). As an industry we were entering a brave new world with many possibilities.

From an integration perspective, Windows Azure .NET Services offered Service Bus as a secure, standards-based messaging infrastructure.

What’s the Deal with Service Bus?

Over the years, Service Bus has been rebranded several times but the core concepts have stayed the same: reduce the barriers for building composite applications, even when their components have to communicate across organizational boundaries. Initially, Service Bus offered Topics/Subscriptions and Queues as a means for systems and services to exchange data reliably through the cloud.

Service Bus Queues are just like any other queueing technology. We have a queue to which any number of clients can post messages. These messages can be received from the queue later by some process. Transactional delivery, message expiry, and ordered delivery are all built-in features.

Sample Service Bus queue

Sample Service Bus queue

I like to call Topics/Subscriptions “smart queues.” We have concepts similar to queues with the addition of message routing logic. That is, within a Topic I can define one or more Subscription(s). Each Subscription is used to identify messages that meet certain conditions and “grab” them. Clients don’t pick up messages from the Topic, but rather from a Subscription within the Topic. A single message can be routed to multiple Subscriptions once published to the Topic.

Sample Service Bus Topic and Subscriptions

Sample Service Bus Topic and Subscriptions

If you have a BizTalk Server background, you can essentially think of each Service Bus Topic as a MessageBox database.

Interacting with Service Bus is easy to do across a variety of clients using the .NET or REST APIs. With the ability to connect on-premises applications to cloud-based systems and services, or even connect cloud services to each other, Service Bus offered the first real “integration” features to Azure.

Since its release, Service Bus has grown to include other messaging features such as Relays, Event Hubs, and Notification Hubs, but at its heart it has remained the same and continues to provide a rock-solid foundation for exchanging messages between systems in a reliable and programmable way. In June of 2015, Service Bus processed over 1 trillion (1,000,000,000,000) messages! (Starts at 1:20)

What About VETRO?

As integration specialists we know that integration problems are more complex than simply grabbing some data from System A and dumping it in System B.

Message transport is important but it’s not the full story. For us, and the integration applications we build, VETRO (Validate, Enrich, Transform, Route, and Operate) is a way of life. I want to validate my input data. I may need to enrich the data with alternate values or contextual information. I’ll most likely need to transform the data from one format or schema to another. Identifying and routing the message to the correct destination is certainly a requirement. Any integration solution that fails to deliver all of these capabilities probably won’t interest me much.

VETRO Diagram

VETRO Diagram

So, in a world where Service Bus is the only integration tool available to me, do I have VETRO? Not really.

I have a powerful, scalable, reliable, messaging infrastructure that I can use to transport messages, but I cannot transform that data, nor can I manipulate that data in a meaningful way, so I need something more.

I need something that works in conjunction with this messaging engine.

You Got Your BizTalk in My Cloud!

Microsoft’s first attempt at providing a more traditional integration platform that provided VETRO-esque capabilities was Microsoft Azure BizTalk Services (MABS) (to confuse things further, this was originally branded as Windows Azure BizTalk Services, or WABS). You’ll notice that Azure itself has changed its name from Windows Azure to Microsoft Azure, but I digress.

MABS was announced publicly at TechEd 2013.

Despite the name, Microsoft Azure BizTalk Services DOES NOT have a common code-base with Microsoft BizTalk Server (on second thought, perhaps the EDI pieces share some code with BizTalk Server, but that’s about all). In the MABS world we could create itineraries. These itineraries contained connections to source and destination systems (on-premises & cloud) and bridges. Bridges were processing pipelines made up of stages. Each stage could be configured to provide a particular type of VETRO function. For example, the Enrich stage could be used to add properties to the context of the message travelling through the bridge/itinerary.

Stages of a MABS Bridges

Stages of a MABS Bridges

Complex integration solutions could be built by chaining multiple bridges together using a single itinerary.

MABS message flow

MABS message flow

MABS was our first real shot at building full integration solutions in the cloud, and it was pretty good, but Microsoft wasn’t fully satisfied, and the industry was changing the approach for service-based architectures. Now we want Microservices (more on that in the next section).

The MABS architecture had some shortcomings of its own. For example, there was little or no ability to incorporate custom components into the bridges, and a lack of connectors to source and destination systems.

Give Me Those Sweet, Sweet Microservices

Over the past couple of years the trending design architecture has been Microservices. For those of you who aren’t already familiar with it, or don’t want to read pages of theory, it boils down to this:

“Architect the application by applying the Scale Cube (specifically y-axis scaling) and functionally decompose the application into a set of collaborating services. Each service implements a set of narrowly related functions. For example, an application might consist of services such as the order management service, the customer management service etc.

Services communicate using either synchronous protocols such as HTTP/REST or asynchronous protocols such as AMQP.

Services are developed and deployed independently of one another.

Each service has its own database in order to be decoupled from other services. When necessary, consistency is between databases is maintained using either database replication mechanisms or application-level events.”

So the shot-callers at Microsoft see this growing trend and want to ensure that the Azure platform is suited to enable this type of application design. At the same time, MABS has been in the wild for just over a year and the team needs to address the issues that exist there. MABS Itineraries are deployed as one big chunk of code, and that does not align well to the Microservices way of doing things. Therefore, need something new but familiar!

App Service, and API Apps, and Logic Apps, Oh My!

Azure App Service is a cloud platform for building powerful web and mobile apps that connect to data anywhere, in the cloud or on-premises. Under the App Service umbrella we have Web Apps, Mobile Apps, API Apps, and Logic Apps.

Azure App Service

Azure App Service

I don’t want to get into Web and Mobile Apps. I want to get into API Apps and Logic Apps.

API Apps and logic Apps were publicly unveiled in March of 2015, and are currently still in preview.

API Apps provide capabilities for developing, deploying, publishing, consuming, and managing RESTful web APIs. The simple, less sales-pitch sounding version of that is that I can put RESTful services in the Azure cloud so I can easily use them in other Azure App Service-hosted things, or call the API (you know, since it’s an HTTP service) from anywhere else. Not only is the service hosted in Azure and infinitely scalable, but Azure App Service also provides security and client consumption features.

So, API Apps are HTTP / RESTful services running in the cloud. These API Apps are intended to enable a Microservices architecture. Microsoft offers a bunch of API Apps in Azure App Service already and I have the ability to create my own if I want. Furthermore, to address the integration needs that exist in our application designs, there is a special set of BizTalk API Apps that provide MABS/BizTalk Server style functionality (i.e., VETRO).

What are API Apps?

What are API Apps?

This is all pretty cool, but I want more. That’s where Logic Apps come in.

Logic Apps are cloud-hosted workflows made up of API Apps. I can use Logic Apps to design workflows that start from a trigger and then execute a series of steps, each invoking an API App whilst the Logic App run-time deals with pesky things like authentication, checkpoints, and durable execution. Plus it has a cool rocket ship logo.

What are Logic Apps?

What are Logic Apps?

Putting the Pieces Together

What does all this mean? How can I use these Azure technologies together to build awesome things today?

Service Bus review

Service Bus review

Service Bus provides an awesome way to get messages from one place to another using either Queues or Topics/Subscriptions.

API Apps are cloud-hosted services that do work for me. For example, hit a SaaS provider or talk to an on-premises system (we call these connectors), transform data, change an XML payload to JSON, etc.

Logic Apps are workflows composed of multiple API Apps. So I can create a composite process from a series of Microservices.

Logic App review

Logic App review

But if I were building an entire integration solution, breaking the process across multiple Logic Apps might make great sense. So I use Service Bus to connect the two workflows to each other in a loosely-coupled way.

Logic Apps and Service Bus working together

Logic Apps and Service Bus working together

And as my integration solution becomes more sophisticated, perhaps I have need for more Logic Apps to manage each “step” in the process. I further use the power of Topics to control the workflow to which a message is delivered.

More Logic Apps and Service Bus Topics provide a sophisticated integration solution

More Logic Apps and Service Bus Topics provide a sophisticated integration solution

In the purest of integration terms, each Logic App serves as its own VETRO (or subset of VETRO features) component. Decomposing a process into several different Logic Apps and then connecting them to each other using Service Bus gives us the ability to create durable, long-running composite processes that remain loosely-coupled.

Doing VERTO using Service Bus and Logic Apps

Doing VERTO using Service Bus and Logic Apps

Summary

Today Microsoft Azure offers the most complete story to date for cloud-based integration, and it’s a story that is only getting better and better. The Azure App Service team and the BizTalk Server team are working together to deliver amazing integration technologies. As an integration specialist, you may have been able to ignore the cloud for the past few years, but in the coming years you won’t be able to get away with it.

We’ve all endeavored to eliminate those nasty data islands. We’ve worked to tear down the walls dividing our systems. Today, a new generation of technologies is emerging to solve the problems of the future. We need people like you, the seasoned integration professional, to help direct the technology, and lead the developers using it.

If any of this has gotten you at all excited to dig in and start building great things, you might want to check out QuickLearn Training’s 5-day instructor-led course detailing how to create complete integration solutions using the technologies discussed in this article. Please come join us in class so we can work together to build magical things.

Integration Monday Recap and Push-BUtton Push Trigger Introduction

By Nick Hauenstein

This blog post serves as a quick recap of and expansion on my October 19th Integration Monday talk titled Building Push Triggers for Logic Apps. You can view the session and look through the slides over at integrationusergroup.com.

Building Push Triggers for Logic Apps

In the talk, I explored the bare minimum requirements for building push triggers, expanding on my AzureCon 2015 talk about a specific push trigger for dealing with NFC tag reads. I showed how you could use the QuickLearn Push Trigger Tools and QuickLearn Push Trigger Client Tools to implement a simple interface for storing callbacks, and build a re-usable set of callback storage mechanisms.

I also introduced the Push-Button Push Trigger. A push trigger that responds to a button press on a Windows 10 IoT device (in this case a Raspberry PI 2), relying on Azure Storage for callback storage. In the remainder of this post, I’m going to show you how  to get your own Push-Button Push Trigger up and running.

Where Do I Get a Push-Button Push Trigger?

At the moment, there are 2 ways that you can get one. You can come out to QuickLearn’s 5-day Cloud-Based Integration using Azure App Service course (or attend remotely), or you can build one for yourself!

Push-Button Push Trigger for Logic Apps

Even if you’ve never worked with anything like this before — don’t panic. You can’t really get something simpler than this.

Essentially you’ll need a Windows 10 IoT device (Raspberry Pi 2, DragonBoard 410C, MinnowBoard Max, etc…), a momentary switch (button), some wiring to wire the button up to a GPIO port and ground, and optionally a breadboard for even more fun later. I went with a Raspberry Pi 2 for mine, but if I could do it again, I would have chosen a DragonBoard 410C given its built-in Wi-Fi capabilities that don’t require an additional accessory or external module.

To get started with developing, you will need some software on your own machine (Visual Studio, Windows 10 IoT Project Templates, etc…) and you will need to get Windows 10 IoT Core onto your device. Microsoft has provided a pretty decent write-up of that part of the procedure over here.

Assembling Your PusH-Button Push Trigger

You may or may not have a case to go along with your Push-Button Push Trigger. I ended up buying the cheapest case I could find on the internet. This was likely a poor choice as it quickly disintegrated and pieces chipped off. Since then, I’ve had a lot better luck with this one. Of course, you could always print/build your own custom enclosure as well.

image

First things first, make sure your device has Windows IoT by inserting a prepared MicroSD card into the device.

image

Next, place your device in its case (if applicable). Mine has a mini-breadboard mounted on top for easier portability of the device.

image

Raspberry Pi 2 in Enclosure

Now, on to the wiring. We’re going to wire up a wire to GPIO pin 4 (chosen randomly), and another wire up to ground. Eventually we’re going to put a momentary switch inbetween so that we can quickly toggle that connection between high/low.

Wiring up to GPIO 4 and Ground

Let’s get that switch up on the breadboard (and make sure we put on a nice and colorful cover).

Switch on the breadboard

You can see in the image above how the posts are reaching down into the board. To connect wires to those pins, you will simply plug in one of the jumper wires to the same row as one of the pins.

Wiring up the switch

One connection down, one to go.

All wired up

At this point, everything is wired up, and it’s time to get power and internet to the device.

Completed Push-Button Push Trigger

How Do I Make The Sample Code Work?

First of all, you can find the sample code over at https://github.com/nihaue/PushButtonPushTrigger. You can either download it as a ZIP file if you’re not comfortable with Git, or if you are comfortable with Git, you can clone it directly from here: https://github.com/nihaue/PushButtonPushTrigger.git

Once you have the sample downloaded, you should immediately Build the code to restore NuGet packages, and make all of the references happy. Next, you should take some time to look through the CallbacksController class for the Push Trigger (the part actually hosted in Azure with which the Logic App registers its interest in certain data), and the StartupTask class for the Universal Windows App (the part that actually looks for and handles the button press):

Interesting Classes within the Push-Button Push Trigger Solution

After you have a decent understanding of what’s going on, you’ll realize that the CallbacksController is storing callbacks from interested Logic Apps in Azure Table Storage, and the StartupTask (think background service on the device) is reading callbacks from Azure Table Storage when the button is pressed (moving this code to initialization and caching/polling for updates would be a better choice – and something you’re free to implement). So in order to get this thing working, you’re going to need an Azure Storage account.

If you don’t already have an Azure Storage account, head over to the Azure Portal and create one.

Creating an Azure Storage Account

The only thing you need from this storage account will be the connection string, which you can find after it’s created over here:

Getting the Storage Account Credentials

With those credentials in hand, you’ll need to visit the two files in the solution responsible for storing configuration. They’re both named AzureStorageConfig.cs.

Locating AzureStorageConfig.cs

Inside that file, you will see a line of code with a TODO comment indicating that you should paste your connection string for your Azure Storage account in that location. This is indeed your next step (make sure to do it in both the code for the API App that lives in Azure, and the code for the device itself).

Configuration Location

Ultimately, this is a terrible way to handle configuration. You can get the sample working with a simple copy/paste in that file, but the intent is that you would simply decide for yourself how you’d like to manage the configuration and creation of the CloudStorageAccount instance, and make that instance available through the StorageAccount property of the AzureStorageConfig class. This instance is used in both the AzureStorageCallbackStore and the AzureStorageClientCallbackStore classes.

Publishing the Code to Azure

We’re now ready to get this code all in place and running. The first step toward that goal will be to publish the API App project. You can do this by right-clicking the QuickLearn.ButtonPress.PushTrigger project, and then clicking Publish.

Publishing the QuickLearn.ButtonPress.PushTrigger project

Make sure to select Microsoft Azure API Apps (Preview) as the target.

image

In the Microsoft Azure API Apps window, select your Azure Subscription, and then click New… Fill out the form to create a new API App container into which you can deploy your code.

Creating a New API App Container

Once the creation of the container is complete, you will see the following status message appear in Visual Studio.

API App Provisioned

Then, you will once again right-click the project and then click Publish… This time, the form will be pre-filled with the settings from the publish profile of the Azure API App container that you just provisioned. You might find it helpful to deploy the Debug configuration of your API App (Settings > Configuration > Debug – Any CPU), but that is entirely up to you. Once you click Publish, your code will be deployed to the API App container, and the API App will be usable within a Logic App.

Publishing the API App

Next up, we will deploy code to the device, and configure it to run in the background.

Deploy the Code to the Device

First of all, you will need to edit the project properties for the QuickLearn.ButtonPress.App project so that it attempts deployment to the correct device. In this case, that will mean navigating to the Debug tab, setting the Target device  to Remote machine, the Remote machine to the name of your Windows IoT Core device (default: minwinpc), and then unchecking the Use authentication box.

Configuring Project Properties for Deployment

You will want to make sure to save the project properties, get your device connected to the same network as your development machine (laptop in my case). Next, you can right-click QuickLearn.ButtonPress.App, and click Deploy.

image

Once deployment is complete, head over to the Windows IoT Core Watcher utility that ended up on your system after installing everything that you needed to get your device setup initially. If you can’t find it, reboot your system and it will be there waiting for you. The Windows IoT Core Watcher utility finds IoT Core devices on your network and provides quick links to gain access and configure them.

In the utility, right-click your device, and then click Web Browser here.

Windows IoT Core Watcher

Login using your user name and password (default is Administrator / p@ssw0rd).

Next, head over to the Apps tab, and verify that QuickLearn.ButtonPress shows up in the list. You will want to note the full name as it appears here because you will need it in a few minutes.

QuickLearn Button Press App Installed

Since this app was created as a Startup Task rather than as a graphical application, you will need to register it with the device to be run on startup. At the moment, this is not something that you can accomplish in the browser. Instead, you will need to fire up PowerShell for this next bit.

In PowerShell, you will need to enter a remote session on your device. You can do this using the Enter-PSSession cmdlet like this:

Enter-PSSession on Raspberry Pi 2

The connection process will take a while. Just get a cup of coffee, and when you have it ready, the session should be connected. Once connected, you are in PowerShell on the device, and are executing commands against the device (not your own local machine).

On the device is a utility called iotstartup. This utility provides access to configure what tasks run at device startup. In this case, you want to configure the device to constantly be running the Push-Button Push Trigger code in the background.

iotstartup usage

At the prompt, type iotstartup add headless “QuickLearn.ButtonPress.*”

Add Headless Startup Task

Verify that what the app added matches exactly what appeared in the list on the device web page that you examined earlier. At the prompt type, shutdown /r /t 0

This will cause the device to reboot and your application to start up. It may take 60-90 seconds for the reboot to complete.

Building and Testing a Logic App using the Push-Button Push Trigger

In the Azure Portal, create a new empty Logic App in the same resource group in which you deployed the Push Button Push Trigger API App (otherwise the API App won’t be available to select in the designer). In the Logic App designer, in the API Apps pane (which you may have to expand in order to see), click QuickLearn.ButtonPress.PushTrigger.

Push Trigger in the Designer

Configure the Push Trigger as shown, and then click the green check mark to save the settings.

Push Trigger Configuration

After the push trigger, add any other actions to your Logic App that you wish. Maybe this triggers a build in TFS, maybe it connects to a device that opens a door, maybe it brews you coffee remotely, maybe it posts a message in a chat service, maybe it closes out the latest support ticket that you were working on in your help desk system – it’s up to you. For me, I’m going to add a simple HTTP action (since it’s built into the runtime), and have it POST a message to a requestb.in indicating that the button has been pressed.

Completed Logic App

Save the Logic App, and it’s all ready to go!

PUSH THE BUTTON

There’s only one thing left to do – push the button. If everything has been setup correctly, the Logic App’s callback should be invoked and magic should happen in the cloud.

End Result

What If It Didn’t Work?

Well, there’s a few troubleshooting things you can do. Using the Cloud Explorer window (part of the Azure SDK) in Visual Studio, you can navigate to your API App, right-click and then click Attach Debugger. You can set breakpoints within the callback registration method of the controller class, and step through looking for problems as the Logic App registers the callback. This only happens when the trigger is first added to the Logic App (after clicking Save), and then every hour or so after that (assuming it worked on the first try).

You can see past registrations of the callback by navigating to the trigger history for the Logic App. If you see a string of failures there, it’s likely a bug in the callback registration code, or your storage account credentials.

image

Clicking any one of those line items will bring up the details (inputs/outputs) for debugging. If you want to attempt a callback registration manually (so that you can do it on demand), you can use the Swagger UI page for the API App, and manually fire the callback registration method.

Using Swagger UI to debug Logic App Push Trigger

The above screenshots were generated by replacing the configuration details for the Cloud Storage account with completely invalid data.

If everything looks good as far as the API App in Azure is concerned, you may want to debug the Windows IoT Core task from within Visual Studio. This can be done by right-clicking the project and then clicking Start Debugging (nothing special there).

The End

That’s all for now. Stay-tuned for more samples, and course updates!

Azure App Services Training is Awesome!

By John Callaway

If you weren’t one of the students that attended the recent Cloud-Based Integration Using Azure App Service class offered by QuickLearn you really missed out. I was able to attend and found the experience very informative.

Some technologies lend themselves to simply picking up a book and reading about how it works. BizTalk Server has never been one of those products and it looks like for the foreseeable future at least Azure App Services and Logic Apps are going to fall into that same category. It’s a good thing that Rob Callaway and Nick Hauenstein are braving the front lines to create and deliver quality training for all of us that are too busy to keep up with the rapid changes in Azure.

About the Class

This class was delivered by Rob Callaway, one of the best BizTalk and now Azure App Services instructors in the world! This three day class had an eclectic international audience with people traveling from Canada and Europe to attend the course.

As you can tell from the overview this class is jam-packed with everything that you need to prepare to build integration solutions using Microsoft’s newest addition to Azure App Service, Logic Apps. Since there is so much that goes into creating a logic app the class feels a bit like a snowball rolling down hill, it starts small but as it progresses the knowledge you gain becomes almost overwhelming. The labs ensure that you don’t get lost in the cloud (pun IS intended) by providing a rich hands on experience to match the excellent lecture.

As an integration specialist, I felt very comfortable with the early concepts. By the time we got into day two everything was new as we built first simple and then more complex logic apps.

For the uninitiated a logic app is comprised of triggers and actions which are themselves API Apps. These API Apps are in turn Web Apps that perform some simple function. This whole thing is hosted in Azure. When strung together a Logic App can be very powerful providing capabilities similar to BizTalk orchestrations.

We didn’t just explore Microsoft Azure App Services, but learned how to integrate with Microsoft Azure Service Bus as a reliable and persistent store for inbound and outbound data, and identified the role that Microsoft Azure BizTalk Services (MABS) plays in cloud-based integration. By the end of the course the participants were even able to build their own custom API Apps, no small feat!

The goal of the course, one that I think all the participants would agree was achieved, is to provide the best training possible on these evolving technologies. QuickLearn Training is able to deliver on this goal because we have spent the last two years digging through the sometimes scant documentation and Microsoft presentations to find the golden acorns of knowledge that we happily share with our customers.

Special Guests

One of the benefits of our close relationship with the product team and our proximity to the Microsoft campus is that from time to time we have special visitors. We appreciated Mark Mortimore and Jeff Hollan for taking time out of their busy schedules to drop by on Thursday evening for a meet and greet with the students. Students provided Jeff with some great feedback for features in Logic Apps, and they even convinced Jeff to take a BizTalk Server course. While we don’t always get our friends at Microsoft to visit when we do it’s exciting and fun.

To wrap up the class on Friday we had a special guest appearance by our own Nick Hauenstein where he previewed his Creating a Push Trigger API App to Process NFC Tag Reads demonstration that he will be delivering at the upcoming AzureCon2015 on September 29th.

What The Participants Are Saying

Rob did a great job helping the attendees navigate the mine field or maybe, given the mental challenge, that should read MIND field, of creating and configuring Microsoft Azure Logic Apps.

Some of the feedback that we received from the attendees of this class:

…the QuickLearn materials were flawless and perfectly adapted to the objective of the course…I think that the learning environment is close to perfect and I’m having a hard time thinking of anything that should be changed.

The pace of his speech is very easy and pleasant to follow. Important points are made and repeated, often with humor, which is yet another demonstration that Rob masters his topic and enjoys sharing his knowledge.

This class probably needs to be at least 4 days if not a week.  Need more time to complete labs.

Announcement

With an evolving set of technologies such as this there were inevitable additions that were made between the initial deliveries of this course and the most recent one. With this new content the class will simply not fit into the three day time-box that we initially allotted. As a result of these additions, and the feedback that we got from the recent class, we are excited to announce that the Cloud-Based Integration Using Azure App Service class is being extended to five days!

Your first opportunity to attend this new expanded version of the class is November 30th. As with all QuickLearn Training classes it is offered for remote attendance if you prefer but of course you are all invited to attend at our state-of-the-art facilities in Kirkland Washington as well.

I predict that within one year, your customers will be asking you about cloud based integration. Wouldn’t you rather be the one that knows the answers already, and have several months experience under your belt?

If you are worried that new features will be added that you miss out on by being an early adopter, QuickLearn Training always offers the opportunity for students to retake any class within six months, thus future-proofing your learning. As new features are added to these technologies you can bet that we will do our best to stay on top of the changes so that we can share that knowledge with you.

So indeed if you weren’t one of the students that attended the recent Cloud-Based Integration Using Azure App Service class offered by QuickLearn you really missed out, but you have another chance to mend your ways and get an improved and lengthened version of the course. Don’t miss out on this great opportunity.