See Docker Enterprise Edition
See Overview of Azure Service Fabric
Azure Service Fabric is a distributed systems platform that makes it easy to package, deploy, and manage scalable and reliable microservices and containers. Service Fabric also addresses the significant challenges in developing and managing cloud native applications. Developers and administrators can avoid complex infrastructure problems and focus on mission-critical, demanding workloads that are scalable, reliable, and manageable. Service Fabric represents the next-generation platform for building and managing these enterprise-class, tier-1, cloud-scale applications running in containers.
Service Fabric lets you build and manage scalable and reliable applications composed of micro-services that run at high density on a shared pool of servers, which is referred to as a cluster. It provides a sophisticated, lightweight runtime to build distributed, scalable, stateless, and stateful micro-services running in containers. It also provides comprehensive application management capabilities to provision, deploy, monitor, upgrade/patch, and delete deployed applications, including containerized services.
Azure Automation is a managed service that focuses on automating frequent deployment and lifecycle management tasks using Runbooks-based PowerShell Workflow functionality.
Automation reduces the chance of errors and provides support for repeatable and reproducible results.
To use Azure Automation, enable it in the Azure Portal. You can have up to 30 Azure Automation accounts per subscription.
When you create a new Automation Account, Azure will create two Run As Accounts and four template Runbooks. These sample accounts and runbooks demonstrate how to sign in to your Azure subscription using Classic and Resource Management accounts. You can copy and modify these templates to include the PowerShell script required for your automation task.
Runbooks encompass a set of processes and procedures that can be automated. Each Runbook should automate a single task.
Runbooks can be exported and imported. Microsoft provides a library of Runbook templates at the Microsoft Azure Automation Runbook Gallery.
Runbooks are imported from the gallery in draft state and must be published before they can be scheduled or called by other runbooks.
If you execute a runbook in test mode (draft or edit), it operates against the real targets, so be careful how you test!
If you put a published runbook into edit mode, it creates an additional draft copy of the runbook that can only be run in test mode. Anything attached to the runbook uses the currently published version until a new version is published. NOTE – any workflow already in progress when you publish the new version will continue to use the older published version until the workflow completes.
Multiple users can edit the same runbook concurrently, but this is not recommended!
You can pass parameters into a runbook when it is called. Input parameters can be made mandatory, or they be optional, in which case you must specify a default value.
Parameters are specified at the start of the runbook in the param (..) section.
Checkpoints can be added to runbooks to provide a persistence mechanism within the workflow. You can add a Checkpoint-Workflow before/after any workflow command except inline code blocks.
Checkpoints persist the state of the runbook (including variables) to the Azure Automation database. They persist until overwritten by the next checkpoint, or the workflow completes.
If a workflow fails, it can be resumed at a checkpoint. They help ensure that steps already completed are not executed twice if the workflow is restarted/resumed.
Azure Automation limits runbook execution time to 30 minutes. Azure will unload any runbook that takes longer than that.
Runbooks can be manually suspended in the portal, or within the workflow by calling Suspend-AzureAutomationJob or Suspend-Workflow. Runbooks can also be suspended by Azure if they run for more than 30 minutes or fail.
Runbooks can be manually resumed in the portal, or within the workflow by calling Resume-AzureAutomation.
Assets are global resources shared across all runbooks in the automation account. Examples include:
Azure Automation can use Azure Active Directory, which reduces the need for management certificates. AAD authentication is the recommended approach.
Start by creating an AAD User WITHOUT multifactor authentication, then add the new user as a co-administrator for the Azure Subscription.
You can create an Azure Automation credential asset with the login credentials of the user through Get-AutomationPSCredential from within the runbook.
The variables are created and maintained globally. They can be strings, integers, Boolean, or datetime; an initial value may be specified.
Variables can be encrypted, which causes circles to display in the UX, but the backend variable is not encrypted.
Variables can be accessed in scripts using either Get-AutomationVariable or Set-AutomationVariable.
You can import published PowerShell libraries up to 30MB in size in a zipped format into an automation account that can be shared by all runbooks.
The zip file must contain a single folder with the same name as the zip file. Within this folder there must be at least one file with the same name having extension .psd1, .psm1 or .dll.
These are used to provide access to external systems, networks, databases and services.
They are referenced inside workflows using Get-AutomationCredential or Get-AutomationPSCredential.
Certificate credentials are based on management certificates. Best practice is to use AAD to generate certificates.
Connections are used to connect to external networks or systems. They need to specify all connection data, including ports, protocols, usernames and passwords.
These execute runbooks automatically. When the schedule triggers, the runbook is loaded and executed. When the runbook completes, Azure manages the deallocation of execution resources.
Schedules are not referenced from runbooks. Instead a runbook is linked to a schedule.
Not everything has to be built in the cloud and deployed on PaaS. When building services that will be hosted on a physical or virtual server, using WCF services is still a viable option.
WCF endpoints can be exposed via IIS web sites. Additionally, WCF can be problematic to correctly configure, and can be a performance hog.
See Azure Service Bus
Microsoft Azure Service Bus is a reliable information delivery service for brokered messaging that can be thought of as asynchronous, or "temporally decoupled". The message sender can also require a variety of delivery characteristics, including transactions, duplicate detection, time-based expiration, and batching
Applications should be decoupled from sending to message Service Bus Namespaces directly – this stops the application hanging when the network is slow, or the service is down. Instead, the application should send outbound messages to a database table for transmission to the queue via a windows service.
Outbound messages can be sent to either:
Topics and Subscriptions
Reading a message off the queue may be done in one of two modes:
The service can also ignore message duplicates by interrogating the MessageID of the message if the DuplicateMessageDetection and DuplicationDetectionHistoryTimeWindow settings are enabled.
Inbound messages are retrieved from a Service Bus using Azure Functions. A service bus can be secured using:
Service buses doesn’t provide a geo-replication option, so they have no inbuilt DR capability. This can be mitigated by partitioning the queue, but data can still be lost. The only way to be sure is to send the message to multiple queues and let the recipient sort it out by keeping track of which messages have already been processed.
These are the Azure PaaS equivalent of BizTalk, recently introduced but rapidly becoming the workflow platform of choice for orchestration in the cloud. They support serverless scaling. They are created using a web-based visual designer in the Azure portal or by installing the Extension “Visual Studio tools for Logic Apps”.
Primary development using a graphical designer which creates a workflow definition in a JSON representation. Reference
These are already supported for many managed Microsoft connectors and workflow templates, which are available in the Enterprise Integration Pack. Additional connectors, flow control and BizTalk-like capabilities are being developed. You can create additional components using Visual Studio MVC 6 framework, or they can be integrated with on-premises BizTalk using a Data Connector and a Service Bus Relay endpoint.
Azure Logic Apps provide built-in auditing of all management operations, date/time that workflow is triggered, duration and status.Role based access controls can be applied through the portal, and diagnostic information can be exported to Event Hubs or Azure Storage.
Logic Apps can support content in JSON, XML, Flat Files and binary data. Content type can be configured in the connector and built in conversions are available. Reference
Exception and error handling:
EAI/B2B Integration using Logic Apps:
Data sources are made available using connections that are used to build forms-based applications that support not only standard create, read, update, and delete functionality, but also complex search and field validation. They provide a range of connectors out of box, including Office 365, Dynamics 365, Dropbox, Azure services, and social networks.
Applications built in PowerApps work immediately on tablets and phones. PowerApps uses small integration gateways to connect to on-premises data via a Service Bus relay. Microsoft has introduced the Common Data Services, which includes the Common Data Model, to provide data structure and governance.
Microsoft Flow is a recently introduced, lightweight version of Logic Apps for Office 365. Download
Information on Common Data Services