Artificial Intelligence (AI), Business Intellegence (BI), Gunnebo Business Solutions, Machine Learning (ML), Microsoft Azure

Machine Learning and Cognitive Services

Machine learning is gradually becoming the driving force for every business.  Business organizations, large or small trying to seek machine learning models to predict present and future demands and do innovation, production, marketing, and distribution for their products.

Business value concerns of all forms of value that decides the well-being of a business. It’s a much broader term than economic value encompassing many other factors such as customer satisfaction, employee satisfaction, social values etc. It’s the key measurement of the success of a business. AI helps you to Accelerate this business value in two ways. That’s through allowing to make correct decisions and innovation.

Machine learning technologies. Millennial students teaching a robot to analyse data
nadia_snopek

Remember the days when Yahoo was the major search engine and Internet Explorer was the Major web browser. One of the main reason for their downfall was their inability to make correct decisions. Wise decisions are made by analyzing data. More data you analyze, better decisions you make. Machine Learning greatly support in this cause.

There was a time, Customers accepted what companies were offering them. Things are different now. Demands of customers for new features are ever more increasing. Machine Learning has been the decisive factor behind almost every new innovation whether it be face recognition, personal assistants or autonomous vehicles.

Machine Learning in more details

First starts with learning what machine learning is. Machine learning enables systems to learn and make decisions without explicitly programming for it.  Machine learning is applied in a broad range of fields. Nowadays, Almost every human activity getting automated with the help of machine learning. A particular area of study that machine learning largely exploited is data science.

Data science plays with data. Data must be extracted to make the best decisions for a business.

The amount of data that a business has to work with is enormous today. For example, social media producing billions of data every day. To stay ahead of your competitors, every business must make the best use of this data. That’s where you need machine learning.

Machine learning has invented many techniques to make better decisions out of large data sets. These include Neural networks, SVM, Reinforcement learning and many other algorithms.

Among them, Neural networks are leading the way. It improves consistently spawning child technologies such as convolutional and recurrent neural networks to provide better results in different scenarios.

AdobeStock_178345630_low

Learning machine learning from the beginning, and trying to develop models from scratch is not a wise idea. That yields huge cost and demands a lot of expertise in the subject. That why someone should try to take the assistance of a machine learning vendor. Google, Amazon, Microsoft they all provides Machine learning services. Let’s take Microsoft as an example, and review what qualities we should look for when selecting a vendor.

Using cloud as a solution for machine learning

It simplifies and accelerates the building, training, and deployment of machine learning models. It provides with a set of APIs to interact with when creating models hiding all the complexity in devising machine learning algorithms. Azure has the capability to identify suitable algorithms and tune hyperparameters faster. Autoscale is a built-in feature of Azure cloud services which automatically scale applications. This autoscaling feature has many advantages. It allows your application to perform best while keeping the cost to a minimum. Azure machine learning APIs can be used with any major technologies such as C# and Java.

There are many other advantages you will have with cloud Machine Learning

  • Flexible pricing. You pay for what you use.
  • High user-friendliness. Easier to learn and less restrictive.
  • More accurate predictions based on a wide range of algorithms.
  • Fine tuning results are easier.
  • Ability to publish your data model as a web service Which is easy to consume.
  • The tool allows data streaming platforms like Azure Event Hubs to consume data from thousands of concurrently connected devices.
  • You can publish experiments for data models in just a few minutes whereas expert data scientists may take days to do the same.
  • Azure security measures manage the security of Azure Machine Learning that protects data in the cloud and offers security-health monitoring of the environment

Using Cognitive Services to power your business applications

We will go on to discuss how Azure cognitive service can be used power up a business application. Azure cognitive services are a combination of APIs, SDKs, and services which allows developers to build intelligent applications without having expertise in data science or AI. These applications can have the ability to see, hear, speak, understand or even to reason.

AdobeStock_252431727_low

Azure cognitive services were introduced to extend the Microsoft existing portfolio of APIs.

New services provided by Azure cognitive services includes

  • Computer vision API which provides with advanced algorithms necessary to implement image processing
  • Face API to enable face detection and recognition
  • Emotion API gives options to recognize the emotion of a face
  • Speech service adds speech functionalities to applications
  • Text analytics can be used for natural language processings

Most of these APIs were built targeting business applications. Text analytics can be used to harvest user feedbacks thus allowing businesses to take necessary actions to accelerate their value. Speech services allow business organizations to provide better customer services to their clients. All these APIs have a free trial which can be used to evaluate them. You can use these cognitive services to build various types of AI applications that will solve a complex problem for you thus accelerating your business value.

If you want to talk more about ML and AI, feel free to contact me: bjorn.nostdahl@gunnebo.com 🙂

Cosmos DB, Gunnebo Business Solutions, Microservices, Microsoft Azure, Mongo DB, Technical

Microsoft LEAP: Designing for the Cloud

The Microsoft LEAP is an event for the developers worldwide who are looking for original training from Microsoft.  It takes place annually in Microsoft headquarters in Redmond, WA. The five-day conference helps the attendees to fully understand how Microsoft products can be used and how they can solve the problems of the companies. This time, the participants learned how to design a cloud in an up-to-date fashion.

 

MicrosoftTeams-image

The following piece will provide you with a glimpse through the Microsoft Leap program. The sections are the highlights with the greatest impact and effect on the developers’ community.

Deep Dive into Cloud Computing: Azure Strategy

On January 28, Microsoft kicked off the Leap program for software architects and engineers. There were loads of speakers on the agenda. Among them, Scott Guthrie was one of the strongest. Scott is in charge of Microsoft’s cloud infrastructure, servers, CRM and many more tools. He was the leader of the team that created Microsoft Azure. In his keynote, “Designed for Developers”, he discussed cloud computing technology. His aim was to help the developers with a different level of skills to reach one goal, which is sustainable development and use of cloud computing.

20190128_090001

Scott focused on how to develop clouds and maintain them.  The session was concluded with the presentation of Microsoft’s anticipated plan of providing Quantum Computing in their Azure technology.

The Strong Impact of Microservice Architecture

On this issue,  the most memorable was the session featured by Francis Cheung and Masashi Narumoto. They talked about microservices and the strong architecture that they hold. This architecture is considered a paragon in the world of cloud computing as it has raised the bar.

20190128_111511

The speakers mentioned several important features of a strong company that has the potential to succeed. And it was well-established that the success of microservice implementation depends mostly on a well-developed team with a strong strategy (preferable domain-driven).

 

No matter how beneficial microservices could be, it is not necessarily the right choice for your business. You need to be well aware of your products and the level of complexity your business needs. Having extra unrequired tools will set you back rather than take you anywhere.

SQL HyperScale as a Close Based Data Solution

This session was different as it celebrated two decades of Pass and 25 years of SQL technology being used. The speaker, Raghu Ramakrishnan,  has been Microsoft’s CTO since he moved from Yahoo in 2012. With his strong background and experience, Raghu was the best candidate to discuss the use of SQL Hyperscale and how groundbreaking this technology has been.

20190128_134351

The Hyperscale service has become a crucial update to the currently existing services. According to Ramakrishnan, this is the most modern technology of SQL services which has the highest storage with the most computing performance. This precise model has up to 100 TB of the database.

 

This technology is generally used to replace cloud computing database structures as it is more reliable and accessible than other alternatives. Microsoft has added many features to the SQL hyperscale making it the leading databasing solution in the market. With the amazing features discussed in the talk, it was really worth a separate session.

The Commercial Database: Cosmos Database

Deborah Chen, the Cosmos Database program manager at Microsoft, took the time to discuss the most viral commercial form of database out there. Most current implementations use non-relational databases. The Cosmos DB is one of the most widely used sources for databasing.

20190128_144226

As it was mentioned by Deborah, the Cosmos DB is a very volatile and responsive tool. With numerous transactions taking place in a second, response to applications (especially for real-time) is a very sensitive thing. since it is a non-relational database, the retrieving and storing of data is easier and faster. Thus, this is where Cosmos stands out, as it was intentionally created with an architecture aimed at handling such tasks.

 

She also discussed the use of Service Level Agreements (SLA). This agreement helps to provide guarantees, availability, and latency for all users, making Cosmos DB the most viral product out there.

Monitoring Your Procedures Using Azure Monitoring

Rahul Bagaria, a product manager of Azure monitoring, joined later on to talk about the importance of monitoring your work, flow, and operations.  But the monitoring process is not limited to single tasks only but to the connections, workflow, and final output. To monitor all the steps taken through the procedure is important for maintaining efficient delivery and quality assurance as a whole. It is also beneficial to pick out errors and problems in the cycle, may they arise.

20190128_154930.jpg

This is where Azure monitoring kicks in, with many strong details like log analytics and application insights. Rahul emphasized the importance of this tool and all the features it provides. His team has worked hard to provide a service that can help with multiple tasks, milestones, and services. This session helped the developers to learn why and how to monitor their work processes.

 

All in all, the first day at Microsoft LEAP 2019 was very on-topic and interesting. I look forward to the next sessions. If you have any questions, feel free to contact me at bjorn.nostdahl@gunnebo.com

Artificial Intelligence (AI), Business Intellegence (BI), Gunnebo Business Solutions, Machine Learning (ML), Microsoft Azure

Microsoft LEAP: Looking into the future

Cloud Computing have become one of the most profitable industries in the world and cloud will remain a very hot topic for a foreseeable future. There is a huge competition among cloud service providers to win customers by providing the best services to their customers. Cloud service providers invest a lot of money on inventions. Thus, cloud services make most of the trends in the future IT industry. Microsoft Azure and Amazon AWS is one of the leaders in innovation in their respective fields.

Data centers around the world

As the demand for cloud services rapidly increasing in all parts of the world, establishing data centers around the globe becomes a necessity. Azure has understood this well and expecting to expand its service by constructing data center regions in many parts of the world.

Microsoft-navalgroup_Brest
From news.microsoft.com article about Project Natick’s Northern Isles datacenter at a Naval Group facility in Brest, France. Photo by Frank Betermin

The world is divided into geographies defined by geopolitical boundaries or country borders. These geographies define the data residency boundaries for customer data. Azure geographies respect the requirements within geographical boundaries. It ensures data residency, compliance, sovereignty, and resiliency. Azure regions are organized into geographies. A region is defined by a bandwidth and latency envelope. Azure owns the greatest number of global regions among cloud providers. This is a great benefit for businesses who seek to bring their applications closer to users around the world while protecting data residency.

The Two Major Azure’s Global Expansion of Cloud Services

Two of the most expansion that Microsoft Azure has incorporated to improve its service updates includes the following:

Expansion of Virtual Networks and Virtual Machines Support.

With utility virtual machines like A8 and A9 that provides the advantages of operations like rapid processors and interconnection amidst more virtual cores, there can now be the seamless configuration of virtual networks for specific geographical locations and regions.

This feature gives more room for optimal operations, cloud services, complex engineering design video encoding and a lot more.

Incorporation of Azure Mobile Services, and its Expansion to Offline Features

Even with a disconnected service, this operation makes it possible for applications to operate effectively on offline features.  Furthermore, is that this extends the incorporation of Azure cloud services to apps on various platforms, including Android and iOS on mobile phones.

Then there are Availability Zones. It is the 3 rd level in the Azure network hierarchy.

Availability zones are physically separated locations. They exist inside regions. They are made up of one or more data centers. Constructing availability zones is not easier. They are not just data centers, they need advanced networking, independent power, cooling etc. The primary purpose of Availability zones is to helps customers to run mission-critical applications.

You will have following benefits with Azure availability zones

  • Better protection for your data – you won’t lose your data due to the destruction of a data center
  • High- availability, better performance, more resources for businesses to continuity.
  • 99% SLA on virtual machines

Open source technology

Microsoft took some time to understand the value of Open source technologies. But now they are doing really fine. With .Net Core and the .Net Standard, Microsoft has done a major commitment to open source. Looking at GitHub alone, Microsoft is one of the largest contributors to open source.

Redmond, Washington USA - 4th June 2018 Microsoft confirms its acquiring GitHub
“Microsoft is a developer-first company, and by joining forces with GitHub we strengthen our commitment to developer freedom, openness and innovation,” said Satya Nadella, CEO, Microsoft.

With  .Net core 3.0, Microsoft introduced many features that will enable developers to create high security fast productive web and cloud applications. .NET Core 3 is a major update which adds support for building Windows desktop applications using Windows Presentation Foundation (WPF), Windows Forms, and Entity Framework 6 (EF6). ASP.NET Core 3 enables client-side development with Razor Components. EF Core 3 will have support for Azure Cosmos DB. It will also include support for C# 8 and .NET Standard 2.1 and much more.

Mixed reality and AI perceptions

Mixed reality tries to reduce the gap between our imagination and reality. With AI, it is about to change the way how we see the world. It seems to become the primary source of entertainment. Although Mixed reality got popular in the Gaming industry, now you can see its applications in other industries as well. The global mixed reality market is booming. That’s why the biggest names in tech are battling it out to capture the MR market. All major tech products have introduced MR devices such as Meta2 handsets, GoogleGlass 2.0, Microsoft HoloLens.

Mixed reality and AI perception is a result of the cooperation of many advanced technologies. This technology stack includes Natural Language interaction, Object recognition, real-world perception, real-world visualization, Contextual data access, Cross-device collaboration, and cloud streaming.

Factory Chief Engineer Wearing VR Headset Designs Engine Turbine on the Holographic Projection Table. Futuristic Design of Virtual Mixed Reality Application

As I said earlier, Although the Gaming industry was the first to adopt mixed reality, now MR applications are more used in other industries. Let’s visit some of the industries and see how Mixed reality has transformed them and what benefits those industries get from mixed reality and AI perception.

You can see tech giants such as SAAB, NETSCAPE, DataMesh, using mixed reality in the manufacturing industry. According to research, mixed reality helps to increase worker productivity by 84%, improve collaboration among cross-functional teams by 80% and improve customer service interaction by 80%. You may wonder How mixed reality was able to achieve it? What it offers to the manufacturing industry. There are many applications of Mixed reality in manufacturing, following is a small list of them.

  • Enhanced Predictive Maintenance
  • Onsite Contextual Data Visualization
  • Intuitive IOT Digital Twin Monitoring
  • Remote collaboration and assistance
  • Accelerated 3D modeling and product design
  • Responsive Simulation training

Retail, Healthcare, Engineering, Architecture are some other industries that use mixed reality heavily.

Quantum revolution

Quantum computing could be the biggest thing in the future. It is a giant leap forward from today’s technology. It has the potential to alter our industrial, academic societal and economic landscapes forever.  You will see these massive implications nearly every industry including energy, healthcare, smart materials, and environmental system. Microsoft is taking a unique revolutionary approach to quantum with its Quantum Development Kit.

QPR18_Copenhagen_57022000x1108
Picture from cloudblogs.microsoft.com article about the potential of quantum computing

Microsoft can be considered as the only one who took quantum computing seriously in the commercial world. They have a quantum dream team which is formed by the greatest minds in physics, mathematics, computer science, and engineering to provide cutting-edge quantum innovation. Their quantum solution integrates seamlessly with Azure. They have taken a scalable topological approach towards quantum computing which helps to harness superior qubits. These superior qubits can perform complex computations with high accuracy at a lower cost.

There are three important features in Quantum development kit which makes it the go-to Quantum computing solution.

It introduces its own language, Q#. Q# created only for quantum programming. It has general programming features such as operators, native types and other abstractions.  Q# can easily integrate with Visual Studio and VS code which makes Q# feature rich. Q# is interoperable with the Python programming language. With the support of enterprise-grade tools, you can easily work on any OS windows, macOS, or Linux.

Quantum development kit provides a simulated environment which greatly supports optimizing the codes. This is very different from other quantum computing platforms which still exist in a kind of crude level. This simulation environment also helps you to debug your code, set breakpoints, estimates costs, and many other things.

As we discussed earlier, Microsoft has become the main contributor in the open source world. They provide Open source license for libraries and samples. They have tried a lot to make quantum computing easier. A lot of training materials are presented to attract developers to into quantum programming realm. The open source license is a great encouragement for developers to use the Quantum development kit in their applications while contributing to the Q# community.

Cloud services will shape the future of the IT industry. Quantum computing, Open source technologies, Mixed reality will play a great role in it.

This is my last day in Redmond, but I really look forward to coming again next year! If you have any questions, feel free to contact me at bjorn.nostdahl@gunnebo.com

Artificial Intelligence (AI), Gunnebo Business Solutions, Machine Learning (ML), Microsoft Azure

Microsoft LEAP: Adding Business Value and Intelligence

Adding Business Value and Intelligence

The concept of business value and intelligence is aimed at more productive measures through the utilization of various tech application and analytical tool for the assessment of raw data. Business intelligence makes use of activities like data mining, analytical processing, querying and reporting. Companies take advantage to improve their operationalization, as well as accelerate their decision making. Business intelligence is also useful in the aspect of reducing cost and expenses and also identifying new business opportunities.

Machine learning technologies. Millennial students teaching a robot to analyse data

A lot of experts have shared their ideas and spoken on various aspect of business values and intelligence relating to AI in Redmond. Notable speakers include Jennifer Marsman, Maxim Lukiyanov, Martin Wahl, and Noelle LaCharite. The aspects that they extensively spoke on is a machine and learning fundamentals, introduction to new azure machine learning service, using cognitive services to power your business applications, and how to solve business problems using AI, respectively.

Machine and Learning Fundamentals

The fundamentals of machine learning have to do with understanding both the theoretical and programming aspect. it is also important to be up to date with the latest algorithm and technology that is being implemented by the various programming tools for machine learning. The there simplest explanation of the term machine learning is that the operation of the machine in such a way that it would be able to perform various tasks.

20190131_081910

Algorithms can learn how to perform these tasks in various ways, and this brings us to the different types of machine learning. They include supervised learning which is carried out to enable the machine to identify and differentiate between various data. Unsupervised learning, on the other hand, does not have to do with a specific data or structure that the machine is supposed to produce. Another type of machine learning is reinforcement learning.

The importance of a machine model’s accuracy cannot be understated. The accuracy is what really determines how effective a machine can be for the operationalization of a company. machine models are estimated or measured mainly by prediction making and putting them to work in the real world sense. In the business world, a model cannot be accepted until it has been tested against the real world and the results are satisfactory. Measuring a machine model depends on the characteristics of such a particular model, and the circumstances the model is needed in the real world.

Two vital aspects of Machine learning are CNN and RNN. CNN is convolutional neural networks, while RNN is recurrent neural networks. For CNN mainly generate free size outputs, and are used for minimal amounts of reprocessing. RNN on the other hand functions on random inputs and outputs. They can also be sued for the processing of random sequences. So in basic terms, CNN is built such that they can be able to recognize images while RNN, on the other hand, recognizes sequences.

Presentation about machine learning technology, scientist touching screen, artificial intelligence-1

Furthermore, Jennifer Marsman helped in the description of various methods that are related to artificial intelligence, and they include the following.

  • Search and Optimization

The use of a search engine and search optimization helps to rank AI algorithms. Explaining the role of AI for search and optimization purposes on search engines could be very technical. Machines are also taught on how to work with these to rank algorithms.

  • Logic

Logic also plays a major role in AI. The application of Logic in Ai could be as an analytical tool, as a knowledge representation formalism, and also a method of reasoning. Logic can also be used in the aspect of programming language. With this, it can explore both the prospects and the problems of the success of AI.

  • Probabilistic Methods for Uncertain Reasoning

One of the most widely artificial methods for representing uncertainty is a probability. A lot of certainty factors have been utilized for quantifying uncertainty for alternative numerical schemes over the years.

  • Classifier and Statistical Learning Methods

Classifiers associated with AI includes Naive Bayes, Decision trees, perceptron, amidst a host of others. There are also various statistical learning methods and theories that are in used to evaluate the uncertainties of AI. However, there are limitations to these statistical models, and this is where logic comes.

  • Artificial Neural Networks

This is the impact of the earlier mentioned RNN and CNN on the concept of AI. A typical explanation of ANN in a natural language processing AI which can be used in the interpretation of human speech.

  • Evaluation Progress in AI

This is imperative in the estimation of the progress of the concept of AI across all sectors including business models. Three evaluation types include human discrimination, peer confrontation, and problem benchmarks.

An Introduction to New Azure Machine Learning Service

Maxim Lukiyanov spoke about the working principle of the new Azure machine learning service. The service helps to simplify and accelerate building, training, as well as the development of various machine learning models. Furthermore, the automated machine can be utilized in such a way that algorithms that are needed are easily identified, and the hyperparameters are tuned faster.

New Azure Machine Learning Service also helps to improve productivity and reduce costs with auto-scaling compute methods, as well as develops for the machine learning procedure. New Azure Machine Learning Service also have the advantage of storing the data easily on the cloud. Using the latest programming language is also a seamless operation with the New Azure Machine Learning Service, with open source frameworks like PyTorch, TensorFlow, and scikit-learn.

Maxim also spoke further on some benefits of the New Azure Machine Learning:

  • Easy and flexible pricing method, as you will have to pay a=for only the features that you use.
  • The machine learning is very easy to understand, and the tools that come with it are not in any way restrictive.
  • With the various data and algorithm of the tool, there will be more accurate predictions
  • The tools from the machine make it very easy to import data, and as well as fine-tune the results.
  • A lot of other devices can be connected easily to the platform with the aid of the tolls
  • Data models can be easily published as a web service
  • The time scale for the publish of experiments is only a matter of minutes. This is a very major upgrade when compared to expert data scientists that take days.
  • There is adequate security from the Azure security measures. And this is very useful for the storage of Data in the cloud.

Using Cognitive Services to Power your Business Applications: An Overview and Look at Different AI Use Cases

Martin Wahl explained that with Azure cognitive services, customers are set to benefit from AI with developers. With this, they will not even need the service of a data scientist, which is a major advantage to saving both time and costs. This is done by building this machine in such a way that the learning models, pipelines and infrastructure needed are packaged up on cognitive service for important activities such as vision, speech, search, processing of text, understanding languages, and many more operations. This means that anyone who is capable of writing a program at all can make use of the machine learning to improve the application.

20190131_110410.jpg

Customers who have patronized this service are already benefiting from cognitive services such as face container, text container, custom vision service support for logo detection, language detection, in-depth analysis and many more.

Martin Wahl finally explained that with Azure service, more value is added to the business, and the implementation of artificial intelligence is easier than ever.

How to Solve Complex Business Problems Using AI Without Needing a Data Scientist or Machine Learning Expert.

With the possession of basic skills like python coding, data visualization, Hadoop platform, apache spark etc. complex business problems can be solved, even without being a machine learning expert or a data scientist.  All of these are made possible through the help of AI and all that is needed is just dedication and willingness. Some procedure to go about this include:

  • Understanding the basics: This has to do with acquiring general knowledge on the basics, both theoretically and practically.
  • Learning Statistics: Statistics is core to solving business problems, and some of the aspect to be looked at include Sampling, data structures, variable, correlation, and regression etc.
  • Learning Python
  • Making attempts on an explanatory data analysis project
  • Creation of learning models
  • Understanding the technologies that are related to big data
  • Exploring deeper models
  • Completing a complex business problem.

Finally, Noelle LaCharite gave a vivid explanation of how a PoC was made and I did one myself in Delphi in 30 minutes with the aid of Azure AI.

Gunnebo Business Solutions, IBM International Business Machines, Microsoft Azure, Node RED, Technical

Node-RED deployment on Azure

Today I would like to talk about the process of deployment Node-RED instances on Azure platform.

The initial tasks were:

  1. Deploy Node-RED instance to Azure cloud and provide public IP address/ DNS name to it.
  2. Secure Node-RED instance access with user credentials.
  3. Update instance with actual node set and provide ability to keep them up to date.

Let’s discuss all steps one by one.

Azure deployment

The most common and convenient way to deploy your application on Azure platform is by using Azure Resource Manager. It enables you to use all application resources as a group and to deploy, manage or delete them just in one operation. With Resource Manager, you can create a template (Azure Resource Manager template) that defines the infrastructure and configuration of your Azure solution. It allows you to deploy your solution repeatedly throughout its lifecycle being confident that your resources are deployed in a consistent state.

Resource Manager template is a JSON file that defines resources which you need to deploy to a resource group. Resource Manager analyzes the template and then converts its syntax into REST API operations for the appropriate resource providers. For the resources to be deployed in correct order, you can set dependencies between them.  It is done when one resource relies on a value from another resource, for example, in case of a virtual machine which needs a storage account for disks.

You may wonder, “What resources are and why we need them? We just want to deploy NodeJS application (Node-RED) on Azure”.  Well, a resource is a manageable item that is available on Azure. Some common resources are a virtual machine, a storage account and a virtual network, but there are much more. To start Node-RED in the cloud, we need to create VM and deploy a Docker container(image) with Node-RED inside. Since one resource relates to another one, we should create a bunch of resources in our resource group (that is a container holding related resources for our Azure solution). It includes:

  • Storage account
  • Public IP address
  • Virtual Network
  • Network interface
  • Network security group
  • Virtual Machine
  • Extensions

Resource Manager provides extensions for scenarios in case you need additional operations such as installing particular software which is not included in the setup. We used Docker Extension in order to setup Docker container on VM.

Ok, so now we are ready to create a template. The detailed description can be found here.

Here I would like to  talk only about extension section:

At this stage, we define DockerExtension resource that depends on our Virtual Machine resource. We specify to use “nodered/node-red-docker” image from DokerHub

Also, we need to enable Docker Remote API for further use:

Since we need to get access to the API we also expose port in Network Security Group:

Also, we need to map 80 VM port to 1880 (default port for Node-RED):

After defining the template, we are ready to deploy the resources to Azure. There are several ways to do that: PowerShell, Azure CLI, Azure Portal, REST API or Azure SDK.

Since we want to develop automation solution for application deployment, REST API and Azure SDK seem to be most suitable for us.  The reason why I want to highlight the Azure SDK for .NET is that it is much easier to build an application using existing wrapper classes for the API than to create your own REST wrappers and methods

Take these four steps to deploy your template with C# SDK:

1. To be able to make any requests to the API, first we need to authenticate and authorize our request. Let’s create the management client:

azureauth.properties  –  authorization file. Before you can deploy a template, you need to acquire a token for authenticating requests to Azure Resource Manager. You should also record the application ID, the authentication key, and the tenant ID which you need in the authorization file.

2. Create resource group and storage account:

3. Upload your template file to Azure:

4. Deploy template:

That’s it. On the whole, the deployment process in our case takes about 3-5 mins.

To retrieve public  IP address our Docker container is available on:

So now we have Node-RED instance up running on Azure cloud and accessible via public IP/DNS name. Let’s proceed to the next step.

Secure Node-RED instance

Node-RED Editor supports two types of authentication:

  • username/password credential based authentication
  • starting from Node-RED 0.17, authentication against any OAuth/OpenID provider such as, for example, GitHub or Twitter 

If we choose the first option, we need to add the following to our settings.js file:

Since we want to make this credentials customizable for each deployment we can’t embed this configuration in Docker file. So we need a way to execute commands inside Docker container after deployment. That’s why we use Docker Remote API to adjust credentials settings. And this is the reason to expose additional port in our template, as mentioned above.

Here is a command example to setup credentials for Node-RED:

We used .NET Client for Docker Remote API as a wrapper to REST API:

Now we have secured our Node-RED editor with custom username and password.

Keeping nodes and flows up to date

Now we need a way to provide our cloud Node-RED instance with custom node’s set and keep it up to date. We already have all tools for that. Custom nodes are stored in separate Git repository. A few options are available:

  1. Execute npm install <git repo url>  inside Node-RED userDir ( /data  for "nodered/node-red-docker"  container)
  2. Copy custom nodes to /data/nodes inside a container.

Node-RED flows can be synchronized in a similar way. By default, Node-RED Docker container stores flows data in /data/flows.json. The flows configuration file is set using an environment parameter ( FLOWS), This can be changed by setting environment variables in docker-compose configuration section:

Using this approach we can put nodes and flows file under version control inside a container and synchronize them with a remote repository.

All commands can be executed via Docker Remote API in the same way, as described in the previous section.

Each time we need to update our nodes, we just call Docker API and pull updates from repository. Also, we can backup our flows.json  by committing and pushing it into the repository.

As an improvement, we can create Git hook in order to update our Node-RED instances once some changes are pushed to our node’s repo. But this is out of the scope of this post.

Summary

Here we make a short overview of how to automate your deployments on Azure cloud with Azure Resource Manager and Azure SDK for .NET. In our example, we set up Node-RED docker container in the cloud but all mentioned steps are applicable to any similar Docker deployments.