DevOps, Microsoft, Microsoft Azure, Software Development Insights

Microsoft LEAP: Design with Best Practices

All good things come to an end and LEAP is no exception. It was a great week full of interesting and enlightening sessions. Day 5 was a fitting end to the week with its focus on Design with best practices.

group-photo-2.jpg

Let’s get to the sessions; the day began with a keynote by Derek Martin on the topic Design for failure. Derek is a Principal Program Manager and spoke about what not to do when designing a product. He spoke about building Azure and how the lessons learned can be used to understand and anticipate challenges.

img_5039.jpg

The focus was given to managing unexpected incidents not only in the application environment but also in the cloud as a whole.

Brian Moore took over with his keynote on Design for Idempotency – DevOps and the Ultimate ARM Template. He is a Principal Program Manager for Azure. The focus of the session was on creating reusable Azure Resource Manager Templates and language techniques to optimize deployments on Azure. The intention of these reusable templates is to introduce a “Config as code” approach to DevOps.

He took his time to explain “the Ultimate ARM Template” and other key points about the template. Brian Moore explained that the Ultimate ARM Template utilized utilised any language constructs to increase the impact of minimal code. The template simply looks to simplify all of your work. It also offers a variety of benefits for all of its users to enjoy. To guarantee the efficiency of ARM, he explained the practice to avoid. It’s a template which provides you with the best options for the most effective results and lack nothing essential.

img_5044.jpg

Alexander Frankel, Joseph Chan, and Liz Kim conducted their join keynote on Architecting well-governed environment using Azure Policy and Governance after the morning coffee break.

They illustrated real-life examples of how large enterprises scale their Azure applications with Azure Governance services like Azure Policy, Blueprints, Management Groups, Resource Graph and Change History.

The next session was on Monitor & Optimize your cloud spend with Azure Cost Management and was conducted by Raphael Chacko. Raphael is a Principal Program Manager at Azure Cost Management.

The keynote’s main focus was optimizing expenditure on Azure and AWS through cost analysis, budgeting, cost allocation, optimization, and purchase recommendations. The main features of Azure Cost management were highlighted.

img_5052

It was right back to business after a quick lunch break. Stephen Cohen took over with his session on Decomposing your most complex architecture problems.

Most of the session was spent on analyzing and coming up with answers to complex architecture-related problems raised by participants. It was a very practical session and addressed many commonly faced issues.

img_5055.jpg

The next session was conducted by Mark Russinovich, the CTO of Microsoft Azure.

img_5063

Day 5 had a shorter agenda and was concluded with Derek Martin returning for another keynote on Networking Fundamentals. Derek spoke about Azure Networking Primitives and how it can be used to leverage the networking security of any type of organization using Azure environments. Azure Networking Primitives can be used in a flexible manner so that newer modern approaches to governance and security protocols can be adopted easily.

And that was it. The completion of a great week of LEAP. I hope all of you enjoyed this series of articles and that they gave you some level of understanding about the innovations being done in the Azure ecosystem.

DevOps, Microsoft, Microsoft Azure, Software Development Insights

Microsoft LEAP: Design for Efficiency, Operations and DevOps

I just left Microsoft Headquarters after another interesting day at LEAP. Today’s topics were quite interesting, especially DevOps, because of all the innovations that are being made. I’m actually a little emotional that there’s just one more day remaining.

Banner of DevOps vector illustration concept-1
Jason Warner began the day’s session with his keynote on From Impossible to Possible: Modern Software Development Workflows. As the CTO of Github, Jason shared much of his experience regarding the topic.

The underlying theme of the keynote was on creating an optimal workflow that leads to the success of both the development process as well as the team. He pointed out the inevitable nature of modernization and said its important that the company does not become a mediocre or get worse.

IMG_20200130_075743

Before he went on to the topic of the day, Jason spoke about himself. He also didn’t hesitate to share some valuable history and information about his life. Jason Warner introduced the audience to some brief insight into the capabilities of GitHub and the success they have managed to achieve so far.

According to Jason, to ensure proper modernisation must have a workflow that consists the following; automation, intelligence and open source. Next, he identified GitHub’s ability to produce the best workflows to improve company efficiency. It didn’t end there as he continued by talking about the benefits of workflow inflation

IMG-20200130-WA0009
Abel Wang continued with the next session and his keynote was on Real World DevOps. Abel is a Principal Cloud Advocate for Azure.
This session was truly valuable as it covered the full process of a production SDLC and many other important areas such as infrastructure, DNS, web front ends, mobile apps, and Kubernetes API’s.

At the start of his presentation, Abel Wang introduced us to his team and gave a run down on some vital information about DevOps. Why do you need DevOps? Well, they are solution providers, support any language and boast a three-stage conversation process for results.

After a much-needed coffee break, we embarked on the next session on Visual Studio and Azure, the peanut butter and jelly of cloud app devs. The speaker, Christos Matskas is a Product Marketing Manager at Microsoft.

The session focused on explaining how well Azure and Visual Studio support development, live debugging, and zero downtime deployments. Christos also spoke about leveraging integrated Azure tools to modernize .Net applications.

The goal of those at Visual Studio are committed to providing developers with the best tools available. It supports all types of developers and redefines their coding experience. The great thing about Visual Studio is that they don’t rest on their laurels and are constantly in search of innovation. It even comes with a Visual Studio Live feature that allows developers share content with each other in real-time.

Evgeny Ternovsky, Shiva Sivakumar jointly conducted the next session on Full stack monitoring across your applications, services, and infrastructure with Azure Monitor. Many demonstrations were performed to overview the capabilities of Azure monitor.
The demos included monitoring VMs, Containers, other Azure services, and applications. In addition, setting up predictive monitoring for detecting anomalies and forecasting was also discussed.

Azure has a full set of services which it uses to oversee all your security and management needs. They have all the tools you need and are built into the platform to reduce any 3rd party integration. As if not enough, Azure managed to develop a set of newer features; partner integration, monitor containers everywhere, new pricing option, trouble shoot network issues later.

Screenshot_20200215-224807_Chrome

Subsequent to lunch, I joined the alternative session, which was on Artificial Intelligence and Machine Learning. The session was on the use of Azure Cognitive Services and using it with optimized scaling in order to optimize the customer care services provided organizations such as telecoms and telemarketers.
Then we were back at another joint session by Satya Srinivas Gogula and Vivek Garudi and the keynote was on the topic Secure DevOps for Apps and Infrastructure @ Microsoft Services.

IMG_20200130_091401

The speaker spoke about the wide adoption of DevOps practices and Open Source Software (OSS) and the vulnerabilities they introduce. The latter part of the session focused on best practices for secure DevOps with Azure.

The next keynote was on Transforming IT and Business operations with real-time analytics: From Cloud to the intelligent edge. It was jointly delivered by Jean-Sébastien Brunner and Krishna Mamidipaka and focussed on the challenges faced by IT and Business teams trying to understand the behavior of applications.
The speakers explained the benefits of Azure Stream Analytics to ingest, process, and analyze streaming data in order to enable better analytics.

A good example of when Azure is at its best is that it can be used for earthquake and storm predictions.

Taylor Rockey concluded the day with his keynote on MLOps: Taking machine learning from experimentation to production. MLOps is an integration between machine language and DevOps. MLOps has proven to have numerous benefits including; scalability, monitoring, repeatability, accountability, traceability and so on. This platform had impressive features that make it a first-choice for many developers.
The problems that many organizations face is the lack of proper understanding and tooling to use Machine Learning for production applications. The session focussed on the use of Machine Learning for production applications with the use of Azure Machine Learning and Azure DevOps.

And that’s a wrap. Don’t forget to tune into tomorrow’s article.

Microservices, Microsoft, Microsoft Azure, Software Development Insights, Technical

Microsoft LEAP: Design for Performance and Scalability

I’m at Microsoft for LEAP and we just wrapped up another day of interesting discussions. If you missed my update regarding day 1, make sure to have a look at it here.

Today’s theme was Design for Performance and Scalability. Many legacy applications are being replaced because they are not performance-oriented and scalable at their core. This is something that has to be introduced right from the design stage. Today’s speakers covered many of the core areas which need to be optimized to enable both performance and scalability.

Intelligence (BI) and business analytics (BA) with key performan

Vamshidhar Kommineni took us right from breakfast to using Azure storage for the data storage needs of Azure applications and how it can be used to enhance performance. Vamshidhar spoke about the innovations in the storage services layer made in the year 2019. He also shared briefly that plans for 2020. 

Corey Newton-Smith was next and focused on IoT applications. Corry has been with Microsoft since 2003 and currently functions as the Principal Group PM for IoT Central. She shared the current state of IoT and Microsoft’s plans for the near future highlighting their vision.

Corey explains that Azure IoT represented a new era of digitization amongst industries. It was an innovation that allowed brands the ability to do so much more. The objective behind the production of this platform is enabling a digital feedback loop. She discussed that Microsoft had done so much to make the IoT better. Now, it was capable of bidirectional communication, can be scaled to suit enterprise of any size and provides end-to-end security. Microsoft was planning an improvement that would allow it to support scenarios that are not currently cloud feasible. What’s more? Everything can be tailored specifically to the exact solutions that you need.

The next session began after some light mingling during the coffee break. It was back to business with Jose Contreras and his keynote on decomposing Monoliths into Microservices.

IMG_20200128_103740

Enterprise applications have made a gradual transition from being monolithic to being Microservice based. Jose explained strategies that can help with this process focussing on Memory, Computing, and Schema. He then discussed migrating existing monolith applications into Microservices without affecting ongoing operations. He focussed on the design, execution, and DevOps aspects.

Jose spoke on a number of factors to really prove the usefulness of transforming monolith to microservices. As part of his talk, he highlighted the factors to consider in making use of this service, differences between private and shared cache and considerations for using cache.

Interestingly, he moved on and started talking about Azure Compute. He listed all of their available services and gave detailed information on its hosting models, DevOps Criteria, Scalability criteria, and other criteria.

Clemens Vasters’s keynote focussed on how messaging is shaping enterprise applications. Importantly, he spoke on how Microsoft Azure could make all of it better.
He is a Product Architect at Microsoft and highlighted how open standards and messaging can be used to move applications to the cloud. Some of the areas he touched on were Event Hubs, Service Bus, Event Grid, and CNCF Cloud Events, and Relay with web sockets.

According to him, users can use a series of options to connect a range of devices. Ease of connectivity is guaranteed by the use of intelligent edge or intelligent cloud. Basically, it can be applied to varying scales and still work well with Telco 4G/5G. Despite all of this, cloud services can be applied to create automotive and smart cities, support industrial automation and speed up processes.

Clemens continued by clearing the air on the standards which the cloud service operated on. Everything is built according to standards and designed to be secure. Such was the level of quality in display.

After a quick lunch break, an alternative session was conducted for those who were already familiar with the campus. This session on Messaging Guidance was conducted by Francis Cheung and was related to session 4. However, Francis focused more on how we could assess if some of those tools were a good fit for our projects. He also touched on managing and versioning message schemas.

Next was David Barkol’s session focusing on Designing an Event-driven Architecture on Azure through a workshop approach. He challenged attendees to solve problems related to messaging in a group setting. As a Principal Technical Specialist for Azure, David used his vast experience to reinforce the existing knowledge of attendees about Azure messaging services. He really had a lot of interesting things to say.

Using a few simple statements, he was able to highlight the problems of the customer, identify their needs and how to solve them with the use of event-driven architecture. As a platform, the event-driven architecture will eliminate any bottlenecks and allow for easier transmission of information. Azure messaging services will solve all of the demands identified by the consumer. He also mentioned that Event Hubs GeoDR will also provide a backup or secondary region.

20200128_142749

Derek Li conducted his keynote next. He focussed on Serverless platforms based on Azure Functions and Logic Apps. Derek is a Senior Program Manager. His keynote focused on how serverless technologies have impacted how applications are built. He also spoke on how Azure Functions and Logic Apps can be used to speed up delivery.

The last session was after a very welcome Cola Zero break. It refreshed us for Rahul Kalaya’s keynote on deriving insights from IoT Data with Azure Time Series Insights.
Rahul spoke about design choices, principles and lessons learned with regards to maintaining the highest possible uptime of cloud databases and servers. Many stories from his experiences with Azure SQL made the keynote even more interesting.
And that was it. The completion of a day of meaningful sessions.

I look forward to sharing my next article on Day 3: Designing for Availability and Recoverability.

Internet of Things (IoT), Microsoft, Microsoft Azure, OpenID Connect, Security, Software Development Insights, TLS/SSL

Microsoft LEAP: Design for Security

This year is already off to a fantastic start! I am so excited to be here at the LEAP conference at the Microsoft Headquarters in Redmond Seattle. LEAP is a perfect way for me to keep up to date with new technology and how to apply it here at Gunnebo.

IMG_5034

The focus of the day was to Design for Security. The threat of cyber attacks and hackers is still as pressing as ever, so the need for cloud security is crucial. Although technological advancement has triggered an evolution in cloud security over the years, keeping the right level of visibility and control over their applications is still a challenge to many organizations. This means that finding a balance between cloud security and ease of use is a hard nut to crack. Today’s program discusses how Azure can cope up with this issue. Also, speakers are expected to introduce new and updated features Azure brought recently to improve the security of cloud applications.

IMG_4953

The highlight of today’s program consists of five great keynotes. The first on the list was Scott Guthrie, the executive vice president for Microsoft’s Cloud. He is an incredible orator and kept the audience thrilled with his in-depth explanations on how Azure helps organizations to deliver product innovation and better customer experience securely. It was frankly impossible to have been there without taking away more than a few vital points and a better understanding of Azure.

IMG_4958

Then Stuart Kwan, who is a principal program manager at Microsoft, was the next in line. He backed up Scott Guthrie with a great keynote on how authentication works on today’s applications. Stuart has a wealth of experience under his belt, and he has worked on identity and security-related technologies since joining Microsoft in 1996. Few people have more experience in that field. He is the guy to listen to on topics like Active Directory Federation Services and Windows Identity Foundation. The main focus was on OAuth, Open ID Connect, and SAML. OpenID Connect is a simple identity layer built on top of the OAuth 2.0 protocol. OAuth 2.0 defines mechanisms to obtain and use access tokens to access protected resources, but they do not define standard methods to provide identity information. OpenID Connect implements authentication as an extension to the OAuth 2.0 authorization process. It includes information about the end-user in the form of an id_token that verifies the identity of the user and provides necessary profile information about the user.

When Yuri Diogenes took control of the stage, everyone knew that his talk would be primarily based on how cloud security is evolving and becoming more mature. Yuri is a Senior Program Manager at Microsoft for Cloud and AI Security.

IMG_4968

Before Yuri moved on to talk about Azure security, he provided some insights into the problematic scenarios that many companies find themselves. According to him, security hygiene has to be taken seriously or any cloud-based infrastructure would suffer. Basically, organizations have to protect themselves against modern-day threats. He carefully explained that Azure Security Center is a unified infrastructure security management system that strengthens the security posture of your data centers, and provides advanced threat protection across your hybrid workloads in the cloud – whether they’re in Azure or not – as well as on-premises. In simple terms, Azure security is the new security hygiene which you need.

Yuri went further to explain the benefits of Azure security center and Azure Sentinel. It provides all-round security and also affords a degree of customizability. According to him, Azure is capable of protecting Linux and Windows VMs from threats, protecting cloud-native workloads from threats, detecting file-less attacks, cloud workload protection for containers and so on.

IMG_4984.jpg

The next person on stage was Nicholas DiCola who was a Security Jedi at Microsoft. He thrilled the audience with his discussions on the Azure Sentinel. He explained to everyone how the Sentinel functions as a cloud-native SIEM for intelligent security analytics for an entire organization. It offers limitless cloud speed and could be used at any scale. It also provides its users with faster threat protection and will easily integrate will all existing tools.

According to him, the Azure Sentinel was designed to collect visibility, helps in detecting analytics and hunting, investigates any incidents and respond automatically to them. Azure Sentinel gets data to function from numerous sources such as Linux Agent, Windows Agent, cloud services, custom app, appliances, azure services and so on. After collating all necessary data, it’s analytics scan for any possible threats. Then, you will now be able to monitor your data and activity.

Last but not least we had a session with Sumedh Barde and Narayan Annamalai. They opened a fascinating discussion on how to secure certificates, connection strings, or encryption keys and new networking capabilities of Azure. Sumedh Barde is Program Manager on the Azure Security team, and Narayan is the leader of the SDN product management group in Microsoft Azure that focuses on virtual networks, load balancing, and network security.

These two gave us great insight into the Azure Key Vault. They explained to us how it functions as a tool for securely storing and accessing secrets. From what I learned from the conference, the secret to tightly controlling and securing access on things API keys, passwords, or certificates is to use a vault. A vault is your very own logical group of secrets.

It was a great day here in Redmond and an excellent opportunity to brush up my knowledge of cloud security. I’m actively looking forward to tomorrow.

Microsoft, PowerBI, Software Development Insights, Technical

Microsoft Power BI: Dashboard in a Day

What better way to start the week than crunching and visualizing data. I joined the Dashboard in a Day workshop by Microsoft and Random Forest, a one day hands-on workshop designed for business analysts.

Transition effect in bar chart statistics and bright windows

I have to say that I really enjoy digging into new tools and learning about new technology. Even if I am probably not going to be working directly with PowerBI, it makes my job easier by understanding what is possible and the boundaries of the tool.

Kicking off with a short introduction (and the mandatory Microsoft advertisements), we dove straight into the PowerBI Desktop application itself, learning how to connect to, import and transform data from various sources. After preparing the data, reformatting and splitting fields, we moved on to exploring data with powerful visualization tools.

PowerBI

The exercises were quite comprehensive, and at some point I went rough and chose to implement some of my own visualizations instead.

The event was well organized and planned, the assets were categorized in a way that made it easier to identify the specific assets that suited our needs. The attendee content consisted of Lab Manuals and Datasets that were available for download on MPN without requirement for an MPN ID.

Towards the end of the day the guys from Random Forest ensured that we had a good working knowledge and familiarity with Power BI and possessed the ability to answer questions concerning the workshop or Power BI in general. It was a tremendous learning experience and I couldn’t wait to try out those awesome new technologies! They even spent the last couple of hours of the day to support and guide us through our own datasets. I brought some statistics from one of our Business Units, and it was quite impressive how I could visualize and interactively navigate through the data.

All in all an exciting workshop, and I look forward to playing more with PowerBI in the future. If you have any questions or great ideas, feel free to contact me at bjorn.nostdahl@gunnebo.com 🙂