Connect to AKS cluster nodes

sometimes you need to access AKS worker node to troubelshoot, but how to do that with AKS

Run the below command

kubectl get nodes

Output will give an idea about the worker nodes you have

Run a container image on the node by issuing the kubectl debug command in order to establish a connection to it. The following command begins the process of connecting to a privileged container that has been started on your node.

kubectl debug node/<node-name-you-wish-to-connect> -it --image=mcr.microsoft.com/dotnet/runtime-deps:6.0

Regards

Osama

Storing Container Data in Azure Blob Storage

This time how to store your data to Azure Blog Storage 👍

Let’s start

Configuration

  • Obtain the Azure login credentials
az login
  1. Copy the code provided by the command.
  2. Open a browser and navigate to https://microsoft.com/devicelogin.
  3. Enter the code copied in a previous step and click Next.
  4. Use the login credentials from the lab page to finish logging in.
  5. Switch back to the terminal and wait for the confirmation.

Storage

  • Find the name of the Storage account
 az storage account list | grep name | head -1

Copy the name of the Storage account to the clipboard.

  • Export the Storage account name
 export AZURE_STORAGE_ACCOUNT=<COPIED_STORAGE_ACCOUNT_NAME>
  • Retrieve the Storage access key
az storage account keys list --account-name=$AZURE_STORAGE_ACCOUNT

Copy the key1 “value” for later use.

  • Export the key value
export AZURE_STORAGE_ACCESS_KEY=<KEY1_VALUE>
  • Install blobfuse
sudo rpm -Uvh https://packages.microsoft.com/config/rhel/7/packages-microsoft-prod.rpm
sudo yum install blobfuse fuse -y
  • Modify the fuse.conf configuration file
sudo sed -ri 's/# user_allow_other/user_allow_other/' /etc/fuse.conf

Use Azure Blob container Storage

  • Create necessary directories
sudo mkdir -p /mnt/Osama /mnt/blobfusetmp
  • Change ownership of the directories
sudo chown cloud_user /mnt/Osama/ /mnt/blobfusetmp/
  • Mount the Blob Storage from Azure
blobfuse /mnt/Osama --container-name=website --tmp-path=/mnt/blobfusetmp -o allow_other
  • Copy What you want to the files into the Blob Storage container for example website files.
 cp -r ~/web/* /mnt/Osama/
  • Verify the copy worked
ll /mnt/Osama/
  • Verify the files made it to Azure Blob Storage
az storage blob list -c website --output table
  • Finally, Run a Docker container using the azure blob storage
docker run -d --name web1 -p 80:80 --mount type=bind,source=/mnt/Osama,target=/usr/local/apache2/htdocs,readonly httpd:2.4

Enjoy 🎉😁

Osama

Setting up a Jenkins-Based Continuous Delivery Pipeline with Docker

As an important step in agile development, continuous integration is designed to maintain high quality while accelerating product iteration. Every time when the codes are updated, an automatic test is performed to test the codes and function validity. The codes can only be delivered and deployed after they pass the automatic test, This post describes how to combine Jenkins, one of the most popular integration tools, with Alibaba Cloud Container Service to realize automatic test and image building pushing.

1

Deploying Jenkins Applications and the Slave Nodes

1. Create a Jenkins orchestration template.

Create a new template and create the orchestration based on the following content.

jenkins:  image: 'registry.aliyuncs.com/acs-sample/jenkins:latest'  ports:      - '8080:8080'      - '50000:50000'  volumes:      - /var/lib/docker/jenkins:/var/jenkins_home  privileged: true  restart: always   labels:      aliyun.scale: '1'      aliyun.probe.url: 'tcp://container:8080'      aliyun.probe.initial_delay_seconds: '10'      aliyun.routing.port_8080: jenkins  links:      - slave-nodejs slave-nodejs:  image: 'registry.aliyuncs.com/acs-sample/jenkins-slave-dind-nodejs'  restart: always   volumes:      - /var/run/docker.sock:/var/run/docker.sock  labels:      aliyun.scale: '1' 

2. Use the template to create Jenkins applications and slave nodes.

You can also directly use a Jenkins sample template provided by Alibaba Cloud Container Service to create Jenkins applications and slave nodes.

2

3. After the successful creation, Jenkins applications and slave nodes will be displayed in the service list.

3

4. After opening the access endpoint provided by the Container Service, you can use the Jenkins application deployed just now.

4

Realizing Automatic Test and Automatic Build and Push of Image

Configure the slave container as the slave node of the Jenkins application.

Open the Jenkins application and enter the System Settings interface. Select Manage Node > Create Node, and configure corresponding parameters. See the figure below.

5

Note: Label is the only identifier of the slave. The slave container and Jenkins container run on the Alibaba Cloud platform at the same time. Therefore, you can fill in a container node IP address that is inaccessible to the Internet to isolate the test environment.

6

Use the jenkins account and password (the initial password is jenkins) in Dockerfile for the creation of the slave-nodejs image when adding Credential. Image Dockerfile address HERE

1. Create a project to implement the automatic test.

  1. Create an item and choose to build a software project of free style.
  2. Enter the project name and select a node for running the project. In this example, enter the slave-nodejs-ut node created above.
7

Configure the source code management and code branch. In this example, use GitHub to manage source codes.

8

Configure the trigger for building. In this example, automatically trigger project execution by combining GitHub Webhooks and services.

9

Add the Jenkins service hook to GitHub to implement automatic triggering.

Click the Settings tab on the Github project homepage, and click Webhooks & services > Add service and select Jenkins (Git plugin). Enter ${Jenkins IP}/github-webhook/ in the Jenkins hook URL dialog box.

1. http://jenkins.cd****************.cn-beijing.alicontainer.com/github-webhook/
10

Add a build step of Executes shell type and write shell scripts to execute the test.

11

The command in this example is as follows.

1. pwd
2. ls
3. cd chapter2
4. npm test

Create a project to automatically build and push images.

  1. Create an item and choose to build a software project of free style.
  2. Enter the project name and select a node for running the project. In this example, enter the slave-nodejs-ut node created above.
  3. Configure the source code management and code branch. In this example, use GitHub to manage source codes.
  4. Add the following trigger and set it to implement automatic image building only after success of the unit test.
12

Write shell scripts for building and pushing images.

13

The command in this example is as follows.

a.cd chapter2 b.docker build -t registry.aliyuncs.com/qinyujia-test/nodejs-demo . c.docker login -u ${yourAccount} -p ${yourPassword} registry.aliyuncs.com d.docker push registry.aliyuncs.com/qinyujia-test/nodejs-demo 

Automatically Redeploy the Application

Deploy the application for the first time

Use the orchestration template to deploy the image created above to the Container Service and create the nodejs-demo application.

Example

1. 
2. express:
3. image: 'registry.aliyuncs.com/qinyujia-test/nodejs-demo'
4. expose:
5. - '22'
6. - '3000'
7. restart: always
8. labels:
9. aliyun.routing.port_3000: express
10. 

1. Select the application nodejs-demo just created, and create the trigger.

14

 Add a line to the shell scripts you wrote in Realize automatic test and automatic build and push of image. The address is the trigger link given by the trigger created above.

i.curl 'https://cs.console.aliyun.com/hook/trigger?triggerUrl=***==&secret=***' 

Change the Command in the example from Realize automatic test and automatic build and push of image as follows.

i. cd chapter2
ii. docker build -t registry.aliyuncs.com/qinyujia-test/nodejs-demo .
iii. docker login -u ${yourAccount} -p ${yourPassword} registry.aliyuncs.com iv.docker push registry.aliyuncs.com/qinyujia-test/nodejs-demo
v. curl 'https://cs.console.aliyun.com/hook/trigger?triggerUrl=***==&secret=***'

After pushing the image, Jenkins automatically triggers redeployment of the nodejs-demo application.

Configure The Email Notification for the Results

If you want to send the unit test or image configuration results to relevant developers or project execution initiators through email, perform the following configurations.

On the Jenkins homepage, click System Management > System Settings, and configure a Jenkins system administrator email.

15

Install the Extended Email Notification plugin, configure SMTP server and other relevant information, and set the default recipient list. See the figure below.

16

The above example shows the parameter settings of the Jenkins application system. The following example shows the relevant configurations for Jenkins projects whose results are to be pushed through email.

1. Add post-building operation steps in the Jenkins project, select Editable Email Notification, and enter a recipient list.

17

2. Add a mailing trigger.

18

Cheers

Osama

Create a Serverless Website with Alibaba Cloud Function Compute

Regarding to Wikipedia, Serverless computing is a cloud computing execution model in which the cloud provider runs the server, and dynamically manages the allocation of machine resources. Pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity

Today i will show you an example how to create serverless website but this time not using Amazon AWS, Azure or OCI but Alibaba Cloud Provider.

Create a Function Compute Service

Go to the console page and click through to Function Compute.

Click the add button beside Services.

In the Service slide out, give your service a name, an optional description, and then slide open the Advanced Settings.

In Advanced Settings you can grant access for Functions to the Internet, to VPC resources, and you can attach storage and a log service to a Function. You can also configure roles.

For our tutorial, we will need Internet access so make sure this configuration is on.

We will leave VPC and Log Configs as they are.

In the Role Config section, select Create New Role, and in the dropdown list pick AliyunOSSReadOnlyAccess as we will be accessing our static webpages from an Object Storage Service bucket.

Click Authorize.

You will see a summary of the Role you created.

Click Confirm Authorization Policy.

You have successfully added the Role to the Service.

Click OK.

ou will see the details of the Function Compute Service you just created.

Now let’s create a Function in the Service. Click the add button next to Functions.

You will see the Create Function process. The first part of the process is Function Template.

There are many Function Templates available, including an empty Function for writing your own bespoke Functions.

Alibaba Cloud-supplied Template Functions are very useful as they have relevant method invocation and demo code for getting started quickly with Function Compute.

let’s choose the flask-web Function written in Python2.7.

Click Select.

We are now at the Configure Triggers section of creating a Function.

Select HTTP Trigger from the dropdown list. Give the Trigger a name and choose Authorization details (anonymous does not require authorization).

Choose your HTTP methods and click Next. We are going to build a simple web-form application so we will need both the GET and POST HTTP methods.

Now we arrive at the Configure Function Settings.

Give the Function a name then scroll down to Code details.

We’ll leave the supplied code for now. Scroll down to below the code sample.

You will see Environment Variable input options and Runtime Environment details.

Click Next.

Click Next at Configure Function Permissions.

Verify the Configuration details and click Create.

You will arrive at the Function’s IDE. Here you can enter new code, edit the code directly, upload code folders, run, test, and fix your code.

Scroll down.

Copy the URL as we will need to add this to our static webpages so they can connect to our Function Compute Service and Function.

Set Up and Configure an OSS Bucket

Click through to Object Storage Service on the Products page.

If you haven’t yet activated Object Storage Service, go ahead and activate it. In the OSS console, click Create Bucket.

Choose a name for the OSS Bucket and pick the region – you cannot change the region later. Select the Storage Class – you also cannot change this later.

We have selected Public Read for the Access Control List.

When you’re ready, click OK.

You will see the Overview page for your bucket. Make a note of the public Internet URL.

In the Files tab, upload your static web files.

I uploaded a simple index.html homepage and a background picture.

<script type="text/javascript">
        const functionURL = '<<Function URL>>';
        const doHome = new XMLHttpRequest();
doHome.open('GET', functionURL, true);
doHome.onload = function () {    
document.getElementById('home_message').innerHTML = doHome.responseText;
        };
        doHome.send();
</script>

In Basic Settings, click Configure to configure your Static Pages.

Add the homepage details and click Save.

Now go to a new browser window and access the OSS URL you saved earlier.

Back in the Function Compute console, you can now test the flask-app paths directly from the code.

We already tested index.html with no Path variable. Next, we test the app route signin with GET and check the Headers and status code.

The signin page code is working correctly. You can also check the Body to make sure the correct HTML will render on the page. Notice that because I entered the path variable, signin is appended to the URL.

Of course, any errors you encounter will show up in the Logs section for easy debugging.

Now, let’s test this page on the Internet.

If you get an error here, implement a soft link for the page in OSS. Go to the OSS bucket and click More dropdown for the HTML file in question and choose Set soft link.

Give the link a name and click OK.

A link file will appear in the list of static files and you will now be able to access the page online with the relevant soft link and it will render as above.

Back in Function Compute, we can test the POST method in the console with the correct username and password details in the same way.

Add the POST variables to the form upload section in the Body tab.

Now you can test this function online.

Cheers

Osama

DevOps Overview, Cloud Version this time Azure

DevOps is the union of people, process, and products to enable continuous delivery of value to your end users. Azure DevOps is a set of services that gives you the tools you need to do just that. With Azure DevOps, you can build, test, and deploy any application, either to the cloud or on premises. DevOps practices that enable transparency, cooperation, continuous delivery and continuous deployment become embedded in your software development lifecycle.

Azure DevOps provides several tools you can use for better team collaboration. It also has tools for automated build processes, testing, version control, and package management. That’s quite a bit to cover! We’ll get to all the tools eventually. For now, let’s follow the team as they begin with an overview of what Azure DevOps is and how they can get started.

Azure DevOps ServicesDescriptions
Azure Boardsagile tools that help us plan, track, and discuss our work, even with other teams.
Azure Pipelines build, test, and deploy with CI/CD that works with any language, platform, and cloud.
Azure Test Plansmanual and exploratory testing tools.
Azure Reposprovide unlimited, cloud-hosted private, and public Git repos.
Azure Artifacts create, host, and share packages.

What is Agile?

Agile is a term that’s used to describe approaches to software development, emphasizing incremental delivery, team collaboration, continual planning, and continual learning. Agile isn’t a process as much as it is a philosophy or mindset for planning the work that a team will do. It’s based on iterative development and helps a team better plan for and react to the inevitable changes that occur in software development. Let’s listen in on Mara’s discussion with Andy after the latest release.

Recommendations for adopting Agile

  • Create an organizational structure that supports Agile practices
  • Mentor team members on Agile techniques and practices
  • Enable in-team and cross-team collaboration:- If collaboration is the key to becoming successful at Agile, what are some of the ways you can encourage it? Here are some ideas.

What is Azure Boards?

Azure Boards is a tool in Azure DevOps to help teams plan the work that needs to be done. The Tailspin team will use this tool to get a better idea of what work needs to be done and how to prioritize it.

Set up Azure Boards using the Basic process

  • Create the project
  1. Sign into your account at dev.azure.com.
  2. Select + Create project.
  3. In the Project name field, type Space Game – web.
  4. In the Description field, type The Space Game website.
  5. Under Visibility, you choose whether to make your project public or private. For now, you can choose private.
  6. Select Advanced.
  7. Under Version control, make sure that Git is selected. Under Work item process, make sure that Basic is selected.
  8. Select Create.
  • Create a team
  1. Select Project settings in the lower-left corner.
  2. On the Project details page, under General, select Teams.
  3. Select Space Game – web Team.
  • Add team members
  1. Under Members, select + Add.
  2. Enter the email address of the user you’d like to add. Then select Save changes.
  3. Repeat the process for any other members you’d like to add.
  • Create the board
  1. In the column on the left, point to Boards and select Boards from the menu that appears.
  2. Select Space Game – web Team boards. A blank board appears.
  3. In the To Do column, select the green + button next to the New item field.
  4. Enter Stabilize the build server and then press Enter.
  5. Select the ellipsis (), and then select Open.
  6. In the Description field, enter this text (The build server keeps falling over. The OS, Ubuntu 16.04, requires security patches and updates. It’s also a challenge to keep build tools and other software up to date.)
  7. Select Save & Close.
  8. Follow the same steps for the next two items.
TitleDescription
Create a Git-based workflowMigrate source code to GitHub and define how we’ll collaborate.
Create unit testsAdd unit tests to the project to help minimize regression bugs.

Drag Stabilize the build server to the top of the stack. Then, drag Create a Git-based workflow to the second item position. Your final board looks like this.

  • Define a sprint
  1. In the left-side column, select Sprints.
  2. Select Set dates from the upper right.
  3. Leave the name as Sprint 1.
  4. In the Start date field, select the calendar and pick today’s date.
  5. In the End date field, select the calendar and pick the date two weeks from today.
  6. Select Save and Close.

Assign tasks and set the iteration

  1. Under Boards, select Work items.
  2. Select Stabilize the build server.
  3. In the Iteration drop-down list, select Sprint 1.
  4. From the same window, select Unassigned and set yourself as the task owner.
  5. Repeat the process for the other two work items.
    1. Create a Git-based workflow
    2. Create unit tests

Cheers
Osama

Connect To An Instance In A Private Subnet on Cloud

If you ever worked with cloud and configured different subnet, there will be public and private subnet, both having a different number of servers, for the public or even private have you also wonder how to access the environment without associate the VM to Public IP, in this post I will show you how.

For the figure shows one of the simple example of that, In this post you will learn how to connect to an instance that is hosted in a private subnet

The first thing you need, install Pageant

After that, open the software and add the public key to this software , enter the password

Once you add the key, it will be appear inside the program you can close it

Now you can use putty and configure the software like the follwing, You are now connected to the instance in public subnet through agent forwarding.

From the public subnet you can simply SSH to to private subnet easily.

Cheers

Osama

Cloud Services Mapping For AWS, Azure, GCP ,OCI, IBM and Alibaba provider

This blog post is one of that kind that took much time and consume so much energy, to complete this post it took me around ten days to make sure that I will cover most of the available services and make it readable for people, Be sure the services can change while you are reading this post ; if you have any comments,or add something to this post, please send me an email – using contact us page or by comments below.

I am writing this post to share a different cloud providers services and the comparison between each one of them, this will show various naming services for each one of them.

Earlier we used to store our data to H.D.D or USB flash, Cloud Computing services have replaced such hard drive technology. Cloud Computing service is nothing but providing services like Storage, Databases, Servers, networking, and software through the Internet.

Cloud Computing is moving so fast, in 2020 the cloud now is more mature, going multi-cloud, and likely to become more focused on vertical and a sales ground war as the leading vendors battle for market share.

Notes :

  • GCP : Google Cloud Provider
  • OCI :- oracle cloud infrastructure
  • None : not meaning the services is not available necessarily by cloud provider but i didn’t look deeper into this or i didn’t use it before.

Marketplace

AWSAzureOCIGCPIBM CloudAliBaba Cloud
AWS MarketplaceAzure Marketplace Oracle Cloud MarketplaceGoogle Cloud Platform (GCP) MarketplaceIBM Marketplace Alibaba Cloud Marketplace 

AI and machine learning

AWSAzureOCIGCPIBM CloudAliBaba Cloud
SageMakerMachine LearningOCI Machine LearningGoogle Datalab Cloud AutoML (Alpha)
Cloud Machine Learning Services
Machine Learning Machine Learning Platform for AI 
Alexa Skills KitBot FrameworkOracle Digital AssistantGoogle AssistantNoneNone
Polly, TranscribeSpeech Services Bing Speech APINoneTranslation API
Speech API
NoneNone
LexSpeech ServicesOracle Chatbots Cloud Text-to-Speech DialogFlow Enterprise Edition (Beta) Natural Language APIWatson Assistant Intelligent Service Robot 
RekognitionCognitive ServicesNoneCloud Video Intelligence Vision APIVisual Recognition Image Search 
Skills KitVirtual AssistantNoneNoneNoneNone
Amazon ComprehendLanguage Understanding (LUIS)NoneCloud Text-to-Speech DialogFlow Enterprise Edition (Beta)
Natural Language API
Visual RecognitionImage Search

Big data and analytics

AWSAzureOCIGCPIBM CloudAliBaba Cloud
RedshiftSynapse Analyticsoracle autonomous data warehouseBigQueryDb2 Warehouse Alibaba MaxCompute ODPS 
Lake FormationData ShareNoneNoneNoneNone
Amazon EMRHDInsight Data Lake Storage
Oracle Big Data Service
Cloud DataProcAnalytics Engine E-MapReduce Service 

Data orchestration / ETL

AWSAzureOCIGCPIBM CloudAliBaba Cloud
Data Pipeline, GlueData Factory Data CatalogData IntegratorCloud DataPrep Cloud ComposerDataStage DataWorks  Data Integration 
Dynamo DBTable Storage, Cosmos DBNoSQL DatabaseCloud Datastore Cloud BigTableCloudant NoSQL DB  Compose for JanusGraph 
Databases for MongoDB 
Apsaradb for Mongodb  Table Store 

Analytics and visualization

AWSAzureOCIGCPIBM CloudAliBab Cloud
Kinesis AnalyticsStream Analytics
 Data Lake Analytics
Data Lake Store
Event Hub (Apache Kafka as a Service)Cloud DataflowStreaming AnalyticsNone
QuickSightPower BIData Visualization Business IntelligenceGoogle Data StudioWatson studio Data IDE 
CloudSearchCognitive SearchNoneNoneNoneNone
AthenaData Lake AnalyticsNoneBigQuerySQL QueryE-MapReduce Service

Compute

AWSAzureOCIGCPIBM CloudAliBaba Cloud
Elastic Compute Cloud (EC2) InstancesVirtual MachinesComputeCompute EngineClassic Virtual Server Alibaba ECS
BatchBatchNonePreemptible VMsIBM Cloud FunctionsBatch Compute 
Auto ScalingVirtual Machine Scale SetsAuto ScalingAuto ScalerAuto Scaling Auto Scaling 
VMware Cloud on AWSVMware by CloudSimpleNoneNoneNoneNone
Parallel ClusterCycleCloudCluster Networkingslurm gcpNoneNone
Amazon EC2 – I3.metalNoneCompute – Bare MetalNoneNoneNone
Amazon EC2 – P2, P3, G3 instancesAzure N-SeriesOracle Cloud Infrastructure Compute – GPUGoogle GPUNoneNone

Containers and container orchestrators

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Elastic Container Service (ECS)Container InstancesOracle Cloud Infrastructure RegistryContainer RegistryIBM Cloud Container Registry Container Registry 
Elastic Kubernetes Service (EKS)Kubernetes Service (AKS)Container Engine for Kubernetes (OKE) Kubernetes Engine IBM Cloud Kubernetes Service Container Service for Kubernetes 
App MeshService Fabric MeshNoneGoogle Istio Service MeshNoneNone

Serverless

AWSAzureOCIGCPIBM CloudAlibaba Cloud
LambdaFunctionsOracle FunctionsGoogle Cloud FunctionsIBM Cloud Functions Function Compute 

Database

AreaAWSAzureOCIGCPIBM CloudAlibaba Cloud
Relational databaseRDSSQL Database Database for MySQL
Database for PostgreSQL
Oracle Database Cloud Service
MySQL Service 
Cloud SQL  Cloud Spanner Db2 
Db2 Hosted  Informix  Databases for PostgreSQL  Compose for MySQL(Beta
ApsaraDB for RDS MYSQL  ApsaraDB for RDS SQL Server  ApsaraDB for RDS PostgreSQL  Distributed Relational Database Service (DRDS) 
NoSQL / Document DynamoDB
SimpleDB
Amazon DocumentDB
Cosmos DBNoSQL Database Cloud Datastore  Cloud BigTable Cloudant NoSQL DB  Compose for JanusGraph  Databases for MongoDB ApsaraDB for RDS MYSQL  ApsaraDB for RDS SQL Server  ApsaraDB for RDS PostgreSQL  Distributed Relational Database Service (DRDS) 
CachingElastiCacheCache for RedisIn-memory OptionCloud MemoryStore (Beta) Informix HiTSDB (High-Performance Time Series Database) 
Database migration Database Migration ServiceDatabase Migration ServiceMigrate to the Cloud NoneLift CLI Data Transmission Service 
Relational Database Management ServiceAuroraAzure SQL Database; Azure Cosmos DB  Oracle Autonomous Transaction ProcessingCloud SQL ; Cloud SpannerCloudant NoSQL DB  Compose for JanusGraph  Databases for MongoDB Apsaradb for Mongodb  Table Store 

DevOps and application monitoring

AWSAzureOCIGCPIBM CloudAlibaba Cloud
CloudWatch,
X-Ray
AWS Cloud9  AWS Code Star 
AWS CodeBuild  CodeDeploy CodeCommit CodePipeline
Monitor Azure Boards  Azure Pipelines  Azure Repos  Azure Test Plans  Azure Artifacts  DevOpsDeveloper Cloud Service Cloud Source Repositories  Cloud Build Continuous Delivery  DevOps Insights  Globalization Pipeline None
Developer ToolsDeveloper ToolsDeveloper Cloud Service Cloud Source Repositories  Cloud Build Continuous Delivery  DevOps Insights  Globalization Pipeline None
Command Line InterfaceCLI PowerShellOCI CLICloud Shell  Cloud Console NoneAlibaba Cloud CLI 
OpsWorks (Chef-based)AutomationOracle Orchestration Cloud Service NoneNoneResource Orchestration Service 
CloudFormationResource Manager
VM extensions
Azure Automation Azure Building Blocks
Stack Manager Cloud Resource Manager  Cloud Deployment ManagerSchematics Resource Orchestration Service 

Internet of things (IoT)

AWS AzureOCIGCPIBM CloudAlibaba Cloud
IoTIoT HubInternet of Things Cloud Service Cloud IoT Core (Beta)  Google Cloud IoT Internet of Things Platform IoT Platform 
GreengrassIoT Edge Azure IoT SDKNoneNoneNoneNone
Kinesis Firehose
Kinesis Streams
Event Hubs Azure Stream Analytics Event Hub (Apache Kafka as a Service) Cloud Dataflow Streaming Analytics None
IoT Things GraphDigital TwinsNoneNoneDigital TranscodingApsaraVideo Live 
AWS IoT ButtonAzure SphereNoneNoneNoneNone

Management

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Trusted AdvisorAdvisorNoneGoogle Cloud Platform Security NoneNone
Usage and Billing ReportBilling APIOracle Management ConsoleBilling API NoneAlibaba Cloud CLI 
Management ConsolePortalConsolePortal/ConsoleConsoleConsole
Application Discovery ServiceMigrate Azure Active Directory NoneNoneNoneNone
EC2 Systems ManagerMonitorOracle Management Cloud NoneNoneNone
Personal Health DashboardResource Health Azure Monitor Oracle Management Cloud NoneNoneCloud monitoring, Notification and Alerts 
CloudTrailMonitorApplication Performance Monitoring Google StackDriver  Monitoring  Logging Error Reporting  Trace  Debugger Application Performance Monitoring CloudMonitor 
Cost ExplorerCost ManagementOracle Management Cloud NoneNoneNone
CloudWatchApplication InsightsApplication Performance MonitoringGoogle StackDriver 
Monitoring 
Logging
Error Reporting 
Trace 
Debugger 
IBM Cloud Log Analysis with LogDNA CloudMonitor

Messaging and eventing

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Simple Queue Service (SQS) Amazon MQ Queue Storage Service Bus Service Bus topics Service Bus relayIntegration Messaging Cloud Pub/SubEvent Steams Message Queue 
Simple Notification ServiceEvent Grid Azure Notification ServicesMessaging Firebase Cloud MessagingPush Notifications Short Message Service (SMS) 
Amazon SESMarketplace – EmailOracle Cloud Infrastructure Email DeliveryPartnersSendgrid Direct Mail 

Mobile services

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Mobile HubApp Center Xamarin AppsMobile & ChatbotsCloud Mobile AppMobile FoundationNone
Mobile SDKApp Center
Azure Mobile SDK,Offline/Sync Azure DevTest Labs (Back End) Hockey App
Mobile Cloud ServiceCloud Tools for Android StudioNone
Amazon PinpointAzure Mobile EngagementNoneNoneMobile FoundationNone
CognitoApp CenterMobile Cloud Service Cloud Tools for Android StudioApp ID None
Mobile AnalyticsHockey App Mobile Cloud Service Firebase Analytics Mobile Foundation None

Networking

AreaAWSAzureOCIGCPIBM CloudAlibaba cloud
Cloud virtual networkingVirtual Private Cloud (VPC)Virtual NetworkOracle Virtual Cloud Network Virtual Private CloudIBM Cloud VPC on Classic Virtual Private Cloud 
Cross-premises connectivityAmazon VPN VPN Gateway VPN Connect Cloud VPN Classic IPSEC-VPN VPN Gateway 
DNS Managment Route 53DNSOracle DNS Cloud DNS Internet Services Alibaba Cloud DNS 
Global Traffic ManagmentAmazon Route 53 Traffic Flow Azure Traffic Manager OCI Traffic Management NoneInternet Services None
Dedicated networkDirect ConnectExpressRouteFast Connect Cloud InterConnect Direct Link Express Connect 
Load balancingElastic Load BalancingLoad BalancerOracle Load Balancer Cloud Load Balancing IBM Cloud Load Balancing Server Load Balancer 

Security, identity, and access

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Identity and Access Management (IAM)Azure Active Directory
Role Based Access Control
Identity Cloud IAM Identity & Access Management Resource Access Management 
OrganizationsSubscription Management + RBAC
Policy
Management Groups
Audit NoneResource Group None
Multi-Factor AuthenticationMulti-Factor AuthenticationMulti-factor authenticationMulti-factor authenticationMulti-factor authenticationMulti-factor authentication
Directory ServiceAzure Active Directory Domain ServicesNoneNoneNoneNone
CognitoAzure Active Directory B2CMobile Cloud Service Firebase Authentication

App ID None

Encryption

AWSAzureOCIGCPIBM cloudAlibaba Cloud
Server-side encryption with Amazon S3 Key Management ServiceAzure Storage Service EncryptionNonenoneNoneNone
Key Management Service (KMS)
CloudHSM
Key VaultKey ManagementCloud Key Management Service Key Protect Key Management Service 

Firewall

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Web Application FirewallWeb Application Firewall
Firewall
Web Application Firewall NoneInternet Services Web Application Firewall 

Security

AWSAzureOCIGCPIBM CloudAlibaba Cloud
InspectorSecurity CenterConfiguration and Compliance Service NoneInfrastructure Vulnerability Scan None
Certificate ManagerApp Service Certificates available on the PortalNoneCloud Key Management Service Key Protect Key Management Service 
GuardDutyAdvanced Threat ProtectionNoneNoneNoneNone
ArtifactService Trust PortalComplianceCloud Security Command Center (Alpha) NoneAlibaba Truster Center 
ShieldDDos Protection ServiceOracle Cloud Infrastructure DDoS ProtectionCloud Armor (Beta) Internet Services DDOS Pro and Basic 

Storage

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Simple Storage Services (S3)Blob storageObject Storage Cloud Storage Cloud Object Storage Object Storage Service 
Elastic Block Store (EBS)managed disksBlock Storage Persistent Disk Block Storage Block Storage

Elastic File SystemFilesOCI File Storage File Store File Storage NAS File Storage 
S3 Infrequent Access (IA)Storage cool tierNoneNoneNoneNone
S3 GlacierStorage archive access tierArchive Storage Cloud Storage Object Storage-ColdVault Object Storage Archive 
BackupBackupNoneNoneNoneNone
Storage GatewayStorSimple
Storage Software Appliance NoneNoneHybrid Cloud Storage Array 
DataSyncFile SyncNoneNoneNoneNone

Bulk data transfer

AWS AzureOCIGCPIBM CloudAlibaba Cloud
Import/Export DiskImport/Export
Data Transfer Services – Hard Disk Import NoneData Transfer Service None
Import/Export c
Snowball Edge
Snowmobile
Data BoxData Transfer Services – Storage applicance import Transfer Appliance (Beta) Mass Data Migration Service Data Transport 

Web applications

AWSAzureOCIGCPIBM CloudAlibaba Cloud
Elastic BeanstalkApp ServiceApplication Container Cloud 
Java Cloud Service 
Google App engine Cloud Foundry Apps Enterprise Distributed Application Service 
API GatewayAPI ManagementAPI Platform Cloud EndpointsAPI Connect API Gateway 
CloudFrontContent Delivery NetworkNoneCloud CDN Content Delivery Network Alibaba Content Delivery Network 
Global AcceleratorFront DoorNoneNoneNoneNone
LightSailApp ServiceNoneNoneClassic Virtual Server 
Virtual Server for VPC 
Simple Application Server 

Miscellaneous

AreaAWSAzureOCIGCPIBM CloudAlibaba Cloud
Backend process logicStep FunctionsLogic AppsFunctions
App Engine IBM Cloud Functions None
Enterprise application servicesWorkMailWorkDocsOffice 365NoneG Suite
NoneNone
GamingGameLiftGameSparksPlayFabNoneNoneNoneNone
Media transcoding
Elastic TranscoderMedia ServicesNoneNoneDigital TranscodingApsaraVideo Live 
WorkflowSimple Workflow Service (SWF)Logic AppsData Integrator Cloud DataPrep (Private Beta) 
Cloud Composer (Beta) 
DataStage 
Watson Knowledge Catalog 
DataWorks
Data Integration  
HybridOutposts
Stack
Cloud At Customer Anthos NoneNone
MediaElemental MediaConvertMedia ServicesNoneNoneDigital Transcoding
ApsaraVideo Live 
Region Availability Zone (AZ)Availability Zone (AZ)Availability Domain (AD)Zonesavailability zonesZones

Disaster Recovery Services

AWSAzureOCIGCPIBM CloudAlibaba Cloud
AWS Disaster Recovery Azure Site Recovery Oracle Database Backup 
DR Site
NoneNoneAlibaba Disaster Recovery 
Hybrid Backup Recovery 

Enjoy

Cheers

Osama

Azure Resource quick guide

In gernal,

load balancer distributes traffic evenly among each system in a pool. A load balancer can help you achieve both high availability and resiliency.

Say you start by adding additional VMs, each configured identically, to each tier. The idea is to have additional systems ready, in case one goes down, or is serving too many users at the same time.

Azure Load Balancer is a load balancer service that Microsoft provides that helps take care of the maintenance for you. Load Balancer supports inbound and outbound scenarios, provides low latency and high throughput, and scales up to millions of flows for all Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) applications. You can use Load Balancer with incoming internet traffic, internal traffic across Azure services, port forwarding for specific traffic, or outbound connectivity for VMs in your virtual network.

When you manually configure typical load balancer software on a virtual machine, there’s a downside: you now have an additional system that you need to maintain. If your load balancer goes down or needs routine maintenance, you’re back to your original problem.

Azure Application Gateway

If all your traffic is HTTP, a potentially better option is to use Azure Application Gateway. Application Gateway is a load balancer designed for web applications. It uses Azure Load Balancer at the transport level (TCP) and applies sophisticated URL-based routing rules to support several advanced scenarios.

Benefits

  • Cookie affinity. Useful when you want to keep a user session on the same backend server.
  • SSL termination. Application Gateway can manage your SSL certificates and pass unencrypted traffic to the backend servers to avoid encryption/decryption overhead. It also supports full end-to-end encryption for applications that require that.
  • Web application firewall. Application gateway supports a sophisticated firewall (WAF) with detailed monitoring and logging to detect malicious attacks against your network infrastructure.
  • URL rule-based routes. Application Gateway allows you to route traffic based on URL patterns, source IP address and port to destination IP address and port. This is helpful when setting up a content delivery network.
  • Rewrite HTTP headers. You can add or remove information from the inbound and outbound HTTP headers of each request to enable important security scenarios, or scrub sensitive information such as server names.

What is a Content Delivery Network (CDN)?

A content delivery network (CDN) is a distributed network of servers that can efficiently deliver web content to users. It is a way to get content to users in their local region to minimize latency. CDN can be hosted in Azure or any other location. You can cache content at strategically placed physical nodes across the world and provide better performance to end users. Typical usage scenarios include web applications containing multimedia content, a product launch event in a particular region, or any event where you expect a high-bandwidth requirement in a region.

DNS

DNS, or Domain Name System, is a way to map user-friendly names to their IP addresses. You can think of DNS as the phonebook of the internet.

How can you make your site, which is located in the United States, load faster for users located in Europe or Asia?

network latency in azure

Latency refers to the time it takes for data to travel over the network. Latency is typically measured in milliseconds.

Compare latency to bandwidth. Bandwidth refers to the amount of data that can fit on the connection. Latency refers to the time it takes for that data to reach its destination.

One way to reduce latency is to provide exact copies of your service in more than one region, or Use Traffic Manager to route users to the closest endpoint, One answer is Azure Traffic Manager. Traffic Manager uses the DNS server that’s closest to the user to direct user traffic to a globally distributed endpoint, Traffic Manager doesn’t see the traffic that’s passed between the client and server. Rather, it directs the client web browser to a preferred endpoint. Traffic Manager can route traffic in a few different ways, such as to the endpoint with the lowest latency.

Cheers

Osama

Migrating from MongoDB to Azure cosmos db, Using Mongorestore and mongodump manual/offline

In this post, i will discuss how to migrate from mongoDB (in my case the database was hosted on AWS) to Azure CosmosDB, i searched online about different articles how to do that, the problem i faced most of them were discussing the same way which is Online and using 3rd party software which is not applicable for me due to security reason, thefore i decided to post about it maybe it will useful for someone else.

Usually the easiet way which is use Azure Database Migration Service to perform an offline/online migration of databases from an on-premises or cloud instance of MongoDB to Azure Cosmos DB’s API for MongoDB.

There are some prerequisite before start the migration to know more about it read here, the same link explained different ways for migrations, however before you start you should create an instance for Azure Cosmos DB.

Preparation of target Cosmos DB account

Create an Azure Cosmos DB account and select MongoDB as the API. Pre-create your databases through the Azure portal

The home page for azure Cloud

from the search bar just search for “Azure Cosmos DB”

Azure Cosomo DB

You have add new account for the new migration Since we are migrating from MongoDB then The API should be “Azure CosmosDB for MongoDB API”

Create cosmos db

The target is ready for migration but we have to check the connection string so we can use them in our migration from AWS to Azure.

Get the MongoDB connection string to customize

  • the Azure Cosmos DB blade, select the API.
  •  the left pane of the account blade, click Connection String.
  • The Connection String blade opens. It has all the information necessary to connect to the account by using a driver for MongoDB, including a preconstructed connection string.
Connection string

From MongoDB (Source server) you have to take backup for the database, now after the backup is completed, no need to move the backup for another server , mongo providing two way of backup either mongodump (dump) or mongoexport and will generate JSON file.

For example using monogdump

mongodump --host <hostname:port> --db <Databasename that you want to backup > --collection <collectionname> --gzip --out /u01/user/

For mongoexport

mongoexport --host<hostname:port> --db <Databasename that you want to backup > --collection <collectionname> --out=<Location for JSON file>

After the the above command will be finished, in my advice run them in the background specially if the database size is big and generate a log for the background process so you can check it frequently.

Run the restore/import command from the source server , do you remember the connection string, now we will use them to connect to Azure Cosmos DB using the following, if you used mongodump then to restore you have to use mongorestore like the below :-

mongorestore --host testserver.mongo.cosmos.azure.com --port 10255 -u testserver -p  w3KQ5ZtJbjPwTmxa8nDzWhVYRuSe0BEOF8dROH6IUXq7rJgiinM3DCDeSWeEdcOIgyDuo4EQbrSngFS7kzVWlg== --db test --collection test /u01/user/notifications_service/user_notifications.bson.gz  --gzip --ssl --sslAllowInvalidCertificates

notice the follwing :-

  • host : From Azure portal/connection string.
  • Port : From Azure portal/connection string.
  • Password : From Azure portal/connection string.
  • DB : The name of the database you want to be created in azure cosmo,this name will be created during the migration to azure.
  • Collection : The name of the collection you want to be created in azure cosmo,this name will be created during the migration to azure.
  • Location for the backup.
  • gzip because i compressed the backup
  • Migration required to use ssl authication otherwise it will fail.

using mongoimport.

mongoimport --host testserver.mongo.cosmos.azure.com:10255 -u testserver -p w3KQ5ZtJbjPwTmxa8nDzWhVYRuSe0BEOF8dROH6IUXq7rJgiinM3DCDeSWeEdcOIgyDuo4EQbrSngFS7kzVWlg== --db test --collection test --ssl --sslAllowInvalidCertificates --type json --file /u01/dump/users_notifications/service_notifications.json

Once you run the command

Note: if you migrating huge or big databases you need to increase the cosmosdb throughout and database level after the migration will be finished return everything to the normal situation because of the cost.

Cheers

Osama