Azure CLI For Beginners

What is Azure CLI ?

The Azure command-line interface (Azure CLI) is a set of commands used to create and manage Azure resources. The Azure CLI is available across Azure services and is designed to get you working quickly with Azure, with an emphasis on automation.

Documentation here

Create a Linux VM with the Azure CLI

The Azure CLI includes the vm command to work with virtual machines in Azure. We can supply several subcommands to do specific tasks. The most common include:

Sub-commandDescription
createCreate a new virtual machine
deallocateDeallocate a virtual machine
deleteDelete a virtual machine
listList the created virtual machines in your subscription
open-portOpen a specific network port for inbound traffic
restartRestart a virtual machine
showGet the details for a virtual machine
startStart a stopped virtual machine
stopStop a running virtual machine
updateUpdate a property of a virtual machine

Create new VM Using AZ CLI :-

az vm create --resource-group [resource group name] --location westus --name OsamaVM 
  --image UbuntuLTS --admin-username osama--generate-ssh-keys --verbose 

After creating the vmware , Public IP is assigned to create VM, to check the Public IP

Another way to check the IP by using the below command :-

az vm list-ip-addresses -n OsamaVM -o table

You could ports using AZ CLI for example

az vm open-port --port 80  --resource-group learn-9c22c502-355e-437b-9682-eb54b8c48e1c  --name OsamaVM

You can connect SSH to the VM Using the below command :-

There are pre defined image avaliable from Azure you can check them by :-

Or you can check the Avaliable Images in certain location, here you could find different between locations :-

For example also, the below command shows the images that has been created only By Microsoft :-

az vm image list --publisher Microsoft --output table --all

One More thing , To resize a VM, we use the vm resize command. For example, perhaps we find our VM is underpowered for the task we want it to perform. We could bump it up to a D2s_v3 where it has 2 vCores and 8 GB of memory. Type this command in Cloud Shell:

az vm resize --resource-group test-7223198d-cbdf-4fb7-bfd9-b609eaca3671 --name OsamaVM--size Standard_D2s_v3

Cheers

Enjoy

Osama

Complete Project Built on Amazon AWS

I would like to share the following project, the idea of the project is to share the knowledge and allow people to try full implementation on AWS and hands-on experience: –

  • Create the VM
  • Install Nginx with a reverse proxy listening on port 3333 and rerouting locally on port 8080 and make a small application that just ECHOs HTTP request on port 8080 (in any language you want).
  • Harden it using this script (or if you want to make your own on different OS configuration or a more updated one, we will compensate you for it.)
  • Ensure that you are using a custom service account and not the default one.
  • Make sure that everything is logged in Stackdriver including system, SSH log tentative,all application servers.
  • Make sure that the firewall is well configured and no extra port is opened.
  • Publish the VM.

The Solution that i created was the following: –

  • Use this CloudFormation script to deploy following an instance with following setup:
    • Echo request application
    • Centralized logging to CloudWatch
    • Hardened Operating System
  • The CloudFormation Script, makes use of several scripts and configuration files. Their links and description are as follows:-
  • For details on deployment architecture and configuration, please refer to the following documents:
    • Documentation : Contains overview on deployment architecture and high-level OS hardening & logging configurations.
    • CloudFormation Script: Contains details on architecture and OS hardening as well as logging configurations

Cheers & thank you

Osama

Dealing with s3 using Python – Boto3 Module – Upload/Download files from bucket

We all know Amazon AWS, one of the AWS Features called s3, refer to Simple Storage Service which is is a service offered by Amazon Web Services that provides object storage through a web service interface.

Amazon Provide different ways to deal with s3, either by console or by AWSCLI, But In this post i choose to work with s3 in different way, as most of you know, i love python because it’s very simple, not complicated and cross-platform programming language.

You can deal and manage s3 using python by module called boto3, in previous version it was called boto (without 3), you have to install this module to allow python import the library, very simple step

pip3 install boto3

or by the project Interpreter that you are using, in my case i am using PyCharm Community, Nice to deal with, you can install any module by doing the following

File --> Setting --> Project interpreter --> Press on + sign --> and search for the module you want to install 

I uploaded the scripts to my Github like usual here, the repository includes two different file

  • Upload to S3
  • Download from s3

It’s very simple scripts, but you have to update the following to ensure access to AWS S3

Inside the upload file S3

  • ACCESS_KEY_ID
  • ACCESS_SECRET_KEY
  • BUCKET_NAME
  • The Location of the files

Inside the download from S3 file

  • ACCESS_KEY_ID
  • ACCESS_SECRET_KEY
  • BUCKET_NAME

Cheers ✌😉

Osama

Build, Deploy and Run Node Js Application on Azure using Docker

This documentation explains step by step how to Build, Deploy and Run Node.js application on Azure using docker.

The idea was when one of the customer asked to do the automatation  them, and they already had application written using node js, so i searched online since i can’t post the code of the client here and found this sample insteaed of using the actual code 😅

Now, the readers should have knowledge with Azure Cloud, but this document will guide to create and work on Azure therfore you have to understand Azure Cloud Concept, Also basic knowledge with node js and how to write docker file, the provided everything on my Github here, the code is already sample and used to deployed on heroku, but  still can be deployed on Azure using the documentation 🤔

The documentation uploaded to my Slideshare.net here

 

Cheers

Osama

 

 

Error: Server refused our key or No supported authentication methods available

If you use PuTTY to connect to your instance and get either of the following errors, Error: Server refused our key or Error: No supported authentication methods available, verify that you are connecting with the appropriate user name for your AMI. Enter the user name in the User name box in the PuTTY Configuration window.

The appropriate user names are as follows:

  • For an Amazon Linux AMI, the user name is ec2-user.
  • For a RHEL AMI, the user name is ec2-user or root.
  • For an Ubuntu AMI, the user name is ubuntu or root.
  • For a Centos AMI, the user name is centos.
  • For a Fedora AMI, the user name is ec2-user.
  • For SUSE, the user name is ec2-user or root.
  • Otherwise, if ec2-user and root don’t work, check with the AMI provider.

Thanks
Osama

Configure AWS Command Line Interface

In this lesson i will show you how to configure AWS command line, and how to start working with your AWS account thru command line with very simple and basic steps :-

  • First of i will assume that you don’t have any user or group in your AWS console
  • from the AWS Console :-
  • From IAM ( Identity access management ) Choose Group Create Group with permission Administrator access and then hit create.
  • create user and add that user to the group in my case the group name shown above, save the secret key and access ID as CSV.
  • Now from this link here, Download AWS Command Line Interface depend on your operating system and open cmd , terminal … etc.
  • Now from Dos enter AWS configure and fill the information like the below:-
  • Open the command line and test if it’s connected to AWS now.
You can learn more about the command line from AWS documentation , if you recive the following error 
“Could not connect to the endpoint URL: “https://s3.london.amazonaws.com/”
Then make sure you are on the right Region.
Thanks
Osama

failed to associate the token : AWS

Once you create AWS you should activate “Activate MFA on your root account” 

and do that you should use your phone (Iphone, Android) download google authenticator app from the app store/Google Play once the installation is done Scan the QR and Enter the code, the code will be change automatically, if the first try not working and you receive this error ” failed to associate the token” uninstall the app and try again.

Thank you
Osama

Migrating From AWS to Oracle Using SQL Developer

The Data Uploaded to the Cloud Vendor Amazon web services ( AWS ) But the client decided to move their data on-premises for the first sight you will think this is hard and needs  a lot of work but thank you SQL Developer and Jeff Smith and he is the product manage for SQL Developer amazing man by the way and crossfitter  at the same time 😛

However Lets start :-

  •  Open SQL developer
  • Choose Database copy option from tools menu.
  • Select source database should be AWS
    • Provide hostname only for the AWS
    • Listener Port
    • DB Name
    • Username/Password 
    • Test your connection.
  • Select destination database should be Oracle 

  • Provide hostname only for the AWS.
      • Provide hostname/IP for the server.
      • Listener Port
      • DB Name
      • Username/Password 
      • Test your connection.

    • Press Next Button, if the migration done before on the same schema press replace and next.

                                        

    • Press Next after choose what you want to move, Data, Functions , Or trigger … etc
    • Check Proceed to summary and Press the finish button the migration will start after this,  it will take some time depend on internet connection and data size.
    Enjoy the migration
    Osama Mustafa

    Moving from VMware/KVM to the Oracle Cloud

    Are you running Vmware or KVM solution in your infrastructure and you are afraid to move your infrastructure to the cloud, Oracle provide and gives one simple solution without losing anything Now you can now easily move your virtual machines to the Oracle cloud using Ravello

    and you don’t have to change anything from network, storage or anything you did on your local infrastructure to know more about this product.

    You can request a free trial account to experience the Ravello’s unique features and
    capabilities. For any questions please contact your local Oracle Cloud Infrastructure and
    Platform Sales Executive. The following is the URL for requesting the Free Trail
    account.

    https://www.ravellosystems.com/.

    Thank you
    Osama 

    Plug and unplug PDB on the Cloud

    You can use DBaaS Monitor to plug/Unplug in a pluggable database.

    • Open Dbaas Monitor like usual 

    • Once you did this the, you will redirected to new page, in my case i have 2 PDB let’s choose one of them and try to unplug it then Plug it again.
    • From the Right Panel press on the Unplug new screen will be open, As you see from the below picture you should the XML path in your mind in case you need to Plug the database again , put the password same as you use it when you create PDB.

    The output should be “PDB unplugged successfully”..
    Let’s Plug again.
    • Press on Plug PDB on the right new screen will be opened.

    Enter the following ;-
      • Name of the new PDB You want.
      • The XML file should be saved under directory you choose it
    Output should be successfully done.
    Cheers
    Osama Mustafa