Again, But this time virtual, I remember the tour before three years, one of the most fantastic trip, met new people and friends, this time it will be virtual due to Coronavirus, great topics, and Geeks
Register now and don’t miss it there is always time to learn something new.
Many of you knows that i have been working on different cloud vendor, oracle cloud infrastructure , Amazon AWS, and MS Azure, and I had chance to work on many of them with hands-on experience and implement projects on all of them.
Now i am working on 2nd book that will include different topics about the 3 of them, DevOps, and comparison between all the three cloud vendor and more.
During the Lockdown, i was working to sharp my skills and test them in the cloud, therefore i decided to go for azure first and trust me when i say “it’s on of the hardest exam i ever did”.
The exam itself it’s totally different from what i used to, real case scenario that you should be aware of azure features, all of them, and configure them.
To be “Azure Solutions Architect Expert”, there are some of the conditions you should go thru, first you need to apply for two exams, AZ-301 & AZ-300
AZ-301 Microsoft Azure Architect Design
AZ-300 Microsoft Azure Architect Technologies
Both are Part of the requirements for: Microsoft Certified: Azure Solutions Architect Expert, the first exam which AZ-301, disccused the following secure, scalable, and reliable solutions. Candidates should have advanced experience and knowledge across various aspects of IT operations, including networking, virtualization, identity, security, business continuity, disaster recovery, data management, budgeting, and governance. This role requires managing how decisions in each area affects an overall solution. Candidates must be proficient in Azure administration, Azure development, and DevOps, and have expert-level skills in at least one of those domains.
Determine workload requirements
Design for identity and security
Design a data platform solution
Design a business continuity strategy
Design for deployment, migration, and integration
Design an infrastructure strategy
For the AZ-300
Deploy and configure Azure infrastructure
Implement workloads and security on Azure
Create and deploy apps on Azure
Implement Azure authentication and secure data
Develop for the cloud
After you completed the both exams successfully you will receive your badge for the three exams, durtation for the exams around 3 hours and trust me you will need it.
Many of you knows that Oracle annouced before one month, the six track from Oracle university included the exams for free, so far i completed four of them and looking for the other two.
in this post i will discuss how to preapre for exam 1z0-1084-20, in my opioion, this exam it’s more DevOps exam, so if you know the knowledge with Docker and Kubernetes and worked on them, working with OCI (Oracle Cloud Infrastrcuture) before, go ahead and apply for this exam.
The funny thing when you pass one exam and post about it on the social media directly i start recieving multiple messages from different people i don’t know, asking “could you please provide us with the dumps ?” first of all, how did you assume i am using dump, i failed mulitple times in different Oracle exam, second, i am aganist the dumps for various reasons, the exam is prove that you are ready to go thru this track and work on it, imagine you put this on your resume and someone asked you question about it, it will not be professional for you.
However, i would like to discuss 1z0-1084-20 specially this one, because i didn’t feel it’s only related to Oracle, you should have knowledge with different criteria,
software architect patterns
For sure OCI
When you are study for this exam, you should follow Lucas Jellema Blog here and you can follow him on twitter also.
This blog saved me alot of time and explained everything you need to know in details.
Ambari is an open-source administration tool deployed on top of Hadoop clusters, and it is responsible for keeping track of the running applications and their status. Apache Ambari can be referred to as a web-based management tool that manages, monitors, and provisions the health of Hadoop clusters.
With Ambari, Hadoop operators get the following core benefits:
Simplified Installation, Configuration and Management.
Centralized Security Setup.
Full Visibility into Cluster Health.
Highly Extensible and Customizable.
For more information about Ambari, review the documentation here.
The Below picture shows how the ambari architecture: –
The Ambari Installation is pretty simple, the below section will discuss the installation steps
Installation of Apache Ambari is very easy. SSH to your server and become root, add Ambari repository file to /etc/apt/sources.list.d :
The above checking should give you the following result :-
Now we will install the Ambari server which will also install the above said default PostgreSQL Ambari database :
apt-get install ambari-server
Next part is running the command to set up the Ambari Server :
The above command will start configure Ambari to connect the database, the default database is PostgreSQL, then install JDK, and the Ambrai daemon, then the account, you could choose advance setting for database in case you want to change the database type, ambari gives you different options such as Oracle, MySQL, Microsoft and DB2, the default database username and password is
You can start the Ambari server by running the following command :-
To start the Ambari server: service ambari-server start
To stop the Ambari server: service ambari-server stop
restart the Ambari server: service ambari-server restart
To check the Ambari server processes: ps -ef | grep Ambari.
In this post i will discuss how to install Boto3 module on Python, I am using Python 3.6, What is Boto3 ?!
Boto3 is the Amazon Web Services (AWS) Software Development Kit (SDK) for Python, which allows Python developers to write software that makes use of services like Amazon S3 and Amazon EC2. You can find the latest, most up to date, documentation at our doc site, including a list of services that are supported.
The Module is very big and covering all AWS features, you can intergrate the code and start dealing with S3 for exampel from Download/Upload, Create Bucket , Delete and more; the documentation is here
To install Boto3, you should follow the below steps
yum install python3-pip
Once you run the above command, Pip will be installed on local machine which is a package manager for Python packages, or modules if you like.
pip3 install boto3 --user
I prefer this method more than Option #1 because it’s run by python itself
python3 -m pip install --user boto3
Now you installed Boto3 on your machine, you can start using it by