I had chance to work and test alibaba cloud, so i thought it’s good idea to write something about it since i already used AWS, Azure and OCI and this is will be my 4th cloud vendor.
Alibaba Cloud is the subsidiary of the e-commerce hub Alibaba Group. The group launched its cloud services in 2009. Today, cloud is the most ambitious project of Alibaba Group where they are investing their hard efforts to win over AWS.
The company has an exclusive range of cloud computing products and services that are divided into 7 categories of Elastic Computing and Networking, Security and Management, Database, Application Services, Domains and website, Storage and CDN and Analytics. Customers of Alibaba Cloud are eligible to get the benefits of cloud security, record breaking computing power, cloud security, safeguard your data, etc.
I really like the cloud and the portal, it’s very simple and ease of use, include to this, having a lot of different features same as AWS, you can check them from here.
the alibaba cloud known as different name also, Aliyun, Alibaba Cloud has 19 regional data centres globally, including China North, China South, China East, US West, US East, Europe, United Kingdom, Middle East, Japan, Hong Kong, Singapore, Australia, Malaysia, India, and Indonesia, right now the Data Center in Germany is operated by Vodafone Germany
Some of the clients that using this cloud : Ford, Air Aisa, Lazada, and more.
Some of the services that providing by alibaba:-
Storage & CDN
and will discuss each one of them in different post, the next one will be alibaba services.
A load balancer distributes traffic evenly among each system in a pool. A load balancer can help you achieve both high availability and resiliency.
Say you start by adding additional VMs, each configured identically, to each tier. The idea is to have additional systems ready, in case one goes down, or is serving too many users at the same time.
Azure Load Balancer is a load balancer service that Microsoft provides that helps take care of the maintenance for you. Load Balancer supports inbound and outbound scenarios, provides low latency and high throughput, and scales up to millions of flows for all Transmission Control Protocol (TCP) and User Datagram Protocol (UDP) applications. You can use Load Balancer with incoming internet traffic, internal traffic across Azure services, port forwarding for specific traffic, or outbound connectivity for VMs in your virtual network.
When you manually configure typical load balancer software on a virtual machine, there’s a downside: you now have an additional system that you need to maintain. If your load balancer goes down or needs routine maintenance, you’re back to your original problem.
Azure Application Gateway
If all your traffic is HTTP, a potentially better option is to use Azure Application Gateway. Application Gateway is a load balancer designed for web applications. It uses Azure Load Balancer at the transport level (TCP) and applies sophisticated URL-based routing rules to support several advanced scenarios.
Cookie affinity. Useful when you want to keep a user session on the same backend server.
SSL termination. Application Gateway can manage your SSL certificates and pass unencrypted traffic to the backend servers to avoid encryption/decryption overhead. It also supports full end-to-end encryption for applications that require that.
Web application firewall. Application gateway supports a sophisticated firewall (WAF) with detailed monitoring and logging to detect malicious attacks against your network infrastructure.
URL rule-based routes. Application Gateway allows you to route traffic based on URL patterns, source IP address and port to destination IP address and port. This is helpful when setting up a content delivery network.
Rewrite HTTP headers. You can add or remove information from the inbound and outbound HTTP headers of each request to enable important security scenarios, or scrub sensitive information such as server names.
What is a Content Delivery Network (CDN)?
A content delivery network (CDN) is a distributed network of servers that can efficiently deliver web content to users. It is a way to get content to users in their local region to minimize latency. CDN can be hosted in Azure or any other location. You can cache content at strategically placed physical nodes across the world and provide better performance to end users. Typical usage scenarios include web applications containing multimedia content, a product launch event in a particular region, or any event where you expect a high-bandwidth requirement in a region.
DNS, or Domain Name System, is a way to map user-friendly names to their IP addresses. You can think of DNS as the phonebook of the internet.
How can you make your site, which is located in the United States, load faster for users located in Europe or Asia?
network latency in azure
Latency refers to the time it takes for data to travel over the network. Latency is typically measured in milliseconds.
Compare latency to bandwidth. Bandwidth refers to the amount of data that can fit on the connection. Latency refers to the time it takes for that data to reach its destination.
One way to reduce latency is to provide exact copies of your service in more than one region, or Use Traffic Manager to route users to the closest endpoint, One answer is Azure Traffic Manager. Traffic Manager uses the DNS server that’s closest to the user to direct user traffic to a globally distributed endpoint, Traffic Manager doesn’t see the traffic that’s passed between the client and server. Rather, it directs the client web browser to a preferred endpoint. Traffic Manager can route traffic in a few different ways, such as to the endpoint with the lowest latency.
In this post, i will discuss how to migrate from mongoDB (in my case the database was hosted on AWS) to Azure CosmosDB, i searched online about different articles how to do that, the problem i faced most of them were discussing the same way which is Online and using 3rd party software which is not applicable for me due to security reason, thefore i decided to post about it maybe it will useful for someone else.
Usually the easiet way which is use Azure Database Migration Service to perform an offline/online migration of databases from an on-premises or cloud instance of MongoDB to Azure Cosmos DB’s API for MongoDB.
There are some prerequisite before start the migration to know more about it read here, the same link explained different ways for migrations, however before you start you should create an instance for Azure Cosmos DB.
Preparation of target Cosmos DB account
Create an Azure Cosmos DB account and select MongoDB as the API. Pre-create your databases through the Azure portal
from the search bar just search for “Azure Cosmos DB”
You have add new account for the new migration Since we are migrating from MongoDB then The API should be “Azure CosmosDB for MongoDB API”
The target is ready for migration but we have to check the connection string so we can use them in our migration from AWS to Azure.
Get the MongoDB connection string to customize
the Azure Cosmos DB blade, select the API.
the left pane of the account blade, click Connection String.
The Connection String blade opens. It has all the information necessary to connect to the account by using a driver for MongoDB, including a preconstructed connection string.
From MongoDB (Source server) you have to take backup for the database, now after the backup is completed, no need to move the backup for another server , mongo providing two way of backup either mongodump (dump) or mongoexport and will generate JSON file.
For example using monogdump
mongodump --host <hostname:port> --db <Databasename that you want to backup > --collection <collectionname> --gzip --out /u01/user/
mongoexport --host<hostname:port> --db <Databasename that you want to backup > --collection <collectionname> --out=<Location for JSON file>
After the the above command will be finished, in my advice run them in the background specially if the database size is big and generate a log for the background process so you can check it frequently.
Run the restore/import command from the source server , do you remember the connection string, now we will use them to connect to Azure Cosmos DB using the following, if you used mongodump then to restore you have to use mongorestore like the below :-
mongorestore --host testserver.mongo.cosmos.azure.com --port 10255 -u testserver -p w3KQ5ZtJbjPwTmxa8nDzWhVYRuSe0BEOF8dROH6IUXq7rJgiinM3DCDeSWeEdcOIgyDuo4EQbrSngFS7kzVWlg== --db test --collection test /u01/user/notifications_service/user_notifications.bson.gz --gzip --ssl --sslAllowInvalidCertificates
notice the follwing :-
host : From Azure portal/connection string.
Port : From Azure portal/connection string.
Password : From Azure portal/connection string.
DB : The name of the database you want to be created in azure cosmo,this name will be created during the migration to azure.
Collection : The name of the collection you want to be created in azure cosmo,this name will be created during the migration to azure.
Location for the backup.
gzip because i compressed the backup
Migration required to use ssl authication otherwise it will fail.
mongoimport --host testserver.mongo.cosmos.azure.com:10255 -u testserver -p w3KQ5ZtJbjPwTmxa8nDzWhVYRuSe0BEOF8dROH6IUXq7rJgiinM3DCDeSWeEdcOIgyDuo4EQbrSngFS7kzVWlg== --db test --collection test --ssl --sslAllowInvalidCertificates --type json --file /u01/dump/users_notifications/service_notifications.json
Once you run the command
Note: if you migrating huge or big databases you need to increase the cosmosdb throughout and database level after the migration will be finished return everything to the normal situation because of the cost.
Encryption is the process of making data unreadable and unusable to unauthorized viewers. To use or read the encrypted data, it must be decrypted, which requires the use of a secret key.
There are two different type :-
Symmetric encryption :– Which mean you will use same key to encrypt and decrypt the data
Asymmetric encryption :– Which mean you will use different key , for example Private and public key.
both of these two type having two different ways :-
Encryption at rest which mean data stored in a database, or data stored in a storage account.
Encryption in transit which means data actively moving from one location to another.
So, there are different type of Encryption provided by Azure:-
Encrypt raw storage
Azure Storage Service Encryption:- encrypts your data before persisting it to Azure Managed Disks, Azure Blob storage, Azure Files, or Azure Queue storage, and decrypts the data before retrieval.
Encrypt virtual machine disks low-level encryption protection for data written to physical disk
Azure Disk Encryption : this method helps you to encruypt the actually windows or Linux disk, the best way to do this is h Azure Key Vault.
Transparent data encryption :- helps protect Azure SQL Database and Azure Data Warehouse against the threat of malicious activity. It performs real-time encryption and decryption of the database.
The best way to do this which is Azure Key Vault, cloud service for storing your application secrets. Key Vault helps you control your applications’ secrets by keeping them in a single, why should i use it :-
Centralizing the solutions.
Securely stored secrets and keys.
Monitor access and use.
Simplified administration of application secrets.
There are also two different kind of certificate in Azure which will helps you to encrypt for example the website or application, you need to know that Certificates used in Azure are x.509 v3 and can be signed by a trusted certificate authority, or they can be self-signed.
Types of certificates
Service certificates are used for cloud services
Management certificates are used for authenticating with the management API
which is attached to cloud services and enable secure communication to and from the service. For example, if you deploy a web site, you would want to supply a certificate that can authenticate an exposed HTTPS endpoint. Service certificates, which are defined in your service definition, are automatically deployed to the VM that is running an instance of your role.
allow you to authenticate with the classic deployment model. Many programs and tools (such as Visual Studio or the Azure SDK) use these certificates to automate configuration and deployment of various Azure services. However, these types of certificates are not related to cloud services.
Be noted that you can use Azure Key Vault to store your certificates.
Oracle Access Management is a Java, Enterprise Edition (Java EE)-based enterprise-level security application that provides a full range of Web-perimeter security functions and Web single sign-on services including identity context, authentication and authorization; policy administration; testing; logging; auditing; and more. It leverages shared platform services including session management, Identity Context, risk analytic, and auditing, and provides restricted access to confidential information.
From the above picture as you see OAM provides single point to control all resource grants in an enterprise where multiple applications exist on different platform.
There is more but you can refer to the above documentation.
OIM : Oracle Identity manager
enables enterprises to manage the entire user life cycle across all enterprise resources both within and beyond a firewall. An Oracle identity management solution provides a mechanism for implementing the user management aspects of a corporate policy. It can also be a means to audit users and their access privileges.
The best best example to understand OIM is employee.
if the new employee joining the company the HR handle everything for him emails, permission … etc, with OIM it’s different and all of this can be done automatically
An online directory is a specialized database that stores and retrieves collections of information about objects. The information can represent any resources that require management, for example:
Employee names, titles, and security credentials
Information about partners
Information about shared resources such as conference rooms and printers.
The information in the directory is available to different clients, such as single sign-on solutions, email clients, and database applications. Clients communicate with a directory server by means of the Lightweight Directory Access Protocol (LDAP). Oracle Internet Directory is an LDAP directory that uses an Oracle Database for storage.
con1: Command> call ttgridcreate (‘samplegrid’); 15022: OraclePwd connection attribute needs to be specified and has to be non-empty for using IMDB Cache features 5109: Cache Connect general error: BDB connections not open.
As you see from the error,The OraclePWD connection attribute is not specified. which mean caching will not be enabled till you are set OraclePWD parameter , to do this reconnect with the following connection :-
Don’t be dismayed by good-byes. A farewell is necessary before you can meet again. And meeting again, after moments or lifetimes, is certain for those who are friends.
I Thought it will be nice to share this quotation before start talking about Oracle OTN Tour – MENA TOUR 2014.
Fir the people who dosen’t know anything about this Tour it’s happened for the first time in middle-east,
“Courtesy of the Oracle Technology Network (OTN) and the ARABOUG, the inaugural 2014 OTN MENA Tour brings a star-studded cast, consisting of some of the world’s best Oracle ACEs, ACE Directors and Rock Star Speakers to the region. The tour aims at sharing cutting edge knowledge and independent research in the MENA region, by accomplished Oracle experts from all over the world”