Oracle Autonomous Database (ADB): A Technical Guide

Oracle Autonomous Database (ADB) on Oracle Cloud Infrastructure (OCI) is a cloud service that leverages machine learning to automate routine database tasks, offering users a self-driving, self-securing, and self-repairing database solution. This blog post will delve into setting up and interacting with an Autonomous Transaction Processing (ATP) instance, showcasing how to deploy a sample application to demonstrate its capabilities.

Overview of Oracle Autonomous Database

Self-Driving: Automates performance tuning and scaling.

Self-Securing: Applies security patches automatically.

Self-Repairing: Offers built-in high availability and backup solutions.

Step 1: Creating an Autonomous Database

Log into OCI Console: Go to console.oracle.com and log in to your account.

Create Autonomous Database:

  • Navigate to the Database section and click on Autonomous Database.
  • Click on Create Autonomous Database.
  • Fill in the required details:
    • Display Name: MyATPDB
    • Database Name: MYATPDB
    • Database Type: Autonomous Transaction Processing
    • CPU Count: 1 (can be adjusted later)
    • Storage: 1 TB (adjust as necessary)
  • Configure the Admin Password and ensure you store it securely.
  • Click Create Autonomous Database.

Step 2: Setting Up the Network

2.1: Create a Virtual Cloud Network (VCN)
  1. Navigate to the Networking Section.
  2. Click on Create VCN and fill in the necessary details:
    • VCN Name: MyVCN
    • CIDR Block: 10.0.0.0/16
    • Subnets: Create a public subnet with a CIDR block of 10.0.0.0/24.
2.2: Configure Security Lists
  1. In the VCN settings, add a security rule to allow traffic to your database:
    • Source CIDR: Your public IP address (for SQL Developer access).
    • IP Protocol: TCP
    • Source Port Range: All
    • Destination Port Range: 1522 (default for ADB)
Step 3: Connecting to the Autonomous Database
3.1: Download Wallet
  1. In the ADB console, navigate to your database and click on DB Connection.
  2. Download the Client Credentials (Wallet). This will be a zip file containing the wallet and connection files.
3.2: Set Up SQL Developer
  1. Open Oracle SQL Developer.
  2. Go to Tools > Preferences > Database > Advanced and set the Use Wallet option to true.
  3. In the Connections pane, click on the green + icon to create a new connection.
  4. Set the connection type to Cloud Wallet, then specify:
    • Connection Name: MyATPConnection
    • Username: ADMIN
    • Password: Your admin password
    • Wallet Location: Path to the unzipped wallet directory
  5. Click Test to verify the connection, then click Save.

Step 4: Creating a Sample Schema and Table

Once connected to your database, execute the following SQL commands to create a sample schema and a table:

-- Create a new user/schema
CREATE USER sample_user IDENTIFIED BY SamplePassword;
GRANT ALL PRIVILEGES TO sample_user;

-- Connect as the new user
ALTER SESSION SET CURRENT_SCHEMA = sample_user;

-- Create a sample table
CREATE TABLE employees (
employee_id NUMBER GENERATED ALWAYS AS IDENTITY PRIMARY KEY,
first_name VARCHAR2(50) NOT NULL,
last_name VARCHAR2(50) NOT NULL,
email VARCHAR2(100) NOT NULL UNIQUE,
hire_date DATE DEFAULT CURRENT_DATE
);

-- Insert sample data
INSERT INTO employees (first_name, last_name, email)
VALUES ('John', 'Doe', 'john.doe@example.com');

INSERT INTO employees (first_name, last_name, email)
VALUES ('Jane', 'Smith', 'jane.smith@example.com');

COMMIT;

Querying the Data

To verify the data insertion, run:

SELECT * FROM employees;

Step 5: Using Autonomous Database Features

5.1: Auto-Scaling

ADB allows you to scale compute and storage resources automatically. To enable auto-scaling:

  1. Navigate to your Autonomous Database instance in the OCI console.
  2. Click on Edit.
  3. Enable Auto Scaling for both CPU and storage.
  4. Specify the minimum and maximum resources.

5.2: Monitoring Performance

Utilize the Performance Hub feature to monitor real-time database performance. You can view metrics like:

  • Active Sessions
  • Wait Events
  • Resource Consumption

Regads
Osama

Automating Block Volume Backups in Oracle Cloud Infrastructure (OCI) using CLI and Terraform

Briefly introduce the importance of block volumes in OCI and why automated backups are essential.Mention that this blog will cover two methods: using the OCI CLI and Terraform for automation.

Automating Block Volume Backups using OCI CLI

Prerequisites:

  • Set up OCI CLI on your machine (brief steps with links).
  • Ensure that you have the right permissions to manage block volumes.

Step-by-step guide:

  • Command to create a block volume
oci bv volume create --compartment-id <your_compartment_ocid> --availability-domain <your_ad> --display-name "MyVolume" --size-in-gbs 50

Command to take a backup of the block volume:

oci bv backup create --volume-id <your_volume_ocid> --display-name "MyVolumeBackup"

Scheduling backups using cron jobs for automation.

  • Example cron job configuration
0 2 * * * /usr/local/bin/oci bv backup create --volume-id <your_volume_ocid> --display-name "ScheduledBackup" >> /var/log/oci_backup.log 2>&1

Automating Block Volume Backups using Terraform

Prerequisites

  1. OCI Credentials: Make sure you have the proper API keys and permissions configured in your OCI tenancy.
  2. Terraform Setup: Terraform should be installed and configured to interact with OCI, including the OCI provider setup in your environment.
Step 1: Define the OCI Block Volume Resource

First, define the block volume that you want to automate backups for. Here’s an example of a simple block volume resource in Terraform:

resource "oci_core_volume" "my_block_volume" {
  availability_domain = "your-availability-domain"
  compartment_id      = "ocid1.compartment.oc1..your-compartment-id"
  display_name        = "my_block_volume"
  size_in_gbs         = 50
}
Step 2: Define a Backup Policy

OCI provides predefined backup policies such as gold, silver, and bronze, which define how frequently backups are taken. You can create a custom backup policy as well, but for simplicity, we’ll use one of the predefined policies in this example. The Terraform resource oci_core_volume_backup_policy_assignment will assign a backup policy to the block volume.

Here’s an example to assign the gold backup policy to the block volume:

resource "oci_core_volume_backup_policy_assignment" "backup_assignment" {
  volume_id       = oci_core_volume.my_block_volume.id
  policy_id       = data.oci_core_volume_backup_policy.gold.id
}

data "oci_core_volume_backup_policy" "gold" {
  name = "gold"
}
Step 3: Custom Backup Policy (Optional)

If you need a custom backup policy rather than using the predefined gold, silver, or bronze policies, you can define a custom backup policy using OCI’s native scheduling.

You can create a custom schedule by combining these elements in your oci_core_volume_backup_policy resource.

resource "oci_core_volume_backup_policy" "custom_backup_policy" {
  compartment_id = "ocid1.compartment.oc1..your-compartment-id"
  display_name   = "CustomBackupPolicy"

  schedules {
    backup_type = "INCREMENTAL"
    period      = "ONE_DAY"
    retention_duration = "THIRTY_DAYS"
  }

  schedules {
    backup_type = "FULL"
    period      = "ONE_WEEK"
    retention_duration = "NINETY_DAYS"
  }
}

You can then assign this policy to the block volume using the same method as earlier.

Step 4: Apply the Terraform Configuration

Once your Terraform configuration is ready, apply it using the standard Terraform workflow:

  1. Initialize Terraform:
terraform init

Plan the Terraform deployment:

terraform plan

Apply the Terraform plan:

terraform apply

This process will automatically provision your block volumes and assign the specified backup policy.



Regards
Osama

Building a Scalable Data Pipeline on OCI with Data Flow

In this blog, we will explore how to build a scalable data pipeline on Oracle Cloud Infrastructure (OCI) using OCI Data Flow. We’ll cover the end-to-end process, from setting up OCI Data Flow to processing large datasets, and integrating with other OCI services.

Introduction to OCI Data Flow

  • Overview of OCI Data Flow and its key features.
  • Benefits of using a serverless, scalable data processing service.
  • Common use cases for OCI Data Flow, including ETL, real-time analytics, and machine learning.

Setting Up OCI Data Flow

Prerequisites

  • An active Oracle Cloud account.
  • Necessary permissions and quotas for creating OCI resources.

Configuration Steps

  1. Create a Data Flow Application:
    • Navigate to the OCI Console and open the Data Flow service.
    • Click on “Create Application” and provide the necessary details.
    • Define your application’s parameters and Spark version.
  2. Configure Networking:
    • Set up Virtual Cloud Network (VCN) and subnets.
    • Ensure proper security lists and network security groups (NSGs) for secure communication.

3. Creating a Scalable Data Pipeline

Designing the Data Pipeline

  • Outline the flow of data from source to target.
  • Example pipeline: Ingest data from OCI Object Storage, process it using Data Flow, and store results in an Autonomous Database.

Developing Data Flow Jobs

  • Write Spark jobs in Scala, Python, or Java.
  • Example Spark job to process data:
val df = spark.read.json("oci://<bucket_name>@<namespace>/data/")
df.filter("age > 30").write.csv("oci://<bucket_name>@<namespace>/output/")

Deploying and Running Jobs

  • Deploy the Spark job to OCI Data Flow.
  • Schedule and manage job runs using OCI Console or CLI.

Processing Large Datasets

Handling Big Data

  • Techniques for optimizing Spark jobs for large datasets.
  • Using partitions and caching to improve performance.

Example: Processing a 1TB Dataset

  • Step-by-step guide to ingest, process, and analyze a 1TB dataset using OCI Data Flow.

5. Integrating with Other OCI Services

OCI Object Storage

  • Use Object Storage for data ingestion and storing intermediate results.
  • Configure Data Flow to directly access Object Storage buckets.

OCI Autonomous Database

  • Store processed data in an Autonomous Database.
  • Example of loading data from Data Flow to Autonomous Database.

OCI Streaming

  • Integrate with OCI Streaming for real-time data processing.
  • Example: Stream processing pipeline using OCI Streaming and Data Flow.

Optimizing Data Flow Jobs

Performance Tuning

  • Tips for optimizing resource usage and job execution times.
  • Adjusting executor memory, cores, and dynamic allocation settings.

Cost Management

  • Strategies for minimizing costs while running Data Flow jobs.
  • Monitor job execution and cost metrics using the OCI Console.

Implementing Data Replication and Disaster Recovery with OCI Autonomous Database

Introduction

  • Overview of OCI Autonomous Database and its capabilities.
  • Importance of data replication and disaster recovery for business continuity.

Step-by-Step Guide

  1. Setting Up OCI Autonomous Database
  • Creating an Autonomous Database Instance:
oci db autonomous-database create --compartment-id <compartment_OCID> --db-name "MyDatabase" --cpu-core-count 1 --data-storage-size-in-tbs 1 --admin-password "<password>" --display-name "MyAutonomousDB" --db-workload "OLTP" --license-model "BRING_YOUR_OWN_LICENSE" --wait-for-state AVAILABLE

2. Configuring Data Replication

  • Creating a Database Backup:
oci db autonomous-database backup create --autonomous-database-id <db_OCID> --display-name "MyBackup" --wait-for-state COMPLETED

3. Setting Up Data Guard for High Availability:

  • Creating a Data Guard Association:
oci db autonomous-database create-data-guard-association --compartment-id <compartment_OCID> --primary-database-id <primary_db_OCID> --standby-database-id <standby_db_OCID> --display-name "MyDataGuardAssociation"

4. Implementing Disaster Recovery

  • Configuring Backup Retention Policies:
  • Set up automated backups with a specific retention period through the OCI Console or CLI:
oci db autonomous-database update --autonomous-database-id <db_OCID> --backup-retention-period 30
  • Restoring a Database from Backup:
oci db autonomous-database restore --autonomous-database-id <db_OCID> --restore-timestamp "2024-01-01T00:00:00Z" --display-name "RestoredDatabase"

4. Testing and Validating Disaster Recovery

  • Performing a Failover Test:
    • Failover to Standby Database:
oci db autonomous-database failover --autonomous-database-id <standby_db_OCID>
  • Verifying Data Integrity:
    • Connect to the standby database and validate data consistency and application functionality.

5. Automating and Monitoring

  • Automating Backups and Replication:
    • Use OCI’s built-in scheduling features to automate backup creation and data replication.
  • Monitoring Database Health and Performance:
  • Use OCI Monitoring to set up alarms and dashboards for tracking the health and performance of your Autonomous Database.
  • Example Alarm:
oci monitoring alarm create --compartment-id <compartment_OCID> --display-name "HighIOWaitTime" --metric-name "io_wait_time" --threshold 1000 --comparison ">" --enabled true

Building a Secure Data Pipeline with OCI Data Flow and OCI Data Integration

Setting Up OCI Data Flow

Creating a Data Flow Application:

oci data-flow application create --compartment-id <compartment_OCID> --display-name "MyDataFlowApp" --image-id <image_OCID> --description "Data processing application"

Creating a Data Flow Run:

oci data-flow run create --application-id <application_OCID> --display-name "MyDataFlowRun" --compartment-id <compartment_OCID> --arguments '{"input":"<input_data_location>", "output":"<output_data_location>"}' --wait-for-state SUCCEEDED

Setting Up OCI Data Integration

  • Creating a Data Integration Task:
    • Go to Data IntegrationData TasksCreate Task.
    • Define your task type (e.g., Copy Data, Data Mapping) and configure source and target data stores.
  • Setting Up Data Flows:
  • Define and configure data flows that transform and move data between different sources and targets.
  • Example: Copy data from an OCI Object Storage bucket to a database
  • Securing Your Data Pipeline
  • Data Encryption:
    • At Rest: Ensure data stored in OCI Object Storage is encrypted using server-side encryption.
    • In Transit: Use HTTPS for secure data transfers between services.
  • Access Control:
    • Configure IAM policies to restrict access to data sources and pipelines.
    • Example IAM Policy:
allow group <group_name> to manage data-integrations in compartment <compartment_OCID>

Network Security:

  • Use VCNs and subnets to isolate data processing environments.
  • Example: Set up a private endpoint for data flow applications.

Monitoring and Managing Data Pipelines

Monitoring Data Flow Runs:

oci data-flow run list --compartment-id <compartment_OCID> --application-id <application_OCID>

Setting Up Alarms:

  • Use OCI Monitoring to create alarms based on metrics from data flows and integration tasks.

Example Alarm:

oci monitoring alarm create --compartment-id <compartment_OCID> --display-name "HighErrorRate" --metric-compartment-id <compartment_OCID> --metric-name "error_rate" --threshold 5 --comparison "<" --enabled true

putting in place a safe data pipeline that uses OCI Data Integration to import log data into an OCI Autonomous Database, OCI Data Flow to process the log data, and OCI Object Storage bucket to modify it. To protect the security and integrity of the data, the pipeline has access controls, encryption, and monitoring.

Thank you
Osama

How to setup the OCI CLI

Setting up the OCI CLI (Command Line Interface) involves several steps to authenticate, configure, and start using it effectively. Here’s a detailed guide to help you set up OCI CLI.

Step 1: Prerequisites

  1. OCI Account: Ensure you have an Oracle Cloud Infrastructure account.
  2. Access: Make sure you have appropriate permissions to create and manage resources.
  3. Operating System: OCI CLI supports Windows, macOS, and Linux distributions.

Step 2: Install OCI CLI

Install Python: OCI CLI requires Python 3.5 or later. Install Python if it’s not already installed:

On Linux:

sudo apt update
sudo apt install python3

On macOS:
Install via Homebrew:

brew install python3
  • On Windows: Download and install Python from python.org.

Install OCI CLI: Use pip, Python’s package installer, to install OCI CLI:

pip3 install oci-cli

Step 3: Configure OCI CLI

  1. Generate API Signing Keys: OCI CLI uses API signing keys for authentication. If you haven’t created keys yet, generate them through the OCI Console:
    • Go to IdentityUsers.
    • Select your user.
    • Under Resources, click on API Keys.
    • Generate a new key pair if none exists.

Configure OCI CLI: After installing OCI CLI, configure it with your tenancy, user details, and API key:

  • Open a terminal or command prompt.
  • Run the following command:
oci setup config
  • Enter a location for your config file: Choose a path where OCI CLI configuration will be stored (default is ~/.oci/config).
  • Enter a user OCID: Enter your user OCID (Oracle Cloud Identifier).
  • Enter a tenancy OCID: Enter your tenancy OCID.
  • Enter a region name: Choose the OCI region where your resources are located (e.g., us-ashburn-1).
  • Do you want to generate a new API Signing RSA key pair?: If you haven’t generated API keys, choose yes and follow the prompts.

Once configured, OCI CLI will create a configuration file (config) and a key file (oci_api_key.pem) in the specified location.

Thank you

Osama

Exploring Oracle Cloud Infrastructure (OCI)

In today’s rapidly evolving digital landscape, choosing the right cloud infrastructure is crucial for organizations aiming to scale, secure, and innovate efficiently. Oracle Cloud Infrastructure (OCI) stands out as a robust platform offering a comprehensive suite of cloud services tailored for enterprise-grade performance and reliability.

1. Overview of OCI: Oracle Cloud Infrastructure (OCI) provides a highly scalable and secure cloud computing platform designed to meet the needs of both traditional enterprise workloads and modern cloud-native applications. Key components include:

  • Compute Services: OCI offers Virtual Machines (VMs) for general-purpose and high-performance computing, Bare Metal instances for demanding workloads, and Container Engine for Kubernetes clusters.
  • Storage Solutions: Includes Block Volumes for persistent storage, Object Storage for scalable and durable data storage, and File Storage for file-based workloads.
  • Networking Capabilities: Virtual Cloud Network (VCN) enables customizable network topologies with VPN and FastConnect for secure and high-bandwidth connectivity. Load Balancer distributes incoming traffic across multiple instances.
  • Database Options: Features Autonomous Database for self-driving, self-securing, and self-repairing databases, MySQL Database Service for fully managed MySQL databases, and Exadata Cloud Service for high-performance databases.

Example 2: Implementing Autonomous Database

Autonomous Database handles routine tasks like patching, backups, and updates automatically, allowing the IT team to focus on enhancing customer experiences.

Security and Compliance: OCI provides robust security features such as Identity and Access Management (IAM) for centralized control over access policies, Security Zones for isolating critical workloads, and Web Application Firewall (WAF) for protecting web applications from threats.

Management and Monitoring: OCI’s Management Tools offer comprehensive monitoring, logging, and resource management capabilities. With tools like Oracle Cloud Infrastructure Monitoring and Logging, organizations gain insights into performance metrics and operational logs, ensuring proactive management and troubleshooting.

Integration and Developer Tools: For seamless integration, OCI offers Oracle Integration Cloud and API Gateway, enabling organizations to connect applications and services securely across different environments. Developer Tools like Oracle Cloud Developer Tools and SDKs support agile development and deployment practices.

Oracle Cloud Infrastructure (OCI) emerges as a robust solution for enterprises seeking a secure, scalable, and high-performance cloud platform. Whether it’s deploying mission-critical applications, managing large-scale databases, or ensuring compliance and security, OCI offers the tools and capabilities to drive innovation and business growth.

AWS Infrastructure

The AWS Global Cloud Infrastructure is the most secure, extensive, and reliable cloud platform, offering over 200 fully featured services from data centers globally.

AWS Data Center

AWS pioneered cloud computing in 2006 to provide rapid and secure infrastructure. AWS continuously innovates on the design and systems of data centers to protect them from man-made and natural risks. Today, AWS provides data centers at a large, global scale. AWS implements controls, builds automated systems, and conducts third-party audits to confirm security and compliance. As a result, the most highly-regulated organizations in the world trust AWS every day.

Availability Zone – AZ

An Availability Zone (AZ) is one or more discrete data centers with redundant power, networking, and connectivity in an AWS Region. Availability Zones are multiple, isolated areas within a particular geographic location. When you launch an instance, you can select an Availability Zone or let AWS choose one for you. If you distribute your instances across multiple Availability Zones and one instance fails, you can design your application so that an instance in another Availability Zone can handle requests.

Region

Each AWS Region consists of multiple, isolated, and physically separate Availability Zones within a geographic area. This achieves the greatest possible fault tolerance and stability. In your account, you determine which Regions you need. You can run applications and workloads from a Region to reduce latency to end users. You can do this while avoiding the upfront expenses, long-term commitments, and scaling challenges associated with maintaining and operating a global infrastructure.

AWS Local Zone

AWS Local Zones can be used for highly demanding applications that require single-digit millisecond latency to end users. Media and entertainment content creation, real-time multiplayer gaming, and Machine learning hosting and training are some use cases for AWS Local Zones.

CloudFront – Edge Location

An edge location is the nearest point to a requester of an AWS service. Edge locations are located in major cities around the world. They receive requests and cache copies of your content for faster delivery.

Regards

Osama

JCON ONLINE 2022

Save the date and don’t forget to register.

I will be speaking about Infrascturcture as code (IaC).

Where to start?

Check out our website at https://2022.jcon.one/

Our session planner is available as an event platform at https://2022.jcon.one/session-plan  

Social media:

Please use the hashtag #JCON2022 to promote the conference. Our Twitter handle is @jcon_conference / https://twitter.com/jcon_conference.

Please join our event on LinkedIn: https://www.linkedin.com/events/6915612844054999040

Please join our event on XING: https://www.xing.com/events/jcon-online-2022-3886249

Thank you

Enjoy the event.

Another year – Top 100 Oracle Blogs and Website

The best Oracle blogs from thousands of blogs on the web ranked by traffic, social media followers, domain authority & freshness.

Happy to share that my blog has been choosen for another year as the Top 100 Blogs around the world, the list contains talened, experience and professional people 🎉🎉🎉

Thank you all for the support.

Cheers

Osama