Pakistan's First Oracle Blog

Subscribe to Pakistan's First Oracle Blog feed
Blog By Fahd Mirza ChughtaiFahd Mirza
Updated: 6 hours 29 min ago

Cloud data lake house

Tue, 2022-04-12 21:44

 A data lake can help you break down data silos and combine different types of analytics into a centralized repository. You can store all of your structured and unstructured data in this repository. However, setting up and managing data lakes involve a lot of manual, complicated, and time-consuming tasks.

We’re seeing the use of data analytics expanding among new audiences within organizations, for example with users like developers and line of business analysts who don’t have the expertise or the time to manage a traditional data warehouse. Also, some customers have variable workloads with unpredictable spikes, and it can be very difficult for them to constantly manage capacity.

A data lake is a place to store your structured and unstructured data, as well as a method for organizing large volumes of highly diverse data from diverse sources. Data lakes are becoming increasingly important as people, especially in business and technology, want to perform broad data exploration and discovery. Bringing data together into a single place or most of it in a single place makes that simpler.

A data lakehouse can be defined as a modern data platform built from a combination of a data lake and a data warehouse. A data lakehouse takes the flexible storage of unstructured data from a data lake and the management features and tools from data warehouses, then strategically implements them together as a larger system.   

Oracle Cloud Infrastructure Data Integration is a fully managed, serverless, cloud-native service that extracts, loads, transforms, cleanses, and reshapes data from a variety of data sources into target Oracle Cloud Infrastructure services, such as Autonomous Data Warehouse and Oracle Cloud Infrastructure Object Storage. 

Categories: DBA Blogs

OCI Vulnerability Scanning Service (VSS) and Oracle Cloud Guard

Tue, 2022-04-12 21:38

 The Cloud Native Computing Foundation reported that over 92% of firms are using containers in production in 2020, up from 23% in 2016. The need to innovate faster and shift to cloud-native application architectures isn’t just driving complexity, it’s creating significant vulnerability blind spots.

Oracle has a new Oracle Cloud Guard detector for container image scanning. Customers can set the risk level for container images in the new Cloud Guard detector. The image findings are collected by the detector and then will become a container image ‘problem’ in Cloud Guard. This additional feature is great for the users that normally do not use the VSS or OCIR consoles to check the status of their container images. Cloud Guard will alert users when VSS detects container images with high risk vulnerabilities so that everyone will know that a development team needs to address the issues quickly.

Container security is the process of implementing tools and policies to ensure that container infrastructure, apps, and other container components are protected. Linux containers allow both developers and IT operations to create a portable, lightweight, and self-sufficient environment for every application. However, securing containerized environments is a significant concern for Dev/Sec/Ops teams.

Unfortunately, container security is much more difficult to achieve than security for more traditional compute platforms, such as virtual machines or bare metal hosts.

A container is a standalone file or package of software files with everything you need to run an application. The application’s code, dependencies, library, runtime, and system tools are all “contained” within the container. As a result, containers have made the process of developing an application faster, simpler, and more powerful than ever.

To reduce an application’s attack surface, developers need to remove any components that aren’t needed. Use scripts to configure hosts properly based on the CIS benchmarks. Although legacy SCA and SAST tools can be slow and cumbersome to use, many have been evolving in recent years to support DevOps initiatives and automation, and they are still an important part of container security. 

Categories: DBA Blogs

OCI AI services and NVIDIA

Tue, 2022-04-12 21:32

 Make accurate predictions, get deeper insights from your data, reduce operational overhead, and improve customer experience with Oracle AI and ML services . Oracle cloud helps you at every stage of your ML adoption journey with the most comprehensive set of artificial intelligence (AI) and ML services, infrastructure, and implementation resources.

Improve your business outcomes with ready-made intelligence for your applications and workflows. From computer vision to automated data extraction and from analysis to control quality, Oracle has all kinds of offerings.

Oracle has teamed up with NVIDIA accelerated computing to demo HPC applications. Oracle has pretrained ML services that can be seamlessly integrated by developers, data scientists, and even business analysts directly.

OCI AI services are a collection of prebuilt machine learning models that developers can easily add to their applications and business operations. These models, which you can further custom train on an organization’s own business data, are pretrained on industry data, which helps them deliver more accurate results. Developers can now focus on accelerating application development without needing data science experience.

The following services all run directly on NVIDIA GPU offerings in OCI:

  • Oracle Digital Assistant
  • OCI Language
  • OCI Vision
  • OCI Speech
  • OCI Anomaly Detection
  • OCI Forecasting

Categories: DBA Blogs

What to Expect from First Oracle CloudWorld

Tue, 2022-04-12 21:27

Oracle OpenWorld has become CloudWorld. So what to expect from first Oracle Cloudworld, the premier Oracle event of the year in glittering Las Vegas.

From October 16-20, 2022, Oracle is all set to dazzle the world. From IT administrators to business professionals, CloudWorld attendees will learn from organizations of all sizes and industries as they share how they’ve used Oracle Cloud to modernize their IT operations.

Just like it's other solution, Oracle's aim is to cover the whole business fleet of any kind. Oracle Cloud want to be a cloud with solutions for every type of industry including but not limited to finance, supply chain, customer experience, human resources, industries, and more.

If there is one even this year you want to attend, then that should be Oracle Cloudworld.

Categories: DBA Blogs

Know Your AWS Cloud Carbon Footprint

Mon, 2022-03-07 20:45
If you stop or delete what you are not using in cloud, then you are saving energy which means you are reducing carbon emission, which means you are playing your part in making this planet more sustainable. This lasting impact on environment can now be kept track of by using AWS Customer Carbon footprint tool.

It's a free tool by AWS, and can be accessed in your account within AWS Billing Console and click Cost & Usage Reports.

This tool perfectly aligns with sustainability pillar of AWS which is all about energy consumption and its impact on environment. As per Jeff Barr, "If you are an AWS customer, then you are already benefiting from our efforts to decarbonize and to reach 100% renewable energy usage by 2025, five years ahead of our original target.".

Categories: DBA Blogs

Oracle Cloud Has Lot for Free

Tue, 2022-02-01 19:35

 Oracle is making great effort to compete with other cloud providers. Besides of ever-improving cloud services, their free offerings are massive and always being added to.

Following is the latest list of Oracle cloud's free offerings:


  • AMD-based Compute
  • Block Volume
  • Flexible Load Balancer
  • Virtual Cloud Networks
  • Arm-based Ampere A1 Compute
  • Object Storage
  • Flexible Network Load Balancer
  • Site-to-site VPN
  • Outbound Data Transfer
  • Archive Storage
  • VPN Connect

Managed database

  • Autonomous Data Warehouse
  • NoSQL Database
  • Autonomous Transaction Processing
  • Autonomous JSON Database

Resource management and automation

  • Resource Manager (Terraform)
  • Vault (key management)
  • Bastions
  • Monitoring
  • Logging
  • Security Advisor
  • Service Connector Hub
  • Application Performance Monitoring (APM)
  • Security Zones

PaaS and app dev

  • APEX Application Development
  • Notifications (pub/sub)
  • Content Management Starter Edition

Enjoy Oracle Cloud!

Categories: DBA Blogs

Using AWS Cloudfront for URL Redirection

Tue, 2022-02-01 19:20

 AWS Cloudfront is a content delivery network (CDN) and can be used for URL redirection. Sure enough, any DNS server or web server can be used for that matter, but Cloudfront enables us to use SSL certificates through ACM and WAF rules plus AWS shield for extra protection.

All you have to do is to specify your target URL, redirection from HTTP to HTTPS, specify your WAF, then a free custom SSL certificate for your domain from ACM, HTTP headers, a S3 bucket for logging and that's it. Following screens show these configurations:

Hope that helps.

Categories: DBA Blogs

Failed to start The nginx HTTP and reverse proxy server on RedHat EC2 Linux

Tue, 2021-10-19 21:55

 I needed a simple reverse proxy to redirect connections to a RDS database in a private subnet, so I quickly created a Redhat Linux EC2 instance, installed NGINX, and setup the nginx.conf file for session redirection. My nginx.conf looked like following:

user nginx;

worker_processes auto;

error_log /var/log/nginx/error.log;

pid /run/;

include /usr/share/nginx/modules/*.conf;

events {

    worker_connections 1024;


stream {

    upstream target_server {

        server targetdb:1521;


    server {

        listen 1521;

        proxy_pass target_server; }


But starting ngnix process was giving following error:

[root@test nginx]# systemctl start nginx

Job for nginx.service failed because the control process exited with error code. See "systemctl status nginx.service" and "journalctl -xe" for details.

[root@test nginx]# systemctl status nginx.service
● nginx.service - The nginx HTTP and reverse proxy server
   Loaded: loaded (/usr/lib/systemd/system/nginx.service; enabled; vendor preset: disabled)
   Active: failed (Result: exit-code) since Wed 2021-10-20 13:40:57 AEDT; 5s ago
  Process: 14702 ExecStartPre=/usr/sbin/nginx -t (code=exited, status=1/FAILURE)
  Process: 14700 ExecStartPre=/usr/bin/rm -f /run/ (code=exited, status=0/SUCCESS)

Oct 20 13:40:57 test systemd[1]: Starting The nginx HTTP and reverse proxy server...
Oct 20 13:40:57 test nginx[14702]: nginx: [emerg] unknown directive "stream" in /etc/nginx/nginx.conf:9
Oct 20 13:40:57 test nginx[14702]: nginx: configuration file /etc/nginx/nginx.conf test failed
Oct 20 13:40:57 test systemd[1]: nginx.service: control process exited, code=exited status=1
Oct 20 13:40:57 test systemd[1]: Failed to start The nginx HTTP and reverse proxy server.
Oct 20 13:40:57 test systemd[1]: Unit nginx.service entered failed state.
Oct 20 13:40:57 test systemd[1]: nginx.service failed.

Solution: Just install nginx-mod-stream

[root@test nginx]# ls -ltr /usr/lib/nginx/modules/
ls: cannot access /usr/lib/nginx/modules/ No such file or directory
[root@ip-10-219-40-147 nginx]# yum install nginx-mod-stream

Now if you start nginx service, it should work.
Categories: DBA Blogs

Introduction to Oracle Blockchain Tables in 21c

Mon, 2021-08-16 18:32

When I first heard that Oracle is now available in blockchain few years back, I fell from my chair in awe that how quickly they have adopted this new bleeding edge technology. I thought now Oracle corporation would also release its own cryptocurrency too. But then I learned that a blockchain oracle is not that Oracle. 

A blockchain oracle is just a service which inputs required information to smart contracts. A smart contract is just a program in blockchain which creates a immutable agreement between stakeholders without a third party.

Anyway, at last Oracle has dipped its feet into the blockchain technology it seems. In Oracle 21c, we have now something called as blockchain tables. Blockchain is simply a linked chain of blocks which store ledger of transactions verified and duplicated by all the nodes based on a complex cryptographic algorithm. This transparency, decentralization, and cryptographic foundation makes it more secure and eliminates the need of third parties. 

Oracle blockchain is not really that decentralized. It centralized the whole blockchain within the Oracle database. Blocks in a blockchain table are chained together by row hashes. You can only insert data into these tables. Rows can be deleted but only restricted to some retention value.

CREATE BLOCKCHAIN TABLE cardano_ledger (ada_id NUMBER, ada_tokens NUMBER)

                     NO DROP UNTIL 365 DAYS IDLE

                     NO DELETE LOCKED

                     HASHING USING "SHA2_512" VERSION "v1"

You can then check attributes of blockchain table in user_blockchain_tables.

You can delete rows in a blockchain table only by using the DBMS_BLOCKCHAIN_TABLE package, and only rows that are outside the retention period. You can only increase the retention period and not decrease it.





   DBMS_OUTPUT.PUT_LINE('Number of rows deleted=' || NUMBER_ROWS);




And you cannot truncate a blockchain table in Oracle database.

Now does Oracle blockchain table is really a blockchain? It has compromised decentralization and it conditionally mutable. I will leave the decision to you. 

For more practice of Oracle blockchain table, check the Oracle Docs.

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 8

Thu, 2021-07-01 21:55

 Welcome to the next edition of weekly Cloud Vanity. By 2025, 85 percent of companies will have containerized applications in production, as per Gartner. Due to their very nature, containerized apps have to be orchestrated. As of now, K8s is the most viable option. To take the edge off the K8s complexity, we have EKS, AKS, ACK, and GKE among various other options. It's going to stay here one way or another.


The iOS clients rely on Amazon Cognito for authentication and authorization, provisioned using the Amplify CLI authentication category. The Amplify DataStore is used to persist the data locally on the client side, providing capabilities that allow automatic synchronization of data between devices and the cloud.

MYCOM OSI offers assurance, automation, and analytics software as a service (Saas) applications for the digital era. The Assurance Cloud Service™ provides critical end-to-end performance, fault and service quality management, and supports AI and machine learning (ML)-driven closed-loop assurance for hybrid, physical, and virtualized networks, across all domains, within a SaaS model.

Binlog replication is a popular feature serving multiple use cases, including offloading transactional work from a source database, replicating changes to a separate dedicated system to run analytics, and streaming data into other systems, but the benefits don’t come for free.

With AWS Glue DataBrew, data analysts and data scientists can easily access and visually explore any amount of data across their organization directly from their Amazon Simple Storage Service (Amazon S3) data lake, Amazon Redshift data warehouse, Amazon Aurora, and other Amazon Relational Database Service (Amazon RDS) databases. You can choose from over 250 built-in functions to merge, pivot, and transpose the data without writing code.

Everyone has their favorite integrated development environment, or IDE, as it’s more commonly known. For many of us, it’s a tool that we rely on for our day-to-day activities. In some instances, it’s a tool we’ve spent years getting set up just the way we want – from the theme that looks the best to the most productive plugins and extensions that help us optimize our workflows.


Run a "provisioning" automation for a completely hands-off experience for instrumenting and monitoring any new applications that you create and deploy—using Terraform or ARM Template. Or you can run it on-demand using the Azure CLI for greater flexibility and control.

As defined by Gartner, AIOps enhances IT operations through insights that combine big data, machine learning, and visualization to automate IT operations processes, including event correlation, anomaly detection, and causality determination. 

From edge to cloud, companies are eager to find innovative solutions that meet them where they are. Today’s business environment is increasingly complex, and customers tell us they need solutions that are multi-cloud, platform-agnostic, and offer integrated apps and services that are always up to date. 

This article introduces MSIX & a deep dive/walkthrough on MSIX App Attach, Microsoft’s layering solution for delivering applications to a modern workspace.

A virtual network is nothing but, like On-premises network which we use switches and routers to communicate with servers and clients as same as Azure VNet is also used for communicating with azure resources. (Virtual Machine, Databases, etc.)


The financial services industry is changing—an estimated $68 trillion in wealth transferring from baby boomers to millennials.

For Data Preview 0, the IDF leverages Cloud Storage, Google Kubernetes Engine (GKE), and Compute Engine to provide the Rubin Observatory user community access to simulated LSST data in an early version of the RSP. 

Digital native companies have no shortage of data, which is often spread across different platforms and Software-as-a-service (SaaS) tools.

Many of our customers want to know how to choose a technology stack for solving problems with machine learning (ML).

Setting up Cloud Monitoring dashboards for your team can be time consuming because every team's needs are different. 


This article describes how to use the ORAS client to push Wasm modules with allowed media types to the Alibaba Cloud Container Registry (ACR) (an OCI-compatible registry).

Before the Dubbo-go development tests started recently, samples were used for functional verification. Now, to reproduce a functional problem, a project of Dubbo-go and Dubbo calling has been planned from scratch to record the potential problems newcomers might encounter.

They transform mission-critical business systems by automating transactions and processes. Functions like campaign to lead, order to cash, procure to pay, incident to resolution, concept to market and hire to retire can now all be optimized and accelerated by artificial intelligence.

HPE helps customers recover from cyberattacks in minutes with acquisition of data protection leader, Zerto

Simplify your software deployments with the Oracle Cloud Infrastructure DevOps service

Categories: DBA Blogs

Best Ultimate Definition of DevOps

Thu, 2021-06-24 23:50

 DevOps enables us to build software in a faster and reliable way. 

That's it.

Yes that's pretty much all there is to it. 

Over the decades, what was marring software development was that software was taking too long to build because any serious software needs multiple teams to develop, stored, build, test, integrate and deploy. 

Earlier, on every step, teams were working in silos taking their sweet time to finish their part and chucking the software over the fence to the next team. Without any automation and collaboration between these teams software was taking longer to build and end product was always replete with bugs. 

DevOps is a natural evolution from that process where code is stored in single repo and all developers work in single repo for their application, then there is a single automated build process for that repo, which then get's tested on the way to deployment. Agile methodology and shorter sprints ensure that software passes through the phases in days and not in months and bugs are identified and rectified as early as possible in the cycle.

Then comes the slew of tooling available ranging from Git or bitbucket and others for source code, Jenkins or TeamCity and others for build and testing, and then either automatic deploy or manual deploy to different environments.

I hope that helps.

Categories: DBA Blogs

EKS Login error: You must be logged in to the server (Unauthorized)

Tue, 2021-06-22 09:00

 The reason why you would receive following error when you try to access an EKS cluster is that the user who created the EKS cluster is different what you are using to run kubectl commands:

error: You must be logged in to the server (Unauthorized)

In order to resolve this, either use aws-iam-authenticator or use the same user to run your kubectl commands with which you created the cluster. I  normally like to work with the kubeconfig file present so I use the same user for both creating EKS cluster and running kubectl commands. I also set my config and credentials in AWS as follows:

AWS .config file entry:

[profile eks-np]

region = ap-southeast-2

output = json

AWS credentials file entry:


aws_access_key_id = <key id>

aws_secret_access_key = <access key>

region = ap-southeast-2

and then run following command to update your .kube file with cluster and context info:

aws eks --region ap-southeast-2 update-kubeconfig --name ekscluster --profile eks-np

I hope that helps.

Categories: DBA Blogs

3 Tips For Mental Health as Cloud Engineer

Mon, 2021-06-21 17:52

 It's simple. Our brain is our biggest asset when it comes to being a cloud or devops engineer. If you start loosing your mind or if its chaotic and unsound, then you won't be a cloud engineer for long for sure. Following are 3 tips to maintain your mental health as a cloud engineer.

1- You don't have to Learn it all 

You don't have to learn all hundreds of services offered by AWS, Azure, GCP, Ali baba, Oracle, etc. Don't stress about learning every devops CICD tool under the sun. You are not supposed to learn all there is in cloud w.r.t networking, compute, storage, databases, serverless, streams, ML, AI, integration, and other stuff. Stop falling into that bottomless chasm as you would never be able to get out of it. Just learn what's needed now. Just learn what your background is. Just learn what interests you. Just learn what gets the job done.

2- Don't take Cloud Job postings too seriously

One of the ways I was checking what's required by today's businesses in terms of skillset was to review cloud job postings. After going through the skills required, I was getting very depressed as the list was long. Then I started to realize that it was humanly impossible for anyone to possess all those skills in fullest. How can I be a full stack developer with rich networking background, having worked in true agile environment remaining abreast of CICD pipelines with a must docker and kubernetes 5 year experience, topped by some SQL and NoSQL database skills. Job posting just don't end there either. You have to know all those tools like DCOS, K8s, Maven, Ant, Gradle, TeamCity, Jenkins, Bash, Powershell, Python, Ruby, MongoDB, Kafka, Oracle, IAM, OKTA, AD, ELK, and plethora of other acronyms. Trust me, most of these job postings are unrealistic and they know it. If you foot even half of the bill in these job postings and are good at what you do, you should be fine. 

3- Keep Learning Enjoyably

Don't switch off and don't go to some remote monastery to meditate. We have made these careers after lots of hard work. We have to stay relevant in the industry. We have to pay mortgages and we have to provide for our family. What really stresses is the fear of lagging behind and going obsolete. We keep doing this cloud stuff because we enjoy doing it. What bothers is the endless list of things to learn. Just follow above 2 points and this stress should be very manageable, and then you can just focus on things which you really should be learning and what you enjoy learning. 

Happy learning and progressing.

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 6

Thu, 2021-06-17 21:55

 Welcome to the next edition of weekly Cloud Vanity. There is so much razzmatazz about cloud that for a second we tend to think that every company is already on some sort of cloud. A recent study has found out that still only 35% of companies are in the cloud and rest of them are still thinking or planning to migrate. So still lot of opportunity there.


Announcing the AWS Security and Privacy Knowledge Hub for Australia and New Zealand

Few things have changed the world more than the internet, and at the heart of the internet is the open source LAMP stack. LAMP, short for Linux, Apache, MySQL, and PHP, enabled developers to build new, interactive web experiences

Australian Commonwealth Government agencies are subject to specific requirements set by the Protective Security Policy Framework (PSPF) for securing connectivity between systems that are running sensitive workloads, and for accessing less trusted environments, such as the internet.

Today, Dave Brown, VP of Amazon EC2 at AWS, announced the Graviton Challenge as part of his session on AWS silicon innovation at the Six Five Summit 2021.

AWS Step Functions allow you to build scalable, distributed applications using state machines. Until today, building workflows on Step Functions required you to learn and understand Amazon State Language (ASL).


Now’s the time to register for the free Azure Hybrid and Multicloud Digital Event on Tuesday, June 29, 2021, from 9:00 AM–11:00 AM Pacific Time, delivered in partnership with Intel.

For over three years, I have had the privilege of leading the SAP solutions on Azure business at Microsoft and of partnering with outstanding leaders at SAP and with many of our global partners to ensure that our joint customers run one of their most critical business assets safely and reliably in the cloud. 

There are many factors that can affect critical environment (CE) infrastructure availability—the reliability of the infrastructure building blocks, the controls during the datacenter construction stage, effective health monitoring and event detection schemes, a robust maintenance program, and operational excellence to ensure that every action is taken with careful consideration of related risk implications.

The power of 5G, IoT, and real-time AI will unlock new and innovative services for enterprises across the world to accelerate their transformation toward Industry 4.0 as they evolve and adopt diverse new business models. 

Cloud and edge computing are coming together as never before, leading to huge opportunities for developers and organizations around the world. Digital twins, mixed reality, and autonomous systems are at the core of a massive wave of innovation from which our customers already benefit.


As your organization evolves, the cloud can be a powerful tool to drive growth, improve efficiency, and reduce costs. In fact, the cloud is so powerful that most organizations find themselves running on multiple clouds

At its core, Data and Analytics allows us to make impactful decisions by deriving insights from our data. In the pursuit of making data meaningful, data scientists and engineers are often tasked with building end-to-end workflows to ingest, process and analyze data.

At Google Cloud, we believe moving to the cloud shouldn’t have to mean starting over from scratch. That’s why we’re on a mission to give you choices for how you run your enterprise workloads, including migrating and modernizing your Windows workloads. 

In times of significant disruption, organizations are faced with three choices: Retrench within legacy solutions, pause and do nothing while waiting for more data or different circumstances, or press ahead, potentially even accelerating to realize the desired outcome.

Our collective understanding of work—where it takes place and how it gets done—has been transformed over the last year.


The Clouds are Thickening: An Overview of The SaaS Ecosystem and Big Cloud Providers

The Move Toward Simplicity: Why a Single-Vendor Approach to AI-Powered Automation Matters

Crafting with XieSandi is an educational simulation video game that describes how to craft different things, such as handcrafting and DIY.

Imagine one hybrid cloud platform that provides the automation, observability and cloud-native capabilities necessary to keep business, technology and teams connected while delivering the best digital experiences now and in the future.

Cloud operating model refers to the operational model used by I&O organization when adopting cloud and trying to execute their cloud strategy. The model is used to transform the traditional IT management within an organization to be capable of managing the clouds that the business consumes.

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 5

Thu, 2021-06-10 21:09

 Welcome to the next edition of weekly Cloud Vanity. As usual, this edition casts light on multiple cloud providers and what's happening in their sphere. From the mega players to the small fish on the ocean, it has covered it all. Enjoy!!!


Reducing risk is the fundamental reason organizations invest in cybersecurity. The threat landscape grows and evolves, creating the need for a proactive, continual approach to building and protecting your security posture. Even with expanding budgets, the number of organizations reporting serious cyber incidents and data breaches is rising.

Streaming data presents a unique set of design and architectural challenges for developers. By definition, streaming data is not bounded, having no clear beginning or end. It can be generated by millions of separate producers, such as Internet of Things (IoT) devices or mobile applications. Additionally, streaming data applications must frequently process and analyze this data with minimal latency.

This post presents a solution using AWS Systems Manager State Manager that automates the process of keeping RDS instances in a start or stop state.

Over the last few years, Machine Learning (ML) has proven its worth in helping organizations increase efficiency and foster innovation. 


In recent years, the grocery industry has had to shift to facilitate a wider variety of checkout journeys for customers. This has meant ensuring a richer transaction mix, including mobile shopping, online shopping, in-store checkout, cashierless checkout or any combination thereof like buy online, pickup in store (BOPIS).  

At Google I/O this year, we introduced Vertex AI to bring together all our ML offerings into a single environment that lets you build and manage the lifecycle of ML projects. 

Dataflow pipelines and Pub/Sub are the perfect services for this. All we need to do is write our components on top of the Apache Beam sdk, and they’ll have the benefit of distributed, resilient and scalable compute.

In a recent Gartner survey of public cloud users, 81% of respondents said they are working with two or more providers. And as well you should! It’s completely reasonable to use the capabilities from multiple cloud providers to achieve your desired business outcomes. 


Generators at datacenters, most often powered by petroleum-based diesel, play a key role in delivering reliable backup power. Each of these generators is used for no more than a few hours a year or less at our datacenter sites, most often for routine maintenance or for backup power during a grid outage. 

5 reasons to attend the Azure Hybrid and Multicloud Digital Event

For over three years, I have had the privilege of leading the SAP solutions on Azure business at Microsoft and of partnering with outstanding leaders at SAP and with many of our global partners to ensure that our joint customers run one of their most critical business assets safely and reliably in the cloud. 

There are many factors that can affect critical environment (CE) infrastructure availability—the reliability of the infrastructure building blocks, the controls during the datacenter construction stage, effective health monitoring and event detection schemes, a robust maintenance program, and operational excellence to ensure that every action is taken with careful consideration of related risk implications.


Anyone who has even a passing interest in cryptocurrency has probably heard the word ‘blockchain’ branded about. And no doubt many of those who know the term also know that blockchain technology is behind Bitcoin and many other cryptocurrencies.

Alibaba Cloud Log Service (SLS) cooperates with RDS to launch the RDS SQL audit function, which delivers RDS SQL audit logs to SLS in real time. SLS provides real-time query, visual analysis, alarm, and other functionalities.

How AI Automation is Making a First-of-its-Kind, Crewless Transoceanic Ship Possible

Enterprise organizations have faced a compendium of challenges, but today it seems like the focus is on three things: speed, speed, and more speed. It is all about time to value and application velocity—getting applications delivered and then staying agile to evolve the application as needs arise.

Like many DevOps principles, shift-left once had specific meaning that has become more generalized over time. Shift-left is commonly associated with application testing – automating application tests and integrating them into earlier phases of the application lifecycle where issues can be identified and remediated earlier (and often more quickly and cheaply).

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 4

Thu, 2021-06-03 19:31

 Welcome to the next edition of weekly Cloud Vanity. Foundation of any cloud matters. Cloud is and always will be a distributed hybrid phenomenon. That is why architecting, developing, and operating a hybrid mix of workload require stable, scalable and reliable cloud technologies. This edition discusses few of them from across different clouds out there.


AWS SAM or Serverless Application Model is an open source framework that you can use to develop, build and deploy your serverless applications.

Pluralsight, Inc., the technology workforce development company, today announced that it has entered into a definitive agreement to acquire A Cloud Guru (ACG).

AWS Lambda Extensions are a new way to integrate your favorite operational tools for monitoring, observability, security, and governance with AWS Lambda. Starting today, extensions are generally available with new performance improvements and an expanded set of partners including Imperva, Instana, Sentry, Site24x7, and the AWS Distro for OpenTelemetry.

Amazon SQS is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. 

Let’s say your Python app uses DynamoDB and you need some unit tests to verify the validity of your code, but you aren’t sure how to go about doing this.


Personal access tokens (PATs) make it easy to authenticate against Azure Devops to integrate with your tools and services. However, leaked tokens could compromise your Azure DevOps account and data, putting your applications and services at significant risk.

Azure announces general availability of scale-out NVIDIA A100 GPU Clusters: the fastest public cloud supercomputer.

La Liga, the foremost Spanish football league, has expanded its partnership with Microsoft Azure to focus on machine learning (ML), over the top (OTT) services, as well as augmented reality.

A little over a year ago, Microsoft Build 2020 was Microsoft’s first flagship event to become all-digital early in the COVID-19 pandemic.

Generators at datacenters, most often powered by petroleum-based diesel, play a key role in delivering reliable backup power. Each of these generators is used for no more than a few hours a year or less at our datacenter sites, most often for routine maintenance or for backup power during a grid outage. 


Having constant access to fresh customer data is a key requirement for PedidosYa to improve and innovate our customer’s experience. Our internal stakeholders also require faster insights to drive agile business decisions. 

5 ways Vertex Vizier hyperparameter tuning improves ML models

Getting started with Kubernetes is often harder than it needs to be. While working with a cluster “from scratch” can be a great learning exercise or a good solution for some highly specialized workloads, often the details of cluster management can be made easier by utilizing a managed service offering. 

Zero-trust managed security for services with Traffic Director

Databases are part of virtually every application you run in your organization and great apps need great databases. This post is focused on one such great database—Cloud Spanner.


Kubernetes is a robust yet complex infrastructure system for container orchestration, with multiple components that must be adequately protected. 

It is no contradiction to say that being ‘cloud-native’ has not much to do with cloud computing. There is an idea that cloud is a place, a suite of technologies or services that run somewhere in data centres. But the cloud is not a place; it is a way of working.

The most innovative companies of 2021 according to BCG: Alphabet, Amazon, Microsoft all make it.

In this article, the author discusses how cloud computing has changed the traditional approach to operation and maintenance (O&M).

This June, a small marine research non-profit with a huge vision will launch a first-of-its-kind, crewless transoceanic ship that will attempt to cross the Atlantic Ocean without human intervention.

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 3

Thu, 2021-05-27 20:48

 Welcome to the next edition of weekly Cloud Vanity. IBM Cloud is not having a good time out there. It was hit by another outage this week, just five days after a similar incident. The root cause was an unidentified "severity-one" incident impacted multiple services across multiple locations. This once again underlines that you need a high availability and DR plan in the cloud too. 


Authorizing functionality of an application based on group membership is a best practice. If you’re building APIs with Amazon API Gateway and you need fine-grained access control for your users, you can use Amazon Cognito

VMware Cloud on AWS allows customers to run VMware vSphere workloads on the AWS global infrastructure. This means you can run vSphere workloads across all of the AWS Regions where VMware Cloud on AWS is available.

CloudFormation Guard, an open source tool that helps validate your AWS CloudFormation templates against a rule set to keep AWS resources in compliance with company guidelines.

AWS Security Hub provides a comprehensive view of the security alerts and security posture in your accounts. Now you can import AWS IoT Device Defender audit findings into Security Hub.

Customers who are running fleets of Amazon Elastic Compute Cloud (Amazon EC2) instances use advanced monitoring techniques to observe their operational performance. Capabilities like aggregated and custom dimensions help customers categorize and customize their metrics across server fleets for fast and efficient decision making. 


Microsoft has published a root cause analysis of an outage of its Azure Domain Name System that struck the cloud platform over Easter, causing intermittent failures for customers accessing and managing their Microsoft services globally.

Machine learning is changing the way we interact with each other and improving many aspects of our lives. In recent years, a variety of tools and frameworks have been developed to make it easier to build and deploy machine learning models into user-facing applications.

Function App keys are placed in the azure-webjobs-secrets folder in Blob Container. If this folder is missing, this could mean that the Function App is unable to connect to the storage account referenced by the Function App Application Setting “AzureWebJobsStorage”. This could happen either because of a network misconfiguration or because of an issue on the storage side.

Java is one of the most popular programming languages, used by over seven million developers to create everything from enterprise applications to complex robots. 

Azure at Microsoft Build recap: build amazing things on your terms, anywhere


Using AI-powered machine learning models to identify fraudulent unemployment claims

Google has won a deal to provide cloud services to Elon Musk's SpaceX, which has launched a slew of Starlink satellites to provide high-speed internet, it said on Thursday. SpaceX will set up ground stations within Google's data centres that connect to the Starlink satellites

How to leverage global address resolution using Cloud DNS in Google Cloud VMware Engine

Analyze your logs easier with log field analytics

With Datashare, data publishers, aggregators, and consumers can come together to exchange licensed datasets on Google Cloud securely, quickly, and easily.


CNA Financial, the US insurance conglomerate, has apparently paid $40m to ransomware operators to gets its files back.

Oracle Chairman and CTO Larry Ellison was seemingly omnipresent at the annual Oracle OpenWorld conference last week, providing his unique insights on subjects ranging from why autonomous technologies are so fundamentally important to what keeps him engaged after decades in the business.

Use o to radically accelerate your cloud operations workflow. Spend less time searching the docs and say goodbye to the days of copy-and-pasting those long OCIDs.

TimescaleDB is a time series data plug-in for PostgreSQL. Its version 1.5 enables automatic compression.

Alibaba’s Winning Cloud Formula Is Coming Under Pressure

Categories: DBA Blogs

Cloud Vanity: A Weekly Carnival of AWS, Azure, GCP, and More - Edition 2

Thu, 2021-05-27 20:47

 Welcome to the next edition of weekly Cloud Vanity. With all the hype around Cloud computing, you might think that everyone is already in the cloud but that's not really true. It's still just the beginning and a long way to go. So don't think it's already late to jump on the bandwagon. If you are thinking about shifting to cloud career then do it now. If you are a company thinking about moving workloads to cloud then do it. Because there is no other option if you want to survive.


With the launch of AWS Distro for OpenTelemetry, AWS will continue to help drive advances in observability technologies, enhancing innovation and scalability for the entire OpenTelemetry community by contributing 100% of all changes to the upstream.

CloudEndure Migration can move applications from any physical, virtual, or cloud-based infrastructure to AWS at no charge. This complements AWS Server Migration Service (AWS SMS), which is an agentless service for migrating on-premises workloads to AWS. And now we have AWS Application Migration Service for lift and shift migrations.

Monitoring SQL Server is an essential aspect of any relational database management system (RDBMS) when dealing with performance problems. 

A common practice when creating a data model design, especially in the relational database management system (RDMS) world, is to start by creating an entity relationship diagram (ERD). Afterwards, you normalize your data by creating a table for each entity type in your ERD design. 

Even if you don't like AWS IAM, drop by to wish it happy birthday. Yes AWS is celebrating or marking the birthday of AWS Identity and Access Management (IAM). 


Infrastructure-as-code tools like ARM templates and Terraform are more and more used to deploy infrastructure solutions. In general, these tools run within the context of a service principal so there needs to be an account with high privileges – at least high enough to deploy a given type of resource in your cloud environment. 

Software available under the partnership includes Oracle WebLogic, Oracle Linux, and Oracle Database, as well as interoperability between Oracle Cloud Infrastructure (OCI) and Azure. 

With the ever-increasing adoption of cloud-based solutions, and the incredibly complex make-up of the application architectures; the ability to effectively manage, orchestrate, and monitor the scenarios for search, security, and operations are becoming very critical for the success of the businesses.

Modern web app design patterns leverage microservices best practices for performance, agility, and extensibility. Azure Static Web Apps is a turnkey service for these modern full-stack web apps with pre-built and pre-rendered static front-ends, and serverless API backends. 

One detail updating today is the Azure “A” icon, which will be rolled out in product experiences and across related sites in the coming weeks. The new Azure icon represents the unity of Azure within the larger Microsoft family of product icons.


Today’s healthcare organizations are grappling with mountains of data, increasing regulations, and escalating customer expectations. To add to these, healthcare organizations deal with highly sensitive personal data that needs to be protected.

Since its launch in 2016, the Google Cloud Public Datasets Program has provided a catalog of curated public data assets in optimized formats on BigQuery and Cloud Storage in partnership with a number of data providers.

A data cloud offers a comprehensive and proven approach to cloud and embraces the full data lifecycle, from the systems that run your business, where data is born, to analytics that support decision making, to AI and machine learning (ML) that predict and automate the future. 

VPC Flow Logs is one such enterprise-grade network traffic analysis tool, providing information about TCP and UDP traffic flow to and from VM instances on Google Cloud, including the instances used as Google Kubernetes Engine (GKE) nodes.


This post covers setting up API Gateway logging, setting up Logging Analytics, setting up Service Connector Hub to send API Gateway logs to Logging Analytics, and creating a Dashboard. Prerequisites include a working knowledge of OCI API Gateway and OCI in general.

Applications and use cases continue to evolve around data and enhanced storage needs in the cloud. Organizations building their own infrastructure and storage solutions to address the variable demands of their applications is more expensive and complex. Oracle makes storage seamless, inexpensive, high-performing, and flexible to support a wide range of use cases without sacrificing enterprise capabilities.

Businesses spend billions of hours a year on work that strips people of time and keeps them from focusing on higher-value things. AI-powered Automation helps people reclaim up to 50% of their time, and that’s something we all need.

The word “automation” often reminds people of assembly lines and manufacturing processes. But in a digital world, automation isn’t about the delivery of goods — it’s about making every interaction, experience and process more intelligent and impactful. This helps companies deliver value to their customers and gain a competitive advantage in their industry.

Financial institutions around the world are dramatically accelerating digital transformation. In the financial services industry, over 36 billion customer records were exposed in Q3 of 2020. The IBM Cloud for Financial Services provides a way for banks and financial institutions to migrate workloads to the cloud platform.

Categories: DBA Blogs

How to Enable SSH Equivalency Between EC2 Instances

Tue, 2021-04-20 01:55

 If you want to login to a Linux instance from other Linux instance without password or without mentioning the key, then ssh equivalency is the solution. 

Normally, in order to generate ssh equivalency between 2 Linux instances, you create both public and private keys, then copy them over to other instance and add it to authorized_keys file etc. 

But in EC2 instance in AWS, you have to create or specify the keys during the launch time of instance. When you launch an EC2 instance, public keys are already present in home directory of the user. For example, for Amazon Linux , the public key would be already present in /home/ec2-user/.ssh/authorized_keys file. That is why, you only need the private key to ssh into that server.

Let's say you have another EC2 instance which is Linux based and you want to establish ssh equivalency between these two instances. Let's suppose both are using the same key-pair. It means that both would already have public key present in their /home/ec2-user/.ssh/authorized_keys file. In that case all you need to do is following on both servers to establish ssh equivalency:

1- Login to Instance 1

2- Go to /home/ec2-user/.ssh/ directory

3- Touch a new file

touch id_rsa

chmod 700 id_rsa

4- Copy the content of your pem key and paste it into this id_rsa file

Now you should be able to ssh to the other server, which has the same keypair.

Repeat above steps on other server if you want to enable reverse ssh equivalency.

Categories: DBA Blogs