Development

Sacred Forests

Greg Pavlik - Mon, 2020-07-27 12:28
An acquaintance sent this article on the small forest preserves in Ethiopia. The video is less than 10 minutes and well worth watching. The pictures in many ways tell thousands of words. Interesting to me: many of the visuals remind me of parts of north and central California where the trees and shrubs were removed to make way for cattle grazing - the visual effects I think are best captured by the late great radical novelist Edward Abbey's description of a "cow-burnt west". Deforestation in Ethiopia was also driven by agriculture to a large extent as well.

Now these forests are occupied by a handful of eremites. Their lived experience in these patches of natural oasis lends toward a wisdom that we seem to have lost in our industrialized and bustling commercial existence: "“In this world nothing exists alone,” he said. “It’s interconnected. A beautiful tree cannot exist by itself. It needs other creatures. We live in this world by giving and taking. We give CO2 for trees, and they give us oxygen. If we prefer only the creatures we like and destroy others, we lose everything. Bear in mind that the thing you like is connected with so many other things. You should respect that co-existence.” As Alemayehu explained, biodiversity gives rise to a forest’s emergent properties. “If you go into a forest and say, ‘I have ten species, that’s all,’ you’re wrong. You have ten species plus their interactions. The interactions you don’t see: it’s a mystery. This is more than just summing up components, it’s beyond that. These emergent properties of a forest, all the flowering fruits—it’s so complicated and sophisticated. These interactions you cannot explain, really. You don’t see it.”"

In my mind I see these eremites like Zosima in the Brothers Karamzov: "Love to throw yourself on the earth and kiss it. Kiss the earth and love it with an unceasing, consuming love. Love all men, love everything. Seek that rapture and ecstasy. Water the earth with the tears of your joy and love those tears. Don’t be ashamed of that ecstasy, prize it, for it is a gift of God and a great one; it is not given to many but only to the elect." Of course I may be romanticizing these good people's experience in these forest patches - I've never been there and never met any of the eremites that do.

And yet, as the author notes: "The trees’ fate is bound to ours, and our fate to theirs. And trees are nothing if not tenacious." For these Ethiopians, at least, a tree is tied inextricably to their salvation. But isn't it true that for all of us the tree is a source of life and ought to be honored as such?

Android Oracle Client 2.0

XTended Oracle SQL - Sun, 2020-06-28 07:42

I’ve just released new version of my Simple Android Oracle Client.

New features:

  • Supported Oracle versions: 11.2, 12.1, 12.2, 18, 19, 20.
  • SQL Templates: now you can save and load own script templates
  • Server output (dbms_output)
  • Export results as JSON, CSV and HTML files (long tap on results)
  • Copy results to the Clipboard as JSON or CSV

I use it just for basic troubleshooting and small fixes, but, please, let me know if you need anything else.
Screenshots:

Categories: Development

Oracle Legend Bryn Llewellyn on Distributed, Cloud-Native Databases

Gerger Consulting - Tue, 2020-06-23 13:12
Bryn Llewellyn has run PL/SQL and Edition-Based Redefinition at Oracle for more than a decade and we are incredibly happy to have him as our next speaker!



Cloud-native databases are coming out of age. They are open source and high performance. They are built on the shared nothing architecture, have linear write scalibility, low read latency and built-in fault tolerance. And now they are even compatibale with PostgreSQL.

Attend the webinar by the Oracle legend Bryn Llewellyn and learn how distributed databases can help you in your daily work.
Categories: Development

Modern Times

Greg Pavlik - Sun, 2020-06-21 17:18
I’ve found myself, quite unintentionally, immersed in modernism recently. I had been previously spending a lot of time on Renaissance era music and art, so I don’t have a good explanation as to how I got from there to here. But taking stock of things, I was: reading Fernando Pessoa’s Book of Disquiet, listening to a strange melange of Iannis Xenakis, Holly Herndon, Pink Floyd’s The Wall, and looking closely at a series of paintings by Makoto Fujimura. Pretty much the only active exception I could come up with was znamenny chant recordings. None of these works necessarily relate and I’m not sure I can explain the reason for this clustering outside of coincidence.

I think many times the term "modernism" is conflated with "contemporary" in casual use. But by "modernism" in this case I mean, first and foremost, a mode of artistic exploration that breaks with prior, established forms, be they “rules” or aesthetic norms, seeing them as having exhausted their capacity to express themselves. Of course, these also involve the introduction of new forms and rationalizations for those shifts - ways to capture meaning in a way that carries forward a fresh energy of its own (at least for a time), often with an inchoate nod to "progress". I suppose the most recent manifestation of modernism may be transhumanism, but this obsession with the form seemed to have pervaded so much of the 20th century - in painting the emergence of cubism to the obsessiveness with abstraction (which finally gave way to a resurgence of figurative painting), in literary theory the move from structuralism to post structuralism and the disintegration into deconstruction. Poetry as well: proto modernists like Emily Dickinson paved the way for not only "high modernists" like Eliot but a full range of form-experimental poets, from ee cummings to BH Fairchild. These were not always entirely positive developments - I’ll take Miles Davis’s Kind of Blue over Bitches Brew any day of the week. But then again, I’ll take Dostoevsky over Tolstoy 10 times out of 10. In some sense, we have to take these developments as they come and eventually sift the wheat from the chaff.

Which brings me back to Pessoa, one of the literary giants of the Portuguese language. His Book of Disquiet was a lifelong project, which features a series - a seemingly never ending series - of reflections by a number of "heteronym" personalities he developed. The paragraphs are often redundant and the themes seem to run on, making for a difficult book to read in long sittings. As a consequence I've been pecking away at it slowly. It becomes more difficult as time goes by for another reason: the postured aloofness to life seems sometimes fake, sometimes pretentious: more what one would expect from an 18 year old than a mature writer who has mastered his craft. And yet Pessoa himself seems at times to long for a return to immaturity: "My only regret is that I am not a child, for that would allow me to believe in my dreams and believe that I am not mad, which would allow me to distance my soul from all those who surround me."

But still, the writing at times is simply gorgeous. There's not so much beauty in what Pessoa says as in how he says it. He retains completely the form of language, but deliberately evacuates the novel of its structure. What we are left with are in some sense "micro-essays" that sometimes connect and at other times disassociate. Taken as words that invoke meaning, they are often depressing, sometimes nonsensical. Taken as words that invoke feeling - a feeling of language arranged to be something more than just words - they can be spectacular.

The tension between the words as meaning and words as expression is impossible to escape: "Nothing satisfies me, nothing consoles me, everything—whether or not it has ever existed—satiates me. I neither want my soul nor wish to renounce it. I desire what I do not desire and renounce what I do not have. I can be neither nothing nor everything: I’m just the bridge between what I do not have and what I do not want.” What does one make of this when considered as creed? Unlikely anything positive. Yet this pericope is rendered in a particularly dreamy sort of way that infects the reader when immersed in the dream-like narrative in which it is situated. It's almost inescapable.

Few novels have made me pause for such extended periods of time to ponder not so much what the author has to say but how he says it. It's like a kind of poetry rendered without a poem.

---

A nod to New Directions Publishing, by the way, for making this project happen. Their edition of Disquiet I suspect will be seen as definitive for some time.

Unity and Difference

Greg Pavlik - Wed, 2020-06-03 09:43
One of the themes that traveled from Greek philosophy through until the unfolding of modernity was the neoplatonic notion of "the One". A simple unity in which all "transcendentals" - beauty, truth, goodness - both originate and in some sense coalesce. In its patristic and medieval development, these transcendentals were "en-hypostasized" or made present in persons - the idea of the Trinity, where a communion of persons exist in perfect love, perfect peace and mutual self-offering: most importantly, a perfect unity in difference. All cultures have their formative myths and this particular myth made its mark on a broad swath of humanity over the centuries - though I think in ways that usually obscured its underlying meaning (unfortunately).

Now I have always identified with this comment of Dostoevsky: "I will tell you that I am a child of this century, a child of disbelief and doubt. I am that today and will remain so until the grave": sometimes more strongly than others. But myths are not about what we believe is "real" at any point in time. The meaning of these symbols I think says something for all of us today - particularly in the United States: that the essence of humanity may be best realized in a unity in difference that can only be realized through self-offering love. In political terms we are all citizens of one country and our obligation as a society is to care for each other. This much ought to be obvious - we cannot exclude one race, one economic class, one geography, one party, from mutual care. The whole point of our systems, in fact, ought to be to realize, however imperfectly, some level of that mutual care, of mutual up-building and mutual support.

That isn't happening today. Too often this we are engaged in the opposite - mutual tearing down and avoiding our responsibilities to each other. I wish there was a magic fix for this: it clearly has been a problem that has plagued our history for a long, long time. The one suggestion I can make is to find a way to reach out across boundaries with care on a day by day basis. It may seem like a person cannot make a difference. No individual drop of rain thinks it is responsible for the flood.

Introduction to Apache Kafka

Gerger Consulting - Tue, 2020-06-02 13:03
Apache Kafka is a product that should be in every IT professional's toolbox.

Attend the webinar by developer advocate Ricardo Ferreira and learn how to use Kafka in your daily work


About the Webinar:

The use of distributed streaming platforms is becoming increasingly popular among developers, but have you ever wondered why?
Part Pub/Sub messaging system, partly distributed storage, partly CEP-type event processing engine, the usage of this type of technology brings a whole new perspective on how developers capture, store, and process events. This talk will explain what distributed streaming platforms are and how it can be a game changer for modern data architectures. We'll discuss the road in IT that led to the need of this type of platform, the current state of Apache Kafka, as well as scenarios where this technology can be implemented.


About the Presenter:
Ricardo is a Developer Advocate at Confluent, the company founded by the original co-creators of Apache Kafka. He has over 22 years of experience working with software engineering and specializes in service-oriented architecture, big data, cloud, and serverless architecture. Prior to Confluent, he worked for other vendors, such as Oracle, Red Hat, and IONA Technologies, as well as several consulting firms. While not working he enjoys grilling steaks on his backyard with his family and friends, where he gets the chance to talk about anything that is not IT related. Currently, he lives in Raleigh, North Carolina, with his wife, and son.
Categories: Development

Machine Learning for Oracle Professionals

Gerger Consulting - Tue, 2020-05-26 17:00

In this webinar, Oracle ACE Director Craig Shallahamer will introduce you to the world of applied Machine Learning from an Oracle Professional (DBA/Developer/Manager) perspective. This includes understanding what ML is, why use it and why now.

He will demonstrate how to create an automated anomalous performance detection system. He'll use industry standard Python with its ML libraries and Jupyter Notebooks. You will be able to download and do everything he does in this webinar!


If you have ever wondered how ML can be applied in an IT environment, you don't want to miss this webinar.



Categories: Development

How to Improve Oracle Performance with NVM

Gerger Consulting - Mon, 2020-05-18 13:26
Non-Volatile Memory hardware is a technology that every Oracle customer must know about. Attend the webinar by Yaron Dar and find out how you can improve the performance of yoru Oracle database with NVM.

Covered topics:
  • - Proof points, best practices, and guidelines for achieving peak performance for Oracle workloads with NVMe and Storage Class Memory.
  • - Maintaining high availability through disasters with Oracle Extended RAC (demo)
  • - Achieving amazing data reduction and storage efficiency for Oracle databases.

Register at: http://www.prohuddle.com/webinars/yarondar/oracle_performance.php


Categories: Development

Centrally Managed Users in Oracle (MS Active Directory)

Gerger Consulting - Tue, 2020-05-05 18:09
Attend the webinar by Oracle ACE Simon Pane and learn how to integrate Microsoft Active Directory with the Oracle database.

About the Webinar
This presentation details the advantages of using Centrally Managed Users (CMU) over the previous technologies, covers the installation/set-up process, and gives practical examples and use cases for organizations to leverage and benefit from the simplified integration of database user management (authentication and authorization) with Active Directory (AD).
An exciting new feature of Oracle Database 18c and 19c is "Centrally Managed Users" (CMU). This, for the first time, allows for simple and complete user management integration with Active Directory. No additional licenses required!
Previously, Oracle Enterprise User Security (EUS) and Oracle Unified Directory (OUD) was needed as an intermediate component bridging the Oracle Database to Active Directory. While EUS/OUD is still present and does offer some additional features and benefits, it's no longer required for database-AD user integration and management.

This webinar is sponsored by Gitora, the source control tool for the Oracle database.
Categories: Development

Practical Oracle SQL Webinar by ACE Director Kim Berg Hansan

Gerger Consulting - Wed, 2020-04-08 14:26
ACE Director Kim Berg Hansen is the author of the book "Practical Oracle SQL, Mastering the Full Power of Oracle Database".



Kim Berg Hansen is known to present complicated SQL features in a very accessible way to developers so that they can apply these features in their daily work.

In this webinar, Kim will present several SQL techniques, taken from his new book Practical Oracle SQL, and show you how you can apply them in real life scenarios.

Kim will cover the following topics:
  • Tree Calculations with Recursion (Recursive subquery factoring)
  • Functions Defined Within SQL (Functions in the WITH clause)
  • Answering Top-N Questions (Analytic ranking functions)
  • Rolling Sums to Forecast Reaching Minimums (Analytic window clause, recursive subquery factoring, model clause)
  • Merging Date Ranges (Row pattern matching MATCH_RECOGNIZE)
Categories: Development

Oracle Database + Git = Gitora 4

Gerger Consulting - Thu, 2020-04-02 15:17

We are happy to announce the new version of Gitora, the version control tool for the Oracle Database.

Gitora 4 enables you to manage your entire Oracle code base with Git. New features in Gitora 4 include:
  • Push to and pull from repositories in GitHub, GitLab, Bitbucket etc...
  • Simplified workflow and vastly improved working directory structure
  • SQL Developer Extension (Coming soon.)


Categories: Development

PL/SQL functions and statement level consistency

XTended Oracle SQL - Sun, 2019-12-29 18:40

You may know that whenever you call PL/SQL functions from within SQL query, each query in the function is consistent to the SCN of its start and not to the SCN of parent query.

Simple example:

create table test as 
  select level a, level b from dual connect by level<=10;

create or replace function f1(a int) return int as
  res int;
begin
  select b into res 
  from test t 
  where t.a=f1.a;
  dbms_lock.sleep(1);
  return res;
end;
/

As you can see we created simple PL/SQL function that returns the result of the query select b from test where a=:input_var

But lets check what will it return if another session changes data in the table:

-- session 2:
begin
    for i in 1..30 loop
      update test set b=b+1;
      commit;
      dbms_lock.sleep(1);
    end loop;
end;
/
-- session 1:
SQL> select t.*, f1(a) func from test t;

         A          B       FUNC
---------- ---------- ----------
         1          1          1
         2          2          3
         3          3          5
         4          4          7
         5          5          9
         6          6         11
         7          7         13
         8          8         15
         9          9         17
        10         10         19

10 rows selected.

As you can see we got inconsistent results in the column FUNC, but we can easily fix it using OPERATORs:

CREATE OPERATOR f1_op
   BINDING (INT) 
   RETURN INT 
   USING F1;

Lets revert changes back and check our query with new operator now:

--session 1:
SQL> update test set b=a;

10 rows updated.

SQL> commit;

Commit complete.

-- session 2:
begin
    for i in 1..30 loop
      update test set b=b+1;
      commit;
      dbms_lock.sleep(1);
    end loop;
end;
/

-- session 1:
SQL> select t.*, f1(a) func, f1_op(a) op from test t;

         A          B       FUNC         OP
---------- ---------- ---------- ----------
         1          2          2          2
         2          3          5          3
         3          4          8          4
         4          5         11          5
         5          6         14          6
         6          7         17          7
         7          8         20          8
         8          9         23          9
         9         10         26         10
        10         11         29         11

10 rows selected.

As you can see values in the column OP are equal to the values of B, while, in turn, function F1 returns inconsistent values.

Categories: Development

Just short note for myself: OJPPD limitations

XTended Oracle SQL - Mon, 2019-12-02 06:26

As of Oracle 19c OJPPD doesn’t support connect-by and TABLE():

OJPPD: OJPPD bypassed: query block contains START WITH/CONNECT BY.
OJPPD: OJPPD bypassed: View contains TABLE expression.
Categories: Development

Free Oracle Cloud: 15. The request could not be mapped to any database

Dimitri Gielis - Wed, 2019-11-06 13:25
This post is the last post of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

At some point you might face the following message: "The request could not be mapped to any database":


Oracle is monitoring usage on your Always Free Account and whenever it finds there's no activity for 7 days, it will stop your database automatically. It will preserve the data in the database, but it won't be accessible anymore.

To fix the issue, log in to your Oracle Cloud Account and go to your Autonomous Database:


You will see the database is in a stopped state. Click the Start button:


The state will change to Starting...


And after a minute it becomes available again:


The above behavior is written in the end-user documentation:
Inactivity Monitoring and Database Stoppage
Persistently inactive Always Free Autonomous Databases are detected and handled as follows:
  • After being inactive for 7 days, the database will be stopped automatically, preserving its stored data. Inactivity measurements leading up to 7 days are based on database connections. Successfully making a SQL*Net or HTTPS connection resets these measurements to zero.
  • A database that is automatically or manually stopped and stays inactive for 90 days, cumulative, may be reclaimed and permanently deleted. Inactivity measurements leading up to 90 days are based on the database being inactive or in the stopped state. Starting a stopped database resets these measurements to zero.
    Start an Always Free Autonomous Database by clicking the Start button on the Oracle Cloud Infrastructure console. Start a stopped Always Free Autonomous Database before 90 days to avoid losing access to its data.

But this week there were some people complaining that although they had activity, their database was stopped anyway. I witnessed the same behavior in my account, so I reached out to Oracle and they confirmed their code to identify inactivity, is not properly accounting for APEX/ORDS usage. They are already working on a fix, which they hope to apply very soon. I will update this post when I get confirmation the fix is in the data centers.

Categories: Development

Native Oracle DB JSON functionality as alternative for using cursor() in AOP (and APEX_JSON)

Dimitri Gielis - Sun, 2019-10-27 14:46
When using external (WEB/REST) services, you often communicate in JSON. So it's important to be able to generate JSON in the format that is expected by the external service.

In the case of APEX Office Print (AOP), we made it super simple to communicate with the AOP server from the database through our PL/SQL API. You just have to enter a SQL statement and the AOP PL/SQL API, which uses APEX_JSON behind the scenes, generates the necessary JSON that the AOP Server understands.

Here's an example of the Order data in JSON: a customer with multiple orders and multiple order lines:

As we are living in the Oracle database, we have to generate this JSON. The data is coming from different tables and is hierarchical. In SQL you can create hierarchical data by using the cursor() syntax.

Here's an example of the SQL statement that you would typically use in AOP (the cursor highlighted in red):

select
  'file1' as "filename", 
  cursor(
    select
      c.cust_first_name as "cust_first_name",
      c.cust_last_name  as "cust_last_name",
      c.cust_city       as "cust_city",
      cursor(select o.order_total      as "order_total", 
                    'Order ' || rownum as "order_name",
                cursor(select p.product_name as "product_name", 
                              i.quantity     as "quantity",
                              i.unit_price   as "unit_price"
                         from demo_order_items i, demo_product_info p
                        where o.order_id = i.order_id
                          and i.product_id = p.product_id
                      ) "order_lines"
               from demo_orders o
              where c.customer_id = o.customer_id
            ) "orders"
    from demo_customers c
    where customer_id = 1
  ) "data"
from dual

From AOP 19.3 onwards, the AOP PL/SQL API not only supports this cursor() syntax but also the native JSON functionality of the Oracle Database (version 12c and upwards).

The query above can also be written as the following using JSON support in the Oracle Database:

select 
  json_arrayagg( 
    json_object( 
      'filename' value 'file1', 
      'data'     value (
          select 
            json_arrayagg(
              json_object( 
                'cust_first_name' value c.cust_first_name, 
                'cust_last_name'  value c.cust_last_name,
                'cust_city'       value c.cust_city, 
                'orders'          value (
                    select 
                      json_arrayagg(
                        json_object(                               
                          'order_total' value o.order_total, 
                          'order_name'  value 'Order ' || rownum,
                          'order_lines' value (
                              select 
                                json_arrayagg(
                                  json_object(                               
                                    'product_name' value p.product_name, 
                                    'quantity'     value i.quantity,
                                    'unit_price'   value i.unit_price
                                  )
                                returning clob)      
                                from demo_order_items i, demo_product_info p
                               where o.order_id = i.order_id
                                 and i.product_id = p.product_id
                            )
                        )
                      returning clob)      
                      from demo_orders o
                    where o.customer_id = c.customer_id
                  )
              )
            returning clob)  
            from demo_customers c
            where c.customer_id = 1
          )
    )
  returning clob) as aop_json
  from dual 

You have to get used to this syntax and have to think a bit differently. Unlike the cursor syntax where you define the column first and give it an alias, using the JSON functions, you define the JSON object and attributes first and then map it to the correct column.

I find the cursor syntax really elegant, especially in combination with APEX_JSON, it's a really cool solution to generate the JSON you need. But I guess it's a personal choice what you prefer and I must admit, the more I use the native JSON way, the more I like it. If performance is important you most likely want to use native database functionality as much as possible, but I go in more detail further in this post. Lino also found an issue with the cursor syntax in the Oracle Database 19c, so if you are on that database release you want to look at the support document.

Before I move on with my test case, if you need more info on JSON in the database: Carsten did a nice blog post about parsing JSON in APEX, and although it's about parsing JSON and this blog post is more about generating JSON, the conclusions are similar. You can read more about APEX_JSON and the native JSON database functions in Tim's write-up on Oracle-Base.

As I was interested in the performance of both implementations, I run a few test cases. There are different ways to test performance, e.g. use dbms_profiler, Method R Workbench, trace, timing the results, ... Below I use Tom Kyte's script to compare two PL/SQL implementations. The interesting thing with the script it's not only comparing timings but also latches, which give you an idea of how hard the database has to work. You can download it from AskTom under the resources section:


Here's my test script:

declare
  l_sql             clob;
  l_return          blob;
  l_output_filename varchar2(100);  
  l_runs            number(5) := 1;
begin
  runStats_pkg.rs_start;
  
  -- sql example with cursor
  for i in 1..l_runs
  loop
      l_output_filename := 'cursor';
      l_sql := q'[
                select
                'file1' as "filename",
                cursor
                (select 
                    c.cust_first_name as "cust_first_name",
                    c.cust_last_name  as "cust_last_name",
                    c.cust_city       as "cust_city"
                   from demo_customers c
                  where c.customer_id = 1 
                ) as "data"
                from dual   
               ]';
      l_return := aop_api_pkg.plsql_call_to_aop (
                    p_data_type       => aop_api_pkg.c_source_type_sql,
                    p_data_source     => l_sql,
                    p_template_type   => aop_api_pkg.c_source_type_aop_template,
                    p_output_type     => 'docx',
                    p_output_filename => l_output_filename,
                    p_aop_remote_debug=> aop_api_pkg.c_debug_local); 
  end loop;  
  
  runStats_pkg.rs_middle;  
  
  -- sql example with native JSON database functionality
  for i in 1..l_runs
  loop
      l_output_filename := 'native_json';
      l_sql := q'[
                select 
                  json_arrayagg( 
                    json_object( 
                      'filename' value 'file1', 
                      'data'     value (select 
                                          json_arrayagg(
                                            json_object( 
                                              'cust_first_name' value c.cust_first_name, 
                                              'cust_last_name'  value c.cust_last_name,
                                              'cust_city'       value c.cust_city 
                                            )
                                          )  
                                          from demo_customers c
                                         where c.customer_id = 1
                                       )  
                    )
                  ) as aop_json
                  from dual 
               ]';
      l_return := aop_api_pkg.plsql_call_to_aop (
                    p_data_type       => aop_api_pkg.c_source_type_sql,
                    p_data_source     => l_sql,
                    p_template_type   => aop_api_pkg.c_source_type_aop_template,
                    p_output_type     => 'docx',
                    p_output_filename => l_output_filename,
                    p_aop_remote_debug=> aop_api_pkg.c_debug_local);                     
  end loop;    

  runStats_pkg.rs_stop;   
end;
/

I ran the script (with different l_runs settings) a few times on my 18c database and with the above use case on my system, the native JSON implementation was consistently outperforming the cursor (and APEX_JSON) implementation.

Run1 ran in 3 cpu hsecs
Run2 ran in 2 cpu hsecs
run 1 ran in 150% of the time

Name                                  Run1        Run2        Diff
STAT...HSC Heap Segment Block           40          41           1
STAT...Heap Segment Array Inse          40          41           1
STAT...Elapsed Time                      4           3          -1
STAT...CPU used by this sessio           4           3          -1
STAT...redo entries                     40          41           1
STAT...non-idle wait time                0           1           1
LATCH.simulator hash latch              27          26          -1
STAT...non-idle wait count              13          12          -1
STAT...consistent gets examina          41          43           2
LATCH.redo allocation                    1           3           2
STAT...active txn count during          21          23           2
STAT...cleanout - number of kt          21          23           2
LATCH.transaction allocation             1           3           2
LATCH.In memory undo latch               1           3           2
LATCH.JS Sh mem access                   1           3           2
STAT...consistent gets examina          41          43           2
LATCH.keiut hash table modific           3           0          -3
STAT...calls to kcmgcs                  64          69           5
STAT...dirty buffers inspected           6           0          -6
STAT...workarea executions - o           2          12          10
STAT...free buffer requested            71          52         -19
STAT...lob writes unaligned             80          60         -20
STAT...lob writes                       80          60         -20
STAT...sorts (rows)                      0          20          20
STAT...execute count                    91          71         -20
STAT...sorts (memory)                    0          20          20
LATCH.active service list                0          25          25
STAT...consistent gets                 183         156         -27
STAT...consistent gets from ca         183         156         -27
STAT...consistent gets pin (fa         142         113         -29
STAT...consistent gets pin             142         113         -29
STAT...lob reads                       160         130         -30
LATCH.JS queue state obj latch           0          42          42
LATCH.object queue header oper         151         103         -48
STAT...workarea memory allocat          66          -6         -72
STAT...db block changes                431         358         -73
STAT...consistent changes              390         315         -75
LATCH.parameter table manageme          80           0         -80
STAT...undo change vector size       8,748       8,832          84
LATCH.enqueue hash chains                1          88          87
STAT...parse count (total)             100          10         -90
STAT...session cursor cache hi         171          71        -100
STAT...opened cursors cumulati         171          71        -100
STAT...free buffer inspected           126           0        -126
STAT...calls to get snapshot s         470         330        -140
STAT...db block gets from cach         958         744        -214
STAT...hot buffers moved to he         220           0        -220
STAT...redo size                    12,016      12,248         232
STAT...db block gets                 1,039         806        -233
STAT...db block gets from cach       1,029         796        -233
STAT...session logical reads         1,222         962        -260
STAT...file io wait time             5,865       6,279         414
STAT...recursive calls                 561         131        -430
LATCH.cache buffers chains           3,224       2,521        -703
STAT...session uga memory          196,456           0    -196,456
STAT...session pga memory        1,572,864           0  -1,572,864
STAT...logical read bytes from   9,928,704   7,798,784  -2,129,920

Run1 latches total versus runs -- difference and pct
        Run1        Run2        Diff       Pct
       3,853       3,180        -673    121.16%

There are many different iterations of this test, using bind variables, etc. It seems "logical" that a native DB implementation is better performance-wise than a combination of PL/SQL (APEX_JSON) and SQL (cursor). But I always recommend you just run the test in your own environment. What is true today, might be different tomorrow and a lot comes into play, so if there's one thing I learned from Tom Kyte, it's don't take things for granted, but test in your unique situation.

So, in real life using AOP, will you see a big difference? It depends on the complexity of your SQL statement and data, how many times you call a report etc. but my guess is, in most cases, it's probably not much of a difference in user experience.

A simple test would be to do "set timing on" and compare the implementations:


Or if you are using AOP on an Oracle APEX page, you can run your APEX page in Debug mode and you will see exactly how long the generation of the JSON took for the data part:


Happy JSON'ing :)

Categories: Development

Free Oracle Cloud: 14. Your Oracle Cloud Free Trial has expired (but FREE still running)

Dimitri Gielis - Thu, 2019-10-17 16:19
This post is the last post of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

Today I got an email that my Oracle Cloud account was Expired. While I have an Always FREE Oracle Cloud, when I signed up I also got some extra credits that lasted for a month. Those credits are no longer valid.


When you log in to your Oracle Cloud Dashboard you will get a notification on top too, but nothing to worry about.


It has some consequences tho, on the menu, some options are grayed out. The one I actually use is the Email Delivery, which seems to be grayed out too although normally you should be able to send 1,000 emails per month. So maybe grayed out also means, not full service.


When I checked it out, it said it's part of the paid plan. I remember some discussions at Oracle Open World where they recommend upgrading to a Paid account, but as you only use the Always FREE services, you are not charged.


So I decided to upgrade to a Paid account: Pay As You Go:


You have to provide a Credit Card, but that was a bit of an issue for me. Apparently, Safari is not really working well with this screen, so I switched to Chrome. The next hick-up I had was when I added my AMEX card... it said it was an invalid card.


Then I used my VISA card and that seemed to work well:

Click the Start Paid Account:


Finally, it will say your payment method will be reviewed and after that you are live.


It wasn't immediately clear for me I had to wait for the confirmation email, but when I went to Payment Method again, I saw the review was still in progress:


And a few minutes later I got the email that my account was upgraded:


When you look at your Oracle Cloud Dashboard, there's a cost calculator, so you see how much you have to pay. As long as I use the Always FREE components, I expect the amount to stay 0 :)


But the nice thing now is that you have access to all of Oracle Cloud again (e.g. Email Delivery).
Categories: Development

OGB Appreciation Day: add an error in a PL/SQL Process to the inline notification in Oracle APEX

Dimitri Gielis - Thu, 2019-10-10 12:01
This post is part of the OGB (Oracle Groundbreakers) Appreciation Day 2019, a thank you to everyone that makes the community great, especially those people that work at keeping us all moving!

Before I give my tip on how to add an error message from your PL/SQL code in your Page Process to a notification message in Oracle APEX, I want to start with thanking some people.

What keeps me going are a few things:

  • The innovations of technology and more specifically the Oracle Database, ORDS, and Oracle APEX. I want to thank all the developers and the people behind those products. They allow me to help other people with the tools they create and keep on learning about the new features that are released. 
  • I want to thank the fantastic #orclapex (Oracle APEX) and Groundbreakers community. I believe we are a great example of how people help and support each other and are motivated to bring the technology further. Over time I got to know a lot of people, many I consider now friends.
  • I want to thank you because you read this, show your appreciation and push me forward to share more. I'm passionate about the technology I use. I love helping people with my skill set of developing software and while I learn, share my knowledge through this blog. 

So back to my tip of today... how do you show a message in the notification on a page?


You can do that with the APEX_ERROR PL/SQL API.


To see the usage yourself, create an empty page, with one region and a button that submits the page.
In the Submit Process, simulate some PL/SQL Code where you raise an error.

For example:


That's it! Now you can get your errors in the notification message area of your Oracle APEX Page.

Categories: Development

Free Oracle Cloud: 13. Final things to take away

Dimitri Gielis - Thu, 2019-10-10 04:42
This post is the last post of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

By now we have seen how you can set up the different components from the Always Free Oracle Cloud.

During Oracle Open World I talked to the people behind the Always Free Oracle Cloud, and they told me that when your account is inactive for a specified amount of time (I forgot if it's 5 days, or a week or more?), your instance is being backed-up to the Object Storage. You can see it as a VM which is being put in stand-by or halted and saved to disk. When you need it again, it can be restored, but it takes time and it might be annoying when you don't know this is what is happening.

If you have a production app running in the Fee Oracle Cloud, be sure people use your app at least once inside the window Oracle foresees. Maybe in the future, Oracle could foresee a setting where we can specify the (in-)activity window as a developer.

I'm really impressed by this free offering of Oracle and see many use cases for development environments and small to midsize applications. I believe the limits we get in the free plan are really generous of Oracle and much more than any other cloud provider. 
Here's a quick overview of what it looks like at the time of writing:
  • 2 Autonomous Databases, each with 1 OCPU and 20 GB storage
  • 2 Compute virtual machines, each with 1/8 OCPU and 1 GB memory
  • Storage:  2 Block Volumes, 100 GB total. 10 GB Object Storage. 10 GB Archive Storage.
  • Additional Services:  Load Balancer, 1 instance, 10 Mbps bandwidth. Monitoring, 500 million ingestion data points, 1 billion retrieval data points. Notifications, 1 million delivery options per month, 1,000 emails sent per month. Outbound Data Transfer, 10 TB per month.
So what if you outgrow these limits? It means your applications are successful, so you can be proud of that :) and at that time hopefully, there's enough revenue to upgrade to a Paid Oracle Cloud plan. This can be done super easy... you click the upgrade to the paid plan button and there you go!
Oracle will copy your DB, instance, ... and you go from there.

The way that Oracle is doing the upgrade is really cool, as it means you keep your free instance. So I see myself doing some development on the free instance, then for production upgrade to a paid plan. At that time I still have the development environment. The other free service could be the TEST environment, so you have DEV, TEST both free and PROD paid.


If you didn't check it out by now, go and try out the FREE Oracle Cloud yourself by going to https://www.oracle.com/cloud/free/ :)


Thanks Oracle!
Categories: Development

Free Oracle Cloud: 12. Create a 2nd Compute Instance and a Load Balancer

Dimitri Gielis - Fri, 2019-10-04 11:51
This post is part of a series of blog posts on the Best and Cheapest Oracle APEX hosting: Free Oracle Cloud.

In my blog post Create a VM Instance (Compute Cloud) we created a VM instance in the Free Oracle Cloud. The cool thing is that you get two VMs for free. In this post, we will set up the other always free compute instance.

Just like when we created our first instance, hit the Create a VM instance:


Give your instance a name and before I just hit the Create button, BUT this time you want to create the Show Shape, Network and Storage Options first:


The most important part of that screen is the "Assign public IP address" section. If you don't need this Compute instance to be accessible from the internet you can ignore it, but if you want to host a website, for example, you might want to check it. If you didn't do it, you can always add a public IP later, but I personally found it cumbersome and hard to understand the network piece. I had to do many different steps to get it to work to have an internet connection to that machine, while when you have a public IP address, Oracle does everything for you... anyway, it depends on your use case what you need, but I do want to highlight it. Also, it seems that the default changed from when I wrote the first post; by default, you don't have a public IP address. It might be that Oracle is trying to push you to use a Load Balancer (see later on in this blog post) and that might actually make sense.



Clicking the Create button will show that your instance is being provisioned.


When you go back to the overview you should see both of your Always Free Compute instances:


Clicking on the name, you will get the details. This screenshot shows when you don't specify a public IP address.


To access that machine, as it doesn't have a public IP, I connected to my first instance and from there, as I am on the subnet, I can connect to the Private IP Address:


An alternative for a URL to go directly to your VM instance is to front it with a Load Balancer.

Which brings us to the Load Balancer topic. With the Always Free Oracle Cloud, we also get a Load Balancer for free. There are different use cases for using a Load Balancer, but here are my own reasons why I have used a Load Balancer before:

  1. Distribute the traffic automatically over different machines. For example, when you use our APEX Office Print (AOP) Cloud you will actually hit our load balancer, behind the load balancer we have two to five different machines. It's not only to handle the large number of prints we get, but it also makes our lives easier when we want to upgrade without downtime. We upgrade one clone instance, and when done, new machines are brought online and old ones are shutdown. We patch our own service with zero downtime.
  2. The Load Balancer has the SSL certificate and handles the HTTPS requests while the backend servers have HTTP.
  3. On a Load Balancer, you have integrated health checks, so you can be warned when things go wrong, even when there's only one server behind the Load Balancer.

So lets get started to set up a Load Balancer in the Oracle Cloud:

Click on Networking > Load Balancers:


Click the Create Load Balancer button:


It will ask for a name and type. For the Always free instance, use Micro with Maximum Total Bandwidth.
By default Small is selected, so don't forget to change it:


Next you want to add a Backend to this Load Balancer, so click the Add Backends button:


In the pop-up you can select the instances you want to put behind this Load Balancer:


Furthermore, on the screen you can select a Health Check Policy:


In the next step, you can upload the SSL certificate, in case you want the Load Balancer to be accessible through HTTPS. You can also choose to just configure the Load Balancer for HTTP (which I don't recommend):


Hit the Create Load Balancer and you will get an overview that the Load Balancer is being created:


Once it's ready the icon turns green and you will see the Public IP Address of your Load Balancer:


Instead of putting the IP Address of your instance directly in the DNS of your domain name, you put the IP Address of the Load Balancer in.

A Load Balancer can do much more, you can have different Rules, SSL tunneling, etc. You can read more about that in the online documentation.

Hopefully, now you know how to set up a second compute instance and you have an idea what a Load Balancer can do for you.

We are almost done with this series... but you definitely want to read the next blog post, which is the last one where I give some important information to keep your Always Free instance running.
Categories: Development

Pages

Subscribe to Oracle FAQ aggregator - Development