Last 19c upgrade issue. Working on our new 19c database, several things died off with errors like this:
SQL> execute DBMS_STATS.CREATE_STAT_TABLE ('MYSCHEMA','MYSTATTAB','MYTS');
BEGIN DBMS_STATS.CREATE_STAT_TABLE ('MYSCHEMA','MYSTATTAB','MYTS'); END;
*
ERROR at line 1:
ORA-60019: Creating initial extent of size 14 in tablespace of extent size 8
ORA-06512: at "SYS.DBMS_STATS", line 20827
ORA-06512: at "SYS.DBMS_STATS", line 20770
ORA-06512: at "SYS.DBMS_STATS", line 20765
ORA-06512: at line 1
Our tablespaces had small uniform extents and our 19c database had defaulted the parameter db_securefile to PREFERRED. We bumped our uniform extent sizes up to 1 megabyte and the problem went away. Setting db_securefile to PERMITTED also resolved the issue.
Oracle’s support site has a bunch of good information about this. This might be a relevant bug:
Bug 9477178 : ORA-60019: CREATING INITIAL EXTENT OF SIZE X IN TABLESPACE FOR SECUREFILES
Quick post before I leave on vacation. We used Datapump to import a schema from an 11.2 HP-UX database to a 19c Linux database and got errors on a few tables like these:
ORA-39083: Object type TABLE:"MYSCHEMA"."TEST" failed to create with error:
ORA-00904: "SYS_STU0S46GP2UUQY#45F$7UBFFCM": invalid identifier
Failing sql is:
ALTER TABLE "MYSCHEMA"."TEST" MODIFY ("SYS_STU0S46GP2UUQY#45F$7UBFFCM" NUMBER GENERATED
ALWAYS AS (SYS_OP_COMBINED_HASH("COL1","COL2","COL3")) VIRTUAL )
Workaround was to create the table first empty with no indexes, constraints, etc. and import. Today I was trying to figure out why this happened. Apparently, the table has extended statistics on the three primary key columns. I found a post by Jonathan Lewis that shows a virtual column like the one this table has with extended statistics. The error is on the datapump import, impdp, of the table that has extended statistics. This error is similar to some Oracle documented issues such as:
DataPump Import (IMPDP) Raises The Errors ORA-39083 ORA-904 Due To Virtual Columns Dependent On A Function (Doc ID 1271176.1)
But I could not immediately find something that says that extended statistics cause a table to not be importable using Datapump impdp.
If you want to recreate the problem, try added extended stats like this (which I derived from Jonathan Lewis’s post):
select dbms_stats.create_extended_stats(NULL,'TEST','(COL1, COL2, COL3)') name from dual;
select * from user_tab_cols where table_name='TEST';
Then export table from 11.2 and import to 19c database using datapump. Anyway, posting here for my own memory and in case others find it useful. Maybe this is a bug?
Bobby
1/8/2020 update
Simplified my test case table to this:
CREATE TABLE test
(
COL1 NUMBER,
COL2 NUMBER
)
;
select dbms_stats.create_extended_stats(NULL,'TEST','(COL1, COL2)') name from dual;
Datapump export of this table from 11.2.0.4 fails on import with ORA-00904 on 18c and 19c but not 12.2.
1/9/20
Oracle support verified that there are a couple of bugs already out there for this.
1/13/20
You only get this error if you use the SCHEMAS parameter in Datapump. It works if you use the TABLES parameter.
This in the parfiles fails:
SCHEMAS=MYSCHEMA
INCLUDE=TABLE:"IN ('TEST')"
This works:
TABLES=TEST
1/15/20
My bug number for what it’s worth: 30763851. Probably will be a duplicate since there are several already out there. I don’t think there is a patch yet.
6/3/20
I applied the supplied patch today and it works great. Boom!
This is nothing new, but I wanted to throw out a quick post to document it. If you have a sequence.nextval in the insert part of a merge statement the merge calls nextval for all the updated rows as well.
Oracle has a bug report about this from a 9.2 issue, so this is nothing new:
Bug 6827003 : SEQUENCE # IN MERGE BEING UPDATED FOR BOTH INSERT AND UPDATE
Oracle’s bug report says you can work around the issue by encasing the sequence.nextval call in a function so I tried it and it works.
Anyway, you can’t count on the sequence only being advanced on inserted rows with merge statements if you include sequence.nextval in the insert part of the merge statement.
I have mentioned in previous posts that I am working on migrating a large 11.2 database on HP Unix to 19c on Linux. I ran across a database link to an older 9.2 database in the current 11.2 database. That link does not work in 19c so I thought I would blog about my attempts to get it to run in 19c. It may not be that useful to other people because it is a special case, but I want to remember it for myself if nothing else.
First, I’ll just create test table in my own schema on a 9.2 development database:
SQL> create table test as select * from v$version;
Table created.
SQL>
SQL> select * from test;
BANNER
----------------------------------------------------------------
Oracle9i Enterprise Edition Release 9.2.0.5.0 - 64bit Production
PL/SQL Release 9.2.0.5.0 - Production
CORE 9.2.0.6.0 Production
TNS for HPUX: Version 9.2.0.5.0 - Production
NLSRTL Version 9.2.0.5.0 - Production
Next, I will create a link to this 9.2 database from a 19c database. I will hide the part of the link creation that has my password and the database details, but they are not needed.
SQL> create database link link_to_92
... removed for security reasons ...
Database link created.
SQL>
SQL> select * from test@link_to_92;
select * from test@link_to_92
*
ERROR at line 1:
ORA-03134: Connections to this server version are no longer supported.
So I looked up ways to get around the ORA-03134 error. I can’t remember all the things I checked but I have a note that I looked at this one link: Resolving 3134 errors. The idea was to create a new database link from an 11.2 database to a 9.2 database. Then create a synonym on the 11.2 database for the table I want on the 9.2 system. Here is what that looks like on my test databases:
SQL> select * from v$version;
BANNER
--------------------------------------------------------------------------------
Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
... removed for brevity ...
SQL> create database link link_from_112
... removed for security ...
Database link created.
SQL> create synonym test for test@link_from_112;
Synonym created.
SQL>
SQL> select * from test;
BANNER
----------------------------------------------------------------
Oracle9i Enterprise Edition Release 9.2.0.5.0 - 64bit Production
Now that I have the link and synonym on the 11.2 middleman database, I go back to the 19c database and create a link to the 11.2 database and query the synonym to see the original table:
SQL> select * from v$version;
BANNER ...
-------------------------------------------------------------------------------- ...
Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production ...
...
SQL> create database link link_to_112
...
Database link created.
...
SQL> select * from v$version@link_to_112;
BANNER
--------------------------------------------------------------------------------
Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 - 64bit Production
...
SQL> select * from test@link_to_112;
BANNER
----------------------------------------------------------------
Oracle9i Enterprise Edition Release 9.2.0.5.0 - 64bit Production
So far so good. I am not sure how clear I have been, but the point is that I could not query the table test on the 9.2 database from a 19c database without getting an error. By jumping through an 11.2 database I can now query from it. But, alas, that is not all my problems with this remote 9.2 database table.
When I first started looking at these remote 9.2 tables in my real system, I wanted to get an execution plan of a query that used them. The link through an 11.2 database trick let me query the tables but not get a plan of the query.
SQL> truncate table plan_table;
Table truncated.
SQL>
SQL> explain plan into plan_table for
2 select * from test@link_to_112
3 /
Explained.
SQL>
SQL> set markup html preformat on
SQL>
SQL> select * from table(dbms_xplan.display('PLAN_TABLE',NULL,'ADVANCED'));
PLAN_TABLE_OUTPUT
------------------------------------------------------------------------------
Error: cannot fetch last explain plan from PLAN_TABLE
SQL>
SQL> select object_name from plan_table;
OBJECT_NAME
------------------------------------------------------------------------------
TEST
Kind of funky but not the end of the world. Only a small number of queries use these remote 9.2 tables so I should be able to live without explain plan. Next, I needed to use the remote table in a PL/SQL package. For simplicity I will show using it in a proc:
SQL> CREATE OR REPLACE PROCEDURE BOBBYTEST
2 AS
3 ver_count number;
4
5 BEGIN
6 SELECT
7 count(*) into ver_count
8 FROM test@link_to_112;
9
10 END BOBBYTEST ;
11 /
Warning: Procedure created with compilation errors.
SQL> SHOW ERRORS;
Errors for PROCEDURE BOBBYTEST:
LINE/COL ERROR
-------- -----------------------------------------------------------------
6/3 PL/SQL: SQL Statement ignored
6/3 PL/SQL: ORA-00980: synonym translation is no longer valid
I tried creating a synonym for the remote table but got the same error:
SQL> create synonym test92 for test@link_to_112;
...
SQL> CREATE OR REPLACE PROCEDURE BOBBYTEST
2 AS
3 ver_count number;
4
5 BEGIN
6 SELECT
7 count(*) into ver_count
8 FROM test92;
9
10 END BOBBYTEST ;
11 /
Warning: Procedure created with compilation errors.
SQL> SHOW ERRORS;
Errors for PROCEDURE BOBBYTEST:
LINE/COL ERROR
-------- -----------------------------------------------------------------
6/3 PL/SQL: SQL Statement ignored
6/3 PL/SQL: ORA-00980: synonym translation is no longer valid
Finally, by chance I found that I could use a view for the remote synonym and the proc would compile:
SQL> create view test92 as select * from test@link_to_112;
View created.
...
SQL> CREATE OR REPLACE PROCEDURE BOBBYTEST
2 AS
3 ver_count number;
4
5 BEGIN
6 SELECT
7 count(*) into ver_count
8 FROM test92;
9
10 END BOBBYTEST ;
11 /
Procedure created.
SQL> SHOW ERRORS;
No errors.
SQL>
SQL> execute bobbytest;
PL/SQL procedure successfully completed.
SQL> show errors
No errors.
Now one last thing to check. Will the plan work with the view?
SQL> explain plan into plan_table for
2 select * from test92
3 /
Explained.
SQL> select * from table(dbms_xplan.display('PLAN_TABLE',NULL,'ADVANCED'));
PLAN_TABLE_OUTPUT
------------------------------------------------------------------------------
Error: cannot fetch last explain plan from PLAN_TABLE
Sadly, the view was not the cure all. So, here is a summary of what to do if you have a procedure on a 19c database that needs to access a table on a 9.2 database:
Create a link on a 11.2 database to the 9.2 database
Create a synonym on the 11.2 database pointing to the table on the 9.2 database
Create a link on the 19c database to the 11.2 database
Create a view on the 19c database that queries the synonym on the 11.2 database
Use the view in your procedure on your 19c database
Explain plans may not work with SQL that use the view
I mentioned in my previous two posts that I had tried to figure out if it would be safe to turn on force logging on a production database that does a bunch of batch processing on the weekend: post1, post2. We know that many of the tables are set to NOLOGGING and some of the inserts have the append hint. We put in force logging on Friday and the heavy weekend processing ran fine last weekend.
I used an AWR report to check the top INSERT statements from the weekend and I only found one that was significantly slower. But the table it inserts into is set for LOGGING, it does not have an append hint, and the parallel degree is set to 1. So, it is a normal insert that was slower last weekend for some other reason. Here is the output of my sqlstatsumday.sql script for the slower insert:
It averaged 197062 milliseconds last weekend but 89684 the previous one. The target table has always been set to LOGGING so FORCE LOGGING would not change anything with it.
One of the three INSERT statements that I expected to be slowed by FORCE LOGGING was faster this weekend than without FORCE LOGGING last weekend:
It ran 2421533 milliseconds this weekend and 5672230 the prior one. So clearly FORCE LOGGING did not have much effect on its overall run time.
It went so well this weekend that we decided to leave FORCE LOGGING in for now to see if it slows down the mid-week jobs and the web-based front end. I was confident on Friday, but I am even more confident now that NOLOGGING writes have minimal performance benefits on this system. But we will let it bake in for a while. Really, we might as well leave it in for good if only for the recovery benefits. Then when we configure GGS for the zero downtime upgrade it will already have been there for some time.
The lesson for me from this experience and the message of my last three posts is that NOLOGGING writes may have less benefits than you think, or your system may be doing less NOLOGGING writes than you think. That was true for me for this one database. It may be true for other systems that I expect to have a lot of NOLOGGING writes. Maybe someone reading this will find that they can safely use FORCE LOGGING on a database that they think does a lot of NOLOGGING writes, but which really does not need NOLOGGING for good performance.
I am working on upgrading an Oracle database from 11.2.0.4 to 19c and migrating it from HP Unix to Linux. This 15-terabyte database is too large to copy from the old to the new system during our normal weekend downtime window. It also has a ton of weekend batch updates that overlap the normal weekend change window so it would be best for our business processing if the cut over from the old to the new system was as fast as possible.
I want to use GoldenGate to minimize the downtime for the cutover using an approach similar to what is described in this Oracle document:
You start GoldenGate collecting changes on the current production system and then take your time copying the 15 TB of data from the old to new system. Once you are done with the initial load you apply the changes that happened in the meanwhile. Finally, you cut over to the new system. You could even switch the direction of the replication to push changes on the new production system back to the old system to allow for a mid-week back out several days after your upgrade. Pretty cool. A teammate of mine successfully used this approach on an important database some years back.
But the database that I am working on now, unlike the one that my colleague worked on, has a lot of tables set to nologging. Under the right conditions inserts into tables set to nologging are not written to the redo logs and will be missed by GoldenGate. This Oracle article recommends setting your database to FORCE LOGGING so GoldenGate will not miss any updates:
In order to ensure that the required redo information is contained in the Oracle redo logs for segments being replicated, it is important to override any NOLOGGING operations which would prevent the required redo information from being generated. If you are replicating the entire database, enable database force logging mode.
We could also switch all our application tables and partitions in the source system to logging but we have so many I think we would set the whole database to force logging.
But the big question which I touched on in my previous post is whether force logging will slow down our weekend batch processing so much that we miss our deadlines for weekend processing to complete and affect our business in a negative way. The more I investigate it the more convinced I am that force logging will have minimal impact on our weekend jobs. This is an unexpected and surprising result. I really thought that our batch processing relied heavily on nologging writes to get the performance they need. It makes me wonder why we are using nologging in the first place. It would be a lot better for backup and recovery to have all our inserts logged to the redo logs. Here is a nice Oracle Support document that lays out the pros and cons of using nologging:
The Gains and Pains of Nologging Operations (Doc ID 290161.1)
I have an entry in my notes for this upgrade project dated 8/26/19 in which I wrote “Surely force logging will bog the … DB down”. Now I think the opposite. So, what changed my mind? It started with the graph from the previous post:
I was really surprised that the purple line was so low compared to the other two. But I felt like I needed to dig deeper to make sure that I was not just misunderstanding these metrics. The last thing I want to do is make some production change that slows down our weekend processes that already struggle to meet their deadlines. I was not sure what other metrics to look at since I could not find something that directly measures non-logged writes. But then I got the idea of using ASH data.
In my “Fast way to copy data into a table” post I said that to copy data quickly between two Oracle tables “you want everything done nologging, in parallel, and using direct path”. I may have known then and forgotten but working on this now has me thinking about the relationship between these three ways of speeding up inserts into tables. I think there are the following two dependencies:
Nologging requires direct path
Parallel requires direct path
Oracle document “Check For Logging / Nologging On DB Object(s) (Doc ID 269274.1)” says the first one. In the second case if you have a target table set to parallel degree > 1 and you enable parallel DML you get direct path writes when you insert into the target table.
From all this I got the idea to look for direct path write waits in the ASH views. I could use ASH to identify insert statements that are using direct path. Then I could check that the target tables or partitions are set to nologging. Then I would know they are doing non-logged writes even if I did not have a metric that said so directly.
directwritesql.sql looked at all the SQL statements that had direct write waits over the entire 6 weeks of our AWR history. The output looks like this:
select
2 sql_id,count(*) active
3 from DBA_HIST_ACTIVE_SESS_HISTORY a
4 where
5 event = 'direct path write'
6 group by sql_id
7 order by active desc;
SQL_ID ACTIVE
------------- ----------
2pfzwmtj41guu 99
g11qm73a4w37k 88
6q4kuj30agxak 58
fjxzfp4yagm0w 53
bvtzn333rp97k 39
6as226jb93ggd 38
0nx4fsb5gcyzb 36
6gtnb9t0dfj4w 31
3gatgc878pqxh 31
cq433j04qgb18 25
These numbers startled me because they were so low. Each entry in DBA_HIST_ACTIVE_SESS_HISTORY represents 10 seconds of activity. So over 6 weeks our top direct path write waiter waited 990 seconds. Given that we have batch processes running full out for a couple of days every weekend 990 seconds over 6 weekends is nothing.
I took the top SQL ids and dumped out the SQL text to see what tables they were inserting into. Then I queried the LOGGING column of dba_tables and dba_tab_partitions to see which insert was going into a table or partition set to nologging.
select logging,table_name
from dba_tables
where owner='MYOWNER' and
table_name in
(
... tables inserted into ...
)
order by table_name;
select logging,table_name,count(*) cnt
from dba_tab_partitions
where table_owner='MYOWNER' and
table_name in
(
... tables inserted into ...
)
group by logging,table_name
order by table_name,cnt desc;
This simple check for LOGGING or NOLOGGING status eliminated several of the top direct path write waiters. This process reduced the list of SQL ids down to three top suspects:
SQL_ID ACTIVE
------------- ----------
cq433j04qgb18 25
71sr61v1rmmqc 17
0u0drxbt5qtqk 11
These are all inserts that are not logged. Notice that the most active one has 250 seconds of direct path write waits over the past 6 weeks. Surely enabling force logging could not cause more than about that much additional run time over the same length of time.
TOTAL_SAMPLE_COUNT was all the samples for that SQL_ID value for the past 6 weeks. DW_SAMPLE_COUNT is the same count of samples that are direct write waits that we already talked about. DW_SAMPLE_PCT is the percentage of the total samples that were direct write wait events. They were all around 1% or lower which means that write I/O time was only about 1% of the entire run time of these inserts. The rest was query processing best I can tell.
Also I used my sqlstat script to look at the average run time for these inserts:
These queries run at most a couple of hours. If direct path writes are 1% of their total run time, I estimated that force logging would add about 1% to the elapsed time or about 2 minutes per execution.
The final step was to try to run one of these top nologging I/O inserts in a test environment with and without force logging to see if the test matches the expected performance slowdown. I was not able to run 0u0drxbt5qtqk without setting up a more elaborate test with the development team. My test of cq433j04qgb18 ran considerably faster with force logging than without it so I think other factors were hiding whatever effect force logging had. But 71sr61v1rmmqc had some nice results that matched my estimates well. This is on a Delphix clone of production so the data was up to date with prod but the underlying I/O was slower.
The individual run times are in seconds and the averages are listed in seconds and in minutes. I ran the insert 5 times with no force logging and 5 times with it alternating. I dropped the primary key and unique index of the target table to keep from getting constraint errors. I rolled back the insert each time. It averaged about 1.2 minutes more out of 40 minutes of run time which is about a 3% increase. My estimate from ASH was about 1% so this test matches that well.
The final test remains. In some upcoming production weekend, I will put in a change to flip the database to force logging and see how it goes. My tests were run on a test system with a different storage system and with no other activity. We might see different results on a heavily loaded system with a queue for the CPU. But, after all this analysis and testing I feel confident that we won’t be able to tell that force logging is enabled. Unfortunately, we sometimes have performance issues anyway due to plan changes or data volume so the force logging switch might get blamed. But I feel confident enough to push for the final test and I think we ultimately will pass that test and be able to use force logging to enable GoldenGate to support a short cut over time for our migration and upgrade project.
Bobby
P.S. A good question came in as a comment about direct path write waits and asynchronous I/O. The system I am testing on does not support async I/O because it is HP Unix and a filesystem. This older blog post talks a bit about async and direct I/O on HP-UX:
So, your mileage may vary (YMMV) if you do these same queries on a system with asynchronous writes. Linux filesystems support async writes and on HP-UX our RAC system on ASM supports it. It is one of the challenges of writing blog posts. Other people may be in different situations than I am.
I am trying to figure out how much non-logged write I/O an Oracle database is doing. I want to run an ALTER DATABASE FORCE LOGGING command on the database so that I can use Oracle GoldenGate(GGS) which reads updates from Oracle’s logs. GGS will miss writes that are not logged. But if I turn on force logging it may slow down applications that depend on non-logged writes for good performance. So, I want to find some Oracle performance metrics that give me an idea about how much non-logged write I/O we have so I have an estimate of how much force logging will degrade performance.
I created SQL*Plus and PythonDBAGraphs reports based on DBA_HIST_IOSTAT_FUNCTION that gives some insight into the write I/O that is not logged. Here is the Python based graphical version of the report for one recent weekend:
The purple-blue line represents Direct Writes. These may or may not be logged. The red-orange line represents writes through the DBWR process. These are always logged. The light green line represents log I/O through the LGWR process. My theory is that if the purple line is above the green line the difference must be write I/O that is not logged. But if the green line is equal to or greater than the purple line you really do not know if there was any write I/O that was not logged. But if there is non-logged write I/O it cannot be more than the amount indicated by the purple line. So, this graph does not directly answer my question about how much write I/O was not logged but it does show some numbers that relate to the question.
I did some experiments with the V$IOSTAT_FUNCTION view that populates DBA_HIST_IOSTAT_FUNCTION to see what values it gives for Direct Writes, DBWR, and LGWR using different scenarios. Here is the zip of these scripts and their output: nologgingscriptsandlogs09122018.zip. I tested four scenarios:
Insert append nologging
Insert append logging
Insert noappend logging
Insert noappend nologging
1 and 2 did Direct Writes. 3 and 4 did DBWR writes. 2, 3, and 4 all did LGWR writes.
Here are the relevant sections of the output that correspond to these statements.
This pattern is similar to that in a Ask Tom post that I wrote about a while back. That post showed the different situations in which writes were logged or not. I also got some ideas about direct writes and logging from this Oracle support document:
Check For Logging / Nologging On DB Object(s) (Doc ID 269274.1)
It sounds like inserts into tables that go through the normal processing eventually get written to disk by DBWR but inserts with the append hint write directly to the datafiles and may or may not be logged and written out by LGWR.
These tests and documents gave me the idea of building a report and graph based on DBA_HIST_IOSTAT_FUNCTION showing the values for the Direct Writes, DBWR, and LGWR FUNCTION_NAME values. The graph above shows an example of a real system. I was surprised to see how high the DBWR and LGWR values were and how low the Direct Writes were. That made me think that it would be safe to try turning on FORCE LOGGING because it likely will have minimal impact on the overall weekend processing. It gave me enough evidence to push for approval to do a controlled test of enabling FORCE LOGGING in production over an upcoming weekend. I will update this post with the results if we move forward with the test.
The Arizona Oracle User Group (AZORA) is cranking up its meeting schedule again now that the blazing hot summer is starting to come to an end. Our next meeting is Friday, September 27, 2019 from 12:00 PM to 4:00 PM MST.
Thank you to Republic Services for allowing us to meet in their fantastic training rooms.
Thanks also to OneNeck IT Solutions for sponsoring our lunch.
OneNeck’s Biju Thomas will speak about three highly relevant topics:
Oracle’s Autonomous Database — “What’s the Admin Role?”
Oracle Open World #OOW 19 Recap
Let’s Talk AI, ML, and DL
I am looking forward to learning something new about these areas of technology. We work in a constantly evolving IT landscape so learning about the latest trends can only help us in our careers. Plus, it should be interesting and fun.
Definition of analog: of, relating to, or being a mechanism or device in which information is represented by continuously variable physical quantities
Merriam Webster Dictionary
Introduction
I just finished going off on a major tangent away from my normal home computer pursuits such as studying algorithms, messing with Nethack source code, or practicing Python programming. Prior to this diversion I pursued these computer-related activities for learning and fun in my free time outside of work. But I spent the last three months pursuing activities related to pens, pencils, paper, and notebooks. Some people like to call using a pen and notebook an “analog” activity, so I used that word for the post title.
For several years my primary analog tools have been cheap wide ruled composition notebooks and Pilot G-2 gel pens. For example, I used a composition notebook to keep track of the homework and tests for my OpenCourseWare algorithms classes. I also used a composition notebook for non-technical purposes such as taking notes about things I read or heard, or just writing about things going on in my life. But otherwise most of what I wrote down was in computer form. I use Microsoft Outlook for my calendar at work and keep notes and tasks in Microsoft Word documents and text files, both in the office and on my home laptop. A lot of information was just kept in email. I have stuff on my iPhone.
But back in May I started looking at better ways to use pens, pencils and paper. I started looking on the internet for reviews and recommendations of what to buy and I ended up spending a lot of time and probably a few hundred dollars trying out different products and coming up with my own preferences. Finally, a couple or three weeks back I finally stepped back from going deeper and deeper into this exploration of pens, pencils, paper, and notebooks. I had spent enough money and time researching the best options. Now it was time for me to just use the tools I had and not buy any more and not read any more about them.
Now that I have stopped exploring these analog tools, I thought I should write a blog post about what I learned. I have a bunch of bookmarks of interesting web sites that I found helpful. I also have the results of my own use of the tools. Clearly, I am not a pen, pencil, paper, or notebook expert. This is an Oracle database blog and my strongest skills are in the database arena. Also, would people who read this blog for Oracle tuning scripts and information care about pens and paper? I am not sure. But the information that I learned and gathered has been helpful to me and it has been fun. Maybe others can benefit from my experience and if they want more expert advice, they can follow the links in this post to people who specialize in these areas.
I have decided to break this post, which is almost surely going to be my longest to date, into sections that alternate between things you write with and things you write on. Here is the outline:
Introduction
Pilot G-2 Gel Pens
Graph Paper
Pen Party
Bullet Journal
Fountain Pens
Rhodia Dot Pads
Pencils
Conclusion
Lastly, I want to end the introduction with a caveat that is like those I find in a lot of the pen and paper blogs. I will have links to various businesses that sell products like pens or notebooks. I have not received any money to advertise these products, nor have I received any products for free. I bought all the products with my own money. As I mentioned in my privacy page, this blog is completely non-commercial and does not generate income in any way. I do not sell advertising or people’s emails to spammers or anything like that. My only income is from my job doing Oracle database work. I like having a blog and I cough up a couple hundred dollars a year for hosting, a domain name, and a certificate so I can put these posts out on the internet. So, don’t worry that I’m trying to sell you something because I am not.
Pilot G-2 Gel Pens
I have been using Pilot G-2 gel pens for several years but did not realize that they came in different widths before I started researching all of these analog tools. I was using my gel pen with my composition notebook and kept accidentally smearing the ink. So, I finally Google searched for something like Pilot G-2 review and that really kicked off all of this research into types of pens and related tools. I found out that Pilot G-2 pens came in four width tips and that I had accidentally bought the widest tip pens which are the worst at smearing. They just put the most ink on the paper. I looked back at my Amazon orders and found that I bought a dozen “Fine Point” G-2 pens in 2015, but when I ran out and reordered 12 more in 2017, I got the “Bold Point” which has a thicker line. If you look at the pictures on Amazon, the boxes look very similar. You must know what the widths are named. So, as simple as this sounds, it was helpful to learn about the different tip sizes of my favorite gel pens and to try out each size to find which one I wanted. Here is a table of the four sizes:
Clip #
Millimeters
Name
# Colors
0.38
.38
Ultra Fine
4
05
.5
Extra Fine
5
07
.7
Fine
16
10
1.0
Bold
8
The clips of the pens have numbers on them but only the .38 millimeter tip has a decimal point so that can be confusing. The .38 mm pen writes a very thin line. Evidently several manufacturers compete for the very thin line gel pen market. I have tried all four G-2 sizes and right now my favorite is the .5 mm extra fine.
I got the number of colors from Pilot’s site. It looks like the .7 mm fine has the most colors. I pretty much just use black, but I like red for editing.
A key thing that I learned about testing a new pen is that you must write a lot with it before you really know if you like it. Gel pens seem to start out a little uneven when you first write with them. Maybe they have been on a shelf somewhere for a long time. But after you write say 5 pages the ink starts to really flow well.
As I said in the introduction, I liked the Pilot G-2 gel pen before all this investigation began. But I know so much more about my favorite pen such as the different sizes. I will talk about this later, but one result of all this research is that I have started to like pens that write thinner lines. I accidentally bought a dozen of the thickest line G-2 pens in 2017 and it almost threw me off the pen. Now I have settled down with a G-2 pen with a tip half the size of the one I struggled with and it has really helped me out.
Graph Paper
I got the idea of switching to graph paper from the handwritten lecture notes from an algorithms class I was working through. Evidently the professor wrote out his notes using several color pens on graph paper. It made me wonder if I should use graph paper too.
I had already started using lined loose leaf filler paper in addition to composition notebooks. I think part of the problem is that it has been 30 years since I was a student in school, and I am out of practice using paper and 3 ring binders and all. My daughters use all these tools in school without thinking about it but for me the idea of using some loose filler paper is a revelation. Maybe some of this is not so much learning but relearning how to use these analog tools that I used regularly during my school days.
The lined filler paper was great as scratch paper when I was working on a problem for my online class but then I could clean up the answer and write it in my sturdy composition notebook to keep long term. But it made sense to get graph paper instead of lined paper because it would help to line things up along the vertical lines. If I need to write some code or pseudo-code and line up blocks within loops or if statements, I can use the vertical lines. The horizontal lines just separate the lines of code.
I ended up buying a nice pad of graph paper from Amazon but didn’t realize that some of the lines were darker than others. I prefer the lines to all be the same darkness. Then I ended up returning something that I bought from Office Depot with a gift card and they would not give me cash back. So, I used the gift card that was the refund to buy several pads of loose three-hole punched graph paper as well as two or three composition books with graph paper instead of lines. The composition books are kind of neat but they seem a little cheap and flimsy.
Of the three types of graph paper that I tried I like the Office Depot loose paper the best. All the lines are the same darkness and it is already three-hole punched so I can throw it in a binder if I want to keep it. I can use it for scratch paper, or I can save what I write down. So, at this point I like my loose paper to be graph paper but I still like the wide rule lined composition notebooks over the ones with graph paper.
Pen Party
After reading the Pilot G-2 review I started reading reviews about other pens. Several web sites have best pen lists. The Gentleman Stationer has a nice best pens for 2020 page. The Pen Addict has lists of top 5 pens in different categories. Lastly, the online store JetPens, which specializes in Japanese pens, has their list of 33 best pens for 2019. One challenge that I had when I was looking at the different widths of G-2 pens is that I ended up buying packs of 4 or more pens on Amazon when I really just wanted one black pen to test. JetPens.com sells individual pens. If you spend $25 or more, you get free shipping. So, to test out some of the pens I had been reading about in the top pen lists I ordered several different kinds of pens, pencils, and an eraser, one of each kind.
Here was what I bought in my first order:
Uni-ball Signo UM-151 Gel Pen - 0.38 mm - Black $2.85
Pentel EnerGel Euro Needle-PoinT - 0.35 mm - Black $2.50
Platinum Preppy Fountain Pen - Black - 05 Medium Nib $4.00
Tombow Mono 100 Pencil - HB $2.35
Zebra Sarasa Dry Gel Pen - 0.7 mm - Black $2.95
Sakura Foam Eraser W 80 $1.50
Sakura Pigma Micron Pen - Size 02 - 0.3 mm - Black $2.50
Tombow Mono Drawing Pen - 03 - Black $2.89
Tombow Fudenosuke Brush Pen - Hard - Black $2.75
Uni Jetstream Sport Ballpoint Pen - 0.7 mm - Black Ink $3.15
Some of these were suggested by Bullet Journal enthusiasts but I will talk about that in a later section. I will also talk about fountain pens and pencils later. The point here is that JetPens.com is cool. You must chunk up your purchase to be $25 or more but it is nice picking out individual pens to try them. One of the reasons I wanted to write this post is that I wanted to share about fun vendors like JetPens for people like me who had never heard of them before.
I took these things from my JetPens order and other pens and pencils that I already had and convinced my wife and three daughters to join me for a “pen party” to try them out. We sat around out kitchen table passing around groups of similar pens and pencils and tried each one out. After everyone finished trying them, we all picked our favorites.
My favorite was the .38 mm UM-151 or Signo DX gel pen. I have been using the Signo DX pen for tiny writing in a smaller notebook where it helps to have the smallest possible lines. I think it is slightly better than its G-2 .38 mm competitor. But I prefer the .5 mm G-2 gel pen for writing on full size 8 1/2 x 11 inch paper. So, my favorite gel pens are the .5 mm G-2 for normal writing and the .38 mm UM-151 for extra small writing.
My wife and two of my daughters preferred the Zebra Sarasa Dry .7 mm gel pen because it had the thick line of the .7 mm G-2 pen but with fast drying ink so that it did not smear. I’m not as big of a fan of the Sarasa Dry because the clip kind of sticks out and gets in my way. I may also just be a Pilot G-2 loyalist. Also, I have moved toward the thinner lines so the fast dry ink is not as important to me. We have a few of these Sarasa Dry pens in our pen cup in our kitchen.
My youngest daughter liked the Uni Jetstream Sport .7 mm ballpoint pen. This pen draws a finer line than you would think a .7 mm pen would because it is a ballpoint and not a gel pen. It also does not smear and is waterproof. We got her a bunch of these to take off to college as an incoming freshman this year.
We did not have this pen for our pen party but I wanted to mention the space pen that my family got me for Father’s Day. I carry it around in my pocket with my iPhone 8. I do not like the way the ink looks nearly as well as that of my gel pens, but the space pen is supposed to write at any angle and even on wet paper and it is mostly waterproof. Plus, it comes in a just under 4-inch-long bullet shape that fits comfortably in my pocket and cannot leak. The space pen gives me a way to write on just about anything when I am away from my home or office.
I am not a pen expert, but I thought I would pass along my own experiences as well as the links to the much more knowledgeable bloggers and vendors with their top pens lists. Someone out there might enjoy the top pens lists and trying out individual pens as much as we did. It was fun and even practical.
Bullet Journal
At some point in this analog detour I started a Bullet Journal or BuJo for short. The main web site describes a BuJo as “The Analog Method for the Digital Age”. It sounds a little pretentious to call it “the” analog method instead of “an” analog method as if the Bullet Journal cures all ills. But I took the term analog from there to name this post. This post focuses on the analog tools and not the underlying philosophy or psychology behind the Bullet Journal. If you want to go deeper into that I recommend starting on the web site and then reading the book. I do not endorse everything in the book and web site or know if this method of journal writing really has all the claimed benefits. But I do endorse the tools that I have used such as the notebook which is very cool.
The Little Coffee Fox blog has a nice list of Bullet Journal supplies. I bought the black Leuchtturm1917 A5 Dotted Hardcover Notebook for my journal in May. A lot of the BuJo enthusiasts recommend this notebook. Even if you never get into bullet journals you might want to try one of these notebooks. The pages have dots instead of lines which is kind of like graph paper but less intrusive. The paper is nice quality and the notebook is hardbound and sturdy. The pages are numbered. I am writing this on August 25th, so I have been using my notebook for over 3 months. I like it a lot. Even if all the BuJo philosophy/psychology/method does not appeal to you the notebook itself is worth checking out.
I have tried several of the other BuJo supplies but I mainly use my Signo DX UM-151 .38 mm gel pen with my Leuchtturm A5 notebook along with a ruler. I got a foot long metal ruler with cork backing. I probably could have used any old straight edge just as well. I use it to draw lines and make boxes. I have not gotten into drawing and lettering as some BuJo enthusiasts do but I have purchased a couple of pens and a stencil to try. But I cannot really endorse something I do not use.
The Bullet Journal is all about using paper and pens instead of a computer which is really what this blog post is all about. What tools have I checked out to use offline? Can they work together with my computer tools to make me more productive and to help me have more fun?
Fountain Pens
I added a $4 fountain pen to my first JetPens order on a whim. I had to get the order up to $25 to get free shipping and $4 was reasonable for a fountain pen. If you look at the top pen lists above you will see that beginner fountain pens tend to run around $15 and yet my $4 Platinum Preppy was still considered a good pen. I got a second Preppy with my second JetPens order. The first was a medium nib and the second a fine nib. The medium has a .5 mm tip and the fine .3 mm. I had a lot of fun playing with these two pens. I ended up getting two matching converters so that I could use them with bottled ink and then I bought a nice bottle of black ink. Before I bought the full bottle of ink I got an ink sample and a pair of ink syringes so I could test out the ink.
While I experimented with my two Preppies I got a lot of helpful advice from the Reddit fountain pens group. Also vendors like JetPens and Goulet Pens have helpful videos and pages such as how to fill a fountain pen, how to clean a fountain pen, and how to use an ink sample. I think it makes good sense to start with a less expensive fountain pen and learn the ropes. The stereotypical experience of a new fountain pen user is that they do not learn how to take care of the pen, it stops working, and ends up in the back of a drawer unused. For example, I had trouble with the ink flow in my Preppies, so it helped to get advice on cleaning them and getting the ink flowing better.
After playing with my Preppies for a while I decided to get a nicer pen. If you read the top pen lists the break fountain pens into price ranges like “under $50”, “$50 to $100”, and “over $100”. I tried to be good and get a pen in the middle range, but I had my eye on several gold nib pens in the over $100 range. Japanese fountain pens mess up these neat price ranges because some pens that cost over $150 in the US can be purchased for less than $100 if you buy them directly from Japan. So, I told myself that I could get a $170 Platinum 3776 gold nib pen from Japan for under $100 and that is still in the middle range. This led to a lot of stress and frustration. I tried eBay first. A week after eBay charged over $80 to my credit card, I got an email from the seller saying that my pen was not in stock and did I want a blue one instead of the black one I ordered. I cancelled the order, but it took several days to get my money back. Then I ordered the same pen from a seller on Amazon and that was a total scam. Criminals broke into some poor unsuspecting inactive Amazon seller’s account and redirected the account to their bank account. Then they put out a bunch of bogus products at bargain prices including the fountain pen that I ordered. It took me a little over two weeks to get my money back.
After three or four weeks of frustration trying to buy an expensive fountain pen at a discount directly from Japan, I decided that it made more sense to buy from a reputable dealer in the US. I bought my 3776 at Pen Chalet and the buyer experience could not have been more different from my eBay and Amazon experiences. I got the black pen with gold trim and a fine nib. Lots of people on the fountain pens group on Reddit swear by buying fountain pens from Japan and they have more experience than I do. I suggest that wherever you buy your expensive fountain pen that you contact the seller first and ask them if they have the pen in stock. If they do not respond to your message, then run away very fast. Also, you probably should not try to get the best deal. Pay $20 more to buy the pen from a well-known dealer in Japan that sells a lot of fountain pens instead of going for the lowest price. Or just forget shopping for bargains from Japan and go with a well-regarded US vendor like Pen Chalet. I did contact Pen Chalet about something else before buying my pen and they responded quickly. A quick response to a question is a good sign for fountain pen sellers.
My Platinum 3776 pen writes like a dream. I have the matching gold colored converter and I use Aurora Black ink. It writes a nice thin black line, kind of like the ones my .5 mm G-2 and .38 mm Signo DX gel pens write. The big question is why spend $179 (including sales tax) for a fountain pen when a $3 gel pen makes pretty much the same line? I am not sure. Some people argue that fountain pens are better for the environment because you can fill them with bottled ink but with other pens you use them once and throw them away filling up landfills with plastic. Someone on the internet said, “gel pens are for work and school and fountain pens are spiritual”. I am not sure what they meant by spiritual but using a high-quality fountain pen is a nice experience. I keep mine at home, so I do not lose it somewhere like all my other pens. It is kind of nice to sit down at my kitchen table and write with my fountain pen. Maybe someone will read this post and find enjoyment in fountain pens themselves.
Rhodia Dot Pads
Once I got into fountain pens, I needed some nice paper to write on. Many paper brands advertise as fountain pen friendly but I focused on Rhodia Dot Pads. These have dots like my Leuchtturm bullet journal notebook, but the pages are perforated so they can be removed. I started with the 6 x 8 1/4 inch pad because it was the best deal. I ended up writing on both sides of all 80 sheets and trying out different kinds of pens and pencils on it. We used these sheets in our family pen party. When I finished off this pad I bought the more expensive 8 1/4 by 11 3/4 inch pad and I really like it. I three-hole punch the pages after I rip them off and save them as fountain pen writing samples. I get a lot of enjoyment writing with my gold nib Platinum 3776 fountain pen on my full size Rhodia dot pad.
Before I started this analog detour, I wrote in a composition book with a gel pen. Today I write on my Rhodia pad with a fountain pen. One thing about the Rhodia paper is that it is smoother and less absorbent than cheaper paper. As a result, pens draw thinner lines on Rhodia paper. This probably would be more important with really wide fountain pen nibs, but it is nice that my fine nib pen leaves a nice sharp thin black line. The Rhodia paper is more expensive. At this instant you can get a full size Rhodia pad for $14.99. It has 80 sheets so that is about 19 cents per sheet. A 5 pack of Mead composition books will run you $16.75 for 500 sheets which is less than 4 cents per sheet. Is the Rhodia pad worth over 4 times as much? Why not stick with my Pilot G-2 .5 mm gel pen and write on Mead wide ruled composition books instead of using my Platinum 3776 fountain pen on a Rhodia Dot Pad? I think I could be happy with either. There is a small advantage to the more expensive pair. The fountain pen does not show through very much on the Rhodia paper. The gel pen shows through on the Mead composition book in my testing. At the end of the day, I just enjoy the Rhodia pad like I enjoy the nice fountain pen. It goes beyond simple practicality even though there are some practical benefits of the more expensive Rhodia pads.
Pencils
The last type of analog tool that I checked out was pencils. I have not been using pencils at all in my work or in my computer science study at home. But I remember back in college writing my Pascal code in my first CS class all on paper. The TA that graded my programs said that I had the third least amount of CPU usage of anyone in the class and that the other two with less CPU usage than me dropped the class. I had been programming in BASIC and FORTRAN before coming to college so learning Pascal was not that hard. But I liked to write code out with pencil and edit using an eraser, so I did not spend a lot of time on the computer screen. These days I mainly do things on the screen. I think I need to get back to using pencils and erasers for writing and editing code and pseudo-code both in my online class and for work. I guess that goes along with the analog way of thinking like the bullet journal philosophy of using a notebook and pen instead of a program or app for your planner and journal.
My favorite pencil that I tried was the Tombow Mono 100 HB pencil that I bought in my first JetPens order. It is a pretty thing. It is basically a drawing pencil. It writes a nice dark line and is very smooth. When I was trying out various pencils, I found a fantastic pencil store called CW Pencil Enterprise. You can buy individual pencils of all kinds including types from countries all over the world. I only bought two or three pencils, but they were great to order from. They included a nice handwritten note written with a pencil. Vendors like CW Pencil motivated me to write this blog. JetPens, Goulet Pens, Pen Chalet, and CW Pencil were all very nice stores to buy from. I am sure that they are not perfect, but I had a great experience with them all and I wanted to share that with other people.
In addition to pencils I also looked at erasers. The Sakura Foam Eraser that I got from JetPens is a step up from the traditional pink hand held eraser. It erases all the pencil marks and leaves less stuff behind. A couple of my pencils like the Tombow Mono 100 did not have erasers on the end so I got a pack of Pentel Hi-Polymer Eraser Caps. These convert a drawing pencil into a more conventional writing pencil with eraser. When I use pencils for programming I alternate between the erasers on the end of the pencil and the stand-alone foam eraser.
As much fun as I had looking at pencils, I really did not find much difference between them when I used them for hand coding. The much less expensive Dixon Ticonderoga pencils that my children and wife swear by worked really well for me. I can barely tell the difference between them and the Tombow Mono 100. The Tombow is a little darker and smoother but it really does not matter much for what I need. So, I splurged on the expensive fountain pen but at this point I’m pretty much sold on the affordable and quite nice Dixon Ticonderoga standard in pencils.
Conclusion
I went back on forth on whether I should write this post and what to put in it. There is a lot more that I could talk about that goes beyond just the tools themselves and I thought about writing multiple posts. But, really, this is a DBA blog and not a pen, pencil, paper, and notebook blog so one post is probably enough. I thought about making this a lot shorter and just having links to the various web sites and products without explanation – bullet points with URLs behind them. I settled on a pretty long single post that weaved my personal experiences from the past 3 months or so in with links to the sites and products.
My exploration of these “analog” tools is like a breadth first search of a very large search tree of products and information. For example, there are many kinds of fountain pens and each pen has multiple nib sizes. How many pens would you have to buy to really try them all? If you look at Platinum’s 3776 site there are many different colors and designs of this one pen, plus multiple nib sizes for each. Then there are the other manufacturers each with multiple pens. It is a huge search tree. I got just a little way into this massive search and pulled the plug. This post documents the results of how far I got. I thought about waiting a year before writing a post about this to see if I am still using these tools and what benefits I got from them. But, by then I would have forgotten much of what I learned in my initial search. Maybe a year from now I can follow up with a post about whether this detour has had a lasting positive impact on my work and personal life.
Thanks to everyone who checks out this post. If you have questions or comments, it would be great if you left them below. I hope that something in here will be helpful to others. I had a lot of fun and learned a few useful things.
For Oracle versions 11.2 through 18 I have been applying quarterly patches to a test database and then gathering the Oracle home into a tar file. When we want to roll out a new database server with the latest patches, we untar the file and clone the Oracle home with a command like this:
Before running this command, I had to unzip my gold image zip file into $ORACLE_HOME. I created the gold image zip file using a command like this:
I ran this through MobaXterm to use their X server. I created the response file when I initially installed 19c on this test server. Then I patched the Oracle home with the July 2019 PSU and finally ran the above command to create the gold image.
I tried running the $ORACLE_HOME runInstaller without a response file based on the Oracle support document 2565006.1 listed above but it gave an error saying that I needed a global database name even though I am doing a software only install. Here is what I tried:
We used the Oracle home cloning approach for several years with a lot of success. 19c has a new way to do the same thing and it was not difficult to modify our existing scripts and process to use the new method. Going forward I will apply quarterly 19c patch sets to my test database and then create a new gold image zip file for future deployments. I plan to continue to use the older Oracle home cloning method for 12.2 and 18c for now if it keeps working. For 12.2 and 18c I just apply quarterly updates to our test database servers and create new Oracle home tars for deployments.