If there's anything inevitable in our daily life's, that is Browser.
Choosing a right browser for the kind of work we do is a real important deal.
What the options we have with us
If there's anything inevitable in our daily life's, that is Browser.
Choosing a right browser for the kind of work we do is a real important deal.
What the options we have with us
Let me give a warm up before jumping into how this was understood.
A very generic introduction to the environment, Long story short -
In my previous 2 posts we have discussed some basics of Perl coding namely
Keeping these basics on mind and adding to them here we develop our first Perl script .
Although it's the DBA who deploy STORED PROCEDURE and FUNCTIONS , some times you want to quickly check the PROCEDURES deployed from last week or so and check the cost(timerons) against them to find the culprit.
I use this script more than seldom, but every time I use this , it fulfills my tasks
This script helps you check the ESTIMATED COSTS of the deployed STORED PROCEDURES from the last n no. of days on the go.
In this post we will see some good practice while you write your Perl scripts . If you see my previous post Getting started with Awesome Perl I have discussed usage of default variable $_ while what we see in this post how good for a programmer is to use the default variable
It's for sure that the default variable ($_) does the work but when you have a large chunk of code , readability of the code will become a problem . Lets take a piece of code and analyze it .
Let's pass some sql data into an array like one shown below
my @data = `db2 "list tables for schema DB2INST1"`
Now the array @data is populated with some data , lets go and do some manipulations
A must quality of a DBA is to know the recovery strategy first before entering the battlefield .That is the reason I have started the Accidents happen series so that we can discuss further recovery strategies in this series of posts.
I have not found many search results where somebody who could show in detail how to recover an accidentally dropped table . So I thought of giving an example scenario as it will help at least some people who are dealing with it for the first time.
The first precaution you can take is to add WITH RESTRICT ON DROP
clause to the CREATE TABLE
Say you created a table as shown below
CREATE TABLE "DB2INST1"."EMPLOYEE" ( "EMPNO" CHAR(6 OCTETS) NOT NULL , "FIRSTNME" VARCHAR(12 OCTETS) NOT NULL , "MIDINIT" CHAR(1 OCTETS) , "LASTNAME" VARCHAR(15 OCTETS) NOT NULL , "WORKDEPT" CHAR(3 OCTETS) , "PHONENO" CHAR(4 OCTETS) , "HIREDATE" DATE , "JOB" CHAR(8 OCTETS) , "EDLEVEL" SMALLINT NOT NULL , "SEX" CHAR(1 OCTETS) , "BIRTHDATE" DATE , "SALARY" DECIMAL(9,2) , "BONUS" DECIMAL(9,2) , "COMM" DECIMAL(9,2) ) IN "USERSPACE1" WITH RESTRICT ON DROP ORGANIZE BY ROW ;
Perl : Practical Extraction and Report Language is one of the widely used scripting language . I will be discussing this post in the DBA perspective . Most of our Databases reside on either Linux or Windows servers , so we choose either Shell or Batch scripting .
Lets take a case you been very comfortable writing shell script's as your databases resides on Linux box's ,but for some reason your client decided to move or migrate them to windows machines where there's a definite necessity for you to port all your shell scripts to batch script , you know you need to start all the way from bottom .
So who helps us from this kind of situations , Perl , yes one big advantage of Perl is it portability . Make sure you have the Perl compiler installed either in your Linux or Windows machine and you are free from platform concerns .
Let's see some basics of Perl and some sample scripts
It was almost 3 years back when my career started , I had a chance of creating users and granting them privileges where i created a user and was shocked the moment i created a user in OS he could connect to the database even before granting privileges to him.
This eventually led me to the documentation where i came to know the pseudo group rather PUBLIC group is the culprit.Culprit ? No No ... Not actually , DB2 may have launched this to make things a bit easier for DBA , granting some predefined set of privileges to the user created .
But if you don't believe in accidents and don't want to leave any loopholes , then you may not be very comfortable with this group as it leaves some control over to others
Now lets get into work , db2 has provided us with various tables under SYSCAT catalog schema for administrating authorizations and privileges , here is one way you can look them
In this post we assume 3 mount points namely C:\ , D:\ , E:\ .
C:\ & D:\ are considered to be SSD's and
E:\ is considered to be a RAID device or normal HDD
I have gone through Ember's recent post A Few Differences When Moving From 9.7 to 10.1 where I came across storage groups - a new way of handling physical data .I just thought of giving it a try and went through the db2 v10.1 documentation where the feature was introduced , let me tell you db2 is getting rock-steady version by version
Most of the times we DBA's take on the developers like underdogs , even I used to but things have changed as my years of experience in the industry is growing by , Every person is equally important right from the application developers, DB developers, Network Administrator.... to DBA's . Anyone one role in the Software life Cycle is neglected , the whole application gets affected
I myself have neglected DB Development like creating procedures . Recently , very recently I happened to create some custom stored procedures which will help the DBA's , they save a lot of time
Ill discuss one case I have come across , I usually use the google drive to store my documents where in I store all my custom administrative queries I use daily
I use to copy paste the respective query depending on my need , because all the time you don't have sufficient time to write the query every time , say a tablespace DMS container size is full for some reason and you need to EXTEND it immediately , You cannot frame the query at that peak time which will be time consuming as every second counts
One day there came a issue and I don't have my laptop on hand , so connected through my other colleagues system where in I cannot access my google drive as I'm already connected to VPN , by the time I framed the query and issued the command some transactions in the application got affected
I then started thinking after resolving this issue as to how can I save this time I have wasted in framing my query at that peak time which made to go through basics of STORED PROCEDURES and yes they really help me a lot