After a long period of absence, I posted something today and once again sat with some materials of writing on SAP monitoring ……Configuration of of central monitoring system with Solution Manager etc….How about creating a new section as ‘/log’? Logs and alerets are something indispensibale part of the monitoring….
I was preparing a document on solution manager configuration. I was going thru my old piece of papers where I used to scribble thru with some vital informations….Thought before organising better to write it here….
Note 796998 – CCMS: Setting titles for alert mails
Note 176492 – Automatic email when an alert occurs (RZ20)
Note 1525363 – CCMS: Client and User fields in alert email
Note 934834 – CCMS: CCMS_OnAlert_Email parameter TIME_ZONE
Very important to use with escalation
Note 617547 – RZ20: Sending alerts as mail and SMS
All these are actually read and used during configuration of central monitoring…….
This is something is my favourite dialogue when it is the case of the performance issue of a business application whether it is SAP or it is some other ERP or tailor made one. Every business application is supported by a database storing business transactions as records and a management system/engine to run (retrieve, add, modify or delete information) and manage the database whether relationally or else type.
I still remember those days, when we use to challenge each other with writing smaller better executables which gives less memory dumps. Those days 4 MB RAM in one of my friend’s computer was something we all do envy. We used 360KB floppies to copy programs and games written from one machine to another to try out. Gone those days, everything is now bigger.
But the big the size of head, the intensified is the headache, need bigger doses to come round and carry the risk of the more side effects. Big data is something equivalent to a overloaded truck on the road, unsafe for people (driver as well as those others travelling in the road) and harmful for road as well as the truck itself.
I had recently came across a situation where a server that is running a SAP BW system is temporarily facing storage crunch and ping pong of communication happening between storage admin, DBAs, Basis and BW consultants….This happens, whatever strong is the planning.
Now for any application, there are majorly three parts e.g. binaries, data container and the interface engine. And the application database is the part, which always have growth. As obviously in most my discussion SAP and Oracle comes into the picture, I will be referring them also here in this discussion.
So what are causing an SAP OLTP system to grow bigger? And what are the different approaches to gain control over them…….?
Ideally following things makes an SAP database grow bigger
- System Logs
- Audit trails and logs
- Business Processes configuration
- Data accumulated but not used.
- Too much detail information is collected.
- Trying to provide intelligence in the scope of transactional reporting (this is nothing but fooling your customer as well as yourself) resulting in
- Rampart Creation of indexes to reduce the execution time of so called intelligence reports.
- Creation of duplicate record containers.
- Storing of duplicate data in database because of some customized need (i.e. trying to provide intelligence in the scope of transactional reporting- this is nothing but fooling your customer as well as yourself).
- Database growing older and became porous.
- No policy on information life cycle.
And the ideal will methodology should be a combination of the following.
- Review configuration thus helping data avoidance
- Define data life – Create a policy for data archival, deletion.. The frequencies….
- Database De-Fragmentation
- Stop building business intelligence in an OLTP system – especially when the more open type of parameter screen is used e.g. ABAP reports… This gradually forces you to create a more index to satisfy the execution speed on the actual OLTP tables which is suicidal….i.e. remember the more you create index, the more you increase the transaction time.
- Plan for a separate hardware, and segregate data and history data and build up intelligence.
- Don’t forget to clean up the mess (created by the approach of building business intelligence in OLTP) you made ….My experience is people forget it …But this is one of the most important. And that is why I kept it in separate bullet point.
Remember; don’t let your application logic just to satisfy business logic, but to satisfy it better and proper. Otherwise it is actually hampering business by increase of operation time, there by reducing organisation wide operational efficiency.
What I had in my mind is to create a small document on this topic in a scenario of SAP and Oracle, which can be a ready reckoner at least to start the analysis….
The new puzzle is as below…..
Let us assume a business with its application running on an Oracle Database. Before running into the deep detail of the puzzle, let us name it as D1 and also consider it has a very much customized physical database structure (different file system, different data files (names and sizes) and we can name it as DF1. This database has three application schemas namely D1S1, D1S2 and D1S3. Transactions are happening 24X7. No downtime.
Now another Oracle database on a different hardware is installed, with same database SID, but using OFA (Let us name this physical structure as DF2 with much bigger database files) and consider this database is D2. Now I am allowed to create the application schemas. Remember the schemas in the new database having different tablespace (names). And you are also allowed to copy the data.
Let us consider you took 8 hours for creating the second database D2 starting from 00:00:01 hrs. So the data is consistent till 00:00:00 hrs. Now till 08:00:00 hrs there is enough transactions happened in the first database D1.
The puzzle is how to reflect the data in the D1 to D2 So that the users can be shifted to database D2 for the application?
Whether any native oracle tools oracle technology (comes free with if Oracle Enterprise version is purchased) available to do such job?
Or there are a need of special tools to used? Who are the vendors?
- The basic rule is minimum downtime. Downtime is only permitted to stop user pointing to D1 and start users to point to D2.
- The other point is you should have an evidence which proves the data in D1 and D2 are same same at the point of stoppage. This is a compliance requirement…..No data should miss and the the evidence it is not missed is must….
So enjoy solving the puzzle……
This is a classic example of hiccups for a single window solution. A customer of Oracle gets access to metalink, A customer of SAP Purchasing Oracle as bundled one, earlier used to get at least read only access to metalink, it is being months this access is stopped citing security reason.
It is frustrating condition, every time you need to see something o the fly, you need now to raise a SAP message in the market place, wait for the SLA time, and then get the documents…….
A customer of SAP who purchase oracle bundled is actually a customer of both Oracle and SAP and both OEM must understand that they have the responsibility to those customers…….. they may fight law suite for millions of dollars, but they must not forget their responsibility.
I am bored whenever I open SAP “Note 758563 – Oracle Metalink access for SAP customers” which says….
Because Oracle does not allow generic users for metalink access anymore for any customer due to security reasons, the metalink users SAP customers used in the past (see below) are not working any longer. SAP and Oracle are currently working on a solution. Further details will be announced in this note immediately when available.
If a special metalink document is needed please open a message on BC-DB-ORA with reference to this note and the request to forward the message to Oracle developement support to provide the document.
Please ignore the rest of the note until this header section is removed.”