This is something is my favourite dialogue when it is the case of the performance issue of a business application whether it is SAP or it is some other ERP or tailor made one. Every business application is supported by a database storing business transactions as records and a management system/engine to run (retrieve, add, modify or delete information) and manage the database whether relationally or else type.
I still remember those days, when we use to challenge each other with writing smaller better executables which gives less memory dumps. Those days 4 MB RAM in one of my friend’s computer was something we all do envy. We used 360KB floppies to copy programs and games written from one machine to another to try out. Gone those days, everything is now bigger.
But the big the size of head, the intensified is the headache, need bigger doses to come round and carry the risk of the more side effects. Big data is something equivalent to a overloaded truck on the road, unsafe for people (driver as well as those others travelling in the road) and harmful for road as well as the truck itself.
I had recently came across a situation where a server that is running a SAP BW system is temporarily facing storage crunch and ping pong of communication happening between storage admin, DBAs, Basis and BW consultants….This happens, whatever strong is the planning.
Now for any application, there are majorly three parts e.g. binaries, data container and the interface engine. And the application database is the part, which always have growth. As obviously in most my discussion SAP and Oracle comes into the picture, I will be referring them also here in this discussion.
So what are causing an SAP OLTP system to grow bigger? And what are the different approaches to gain control over them…….?
Ideally following things makes an SAP database grow bigger
- System Logs
- Audit trails and logs
- Business Processes configuration
- Data accumulated but not used.
- Too much detail information is collected.
- Trying to provide intelligence in the scope of transactional reporting (this is nothing but fooling your customer as well as yourself) resulting in
- Rampart Creation of indexes to reduce the execution time of so called intelligence reports.
- Creation of duplicate record containers.
- Storing of duplicate data in database because of some customized need (i.e. trying to provide intelligence in the scope of transactional reporting- this is nothing but fooling your customer as well as yourself).
- Database growing older and became porous.
- No policy on information life cycle.
And the ideal will methodology should be a combination of the following.
- Review configuration thus helping data avoidance
- Define data life – Create a policy for data archival, deletion.. The frequencies….
- Database De-Fragmentation
- Stop building business intelligence in an OLTP system – especially when the more open type of parameter screen is used e.g. ABAP reports… This gradually forces you to create a more index to satisfy the execution speed on the actual OLTP tables which is suicidal….i.e. remember the more you create index, the more you increase the transaction time.
- Plan for a separate hardware, and segregate data and history data and build up intelligence.
- Don’t forget to clean up the mess (created by the approach of building business intelligence in OLTP) you made ….My experience is people forget it …But this is one of the most important. And that is why I kept it in separate bullet point.
Remember; don’t let your application logic just to satisfy business logic, but to satisfy it better and proper. Otherwise it is actually hampering business by increase of operation time, there by reducing organisation wide operational efficiency.
What I had in my mind is to create a small document on this topic in a scenario of SAP and Oracle, which can be a ready reckoner at least to start the analysis….