Cary
Millsap has an excellent (as usual) blog posting today about the
software development process employed by so many oranizations.
You
can read his full posting here:
This
plays into one thing that I see quite a bit as a DBA at client sites.
Most developers that I encounter at client sites don't tend to focus
so much on performance (unless it is painful during their testing).
This isn't, specifically, their fault. In many cases, the developers
are under significant pressure to "make it work" and move
onto the next thing. As a result, they don't really have the time to
worry about performance. Besides, that's the DBA's job, isn't it?
Well,
actually, it isn't. See, here's the thing, from a DBA perspective,
we have a relatively small handful of tools at our disposal. From
the bottom-up, here is a basic list of things that a DBA generally
does to impact the performance of a system:
Properly
configure disk and I/O. This is (or should be) really a
collaboration between the DBA, the Systems Administrators, and the
Storage Administrators. Done correctly, this shouldn't really be a
problem. However, as with everything, it is still possible to screw
it up. Make sure that you have redundant I/O paths that have
sufficient bandwidth for your system. Spread your I/O across many
physical drives. With RAID technologies (particularly RAID 1/0) this
is VERY easy to accomplish. Forget about the old concept of the DBA
moving datafiles around to avoid a "hot disk". The disk
array can do this at a much more granular and consistent level than
any human possibly can. Your primary goal here should be to have as
many spindles involved in every I/O operation as possible.
Properly
size your hardware. Another collaboration between the DBA and
Systems Administrator. Make sure you have adequate horsepower and
RAM. And ALWAYS plan for growth! My general recommendation is
always to put more RAM into the box than you think you'll need.
Given that so many systems that I encounter these days are x86-based
Linux systems (rather than "big iron" like Solaris or AIX),
memory for these systems is a relatively small portion of their cost.
Also, RAM doesn't impact your Oracle licensing!
Properly
tune your OS Kernel and Database parameters. I think that this
is one area where developers, managers, and users tend to have gross
misconceptions. While it's true that tuning a system to truly
"optimal" performance is a dark art, the truth is that,
unless you've really screwed something up (sized your SGA too small,
too few buffers, etc.), odds are you're not going to see huge
performance gains by tweaking these parameters. In most cases,
"decent performance" is fairly easy to achieve. Now,
squeezing out that extra 20%? That can require in-depth analysis of
statistics, utilization, I/O patterns, etc. This is where the "dark
art" comes into play. And, honestly, this requires time to
observe and adjust.
Unfortunately,
too many developers, managers, and even users, seem to wonder why
that idiot DBA didn't just set the magic "MAKE_SQL_RUN_FASTER=TRUE"
parameter. (Shh! Don't tell anyone, that's OUR little secret!)
The
truth is, unless something is wrong at these lower levels, the
biggest performance gains are going to come from tuning the code.
And, in my opinion, since it's the developer's job to produce a
quality product, it's ultimately the developer's job to do tune their code. Unfortunately, as Cary points out, too many organizations are
structured in a manner that breaks the feedback loop required for
developers to do this properly.
That
MUST be fixed.
– James