Reducing data footprints on test systems


Picture the scene:

You are a DBA. You have to make everyone happy, including DEVs (yes, including them too). Part of that is to ensure that your DEVs have a fresh copy of your production data so that they can test their programming with current data and structures. The issue with that is your production database is multiple hundreds of GB in size and your test system doesn’t have the capacity to store a full copy of your database. Your IT Budget will also not stretch to purchasing more storage! What can you do?

I had a similar situation recently and took a look at the possibilties and here is what I came up with:

1. Try out tools like SQL Virtual Restore to allow you to “mount” a backup from a remote server avoiding the space issue all together. This is, of course, a solution that requires a license (no budget maybe?)

2. Use the great features offered to you by SQL Server Developer Edition: data compression to the rescue!

So of course, I chose door number 2.

As you may know, Developer Edition offers the same features as Enterprise Edition with the caveat that the instance cannot be used for production purposes in any way. This means that the seriously powerful and useful data compression becomes available for your test servers! This counts even if you only use Standard Edition in production – joy of joys! The good thing being that Developer Edition only costs around $50 (or less if you have an MSDN subscription) – even more joys!

If you have never seen/used compression take a quick look over on MSDN to see what it all is (Data Compression). Basically, you can compress data in indexes and tables at the storage level avoiding any query rewrites and still profiting from the storage savings. This can be a major advantage in terms of raw storage needs, but can also benefit you in high read environments with low RAM sizes. The data remains compressed when held in the buffer pool and is only de-compressed when being accessed. This means that you can keep more data in the buffer pool and reduce hard disk I/O (obviously not for free, compressing and de-compressing costs CPU cycles).  This may be acceptable on a test system with extremely constrained storage space.

The usage in my example scenario is now an even better proposition, not only can I reduce my storage footprint, I can also potentially increase test system performance on the I/O side of the equation (who doesn’t have I/O issues, especially on a test box!).

The next hurdle is of course identifying which indexes and tables that are in the database you want to squeeze down. This is possible via SSMS’ object explorer, but only if you want to spend an eternity doing so! The best way is to take a look at the meta-data tables/views/functions to interrogate the system objects. These are really interesting on their own (at least they are to me), but after writing my Index Script Creator, I realised the potential for using the information in these system objects to allow me to automate certain tasks like this one.

Similarly to the Index Script Creator, my Index Compressor Script (uploaded as a doc  but it is a .sql file really – stupid wordpress restrictions!) runs off and investigates all tables and indexes (down to the partition level) and then generates an index rebuild command to compress any currently non-compressed partitions of indexes / tables. The resulting commands can then be run and space can be reclaimed – just be aware that rebuilding indexes does cause log activity.

After compressing your tables and indexes you should have a much smaller amount of data (depending on how well your data compresses) and be able to reclaim some disk space.

I realise that this method will mean that you have to have a large enough disk to have the database in its uncompressed state to begin with, but you can end up with a cost effective solution to a budget-constrained test environment.

Happy zipping!

Be prepared!


While I was on my recent trans-global sojourn I came to the conclusion that the boy scouts really had a valid point with their motto “Be Prepared”. Of course, I am not meaning it quite in the semi-military sense that Robert Baden-Powell meant, but being prepared before the large system roll-out really saved a lot of time, nerves and money.

The roll-out was a set of reporting servers in a transactional replication setup, pushing operational data from a central server to the outlying branch offices around the world. I spent a great deal of time preparing these systems for the roll-out; standardising the installations, scripting out each step required to set up users, roles, permissions, DB objects, linked servers, jobs etc. This long preparation period was very tedious and I would often lose the motivation to keep at it.

The final payoff for this long drawn-out process has been the actual roll-out at each location. I have been able to arrive at each office, totally jet-lagged and tired from the entire travel regime and basically sit and watch the scripts run through and see the systems come to life. Had I not been prepared, I would have been in a world of pain, trying to remember what needed doing and when whilst fighting off sleep and headaches.

As a side note: A former colleague/mentor told me to save every script I ever run. If you don’t, you’ll need to repeat that script that took you an hour to write the very next day after you deleted it. This task has been made much easier to do thanks to the SSMS Toolspack provided by the more than awesome Mladen Prajdić. His SSMS extension has saved me hours of time when I have written code and accidentally thrown it away, or when SSMS has crashed. Use his tool and donate to the cause!

So, before you start any project, always keep the Scouts motto in mind – “Be prepared”. Hard disk space is so cheap as to be free, how cheap is your time?

Automating SQL Server installations / maintenance


I am laying down the plans for a standard server installation at work.  I want to have all production systems setup the same way to reduce the installation and maintenance overhead for each server.  Ideally, I won’t have to do anything by hand until the instance is setup and SQL Server is running.

Of course, I am a firm believer of not re-inventing the wheel, so my first course of action was to see what the great encyclopaedia called the internet could offer up…. and it didn’t fail me!

Ola Hallengren’s MaintenanceSolution

The first resource that I found comes from Ola Hallengren (http://ola.hallengren.com).

He has developed a set of maintenance scripts that do Integrity Checks, Database Backups and intelligent Index Reorgs/Rebuilds.  I took a first look at these scripts quite some time ago and proceeded to set them up on one of my production systems (mainly for the index maintenance).  I must say, that the code is nicely written and commented and allows you to easily see what Ola is intending to do.  I made a few small changes to fit my environment and the jobs have run happily ever since.

SQL Server FineBuild

The second resource that I found was SQL Server FineBuild from Ed Vassie (http://sqlserverfinebuild.codeplex.com).

This is a tool to help standardise SQL Server Installations (ha-ha! just what I want). After taking a look at the very detailed documentation and taking a few hours going through the installation options I had a completed configuration file for the installation.  I then took a brand new test server and set FineBuild on its way and waited to see the results.

FineBuild warned me that the installation could take 40 minutes or more, so I got on with something else, and when I looked back to see how FineBuild was doing it had created a summary .txt file showing that it had finished installing in 20 Minutes and everything had been successful.  I now had an all singing all dancing SQL Server installation with extra tools installed, all DB-Files separate from LOG-Files, and a plethora of added-extras to boot.

I then went about taking this installation and modifying the config file to further optimise the installation for my environment.  This was not necessary, but I wanted some of the cosmetic folder naming schemes to be kept and so on.

I am still tinkering with this config and integrating Ola’s scripts into the mix and will be setting up 5 identical servers for a replication system in the near future.  This will easily save me a couple of days setup and configuration.

Conclusion

Ola’s Scripts are great for all DBAs, but especially for those new/accidental DBAs who have little experience/knowledge, but have to look after SQL Server.  They have an easy entry point, but allow experienced users to configure the scripts to fit their requirements.

FineBuild is definitely for the more advanced user/dba. The wide range of options and add-ons means that a greater base knowledge of SQL Server is required before even thinking of using this tool.  If, however, you are comfortable setting up and administering SQL Server, you will find this tool extremely helpful in reducing your admin overhead.  The fact that once the settings are finalised (normally a one of task), you can then install an unlimited amount of servers, secure in the knowledge that each one will be identical in the basic structure.

A big thank you goes out to Ed Vassie and Ola Hallengren.  Both of whom have created resources that will save lots of time for DBAs everywhere!