Friday, July 27, 2012

Solid State Drive (SSD) Migration

Solid State Drives (SSDs) are all the buzz lately and for good reason, they are very, very fast.  Ask anyone who has had to performance tune SQL Server (especially for a Data Warehouse) and I/O is usually the largest area for improvement.  Not always, but usually due to the fact that hardware today is very powerful and not very expensive.  I have used SSDs on SQL Server to house TempDB for an Operational Data Store (ODS) before, and while it was not cheap it was the first time I have seen hardware thrown at a SQL Server problem and it made such a drastic difference right out of the gate.  Well there was that one time when a client had SQL Server installed on a web server with 1 Processor, 1 GB of RAM, and a pair of local slow rpm drives...I guess that doesn't count.

This post isn't about a SQL Server migration to SSDs and the good results, sorry. That is a good idea and it is in the queue.  This is about how I decided to follow the advice that I give to my clients as a consultant and bought an SSD for better performance (for my laptop) and the steps I took to do it in a few hours.

I use VMWare Workstation on my laptop to run a handful of virtual machines. The more I loaded up the VMs with the SQL Server DB Engine, Sharepoint, Powerview, and SSAS the slower it became.  I bucked up and ordered a 512 GB SSD and wow what a difference.  If you are thinking of purchasing an SSD for your laptop or tower, don't think about it, just do it.  My laptop almost boots instantaneously and the VM performance is probably 10-20x better.

The steps below document how I swapped the SATA drive for an SSD on my Dell XPS 15z and also used an imaging software called Macrium Reflect (that is free) to clone the existing drive.  I thoroughly enjoyed doing this, and my wife just smiled at me because she knows I love taking things apart.  My father would be proud to see how organized this is compared to his Sony rack stereo that had a box of "spare" parts left over when I was 12.

Here are the steps, and hope they will help you because I spent some time learning them...then learning them some more.

1. Buy or borrow an external hard drive.  I used a Seagate GoFlex Ultra eSata. eSata is native to almost all machines while USB requires drivers.  I didn't want to deal with drivers so used Luis Figueroa's eSata drive.  He has the best toys, and every piece of software on MSDN for the last 5 years.

2. Install Macrium Reflect Free Edition.  You can purchase an advanced version but the free edition works just fine.

3. Image your existing Hard Drive using Macrium and store the .img file on the external drive.

4. Create a "rescue" CD from Macrium.  You will need a CD-R and a burner.  You can do this from USB as well but there could be driver issues.  I took the safe bet.  The Macrium rescue CD automatically puts the bootable OS Windows PE or Linux rescue program on it configured with the Macrium software to restore your .img file.

5. Go into your BIOS (F2 for dell) and change the boot order to use the CD.

6. Now the fun begins.  Unscrew the cover and remove the cover of your laptop.

7. Find the battery and unplug it. (This step is very important could fry your laptop if not done)

8. Locate the hard disk.

9. Unscrew the supporting bracket.  Keep it around as you are going to have to put the SSD in it.

10. Place the new SSD in the bracket and reverse all the steps until you get the cover back on.

11. Place the CD in the drive and boot from it.  Macrium program instantly comes up in Windows PE and select restore image and browse to it on the external drive.

It does take some time to restore.  I had a 274 GB .img file and it took 2 hours and 24 minutes.

Enjoy your lightning fast laptop or tower now!

Tuesday, July 17, 2012

SQL Saturday #151

I received an email over the weekend that I've been selected to speak at SQL Saturday #151 in Orlando on September 29, 2012.  I am honored and excited to have been selected again to speak. I am not going to present on Scaling Out SQL Server with Replication again, but am going to present on SQL Server 2012 Analysis Services Tabular, part of Microsoft's Business Intelligence Semantic Model.

Hope to see you all there!

You can find more info out on the event here:

Friday, July 13, 2012

ETL, Not Always In That Order

Almost everyone in the Data Warehouse business has extracted data from a source system, then applied processes to cleanse it, de-duplicate it, aggregrate it, or many other operations with it, then loaded into the final resting place of a respective Operational Data Store (ODS) or Data Warehouse.  We as data professionals call that process ETL which stands for Extraction, Transformation, and Loading.  Nothing earth shattering about this and used commonly.

At a Data Warehouse client of ours I have heard their employees use the terms ETL and ELT interchangeably. Every time I heard them say this I thought maybe they just got the acronym mixed up; which is understandable.  I don't how many times I confused CTE (common table expression) and called it CET, but know it was more than a handful.

So I asked an employee at the client why he used the acronym in that order and to my surprise he said Extract, Load, then Transform and explained it.  I had that look of a cartoon character when their mouth drops, hits the floor and their tongue rolls out.  Like this...although my overalls and bow tie are much cooler.

I decided to research this. To my surprise both terms are correct and used in the industry as well as with a third term; ETLT.  Now, I'd be remiss if I didn't state that I always speak about ETL as a layer of a BI solution or in general the process of transporting data.  I usually don't speak about the exact procedure of ETL, ELT, ETLT, although I have done it many times.  I've just never spoke about it in this way.

Let's define each procedure and the differences:

ETL: Classically the way Data Warehouse / Business Intelligence professionals categorize the process of extracting data from the source, combining, de-duping, cleansing (transforming) it, and loading it into the target Data Warehouse or ODS. ETL can also be the exact order of the procedure used.

ELT: An alternative procedure to perform the ETL process on the target server. i.e. Extract the data from the source, load it to the ODS or Data Warehouse, then transform it to its final form.  This tends to be heavy on the destination relational database management system where the Data Warehouse or ODS lives. There is a need to have enough horsepower to handle it in the Data Warehouse or the ODS.

ETLT: The same description as ETL above, then one last transformation on the target Data Warehouse or ODS. ETLT is slightly heavier on the destination RDBMS system than ETL is, but not as heavy as ELT due to the fact that in ETLT it is usually a few final transformations and not the bulk of them.

Friday, July 6, 2012

SQL Saturday #141 - South Florida

A year ago I attended my first SQL Saturday in South Florida, and I intentionally did not want to make a blog post until this year's SQL Saturday in South Florida. Okay, I'm lying. I've been extremely busy over the last year with running multiple BI projects for TekPartners BI Solutions, and my wife and I welcomed our second child into the world.  I actually stated last year in my blog post about SQL Saturday #79 that I would present this year and held true to that by presenting "Dividing and Conquering Reporting by Scaling Out with Replication" you can view the slides here.
 I was also going to present on SSAS Tabular in SQL Server 2012 but that will have to wait until SQL Saturday #151 in Orlando.  I had about 25 people in the session and the interaction and questions were great.  Replication is one of those topics for SQL Server that few really know and understand, and most are simply scared of it.  Over the years I spent a lot of time implementing it with clients that really needed to scale out SQL Server for reporting, and its a great tool to facilitate that.

SQL Saturday was a great event put on by the South Florida SQL Server User Group and definitely am going to attend and present at more of them in the future.

TekPartners BI Architects: Luis Figueroa, David Ortiz, and Me 

The sessions I attended were very educational.  Jen Stirrup's (Blog | Twitter | LinkedIn) session on Mobile BI was great and very informative with Azure, Performance Point, and many other great topics. Devin Knight's (Blog | Twitter | LinkedIn) session on Performance Point Dashboards was awesome. Creating a dashboard from whiteboard idea to final product with Analysis Services as a source and very informative.  Luis Figueroa's (Blog | Twitter | LinkedIn) on super fast dimension loading with Change Tracking and Change Data Capture was very informative as well as not many people are very familiar with the change tables and how to benefit from using them in SQL Server 2008+ sources.