Automatically Update your MDT Reference Images

Typical workflow for handling reference images in MDT:

  1. Import OS install media from an ISO you downloaded off Technet / Microsoft Volume Licensing
  2. Build a task sequence to deploy it (I call this my Build Image task)
  3. Capture it to the Captures folder on your deployment share
  4. Import the captured OS
  5. Build a task sequence to deploy it (I call this my Deploy Image task)
This looks mundane, but doing steps 3, 4 and 5 sucks! Trying to remember exactly how you customized your task sequence is no way to live when it’d be way easier to re-use the existing Deploy Image task when updating your reference image.  I also would love it if I’m not the only one who can perform updates to reference images ….so I figured it all out and now I live happily every after!
It’s a little extra work up front, but here’s how you can turn updating your reference images into a one step process that anyone could perform:
  1. Create a script called Relocate.cmd in your Scripts directory off the Deployment Share that contains the following one-liner:
    • move /Y "%DEPLOYDRIVE%Captures%1.wim" "%DEPLOYDRIVE%Operating Systems%1"
  2. Create your Build Image task. Keep the ID short. For example, let’s say we’re deploying a general purpose Windows 8 image.  My Task Sequence ID that builds the reference image is 8_GP
  3. Run this task sequence and capture your reference image. Make sure to save it to the Captures folder and name it after your task sequence. For my example, this is .Captures8_GP.wim
  4. This one time, you’ll need to use the Import Operating System wizard. Be sure to name the folder for this operating system to match your task sequence that builds the reference image. For my example, I have .Operating Systems8_GP8_GP.wim
  5. Go back into your Build task sequence and add a custom task that runs the following command as the final step (you don’t need to set the Start In value):
    • cmd /? %SCRIPTROOT%Relocate.cmd %TaskSequenceID%

      Note: Do to ever-vigilent WordPress Security, I had to change out the letter C to a question mark. Pleaee change it back when trying on your own.

  6. Create your new Deploy Image task sequence using the OS from the previous step. I recommend that for your Task Sequence ID you use something like 8_GP_DEPLOY
You’re done! At this point, to get the latest Windows Updates into an image, just run your “Build Image” task  sequence – the WIM is captured and automatically replaces the OS that gets deployed when someone runs the “Deploy Image” task.
There is one word of caution: Significant changes to the OS in your WIM (Service pack, new IE version, etc.) might break the Deploy OS task. If that happens go through step 3 and step 6 again so that the MDT can “refresh” what it knows about the deployment OS you’re using


Microsoft Deployment Toolkit trickiness

So over the past few days I learned a very long and slow lesson about why the Microsoft Deployment Toolkit only has instructions for using local storage. Turns out, there’s either a bug in WinPE or in certain storage filers’ (NetApp) implementation of CIFS.   Due to one of those two factors, connecting to network storage from WinPE is buggy. It works just enough to make you want to blame everything else in the universe because it should “just work”.

Well, after I got over that, I was still stuck with a server with no available local storage, and a huge NetApp volume sitting there doing nothing. So I decided to get tricky. We have good network performance and an awesome storage admin, so I decided to virtualize my deployment share. I created a fixed disk 200 GB VHD file and had it mount to  path on the local storage. Using the Disk Manager GUI in Server 2008 R2 made this easy, but it also could have been done via diskpart if you want to go all CLI (I have yet to see straight-forward PowerShell stuff for working with VHDs directly).

This was cool and so far has been working pretty well, although I have yet to do an in depth comparison of deployment speeds when I have more than one or two clients doing an install.

One final challenge I had was that on a reboot the VHD would disappear until someone went and remounted it. I fixed that by doing the following:

  1. Create a script with two lines (don’t include the 1. and 2.) and save it in the same folder on the NetApp/filer/whatever as your VHD file. I called my “Dev.dp” because this is a Dev environment and the script will be run with DiskPart
    1. SELECT VDISK FILE=”\uncpathtoyourfile.vhd”
  2. Open Task Scheduler and in the Actions pane pick Create Basic Task…
  3. Give your task a name. For the trigger, pick “When the computer starts”
  4. For the action, choose Start a Program. Use the following info:
  5. Program/script:  c:windowssystem32diskpart.exe
  6. Add arguments:  /s “\uncpathtoyourDev.dp”
  7. Click the check box to open properties when you’re done. In the General tab, change these settings:
    1. Use Change User or Group to set an account that has permissions on the NetApp. For me that was MYDOMAINsvc.deploy
    2. Pick Run whether user is logged on or not
    3. Check Run with highest privileges
  8. In the setting tab, you may want to back off the failure conditions or have the task retry if it fails (but don’t forget if for some reason it was already mounted, the task will fail because of that)

At this point you’re all done. When diskpart mounts the VHD it will automatically restore the mount point or drive letter that was used last time it was mounted. You can now use a filer to store your deployment data, but have it behave as if it were local storage because that’s the only way your deployment server will work. And your system can survive a reboot without manually re-attaching the drive!



When Virtual Worlds Collide

Seems like lately I only remember to post after taking a training class.  This time it was a series of two classes, both for vSphere.  One was a “What’s New” class that was mostly repeat of a previous vSphere 5.0 Anything and Everything class and the other was for automation and scripting via PowerCLI. One of the classes came with a voucher for a free VCP exam and I just barely squeezed that in before it expired and just barely squeezed a passing score (more on that later).

I think I’ve stated my suspicions before, but I’ll reiterate that PowerShell is the future for Windows systems administration.  I’m almost at the point where I’m mad when I’m not using PowerShell to do things, but it’s a very conflicting state to be in since I never actually use PowerShell for anything…. well, until today that is. I did a quick script to list all the VMs in our environment along with their VMware Tools versions since we have many that are running sans Tools or with an old version. Yeah, it was simple and not anything beyond a sample script, but it felt great to do because it’s practical.

As for my VCP certification…. phew that was an ordeal. Let me tell you, the testing facility was absolutely the worst I’ve ever seen.  I’m pretty sure the front desk receptionist was a stripper, but was also the main tech support for the testing systems.  I hope her night job is a more fruitful career than her day job – My test was scheduled for 7 AM but didn’t start until 8:30 due to what I believe is operator error on behalf of the receptionist.  Her troubleshooting methods for a broken desktop shortcut included using Windows Search (remember the sidebar in XP that shows the dog Fetch who will find your files? Yeah, that search) to search for the shortcut (not the target, the shortcut itself) and then clicking it a million times expecting it to load better somehow. Seriously, an hour and a half of this and similar techniques.  I walked out and sat in the lobby because it was too painful to watch after the first 30 minutes.  This killed my nerves and my mojo for when I was finally able to start and on top of that it’s a genuinely hard test!  Maybe I just don’t use enough of the available features to have a strong working knowledge, but I really don’t think VCP certification syncs with real-world challenges.

I don’t want to put the name of the company here and just blast out negativity towards them in case I had a unique and atypical experience (but mostly because I managed to pass my exam), but if you’re planning on taking a test in San Diego county and are wondering who to avoid or where to find questionable receptionists feel free to contact me