Cops. V-cops.

Work just closed the deal on the full vCenter suite this week. On Tuesday I spent about 2.5 hours upgrading to vCenter 5.1 (up from 5). As a matter of pride I tried to do it completely solo.

About 2 hours in I panicked and called up VMware support because I was getting a weird error and starting to run low on time. By the time they got back to me I figured out that I had misunderstood exactly what database I was supposed to point to. Their upgrade documentation is really vague in certain areas but pretty solid in others..

I must say it’s weird getting comfortable in SQL. Just hopping into the admin tool and doing stuff…. was kind of fun but I’m still sure DBA is never going to be in my future. Of course I say that about Sharepoint and today I was told (paraphrasing) “Oh hey, you DO know a lot about that. You’re my new escalation point.”  Me and my big mouth…. maybe I shouldn’t be commenting on my SQL exploits. At least I don’t think my coworkers know I blog so I’ll be safe for a little while…



Microsoft Deployment Toolkit trickiness

So over the past few days I learned a very long and slow lesson about why the Microsoft Deployment Toolkit only has instructions for using local storage. Turns out, there’s either a bug in WinPE or in certain storage filers’ (NetApp) implementation of CIFS.   Due to one of those two factors, connecting to network storage from WinPE is buggy. It works just enough to make you want to blame everything else in the universe because it should “just work”.

Well, after I got over that, I was still stuck with a server with no available local storage, and a huge NetApp volume sitting there doing nothing. So I decided to get tricky. We have good network performance and an awesome storage admin, so I decided to virtualize my deployment share. I created a fixed disk 200 GB VHD file and had it mount to  path on the local storage. Using the Disk Manager GUI in Server 2008 R2 made this easy, but it also could have been done via diskpart if you want to go all CLI (I have yet to see straight-forward PowerShell stuff for working with VHDs directly).

This was cool and so far has been working pretty well, although I have yet to do an in depth comparison of deployment speeds when I have more than one or two clients doing an install.

One final challenge I had was that on a reboot the VHD would disappear until someone went and remounted it. I fixed that by doing the following:

  1. Create a script with two lines (don’t include the 1. and 2.) and save it in the same folder on the NetApp/filer/whatever as your VHD file. I called my “Dev.dp” because this is a Dev environment and the script will be run with DiskPart
    1. SELECT VDISK FILE=”\uncpathtoyourfile.vhd”
  2. Open Task Scheduler and in the Actions pane pick Create Basic Task…
  3. Give your task a name. For the trigger, pick “When the computer starts”
  4. For the action, choose Start a Program. Use the following info:
  5. Program/script:  c:windowssystem32diskpart.exe
  6. Add arguments:  /s “\uncpathtoyourDev.dp”
  7. Click the check box to open properties when you’re done. In the General tab, change these settings:
    1. Use Change User or Group to set an account that has permissions on the NetApp. For me that was MYDOMAINsvc.deploy
    2. Pick Run whether user is logged on or not
    3. Check Run with highest privileges
  8. In the setting tab, you may want to back off the failure conditions or have the task retry if it fails (but don’t forget if for some reason it was already mounted, the task will fail because of that)

At this point you’re all done. When diskpart mounts the VHD it will automatically restore the mount point or drive letter that was used last time it was mounted. You can now use a filer to store your deployment data, but have it behave as if it were local storage because that’s the only way your deployment server will work. And your system can survive a reboot without manually re-attaching the drive!



When Virtual Worlds Collide

Seems like lately I only remember to post after taking a training class.  This time it was a series of two classes, both for vSphere.  One was a “What’s New” class that was mostly repeat of a previous vSphere 5.0 Anything and Everything class and the other was for automation and scripting via PowerCLI. One of the classes came with a voucher for a free VCP exam and I just barely squeezed that in before it expired and just barely squeezed a passing score (more on that later).

I think I’ve stated my suspicions before, but I’ll reiterate that PowerShell is the future for Windows systems administration.  I’m almost at the point where I’m mad when I’m not using PowerShell to do things, but it’s a very conflicting state to be in since I never actually use PowerShell for anything…. well, until today that is. I did a quick script to list all the VMs in our environment along with their VMware Tools versions since we have many that are running sans Tools or with an old version. Yeah, it was simple and not anything beyond a sample script, but it felt great to do because it’s practical.

As for my VCP certification…. phew that was an ordeal. Let me tell you, the testing facility was absolutely the worst I’ve ever seen.  I’m pretty sure the front desk receptionist was a stripper, but was also the main tech support for the testing systems.  I hope her night job is a more fruitful career than her day job – My test was scheduled for 7 AM but didn’t start until 8:30 due to what I believe is operator error on behalf of the receptionist.  Her troubleshooting methods for a broken desktop shortcut included using Windows Search (remember the sidebar in XP that shows the dog Fetch who will find your files? Yeah, that search) to search for the shortcut (not the target, the shortcut itself) and then clicking it a million times expecting it to load better somehow. Seriously, an hour and a half of this and similar techniques.  I walked out and sat in the lobby because it was too painful to watch after the first 30 minutes.  This killed my nerves and my mojo for when I was finally able to start and on top of that it’s a genuinely hard test!  Maybe I just don’t use enough of the available features to have a strong working knowledge, but I really don’t think VCP certification syncs with real-world challenges.

I don’t want to put the name of the company here and just blast out negativity towards them in case I had a unique and atypical experience (but mostly because I managed to pass my exam), but if you’re planning on taking a test in San Diego county and are wondering who to avoid or where to find questionable receptionists feel free to contact me



I’m Still Alive… I think

Yes, it’s been quite a few months since I blogged, but sometimes life is just busy and you have to concentrate on what matters most. I’m in the midst of some back-to-back training – it’s nice when class gets done early, but since I just moved it’s not worth fighting traffic to head out – so instead here I am blogging. And hopefully I’ll keep it up without having to make a New Year’s resolution!

Virtualization is cool stuff. Last week I finished up a foundational class all about VMware’s vSphere/vCenter products. It wasn’t really “new” to me, but they went really in depth into enterprise storage fundamentals and how to hook up SANs. That’s actually where I got the most benefit! And now this week I’m learning all about Puppet. I’m pretty jealous because we’re diving headlong into wrangling our linux environment and getting things properly managed. Now if only I could convince someone that doing the same with Windows is just as important!

Over the past few months I’ve been prepping my group (Systems Engineering) for taking ownership of our company’s AD environment (previous owner being “….uhh?”). Our boss is pushing hard to align what our customers want/need with specific services that IT provides. And at the same time we’re aligning our department’s strategy on managing those services in a Plan/Build/Run model. I have no idea if it’s an actual thing, but I like the premise – We have a team that plans it out, another that builds it, and another that does daily run tasks.

As an Engineer I’m excited because I might get to be a little more distanced from the daily break/fix distractions and do more quality ‘building’ work.  My real question is where the line is between Planning and Building, but whatever.  I ended up writing about 13 pages of a Word doc that spells out anything and everything related to the AD service and is I believe what all our future projects should embrace when trying to match this PBR model. If we stick with it, I think there’s actually some hope of getting out of technical debt and eventually becoming a much more valuable asset for the business teams our IT group supports.