Michael's Minutes - 04/28/2017

Hello! Another week another summary of this weeks readings!

  • Been thinking about Git branching strategy again. This guy caught my eye. It's last year but still valuable.
  • Been ramping up on Azure skills. Got this Introduction for Azure Operators. I know a lot of it since the principals haven't changed since I last was using/selling it. But the technical details change every month with both AWS and Azure.
  • I submitted my first GitHub issue/feature request on the PowerShell/xComputerManagement DSC Repo. I may even play around and see if I can code it myself.

Another week another dollar...

Michael's Minutes - 04/21/2017

It was a light week today - was pretty busy since I moved teams and learning new systems and tools. I was also watching the US/North Korean fiasco a bit more than tech news, but here's what's interesting -

Onward and upward into next week!

Michael's Minutes - 04/14/17

Figured I'd start a series to help me blog a bit more. There is so much good content that I read and I figured since we're all about sharing in the community, this would be a perfect opportunity.

I had the privilege of attending the PowerShell & DevOps Global Summit this year. You can find more information here. So a lot of the list below is content from the Summit -

  • Ever have the problem of using credentials from a remote host to another remote host using PowerShell? Ashley McGlone explains here about Kerberos and the double hop problem.
  • Want an automated way to create PowerShell module templates? Plaster is here to help.
  • Ruby on Rails running on Windows hasn't been the greatest experience, until now. Scott Hanselman explains why it's a little bit better now.
  • Infrastructure testing has been on my mind lately. Adam Bertram released a course on PluralSight showing us how to do it correctly with PowerShell and Pester.
  • Chef was at the Summit this week and it reminded me of a post awhile back discussing their take on DevOps principals.


So I found myself over the weekend needing to automate something. I have a backlog of all the small tools I thought would be cool to build while I was consulting but never had time to actually do. Now I have some time! Introducing the MDTDownloader...


I found myself constantly setting up and configuring MDT at client sites, often multiple times and wish I had an easy command line or script to do it for me. That is exactly the point of MDTDownloader. This script reaches out to Microsoft and downloads MDT 2013 Update 2 and the ADK prerequisite. It will also download some common OSD necessities such as .NET Framework 4.6.2 and the Visual C++ Redistributable binaries. Why? Because this script will also create a deployment share, and a folder hierarchy and create applications for those. Cool, huh?

But as with most things with PowerShell right now, I set out to learn something new. In this instance, I wanted to abstract the onfiguration into a separate file. So I'm using a PowerShell Data File (".PSD1") to hold all of the application data such as the URL, command line and such so you can add applications without modifying the script.

As always, community input is welcome. There are a few features I want to implement and will over the next couple weeks. Got an idea, make a pull request or create an issue!

PowerShell Logging

While I've been writing scripts and other PowerShell code, one of the things that has helped me greatly is the idea of documentation and logging. I can't emphasize this enough.

I leverage Write-Debug and Write-Verbose in almost every script I write. And I'm using Start-Transcript more now as well. Not only does it document everything, but when I'm troubleshooting scripts I have a nice flow of information to guide my line of thinking. For me, it just seems complete. And that Start-Transcript cmdlet just dumps everything into a nice text file for me to reference in production if I ever need to. It's just so complete.

Then the #PSBlogWeek happened. It actually occurred in December 2015 but it was republished in the monthly PowerShell TechLetter (sign-up here). And the topic for the blog week was logging. Two of the posts were directed towards the Windows Event log system. I realized there is a better logging system already in place, I should use this instead.

So I want to call out the two specific blog posts -

And again, do consider signing up for the PowerShell newsletter. Don Jones and team put some amazing content into it and it's well worth your time.

About that DATA

Haven't been posting here as much as I want to. I'm sure some of you guys can understand and appreciate that. It's been a constant nag though, an itch in the back of my mind while working on other things...

Michael, you should really post something...it's kinda getting lame!

I've kept myself busy with a number of projects both professionally and personally. Probably the most noteworthy change as of late is that I decided to take a break from the world of consulting...something that I've done pretty much my entire 14 year career. Some say that variety is the spice of life, right?

That's right - no longer with Slalom Consulting

I can't say enough about how awesome of company they are. So passionate and focused on their clients strategic objectives. I learned a ton with just being surrounded by brilliant people. Thank you for the opportunity to learn and grow.

I landed at an equally awesome place - Tableau Software! They're big on DATA. They build a data analytics application that is becoming quite popular and extremely easy to use to better illustrate what your data is telling you. They're no longer a start-up, but still act like one which keeps them flexible. I'm doing some automation work with a primary focus on PowerShell and Configuration Management tools such as Puppet and Chef. Exciting times indeed!

Now that the holidays are over and I have some more free time I'm hopeful to release some of my tooling that I'm building here soon on GitHub. I will of course let the community know. Also, if you're in the Seattle area and are interested in Microsoft user groups, there are several. I attend the Northwest System Center User Group ("NWSCUG") pretty regularly and there is a new Azure User Group out over in Bellevue.

PowerShell WinToGo Solution

And you all thought that I forgot about this project...

I'm happy to announce that I've finished the initial release of this project. I've used it several times (outside of testing) to make USB sticks and feel that it's ready for other people to use and abuse. I've included my rudimentary Pester test cases if anyone is interested as well.

You can find the solution here. It includes a script and a PowerShell module. I'll update it as I see fit. If there are any features that you feel might be useful, let me know.

Ever use Windows to Go?

My latest project has been the development and production deployment of Windows to Go using Kingston DT Workspace USB Storage sticks. In short - a very cool solution.

The client wanted to reduce the expenses of contractor on-boarding for a big rock, high impact project. Instead of giving these temporary workers expensive laptops, the decision was made to either have them use AWS Workspaces (cloud) or WTG. Due to some requirement restrictions, WTG was chosen and I was charged with leading the technical effort around building and deploying these sticks.

So what are these sticks? It's pretty easy. It's Windows 8.1 that is loaded on a USB drive and when you turn on your workstation, you "boot from this USB stick" as opposed to your drive inside the computer. These sticks are USB 3.0, and what we found was that for all laptops that have internal spinning hard drives this solution provided a performance boost. This is because the sticks (Kingston Workspaces) are actually SSD drives with a USB interface. So how did we build them?

First thing first, we needed requirements for the images. There were four distinct roles that we discovered that needed to be separate images. We performed some requirements gathering tasks and got a list of applications & settings that needed to be present within each image. We also discovered some security restrictions such as preventing the user from accessing the local disk of the workstation, and other minor things. Now comes the fun part...

The tools we leveraged were MDT and PowerShell scripting. MDT 2013 is where we performed our image creation process. We ensured that the applications needed were silent and unattended. Some of them were not which we in turn just scripted the copy of the bits down to the root of the drive and had the user manually install these. Not a perfect solution (I'd prefer to properly automate them), but it worked within the time requirements we were given. I also was able to get the Chocolatey package manager into the images to make installation of some commonly used apps available to the task sequence and kick them off. The Client Engineering lead also appreciated that added piece of software. Nice win for us.

For creation of the sticks themselves, we used a PowerShell script that is based upon code from Microsoft (link) and a previous consultant. Unfortunately it has some custom code so I can't share, but I improved it by parameterizing some info and leveraging jobs. Once I generalize it I'd like to share it so stay tuned.

Overall, the project was a success and we were able to demonstrate WTG working on a wide range of hardware...including a Macbook Air. The client was happy and we delivered serious value in the form of cost savings to them.


So I'm sitting here listening to some Skrilex, pounding away at a test environment that I'm about to capture (have you used the PowerShell Deployment Toolkit yet??? It's amazing...when it works) and of course I'm multitasking and reading blogs at the same time.

Going through my blog reader and I decide to read up on some articles that have been in queue for awhile. I come across Christopher Hunt's article on Pester and wow. Just wow. I've always struggled with testing scripts in a managed and consistent manner and I knew there were tools out there. This one is is made in PowerShell. Awesome.

Win10 Came and Went

Well it was a pretty fast turnaround, but I had to switch back to Win 8.1. Build 9879 had some issues.

As mentioned in the first post, the performance of the system was an issue. Not all of it, VM performance was pretty decent. But when it came to GUI transitions and other operations within the OS, there were noticeable issues. Sometimes it was the rendering of text in an email, sometimes it was IE crashing. These small issues that all add up. But the final one that broke the camel's back was that Word and Outlook crashed quite a bit and I need these programs to be up and running as this impacts work in a significant way.

Again, I'm fully aware that this is test software and that these issues are to be expected. Given the responsibilities that I have though, I can no longer afford to be a tester in which the next update is anticipated next year as opposed to a more frequent update schedule. When the next build is out though...I'll be loading it up. Count on it. :)

Thoughts on Win10 Build

My current thoughts on Windows 10 build 9860. I've been running it since it was released.

In terms of functionality, it’s great. Start menu is powerful and customizable compared to it being just a v1 feature. Having Modern apps within the desktop is much appreciated as I am someone who valued the simplistic approach to their design but didn’t really enjoy the idea of having them full screen on a desktop box. Tablets, it makes sense but not for laptops/desktops.

In terms of stability, I’ve had it blue screen once. My recommendation is that you don’t load 3rd party drivers…keep the MS WHQL drivers from Microsoft Update. Once I removed the 3rd party drivers I haven’t had another BSOD.

For performance, it’s evident that there is debug code in it which takes its toll. You can notice a performance drop when opening apps and sometimes simple transitions. With that said, I use a lot of Hyper-V/VM’s and I haven’t had much of a problem at all in that regard. So it seems that anything that leverages API calls up above a certain level from the kernel will get hit, but if you’re VM heavy…it’s not a problem from my perspective.

[Clear-WindowsUpdateCache] New in 1.4

So I made a few updates to my PowerShell experiment over the past week or so. Here are the highlights -

  • Rename of the script to adhere to the verb-noun standard syntax
  • Added a bunch more Verbose and Debug messages to help users and myself
  • Added much better error handling, but its still my initial stab at it
  • Uses PowerShell Advanced Functions so that it can support multiple computers
    • More to come, I haven't tested objects from the pipeline, but performing several machines by using the parameter -ComputerName works

You can grab the script here.

[Clean-WindowsUpdates] New in 1.1

So I've updated the script with some new functionality and it also was a good opportunity for me to learn a few things. The biggest idea I got out of it was the fact that you want "clean" code that is well organized. Another one was the idea of commenting. I'm of the mindset that comments should describe what you're trying to do and not just the code itself (although it helps).

You will notice that if you run the script without any arguments, it doesn't display anything. If you wanted output, you will need to add the "-Verbose" or "-Debug" to the command when executing. I wanted this to be as silent as possible for use within automation tools such as SCCM or MDT.

Of particular interest is how I needed to remove the updates in Windows 8.1. So I was looking forward to using the DISM PowerShell Cmdlet's to remove the superseded updates in the Component Store, but I was surprised to find that none of the existing Cmdlet's had the equivalent functionality as opposed to running the DISM.exe utility itself. Hopefully the product team will complete the functionality of those Cmdlet's soon. Until then, I am simply calling "Start-Process" and pointing it to the DISM.exe itself to do the work.

Future work - I think I need to work on the identification of OS's. I think the logic is rough. For instance, What about Windows 8.0? Not sure I handle that well and I should look at that soon.

I've created for myself a Github Repo (first time for everything!) for this project and it's located here. As previously indicated, this is a learning experience for me with PowerShell. If you find something wrong or have a feature suggestion, please tell me.


So I posted a couple of weeks ago that I discovered a project I could do with PowerShell and some sort of automation. Well I'd like to debut my first public script, Clean-WindowsUpdates.

As I've mentioned previously, this is more of a project to help me learn PS and if it helps someone else, all the better. I'm sure there are better scripts out there to accomplish the same or similar results but this is for my learning experience. :)

So what does this script do? It performs the following -

  • Check for Administrator Rights
  • Define a registry key and property using the New-ItemProperty cmdlet
  • Use a Try/Catch/Finally block
  • Start the Cleanup Wizard using the Start-Process cmdlet
  • Perform a cleanup task

The goal of this script is to safely remove superseded Windows Updates from the CBS store. There have been plenty of ways to do this in an unsupported fashion for Windows 7, but now since KB2852386 was released back in October 2013 we can use the functionality in the Windows Cleanup Wizard.

I've released the script here. There are several features I'd like to add such as the detection of the Windows version and whether leveraging Windows Cleanup Wizard is used or if DISM would be appropriate.

Removes Windows Updates using the Microsoft supported Disk Cleanup Wizard

This script will use the Microsoft supported Disk Cleanup Wizard plugin to
safely remove applied Windows updates from the system. Once removed, these
updates cannot be uninstalled. The primary benefit is recovering disk space.

Michael Sainz




Write-Verbose "Checking for Administrator rights."

$Identity = [Security.Principal.WindowsIdentity]::GetCurrent()
$Principal = new-object Security.Principal.WindowsPrincipal $Identity
If ($Principal.IsInRole([Security.Principal.WindowsBuiltInRole]::Administrator) -eq $False)
Write-Verbose "Script isn't running with Administrator rights. Exiting."

Write-Verbose "Check if the required registry key exists."
$Key = "HKLM:\Software\Microsoft\Windows\CurrentVersion\Explorer\VolumeCaches\Update Cleanup"

Write-Verbose "Writing the registry configuration for the Cleanup function."
New-ItemProperty -Path $Key -Name StateFlags0128 -PropertyType DWord -Value 2 -ErrorAction Stop | Out-Null
Catch [System.Management.Automation.ActionPreferenceStopException]{
Write-Verbose "Couldn't write the registry key needed for the Cleanup function."
Write-Verbose "A general error occured, exiting."
Write-Verbose "Exiting."

Write-Verbose "Executing the Cleanup Manager."
Start-Process CleanMgr.exe -ArgumentList "/sagerun:128" -NoNewWindow -Wait

Write-Verbose "Cleaning up registry configuration for the Cleanup function."
Remove-ItemProperty -Path $Key -Name StateFlags0128

Script Project

So I just figured out what my script project will be.

So awhile ago I came across a blog post detailing out a new update that adds functionality to the Windows Update Cleanup wizard. What it basically does is remove the backup or superseded update files from the WinSXS folder.

But this update only adds the new feature to a GUI. What happens if you wanted to cleanup the updates from a large group of computers?

Well there's going to be a script for that.

System Center & PowerShell = Happiness

So I've been doing a lot with the System Center suite of tools for enterprise management. I've always had exposure to System Center - Configuration Manager due to its OSD features but in the past couples years I've had more opportunities to gain some insight into Service Manager and a more complete understanding of Configuration Manager as well.

One of the pains of this suite though is that its just huge. Both in terms of infrastructure in production environments (labs are smaller but still puts some load on hardware) and the time it takes for deployment. I've always likened it to spinning plates...lots of them.

And then I found out about the PowerShell Deployment Toolkit. And its...amazing.

First, the name doesn't give it justice. This is not a tool for deploying PowerShell but rather its deploying System Center leveraging PowerShell! The team who puts this together (Windows Server and System Center Group within Microsoft) has done an amazing job at integrating several features of PowerShell including Workflows into a cohesive and automated solution to both provision virtual machines, install core dependencies and System Center.

The team is led by Rob Willis and he blogs over at the Building Clouds site under the Deployment track, but pretty much all of those posts/tracks are great for cloud construction and systems automation content.

You can grab the PowerShell Deployment Toolkit here.

Where the Rubber Hits the Road

So throughout last year I've been talking to clients and fellow IT professionals about the incredible benefits of systems automation and DevOps. This is turning out to be something that I just really enjoy doing.

But there was something that held me back and I've started to change this - PowerShell.

It's funny really...I go at length evangelizing the benefits of PowerShell and even have helped people write some scripts (verbally) without fully understanding and learning the scripting language itself. Well, this stops here. And what better way to learn something by doing something!

Sometime soon I'm going to zero in on problem or task that I need solved and I'll make a script project out of it and share it with the community. The community has always been such a great learning tool and I'm excited to give something tangible back instead of just troubleshooting advice.

TechEd 2013: Announcing Windows Server 2012 R2

Oh hey, I have a blog. 

During TechEd 2013, Microsoft announced the next version of its server operating system, Windows Server 2012 R2.

You can watch both the keynote and a list of TechEd sessions, but here is a short overview of what I took away.

Windows Server 2012 R2 is part of Microsoft's evolving approach to a Cloud OS, or Platform. Remember people, they may be a "Devices and Services" company but they are also a platform/partner company. Windows Server 2012 R2 plays a role in their platform strategy which involves the following:

  • Private Cloud
    • Windows Server 2012 R2 plays the lead role in this area for obvious reasons. What was emphasized here though was the fact that all the experience and insight while Microsoft built out Windows Azure with the previous server versions (2008 R2 & 2012) are brought into focus with Windows Server 2012 R2 so that private companies can take advantage of these benefits at scale also.
  • Public Cloud
    • This is where Microsoft delivers on their Saas, PaaS and IaaS story. Windows Azure is currently built on Windows Server 2012 Hyper-V but make no mistake, they are testing and iterating at least some portion of Azure on Windows Server 2012 R2 and when its released Azure will absolutely get those bits.
  • Service Partners
    • Microsoft specifically touted the fact that they haven't forgotten about scenarios where there needs to be a certain amount of customization a customer needs that Azure cannot provide but a partner could. Jeff Woolsey described a scenario in which a requirement that data couldn't leave Canada, but the customer still wanted to take advantage of at scale computing. Providers can deliver on this.

This is how Microsoft described it's approach to cloud computing. They took a few jabs at their competitors such as VMware, Amazon and SalesForce saying that they're only focused on specific pieces of the puzzle. Judging by what they illustrated though, they may be right.

Just because it’s the most thought out solution doesn't mean customers will follow suit.

Time will tell.

BYOD From a Different Perspective

I'm sitting on this plane doing my usual thing: Working. Part of my job is to read...a lot.

I follow a variety of people. One of them is Mike Rigsby. Recently he wrote an article where he lays out the opinion that BYOD (Bring Your Own Device) just wouldn't work in today's enterprises.

Obviously this peaked my interest. :)

Mike lists out the concerns that he believes he would have. And I have to tell everyone, they are valid issues. But at the same time, I don't believe they're as bad as he anticipates. I have the benefit of seeing first hand what this would be like...because I've deployed technologies that enable BYOD in enterprise environments. Let's hit the list…

1. Standardization

I believe the consumerzation of IT is forcing this issue. You can actually make the argument that the same force is applying to BYOD as well. But the question Mike is raising is "Do you really want them in the enterprise?". Well, the reason you want standardization is for support most of all, and also easy procurement. With a BYOD technology like VDI (Virtual Desktop Infrastructure), you rely less on the client hardware and more on the operating environment and infrastructure. This has the potential to reduce hardware support costs because you don't support as much hardware. Enterprises are realizing this and taking advantage.

  2. Security

Security is a tough one. I think it really depends on the business requirements that drive the functional requirements to make this determination if BYOD is a good fit for you. Typically there is risk assessment performed to evaluate if BYOD is acceptable, but a risk assessment is going to take into consideration technology that can prevent and mitigate a lot of these concerns. From VLAN's, 802.1x, NAP/NAC and ACL's, many companies have discovered limited and acceptable risk when contained appropriately.

VDI can also help reduce the attack surface for the company in question. Mike describes a situation where users have their laptops stolen. In the case of VDI, since nothing is on the end users laptop, the impact of stolen property is limited to the end user.


Mike frames this topic in the context of user productivity, which is absolutely a concern. What if the user loses or damages their equipment? Even if it's their own, the business still has to deal with lost productivity. I've discovered that most companies who employ BYOD don't deploy it company-wide. There are certain scenarios that require company owned hardware (like a dependency to interface with another hardware device or a security policy in place). But depending on your business, the vast majority of users are probably information workers. In this case, VDI can be ideal for a BYOD solution. Several companies I know simply have hardware on reserve for this type of scenario.

In conclusion, are there reasons not to deploy BYOD/VDI? Absolutely. It's a large capital expense. You need to purchase the backend infrastructure and software licenses, which are not cheap.

But is it worthwhile? I believe more companies are asking themselves that question and the answer may surprise Mike.