As I was uploading Office 2010 Pro Plus (from the Office 365 subscription), I noticed that Office 2010 Service Pack 1 was not part of the update. At first this struck me as a bit odd, considering that you would normally want the users to receive the latest version of the software, but after some consideration, I decided it’s the better approach to only distribute the RTM version.
Why was this? There are several reasons I can think of immediately, possibly more once I’ve had time to think about it some more.
What if the users only need Office 2010, not Office 2010 SP1?
This is the best way to deal with any application compatibility issues that may have arisen between the RTM and SP1 versions for a custom add in, for example. Granted, I don’t think this is really the top reason, but listing too many different versions of Office 2010 for download in the Microsoft Online portal could get confusing. Just supplying the RTM version also means you only need one package deployed to your Windows Intune Software Distribution storage, and you can then use update approvals to determine which computers get moved to SP1, and which stay on RTM
Does integrating the patch save any time?
Office 2010 doesn’t slipstream the SP1 bits into the existing install, instead it runs the new updates after the initial installation from the Updates Folder. This may minimise the delay between when SP1 gets applied, but it is still running the full SP1 installation.
Does integrating the patch save any bandwidth?
As far as bandwidth savings are concerned, making a large, monolithic install could have negative bandwidth implications. What if some of the clients already have Office 2010, and have done the SP1 update via Microsoft Update, and it’s already cached in your local proxy server? The monolithic package can’t take advantage of that, because it’s a different encrypted, compressed file.
Does this have an impact on future updates Microsoft releases?
It’s fairly safe to assume that there will be an Office 2010 SP2 at some point in the future. If we assume that the way to update the RTM install files to SP2 is to once again drop them into the Upgrades folder, do you want to go through this package and upload process again, or is it just easier to approve the updates via Windows Intune and let Microsoft’s distribution servers provide the files? That’s pretty easy if you ask me, just leverage Microsoft’s work, don’t reinvent the wheel.
Is it using my precious online storage for no reason?
The 20GB allowance that is currently provided with a Windows Intune distribution may be more than enough for some companies, but nowhere near enough for others. Before you start purchasing additional online storage, does it make more sense to remove bits that can delivered in smarter ways while leveraging Microsoft’s infrastructure.
What about integrating hotfixes that may not be available from the Windows Intune update service yet?
Again, keep them as separate packages. Once they go live, you can remove them from your managed software list easily. Decide on an easy to follow naming scheme that will allow them to be recognised easily.^ Scroll to Top
As highlighted in previous posts, I’ve been following the advice of the Windows Intune team on setting up a caching solution, in this case ForeFront Threat Management Gateway as a means to accelerate the Windows Intune and Office 365 deployments.
What I’ve found is that the initial Windows Intune installation components that are installed don’t benefit greatly from the caching my setup. It seems to be hovering around 50MB for a Windows 7 Enterprise system that has been completely patched prior to the Windows Intune installation, but MS suggest it could be up to 120MB. The attraction is just how effectively it caches Windows Update and Microsoft Update downloads, as well as the packages that are being distributed via the Windows Intune online software distribution.
The simplest way to put it is that it works incredibly effectively in aiding the patching of multiple systems. Of course the more varied your Windows versions and service packs are, the more of a chance that there are some items that will be a one off download and hence not really benefitting, but overall you will still see some bandwidth savings.
Here are some numbers for those of you who like numbers. This is data I obtained from patching a bunch of Windows 7 Ultimate and Windows 7 Enterprise machines with varying degrees of updates installed..Note that a few new machines had already been using the cache at this point, so it had a head start.
Installing Windows Intune and allowing it to install all available updates on an RTM version of Windows 7 Ultimate would required roughly 1.5GB of data to the client, but only generated 165-205MB of Internet traffic. Windows 7 Ultimate and Enterprise with integrated SP1 generated 250MB of client traffic, but only 50MB of that came from the Internet, and that 50MB was the Windows Intune client downloads at the start of the upgrade process.
For a handful of machines these numbers are very impressive, but hopefully you aren’t encountering too many PCs out there that haven’t been updates since they were switched on. The other element in play here is that these are just the numbers for Windows Updates, I hadn’t gotten around to installing Office 365 Pro Plus yet, which was one of the requirements for these machines.
I downloaded the 32 bit Office Pro Plus installer from the Office 365 portal, extracted the files out into their folder structure, and then downloaded and extracted Office 2010 SP1 into the Updates folder, and created a package that was just under 1GB in size. After the lengthy compress and upload process in the Windows Intune console I realised that this really wasn’t the best approach, so I removed the updates and created the new package. I will discuss the reasoning behind this in my next post. The Lync client was repackaged and uploaded into the online storage space, a much faster process due to the smaller file size.
Then it was time to ensure that all of the required, applicable updates were made available through Windows Intune. The ones that aren’t currently available through Windows Intune are as follows.
Microsoft Online Services Sign-In Assistant – Needed to downloaded separately
KB2597011 – Hotfix not currently available via Windows Intune Updates, needs to be repacked and uploaded as managed software
KB2523130 – Hotfix not currently available via Windows Intune Updates, needs to be repacked and uploaded as managed software- EDIT – This update is included in Office 2010 SP1, so it isn’t required here
KB2597051 – Hotfix not currently available via Windows Intune Updates, needs to be repacked and uploaded as managed software
At this point the question becomes should we package them all up as one software package for Windows Intune to distribute, or should be upload them individually? I will go with individual, for reasons that I will outline in the next post. The Microsoft Online Services Sign-In Assistant is an MSI file, so we don’t need command line options for Windows Intune, while the other three, being hotfixes, all need to /quiet switch added before uploading. At this stage the Windows Intune Managed Software screen looks like this.
There are a couple of things that require further explanation at this point. The number of packages available has been minimised due to all of the clients being Windows 7 64 bit with Office 2010 32 bit installed. If I was deploying a 32 bit client OS, I would need the 32 bit Sign In Assistant as well as the 32 bit Lync client. If they were running the 64 bit version of Office 2010 they would need the 64 bit hotfixes deployed. This is just a small example of where implementing a standard desktop OS and application suite really does start reducing overhead, even in a simple manner if being compared to a full SOE.
Now that everything is prepared, it’s time to deploy the software to the appropriate computer groups, and for the first round of testing I just deployed it to a group with a single machine to ensure that all went smoothly. As you could imagine, the first time all of those software installs, as well as the additional Windows and Office updates they would trigger with my existing update approvals, the download process did take time, but then thanks to the caching, the subsequent installs onto additional PCs as I increased the deployment scope benefitted from all of the updates being delivered via the TMG cache, so the bandwidth savings were astronomical. There were a few teething issues with the hotfixes that I am still troubleshooting, but otherwise it’s been clear sailing.
Finally, going back to answer the question asked in the post name, just how effective is ForeFront TMG for caching Windows Intune? For approved updates and software distribution, the answer is outstanding. Having all of the patches and updates delivered out of the cache really does change how you can approach deployment moving forward. The initial deployment of the Windows Intune client pieces don’t really benefit, but the chances are that there are going to be some accompanying updates that will make that a non-issue for most people.^ Scroll to Top
Thus far the Windows Intune client won’t install on Windows 8, but that’s expected for something during this early stage of pre-release. The big benefit at this point in time is the eligibility to move to Windows 8 Enterprise or what the similarly capable version will be with Windows 8. It’s not a safe assumption that there will be 1:1 version mapping, below I give a couple of reasons why.
ARM Tablets are one of the obvious areas that the Windows Intune team will need to develop for, considering the strong push into enterprise that these tablets will have alongside the traditional Windows PCs. Now, at this stage of the game I’m not 100% convinced on the real viability of ARM based Windows Tablets, the reason being the thing that frustrates me with existing tablet solutions in the marketplace is that they don’t run all the Windows apps I want to run, and I still need my laptop. Over time my dependence on these PC only applications may be reduced, but it is going to take a while. During that period Intel and AMD aren’t going to be sitting on their hands, they will no doubt be chasing the power consumption numbers that ARM based systems tout. If someone from the Windows Intune team is looking for a tester if this is a real scenario, I’m more than happy to put my hand up for the task.
With Windows versions, some choice is good, but too much choice isn’t necessarily good, and can be quite detrimental. While Microsoft has been attempting to simplify its Windows lineup, Windows 7 leaves a lot to be desired, and Windows Intune is a great example where there is some confusion and some inconsistencies. While Windows 7 Enterprise and Windows 7 Ultimate provide the same functionality, the primary differences are how they are sold/licensed, retail and OEM for Ultimate versus volume license for Enterprise, and they have different approaches to activation.
Where the pain comes in is that Enterprise needs to be a clean installation, whereas Ultimate can do an in place upgrade of lower end versions of Windows 7, as well as Windows Vista clients. In a well managed corporate environment, the upgrade discussion doesn’t usually happen, instead a pristine image, tweaked and tested, is deployed out to users when the time for a new OS rolls around. User data in these environments should be redirected, so the dependency on the physical machine and the OS are minimized.
But what about the SMB customer who doesn’t have the necessary infrastructure, and doesn’t necessarily want to invest in the data migration during the upgrade process, instead they just want to do a good old in place upgrade? Ultimate allows this with ease, but Enterprise isn’t in the running. To add insult to injury, many of the smaller customers out there may not have been domain joined, and not had a need for Professional or higher, so are in fact not eligible for the Windows 7 Enterprise Upgrade. To take advantage of these upgrade rights they need to purchase Windows 7 Professional upgrades in retail or via Windows Anytime Upgrade. I wouldn’t like to be the person who had to explain this to the customer who thought they were all set to move across to Windows 7 Enterprise.
Hopefully Windows 8 sees a further reduction in the SKU lineup. There is much speculation on this at the moment, and I’m sure there are groups within Microsoft and within OEMs who have these details, but the rest of us must wait. For OEMs, the more SKUs Microsoft makes available mean the more decisions they need to make in terms of matching the Windows version to the PC model, and that’s quite a large matrix when you look at the hardware lineup of major OEMs.
The flip side of this is what Apple do, one OS version, across a limited range of hardware choices. While some may scoff at the lack of choices that Apple offers compared to HP, Dell, Acer etc., but economies of scale benefits really favor the Apple approach. Suppliers can ramp up production, warehousing and shipping are simplified, resellers can reduce stock on hand, the right stock is more likely to be available in a short transit time. Sometimes it seems like the other OEMs are deliberately limiting their profitability, while Apple continues to make a very healthy margin.
One or two Windows options for OEMs would be a great start, preferably one, then using the Windows Marketplace, Windows Intune, Software Assurance, or even retail media to allow upgrades to a limited range of premium SKUs. This approach would make Windows Intune and desktop Software Assurance much more attractive to customers that have traditionally avoided SA on the desktop, as they would be seeing immediate value with a much more feature rich, business targeted upgrade to Windows. This would be a step in the right direction, but I think it could be just a bit too drastic.
The other issue that we currently see with the SKU lineup that impacts Windows Intune’s Windows 7 Enterprise upgrade rights is that customers on Windows 7 Professional don’t necessarily see the value in a new OS deployment so they can get BitLocker, BrancheCache, DirectAccess and Enterprise Search. If this is an SMB customer relying heavily on other cloud services, some of these capabilities just aren’t appealing or even terribly useful, and at this stage, SMB customers really are the best targets for Windows Intune. Consolidating the Professional and Enterprise/Ultimate versions would make this value clearer when adopted alongside a single version of Windows that is the default in the market.^ Scroll to Top
The last update gave a quick run down on what MDOP includes, but now it’s time to see how these components can be used in various types of organizations. We will start with with an unmanaged, distributed workforce, and add structure and potentially complexity through each example. The chart at the end of the post gives each component an applicability rating, but remember that these are just the views of the author, and I am more than willing to be swayed to change my view.
Unmanaged Distributed PCs
This is probably the scenario that will benefit the least from Windows Intune, as many of the components require a more traditional well designed, highly available network infrastructure to allow their effective deployment and maintenance. The ability to install Windows Intune onto unmanaged and distributed PCs, which covers distributed non-domain joined PCs means that some of the MDOP inclusions just aren’t applicable, and then others like the Application Inventory Service may have some very short term value, but DaRT is definitely a very useful addition to the IT arsenal.
Unmanaged Centralized PCs
The big differentiator here is that we have bandwidth. Just what MDOP tools can we start using if we aren’t part of an AD domain? While many in the world of IT have spent years arguing over the best directory or network operating system to deploy, there are still many networks running in peer to peer mode in the SMB space, and while we may want to provide them some more infrastructure, it may not always be as applicable as we like to think. It pains me to write this, as I was a user on one of the first NT style domain rollouts, and then also a lab rat on one of the first AD global domain rollouts, so I fully appreciate the benefits of a directory
Lightly Managed Distributed PCs
Now we have added domain joined PCs into the mix, even if they are at the other end of slow connections and the AD management is quite basic. Suddenly the AGMP tool starts to bring value, and more of the MDOP components start lighting up for applicability.
Lightly Managed Centrally Located PCs
Now we have bandwidth an Active Directory capabilities, what more could we ask for? Well, a lot, and that’s why MDOP and other tools exist. This is where MDOP really starts to shine.
Well Managed Distributed Or Centrally Located PCs
I’ve included this category for the sake of it, but in reality I don’t see anyone swapping out their existing management solution for Windows Intune unless they have some very specific requirements to do so. An ontpremise solutions such as System Center may be more complex to deploy and support than Windows Intune, but the capabilities the combined System Center family offer far exceed anything that Windows Intune will be offer for quite a while.
|Application Virtualization (App-V)||Low||Low||Medium||High||High|
|Enterprise Desktop Virtualization
|Diagnostics and Recovery Toolset (DaRT)||High||High||High||High||High|
|BitLocker Administration and Monitoring
|Advanced Group Policy Management
I don’t want to go into the ratings of everything above, but I do want to focus on a few of them.
Firstly AIS is listed as Low across the board, and the reason for this is that AIS is the basis for the software reporting capabilities that are in Windows Intune. There is a small window of opportunity for AIS to provide some inventorying capabilities as it is a lighter footprint client than the Windows Intune install, but this wouldn’t be a normal scenario.
App-V and MED-V only receive Medium scores in distributed environments due to the potential bandwidth requirements for deployments. If bandwidth isn’t a concern they can both become High.
DaRT is useful across the board as it will help solve issues with non-booting PCs, which is great inside of a large organisation, but also highly valuable if you have to do any remote recovery and repair work.
Looking at the chart, one of the things that should be clear is that MDOP shines when it has the right infrastructure to work with. While Microsoft would like all of its customers to have some type of Software Assurance (SA) on the desktop OS, combined with MDOP, that isn’t the case. Windows Intune allows customers who chose not to go down this path, or missed the window of opportunity, to get many of the benefits of SA without an SA agreement. Now that I’ve typed that out, I think I may have to write an article comparing a Windows Intune subscription with SA. That will be the first licensing post for the site!^ Scroll to Top
The MDOP add on for Windows Intune is an interesting offering from Microsoft, allowing subscribers to get even more of the capabilities that would usually only be available to a customer under a Microsoft volume license agreement that included desktop software assurance.
MDOP is a collection of technologies from Microsoft, and here’s a description of it in Microsoft’s own words…looks like we need to remind them that it’s available via Windows Intune as well…
The Microsoft Desktop Optimization Pack (MDOP) is a suite of technologies available as a subscription for Software Assurance customers. MDOP helps to improve compatibility and management (App-V/MED-V), reduce support costs (DaRT), improve asset management (AIS) and improve policy control (MBAM/AGPM).
Microsoft Application Virtualization (App-V) transforms applications into centrally managed services that are never installed and don’t conflict with other applications.
Microsoft Enterprise Desktop Virtualization
MED-V removes the barriers to Windows upgrades by resolving application incompatibility with Windows 7 and delivering applications in a Windows XP-based application compatibility workspace. Upgrades can proceed on schedule, and users can take advantage of the power of Windows 7 right away without losing access to applications they need while IT departments can remediate incompatible applications.
Advanced Group Policy Management
Microsoft Advanced Group Policy Management (AGPM), a core component of the Microsoft Desktop Optimization Pack for Software Assurance, makes it easier for IT organizations to keep enterprise-wide desktop configurations up to date, enabling greater control, less downtime, and reduced total cost of ownership (TCO).
Diagnostics and Recovery Toolset
Microsoft Diagnostics and Recovery Toolset, a core component of the Microsoft Desktop Optimization Pack for Software Assurance, helps IT teams make PCs safer to use, keeps employees productive, and enables desktops that are easier and less expensive to manage. Administrators can easily recover PCs that have become unusable, rapidly diagnose probable causes of issues, and quickly repair unbootable or locked-out systems, all faster than the average time it takes to reimage the machine. When necessary, you can also quickly restore critical lost files.
Bitlocker Administration And Monitoring
Organizations around the world rely on BitLocker Drive Encryption and BitLocker To Go to protect data on Windows 7 PCs and portable storage devices. To make large-scale BitLocker implementations easier to manage, enterprises turn to Microsoft® BitLocker® Administration and Monitoring (MBAM).
Asset Inventory Service
Microsoft Asset Inventory Service (AIS), a core component of the Microsoft Desktop Optimization Pack for Software Assurance, provides a comprehensive view of your enterprise’s desktop software and hardware environment. AIS helps reduce total cost of ownership (TCO) and improve license compliance through advanced software inventory scanning and by translating inventory data into actionable information.
Well, all of this is well and good, but how well will these work for your company? That is the topic for the next post, where I will delve into what organisations of different sizes may benefit from with MDOP, as all of the pieces aren’t necessarily right for everyone.^ Scroll to Top
Follow the link for the full article, but here’s my take…
The inbuilt and integrated AV is one of the core benefits of Windows Intune, and is really the only way for a customer > 10 but < System Center Configuration Manager to get the MS AV endpoint technologies. It’s part of what I call the perfect storm for Intune applicability to a customer – expired or expiring AV, a relatively unmanaged environment, and preparing for an SOE or desktop OS upgrade. As one or more of these is removed, the value proposition for Intune is reduced, making it a harder sell for the partner, or a harder justification for the customer.
For a partner looking at Intune as a scale out support option, possibly as an MSP, the integration of the Intune Endpoint Protection into the Intune administration console is a great convenience, and I would really ask them to find really good reasons to not use what Intune provides versus what the 3rd party offerings are. I’m not saying that Intune Endpoint Protection will necessarily check all the boxes for all customers, but it’s worth checking if it does before committing to alternatives.^ Scroll to Top
Why the Windows Experience Index? Well, it’s simple, it’s included in Windows 7, and I didn’t want to run an extensive test of the disk performance under different RAID options, not yet anyway. That would be something outside of scope for what I’m really doing with these devices.
First up, the HP MicroServer N36L, which has a dual core Athlon Neo at 1.3Ghz. As you can see the CPU score is quite low by modern standards, but when this is being used a basic server or NAS device, it shouldn’t really matter, the disk subsystem and network throughput are going to make a bigger difference. The N40L is also available with a 1.5GHz CPU, but I highly doubt it would do much to close the gap in CPU and RAM performance. Today it’s easy enough saturate Gigabit ethernet with a single HDD, so unless you drop in an additional NICs you aren’t going to hit any real throughput issues with this server. Note that this server, like the Acer AC100, has 8GB of RAM installed.
For server purposes the graphics scores are largely irrelevant, and the HDD score is capped due to testing against a single 7200RPM HDD.
Now on to the Acer – as expected, the WEI score for the CPU blows the HP out of the water, but this is not unexpected at all. I was surprised though that an almost entry level Xeon in many ways was able to score so well. Again, as a NAS, the CPU speed isn’t going to have that much of an impact, but for running our test VMs this is going to make a big difference, as is the ability to take it up to 16GB of RAM instead of maxed out at 8GB like the HP. As this server also has the ability to take an additional PCI-E card you can add a multiport NIC if you really want to push the throughput across the wire.
The Acer is going to be my primary server for the next few months, so it will certainly get put through it’s paces. My MicroServer has been doing 24/7 duty in various roles for a while now, so the reliabiliy of the unit is a known quantity, now it’s time to see how the Acer copes, but with bigger workloads than the HP was ever really capable of.^ Scroll to Top
The Acer unit arrived yesterday, and the first thing I noticed was that it ships in a much smaller box than the HP MicroServer, but I wasn’t surprised by how much smaller the Acer would be in comparison. This is instantly a big plus for me at the moment with some extended travel coming up, something small and light, with a degree of flexibility is what I need.
So far I have 8GB installed, and have been in contact with my favourite Kingston employee to get the scoop on supported memory to take it to 16GB, which is going to be a much better option longer term for some of my virtualisation and testing projects I need to perform around Windows Intune and Windows 7 deployments.
The faster CPU is really noticeable, and it’s a bit of an unfair comparison for a low power dual core Athlon against a quad core Xeon with multithreading, and that’s before the CPU speeds are even taken into account. Like most techs, I like to see more cores in Task Manager, and this certainly delivers, but the overall responsive while under load is much, much better. I will do a Windows 7 install within the next few days so that I can provide a sample WEI comparison between the two microservers, but remember though, one is a quarter the price of the other, and while HP and Acer may be targetting them at similar audiences – SMBs with a need for Small Business Server Essentials 2011, Windows Server Foundation 2008 R2 or Windows Server Standard 2008 R2, the way they go about the task is very different.
I’m not completely sold on the Acer concept of keeping the power button behind the locked front panel, and the keyhole on the side of the unit, but that’s a minor squabble. The ease of dropping in new drives is just as simple as the HP, but in this case you are limited to 4 internal HDDs plus 1 external eSATA drive, versus the HP’s ability to take up to 6 internal drives if you forgo the optical drive and route the eSATA cable into one of the internal drive bays that are free.
Setup was simple, the only catch I had was needing to change the order of boot devices to be able to get Windows to install off the flash drive. I’ve encountered this on my Acer Iconia W500 tablet as well, so it was easy enough to change, but it did have me scratching my head for a few minutes. I’ve kept the drives in AHCI mode rather than taking advantage of either the LSI or Intel onboard RAID capabilities, I’ll test those out at some point in the future.^ Scroll to Top
With an extended overseas journey approaching, and some spare time which will be dedicated to some software and scenario testing, I’ve decided to purchase one of the Acer AC100 micro servers for this purpose.
The Acer box is a very different beast to the HP MicroServer, with Acer going down the path of a high performance Xeon versus the ultra low voltage AMD CPU in the HP. For general NAS, storage or other light CPU overhead work, this difference won’t really be seen, but due to the amount of work I’ll be doing building out VMs and running various test loads, the AMD CPU in the HP is going to be a little anemic. Two cores versus four cores with HyperThreading, the ability to go to 16GB of RAM instead of just 8GB are the big winners on this front. I will be sticking to 8GB to start with due to the lack of supported 8GB ECC Unbuffered RAM on the Acer compatibility list, and the lack of support from the major RAM manufacturers as well, but this will change. Having an Intel NIC on board is also a nice sweetener.
That’s not to say the HP doesn’t have it’s own charm – the ease with which you can turn this 4 HDD device into a 6 HDD device means that it’s potentially a better option for the storage junkie. It also has two PCI-E slots instead of the single slot in the Acer (which will be occupied by an Intel i350-T2 NIC due to it’s support for advanced virtualisation and iSCSI capabilities), and is built like a tank. The modular design of the HP generally impresses, hopefully the Acer comes close. The HP unit is also a quarter of the price of the Acer, which is definitely going to be a deciding factor for most.
There are some things that both servers lack – neither support hardware RAID 5. While Acer promotes the support of RAID 5, the fine print reveals that it is via an Intel software solution. Both can support RAID 5 via OS configuration, but hardware offloading would defintiely be appreciated. The extra horsepower in the Acer should reduce the overall potential impact performance of parity calculations, but a better RAID implementation wouldn’t hurt.
I’ll give a further update when the Acer unit arrives, and give some feedback on setup and build quality versus the HP, but to me they are very different beasts, even though they appear similar at first.^ Scroll to Top
This is just a short update, primarily confirming that the NIC traffic in Hyper-V VMs is being accurately reported again. I’m not quite sure what has triggered this, but I’ve got a few VM snapshots I can revert to when I want to dig further into the issue.
The other piece of the testing that I’ve just confirmed is working as expected, that is traffic is being cached comprehensively, is the single NIC proxy/caching only scenario for TMG that I wanted initially. The TMG wizards make it easy enough to reconfigure the server after removing the additional virtual NIC, and the client was updated with the changed IP address of the proxy server via IE and netsh as shown in previous posts. With these changes in place, I now have a working TMG configuration for all the machines on my network, not just machines in a virtual network, and I’ll certainly save myself some Windows Intune and Windows Update traffic on my ISP connection each month.
After spending way too much time doing updates and rebuilds (or reverting to snapshots…) I’ve been noticing some interesting differences between the way Windows Intune delivers the updates versus the way Windows Updates does, but that will be a topic for a future post.^ Scroll to Top