Tuesday, December 30, 2008

The “Albuquerque Photo”


The “Albuquerque Photo”

by Jeff Greene /

An Iconic Microsoft Company Photo – 30 Years Later

ABQ photo2

Top row, L-R: Steve Wood, Bob Wallace and Jim Lane;

Middle row, L-R: Bob O'Rear, Bob Greenberg, March McDonald and Gordon Letwin;

Front row, L-R: Bill Gates, Andrea Lewis, Marla Wood and Paul Allen.

(Missing: Miriam Lubow, who missed the photo shoot because of a snowstorm)

Thirty years ago, on December 7th, 1978, the entire Microsoft staff gathered at Royal Frontier Studios in Albuquerque for a company portrait. It is the only image of it’s kind from that era and is one of the most iconic images in Microsofts’ history. For those of us who work here, it serves as a historical touch point illustrating how a small, yet passionate and dedicated, group could quite literally change the world.

It was during this time in Albuquerque that the name “Microsoft” was trademarked and where Bill Gates laid out his famous mission statement for the company: A computer on every desk and in every home running Microsoft software.

In April this year, 11 of Microsoft’s 12 original employees gathered again to re-create the famous photo that was taken shortly before the company relocated in 1978 from its Albuquerque birthplace to Washington State.

ABQ photo 3

Back row, L-R: Bob O'Rear, Steve Wood, Bob Greenberg, March McDonald, Gordon Letwin, and Jim Lane; 

Front row, L-R: Bill Gates, Andrea Lewis, Miriam Lubow, Marla Wood and Paul Allen.

(Missing: Bob Wallace, who passed away in 2002)

Click here for more details about the history of this photo and the early days of Microsoft…

Microsoft Pro Photo Community Blog : The “Albuquerque Photo”

Samples for creating SSIS packages programmatically


Samples for creating SSIS packages programmatically

This is an index post for the series of posts with examples on how to create packages programmatically. Creating and adding tasks to a control flow is pretty straightforward, but programmatically adding data flow components can be tricky due to the interaction with the native COM APIs. Each post will provide a simple scenario for using each component.

All samples are done using SQL Server 2008, but will also work with 2005 with minor code modifications.

I’ll continue to update this post as more samples are added. Please post in the comments if there are specific data flow components you’d like to see.

For an end to end solution, check out the Package Generation sample available on codeplex.

Other resources

Published Tuesday, December 30, 2008 1:00 PM by mmasson

SSIS Team Blog : Samples for creating SSIS packages programmatically

Who was General Tso? and other mysteries of American Chinese food


Who was General Tso? and other mysteries of American Chinese food


Friday, December 26, 2008

Updated Visual Studio Team System 2008 Trial VPC images available


Updated Visual Studio Team System 2008 Trial VPC images available

For those of you that want to try out Visual Studio Team System 2008 including Team Foundation Server, without having to go through the process of installing the product, we have recently released four VPC and HyperV images for your use.  The “all-up” image includes Team Foundation Server, Team Build, Team Explorer, and Team Suite while the TFS “only” version has just Team Foundation Server, Team Build, and Team Explorer.

These images are set to expire on December 31, 2009 (over a year from now) and are a replacement for the original VPC images we released last year.



For more information and to find links to the images go to http://www.pluralsight.com/community/blogs/brian/archive/2008/12/24/happy-holidays-and-look-what-santa-s-brought.aspx.  Many thanks to Brian Randell for his outstanding work pulling these images together and getting them through the sometimes Byzantine release process here at MS.


Published 26 December 08 01:30 by jeffbe

Jeff Beehler's Blog : Updated Visual Studio Team System 2008 Trial VPC images available

Resolution Randomizer





How to identify weather a Database Backup Set was compressed using WITH COMPRESSION


How to identify weather a Database Backup Set was compressed using WITH COMPRESSION

This is in continuation with my first blog “SQL Server 2008 Backup Compression”. I received this query from one of the reader about “How to we determine, if a Database Backup set was compressed using WITH COMPRESSION”. A simple way to identify this to use RESTORE HEADERONLY.

RESTORE HEADERONLY command can be used to fetch all information about a backup residing on a backup device. The command output includes a column called ‘Compressed’ with values as 0 or 1. Whether the backup set is compressed using software-based compression:

  • 0 = No
  • 1 = Yes

Let’s verify this with an example:

Please note, we are reusing our existing backup sets ‘Before_Compression.bak’ and ‘With_Compression.bak’.



FROM DISK = 'D:\tempdb\Before_Compression.bak'



and now,



FROM DISK = 'D:\tempdb\With_Compression.bak'



As always, for more details of RESTORE HEADERONLY, I request you to please refer SQL Server 2008 BOL >>


Disclaimer: I work at Microsoft. Everything here, though, is my personal opinion and is not read or approved by Microsoft before it is posted. No warranties or other guarantees will be offered as to the quality of the opinions or anything else offered here.

Posted: Friday, December 26, 2008 7:06 AM by varund

Varun Dhawan's Blog : How to identify weather a Database Backup Set was compressed using WITH COMPRESSION

Wednesday, December 24, 2008

TechNet Virtual Lab: Deploying SSTP Remote Access


TechNet Virtual Lab: Deploying SSTP Remote Access

Event ID: 1032370149

Language(s): English.

Product(s): Windows Server 2008.

Audience(s):  IT Professional.

Duration: 90 Minutes

Start Date: Friday, February 15, 2008 12:00 AM Pacific Time (US & Canada)

Event Overview

After completing this lab, you will be better able to configure the PKI necessary for SSTP, configure RRAS to accept remote VPN connections, and configure and test an SSTP-based VPN connection.


By registering for this virtual lab, you will receive a one-time follow up call from a Microsoft representative to inform you of special discounts and offers related to products and services presented in the virtual lab.

For an overview of SSTP see : http://technet.microsoft.com/en-us/magazine/2007.06.cableguy.aspx

TechNet Virtual Lab: Deploying SSTP Remote Access

New demos


New demos

Published 24 December 08 11:19 AM | ukdynamics

I blogged back in July about the Silverlight demo that we had produced for Dynamics GP, well now we have a library of 20 demos available to the public. Take a look at them here.

They cover:

  • Dynamics GP overview
  • Financials
  • Supply chain
  • Human Resources
  • Reporting
  • IT Systems

They are a really great resource for customers and prospects to see how Dynamics GP can help them to be more productive, streamline their business processes and ultimately help save them money.

Posted by Tom Brookes

Microsoft Dynamics GP UK Blog : New demos

BenkoTIPS January Webcasts


BenkoTIPS January Webcasts

We've been working hard to get January's webcasts into the schedule with MSDN Events. Here is what we've got. I'll be updating the links for the last one...

  • January 7, 2009 - Working with LINQ and Visual C#.  Data access has evolved to make working with databases easier and more productive. In the Microsoft .NET Framework version 3.5, access to data is integrated natively into the languages. In this webcast, we look at some of these access technologies, including Language-Integrated Query (LINQ) and related tools, to see how they impact our development approach. We dive into data wherever it lives and show how we can integrate it into Microsoft Visual C# applications.
  • January 14, 2009 - DemoFest: SQL 2008, AJAX and Virtual Earth SDK. In DemoFest webcasts, we demonstrate an end-to-end solution that takes technologies and tools we discussed in other BenkoTips webcasts and applies them to create something fun. We look at some of the new features in Microsoft SQL Server 2008, including data types for spatial data, and how we can use these features to build interesting applications. We then take this functionality and expose it using a Windows Communication Foundation (WCF) service. Finally, we explore the software development kit (SDK) for Microsoft Virtual Earth and how we can build on top of this rich interactive platform using new Asynchronous JavaScript and XML (AJAX) and JavaScripting capabilities in Microsoft Visual Studio 2008. 
  • January 21, 2009 - Working with XAML in Expression Design and Blend. XAML is a powerful platform for building the next generation of useable applications and Microsoft Expression takes your creative possibilities to a new level. The professional design tools and innovative technologies in Expression give you the flexibility and freedom to bring your vision to reality whether you are designing creative and highly interactive Web sites with Microsoft Silverlight, rich user experiences on the desktop with Windows Presentation Foundation (WPF), or simply managing digital assets and content. In this webcast, we look at the products that make up Expression Studio and how they are used together to make great user experiences a reality both on the Web and on the desktop. We dive deep into Microsoft Expression Blend and Expression Design to see how they can be used along with Microsoft Visual Studio 2008 to deliver rich interfaces based on Windows Presentation Foundation for online and offline scenarios.
  • January 28, 2009 - Visual Studio Team Systems Data Edition. Have you ever had to test code under development against "production" version of your database simply because it was the only copy that was the "truth"? If so, you know first hand the challenges of managing a database as it moves thru the software development lifecycle. Visual Studio Team Edition for Database Professionals dramatically extends Visual Studio Team System and makes it easier to create and test applications that work with databases. In this session we'll cover best practices for keeping your db environments in sync from a schema and data perspective. We'll explore how to create and version database schemas, how to generate test data that truly mimics real data, and how to use unit testing to validate your application against the database.

Published Wednesday, December 24, 2008 5:28 PM by benko

BenkoBLOG : BenkoTIPS January Webcasts

Tuesday, December 23, 2008

A TechMasters Holiday Poem...


Twas the night before implementation and all thru the house
  not a system was working, not even the mouse.
The programmers all hung by their tubes in despair,
  with hopes that a miracle soon would be there.

The users were nestled all snug in their beds,
  while visions of queries danced in their heads.

When out in the server room there arose such a clatter,
  I sprang from my desk to see what was the matter.
And what to my wondering eyes should appear?
  but an alpha geek programmer with a six pack of beer.

More rapid than eagles, his programs they came,
  and he cursed and he muttered and he called them by name!
On Update! On Insert! On Query! On Delete!
  On batch jobs! On Closing! On functions complete!

His eyes were glazed over, his fingers nimble and lean,
  from weekends and nights in front of a screen.
But with a wink of his eye and a twitch of his head,
  soon gave me to know I had nothing to dread.

He spoke not a word, but went straight to his work,
  turning specs into code; then turned with a jerk.
And laying his finger upon the "ENTER" key,
  the system came up and worked perfectly.

The updates updated; the deletes, they deleted,
  the queries inquired; the closings completed.
He tested each whistle, and tested each bell,
  with nary a glitch; it had all gone so well.

The system was finished, the tested were concluded.
  The last minute changes were even included!
And the users exclaimed with a snarl and a taunt,
  "It's just what we asked for, but not what we want!"

BenkoBLOG : A TechMasters Holiday Poem...

Support Debugging Tool Build 10 Released


Support Debugging Tool Build 10 Released

The Support Debugging Tool for Dynamics GP 10 has been released by Dave Musgrave. Below is a list of enhancements to the tool:

  • Screen Shots
  • Security Profiler
  • Resource Information
  • Security Information
  • Administrator Settings

To view and read detailed information about each of the above and links to download the tool please review the entire article posted here: View article...

If you are not sure what the Support Debugging Tool is or what it can do please read the article post here: Support Debugging Tool for Microsoft Dynamics GP.

Published Tuesday, December 23, 2008 6:24 PM by jeffk

US Microsoft Dynamics GP Field Team : Support Debugging Tool Build 10 Released

Services for Microsoft Dynamics


Services for Microsoft Dynamics

With a powerful, growing portfolio of options, Services for Microsoft Dynamics® helps you and and your customers increase efficiency, productivity, and profitability. In other words, everybody wins with Services for Microsoft Dynamics.

Business Ready Enhancement Plan, Customer Services, CustomerSource, Partner Services, Sure Step, Licensing—with these powerful services, it’s no wonder that Microsoft partners and customers worldwide see Microsoft as a strategic business partner for every stage of expansion and development. And by providing customers with access to the most robust services, you can help them meet ever-higher business objectives—which will help you do the same.

To view the following list of Services please vist PartnerSource article located here: Services

  1. Licensing and Financing
  2. Business Ready Enhancement Plan
  3. CustomerSource
  4. Customer Services
  5. Partner Services
  6. Sure Step

Published Tuesday, December 23, 2008 6:04 PM by jeffk

US Microsoft Dynamics GP Field Team : Services for Microsoft Dynamics

Monday, December 22, 2008

Here Comes Another Bubble v1.1 - The Richter Scales

Office Communications Server 2007 R2 Goes RTM With Improved Tools for ISV Applications


Office Communications Server 2007 R2 Goes RTM With Improved Tools for ISV Applications

Office Communications Server 2007 R2 has been released to manufacturing (RTM). Customers will see the software at a virtual launch event in Feburary. OCS 2007 R2 incorporates new tools that developers can use to embed communication features into business applications.

New application programmer interfaces and Visual Studio integration improves the efficiency of everyday business processes by enabling businesses to build communications-enabled applications and embed communications into business applications.

The release improves mobile device support. Your customers will see more PBX-like features that to improve productivity and cutting the cost burden of legacy PBX infrastructure.

SIP trunking, new in R2, and also an important part of Microsoft's ResponsePoint small business VoIP offering, enables companies to move to a standards-based infrastructure so your customers can connect OCS 2007 directly to the carrier network, reducing infrastructure costs at the gateway. SIP trunking could help spur OCS adoption in the midmarket and SMB segments.

OCS 2007 R2 will mark the general availability of OCS 2007 R2 on February 3 in a virtual launch event. At that time, Microsoft's volume licensing customers will be able to download the software.

Those interested can register to participate in the launch at http://www.microsoft.com/communicationsserver

Through deep integration with Microsoft Office, Microsoft Exchange Server and Office SharePoint Server, Office Communications Server delivers organizations the power of one: one infrastructure for enterprise communications and one cohesive user experience. Customers do not need to deploy and adopt dozens of different applications to make unified communications a reality. ISVs are adding features to applications, which integrate communications deep-ly to their applications.

Posted: Monday, December 22, 2008 6:00 PM by Bruce Kyle

US ISV Developer Evangelism Team : Office Communications Server 2007 R2 Goes RTM With Improved Tools for ISV Applications

A new powertoy (nee test tool) used to find OneNote pages with embedded files


A new powertoy (nee test tool) used to find OneNote pages with embedded files

    Every so often I'll get a question from someone who has a need to find pages that have embedded files in them. The reasons vary, but when you need that information, it is hard to get with OneNote 2007. The same question was facing me last week as we were testing notebooks on SharePoint, and I was thinking of creating a tool to identify those pages. I remembered someone posting the same question to the newsgroup many months ago, and figured if anyone else wanted this, I would give it away. Lo and behold, as I was in the process of adapting Nani's Table of Contents powertoy to my needs, I got an email forwarded to me from someone who wanted the same functionality. And finally, about a year and a half ago, Kathy Jacobs, a OneNote MVP, asked if we ever used our powertoys to test with, and this is a great example of a tool we use for testing turned into a powertoy. Talk about getting a lot of bang for the buck.

    Before I describe how this works, I want to point out this is an internal tool which I hope you find useful. It has a bare bones UI, and I did not try to optimize much to make it optimized for speed. It's a tool that solves one specific problem and is aimed at people that need that particular bit of functionality.

    To install it, click the download link below. Unzip the file and run setup.exe. Then when you start OneNote, you will notice a button named "FDO" in the toolbar, like this:


    (You'll notice two other addins installed here. Can you identify them?)

    When you press the FDO button, the addin will look through all the pages in the current section - whatever section you are viewing - and identify the pages with embedded files or images. Since this is a test tool designed for our testing purposes, I needed to know which pages had those two specific types of information on them. When it gets done checking, it creates a table on a new page at the beginning of the section with links to the pages with the files I looked for:


    This is from my Unfiled Notes section. I have one page with 2 images on it (one is a screen clipping and the second is one I pasted there) and another page with one inserted file. Again, for testing purposes, I log a count of how many of each file type I have. The links in the left hand column will take you directly to the page.

    If the table looks familiar, I re-purposed Nani's Table of Contents powertoy to create the table and the new page. It worked very well and provided a nice solution of how to display the results of the search. The code will also group the Image pages first in the table.

    Some miscellaneous notes:

    1. FDO stands for "File Data Object." This probably only means anything to us, but I wanted to point it out to be complete.
    2. If you click the FDO button and it turns gray and gets disabled, check this out to ensure you have the .NET programmability support for OneNote installed.
    3. It's not fast, and if you have a large-ish section (30 pages or more), it may take a minute or more to finish.
    4. You can get the source code here.

    Let me know what you think!

    Questions, comments, concerns and criticisms always welcome,


Published Monday, December 22, 2008 10:17 AM by JohnGuin

OneNote Testing : A new powertoy (nee test tool) used to find OneNote pages with embedded files

Community Technology Preview-3 (CTP3) of Windows PowerShell V2


Early Christmas Present from PowerShell Team: Community Technology Preview-3 (CTP3) of Windows PowerShell V2

While Santa and co. are getting busy for Christmas, the Windows PowerShell Team is pleased to release the third Community Technology Preview (CTP3) of Windows PowerShell V2!

First let us thank you for all your great feedback on CTP1 and CTP2.  This is your product so never be shy about letting us know what you want from it.  We made quite a few changes based upon your feedback.  That is the benefit of these CTPs, it allows us to change things before we release them.  That also means that if you wrote scripts that used the features we changed, they will have to be modified to run properly.  We’ll have some more changes before we release but we are getting to the end game so fewer and fewer things will change by smaller and smaller amounts going forward.

This release brings, among other things, performance improvements ... things will be faster/more efficient than before. PowerShell remoting now allows implicit remoting where command execution appears to be local even though they are remote. We have added over 60 new cmdlets in this release ... cmdlets for adding/removing/renaming computers, cmdlets for event logs, cmdlets for WS-Man functionality and even a WS-Man provider. The “graphical” host, Windows PowerShell ISE, now supports a graphical debugger, context sensitive F1 help and a programmable interface for you to party on.

These are just a few of the new features we have packaged in this CTP3 release. Additionally this CTP3 includes some simple updates... like new parameters to several existing cmdlets. More feature descriptions and details are in the Release Notes and in the “about” topics included with the installation.

Reminder to the brave souls who want to use these bits in a production environment ... Don’t, these bits are still CTP. This CTP is not a beta. This software is a pre-release version. It may not work the way a final version of the software does. These CTP3 bits have not gone through rigorous testing. Even with these caveats, we hope you would try them out and let us know your feedback.

Last but certainly not least, V2 builds upon Windows PowerShell 1.0 by providing backward compatibility – your 1.0 cmdlets and scripts will run on this CTP3 (with the exceptions noted in the Release Notes - mostly new keywords/cmdlets). If a working 1.0 script doesn’t run on V2 and is not in the known list of exceptions, please tell us about it!

Download Windows PowerShell V2 CTP3

Download WinRM 2.0 CTP3 (required for PowerShell remoting)

Hemant Mahawar [MSFT]
Program Manager
Windows PowerShell

Submitting Feedback

Please submit your feedback using the Connect Website (adding a CTP3: to the title), posting on the Windows PowerShell Discussion Group, or commenting on the Windows PowerShell Blog

Published Tuesday, December 23, 2008 1:17 AM by PowerShellTeam

Windows PowerShell Blog : Early Christmas Present from PowerShell Team: Community Technology Preview-3 (CTP3) of Windows PowerShell V2

Sunday, December 21, 2008

NAS Adapter from Addonics


NAS (network attached storage) Adapter

The Addonics NAS Adapter is a convenient and economical solution for adding any USB storage devices onto your LAN (Local Area Network). Once on the network, the USB storage can be shared by any network user, just like an ordinary NAS device. When use in conjunction with Addonics Storage Towers or Storage Racks, a Multi-Tera bytes storage with various RAID capabilities can be instantly added to the LAN. With the NAS adapter, you can custom build you own NAS appliance with RAID capability and plenty of storage expansion using Addonics family of Drive Enclosures, Port Multipliers, and IO converters.

Come built-in with a USB 2.0/1.1 connection and a fast Ethernet 10/100Mbps connection, the NAS adapter supports both SMB (Server Message Block) and the open source Samba network protocols, allowing for cross-platform access of all shared data for most versions of Windows, Mac OS X, and various Linux distributions. For remote users who are not connected over the LAN, the NAS Adapter provides FTP access for up to 8 simultaneous users anywhere in the world with an internet connection. In addition, the NAS adapter can also be used as a print server or as a Bit-Torrent downloading appliance.

Illustration of a NAS Adapter connected over the network

Product Feature

  • Convert any USB 2.0 / 1.1 mass storage device into a Network Attached Storage device
  • Great for adding Addonics Storage Tower, Storage Rack or any Addonics USB storage device onto the network
  • Great for creating a custom Network Attached Storage appliance.
  • USB port can be used to power most 2.5" USB hard drives or any low powered USB storage device.
  • Support Fast Ethernet 10/100Mbps.
  • Simple to install
  • Small and light weight. Size slightly longer than a C size battery. Can be installed practically anywhere
  • Can be set as DHCP server or client.
  • Support Samba server for up to 64 concurrent clients.
  • Support FTP server for up to 8 concurrent users.
  • Can be set as a print server to attach any USB printer to the network
  • Built-in Bit Torrent client for direct download to the attached USB storage device
  • Can be set as a UPnP AV server to share photo/music files stored on the file server with XBOX 360 video game consoles connected to the LAN
  • User management to allow read only or read/write access to folders
  • Administrative management access via web browser with password security.
  • Compatible with all Windows OS, Mac OS, Linux 2.6.x and above

Model and Pricing

Model : NASU2 - $55

Package Contents: NAS Adapter, 1 foot CAT5 Ethernet cable, 110/220 5V power adapter (AAPAC5V), User manual, Driver CD


NAS Adapter from Addonics

Checkin’ those components!!


Checkin’ those components!!

Well, it finally happened – Component Checker 2.0 was finally posted on Microsoft.com.  I don’t know that the actual functionality is earth shattering, but it is nice to finally see it out there.

“What is Component Checker?” and “Why should I care it was updated?” are probably right on the tip of your tongue.  Wait for it, wait for it…

OK, no more teasing – Component Checker is a tool that can be used to check the state of your Microsoft Data Access Components (affectionately known as “MDAC” by those who deal with it) installation.  These components are used by all native applications to talk to various types of databases and are basically required parts of any client/server application that involves a database as the server component.  If you want to see exactly what turning of one of the core parts of MDAC does to your system, follow the workaround of unregistering OLEDB32 from MS08-078.  Good luck trying to do anything functional with data of any type!

The cool thing about the newest release of Component Checker is that is can be used to check every OS installation in existence that might have a bad MDAC installation.  This includes all the variants of XP, 2003, and 2000 (2008 and Vista are left off this list intentionally since it is almost impossible to corrupt their data access stack).  All you have to do is run Component Checker, select “Perform analysis against a selected version” and the select your version of Windows.

You should get something that looks like this:


If you installation is clean, you will see nothing but matches above. 

So, you might ask, why do I have mismatches above?  Well, for that, I go back to the reference to MS08-078 above.  Component Checker is not smart enough to know that files that are LATER than the ones it is checking for are OK.   Unfortunately, due to the bad guys out there these days, it is pretty common to find later versions of the various binaries pushed out via Windows Update.

However, it is still pretty easy to check your installation.  Just look to make sure that the Mismatch Fields column has a version number less than the Version field:


(click on the image above for a larger view)

As you can see above, Component Checker is looking for 2.81.1117 and I have 2.81.1135 (MSADCE) and 2.81.1124 (MSADCO).  Good to go!

Published Sunday, December 21, 2008 8:49 PM by evanba

Everyday Adventures : Checkin’ those components!!

Automatically restart failed Exchange services using PowerShell...


Automatically restart failed Exchange services using PowerShell...

New blog... first post.  Unfortunately I don't have anything ground breaking or earth shattering to share.

Oh wait, there's this, at least...

This morning when I arrived at work, I found that my main Hyper-V server had been restarted due to the installation of security patches.  While all 29 of my virtual machines restarted successfully, more than half of my Exchange services failed on each of my seven Exchange servers.  Frustrated with having to deal with this yet again, I sat down for a few minutes and threw together a little script to help automatically start all of those failed Exchange services upon logging onto the server. 

There are two files involved:

  • CheckServices.cmd (stored at C:\Documents and Settings\All Users\Start Menu\Programs\Startup)
  • CheckServices.ps1 (stored at C:\Program Files\Microsoft\Exchange Server\Scripts)


@echo off

powershell.exe -PSConsoleFile "C:\Program Files\Microsoft\Exchange Server\bin\exshell.psc1" -command "checkservices.ps1"





if ($servicestatus.requiredservicesrunning -match "False")


Write-Host " "

Write-Host "Server Roles:"


Write-Host " "

Write-Host "Services Not Started:"


Write-Host " "

Write Host "Starting required services. Please wait..."

Write-Host " "

foreach ($service in $servicestatus.servicesnotrunning) { start-service $service }

Write-Host " "




if ($servicestatus.requiredservicesrunning -match "True")


Write-Host " "

Write-Host "All required services are started..."

Write-Host " "




Write-Host " "

Write-Host "The following services failed to start:"

Write-Host " "


Write-Host " "


CheckServices.cmd acts as a launcher for CheckServices.ps1, which executes test-servicehealth against Exchange services. If any of the services are in a stopped state for the installed Exchange role(s), the script will attempt to restart them.  If you decide to use this script, you will likely need to modify your file paths.

Yeah, I know. There’s a seven year old out there somewhere who could have done this in 30 characters or less… Try not to laugh too hard at my l33t scripting skillz! J

Published Sunday, December 21, 2008 9:20 PM by DaveH

My life as a UC n00b... : Automatically restart failed Exchange services using PowerShell...

Querying Windows Desktop Search


IronPython: Querying Windows Desktop Search

I needed to find some files without restoring to the desktop search UI – this IronPython script implements a very simple search that uses System.Data.OleDb* to query the Windows Desktop Search catalog

import clr
import System

import System.Data

wds_connection_string = "Provider=Search.CollatorDSO;Extended Properties='Application=Windows';"
wds_query = "SELECT System.FileName FROM SYSTEMINDEX"
wds_connection = System.Data.OleDb.OleDbConnection( wds_connection_string )
wds_connection.Open( )
wds_command = System.Data.OleDb.OleDbCommand( wds_query , wds_connection)
wds_results = wds_command.ExecuteReader()

while ( wds_results.Read() ) :
print wds_results.GetString(0)
n +=1
if ( n>= max_results_to_show) : break


For Example

You can directly paste the code into the IronPython console to see the results…


For more information

Microsoft TechNet: "Seek and Ye Shall Find : Scripting Windows Desktop Search 3."0


Microsoft Windows Search 3.x SDK


NOTE: You don’t need to download the SDK to run the sample code.

Published Sunday, December 21, 2008 11:20 AM by saveenr

Attachment(s): ironpython-windows-desktop-search-example-(2008-12-21).py

Saveen Reddy's blog : IronPython: Querying Windows Desktop Search

Office 2007 SP2 is on its way


Office 2007 SP2 is on its way

Office Sustained Engineering announced the upcoming release of Office 2007 SP2 here. You can also view the official SP2 press release and learn more about the background and history of SP2 from Gray Knowlton. The exact release date has not been announced, but is expected to fall between February and April 2009. The biggest news is that SP2 will include support for Open Document Format (ODF), XML Paper Specification (XPS), and Portable Document Format (PDF). Currently, support for XPS and PDF is available as a separate download. SP2 also includes performance and additional interoperability enhancements.

- Andrea Weiss

Office Resource Kit Blog : Office 2007 SP2 is on its way

Welcome to the Dynamics Confessor Blogspot


Welcome to the Dynamics Confessor Blogspot

David MeegoAnother one of the Microsoft Dynamics GP MVPs has just created a blog.

So welcome to the blogsphere, Leslie Vail and the Dynamics Confessor Blogspot. Leslie is asking for topics, so please go visit her site and put in your requests.

I have also added the link to the Microsoft Dynamics GP Blogs page and to the Blog Links on the left hand pane of the blog.


Developing for Dynamics GP : Welcome to the Dynamics Confessor Blogspot

Saturday, December 20, 2008

Exchange Activesync protocols now open


Exchange Activesync protocols now open

Published 21 December 08 11:00 AM

A further sign of Microsoft’s commitment towards being more open is that the protocols for Exchange Activesync (EAS) have now been published to the public (previously they were only available to licensees).

EAS is already built into pretty much every major smartphone/PDA out there – but now that it is open I’m sure we will see some innovative and creative apps/tools from the developer community.

Documentation of the EAS protocols is available at http://msdn.microsoft.com/en-us/library/cc307725.aspx

From the press release:

REDMOND, Wash. — Dec. 18, 2008 — Microsoft Corp. today announced it is expanding its Exchange ActiveSync Intellectual Property (IP) Licensing program, facilitated by Version 1.0 releases of technical documentation for protocols built into Exchange ActiveSync, which Microsoft posted on the Microsoft Developer Network (MSDN) earlier this month. The posting of this documentation completes the set of Exchange ActiveSync protocols Microsoft committed to publish as part of its Interoperability Principles announced in February 2008 (http://www.microsoft.com/interop/principles/default.mspx).

“The Exchange ActiveSync IP Licensing program is another example of how we are continuing to deliver on our commitment to increased openness and collaboration,” said Horacio Gutierrez, vice president of intellectual property and licensing at Microsoft. “This technology is being sought out by our partners and competitors alike because it enhances their value proposition to their customers, and we believe that to be a testament to the innovation taking place at Microsoft.”

by jkruse

Johann's Unified Communications : Exchange Activesync protocols now open

Practice Practice Practice


Practice Practice Practice

Yeah, what he said:  http://radar.oreilly.com/2008/12/hard-work-and-practice-in-programming.html

One thing to add is this is not learning by rote. This is discovery and creation that only surfaces through practice. The same applies equally to being a program manager. There is no substitute for practice (and by extension, experience).  I need to go practice now.

Posted: Saturday, December 20, 2008 8:07 PM by Sean Olson

Unified Me : Practice Practice Practice

PowerShell for Failover Clustering in Windows Server 2008 R2


PowerShell for Failover Clustering in Windows Server 2008 R2

Hi Cluster Fans,

In Windows Server 2008 R2 (“R2”) we are introducing PowerShell as the new scripting language for clustering technologies. PowerShell with Failover Clustering will replace Cluster.exe and the Windows Server 2008 R2 release will be the deprecation release for Cluster.exe. This means it will still available for use so it doesn’t break legacy scripts, but no improvements have been made and Cluster.exe will be completely removed in the next release of Windows Server. This allows ample time for you to learn (and love) PowerShell.

PowerShell provides numerous benefits over standard command line interfaces, including easily customizable scripts and the dynamic use of variables. In Windows Server 2008 R2, PowerShell can also be run on Server Core machines. Using PowerShell on a Core cluster, you can directly run cluster Validation and generate dependency reports, without needing to manage the Core node through a UI-based remote machine.

This blog post will provide an overview of PowerShell with Failover Clustering. In the next few weeks, a post about PowerShell with Network Load Balancing (NLB) will be added to the site.

How do I get R2 Beta?

There are numerous ways to get the Windows Server 2008 R2 Beta build which includes Failover Clustering (on Enterprise and Datacenter editions). If you work for an organization which partners with Microsoft, try contacting your Technology Account Manager (TAM) to see if they can provide you with access. If your organization is enrolled in the Technology Adoption Program (TAP) you may also have access through this channel. Others may enroll in the Microsoft Connect program (http://connect.microsoft.com/) to receive access to major builds. The Beta build will be available very shortly for deployment and testing.

We want your feedback!

PowerShell is going to be the cluster scripting language for the future – and you have the opportunity to influence its design and use for the next decade during the Beta feedback period. Some high-level areas of feedback for the PowerShell commands (cmdlets) which we are looking for include the following:

· Was it easy to find the cmdlet you wanted?

· Are the parameters consistent between cmdlets?

· Is the in-box help and example useful?

· Has the PowerShell utility met your scripting needs?

· Was there a cmdlet which did something different than you expected from its name or description?

· Anything else?

We encourage you to provide feedback through the appropriate channels on the Microsoft Connect site, through your TAM, or TAP Program Manager. You may also email your feedback to us via the “Email” link in the upper left corner of this page.

Running Failover Clustering with PowerShell

The name of the Failover Clustering module is FailoverClusters.

Loading PowerShell with Failover Clustering can be done in two ways:

1. Open Failover Cluster PowerShell Management from the shortcut in Administrative Tools

§ This option will appear after the Failover Clustering feature is installed


2. Open PowerShell on your machine through right-clicking and selecting Run as administrator

§ Load the module with the command: Import-Module FailoverClusters

Failover Clustering PowerShell Cmdlets

Cmdlets (“command-lets”) form the basis of the PowerShell instruction set. The design goal was for feature parity between the Failover Cluster Manager GUI and PowerShell, so any operation can be performed on both (with cluster migration being the exception, unavailable with PowerShell). Other new R2 Failover Clustering features are also supported by PowerShell, including Cluster Shared Volumes (CSV) and Live Migration. Integration with and manageability of Hyper-V Virtual Machines (VMs) is also important and PowerShell give you the ability to create and manage highly-available VMs, configure CSV volumes, and perform a Quick Migration (Move-ClusterGroup) or a Live Migration (Move-ClusterVirtualMachineRole).

To get a complete list of the cmdlets, run the following command: Get-Command -Module FailoverClusters

The following is a list of Failover Clustering cmdlets for the Beta release. Note that these are subject to change.







































































Help Documentation

This all sounds pretty good, right? But what happens when you cannot figure out what command syntax is needed? Well PowerShell has all the help documentation built into the PowerShell utility itself and even includes examples. Please refer to this in-box help when you use PowerShell. This is also an area where we would like your feedback – please let us know if there is anything which you cannot find or you believe was misleading.

To get full help for a cmdlet: Get-Help <cmdlet_name> -Full

Let’s take a look at the Add-ClusterDisk cmdlet:

PS C:\Users\symonp> Get-Help Add-ClusterDisk -Full




Make a new disk available for use in a failover cluster. The disk (LUN) must be exposed to all nodes in the failover cluster, and should not be exposed to any other servers.


Add-ClusterDisk [-InputObject] <ClusterDiskInfo[]> [-confirm] [-whatif] [<CommonParameters>]


When adding a disk, make sure that the configuration of the storage allows the operating system to recognize and mount the disk as needed. The disk must be a basic disk (not a dynamic disk) and should not be exposed to servers outside the cluster. The cmdlet Get-ClusterAvailableDisk gets information about disks that you can add to the cluster.


-InputObject <ClusterDiskInfo[]>

Required? true

Position? 1

Default value

Accept pipeline input? true (ByValue)

Accept wildcard characters? false

-Confirm [<SwitchParameter>]

Prompts you for confirmation before executing the command.

Required? false

Position? named

Default value

Accept pipeline input? false

Accept wildcard characters? false

-WhatIf [<SwitchParameter>]

Describes what would happen if you executed the command without actually executing the command.

Required? false

Position? named

Default value

Accept pipeline input? false

Accept wildcard characters? false


This cmdlet supports the common parameters: -Verbose, -Debug,

-ErrorAction, -ErrorVariable, -WarningAction, -WarningVariable,

-OutBuffer and -OutVariable. For more information, type,

"get-help about_commonparameters".






-------------------------- EXAMPLE 1 --------------------------

C:\PS>Get-ClusterAvailableDisk | Add-ClusterDisk

Name State Group ResourceType

---- ----- ----- ------------

Cluster Disk 7 OnlinePending Available Storage Physical Disk

Cluster Disk 8 OnlinePending Available Storage Physical Disk



This command adds to the cluster all disks that are ready for cluster use but have not been added to the cluster yet.

-------------------------- EXAMPLE 2 --------------------------

C:\PS>Get-ClusterAvailableDisk | ?{ $_.ScsiAddress -eq 50331651 } | Add-ClusterDisk

Name State Group ResourceType

---- ----- ----- ------------

Cluster Disk 4 OnlinePending Available Storage Physical Disk



This command adds to the cluster a disk (with the given SCSI Address) that is ready for cluster use but has not been added to the cluster yet.




To get full help for all Failover Clustering cmdlets: Get-Command -Module FailoverClusters | %{ Get-Help $_.Name -Full }


Here’s a sample PowerShell script where I set some variables, create a highly-available File Server, get some information about the cluster and resources, move the File Server, then delete it.

# Set some variables

$Node1 = "symonp-n1" ; $Node2 = "symonp-n2"

$FileServerGroupName = "symonp-fsBlog"

$FileServerDiskResourceName = "Cluster Disk 1"

# Create a highly available file server

Add-ClusterFileServerRole -Storage $FileServerDiskResourceName –Name $FileServerGroupName

# See which resources are in my group

Get-ClusterGroup $FileServerGroupName | Get-ClusterResource

# Get resources on a specific node

Get-ClusterNode $Node1 | Get-ClusterResource

Get-ClusterNode $Node2 | Get-ClusterResource

# Move file server cluster group to this node if it is on another node

Get-ClusterGroup $FileServerGroupName | %{ if ( $_.OwnerNode.NodeName -ne $env:computername ) { Move-ClusterGroup -Group $FileServerGroupName -NodeName $env:computername } }

# Delete the highly available file server

Remove-ClusterGroup $FileServerGroupName -RemoveResources

Good luck with your PowerShell deployments and please send us your feedback!


Symon Perriman

Program Manager

Clustering & HA

Published Saturday, December 20, 2008 3:21 AM by msclustm

Clustering and High Availability : PowerShell for Failover Clustering in Windows Server 2008 R2

Friday, December 19, 2008

Introduction to Internet Protocol Version 6


Introduction to Internet Protocol Version 6

While doing some research of IPv6 I ran across this Introduction to Internet Protocol Version 6 webcast that I thought might be of interest to many of you that need to get up-to-speed on IPv6.  

From the website:

This presentation summarizes the content in the Introduction to IPv6 white paper. It provides an overview of Internet Protocol Version 6 (IPv6) and detailed discussions of IPv6 addressing, the Internet Control Message Protocol for IPv6, Multicast Listener Discovery, Neighbor Discovery, and address autoconfiguration.

This is a Level 300 session that was recorded May 18, 2006 and presented by Joseph Davies.

Joseph is a technical writer in the Windows Networking and Device Technologies group at Microsoft. He has been a technical writer and instructor of TCP/IP and networking technology topics since 1992 and has written product documentation, Web content, training courseware, and five books for Microsoft Press that include the award winning title Understanding IPv6.

Published Friday, December 19, 2008 8:56 PM by johnbaker

John Baker's BritBlog : Support WebCast: Introduction to Internet Protocol Version 6




I am working on a project that is churning out the builds, sometimes with just a single change in an assembly (we are in bug fixing mode).  Doing a full install each time is tedious given that it is a server application with a Windows service, several configuration tools, three Microsoft Office Project Server event handlers, and quite a few dependency assemblies.  Manually hunting through the server replacing files got old and batch files assumed knowledge about target directories and other configuration information.  I decided to build a patching tool to ease the pain a bit.   I call it QuickPatch.  Last week I wrote about a new GUI for IExpress, which is the foundation on which QuickPatch is built.  Read about IExpress here.


QuickPatch was designed to meet the following requirements:

  • Create a tool that generates self-extracting executables (SEE) taking advantage of IExpress (while updating IExpress’ user experience).  The following requirements are a rehash of some of IExpress’ capabilities):
    • The SEE contains a set of files as specified by the user
    • The SEE may include a license file to be displayed to the user before taking another action
    • The SEE may include a simple prompt to be displayed to the user before taking another action
    • The SEE may include an executable to be run after extraction
    • The SEE may include a second executable to be run after the one mentioned above
    • The SEE may control whether the system is rebooted after the executable is run
    • The SEE may control whether the user is shown any user interface whatsoever
    • The SEE may control whether feedback (via an animated dialog) during the unpacking of the contained files is displayed to the user
  • Create a patching tool delivered via a SEE that has the following capabilities:
    • Given a target directory (we will get to how the target directory is discovered in a minute):
      • Check the set of files included in the SEE to see if any files in the target directory are a match and update them
        • Check file version first if applicable
      • Optionally check the set of files included against files found in subdirectories of the target directory and update them
      • Optionally create a “file map” that allows for the placement of files in the target directory or subdirectories that may not exist (this allows for adding things to the target installation—in my case, I have “agents” that simply need to be dropped in a subdirectory to become active in the installation)
    • Provide for the discovery of the target directory through four mechanism:
      • Prompt the user to choose the installation directory
      • Scan the machine for the files and replace them wherever they are found (this is useful because my assemblies’ names always match their inclosing namespace, such as “Mcs.Epm.Services.EpmExecutive.exe”
      • Scan the machine for a single file name. Once found, the location of the file becomes the target directory and update continues from there
      • Provide a mechanism for taking a “registry hint”, i.e. a registry value that represents the target directory
        • The registry value may contain a directory (for example “HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\12.0\Project\InstallRoot\Path”
        • The registry value may contain a file name, which will be stripped away to arrive at the target directory
        • The registry hint may also allow for appending a subdirectory (for example HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\12.0\Project\InstallRoot\Path[MySubDirectory]”)
        • The registry hint may also allow for walking up the path one level  (for example HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\12.0\Project\InstallRoot\Path[…]”)
    • Includes a bootstrapper that ensures the .NET Framework is installed
      • If not, take the user to a link where it can be downloaded
      • If so, launch the patch component
  • Provide a user interface for creating, building, and managing patches.
Usage Scenarios

I can envision many ways this can be used, but I will give you a concrete real world example where this has already been used.  I wrote a Windows service that looks to an “agents” directory to periodically load and execute any assembly which implements the IAgent interface.  I pushed a full install to the customer’s environment.  The next day the engagement manager wanted a change to one of the agents’ behavior.  The change was so small that it didn't require a full reinstall so I used QuickPatch to deliver the change.  The client received an executable.  They ran it and the patch was complete.  I used a registry hint (the location of the Windows service) to figure out where the patch should be applied.

QuickPatch Design

There are four major components to QuickPatch:

  1. IExpress – The tool supplied by Microsoft for generating self-extracting executables and running setup programs.
  2. QuickPatch – The WinForms application I wrote for creating, building, and managing patches
  3. QuickPatchInstaller – The application I wrote for applying a patch
  4. QuickPatchBootStrapper – A small C++ program used to detect whether or not the .NET framework is installed before launching QuickPatchInstaller.  QuickPatch is not intended to be an installation tool.  It assumes that you are patching an existing .NET installation, so the framework should be present.  The bootstrapper is just a backstop to ensure the user experience is decent if the framework is not present.

Let’s take a look at 2-4.

The QuickPatch Application

Here is a screen shot of the application with a patch open for editing:


See my previous post for a detailed description of the “Package” tab.  Below is a screen shot of the same screen with the “Patch” tab displayed:


Note the Advanced section has a “file map” defined with several sub directories.

Let’s build the patch:


And run the patch:


Clicking the image  button changes the user interface to show the log of changes (and errors if necessary):


In the registry you can see the path used:


And in the file system you can see the results:


More next week.  Have a good weekend.

Published Friday, December 19, 2008 8:53 PM by coafrica

Colby Africa : QuickPatch

WiE project and source code posted to CodePlex!


WiE project and source code posted to CodePlex!

Good Afternoon,

I’ve just completed posting the WiE community project to CodePlex.  As this is my first CodePlex project I expect that I may have forgotten some steps so please let me know of any issues you experience and I will work to resolve them quickly.

You can find the project at: http://www.codeplex.com/wie

In the coming weeks I will put together some documentation for developers interested in and willing to contribute to the CodePlex project and turn this sample into a full fledged location-based social networking application.  

Until then you can refer to previous blog articles for a review of the design and implementation of the current WiE release:

Here is a very high level architecture diagram of the components and underlying technologies that make up the current release of WiE.


Have a great holiday and a wonderful new year.



Posted: Friday, December 19, 2008 10:51 PM by olmeyer

Discussions on SQL Server Manageability and Development. : WiE project and source code posted to CodePlex!

.NET 3.0 and 3.5 in Windows Server 2008 R2 Server Core


.NET 3.0 and 3.5 in Windows Server 2008 R2 Server Core

Much like .NET 2.0 in Server Core, .NET 3.0 and 3.5 are also subsets of functionality. Included are:

· Windows Communication Framework (WCF)

· Windows Workflow Framework (WF)


The only functionality not included is the Windows Presentation Framework (WPF).

.NET 3.0 and 3.5 functionality is installed with a single package:

· Start /w ocsetup NetFx3-ServerCore

If 32bit support is needed you first need to install WoW64 and then .NET 2.0 WoW64 support, the following assume the above command has already been run:

· Start /w ocsetup ServerCore-WOW64

· Start /w ocsetup NetFx3-ServerCore-WOW64

Published Friday, December 19, 2008 3:51 PM by amason

Server Core : .NET 3.0 and 3.5 in Windows Server 2008 R2 Server Core

Service Pack 1 for DPM 2007 is now available


Service Pack 1 for DPM 2007 is now available

System Center Data Protection Manager

The DPM team is very excited to announce the release of Service Pack 1 for DPM 2007.

What is new in Service Pack 1

DPM 2007 SP1 x86    

DPM 2007 SP1 x64   

SP1 videos on TechNet Edge

Upcoming Webcast on SP1
on January 8, 2009

Service Pack 1 for Microsoft System Center Data Protection Manager (DPM) 2007 provides continuous data protection for Windows application and file servers using seamlessly integrated disk and tape media and includes the following expanded capabilities:

  • Protection of Hyper-V™ virtualization platforms, including both Windows Server 2008 Hyper-V and the Microsoft Hyper-V Server, has been added to the existing set of protected workloads, building on the virtualization protection originally delivered for Virtual Server 2005.
  • Enhanced SQL Server 2008 protection, including the addition of new protection capabilities for mirrored databases, support for parallel backups of databases within a single instance, and the ability to move data from SQL Server 2005 to SQL Server 2008 for migration scenarios.
  • Microsoft Office SharePoint Server 2007 and Windows SharePoint Services 3.0 receive index protection, significant catalog optimization, and support for mirrored content databases.
  • Added protection for Exchange Server 2007 Standby Cluster Replication (SCR), enabling a complete disaster recovery solution that leverages SCR failover alongside DPM point-in-time restores.

In addition to enhancing the protection of each of the core Microsoft application workloads, additional capabilities have also been introduced with the release of DPM 2007 SP1, such as:

  • Local Data Source Protection enabling the DPM 2007 SP1 server to act as a branch office server offering self-protecting File Services and Virtualization hosting within one platform.
  • Cross-Forest Protection allowing large enterprise customers with multiple Active Directory® forests to now have even more flexibility in their DPM deployments.
  • Provision for a Client DPML answers customer demand for a more cost-effective way to protect Windows XP and Windows Vista clients using the same DPM 2007 infrastructure that protects their servers
  • Disaster Recovery capabilities within DPM 2007 SP1 now include the ability to leverage a third-party vaulting partner via the cloud (SaaS)

All of this new functionality builds on the features released in the DPM 2007 ‘Rollup Update’ in June 2008, which provided protection of Windows Server 2008, including Windows Server 2008, Windows Server 2008 core, Windows Server 2008 System State and BitLocker™ support – as well as new tape media capabilities around tape sharing and media library sharing. 

Between ‘Rollup Update’ and Service Pack 1, most of the core features of DPM 2007 have seen incremental capabilities or workload advancements which promises to keep Data Protection Manager on a trajectory toward improving how Microsoft customers protect and recover their Windows application and file servers with the Microsoft backup and recovery solution.

-- jason

Published Friday, December 19, 2008 2:16 PM by JasonBuffington

Ctrl P - The Data Protection Manager Blog! : Service Pack 1 for DPM 2007 is now available

Sysinternals Updates: Autoruns v9.37, AccessChk v4.23


Updates: Autoruns v9.37, AccessChk v4.23

Autoruns v9.37: This update adds support for viewing the Local System account's profile and adds a new option, Hide Microsoft and Windows Entries.

AccessChk v4.23: Changes the behavior of object manager name parsing to make enumerating the objects in an object manager directory more straight forward.

Published Friday, December 19, 2008 10:44 PM by curtismetz

Sysinternals Site Discussion : Updates: Autoruns v9.37, AccessChk v4.23

How to Manage Remote Employees


How to Manage Remote Employees - Part 1 18 December 08 01:19 PM
Due to the globalization of today's business managers are faced with having to manage remote employees.  This can be a daunting task especially if you are new to managing people let alone remote folks as well.  Throughout the next couple of weeks TechLeaders is going to be sharing a series of tips and ideas that Kelly Pate Dwyer wrote about in 2007. 
We would also like to gather your thoughts as well regarding this topic so feel free to comment.
Build a Strong Team, Starting with You
Goal: Make sure you’re up for the task of managing remotely.
Managers who run dispersed teams successfully share several traits. They work a lot, they

travel — some more than half the time — and they thrive on their work and the culture they’ve

created. “Remote managers need more energy, because a lot of what you have to do is

transfer that energy to your team,” says Juliana Slye, who manages remote employees

as director of the government division at software maker Autodesk, based in San Rafael,

California. The successful remote manager has the following traits:

Passion. A remote set-up won’t work unless your employees are motivated and running in

sync — collaborating, asking each other for help, sharing ideas. That energy has to start with

you. You don’t need to start each day smiling from ear to ear, but if you’re annoyed every

time an IM breaks your train of thought or you’re not good about remembering to check in with

people, running remote teams probably isn’t for you.

Availability. Good remote communication requires extra effort. You need to go out of your

way to address issues that would come up naturally and spontaneously if you all worked

in one place. When your staff is spread across a number of time zones, they need to feel

comfortable calling you at odd hours — even if it’s dinner hour. Beyond the guidance or

answers you can provide, which allows them to move forward with their work, your availability

shows support, which helps strengthen your relationships with everyone. That said, establish

reasonable guidelines about when to call.

Patience. A two-hour dinner with an employee across the country may take up two days with

travel time. And it may take two hours instead of 10 minutes to schedule a conference call.

The lesson here? Budget extra time for common group tasks. This doesn’t necessarily hurt

productivity. For instance, conference calls are usually shorter and more to the point than a

meeting in person, where members of the group are bound to do more small talk.

Reliability. By doing what you say you’ll do — whether it’s helping solve a problem or

sending a new laptop — you foster trust. Your reliability shows respect for what your workers

are doing. Without that, they’ll quit asking for help, and you’ll fall out of the loop. “Trust

is particularly important in distance relationships,” says management consultant Debra

Dinnocenzo, author of “How to Lead from a Distance.” “You build trust through actions that

demonstrate reliability, integrity, and familiarity.”

Five Ways to Build Trust

Asked how he makes sure his team is keeping him in the loop, remote manager Dan

Belmont, chief marketing officer of the Marketing Arm, a Dallas-based agency that

promotes sports and entertainment events, says he makes himself part of their “network”

by working beside them. “If you’re in the trenches doing the work,” he says, “you’re not

just perceived as someone who is managing people and processes.” Belmont makes

himself available to brainstorm or solve problems and typically spends an hour a week on

the phone with each of his 14 employees.

Here are more ways to build trust:

1. Be available. Don’t let employee calls go to voice mail. When you absolutely can’t be

reached, reply ASAP.

2. Beware of using sarcasm and teasing in distance interactions, like email and

conference calls, where signals can easily get crossed.

3. Handle sensitive issues with discretion. One team member might tell Belmont that

another is having a bad day. He’ll immediately call the person having the bad day,

without exposing the colleague who told him.

4. Communicate in a variety of ways (email, phone, in person, etc) and often.

5. Visit employees on their turf. It shows respect for their time and interest in their life

outside the job.

Managing Remote Employees - Part 2 19 December 08 09:57 AM
The traditional office and the 9-5 days have morphed into home offices, teams in different countries and no standard set hours.  Some employees flourish in a remote role, while others are just not cut out for it.  It takes a certain type of individual to handle working remotely. You want to be sure that you gather the right people for the remote roles.
Goal: Build a team that can work well at a distance.
A dispersed team depends on people who can be productive without a boss roaming the hallways or a trusted co-worker sitting nearby. Team members should be motivated, disciplined, and flexible with their time, allowing them to connect with clients or co-workers
in different time zones. “People who like to quit at 5 p.m. aren’t the people who work well remotely,”says Michelle LaBrosse, CEO of Cheetah Learning, a project-management training company based in Carson City, Nevada. They also need to communicate clearly in writing (since e-mail and instant messaging are the new standard for daily communication) and should be willing to suggest ideas, ask for and offer help, make decisions, and collaborate.  Below are a few suggestions for setting up a remote work arrangement.

Match people to the work.

Extroverts and idea people tend to like tasks that require frequent and ongoing communication. Make sure they’re in an office with teammates they can collaborate with. Introverts and people confident making decisions can work more easily at

home or on solo projects.

Match work to the time zone.

If some employees are working while others sleep, try to avoid assigning work that leaves team members perpetually in the hurry-up-and-wait cycle, as their counterparts half a world away complete their part of a project. 

Assign backups.

For the most critical tasks, make sure you or someone else in your group can fill in on a moment’s notice, like when someone is ill or quits. (And make sure you can access a remote worker’s files and contacts from afar.)

Sign an agreement.

Specify when and how much a person may need to work, times they need to be available, performance objectives, and frequency of in-person meetings. This codifies expectations and provides something tangible for your employee to refer back to.


At least a few times a year, ask what’s working and what’s not, then make changes if necessary. Withdrawal is a common sign of a problem. Even if a person is meeting deadlines and producing quality work, they may be unhappy if you hear from them less and less.

Connecting the Dots.

After studying dozens of virtual teams, including groups at BP, Nokia, and Ogilvy & Mather, researchers at the London Business School recommend the following:

Recruit volunteers.

Look within the company for volunteers to lead a new committee or research a new opportunity, rather than just assigning such tasks. “Virtual teams appear to thrive when they include volunteers with valuable skills — people whose proof

of commitment is their willingness to join the team on their own,” writes Lynda Gratton, a professor of management with the school, in a 2007 Wall Street Journal article.

Add “boundary spanners” to each virtual team.

“Boundary spanners are people who, as a result of their personality, skills, or work history, have lots of connections to useful people outside the team,” Gratton writes. They play a strong networking role, keeping

the team and its accomplishments visible within the company. Nokia cultivates boundary spanners by introducing each new hire to at least 10 people both inside and outside their department.



Blog Archive