Saturday, January 31, 2009

Fannie Mae Logic Bomb Would Have Caused Weeklong Shutdown


Fannie Mae Logic Bomb Would Have Caused Weeklong Shutdown

By Kevin Poulsen January 29, 2009 | 1:41:19 PM


A logic bomb allegedly planted by a former engineer at mortgage finance company Fannie Mae last fall would have decimated all 4,000 servers at the company, causing millions of dollars in damage and shutting down Fannie Mae for a least a week, prosecutors say.

Unix engineer Rajendrasinh Babubha Makwana, 35, was indicted (.pdf) Tuesday in federal court in Maryland on a single count of computer sabotage for allegedly writing and planting the malicious code on Oct. 24, the day he was fired from his job. The malware had been set to detonate at 9:00 a.m. on Jan. 31, but was instead discovered by another engineer five days after it was planted, according to court records.

Makwana, an Indian national, was a consultant who worked full time on-site at Fannie Mae's massive data center in Urbana, Maryland, for three years.

On the afternoon of Oct. 24, he was told he was being fired because of a scripting error he'd made earlier in the month, but he was allowed to work through the end of the day, according to an FBI affidavit (.pdf) in the case.  "Despite Makwana's termination, Makwana's computer access was not immediately terminated," wrote FBI agent Jessica Nye.

Five days later, another Unix engineer at the data center discovered the malicious code hidden inside a legitimate script that ran automatically every morning at 9:00 a.m. Had it not been found, the FBI says the code would have executed a series of other scripts designed to block the company's monitoring system, disable access to the server on which it was running, then systematically wipe out all 4,000 Fannie Mae servers, overwriting all their data with zeroes.

"This would also destroy the backup software of the servers making the restoration of data more difficult because new operating systems would have to be installed on all servers before any restoration could begin," wrote Nye.

As a final measure, the logic bomb would have powered off the servers.

The trigger code was hidden at the end of the legitimate program, separated by a page of blank lines. Logs showed that Makwana had logged onto the server on which the logic bomb was created in his final hours on the job.

Makwana is free on a $100,000 signature bond. His lawyer didn't immediately return a phone call Thursday.

(Updated January 30, 2009 | 3:00:00 PM to correct Makwana's employment information)


Photo:  Fannie Mae's data center in Urbana, Maryland

Fannie Mae Logic Bomb Would Have Caused Weeklong Shutdown | Threat Level from

A Dashboard for Financial Controllers' KPI's


A Dashboard for Financial Controllers' KPI's

Microsoft IT has a dual role. They must not only execute projects and programs, but also be able to define how it was done in a replicable model for others to use. One valuable item to come from this mission is the dashboard which was created for Microsoft finance to monitor key performance indicators (KPI's) on a real time basis using excel services on SharePoint.

What has been published is an overview, a technical white paper on how its done, and a couple demos as well.

Here is a link to all of it so you can establish a controller portal of your own.

Published Friday, January 30, 2009 9:18 PM by Craig Swartz

Craig Swartz on Microsoft : A Dashboard for Financial Controllers' KPI's

Friday, January 30, 2009

Visualizing O'Reilly with Silverlight Deep Zoom



Motley says: "To be a lead developer, all you need is technological know-how"


Motley says: "To be a lead developer, all you need is technological know-how"


Motley: To be a lead developer, technical skills are absolutely the most important. Everything else is secondary, tertiary, and whatever word comes next.

Maven: A lead developer must lead from several different perspectives, including people, process, and technology. To properly balance technology and people, build a great team, learn to delegate, and break up your responsibilities.


[Context: Maven and Motley are hanging out at lunch debating on what it takes to be a good lead developer]

Motley: No, no, no - you're wrong. The absolute most important skill that a lead developer possesses is technical skills. Everything else is far down the priority list. If I cannot make technical decisions then I am worthless.

Maven: Well, I partially agree-

Motley: You can't agree! You just said technical skills were not most important!

Maven: Let me finish, please. Remember to practice "seek first to understand, then to be understood" instead of "listening to respond", but I digress.

Motley: Blah, blah, blah. Get on with it.

Maven: The phrase "lead developer" means different things to different people. If you are leading a small team of 3-4 developers, then I agree with you, a lead can focus more on the technical aspects of the job, although not completely at the expense of some other areas. However, if you lead a team of 10 developers, that typically does not leave much time for technical stuff and you will spend more time managing people. Note that I am using the term "lead" and "manager" here together - my definition of a "lead developer" assumes that you have direct reports in the organizational structure, and thus, have management duties.

Motley: I can see job responsibilities varying by size of team, but I still think technical stuff is the most important. The phrase "lead developer" still has the term "developer" in it after all.

Maven: True, and you are still expected to be solid technically, but there are other aspects to development, as you know. A lead developer must lead from several different perspectives, including people, process, and, of course, technology. You are a key decision maker in your role and you represent the business, but not at the expense of the people. People issues often have to come first. You are the one expected to grow their careers after all.

Motley: I admit I have other responsibilities than technology. But my team of 10 people is doing pretty well. I don't have to focus on managing a whole lot. I can concentrate on writing code.

Maven: Who mentors your people? Who grows their careers? Who manages the dependencies of your team? Can you possibly attend all technical meetings? Who handles recruiting new people? Who attends upper management meetings? Who reviews all the designs? Who triages the bugs? Perhaps you could do all of that, but you would work a very long day and probably a do a half-ass job at everything.

Motley: I just find that a lot of that stuff takes care of itself. My boss, however, has been screaming at me a bit more lately, I must admit. But I just love the technical stuff and would rather spend time on that!

Maven: Loving technology is great, but you have to balance your tasks on technology lead vs. management. You have a job to do that involves both axes of leading and management. If you don't enjoy both pieces, then why are you a lead? Companies these days have growth paths for individual contributor developers so as not to force them into management.

Motley: I do like the other stuff, but I like the technical stuff more.

Maven: That's fine. You just have to balance technology and people. In order to do that, there are a few keys:

  • Delegate: You cannot do everything. Delegate tasks when necessary and fully trust the person you are delegating to. Additionally, do not just delegate the work associated with the task - you should delegate ownership of that task. Make the person completely accountable for finishing it.
  • Build a great team: Surround yourself with highly diversified good people. As a lead with a reasonable-sized team, you likely need to give up some of the technical stuff. To compensate, ensure you have one or two senior technical gurus on your team to delegate decisions to.
  • Break up your responsibilities. Perhaps ask for another lead at your level of the organization so that you can have a smaller team allowing you to focus on technology more.

Motley: Not bad, Mave. Good suggestions overall. Perhaps what I can do is concentrate on making the key decisions, focus more on design, and give myself some smaller development tasks that are off the critical path of the project to keep my skills fresh.

Maven: Now you're talking! That will free up some of your time for managing careers in addition to the product, mentoring, reviewing the work of the team, refining and enhancing team processes, managing relationships between your team and other teams in the company, and overall making the team a well oiled machine. Of course, we have been mostly talking about management here and less about leading. We should follow up this conversations one of these days and expand on that. We could also talk a lot more about being a good manager.

Motley: I will be a better dev lead tomorrow. Glad I came up with some refinements to my working style.

Maven: Do you think I had something to do with your refinements in this conversation?

Motley: Not a chance.

Maven: Figures.


Maven's Pointer: As a lead developer at Microsoft, I spend far less time in the source code than I have in the past. Let me clarify - I am in the source code quite a bit, just not contributing new code. Instead, I have chosen to focus more on design issues and code reviews when I find the time. However, if I had far fewer than 10 reports, that would allow me to be closer to the code and even be a contributing developer, although not at the expense of the people I need to help grow. Different people have differing opinions on exactly what the role of a lead developer is, and truthfully, it really depends on the specific situation. However, most "leads" are also managers, and caring about your people and removing distractions from their everyday work is a high priority. Let me know if you want to discuss leading and managing more, in the context of developers of course.

Maven's Resources:

Posted: Friday, January 30, 2009 7:57 AM by James Waletzky

Progressive Development : Motley says: "To be a lead developer, all you need is technological know-how"

It’s nice to be wanted!


It’s nice to be wanted!

Published 30 January 09 08:43 AM

Personally, I’ve always believed that “It’s the data, stupid!” when it comes to enterprise application development. I’m gratified to see that enterprises are figuring this out! It’s unfortunate that salaries seem to be slumping in all but three IT areas, but I’m glad to be part of one that’s not!

That counter-trending is what Foote called "an urgent demand for talent" in three areas: management/methodology/process, database, and messaging and communications skills. IT pay up in three areas, down elsewhere (

As I’m sitting on the review board today for prospective MCA|SQL Server candidates (formerly known by the less politically correct monicker “SQL Rangers”), I see yet more validation that companies are learning more every day that the data is where the real value is. The database architects that we certify through the MCA program are top notch, all-around gurus; most of them come from previous lives with deep experience in non-Microsoft database platforms, and many of them know SQL Server better than they know themselves!

Just remember, there are lots of way to get to the data… and it doesn’t always have to be a web app!

by reedme

Reed Me : It’s nice to be wanted!

Tips too good to pass up


Tips too good to pass up

Have I got a tip for you…

This was previously announced in the newsletter, (and quite popular, I must say) but if you haven’t seen it yet for yourself, the TechNet Magazine site is now running a Tip of the Day feature, with a new tip offered every business day. To make it a little easier to organize and digest, these are not random tips. The editors at the magazine have collected theme around monthly themes. December was Windows Vista month and January featured SQL Server tips. Check out the TechNet Magazine Tips index for all past tips and bookmark the site to stay current on new tips. What's the tip theme for February? Stay tuned!

More to come,


Subscribe to the TechNet Flash newsletter

Published Friday, January 30, 2009 10:18 AM by mitchirs

TechNet Flash Feed : Tips too good to pass up

Windows XP, Vista, 7…what should my desktop strategy be?


Windows XP, Vista, 7…what should my desktop strategy be?

Published 30 January 09 04:47 PM

So, the past several years have seen some interesting developments in the Windows client lifecycle.  Many customers are taking a hard look at their current environment, the overall technology landscape, and the Windows roadmap.  Now is a good time to try to determine what the best course of action is with respect to the Windows platform.  To that end, I thought that it would be prudent to put some thoughts in writing – some of them the official corporate stance, some of them my own personal spin.  Here we go…

Microsoft Windows 2000

Windows 2000 has largely gone by the wayside on the client – thankfully so, as support for 2000 ends on July 13, 2010.  Anyone who is convinced that we’ll extend that date is playing with fire. Yes, we did so with NT4, but that shouldn’t be considered a precedent setting event.  Retire those machines this year so you don’t have a firedrill in 2010 – especially if your company is subject to SOX, PCI, etc.

Windows XP v logo

Windows XP is still going strong – the most popular version of Windows from a deployment perspective, with extensions to the OEM licensing deadlines and introduction of lower cost netbooks having an impact.  That being said, there’s definitely a level of comfort out there in the enterprise for Windows XP.  It’s been around for over 7 years now.  ISV’s and OEM’s have been developing applications and drivers on this platform for a long time, and business users & consumers alike are familiar with the interface.  At the same time, let’s think back to what was happening in or around 2001: Apple introduced the iPod and OSX, Google had just 8 employees two years earlier (they have almost 17,000 now, if you were wondering), 3G cell service was first launched (in Japan), a 1 Ghz processor was very fast, and you almost undoubtedly used the modem in your laptop with great frequency (and maddening slowness).  Times have changed, but you’re probably still running the same version of Windows that became available back then.  Imagine if you were still using a first generation iPod or had a 1 Ghz Celeron under your desk!  Of course, moving music to a new device is different than upgrading an operating system, but you get the idea.

Service Pack 3 was released last spring, and as a consequence, SP2 support will be retired on July 13, 2010.  Put that date in your calendar and consider it D-Day for legacy Windows!  Windows XP leaves Mainstream Support and enters Extended Support in just a few months - on April 14, 2009.  For the full breakdown on what this means, I encourage you to speak with your Technical Account Manager, if applicable, and\or visit the front page of the Microsoft Support Lifecycle site:

Windows XP with SP3 will be supported until April 8, 2014.  So that gives XP environments some breathing room, but not necessarily as much as you might think (more on that in a minute).

Windows Vista logo cl v

Windows Vista is a loaded topic.  No one that I’ve met feels ambivalently about it, for one reason or another.  Admittedly, the launch did not go as well as we would have hoped.  We made huge changes ‘under the hood’ that made it difficult for driver and application developers to ramp up – as a result, the Windows ‘ecosystem’ was not prepared.  Vista itself had some problems with performance and stability that were fairly well publicized (and the Mac vs PC ads definitely didn’t help)…so 2007 was a rough year for adoption.  The beginning of 2008 saw the release of Service Pack 1, and a years time had given our partners a chance to get their hardware, drivers, and applications updated for the latest operating system release.  So, the folks who have seen and used Vista over the past twelve months probably have had a good experience with it.  We’re finding that the major problem out there today is public perception.  Go to if you don’t believe me.  All that being said, enterprise adoption is about what we saw with Windows XP 2 years after it’s release.  The one recurring theme in discussions with corporate customers is that application compatibility is a problem.  Applications may not run in Vista, or maybe they can, but it’s not supported by the vendor.  Remediation will be costly and time consuming.  We get it.  Many of the acquisitions and investments we’ve made in the past few years are targeting that problem specifically (Application Virtualization – SoftGrid, Enterprise Desktop Virtualization – Kidaro, etc.)  Operating systems are traditionally tied to hardware, user data, and applications.  We want to decouple them so that it is feasible and relatively easy to perform an in-place OS migration.  Our Desktop Optimization Pack technologies are a must have for those considering an upgrade any time soon.

On the positive side, there should be no one out there that can deny the security enhancements Vista brings to the table.  It’s far and away the most secure OS we’ve released to date (the first time that we’ve gone from drawing board to release with security as priority #1).  Mobility functions and power savings features are also prime examples of the benefits that you can reap from Vista.  The 64-bit release is very solid, and people should be looking in that direction, as memory grows cheaper and cheaper.  Any modern hardware (2-3 years old) with 2 GB of RAM or more should run Vista quite happily.

windows 7 bl v

The hot topic.  The Windows 7 beta release several weeks ago was met with overwhelming interest (we couldn’t handle the initial flood of download requests).  The reviews are almost all positive – even Walt Mossberg of the Wall Street Journal had good things to say.  And he’s not, ahem, usually all that much of a Microsoft fan.  Millions of people are out there running it on thousands of different hardware configurations, and I’ve been running it myself on laptops, desktops, and server class machines with minimal hiccups.  The caveat to this early success is that we’re hearing from a lot of folks ‘Why should I upgrade to Vista when Windows 7 is right around the corner?’  Well, the answer to that is what I’ve been building up to.  If we look at it from the perspective of an enterprise with fairly unaggressive adoption cycles, then you’ll see that you may be putting yourself in an untenable situation a few years down the road.  For the sake of argument, make these assumptions:

  • Company A doesn’t deploy new operating systems or major applications until Service Pack 1 (or a similar bug-fix milestone) has been provided by the vendor
  • Company A probably won’t even begin testing their application footprint against the new OS until said SP1 is available
  • Windows 7 ships in the fourth quarter of 2009
  • Service Pack 1 for Windows 7 would likely not be final until the first half of 2011, if not later (going by our historical timelines for SP1 releases)

So, Company A would begin testing migration from Windows XP to Windows 7 SP1 in 2011 sometime.  How long would it take to perform adequate testing of your application suite to certify\remediate it for Windows 7?  For most, this is at least a 6 to 12 month process…so, now we’re in mid-2012.  At that point, you’re ready to start building an image (hopefully using the MDT to make your lives easier).  Maybe the image is ready to go in early 2013.  Then you have a little over a year to get it out company-wide until Windows XP hits end-of-life.  Is that enough time?  Perhaps…but is it worth backing yourself into a corner?

A few points to make in conclusion.  Thanks for sticking with me this long.  :)

  • While Windows 7 is a major release from our perspective, it does not involve any huge architectural changes, so I personally feel that the ‘waiting for SP1’ methodology is unnecessary.  Given how aggressively that we’ve been working with the partner ecosystem, I would expect that they will be much more ready to support Windows 7 at the time of availability than any prior release of Windows.  Based on the timelines that I described above, I would also argue that it’s a greater risk to sit on XP and wait for Win7 SP1 than it is to go to Vista and\or Win7 RTM.
  • One of the primary tenets of Windows 7 development is maintaining compatibility with Windows Vista applications and drivers. 
  • Starting to test against Vista now will bear fruit in the long run – you won’t have to perform two runs of testing on Vista and Win7
  • Deploying Vista now to at least some subset of systems, will provide you with experience that will be valuable for Windows 7, and take advantage of the much improved security, mobility, and power-saving features in Windows Vista today
  • The UI change going from XP to Win 7 will be greater than going from Vista to Win7.  I know that user training is a very important topic to a lot of folks.
  • Start to investigate desktop virtualization now.  If you have a significant portion of desktops (as opposed to mobile PC’s), then a VDI scenario warrants serious consideration.  We have partners such as Quest and Citrix that have solutions in this arena that make it a reality *today*.  I’ll be writing up a Windows VDI post soon to cover technology and licensing requirements.
  • Analysts such as Forrester and Gartner are already on the record as saying that customers should not plan to skip Windows Vista
  • The Desktop Optimization Pack technologies are going to make OS migrations much, much easier than what was dealt with in the past
  • The Microsoft Deployment Toolkit (Formerly the BDD) provides the tools necessary to build and customize images very easily, with simultaneous administrator ease-of-use and handling of complex scenarios

All in all, please remember that Microsoft is here to assist with your OS migration and testing.  Whether you would like someone to help get your organization going with the MDT, application compatibility testing, Windows Vista or 7 readiness – we have resources that can help (at no cost).  Give me a shout to get the process started.

by mfiorina

NorthEast Technology Landscape : Windows XP, Vista, 7…what should my desktop strategy be?

TechNet Magazine: February 2009 Issue Available


TechNet Magazine: February 2009 Issue Available

The February 2009 edition of TechNet Magazine is now online and highlights business intelligence and what you cantechnetmagfeb09  do to make your data smarter.  Some of the highlights are below.

  • Microsoft Office – Business Intelligence with SharePoint and Excel – In my opinion SharePoint and Excel are two of the most powerful products we make; however most people don’t know how to take full advantage of them.  This article goes into some detail about how to use both of these products along with Excel services to build a comprehensive BI solution with tools you may already have.
  • SQL Server – Understanding Logging and Recovery – Logging and recovery are not the most exciting features in SQL Server right?  But they are critical to any successful deployment; but often misunderstood.  This article goes into some detail about these 2 critical features which will help move your business along the path to a better performing database.
  • Virtualization – Automating Virtual Machine Host Deployment – Ah of my favourite topics of all.  This article talks about how Hyper-V in Windows Server 2008 is simple to deploy and redeploy with just a few clicks.  There is even some sample code for you to have a look at.
  • Microsoft Dynamics – Deploying Microsoft Dynamics CRM 4.0 – This is not a topic I know much about but one that is gaining momentum all the time in the enterprise.  This article goes into details about planning for the right solution for your business.

Of course as always there is heaps of more content in this edition.  Make sure to check it all out when you get a chance!


Posted: Saturday, January 31, 2009 10:34 AM by jeffa36

Jeff Alexander's Weblog : TechNet Magazine: February 2009 Issue Available

MVPs recognised as Dynamic Writers!


MVPs recognised as Dynamic Writers!


Microsoft MVPs Mariano Gomez and Mark Polino have been recognised by as writers of two of the most viewed/read articles of 2008! launched in January 2008 and since then, has attracted a large number of Microsoft Dynamics experts who frequently write articles on a wide range of topics relating to Dynamics. The site’s editorial team judged Mariano’s and Mark’s articles, as most “useful” to the community!

Read all of Mark’s stories here and Mariano’s full posts here

MVP Mariano Gomez is a Microsoft MCP, PMP and the CEO and founder of Maximum Global Business. He is the original developer of the Microsoft Dynamics GP Spanish release for Latin America and has been consulting and implementing technology solutions for organizations across the United States, the Caribbean, and Latin America for the last 12 years.

MVP Mark Polino is a Senior Consultant with 2007 Worldwide GP Partner of the Year I.B.I.S, Inc. He regularly blogs on a range of topics related to Microsoft Dynamics here.

Published Friday, January 30, 2009 9:07 AM by Jas Dhaliwal

The Microsoft MVP Award Program Blog : MVPs recognised as Dynamic Writers!

Extracting and Loading SharePoint Data in SQL Server Integration Services


Extracting and Loading SharePoint Data in SQL Server Integration Services

SSIS Developers often wonder how they can get information into, and out-of a SharePoint lists. There is a solution.

Look at for an explanation on the SharePoint List Source and Destination Sample available on Codeplex.

Published Friday, January 30, 2009 10:17 AM by mepprecht

Swiss IT Professional and TechNet Blog : Extracting and Loading SharePoint Data in SQL Server Integration Services

Visio/IronPython/Powershell – Part 3 – Shapesheet fun


Visio/IronPython/Powershell – Part 3 – Shapesheet fun

See my previous posts:

Launch Visio 2007 and draw some shapes


The window opened larger than I wanted. Let’s resize it to something nicer

>>> vi.Dev.ResizeWindow( 1024, 768 )


Much better.

Now manually draw some shapes and set some formatting.


Select all the shapes, get all the cell values, and send it to excel

>>> vi.Select.All()
>>> cells = vi.ShapeSheet.GetCells()
>>> vi.Data.ExportToExcel( cells )

Excel 2007 will launch


Each row is a cell.

The columns:

  • A = the shape name
  • B = the shape ID
  • C = the name of the section the cell is in
  • D = the row index of the cell
  • E = the cell name
  • F = cell’s formula value
  • G = cell’s result value
  • H = cell’s result value case to a double

You can export the data in other ways

  • vi.Data.ExportToCSV( cells, “shapesheet.csv” )
  • vi.Data.ExportToExcelXML( cells, “shapesheet.xml”)

Sometimes when working with a shapesheet, it’s convenient for shapes to display their shapenames …

>>> vi.Text.SetToField( VA.Fields.ObjectName )


Let’s make the text a little bigger

>>> vi.Text.SizeUp()
>>> vi.Text.SizeUp()


Or we can set the text size to a specific value

>>> vi.Text.Size = 40


Now let’s manually connect some shapes with the Dynamic Connector


In this diagram:

  • Sheet.1 is connected to Sheet.2
  • Sheet.1 is connected to Sheet.3
  • Sheet.2 is connected to Sheet.3
  • Sheet.2 is connected to Sheet.4

Let’s discover this programmatically

>>> pairs = vi.Connect.GetConnectedShapePairs()
>>> for pair in pairs:
>>>    print pair.ConnectedShape0.Name, "is connected to", pair.ConnectedShape1.Name, “by way of”, pair.ConnectingShape.Name

This will print …

Sheet.1 is connected to Sheet.2 by way of Dynamic connector
Sheet.2 is connected to Sheet.3 by way of Dynamic connector.6
Sheet.4 is connected to Sheet.2 by way of Dynamic connector.7
Sheet.3 is connected to Sheet.1 by way of Dynamic connector.8

Using this information and looking up of the line endpoints for the connectors, you can identify create a directed graph. (QED)

Finally we can manually set cells,for example the PinX cell

Select all the rectangles


>>> vi.ShapeSheet.SetFormula( Cells.PinX, “0” )

And you’ll see all the shapes have moved


We could likewise manually set their color

>>> vi.ShapeSheet.SetFormula( Cells.FillForegnd, "rgb(0,0,255)" )


Published Thursday, January 29, 2009 9:43 PM by saveenr

Saveen Reddy's blog : Visio/IronPython/Powershell – Part 3 – Shapesheet fun

Thursday, January 29, 2009

Hyper-V R2 Live Migration Overview & Architecture


Hyper-V R2 Live Migration Overview & Architecture

There is a new whitepaper that details how live migration in Windows Server 2008 R2 and Microsoft Hyper-V Server R2 works:


Published Thursday, January 29, 2009 5:22 PM by Virtual PC Guy

Virtual PC Guy's WebLog : Hyper-V R2 Live Migration Overview & Architecture

Code Snippets in Visual Studio


Code Snippets in Visual Studio

Recently I found myself writing a ton of VB code however I am working on a new gig and it requries me to switch over to C#.  The tedeous task of remembering to type "foreach" or not typing "dim" has caused me much frustration.  A little tip that will save time and effort and possibly help re-wire my brain so that I can be more efficient is to use Code Snippets.  Code Snippets in VS 2005 or VS 2008 is really one of the coolest things to make you productive.  To use snippets check out the following straight from the Tips and tricks on MSDN.

Code Snippets

Code snippets are one of the best productivity features introduced in Visual Studio 2005. It allows you to quickly insert fragments of code to avoid tedious typing (such as typing a for loop) or to give you a template of how to accomplish a certain task (such as sending data over the network). Most of the built-in C# snippets are of the first type – they help you in minimizing repetitive typing, while most of the built-in VB snippets are of the second type – they let you code up a specific task more easily.

There are two ways to insert a snippet. You can type the snippet's alias in the code editor and press Tab twice (you only need to press Tab once for VB) to insert the snippet immediately. After the code snippet has been inserted, you can press Tab and Shift+Tab to jump to different fields within the snippet. This allows you to quickly change the parts of code that need to be modified. Notice that in C#, code snippet aliases also have IntelliSense. You can tell that an item is a code snippet in the IntelliSense list by its snippet icon.

Figure 8. IntelliSense fully supports code snippets

If you don't remember your code snippet's alias, you can also insert it by pressing "Ctrl+K, Ctrl+X" within the code editor or do a mouse right-click with the mouse and select Insert Snippet.... This shows the code snippet picker, which enables you to browse all the snippets that are applicable to your current programming language and to choose the one you want to insert. This method of inserting code snippets works for both C# and Visual Basic. Visual Basic users have yet another way to insert snippets: you can type the first few letters of a snippet alias, followed by "?" and pressing Tab. Visual Studio will display an alphabetical listing of all the code snippet aliases with the one most closely matched highlighted. This feature is available only for Visual Basic users.

Click here for larger image

Happy coding!

Posted: Thursday, January 29, 2009 2:42 PM by Karimka

Coding Times of a .Net Developer from Yard : Code Snippets in Visual Studio

Fixed Disks vs. Physical Disks


Fixed Disks vs. Physical Disks

I recently was setting up a Home Server on top of Hyper-V (Note: this is not a supported configuration, so if you wish to do the same - be it on your head) and had to decide between using physical disks directly attached to the virtual machine, or using fixed size virtual hard disks.  I had 5 SATA disks that I wanted to connect to the Home Server - and in the end I decided to use fixed size virtual hard disks where each physical disk had a single fixed size virtual hard disk that took up all the space available.

There were a number of things that I considered before coming to this decision:

  1. Performance: There is a big performance difference between dynamically expanding virtual hard disks and fixed size virtual hard disks.  But there really is not much difference between fixed size virtual hard disks and physical disks, so this did not really influence my decision here.
  2. Mobility: This is an interesting one, as it can go either way depending on your hardware.  In my case - using direct attached SATA storage - fixed size virtual hard disks are easier to move around.  If - however - you are using network based storage or SAN infrastructure, then physical disks are easier to move around.
  3. Backup: This was a big one for me.  If you use physical disks you cannot use VSS to backup your virtual machine - but you can if you are using fixed size virtual hard disks (similarly - you cannot use virtual machine snapshots with physical disks - but using virtual machine snapshots with fixed size virtual hard disks is not a good idea either).
  4. Data Safety: I spent a while pondering this one.  The question is which option is least likely to be significantly affected by random data corruption.  As I discussed a little while ago the fixed size virtual hard disk format is mostly just data - and corruption there would have the same effect as for a physical hard disk.  When using a fixed size virtual hard disk you would have to have corruption in the 511 byte footer, or in the parents NTFS file system structure, to cause a problem.  So while there is definitely a greater potential for "catastrophic" data loss - the difference is fairly small.
  5. Hardware monitoring: This was another big one for me.  By using fixed size virtual hard disks I can run hardware monitoring tools (like SMART monitoring tools) in the parent partition.  If you use physical disks with a virtual machine you cannot use these sorts of tools in the parent partition or in the virtual machine.

Hope this helps you when you are faced with similar decisions.


Published Wednesday, January 28, 2009 6:47 PM by Virtual PC Guy

Virtual PC Guy's WebLog : Fixed Disks vs. Physical Disks

SQL Server 2005 and 2008 System Views Posters


SQL Server 2005 and 2008 System Views Posters

Use the following links to download the official posters with the map of the System Views for SQL Server 2005/2008:

SQL Server 2005

and for SQL Server 2008

For SQL Server 2000 you can download a zip file with a more interactive presentation:


Posted: Thursday, January 29, 2009 2:32 AM by luiscan

Blog do Ezequiel : SQL Server 2005 and 2008 System Views Posters

Regarding Windows Server 2008 R2 : Windows Management Infrastructure and BITS Compact Server


Windows Management Infrastructure and BITS Compact Server

Much of the online information (blogs, script archives, etc.) related to Windows Management fails to address the interests of developers but rather focuses largely on the IT-pro community.   Now, the Windows Management Infrastructure team (they own WMI, WinRM, BITS, etc.) have published a developer-focused blog.  

Check-out the WMI blog here:

With Windows Server 2008 R2, the Background Intelligent Transfer Service (BITS) team introduces the BITS Compact Server.   This product provides application developers an “in the box” service to host files for BITS clients.  What's new with the BITS Compact Server is that there is now no need to also install/administer a full web server.  A key scenario being addressed is the transfer of a limited number of large files asynchronously between machines using https within a subnet/workgroup.  BITS can be leveraged as a complete bi-directional file transfer solution even by applications and includes several API's for interfacing from script, managed code, PowerShell, or COM.

Posted: Wednesday, January 28, 2009 7:29 PM by PhilPenn

Regarding Windows Server 2008 R2 : Windows Management Infrastructure and BITS Compact Server

Even more Ramp Up updates!


Even more Ramp Up updates!

The Ramp Up program has just announced another new track: SharePoint for Developers, Part 2! This track, and all of the Ramp Up learning tracks, teach the skills you need in a guided path, making the learning process easier and more efficient. We announced the introduction of SharePoint for Developers, Part 1 a few weeks ago, so we're excited to bring you news of this brand-new learning option.

As a refresher, Ramp Up is a free online program and a featured learning resource for the Software Design, Embedded Development, Game Development and Robotics & Algorithm competition tracks. The Ramp Up website has a great new design, so go check it out!

Published 28 January 09 09:45 by mslic

Microsoft Learning | Imagine Cup : Even more Ramp Up updates!

Wednesday, January 28, 2009

Microsoft Virtualization Overview


Microsoft Virtualization Overview

For the peps out there looking for a nice briefing video overview of our strategy you or your colleagues can huddle round your desk and view one of the following two videos:

12 Minute Overview -

35 Minute Overview -

Mike Peers and Matt McSpirit take us on a tour from Datacenter to Desktop. They are a great watch if you have no great understanding and want a bit of a ramp!

Posted: Thursday, January 29, 2009 12:31 AM by Justin Zarb

The World Simplified is a Virtual World : Microsoft Virtualization Overview

Office Migration and Planning Manager (OMPM)


Office Migration and Planning Manager (OMPM)

Ok, If you are an IT guy/gal responsible for deploying Office 2007 to your corporate environment… chances are, you are asking yourself “hmmm… how do I know if I will encounter any compatibility issues with the new file format?”  or “How can I identify all of the office files in my environment and tell if each file has macros?” , or “I’d like to take advantage of the incredible 30% – 60% file size reduction by converting my existing ‘binary’ files to the new format”…

Lucky for you, we have just the tool to help you…   Welcome to the wonderful world of OMPM! 
I recommend you start by watching the below On-Demand ‘labcast’!!! 

New 2007 Microsoft Office System Planning and Migration Tools - Office Migration Planning Manager

After completing this labcast on-demand, you will be better able to install and configure Office Migration Planning Manager (OMPM), perform a scan on a user system, review the results of that scan, and convert documents with no issues to the new file formats.

Published Wednesday, January 28, 2009 1:12 PM by bwhichel

Brent Whichel - Everything Office : Office Migration and Planning Manager (OMPM)

Tuesday, January 27, 2009

You've got Groove -- has Groove got you back?


You've got Groove -- has Groove got you back?

Got the gotchas?  Francie Selkirk has a Groove support blog out on TechNet that has lots of good stuff on troubleshooting, best practices, and general gotcha avoidance.

Published Tuesday, January 27, 2009 4:24 PM by Jim McCoy

All the Groovey news that fits : You've got Groove -- has Groove got you back?

New release of Financial Functions .NET uploaded on MSDN Code Gallery


New release of Financial Functions .NET uploaded on MSDN Code Gallery

I fixed the bug described in this thread and cleaned up the root finding algorithm. I’m still unhappy about it, but I have no time to code a better one right now (i.e. Ridder, Brent). I also added changes.txt and todo.txt to keep track of things.


1. Fixed call to throw in bisection
2. Changed findBounds algo
3. Added TestXirrBugs function
4. Removed the NewValue functions everywhere


1. The interaction of Bisection and Newton algo in findRoot needs review. It seems like it is working now, but it could use some love. Maybe I should switch to a better root finding algo (i.e. Rudder or Brent)

Published Tuesday, January 27, 2009 11:18 AM by lucabol

Luca Bolognese's WebLog : New release of Financial Functions .NET uploaded on MSDN Code Gallery

Visio diagram and site template from Office Online workflow videos


Visio diagram and site template from Office Online workflow videos

Hello all, Stephen here again. Awhile back I recorded a series of workflow videos that were posted on Office Online:

Watch this: Design a document review workflow solution

Many of you have submitted comments asking if I could make available both the Visio diagram shown in the videos and the solution itself.

I’m happy to oblige (and apologies for the delay), so please find attached to this blog post a .zip file containing the site template (.stp file) and Visio diagram (.vsd file).

A few notes:

  • The solution uses the Document Center site template found in SharePoint Server 2007 (not Windows SharePoint Services), so it must be deployed to a SharePoint Server environment.
  • The site template includes WorkflowDashboard.aspx, but the Data View on that page must be re-created by following the steps in the Part 12 video and the Part 13 video.
  • The URLs used in the e-mail messages in the workflows must be updated to reflect the path of the new site, as mentioned in the Part 14 video.

Hope you find this helpful.

Workflow diagram

Published Tuesday, January 27, 2009 10:44 AM by spdblog

Attachment(s): Diagram and Solution from Document Review

Microsoft SharePoint Designer Team Blog : Visio diagram and site template from Office Online workflow videos

Versioning the Database


Versioning the Database

If you write apps that hit a database that you own, then you probably want version control on the database itself. I always script out the initial database, and version that script. I also script the changes, and version those, and then script the entire DB again and version that as well. That allows me to upgrade a DB schema or build a new one from scratch.

But that doesn't do anything once the DB is deployed - how DO you know what version the database is in once it is in the field? I've used two approaches, and I'd love to hear the ones you're using in your shops.

The first method is to create a "version" table in the database and record the number there. I like this approach because I can see the "trail" of how many times (and when) the database has been versioned, and even who did it:

   1: USE AdventureWorks;

   2: GO


   4: /* First Method - create table */

   5: CREATE TABLE [dbo].[DBA_Version]

   6: (    [DBA_VersionPK] [int] IDENTITY(1,1) NOT NULL,

   7:     [Version] [varchar](50) NOT NULL,

   8:     [DateAssigned] [datetime] NULL,

   9:     [AssignedBy] [varchar](50) NULL,

  10:     [Notes] [varchar](255) NULL,


  12: ([DBA_VersionPK] ASC)


  14: ON [PRIMARY]

  15: GO


  17: GO

  18: ALTER TABLE [dbo].[DBA_Version] ADD  CONSTRAINT [DF_DBA_Version_DateAssigned]  DEFAULT (getdate()) FOR [DateAssigned]

  19: GO

  20: ALTER TABLE [dbo].[DBA_Version] ADD  CONSTRAINT [DF_DBA_Version_AssignedBy]  DEFAULT (user_name()) FOR [AssignedBy]

  21: GO


  23: /* Insert some data */

  24: INSERT INTO [AdventureWorks].[dbo].[DBA_Version]

  25:            ([Version]

  26:            ,[Notes])

  27:      VALUES

  28:            (''

  29:            ,'Changed major schema' )

  30: GO


  32: /* Read it */

  33: SELECT * 

  34: FROM [AdventureWorks].[dbo].[DBA_Version]

  35: ORDER BY DateAssigned DESC;

  36: GO

There's another way, if you want to just track the latest number. For this you can use an "extended property". You could even use this for other objects in the database, such as stored procs and so forth, but I just use it for the database itself:

   1: USE AdventureWorks;

   2: GO


   4: /* Method 2 - Add an extended property */

   5: EXEC sys.sp_addextendedproperty 

   6: @name = N'version', 

   7: @value = N'';

   8: GO


  10: /* To view an extended property */

  11: SELECT name, value

  12: FROM fn_listextendedproperty(default, default, default, default, default, default, default)

  13: WHERE name = 'version';

  14: GO

  15: -- or

  16: SELECT name, value 

  17: FROM sys.extended_properties 

  18: WHERE class_desc = 'DATABASE'

  19: AND

  20: name = 'version';

  21: GO



  24: /* Update an extended property */

  25: EXEC sp_updateextendedproperty

  26: @name = N'version', 

  27: @value = N'';

  28: GO


  30: /* Delete an extended property */

  31: EXEC sp_dropextendedproperty

  32: @name = N'version'; 

  33: GO

Of course, both of these have issues - the user can just change things underneath you and you would think the database is at one level when it isn't. Also, the user might tinker with the version itself.

To handle that, you could use DDL triggers, or you could do a checksum on the table. Lately I've been playing around with using the Change Data Capture feature in SQL Server 2008 to track DDL changes. That, combined with these version numbers, gives me a good feel for the database version.

Published 27 January 09 09:13 by Buck Woody

Carpe Datum : Versioning the Database

How to handle a missing, but still running, virtual machine


How to handle a missing, but still running, virtual machine

It is possible under Hyper-V to get into a state where you have a virtual machine that is still running correctly - but does not show up in the virtual machine list under the Hyper-V management console.  The two most likely causes for this are:

  1. The virtual machine management service (VMMS) is not running. 
    If this is the case you will not see any virtual machines (or interact with the server in general).  You can fix this by just starting the VMMS again (run "net start vmms" from an administrative command prompt).
  2. The virtual machine management service cannot find the virtual machine configuration file.
    With Windows Server 2008 - if a virtual machine configuration file is stored on network based storage, and there is a momentary disruption in network connectivity - the virtual machine management service will stop listing the virtual machine in question, even though it is still running.  You can correct this problem by restarting the VMMS (run "net stop vmms" followed by "net start vmms" from an administrative command prompt).
    Note that this should not happen with Windows Server 2008 R2, as we have updated the VMMS so that it will recheck for the configuration in this case.


Published Tuesday, January 27, 2009 5:20 PM by Virtual PC Guy

Virtual PC Guy's WebLog : How to handle a missing, but still running, virtual machine

QoS in Windows 7


QoS in Windows 7

Hello all. My name is Charley. I’m the new QoS program manager for Windows Core Networking.

It has been a while since we posted our last article about QoS. We want to assure you that we’re still committed to improving this technology and building new QoS features in Windows. We received many questions and suggestions from you in the past. We hope you continue doing so because they are important to us.

You probably all know that Windows 7 beta is live and available for you to download. You may wonder what is new in Windows 7 regarding QoS. But before getting to that, I’d like to refresh our memory as to what was new in Windows Vista. In Vista, we introduced two major QoS features: qWAVE and Policy based QoS. qWAVE, or Quality Windows Audio Video Experience, is designed to estimate the network bandwidth, intelligently mark the application packets (with proper DSCP values), and interact with the application in the event of network congestion or fluctuations of available bandwidth (so that the application can take appropriate actions). qWAVE APIs aim to simplify the work of application developers developing QoS-enabled applications for a home network. The APIs are documented in the Windows SDK. In contrast, Policy based QoS has a different target audience. It is intended to enable IT administrators to apply QoS to applications (which don’t need to have native supports of QoS), computers, and users in their enterprise network. It is especially beneficial to an organization with branch offices, where the WAN link capacity is limited and users tend to experience unpredictable network delays when accessing files or applications hosted on the main campus (or a different branch office). If you’d like to know more about these features, we’d suggest you read an excellent article written by Joseph Davies for TechNet magazine. Click here to check it out.

So what’s new in Windows 7? The enhancement we’ve made is called URL based QoS. If you’re familiar with Policy based QoS you know that in Vista an IT administrator can create a policy based on the application name, source and/or destination IP addresses, source and/or destination ports, and the protocol (TCP, UDP, or both). We’ve learned since then that many enterprise applications have been, or will be, hosted on web servers and accessed from a browser, so the IT administrators would love to be able to prioritize or control the network traffic from those web-based applications, provided that a convenient configuration is available. To answer their request, we’ve added a new configuration option: you can create a policy based on the URL of an HTTP server. Say, you host all the training videos of your company on an IIS server running Windows 7 (or more precisely, Windows Server 2008 R2). You can then use a URL based QoS policy to throttle the video traffic (to prevent it from overwhelming your corporate network). Or, you host your company’s CRM or ERP applications on an HTTP application server running Windows 7. You can similarly use a URL based QoS policy to prioritize these applications traffic so that even users at a remote branch office always get prompt replies and have smooth UI experience.

Does this sound interesting to you? Please let us know what you think. In the next blog, we’ll detail the configuration and the rules.


Windows Core Networking : QoS in Windows 7

SQL Server 2005 Service Pack 3 is now available on Microsoft Update (MU)


SQL Server 2005 Service Pack 3 is now available on Microsoft Update (MU)

SQL Server 2005 SP3 has been available on the Microsoft Download Center since 12/15/2008 and is also now available on Microsoft update.

The main SP3 update for Standard, Workgroup, Developer and Enterprise Editions is available as an optional update on the Microsoft Update website and is visible when you choose "Custom" scan.  It will be offered for each language and platform we support.  All of these updates will be available through Windows Software Update Services (WSUS) also.

Starting in 2 weeks, the SP3 Update version for Express Edition, Express Edition with Advanced Services and Express Toolkit will be available as an automatic update. This update will be automatically downloaded and installed if you have accepted to receive updates through Microsoft Update and have enabled automatic updates on your Windows machine.

More information on SQL Server 2005 SP3 can be found at:

Thank you,

Christina Gendrano

Program Manager

SQL Server Sustained Engineering

Published Tuesday, January 27, 2009 11:43 PM by tina_nj23

Microsoft SQL Server Release Services : SQL Server 2005 Service Pack 3 is now available on Microsoft Update (MU)

Monday, January 26, 2009 make and share lookup lists make and share lookup lists

Published 26 January 09 07:53 AM is a usefully little site where users are making and sharing lists of data. Everything from US State Names to Most Commonly Used Passwords. If you have some favorite sites for getting data, leave a comment.

by Ryan McMinn

Access Team Blog : make and share lookup lists

Demo and Discovery 2 Win Overview Recordings


Demo and Discovery 2 Win Overview Recordings


I am sure by now most of you have heard about the Demo and Discovery 2 Win Trainings we have been offering. Many of you have taken advantage of these great classes but we still get a lot questions asking what exactly will I learn or what is Demo and Discovery 2 Win?

To help with these questions and more Cindy Alexander, Dynamics GP PTS in the Southeast, created two 15 minute overview recordings that will explain Demo and Discovery 2 Win.

Below are the links to the 2 recordings:

1. What can Microsoft Dynamics Partners Expect from Discovery2Win?

2. What can Microsoft Dynamics Partners Expect from Demo2Win?

Published Monday, January 26, 2009 3:34 PM by jeffk

US Microsoft Dynamics GP Field Team : Demo and Discovery 2 Win Overview Recordings

Sunday, January 25, 2009

New build of my PowerShell library for Hyper-V on codeplex


New build of my PowerShell library for Hyper-V on codeplex

Some naked statistics

1942 – the number of lines of code in HyperV.PS1

499 – the number of lines of formatting XML

14,381 – the number of words in the documentation file

2443 - the number of downloads of the last version

929 – the number of downloads for the versions before that

1.00 – the version number of the Library now on codeplex.

I’m calling this “released”. I daresay I will need to come back and do a service pack later, and the documentation must be regarded as “preliminary” but it’s done.

Published Sunday, January 25, 2009 11:47 PM by jamesone

James O'Neill's blog : New build of my PowerShell library for Hyper-V on codeplex

ADO.NET Data Services Resources


ADO.NET Data Services Resources

Help yourself by exploring below resources on ADO.NET Data Services (aka "Project Astoria").


Posted: Sunday, January 25, 2009 8:49 PM by wriju

Wriju's BLOG : ADO.NET Data Services Resources

The IO Model Part 2 – Inner Workings


The IO Model Part 2 – Inner Workings

Note : IO = (Infrastructure Optimization)

I recently wrote about the new IO worksheet that was recently released in its updated form. For today's look at the Inner Workings of the IO Model, we’ll use the spreadsheet version, although there is the online version and of course we have an internal application :)

So let’s take a look at the worksheet.  It is an Excel workbook that can be downloaded from the website (or directly using this link).

When you start the workbook, it should open to the Intro tab.  If it doesn’t start there.

IO Intro

As you can see, in addition to the Intro tab which has instructions and quick links, there are separate tabs for the Core, Business Productivity and Application Platform Parts.  Then there is a Results tab and also a nice Projects tab which auto-populates based on your answers.

The new workbook has a much cleaner interface with radio buttons instead of drop down lists as well as a much simplified look and feel.   

The Previous workbook:

IO old workbook

The New workbook:

IO new workbook

So, the idea is that you fill in the answers that are appropriate for your organisation.  It is unlikely that one person will have all the answers (unless you are a small company) so you will probably need either a few people in a room together to do it all at once, or canvas round a few people individually.  I prefer to get all relevant people in a room together, as this tends to lead to some robust discussions and often reveals that not everyone is on the same page with regards to what the organisations needs are, or what the current state is.

Most of the questions are vendor agnostic, however there are a few where Microsoft products are named explicitly (Active Directory and Office for example).

So for this example, I have gone through an answered the questions for the organisation:

IO questions

Once all the questions are answered, I can change to the Results tab to see how I rated:

IO results

As you can see, based on the answers I gave, I was rated Standardized for I&A, Overall Basic for DDSM (although in one sub-capability I was Standardized), Basic for Security & Networking and Standardized for Data Protection and Recovery.

In this set of answers,  also completed the question on my security processes and our ITIL practices.

On the Projects tab, the workbook has auto-populated a set of projects that are suggested to be beneficial based on my current state:

IO Projects

You can rate your level of importance to these projects, along with the type of business benefit the completion of the project will bring to your organisation.  You can then populate this list with the expected costs (not auto-populated as this is very much organisational specific).

And there you have it, you’ve completed a maturity assessment for your organisation, and have a pretty useful selection of projects to look into to improve the effectiveness of your technology infrastructure.

As you move up the maturity model, your costs decrease and your agility and responsiveness increase.

In Part 3 of this series, I’ll run you through some ‘real world’ examples that I am aware of where organisations are using the IO Model.

Published 26 January 09 05:44 by adhall

The IO Guy : The IO Model Part 2 – Inner Workings

Hello world to TTS (1)


Hello world to TTS (1)

In this post, I will show you how easily you can make your first TTS application with C#. This example will use the COM interop to call SAPI functionality because SAPI has implemented a number of automation objects that can be directly used in C# or any other .Net languages.

Suppose you have Windows Vista (or Windows 7 Beta ) and Vistual Studio installed. Here is a step by step guide:

1. Create the solution by clicking the menu: File -> New -> Project, select the "Visual C#" -> "Windows" -> "Console Application",

choose a project name "SAPIInterop" . A basic C# project will be created automatically

2. Add the referrence to SAPI.dll by clicking "Project" -> "Add Referrence" -> "COM". In the list, find and select the "Microsoft Speech Object Library".  A "SpeechLib" referrence will  appear in the project referrences list.

3. Now we can add some piece of code to make the first "hello world" TTS application.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using SpeechLib;
using System.Diagnostics;

namespace SAPIInterop
    class Program
        static void Speak(int voiceIdx)
            SpVoice spVoice = new SpVoice();
            ISpeechObjectTokens tokens = spVoice.GetVoices(null, null);
            if (voiceIdx < 0 || voiceIdx >= tokens.Count)
                Console.WriteLine("voice index out of range from [0-{0}]", tokens.Count);

            spVoice.Voice = tokens.Item(voiceIdx);
            spVoice.Rate = 0;
            spVoice.Speak("Hello world", SpeechVoiceSpeakFlags.SVSFDefault);

        static void ListVoice()
            SpObjectTokenCategory tokenCat = new SpObjectTokenCategory();
            tokenCat.SetId(SpeechLib.SpeechStringConstants.SpeechCategoryVoices, false);
            ISpeechObjectTokens tokens = tokenCat.EnumerateTokens(null, null);
            foreach (SpObjectToken item in tokens)

        static void Main(string[] args)
            // list voices installed on the system
            // speak with the second voices

Posted: Sunday, January 25, 2009 9:11 AM by myblog

My Ramblings : Hello world to TTS (1)

Saturday, January 24, 2009

NEW: GP Demo Site


NEW: GP Demo Site

I am happy to announce a New GP Demo Site where you can watch videos and take Dynamics GP for a Test Drive.

Check it out here:

Published Friday, January 23, 2009 3:30 PM by jeffk

US Microsoft Dynamics GP Field Team : NEW: GP Demo Site

A Bit More On BITS


A Bit More On BITS

I was in a meeting yesterday talking about BITS and what an awesome service it was.  I described what it did and how it allowed people to avoid a wide range of low-class, pain in the butt problems.  The question then became, "why isn't everyone using it?"  That's a darn good question.  The answer is that we have not educated people about it.  I aim to fix that.  I wrote an earlier blog on BITS HERE. 

Alex Ng has just written a nice blog about it HERE.  Check it out, try it out, tell a friend (and consider the internet a friend).

Experiment! Enjoy! Engage!

Jeffrey Snover [MSFT]
Windows Management Partner Architect
Visit the Windows PowerShell Team blog at:
Visit the Windows PowerShell ScriptCenter at:

Published Friday, January 23, 2009 1:56 PM by PowerShellTeam

Windows PowerShell Blog : A Bit More On BITS

Microsoft Dynamics by Todd : The End


The End

Thank you all for reading my Blog. However, as of today Jan 23, 2009 I am no longer employed by Microsoft.  In fact the entire MSAM - Dynamics Team was part of the 1400 folks laid off.

Please keep me in mind if you know of any opportunities.

Feel free to contact me:

Todd Juchem

Thanks again for your support.

Note : "Microsoft Across America or MSAM for short, and consists of event programs in a variety of areas."


Microsoft Dynamics by Todd : The End

Friday, January 23, 2009

There’s No Good Reason to Delay Data Liquidity and Information Sharing


The Truth About Health IT Standards – There’s No Good Reason to Delay Data Liquidity and Information Sharing

David C. Kibbe and Peter Neupert

Now that the Obama administration and Congress have committed to spending billions of tax payers’ money on health IT as part of the economic stimulus package,  it’s important to be clear about what consumers and patients ought to expect in return—better decision-making by doctors and patients. 

The thing is, nobody can make good decisions without good data. Unfortunately, too many in our industry use data “lock-in” as a tactic to keep their customers captive. Policy makers’ myopic focus on standards and certification does little but provide good air cover for this status quo. Our fundamental first step has to be to ensure data liquidity – making it easy for the data to move around and do some good for us all.

We suggest the following three goals ought to be achieved by end of 2009:

  • Patients’ clinical data (diagnoses, medications, allergies, lab results, immunization history, etc.) are available to doctors in 75% of emergency rooms, clinic offices, and hospitals within their region.
  • Patients’ doctors or medical practices have a “face sheet” that lets any staff member see an all-up view of their relevant health data, including visit status, meds, labs, images, all of which is also viewable to patients via the Web.
  • Every time patients see providers, they are given an electronic after-visit report that includes what was done and what the next steps for care will be according to best practices and evidence-based protocols, whenever these are applicable.

Some who view this seemingly humble list of achievements will say that we can’t do it, because the standards aren’t ready, or the data is too complex. They’ll say that delays are necessary, due to worries about privacy or because too much data is still on paper.

We disagree.  We believe that where there’s a will, there is going to be a way.  And we already know most of what we need to know to achieve these goals.  We know that:

  • huge amounts of digital data exist, already formatted electronically, but scattered across many proprietary systems (meds, labs, images).
  • software and the Internet makes it possible—in a low cost, lightweight way—to get data out of these databases to the point of decision making (to the ER doctor, the patient/consumer, or the  primary care physician).
  • people are hungry for information in whatever form they can get it:
    • Getting it on paper is better than nothing
    • Getting it quickly is better than getting it late
    • Getting it in non-standard digital format is better than paper (software is pretty good at transforming non-standard to standard formats)
    • Getting it in a standard format is better
    • Getting it in a structured, standard format is best
  • An integration “big bang” -- getting everybody all of a sudden onto one, single, structured and standard format—can’t and won’t happen.

We don’t have to wait for new standards to make data accessible—we can do a ton now without standards.  What we need more than anything else is for people to demand that their personal health data are separated from the software applications that are used to collect and store the data.

This idea of separating health data from the applications is very important, and a better way to frame the discussion about how to achieve data liquidity than is the term “interoperability,” which we find cumbersome and opaque. Smart people, armed with software, can do incredible things with data in any format – so long as they can get to it.

Customers of health information systems want to re-use their health data, and in ways they haven’t always thought of or anticipated.    However, many enterprise system vendors make it difficult or expensive to get access to the data—to separate it from the application.  They believe that proprietary “lock-in” allows them some form of strategic advantage.      

We understand that IT vendors are in business, and need to create strategic value for their products.  And we are very much in favor of that—in rules, in workflow, in user experience, price and flexibility, and so on. However, vendors should not be able to “lock” the patient or enterprise data into their applications, and thereby inhibit the ability of customers and partners to build cross-vendor systems that improve care.

It’s possible for vendors to provide value without the need for lock-in.  There are lots of examples of this, for example, the Health Information Exchange in Wisconsin and CVS MinuteClinic.  In the former, value is clearly being added immediately to users in the ED, without requiring all the participating EDs to change their systems or to be standards compliant (or CCHIT certified).  At MinuteClinics, summary after-visit health data are made available to customers online using the Continuity of Care Record standard. This is where the low hanging fruit is.     

There’s already a proven model for extracting and transforming data in many ways – HL7 feeds, non-HL7 feeds, web services, database replication, XML and XSLT, and more – and along the way wecan create value by interpreting the data and adding metadata.  Microsoft is doing it today– both in the enterprise with Amalga and and across enterprises to the consumer with HealthVault.    We hope other vendors follow this lead to drive better outcomes for patients. 

Unlike the physical world where there is a need for dejure standards—think railroad tracks—in the software world, there is much more flexibility and the standards that work are the ones that evolve from USAGE and market acceptance.    The certification and standards road equals conferences, press releases, “connectathons”, caregivers-turned-bureaucrats.  The outcomes road equals immediate benefits to actual caregivers AND learning we can apply to the next round, and the next, and the next.
We have given the industry decades to make this happen --- and just in the last 1-2 years have people finally gotten fed up and just started moving.  Our great risk here is that the people lobbying for dollars and certification today are the people who are invested in the old road.  With the amount of money we are talking about, we run the risk of just giving them another decade to delay and plan.   Instead, let’s put the dollars into rewarding behavior and outcomes, and let the people who live with the problems every day figure out how to solve them.
When we set out to go to the moon in the 1960’s we didn’t say “let’s build a great rocket.”   So, too, in this case we shouldn’t say “let’s buy a great IT system.”   Our measurements should be tied to what we want – better care, informed by the data that is just out there waiting for us to use it.

David C Kibbe MD MBA is a Family Physician and Senior Adviser to the American Academy of Family Physicians who consults on health care professional and consumer technologies.  Peter Neupert is Health Solutions Group Corporate Vice President at Microsoft.

Posted: Friday, January 23, 2009 4:26 PM by pnblog

Neupert On Health : The Truth About Health IT Standards – There’s No Good Reason to Delay Data Liquidity and Information Sharing

Blog Archive