PowerShelling your Deployment Emails

QA and Development departments team members coordinate at many points along the software development life cycle (SDLC). Part of this communication revolves around deployments, which often take the form of an e-mail that includes information from Team Foundation Server(TFS). To achieve success, multiple points need to be communicated.

  1. When – It needs to be clear when a deployment is finished and the environment is ready for testing
  2. What – The stories/Product Backlog Items (PBI’s)/Bugs that are included in the build
  3. Where – The environments that will be changed and/or updated as a result of the release

The below script will lookup the provided items in TFS, update them and send relevant information in an email. I highly recommend using distribution lists for emails like this. It facilitates people subscribing and unsubscribing without updating the script.

In this example, the person executing the script would need to enter the PBI’s that are going to be deployed. Then the script will query TFS to retrieve the relevant information to include in the e-mail (Id, Title, Type, Iteration, Assigned To). In our case, part of the deployment included assigning the stories/PBI’s/Bugs to a QA user for clarity and historical record. To prevent mismatches between the e-mail and assignment, the script updates the AssignedTo for each provided item.

The output of the query comes in a tab separated vector (tsv) format. The TsvToHtmlTable function converts the output from tsv to something (aka html) that looks decent within an email. The endpoints listed in the e-mail remove guesswork about where to find the environment and perform testing.

For me, investing the time and energy into these communications helped foster trust and collaboration between Dev and QA. I hope it helps you achieve the same, if not more.

Gated Check-In and Nuget Blues

Gated Check-In

You might have heard that it’s a good idea best practice to use gated check-ins. It seems like a no-brainer to make sure that your code at least builds and passes the unit tests. TFS 2012 has the ability to setup gated check-ins, what luck. You set it up and the sun and shining and birds are singing.


Oh happy days. Then one day, you see this.

C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1605): The primary reference "\bin\Library.dll" could not be resolved because it has an indirect dependency on the assembly "NLog, Version=, Culture=neutral, PublicKeyToken=5120e14c03d0593c" which was built against the ".NETFramework,Version=v4.5" framework. This is a higher version than the currently targeted framework ".NETFramework,Version=v4.0". C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (1605): The primary reference "\bin\Efile1Library.dll" could not be resolved because it has an indirect dependency on the framework assembly "System.IO.Compression, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089" which could not be resolved in the currently targeted framework. ".NETFramework,Version=v4.0". To resolve this problem, either remove the reference "\bin\Efile1Library.dll" or retarget your application to a framework version which contains "System.IO.Compression, Version=, Culture=neutral, PublicKeyToken=b77a5c561934e089".

Short Term Fix Alert

More than likely your Product Owner (PO) didn’t select anything for the sprint that included debugging gated check-in issues. However you committed to finishing the work that you can’t now check-in. If your build fails with a similar error, do yourself a favor and try cleaning out the local(on the build agent) nuget cache.

This will actually solve the issue….for a bit. The birds sing but appear kind of nervous.

Swing and a Miss

Then the issue returns. You think hey “Maybe there’s something to all that stuff about different versions.“. Then you look at your solution and notice different versions of the same Nuget reference. For me, it was Newtonsoft.Json. All the projects are referencing the same version of the .NET framework (note: Client Profile vs regular is a significant difference).

Project A had Newtonsoft.Json.6.0.8 and Project B had Newtonsoft.Json.7.0.1. I upgraded both to reference Newtonsoft.Json.8.0.2. While I recommend consolidating your nuget package versions, it did nothing to actually solve the problem that I was having.

Dealing with the Real Issue

This zombie error just won’t stay down. When you really need to commit a changeset, you will be revisited by this error.

The root cause of the error was the Library.Test project that referenced the Library project. I inspected their references and noticed that while Library had a reference to NLog (which I highly recommend) but the test project did not. This gave me an idea. The solution was to add NLog to the Library.Test project.
It would’ve been greatly appreciated if that were mentioned anywhere in the error or better yet if a compiler error were generated before even running the unit test. Now the birds can sing carefree once again.


SQL Saturday 2015 – South Florida #379


Thank you to Nova Southeastern University(NSU), all sponsors, speakers and volunteers for putting on a great SQL Saturday #379. Met lots of passionate people that showed how much of a community we have in Broward County. I attended the following sessions and talked to lots of people about Code for Fort Lauderdale.

DH2i & DxEnterprise Stand-Alone to Clustered in Minutes

Carl Berglund

This was a vendor session for DH2iworks across multiple SQL versions and OSes. It was a very interesting session, particularly for a developer less familiar with virtual environments (at least standing them up, configuring, updating, feeding, watering, etc.). I know that for me, it introduced the following new and/or less familiar terms.

  • quorum – the minimum number of members of a deliberative assembly. In the context of Clustering and High Availability, it is minimum number of nodes that must be in a cluster for it to be viable.
  • InstanceMobility – The ability to move an instance of a virtual machine across physical machine boundaries
  • SAN – Consolidated block level storage (as opposed to file level storage). Only block level operations are supported.
  • Internet Small Computer System Interface(iSCSI) SAN – Allows for the emulation of a SAN over IP networks

The DH2i & DxEnterprise works across multiple SQL versions and OSes. Microsoft Clustering is not designed for Quality of Service(QoS). I was impressed by the demo given showing how to build out a 2 node cluster. You can see the steps below.

  1. Create individual nodes
  2. Add the disks
  3. Create a vhost
  4. Set the two nodes to be active on this vhost

In the demonstration, one node was running Windows Server 2008 R2 and the other node was running Windows Server 2012. He explained how this simplifies implementing a non-traditional cluster(different OS/SQL versions). As an added benefit, it provides a way to perform a consistent install on each node.



  1. Wikipedia: Storage area network
  2. Wikipedia: iSCSI


PowerShell and Python – The Clash Part Deux

Jorge Besadaz (jbesada@carnival.com, jbesada@yahoo.com)

Below is a comparison between Python and PowerShell.


  • Powershell is installed by default.
  • Politically correct to use PowerShell
  • Can instantiate .NET classes
  • “Steal from the Best, And Create the Rest”. Chad Miller is better than you, see for yourself on CodePlex.
  • PowerGUI Script Editor


  • Easy to learn
  • Very good support for windows
  • has modules for everything
  • has “batteries included” philosophy

JetBrains makes an outstanding Python editor called PyCharm. In python, indentation is equivalent to brackets. You can use python to call sqlcmd.

There is a great video from Jeff Snover, PowerShell Creator, that you can watch. You can find the presentation here.


  1. http://www.maxtblog.com/2015/06/powershell-in-south-sql-saturday-379-was-a-great-success/
  2. http://bost.ocks.org/mike/bar/

PowerShell with Visual Studio SQL Data Tools

Max Trinidad
Florida PowerShell User Group

Max demonstrated how to debug PowerShell with Visual Studio and use many of the data tools available. There were big changes between versions of powershell (2-5).

New/Enhanced Features

Rival Editors

  1. SAPIEN is editor of choice.
  2. PowerShell Studio 2012
  3. PrimalXML
  4. SharpDevelop


“I can do it. You can do it!” -Max

Page.ClientScript.RegisterStartupScript is not working : A Solution

Code Wala

This is going to be a very short post but will be useful many developers.

You have registered some JavaScript block from server side. But it’s not working. Actually it’s not getting rendered on Client side when you see the PageSource. But the same code works in different projects/applications/pages. Say you have code like this

It is registering one one startup script that will be fired when the page will be loaded.
Say you have used at several time and it was working but now it is failing.  Similarly Page.ClientScript provide many other methods that will also not work.

One of the main reasons is, We start using Ajax on our pages. Ajax requires a ScriptManager to handle all the ajax related (Partial postback) tasks. Ajax requires lots of JavaScript code that is managed by the ScriptManager. And it must be noted on a single page only one instance…

View original post 58 more words

Nuget Gotcha: Update-Package Edition

Let me start off by stating that Nuget, FluentValidation and Identity Manager are all great tools. Unfortunately their mutual greatness doesn’t prevent issues from arising.

The Gotcha

Having a beta version of one nuget package (e.g. Identity Manager) and trying to update an independent package (e.g. FluentValidation). You run the Update-Package command only to be greeted with the following

You can view a screenshot of this below.


The solution is to use the -pre flag in addition to update-package. However I still don’t understand why updating one package depends on an unrelated one.


Doesn’t Work


To Nuget’s credit, they have this issue and it should be included in an upcoming release.

Powershell: Queuing Multiple TFS Builds


I needed to automate a full deployment of all my services, databases and the client that consumed them frequently.


I am using an on premise installation of TFS 2012 and Visual Studio 2015. I needed to leverage multiple builds since msbuild arguments (the ones that you use to automatically publish/deploy) apply to every project/solution being built. This can potentially lead to multiple different builds.


Fortunately, this can be accomplished using PowerShell. TFS provides a good useful well-documented interface.

Below are some helpful references that I found on my way to the above solution.


  1. TFS Build API by Example #1: Queue a build.
  2. Queuing TFS Build from Powershell Script Which is Called from TFS Build
  3. Queuing a build in PowerShell
  4. relative path in Import-Module
  5. Powershell import-module doesn’t find modules
  6. How to set a custom tfs build parameter using powershell

ASP.NET MVC Gotcha: Model Error Ordering

I had the following peculiar bug with the way Model Errors were presented in the view for my ASP.NET MVC 2 website. The code for the view is below.

You can see the ordering of the fields is First Name, Last Name, Name and Phone. As you can imagine, the validation messages should reflect this ordering. However I was getting the following output instead.ValidationMessagesInOrder

This isn’t a show stopper issue, but it is the kind of things that customers notice (and good QA people). After registering, users add more information at this screen. For this reason, it should be as elegant as possible to keep them coming back. I assumed this would be an easy fix. I’d just go to the validation code and fix the ordering of the errors.

You can see from the code that it is yielding the validation errors the same way that I would like them to be presented. As it turns out, this didn’t make a whole lot of difference. Then I realized that the validation code was not the issue at all.

The errors are displayed in the order that the properties are listed in the mode.

I’ll give you a moment to allow that to sink in. It’s an issue that is difficult to diagnose but easy to rectify. The original model property ordering below outputs the incorrect ordering.

The model should appear like below to achieve the desired user experience.

Once you make that change, the errors display like below.


Commerce Server 2009: Profile Definitions 404 Error

I opened up Commerce Server Manager expecting to be able to browse to the Profile Definitions in Commerce Server 2009 using Commerce Server Manager. Instead of the expected interface, I was greeted with “Server Error in Application ‘Default Web Site’ HTTP Error 404.0 Not Found” like the screen shot below.


This can be rectified with the below steps.

  1. Open Internet Information Services(IIS) Manager
  2. Right Click on Default Web Site
  3. Click Add Virtual Directory
  4. Enter Widgets under Alias
  5. Enter c:\Program Files (x86)\Common Files\Microsoft Shared\Enterprise Servers\Commerce Server\Widgets under Physical Path


After following those steps and refreshing Commerce Server Manager you should see the below interface.


That’s it!

Authentication Failed SharePoint 2010 Edition

I stood up a development that featured both Commerce Server 2007 and SharePoint 2010. Within the environment, there is a public site and a private site. The SharePoint public site authenticates with a custom provider for Commerce Server 2007. The SharePoint private site uses windows authentication. Once the public site loaded correctly, I was greeted by the below Authentication Failed Exception on the private site. You can see from the screenshot below that it’s not pretty.

Server Error in '/' Application.  Authentication failed.  Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.  Exception Details: System.Runtime.InteropServices.COMException: Authentication failed.   Source Error:  An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below.   Stack Trace:     [COMException (0x80040e4d): Authentication failed.  ]     Microsoft.CommerceServer.Interop.Profiles.ProfileServiceClass.GetProfileByKey(String bstrKeyMemberName, Object sValue, String bstrType, Boolean bReturnError) +0     Microsoft.CommerceServer.Runtime.Profiles.Profile..ctor(ProfileContext profileService, String keyName, String keyValue, String profileType) +156     [CommerceProfileSystemException: Failed to retrieve profile.]     Microsoft.CommerceServer.Runtime.Profiles.Profile..ctor(ProfileContext profileService, String keyName, String keyValue, String profileType) +504     Microsoft.CommerceServer.Runtime.Profiles.ProfileContext.GetProfile(String keyName, String keyValue, String profileType) +647     Microsoft.CommerceServer.Runtime.CommerceContext.get_UserID() +744     Microsoft.CommerceServer.Runtime.CommerceDataWarehouseAuthenticationModule.OnPreRequestHandlerExecute(Object sender, EventArgs e) +73     Microsoft.Commerce.Providers.SharePointCommerceDataWarehouseAuthenticationModule.OnPreRequestHandlerExecute(Object sender, EventArgs e) +288     System.Web.SyncEventExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute() +80     System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously) +171     Version Information: Microsoft .NET Framework Version:2.0.50727.5485; ASP.NET Version:2.0.50727.5491

Unfortunately, the stack trace here doesn’t provide too much direction.

I checked the Event Viewer to try glean some useful information. I found the following event.

This only offered misdirection and confusion as it referenced the Commerce Server in the stack trace, which couldn’t be the case since this site used Windows Authentication. The solution came down to the aspnet:AllowAnonymousImpersonation setting, which was configured to true. Once it was modified to false, the site came up as desired. Below is the working configuration.

Fort Lauderdale Budget Data with Dygraphs

For those of you that don’t know,

Even though this data may not have been a perfect fit, I struggled with the 0 to visualized portion of Dygraphs longing for more basic and easy to follow examples. I have chopped up this tutorial into easy to follow steps and made it available on github. It’s a great way to learn about city data and data visualization using javascript.

Step 0 – Plain Jane

Step0-PlainJaneThis represents the starting point for the tutorial. Reorienting the data was omitted here but can be learned from comparing the reoriented file with the raw file in the data folder. Applying meaning to the data can be difficult at this stage. Basic javascript, dygraphs reference and scrubbed data are included but not much else.

Step 1 – Adding a Title


To add the title, we pass in the options parameter to the dygraphs constructor using the below javascript. Notice the options parameter is a json object and properties are case-sensitive.

 Step 2 – Resize


After adding the title, the graph looks cramped. We can alter the height and width parameters to fix this as the below gist illustrates.

Step 3 – Formatting Currency


The scientific notation on the left (y-axis) and on the values makes sense, but isn’t terribly readable. Fortunately, we can beautifully format currency with a great library called numeral.js. You can see it in action below. This turns “$10e+7” into “$10 m”.

Step 4 – Set Axis Label Width


The below javascript changes improve the readability of our left(y-axis) values by giving them some room to breath. Below we set the axisLabelWidth property to 100.

Step 5 – Refactor

Our astute readers and developers will notice the identical code that formats the currency for the axisLabelFormatter and
valueFormatter. For consistency’s and stability’s sake, we need to rectify this situation. In our case, we create and use a single function called formatLikeCurrency.

Step 6 – Debugging Tip

As you work through dygraphs, for debugging it can be helpful to use the hideOverlayOnMouseOut property. This prevents the chart corresponding to the mouse position to stay visible until making another selection. Consider moving between other sites, text editor, e-mail and folder/file viewer. It’s important to keep your frustration to a minimum.

Step 7 – Styling

Step7-StylingStyling creates a better visualize experience for anyone using or viewing your data. Fortunately, we easily style the visualizations generated with dygraphs using CSS. First we need to configure our class structure that will be used by CSS selectors. Then, we tell dygraphs the correct class to put the labels, that appear on mouse hover.

Step 8 – Profit

I hope that you learned something about dygraphs, Fort Lauderdale, data visualization or javascript.