Scrum Day–Dallas 2015

Scrum Day Dallas will be on March 27th this year at the Addison Conference Center. This is going to be a full day of face-to-face learning with practitioners and industry experts, followed by the Face to Face meeting for Scrum Trainers. Just that follow up internal meeting guarantees that there will be at least around 20 trainers, plus Ken Schwaber himself.

You can get more information and register at

The program is as follows (side by side with each session there will also be an Open Space discussion)

8:00 AM-8:45 AM
Registration, Continental Breakfast, Introductions, Announcements

8:45 AM-9:45 AM
Keynote by Ken Schwaber: Scaled Scrum

10:00 AM-11:00 AM
The New New Product Owner

11:15 AM-12:15 PM
Practical Scaling

12:15 PM-1:30 PM
Lunch + Lunch Panel

1:30 PM-2:30 PM
Facilitating the Scrum Events: tips and tricks

2:45 PM-3:45 PM
Team Performance: The Impact of Clean Code & CD

4:00 PM-4:45 PM
Refreshments, Open Space Reports

4:45 PM-5:00 PM
Raffle and Wrapup

This is a great opportunity to ramp up on the latest from on Scaling Scrum. See you there!

One more component in Microsoft’s ALM vision

Brian Harry highlighted last week that Microsoft had acquired HockeyApp. What is relevant to me in relation to the Microsoft ALM vision is how this complements the Dev side with more of the Ops side. If you are not familiar with HockeyApp, it is a toolset to manage the lifecycle of released applications. From their website:

The platform for your apps.

The world's best developers rely on HockeyApp to develop the world’s best apps. Distribute beta versions on iOS, Android, Mac OS X, and Windows, collect live crash reports, get feedback from users, recruit new testers, and analyze test coverage.

I have been calling out for a while how Microsoft is different from most ALM/ADLM leaders per Forrester and Gartner, in that it is also focused heavily on the operations side of ALM. This acquisition is yet another building brick for this complete ALM vision which not many vendors have.

“Building Real World Cloud Apps with Windows” book

Last June Scott Guthrie and team have released a new book to help you understand the essence of developing for the cloud, and more. From the Microsoft Press page:

“This ebook walks you through a patterns-based approach to building real-world cloud solutions. The patterns apply to the development process as well as to architecture and coding practices.

The content is based on a presentation developed by Scott Guthrie and delivered by him at the Norwegian Developers Conference (NDC) in June of 2013 (part 1, part 2), and at Microsoft Tech Ed Australia in September 2013 (part 1, part 2). Many others updated and augmented the content while transitioning it from video to written form.”

I got the reference by email from MSPress. It pointed to their announcement page, and you will notice the references to TechEd and NDC conferences. I wanted a link to those talks so I did a quick search and found them inside the book itself, at page 9. However if reading on the web you can get them from the ASP.NET online version:

The content is based on a presentation developed by Scott Guthrie and delivered by him at the Norwegian Developers Conference (NDC) in June of 2013 (part 1, part 2), and at Microsoft Tech Ed Australia in September, 2013 (part 1, part 2). Many others updated and augmented the content while transitioning it from video to written form.

There area a couple more video references on chapter 14.

This book is a must have reference on how to transition to a fully managed application lifecycle on all its aspects, including operations, and not just yet another book about Azure software development.

Interview with the ALM Rangers core team at the MVP Summit

I had the opportunity of being interviewed by the ALM Rangers core team (Anisha Pindoria) during the 2015 MVP Summit in Redmond. It is at [21:11] - MVP Summit Day 1: Clementino De Mendonca. You can also get perspectives from others in the same video.

Watching the other Rangers provided a glimpse of their different background, activities and focus. It was very enriching, and it felt good to know I am in the company of so many bright people.

Resolution: Error "Could not load file or assembly 'file://\\server\path\file.dll' or one of its dependencies. (0x80131515)


Call to a .NET Framework assembly on a network folder fails with error:

"Could not load file or assembly 'file://\\server\path\file.dll' or one of its dependencies. Operation is not supported. (Exception from HRESULT: 0x80131515)


In this particular case, I wanted to call an assembly with tests which had been deployed using Team Build to a shared folder, then use MSTest in the command line to run them.

MSTest behaved differently depending on whether it is being called from a Windows 2008 R2 Standard server or Windows 8 Client. Using the following command line:

MSTest /testcontainer:\\server\path\file.dll /detail:debugtrace /detail:traceinfo

  • From Windows 8 client, user as local administrator: command works as expected. It loads the tests and executes them.
  • From Windows 2008 R2 Standard, user as a local administrator: command fails with the following message:

Could not load file or assembly 'file://\\server\path\file.dll' or one of its dependencies. Operation is not supported. (Exception from HRESULT: 0x80131515)

Further troubleshooting using 1) a mapped drive 2) PowerShell 3) /testcontainer parameter between quotes show that all these would also fail under Windows 2008 R2 Standard.

The error “Could not load file or assembly” (0x80131515) is a catch-all error. For instance, it is also reported when the exe or dll was downloaded from an unsafe zone. This is fixed by right-clicking the assembly and choosing “Unlock” from the General tab. Sometimes the “Unlock” option won’t show, and in this case it wasn’t.

Blocking is done based on a Zone Identifier alternate stream that is added to a file when it is copied from the Internet. To validate that the file is actually blocked, display this Zone Identifier using the following command (note the direction of the “<” sign – the opposite one would erase the file):

more < file.dll:Zone.Identifer

If ZoneId = 3 or 4, your file is blocked. In our case, the files show no alternate Zone.Identifier stream.

Since copying the files locally to a folder in the Window 2008 R2 Standard server allowed MSTest to execute the tests, this showed that the issue was constrained to some security policy differences between Windows 8 and Windows 2008 R2 Standard. This also confirmed that the files were not blocked.

Next step I looked into the .NET Framework security policy configuration. Using the instructions from this Microsoft blog post I modified the CAS policy settings using the following command prompt:

CasPol.exe -m -ag 1.2 -url file://\\server\path\file\* FullTrust

This added the following record into the CAS policy (CasPol.exe -l):

   1.2.  Zone - Intranet: LocalIntranet
      1.2.1.  All code: Same site Web
      1.2.2.  All code: Same directory FileIO - 'Read, PathDiscovery'
      1.2.3.  Url - file://\\server\path\file\*: FullTrust

This did not resolve the issue as expected, but the following error started to be reported on the Event log:

Log Name:      Application
Source:        VSTTExecution
(MSTest, PID 2000, Thread 1) AssemblyEnumerator.EnumerateAssembly threw exception: System.IO.FileLoadException: Could not load file or assembly 'file://\\server\path\file.dll' or one of its dependencies. Operation is not supported. (Exception from HRESULT: 0x80131515)

File name: 'file://\\server\path\file.dll' ---> System.NotSupportedException: An attempt was made to load an assembly from a network location which would have caused the assembly to be sandboxed in previous versions of the .NET Framework. This release of the .NET Framework does not enable CAS policy by default, so this load may be dangerous. If this load is not intended to sandbox the assembly, please enable the loadFromRemoteSources switch. See for more information.

By following the link recommended in the log, I got more information on the issue and how to solve it with the configuration switch loadFromRemoteSources.

From MSDN documentation:

“In the .NET Framework version 3.5 and earlier versions, if you loaded an assembly from a remote location, the assembly would run partially trusted with a grant set that depended on the zone in which it was loaded. For example, if you loaded an assembly from a website, it was loaded into the Internet zone and granted the Internet permission set. In other words, it executed in an Internet sandbox. If you try to run that assembly in the .NET Framework version 4 and later versions, an exception is thrown; you must either explicitly create a sandbox for the assembly (see How to: Run Partially Trusted Code in a Sandbox), or run it in full trust.

The <loadFromRemoteSources> element lets you specify that the assemblies that would have run partially trusted in earlier versions of the .NET Framework are to be run fully trusted in the .NET Framework 4 and later versions. By default, remote assemblies do not run in the .NET Framework 4 and later […]. If you set enabled to true, remote applications are granted full trust.

If <loadFromRemoteSources> enabled is not set to true, an exception is thrown under the following conditions:

  • The sandboxing behavior of the current domain is different from its behavior in the .NET Framework 3.5. This requires CAS policy to be disabled, and the current domain not to be sandboxed.
  • The assembly being loaded is not from the MyComputer zone.”

And, in the same article:

In the .NET Framework 4.5, assemblies on local network shares are run as full trust by default; you do not have to enable the<loadFromRemoteSources> element.

This part of the documentation also shows why it worked on Windows 8: .NET Framework 4.5 is the default for Windows 8.

For validation purposes, I originally applied the recommended setting to machine.config, which is too wide. Later I tested with just MSTest.exe.config and it also worked, which is a smaller recommended scope.


[Verified] Add the following entry to the MSTest.exe.config file (at C:\Program Files (x86)\Microsoft Visual Studio 11.0\Common7\IDE):

      <loadFromRemoteSources enabled="true"/>


[According to documentation] Use .NET Framework 4.5.

Gartner: Visual Studio and TFS in the Leaders quadrant again

Brian Harry highlighted in his latest post how Gartner had included Visual Studio and TFS in the Leaders quadrant for the second year in a row! Here is a picture of it:


I had never heard of Soasta/Tricentis/Test Plant/Original Software, I will have to check these ones out.

This report, along with the other one on ADLM tooling makes for an interesting analysis: Microsoft has an edge over ADLM quadrant leaders in that some of the others (Rally for instance) do not not even show up here. I will come back to that later.

Understanding new trends on source control format

Last month I went to a nice presentation at the Austin TFS User Group on migration from TFVC format to TFS Git, using Git-TFS, a very useful one, the kind that makes you want to research more after it.

However one of the first slides of the talk did a comparison using Google trends which was used as a justification to moving to Git, because according to it, TFVC was going nowhere, whereas Git was exploding in adoption (notice that the numbers are not absolute, that is they are normalized. For instance, if you have just TFS and Mercurial, the number for TFS will be different).




I had in the had past previous encounters with skewed statistics, so I waited until the Q&A at the end to ask a few questions about it.

I pointed out that since we were seeing data that had been filtered through Google search, it did not take into account that most Microsoft tools users however would first go to the MSDN site first and search from there (or just press Help from Team Explorer), so some of the millions of people who have MSDN and use TFVC are not being represented.

Also, git has many client providers, such as Xcode to Tortoise-git. Most of them are also OSS with “best-efforts” support, meaning “if you have an issue, search in the Internet for a solution”. So in a way this trend curve also reflects a bit of where the information sources are for git related topics in general, it is a bit fragmented so you need to go through a search engine to have it all collected for research.

The most important data point I observed is that the TFVC curve (numbers aside) were pretty much stable. So “where is git interest growth coming from?” since the TFVC interest curve seemed pretty stable.

I recalled that around 2001 I had seen something similar about Linux adoption in the desktop: some pointed how it was growing so fast that it would overtake Windows soon. This growth curve was similar, steady but slightly rising for Windows, and steeper for Linux, showing it overtaking Windows in the forecast.

Time showed that the market shares were not changing that much, so where was the Linux growth coming from? Sun took too late to realize that it was their own Unix, Solaris (and for that matter, all of Unix variants) that were being cannibalized as people moved to Linux. Later Sun made Solaris into open source but it was already too late: most Unix users had converted to Linux.

If we could use “searches”  as a proxy to “interest”, and that as a proxy for market share: with git growing and TFVC stable, what was the “Solaris” equivalent that was being replaced?

After I explained my point of view, the presenter went online and added Mercurial in the graph… there is surely a downwards trend on Mercurial queries, but that does not explain how git searches were growing:



I mentioned Subversion to him but we didn’t have time to try it out so I continued from home.

I then added Subversion, listed under “Apache Subversion Revision Control System” to the graph, and voilá, the puzzle was solved: the new trend graph, confirmed that the open source community (and for that matter, Microsoft TFS users as well) is readjusting its preferences. Git is now as popular as Subversion was in the 2007-2009 time frame. The growth of interest in git is pretty much explained by the winding down of searches on Subversion and Mercurial, and probably some from TFVC now migrated to TFS git:


I then continued my research looking for the development of Interest in other source control systems, starting by the all-time grandfather of many, CVS. Back-extrapolating its curve, you can see that it was very popular, but was itself run over by Subversion in the 2005 time frame:


I then tried out with others: Rational ClearCase, Rational Jazz Source Control, Perforce. Out of these, only Perforce showed a small but steady curve. ClearCase is now reduced to just a trickle, and Jazz Source Control did not even show up. Finally I had to see if any record of Visual Source Safe still existed after 2005, and as expected it dwindled after 2005:


As a final experiment I tried the “Forecast” feature which seems to trace a simple extrapolation based on the data so far (forecast starts at ABC points below). The extrapolation confirmed the steadiness of the TFVC curve so far, and the ongoing dwindling of Subversion as it gets overtaken by git as all other open source version control systems:


So in conclusion:

- Git will become to open source version control systems what Linux is to open source Unix-like systems;

- TFVC will remain stable for the foreseeable future, with users of TFS-git adding to the number of git adopters;


- You can use statistics to justify any point of view, so be on the lookout for any inadvertently skewed perspectives;

- Do not just accept the data, think about it too – logic will help you in finding the hidden aspects (the “Solaris”) of the question;

If you have read so far and have a different perspective, please let me know what your thoughts are on this.

Scaled Agile Framework: Using TFS to support epics, release trains, and multiple backlogs whitepaper

The SAFE whitepaper + download was just launched today. See the announcement by Greg Boer at the MSDN ALM blog.

This was the result of lots of hours of internal contribution by the ALM Rangers.

The whitepaper provides both a high level view of how SAFe is realized using TFS, as well as detailed configuration/customization details.

In addition to the whitepaper, this release includes a download of the Visual Studio out-of-box process templates, with SAFe related customizations already made: Team Foundation Server 2013 Process Template Samples - Support for Scaled Agile Framework (SAFe).

This pretty much addresses Gartner’s concerns in their latest ADLM state of the industry report, showing a  quick turnaround from Microsoft to get these addressed.

Issue: updating Field AllowedValues that differ only by casing


Say you created a field and by mistake, there was a typo in one of the allowed values:

<FieldDefinition name="FieldToTestLowerUpperCase" refname="Custom.FieldToTestLowerUpperCase" type="String">
   <ALLOWEDVALUES expanditems="true">
     <LISTITEM value="Out of scope" />
     <LISTITEM value="Value 1" />
     <LISTITEM value="Value 2" />

What you really wanted was “Out of Scope”, not “Out of scope”:

<FieldDefinition name="FieldToTestLowerUpperCase" refname="Custom.FieldToTestLowerUpperCase" type="String">
   <ALLOWEDVALUES expanditems="true">
     <LISTITEM value="Out of Scope" />
     <LISTITEM value="Value 1" />
     <LISTITEM value="Value 2" />

Using Process Editor, even if you modify and republish the value, it does not change the casing.


I have been able to replicate your scenario with Process Tools Editor within VS 2013 Update 3 and TFS 12.0.30723.0 (Tfs2013.Update3).

I deleted the field using Process Editor to take it out of a custom Task, and then deleted it in the command line with witadmin:

witadmin deletefield

/collection:http://<tfsserver>:8080/tfs/<yourcollection> /n:Custom.FieldToTestLowerUpperCase

then re-added it using Process Editor with the right case (“Out of Scope”) and told it to rebuild the cache (“witadmin rebuildcache”). It still did not work, it still kept the same value.

I applied a simple change (add an extra space between “of” and “Scope”, saved it) and the new one had the uppercase plus the extra space (“Out of  Scope”). Then I modified the new field to use just a single space, rebuilt the cached, but it returned to using lowercase (“Out of scope”).

To see whether it was a bug with Process Editor, I did all operations using just witadmin in a command line prompt. It still did not work: even after an update, I would retrieve the work item definition and it would show the word “scope” in lowercase.

This value was cached somewhere, and not being able to update it is definitely a bug. By looking into the Fields table I confirmed that nothing really is deleted, only marked as deleted, and most likely it is reused when the value is reinserted. In addition, when a field AllowedValues is changed, the Import method (either using Process Editor, witadmin or the API) does not consider casing when checking whether the value needs to be updated.


I found the “Out of scope” value in the TFS Constants table (within the collection database):


PartitionId, ConstID, DomainPart, fInTrustedDomain, NamePart, DisplayPart, String, ChangerID, AddedDate, RemovedDate, SID, Cachestamp, ProjectID, TeamFoundationId, fReferenced

FROM Constants

WHERE (DisplayPart = 'Out of scope')

ORDER BY DisplayPart

Next I manually updated it to “Out of Scope”, and refreshed. This fixed the issue.

ATTENTION: Do this at your own risk, as modifying TFS tables directly is neither recommended nor supported, and might put your database in an unsupported state. I tested this on a sample TFS installation, which is not in production.

I only provided this workaround as a last resort and because it was a simple enough update of a string value. A better, supported path would be to open a case with Microsoft support using your MSDN incidents, and have it escalated to the Product Team as a Bug (I might also open a bug with Connect later, and will post the link here).

A few pointers on how to use Delphi applications with Coded UI

To use Delphi-based UIs with Coded UI tests you would need to implement the MSAA interface for each component you would want to use/have it visible with Coded UI.  Example implementations:

-          TEdit:

-          TreeView

The Coded UI extensibility framework works mostly with MSAA compliant applications ( However, if you  can’t get the Delphi source code and enable MSAA, you will have to do with the plain Windows Win32 support ( ).

Is it possible to build a plug-in or add-on in .NET using Coded UI extensibility for identifying Delphi (VCL) UI controls native properties (like id, control name)? As mentioned before, it is the UI control itself that has to expose MSAA compliant properties to be visible, that is, the TEdit or TForm needs to implement it. However the documentation on how to used CodedUI with Silverlight states the following:

“To test your Silverlight applications, you must add Microsoft.VisualStudio.TestTools.UITest.Extension.SilverlightUIAutomationHelper.dll as a reference to your Silverlight 4 application so that the Silverlight controls can be identified. This helper assembly instruments your Silverlight application to enable the information about a control to be available to the Silverlight plugin API that you use in your coded UI test or is used for an action recording.

If I understand this correctly, it might be possible to do the same for Delphi CLR .NET applications at the assembly level (I have not seen any reference implementation on how to do this though). For applications compiled to native code you would have to go to the source as explained above.


<<  March 2018  >>

View posts in large calendar

Month List