CloudShare Explorer for Visual Studio 2012

For those who don’t know about – it’s a subscription-paid virtual server hosting environment along the lines of Amazon EC2 or Azure, which has a key twist: it provides pre-configured virtual machines that specifically focus on SharePoint and related technologies. We use it for development at itgroove when we need to do high-level development, as a complement to local Hyper-V development images.

I finally took their CloudShare Explorer extension for Visual Studio 2012 for a spin – it’s pretty sweet:

Allow developers to spend more time coding and less time thinking about the tools they use. The CloudShare Explorer allows developers to access CloudShare development and testing environments without leaving Visual Studio. With CloudShare Explorer for Visual Studio, you can easily create and access your cloud-based labs.

•Complete list of your CloudShare environments in the CloudShare Explorer tab. The list is continuously updated with the status of the environments.
•Full screen remote access to the CloudShare machines with just a click of a mouse.
•Resume your lab environments.
•Revert the environment, or a specific machine, to the latest snapshot (only works with a CloudShare TeamLabs account).
•Web access to your lab web servers within Visual Studio.
•Add a machine to an environment (only works with a CloudShare TeamLabs account).
•Delete a machine in an environment (only works with a CloudShare TeamLabs account).
•Prevent CloudShare environments from being suspended if the user is still working on them.
•Highly customizable – you can turn on / off most of the functionality of the plugin as needed

Config Steps:

Download for Visual Studio 2013

Step 1: Download and install the Visual Studio Extension

Step 2: Login to the CloudShare website and Generate your API Credentials under My Account > API Credentials


Step 3: Add your API Credentials into the CloudShare Explorer interface in Visual Studio:

Step 4: That’s it! You can now work with your CloudShare environments directly in Visual Studio!
2014-01-20 8-47-41 AM

SharePoint 2013 Search Query Tool

Found a nice little GUI for visualizing, assembling and debugging SharePoint 2013 Search API queries:

Use this tool to test out and debug search queries against the SharePoint 2013 Search REST API.

Learn how to build an HTTP POST query, and how the different parameters should be formatted.

After running the query, you can view all types of result sets returned; Primary Results, Refinement Results, Query Rules Results, Query Suggestions, in addition to the actual raw response received from the Search service.

Can be used to query both SharePoint 2013 On-Premise, and SharePoint Online.

More about the Search REST API in SharePoint 2013 on the authors blog: and on TechNet.

1-8-2014 8-30-28 AM

Resolving Nintex Feature Activation issues after migration

Issues have been reported with data migration tools used to transfer   content from a SharePoint 2010 environment to a SharePoint 2013 environment.   The tools are preventing the Nintex Workflow site collection feature from   activating in the 2013 environment.


Site Collection A is in a SharePoint 2010 environment with the Nintex   Workflow 2010 feature activated.

Site Collection B is in a SharePoint 2013 environment without the   Nintex Workflow 2013 feature activated.

A data migration tool is used to move data from a team site in Site   Collection A to a team site in Site Collection B.


Some data migration tools detect that a site column or content type   from Site Collection A is not present in Site Collection B, and then recreate   these assets at the target team site. In this case, it will recreate a Nintex   Workflow-installed content type and site column.

Later, when a user attempts to activate the Nintex Workflow site   collection feature, it will fail. The message in the SharePoint logs will be   similar to:

The field with Id {c2dd77c1-89a4-4f1f-b037-c17407e9922c}   defined in feature {0561d315-d5db-4736-929e-26da142812c5} was found in the current site collection or in a sub site.

In other words, the data migration tool has forcibly created a column somewhere in the site collection that the feature activation cannot overwrite, preventing it from activating.

Avoiding the issue

This issue will not occur if the Nintex Workflow 2013 site collection   feature is already active on the target site collection prior to migration.

Fixing the issue

Option   1: Activate Feature with Powershell and use the Force

From an administrative SharePoint Powershell:

Put in the folder name of the feature in question in the   “identity” parameter. You can find the folders under

SharePoint   2010:
C:\Program Files\Common Files\microsoft shared\Web Server   Extensions\14\TEMPLATE\FEATURES

SharePoint   2013:
C:\Program Files\Common Files\microsoft shared\Web Server   Extensions\15\TEMPLATE\FEATURES


Enable-SPFeature –identity "NintexWorkflowInfopath" -URL <a href=""></a> -force

Option 2: Feature Site Column Cleanup Tool

In response to this issue, Nintex developed a utility to clean up any site columns and content types that exist in a sub site in a site collection.

This tool recursively searches every site in a site collection, checking for all Nintex Workflow content types and site columns, and optionally attempts to delete these assets.

It will print the location of any founds assets to the console window.

Once the assets have been removed, the site collection feature will activate correctly.

The tool is available from

It should be downloaded to a SharePoint server and run from a command console window as an account that has access to modify all sites in the site collection.

Depending on the number of subsites in the site collection, the tool may take some time to run. The tool will print “Process complete” to the console window when it has finished processing.


FeatureSiteColumnCleanup.exe "siteCollectionUrl" [-deleteFound] [-includeRootWeb] [-searchUsage] [-skipFields] [-skipContentTypes]

siteCollectionUrl: The URL to the top level site of the site collection that requires processing.

-deleteFound: If this argument is included, any found content types of site columns will be deleted. If this argument is not included, the tool will only report the location of these fields.

-includeRootWeb: If this argument is included, the root team site will be included in the search for assets. Please note that if the Nintex Workflow feature has activated correctly, the assets will exist at the root team site and should not be deleted.

-searchUsage: Specifies whether each list on each site should be checked to see if it uses a Nintex Workflow content type

-skipFields: Specifies whether the process should not check for field assets

-skipContentTypes: Specifies whether the process should not check for content type assets


FeatureSiteColumnCleanup.exe "<a href="http://sharepoint/sites/portal">http://sharepoint/sites/portal</a>" –deleteFound

From <>

Option 3: SQL Method aka “Hammer of Thor”

Warning: Unsupported by Microsoft and most likely Nintex!

Identify the Nintex-related Content Type GUID’s and update their “IsFromFeature” property to 0, in the relevant content databases.

Update [dbo].[ContentTypes] set [IsFromFeature] = 0 where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x0101EF0201%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0 where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x010801005CC0%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0 where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x0108010064E42%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0 where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x0108010079DBDE6%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0 where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x01010024055%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0
where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x01010024055%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0
where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x010100F815D979%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0
where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x010100F8376F531%')

Update [dbo].[ContentTypes] set [IsFromFeature] = 0
where (sys.fn_varbintohexstr(ContentTypeId) LIKE '0x010100240555%')

Incoming search terms:

Blog Reader for SharePoint MVP’s Windows Phone 8 App

9712c7be-7341-44af-9703-a6eb0991bd41The Blog Reader for SharePoint MVP’s App grabs the latest blog posts from all known SharePoint MVP blogs and surfaces them in one handy interface. Get the latest insider info from your favorite SharePoint MVP’s, anywhere, anytime!

Get the FREE App now

4c13b6a4-b896-43e1-8df4-2d3956e6a970 6c4a7400-a157-4661-a8d0-3b6f7d449736 c3686641-da9a-4c58-90ea-50180bd71eb8

If you are an SharePoint MVP & blogger and would like to add/update your blog, just email and I will add you.

As fuel for this App, I created a Yahoo Pipes mashup of all the SharePoint MVP blog feeds I could find after a couple hours of research.

About Yahoo Pipes:
“Pipes is a powerful composition tool to aggregate, manipulate, and mashup content from around the web.

Like Unix pipes, simple commands can be combined together to create output that meets your needs:

    • combine many feeds into one, then sort, filter and translate it.
    • geocode your favorite feeds and browse the items on an interactive map.
    • power widgets/badges on your web site.
    • grab the output of any Pipes as RSS, JSON, KML, and other formats.”

I simply add the blog RSS feed URL’s desired (10 is the limit per widget), pipe it into a UNION, then do a sort by date descending, and output the whole thing as a consolidated RSS feed which has a Feedburner front end on it:


Output is the combined RSS feeds of all the SP MVP’s involved:


And finally, in this scenario, i’m consuming the feed via the Windows Phone 8 i’ve created, previewed below in the super awesome Hyper-V / Visual Studio Windows Phone 8 Emulator (thanks to my colleague Colin for schooling me up on that!):

The App itself is free can be found in the Windows Phone 8 Store at or by searching for “Blog Reader for SharePoint” either via the website or, of course, via the Store on your Windows 8 Phone!

Submissions/edits are most welcome, if you know of any more active SharePoint MVP bloggers, please let me know at!

Visual Studio 2013 now available to MSDN Subscribers

Visual Studio 2013 is now available for MSDN subscribers! Public launch is Nov. 17-

What’s new in VS 2013 :

For developers and development teams, Visual Studio 2013 easily delivers applications across all Microsoft devices, cloud, desktop, server and game console platforms by providing a consistent development experience, hybrid collaboration options, and state-of-the-art tools, services, and resources.


Below are just a few of the highlights in this release:

   Innovative features for greater developer   productivity: Visual Studio 2013   includes many user interface improvements; there are more than 400 modified   icons with greater differentiation and increased use of color, a redesigned   Start page, and other design changes.
   Support for Windows 8.1 app development: Visual Studio 2013 provides the ideal   toolset for building modern applications that leverage the next wave in   Windows platform innovation (Windows 8.1), while supporting devices and   services across all Microsoft platforms. Support for Windows Store app development   in Windows 8.1 includes updates to the tools, controls and templates, new   Coded UI test support for XAML apps, UI Responsiveness Analyzer and Energy   Consumption profiler for XAML & HTML apps, enhanced memory profiling   tools for HTML apps, and improved integration with the Windows Store.
   Web development advances: Creating websites or services on the   Microsoft platform provides you with many options, including ASP.NET   WebForms, ASP.NET MVC, WCF or Web API services, and more. Previously, working   with each of these approaches meant working with separate project types and   tooling isolated to that project’s capabilities. The One ASP.NET vision   unifies your web project experience in Visual Studio 2013 so that you can   create ASP.NET web applications using your preference of ASP.NET component   frameworks in a single project. Now you can mix and match the right tools for   the job within your web projects, giving you increased flexibility and   productivity.

We also want to let you know that Visual Studio 2013 is no longer embedding the product key in the bits found on Subscriber Downloads. Once you launch the product, you can simply sign in with the Microsoft account associated with your MSDN subscription and your IDE will automatically activate. An added benefit of signing in to Visual Studio is that your IDE settings will sync across devices, and you can connect to online developer services. If you’d rather enter a product key, then simply bypass signing in and enter the product key found on Subscriber Downloads by selecting Register Product under the Help menu.

Download Visual Studio 2013 today and use the software, services, and benefits of your MSDN subscription to achieve your software development goals with greater ease and agility.

Lastly, please save the date for the Visual Studio 2013 Launch on November 13th as you’re cordially invited to join us online from virtually anywhere using your PC or mobile device. Watch the keynote live and see how developers like you are achieving their goals today with Visual Studio 2013. Then watch a comprehensive collection of expert product demonstrations from the Visual Studio development team in on-demand sessions.

Incoming search terms:

Deleted documents version history behaviour in SharePoint Document Sets

Someone asked me about why it seems Documents that are deleted from Document Sets are apparently not recoverable from the Document Sets version history.

It is a bit odd at first glance- seems like version history should be version history. The official MS description of how to work with versioning in Document Sets is here:   More on using the “Capture Version” at and

When I add two documents, do a Capture Version, delete one of the documents, and then do a second “Capture Version”, the deleted document is definitely gone:

10-7-2013 9-12-59 AM

When you go to restore that document set, it warns you that the file is gone:


10-7-2013 9-14-09 AM

Essentially this is behavior is “by design” as dealing with document deletes is outside it’s functionality intent. Document Set’s don’t have any special ability to overcome the absence of a document – deletions follow the same recycle bin > delete lifecycle in Document Sets as well.

More on other limitations of document sets here:

I would be surprised if this would ever be changed as out of the box SharePoint behaviour– the core problem would be – if I delete a document, and then it gets run through the first and second stage recycling, it would be most definitely gone- deleted from the SP DB.  So imagine if we made a special exception that if documents were in a Document Set, their version histories would be maintained despite what would normally be a hard delete.

This would make life very confusing for the DB/backup admin as you would essentially have orphaned documents that could never be deleted unless the associated Document Sets were deleted.  I’m not sure how that could all be set up in a sensible way.

As the person asking about this scenario found, you would have to come up with custom code to work around this, or move to a more advanced records management/archive solution. One thing that could be done is to write a feature that fires an event receiver on a document delete even, and then execute the code found at under “Download the Document Set as a .ZIP file”. The idea would be to preserve the documents from deletion by downloading the whole document set at that point in time and storing it somewhere else in SharePoint.

Thanks to Stefaan from Belgium for getting me onto the topic.

Extract DLL assemblies from the GAC via Mapped Drive

Sometimes we need to dig up DLL files from the depths of the Global Assembly Cache, for development purposes. Visual Studio project references may be missing and you may need those .DLL’s to get that sucker to compile. Viewing c:\windows\assembly in explorer view presents a different experience than we normally get – this is for good reason so you don’t eviscerate your Windows system unintentionally.

For years, I’ve just used good ol’ DOS command line to copy out the DLL’s need individually. This is barbaric as the folder numbers are usually long strings of numbers and build versions – hard to type.  Just today I learned a super simple way to access the DLL’s through the standard explorer view:

  • Map a Network Drive (Explorer -> Tools)
    • Map to \servername\folder (\\YourServer\C$\Windows\Assembly)
  • No need for sharing if you are the Administrator
  • Browse to the drive and the specific subfolder you want, extract your assembly

9-13-2013 12-38-25 PM

Symptoms of Pathological Science & Technical Problem Solving

Irving Langmuir had an interesting career in science. He made countless discoveries an inventions including the diffusion pump, atomic hydrogen welding, submarine detection devices and the gas-filled incandescent light bulb, and even coined the word “plasma”. What is really interesting for me, however, is that in 1953 he coined the term “pathological science“, describing research conducted with accordance to the scientific method, but tainted by unconscious bias or subjective effects. This is in contrast to pseudoscience, which has no pretense of following the scientific method. In his original speech, he presented ESP and flying saucers as examples of pathological science; since then, the label has been applied to polywater and cold fusion.

As a side-effect of all his right-on inventions and amazing science, he also excelled at keeping an eye open for scientists who had unconsciously broken with the scientific methodology. Langmuir described it as “These are cases where even when no dishonesty was involved, people were tricked into false results by a lack of understanding about what human beings can do to themselves in the way of being led astray by subjective effects, wishful thinking, or threshold interactions.”

He did a lecture in 1954  where he proposed a list of “symptoms of pathological science”:

1. The maximum effect that is observed is produced by a causative agent of barely detectable intensity, and the magnitude of the effect is substantially independent of the intensity of the cause.
2. The effect is of a magnitude that remains close to the limit of detectability; or, many measurements are necessary because of the very low statistical significance of the results.
3. Claims of great accuracy.
4. Fantastic theories contrary to experience.
5. Criticisms are met by ad hoc excuses thought up on the spur of the moment.
6. Ratio of supporters to critics rises up to somewhere near 50% and then falls gradually to oblivion.

Now what does this have to do with us techies? Coming into a typical SharePoint, IIS, ASP .NET or indeed any other technical issue with many actors and moving parts, I find that the notion of pathological science is really something to watch out for. There are a few key risky neighbourhoods around some of the harder IT issues when they involve parachuting in to a lot of unknowns.

A major contributor to pathological science in the IT realm these days is, in my opinion, the “Let me Google/Bing that for you” effect – As far as I can tell, the world did not go off it’s axis when Google went globally down for a period. Global internet traffic dipped a bit, some people might have been induced to try an alternate approach or go take a break.

Although it’s a bit cynical to paint search engines with a broad brush as they give so much information out freely, there are schools of  thought that propose search engines make people dumber. :)  At the least, people such as, let’s say, Irving Langmuir, managed to crank out stunning inventions without such aids.

In the Microsoft world there are generally enough technical documents, forums, blogs, snippets and personal experiences that one can rapidly use a search engine to zero in on “a” fix for a particular symptom, but the problem is that these channels need to be vetted to exclude all the following critical factors as complications:

- software versions
– software interdependencies
– personal opinions
– known bugs
– unknown bugs
– known unknown bugs
– hardware
– networking
– OS
– End user or external system interaction patterns
– .. and so forth.

When trying to isolate a cause for these types of issues, it’s important to stick to the patterns of these old school science greats. While things these days in the IT sector may be almost childlike in comparison to what these scientists dreamed up in their heads with no acronyms or decades of progress to back them up, the silo’s of logic that we have created around modern code mask huge underlying complexities. The problems we encounter daily normally are not so simple in that we have a square peg and are trying to fit it in a round hole- it’s that we have a bunch of pegs of various diameters, many of which will fit the pattern despite not being a true fit for the problem.

The cure for having many “so-so” answers to an issue and no definitive “right” answer is to fall back on experience, reason, and research and peer review.

Oh and what became of Irving Langmuir? Well, he went on in his latter career to pitch the concept of controlling weather via cloud seeding, so that humans could spawn rain clouds and such- with huge potential for agriculture and of course military uses. Unfortunately, he wanted to believe his weather spawning solution worked so much, he became an ironic victim of his own Pathological Science.

“Utilizing his own criteria for pathology, Langmuirʼs claims for cloud seeding qualified on several counts: they rested on observations close to the threshold of detectability, on apparently meaningful patterns generated in field trials; on the inability of critics to reproduce the experiments; on the intervention of the courts, legislature, and the press; and on overreliance on the credentials of a Nobel laureate rather than proof.”

In essence, despite knowing better, he pursued the result instead of getting a result from pursuit. I think this is something that all of us involved in IT can keep in mind in our daily problem solving.

Disk defragmenting SharePoint on Virtual Machines: Performance tweak or myth?

I previously posted about the effectiveness of defragmenting guest machines (SharePoint servers) in virtualized environments here:
SharePoint on Virtual Machines – Is Disk Defragmenting necessary for Performance?

Today the ever-helpful President of PerfectDisk , Bob Nolan, wrote me to share a very interesting tidbit that he found on VMware’s VSphere documentation:

“Last week I was researching something and I found an interesting piece of documentation that might be helpful to you down the road.  The VMware 5.1 doc has a section enumerating 12 things to do when you have disk latency or I/O contention issues.  “Defragment the file system of all the guests” is the 2nd recommendation on the list



“You may want to look at this too. Cormac Hogan is a VMware storage architect and this is a blog he did on how Storage I/O Control (SIOC) balances fairness and performance .  The gist of the blog is that VMware Kernel settings reduce the number of outstanding I/O requests any VM can have when multiple VMs are sharing a LUN.  SIOC basically throttles performance to improve latency.  Further, if you do sequential I/O, VMware will grant you more I/O requests.

In response I wrote this paper which says if VMware is going limit your I/O requests then get the most from the ones you have by doing larger I/O. When you do larger I/O you tend to do sequential I/O so you can also get the additional requests VMware is willing to grant

This reinforces the original conclusion: This data combined with my decade-plus experience of seeing direct noticeable performance improvements on IIS boxes after full defragmentation and implementation of regular defrags, located on physical or virtual disks, leads to me to stand by my guns: I will continue to recommend disk defragging.  Would love to see if someone can change my mind..

SharePoint Hire Interview Question Matrix



Hiring staff in a highly specialized technical field such as SharePoint can rapidly turn into an inconsistent & time-consuming (expensive) affair, if you don’t have the right system in place. Selecting the right candidate for a SharePoint job from a pool of applicants with largely similar backgrounds & CV’s can be daunting – SharePoint encompasses, even by IT hiring standards, a huge body of knowledge and experience.

Think of all the expense & effort involved with bringing on a new SharePoint hire into your team: onboarding, setting up benefits, training, team familiarization, meetings, etc. Now imagine having to burn up all of that, if you make a less than perfect hiring decision! You stand to risk your company reputation, team trust, customer satisfaction, and so on. Most prominently, you would be right back where you started – looking for a great SharePoint expert for your team.

So What

Failure to properly screen your SharePoint hire applicants can lead to big trouble when it come’s time for them to be effective in their jobs. Once you get through all the standard HR practices & processes, the question remains: “Is this guy/gal really a SharePoint pro?”. To help determine that, an extremely effective screening method is a series of technical questions combined with a scoring system.

Now What

OPTION A- Hire itgroove’s SharePoint MVP’s to perform your interviews for you.

OPTION B- Purchase and download the SharePoint Hire Interview Question Matrix Excel Workbook for only $40. The itgroove SharePoint Hire Interview Question Matrix Excel Workbook is composed of two Worksheets:

1. Questions Scorecard
This Excel Worksheet contains 56 carefully researched, in-depth SharePoint Hiring Questions, composed from real world experience of itgroove’s 3 SharePoint MVP’s. These questions are divided into the following categories:
– General SharePoint Knowledge
– SharePoint 2010 IT Pro
– SharePoint Development
– SharePoint 2013

A Score column is used to rate your Job Applicant’s response to the questions.

All questions include a detailed answer and background explanation for the answer – you don’t necessarily need to be a SharePoint guru yourself in order to effectively grade someone with this worksheet!

2. Personal Scorecard
The Personal Scorecard Worksheet incorporates a detailed scoring formula that rates your potential hire by the following traits:
– Amount and Quality of relevant experience
– Communication Skills in Interview
– Technical Skills Level
– Enthusiasm
– Overall Fit/Suitability for Role
– Educational Qualifications
– Evidence of Research into our Company

The general interview results for these traits are entered into the Worksheet so that an overall grade for the Candidate can be achieved. This grade can then be stacked up against all your other interviewee’s results, to enable quick & fair qualification of who is most suitable for the job position.

Buy Now on the itgroove Store

SharePoint Interview Questions

Incoming search terms:

Recent Comments

  • Keith Tuomi

    You can concatenate strings and numbers in a variety of ways:
    =CONCATENATE(Leave Description,” “,MAPI Contact)
    =(Leave Description & ” ” & MAPI Contact)
    =(Leave Description + ” ” + MAPI Contact)

Follow me on Twitter

Office 365 Service Health

There is a problem - it appears you are using categories and no feeds have been put into those categories.

SharePoint & Office Patches

There is a problem - it appears you are using categories and no feeds have been put into those categories. Service Health

There is a problem - it appears you are using categories and no feeds have been put into those categories.