Automate sandcastle build to website with TFS

Topics: Developer Forum
Apr 30, 2011 at 1:04 AM

I currently have a sandcastle file built as part of my TFS project in VS2010. I have also created a virtual directory on that server through IIS. What I want to happen is when i queue a build of my project I would like to have the website built and overwrite the webfiles in the IIS virtual directory. This way every time i queue a build after checking in my code, all the new comments for the methods etc I have created, the website would be updated to contain those new methods. Can someone please tell me how to set this up. I already have the sandcastle file setup to build the website locally butI have no idea where to go from there. Any and all help is appreciated.

Thanks in advance.


Apr 30, 2011 at 2:12 AM

If you have access to your website over ftp then you could just copy the directory to the server.

There is no method for recursively copying files in a directory so I have to make it!

Untested code below written in five minutes so please use a test machine. If you want I can set this up to a small app to show the output with a cancel button incase it does the same directory over and over but take a look it should be fine.

Initializes and runs the operation all very simple.


new FTPRemote(
                IPAddress.Parse("" /*Example only*/),
                ""/*If Required add your UserID*/,
                ""/*If Required add your Password*/) { Directory = new Uri("Virtual"/*<-- you can specifiy your servers directory here that is a starting point*/) }
                .Upload(new DirectoryInfo("C:\\Where is your directory you want to upload?")
                ); // DONE!

Core that does the work.

  internal sealed class FTPRemote
            public FTPRemote(IPAddress IPAddress, string UserID, string Password)
                this.IPAddress = IPAddress;
                this.UserID = UserID;
                this.Password = Password;

            public IPAddress IPAddress { get; private set; }

            public string UserID { get; private set; }

            public string Password { get; private set; }

            public Uri ServerUri { get { return new Uri("ftp://" + IPAddress + "/"); } }

            public Uri ServerCurrentUri { get { return new Uri(ServerUri, Directory); } }

            public Uri Directory { get; set; }

            internal void Upload(DirectoryInfo DirectoryInfo)
                foreach (FileInfo fi in DirectoryInfo.GetFiles())

                foreach (DirectoryInfo di in DirectoryInfo.GetDirectories())
                    Directory = new Uri(Directory, new Uri(DirectoryInfo.Name));

            private void Upload(FileInfo fileInf)
                FtpWebRequest reqFTP;

                // Create FtpWebRequest object from the Uri provided
                reqFTP = (FtpWebRequest)FtpWebRequest.Create(new Uri(ServerCurrentUri, fileInf.Name));

                // Provide the WebPermission Credintials 
                reqFTP.Credentials = new NetworkCredential(UserID, Password);

                // By default KeepAlive is true, where the control connection
                // is not closed after a command is executed. 
                reqFTP.KeepAlive = false;

                // Specify the command to be executed. 
                reqFTP.Method = WebRequestMethods.Ftp.UploadFile;

                // Specify the data transfer type. 
                reqFTP.UseBinary = true;

                // Notify the server about the size of the uploaded file
                reqFTP.ContentLength = fileInf.Length;
                int buffLength = 500447300;
                byte[] buff = new byte[buffLength];

                int contentLen;

                // Opens a file stream (System.IO.FileStream) to read the file
                // to be uploaded 
                FileStream fs = fileInf.OpenRead();
                    // Stream to which the file to be upload is written 
                    Stream strm = reqFTP.GetRequestStream();

                    // Read from the file stream 2kb at a time
                    contentLen = fs.Read(buff, 0, buffLength);
                    // Till Stream content endswhile (contentLen != 0) 
                        // Write Content from the file stream to the FTP Upload Stream
                        strm.Write(buff, 0, contentLen);
                        contentLen = fs.Read(buff, 0, buffLength);


                catch (Exception ex)
                    throw new Exception(ex.Message);
Apr 30, 2011 at 2:18 AM

The Output Deployment Plug-In may also help in this situation.  It will allow you to copy the output to a UNC path, FTP, or HTTP site for each output type.  You add and configure it through the project's PlugInConfigurations property.  See the plug-in's help topic for more information.



May 1, 2011 at 1:33 AM

@ SpiderMaster : I have actually written a robust FTP library in C# if you would like i can share the code here. I don't think it applies to what I am asking though. Let me explain my setup I apologize for the long winded response.


My Team Foundation Server is setup on
IIS7 is on the same machine, with a virtual directory mapped to C:\SETL_WIKI The correct app_pool, permissions, etc is setup on the virtual directory
The URL for the virtual directory is

There is a solution called ETLProject.sln
That solution is made up of 8 projects (1 of them being an install shield project, that is irrelivant )
The rest of the projects have all the proper XML comments in them to work with sand castle.

The sandcastle build file is in the "Solution Items" folder the file looks like this:


<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" xmlns="" ToolsVersion="4.0">
    <!-- The configuration and platform will be used to determine which
         assemblies to include from solution and project documentation
         sources -->
    <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
    <Platform Condition=" '$(Platform)' == '' ">x86</Platform>
    <!-- AssemblyName, Name, and RootNamespace are not used by SHFB but Visual
         Studio adds them anyway -->
    <!-- SHFB properties -->
      <DocumentationSource sourceFile="DataAccessLayer\DataAccessLayer.csproj" configuration="Debug" platform="AnyCPU" />
      <DocumentationSource sourceFile="EDIParser\EDIParser.csproj" configuration="Debug" platform="AnyCPU" />
      <DocumentationSource sourceFile="ETLCommandLine\ETLCommandLine.csproj" configuration="Debug" />
      <DocumentationSource sourceFile="ETLThinClientApplication\ETLThinClientApplication.csproj" configuration="Debug" />
      <DocumentationSource sourceFile="ETLUtilityLibrary\ETLUtilityLibrary.csproj" configuration="Debug" platform="AnyCPU" />
      <DocumentationSource sourceFile="Import\Import.csproj" configuration="Debug" platform="AnyCPU" />
      <DocumentationSource sourceFile="ImportObjects\ImportObjects.csproj" configuration="Debug" platform="AnyCPU" />
    <VisibleItems>Attributes, InheritedMembers, InheritedFrameworkMembers, Internals, Privates, Protected, SealedProtected</VisibleItems>
      <NamespaceSummaryItem name="(global)" isDocumented="False">global namespace</NamespaceSummaryItem>
<NamespaceSummaryItem name="ETLCommandLine" isDocumented="True">The namespace for the ETL Command Line</NamespaceSummaryItem>
<NamespaceSummaryItem name="ETLUtilityLibrary" isDocumented="True">The namespace for the ETL Utility Library</NamespaceSummaryItem>
<NamespaceSummaryItem name="Import" isDocumented="True">The namespace for the Import</NamespaceSummaryItem>
<NamespaceSummaryItem name="DataAccessLayer" isDocumented="True">The namespace for the Data Access Layer</NamespaceSummaryItem>
<NamespaceSummaryItem name="EDIParser" isDocumented="True">The namespace for the EDI Parser</NamespaceSummaryItem>
<NamespaceSummaryItem name="ETLThinClientApplication" isDocumented="True">The namespace for the ETL Thin Client Application</NamespaceSummaryItem>
<NamespaceSummaryItem name="ImportObject" isDocumented="True">The namespace for the Import Object</NamespaceSummaryItem></NamespaceSummaries>
    <ProjectSummary>This is the project summary at the root namespaces pages, entered via Sandcastle</ProjectSummary>
    <HelpTitle>ETL - A Sandcastle Documented Class Library</HelpTitle>
  <!-- There are no properties for these groups.  AnyCPU needs to appear in
       order for Visual Studio to perform the build.  The others are optional
       common platform types that may appear. -->
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|x86' ">
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|x86' ">
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|x64' ">
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|x64' ">
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|Win32' ">
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|Win32' ">
  <!-- Import the SHFB build targets -->
  <Import Project="$(SHFBROOT)\SandcastleHelpFileBuilder.targets" />

If i run this file in the Sand Castle Help File Builder, everything runs without trouble. The website builds, puts it in the proper location, and I can access it via the URL. However if i queue a build in TFS which has the sand castle file as part of the project the website does not get built. Can you please tell me what steps I need to take to make sure the help file builder runs as part of my TFS build. This way any updates that are checked in prior to rollouts and all new classes, members, comments etc will be created as part of the new help file when its overwritten. Sorry again for the long explanation and if im making it harder than it is.

Thanks in advance for all the help,


May 2, 2011 at 6:45 PM
@william : What code? the code for an FTP connection?

From: [email removed]
To: [email removed]
Date: Mon, 2 May 2011 07:27:08 -0700
Subject: Re: Automate sandcastle build to website with TFS [SHFB:255766]

From: williamg
the codes doesn't work. tried endlessly. can anybody tell my why? thanks a lot...

best property insurance
Read the full discussion online.
To add a post to this discussion, reply to this email (
To start a new discussion for this project, email
You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe on
Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online at
May 3, 2011 at 2:49 AM


When using your build engine with Librarys and a Web Site you may want to check your solution file contains the following config!


<ConfigurationToBuild Include="Debug|Mixed Platforms">


     <PlatformToBuild>Mixed Platforms</PlatformToBuild>


Joel writes more about automating mix platform builds here

The only problem I think you could be having if the above is not a resolution for you is getting TFS to run the SHFB Project because it does not recognize it!

Eric is currently working on integrating SHFB with Visual Studio and im not sure what it needs to run your project through TFB Engine and I can only assume that there is no link between the 2 and TFB is ignoring the project.

Eric will probably post back on this soon.


May 3, 2011 at 3:03 PM

@Spider Master: Thanks for the tip. I have actually got it to work using the post build ops suggested, the only thing I have found out is apparently TFS does not execute these ops when queuing a build. They only fire off when building the project in the IDE. This isn't a deal breaker as I check to make sure the mode is set to release and only to fire when a build is successful. I am thinking a good work around to avoid the long deployment time (due to the website output being built on a remote web server is 39 minutes). Would be to perhaps build a batch file using the same msbuild and have TFS kick off the batch file when a build occurs. Since the TFS server, and web server where the site is being deployed are one in the same, I think that would take the speed down to nothing. I just wish there was a smoother integration with TFS. Sandcastle is amazing, and the documentation looks fantastic, I just wish it wasn't so clunky to use for programmatic builds with TFS.

May 3, 2011 at 3:17 PM

An alpha release (v1.9.3.1) is now available that provides Visual Studio integration.  Prior to this release, SHFB project's couldn't be loaded in Visual Studio so they couldn't be made part of the solution but they can now.  I can't really comment too much on Team Build since I don't use it.  In prior releases, the SHFB project had to be build in a separate step.  If it's made part of the solution just like a regular project, I assume it will get built along with everything else in the solution as I don't think there are any restrictions on projects in the solution for the purpose of building them.  All it should need is the SHFBROOT environment variable to point to the location of the build engine assemblies and supporting files.



May 3, 2011 at 3:21 PM

@Eric : Thanks for the reply. I do not have the new version but for some reason or another VS2011 Ultimate lets me include the SHFBPROJ file in my solution. It stores it in the Solution Items folder which is fine since the internal variable link to that folder is persisted through the solution. I just had to edit the file and the msbuild script to point to the correct locations and it built fine. My trouble now is the speed issue, i assume a batch file will clear that up. I do look forward to working with the new version as well.

May 3, 2011 at 7:33 PM

Regarding speed, BuildAssembler is probably the step that's slowing it down the most.  You can check the build log for the time each step took.  If you haven't done so already, add the three cached build components to the project's ComponentConfigurations property.  They may help a little with the speed after the first build with them that builds the cached info.  Depending on the size of the project, it can still take a while though.



May 5, 2011 at 1:01 PM

@Eric: Thanks again, I used the the cached build component after the firs build it is pretty smooth still having some minor speed issues though that took it down from about 37 minutes to 10. I am implementing the batch file process today, putting it all neatly on the same server and not doing a remote build. I bet that will wrap it up in about 2 minutes, as it does whenever i build the help file with the GUI locally. I do have another question about how the TFS integration is going to work. Will the build of the help file as a website saving on a web-server be able to be kicked off automatically when a TFS build is queued. As i found out using the post build operations is fine, but if this method were available to me since my TFS build server is the same web-server I was deploying the help website to (just a virtual directory with its own app pool on that same box) the integration would have been sweet!! Since my production build is only set to run once a month, It would guarantee all of the new stuff would get into the help since we have code freeze the last week of the cycle. No devs trying to sneak stuff in last minute...(most of the time lol)