04 June 2012

SharePoint 2007 import / export

I have been working to export site content from one SharePoint 2007 server to another using stsadm import / export.
This is my recording of a commonly occurring error that others have seen at least as far back as 2009.
The thorn in the side is that the TEMP and TMP environment variables for the account running the processes (both import and export) need to be located on a drive with sufficient space to contain the entire site being exported / imported.

stsadm –o export

In the export log, the following error will be recorded:
[5/31/2012 12:23:26 PM]: FatalError: There is not enough space on the disk.
   at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.FileStream.WriteCore(Byte[] buffer, Int32 offset, Int32 count)
   at System.IO.FileStream.Write(Byte[] array, Int32 offset, Int32 count)
   at Microsoft.SharePoint.Deployment.ExportDataFileManager.AddFile(Stream fileStream)
   at Microsoft.SharePoint.Deployment.FileSerializer.SaveFile(SerializationInfo info, ExportObjectManager objectManager, ExportDataFileManager fileManager, SPExportSettings settings, SPWeb parentWeb, Boolean isGhosted, String setupPath, String setupPathUser, Byte setupPathVersion, String webRelativeFileUrl, Int32 size, Byte level)
   at Microsoft.SharePoint.Deployment.FileSerializer.GetDataFromDataSet(Object obj, SerializationInfo info, StreamingContext context)
   at Microsoft.SharePoint.Deployment.DeploymentSerializationSurrogate.GetObjectData(Object obj, SerializationInfo info, StreamingContext context)
   at Microsoft.SharePoint.Deployment.XmlFormatter.SerializeObject(Object obj, ISerializationSurrogate surrogate, String elementName, Boolean bNeedEnvelope)
   at Microsoft.SharePoint.Deployment.XmlFormatter.Serialize(Stream serializationStream, Object topLevelObject)
   at Microsoft.SharePoint.Deployment.ObjectSerializer.Serialize(DeploymentObject deployObject, Stream serializationStream)
   at Microsoft.SharePoint.Deployment.SPExport.SerializeObjects()
   at Microsoft.SharePoint.Deployment.SPExport.Run()
The disk being referred to is not the disk that the exported CMP files are to be written to but the disk to which the current user’s TEMP and TMP folders are located.

stsadm-o import

In the import log, the following error will be recorded:
[6/4/2012 11:47:07 AM]: Error: Failure writing to target file [6/4/2012 11:47:07 AM]: Debug:    at Microsoft.SharePoint.Library.SPRequest.ExtractFilesFromCabinet(String bstrTempDirectory, String bstrCabFileLocation)
   at Microsoft.SharePoint.Deployment.ImportDataFileManager.<>c__DisplayClass2.<Uncompress>b__0()
   at Microsoft.SharePoint.SPSecurity.CodeToRunElevatedWrapper(Object state)
   at Microsoft.SharePoint.SPSecurity.<>c__DisplayClass4.<RunWithElevatedPrivileges>b__2()
   at Microsoft.SharePoint.Utilities.SecurityContext.RunAsProcess(CodeToRunElevated secureCode)
   at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(WaitCallback secureCode, Object param)
   at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(CodeToRunElevated secureCode)
   at Microsoft.SharePoint.Deployment.ImportDataFileManager.Uncompress(SPRequest request)
[6/4/2012 11:47:07 AM]: FatalError: Failed to read package file.
   at Microsoft.SharePoint.Deployment.ImportDataFileManager.Uncompress(SPRequest request)
   at Microsoft.SharePoint.Deployment.SPImport.Run()
***
Inner exception:
Failure writing to target file
   at Microsoft.SharePoint.Library.SPRequest.ExtractFilesFromCabinet(String bstrTempDirectory, String bstrCabFileLocation)
   at Microsoft.SharePoint.Deployment.ImportDataFileManager.<>c__DisplayClass2.<Uncompress>b__0()
   at Microsoft.SharePoint.SPSecurity.CodeToRunElevatedWrapper(Object state)
   at Microsoft.SharePoint.SPSecurity.<>c__DisplayClass4.<RunWithElevatedPrivileges>b__2()
   at Microsoft.SharePoint.Utilities.SecurityContext.RunAsProcess(CodeToRunElevated secureCode)
   at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(WaitCallback secureCode, Object param)
   at Microsoft.SharePoint.SPSecurity.RunWithElevatedPrivileges(CodeToRunElevated secureCode)
   at Microsoft.SharePoint.Deployment.ImportDataFileManager.Uncompress(SPRequest request)
[6/4/2012 11:47:57 AM]: Progress: Import Completed.
[6/4/2012 11:47:57 AM]: Finish Time: 6/4/2012 11:47:57 AM.
[6/4/2012 11:47:57 AM]: Completed with 0 warnings.
[6/4/2012 11:47:57 AM]: Completed with 2 errors.
The “Failure writing to target file” is again the location of the TEMP and TMP directories of the currently logged in user. This error is saying that there isn’t enough temporary disk space available.
I think the fact that SharePoint allows you to specify how large you want the CMP files to be leads you to  think that each file will be individually packed / unpacked but this is simply misleading. The entire site is exported to TEMP then written to CMP files or imported from CMP files to TEMP before written to SharePoint.
I could guess that this behaviour continues into SharePoint 2010 but haven’t confirmed this.

06 March 2012

SharePoint only allows a WSP to be deployed globally

One more reminder to me…

Description of the Problem

I’ve been experiencing a problem where on installing a WSP to SharePoint, it is only possible to deploy the solution globally.

Some of the other posts surrounding this issue talk about the SafeControl entry. However, in my instance, I’m just deploying a bunch of files - to the Master Page Gallery and Style Library. I don’t even want to deploy a DLL as I have no code.

Solution I found

  1. Deploy an assemly to the GAC.
  2. In the Package.Template.xml file, add something like the following:
<?xml version="1.0" encoding="utf-8"?>
<
Solution xmlns="http://schemas.microsoft.com/sharepoint/">
<
Assemblies>
<
Assembly Location="My.Branding.dll" DeploymentTarget="GlobalAssemblyCache">
<
SafeControls>
<
SafeControl Assembly="$SharePoint.Project.AssemblyFullName$" Namespace="My.Branding" TypeName="*" />
</
SafeControls>
</
Assembly>
</
Assemblies>
</
Solution>

After installing this, I can now deploy to any web application. The down side is I’ve now got a DLL polluting in the GAC that I don’t care about.

27 January 2012

WCF RIA Services Learning Notes

Why bother with RIA Services?

Because Lightswitch is awesome! Out of the box I can connect to SharePoint lists (no attachment support in v.1) and to SQL Server. For anything else, I’ll need a custom WCF RIA Service. By anything else, I mean (for now) Dynamics CRM.

I plan to write some more on Lightswitch later on, to document my learning as much as anything. One use case I see for Lightswitch is for creating those applications for which one or two people need a different set of screens to more efficiently work with existing data. In many of these cases, there may be insufficient return (or available developer time) to warrant extensive custom development.

My own interest then is to create a stand alone RIA Service, probably wrapping an existing WCF Service (such as those that ship with Dynamics CRM 2011 or SharePoint 2010), stored procedure or other code.

The following notes are mainly for myself – I’ll probably need to find them again some day soon. Neither of the other two people who might read this blog are likely to ever care… Winking smile

Visual Studio will allow me to “Enable WCF RIA Services” when I create a Silverlight application. My problem with doing this is that like the WCF project templates, it hides so much of what’s going on and also commits me to creating a Silverlight application anyway. It’s not really what I want to do. I guessed the process is worth doing once, so I followed the “Walkthrough: Creating a RIA Services Solution” from MSDN. But any goose can click through a wizard and I’m afraid I can’t say I’m much the wiser for it.

The following is my post-mortem from working through the walkthroughs and how-tos on MSDN for WCF RIA Services.

Walkthrough: Creating a RIA Services Solution

(Source) I found that this walkthrough left me with more questions than answers. Because so much was automatically created and everything was pretty much thrown together, I didn’t know what pieces were what and more importantly which pieces mattered when building a custom RIA Service for use by Lightswitch (or JavaScript).

How-to: Create a Domain Service that uses POCO-defined Entities

(Source) This was a useful walkthrough though it has an error. It was a useful learning experience.

Building the solution according to the steps led to code like the following :

[EnableClientAccess()]
public class SovereignDomainService :
DomainService
{
}

public class
Sovereign
{
    [Key]
    public int UniqueId { get; set; }
    public string Name { get; set; }
    public string House { get; set; }
    public string Dominion { get; set; }
    public int ReignStart { get; set; }
    public int ReignEnd { get; set; }
    public string Sobriquet { get; set; }

    public List<Sovereign> FetchSovereigns()
    {
        List<Sovereign> sovereignList = new List<Sovereign>
        {
            new Sovereign()
            {
                UniqueId=1,
                Name="John",
                House="Plantagenet",
                Dominion="Angevin Empire",
                ReignStart=1167,
                ReignEnd=1216,
                Sobriquet=
"Lackland"
           
},
            new Sovereign()
            {
              
// ... Abbreviated for space...
           
}
        };
        return sovereignList;
    }

    public IEnumerable<Sovereign> GetSovereigns()
    {
        Sovereign sovereign = new Sovereign();
        return sovereign.FetchSovereigns();
    }

    public IEnumerable<Sovereign> GetSovereignsByReignEnd(int ReignedBefore)
    {
        Sovereign sovereign = new Sovereign();
        return sovereign.FetchSovereigns().Where<Sovereign>(p => p.ReignEnd <= 1500);
    }
}

Generating this, produced an empty code gen file (RIAServicesExample.Web.g.cs) in the client project.

The correct version puts the three methods inside the SovereignDomainService class. The two GetX methods call FetchSovereigns. When built, this produces a considerable amount of generated code. I certainly wouldn’t want to generate this by hand. The Sovereign class gets transformed and marked with the WCF DataContract attribute, also inheriting from System.ServiceModel.DomainServices.Client.Entity, and the properties get transformed and marked with the WCF DataMember attribute.

The SovereignDomainService is transformed into a partial sealed class. This provides the extension mechanism – as the code I’m viewing is generated, I can’t change it as any changes would be smashed next time I build the project. The constructor creates the SVC file.

An interface for the SovereignDomainService is created as an inner interface. It’s fully qualified name in this example is RiaServicesExample.Web.SovereignDomainContext.ISovereignDomainContext. I’ve never seen one before, though I’d say this follows the same rules as an inner class. Namespaces for the FaultContract and OperationContract types are for tempuri.org. I want to find out how to change this.

Walkthrough: Creating a RIA Services Class Library

(Source) I found this walkthrough to be a good next step from the previous how-to. In this walkthrough, I created a Silverlight project (without WCF RIA Services enabled) and a separate WCF RIA Services Class Library project. This project template is found under the Silverlight section. I’m not convinced this is a good decision as WCF RIA Services can also be built for JSON (JavaScript) clients – it’s just a matter of configuration.

Following this walkthrough, I end up with four projects (from the two project templates). The RIA Services Class Library contains the Entity Data Model, while the Silverlight.Web project contains a reference to the RIA Services Class Library Web project.

There are a few manual steps here that aren’t present in cases where you tick “Enable WCF RIA Services” in the New Silverlight Application wizard when you create a new Silverlight project but it’s no big deal. It mainly boils down to adding a few DLL references and modifying your web.config with values that were automatically created in the WCF RIA Services Class Library project – it comes down to a bit of copy/paste.

Summary so far…

Being able to build custom WCF RIA Services are essential for assembling Lightswitch applications that connect to data sources other than SharePoint 2010 lists and SQL Server databases. In cases where you’re not meant to access the database directly (eg. Dynamics CRM), for re-using existing business logic and for accessing other types of data (eg. XML or NOSQL data), you’re going to have to build a custom WCF RIA Service.

I’m not sure whether I have a similar business case for JavaScript clients. I may be better off using JSON / XML directly. I imagine though that if the RIA Service is already built than it looks like I have everything built that I will need, leaving only the necessary configuration to emit JSON data.

NBN mobile towers planning challenges

Here we go again… Mobile towers and health effects, NBN style this time. Interesting that the Libs are really quiet on the topic, considering that their alternative was all wireless. http://www.smh.com.au/it-pro/government-it/nbn-towers-anger-residents-and-andrew-wilkie-20120113-1py5q.html

20 May 2011

NBN: To Wireless or Not To Wireless?

I’ve been following the debate on the NBN for several years now. While the debate over whether to go with the optic fibre service currently being planned and deployed by the Labor Government or to scrap it and implement a cheaper(?) wireless based service proposed by the Liberal Party is the hot issue, there’s never been much debate that I’ve found on the reliability of a wireless network.

I had a 3G wireless broadband service at home at the foot of the Dandenong Ranges, 30km as the crow flies from the Melbourne CBD. I got tired of its unreliability and switched to a fixed line service over a year ago.

It must be noted that a proposed national wireless broadband service wouldn’t run (I hope / expect) on 3G so hopefully reliability would be better.

But for me, here’s the clincher – the 3G service claimed a bandwidth of 3.6Gbps. At home, testing against speedtest.net, on a clear day I could get 330kbps. When the weather was bad, I was recording 180kbps or so – sub-broadband speeds. The government says “broadband” begins at 256kbps.

So the most basic wireless fact that every UHF and HF radio operator knows (and the politicians and commentators are forgetting) is that radio performance is heavily affected by the amount of atmospheric moisture. “Next gen” wireless can’t deny physics.

A wireless NBN is a joke.

19 May 2011

Mono project IP in trouble?

This article appeared in my RSS reader today.

The global Mono project team has been laid off. Miguel De Icaza and his team have formed a new company (Xamarin) to continue development.

The problem for them is that their non-open source offerings - MonoTouch (Mono for iPhone / iPad) and MonoDroid (Mono for Android) (no mention of MonoMac) are owned by Novell's new owner, Attachmate. Attachmate obviously just wants whatever of Novell's assets it can get.

So there are a few questions.

Firstly, if MonoTouch and MonoDroid were the money making parts of the Mono Project and Xamarin can't own them, how are they going to continue to exist? Can they get sufficient work doing Mono consulting?

Secondly, what intellectual property risks are they running by trying to engineer replacements? These guys built the original products. Are Attachmate going to take legal action claiming ownership over their own personal knowledge? This is a possibility as they (Attachmate) could claim that they went to work for a competitor, building a competing product using intellectual property owned by Novell. How might this play out? How might a court separate the intellectual property owned by a company from the knowledge held by a group of developers (as individuals or collectively)?

26 April 2011

Blogging on Innovation and Entrepreneurship

This semester I'm back at uni. I was so grateful for the break - having time to smell the roses and rediscover life. The strangest thing happened - I got married!!!

I'm taking a subject called Innovation and Entrepreneurship in IT. As part of this, I'm keeping a learning stream in a blog at http://mikehansford.wordpress.com.

My current post is called "The right knowledge management strategy is essential to open innovation."