I have been working as a consultant for some time on a back-end project that was to run both in “self-hosted mode” (aka Windows Service) and in Windows Azure.
We had adjusted some minor details in an OData endpoint (aka WCF Data Service), and tested it “in self-hosted mode”. When I was done with the testing I redeployed the solution to Azure.
And as soon as I accessed my service, the server crashed with a 500-Internal Server error. I tested some other services, but they was working just fine.
Based on earlier experiences I guessed that some DLL’s was missing and since there was several people in the project, not all focusing in Azure, I was guessing some DLL’s had lost it “copy local=yes” attribute.
Since the solution is quite large, I had to use some time to verify all this – but I didn’t find anything special.
I added “copy local=yes” some DLL’s that I was pretty sure was not in Azure, and redeployed to no help.
I even used the nice service http://gacviewer.cloudapp.net/ to help me locate bad configuration to no help.
And the error message from WCF was not very helpful. Internal server error.. yeah right.. See server logs, yeah right – nothing in Event log or IIS log.
After some research, I found an article describing how to enable debugging output from WCF: http://blogs.msdn.com/b/phaniraj/archive/2008/06/18/debugging-ado-net-data-services.aspx ‘
- Set a tag in the source code or via enable Config, I prefered the tag.
- [System.ServiceModel.ServiceBehavior(IncludeExceptionDetailInFaults = true)]
- Set UseVerboseErrors to true in the ServiceConfiguration
Now the WCF service gave me something more than 500-Internal Server Error. “The server encountered an error processing the request. The exception message is ‘Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.’” and the stack trace.
I noticed something odd in the stack trace, a mention of the function “SyncInvokeProcessRequestForMessage”. This was helpful, as I knew someone has modified the Sync logic in the back-end system.
Some more research, and I found this article “How to: Deploy Sync Framework to Windows Azure” – http://msdn.microsoft.com/en-us/library/ff928660(v=sql.110).aspx.
As it turns out, you have to put on your Azure Black Belt to actual get your deployment to Work in Azure when Sync is involved.
By using the recipe in the MSDN article I was able to get the solution to work, but Microsoft – come on – this needs to be much easier!!
Azure is of course securing your data from data loss by duplicating the physical data, but it doesn’t secure you from unintentionally data changes/deletes.
You can of course make a snapshot of the database by using CREATE DATABASE xx AS COPY OF yy, but what I really want is some way to get a backup file into Azure Storage (because it’s cheap, and we can have many backups stored there, making it possible to do a Restore of a special version if needed)
As I see it right now, there are two options for managing backup yourself:
- Write an import/export logic to stream data from the database into files and put the files into Azure Storage
- Sync your data with a database outside Azure and do a local backup on it.
If you go for the last option, Sync Framework is your friend.
Then install the Sync Framework 2.1 SDK and start coding. It should be possible to get this into production in just a few hours.
If you do not have an SQL Server license, SQL Server 2008 Express will probably be big enough for most people with its 10GB file size limit.
Update: New video showing Introduction to SQL Azure Data Sync
Update 23. November:
This is the offical Microsoft SQL Azure Backup and Restore Strategy.
I am a bit disappointed, but they are promising to remedy this later..
“The tools that we have today cover the other risk factors, however better tools are coming to make the job much easier.”