I have been working as a consultant for some time on a back-end project that was to run both in “self-hosted mode” (aka Windows Service) and in Windows Azure.
We had adjusted some minor details in an OData endpoint (aka WCF Data Service), and tested it “in self-hosted mode”. When I was done with the testing I redeployed the solution to Azure.
And as soon as I accessed my service, the server crashed with a 500-Internal Server error. I tested some other services, but they was working just fine.
Based on earlier experiences I guessed that some DLL’s was missing and since there was several people in the project, not all focusing in Azure, I was guessing some DLL’s had lost it “copy local=yes” attribute.
Since the solution is quite large, I had to use some time to verify all this – but I didn’t find anything special.
I added “copy local=yes” some DLL’s that I was pretty sure was not in Azure, and redeployed to no help.
I even used the nice service http://gacviewer.cloudapp.net/ to help me locate bad configuration to no help.
And the error message from WCF was not very helpful. Internal server error.. yeah right.. See server logs, yeah right – nothing in Event log or IIS log.
After some research, I found an article describing how to enable debugging output from WCF: http://blogs.msdn.com/b/phaniraj/archive/2008/06/18/debugging-ado-net-data-services.aspx ‘
- Set a tag in the source code or via enable Config, I prefered the tag.
- [System.ServiceModel.ServiceBehavior(IncludeExceptionDetailInFaults = true)]
- Set UseVerboseErrors to true in the ServiceConfiguration
Now the WCF service gave me something more than 500-Internal Server Error. “The server encountered an error processing the request. The exception message is ‘Unable to load one or more of the requested types. Retrieve the LoaderExceptions property for more information.’” and the stack trace.
I noticed something odd in the stack trace, a mention of the function “SyncInvokeProcessRequestForMessage”. This was helpful, as I knew someone has modified the Sync logic in the back-end system.
Some more research, and I found this article “How to: Deploy Sync Framework to Windows Azure” – http://msdn.microsoft.com/en-us/library/ff928660(v=sql.110).aspx.
As it turns out, you have to put on your Azure Black Belt to actual get your deployment to Work in Azure when Sync is involved.
By using the recipe in the MSDN article I was able to get the solution to work, but Microsoft – come on – this needs to be much easier!!
Last week I was invited by Microsoft to do a presentation on SQL Azure at TechDays here in Norway.
As always I like to be well prepared, so building the slides and doing a complete background research – in case of unexpected questions – was neccesary. This was fun and I learned quite a few new things doing this :-)
My presentation was just after the BI guru Rafal Lukawiecki (which has 6 “Best Speaker” from TechEd), so it was a bit like jumping after Wirkola, but Rafael was very kind and helped me to get rigged before my presentation and even helped me calm my nerves.
Feedback from the presentation was nice, thanks to all of you that voted green and yellow. Luckily, no red lables :-)
Newsflash on Sync Framework V4
During my presentaion I spoke about several sync technologies, including Sync Framework V4 that Microsoft had decided to drop support for, but was going to make the source available.
Today Rob Tiffany announced that the Sync Framework is now open source. Check out his blog for more details: http://robtiffany.com/sync-framework/sync-framework-v4-is-now-open-source-and-ready-to-connect-any-device-to-sql-server-and-sql-azure
The bits are hosted by MSDN at http://code.msdn.microsoft.com/Sync-Framework-Toolkit-4dc10f0e
“Local processing mode” means that the ReportViewer is not connected to a Reporting Server to do the processing but requires the developer to generat the DataSet(s) up front, and that the ReportViewer control is handed the DataSet(s) together with the report layout (the RDLC file) so that it can render the report.
According to MSDN documenation “How to: Use ReportViewer in a Web Site Hosted in Windows Azure” this is not supported. It states “Note: ReportViewer configured in local processing mode is not supported in Windows Azure .”
I don’t know the reason for this statement – but the positive thing is that it doesn’t say “Note: ReportViewer configured in local processing mode does not work Windows Azure”, it just says it is “not supported”.
To get the ReportViewer to work you need to get the neccesary DLL files that the ReportViewer control uses copied to the bin folder during build, so that deploying to Azure actually brings in the neccesary DLL.
(Remember that the Azure servers do only have the .NET runtime and the Azure DLLs installed so you need to bring in the DLL’s you are using)
What you need
In remote processing mode, the ReportViewer control uses the following assemblies:
- Microsoft.ReportViewer.WebForms.dll Contains the ReportViewer code, which you need to use ReportViewer in your page. A reference for this assembly is added to your project when you drop a ReportViewer control onto an ASP.NET page in your project.
- Microsoft.ReportViewer.Common.dll Contains classes used by the ReportViewer control at run time. It is not automatically added to your project
In addition you will need these two DLL’s for doing local processing
- Microsoft.ReportViewer.ProcessingObjectModel.dll -
- Microsoft.ReportViewer.DataVisualization.dll -
I even had to throw in an “unrelated” DLL to get things to work – I am not 100% sure if you need it, but I had to add it to get it to work.
- Microsoft.Web.Infrastructure -
How to get DLL files out of the GAC
For the DLL’s in the GAC you need to copy them out from the GAC and to a folder inside your project (so that you can add it to SourceControl and not break the compile or create trouble for other developers in the team)
The GAC is using this folder on your PC: %windir%\assembly\GAC_MSIL
So the DLL Microsoft.ReportViewer.ProcessingObjectModel is stored in the folder C:\Windows\assembly\GAC_MSIL\Microsoft.ReportViewer.ProcessingObjectModel. Actually there is a subfolder for each version of the DLL. For the VS2010 version make sure you pick the folder named 10.0.0.0_something. In my PC the complete foldername is C:\Windows\assembly\GAC_MSIL\Microsoft.ReportViewer.ProcessingObjectModel\10.0.0.0__b03f5f7f11d50a3a
Make sure the DLL’s are copied to your BIN folder and deployed to Azure
For all of your referenced DLL files that you need transfered to Azure you must make sure that the “Copy local” flag is set to True.
1.Find the Microsoft.ReportViewer.dll which is already added as a reference in your project, set Copy Local=true
2.Add the Microsoft.ReportViewer.Common.dll to the assembly references, by doing an add reference. Set Copy Local=true
3.Find the Microsoft.ReportViewer.ProcessingObjectModel.dll and Mircosoft.RevportViewer.DataVisualization.dll in your GAC (see above), copy them into a folder in your project then reference the copies of the assemblies. set Copy Local=true
4. Reference the Microsoft.Web.Infrastructure.DLL (c:\Program Files (x86)\Microsoft ASP.NET\ASP.NET Web Pages\v1.0\Assemblies\Microsoft.Web.Infrastructure.dll) and set Copy Local = True
Deploying the report files
Make sure your report files (the RDLC files) are copied to the BIN folder by setting “Build Action= Content” and “Copy to output directory=Always”
I believe it should be possible to make the Instance to actually execute the Microsoft Report Viewer 2010 Redistributable Package during boot using a Startup Task, read more about this in MSDN docs. But I havent tested this way, and personally i prefer to actually bring in the DLL’s in the deployment instead of executing a setup package – but that’s just my opinion.
And by the way: You should use the latest VS2010 version of the ReportViewer, if you havent upgraded yet – please do – as the new version has a lot of bug fixes and general user interface improvements. And it also supports export to Word…
- Analyze and fix the database and upload it to SQL Azure
- Convert the application to Azure Web Role
- Deploy the Application
In this example the architecture is quite simple; it is a Silverlight project using RIAA services using Entity Framework against a SQL Database.
Analyze and fix the database and upload it to SQL Azure
The first step is to analyze the SQL Database to see if there are incompabilities for SQL Azure – and fix them if necessary. To do this I recommend the tool “SQL Azure Migration Wizard” available at CodePlex at http://sqlazuremw.codeplex.com/
When the database model has been fixed and tested locally you can use the same tool to transfer data to SQL Azure, but you can also script the database schema and its data from SQL Server Management Studio. Right click your database inside SQL Server Management Studio and select “Tasks” -> “Generate scripts…” In step 3 – “Set Scripting Options” you click the “Advanced” button and change two options:
Set the option “Script for database engine type” to “SQL Azure Database”.
Optionally, if you want the data in addition to the schema – set the option “Types of data to script” to “Schema and data”.
After scripting the data, you connect to your SQL Azure and run the script.
Now you can change your connection string in your web.config to point to the SQL Azure database and actually test the Silverlight application locally before moving on to the next step.
- SQL Server Management Studio 2008 R2 (Express edition works fine)
- Firewall settings securing your SQL Azure database must be configured to allow traffic from your current network address.
Convert the application to Azure Web Role
Open your solution in Visual Studio. Right click your solution and select “Add new project”. Under “Cloud” select the template “Windows Azure Project” and name your new project.
The template will trigger a selection box where you can add Azure roles to your project. Don’t select any roles right now, just click “OK”.
In your new project you now have 3 elements: “Roles”, “ServiceConfiguration.cscfg” and “ServiceDefinition.csdef”.
Right click “Roles” and select “Add” -> “Web Role Project in solution…”.
You will now get a listbox where your existing Silverlight project will show up. Select this and click “OK”.
Go back to your Silverlight project and add Reference to Microsoft.WindowsAzure.ServiceRuntime and Microsoft.WindowsAzure.StorageClient – (normally at ”C:\Program Files\Windows Azure SDK\v1.3\ref”)
Add a new file “WebRole.cs” (copy the content from an existing Azure project or hand-code it yourself. The class must inherit from RoleEntryPoint.) Inside this class you configure Azure logging and diagnostic settings.
- Azure SDK (download from http://www.microsoft.com/windowsazure/getstarted/default.aspx)
- To test the application locally in the “Compute emulator” you must start Visual Studio with Administrative rights
Deploy the Application
When deploying your application to Azure you must be aware that the available Assemblies in your Azure Instance might not include all the DLL’s you are using in your project. Debugging and finding the ones that are missing can be a bit complex.
Luckily Wayne Berry and Rob Gillen have made a tool – running in Azure – available to help you and me. Go to http://gacviewer.cloudapp.net/Upload.aspx and upload all of your projects (one by one) and the tool will identify which DLL’s are not available in Windows Azure, and you need to go back to your project and set “Copy Local” to TRUE for all identified assemblies.
In my solution these three was identified:
Now you are ready to deploy and test your solution.
Before deploying my solution to Azure I tested it locally. The UI started as expected, but when I tried to log in I got an error:
at System.ServiceModel.DomainServices.Client.ApplicationServices.WebAuthenticationService.EndLogin(IAsyncResult asyncResult)
at System.ServiceModel.DomainServices.Client.ApplicationServices.LoginOperation.EndCore(IAsyncResult asyncResult)
at System.ServiceModel.DomainServices.Client.ApplicationServices.AuthenticationOperation.End(IAsyncResult result)
Caused by: Load operation failed for query 'Login'. The remote server returned an error: NotFound.
A search on the internet found a lot of similar problems where the solutions mostly said I was missing some DLL files and I needed to set “Copy Local” to YES. Since I already had fixed that, I was pretty sure that this was not the problem.
Using Fiddler I was able to see that the real reason was a 500 error from the server during “GET /ClientBin/Web-AuthenticationService.svc/binary/GetUser”. The full error description was
“System.InvalidOperationException: IIS specified authentication schemes 'IntegratedWindowsAuthentication, Anonymous', but the binding only supports specification of exactly one authentication scheme.” This svc was my AspNet membership providers AuthenticationService.
A new search showed a lot of different solutions, but the one that solved the problem was saying “For most, simply changing <authentication> node to mode=”Forms” will remove this error and allow you to continue”.
As it turns out, your local Azure WebRole is running its Compute Emulator through your local IIS. So I started my IIS Manager and under “Sites” I found my Azure deployment. Inside the “Authentication” node I found out that both “Forms authentication” and “Windows Authentication” was enabled.
I disabled “Windows Authentication” and the problem was solved.
If you want to see how to create a new Silverlight application from scratch and deploy it to Azure check out this video from Silverlight TV:
Azure is of course securing your data from data loss by duplicating the physical data, but it doesn’t secure you from unintentionally data changes/deletes.
You can of course make a snapshot of the database by using CREATE DATABASE xx AS COPY OF yy, but what I really want is some way to get a backup file into Azure Storage (because it’s cheap, and we can have many backups stored there, making it possible to do a Restore of a special version if needed)
As I see it right now, there are two options for managing backup yourself:
- Write an import/export logic to stream data from the database into files and put the files into Azure Storage
- Sync your data with a database outside Azure and do a local backup on it.
If you go for the last option, Sync Framework is your friend.
Then install the Sync Framework 2.1 SDK and start coding. It should be possible to get this into production in just a few hours.
If you do not have an SQL Server license, SQL Server 2008 Express will probably be big enough for most people with its 10GB file size limit.
Update: New video showing Introduction to SQL Azure Data Sync
Update 23. November:
This is the offical Microsoft SQL Azure Backup and Restore Strategy.
I am a bit disappointed, but they are promising to remedy this later..
“The tools that we have today cover the other risk factors, however better tools are coming to make the job much easier.”
The Azure Storage Explorer which can be found at CodePlex has been updated to version 4 with many great features.
All 3 types of cloud storage can be viewed and edited: blobs, queues, and tables.
Highlights of version 4:
• Better Code. Versions 1-3 of Azure Storage Explorer pre-dated the commercial Azure launch and there was no formal Storage Client library. Code from the SDK samples was used, which caused the source code base to be huge and complex. In version 4 we are using the .NET StorageClient library and the code is compact and well-organized. The source code is open and is part of this CodePlex project.
• Newer storage feature support. Support has been added for newer features such as blob root containers.
• Direct data entry and editing of blobs, messages, and entities.
• Improved UI. The new WPF-based UI is cleaner, and supports opening multiple storage acounts at the same in tab views. The Mode-View-ViewModel pattern is used.
• Free. Azure Storage Explorer remains a free community donation.
For more information please follow David Pallmann’s Technology Blog
Windows Azure Diagnostics enables you to collect diagnostic data from a service running in Windows Azure. You can use diagnostic data for tasks like debugging and troubleshooting, measuring performance, monitoring resource usage, traffic analysis and capacity planning, and auditing.
The API’s are documented over at MSDN at http://msdn.microsoft.com/en-us/library/ee758705(v=MSDN.10).aspx and http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.diagnostics.aspx
But first I would suggest that you watch the PDC 2010 session about “Windows Azure Monitoring, Logging, and Management APIs” available at http://www.microsoftpdc.com/2009/SVC15
Then you can download the code that was demoed in the PDC session from http://code.msdn.microsoft.com/WADiagnostics
There is also a nice walkthrough of a simple logging application at http://blogs.msdn.com/b/sumitm/archive/2009/11/25/windows-azure-walkthrough-simple-logging.aspx
After you have implemented code in your Azure application to actually log something you can implement some code in the Azure app or in a client tool to read from the Azure logs, or even better – you can use available tools to help you with this task.
You should look into the application Cerebrata – Azure Diagnostics Manager which is a Windows (WPF) based client for managing Windows Azure Diagnostics. It lets you view, download, and manage the diagnostics data collected by the applications running in Windows Azure.
To get you stared using Cerebrate check out this link http://www.cerebrata.com/Products/AzureDiagnosticsManager/GettingStarted.aspx