by Andrzej Kowal

MSSQL on Amazon RDS runs out of space!

Let’s take a simple case which I needed to solve today: our MSSQL Server database hosted in Amazon RDS used whole storage space which we assigned at the moment it was created. Ok, no problem, let’s increase it. Hmm, but there is no such option. So maybe take an RDS snapshot and restore to a larger instance? Nope. Ok, then let’s create a backup file, create new RDS instance with larger storage and restore backup file to that instance? Wrong again. This is just not supported!

The solution is the SQL Azure Migration Wizard. It has many features, but the one we need is the one that moves existing database between servers. The migration includes schema and data. It will support any MSSQL server including Azure, RDS and standalone installations. To solve my problem I first created new RDS instance with same settings and larger storage. Then I migrated DB from the old instance to the new one (with full schema and data) using this tool.

Let’s see the migration process in details:

  1. Download and unzip the application. If you don’t have SQL Server installed on the computer where you run this tool you will need to install Shared Management Objects (SMO) and Command Line Tools from this link: http://www.microsoft.com/en-us/download/details.aspx?id=29065 (this is for MSSQL 2012). You can find the necessary links after expanding “Install Instructions” section.
  2. In some cases you need to edit to SQLAzureMW.exe.config file and adjust the path to folder where the scripts are stored (by default it is c:\SQLAzureMW\BCPData). These scripts can get large, depending on your database size.
  3. Run the tool.
  4. Choose analyze / migrate -> database
  5. Connect to source database (it can be RDS too – in my case it was the RDS instance where we reached storage size limit). If you don’t have a source DB but a backup file – simply restore it to any MSSQL server which you have access to (outside RDS of course). Then connect to that database.
  6. Generate scripts. You might want to take a look at advanced options – there is quite a few of them.
  7. After script is generated connect to target database (if it does not exist, you can create it during the migration process) and execute the import.

 

That’s it. I was working on a 20GB database and it took 15 minutes to pull data and prepare the scripts and around 90 minutes to push the data to target DB. To speed things up I was running this tool on an AWS EC2 instance, which was launched in the same region as RDS instance. After import I even managed to rename the new RDS instance to the old name, so I could keep the old connection strings in all the config files.

And what about Amazon’s guidelines for importing database into RDS? If you take a close look, you will see that one of suggested scenarios involves bcp.exe (Bulk copy, which is part of SQL Command Line Tools). Actually SQLAzureMW uses bcp.exe under the hood to perform the migration. My verdict: go for SQLAzureMW. Its simplicity makes it the best choice for an RDS MSSQL DBA.

by Njål

Windows Azure SQL – Entity Framework 6 Retry Logic

 

One of the biggest drawbacks of using Azure SQL is that the connection towards the database is a lot more fragile and unstable than when you’re using a database server on your LAN. Also – your Azure database is running on servers with several hundred other databases – so the database server might disconnect you at any given time to ensure fairness of service etc.

These are transient errors – meaning that a retry of the query most often will succeed and fix the error.

Microsoft has released several code blocks and guidelines to help developers use this retry logic – http://code.msdn.microsoft.com/windowsazure/SQL-Azure-Retry-Logic-2d0a8401

And luckily – the EntityFramework team has added support for this in version 6 as well.

Here’s how you can get started with EF6 (beta 1) – using the built in SqlAzureExecutionStrategy

 

1. Install EF6 beta by opening the NUGET console in Visual Studio and typing
     PM> Install-Package EntityFramework –Pre

2. Install the EF6 Template generator – http://visualstudiogallery.msdn.microsoft.com/18a7db90-6705-4d19-9dd1-0a6c23d0751f. This will create the C# code for you – which will handle the communication towards the database

3. Restart Visual Studio

4. Add New Item  -> EDMX file. Connect to Database and/or specify your datamodel. Just like older versions of Entity Framework.

5. Right click in the EDMX editor – and select Add Code Generation Item. Select EF 6.x DbContext Generator.

6.
Compile. You should now have EF6 up and running. Only one crucial part missing – the Azure Retry Logic.

7. Add a class inside your Project with the following code. It will get picked up upon Application Start – and make sure that all the EF Database queries will be retried if they fail.

 

using System;
using System.Data.Entity.Config;
using System.Data.Entity.Infrastructure;
using System.Data.Entity.SqlServer;

namespace Www.Business.Database
{
    public class SqlAzureDbConfiguration : DbConfiguration
    {
        public SqlAzureDbConfiguration()
        {
            AddExecutionStrategy(() => new SqlAzureExecutionStrategy(), null);
        }

        protected void AddExecutionStrategy(Func<IExecutionStrategy> getExecutionStrategy, string serverName)
        {
            AddDependencyResolver(new ExecutionStrategyResolver<IExecutionStrategy>("SqlAzureExecutionStrategy", serverName, getExecutionStrategy));
        }
    }
}

(The example provided here: http://entityframework.codeplex.com/wikipage?title=Connection%20Resiliency%20Spec does not work/compile)

by Njål

The directory specified for caching compressed content is invalid. Static compression is being disabled.

image

 

After adding this in my web.config (Azure Project)…

<staticContent>
    <mimeMap fileExtension=".svg" mimeType="image/svg+xml" />
</staticContent>

…IIS refused to serve any static content – such as js and png files. Log entries in Event Viewer said:

The directory specified for caching compressed content
C:UsersxxxAppDataLocaldftmpResourcesb503ce54-3fba-4772-beaa-415d94ffc214temptempRoleTempiisexpressIIS Temporary Compressed Files9ef2ab2d-28fc-4554-a09c-5ba867eb1d19 is invalid.
Static compression is being disabled.

The problem was that .svg files were already added to the staticContent list in my applicationHost.config – but still – the error message shown in the Event Viewer is pretty useless. There is nothing wrong with the directory specified to store compressed content.

So if you suddenly get this error – undo any changes in Web.config (remove all config under staticContent) and restart IIS (Express).

by Njål

Intelligencia UrlRewriter – Ashx Sessions getting lost

image

When debugging a Windows Azure website I kept getting null Sessions for my ashx pages. They worked perfectly on  IISExpress, so I thought this was a Azure flaw. It turned out that it was IIS 7.5 + Url Rewriting (I’m using the Intelligencia module) that caused the problem.

 

 

After adding this to my Web.config everything worked like expected again – also in IIS 7.5

<configuration>
  <system.webServer>
    <modules runAllManagedModulesForAllRequests="true" >
     ...
     ...
  </system.Webserver>
</configuration>

Also – make sure your ashx code implements IRequiresSessionState

public class ApiHandler : IHttpHandler, IRequiresSessionState
{
}
by Njål

Downloading blobs from Windows Azure

image 

Enabling users to download files (blobs) in Windows Azure can be done in 4 ways ;
(as Microsoft evangelist Sean Harris explains in this blogpost)

  • Direct Download – Set the Access of the Container to Public Read Access or Full Public Read Access and expose the URL to the end user. The drawback of this method is obviously security – you have no way of controlling access when the URL is exposed. There is also no way of detecting downloads, and to execute code before/after the download.
     
     
  • API Download – The user must run a Windows App, Silverlight etc. Also – the app must contain the store account name and key – which might compromise security. Especially if you have several customers using the same account (they can still have their own containers of course).
     
     
  • Proxy Download – Have the user access a URL on YOUR server – which then retrieves the file from Azure and sends it to the user. The advantage of this is that you have full control of downloads, you can execute code before/after downloading and you don’t have to expose any Azure URL’s / account info. In fact – the end user will not even see that the file is stored in Azure. You can also override the filename this way.  A downside is that all traffic passes through your server – so you’ might get a bottleneck here.
     
     
  • Pass-through Download (Azure Shared Access Signature) – Creates a signature and inserts this signature in a URL where the user is redirected to Azure. The signature enables the user to access the file for a limited period of time. This is most likely your best option. It enables custom code to be executed before downloading, it will ensure max download speed for your users, and a good level of security is also ensured.

Our client has files stored as Guids in Azure – and we keep track of files (names, sizes etc) and folders in a MS Sql database. This is the reason we have to use the Proxy Download option – when users are downloading files.

Sean Harris has developed such a proxy which streams the file directly from Azure to the end user. This code however requires a third party framework, and seems to overcomplicate things a lot. Here’s a version which does the job just as well (.ashx generic handler):

 

//Retrieve filenname from DB (based on fileid (Guid))
// *SNIP*
string filename = "some file name.txt";

//IE needs URL encoded filename. Important when there are spaces and other non-ansi chars in filename.
if (HttpContext.Current.Request.UserAgent != null && HttpContext.Current.Request.UserAgent.ToUpper().Contains("MSIE"))
    filename = HttpUtility.UrlEncode(filename, System.Text.Encoding.UTF8).Replace("+", " ");

context.Response.Charset = "UTF-8";
//Important to set buffer to false. IIS will download entire blob before passing it on to user if this is not set to false
context.Response.Buffer = false;
context.Response.AddHeader("Content-Disposition", "attachment; filename="" + filename + """);
context.Response.AddHeader("Content-Length", "100122334"); //Set the length the file
context.Response.ContentType = "application/octet-stream";
context.Response.Flush();

//Use the Azure API to stream the blob to the user instantly.
// *SNIP*
fileBlob.DownloadToStream(context.Response.OutputStream);

 

by Andreas

Windows Azure Storage Emulator – Blob service does not start

imageOne of the features shipped with Windows Azure SDK for Visual Studio 2010 is the Azure Storage emulator. This little add-on enables you to run your very own Azure Cloud storage service locally while developing, and needless to say this makes everything quite a lot simpler.

When first initialized, the emulator creates a database in Microsoft SQL Server Express (which you can change to your local SQL Server if you want). It also sets up three services to emulate the Blob, Queue and Table types on different ports:

image

If you receive the following error message when attempting to start one of the services (typically the Blob service)

Process cannot access the file because it is being used by another process

it actually means that the port is taken, not a file. Run “Netstat /a” from your command prompt to find the culprit that’s taking up your precious port (which will probably be your torrent client – don’t tell RIAA!).