1- User clicks on a download link
2- Client displays a "Loading..." message
3- Client sends a request to server for generating and returning the file.
(e.g. window.location = '@Url.Content("~/Download/Pdf")';)
4- Server returns the dynamically generated file to the client
5- Client hides the "Loading..." message
6- Client starts to download the file
Unfortunately JavaScript doesn't know when download starts then we cannot hide the message! All the Google searches were not promising at all.
As a first solution, I came up with using TempData. Here is the steps in this approach:
1- User clicks on a download link
2- Client displays a "Loading..." message
3- Client sends an ajax request to server for generating the file. Assume that the url is '@Url.Content("~/Download/PreparePdf")'
4- Server generates the report in pdf format and stores the file stream in TempData. Then returns a json response confirming that the file ready to download
5- Client hides the "Loading..." message and sends another request to download the file (e.g. window.location = '@Url.Content("~/Download/Pdf")';) which is a separate action method.
6- Download starts
At first glimpse everything looks fine and it gives the user a nice experience. After deploying the application to UAT environment and doing some load tests we found that some of the file downloads take more that expected (around 15 seconds). As you know, The more response time, the more requests waiting in thread pool.
Digging into the problem helped me to find out the root cause. If you look at the workflow above, the client calls the server twice (One call for generating the pdf server side and the other for downloading the file). If you have only one web server to serve the requests you won't see any issue. However, using the web-farm and load balancer will replicate the issue! The reason is simple. There is no guarantee for sending the second HttpRequest to the same server utilized for first HttpRequest. Hence, the second request will find TempData empty and then will try to generate the file. (This is just a defensive programming approach). That's why we experience a response time longer that before.
One solution to resolve the issue is using the centralized state server such as AppFabric or SqlServer. In this approach instead of storing the pdf stream in TempData we should store it in the state server which is accessible for all the web servers.
For some reasons I didn't want to use this approach. I still had a bad filling about sending two separate requests for just having a progress message client side!
I ended up with suing the cookies as my hero! Before explaining it let's have a look to a sample code for that.
Client-side code:
The Action code - server side:
public ActionResult Download() { // set a cookie to notify client the server side process is done var requestToken = Request.Cookies["fileDownloadToken"]; if (requestToken != null && long.Parse(requestToken.Value) > 0) { var responseTokenValue = long.Parse(requestToken.Value) * (-1); Response.Cookies["fileDownloadToken"].Value = responseTokenValue.ToString(); } // Just a simulation of generating a pdf file - demo purpose Thread.Sleep(7000); FileStream fs = new FileStream("C:/Temp/DummyFile.pdf", FileMode.Open); byte[] buffer = new byte[fs.Length]; fs.Read(buffer, 0, (int)fs.Length); fs.Close(); return File(buffer, "application/pdf", "test.pdf"); }
Now let's dig into the code. First of all, I am using jQuery cookie to manipulate the cookies client side. The function in client side, creates a unique token (timestamp in this demo) and assign it to a cookie called fileDownloadToken. Then it creates an interval function to check the cookie every one second. It is expecting the same token that is multiplied by (-1). When that happens then downloadStarted function is called to clear the cookie and stop the checker function.
On the other side, inside the Action method, we first check the cookie and if it exists then multiply the value by (-1). After generating the pdf, we use the MVC File type to return the pdf file.
You may think this is end of the story but it is not! Taking 7 seconds for processing a request in ASP.NET is awful and it will cause the deadlocks and will increase the number of requests waiting to be served by the web server.
The simple solution to dramatically affect the performance is to leverage OutputCache feature in ASP.NET MVC. This is a very cool and handy feature that you can use to cache the output result of an Action. Please look at the detail info yourself.
Again, because we are using webfarm approach, I will use the AppFabric cache to persist the download results.
The problem with this scenario is that the Action code won't be called for cached urls. So, the cookie value will never be changed. As a result, the "Loading..." message will be up forever!
This issue can be solved by implementing the following HttpModule:
public class DownloadModule : IHttpModule { public void Dispose() { } public void Init(HttpApplication context) { context.PreSendRequestHeaders += new EventHandler(context_PreSendRequestHeaders); } void context_PreSendRequestHeaders(object sender, EventArgs e) { HttpApplication app = sender as HttpApplication; var requestToken = app.Request.Cookies["fileDownloadToken"]; long requestTokenValue; if (requestToken != null && long.TryParse(requestToken.Value, out requestTokenValue)) { if (requestTokenValue > 0) { var responseTokenValue = requestTokenValue * (-1); app.Response.Cookies["fileDownloadToken"].Value = responseTokenValue.ToString(); } } } }
This module will check the cookie before sending the request headers. It doesn't matter if the request output result is cached or not. This event will be called always in the request pipeline.
The OutputCache will use the IIS output cache by default. We can customize it to use other caching technologies like AppFabric. It is as simple as inheriting from OutputCacheProvider and override 4 main methods. Here is a nice blog from ScottGu that describes how to extend the OutputCache in ASP.Net 4.
Enjoy coding...