For some time now, developers have used asynchronous programming techniques to get more out of their applications, with increasing amounts of work done in the background without freezing the screen. Web applications and services support a base level of concurrency naturally by virtue of the ability of web servers to handle many requests simultaneously. At the level of individual requests, work done in parallel, especially when multiple external resources need to be accessed, can result in shorter response times for users.

Recognizing this reality, Microsoft has continually improved and simplified its language and platform support for asynchronous programming in .NET. The latest innovation came in 2013 when Microsoft introduced the async and await keywords in the .NET 4.5 release. Now, New Relic’s .NET Agent release 6.0 includes general availability of accurate instrumentation for applications that use the latest async techniques in .NET.

Taxi

Understanding async: a taxi analogy

To understand the value of Microsoft’s latest support for async programming, think about going on a business trip that involves a plane flight. Imagine doing this the old-fashioned (pre-Uber) way: taking a cab to and from the airport on both ends of your trip. Now imagine that the cab company wanted to provide you an extra service: a “dossier” at the end of your trip, listing the times you left your house, left the cab at the airport, got back in the cab upon your return, and finally, returned to your house. How could it accomplish this?

One way would be to have the cabbie wait at the airport after dropping you off until you returned from your trip. The driver could write all the times on a clipboard and simply hand you your dossier at the end. That works, but the cab company is going to have a hard time staying in business with its cabs sitting at the airport so long, waiting for particular passengers to return.

A smarter approach would be for the cabbie who drove you to the airport to write down the times and hand them to the dispatcher when she got back to the cab depot (yeah, they used to have those, remember the old “Taxi” TV show?). When you return from your trip, the dispatcher would send another cab to the airport, handing the driver the dossier to complete after picking you up at the airport and dropping you off at home.

On days with plenty of demand from riders, more of the cabs stay busy, more of the time. Much better, right?

What is async in .NET?

Getting back to software, the improved async capability in .NET 4.5 and later boils down to compiler, runtime, and (significantly) IIS and WCF support for two new C# keywords, async and await:

  • Declaring a method async means that it returns a promise (also known as a future) rather than a direct result. A promise typically wraps a result of a specific type and will yield its result once available. This makes sense for operations that take significant time to complete and may execute on separate threads, or via a blocking system call.
  • Callers of async methods can await the result of the promise returned by the method any time after calling the method (not necessarily immediately). So a caller can call an async method to get things started, do some unrelated work (including, potentially, calling other async methods), then await the result when there’s nothing else to be done, with execution automatically continuing after the await returns.
  • When a method awaits the result of a call to an async method, it doesn’t block—it just suspends execution, which is picked up once the result is available through the magic of the .NET runtime. Meanwhile, control returns to the caller of the method that did the await. A method cannot do an await unless it declares itself as async. So if the caller of that method also decides to await, then the cycle continues. Sort of a “hall of mirrors” effect, but with some powerful benefits.

Where does the buck stop?

At this point, you may be wondering “where does it all end?” If things are really async all the way down in, say, a Web API application (i.e., even the controller methods are declared as async), then what’s to stop the request lifecycle from completing before all the awaits have resumed, resulting in an incomplete response to the client?

Here, too, Microsoft has handled the details in .NET and IIS/ASP, which keep track of outstanding awaits in a request and postpone its completion until they return. This leads to some nice performance characteristics for applications that use async and await.

What are the benefits of async/await?

Concurrency: Imagine that you’re implementing a REST API using C# and web API. You have an endpoint that needs to fetch data from a database and two external services, then combine the result into some JSON to return to a mobile app. You also need to update some statistics you’re keeping in a local cache to reflect particulars of the request.

Since the database and service calls take time, you’d like to get them all started right away, do the cache update while they’re executing, then finish processing when the results are available. Before async/await, you probably would have done this using Thread.Start and Thread.Join, or perhaps the TPL (Task Parallel Library) with .NET threading. But async/await and compatible client libraries (HttpClient.SendAsync for external service calls, SqlCommand.ExecuteReaderAsync, etc. for databases) let you do it without ever having to create threads on your own.

The client libraries expose async methods, and behind the scenes, they create threads for you, returning promises whose results can be awaited. The awaits return when the threads finish their work.

Here’s a simple code sample:

public class MyWebApiController : ApiController
	{
		private static readonly HttpClient httpClient = new HttpClient();

		public async Task<IEnumerable> Get()
		{
			var taskList = new List<Task>();

			taskList.Add(MakeRequest("someUrl1"));
			taskList.Add(MakeRequest("someUrl2"));
			taskList.Add(MakeRequest("someUrl3"));

			CallMyOtherLocalWork();

			var result = await Task.WhenAll(taskList);
                        return result;
		}

		private async Task MakeRequest(String url)OK, foo
		{
			var task = await httpClient.GetStringAsync(url);
			return task;
		}
	}

 

In this example, each of the three HttpClient.GetStringAsync() calls will execute in parallel, with each of the three calls to MakeRequest suspended at that point, with control returning to the Get() method in the controller. The await on Task.WhenAll() allows the controller method itself to be suspended by IIS/ASP until all the awaits resume and finish, at which point the web API request itself completes, returning an array of strings containing the external call results. But something else good happens here too.

Scalability: So far, async/await may seem like slightly easier ways of getting concurrency in your application, right? That’s only part of the story.

The IIS web server running .NET web applications and services (including WCF) has evolved into a highly scalable software platform capable of handling large volumes of low-latency traffic, as you can see in many of today’s microservice architectures. Like most web servers, one way IIS scales is by maintaining a pool of threads that can be immediately assigned to handle incoming requests. Under extreme load, however, the chance that all the threads in the pool will be busy goes up, and with it the chance that an incoming request will have to wait, or even fail. If this condition persists, IIS reacts by adding threads to its pool, but it does so conservatively. That means using the threads already in the pool as efficiently as possible is the key to good performance under load.

Let’s go back to the code sample above. Once you await in your top-level controller method, IIS is smart enough to return the thread under which your request was running back to the pool, where it can immediately be used to handle other incoming requests. Each time one of your awaits resumes, you get a thread from the pool to resume the awaited method. It may not be the thread you started with, but IIS restores all important contextual information for you, so it really doesn’t matter.

The result? In IIS as a whole, threads stay busy, not blocked, yielding much better scalability! Congratulations, you’re now running a digital version of a smart cab service!

New Relic support for async in our .NET agent Release 6.0

Once you decide that the concurrency and scalability advantages of async/await are what’s required for your high-throughput web service or application, you’ll want to make sure you’re using the latest version of the New Relic .NET Agent.

Release 6.0 of the agent plugs right into the new generation of async-capable network and database client libraries (see the public documentation for details) to give you accurate timings on your async calls. In particular, when you view your application in the APM UI, you’ll be able to see:

  • The stacked overview chart showing the time taken by the components of your application, representing the Total Time taken in processing requests. Now, however, you get a superimposed line showing the Response Time for your requests—the actual time they take to complete. The area “above the line” shows the benefits of concurrency in your application: the time spent processing awaited tasks in parallel, often by external servers and databases:

web transactions time

 

  • While a visual representation of the overall concurrency in your application is a great start, you’ll likely want to dig deeper when troubleshooting performance issues. Transaction traces, generated specifically for slow transactions, are your starting point for this investigation, and with our new async support, you’ll see the segments of your transaction that executed asynchronously starting at roughly the same timestamp. From here, you can think about rearranging the way your transaction does its work to gain even more concurrency speedup:

trace-details

Try it today

At New Relic, we’re big fans of the way .NET has introduced asynchronous capabilities in its standard HTTP and database client libraries, while encouraging database vendors to do the same. The tight integration with IIS/ASP means that once you start using these libraries and the simple coding techniques required, you get instant concurrency and scalability advantages in your applications. By adding async instrumentation support to the .NET agent, we’re casting our vote in favor of our customers leveraging these new technologies as soon as possible.

Download and install the latest New Relic .NET agent today and take advantage of our support for async!

Marc Perrone is a senior software engineer in New Relic’s Portland engineering headquarters.

View posts by .

Interested in writing for New Relic Blog? Send us a pitch!