ASP.NET 5 Configuration API for .NET 4.5

As I’ve stated in previous posts I think the new ASP.NET 5 Configuration API great. This thing is I want to use it now, before I can use 4.6 on all my projects. And even then, we have some projects that will not be upgraded anytime soon. On the other hand, it was in .NET 4.5 I’d be ok.

So I decided to fork the repo and check how much I would have to change it to run on 4.5. As it turns out, it wasn’t too bad. I had to create the proper msbuild projects (csproj), modify a few areas of the code where C# 6 was being used and install the proper nuget packages. The code compiles and the tests pass.

I don’t mean to redo what Microsoft did in anyway, I’m just interested in maintaining a 4.5 version of the API while 4.5 remains relevant. It’s all about making my life easy.

I’m not creating nuget packages for it as I don’t want to create any confusion on nuget about what is official or not. If you have a need for this like I do, just feel free to clone my forked repo and rebuild it on your machine.

If you like it or find that you have a use for it, please ping me on twitter. I’d be curious to know if I’m the only mad man out there that wanted something like this.

My changes are on the net45 branch of the repository. Have a look at it here:

https://github.com/perezgb/Configuration/tree/net45

Well, that it for now folks. Have fun and see you next time.

ASP.NET 5 Typed Settings with the ConfigurationBinder

One thing that may seem really small but I really love in the new ASP.NET 5 is the Configuration API. The reason I like it so much is I’ve written something very similar in the past to make up for the lack of flexibility in System.Configuration. So now I get to retire my own implementation and I can just use the new API. Yay, less code I have to maintain on my own.

I’m not going to provide the general introduction to the API, that has been done well Louis Dejardin and Jsinh.

What I would like to talk about is the lesser known ConfigurationBinder. What this handy class does is allow you to take the key-value structure of the IConfiguration interface and use it to assign a typed setting class. So here’s what it looks like:

Let’s say we define a json file with our settings:

{
    "Server": "PLUTO",
    "Port": 8080
}

Now let’s load that file using the config classes and output the values to the console.

IConfiguration configuration = new ConfigurationBuilder()
               .AddJsonFile("MySettings.json")
               .Build();

Console.WriteLine(configuration.Get("server"));
Console.WriteLine(configuration.Get("port"));

What if we wanted to have a typed settings class. Well, that’s straightforward too. First we define the class with the properties mapping to the settings file.

public class TypedSettings
{
    public string Server { get; set; }
    public int Port { get; set; }
}

Now we get the ConfigurationBinder working. It’s just a call to a static method specifying the type of the class to read the settings into and the IConfiguration instance:

IConfiguration configuration = new ConfigurationBuilder()
               .AddJsonFile("MySettings.json")
               .Build();

TypedSettings settings = ConfigurationBinder.Bind<TypedSettings>(configuration);

Console.WriteLine(settings.Server);
Console.WriteLine(settings.Port);

The best way to get all the details is to clone the repository and have a look at the unit tests. But let’s just go through some of the valid and invalid scenarios.

Properties without a public setter are not valid:

public class ComplexOptions
{
    //Exception: private setters not allowed
    public string PrivateSetter { get; private set; }
    //Exception: protected setters not allowed
    public string ProtectedSetter { get; protected set; }
    //Exception internal setters not allowed
    public string InternalSetter { get; internal set; }
    //Static properties work
    public static string StaticProperty { get; set; }

    //Exception: readonly properties not allowed
    public string ReadOnly
    {
        get { return null; }
    }
}

Nested options are valid with the following exception:

public class NestedOptions
{
    //Valid
    public NestedValid NestedValid { get; set; }
    //Invalid: the class has no public ctor, so it can't be instanciated
    public NestedInvalid NestedInvalid { get; set; }
    //Invalid: the framework wouldn't know which class to instantiate
    public ISomeInterface SomeInterface { get; set; }
}

public class NestedValid
{
    public int Integer { get; set; }
}

public class NestedInvalid
{
    private NestedInvalid() {}

    public int Integer { get; set; }
}

public interface ISomeInterface
{
    int Integer { get; set; }
}

My main goal with this post is just to shed some light on this very useful class that hasn’t gotten a lot of attention yet.

Happy configuring!

Better logging for Web API and MVC

So I’ll try not to be too long with this one. Logging is good thing, I guess there’s not argument there. Who am I kidding, we’re developers, we can make an argument out of anything. Spaces vs tabs… Don’t worry I’m not going there.

Anyways, when you’re testing a Web application, be it an API or an MVC one, going through the logs is only simple when your the only one hitting the app sequentially. All the request and responses are neatly separated and parsing the log is a breeze. When you have a production app, that is totally not the case. Multiple requests come in at the same time and you’re left scrambling trying to figure out to which request each message belongs. Let’s see an example of this. Here’s a sample controller from a Web API project with a single log message:

public class ValuesController : ApiController
{
    private static readonly Logger Log = LogManager.GetCurrentClassLogger();

    public async Task<HttpResponseMessage> Get()
    {
        Log.Debug("Hi there");
        return Request.CreateResponse(HttpStatusCode.OK, new[] { "one", "two" });
    }
}

If we hit that endpoint once this is what we get:

2015-07-02 22:57:16.6195 DEBUG Hi there

Now, wouldn’t it be nice if we could the address for the request, the verb used and how long it took to execute? And even better, what if we had a unique identifier included in every log message that would allow us to easly parse our log? So check this out:

2015-07-02 22:58:43.9794 c265a13b-99bd-494e-b83a-4c4b1a891d4b TRACE Requesting:[GET]http://localhost:57577/api/values
2015-07-02 22:58:44.0605 c265a13b-99bd-494e-b83a-4c4b1a891d4b DEBUG Hi there
2015-07-02 22:58:44.0805 c265a13b-99bd-494e-b83a-4c4b1a891d4b TRACE Request completed. Duration:00:00:00.1019623 StatusCode:OK

That’s much better than before. Now, every log message in the request has the same guid, so it’s easy to parse your log and find the messages you want. We also have a line defining the beggining of the request and one defining the end of the request. All this can be achieved without changing any of your existing log message. Here’s what we need to get this going:

1-Install the following packages. - NLog - NLog.Contrib

2-Add a DelegatingHandler that will tap in the request pipeline to do the logging:

public class LoggerDelegatingHandler : DelegatingHandler
{
    private static readonly Logger Log = LogManager.GetLogger("RequestTracer");

    protected async override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request,
        CancellationToken cancellationToken)
    {
        var stopwatch = new Stopwatch();

        Guid guid = Guid.NewGuid();
        NLog.Contrib.MappedDiagnosticsLogicalContext.Set("requestid", guid.ToString());
        stopwatch.Start();
        Log.Trace("Requesting:[{0}]{1}", request.Method.Method, request.RequestUri);

        var reponse = await base.SendAsync(request, cancellationToken);

        stopwatch.Stop();
        Log.Trace("Request completed. Duration:{0} StatusCode:{1}", (object) stopwatch.Elapsed, (object) reponse.StatusCode);
        return reponse;
    }
}

This delegating handler intercepts every request and appends a message before and after the action methods of the controllers. The magic here in the line that configures the requestid for NLog. This will append that id to the context of every log message.

3-Change your Application_Start to hookup the handler and initialize NLog:

public class WebApiApplication : System.Web.HttpApplication
{
    protected void Application_Start()
    {
        //initialize the NLog config
        NLog.Config.ConfigurationItemFactory.Default.LayoutRenderers.RegisterDefinition("mdlc", typeof(MdlcLayoutRenderer));

        GlobalConfiguration.Configure(configuration =>
        {
        	//add the custom handler to the message handler pipeline
            configuration.MessageHandlers.Add(new LoggerDelegatingHandler());

            WebApiConfig.Register(configuration);
        });
    }
}

4-Modify the NLog.config file to display your newly defined renderer:

<target type="File" name="f" fileName="${basedir}/logs/${shortdate}.log"
        layout="${longdate}|${mdlc:item=requestid}|${uppercase:${level}}|${message}" />

I won’t go into much detail about how this works because you can read it directly from the source here. That post will answer all your questions about how this really works in NLog.

You can get the full sample from my GitHub samples repository.

I think I ran a little longer that I expected but I hope it was worth it. Happy logging.

Moving to GitHub Pages

My blog has been kinda dead for a while for several reasons. One of the reasons for it is I’ve been busy at work, but who hasn’t right? Another reason is that I’ve always written my own blog engine as a way of testing out ideas on something that is mine and is also in production. That worked ok for several years, I got to test webforms, asp.net mvc, NHibernate, EntityFramework, several IoC containers, Bootstrap and many more. The downside however is I never took the proper time to write an admin area that made it easy to publish my posts. The shoemaker’s son always goes barefoot.

Lately I’ve been meaning to get the blog active again but I so I did some research and using GitHub Pages and Jekyll seemed like the logical choice. I love Git and GitHub and markdown is a lot better than html to write the posts. It also doesn’t hurt that all the cool kids are doing it :-)

So there you have it, this blog has officially been resurrected.

NDC 2012 Schedule in PDF

Here is a printer friendly version of the NDC 2012 schedule:

Wednesday

Thursday

Friday

Just right click the links and select Save As to save the files.

See you in Oslo :-)

Auto Generating Inner Joins in PetaPoco

One of the neat features of PetaPoco is the Multi-Poco queries. It allows you to define a query that joins two or more tables and populate a set of related objects.

As an experiment on extending PetaPoco I wanted to be able to auto generate the sql statement for a Multi-Poco query. And I have to say it was pretty simple. Let's say we define our pocos as:

[PrimaryKey("PersonId")]
public class Person
{
    public int PersonId { get; set; }
    public int AddressId { get; set; }
}

[PrimaryKey("AddressId")]
public class Address
{
    public int AddressId { get; set; }
    public string Street { get; set; }
}

Let's say that we would like to retrieve all the Person records with the associated Address object. This is how it would work right now:

string sql = @"SELECT [Person].[PersonId], [Person].[AddressId],
        [Address].[AddressId], [Address].[Street]
        FROM [Person] INNER JOIN [Address] ON [Person].[AddressId] = [Address].[AddressId]";

var database = new Database(string.Empty);
IEnumerable<person> persons = database.Query<Person,Address>(sql);

This is how I'd wanted it to be:

var database = new Database(string.Empty);
IEnumerable persons = database.AutoQuery<Person,Address>(); 

To make it possible I created a few methods using PetaPoco's metadata classes. The main method is BuildSql which takes the type of the main poco (the root table) and an array of types of the pocos that shoud be joined to the main table. Here's the code:

public partial class Database
{
    public IList<T1> AutoQuery<T1, T2>()
    {
        string sql = BuildSql(typeof(T1), new[] { typeof(T2) });
        return Query<T1>(new[] { typeof(T1), typeof(T2) }, null, sql, null).ToList();
    }

    public string BuildSql(Type rootType, Type[] joinTypes)
    {
        PocoData rootData = PocoData.ForType(rootType);
        string rootTable = EscapeSqlIdentifier(rootData.TableInfo.TableName);

        IEnumerable<string> columns = GetColumns(rootData);
        string join = string.Empty;

        foreach (var joinType in joinTypes)
        {
            PocoData joinData = PocoData.ForType(joinType);
            columns = columns.Union(GetColumns(joinData));
            join += BuildJoin(rootTable, joinData);
        }

        string columnList = string.Join(", ", columns);

        return string.Format("SELECT {0} FROM {1}{2}",
                             columnList,
                             rootTable,
                             join);
    }

    private string BuildJoin(string rootTable, PocoData join)
    {
        string joinedTable = EscapeSqlIdentifier(join.TableInfo.TableName);
        string joinPk = EscapeSqlIdentifier(join.TableInfo.PrimaryKey);
        return string.Format(" INNER JOIN {0} ON {1}.{2} = {3}.{4}",
                             joinedTable,
                             rootTable,
                             joinPk,
                             joinedTable,
                             joinPk);
    }

    private IEnumerable<string> GetColumns(PocoData rootData)
    {
        var tableName = EscapeSqlIdentifier(rootData.TableInfo.TableName);

        var cols = from c in rootData.QueryColumns
                   select tableName + "." + EscapeSqlIdentifier(c);
        return cols;
    }
}

Basically what we are doing is generating the sql statement and using the Query method that is already available in PetaPoco. I think that this shows how easy it is to extend PetaPoco with your conventions. And what is even nicer is that you don't need to convice anyone to change PetaPoco, you can just add your changes locally.

Of course this is just a simple example. It's easy to futher evolve it to take additional paramters to specify a WHERE clause for the statement.

I hope this will give other people more ideas on how to extend PetaPoco to solve their own problems.

Auto Generating Inner Joins in PetaPoco

One of the neat features of PetaPoco is the Multi-Poco queries. It allows you to define a query that joins two or more tables and populate a set of related objects.

As an experiment on extending PetaPoco I wanted to be able to auto generate the sql statement for a Multi-Poco query. And I have to say it was pretty simple. Let's say we define our pocos as:

[PrimaryKey("PersonId")]
public class Person
{
    public int PersonId { get; set; }
    public int AddressId { get; set; }
}

[PrimaryKey("AddressId")]
public class Address
{
    public int AddressId { get; set; }
    public string Street { get; set; }
}

Let's say that we would like to retrieve all the Person records with the associated Address object. This is how it would work right now:

string sql = @"SELECT [Person].[PersonId], [Person].[AddressId],
        [Address].[AddressId], [Address].[Street]
        FROM [Person] INNER JOIN [Address] ON [Person].[AddressId] = [Address].[AddressId]";

var database = new Database(string.Empty);
IEnumerable<person> persons = database.Query<Person,Address>(sql);

This is how I'd wanted it to be:

var database = new Database(string.Empty);
IEnumerable persons = database.AutoQuery<Person,Address>(); 

To make it possible I created a few methods using PetaPoco's metadata classes. The main method is BuildSql which takes the type of the main poco (the root table) and an array of types of the pocos that shoud be joined to the main table. Here's the code:

public partial class Database
{
    public IList<T1> AutoQuery<T1, T2>()
    {
        string sql = BuildSql(typeof(T1), new[] { typeof(T2) });
        return Query<T1>(new[] { typeof(T1), typeof(T2) }, null, sql, null).ToList();
    }

    public string BuildSql(Type rootType, Type[] joinTypes)
    {
        PocoData rootData = PocoData.ForType(rootType);
        string rootTable = EscapeSqlIdentifier(rootData.TableInfo.TableName);

        IEnumerable<string> columns = GetColumns(rootData);
        string join = string.Empty;

        foreach (var joinType in joinTypes)
        {
            PocoData joinData = PocoData.ForType(joinType);
            columns = columns.Union(GetColumns(joinData));
            join += BuildJoin(rootTable, joinData);
        }

        string columnList = string.Join(", ", columns);

        return string.Format("SELECT {0} FROM {1}{2}",
                             columnList,
                             rootTable,
                             join);
    }

    private string BuildJoin(string rootTable, PocoData join)
    {
        string joinedTable = EscapeSqlIdentifier(join.TableInfo.TableName);
        string joinPk = EscapeSqlIdentifier(join.TableInfo.PrimaryKey);
        return string.Format(" INNER JOIN {0} ON {1}.{2} = {3}.{4}",
                             joinedTable,
                             rootTable,
                             joinPk,
                             joinedTable,
                             joinPk);
    }

    private IEnumerable<string> GetColumns(PocoData rootData)
    {
        var tableName = EscapeSqlIdentifier(rootData.TableInfo.TableName);

        var cols = from c in rootData.QueryColumns
                   select tableName + "." + EscapeSqlIdentifier(c);
        return cols;
    }
}

Basically what we are doing is generating the sql statement and using the Query method that is already available in PetaPoco. I think that this shows how easy it is to extend PetaPoco with your conventions. And what is even nicer is that you don't need to convice anyone to change PetaPoco, you can just add your changes locally.

Of course this is just a simple example. It's easy to futher evolve it to take additional paramters to specify a WHERE clause for the statement.

I hope this will give other people more ideas on how to extend PetaPoco to solve their own problems.

Serializing Custom Enumeration With Json.NET

When I went to Codemash a few months ago I watched Jimmy Bogard’s Crafting Wicked Domain Models talk. One of the interesting things in his talk was his implementation of a Java like enum. It was something I had seen before with varying implementations but I had never felt really compelled to switch to. What changed my mind was Jon Skeet’s talk on C# Greatest Mistakes where he showed several problems with the C# implementation of enums. After that I decided to start using Jimmy’s implementation full time.

Last week I came across a serialization issue using Json.Net with it though. Look at this sample implementation:

public class ColorEnum : Enumeration
{
    public static readonly ColorEnum Black = new ColorEnum(1,"Black");

    public static readonly ColorEnum White = new ColorEnum(2,"White");
    
    [JsonConstructor]
    private ColorEnum(int value, string displayName) : base(value, displayName)
    {
    }
}

The code for the Enumeration class can be found here: https://github.com/jbogard/presentations/blob/master/WickedDomainModels/After/Model/Enumeration.cs

First, notice that in order to be able to deserialize it properly I had to add the JsonConstructor attribute to the constructor. But that’s not my issue yet.

My first issue is that the deserialized object will be a new instance of my ColorEnum item. That’s not bad as the Enumeration class is prepared to handle it by overriding the Equals method like this:

public override bool Equals(object obj)
{
    var otherValue = obj as Enumeration;

    if (otherValue == null)
    {
        return false;
    }

    var typeMatches = GetType().Equals(obj.GetType());
    var valueMatches = _value.Equals(otherValue.Value);

    return typeMatches && valueMatches;
}

It would still fail however if we did a comparison using the “==” operator. But that can still be fixed by overriding the operator in the Enumeration class like so:

public static bool operator ==(Enumeration left, Enumeration right)
{
    return Equals(left, right);
}

public static bool operator !=(Enumeration left, Enumeration right)
{
    return !Equals(left, right);
}

So if we can get away with multiple instances why would I want to use a single instance? Because I can :-)

The neat thing by using a single instance I can make the serialization look more like with the regular enum. So here what it would be like when serialized:

{"Color":{"Value":1,"DisplayName":"Black"}}

And this is how I’d like it to be:

{"Color":1}

So how can we achieve this? Json.Net custom converters.

Custom converters are not complicated to implement, the one I created to take care of serializing my enumeration is as follows:

public class EnumerationConverter : JsonConverter
{
    public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
    {
        var enumeration = (Enumeration)value;
        serializer.Serialize(writer, enumeration.Value);
    }

    public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
    {
        if (reader.TokenType == JsonToken.Null)
        {
            return null;
        }

        int value = serializer.Deserialize<int>(reader);
        foreach (Enumeration enumeration in Enumeration.GetAll(objectType))
        {
            if (enumeration.Value == value)
            {
                return enumeration;
            }
        }

        throw new Exception("Value not found in enumeration. Type:{0} Value:{1}".Frmt(objectType, value));
    }

    public override bool CanConvert(Type objectType)
    {
        return objectType.IsSubclassOf(typeof(Enumeration));
    }
}

During deserialization I use the GetAll method from the Enumeration class to retrieve all the enumerated items for the specific type and try to match it's values to the value being deserialized. With a few tests we can easily prove that we get our expected results:

[TestFixture]
public class EnumerationConverterTests
{
    [Test]
    public void Should_serialize_Enumeration_to_simplified_json()
    {
        var brush = new Brush {Color = ColorEnum.Black};
        string json = JsonConvert.SerializeObject(brush, new EnumerationConverter());
        Assert.AreEqual(@"{""Color"":1}", json);
    }

    [Test]
    public void Should_serialize_null_Enumeration()
    {
        var brush = new Brush();
        string json = JsonConvert.SerializeObject(brush, new EnumerationConverter());
        Assert.AreEqual(@"{""Color"":null}", json);
    }

    [Test]
    public void Should_deserialize_Enumeration()
    {
        string json = @"{""Color"":1}";
        var deserializeObject = JsonConvert.DeserializeObject<Brush>(json, new EnumerationConverter());
        Assert.AreEqual(ColorEnum.Black, deserializeObject.Color);
    }

    [Test]
    public void Should_deserialize_null_Enumeration()
    {
        string json = @"{""Color"":null}";
        var deserializeObject = JsonConvert.DeserializeObject<Brush>(json, new EnumerationConverter());
        Assert.IsNull(deserializeObject.Color);
    }

    public class Brush
    {
        public ColorEnum Color { get; set; } 
    }

    public class ColorEnum : Enumeration
    {
        public static readonly ColorEnum Black = new ColorEnum(1,"Black");

        public static readonly ColorEnum White = new ColorEnum(2,"White");
    
        [JsonConstructor]
        private ColorEnum(int value, string displayName) : base(value, displayName)
        {
        }
    }
}

I hope this was useful information. See you next time.

Web Api - Routing in Depth

This week I had to use dotPeek on the Web Api to understand how the framework selects the controller method (action) that gets executed. The process is not complicated but I thought it might help other people if I documented my findings.

There's also a post on the asp.net website that talks about the web api rounting, check it out here.

The diagram below tries do describe the decision process behind selecting the method of the controller that will get executed. I've added a little extra documentation through the annotations on the diagram.

1 Action Based

If the route defines the action parameter then the method selection will be based on the value of that parameter. The matching will be based on the method name or the method alias (A method in the controller may be aliased using the ActionName attribute).
All the matching methods are then filtered by verb. The filtering is done by attributes applied to the method (such as HttpGet). Only the methods that match the verb of the incoming request will be returned. Methods with no attributes will also be considered valid as they don’t have any attributes to allow filtration.

2 Verb Based

When the method selection is verb based we need to get the Http verb used in the request. The controller methods are then filtered in two ways:

  • Get all the methods annotated with http verb attributes matching the http method in the request (attributes such as HttpGet or HttpPost);
  • Get all the methods that have the prefix matching the http method in the request (PostBooks matches the Post verb);

3 Check Results Found

All the candidate methods returned by either the verb based or action based approach are then analyzed. If no candidates where found an exception is thrown. If multiple methods were found it’s necessary to analyze it’s parameters to find out the best possible fit.

4 Find Route by Parameters

The parameters from the route and the parameters from the query string are compared to the method's parameters. If no parameters are found in the query string or in the route, all methods without parameters will be selected. If multiple methods match the parameters found, the ones that take the most parameters will win. Here is a diagram that details this specific part of the process:

 

5 Selection Filters

The last filters executed on the candidate methods will make sure that all methods marked with the NonAction attribute will be excluded.
 

I hope these diagrams were clear enough to help understand better the process of selecting the controller method that gets executed for a request.

Web Api - Testing with HttpClient

The Web Api is a project that has been on my radar since I first saw Glenn Block’s presentation last year on MIX. I was only recently though, when we decided to really take on REST at work that I really started looking into with in depth.

One aspect that caught my attention was the HttpClient. I’ve been using RestSharp for a while and I have to say that I am really happy with it. I still wanted to test the new HttpClient and see how it compared to RestSharp.

So let’s say I want to call a super duper service that returns a Guid. Here is a VERY simple example of how I would implement a client using RestSharp:

public class GuidClient
{
    private readonly IRestClient _client;

    public GuidClient(IRestClient client)
    {
        _client = client;
    }

    public string Execute()
    {
        var request = new RestRequest();

        RestResponse response = _client.Execute(request);

        if (response.StatusCode != HttpStatusCode.OK)
        {
            throw new Exception("Invalid response");
        }
        
        return response.Content;
    }
}

As I said, it's a very simple example. The important part here is that if I get a valid response I want to return the content (which should be the GUID), otherwise I want to throw an Exception. Here are the tests:

[TestFixture]
class GreetingClientTest
{
    [Test]
    public void Throws_exception_if_response_not_OK()
    {
        var mock = new Mock<IRestClient>();
        mock.Setup(x => x.Execute(It.IsAny<IRestRequest>()))
            .Returns(new RestResponse {StatusCode = HttpStatusCode.BadRequest});

        var client = new GuidClient(mock.Object);
        Assert.Throws<Exception>(() => client.Execute());
    }

    [Test]
    public void Returns_content_if_response_is_OK()
    {
        string content = Guid.NewGuid().ToString();
        var mock = new Mock<IRestClient>();
        mock.Setup(x => x.Execute(It.IsAny<IRestRequest>()))
            .Returns(new RestResponse
                         {
                             StatusCode = HttpStatusCode.OK,
                             Content = content
                         });

        var client = new GuidClient(mock.Object);
        var result = client.Execute();
        Assert.AreEqual(content, result);
    }
}

It’s easy to notice how simple the RestSharp abstractions make our job of testing. Mock the IRestClient to return the desired RestResponse and it's good to go.

When creating my first client using the HttpClient I wanted to pass the HttpClient as a constructor parameter in the same manner I did with IRestClient. That’s when I noticed that the HttpClient doesn’t implement any interfaces other than IDisposable. Hum, no interfaces? So how can I mock this thing? Good thing Glenn Block was ready to help on twitter:

“@gblock: @perezgb @howard_dierking with web api you can pass a fake message handler to the client to test.”

So Glenn also sent me the code from one of his talks where he creates a fake handler in order to help testing with the HttpClient. So taking his code I created an HttpClient version of my service and tests:

public class GuidHttpClient
{
    private readonly HttpClient _client;

    public GuidHttpClient(HttpClient client)
    {
        _client = client;
    }

    public string Execute()
    {
        var request = new HttpRequestMessage { RequestUri = new Uri("http://localhost/guidservice") };
        Task<HttpResponseMessage> task = _client.SendAsync(request);
        HttpResponseMessage response = task.Result;
        if (response.StatusCode != HttpStatusCode.OK)
        {
            throw new Exception("Invalid response");
        }
        return response.Content.ReadAsStringAsync().Result;
    }
}

And here are the tests:

[TestFixture]
public class GuidHttpClientTest
{
    [Test]
    public void Throws_exception_if_response_not_OK()
    {
        var response = new HttpResponseMessage(HttpStatusCode.BadRequest);
        var httpClient = new HttpClient(new FakeHandler
                                            {
                                                Response = response,
                                                InnerHandler = new HttpClientHandler()
                                            });

        var client = new GuidHttpClient(httpClient);
        Assert.Throws<Exception>(() => client.Execute());
    }

    [Test]
    public void Returns_content_if_response_is_OK()
    {
        string content = Guid.NewGuid().ToString();
        var response = new HttpResponseMessage(HttpStatusCode.OK);
        response.Content = new StringContent(content);

        var httpClient = new HttpClient(new FakeHandler
        {
            Response = response,
            InnerHandler = new HttpClientHandler()
        });

        var client = new GuidHttpClient(httpClient);
        string result = client.Execute();
        Assert.AreEqual(content, result);
    }
}

And this is the fake message handler:

public class FakeHandler : DelegatingHandler
{
    public HttpResponseMessage Response { get; set; }

    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request,
                                                           CancellationToken cancellationToken)
    {
        if (Response == null)
        {
            return base.SendAsync(request, cancellationToken);
        }

        return Task.Factory.StartNew(() => Response);
    }
}

Ok, so mission accomplished! I was able to write my service and tests using the HttpClient. One thing I really like is the russian doll model, kinda like the one you can find on FubuMVC, that the DelegatingHandler makes possible. On the other hand, I still like my RestSharp tests better, they seem cleaner but maybe that’s just me. It's the way I've been writing code for a while and I feel comfortable with it. Some times though, we have to push ourselves out of our comfort zone and try different things, right?

Please take all this with a grain of salt as I don’t consider myself to be any kind of expert in the ASP.NET Web Api. Also this is only my first stab at the HttpClient, I plan to keep going with my tests. I'll let you know if I come across anything interesting ;-)