I know it's probably not enough to be worried about, but how performant is the DBNull.Value.Equals() check?
public IEnumerable<dynamic> Query(string sql, params object[] args)
{
using (var conn = OpenConnection())
{
var rdr = CreateCommand(sql, conn, args).ExecuteReader(CommandBehavior.CloseConnection);
while (rdr.Read())
{
var e = new ExpandoObject();
var d = e as IDictionary<string, object>;
for (var i = 0; i < rdr.FieldCount; i++)
d.Add(rdr.GetName(i), DBNull.Value.Equals(rdr[i]) ? null : rdr[i]);
yield return e;
}
}
}
in particular, this line:
d.Add(rdr.GetName(i), DBNull.Value.Equals(rdr[i]) ? null : rdr[i]);
versus the original code (from Rob Conery's Massive class):
d.Add(rdr.GetName(i), rdr[i]);
There's bound to be at least a small impact, again probably not truly noticable, but I'm curious. The reason for the conversion is because it's much easier testing for null in ASP.NET MVC views.