I have the following model that I'm storing in MongoDB:
public class Person
{
public ObjectId Id { get; set; }
public Int PersonId { get; set; }
public BsonDocument Resume { get; set; } // arbitrary JSON
[BsonIgnore]
public string FirstName { get; set; } // stored elsewhere,
// populated at runtime
[BsonIgnore]
public string LastName { get; set; } // ditto
}
Resume
is a BsonDocument
where I store arbitrary JSON that cannot be standardized into a POCO (each occurrence is vastly different).
I don't want to store the person's first and last names since that information is already stored in a SQL database and I don't want to have to worry about syncing changes. So I've decorated those parameters with [BsonIgnore]
. When my app code retrieves the Person
from MongoDB it populates the FirstName
and LastName
parameters before serializing it to JSON, like so:
public ActionResult GetPerson(int id)
{
var query = New QueryDocument("_id", ObjectId.Parse(id));
// personCollection is Collection<Person>
var person = personCollection.FindOne(query);
var pc = personCache.GetPerson(person.PersonId);
person.FirstName = pc.FirstName;
person.LastName = pc.LastName;
var settings = New JsonWriterSettings() { Outputmode = JsonOutputMode.Strict }
return Json(person.ToJson(settings), JsonRequestBehavior.AllowGet);
}
The resulting JSON, however, is missing FirstName
and LastName
nodes, apparently because they were decorated with [BsonIgnore]
.
Is there a way to tell the Official MongoDB C# driver to ignore saving the parameters in MongoDB but not ignore it when serializing into JSON?