0

I am using a C# dynamic functionality to deserialize Json. It is working perfectly for small file. However if the data file becomes large(I am testing with 500mb file) the deserializer will throw memory exception like below.

>Message: Exception of type 'System.OutOfMemoryException' was thrown.
>Source: mscorlib. StackTrace: at System.String.CtorCharArrayStartLength(Char[] value, Int32 startIndex, Int32 length)
   at Newtonsoft.Json.Utilities.StringReference.ToString()
   at Newtonsoft.Json.JsonTextReader.ParseReadNumber(ReadType readType, Char firstChar, Int32 initialPosition)
   at Newtonsoft.Json.JsonTextReader.ParseNumber(ReadType readType)
   at Newtonsoft.Json.JsonTextReader.ParseValue()
   at Newtonsoft.Json.JsonTextReader.Read()
   at Newtonsoft.Json.JsonWriter.WriteToken(JsonReader reader, Boolean writeChildren, Boolean writeDateConstructorAsDate, Boolean writeComments)
   at Newtonsoft.Json.Linq.JTokenWriter.WriteToken(JsonReader reader, Boolean writeChildren, Boolean writeDateConstructorAsDate, Boolean writeComments)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateJObject(JsonReader reader)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateObject(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.CreateValueInternal(JsonReader reader, Type objectType, JsonContract contract, JsonProperty member, JsonContainerContract containerContract, JsonProperty containerMember, Object existingValue)
   at Newtonsoft.Json.Serialization.JsonSerializerInternalReader.Deserialize(JsonReader reader, Type objectType, Boolean checkAdditionalContent)
   at Newtonsoft.Json.JsonSerializer.DeserializeInternal(JsonReader reader, Type objectType)
   at Newtonsoft.Json.JsonConvert.DeserializeObject(String value, Type type, JsonSerializerSettings settings)
   at Newtonsoft.Json.JsonConvert.DeserializeObject[T](String value, JsonSerializerSettings settings)
   at Newtonsoft.Json.JsonConvert.DeserializeObject[T](String value)
   at ShapeFile.Program.Main(String[] args) in C:\Users\HP\source\repos\ShapeFile\Program.cs:line 21

In the Task-Manager, I can see that memory consumption really grows fast until it's fully utilized to 16GB of my local pc. Any ideas why this is?

Sample code:

string jsonFile = @"D:\3D Map\Road\Road_3.geojson";
var myJsonResponse = File.ReadAllText(jsonFile);
dynamic myDeserializedClass = JsonConvert.DeserializeObject<dynamic>(myJsonResponse);

I am thinking the issue might be cause of there is such a huge string in memory, but whatever it is I still need to dynamically read this json file instead of creating a few classes(which actually work) in order to deserialize JSON into .NET objects. The reason why I dont want to create class is because different json file has different file specs.

If it not possible to read entire large json file dynamically, would it possible to read only the Feature and Properties2 in sample json file without creating class for these two ? Also sample Class below.

class Class1
{
    // Root myDeserializedClass = JsonConvert.DeserializeObject<Root>(myJsonResponse); 
    public class Properties
    {
        public string name { get; set; }
    }

    public class Crs
    {
        public string type { get; set; }
        public Properties properties { get; set; }
    }

    public class Properties2
    {
        public int ID_0 { get; set; }
        public string ISO { get; set; }
        public string NAME_0 { get; set; }
        public int ID_1 { get; set; }
        public string NAME_1 { get; set; }
        public int ID_2 { get; set; }
        public string NAME_2 { get; set; }
        public string TYPE_2 { get; set; }
        public string ENGTYPE_2 { get; set; }
        public object NL_NAME_2 { get; set; }
        public string VARNAME_2 { get; set; }
    }

    public class Geometry
    {
        public string type { get; set; }
        public List<dynamic> coordinates { get; set; }
        
    }

    public class Feature
    {
        public string type { get; set; }
        public int id { get; set; }
        public Properties2 properties { get; set; }
        public Geometry geometry { get; set; }
    }

    public class Root
    {
        public string type { get; set; }
        public string name { get; set; }
        public Crs crs { get; set; }
        public List<Feature> features { get; set; }
    }
}

Sample Json file below.

{
    "type": "FeatureCollection",
    "name": "MYS_adm2",
    "crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
    "features": [
    { "type": "Feature", "id": 0, "properties": { "ID_0": 136 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 102.911849975585938, 1.763612031936702 ], [ 102.911430358886832, 1.763888001442069 ] ] ] } },
    { "type": "Feature", "id": 1, "properties": { "ID_0": 136 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 103.556556701660156, 1.455448031425533 ], [ 103.555900573730582, 1.455950021743831 ] ] ] ] } },
xChaax
  • 193
  • 5
  • 27
  • How much memory is consumed by just reading the text file? Perhaps instead of loading the string into memory, use a reader object? – Lasse V. Karlsen Nov 06 '20 at 09:01
  • You mean manually open the json text file ? consumed almost 2GB. Sorry, what do you mean by use a reader object? mind to share please ? – xChaax Nov 06 '20 at 09:12
  • Open the file for reading, and pass it to the deserializer instead of the whole string, so you can try something like: `using (var reader = File.OpenText(@"path")) { Root root = (Root)new JsonSerializer().Deserialize(reader, typeof(Root)); }` Unfortunately, `JsonConvert` doesn't appear to have static deserialization methods that accept a reader, but use the JsonSerializer object methods instead. – Lasse V. Karlsen Nov 06 '20 at 09:15
  • Thanks, I see. However back to my purpose is to read json file dynamically, so here instead of use `Root` (which actually work like I said), is it possible to use method as u suggested with `dynamic` ? – xChaax Nov 06 '20 at 13:54
  • `using (var reader = File.OpenText(filePath)) { dynamic root = (dynamic)new JsonSerializer().Deserialize(new JsonTextReader(reader)); }` Tried this also same exception. – xChaax Nov 06 '20 at 14:35

1 Answers1

0

I assume that you want to read all the data from features field containing large set of nodes. You can consider using Cinchoo ETL - an open source library to read such large file.

using (var r = new ChoJSONReader("*** YOUR JSON FILE PATH ***")
    .WithJSONPath("$.features")
    )
{
    foreach (var rec in r)
        Console.WriteLine(rec.Dump());
}
Dharman
  • 30,962
  • 25
  • 85
  • 135
Cinchoo
  • 6,088
  • 2
  • 19
  • 34