0

I need to query data in MongoBD and my JSON files look like the below. The problem is that the time/date stamp changes for every five minutes so MongoDB understands the files as having different keys, for example "2018-01-02T00:00:00+09:00" and "2018-01-02T00:05:00+09:00" are different keys.

Finally, how would I query for when T1 is below 280?

I have taken a look at elemMatch but it is for arrays and not subobjects. I am starting in MongoDB world so I am sorry if this is a silly question but I could not find the answer for it anywhere.

File 1

{
  "2018-01-02T00:00:00+09:00": {
    "141474": {
      "T1": 276.5029,
      "T2": 279.3629
    },
    "141475": {
      "T1": 280.1534,
      "T2": 279.7219
    },
  }
}

File 2

{
  "2018-01-02T12:00:00+09:00": {
    "141474": {
      "T1": 275.1324,
      "T2": 276.9986
    },
    "141475": {
      "T1": 267.3324,
      "T2": 250.6574
    },
  }
}
Cleyson Shingu
  • 203
  • 1
  • 3
  • 9
  • I take it, that it's not possible to restructure your Schema, so that you have a fixed keyname? Like ``{ date: "2018-01-02T12:00:00+09:00", data: "{...}"} – BenSower Jan 04 '19 at 10:20
  • Hey Ben, I thought about it, I just didn't to change it. The situation is that I am pulling the data through an API and ideally I would use the raw data as it is, otherwise I will need to change the structure for every pull that I'm doing in the future. – Cleyson Shingu Jan 05 '19 at 03:32
  • I asked a colleage for help and it seems that changing the schema is the only solution, he said that I could use loops instead and search in every document what would kill my performance. – Cleyson Shingu Jan 05 '19 at 03:36

0 Answers0