Scenario:
- I'm trying to build a real-time monitoring webpage for ship operations
- I have 1,000 - 10,000 ships operating
- All ships are sending real-time data to DB, 24 hours - for 30 days
- Each new data inserted has a dimension of 1 row X 100 col
- When loading the webpage, all historic data of a chosen ship will be fetched and visualized
- Last row of the ship's real-time data table will be queried, and fetched on the webpage to update real-time screen
- Each ship has its own non-real-time data, such as ship dimensions, cargo, attendants, etc...
So far I've been thinking about creating a new schema for each ship. Something like this:
public_schema
ship1_schema
ship2_schema
ship3_schema
|--- realtime_table
|--- cargo_table
|--- dimensions_table
|--- attendants_table
ship4_schema
ship5_schme
Is this a good way to store individual ship's real-time data, and fetch them on a webserver? What other ways would you recommend?
For time-series wise, I'm already using a PostgreSQL extension called Timescale DB. My question rather about storing time-series data, in case I have many ships. Is it a good idea to differentiate each ship's RT data my constructing a new schema?
++ I'm pretty new to PostgreSQL, and some of the advice I got from other people was too advanced for me... I would greatly appreciated if you suggest some method, briefly explain what it is