4

My script produces data in the following format:

dictionary = {
 (.. 42 values: None, 1 or 2 ..): {
  0: 0.4356, # ints as keys, floats as values
  1: 0.2355,
  2: 0.4352,
  ...
  6: 0.6794
 },
 ...
}

where:

  • (.. 42 values: None, 1 or 2 ..) is a game state
  • inner dict stores calculated values of actions which are possible in that state

The problem is that the state space is very big (millions of states), so the whole data stucture cannot be stored in memory. That's why I'm looking for a database engine which would fit my needs and I could use with Python. I need to get the list of actions and their values in the given state (previously mentioned tuple of 42 values) and to modify value of given action in given state.

timgeb
  • 76,762
  • 20
  • 123
  • 145
Luke
  • 1,369
  • 1
  • 13
  • 37
  • It looks like your dictionary is one level deep. Any key-value store or an SOL database would fit your problem. – 9000 Dec 28 '15 at 01:01

4 Answers4

1

You can use a key-value cache solution. A good one is Redis. It`s very fast and simple, written on the C and more over than just a key value cache. Integration with python just several lines of code. The redis is also can be scaled very easy for the really big data. I worked in the game industry and understand what I am talking about.

Also, as already mentioned here, you can use more complex solution, not a cache, the database PostgresSQL. Now it supports a JSON binary format field - JSONB. I think the best python database ORM is the SQLAlchemy. It supports PostgresSQL out of the box. I will use this one in my code block. For example, you have a table

class MobTable(db.Model):
    tablename = 'mobs'

    id = db.Column(db.Integer, primary_key=True)
    stats = db.Column(JSONB, index=True, default={})

If your have a mob with such json stats

{
    id: 1,
    title: 'UglyOrk',
    resists: {cold: 13}
}

You can search all mobs with the not null cold resists

expr = MobTable.stats[("resists", "cold")]
q = (session.query(MobTable.id, expr.label("cold_protected"))
    .filter(expr != None)
    .all()) 
fedorshishi
  • 1,467
  • 1
  • 18
  • 34
  • I want to store a python dict in redis but as i am getting data every 15 minutes so i also need to update the redis every 15 minutes. Can you help me how I can achieve that? I mean, How i can dynamically assign a key every time i write data in redis as when i am using the same key.. it is overwriting the old data. If you can answer my question here on stackoverflow :- https://stackoverflow.com/questions/52712982/storing-data-into-redis-through-cron-job –  Oct 09 '18 at 13:02
1

Check out ZODB: http://www.zodb.org/en/latest/

It's natve object DB for Python that supports transactions, caching, pluggable layers, pack operations (for keeping history) and BLOBs.

siefca
  • 1,007
  • 13
  • 16
1

I recommend you use HD5f. It's a data base format that works perfectly with Python (it is specifically developed for Python) and stores the data in binary format. This reduces the size of the data to be stored a great extent! More importantly it gives you the ability of random access which I believe serves for your purposes. Also, if you do not use any compression method you will retrieve the data with the highest possible speed.

Amir
  • 10,600
  • 9
  • 48
  • 75
  • Double indexing did not work for me, e.g. group[state][1]. I can't also assign a dict, e.g. group[state] = { 0: 0.43, .., 6: 0.65 }. – Luke Dec 28 '15 at 08:08
0

You can also store it as JSONB in PostgreSQL DB.

For connecting with PostgreSQL you can use psycopg2, which is compliant with Python Database API Specification v2.0.

Marqin
  • 1,096
  • 8
  • 17