1

I have 12 classes that look like this, the only difference is DefaultAsset global object. I have 12 of those global DefaultAssetOrderedDict, such as DefaultAssetsOrderedDict, NewAssetsOrderedDict, OldAssetsOrderedDict, etc. I didn't make them part of each class because they are large and I suspect doing so would result in multiple instances of something that is intended to be static, correct me if I'm wrong (I've had a lot of memory issues, switching from OrderDict data rows to class data rows to fix it)

class DefaultAsset(object):
    __slots__ = list(DefaultAssetOrderedDict.keys())

    def __init__(self, **kwargs):
        for arg, default in DefaultAssetOrderedDict.items():
            setattr(self, arg, re.sub(r'[^\x00-\x7F]', '', kwargs.get(arg, default)))
            #print (str(arg) + " : "+ str(re.sub(r'[^\x00-\x7F]', '', kwargs.get(arg, default))))

    def items(self):
        for slot in self.__slots__:
            yield slot, getattr(self, slot)

    def values(self):
        for slot in self.__slots__:
            yield getattr(self, slot)

So, I want to know how I can re-write that above class so that it's a parent class called Rows and so that I can do something like this:

class DefaultAssets (Row, DefaultAssetOrderedDict):
    #does the same thing but per the OrderedDict in second argument

or maybe:

DefaultAssets =  Rows(DefaultAssetOrderedDict)
NewAssets =  Rows(NewAssetOrderedDict)
martineau
  • 119,623
  • 25
  • 170
  • 301
gunslingor
  • 1,358
  • 12
  • 34
  • Effectively, I just realized, I'm trying to implement an OrderedDict with better memory utilization by using slot, which I think limits me in adding objects to the class (which I don't need because I can define them all in advance). The overhead of OrderDict is removed by making it a global variable that mimics its function inside the class, effectively I'm trading off speed for RAM... does this make sense? Am I a lunatic, a genius or both, lol. – gunslingor Dec 11 '17 at 22:34

1 Answers1

1

If I've understood your main goal properly, it sounds to me like you could use a metaclass to turn your 12 OrderedDict instances into separate classes and conserving memory by eliminating (or at least minimizing) duplicate code and data. Here's one way to do that:

from collections import OrderedDict

class MetaDefaultAsset(type):

    def __new__(cls, name, bases, namespace, **kwargs):
        clsobj = type.__new__(cls, name, bases, namespace) # create class object

        # Use "defaults" keyword argument to create __slots__ and default
        # attributes and their values.
        if 'defaults' in kwargs:
            setattr(clsobj, '__slots__', kwargs['defaults'].keys())
            for key, default_value in kwargs['defaults'].items():
                setattr(clsobj, key, default_value)

        # Define some methods to be added to class object created.
        def items(self):
            yield from ((slot, getattr(self, slot)) for slot in self.__slots__)

        def values(self):
            yield (getattr(self, slot) for slot in self.__slots__)

        # Add the above methods to the class object.
        for name, method in {'items': items, 'values': values}.items():
            setattr(clsobj, name, method)

        return clsobj


DEFAULT_ASSET_ORDERED_DICT = OrderedDict(
    [('apple', 4), ('banana', 3), ('orange', 2), ('pear', 1)])

class DefaultAsset(metaclass=MetaDefaultAsset,
                   defaults=DEFAULT_ASSET_ORDERED_DICT): pass

NEW_ASSETS_ORDERED_DICT = OrderedDict(
    [('computer', 1), ('monitor', 2), ('keyboard', 3), ('mouse', 4)])

class DefaultNewAsset(metaclass=MetaDefaultAsset,
                      defaults=NEW_ASSETS_ORDERED_DICT): pass


da = DefaultAsset()
print(list(da.items()))
dna = DefaultNewAsset()
print(list(dna.items()))

Output:

[('apple', 4), ('banana', 3), ('orange', 2), ('pear', 1)]
[('computer', 1), ('monitor', 2), ('keyboard', 3), ('mouse', 4)]
martineau
  • 119,623
  • 25
  • 170
  • 301