1

I am trying to put together the results of some operations to generate seismic attributes and place it in a dask dataframe together with the compute command but it generates the following error:

AttributeError: 'NoneType' object has no attribute 'array_wrap'

This is the code I am using:

sys.path.append('./d2geo/attributes')

from d2geo.attributes.CompleTrace import ComplexAttributes
from d2geo.attributes.SignalProcess import SignalProcess

complex_att = ComplexAttributes()
signal_process = SignalProcess()

def amplitude_arr(input_cube):
    return da.from_array(input_cube)

# List of tuples with attribute name, the function 
# to run (with cube as input) and additional kwargs dict.
funcs = [
    ('Amplitude', amplitude_arr, {}),
    ('Envelope', complex_att.envelope, {}),
    ('Instantaneous Phase', complex_att.instantaneous_phase, {}),
    ('Instantaneous Frequency', complex_att.instantaneous_frequency, {}),
    ('Instantaneous Bandwidth', complex_att.instantaneous_bandwidth, {}),
    ('Dominant Frequency', complex_att.dominant_frequency, {}),
    ('Cosine Instantaneous Phase', complex_att.cosine_instantaneous_phase, {}),
    ('Second Derivative', signal_process.second_derivative, {}),
    ('Reflection Intensity', signal_process.reflection_intensity, {})
]

dataframe = run_attributes(cube, funcs).compute()
dataframe.tail()

1 Answers1

2

this doesn't seem to be necessarily a Dask problem. I believe that what is happening here is that somewhere in the process of applying these functions, you are ending up with a variable that is None and you are trying to access the attribute array_wrap, and that is what is causing the error. For a more detailed explanation, you can check Why do I get AttributeError: 'NoneType' object has no attribute 'something'?

Dharman
  • 30,962
  • 25
  • 85
  • 135
ncclementi
  • 51
  • 2
  • I just run the program again with some modifications and now it gives me the following error "MemoryError: Unable to allocate 467. MiB for an array with shape (213, 213, 1349) and data type float64 "which keeps coming from the line of code identified as dataframe. What could be causing this? – Yosmely Bermùdez Sep 10 '21 at 15:54