5

When developing and debugging with python/ipython repl, at some point I'd like to dump all the local variables in a function to the workspace to see what's going on. Suppose I have a function

def func():
    a = "blablabla"
    b = 1234
    c = some_calculation_of_a_and_b(a,b)
    dump_to_workspace(a,b,c)   # is this possible?,  or even more preferable:
    dump_all_local_variables_to_workspace()   # is this possible?

I hope to be able to run this in python/ipython:

>>> func()
>>> print a
"blablabla"
>>> print b
1234
>>> print c
some_calculated_value

I know two alternatives: (1) return the variables from the function [not good because I don't want to mess up with return value], and (2) save the data to a file on the disk [not convenient because it involve disk I/O with possibly large amount of data]. But those aren't as convenient most of the times. Is there a way to achieve the dumping directly?

Thanks a lot in advance!

Ying Xiong
  • 4,578
  • 8
  • 33
  • 69
  • A better approach would be to use `pdb.set_trace()` and step through. – vikramls Nov 09 '14 at 16:21
  • Hi @vikramls, this looks a promising direction. I should definitely look into and learn more about `pdb`. If you have a pointer to some good tutorial/reference, that'll also be great. Thanks! – Ying Xiong Nov 09 '14 at 16:39
  • I have added an answer to elaborate on the use of pdb. – vikramls Nov 09 '14 at 19:02

6 Answers6

3

This is a hugly hack, but I use it very often in my jupyter notebooks for dumping some local state to the global workspace for further inspection. Simply add these two lines to the end of your function and then you'll have access to all the local variables directly from the notebook:

import inspect
inspect.getmodule(next(frm[0] for frm in reversed(inspect.stack())
                       if frm[0].f_locals.get('__name__', None) == '__main__')).__dict__.update(locals())

What it does is traverse the stack in reverse order (using the inspect module) to find the top-most module named '__main__'. That's the module representing the notebook (i.e. the current kernel). It then updates the module's global variables definition using __dict__ with the function's locals (using __locals__)

Here's a demo in a notebook: https://colab.research.google.com/drive/1lQgtmqigCUmzVhkX7H7azDWvT3CfxBtt

#%%
def some_calculation_of_a_and_b(a, b):
    return 'some_calculated_value'

def func():
    a = "blablabla"
    b = 1234
    c = some_calculation_of_a_and_b(a,b)

    #dump_all_local_variables_to_workspace():
    import inspect
    inspect.getmodule(next(frm[0] for frm in reversed(inspect.stack())
                      if frm[0].f_locals.get('__name__', None) == '__main__')).__dict__.update(locals())

#%%
func()
print(a)
print(b)
print(c)

# this will print:
# blablabla
# 1234
# some_calculated_value
stav
  • 1,497
  • 2
  • 15
  • 40
2

To achieve what is directly asked for in the question, the function could return locals(), then you can update locals() in the interactive environment:

def func():
    # define a,b,c
    return locals() # or, more restrictively, return {'a':a, 'b':b, 'c':c}

>>> locals().update(func())
keflavich
  • 18,278
  • 20
  • 86
  • 118
  • This is neat. but `locals().update(...)` doesn't work within a function, it updates `locals()` correctly but the code after does not see those variables. – Cris Luengo Jan 19 '23 at 18:02
1

To expand on my comment above, here's a quick intro to pdb:

import pdb
def func():
    a = "blablabla"
    b = 1234
    c = some_calculation_of_a_and_b(a,b)
    pdb.set_trace()

To run:

python program.py

The interpreter will then stop at the pdb.set_trace() line allowing you to observe the values of a,b and c.

While stopped, you can print the values of the local variables by typing:

p a
p b

etc. A full list of commands in pdb can be obtained by typing ? at the pdb prompt. Here's a link to the library documentation.

vikramls
  • 1,802
  • 1
  • 11
  • 15
0

You have a function for this:

print locals()

and globals() for global variables

m.wasowski
  • 6,329
  • 1
  • 23
  • 30
  • Thanks. That's pretty close to what I had in mind. But I'd also like to "dump"/export the locals() to workspace, so that after function returns, I can still examine those variables. Any ideas on how to do that? – Ying Xiong Nov 09 '14 at 16:36
0

you could use whos in ipython. vars() in python environment.

venpa
  • 4,268
  • 21
  • 23
  • This doesn't seem to work.. I need to access the variables inside the the function, and `whos` or `vars()` do not seem allowing me do so. Please correct me if I'm wrong though. Thanks. – Ying Xiong Nov 09 '14 at 16:37
0

I had this same question in a Jupyter context. Especially when you want to use Jupyter itself to help debug or plot some of the locals, Or if you want to interactively write code meant to run within a deeply nested function, and you want to build it up by writing against real data.

Here's a github gist of a function to do this based roughly on stav's answer above (the top level of the notebook is not the same as ipython). This was written with python 3.8 in case that matters.

def copy_locals()
'''
    copies all local variables from this context into the jupyter top level, eg, for easier 
    debugging of data and for prototyping new code that is eventually meant to run within this context.
'''
  stack = inspect.stack()

  caller = stack[1]
  local_dict = {k:v  for k,v in caller.frame.f_locals.items() if not k.startswith('_')}

  notebook_caller = None
  for st in stack:
    if st.function == '<module>':        
        notebook_caller = st
        break
        
  if notebook_caller is None:
    print('is this being called from within a jupyter notebook?')
    return

  print('copying variables to <module> globals...', list(local_dict.keys()))
  notebook_caller.frame.f_globals.update(local_dict)

Then, if you have set a breakpoint with breakpoint() or you have crashed and you are within pdb you can call copy_locals within pdb:

%debug
  copy_locals()

and resume work on the notebook.

orm
  • 2,835
  • 2
  • 22
  • 35