3

Possible Duplicate:
Limiting variable scope

One of the easiest errors for me to slip into is having a function access a variable that is in the global environment as opposed to the local environment. During development, this might happen when I change the name of a variable and forget to rm the old one--then fail to update a function to make it access the new one.

Is there a way to have R return an error or warning whenever it automatically grabs a variable from higher up in the tree? It seems like this would be easier now that R supports compiled code....

Here's a quickie example of what I'd like to return an error/warning:

x <- 5
f <- function(y) {
   z <- y + x
   z
}
f(3)  #will return 8
Community
  • 1
  • 1
Ari B. Friedman
  • 71,271
  • 35
  • 175
  • 235
  • 1
    Do either of these questions help: [Limiting variable scope](http://stackoverflow.com/questions/3277556/limiting-variable-scope), [R force local scope](http://stackoverflow.com/questions/6216968/r-force-local-scope)? – Joshua Ulrich Nov 16 '11 at 20:09
  • @JoshuaUlrich The second one is the one I was looking for. Voted to close in the interests of keeping all the good answers consolidated in one place. – Ari B. Friedman Nov 17 '11 at 01:33
  • I've tackled this problem by testing my functions in clean R sessions. – Roman Luštrik Nov 17 '11 at 11:44

3 Answers3

2

I don't think there's an easy way (e.g. an option in options()). Are your functions in an R script file? Are you building a package?

If so, I think the best way is:

> f <- function(y) {
+    z <- y + x
+    z
+ }
> library(codetools)
> checkUsage(f)
<anonymous>: no visible binding for global variable ‘x’ (:2)

But note that if z is defined then this will not find an error. You would have to copy your f function into a new R session and then run checkUsage. This seems like a pain but it's not that bad and I'm sure you could create some tricks for dealing with it. And if you have your functions all in a separate file then it's easy.

Xu Wang
  • 10,199
  • 6
  • 44
  • 78
  • This seems like the proper way, but it both requires me to actively call `checkUsage` each time, and doesn't really work on one-off anonymous functions in e.g. sapply calls.... – Ari B. Friedman Nov 17 '11 at 01:05
  • @gsk3 I'm not sure anything can fix one-offs, but running `checkUsage` is a good idea for package and script development. – Iterator Nov 17 '11 at 14:19
1

f() fails because there is not a default value for y. I think you meant f(1).

There may be a better way to do this, but you could wrap your function definitions in local calls during development. Something like:

x <- 5
f <- local(function(y) {
   z <- y + x
   z
}, baseenv())
f(1)
# Error in f(1) : object 'x' not found
Joshua Ulrich
  • 173,410
  • 32
  • 338
  • 418
  • Oops. f(1) was indeed what I meant. I like the idea of the local wrapper. Is there a way to overload the `function` function so that this doesn't have to be done for every function, and then un-done again when development "ceases" (as though it ever stops!). – Ari B. Friedman Nov 17 '11 at 01:04
1

It is always a good idea to throw your functions into a package. That has the added advantage of allowing for R CMD check. One of the checks that R performs is dependencies on variables (this was referenced to in e.g. this question).

Of course, truth be told, while actually developing, constantly rebuilding and checking and reinstalling your package is somewhat of a hassle, so you may be better off with the other suggestions.

Community
  • 1
  • 1
Nick Sabbe
  • 11,684
  • 1
  • 43
  • 57