I'm new to R and couldn't find a solution to this. I have a data set with Country Codes, Values, and Years(Panel Data) The 'Value' column has many NAs. I would like to, for each country, get a list of years for which the values are NA. Would this be possible using the dplyr function? This is a snapshot of my data set Country codes, Years and Values
Asked
Active
Viewed 348 times
0
-
When asking for help, you should include a simple [reproducible example](https://stackoverflow.com/questions/5963269/how-to-make-a-great-r-reproducible-example) with sample input and desired output that can be used to test and verify possible solutions. – MrFlick Jun 15 '18 at 18:59
-
Point noted. Thanks. – Skurup Jun 15 '18 at 19:29
-
1Please post data in the post as text, preferably using `dput`, not as an image – camille Jun 15 '18 at 20:18
3 Answers
1
Do you mean like this?
DAT = read.table(text="Country.Code Year Value
UKR 2006 NA
UKR 2007 NA
UKR 2008 2000
ARE 2006 NA
ARE 2007 NA",
header=TRUE)
DAT[is.na(DAT$Value), 1:2]
Country.Code Year
1 UKR 2006
2 UKR 2007
4 ARE 2006
5 ARE 2007
Addition
To get all years for one country in a single line, you could use
temp = DAT[is.na(DAT$Value), 1:2]
aggregate(temp$Year, list(temp$Country.Code), paste, collapse=",")
Group.1 x
1 ARE 2006,2007
2 UKR 2006,2007

G5W
- 36,531
- 10
- 47
- 80
-
Thank You!this worked perfectly! Is there anyway to have the country code only once and the years where its values are NA in the other column? – Skurup Jun 16 '18 at 05:52
-
1
Making the test case:
df <- read.table(text="Country Year Value
UKR 2006 NA
UKR 2007 NA
UKR 2008 2000
ARE 2006 NA
ARE 2007 NA", header=TRUE)
for each country, get a list of years for which the values are NA
lapply(split(df, df["Country"]), function(x) x$Year[is.na(x$Value)])
# or equivalent but more readable
with(subset(df, is.na(Value)), split(Year, Country))
Output:
$ARE
[1] 2006 2007
$UKR
[1] 2006 2007
Is this what you need?

lebatsnok
- 6,329
- 2
- 21
- 22
-
That is what I need, but the lapply package isn't getting installed on my R. I will look into this and get back to you. – Skurup Jun 16 '18 at 06:16
-
-