-1

I'm currently preparing some longitudinal data for analysis in SPSS and I've got six data files. How do I combine them into the wide format for analysis?

Wendeekay
  • 3
  • 4
  • look up `match files` command – eli-k May 25 '17 at 10:09
  • Thanks for your response. I tried this MATCH FILES /FILE=C:\Users\###\My Data\Bro.sav /FILE=C:\Users\###\My Data\Bro_II.sav /BY hidp. SAVE OUTFILE = Merge1.sav. Result: >Warning # 206 in column 11. Text: \ >An invalid character has been found on a command. >Warning # 206 in column 17. Text: \ >An invalid character has been found on a command. >Warning # 206 in column 22. Text: \ >An invalid character has been found on a command. >Note # 5145 >The working file has been restored, and subsequent commands may access the >working file. – Wendeekay May 26 '17 at 11:19

2 Answers2

1

use match files to connect your files. In response to your second question in the comment - use quotes with all your filenames - e.g. /FILE='C:\Users\###\My Data\Bro.sav'.

eli-k
  • 10,898
  • 11
  • 40
  • 44
1

Assuming that the 6 datasets each have the same variable hidp to link them together, and that all datasets are sorted according to this variable, you could run code which looks something like the following:

match files file = ´C:\Users\###\My Data\Bro.sav´
  /file = ´C:\Users\###\My Data\BroII.sav´
  /file = ´C:\Users\###\My Data\BroIII.sav´
  /file = ´C:\Users\###\My Data\BroIV.sav´
  /file = ´C:\Users\###\My Data\BroV.sav´
  /file = ´C:\Users\###\My Data\BroVI.sav´
  /BY hidp.

As you see, you can combine them in one command. If a dataset is not sorted, you can sort it with:

get file = ´C:\Users\###\My Data\BroII.sav´.
sort cases by hipd.
save outfile = ´C:\Users\###\My Data\BroII.sav´.

Note that this command assumes that besides the variable hipd all other variables are unique to the six datasets. That is, a variable may be in only one dataset. Otherwise, SPSS does not know what to do and the result may be a warning, error or results which should be carefully checked.

MA53QXR
  • 82
  • 2
  • 11
  • Thank you very much. I seem to be having another challenge. This has to do with combining data from two different data files: one of the data files contains data for individuals (age, sex, etc) from 6 waves (or time intervals) which I have matched by pidp. The second data file contains data from the induvidual's household (size, e-board number, etc). The unique identifier they share is hidp. I have also combined the household data of 6 waves using the match files command. Now how do I combine the individual and household files? I attempted the match files by hidp already – Wendeekay Jul 04 '17 at 15:56
  • Okay, you combined the 6 datasets with data on individual persons, using pipd as BY-variable. Note that using the IN-subcommand you can easily track in which wave a person was coming from: match files file = '....Bro.sav' /in = inwave1 /file = '...BroII.sav' /in = inwave2 .... and so on. – MA53QXR Jul 05 '17 at 09:29
  • As for your comment: it is not clear to me which files contain hidp. I hope they are in both the 6 files of the waves and in the household-file. I assume this is the case. If the household-file has unique vales for hidp for every record, you should use the combined-waves-file in your match files command and use the household-file as a lookup-table. E.g. match files file = ´your-combined-waves-file´ /table = ´your-household-file´ /by hipd. This gives you a file with 1 record for every persons with information on their household as extra variables. – MA53QXR Jul 05 '17 at 09:34
  • And thanks for checking my answer as the best answer. This made it possible for me to add comments. :) – MA53QXR Jul 05 '17 at 09:35
  • You're welcome :) hidp is contained in both data files, yes. Oh I see what you mean, I'll attempt this suggestion, thank you very much – Wendeekay Jul 05 '17 at 13:34
  • It worked, thanks. I was getting an error message (file doesn't exist) until I referenced the entire file, despite setting the folder at the onset with the CD and datadir commands – Wendeekay Jul 06 '17 at 05:57