2

I import data from an excel file using the command xlsread. The data look like the following:

enter image description here

I would like to format these data so that the output looks like:

A = [NaN 1 2 3; 
    20160101 100 80 90; 
    20170101 150 90 200]

In excel, I would use a Pivot table. Is there an equivalent in MATLAB or how would I start to code this? Is reshape an option here?

Joe
  • 457
  • 5
  • 18

1 Answers1

1

I'll assume you are reading your data from the file as follows:

data = xlsread('your_file.xls');

Which gives you a numeric matrix containing your data. You can then reorganize it by parsing your first and last columns using unique, then using the results as indices into accumarray to collect the data in the center column. Then you just add the row and column labels:

[rowVals, ~, rowIndex] = unique(data(:, 3));
[colVals, ~, colIndex] = unique(data(:, 1).');
A = accumarray([rowIndex colIndex], data(:, 2));
A = [NaN colVals; rowVals A];

And the result, for your sample data above:

A =

         NaN           1           2           3
    20160101         100          80         200
    20170101         150          90         200

If you have duplicate entries (i.e. entries that have the same date and identifier), the above will sum them by default. You can provide a function handle to accumarray if you'd like it to do something else. For example:

A = accumarray([rowIndex colIndex], data(:, 2), [], @mean);      % Averages them
A = accumarray([rowIndex colIndex], data(:, 2), [], @(x) x(1));  % Keeps the first entry
gnovice
  • 125,304
  • 15
  • 256
  • 359
  • This answer is superb. I've one additional question to this matter (not sure whether I sould open a new question for that). What if there would be a duplicate? Let's say there are two values of 20160101 with a value of 100. Your code would build a sum of these two in the output (so it would be 20160101 200 80 200). Ideally it would leave it with 100. – Joe Nov 09 '17 at 20:05
  • 1
    @Joe: I added an explanation for how you can deal with duplicate entries. – gnovice Nov 09 '17 at 20:14