13

I have a table with 3000 rows and 8 columns. I use the QTableView. To insert items I do:

QStandardItem* vSItem = new QStandardItem();
vSItem->setText("Blabla");
mModel->setItem(row, column, vSItem);

where mModel is QStandardItemModel. Everything is fine if I have not to many rows, but when I am trying to visualize big data (about 3000 rows), then it is extremely slow (20 seconds on Win 7 64-bit (8 core machine with 8 GB of RAM!!!)). Is there anything I can do to improve performance?

Thanks in advance.

Sam Becker
  • 19,231
  • 14
  • 60
  • 80
Tom
  • 171
  • 1
  • 1
  • 9

9 Answers9

9

Good call on the autoresize on contents for your columns or rows.

I have a function that added a column to the table each time a client connected to my server application. As the number of columns in the table got large, the insertion time seemed to take longer and longer.

I was doing a ui->messageLog->resizeRowsToContents(); each time. I changed this to only auto resize the row that was being added ui->messageLog->resizeRowToContents(0);, and the slowness went away.

Eric
  • 91
  • 1
  • 2
7

Do you have an autoresize on contents for your columns or rows ? It can be a killer in performance sometimes !

Have a look here : QHeaderView::ResizeToContents

Hope it helps !

Christophe Weis
  • 2,518
  • 4
  • 28
  • 32
Andy M
  • 5,945
  • 7
  • 51
  • 96
  • Not explicitly. Unless this is done per default, I don't have it. – Tom Oct 27 '10 at 08:40
  • What I have is: mUi.mImagesTableView->setAlternatingRowColors(true); mUi.mImagesTableView->setSelectionMode(QAbstractItemView::SingleSelection); mUi.mImagesTableView->setSelectionBehavior(QAbstractItemView::SelectRows); mUi.mImagesTableView->setEditTriggers(QAbstractItemView::NoEditTriggers); mUi.mImagesTableView->setIconSize(QSize(vIconSize, vIconSize)); mUi.mImagesTableView->setColumnWidth(0, vIconPlusBorder); mUi.mImagesTableView->horizontalHeader()->setStretchLastSection(true); – Tom Oct 27 '10 at 08:41
  • try and disable the setStretchLastSection, disable the auto resize (I don't remember if it's done by default)... In other words try and disable everything that could be related to resize of rows and columns... In my application, in release, i have a treeview (with columns) with more than 10 thousand items and I can resize everything without problem... But if I switch on those auto-resize features, my perfs go down badly ! – Andy M Oct 27 '10 at 08:55
  • note : Even in debug, you should have performance problem with that small amount of data... – Andy M Oct 27 '10 at 08:56
  • Random idea : Where do you create the datas ? In a method that is called lots of time ? – Andy M Oct 27 '10 at 08:57
  • I disabled it (setStretchLastSection), but it is still slow. Maybe there is something to disable that is enabled per default, but I do not know what. There is nothing with autoresize rows or columns in qtableview... And I am calling the method where I fill the data only once.. – Tom Oct 27 '10 at 09:08
3

I found a solution: the problem was that I assigned the model to the tableview already in the constructor. So everytime I inserted the item in the model, tableview was informed and probably updated. Now I assign the model to the tableview only after I filled my model with data. This is not an elegant solution but it works. Is there maybe a way to temporarily disable the model from tableview or something that says to the tableview to not to care about changes in the model?

Tom
  • 171
  • 1
  • 1
  • 9
  • 1
    You can try messing with blockSignals (http://doc.trolltech.com/main-snapshot/qobject.html#blockSignals) on the model, but that will probably cause you more headaches. Really, you should consider leaving the order the same, or changing to your own model (so you would know when to say you've changed data). – Caleb Huitt - cjhuitt Oct 27 '10 at 19:23
  • 2
    For large datasets, I would consider to write a "proper" model instead of using QStandardItemModel. There you can insert data in bigger chunks than single items, each triggering the view to update, which should me vastly more performant. Edit: I see I just repeated what James said below... – Frank Osterfeld Oct 28 '10 at 09:04
3

Watch out for setSectionResizeMode(). This had enormous performance implications for me. It causes row and column size recalculations with every modification (i.e. every setData()/setText() call). This wasn't noticeable until I reached 1000+ rows. Consider using resizeSections() instead which seems to be a one time adjustment.

Mitch
  • 520
  • 3
  • 7
2

For this quantity of data, you'd be better with a custom model - then you'd have the control of when you inform the view of updates, for example. The 'standard' items scale to hundreds, and probably thousands, due to modern hardware being fast, but they're explicitly documented as not being intended for datasets of this size.

James Turner
  • 2,425
  • 2
  • 19
  • 24
2

Also, if all your rows have the same height, setting http://doc.qt.io/qt-5/qtreeview.html#uniformRowHeights-prop to true can boost performance. In my case, a model containing about 50.000 rows was almost unusable with uniformRowHeights set to false (the default). After changing it to true, it worked like a charm.

Christophe Weis
  • 2,518
  • 4
  • 28
  • 32
bjoern.bauer
  • 699
  • 3
  • 6
0

I am using 80000 rows and had a similar problem adding huge amounts of items to a table.

My solution was to let it allocate the memory in advanced by telling it how many rows it will need.

I was using a Qtableview and model, so:

self.model.setRowCount(80000)

I'm sure you can match this up with your code

Worthy7
  • 1,455
  • 15
  • 28
0

try this :

             QSqlDatabase db =QSqlDatabase::addDatabase( "QSQLITE");

      void SELECT_TO_TBLWID(QTableWidget * TBL, QString DbPath,QString SQL)
                  {
                      QSqlDatabase db2 =QSqlDatabase::database();
                      db2.setDatabaseName(DbPath);
                      if( !db2.open() )
                      {
                        qDebug() << db2.lastError();
                        qFatal( "Failed to connect." );
                      }
                        QSqlQuery qry;
                        qry.prepare(SQL);
                        if( !qry.exec() )
                          qDebug() << qry.lastError();
                        else
                        {
                           QSqlRecord rec = qry.record();

                             TBL->setColumnCount(rec.count());
                             int RW=0;
                             for( int r=0; qry.next(); r++ )
                                {RW++;}
                                TBL->setRowCount(RW);
                                for (int pr=RW;qry.previous();pr--){// do nothing}

                         for( int r=0; qry.next(); r++ )
                            {

                              for( int c=0; c<rec.count(); c++ )
                              {
                                  if ( r==0)
                                  {
                                      TBL->setHorizontalHeaderItem(c,new QTableWidgetItem(rec.fieldName(c)));
                                  }

                              TBL->setItem(r, c, new QTableWidgetItem(qry.value(c).toString()));

                              }

                            }
                        }

                        db2.close();
                 }
Vojtech Ruzicka
  • 16,384
  • 15
  • 63
  • 66
Adrian
  • 53
  • 1
  • 2
  • 3
    Although this code may help to solve the problem, it doesn't explain _why_ and/or _how_ it answers the question. Providing this additional context would significantly improve its long-term value. Please [edit] your answer to add explanation, including what limitations and assumptions apply. – Toby Speight Aug 10 '16 at 13:02
0

I had the exact same issue, with even 100 rows the performance was horrible.

Upon inspection this issue is not really related to options set on the table view itself (like resizing and such), but rather because the model informs and updates the view for each insertion (beginInsertRows/endInsertRows).

With that being said you have 2 options for maximum performance:

  1. Set the model to the view after you populated with data
  2. Set the model anytime, but populate a new list of data and then assign that list to the model

Whichever option you go with, performance is crazy. I went with the second option because I set my model (and proxy model) in the constructor of the widget.

Later on when I want to add data:

// Create a new list of data (temporary)
QList<MyObjects*> NewList;

for (auto it = Result.cbegin(); it != Result.cend(); ++it) 
{
    MyObjects* my = new MyObjects();
    
    // my - set data
    
    NewList.append(my);
}

// Now simply replace current data list
Model->setList(NewList);

This is considering you created your own setList() function inside your custom Model:

void Model::setList(QList<MyObjects*> clist) 
{
    beginResetModel();
    list = clist;
    endResetModel();
}

And voila... you load thousands of records with high performance. Notice beginResetModel() and endResetModel() are the functions that notify the table view.

Enjoy.

Mecanik
  • 1,539
  • 1
  • 20
  • 50