This is a simple question. Suppose I have a massive table (5 million rows), and I have no option but to do a full table scan.
SELECT * FROM Table1
Is there any way at all to make my database return the results faster?
Background:
We have an application provided to us by a third party which was designed to be robust/flexible - there are very few, large tables in it's database. For example, one of the tables is the "Object" table, which stores all objects. Another table is the "Relationship" table, that captures all relationships between objects. It essentially allows you to store anything without changing its schema.
My task is to design a separate reporting application - an application that queries this database. It has to be live data. And because of the size of the tables, plus the complexity of the queries, there's performance issues.
I was wondering how I should go about handling the performance issues. I've created indexes after indexes, but the queries are still very complex, and at the end of the day, I still have to make multiple full table scans.
The good news is, I'm still at the design phase - so I'd like to hear as many suggestions as possible, in case I've missed something.