1

I have a special security need with mysql. I need to forcibly restrict the number of rows a query returns, issuing an error if the returned rows will be over, say a million rows. Here is the setup -

Need - The data has 100s of millions of rows, and we don't want the client to run down the server or do a complete extraction (They would never need all the lines, just aggregations) The idea is, if they need it, they run into an error or the barrier, and come to us with the reason explaining why they need to pull so many rows with a query.

System - Clients can use any query tool, so we have no control over what query is generated. Thus, we cannot use Limit x which seems to be the solution suggested everywhere.

I have tried searching for a solution, and for now it seems that the only way to do it is at the application level (which we do not own).

Is there any way to achieve this?

Setting

1- We need to have SSL enabled.

2- MySQL 5.5

Thanks!

J

Jai
  • 3,549
  • 3
  • 23
  • 31

1 Answers1

1

It seems like you might be able to get close with MySQL Proxy.

https://launchpad.net/mysql-proxy

See this page for manipulating results. Not sure if it does a buffered or unbuffered read, or if you can cancel the reading of results or not...

http://dev.mysql.com/doc/refman/5.1/en/mysql-proxy-scripting-read-query-result.html

It's open source, so you might be able to hire someone to tweak it if needed as well.

There may be other ways to restrict the overloading of your database server. Take a look at this link for more info:

MySQL - can I limit the maximum time allowed for a query to run?

Community
  • 1
  • 1
gahooa
  • 131,293
  • 12
  • 98
  • 101
  • That is one thing I am considering, but it seems it wont allow SSL connections :( any thoughts on that? – Jai Jul 12 '12 at 22:56