-1

As a part of SQL injection prevention, I have revoked rights on DELETE and UPDATE for the user using the connection. In that way, an attacker cannot harm the integrity of the data even if the bad code allows SQL injection.

Now only left is INSERT. E.g. an attacker can flood insert a particular table, crating a dirty database or taking it down with flood INSERT, potentially taking down the HDD and the server where PostgreSQL is running. All DDL and DCL are already revoked for that user.

So, my question is: is it possible to prevent flood insert, rate-limiting specific connection / session / execution, attempting insert of more than 1 row per 5-10 seconds during the mentioned.

By flood insert I mean something like this:

insert into cars (id, name)
select id, name from cars

-- or if we imagine that id is auto-number.
insert into cars (name)
select name from cars

where user could execute a duplicating of the entire table with one simple line, doubling it with each next execution.

In multiple places, I have read as protection to give the right to function/procedure but to take rights on INSERT/DELETE/UPDATE, but I am not sure how to do it, as as soon you REVOKE rights on INSERT, the procedure will not work either as it is running by the same user.

So I am not quite sure how you prevent Script/Query execution, but you grant function/procedure execution?

If this is possible, can someone give me a practical example? How would you do something like this?

Answer in line of "limit the available RAM for insert" is NOT acceptable.

Thank you,

IngoP
  • 465
  • 5
  • 11
  • 1
    You might be able to do something with [Linux cgroups](https://github.com/cybertec-postgresql/pg_cgroups) –  Jun 06 '21 at 08:14
  • As I said "Limit RAM" is not a solution, I am looking some kind solution, where ```insert A (id, name) select id, name from A``` would not be possible. (if possible of course). or in other words I am looking to limit maximum rows possible in one insert statement. – IngoP Jun 06 '21 at 08:36
  • 1
    Does this answer your question? [Is there a way to Rate Limit or Throttle a user or a connection in PostgreSql?](https://stackoverflow.com/questions/63339053/is-there-a-way-to-rate-limit-or-throttle-a-user-or-a-connection-in-postgresql) – Andreas Jun 06 '21 at 09:00
  • No it does not. – IngoP Jun 06 '21 at 09:01
  • 1
    cgroups doesn't "limit RAM", it limits resources in general, including I/O. If you want to limit the number of rows of an INSERT statement, you can use a statement level trigger that throws an exception –  Jun 06 '21 at 09:35
  • I said along the lines, I do not need resource limit I need number of rows inserted in single statement limit, in whatever way possible. So, how would you do the trigger, on the specific table for specific user? – IngoP Jun 06 '21 at 09:47
  • Specific user I could do I guess like it is explained here: https://www.postgresql.org/message-id/CAGsySmggNP6-pcdtmzr90WT_v2rmNe5Y1%3DnRmtYX1p%2Bk2JpK1w@mail.gmail.com and need to think about BEFORE INSERT trigger ROW affected count, if > 1 then exception ... – IngoP Jun 06 '21 at 09:54
  • 1
    You can see this [post](https://stackoverflow.com/questions/56988193/postgres-trigger-check-amount-before-delete) for a way to limit mass deletes (adapt it to mass insert if you wish). Another option is to completely forbid any table direct access and to only insert/update/delete/fetch via stored procedures.... but be ready to spend a lot of time on a feature with unclear benefits – JGH Jun 06 '21 at 12:13

2 Answers2

1

You have some contradicting requirements between your comment:

I need number of rows inserted in single statement limit

and your question:

rate-limiting specific connection / session / execution attempting insert of more than 1 row per 5-10 seconds

The "rate limit" can't be done without external tools, but the "in single statement limit" part can be achieved with a statement level trigger.

The function checks for the number of rows inserted:

create function limit_insert()
 returns trigger
as
$$
declare
  l_count bigint;
begin
  select count(*)
    into l_count
  from inserted;
  if l_count > 10 then
    raise 'Too many rows inserted!';
  end if;
  return null;
end;
$$
language plpgsql;

And the trigger definition would be:

create trigger limit_insert_rows_trigger
  after insert on the_table
  referencing new table as inserted
  for each statement execute function limit_insert();

Note that the trigger function can be used for any table.

  • Thank you, I think this is what I am after for now, although I need to check how does this work behind scene, are those rows inserted in temporary table... (I do have in depth knowledge of PostgreSQL) – IngoP Jun 06 '21 at 18:55
  • I believe the rows are kept in memory if there is enough. But yes, you can think of them as read-only temporary tables –  Jun 06 '21 at 19:47
  • I was kind of afraid of temp/mem table, I was hopping to some pre-calculated query execution plan that would give number of rows. It is not ideal, but again does the job, and will give info that could prevent SQL injection. As it should run only in violation (hopefully very rare) cases. – IngoP Jun 07 '21 at 06:12
0

What about keeping a table user_event with the user_id or session id, event_name and event_time

select max(event_time) from user_event where user_id = :user_id and event_name = 'insert_car';

if event_time < now error "You are doing that too fast"

obviously this is pseudo code.

user939857
  • 377
  • 5
  • 19
  • It would not work as user could as I noted up duplicate millions of rows with simple insert without where clause. So, for your proposition is idea for repeated script but not mass insert. By the way I would need to check how to hook to each script execution event. – IngoP Jun 06 '21 at 19:00