0

One of our MediaWiki - based projects seems under DoS attack - unusual number of anonymous users try to edit pages and view or edit history requests. While anonymous editing is disabled on that project and these anonymous users (I assume, bots) cannot actually change the pages, the load is serious enough to slow the server significantly.

We only have one server box with about 8 Gb RAM dedicated for this project. It runs the recent MediaWiki version on the top of Apache server under Linux.

One of the ideas would be to alter (maybe temporarily) some MediaWiki code to ban anonymous users from edit, view source and edit history requests - and do this early is the flow, be sure attempts are rejected with the minimal resources on that.

Or, maybe, it is not a good idea and something else can be done. We are currently blocking traffic with IPTables and this works for us (site stays operational). However such blocking needs too much attention - the attacker seems controlling some rather big pool of IP addresses from all corners of the world.

Assuming we have full control (root access) to the Linux server, that could be done to shield it more properly?

We are already applying the end-user level approaches like disabling anonymous editing through web interface. This still generates enough load not to be a full solution. Also, simply hiding controls is not enough - the edit requests come even if the "edit" tab is already not visible.

h22
  • 254
  • 2
  • 9

1 Answers1

2

If this is about spambots trying to edit, you could try some of the webserver-level antispam solutions: https://www.mediawiki.org/wiki/Manual:Combating_spam#Hardcore_measures Or it could be that your antispam measures are bringing the site down, it's not uncommon; check that your solutions are performant, some cures are worse than the illness.

If you're not actually sure from logs that those are spambots, maybe it's just search engines. Ensure that you set up a robots.txt excluding the ScriptPath from crawling, so that only the canonical page URLs are indexed, but not any URL with parameters to api.php or index.php. Also set a crawl-delay if needed.

Nemo
  • 259
  • 1
  • 13