0

I'm currently developing a web application that has one feature while allows input from anonymous users (No authorization required). I realize that this may prove to have security risks such as repeated arbitrary inputs (ex. spam), or users posting malicious content. So to remedy this I'm trying to create a sort of system that keeps track of what each anonymous user has posted.

So far all I can think of is tracking by IP, but it seems as though it may not be viable due to dynamic IPs, are there any other solutions for anonymous user tracking?

Trevor
  • 1,333
  • 6
  • 18
  • 32
  • What kind of input from anonymous users? How anonymous do the users want to be, or are you merely trying to lower the barrier for contribution? – this.josh Jun 16 '11 at 07:42
  • To an extent I want to lower the contribution barrier. – Trevor Jun 17 '11 at 04:16

4 Answers4

0

There's two main ways: clientside and serverside. Tracking IP is all that I can think of serverside; clientside there's more accurate options, but they are all under user's control, and he can reanonymise himself (it's his machine, after all): cookies and storage come to mind.

Amadan
  • 191,408
  • 23
  • 240
  • 301
0

Drop a cookie with an ID on it. Sure, cookies can be deleted, but this at least gives you something.

Esteban Araya
  • 29,284
  • 24
  • 107
  • 141
  • That's true, plus I can always make it so that cookies must be enabled and present. Pity there are still so many holes to this though. – Trevor Jun 16 '11 at 04:18
0

My suggestion is:

  1. Use cookies for tracking of user identity. As you yourself have said, due to dynamic IP addresses, you can't reliably use them for tracking user identity.
  2. To detect and curb spam, use IP + user browser agent combination.
jeffreyveon
  • 13,400
  • 18
  • 79
  • 129
0

I would recommend requiring them to answer a captcha before posting, or after an unusual number of posts from a single ip address.

"A CAPTCHA is a program that protects websites against bots by generating and grading tests >that humans can pass but current computer programs cannot. For example, humans can read >distorted text as the one shown below, but current computer programs can't"

That way the spammers are actual humans. That will slow the firehose to a level where you can weed out any that does get through.

http://www.captcha.net/

S. Albano
  • 707
  • 7
  • 21
  • If worse comes to worse I may need to use this, but I think it may detract from the user experience. – Trevor Jun 16 '11 at 04:15
  • Yes, to some extent. You could start requiring it if an ip address starts posting an unusual quantity. I use it on my site for when an ip address has multiple login failures to hinder brute force attacks. – S. Albano Jun 16 '11 at 04:20
  • This has the additional benefit of not completely locking users out who use a proxy just because one spammer shares it with them. – S. Albano Jun 16 '11 at 04:21
  • Makes sense, thanks I'll just require a random/flagged check then. – Trevor Jun 16 '11 at 04:25