-1

We took over a customer and check their DB. It's almost full and growing at 50 MB a day. There's no apparent reason for it to do that and we can't find any huge thingies in it. The solution is on-line so we can't fiddle with the DB.

We've checked the usual suspects.

  • Emails, attachments, notes.
  • Custom fields on a number of entities.
  • Custom entities.
  • System jobs and recurrent workflows.
  • Audits system-wide.
  • A bunch of other stuff we could think of.

They've got an integration towards Visma that produces (or is supposed to produce) PDFs into the system. However, we can't find those, neither.

We've run a number of reports and came up with squat.

Any suggestions on what that could be causing it? Or at least where else to look?

Daryl
  • 18,592
  • 9
  • 78
  • 145
Konrad Viltersten
  • 36,151
  • 76
  • 250
  • 438
  • 50MB a day doesn't sound like that much, how many users? – James Wood Mar 12 '13 at 11:07
  • Audit turned on by any chance? Agree that 50MB doesn't sound huge. – glosrob Mar 12 '13 at 11:17
  • @glosrob Sorry, I forgot to mention that. Audit is off system-wide and section-wide (haven't checked every single entity though but that shouldn't be possible is the big one is off). Nevertheless, I appreciate the feed-back. Any more suggestions? Even long-shots? And why on earth is the question on *close*?! – Konrad Viltersten Mar 12 '13 at 12:14
  • Could activity feeds be causing the growth? – Todd Richardson Mar 12 '13 at 12:17
  • Any chance you have a plugin/workflow that's in an infinite loop, and it's logging to the database? – Daryl Mar 12 '13 at 12:20
  • @KonradViltersten it is on close cos we need a CRM stackexchange to encompass more esoteric questions that might not be an exact fit here on SO.. IMO of course ;) – glosrob Mar 12 '13 at 12:35
  • I think you're checking all the right things - I'd have assumed attachments or system jobs, but you've checked all of these. At the risk of sounding defeatist however, I wonder if it would be better (cheaper) for you to simply pay the extra few pounds/dollars for additional storage. 5gb a year will cost you roughly the price of a consultancy day... If nothing else you can defer the problem ;) – Greg Owens Mar 12 '13 at 12:40
  • @Daryl I thought of that but rejected the thought because it's on-line installation. They shouldn't be able to log to DB, right? – Konrad Viltersten Mar 12 '13 at 13:02
  • @GregOwens I second that suggestion. We're just worried that if they grow, the increment per day will raise **way** beyond 50MB/day and that we hit the roof of 99MB within two, three years. However, pragmatically speaking, I like the suggestion. It's sweeping under the carpet but the carpet should be big enough. :) – Konrad Viltersten Mar 12 '13 at 13:06
  • @glosrob Huh? Encompass esoteric question? I wouldn't put it that way. Mostly because I don't know what that means... :) – Konrad Viltersten Mar 12 '13 at 13:09
  • @KonradViltersten - don't know if your customer can handle this... but maybe on off hours, you could completely fill up the database with dummy data, and see what hidden processes are breaking as they attempt to add data to the database. – Daryl Mar 12 '13 at 13:11
  • @Daryl This was brutal. Brilliant too. Mostly brutal. I'll talk to them right away. Put your suggestions as a reply and if it works, I'll check it. (You need to do that before it gets *esotericized* to oblivion and no more answers will be accepted.) – Konrad Viltersten Mar 12 '13 at 13:13
  • @KonradViltersten there is a settings in Workflows that automatically deletes the workflow record after it is completed. Is it possible that your growth is related to the workflows, and is it possible you'd see the growth rate you expect if you toggle this setting? – Mike_Matthews_II Mar 12 '13 at 13:36

1 Answers1

0

Don't know if your customer can handle this... but maybe on off hours, you could completely fill up the database with dummy data, and see what hidden processes are breaking as they attempt to add data to the database.

I used to work at a Big Box Chain, and during Thanksgiving (not a good time of the year to have troubles) our website was going extremely slow. Everyone blamed the webservers, but the CPU was only 25% on the servers. Managers were clambering to add more webservers to the farm. I instead suggested that we remove some, and see what happened to the webservers. As I anticipated, when we removed two servers, the other's load jumped up the correct mathematical percentage. It ended being a network setting where the max packet size was set to an extremely small number.

Sometimes its better to force the problem, rather than try to avoid it...

Daryl
  • 18,592
  • 9
  • 78
  • 145