0

I have created a BCS Model which reads data through xml returned by rest api. I have LOB instance in BCS which I crawl to get the data in SharePoint. The problem is on debugging i found that BCS Model is returning 1782 records where as after running a full crawl, SharePoint is showing only 1222 successful records in crawl log with no errors and warnings. Any idea suggestions why is this happening will be of great help.

Thanks in Advance...

SiD
  • 1
  • 4
  • Hi, SiD! Have you managed to find the cause of the problem and fix it? I face the same issue. – Trike Apr 04 '16 at 06:07
  • Hi Trike, I know its a bit late but hope this helps. In my case the issue was in Read Item Method. There where special characters in my rest response.Since we where using UTF-8 encoding these characters where getting encoded because of which I was not able to deserialize those records. To avoid such scenarios we made changes in the code generating the rest response – SiD Apr 23 '16 at 13:06
  • Thanks for the comment, SiD! I've had another issue - after crawling 2/3 of data the PropertyStoreDB hit the 4GB limit on SQL Express. – Trike Apr 24 '16 at 18:27
  • 1
    Hi Trike, Since you have mentioned PropertyStoreDB I assume that you are working on SharePoint 2010. I Did some research on net and found this link below though it does not contains the exact solution but hope it helps you moving in right direction http://www.smattie.com/2012/04/14/how-to-fix-sql-server-express-db-size-limitation-of-4-gb/ – SiD Apr 25 '16 at 07:41
  • Hi Sid, yes the solution I'm working with based on SP2010. Thanks for the link! – Trike Apr 30 '16 at 05:31
  • Hi Trike, did you find any solution to your issue, and if yes can you please share it. – SiD Jun 27 '16 at 06:49
  • Hi Sid, unfortunately, the were unable to find workaround for 4GB limit for MS SQL Express DB size. We decided to switch SQL server version. – Trike Jun 28 '16 at 06:29
  • Thanks for the update Trike – SiD Jul 01 '16 at 04:38

1 Answers1

0

As far as I know SharePoint will always log crawled element info in crawl logs. There are 2 other places where your data might be trimmed: BDC Model and API.

  1. Please check if BDC Model has any filters defined. Please remove them.
  2. If possible log all your items returned by API. Then check if API returned all 1782 items.
  3. Check ULS Viewer for errors during crawl. You can filter messages by Category containing "Business".
Friki
  • 96
  • 6