0

I have a database table with a list of URLs that I would like Sharepoint Search 2013 to index so they show up in search results - the URLs are a mixture of content types - web pages, Word documents, PDFs, etc.

All the URLs are internal to my network but aren't Sharepoint pages or files stored in Sharepoint.

I am using Sharepoint 2013 Enterprise Search on a Windows 2008 R2 server.

Does anyone have any ideas on how to achieve this?

I have searched for options but can't seem to find anything relevant - BDC and BCS have come up a lot but seems to be more indexing content returned by the connector. What I want to do is to use the data returned from the table as pointer to items to be indexed.

I'm very new to Sharepoint and Sharepoint Search and am at a bit of loss on how to go about this (to make it even more difficult I would like to apply ACLs to the results, and the ACLs are in another table but that's another question!). Given my experience level I would like the answer to be as basic as possible if you can, but any help would be apprecieated.

Felix Frank
  • 8,125
  • 1
  • 23
  • 30
Orielton
  • 45
  • 5
  • To be honest, I'm unsure exactly what you're trying to achieve and why. Have you looked at Federated Search? Or have you considered importing these URLs into a SharePoint list (that can be indexed and searched natively) and managing them there, rather than an external system? – Panoone Dec 18 '14 at 23:46

1 Answers1

0

BDC and BCS is the proper way to do it, but it's very complicated. If you want something simple, create a small script that writes all the URLs to a single HTML document. Then use the web crawler to crawl this document. It will follow the links and crawl the content.

Dan Gøran Lunde
  • 5,148
  • 3
  • 26
  • 24