3

I am using MAPI tools (Its microsoft lib and in .NET) and then apache TIKA libraries to process and extract the pst from exchange server, which is not scalable.

How can I process/extracts pst using MR way ... Is there any tool, library available in java which I can use in my MR jobs. Any help would be great-full .

Jpst Lib internally uses: PstFile pstFile = new PstFile(java.io.File)

And the problem is for Hadoop API's we don't have anything close to java.io.File.

Following option is always there but not efficient:

  File tempFile = File.createTempFile("myfile", ".tmp");
  fs.moveToLocalFile(new Path (<HDFS pst path>) , new Path(tempFile.getAbsolutePath()) );
  PstFile pstFile = new PstFile(tempFile);
Cœur
  • 37,241
  • 25
  • 195
  • 267
Yogesh
  • 191
  • 1
  • 2
  • 12
  • To do it in a Map/Reduce way, you'll need to be able to split the PST file into small chunks, so individual nodes can process their part. I'm not sure the PST file format supports that though? – Gagravarr May 02 '12 at 16:29

2 Answers2

2

Take a look at Behemoth (http://digitalpebble.blogspot.com/2011/05/processing-enron-dataset-using-behemoth.html). It combines Tika and Hadoop.

I've also written by own Hadoop + Tika jobs. The pattern is:

  1. Wrap all the pst files into sequencence or avro files.
  2. Write a map only job that reads the pst files form the avro files and writes it to the local disk.
  3. Run tika across the files.
  4. Write the output of tika back into a sequence file

Hope that help.s

Karthik Ramachandran
  • 11,925
  • 10
  • 45
  • 53
  • How individual mapper will recognize the input format as pst files? and then how it will extract those? – Yogesh May 08 '12 at 12:57
  • @Yogesh when you place the files into a seuqnce file wrap it some data structure, I actually use Avro and simply add a header field with mime-type (which i get from Tika) as part of the wrapping process. That first step is not a MR job, because of the small files problem in Hadoop. Highly recommend you check out and look at the Behemoth code, that's a good example to start from. – Karthik Ramachandran May 08 '12 at 16:09
0

Its not possible to process PST file in mapper. after long analysis and debug it was found out that the API is not exposed properly and those API needs localfile system to store extracted pst contents. It directly cant store on HDFS. thats bottle-neck. And all those API's(libs that extract and process) are not free.

what we can do is extract outside hdfs and then we can process in MR jobs

Yogesh
  • 191
  • 1
  • 2
  • 12