0

I am trying to push the data to AWS S3. I had user the example in (http://druid.io/docs/0.7.0/Tutorial:-The-Druid-Cluster.html) but modified the common.runtime.properties as below

druid.storage.type=s3
druid.s3.accessKey=AKIAJWTETHZDEQLHQ7AQ
druid.s3.secretKey=tcTtvGXcqLmmMbo2hRunzlSA1P2X0O0bjVf537Nt
druid.storage.bucket=testfeed
druid.storage.baseKey=sample

Below is the logs for realtime node

2015-03-02T15:03:44,809 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.QueryConfig] from props[druid.query.] as [io.druid.query.QueryConfig@2edcd9d] 2015-03-02T15:03:44,843 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.search.search.SearchQueryConfig] from props[druid.query.search.] as [io.druid.query.search.search.SearchQueryConfig@7939de8b] 2015-03-02T15:03:44,861 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from props[druid.query.groupBy.] as [io.druid.query.groupby.GroupByQueryConfig@bea8209] 2015-03-02T15:03:44,874 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [100000000] for [druid.processing.buffer.sizeBytes] on [io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()] 2015-03-02T15:03:44,878 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning value [2] for [druid.processing.numThreads] on [io.druid.query.DruidProcessingConfig#getNumThreads()] 2015-03-02T15:03:44,878 INFO [main] org.skife.config.ConfigurationObjectFactory - Using method itself for [${base_path}.columnCache.sizeBytes] on [io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()] 2015-03-02T15:03:44,880 INFO [main] org.skife.config.ConfigurationObjectFactory - Assigning default value [processing-%s] for [${base_path}.formatString] on [com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()] 2015-03-02T15:03:44,956 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.query.topn.TopNQueryConfig] from props[druid.query.topN.] as [io.druid.query.topn.TopNQueryConfig@276503c4] 2015-03-02T15:03:44,960 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.segment.loading.LocalDataSegmentPusherConfig] from props[druid.storage.] as [io.druid.segment.loading.LocalDataSegmentPusherConfig@360548eb] 2015-03-02T15:03:44,967 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.client.DruidServerConfig] from props[druid.server.] as [io.druid.client.DruidServerConfig@75ba7964] 2015-03-02T15:03:44,971 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from props[druid.announcer.] as [io.druid.server.initialization.BatchDataSegmentAnnouncerConfig@1ff2a544] 2015-03-02T15:03:44,984 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.server.initialization.ZkPathsConfig] from props[druid.zk.paths.] as [io.druid.server.initialization.ZkPathsConfig@58d3f4be] 2015-03-02T15:03:44,990 INFO [main] io.druid.guice.JsonConfigurator - Loaded class[class io.druid.curator.CuratorConfig] from props[druid.zk.service.] as [io.druid.curator.CuratorConfig@5fd11499]

leppie
  • 115,091
  • 17
  • 196
  • 297
user2611300
  • 133
  • 2
  • 9

1 Answers1

0

I got the issue. I had missed the s3 extension in common.runtime.properties. Once that was added data started getting pushed to s3.

user2611300
  • 133
  • 2
  • 9
  • Can you post a snippet illustrating what that looks like? – user766353 Mar 18 '15 at 22:32
  • Actually i didnt implement any code. Was following the example mentioned above. – user2611300 Mar 19 '15 at 12:58
  • @user766353 here's the line to add the s3 extension: `druid.extensions.coordinates=["io.druid.extensions:druid-s3-extensions:0.6.173"]` I'm running druid version 0.6.173, make sure you change that to your version. – Anthony Elliott Jun 25 '15 at 15:29