0

We are Streaming file from S3 and processing it , after process complete we upload file back to S3 as Error / Archive file while streaming file from S3 it streams data and in between it stops processing with Error as "akka.http.scaladsl.model.EntityStreamException: Entity stream truncation" , Not sure is this depend on file size stream from S3 or corrupt file ?

val source = s3Client.download(baseConfig.bucketName.get, 
content.key)._1.via(Gzip.decoderFlow).
via(Framing.delimiter(ByteString("\n"), 256, 
byeFormatterFlag).map(_.utf8String))
val flow = flowDefintion(list)
val resp = source.via(flow).runWith(Sink.seq)

akka {
loglevel = "INFO"
stdout-loglevel = "INFO"
logging-filter = "akka.event.slf4j.Slf4jLoggingFilter"
http {
routing {
  decode-max-size = 25m
}
parsing {
  max-to-strict-bytes = 20m
  max-content-length = 20m
  max-chunk-size=10m
}

} }

Learner
  • 45
  • 1
  • 6
  • Are you streaming from & to the same bucket? – Ramón J Romero y Vigil Sep 28 '18 at 11:39
  • Yes same bucket in different folder – Learner Sep 28 '18 at 13:04
  • @RamonJRomeroyVigil - Not sure root cause yet sometime file get processed and we dont see this error , is this related to amount of data we stream from S3 does akka http has limit in terms of chunked of data to process , we are using alpakka S3 library for streaming files. – Learner Sep 28 '18 at 16:10
  • maybe you're bumping into the `max-content-length`: https://doc.akka.io/docs/akka-http/current/configuration.html – Ramón J Romero y Vigil Sep 28 '18 at 20:14
  • @RamonJRomeroyVigil - Updated code in original question , seems this issue happen during decompress gzip file – Learner Sep 29 '18 at 01:39
  • File we processed is of 1.2MB in gzip format , Streams worked with non gzip format upto 100MB we have tested with csv file for 100MB – Learner Sep 29 '18 at 07:26
  • @RamonJRomeroyVigil - Looks like Issue get resolved after adding this property - host-connection-pool{ max-open-requests = 128 } – Learner Oct 01 '18 at 10:47

0 Answers0