2

I have an app (java, Spring boot) that runs in a container in openshift. The application needs to go to a third-party server to read the logs of another application. How can this be done? Can I mount the directory where the logs are stored to the container? Or do I need to use some Protocol to remotely access the file and read it?

A remote server is a normal Linux server. It runs an old application running as a jar. It writes logs to a local folder. An application that runs on a pod (with Linux) needs to read this file and parse it

Violetta
  • 509
  • 4
  • 12

1 Answers1

2

There is a multiple way to do this. If a continious access is needed :

  • A Watcher access with polling events ( WatchService API )
  • A Stream Buffer
  • File Observable with Java rx

Then creating an NFS storage could be a possible way with exposing the remote logs and make it as a persistant volume is better for this approach.

Else, if the access is based on pollling the logs at for example a certain time during the day then a solution consist of using an FTP solution like Apache Commons FTP Client or using an ssh client which have an SFTP implementation like JSch which is a native Java library.

Reda Salih
  • 141
  • 1
  • 4
  • https://stackoverflow.com/questions/69494378/how-to-connect-from-spring-boot-application-to-nfs-server-directory-to-save-file. this is a related question. Please could you help – Syed Iftekharuddin Oct 08 '21 at 13:33