2

I have a Sinatra app that I have deployed to Heroku. There's a webpage where a user submits a form, it writes some files to a temporary folder, then executes a shell command that has to consume those files that were just written.

The problem is that when I execute a shell, the executed command doesn't have access to the files written to the temporary folder. The shell command is supposed to write another file to the temporary folder, but that file isn't visible to the Ruby code after the shell command has finished running.

I think the web instance is running in one dyno, but the shell command gets executed in another dyno. Is there a way to force the shell command to be executed in the same dyno that the HTTP request/response is running in?


Edit: I did some more debugging and it seems like they are both running in the same dyno (ENV['DYNO'] is the same as [backtick]echo $DYNO[backtick]).

Teddy
  • 18,357
  • 2
  • 30
  • 42
  • When you run a heroku bash (i.e.: `heroku run bash`), it starts a new dyno with its *own* ephemeral system, which is why you don't see the files you were hoping for. Did you figure a way to access them anyway some other way? (I wanted to access the ephemeral filesystem too -- for development/debug purposes. What I do for now is use system commands, in my case via PHP's `shell_exec` e.g.: with `ls` and `tail`.) – Fabien Snauwaert Aug 05 '17 at 09:37
  • I don't remember what exactly I did. I used Google Drive as a data store for PDF files in this project. I think your shell command needs to save files to some cloud based destination such as AWS, Google Drive, FTP, etc. Then you can access the files from another Heroku instance. – Teddy Aug 06 '17 at 16:16

0 Answers0