I have a fully functional django web app running in windows local machine. However, I now need to deploy it in aws ec2 windows server.
- This is "upload - process - download" type of application.
- since the processing is quite heavy, I want to shift it to databricks notebook.
- So, DB Notebook should access the input file, process it and later save the output which can be downloaded by using web app.
My question is,
Can this be done ?
I was thinking of a way, where I can trigger a notebook through rest API request with required parameters. ( I couldn't find any way ) If I trigger DB Notebook with AWS-lambda, then can I trigger AWS- lambda through rest API ?
Both Input and Output can be saved to either DBFS / S3 ?
If someone has worked on similar activity, can anyone suggest a way to do it.
I am quite new to databricks, thus not aware of most of it's functionalities.
- Note - Both Input & Output are .csv format files. I understand this could be a similar to some other question here, I couldn't find such specific use case.