0

Actually, I've Microsoft Distributed File System (DFS) hosted on an EC2, so I want to read all the files from a specific path in DFS. And this thing I want to achieve through lambda function.

I do have the following information:

file_path: \\<DNS>\football\

service_account: username: ABC secret: 123

Permission: My lambda function has all the necessary permission to read the DFS

So, how do I retrieve all the file names located at that location using the lambda function?

1 Answers1

0

After lots of huddling and juggling, I could able to figure out the way to access any Windows file server like FSx and DFS using the lambda function(python) alone using subprotocol.

Note: Lambda function should have all the level access to file server read/write operation along with file server allowing to use the SMB Protocol by its client in any operation.

For more detail please follow this article which is insightful. https://aws.amazon.com/blogs/storage/enabling-smb-access-for-serverless-workloads/

Desclamair: There is no direct way to move the file directly from the s3 bucket to the file server using only the lambda function. We can use the other AWS services to make this movement directly from s3 to fileserver which is out of this context. Also if we can manage to mount the fileserver in the lambda function itself that we can do I guess.

Procedure followed to copy the file from s3 to fileserver using lambda function alone.

  1. Copy the file from the s3 bucket to the lambda temporary location. i.e. tmp/
  2. Perform the normal read/write file operation.

=> Open the file from the tmp/ location in read mode f1

=> Open the same file in the remote file server using smbclient module present in smbprotocal library. f2

=> Write the content of the read by f1 in file open by f2 using the same smbclient