Is there a way to print logs after each file is loaded?
While the statement execution is non-interactive when used from a library, the Snowflake python connector does support logging its execution work.
Here's a shortened snippet that incorporates the example from the link above:
# Assumes a 'con' object pre-exists and is connected to Snowflake already
import logging
for logger_name in ['snowflake.connector', 'botocore', 'boto3']:
logger = logging.getLogger(logger_name)
logger.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
ch.setFormatter(logging.Formatter('%(asctime)s - %(funcName)s() - %(message)s'))
logger.addHandler(ch)
con.cursor().execute("put file:///Path/file_name* @stage_name")
# Optional, custom app log:
# logging.info("put command completed execution, exiting")
con.close()
Watching the output (to stderr) while this program runs will yield the following (filtered for just upload messages):
~> python3 your_logging_script.py 2>&1 | grep -F "upload_one_file()"
[…]
2020-06-24 04:57:06,495 - upload_one_file() - done: status=ResultStatus.UPLOADED, file=/Path/file_name1, (…)
2020-06-24 04:57:07,312 - upload_one_file() - done: status=ResultStatus.UPLOADED, file=/Path/file_name2, (…)
2020-06-24 04:57:09,121 - upload_one_file() - done: status=ResultStatus.UPLOADED, file=/Path/file_name3, (…)
[…]
You can also configure the python logger to use a file, and tail the file instead of relying on the stderr
(from logging.StreamHandler
) used for simplicity above.
If you need to filter the logging for only specific messages, the logging
python module supports attaching your own filters that decide on each record emitted. The following filters for just the upload_one_file()
function call messages (use record.message
field to filter over the log message instead of on the function name used in example below):
class UploadFilter(logging.Filter):
def filter(self, record):
# Only tests one condition, but you could chain conditions here
return "upload_one_file" in record.funcName.lower()
for logger_name in ['snowflake.connector', 'botocore', 'boto3']:
logger = logging.getLogger(logger_name)
logger.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
ch.setFormatter(logging.Formatter('%(asctime)s - %(funcName)s() - %(message)s'))
ch.addFilter(UploadFilter())
# ch.addFilter(AnyOtherFilterClass())
logger.addHandler(ch)
Note: If you are changing your handlers (stream to file), ensure you add the filter to the actual new handler too, and the handler to the logger. You can read the tutorial on logging in Python to understand its mechanism better.