0

I have a mongodb database, I want to count how long it takes to query the database. my code so far:

from pymongo import MongoClient
import time

client = MongoClient('mongodb://localhost:27017') # open connection
db = client.customer

start = time.time()

# query data mongo
q = db.atweetdata.find()

stop = time.time()-start

print(stop)

The result shows 0.0. It might because the time is so sort/fast or something.

How to count the process more specific to nanosecond? Thanks

ytomo
  • 809
  • 1
  • 7
  • 23

1 Answers1

0

by doing a similar test, I have no problem with this issue:

>>> import time
>>> start = time.time(); end = time.time();
>>> print(start-end)
-2.09808349609375e-05
lelouchkako
  • 123
  • 1
  • 7
  • can you tell what machine you use? linux? Mac? Windows? – ytomo May 19 '18 at 09:30
  • @ytomo maybe your question is related to the default precision of your print function. see [https://stackoverflow.com/questions/1566936/easy-pretty-printing-of-floats-in-python](https://stackoverflow.com/questions/1566936/easy-pretty-printing-of-floats-in-python) – lelouchkako May 19 '18 at 09:30
  • @ytomo OS X, Python 3.6.2 |Anaconda, I am using. – lelouchkako May 19 '18 at 09:30