I am currently trying to develop my own "automated" trading journal. I get the data from the bybit api (https://bybit-exchange.github.io/docs/inverse/#t-introduction) I use the pybit (https://github.com/verata-veritatis/pybit) lib to connect to the bybit API. I am trying to pull the closed p&l positions (https://bybit-exchange.github.io/docs/inverse/#t-closedprofitandloss)
I was able to connect to the bybit API via some python code.
Now let me describe the problem I am having: The API request is limited to 50 results PER PAGE.
How can I iterate through all the pages and save this in ONE JSON file?
This is the code I am currently using:
import pybit as pybit
from pybit import inverse_perpetual
session_unauth = inverse_perpetual.HTTP(
endpoint="https://api-testnet.bybit.com"
)
session_auth = inverse_perpetual.HTTP(
endpoint="https://api.bybit.com",
api_key="",
api_secret=""
)
data = session_auth.closed_profit_and_loss(symbol="BTCUSD", limit=50)
import json
with open('journal.json', 'w', encoding='utf-8') as f:
json.dump(data, f, ensure_ascii=False, indent=4)
import pandas as pd
df = pd.read_json(r"C:\Users\Work\PycharmProjects\pythonProject\journal.json")
df.to_csv (r"C:\Users\Work\PycharmProjects\pythonProject\journal.csv", index = None)
I left the api_key and api_secret empty because this is confidential information.