0

This question is related to a trading exchange known as Poloniex.com where I m using their public api- https://poloniex.com/support/api/ especially the function of returnChartData using Python Wrapper.

I have a list with me and this list includes all the altcoins(Alternate coins) listed on Poloniex Something like this- Altcoins= ['BTC_ETH','BTC_ZEC','BTC_XMR','BTC_LTC','BTC_ETC','BTC_BTS','BTC_GNT','BTC_XRP','BTC_FCT','BTC_SC','BTC_DCR','BTC_DASH',.....] (It should have more than 80-100 Altcoins)

the returnChartData function when called returns the trading and pricing data for the particular currencypair for an interval ranging from 5 minutes to a week. So bascially it is Historical data api. I want to use there 4 Hour candle data (period=14400) which I wish to call every 4 hour for all the alt coins at once.

This is what I wish to do- 1. Use poloniex public api and call the historical data for all the altcoins (around 100) every 4 hours 2. Want to create variable of the name same of the altcoin so around 80-90 variables and 3. store data of a particular altcoin to its respective variable 4. Using Pandas DataFrame on all those variable and perform trading and analysis 5. Repeat process every 4 hours. (Offcourse i need not create variables again and again)

So is there any way that I use and run one or two loops every 4 hours to solve this issue or should I run individual 80-100 calculations individually?

Here from where the api is taken- https://github.com/s4w3d0ff/python-poloniex Here is the sample code for running 1 calculation at a time

    from poloniex import Poloniex, Coach
    import pandas as pd
    myCoach = Coach()
    public = Poloniex(coach=myCoach)

    """Below is the code for a single Altcoin. But I wish to perform the below process on the whole gamut"""

    eth=public.returnChartData('BTC_ETH',period=14400) """Saving the data to a variable"""
    eth=pd.DataFrame(eth)

The above code gives me what I want, but please understand how can I write same above piece for 100 altcoins and run them every 4 hours. What if i want to run it every 5 minutes. It will be cumbersome.

This is what I tried to solve the problem-

    from poloniex import Poloniex, Coach
    import pandas as pd
    myCoach = Coach()
    public = Poloniex(coach=myCoach)
    coinlist=['BTC_ETH','BTC_ZEC','BTC_XMR','BTC_LTC','BTC_ETC','BTC_BTS','BTC_GNT','BTC_XRP','BTC_FCT','BTC_SC','BTC_DCR','BTC_DASH']
    for i in coinlist:
        altcoins=public.returnChartData(i,period=14400)

The above thing that I tried gives me data of the last altcoin in list and i.e. BTC_DASH. I think it overriding data till it reaches the end

Can you guys help out please

2 Answers2

0
So is there any way that I use and run one or two loops every 4 hours to solve this issue 

Just a quick thought. Yes, there is a way to run two loops every 4 hours. Timestamp moment you start and if current time is >= timestamp + 4h run the loops and reset timestamp.

or should I run individual 80-100 calculations individually?

Get more hardware and think about multiprocessing / multithreading to parallelise operations.

0

Try to store only what you need and take care to use start and end parameters when getting ChartData from Poloniex API


From https://poloniex.com/support/api/, you can see that :

- returnChartData

Returns candlestick chart data. Required GET parameters are "currencyPair", "period" (candlestick period in seconds; valid values are 300, 900, 1800, 7200, 14400, and 86400), "start", and "end". "Start" and "end" are given in UNIX timestamp format and used to specify the date range for the data returned. [...]

Call: https://poloniex.com/public?command=returnChartData&currencyPair=BTC_XMR&start=1405699200&end=9999999999&period=14400


The best method is to setup and use an external DB (mongoDB,tinyDB,etc...), to store chartdata then to update them.

Assuming that your DB is constantly sync with the real data market, you may do what you want with your local DB without taking any risk to overload or to reach the requests/min limit on Poloniex side.

Assuming that DB is functional:

  • you may first store all data from the begin to the end of each supported pair and for each windows, it will take a long time to process ...

    The 5M chartdata for a coin that have more than 3 years of trade activity could be very long to reinject in your DB (depend on DB,CPU ...)

  • at regular interval - when it is required - and using cronjob :

    - every 5 minutes for the 5M window

    - every 15 minutes for the 15M window

    - ...

    - every x minutes for the xM window

    you need to update chartData for each avaible pair with a start parameter set to the previous recorded candle time according to this pair and window in your DB and the end parameter corresponding to now time (or the Poloniex server now time if you are not in the same time zone)

    Considering that each update will give you only 1 or 2 candle data per requests, the global and complete request processing will be short !

  • you may at the end try to use multithreading & multiprocessing to speed up the global update procedure, but take care to not overload (my advise is no more than 4 concurrents threads) Poloniex infrastructure.

A. STEFANI
  • 6,707
  • 1
  • 23
  • 48