1
     asks_price  asks_qty exchange_name_ask     bids_price  bids_qty exchange_name_bid
0      20156.51  0.000745          Coinbase      20153.28  0.000200          Coinbase
1      20157.52  0.050000          Coinbase      20152.27  0.051000          Coinbase
2      20158.52  0.000745          Coinbase      20151.28  0.000200          Kraken
3      20158.52  0.050000          FTX           20151.28  0.051000          Coinbase

I would like to group the same price, add quantity together and combine the name of the exchange like :

     asks_price  asks_qty exchange_name_ask     bids_price  bids_qty exchange_name_bid
0    20156.51  0.000745          Coinbase      20153.28  0.000200          Coinbase
1    20157.52  0.050000          Coinbase      20152.27  0.051000          Coinbase
2    20158.52  0.050745          CoinbaseFTX   20151.28  0.051200       KrakenCoinbase

I received a good answer simply if it is the same name I do not want it to be concat, example CoinbaseCoinbase. How can i do that ?

Nathan
  • 91
  • 6

2 Answers2

2

group the dataframe by asks_price then call sum passing False to numeric_only argument, the solution assumes that the data is in the order you've provided in the sample data, else you need to handle the logic for sorting:

>>> df.groupby(['asks_price', 'bids_price']).sum(False)

                       asks_qty exchange_name_ask  bids_qty exchange_name_bid
asks_price bids_price                                                        
20156.51   20153.28    0.000745          Coinbase    0.0002          Coinbase
20157.52   20152.27    0.050000          Coinbase    0.0510          Coinbase
20158.52   20151.28    0.050745       CoinbaseFTX    0.0512    KrakenCoinbase

You can call reset_index() at last if you want to have prices as column rather than index.

ThePyGuy
  • 17,779
  • 5
  • 18
  • 45
  • Will the grouping disrupt the order? assuming the asks are ascending and the bids descending? – Nathan Sep 16 '22 at 11:34
  • @Nathan No grouping itself won't disrupt, it may just reorder the keys for which you can just pass `sort=False` to `groupby` if you've any concerns, its the individual rows I'm talking about. – ThePyGuy Sep 16 '22 at 11:35
  • You didn't take care of the side of the bid and you sum the bid_price, 40302.56 . – Nathan Sep 16 '22 at 11:37
  • @Nathan you should have clearly mentioned the columns, you just mentioned price so I thought its the `asks_price` you are talking about, anyways, updating the solution. – ThePyGuy Sep 16 '22 at 11:38
  • Yes but i explain in my example in the both side – Nathan Sep 16 '22 at 11:42
  • @Nathan no worries, the main idea is to use `key` column, if you have distinct records for bid price for a same ask price, do you want to have a sperate row? If yes, it should go to key parameter, else it should be aggregated. – ThePyGuy Sep 16 '22 at 11:44
  • 1
    Yes i dont want to aggreagate bid ask together . it's like two table separate but i want aggregate both side to know where i can buy/sell the best price with the best quantity – Nathan Sep 16 '22 at 11:51
  • I received a good answer simply if it is the same name I do not want it to be concat, example CoinbaseCoinbase. How can i do that ? – Nathan Sep 29 '22 at 08:52
  • 1
    For that, you need to combine the solution with `unique` , you can just ask it as a separate question I guess. – ThePyGuy Sep 29 '22 at 08:58
  • 1
    I asked a new question , waiting for answer .. tks – Nathan Sep 29 '22 at 09:04
1
df.groupby(['asks_price', 'bids_price'], as_index=False).sum(False)

    asks_price  bids_price  asks_qty    exchange_name_ask   bids_qty    exchange_name_bid
0   20156.51    20153.28    0.000745    Coinbase            0.0002      Coinbase
1   20157.52    20152.27    0.050000    Coinbase            0.0510      Coinbase
2   20158.52    20151.28    0.050745    CoinbaseFTX         0.0512      KrakenCoinbase
Naveed
  • 11,495
  • 2
  • 14
  • 21
  • I received a good answer simply if it is the same name I do not want it to be concat, example CoinbaseCoinbase. How can i do that ? – Nathan Sep 29 '22 at 08:53