I'm trying to learn rust by doing some data parsing and re-work some of my trading tools but got stuck pretty quick.
I want to resample my data from 5 min to 15 min and Polars seems to be able to do this in an optimized way.
This is my try so far. I manage to group the times from 5 min into 15 min but I cannot wrap my head around how to apply this grouping to the other columns.
use polars::prelude::*;
use std::error::Error;
fn type_of<T>(_: T) {
println!("--- DATA TYPE: {}", std::any::type_name::<T>());
}
fn main() -> Result<(), Box<dyn Error>> {
let path = "path/to/.csv";
let df = LazyCsvReader::new(path)
.has_header(true)
.with_parse_dates(true)
.finish()?
.fetch(20)?;
let tt = df.groupby_dynamic(
vec![df.column("close")?.clone()],
&DynamicGroupOptions {
index_column: "time".into(),
every: Duration::parse("15m"),
period: Duration::parse("5m"),
offset: Duration::parse("0s"),
truncate: true,
include_boundaries: false,
closed_window: ClosedWindow::Left,
},
)?;
//type_of(&tt);
println!("{:?}", tt);
}
OUTPUT
Series: 'time' [datetime[μs]]
[
2019-09-03 09:00:00
2019-09-03 09:15:00
2019-09-03 09:30:00
2019-09-03 09:45:00
2019-09-03 10:00:00
2019-09-03 10:15:00
2019-09-03 10:30:00
2019-09-03 10:45:00
2019-09-03 11:00:00
], [], Slice { groups: [[0, 1], [1, 1], [1, 4], [4, 4], [7, 4], [10, 4], [13, 4], [16, 4], [19, 1]], rolling: false })
As soon as i try to add a series I want to group in the "by" field (first argument in groupby_dynamic) no resampling is taking part, I only get the same series as the put it.
The function outputs a Slice { groups: ..."
which is of type polars_core::frame::groupby::proxy::GroupsProxy
But I don't know how I should handle it.
My cargo.toml:
[dependencies]
polars = { version = "0.25.1", features = ["lazy"] }
My .csv-file:
time,open,high,low,close,volume
2019-09-03 09:00:00,1183.9999,1183.9999,1183.9999,1183.9999,150
2019-09-03 09:30:00,1178.69,1180.69,1178.47,1178.47,5180
2019-09-03 09:35:00,1177.03,1180.6146,1176.0,1179.47,70575
2019-09-03 09:40:00,1180.6345,1186.89,1180.6345,1185.5141,37267
2019-09-03 09:45:00,1185.9,1186.43,1182.43,1182.47,20569
2019-09-03 09:50:00,1183.54,1184.0,1180.0,1181.96,20754
2019-09-03 09:55:00,1182.5,1186.0,1182.49,1184.83,20848
2019-09-03 10:00:00,1185.5,1185.59,1184.03,1185.145,18581
2019-09-03 10:05:00,1184.65,1184.65,1175.5,1175.86,27714
2019-09-03 10:10:00,1175.49,1176.5,1173.65,1175.47,21779
2019-09-03 10:15:00,1175.295,1177.42,1173.5,1173.68,13588
2019-09-03 10:20:00,1173.01,1176.3717,1173.01,1175.44,9853
2019-09-03 10:25:00,1175.7896,1178.985,1175.7896,1177.468,7866
2019-09-03 10:30:00,1178.05,1179.0,1176.0038,1178.72,11576
2019-09-03 10:35:00,1179.005,1179.005,1176.53,1177.0077,9275
2019-09-03 10:40:00,1177.18,1178.02,1176.0201,1178.02,8852
2019-09-03 10:45:00,1178.3,1182.5,1178.3,1181.7113,14703
2019-09-03 10:50:00,1181.74,1181.9952,1180.01,1181.738,10225
2019-09-03 10:55:00,1182.11,1183.428,1181.33,1183.428,7835
2019-09-03 11:00:00,1183.41,1184.665,1183.41,1184.24,9078
The next thing would be to get the first,last,max,min out of the "closed" columns (yes, i'm trying to get OHLC candles).
Happy for any help!