I wrote a R script which import data from fcs files, perform a clustering by FlowSOM and I'd like to check it and visualize my data on t-SNE. It works well for low amount of data but when I try with a lot of data (around 4 thousand cells) it crashes when it reaches 310 000 cells without any error message.
out_tsne <- Rtsne(data_Rtsne,
perplexity = perp,
initial_dims = 50,
max_iter = i,
pca=T,
verbose = T,
num_threads=0)
data_Rtsne is a matrix of 4000000*10.
With a smaller dataset (around 500 000 cells), it works without trouble. So it should be a memory limit issue.
Can someone explain to me why I don't have any error message from R (it closes and that's all) and why can't I see a saturation of RAM on windows' task manager?
Thank in advance for your help!
EDIT : Ok, I tried again with a smaller dataset (around 500 000 cells) and it works without trouble. So it should be a memory limit issue. Can someone explain to me why I don't have any error message from R (it closes and that's all) and why can't I see a saturation of RAM on windows' task manager ?