I have a list of embeddings. The list has N
lists with M
embedding (tensors) each.
list_embd = [[M embeddings], [M embeddings], ...]
(Each embedding is a tensor with size (1,512)
)
What I want to do is create a tensor size (N, M)
, where each "cell"
is one embedding.
Tried this for numpy array.
array = np.zeros(n,m)
for i in range(n):
for j in range(m):
array[i, j] = list_embd[i][j]
But still got errors.
In pytorch tried to concat
all M embeddings into one tensor size (1, M), and then concat all rows. But when I concat along dim 1 two of those M embeddings, I get a tensor shaped (1, 1028) instead (1, 2).
final = torch.tensor([])
for i in range(n):
interm = torch.tensor([])
for j in range(m):
interm = torch.cat((interm, list_embd[i][j]), 0)
final = = torch.cat((final, interm), 1)
Any ideas or suggestions? I need a matrix with the embeddings in each cell.