Questions tagged [mellanox]

Mellanox Technologies Ltd. is an Israeli and American multinational supplier of computer networking products using InfiniBand and Ethernet technology

Mellanox Technologies Ltd. is an Israeli and American multinational supplier of computer networking products using InfiniBand and Ethernet technology

40 questions
0
votes
1 answer

Cannot use SDP in Debian Squeezy

I have severe problems running SDP on Debian Squeeze. I'm using two machines with Mellanox adapters each. My /etc/modules looks like this: mlx4_ib # Mellanox ConnectX cards #ib_mthca # some mellanox cards #iw_cxgb3 # Chelsio T3 cards #iw_nes #…
Ulf
  • 387
  • 1
  • 5
  • 18
0
votes
0 answers

Nothing but DHCP works after testing SR-IOV on Mellanox ConnectX-4 Lx

I was following the Proxmox guide for enabling PCIe passthrough and SR-IOV for my NIC, since I am running Vyos in a VM as a router. However, after undoing all the changes the NIC is not working anymore. The only thing that seems to be working is…
0
votes
0 answers

Number of receiving queues is double the number of cores of my server?

I'm using Mellanox ConnectX-5 100 GB NIC, Linux kernel 5.15 and my server has 32 cores (SMT disabled). But in the /sys/net//queues, the number of rx queues is 64, which is double the number of cores of my server? How to fix it
0
votes
0 answers

How does the mlx5 driver use the IOVA generated by IOMMU as DMA address?

I have enabled IOMMU on the physical machine. I expect the RDMA NIC to use the IOVA allocated by the IOMMU module for DMA after enabling IOMMU. However, in reality, the RDMA NIC does not use the IOVA for DMA: enter image description here I found…
kuao
  • 1
0
votes
0 answers

Unable to clear virtual mac address on Mellanox ConnectX3 HP 546FLR

I recently had some problems with setting up bond interfaces during deployment with canonical maas. Two server stuck with the same mac address as virtual mac address and I have no idea where I can set or clear this. Within Ubuntu the virtual mac is…
0
votes
1 answer

Only getting 25Gb/s instead of 100Gb/s from a Mellanox switch running Cumulus Linux

I have a Mellanox 100gb/s switch (running Cumulus Linux 4.1) that I use for connecting multiple servers, each with a Mellanox ConnectX 5 100gb/s card. These servers connect to the switch via a DAC cable. While it is working, I am only able to get…
Jarmund
  • 535
  • 2
  • 6
  • 17
0
votes
1 answer

K8s nodeLocalDns pod times out connecting to coreDns after upgrading base os to ubuntu20.04 ConnectX-4 card

Team, I have Mellanox Nic ConnectX-4 on a k8s worker node and it hosts a nodeLocal dns pod on it. The nodeLocalDns pod is timing out when trying to connect to coreDns service on k8s cluster. Same works on Ubuntu18. Versions failing with k8s v1.13.5…
AhmFM
  • 119
  • 5
0
votes
0 answers

Infiniband adapter down

edit: On CentOS 8.5, tried with Mellanox driver 4.9-4.1.7.0 (legacy) and 5.5-1.0.3.2: I am not able to get my Infiniband adapter working. The output of ibstat states that it is down: CA 'mlx5_0' CA type: MT4123 Number of ports:…
Holger
  • 51
  • 1
  • 5
-1
votes
1 answer

SFP+ Connector Won't Latch

We're trying to connect a Cisco Meraki MS225-48 switch to a Mellanox SN2100 switch using this breakout cable. However, the SFP+ end of the cable does not physically latch into the Meraki SFP+ ports (i.e. they don't "click" into place), and even with…
Whimsical Seaplane
  • 145
  • 2
  • 2
  • 8
-1
votes
1 answer

Very poor performance on Kubernetes with 100GbE network

we are using ConnectX-5 100GbE ethernet cards on our servers which is connected one to each other trough the mellanox switch. And we are using weavenet cni plugin on our Kubernetes cluster. When we make some tests using iperf tool with the following…
1 2
3