The joint effort with NVIDIA and testing performed in Mellanox's performance labs, using the Mellanox HDR
InfiniBand Quantum connecting four system hosts, each with eight NVIDIA V100 Tensor Core GPUs with NVLink interconnect technology and a single ConnectX-6 HDR adapter per host, have achieved an effective reduction bandwidth of 19.6GB/s by integrating SHARP's native streaming aggregation capability with NVIDIA's latest NCCL 2.4 library, which now takes full advantage of the bi-directional bandwidth available from the Mellanox interconnect.
HDR
InfiniBand delivers the best performance and scalability for HPC and AI applications, providing our users with the capabilities to enhance research, discoveries and product development, said Gilad Shainer, vice president of marketing at Mellanox Technologies.
The ConnectX family of Virtual Protocol Interconnect[R] adapters support both
InfiniBand and Ethernet and offer unmatched RDMA (Remote Direct Memory Access) features and capabilities and future-proofs data center investments by supporting speeds of 10, 25, 40, 50, and 100Gb/s.
Mellanox's end-to-end FDR 56Gb/s
InfiniBand solution provides high-bandwidth and a low-latency processing framework needed for the internal data network to link with the two private clouds, the company added.
In addition, Mellanox has expanded its line of EDR 100Gb/s
InfiniBand switch systems.
QLogic offers a comprehensive, end-to-end portfolio of
InfiniBand networking products for HPC, including quad data rate (QDR) host channel adapters, QDR directors, edge switches, pass-thru modules and intuitive tools to install, operate, and maintain high-performance fabrics.
"By qualifying all of its adapter families with SUSE Linux Enterprise 11, QLogic is well-positioned to fully exploit the capabilities of this platform in Fibre Channel, iSCSI,
InfiniBand and FCoE environments."
InfiniGreen is also projected to typically consume only 0.9 watts per termination, about one-third of that consumed by current
InfiniBand cables.
The single
InfiniBand fabric enables connectivity from every server to every resource and eliminates the need for additional infrastructure.
Signal reliability is enhanced over the
InfiniBand QDR data rate of 10 Gbps/ channel.