Transcript
Mellanox Cloud and Database Acceleration Solution over Windows Server® 2012 SMB Direct Increased Performance, Scaling and Resiliency
July 2012 Motti Beck, Director, Enterprise Market Development
[email protected]
Microsoft Windows Server 2012 and SMB Direct New class of enterprise file storage Low latency, high throughout, low CPU
File Client
File Server
Application
overhead (50% lower versus Ethernet)
User Kernel
Fibre Channel replacement at a lower
SMB Client
SMB Server
cost and higher performance
Leverages Windows Server 2012 integrated Mellanox Ethernet and InfiniBand RDMA
Accelerates Microsoft HyperV-SMB
Network w/ RDMA support
RDMA Adapter
Network w/ RDMA support
RDMA Adapter
NTFS SCSI
Disk
and SQL-SMB solutions
10X Performance Improvement Over 10GbE* *Preliminary results based on Windows Server 2012 RC © 2012 MELLANOX TECHNOLOGIES
2
What is RDMA? Remote Direct Memory Access Protocol
Client
• Accelerated IO delivery model which works by allowing application software to bypass most layers of software and communicate directly with the hardware
RDMA benefits • • • •
Low latency High throughput Zero copy capability OS / Stack bypass
Mellanox RDMA based Interconnects • InfiniBand • RoCE: RDMA over Converged Ethernet
File Server
Memory
Memory RDMA
SMB Server
SMB Client
SMB Direct
SMB Direct NDKPI
NDKPI
RDMA NIC
RDMA NIC
Ethernet or InfiniBand © 2012 MELLANOX TECHNOLOGIES
3
SMB Direct over InfiniBand and RoCE
Client 1
File Server
4
Application
1. Application (Hyper-V,
1
Memory
Memory
SQL Server) does not need to change.
RDMA
User
Unchanged API
2. SMB client makes
Kernel
2
the decision to use SMB Direct at run time
SMB Server
SMB Client 2
SMB Direct
SMB Direct
NDKPI
3. NDKPI provides a
3
much thinner layer than TCP/IP
NDKPI
3
4. Remote Direct
4
RDMA NIC
RDMA NIC
Memory Access performed by the network interfaces.
RoCE and/or InfiniBand © 2012 MELLANOX TECHNOLOGIES
4
Mellanox End-to-End VPI Solution Mellanox provides end-to-end InfiniBand and Ethernet connectivity solutions (adapters, switches, cables) • Connecting data center servers and storage
Up to 56Gb/s InfiniBand and 40Gb/s Ethernet per port • Low latency, Low CPU overhead, RDMA • InfiniBand to Ethernet Gateways for seamless operation
Windows Server 2012 exposes the great value of Mellanox Interconnect Solution for storage traffic, virtualization and low latency • InfiniBand and Ethernet (with RoCE) integration • Highest Efficiency, Performance and return on investment © 2012 MELLANOX TECHNOLOGIES
5
Measuring SMB Direct Performance
Single Server
IO Micro Benchmark
Fusion Fusion Fusion IO Fusion IO IO IO
© 2012 MELLANOX TECHNOLOGIES
SMB IO Micro Client Benchmark
SMB IO Micro Client Benchmark
SMB IO Micro Client Benchmark
10GbE
IB QDR
IB FDR
SMB Server
SMB Server
SMB Server
10 GbE
IB QDR
IB FDR
Fusion Fusion Fusion IO Fusion IO IO IO
Fusion Fusion Fusion IO Fusion IO IO IO
Fusion Fusion Fusion IO Fusion IO IO IO
6
Microsoft Delivers Low-Cost Replacement to High-End Storage
FDR 56Gb/s InfiniBand delivers 5X higher throughput with 50% less CPU overhead vs. 10GbE
Native Throughput Performance over FDR InfiniBand
© 2012 MELLANOX TECHNOLOGIES
7
Measuring SMB Direct Performance in Virtualized Environment
File Client (SMB 3.0)
Hyper-V (SMB 3.0)
VM
SQLIO SQLIO
RDMA NIC
RDMA NIC
RDMA NIC
RDMA NIC
RDMA NIC
RDMA NIC
RDMA NIC
RDMA NIC
Single Server File Server (SMB 3.0)
SQLIO
RAID
RAID
RAID
RAID
Controller
Controller
Controller
Controller
RAID
RAID
RAID
RAID
Controller
Controller
Controller
Controller
File Server (SMB 3.0)
RAID
RAID
RAID
Controller
Controller
Controller
RAID Controller
SAS
SAS
SAS
SAS
SAS
SAS
SAS
SAS
SAS
SAS
SAS
SAS
JBOD
JBOD
JBOD
JBOD
JBOD
JBOD
JBOD
JBOD
JBOD
JBOD
JBOD
JBOD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
SSD
© 2012 MELLANOX TECHNOLOGIES
8
SMB Direct Performance in Virtualized Environment
Configuration Local
Remote Remote VM
© 2012 MELLANOX TECHNOLOGIES
BW
IOPS
MB/sec
512KB IOs/sec
%CPU Latency Privileged
milliseconds
10,090 38,492 ~2.5%
~3ms
9,852
~3ms
37,584 ~5.1%
10,367 39,548 ~4.6% ~3 ms
9
Microsoft’s Cluster in the Box (CiB) Reference Design Mellanox VPI Interconnect Solutions 10GbE, RoCE or InfiniBand
At least one node and storage always available, despite failure or replacement of any component.
Dual power domains
Network
Network
x8 PCIe Server A
Server B 1/10G Ethernet cluster connect
CPU x8 PCIe
Internal interconnect between nodes, controllers
Flexible PCIe slot for LAN options External SAS ports for JBOD
x8 PCIe
Server Enclosure
x8 PCIe
x4 SAS through x4 SAS midplane through midplane
Storage Controller x4 SAS
SAS Expander
CPU
A port s
0 1
Storage Controller
B port …23 s
x4 SAS
SAS Expander
expansion External JBOD A ports
Office-level power and acoustics for
SAS Expander
B ports 0 1
…23
SAS Expander
entry-level NAS Additional JBODs … © 2012 MELLANOX TECHNOLOGIES
10
Products Announced: X-IO Storage More than 15GB/sec Throughput Demonstrated
Remote Storage Systems Windows Server 2012 with SMB 3.0 PCI Express 3.0 based Servers - HP DL380p G8 Mellanox 56Gb/s FDR InfiniBand adapters and switches
© 2012 MELLANOX TECHNOLOGIES
11
Products Announced: Supermicro More than 10GB/sec Throughput Demonstrated under Hyper-V
Hyper-V with Windows Server 2012 RC Supermicro PCIe 3.0 based Servers, File Server & JBOD LSI MegaRAID SAS 9285 storage controllers with LSI FastPath I/O acceleration software OCZ’s Talos 2R SSDs
© 2012 MELLANOX TECHNOLOGIES
12
Summary Together with Microsoft we deliver 10X performance acceleration for Remote File Server under physical or virtual environment that boosts next generation Cloud and database applications
The first time we demo record performance for remote file server under Hyper-V using 2 FDR ports that delivers more than 10GB/sec using less than 5% CPU overhead
Mellanox RDMA interconnect solution integrated with SMB Direct in Widows Server 2012 delivers the most cost effective solution for File Server replacing FC and TCP/IP Ethernet.
Now with Microsoft and Mellanox integrated solution, remote file server delivers no less performance than native storage.
Customers already announced products using Mellanox RDMA enabled interconnect demonstrated extreme performance
Boost File Server to a Block Storage Performance Level © 2012 MELLANOX TECHNOLOGIES
13
THANK YOU
© 2012 MELLANOX TECHNOLOGIES
- MELLANOX CONFIDENTIAL -
14