-
Fabric hardware envelope (reference example)
The Spectrum-4 based SN5600 class is positioned for high-density 800GbE and 400GbE topologies, which is relevant for building large two-tier AI fabrics with fewer compromise points. High radix alone is not the whole answer, but it increases design options for path diversity and failure-domain control. The exam implication is understanding why topology options matter when synchronized traffic is dominant.
-
Core AI Ethernet mechanisms
The reference architecture emphasizes RoCE transport behavior, adaptive routing, and congestion-control decisions that can be tied directly to workload outcomes. Endpoint behavior and fabric behavior must be interpreted together rather than in isolation. In practice, the most useful operational view is end-to-end: host stack, switch policy, queue state, and job telemetry.
-
Why entropy matters
Low-entropy traffic means many large flows choose similar paths and repeatedly collide on the same links or queues. Static hashing may look acceptable in synthetic tests but fail under true AI burst synchronization. This is why path-distribution quality and adaptive behavior are central to Spectrum-X discussions. If entropy is poor, long-tail latency rises and collective completion jitter appears immediately in job metrics.
-
Vendor-reported benchmark trend
In vendor benchmark references, adaptive routing trends show better effective throughput and faster completion than static balancing under AI-like traffic. The exact percentage depends on topology, workload shape, and endpoint configuration, so treat benchmark numbers as directional rather than universal constants. What you should retain for exam and operations use is the mechanism-level reasoning: reduced hotspots and better path utilization tend to lower long-tail delays.