This video appeared in Electronic Design and has been published here with permission.
Check out our AUVSI Xponential 2023 coverage.
PCIe/104 is a compact, stackable architecture designed to handle high-speed, high-performance single-board computers (SBCs) and peripherals. The boards utilize one to three banks of sockets to connect one board to another in the stack. The interface incorporates PCI Express (PCIe), which is a point-to-point high-speed serial interface; the three-bank version supports a x16 PCIe interface. The peripheral stacks can be built up or down from the host.
Chris Miller, Director of Customer Care & Infrastructure at VersaLogic, discusses the company's latest Ethernet adapter (watch the video above) that works with the Sabertooth SBC built around the PCIe/104 standard (see figure).
VersaLogic’s EPM-E9 10G Ethernet expansion module is a PCIe/104 Type 1 board that provides a pair of 10G interfaces. It's a stack-down adapter that's designed to go under the SBC. A typical configuration from Versalogic has the EPM-E9 under the 6-core Xeon-E-based EPMe-51 Sabertooth SBC. The SBC can accommodate a 128-GB NVMe fast read/write solid-state disk (SSD). A heat-pipe system is tied to the large heatsink on top to keep the system cool.
The Ethernet adapter utilizes x8 PCIe lanes and passes the remaining eight lanes to any boards underneath it. This would allow for a pair of EPM-E9 10G boards to be included in a PCIe/104 stack, which would use up all of the PCIe lanes. Many peripheral boards require fewer PCIe lanes. That number normally depends on the bandwidth required by the peripherals. This is low for interfaces like RS-232 serial ports or relay interfaces.
The 10G interfaces can handle 1G, 2.5G, 5G, and 10GBASE-T connections. They support the IEEE 1588/802.1AS Precision Time Protocol, as well as the IEEE 802.3-az energy-efficiency standard and PXE network boot standard. Drivers are available for Windows and Linux.
Links
- VersaLogic
- Sabertooth single-board computer
The video transcript below has been edited for clarity.
Our Sabertooth CPU module is a PCIe/104 embedded computer with a six-core Xeon processor, up to 32 gigs of ECC RAM, and-128 gig on-board NVME SSD. In addition, it includes dual gigabit Ethernet ports and comes in a passively cooled heat-pipe-cooled variation so that users can actually integrate it into completely fanless designs.
Also, by using the x16 port, customers can expand, add GPUs, Ethernet modules, even low-speed RS-232.; those kinds of add-ons to the stack as well. And we also just announced this guy, which is our new 10-gig Ethernet, it's dual 10-gigabit Ethernet module with two RJ45 10-gigabit ports. So no need for SFP sockets or SFP modules.
This plugs into that same x16 connector and then provides you with the two 10-gigabit Ethernet ports. Comes in both an air-cooled version as well as a version that's integrated into the Sabertooth module. It uses those same integrated heat pipes up to the passive cooling plate and in a dense, very compact solution for customers that can be integrated -42 plus 85MIL 202g shock and vib. And backed by all of VersaLogic's warranty and long-term availability support. PC 104 in 2023 is kind of an interesting topic.
People seem to think it's an older standard, but we do find that a lot of our customers are still integrating and designing even original PC104-Plus designs. So folks that need lower-speed buses and lower power-consumption levels are still picking ISA and PCI-based PC104-Plus boards for lower-power applications where they don't need really high-performance CPUs, but they need something a little bit better than a microcontroller, for example.
We're still seeing a really robust ecosystem of add-on boards both from ourselves, of course, but also from others in the industry. But we do also see that our competitors and others in the ecosystem are releasing more and more with the PCIe/104 or the PCIe/104 Express options. So those are usually used for when you've got to get to a situation like a high-speed x16x8 PCIe connection for GPUs, Ethernet modules, or for motion capture, that kind of thing.
But we are still seeing a pretty good mix. Most of our new designs on the higher end are opting for those higher-speed interconnects, like things that come on our board like the Sabertooth. For all of our boards, even our stacking PC104 stuff, we are starting to to add many PCIe sockets and M.2 sockets in the near future because there's, well we kind of call that last mile I/O, where folks can't quite get everything they need in the stack.
They need that one last Ethernet port or another couple of serial ports and being able to add those on with very low impact of the stack size has been really beneficial to our customers. In addition, we've got things like 5G on the horizon with M.2 sockets coming in the next year, and we're going to need to be able to accommodate those for our customers moving forward.
Check out more of our AUVSI Xponential 2023 coverage.