…right people – right place – right time!
<  BACK TO NEWS

A new era of Storage and Memory solutions are transforming high performance computing

 

27th April - Valencia.

Why new storage and memory subsystems are important

The push for IoT and edge devices has driven a new level of complexity into embedded systems and accompanying memory and storage subsystems that were once optimised for cost and performance. For Engineers working in high performance computers, the balance of speed, performance, price and specific application area knowledge is needed to drive the future computer technology needs.

With so many varieties of storage technology available, it can be quite bewildering for engineers to choose the right one.

 We now see high-performance computing in the embedded space for applications that require handling massive amounts of data (for example, image processing). The need for higher speeds and lower latency throughput enabled the introduction of PCI. Developed by Intel Corporation, the Peripheral Component Interconnect standard (PCI) is an industry-standard and is a high-speed bus which is found in nearly all desktop computers. PCI slots enabled a wide variety of expansion cards to be installed including Graphics or Video cards. Sound cards. Network cards.

 

How PCIe interfaces provides lower latency, lower power usage and smaller profile

PCI is a parallel interface and uses individual buses for each of the devices connected to it, resulting sometimes in delays because of congested bandwidth.   Embedded hardware needed higher speeds and lower latency features to meet challenges encountered with the increasing use of GPUs, FPGAs, I/O, and connection points within cloud ecosystems as embedded systems embraced IoT.

PCI Express (PCIe) has therefore been introduced to support high data throughput, it uses lower power and is of smaller size. As it is a serial interface which gives each device its own direct access to the bus, provides lower latency and faster throughput speed. PCIe makes today’s laptops and computers smaller, making them powerful, portable, and handy.

PCIe 4.0 marks the final interface upgrade for the COM Express family of specifications, but the modular tradition will continue in a next-generation COM architecture from PICMG – COM-HPC.

COM-HPC addresses the high-performance end of the market by supporting higher power, higher performance CPUs, GPUs, and FPGAs, including server-class processors.

Highlighted as a major winner at this year’s embedded world conference was Innodisk’s nanoSSD 4TE3 which uses PCIe Gen. 4x4 technology, offers double the bandwidth and greatly improves performance.

 

The role of Memory

Because advanced application technologies generally rely on fast and efficient data processing, the role of memory is becoming increasingly crucial. Embedded memory is the memory that an embedded device's processor uses to help it perform its functions and enable the device to work and refers to the component within the computer that allows for short-term data access. Memory might be stored on the system-on-a-chip for the embedded device, or it might be separate.

It requires the combination of memory that can quickly load and process data in real-time with flash memory that retains saved information when power is unavailable. Memory storage comes in the form of non-volatile (persistent) and volatile (live).

An important distinction between memory and storage is that memory clears when the computer is turned off.  

 

Embedded memory

Engineers use non-volatile memory in embedded systems to store code and other data that the device always needs, including after the system restarts. For example, non-volatile memory often holds configuration settings for the system.

Primary memory is the main internal memory of a computer system. The system's central processing unit (CPU) directly accesses the primary memory. Secondary memory often is in an external storage device.  However, storage drives, like SSDs (solid state drive) and HDDs (hard disc drive), can only supply data to the CPU every ten milliseconds—that's 10 million nanoseconds. That means in the time between the CPU finishing processing the data it's working on and receiving the next batch of information, it's sitting idle.

It shows storage drives can't keep up with the processor's speed. Computers solve this problem by using primary storage systems like RAM which uses volatile memory. Although this memory system cannot store data permanently, it is much faster when compared to SSDs.

The RAM reads instructions generated by the CPU. The low access time enables the CPU to receive data faster, allowing it to continuously crunch through information instead of waiting for the SSD to send another batch for processing.

Due to this design architecture, programs in the storage drives are moved to the RAM and then accessed by the CPU.

Volatile memory can be found in chips along the package substrate that contains the CPU and I/O ports. This is called a system-on-chip—or SOC—and is often found when there is limited space on the board, such as in small devices.

Changing how the RAM is connected to the computational units is redefining the memory architecture and is changing how memory systems are designed to improve the efficiency of their performance.

A new era of unified memory applications will become available, targeting embedded systems in Industrial IoT, Process Automation and Control, FPGA Configuration, Aero/Avionics, Medical, and Gaming to take over from other forms of non-volatile memory products.

For instance, Everspin Technologies introduced a new unique memory solution at this year’s Embedded World show offering higher density at faster read and write speeds.

 

Choice of product is tricky and requires experts

Choosing the perfect combination of software and hardware components can be tricky. In an ideal world, components would be selected in isolation, guided by independent sets of criteria and then, assembled into an optimal solution for the application at hand. Unfortunately, there are several interdependencies between the various storage components, especially between the storage device and the file system. Therefore, proper device selection based on application requirements is a complex task and needs highly skilled engineers to enable to achieve a successful solution.

However, these embedded engineers are not easy to find as there is a global shortage of these skills. It requires experts in the field such as CIS who have over 20 years' experience of placing highly skilled engineers and teams in a great number of electronic organisations and projects. Make sure your next project is covered, contact CIS now on 0034 963 943 500 or info@cis-ee.com.