By Clarity Insights,
Clarity Insights is a strategic partner to the nation's leading data-driven brands.

For machines loaded with so much memory, computers can be terribly forgetful. All they need is a simple shutdown to lose every piece of information you loaded into them. Then, it’s,  “Bye-bye. No one’s home”. You have to restart and reload all the data back into it to get any work done. Persistent memory changes all of this. Persistent memory servers keep data permanently in place and are likely to be featured in a lot of your upcoming hardware vendor pitches. But they’re about more than just hardware. Persistent memory is going to have an impact on software, data management and your analytics.

 

What Are Persistent Memory Servers?

What does it mean for memory to be “persistent”? Understanding persistent memory is easy once you grasp how regular memory works. Traditional computer memory (e.g. Dynamic Random Access Memory or DRAM) is volatile. When the computer is running, the memory stores whatever data is needed. When the computer powers down (deliberately or accidentally), the data drains out of memory. 

In contrast, persistent memory holds its data, regardless of the state of the computer. Say you’re running an AI-driven application that analyzes historical stock market trends to guide stock trading. Such an application benefits from having a large in-memory capability. It might be able to hold multiple terabytes (TBs) of data in solid-state memory. This enables fast data input and output (I/O), which drives fast application performance. If this app runs on a persistent memory server, a restart will bring the app back to life much faster than is possible with regular memory. The data is still there. It persisted in memory.

 

Why Persistent Memory Now?

Gartner forecasts that persistent memory will represent over 10% of in-memory computing memory (measured by GB consumption) by 2021. They make this prediction in their report, “Top 10 Data and Analytics Technology Trends That Will Change Your Business.” Why is the persistent memory trend happening now? While in-memory computing is increasingly common, restrictions on memory size and excessive cost have made persistent memory impractical. This is now changing.

The analyst firm cites the example of Intel Optane DC Persistent Memory. A single Optane non-volatile DIMM (NVDIMM) offers fully-addressable memory in capacities up to 512GB. This size starts to make true persistent memory servers a reality. That said, it may take some time for the database makers (e.g. Oracle and Microsoft, etc.) to adapt their products to take advantage of persistent memory.

It’s worth noting, though, that persistent memory is not envisioned as completely replacing DRAM. Today’s server workloads require CPU performance, massive memory and faster storage. Historically, DRAM has been a costly but reliable byte-addressable memory solution. DRAM lacks the economics of the much cheaper, denser and slower nonvolatile NAND flash memory, which serves as block-addressable storage. Instead, persistent NVDIMM, while not replacing DRAM, can optimize software environments.

 

Advantages of Persistent Memory

Persistent memory servers are expected to deliver a number of advantages over traditional, volatile memory. In addition to data persistence after a power disruption, projected benefits include:

  • Lower memory prices, compared to DRAM
  • Cacheable memory
  • Lower data access latencies compared to flash Solid-State Drives (SSDs)
  • Increased throughput compared to flash storage
  • Real-time data access, including extremely rapid access to large datasets

 

Persistent Memory Server Use Cases

Persistent memory is expected to play a role in many different use cases. In the financial services context, for example, persistent memory is envisioned as allowing larger, more-efficient virtualization environments. Given how financial firms are often space- and power-constrained, a jump in virtualized capacity is a big plus. Persistent memory also helps with desktop virtualization systems, a popular approach to desktops in finance.

Analytics, another mainstay of financial management, is likely to leverage persistent memory to increase performance. Equipped with persistent memory, data analytics solutions are able to increase the amount of data kept in-memory, which pushes up performance. Workloads with massive data loads, such as augmented analytics using machine learning algorithms, will see significant acceleration. 

High Availability (HA), also a critical element in financial systems, gets a boost from persistent memory. The essence of HA is the rapid “failover,” or switching from a system instance that’s down to a separate instance that takes over. Persistent memory speeds up the failover process.

 

Getting Persistent Memory Working

For some applications, persistent memory is already demonstrating its value. However, it’s important to understand that the technology is not ready to go in all circumstances. Database vendors and other software products still have to adapt in many cases. 

There are configuration issues, as well. The Optane product, for example, can be set up in two modes. One is an “all or nothing” memory pool. The other is application-specific. As hardware and software vendors talk about the merits of persistent memory, it’s worthwhile to bear in mind that some work may be required to get it all up and running the way you want it.

 

We have worked with companies that are assessing the potential for persistent memory in their applications. Success involves working through a thought process that spans hardware, infrastructure management, software and data management requirements. If you want to learn more about how persistent memory can help your systems run better, let’s talk.

You may also like:

Data Governance Customer Analytics Business Intelligence Data Strategy

What is business-based data governance (and why do you need it)?

Data governance is a constant struggle for many companies, as business needs, security requirements and compliance deman...

Data Governance Customer Analytics Business Intelligence

Top 3 data governance failures (and how to avoid them)

Data analytics is a powerful tool for any organization, but if you don't cover all of your bases--including data governa...

Data Visualization Big Data Business Intelligence

Five Keys to Telling a Compelling Data Story

It is not breaking news that organizations are dealing with data growing at exponential rates. What the headlines fail t...