Specialized servers represent the future of data-center computing HP Labs’ CTO claimed in his keynote at ARM TechCon in Santa Clara, California this morning (29 October 2013). But HP is not expecting to design them – instead it is throwing open its doors to other companies to bring in the necessary hardware and low-level software.
Big changes will be needed in server design to avoid large computer users taking desperate measures to deal with spiralling energy demands.
Martin Fink, CTO and director of HP Labs, recalled in his speech: “We’ve had some interesting customer conversations. One said they had to look at repositioning their server farms close to the arctic circle to take advantage of the natural cooling. Another interesting one was at a HP conference when a sales representative stood up and said ‘everything you are talking about is really cool and one of our customers would sign a blank cheque to get it’. Their problem today is that they can’t get more power from the city.”
“The problems feel like they are way out there. But they are happening to us today. Cloud computing is now using more power than entire countries,” Fink added. “We need that next generation of data center where you can handle the amount of data being generated. Based on this we created Moonshot,” Fink said.
Photo HP Labs CTO Martin Fink shows off two partners' Project Moonshot processor cards
HP is not expecting to design its own processors for this new class of server. Instead, the company is inviting hardware companies to bring their own ideas in the form of processor complexes or ‘cartridges’ to its Pathfinder Innovation Ecosystem” and test them out in the Moonshot environment.
“We provide you with our discovery labs that allow you to test out your innovations and see how this works. We want you to shock us. We want you to create the cartridge of all cartridges. Create cartridges that no-one has thought of. Focus on what you can do best and deliver to the industry. That ecosystem is very critical to us,” Fink said.
Testing in progress
Several companies already have their own designs up and running within the Moonshot environment, including Calxeda, Texas Instruments with a DSP-accelerated card, and Applied Micro with a 64bit design. HP is also working with companies, such as PayPal, developing software for massively parallel servers.
“CGG is doing this for a different type of energy exploration. We have customers in the cloud ecosystem that also see the power of Moonshot and leveraging its capabilities to radically the profile of their data centres,” Fink said. “We want to look at what the workloads are and see how to deploy software such as Hadoop with our cartridges. The toolset is expanding and you will see an expanded set of tools and libraries to let you create open-source applications that leverage this ecosystem.
“We have seen a move to energy- and algorithm-optimised ecosystems. Increasingly, we will start to see the data we process specialised to the activities we want to do. Do I want to deploy a general purpose processor in order to do video processing and video analytics? Maybe I want to deploy a specialised SoC that is good at only that.”
HP is anticipating further, more fundamental changes to server architecture.
“Let me start with memory and storage. Today if you really think about the computer architecture, you tend think about memory as a constrained entity. If memory is constrained what do we have to do? We have to use it temporarily. But what if we were able to combine aspects of memory and storage into a single entity while leveraging services such as data protection to get a properly managed data pool. It avoid having to have data shuffling back and forth and treating it instead as a single pool. I’ll let you figure out what that means for the software stack,” said Fink.
“If you take a step back, computers since the dawn of time have been built with CPU, memory and I/O. That really hasn’t changed. If you think about the virtualisation movement we just recreated that architecture in software. We think there is an opportunity to change, have specialised processing connected to large pools of non-volatile memory and really change the paradigm of how we do compute,” Fink claimed.
“Think about how this changes how applications are built. Today traditional software is built around known schemas. You know what the data looks like: it’s all defined ahead of time. The world will continue to use this but increasingly you will have to have ability to dump your data into a place and figure it out later. But you will need to figure it out very efficiently and scale out uisng very task-specific SoCs.”