The chip's Nervana accelerators are approaching the market,Xeon To machine-learning experts and neuronal networks
One of the most intriguing acquisitions of the company Intel In recent years it has been that of the American start-up company Nervana for a sum of about $ 408 million, with the aim of making a leap into the heart of the computational learning market and the growing neural networks. The move has led to the development of a new dedicated team for the subject in the company's offices in the Holy Land as well - and now we are a little ahead of the moment when three years of hard planning and work will become a real product.
As part of a press conference held at the company's IDC center in Haifa, we had the opportunity to take a look at the company's first NNP-I chip (codenamed Spring Hill) - and also hear the technical updates on the way to a launch sometime in the second half of this year.
While there are a number of specific hardware in the market to accelerate the process of learning and practicing neuronal networks, based on particularly large arrays of pre-tagged elements, Intel And Nervana is also accelerating the inference process that follows - a set of rules that will allow even simple end systems (like our smartphones and computers) to enjoy the fruits of artificial intelligence with minimal use of available processing resources.
Surprisingly, NNP-I accelerators will not be based solely on modern chip manufacturing processes Intel But will also share the PCH and the new Ice-Age core of the Sunny Cove architecture with the company's other products, in addition to DSP components and dedicated acceleration components for Networks Neurons - so the resulting chip package is very similar to other "regular" processors, but excels at performing tensor processing operations with impressive efficiency that should be innovative for the category.
The NNP-I chips are designed to be integrated with processors Xeon From various types in server farms, HPC arrays, and even workstations, and will therefore be available in a variety of configurations with a variable power envelope, from compact M.2 cards with less than 15 watts to large expansion cards PCI-Express With consumption reminiscent of the company's desktop processors.
We are not expected to see an offer of NNP chips for the domestic market in the first phase - but the familiar M.2 configuration certainly gives the potential for this to continue.
Is AI the field that is going to lead the Intel Forward in revenue and profits somewhere in the future? It's probably a little too early to jump to conclusions before we see even a single model in action - but it's definitely an intriguing and exciting possibility that we'll continue to follow in the near future.