"In recent years, artificial intelligence has experienced two big low tides, it has become a high-tech circle, which has a big relationship with the branch of the deep study. When training depth neural networks, people also recognize It has large data throughput, and the GPU calculated in parallel is faster than the CPU with faster training speed, lower power consumption, which makes the CPU's king Intel feel abnormally tension and lost.
Good, the sequence curtain of the artificial intelligent hardware platform is just kicking. Over time, people quickly discovered that low-energy, high performance, and programmable characteristics compared to GPU and CPU, FPGA were very suitable for perception, and can be quickly deployed. In 2015, Intel was using $ 16.7 billion to acquire Altera, the second largest FPGA manufacturer at that time, and the biggest acquisition in history.
It is also that year, with the fastest depth learning framework Neon and the first NERVANA Cloud, the first combined machine smart hardware and software, deep learning of the startup of the startup of the start-ups by VentureBeat is worthy of attention. In August, the Intel Herd's $ 400 million in secretly observed the Nervana income of only 48 employees.
In March 2017, Nervana and other Intel internal artificial intelligence related business and resources were integrated into a department, namely the artificial intelligence product division (AIPG), the joint founder of Nervana, before CEO Naveen Rao came to lead, directly to Intel CEO Zuo Zihui report.
According to the AI Technology Base Camp, after integrated NERVANA's technology, Intel AIPG plans to launch the CREST family series product line. First, it is a chip called Lake Crest, which is an ASIC solution designed for training DNN, which is expected to be tested in the second half of this year. According to NAVEEN RAO, the acceleration performance of Lake Crest is 10 times higher than the current fastest GPU.
At the artificial intelligent end-to-end technical seminar held by Intel, Intel AIPG Senior Chief Engineers andRes Rodriguez introduced Intel's technical layout, industrial trends, and progression of NERVANA series chips.
Intel AIPG Senior Chief Engineer: Andres Rodriguez
According to Andres Rodriguez: Lake Crest uses the FlexPoint architecture, MCM multi-chip package, with 32GB HBM2 storage, and internal internet speed is 20 times the PCIe.
After the meeting, Ai Technology, the NG reporter conducted an exclusive interview with Andres Rodriguez.
Andres revealed that the test time of Lake Crest was postponed in the first half of 2017, because Nervana is integrated by Intel, its product quality needs to meet the high standard of Intel platform, so they add some additional verification. This or less will affect the speed of product development.
In the current working state and Nervana, Andres indicated that their working model did not have a special change, still maintaining a state of start-up enterprises. Just because of Intel's acquisition, the customer's expectation becomes higher, and the pressure is bigger.
Write here is preparing to come here, and AI Technology Big Nursing reporters marriage, the Chinese artificial intelligence summit hosted by the Nanjing government on September 12, Mrs. Sun Jian, the chief scientist of 视 科技, when asked and 2018 will consider during training Lake Crest Chip, Sun Principal Represents, but the hardware platform is so much impossible to try, the meaning is that the company has cooperated with the British Weida, so ......
Deep learning chip architecture can be verified by Intel redefined to verify in 2018, and after Lake Crest is available, how to change the user's usage habits is also a problem that Intel is thinking.
Andres Rodriguez accepts an interview with AI technology
The following is the interview record of Andres Rodriguez, and the AI technology big camp has not changed its originating:
About NERVANA Series Chip: Lake Crest End of Test, 2018
Q: What is the positioning of the Nervana series chip, what is the difference between it and the GPU?
Andres Rodriguez: First, I want to briefly introduce Intel Nervana artificial intelligence platform as our NPU, which is a CREST family product.
Compared with GPU, there is actually two different points:
The first difference is the entire Nervana AI platform. Lake Crest is tailored to depth learning, so it does not have a built-in image graphics card, in other words, it does not support the processing of related images, this is very good with GPU Big difference.
The second important difference is that its computing power includes tailoring for deep learning.
Of course, there are also some similarities, such as the entire CREST family product, including some of the latest chip products on the market, are high-wideband memory, without having to pass the CPU, including the direct interconnection between the entire chip.
At the same time, the entire Nervana AI platform is tailored to deep learning, then a good integration can be reached whether the hardware platform or software platform can achieve.
Q: I often hear that the media mentioned that NERVANA chips are more accelerated than GPUs in deep learning training. Can I talk about the specific performance of these aspects in a targeted discussion.
AndRes Rodriguez: Our products are only open to customers who have signed the confidentiality agreement, and the details will be open to the end of this year. Performance or power consumption improves the specific number I can't disclose it to you. However, it can be said that our utilization is very high compared to other series of products.
Q: Talking about the time of Lake Crest listed or tested, there is a message called it in the first half of this year, but why is it difficult to encounter in the second half?
Andres Rodriguez: Factors are multifaceted, with one of the most important reasons to make more stringent tests for Nervana chips in Intel, to make it better to meet Intel's current quality requirements for NERVANA chips. At the same time, it is also possible to meet Intel's high standards of product quality and all platform series. It is based on this request, we have to do a lot of testing work.
Q: How does the chip of the CREST family support these more popular artificial intelligence frames like Caffe, Torch, Tensorflow?
Andres Rodriguez: Whether it is Lake Crest, to strong or Movidius, FPGA, we provide support for other frameworks, the overall process is compared.
First, users will write their models into the framework he replaced, and there are special libraries for deep learning throughout the framework, which we will optimize the architecture for different content libraries.
Here I can take Tensorflow to give an example, Tensorflow has a tailor-made algorithm, at the same time to strong processors, FPGAs have specialized unit libraries. Regardless of which framework you use, we will optimize its cell libraries for specific content, and the entire approach is relatively consistent.
Q: What specific application scenarios apply to the CREST family's chip?
AndRes Rodriguez: The entire CREST chip family's series of products are made for deep learning of different types of loads, whether it is the previous item test, object identification, and video processing, image recognition, image processing, and Treatment of voice processing, natural language. The general features of the entire depth study are very supportive, which are the fields it applied.
In addition, there are training and reasoning, as well as the use in the data center, is also supported by the CREST series family product. At the same time, specific applications are still very broad, whether it is like the current gene sequencing, financial fields, and automatic driving areas, the Crest Series products can be used; and its training model depends on the specific environment, we can also put it During the reasoning process, including the data center, if you want to reason, our CREST family series products can also be supported.
Q: If you have a demand for training neural networks now, how do you choose a chip for yourself?
Andres Rodriguez: In fact, no matter how many people on the city, it is possible to be in the same situation as I am in the same situation, that is, there is a large number of available to strong processors to consider. In this case, I may not need to buy other hardware specifically made for depth learning, because all work can be done on the Strong processor.
For more than six months, our performance has increased more than 100 times, so we don't have to consider other solutions. In addition, in the entire training process, the time required for my workload can be greatly shortened. During the processing of deep learning training, all workloads are allocated to different cores, which can be large. Reduce time.
Q: You just mentioned that Lake Crest was released at the end of this year, the Chinese market and the US market synchronized, are you getting products at the same time?
Andres Rodriguez: As a specific wafer of Lake Crest, we will announce the specific details at the end of this year. But actual production includes large-scale mass production or waiting for the first half of 2018. As the entire Lake Crest, it will be integrated in Nervana Cloud, and the customer tests it in Nervana Cloud. Some of our very close partners, whether it is the United States or China, can be tested directly in Lake Crest.
Of course, our specific test is also divided into two different types of partner groups. For customers with a lot of demand, we are only open to some partners that work very close to some cooperation. They may consider deploying our Crest in their data center in the future. If as a universal test, the open range will be wider.
Q: What is the plan of the AIPG in AI chip technology, or has there a product road map?
Andres Rodriguez: The answer is affirmative, the roadmap we have a established planning, in addition to traditional computing and training and reasoning, we will also establish a more perfect depth learning ecosystem planning. But we are concerned that it is not only training and reasoning itself, and we have many other detailed plans, but the specific content includes details we only sign our confidential agreement.
About AIPG's current situation: The team has expanded significantly compared with Nervana before
Q: About your personal work status, because Nervana is now acquired by Intel, now what is the difference between Intel or in the AIPG? Is there any difference between Nervana?
Andres Rodriguez: It can be said that the entire leadership of the Intel Artificial Intelligence Products Division will not change much before the NERVANA system before. Our current GM is also the entire Intel VP. It is our previous CEO before Nervana System, and the joint founder of Nervana.
Nervana has now been acquired by Intel and joined the AIPG business unit, and we are still adhering to the spirit of the beginning of a start-up business. Our working model does not have a particular change. We still want to quickly complete our product development, and very focused on the field we are good at.
One of the biggest differences is that our scale is much larger than before Nervana, but many traditions have some values that have been well inherited. There is another important difference. After all, the entire artificial Intelligence Products Division is an Intel Big brand as an important background support, while customers will have a higher expectation of our subsequent products. Regardless of the performance or safety perspective, this is also a huge pressure on us. Therefore, after joining Intel, there must be a higher standard to develop our technology and products.
In addition, we have to develop better products, the quality, standards and requirements of the product will be higher, which is also an extra challenge for us. In order to meet the high standards of high requirements, we have to add some additional verification or Work, this may or less will affect the speed of our product development, which is what we have to make is a compromise.
Q: Just said that the team had a big expansion, because before Nervana was acquired, I remember there were 48 people. How many people now have you?
Andres Rodriguez: Our size is definitely much much longer than before, but the specific number of employees is temporarily discouraged. I feel that Joining Intel's great advantage is that although Nervana is also talent, it is less than talent, but after all, the number of people is less; now under the entire Intel platform, hardware software includes digital scientists, we can contact experts The more, this will be a biggest advantage.
At the same time, our global market is also larger, and the perspective is more open. Because we are not only for the Lake Crest Accelerator, we still have a bigger market not only about the AIPG.
For example, the optimization of our and Google Tensorflow is to be implemented under the entire AIPG business department, so AIPG has a bigger market.
View of China AI Chip Company
Q: How do you look at China's local AI chip?
Andres Rodriguez: I think it is a very good thing to make every company has its own space now. This is a very good thing. Because of the entire ecosystem, the relevant parties can learn from each other, and can promote the rapid development of the entire depth study, so we are also very happy and welcome to develop and enter this market.Step. We also hope to have the opportunity to make full cooperation with Chinese companies to fully cooperate with more customers to develop software, hardware, and other products for deep learning areas.
Original link: https://www.eeboard.com/news/lake-crest/
Search for the panel network, pay attention, daily update development board, intelligent hardware, open source hardware, activity and other information can make you master. Recommended attention!
[WeChat scanning picture can be directly paid]
Technology early know:
Foxconn United States Construction Factory employee wages stunned
Huang Zhang's backup, really can save a Meizu who has been in the "death" road?
Google acquires HTC, HTC Pixel mobile phone R & D team joined Google Team
Want! The world's first AMD Ryzen notebook is about to sell: 100,000 yuan
The price of memory is booming, DIY buyers call: can't afford it, reveal the mystery of memory prices
Our other product: