From PC master to data catcher Intel took a vivid transformation lesson


The transition from “PC-centric” to “data-centric” is the largest and most important change for Intel in the 2010s. When the Web 2.0 era with user-contributed data at its core came, Intel keenly captured the importance of data to the development of today’s society.

In the past five years, Intel has built a “data-centric” foundation for the development of the IT industry in the next decade or even 20 years, and after the arrival of the 2020s, Intel will drive future computing methods based on these foundations.

· Mining data “oil” to add value to the data

Web 2.0, mobile smart devices, and IoT blowouts make data more and more important. However, in the past, people’s use of data was very limited. After a large amount of data was generated, it did not produce its due value. Users could not obtain useful information from these “waste data” in order to better help them develop. What Intel has to do is make data valuable, because in Intel ’s view, data is the future “oil.”

In the global data volume trend report given by IDC, by 2025, the amount of data generated by data centers (cloud computing), edge computing, and terminal devices will show exponential growth. As more and more devices are connected to the network, the number of devices has become the strongest driving force for the exponential growth of data. In addition to the increase in data volume, the types of data are also becoming more diverse. The amount of data from various fields such as digital television, broadcast media, video surveillance, streaming media, etc. is constantly increasing.

From Silicon Giant to Data Catcher Intel Takes a Live Transformation Lesson

In this regard, Intel strives to sink more and more computing and storage originally processed in the data center to the edge, in order to alleviate the network, cloud computing and cloud storage pressure caused by the growing amount of data generated by the edge or terminals on the cloud . In addition, how to deal with the increase in computing power brought by the increase in data volume? How to address the computing needs of different data types with diverse solutions? Is the key to Intel’s data mining “oil.”

Data growth, which fundamentally drives computing, storage, and transmission needs. Relying on its advantages in the hardware field, Intel uses hardware as its foundation to transform data into business value by means of mining, analysis, and acceleration, so that “data generation” smoothly moves towards “data value-added”. At the same time, Intel leverages the three major transitional technologies of 5G, AI, and intelligent edge to drive future growth and innovation.

As Song Jiqiang, Dean of Intel China Research Institute said, “These data are a long chain from the process of generating it to the process of finally generating business value. If these data are only collected and stored, but not used there, then these Data does not produce much value other than power and assets. But they are still treasures, and treasures that have not been mined. But to mine them, they must be processed, and if they are processed, data mining and analysis must be done. Because the data The speed of production is faster, and faster processing is needed, so Intel needs to speed up the hardware, and different types of data to speed up different hardware, and even better communication means if joint processing is needed. More and more data, More and more types, so much data is getting faster and faster in the process of transmission and processing. This is our need. “

From PC master to data catcher Intel took a vivid transformation lesson

Rich “arsenal” to accelerate data mining

Ubiquitous data must require computation to enter the meta-era. From the cloud to the end, not only PCs, servers, or other devices, new data-intensive workloads such as artificial intelligence, cloud data centers, the Internet of Things, next-generation networks, and autonomous driving continue to emerge, and are driving the rapid evolution and evolution of computing architectures. Exponential expansion.

Against this background, Intel, which is best at computing, can be said to be the company best suited to these things. Because in Intel’s “arsenal”, the most indispensable is high-performance, diversified computing hardware.

In Song Jiqiang’s view, architectural innovation will become a key driver of computing innovation in the next decade. CPUs suitable for scalar computing, GPUs for vector computing, AI for matrix computing, and FPGAs for space computing are all powerful tools in Intel’s “arsenal”.

At the same time, Intel also noticed the importance of 5G for the era of data torrents. The ultra-high bandwidth of 5G networks can quickly connect a variety of terminal equipment and data. Through the rational use and distribution of terminal computing, edge computing, and cloud computing, efficient processing and mining of large-scale and different types of data can be achieved. To make it more valuable.

How Intel responds to changes in data and computing trends

Facing changes in data and computing trends, if Intel wants to add value to its data through calculation, it will naturally make a lot of efforts.

First, after determining the general direction of “data-centric”, Intel began to make an ecological layout around this core. Take a look at the following picture:

From Silicon Giant to Data Catcher Intel Takes a Live Transformation Lesson

四年 In the four years from 2015 to 2019, Intel achieved the ecological layout in the transformation process through four dimensions of strategic launch, strategic acquisition, product innovation, and ecological cooperation.

Altera, acquired in 2015, is a leading FPGA manufacturer; Nervana, acquired in 2016, is good at custom AI chips, and solves the problem of AI deep learning acceleration by ASIC; Movidius, acquired in 2016, and Mobileye, acquired in 2017, are visual Advanced technology manufacturers in the field of computing and driverless.

In addition, Optane memory, neural computing sticks, neural mimic chips, superconducting quantum test chips, tenth-generation Core processors, second-generation Xeon scalable processors and many other innovative products make Intel diversify in the field of data Ecological layout.

In addition to improving the ecological layout, Intel also established six technology pillars centered on process & packaging, architecture, storage, software, security, and interconnection at the end of last year. This is Intel’s innovation-driven engine for the new computing era. Under the guidance of these six technology pillars, Intel will provide a full range of high-quality solutions for the industry.

From Silicon Giant to Data Catcher Intel Takes a Live Transformation Lesson
Six technology pillars

At the same time, Intel has not overlooked the importance of software and hardware collaboration. Based on XPU (CPU, GPU, AI, FPGA, etc.), Intel implements heterogeneous integration through OneAPI, providing unified and simplified application development programming for heterogeneous computing across CPU, GPU, FPGA and other accelerators. model.

From Silicon Giant to Data Catcher Intel Takes a Live Transformation Lesson

In addition, Intel can simultaneously provide 2.5D EMIB and 3D FOVEROS packaging technologies on heterogeneous packages to achieve communication between CPUs, graphics cards, IO and other multiple chips, and increase transistor density and multifunctional integration in three dimensions. Provide Optane solutions at the storage level; Aurora supercomputing architecture builds a 10 billion billion secondary computing foundation.

Therefore, in responding to changes in data and computing trends, Intel has achieved a complete system construction from industrial ecology and technology strategy to diversified computing hardware.

· About the future

Intel has captured the beginning of the data era, grasped the context of the data torrent, and built a complete data-related ecological chain. What about the future?

In fact, Intel is already planning the future.

年底 At the end of 2017, Intel released the LOIHI neural mimic computing chip. By simulating the calculation method of the human brain neurons, it can achieve a calculation efficiency improvement of more than 1000 times.

From Silicon Giant to Data Catcher Intel Takes a Live Transformation Lesson
Intel LOIHI neural mimic computing chip

LOIHI has 128 cores, 130,000 neurons, and 130 million synapses. Each neural mimic computing kernel simulates multiple “logical neurons”. The on-chip grid connection supports efficient pulse message distribution, has a highly complex neural network topology, and Scalable on-chip learning capabilities supporting multiple learning modes. Its powerful computing power and low power consumption characteristics can be effectively used in future large-scale data calculation and processing to deal with the impact of data floods.

In addition, quantum computing is also one of the key directions of Intel’s cutting-edge technology. Compared with the current calculation method, the computing power of quantum computing is beyond imagination. Take password cracking as an example. Under the traditional computing mode, the 128-bit key or 256-bit key cracking calculation will take decades, hundreds, or even thousands of years, but within the scope of quantum computing, this time can be shortened by several Thousands and thousands of times, ordinary encryption methods can be cracked in a few minutes, and the difference in data computing power is amazing.

From Silicon Giant to Data Catcher Intel Takes a Live Transformation Lesson
Quantum computing field layout

In this frontier field, Intel has launched the first 49-qubit superconducting quantum test chip “Tangle Lake”; it also has the smallest spin qubit chip for quantum computing; it also has the world’s first cryogenic wafer prober , It is the first test tool for quantum computing.

It can be seen that Intel has responded to changes in data and calculations from now to the future, from ecology to hardware, and from technology to solutions.

· Conclusion

年底 The end of 2010 is about to pass. In the 2020s, Intel still insists on “data-centric” overall transformation. In the past few years, Intel has seen great transformation results. It also has a combination of different XPU schemes and an important tool such as oneAPI. At the same time, it also proposes six major technical pillars, and has already laid out neural mimic computing and quantum computing for the future.

继 Song Jiqiang, Dean of Intel China Research Institute, said: “In the next ten years, Intel will continue to adhere to the” data-centric “and” six technology pillars, “and lay a solid foundation for the future world.”

This article belongs to the original article, if reprinted, please indicate the source: from PC master to data catcher Intel took a vivid transformation lesson http://nb.zol.com.cn/732/7322477.html

Error correction and problem suggestionslabel:laptop

nb.zol.com.cn
true

http://nb.zol.com.cn/732/7322477.html
report
5888
The transition from “PC-centric” to “data-centric” is the largest and most important change for Intel in the 2010s. When the Web 2.0 era with user-contributed data at its core came, Intel keenly captured the importance of data to the development of today’s society. In the past five years, Intel has built a “data-centric” IT industry for the next ten or even two decades …

.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *