Source code

Mysterious Chinese supercomputing: taking the lead of high-speed railway in the world

Source: 21Tech (News-21)

Author: Luo Yiqi

Editors: Li Qingyu, Liu Xueying

In April 2019, a picture similar to burning honeycomb briquette spread all over the world - this is the first time in human history to see the true face of a "black hole".

In fact, photos of black holes have been taken for more than ten days as early as April 2017. Since then, data collected from eight radio telescopes around the world has been intensively calculated, with a capacity of 2 PB per night (1 PB=1000 TB=1000000 GB).

Further ahead, there were two impressive "human-computer wars" in 1997 and 2016. "Dark Blue" developed by IBM and AlphGo developed by Google finally defeated the human brain and won the Go game.

The protagonists who perform these huge calculations have the same name—— Supercomputer

 1000.jpg

From the earliest computing capacity in K, which was used for scientific research and national defense, to today's ability to "calculate the sky, calculate the earth, and calculate the people", and ranking first in the world, Chaosuan has just spent 40 years in China.

In the newly released Top 500 list of global high-performance computers, "Sunway Taihu Light" and "Tianhe-2" from China ranked third and fourth respectively. The R&D world competition on the next generation of E-level computing (1 EFlops, 10 billion calculations, 1018 double precision floating point calculations per second) has already begun.

 1000 (1).jpg

This technology, known as "the most important tool of the country", has actually started the pace of domestic substitution before going through two international "export controls". Now, in the window period of new technology architecture, it is becoming an important topic to build a benign supercomputing ecosystem faster and find the landing direction of a new generation of computing platform. This means new opportunities and challenges for high-performance computing people all over the world.

China's supercomputing may be another high-tech that can deliver services to the world after high-speed railway 。” Zhang Yunquan, director of the International Supercomputing Center in Jinan, told 21Tech.

Supercomputing from scratch

When there were no words in human civilization, as an early way of counting, knotted records existed for a long time. It shows the long history of computing. However, the rapid development of computing has begun since nearly half a century ago.

In the early days of supercomputing, IBM was the representative of the times.

Around the 1960s, the United States began to promote the development of high-performance computing for the sake of national defense and security. The IBM 7030 was also born at that time.

On the commercial side, IBM is involved in the field of demography. With the change of the industry's technological environment and industrial demand, a series of transformation concepts such as "e-commerce", "smart planet", and "cognitive commerce" have been put forward since then.

"The computers made by IBM in the early days were basically supercomputing," said Feng Shengzhong, director of the National Supercomputing Center in Shenzhen. In fact Supercomputing can be understood as the source of today's computer technology After the success of supercomputing and other mainframe applications, the corresponding functions are gradually moved down to smaller terminals such as PCs and mobile phones.

As the first generation supercomputer, Kwan Yongqiang, Assistant Director of Information Technology Services at the University of Hong Kong, had no computer related major in the 1970s. If it were not for the need of scientific research to solve engineering problems, he would not go to the computer center of HKU to do part-time jobs. His subsequent stay as a teacher in the college witnessed the stage when supercomputing started to explore from the academic world.

This is also the two main focus points of the early development of supercomputing: frontier scientific research and national defense and military affairs. Based on this, the main operation mode of supercomputing center is government funded and led by national defense scientific research units.

The 1970s ushered in the era of large-scale development of global supercomputing.

At the early stage, American companies ControlData and Cray Research were in the leading position in technology. Since the 1980s, Japan has started large-scale policy subsidies and industrial support, which was once on a par with the United States. This led to the Japanese supercomputer "Earth Simulator" winning the top 500 supercomputer rankings for five consecutive times between 2002 and 2004.

 1000 (2).jpg

"Earth Simulator", source: China Weather Network

During this period, China launched the development of the first supercomputer "Galaxy I" in 1978, and it was officially launched in the early 1980s. As the technology architecture moves towards parallel computing mode, the global supercomputing pattern has changed since the beginning of the 21st century.

In 2010, China's first multi billion supercomputer developed by the National University of Defense Science and Technology“ Tianhe-1 " Won the first place in the world's TOP500 supercomputer for the first time; During 2013-2017, Chinese related machines continue to hover at the top of supercomputing.

 1000 (3).jpg

"After supercomputing wins the first place, it will have an impact on the supercomputing ecology of the whole world. This means that the global status will be improved, cooperation opportunities will become more, and our own level will also be improved." Zhang Yunquan explained to 21Tech that, for example, some phenomena or difficulties will be observed in advance, which will also bring some new scientific discoveries, especially in terms of comprehensive national strength and scientific discovery strength, Will act as a support.

Feng Shengzhong introduced that supercomputing plays an important role in the research and development of new materials. "At present, supercomputing may have 30% of its computing power supporting similar material design work." He further said that some seemingly "brain opening" needs, such as whether there are materials that support heat shrinking and cold expanding, and the design and application of graphene and other materials, need to be discovered through the operation of supercomputing equipment.

Perhaps because of this, China has experienced two technical challenges on its supercomputing development.

In February 2015, the United States included four entities in the export control "entity list", namely, the National Supercomputing Changsha Center, the National Supercomputing Guangzhou Center, the National Supercomputing Tianjin Center, and the National University of Defense Science and Technology, and prohibited American enterprises from exporting related chips and other products and technologies to them;

In June 2019, the US Department of Commerce added five entities including China Science Dawning, Haiguang and Jiangnan Institute of Computing Technology to the export control "entity list".

At present, China has set up six national supercomputing centers, mainly including Shenwei, Tianhe and Shuguang. The target targeted in 2015 is the relevant institutions of Tianhe system.

However, in recent years, China's supercomputing system has made considerable progress in localization and substitution. For example, Shenwei chips independently developed by Shenwei system and Feiteng series chips independently developed by Tianhe system have been successively applied to supercomputing equipment. According to 21Tech, the localization and independent research and development of Shenwei system has lasted for several decades, and the industry believes that it can do more thoroughly under its own control.

Nowadays, the application of supercomputing is not limited to scientific research and national defense for a long time. The origin of its "calculating the sky, calculating the earth and calculating the people" theory is that supercomputing can support such fields as weather observation, oil exploration, film rendering, and precision medicine.

It should be said that the development of computing over the past hundred years has helped us to see this huge and complex world today. More than that, with the help of today's supercomputing, we can also extend our tentacles to more areas that we think of but cannot touch, or even counter common sense.

The way of ecological construction

Changes from the external environment have undoubtedly accelerated the process of supercomputing autonomy in China and the construction of its own ecosystem.

Feng Shengzhong said frankly that the cost of chips is closely related to the level of mass production. For example, even though the technology level of domestic chips is completely the same as that of Intel, the production of the latter is large and the cost is low, which is why the former does not have much advantage at present.

Ecology is the biggest weakness at present, but it is not completely unbreakable. It's just that the complete establishment of ecology is not achieved overnight, but has a painful period 。” He said so.

The greater ecology lies in the promotion of industry. Zhang Yunquan told 21Tech that how to make supercomputing develop well and generate return on investment is a new topic. In the past, as a strategic investment, the state did not require supercomputing to generate huge returns in the short term

As far as Jinan Center is concerned, a new plan has been made in terms of institutional design. Zhang Yunquan introduced to 21Tech that by 2020, Shandong will have two major systems, namely, the National Supercomputing Jinan Center and the National Marine Laboratory Supercomputing Platform. The latter is built in Qingdao National Marine Laboratory, mainly for high-end scientific computing, while the former is mainly for commercial computing.

Zhang Yunquan continued to say that Jinan Center plans to build a supercomputing technology park. Its vision is to build a computing factory. It hopes that in the future, computing can become a public product like hydropower and a highland where national large scientific devices gather. "It is mainly aimed at incubating industries focusing on new generation information technology, including AI, cloud computing, industrial Internet, etc. Of course, it also depends on the healthy development of artificial intelligence industry, and it is our expectation that we can successfully achieve intelligence+ecology with various industries."

The earlier built National Supercomputing Shenzhen Center has experienced the process of industrial landing, from anticipation, to matching, and then to further upgrading.

Feng Shengzhong said that at the initial stage of the deployment of Shenzhen Center in 2008, he expected to provide service support from four levels. One is to serve leading enterprises in Shenzhen, including CGN, BYD, Huawei, ZTE, etc; The second is to serve small and medium-sized enterprises in Shenzhen; The third is to serve the social development of Shenzhen and South China, such as meteorology, medical treatment, health, environmental protection and other fields; The fourth is basic scientific research.

However, the practice over the past ten years has confirmed that there have actually been some changes in landing services. "The demand originally thought to be the third and fourth, but now it is the first and second", Feng Shengzhong said that over the past decade, there have been more and more basic scientific research institutions in Shenzhen, The demand in the field of social development is very strong, which was unexpected before.

On the contrary, large enterprises do have high performance computing needs, but in fact, Huawei and other enterprises will build related servers and platforms independently, and develop software independently, and absorb most of the needs independently.

The demand of small and medium-sized enterprises is still not very strong The reason is that such enterprises will consider reducing market risk, so the use of new technologies will be based on the "follow" mentality. At the same time, the technological breakthrough of small enterprises is relatively simple compared with large enterprises, so the demand is actually small.

Only in the last year, some seemingly small enterprises have started more and more intensive cooperation with the National Supercomputing Shenzhen Center, mainly artificial intelligence enterprises.

"This is also a mutual need. The center has ideas and needs new directions to develop rapidly; AI company also has development needs." Feng Shengzhong smiles to 21Tech, saying that he has "close" relationships with many AI company founders and alumni.

A recent cooperation case is that the National Supercomputing Shenzhen Center and Yuntian Lifei jointly released the iOS operating system. It is reported that the original intention of the platform is to realize the whole process management of data management, data annotation, algorithm training and algorithm application, and to provide this system for various industries.

However, the Shenzhen Center, which has been established for ten years, has extended its service in the face of huge demands for industrial services. Feng Shengzhong said that supercomputers belong to electronic products after all and have a certain life cycle. Shenzhen Center is currently facing the phenomenon of full and queuing.

Therefore, the second phase of the plan is on the target. He told 21Tech that compared with the current phase, the scale of the second phase will increase by 1000 times and the performance will increase by 1000 times, but the volume will only increase by several times and the energy consumption will increase by 10 times.

"This means that the efficiency has been improved 100 times. Phase II is now in full swing. In addition to machine development, the construction of new machine rooms is advancing rapidly; application research and development should not lag behind, and should not be matched until the machines come. It should be carried out synchronously," he said.

But Feng Shengzhong also mentioned one discovery. "The more attention is paid to basic scientific research, the greater the demand for supercomputing will be. As far as I know, about 30% of the work of scientific research institutions in general American universities is related to supercomputing; however, the average level of domestic universities is less than 10%, and of course, the top universities such as China University of Science and Technology, Tsinghua University and Peking University will exceed 30%. So from the perspective of national development, there is still a lot of work to be done in the application of supercomputing."

New Computing Generation

From the perspective of underlying basic technology, the global semiconductor industry generally faces a new challenge—— Moore's law is close to failure. As a result, the development speed of high-performance computing technology will not be the same as before.

 1000 (4).jpg

Moore's law

At the same time, the coming 5G era means a larger amount of data production and processing. Finding a way to break the situation is a new topic for the industry.

Zhang Yunquan said frankly to 21Tech that the latest Top 500 list also revealed that the growth rate and update rate of floating point computing are slowing down, which means that the industry is no longer enthusiastic about updating processors. Because this not only means that even if the server is replaced, it may not be able to achieve the same effect as before. However, when the parallelism is greater, the programming will be more difficult and reliability will become a problem.

"We are faced with the question of what is the next computing platform, and which technology can improve the current semiconductor technology, whether it is quantum computing, biological computing, optical computing, etc. Where are the revolutionary changes, and how can we continue to make computing speed develop according to Moore's Law?" he said.

Both academia and industry are looking for new breakthroughs. Li Guojie, an academician of the Chinese Academy of Engineering, pointed out that the next decade is the golden age of architecture. The next decade will see the "Cambrian" explosion of new computer architecture, and computer architects in academia and industry will usher in an exciting era.

"The traditional development is based on Moore's Law, but no one pays attention to the performance and structure, and the future improvement depends on the structure improvement. In recent years, although there are many suggestions, no new revolutionary achievements have been seen." He said so.

The wheel of the new computing generation is rolling forward. Who will carry the flag next, waiting for the industry to jointly seek a breakthrough.

fabulous ( zero )

This article is written by Contributors Author, article address: https://blog.isoyu.com/archives/shenmidezhongguochaosuanbijiangaotiedeshijiejilingxian.html
use Knowledge Sharing Attribution 4.0 International License Agreement. Unless the reprint/source is indicated, they are all original or translated by this website. Please sign your name before reprinting. Last editing time: July 11, 2019 at 03:42 PM

Popular articles

Post reply

[Required]

I am a human?

Please wait three seconds after submission to avoid unsubmission and repetition