Top 10 latest technology trends you must follow in 2021
Whether you admit it or not, 2020 is an exceptional year. I found out the ten technology trends outlook for 2020 launched by the Info editorial department at this time a year ago and found that there is a sentence in it. This sentence now seems to be full of magical meaning. At that time we wrote like this:
Some people say that 2019 may be the worst year of the last ten years, but it may also be the best year of the next ten years. We do not agree with this view. We remain optimistic about the future, because pessimists are often right, and optimists are often successful.
In 2020, domestic and foreign general trends will inevitably have a huge impact on the IT field, changing the industry and the technological development trend of the industry; however, there will also be some technological developments that follow their own trajectories and are not disturbed by the outside world.
It’s another new year. The InfoQ editorial department continues to introduce the top ten technology prospects for 2021, inviting our editors in chief in the fields of architecture, front-end, cloud computing, big data, AI, blockchain… to use their own insights and the past The essence of the views obtained from many practitioners and technical experts in one year, and the following ten major technology development trends are judged.
1. Architecture, cloud computing
1. Microservices that turn the complex into simplicity and walk on the road of “minimalism”
In 2020, a new vocabulary appeared in the field of microservices: “macro service”. Macro services are not actually a brand new architecture, but a concept that strikes a balance between monoliths and microservices.
At present, the development of microservices has increased the complexity of the system. Microservices are becoming more refined and the reuse rate has reached a peak. The relationship between services has become more complex and maintenance costs have increased. Under this circumstance, technicians proposed “macro service”, which is to find a balanced position in the granularity of microservices, which makes the system easier to maintain, realizes multi-person collaborative maintenance, and makes code base reconstruction easier.
In addition to macro services, Service Mesh has also made many changes in terms of solving complexity. Take Istio as an example. In 2020, Istio will keep its promise to release a version (1.5, 1.6, 1.7, 1.8) every quarter. The opening version 1.5 overturns the previous design and proposes a “return to monolithic” architecture. 1.6 The release note of the version even at the beginning shows that minimalism must be carried to the end. After the release of version 1.7, former Red Hat chief architect, author of Istio in action, and solo.io Field CTO Christian Posta believes that Istio 1.7 will become the most stable version available for production. In the 1.8 version released at the end of the year, Istio officially abandoned the Mixer component.
From the version update of Istio, it is not difficult to see that the community has been working hard to solve the complexity. However, it is impossible to build a tall building overnight. The architectural leap enabled by Istio version 1.5 makes it difficult for production to land at least in versions 1.5 and 1.6. Version 1.7 still has strict platform version requirements (the initial version of Kubernetes is upgraded to version 1.16 or higher) and depends on API It is about to be forced to migrate and other issues. Whether the 1.8 version released at the end of the year can truly become the most stable version available for enterprise production remains to be tested by production.
In 2021, simplifying the complex will still be a key issue for microservices.
2. Cloud native is no longer resource-oriented, but application-oriented
In 2010, Paul Fremantle first proposed the concept of “cloud native” in his blog. After ten years of development, technologies such as DevOps, containers, and microservices have developed rapidly. Cloud native has been successfully applied to the core business of enterprises and has become an important driving force for business innovation.
From a market perspective, the business scenarios supported by cloud native technologies are very rich, such as finance, manufacturing, and the Internet. For these companies, the biggest difficulty they encounter when applying cloud native technologies is not to build a cloud platform, but to migrate to the cloud. Traditional applications are not developed for cloud computing, and the migration workload will be very large, such as the use and debugging of migration tools, operation and maintenance after migration, and so on. In addition, if you just use virtualization and redeployment to migrate to the cloud, you will not be able to take advantage of the elasticity and high concurrency of cloud computing.
Therefore, cloud native is no longer resource-oriented, but application-oriented. Virtual machines and servers are no longer the meaning of cloud native. Business is. Many technical experts also call this “cloud native 2.0″. era”. Regarding “applications” as the center, standardizing the lifecycle management of enterprise applications, and building a unified cloud-native application deployment, operation, operation and maintenance, and governance standardization process for enterprise business… These are the key points and challenges for the future development of cloud-native.
Enterprises will increasingly realize that “cloud native really helps business departments.” The technical architecture of enterprises will gradually transform into a business-oriented microservice structure, reducing development and operation and maintenance concerns, and truly focusing on business.
3. Edge computing will usher in large-scale commercial landing
In July 2020, Lei Bo, director of the IP and Future Network Research Center of China Telecom Research Institute, said: “As the business and market mature, the commercial deployment of edge computing has become the focus of attention of all parties in the industry. It is predicted that it will be in the near future. (2021-2023) begin to proceed on a large scale.”
From the perspective of technical architecture, the Edge Computing Industry Alliance (ECC) and the Industrial Internet Industry Alliance (AII) jointly released the Edge Computing Reference Architecture 3.0. The entire system is divided into three layers: cloud, edge and field. Edge computing is located between the cloud and field layer. The edge layer supports the access of various field devices downwards and can be connected to the cloud upwards. The edge layer is mainly composed of edge nodes and edge managers. Edge nodes are hardware entities and are the core of carrying edge computing services. They generally have computing, network and storage resources. The core of edge managers is software, and the main function is to Unified management of nodes.
Edge Computing Reference Architecture 3.0
From the perspective of industrial structure, edge computing is still in its early stage. The downstream is mainly composed of chips, hardware, software, and connections. Downstream manufacturers will gradually promote hardware and software facilities to become intelligent and open; the midstream mainly supports platforms and involves Vendors include cloud service providers and telecom operators, and they usually choose specific areas as the breakthrough point for edge computing applications; the upstream is the application, which empowers edge computing to smart terminals and applications. From upstream to midstream to downstream, companies in the entire industry chain are exploring the business model and customer value of edge computing.
From the perspective of landing scenarios, the current landing of edge computing is mainly concentrated in energy Internet, industrial Internet, AR/VR/HD video, cloud games, driverless, smart stores, and healthcare.
Interviews conducted by InfoQ found that most manufacturers have a wide variety of edge computing products, and there are also landing cases. So what causes the large-scale commercial landing of edge computing to occur after 2021? The views of technical experts are more consistent: “The large-scale commercial implementation of edge computing is still waiting for a signal, and this signal is 5G!”
2. Front end
Fourth, low code will bring new changes in the front-end field
In 2014, the research institute Forrester Research formally put forward the concept of “low code/zero code”. As the name suggests, low code means that developers can quickly develop applications and extend more functions with very little code. Compared with traditional software development tools and technologies, low-code has a lower technical threshold and higher development efficiency; compared with other rapid development tools, low-code has better scalability.
By using a low-code platform, non-IT technicians can also build software; allow the use of a common platform to develop multiple applications, which solves the problem of the IT department’s task backlog to a certain extent; supports multi-platform deployment, and only one application can be developed Compile and run in different environments; easy to maintain, making software updates, debugging, repairs and changes easier
At present, many companies have applied low-code platforms to improve development efficiency, especially in the front-end field. Yu Libin, the front-end technology director of Suning Consumer Platform R&D Center, said in a previous interview with InfoQ: “As the front-end team of an e-commerce company, when we encountered similar business requirements, the usual practice was componentization + manual modification. , This year we tried a low-code platform, which significantly reduced the workload of front-end programmers. At present, Suning’s low-code/no-code platform is running well, and the cost savings are obvious. The team was reduced to 2 people.”
Gartner predicts that market demand for application development in 2021 will be five times the capacity of IT companies. To fill this production gap, low-code/zero-code technology is currently the only feasible solution, and more and more companies will inevitably introduce this technology. At present, we have seen that many major Internet companies have applied low-code platforms in the front-end field. Next year, we expect low-code platforms to play a role in more enterprises and more fields.
3. Big data, artificial intelligence
5. Accelerate the integration of big data and cloud, and integrate the lake and warehouse from theory to implementation
With the accelerated migration of IT infrastructure to the cloud, cloud native is becoming the mainstream standard for a new generation of data architecture. In addition to the standard services provided by public cloud vendors, third-party data service providers across cloud platforms (such as Snowflake and Databricks, etc.) are also sought after by users and the capital market. Snowflake, as a cloud-native data warehouse provider, once went public in September 2020, the market value once climbed to more than 100 billion U.S. dollars. In contrast, the market value of traditional data warehouse top providers (such as Teradata) is less than 2 billion. US dollars. More and more enterprise customers are shifting from On-Premise data warehouse solutions to cloud-based (including public and private cloud) solutions. This trend has been widely accepted in the US 2B market, and is also in the ascendant in the domestic 2B market. We believe that the integration of big data and cloud will continue to deepen in the new year, and the field of big data will accelerate the embrace of the new direction of “convergence” (or “integration”) evolution.
In fact, whether it is the “integration of lakes and warehouses” that has received the most attention this year, or the “integration of batches and batches” that has been widely recognized by the industry, they are all phased products of the evolutionary thinking of “integration”. Its essence is to reduce the technical complexity and cost of big data analysis, while meeting higher requirements for performance and ease of use.
In the past few years, while data warehouse and data lake solutions have rapidly evolved and made up for their own shortcomings, the boundary between the two has gradually faded. The new generation of cloud-native data architecture no longer follows the single classic architecture of data lakes or data warehouses, but is rebuilt to a certain extent by combining the advantages of the two. Major cloud vendors have successively proposed their own “Lakehouse” technical solutions, such as AWS’s Redshift Spectrum, Microsoft Azure Synapse Analytics service and Azure Databricks integration, Alibaba Cloud MaxCompute+DataWorks, Huawei Cloud FusionInsight, etc. Other companies are building their own data lake warehouses through open source table formats (such as Delta Lake, Apache Iceberg, and Apache Hudi). With the joint promotion of cloud vendors and open source technology solutions, we will see more actual landing cases of “Integration of Lake and Warehouse” in 2021.
6. Industrial intelligence will pass the initial stage of development
With the development of technologies such as deep learning and knowledge graphs, the solvability of algorithms for complex problems has been significantly improved, and artificial intelligence technology has gradually developed to the extent that it can solve practical problems and surpass humans. On this basis, industrial intelligence has gradually developed. Typical representatives include data-driven optimization and decision-making, and in-depth visual quality inspection; industrial knowledge graphs solve global and industrial problems; intelligent industrial robots such as human-machine collaboration have been developed and integrated. widely used.
In the past few years, industrial intelligence has gone through three stages: rule-based, statistics-based, and complex calculations. On the one hand, the three stages are not mutually substitutes. Expert systems, traditional machine learning, knowledge graphs, and cutting-edge machine learning technologies coexist and continue to interweave and merge; Knowledge engineering represented by knowledge graphs and data science represented by deep learning are two major directions. However, the current applications of industrial intelligence are mostly dotted scenes, and the scope of popularization is limited, and there are still many problems that have not been resolved, and they are still in the initial stage of development.
In 2021, with the breakthrough of general technology, industrial intelligence will enter a new stage of development. Specifically, FPGA-based semi-customized chips are expected to become the base of industrial intelligence; high-compatibility compilers meet the needs of industrial adaptability; real-time requirements promote the further optimization of the end-side reasoning framework; breakthroughs and customization in the field of general technology Algorithm research is the key.
7. Explainable AI is one step closer to the landing of large-scale applications
Due to the “black box” attribute of the machine learning model, the internal working principle of the model and the model decision-making process are difficult to understand. However, the results of AI operations must be interpreted to human users; at the same time, human engineers must be able to locate and solve the problems of AI operation; in addition, the AI process requires human supervision.
In the past few years, we have witnessed the rise of opaque decision-making systems, such as deep neural networks (DNNs). The success of deep learning models (such as RNN, BERT) stems from the combination of efficient learning algorithms and their huge parameter space. A parameter space may consist of hundreds of layers and millions of parameters, which makes DNNs considered complex Black box model. As the computing power becomes stronger and stronger, the algorithm model becomes more and more complex and larger, although its ability is indeed very strong, and it can help us do more and more things, even on many specific tasks. It performs better than humans, but we are increasingly unable to understand these models. This is a very difficult problem. The so-called interpretability is to seek a direct understanding of the working mechanism of the model and break the black box of artificial intelligence.
The idea of interpretable machine learning is to consider its accuracy and interpretability at the same time when selecting a model. It not only gives the prediction result of the model, but also gives the reason for the result. The commonly used methods are the interpretability of the model itself and the interpretability based on the results.
The interpretability of the model itself is strongly bound to the model. We need to iterate one-to-one according to the model and application scenarios to be able to make it interpretable. The versatility is very limited and the modification is difficult. Based on the interpretability of the results, although it can be regarded as a black box, there are still some problems with the algorithm itself. For example, the LIME algorithm has a certain dependence on sampling, which leads to unstable results. However, through the step-by-step exploration of industry and academia, it is believed that these algorithms will become better and better in 2021, and they will get closer and closer to large-scale applications.
8. The breakthrough of cognitive intelligence is worth looking forward to
Nowadays, with the continuous innovation of related theories and technologies, AI is increasingly entering our daily lives under the support of the “three elements” of data, computing power and algorithms. However, behind this series of surprises is the difficulty of most AIs in language understanding, visual scene understanding, decision analysis, etc.: These technologies are still focused on the perceptual level, that is, using AI to simulate human perception of hearing and vision, but It cannot solve complex cognitive and intelligent tasks such as reasoning, planning, association, and creation.
The current AI lacks the processing, understanding and thinking of information after entering the “brain”. What it does is only relatively simple comparison and recognition, and it only stays in the “perception” stage instead of “cognition”. AI is based on perceptual intelligence technology. It is still far from human intelligence. The reason is that AI is facing bottlenecks restricting its development: large-scale knowledge base of common sense and logical reasoning based on cognition. Cognitive maps based on knowledge graphs, cognitive reasoning, and logical expression are considered by more and more scholars and industry leaders at home and abroad as “one of the feasible solutions that can break through this technological bottleneck.”
How to achieve intelligence? There are two ways at the moment: The first is the so-called violent aesthetics. If the data is not enough, add data, just like GPT-3. I believe there will be GPT-4, GPT-5 in the future… This kind of thinking may be successful. But you can also change a perspective to see how biological intelligence is achieved. There are many ways to achieve biological intelligence, and it does not rely solely on the number of neurons or violent aesthetics to solve problems.
If we define general artificial intelligence as three conditions: one is multi-tasking and can do many things, not just a single thing; the other is robustness; and the third is that it can adapt to the existence of multiple environments. Then, in the future, we need to cross-fuse neuroscience, cognitive science and computing science, strengthen the two-way interaction between artificial intelligence and brain science, reveal the fine structure and working mechanism of biological intelligence systems, and build functional brain-like and high-performance superbrain intelligence The system uses vision and other functions and typical model animals as references to test the intelligence level, and explore a feasible path for the future development of artificial intelligence.
4. 5G, blockchain
9. The initial completion of 5G network construction to promote the development of smart medical, industrial manufacturing and other industries
The development speed of 5G is much faster than previous generations of communication technology. In 2020, global operators will accelerate the construction of base stations, and as the world’s largest 5G market, the current number of 5G base stations in China has exceeded 700,000. From the perspective of 5G terminals that are closest to consumers, mainstream smartphone brands have entered the consumer market. Even though the 5G iPhone is late this year, its sales are good, and it is expected to help the rapid development of the 5G market.
But 5G still lacks “killer apps.” Low latency and high bandwidth are the characteristics and advantages of 5G. Entering 2021, we believe that the combination of 5G and “video”, “cloud gaming”, “Internet of Things”, and “edge computing” is the focus of attention.
Under the influence of the epidemic, video scenes such as live broadcasts, short videos, and audio and video calls have gradually become the norm. How to make the sound and the picture more consistent, reduce the freeze, and make the delay lower? The answer is inseparable from the key technology of 5G. In addition, one of the basic conditions for the evolution of ultra-high-definition technology is bandwidth. Only with increasing bandwidth can it be realized.
Cloud gaming is essentially an interactive online video stream. It is considered to be one of the closest 5G applications to landing. In April this year, Baidu also announced the launch of cloud mobile phone products. Cloud gaming is one of the major application scenarios. With the commercialization of 5G and the integration of cloud computing and 5G technology, it is expected that the latency of cloud games will become lower, and the game quality and operational fluency will be greatly improved. Coupled with the entry of more and more companies, including major technology and Internet giants, its industrial ecology will also be quickly improved.
To achieve a massively connected IoT world, it is even more inseparable from 5G. But Forrester predicts that “network connection chaos” will be the mainstream phenomenon in 2021. There are many Internet of Things network connection options (such as satellite, cellular, Wi-Fi, Bluetooth, Zigbee, Z-Wave, etc.). 5G is not the only option. Enterprises and organizations need to overcome such “market chaos” and organize them. Therefore, 5G and The implementation of Wi-Fi technology has declined compared to 2020.
In the current AI era, the computing industry is also inseparable from intelligence. With the blessing of 5G technology, the current hot edge computing will significantly improve bandwidth and latency, and achieve smarter computing. KPMG and IDC estimate that, thanks to 5G and edge computing, in addition to the connected medical field, industrial manufacturing, intelligent transportation, environmental monitoring, (cloud) gaming and other industries are also expected to achieve significant growth in the next two to three years.
10. Blockchain technology is accelerating its implementation under the impetus of “new infrastructure”
Blockchain is a relatively important technology in recent years. At present, the technology is in the “Trough of Disillusionment” of Gartner’s hype cycle. Cryptocurrency is one of the main applications of blockchain technology. In recent years, the government has stepped up its crackdown on cryptocurrency scams. The public security agency filed a case this year and investigated the “Plus Token Platform” network MLM case with a total value of more than RMB 14.8 billion.
On the other hand, the Diem (formerly known as Libra) digital currency that Facebook plans to release will still be subject to huge regulatory pressures in 2020, but the digital renminbi of the People’s Bank of China has been piloted in various places under the promotion of the government. Shenzhen conducted the first digital currency trial in October. Subsequently, the Suzhou government department distributed RMB 20 million digital red envelopes to local residents through a lottery in December. The 100,000 winners each received 200 yuan of new digital renminbi, which can be used for online shopping or offline consumption. The government also tested the use of digital renminbi for food delivery and online car-hailing services with Meituan and Didi Chuxing. The legal digital currency DC/EP uses blockchain technology in the confirmation link. Therefore, in the long term, data will be incorporated into the production factor requirements, and data confirmation, pricing, and transactions will be promoted. This means that data on the chain will gradually become a trend.
In addition, the issue of privacy and security has always been one of the challenges in the application of blockchain. Various manufacturers have also tried to integrate various data encryption transmission protocols, data encryption storage protocols, remote authentication and other technologies into an overall solution to provide various scenarios. Privacy protection strategy, and can lower the development threshold. Privacy protection technology will get breakthroughs and verifications in multiple scenarios next year, which can further promote the large-scale application of blockchain.
In 2020, the blockchain has gradually penetrated into various vertical industries in China, and has initially formed a demonstration effect, which is concentrated in the fields of government affairs, people’s livelihood, finance, and supply chain. “to G” (including state-owned enterprises and institutions) has become a block in China The mainstream profit model of the chain industry. At the same time, the National Development and Reform Commission clearly incorporated “blockchain” into the information infrastructure of the new infrastructure. Under the influence of COVID-19, global enterprises need to accelerate their digital transformation. Therefore, in the future, blockchain will serve the digital transformation of enterprises together with new infrastructure-related technologies such as artificial intelligence and 5G. In 2021, we will also see more “blockchain+” landing cases.
5. concluding remarks
No technological development can be separated from the overall environment of political, economic, and social development. The increasingly fierce competition in the global technology industry in 2020 and the ever-complicated global political and economic environment have made people understand the truth that the core technology purchased is not reliable. 2021 is bound to be a crucial year for the accelerated development of independent controllable technologies.
In the field of IT information industry, self-controllability mainly refers to relying on its own R&D and design, fully mastering the core technology of products, and further realizing the full controllability of the independent R&D, production, upgrading and maintenance of information systems from hardware to software, and striving to control the entire process of IT infrastructure, The basic software, application software, and security products have been fully domestically replaced.
With the launch of the second phase of the IC Fund and the issuance of a series of policy documents that the government encourages the development of IT software and hardware industries, the state’s support for the development of independent IT software and hardware technologies has reached an unprecedented level. In 2021, the rising momentum of the development of independent and controllable technologies will become more apparent.