At around 9 a.m. on August 20, the much-preheated “Tesla AI Day” finally began.
Tesla CEO Elon Musk, AI director Andrej Karpathy, and Autopilot software Director Ashok Elluswamy were among several executives who introduced a series of tesla’s layout in AI, especially thinking and progress in self-driving technology.
The press conference was praised by industry insiders as “a lot of dry stuff”, and the executives who spoke were “all new faces except Andrej and Elon”, so it was very refreshing.
In addition to the DOJO supercomputer, which the industry had previously predicted, and the D1, a self-developed AI chip built into it, there was also a fully humanoid Tesla Bot with four limbs.
For non-professionals, the launch was not “friendly”. A large number of professional terms and hardware performance parameters, complex training modules and algorithm mechanisms, as well as the realization principle of pure visual auxiliary driving scheme and so on, give people the intuitive feeling is “ununderstood and severe”.
But the event itself was not intended to attract users. Musk has previously said the AI Day is more about hiring, saying the event is intended to “find and attract talent to help the company improve Autopilot and FSD and take its fully autonomous driving software to the next level.”
After the AI Day ended, tesla China’s official recruitment platform wechat posted a related job information in the afternoon, saying, “Join tesla’s R&D team to open more imagination of AI.”
Behind it, AI industry rapid development in recent years, the field of talent for a long time there is a big gap, and application of technology has great potential has yet to play, even industrialization go relative to the front of the tesla, also needs to build more powerful AI team, to help self-driving products such as further mature and more massive.
From another perspective, Tesla’s autonomous driving technology, although it has been relatively leading in the industry, is still far from mature and is only in the stage of assisted driving requiring human supervision. Recently, some defects in its auxiliary driving function have attracted wider attention, and Tesla also needs to improve its auxiliary driving function experience as soon as possible.
Tesla flexes its muscles
There were many highlights of the Day, including the Tesla Bot, the self-developed D1 chip, and the Dojo supercomputer. One of the highlights was the development of Tesla’s autonomous driving technology.
Unlike most of the industry’s autonomous driving solutions, Tesla has been sticking to a purely visual approach, using cameras to sense the environment, without lidar or sophisticated maps. The hardware cost of this approach is relatively low, but the data and algorithm requirements are higher.
Andrej Karpathy explained the progress of Tesla’s pure visual FSD solution, which uses eight cameras that wrap around the entire body and cover 360° to capture information about the road, and multi-tasking neural network frames to Mosaic different images. To make the Mosaic more realistic and informative, Tesla has developed technology that uses information from its eight cameras to create a 3D bird ‘s-eye view of the vehicle’s environment.
Hardware perception combined with AI deep learning, software data processing to form a correct decision planning, is a general path for the realization of autonomous driving function. In the whole process, the most basic is the perception technology. The hardware replaces the human eye to “perceive”. After a series of complex algorithms and data information processing, decisions are made and judgments are made.
As the amount of data needed to process began to grow exponentially, Tesla also improved the computing power of its trained neural network, hence Dojo. In fact, Musk has teased the existence of The Dojo supercomputer several times before, but what was even more notable this AI day was the key component of the Dojo supercomputer — Tesla’s self-developed neural network training chip called D1.
According to the report, the D1 chip uses a distributed structure and a 7-nanometer process. It is equipped with 50 billion transistors and 354 training nodes. The internal circuit alone is 17.7 kilometers long, enabling super high computing power and super high bandwidth.
The Dojo supercomputer’s training module is made up of 1,500 D1 chips. Because each chip is seamlessly connected to each other and the latency between adjacent chips is extremely low, the training module maximalizes bandwidth. With Tesla’s own high-bandwidth, low-latency connector, the training module achieves a computing power of up to 9 Pflops.
Musk has said he will eventually make The Dojo supercomputer available to other companies that want to use it to train neural networks, suggesting that Tesla could extend ITS AI applications beyond autonomous driving, and the robots at the AI day show partly confirmed that possibility.
The Tesla Bot, which is 1.72 meters tall and weighs 56.6 kilograms, has a screen that displays information on its face, human-level hands, and powerful feedback for balance and agility, will use the Dojo supercomputer’s training mechanics to improve its capabilities.
“There will be no shortage of workers in the future, but manual labor is just an option,” Musk said. The Tesla Bot can perform dangerous, repetitive, boring tasks.” The project is already on the way, and the Tesla Bot is scheduled to launch its first prototype next year.
AI talent is in desperate need
Tesla is flestering its muscles to attract more AI professionals.
Although Tesla is the “leader” in the self-driving industry in a sense, it is still a long way from being truly driverless, and even the driving assistance features that have landed now are still not perfect.
Recently, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into Tesla. NHTSA is investigating 11 crashes involving Tesla’s Autopilot or other Autopilot features, seven of which resulted in injuries and a total of 17 injuries and one death.
Specifically, the incidents occurred between January 22, 2018, and July 10, 2021, across nine different states. They mostly occur in the evening, in the scene after the accident, there is some, such as emergency lights, flares, luminous arrow plate and the road cones and other objects, from the point of view of the scene, in road rescue workers stopped the vehicle rescue mission, tesla’s automatic auxiliary driving function failed to recognize these objects and vehicles, and then a collision.
For Tesla, this feature defect must be solved as soon as possible, which will require it to further expand the AI related team. This time, It will show the AI technology path and reserve in detail. Tesla is obviously offering an “olive branch” to like-minded industry people.
From a meso perspective, in recent years, the global AI industry has achieved rapid development, the number of AI enterprises and the level of financing have increased rapidly, and the competition for talents around the AI industry has become increasingly fierce.
According to a report by EOU Think Tank, in the past 10 years, the number of newly added AI enterprises in major countries around the world reached a peak in 2016. Among them, the number of newly added AI enterprises in China and the United States is still significantly ahead of other countries in absolute value after 2017, which is the leading region of global AI enterprises. From the perspective of capital introduction, the financing amount of global AI enterprises continues to grow, showing a geometric growth trend after 2016. By 2018, the total financing amount of GLOBAL AI enterprises is 78.48 billion US dollars, among which the United States leads.
AI talent, on the other hand, has long been in severe demand. Several research institutes have reported that there is a huge gap in AI jobs around the world. According to UIPath’s AI Jobs report, there were 7,465 vacancies in the U.S. in 2018; The 2019 Global AI Talent Report released by JFGAGne also shows that the global AI talent pool is growing, but demand still exceeds supply, while the latest global AI Talent 2020 Report shows that the demand for “new roles” has been stable despite a decline in the last year.
A new report from JFGAGne points out that talent is a problem for AI, and that the AI industry needs more than just software algorithms. “Whether the full potential of AI has been overhyped is open to debate, but we can say that truly successful AI requires more than high-level experts and the right data algorithms. The AI industry initially focused on very senior experts, because only they could manage new technology and apply it to new areas. But there is now recognition that this new technology requires more than just engineers and people who can build good models to deploy it effectively.”
The report further explains that AI is a new generation of software that is encoded with data rather than logical rules. In contrast, traditional software is static, and AI needs a new ecosystem of infrastructure that is not only built, but also governed after deployment, so in order for AI to function at scale, Fields such as engineering, infrastructure, new business model development and target monitoring all require a lot of new talent.