"What's the second technology?" Boss Huang asked Tong He with great interest after dealing with Da Qian Eyewear.
"It's this, infrared projection keyboard technology!" Tong He quickly introduced the next technology under research.
"Infrared projection keyboard? I remember this technology isn't exactly new, it existed before the 21st century, right?" Boss Wang frowned and said, "And this technology isn't suitable for terminals. Are you trying to fool me with this?"
Infrared projection keyboard was a seemingly very impressive technology. It could project a keyboard onto a tabletop using infrared rays through a lens.
Users only needed to place their fingers on this light-composed keyboard and tap directly on the tabletop to input various texts and letters as if typing on a real keyboard, creating a visually stunning experience of a keyboard conjured out of thin air.
Moreover, adding such an infrared keyboard function only required a special projection bulb in the mobile terminal, which could be easily integrated into a terminal. Unlike the previous Da Qian Eyewear technology, it did not require other auxiliary equipment, making it quite practical.
However, this seemingly impressive technology had actually been invented a long time ago. In the 1980s, someone had already produced a simplified version of the infrared projection keyboard.
The essence of this keyboard technology was that the terminal recognized the inputted letters by the fingers blocking the infrared rays, thus completing the input. In terms of technology, there was nothing particularly high-end about it, and related prototypes were easily created.
The problem was that while it was simple to make, miniaturizing it and, crucially, improving its recognition accuracy was extremely difficult.
After all, this was a technology that relied on infrared ray blocking to achieve input, and it was very easy to make errors in reading, resulting in incorrect input.
Chinese characters were manageable, but with English input, it was difficult to spot spelling errors. A whole article could end up full of typos, absurd and laughable.
In addition to improving accuracy, the infrared projection keyboard had another fatal flaw: it required a desktop surface, and tapping on the desktop itself provided no tactile feedback.
On physical keyboards, many users complained about insufficient feedback and favored mechanical keyboards, which offered a crisper and more satisfying feel. A good mechanical keyboard could even sell for thousands of yuan.
However, the infrared projection keyboard essentially regressed the typing experience to its most primitive state, as tapping your fingers directly on a desktop would never feel good.
Most importantly, you needed a desktop where you could rest your hands quietly. Most of the time when using a terminal, you couldn't find such a suitable desktop.
In conclusion, Boss Huang was very displeased when Tong He introduced this technology to him, because while the technology was cool, its impracticality was also very evident.
Thinking about how many Android phones in the future would desperately try to improve screen-to-body ratio, expending immense effort to gain that 1% screen real estate, it would be better to equip the phones with such a cool-looking infrared projection keyboard.
"Boss, infrared projection keyboard technology did emerge decades ago. Of course, we wouldn't use such technology to prove ourselves; we don't have that kind of face!" Tong He said with a smile. "This technology was initiated a year ago, but Teacher Zhao Chunlei, who is in charge of this project, is extremely capable. He immediately recognized the problems with this technology in practical applications. So he proposed an idea and made a small change to the technology."
"What change?" Boss Huang asked curiously.
"It's very simple, it's changing the infrared sensing technology to infrared radar camera technology!" Tong He said.
"Infrared radar camera technology?" Boss Huang was taken aback. "Isn't that the technology our research center is developing for electric vehicles?"
That's right, Jiangnan Group was indeed developing electric vehicle technology at that time, and from the outset, they were aiming for autonomous driving technology. The core of autonomous driving technology wasn't AI or the design of driving logic, but rather how to constantly grasp the entire road situation.
Because designing driving logic wasn't difficult; it was merely a process of logic aggregation and machine learning. With the level of programmers at Jiangnan Group, a complete program could be developed within a year.
The most difficult part of autonomous driving technology was actually data collection, the detailed understanding of the entire road condition.
The logic of autonomous driving technology for electric vehicles primarily involved collecting comprehensive road information, and then, based on that information, determining how to drive next without hitting any objects or violating traffic rules, accurately recognizing various traffic signs, and so on. This was truly the most difficult aspect of autonomous driving technology.
Through various accidents related to autonomous driving technology that occurred in the future, it could be seen that most of them were caused by incomplete road information collection.
For example, there was a car accident in the US where a Tesla with autonomous driving directly crashed into a guardrail. According to the later investigation, the reason was that it had rained earlier, causing water droplets to fall on the guardrail, leading to light reflection. This light reflection caused the camera installed on the Tesla to misjudge the situation, thinking that the location was merely foggy. As a result, it drove straight into it, causing a serious accident.
Then there was another accident where a Tesla hit a little girl. It was the same situation. The little girl was wearing clothes that were red and yellow, and these colors were separated at the shoulders. This caused the Tesla to judge that the little girl was a traffic cone on the side of the road.
And because a car was approaching from the opposite direction, the Tesla's autonomous driving system, after evaluation, decided to turn as far left as possible, even if it meant hitting the traffic cone. Compared to a head-on collision between two vehicles, hitting a small traffic cone was not a significant issue.
And then the accident happened.
The above accident cases fully demonstrated that for autonomous driving technology to surpass human driving, the core lay in collecting a large amount of road conditions and information to enable the computer to make the most reasonable judgments.
Initially, road information was collected using high-definition cameras. These cameras continuously captured road scenes, and then AI analyzed the objects in the images to determine their meaning and derive corresponding road information.
However, the images captured by high-definition cameras were ultimately flat, and even the most high-definition cameras couldn't be truly high-definition.
It wasn't that electric vehicles like Tesla couldn't afford more high-definition cameras, but the higher the resolution of the captured images, the higher the pixel count of both the images and videos. Simultaneously, the larger the file size would be. Such a massive amount of data required a more powerful computing capability to process, and the computing chips installed in electric vehicles were far from capable of processing such vast amounts of data.
Therefore, under Boss Huang's instructions, the autonomous driving technology team for electric vehicles at Jiangnan Group directly abandoned the idea of using cameras and instead began designing according to the industry's most advanced infrared radar camera concept.
As the name suggests, infrared radar cameras combine the characteristics of infrared rays, radar, and cameras.
The entire working process is as follows: the camera first emits a large number of infrared rays. These infrared rays not only help the camera to image but also continuously collide with surrounding objects like radar, forming echoes that return to the camera's receiver.
By receiving these echoes, the receiver can obtain a large amount of positional data of objects around the car.
Since there are unlikely to be any military-grade stealth coatings on the road, as long as radar technology is used, all solid objects will be virtually undetectable. Their positions can be immediately transmitted to the computing chip, which then plans the driving route.
On the other hand, infrared rays can capture images of quite good quality. When the computing chip analyzes the captured photos and videos to determine what the obstacles detected by the radar are, then it makes the most reasonable driving judgment.
At the same time, when infrared rays are projected onto objects of different colors, the infrared rays returned will also vary. Through these variations, the actual colors of the objects can be directly determined, allowing the car's ability to recognize traffic lights to reach its peak.
In summary, as long as the infrared radar camera is successfully developed and installed on electric vehicles, electric vehicles will be equipped with eyes much sharper than the human eye, and barring some extreme cases, there will be virtually no misjudgment of road conditions.
Therefore, the entire electric vehicle project team is currently working hard to develop this infrared radar camera technology. Unexpectedly, this technology has been taken by the terminal project team.
"Yes, autonomous driving requires very precise data, so they are still continuously refining their infrared radar camera. But our requirements are very simple; we only need to recognize finger movements. So even the most basic version they develop is sufficient for our project!" Zhao Chunlei said directly.