An engineer shows how the magnified vision mode of the developer mode of the Tesla operating system looks and works, to show us how the algorithm in which the car’s autopilot works. Autonomous cars are the future, but not in the short term, at least in Europe where road safety regulations are much tougher than other countries like the United States.
Tesla has many years of advantage over its competitors, and in addition to being one of the main players in the electric car market, it also has tests with autonomous cars well advanced . If electric cars have driving assistance, the autonomous car will be able to do the entire journey with virtually no human interaction.
But the operating system on which the different Tesla cars are based, and its autopilot function, has a developer mode that, although it is out of the reach of most users, can be activated by making a series of code changes .
Well, the engineer greentheonly on Twitter, taking advantage of the developer mode of the Tesla operating system, specifically that of the autopilot, has found the option of “augmented vision mode”.
This allows developers to make changes to the operating system in a much more efficient way, since everything is superimposed on the screen of the cameras, offering different information.
In this way, we have an interface very similar to what is augmented reality , where we can observe on a screen what the vehicle’s cameras show, and in other areas system recognition data. There we are shown signs or pictures on elements seen on the road such as roads, the edge of the lane or traffic signs.
This allows the operating system, based on the vehicle’s cameras, to understand what is in front of it based on a series of algorithms that would help the developer to make a series of changes, and that a common user should never do because they would be modifying an operating system. without permission from the manufacturer.