Amazed driver shows the moment his Tesla automatically swerved to avoid a pig at night

Amazed driver shows moment his Tesla automatically swerves to avoid a pig at night – as Elon Musk boasts his vehicles have ‘superhuman’ abilities

  • Pranay Pathole, 22, tagged Elon Musk & tweeted the footage from his dashcam
  • IT worker from India was driving in darkness and the car swerved to avoid a pig 
  • Elon Musk tweeted: ‘Autopilot prime directive is: Don’t crash […] 360 degree low light vision & sonar, plus forward radar enable to be superhuman’
  • Tesla claims its Autopilot system processes various inputs, including data from external cameras and sensors, in ‘milliseconds’ to avoid crashes

An amazed driver has revealed the moment his Tesla automatically swerved to avoid a pig at night, which Elon Musk said shows his vehicles’ ‘superhuman’ abilities.  

Pranay Pathole, 22, an IT worker studying in Maharashtra, India, tweeted a video of his Tesla car automatically veering away from the animal on Sunday.   

Tesla says its Autopilot system processes data from ultra-sensitive external cameras and sensors in ‘milliseconds’ to avoid crashes.    

Pathole tweeted the dashcam footage from his electric car, tagging Tesla CEO Elon Musk.  

He said the video proved the Autopilot system’s effectiveness, despite the public’s reservations. 

Pictured: The incredible moment the Tesla detects a pig in the middle of the road at night in India – and swerves to avoid a collision 

Tesla CEO Elon Musk responded to Pranay Pathole's tweet, saying that Autopilot's 'prime directive' is to avoid collisions

Tesla CEO Elon Musk responded to Pranay Pathole’s tweet, saying that Autopilot’s ‘prime directive’ is to avoid collisions 

Musk replied to the tweet shortly after, saying: ‘Autopilot prime directive is: Don’t crash. What seems fast to humans is slow to a computer. 360 degree low light vision & sonar, plus forward radar enable to be superhuman. Upcoming software upgrades will increasingly show the potential.’ 

According to Tesla’s website, Autopilot is built on a ‘deep neural network’, using ‘cameras, ultrasonic sensors and radar’ to sense the environment around the vehicle and prevent it colliding with external objects.  

Drivers can disengage Autopilot by pushing up a stalk near the steering wheel or by tapping the brakes. They can also take control of the steering wheel to switch away from Autopilot. 

In a survey reported by Bloomberg, 13 per cent of new Autopilot drivers said the system had put them in a dangerous situation, but 28 per cent said it had saved their lives.  

Tesla CEO Elon Musk, pictured, said: '360 degree low light vision & sonar, plus forward radar enable [the Tesla] to be superhuman'

Tesla CEO Elon Musk, pictured, said: ‘360 degree low light vision & sonar, plus forward radar enable [the Tesla] to be superhuman’

Another camera shows the pig as it is detected by Tesla's Autopilot system on a road in India

Another camera shows the pig as it is detected by Tesla’s Autopilot system on a road in India 

Sixty-one per cent of surveyors were ‘very satisfied’ with Autopilot Safety and 42 per cent felt ‘somewhat satisfied’ with Navigate on Autopilot Reliability.  

A North Carolina man in March shared footage of the moment he avoided a head-on accident while driving his 2020 Model S Tesla. 

Video of the incident shows the Tesla driving down a motorway in early morning darkness when a car travelling the wrong way at 60mph comes barreling towards him.  

Suddenly, the Tesla swerves to the right to avoid danger and the other driver quickly swerves onto the grassy median.   

Other reports suggest the Tesla is flawed at detecting oncoming crashes. 

Earlier this month, footage surfaced of a Tesla on Autopilot mode in Taiwan crashing into a turned-over truck in the middle of a motorway.   

The bonnet of the Tesla went straight through the roof of the lorry after it flipped over

The bonnet of the Tesla went straight through the roof of the lorry after it flipped over 

Reports say the sensors did not recognise the truck laying in the lane and only applied the emergency brakes at the last minute, resulting in the Tesla piercing through the roof of the vehicle.  

The impact of force during the crash was so great that the truck shook when the Tesla smashed into it, reports SETN a local Taiwan news source.  

‘There is no drunk driving situation, and the relevant transcripts have been completed so far, and the two parties have to face the subsequent compensation matters,’ SETN reports (translated).

The local media source also noted that the driver said the auxiliary system was activated and the self-driving state was not adopted. 

Neither of the drivers were injured in the collision. 

HOW DO SELF-DRIVING CARS ‘SEE’?

Self-driving cars often use a combination of normal two-dimensional cameras and depth-sensing ‘LiDAR’ units to recognise the world around them.

However, others make use of visible light cameras that capture imagery of the roads and streets. 

They are trained with a wealth of information and vast databases of hundreds of thousands of clips which are processed using artificial intelligence to accurately identify people, signs and hazards.   

In LiDAR (light detection and ranging) scanning – which is used by Waymo – one or more lasers send out short pulses, which bounce back when they hit an obstacle.

These sensors constantly scan the surrounding areas looking for information, acting as the ‘eyes’ of the car.

While the units supply depth information, their low resolution makes it hard to detect small, faraway objects without help from a normal camera linked to it in real time.

In November last year Apple revealed details of its driverless car system that uses lasers to detect pedestrians and cyclists from a distance.

The Apple researchers said they were able to get ‘highly encouraging results’ in spotting pedestrians and cyclists with just LiDAR data.

They also wrote they were able to beat other approaches for detecting three-dimensional objects that use only LiDAR.

Other self-driving cars generally rely on a combination of cameras, sensors and lasers. 

An example is Volvo’s self driving cars that rely on around 28 cameras, sensors and lasers.

A network of computers process information, which together with GPS, generates a real-time map of moving and stationary objects in the environment.

Twelve ultrasonic sensors around the car are used to identify objects close to the vehicle and support autonomous drive at low speeds.

A wave radar and camera placed on the windscreen reads traffic signs and the road’s curvature and can detect objects on the road such as other road users.

Four radars behind the front and rear bumpers also locate objects.

Two long-range radars on the bumper are used to detect fast-moving vehicles approaching from far behind, which is useful on motorways.

Four cameras – two on the wing mirrors, one on the grille and one on the rear bumper – monitor objects in close proximity to the vehicle and lane markings.