The “race”
On the Saturday of the race weekend, a demonstration of two A2RL vehicles raced around the circuit. The vehicles moved quickly onto the straight. Although, the corners? We were told it was still a bit tricky for the vehicles to navigate.
Down in the pits, the team watched a series of monitors. Sensor data came in from the vehicles: zeros and ones representing the trajectory, translated into a sea of graphs. To quickly analyze the data, the system shows a green flag if everything is going well and red flags if the values do not match what should happen. In addition to how the vehicle drives, information about fuel consumption, brake wear and tire temperature is shared with the team.
All of this data tells the team how hard the vehicle is being pushed. If everything looks good, the team can run the vehicle a little faster, push a little harder for a better lap time. People elsewhere in the pits will soon be telling their human drivers the same thing. Push harder, be faster; the car can handle it. The data coming in predicts what will happen in the next few seconds.
Hopefully.
The individual teams will try to find the optimal line, just like the human team, but it doesn't always follow what humans have done before on a track. They are working on creating an optimal line for the autonomous car instead of just copying what humans do.
This team has been at Suzuka for weeks prior to this race. The HD card they bought from a third party was miles off. During that time, the team had to re-map the track for the vehicles and teach them how to drive on a track narrower than the Abu Dhabi track.
The car is equipped with Sony 4K cameras, radars, lidar, high-definition GPS and other sensors. The electric steering can handle up to five G's. The hydraulic brakes on each wheel could be activated individually, but according to Pau this is currently not the case. However, Pau noted that enabling this feature would open up new possibilities, especially when cornering.