A vehicle platform processes data inside the car, makes decisions in real time, and keeps safety active without depending on cloud connectivity.

Rivian’s third-generation autonomy platform is built around a custom chip called the Rivian Autonomy Processor (RAP1), developed with Arm. The processor is designed to handle the high compute demands and strict safety requirements of autonomous driving and “physical AI,” where vehicles interact with the real world in real time. By keeping heavy compute on board, the system enables fast decision-making without relying on cloud connectivity.
At the core of this platform is RAP1’s architecture, based on Armv9. It uses Arm Cortex-A720AE CPU cores for high-level computing and coordination of advanced AI tasks. Alongside these, specialised processors manage real-time monitoring and fault handling. The system is designed to process large volumes of data while maintaining energy efficiency, so vehicle range is not heavily impacted.
The platform supports real-time decision-making by using an edge compute approach. Instead of sending data to the cloud, sensor analysis, predictive modelling, and vehicle control functions run locally within milliseconds. This reduces latency and ensures consistent performance regardless of network conditions.
Reliability is built into the compute pipeline. Independent processing blocks are physically isolated from the main AI cores. This separation ensures that even if the primary system encounters a fault, critical safety operations continue without interruption. This is important for vehicles operating in unpredictable public environments.
Efficiency remains a key factor. The Arm-based design improves instructions per watt, which directly affects real-world vehicle performance. Lower power consumption allows continuous AI processing without significantly reducing battery range.
The platform is also modular. Rivian can reuse the same base architecture across future vehicle models while adapting capabilities through software as AI systems evolve. This provides flexibility without requiring major hardware redesigns.
While the current platform supports driver-assist features, it is designed to scale toward higher levels of autonomy. The same compute foundation could also support future systems that rely on advanced perception and real-time interaction with the physical world.
Click here for the original announcement.

