Armored fighting vehicles face other challenges when operating in urban combat. They are exposed to snipers or anti-tank teams operating from elevated positions on rooftops or accessing underground shafts too close for the crew to respond. In these conditions, technology must augment situational awareness, allowing the crew or defensive systems to focus on the most relevant threat. These capabilities are already operational with the Israel Defense Forces (IDF).
360° armor protection is a must, and several companies are addressing this requirement with new protection systems. Active protection systems, like Rafael’s Trophy and Elbit Systems’ Iron-Fist, combine radar and Electro-optical sensors, high-performance processors, and different effectors, from explosively formed projectiles through blasts to lasers to destroy incoming projectiles. The latest enhancements introduced with these systems include dual-sensor capability (radar+EO) and top attack engagements being considered by the developers of both systems. However, APS adds considerable weight to the vehicle and requires a substantial base armor for optimal function.
Other systems include passive systems, like the Hedgehog top-side and flexFence counter-RPG armor, added to the base armor. Both are offered by Plasan. These ‘statistical protection’ means can reduce the probability of penetration by up to 80 percent compared to base armor.
Another essential is peripheral vision. Unfortunately, transparent armor provides such capabilities at a substantial weight penalty. Nevertheless, modern vehicles require transparent armor as base armor to ensure situational awareness, driving, and crew performance. Such systems are provided by OSG.
Imco, a supplier of electrical and electronic systems for AFVs, offers a Situational Awareness Video System (SAVS Ai) that provides 360° surveillance and protection for combat vehicles. and transforms situational awareness and decision-making capabilities. The system integrates cameras, sensors, an advanced video matrix, an AI application for real-time sensor data analysis, and multiple user displays, enabling commanders and crew members to maintain complete situational awareness in all combat situations.
Some of these combat vehicles also employ the Vehicle Control Module (VCM) system produced by the company. The system collects data from onboard systems, including the engine, transmission, tracks, etc., and interprets and provides an overview of the vehicle’s status, ensuring optimal performance and reduced wear and downtime by recommending proactive maintenance activities.
Such powerful processing systems require high-performance data communications operating at low latency and high bandwidth to ensure real-time operations. Digital backbone (DBB) services, such as the AITECH’s DBB, provide such services. Such systems are implemented in Active Protection Systems (APS), advanced mission computers, rapid sensors, and data processing & transfer. The DBB provides a seamless integration of high-speed, reliable, and secure connectivity between electronic systems. Time-sensitive networking (TSN) and L3/Ethernet connectivity enable data to pass easily between endpoints and networks. This application is suitable for critical time-sensitive traffic, with low to mid latency. A modular solution aligned with SOSA technical standards and Future Airborne Capability Environment (FACE) architecture. Among the products and systems are GPUPU-based systems leveraging the latest NVIDIA processors, facilitating AI and Edge processing applications.
Peripheral vision is a new function that has received dramatic new meaning in recent combat. Edge 360 is a peripheral vision system developed by Axon Vision that has already been implemented in several AFVs. The system enables the vehicle commander to operate continuously with closed hatches while providing complete situational awareness from close range in a single view. Leveraging the power of machine vision, this simplified user interface reduces the cognitive load and required attention while maximizing the vehicle’s lethality and survivability. Automatic detection, recognition, and tracking of static and dynamic targets (Action recognition). The modular system contains AI clusters (Day/Night Camera, GPU, and AI algorithm) and a central processing unit. Real-time video communication distribution system with very low latency. Decentralized embedded GPUs architecture combined with central processing capabilities. The system also provides Distance estimation using EO and TI sensors.
Another company providing is Maris Tech. The system integrates five peripheral cameras, ensuring no blind spots. All cameras are monitored simultaneously and in real-time by a powerful yet compact AI processor, utilizing low-power, high-performance AI processors from Hailo AI. The system enables the crew to receive alerts of potential threats in their surroundings and act against them. The system employs an edge processor that processes the video streams from the cameras with AI to detect and alert of nearby objects and suspicious actions. All images are displayed on a central screen, which can also be integrated into the platform’s network. The system provides powerful threat detection, alerts, and precise responses to dismounted threats. Suitable for urban combat and can be integrated into a wide.
System services are extended beyond peripheral situational awareness by leveraging the powerful AI edge processing capabilities. For example, Axon-Vision’s EdgeSight and AI-NGCV SmartScopes empower commanders and gunners of combat vehicles to Use the Axon-Vision open architecture software SDK to connect and integrate with the sight and existing communication solutions like BMS.
AI functions help commanders perform terrain analysis, enabling smart scanning for targets on a large area while simultaneously keeping track of visible targets. Once targets are acquired, Automatic Target Recognition (ATR) is performed to recognize targets, their attributes, and context (“person on the roof,” “car in the field,” “person near treeline”). Then, target data is sent to the main gun, weapon station, or external effector for engagement. Once targets are referred from another observer system, AI helps in rapid reacquisition between scopes. Using appearance, location, and intuitive UI for target acquisition and re-identification reduces the risk of errors in matching targets between sensors. When several effectors are available for engagement, AI may assist in selecting the weapon with the optimal results based on weapon and projectile optimal fire doctrine for each target and its context and location.
Further Reading: