Technology Is Not An End But Means To Make Customer Life Easier: Manu Saale
- By 0
- February 04, 2020
Mercedes-Benz R&D India (MBRDI), founded in 1996 in Bengaluru to support Daimler’s research, IT and product development activities, is now one of the largest global R&D centres outside Germany, employing close to 5000 skilled engineers and a valuable centre to all business units and brands of Daimler worldwide. The centre is also a key entity for Daimler’s future mobility solutions through C.A.S.E (Connected, Autonomous, Shared and Electric) for building autonomous and electric vehicles. The centre’s competencies in engineering and IT have progressed to using AI, AR, Big Data Analytics and other modern technologies to provide seamless connectivity. During an interaction with T Murrali, the Managing Director and CEO of MBRDI, Manu Saale, said, “The centre has been growing phenomenally. We have just started a team on cyber security. . . We have been helping to simulate some stack- related solutions using fuel cells. I’m waiting for a clear strategy from the company for a possible venture into the hydrogen path.” Edited excerpts:
Q: You could begin with detailing the contribution of MBRDI to the Experimental Safety Vehicle (ESF)?
Saale: The ESF is a concept vehicle. We have taken a GLE platform and tried to predict technologies that are coming up and put its demo version inside. Some of them are just future technologies but they are strictly based on the data we have collected, and the accident research and digital trends that we have seen.
There is a worldwide safety theme, centred in Germany and India, which is studying all these data and statistics to predict how the future should look like. Mercedes-Benz has a history of building concept cars as mobility is changing around us. This time we have decided to put safety in perspective for the new age mobility with ESF2019. This time we have decided to put safety in perspective for the new age mobility.
For example, in a driverless car there is no steering wheel, so where will you put the air bags as it has been placed in the steering wheel. This means that the airbag concept will have to change. If you go white-boarding on this topic you will realise that some fundamental things you have been counting on all these years will change. This international team in Bengaluru supporting Germany has been working on many of these kind of concepts.
We have brought it here for two reasons. One is for the contribution from India. A lot of digital simulations have been done before implementing the hardware. Bengaluru has contributed to the digital evaluation of the new safety concepts in ESF. The other reason is to inspire the engineers to innovate further based on the first level of fantasies that we have created, and how it could be taken to the next level. These are the kind of things we want our engineers to think about; ESF is a pointer in that direction.
Q: What are the possible changes with the emergence of EVs and autonomous vehicles for safety?
Saale: Imagine not being able to predict the position of passengers when a crash happens. If they are sitting in a conference mode, facing one another other, how can they be protected without an airbag in their front? That’s one; second is the use of different materials within the car and the dynamics that could happen in an accident. Third is connection to the source of a fuel tank / pack, not specific to one place but probably spread across the floor of a car. The battery and its chemical components are also critical in a crash situation.
There are many new things when we think about safety in autonomous and electric vehicles; whereas connectivity plays into our hands. I don’t think the industry has exhaustively thought about what new dimensions can come from driving autonomous vehicles.
Q: What happens if the accident is so severe that all the electrical connections are cut off? Has any thought gone into this?
Saale: I am sure they have thought about it. An airbag can pop up in milliseconds; an SOS is message placed post crash. Today, in an instant, we can ping the world somehow, so information of position, latitude, etc is sent out immediately when an accident takes place. Of course it depends a lot on the emergency services and collision response in the country.
Q: What is the role played by MBRDI in the development of Artificial Intelligence (AI) and Augmented Reality (AR)?
Saale: This is the new age digital; we don’t have to go back to the old world of software alone. Digital has shown new potential in the last few years and we have tried to keep pace with the current trends. AI is certainly one of the buzz words that is coming up.
MBUX, which we flagged off in Bengaluru a few weeks ago, showcases how AI could be used as a technology to make customer life easier in the car. We look at all the use cases to find out what the customer does in a car.
For example, use of camera in a car. During night driving if the driver extends his hand to the vacant seat next to him looking for something, and if it is dark, the camera will sense that he is seeking something and switch on the lights. We need AI for that because we have to understand the hand position and the amount of stretch done; it should not be confused with the driver stretching himself after yawning. Such a simple use case requires a lot of technology. These are things where people look at customer behaviour and say ‘technology is not for the sake of technology but to make customer life easier.’

Q: The Tier-1 companies spread across Germany have come up with many futuristic solutions for vehicles. They have their own research centres. So what is the role of R&D centres of OEMs like this other than integration?
Saale: Every centre has to ride its own destiny. Even if we are a GIC we cannot expect HQ to hold our hand for ever. It’s a typical parent-child relationship and not a customer-supplier one. We have seen all the combinations of GICs working out there in the market. I think we have a good success story here. That is the value-add GIC has to think about.
A survey was done on the value-add from GICs; they used the word entrepreneurship from GICs. It was found that only 6 percent of GICs were entrepreneurial, that were really able to innovate. We were also named in that top 6 percent. It depends on the company culture, relationships, handling discussions with HQ and the local leadership teams. That’s the challenge in a GIC compared to a profit centre that is looking from one customer to another.
Q: You are also in touch with suppliers in India and across the globe for necessary hand-holding?
Saale: Absolutely, imagine a situation where the parents trust the child completely.
Q: You will be the parent and Tier-1s the children?
Saale: No, it is not that way. We behave as Daimler when we talk to Tier-1s. We tell them that ‘you know the car well, so do it by yourself and deliver the product.’ That’s the level of maturity in interaction that one can reach.
Q: When it comes to electronics, OEMs the world over are faced with many regulations. Do you see options for them to comply with all the regulations considering the amount of electronics coming into the car?
Saale: Every new thing is a technical challenge on the table. It can be stricter emission norms or features and functionalities that are difficult to reach, a technical compliance issue that crops up every now and then, and a safety or parking aspect that is covered by many regulations around the world. We thrive on such challenges that have pushed a company like Mercedes to keep on inventing because, among many other things, hardware is getting cheaper and smaller, software capabilities are growing, connectivity is increasing, computing external to the car is possible, and so many other things. OEMs are dealing with authorities, trying to handle what is possible at lower cost, because at the end of the day we have to sell. I am sure that regulators and societies around the world today are looking for some balance between technology and cost.
Q: How do you manage multiple sensors in the vehicle?
Saale: Digital appears to be very complex now but electronics will go through its life cycle and come to a point where man understands its complexity and is able to put it all together. Today, we are talking about sensor fusion - putting together the net of information and seeing it as a whole through various sensors.
Functionalities could range from a switch to radar or lidar with their spectrum of signals, to give various resolutions; the processing capability would be in milliseconds. The more we comprehend the mixed bag of signals we get the better will be our ability to make right decisions.
Q: With all the facilities that you provide to the driver, are you not actually deskilling him?
Saale: The trend is that people don’t want to get into the hassles of driving a vehicle. Driving is stressful and cumbersome to many which is why the autonomous car would gain popularity. The driver has to just punch in where he/she has to go and the vehicle will do it automatically, saving both mental and physical tension. A completely new user base is being introduced into mobility with software features. We have to look at it positively.
Q: Are you also working on cyber security, on things that get into the car?
Saale: We have just started a team now. Our focus on cyber security is at a centre in Tel Avi, Israel.
Q: Do you see scope to improve the thermal efficiency of Internal Combustion (IC) engines further?
Saale: I think the capability, from an engineering perspective, exists to take the IC engine to the next level. The potential continues to be there and all OEMs talk about it. Possibly it is getting affected by the social and environmental aspects.
Q: It is said that the exhaust from a Euro-6 engine is far better than the atmospheric air in many highly polluted cities and it is not actually polluting. What is your opinion?
Saale: It is true. But people say if electricity is generated from coal then aren’t we contributing to pollution? If we localise electric production to one area with everything contained then it would give us better scope to control it rather than spewing it out of every vehicle tail-pipe in all over the world.
Imagine millions of polluting vehicles moving around compared to millions of electric, which don’t have any tail-pipe emissions, with electricity generated by coal that is centralised; it would be a completely different technical and logistic challenge from the environmental point of view. Regulators, politicians and policy makers are all giving their views on this issue; the improvement in living standards and the coming up of smart cities would affect it. I think we are moving in the right direction with the greening of the environment covering everything. I see this sustainable city living much better pictured with electric moving around me.
Q: Can you tell us about the work done around IoT?
Saale: We are working on digitalisation of our production in many ways. One of the teams for Manufacturing Engineering in Bengaluru focuses on digital methods in manufacturing such as production planning, supply chain, logistics and IoT. The team also works on front-loading of production planning.
Q: What is your contribution to the Sprinter F-CELL, the fuel cell application, that replaced the diesel engine?
Saale: We have been helping to simulate some stack- related solutions using fuel cells. I’m waiting for a clear strategy from the company for a possible venture into the hydrogen path. (MT)
- Visteon Corporation
- Mahindra & Mahindra
- Mahindra XUV7X0
- Francis Km
- Adreonx+
- Qualcomm Technologies
- Auto Shanghai 2025
- CES 2026
- Snapdragon
- Uday Dodla
- Mark Granger
Visteon Showcases High-Performance Cockpit Computing, Expands Partnership With Mahindra & Mahindra Too
- By MT Bureau
- January 09, 2026
Visteon Corporation has announced an expanded technology partnership with Mahindra & Mahindra that will see its next-generation SmartCore Pro cockpit domain controller deployed in Mahindra’s XUV7X0 SUV lineup.
Unveiled at CES 2026, the SmartCore Pro builds on the SmartCore system introduced in the Mahindra XUV700 in 2021. The new system integrates cockpit electronics, surround view camera technology and telematics on Mahindra’s Adrenox+ platform. It features a three-display configuration supporting vehicle information, infotainment, ADAS visualisation and connectivity, alongside an integrated 360-degree camera system.
Francis Kim, Vice-President of Global Sales & Commercial Excellence and General Manager for Rest of Asia, Visteon, said, “The automotive industry is shifting from discrete systems to fully integrated digital platforms, and India is among the fastest-moving markets in this transition. This partnership demonstrates how strategic OEM collaboration can accelerate time-to-market for complex technologies while laying the foundation for software-defined vehicles.”
Alongside the Mahindra announcement, Visteon also showcased the production specifications and OEM implementations of its High-Performance Compute solution built on the Snapdragon Cockpit Elite platform. The solution follows Visteon’s collaboration with Qualcomm Technologies announced at Auto Shanghai 2025 and is now being demonstrated with multiple global OEMs.
The High-Performance Compute platform supports centralised vehicle architectures and software-defined vehicle strategies. It enables on-device AI processing, multi-display support, multi-user experiences and personalised cockpit features. The system uses the Qualcomm Oryon CPU, Qualcomm Adreno GPU and enhanced NPU AI performance, while Visteon’s cognitoAI Concierge digital assistant operates using the company’s QWEN 7B model.
Uday Dodla, Vice-President, Product Management, Visteon, said, “This High-Performance Compute solution addresses a critical challenge our OEM partners face as they transition to centralized architectures. By consolidating multiple ECUs into a single, powerful platform, we're enabling automakers to reduce complexity and costs while delivering the sophisticated AI-driven experiences that consumers increasingly expect.”
Mark Granger, VP, Product Management at Qualcomm Technologies, said, “Visteon's demonstration of its High-Performance Compute solution on the Snapdragon Cockpit Elite platform highlights the momentum toward centralized, software-defined architectures that will power the next era of intelligent, connected vehicles.”
Visteon said the platform is designed to support a common architecture across vehicle segments, allowing OEMs to scale features while consolidating electronic control units and supporting long-term cost efficiencies.
Valeo Join Forces With Hero MotoCorp To Bring ARAS Tech For Two-Wheelers
- By MT Bureau
- January 09, 2026
French tier 1 supplier Valeo and Hero MotoCorp, the world’s largest manufacturer of motorcycles and scooters, have inked a strategic partnership for Advanced Rider Assistance Systems (ARAS).
The partnership will focus on enhancing rider safety by introducing advanced sensing, perception and intelligent technologies tailored specifically for two-wheelers across both entry-level and premium segments, including the OEM’s emerging electric mobility portfolio under VIDA.
As part of the understanding, they will focus on ARAS by leveraging Valeo’s radar and smart camera tech equipped in Hero MotoCorp’s two-wheeler portfolio. This will not only enhance safety for two-wheeler users in India, but is also expected to drive awareness amongst customers globally.
The partners state that they have already achieved success in its proof-of-concept systems designed to protect both riders and pedestrians.
Marc Vrecko, CEO, Valeo’s Brain Division, said, “We are truly excited to partner with Hero MotoCorp to deliver solutions that will significantly enhance rider safety and create a more secure riding experience for millions of people. This collaboration is a key step in our strategy to bring advanced technology to the rapidly growing mobility market in India and globally.”
Ram Kuppuswamy, COO, Plant Operations, Hero MotoCorp, said, “At Hero MotoCorp, we are redefining the future of mobility by bringing advanced technology to our products. Our partnership with Valeo marks a significant stride in making mobility smarter, safer and more sustainable with next-gen advanced rider assistance systems. Together, we aim to make two-wheeler safety accessible to everyone and set new standards for innovation and protection globally.”
The ARAS architecture is developed as a digital co-pilot for riders, providing a 360deg safety envelope around the vehicle, it provides real-time sensing and intelligent alerts. It uses a radar-based system that can provide critical information/warnings such as Forward Collision Warning (FCW), Distance Warning (DW), Lane Change Assist (LCA), Blind Spot Detection (BSD) and Rear Collision Warning (RCW).
On the other hand, the vision system uses high-resolution cameras to provide Pedestrian Detection, Lane Detection, Traffic Sign Recognition and Lane Departure Warning.
Through intelligent image processing the system identifies road signs and obstacles, even in low-light conditions. Through the combination of radar and vision system, the two-wheeler encompasses a comprehensive safety system for two-wheeler users.
SiMa.ai And Synopsys Announce Integration To Accelerate Automotive AI Development
- By MT Bureau
- January 08, 2026
SiMa.ai has announced its first integrated capability resulting from a collaboration with Synopsys. The joint solution provides a blueprint to accelerate architecture exploration and virtual software development for automotive Systems-on-Chip (SoCs). These chips support applications including Advanced Driver Assistance Systems (ADAS) and In-Vehicle Infotainment (IVI).
The partnership aims to deliver architectures required for software-defined vehicles. The blueprint allows customers to begin the design and validation of custom AI SoCs and ‘shift left’ software development before silicon is available. This process is intended to reduce development costs and accelerate vehicle time-to-market.
The blueprint provides pre-integrated SoC virtual prototypes and a tool workflow using solutions from both companies.
For Architectural Exploration:
- SiMa.ai MLA Performance and Power Estimator (MPPE): Enables customers to size machine learning accelerator designs for specific workloads.
- Synopsys Platform Architect: Used to model workloads and analyse performance, power, memory, and interconnect trade-offs before RTL design.
For Verification and Validation:
- Synopsys Virtualiser Development Kit (VDK): Facilitates software development using a virtual SoC prototype, which can accelerate vehicle time-to-market by up to 12 months.
- SiMa.ai Palette SDK: Supports machine learning workflows for edge AI applications.
- Synopsys ZeBu Emulation: Delivers pre-silicon hardware and software validation to ensure architectures meet workload requirements.
Krishna Rangasayee, Founder & CEO at SiMa.ai, said, "We are pleased with how well the two teams have worked together to quickly create a joint solution uniquely focused on unlocking physical AI capabilities for today's software defined vehicles. Our best-in-class ML platform, combined with Synopsys' industry-leading automotive-grade IP and design automation software creates a powerful foundation for innovation across OEMs in autonomous driving and in-vehicle experiences."
Ravi Subramanian, Chief Product Management Officer, Synopsys, said, "Automotive OEMs need to deliver software-defined AI-enabled vehicles faster to market to drive differentiation, which requires early power optimisation and validation of the compute platform to reduce total cost of development and time to SOP. Our collaboration with SiMa.ai delivering an ML-enabled architecture exploration and software development blueprint supported by a comprehensive integrated suite of tools significantly jumpstarts these activities and enables our automotive customers to bring next-generation ADAS and IVI features to market faster."
Tianma Showcases Automotive Display Technologies At CES 2026
- By MT Bureau
- January 08, 2026
Chinese display panel manufacturer Tianma recently exhibited its range of automotive technologies at CES 2026. The company’s solutions include LTPS-LCD, AMOLED and MicroLED technologies designed for cockpits.
The centrepiece of the exhibit was the Smart Cockpit 7.0, an automotive interior and dashboard demonstration. It integrates a 49.6-inch curved ACRUS display with 8K resolution and a slidable AM-OLED display using a gear-rack mechanism.
It also presented InvisiVue, a solution that mimics decorative surfaces like wood or metal when inactive and reveals images through a transmissivity layer when powered on.
The 49.6-inch ACRUS curved display uses Corning ColdForm Technology. It features pixel-level dimming with 210,000 zones, achieving a contrast ratio of 100,000:1. The unit’s R3000 curvature is designed to align with the windshield to reduce blind spots and reflections.
Furthermore, Tianma also presented two HUD technologies – a 43.7-inch Ultra-wide IRIS HUD. It uses a Mini-LED display with peak brightness of 10,000 nits for visibility in sunlight. It features an 85 percent NTSC colour gamut and a curved structure designed to match the windshield’s optical path.
Secondly, an 11.98-inch IRIS HUD, which utilises high-luminance PGU technology, delivering 12,000 nits brightness. The module is less than 15 mm thick for integration in compact vehicles and operates at approximately 6 W to reduce thermal load.
The company also introduced a 34-inch dye liquid crystal dimming glass for rear side privacy windows. This technology uses voltage control of liquid crystal molecules to achieve stepless dimming without physical sunshades.
The system provides a response time of less than 300ms for transitions between privacy and transparent modes. It features a wide viewing angle and a grey-black tone to manage glare within the vehicle interior.

Comments (0)
ADD COMMENT