Selasa, 26 Januari 2016

Award season

Patryk Fournier
The first two months of the calendar year are often referred to as award season by the entertainment industry. Although we don’t compete with the likes of Leonardo DiCaprio and Jennifer Lawrence, we still feel honored with the recent accolades and awards being bestowed upon us.

In the category of Best Backhaul Software or Development Platform for Automakers, the winner is… QNX Software Systems.

Thank you so much to Auto Connected Car News and all the people and companies who voted for us in the Tech CARS Awards. We pride ourselves on offering flexible development platforms that enable automakers to deliver unique, branded experiences. Working with leading-edge automakers and Tier 1 suppliers drives us (pardon the pun) to continue upping our game in advanced platforms for infotainment, digital instrument clusters, advanced driver assistance systems (ADAS), and acoustics — including, of course, the recently announced QNX Platform for ADAS and QNX Acoustics Management Platform.

We would also like to congratulate our fellow award winners, Ford and Harman. Ford won for Overall Best Car Infotainment Software by Automaker for their QNX-powered SYNC 3 connectivity system.

And speaking of Ford, the GSMA Global Mobile Awards recently announced their shortlist of finalists. And we just happen to be a finalist in the category of Best Mobile Innovation for Automotive for our work in Ford SYNC 3.

QNX-powered Ford SYNC 3: Shortlisted for a 2016 Glomo Award. Source: Ford
The Global Mobile Awards, newly rebranded as the Glomo Awards, will take place on February 23 at the Mobile World Congress event in Barcelona, Spain.

Selasa, 12 Januari 2016

“I don’t know where I’m going from here, but I promise it won’t be boring”

Patryk Fournier
The quote is from the now late but great David Bowie and is extremely prophetic when you apply it to autonomous driving. Autonomous driving is very much still uncharted territory. Investments in roadway infrastructures are being made, consumer acceptance is trending positive, and, judging by the news and excitement from CES 2016, the future if anything will not be boring.

CES 2016 stretched into the weekend this year and ICYMI there was a lot of compelling media coverage of QNX and BlackBerry. Here’s a roundup of the most interesting coverage from the weekend:

ARS Technica: QNX demos new acoustic and ADAS technologies
The crew from ARSTechnica filmed a terrific demonstration of the QNX Acoustics Management Platform and the QNX Platform for ADAS. The demonstration highlights the power and versatility of the acoustics platform, including the QNX In-Car Communication module, which allows the driver to effortlessly speak to passengers in the back of the vehicle, over the roar of an engine revving at high speed. The demonstration also showcases how the QNX OS can support augmented reality and heads-up displays:

Huffington Post: CES 2016 Proves The Future Of Driverless Cars Is Promising
Huffington Post highlighted BlackBerry and QNX as key newsmakers for advancements in driverless cars. The article notes QNX’s automotive leadership: “The software is actually installed in 50 per cent of the world’s automotive infotainment systems including Audi, Volkswagen, Ford, GM and Chrysler.”

Crackberry: Inside the QNX Toyota Highlander at CES 2016
The folks at CrackBerry filmed a demonstration of our latest technology concept vehicle, based on a Toyota Highlander. The demo focuses on the QNX In-Car Communication acoustics module, which forms part of the recently launched QNX Acoustics Management Platform:



HERE 360: QNX and HERE bring to life a multi-screen experience in vehicles
A blog post from our ecosystem partner mentions HERE navigation and its use in the Toyota Highlander and Jeep Wrangler technology concept vehicles.

Kamis, 07 Januari 2016

Why is software the key to bringing augmented reality to cars?

Guest post by Alex Leonov, marketing director, Luxoft Automotive.

While self-driving vehicles are gradually becoming a reality, more and more of today’s cars roll out from factories featuring advanced driver assistance systems (ADAS). We are quickly getting used to adaptive cruise control, blind spot monitoring, parking assistance, lane departure warning, and many other features that make driving safer and the driver’s job easier. Data from cameras, sensors, and V2X infrastructure feed into ADAS systems, increasing their accuracy and efficiency. These systems are important steps toward fully autonomous driving, but the ultimate responsibility for decision making still lies with a driver.

The more that cars become connected, the more the average driver can be bombarded by information while driving. “In 500 feet make a right turn.” “You have an incoming call from Christine.” “You have a new message on Facebook.” “You are over the speed limit.” This may not be so big of a distraction under normal conditions. But sometimes, when driving in hectic city traffic or in a snow storm, it is critical to keep eyes on the road, while still receiving essential information. The good news is, the technology is already there to remedy this.

Heads up for HUDs
Keeping the driver’s eyes on the road is a priority, and head-up displays (HUDs) can accomplish just that. They project alerts and navigation prompts right on the windshield. Analysts predict an explosive growth of HUDs with the market reaching close to US$100 billion by 2020. The bulk of HUDs are relatively simple combiners, but more advances in wide-field-of-view HUDs are coming soon.

Projecting alerts and navigation prompts directly on the windshield.
HUDs are perfect for presenting information in a convenient, natural way, and giving the driver a feeling of being in control. But HUDs are only as good as the information they display. That is why it is critical to have solid and reliable data processing and decision-making algorithms, running on a reliable OS, that can prioritize and filter data. The resulting alerts and prompts must be communicated to a driver in a clear, transparent way.

Computer vision, also known as machine vision, is a key to processing the endless flow of data. With its human-like image recognition ability, computer vision processes road scenes, and the system fuses data from multiple sources. Add in a natural representation of processing outcomes in the form of augmented reality, while tracking driver’s pupils, and you have a completely new level of driver’s experience — safe and intuitive.

Next-generation driving experience
At Luxoft, we’ve been working on making this experience a reality. The result is CVNAR, a computer vision and augmented reality solution. CVNAR is a powerful software framework containing mathematical algorithms that process a vast amount of road data in real time to generate intuitive prompts and alerts. CVNAR has built-in algorithms for road and pedestrian detection, vehicle recognition and tracking, lane detection, facade recognition and texture extraction, road sign recognition, and parking space search. It performs relative and absolute positioning and easily integrates with navigation, the map database, sensors, and other data sources. A unique feature of CVNAR is its extrapolation engine for latency avoidance.

Detecting and recognizing road signs, pedestrians, traffic lanes, gas stations, and other objects.
CVNAR works perfectly with LCD displays and smartglasses, but it is ultimately built for HUDs. Data from cameras, sensors, CAN, and navigation maps are fused and processed to create an extendable metadata output that describes all augmented objects. It takes a HUD and an eye-tracking camera to implement CVNAR in a vehicle. CVNAR will track the driver’s gaze and adjust the position of the augmented objects in the driver’s line of sight to make sure they don’t obstruct anything important — all in real time.

Alerting the driver to an empty parking spot.
This is not all that CVNAR can do. New car models come packed with infotainment features that take time to learn and memorize. The CVNAR-based smartphone app can help. It turns your smartphone into an interactive guide. Point your phone camera to your dashboard and use augmented prompts to find out more about a particular car function. It can work under the hood, too.

Era of a software-defined car
A modern car runs on code as much as it runs on gasoline (or a battery-powered electric motor). Today, it takes over 100 million lines of software code to get a premium car going, and the amount of software necessary keeps expanding. At Luxoft, we are excited about the car’s digital future, and we work every day to help bring it about, by developing cutting-edge automotive solutions for leading global vehicle manufacturers.

Offering a wide range of embedded software development and integration services for in-vehicle infotainment and telematics systems, digital instrument clusters, and head-up displays, Luxoft has developed User Experience (UX) and Human Machine Interface (HMI) technology for millions of vehicles on the road today. We push the envelope of technology in such areas as situation-aware HMI, computer vision and augmented reality, while Luxoft’s products, the Populus and Teora UX and HMI design tool chains, power the development of award-winning automotive HMIs and slash time to market.

Software holds the key to the future of cars. It is essential to creating a customized user experience in vehicles. With over-the-air updates, software offers unmatched flexibility and scalability. Finally, it takes safety to the next level with its ability to simulate human-like logic through complex algorithms.

You can view Luxoft’s CVNAR solution running on a QNX-based ADAS demo this week at CES, in the BlackBerry booth: LVCC North Hall, #325.



About Alex
Alex Leonov has been in the automotive and IT industry for over 18 years in various business development and marketing roles. Currently, Alex leads the global marketing efforts of Luxoft Automotive.

In the zone — a visit to the QNX concept garage

Guest post by QNX consultant and software designer Rob Krten.

How often have you heard the expression, “If it were easy to do, everyone would do it”? I’m constantly amazed at the things that QNX does with their concept cars. To me, a car is an inviolate object that must be touched only by the dealer (well, ok, I do top up the windshield wiper fluid and I once changed a battery). I don’t say that because I necessarily like to give the dealer money, but I just don’t want to break anything that’ll cost me more to get fixed properly later.

Pushing the envelope, however, means getting right in there and doing stuff. QNX engineers have done this for their technology concept cars — from replacing the mirrors with LCD screens, to getting right into the dash and rebuilding it, to adding cameras into the antenna fin on the roof. It’s nothing for them to rip out the center console and then look at all the wiring and go, “Huh, ok — so we need to lengthen this wire, add a shim here, move this piece,” and so on. They are fearless.

Redoing the dash of the QNX
reference vehicle.
Sometimes the “getting right in there” is physical; other times, it’s software based — such as making a new application that lives in the infotainment stack or that interfaces with a smartphone. Like a “Dude, where’s my car?” feature — when your Bluetooth phone unpairs with your car, the phone records the current GPS position. Later, when you’re looking for your car, your phone can recall this last stored GPS position — this must be where you left your car. Or even simple aids, such as a radio tuner that detects when you are losing an AM/FM signal and automatically switches to the corresponding digital station, so you can continue listening to your favorite station anywhere you drive.

Curious to see what the future holds, and to actually see some of this work in action, I invited myself down to the “garage” at QNX headquarters. It’s at the far end of the building, next to the cafeteria. The hallway is festooned with posters of previous QNX concept vehicles, highlighting success stories in 3-foot-high glory.

The day I visited, there were half a dozen people in the garage, and two vehicles: a Jeep and a Highlander (otherwise known as the QNX reference vehicle and QNX technology concept vehicle). The garage is a combination of software development lab, hardware development lab, simulation environment, and actual garage (but without the greasy/oily smell). I wanted to get a sense of what drives these people, what they do, and how they do it.

Digital analogs
No, not that kind of digital 
display. Credit: Peter Halasz
The first thing I learned was that there are no real limits. They have the freedom to innovate, without preconceived notions about how things should look. For example, a lead designer on the team (let’s call him Allan, because that’s his name), explained how they look at the controls in the car’s dash display area. In the era of analog, the speedometer had a certain look — it was usually a needle rotating about a central point, where the needle pointed to the speed you were going. In the very early era of digitization, car manufacturers changed this needle to a seven-segment numerical display.

Of course, this was a failure, because the human brain is basically analog; it likes to see nice, continuous changes for processes that are continuous — such as the speed that you’re going. Seven-segment digits change too “randomly”; they require higher-level cognitive functions to parse what the individual lights mean and convert that into digits, and then convert that into a “speed” (and then convert that into “too slow,” or “just right,” or “too fast,” and then, finally, convert that into “apply brake” or “press down on throttle”).

Allan pointed out that changing to a digital display didn’t necessarily mean that they have to slavishly follow the analog “physical” appearance (except do it on an LCD display), but that they were free to experiment with “fill concepts” — digitally controlled analogs to the actual controls. We likened it to the displays in military avionics, where the most important information becomes bigger as it increases in importance. Consider a fighter jet at 20,000 feet — the altitude isn’t nearly as important as it as at 300 feet. Therefore, at 20,000 feet, the part showing the altitude is small, and in a less prominent position than it is when the plane is at 300 feet. The same thing with your speedometer: if you’re doing the speed limit, it’s not as important to show your current speed (you’re most likely flowing with traffic) as it is when you’re 20 over (or under).

In this image from the new QNX technology concept vehicle, the digital instrument cluster is warning that a
forward collision is imminent, and that the driver is exceeding the speed limit by 12 mph. 

You could do the same thing with your fuel range — when you have a full tank, the indicator can be off in a corner somewhere. But as you start to run low, the indicator can get bigger or more prominent, to start nagging you to refuel. By having the displays all be “virtual” on a large LCD screen in the dash, the designers have incredible flexibility to create systems that present relevant information when required, and have it move out of the way when something more important comes along. (Come to think of it, this would be an awesome feature to have on turn-signal indicators — after you’ve kept your blinker on for more than 10 seconds, it would start to get bigger and brighter. Maybe then people would stop driving with their turn indicator permanently on.)

Collision avoided: The V2X command center
Also in the lab was a huge (3 by 5 foot) flat-panel touchscreen, mounted at an angle that’s aggressively unfriendly to coffee cups (probably for that very reason). It’s reminiscent of Star Trek’s main transporter control station, but it’s used to control and display the simulation environment’s V2V (vehicle to vehicle) and V2I (vehicle to infrastructure) data. It acts as a command center to control and reveal the innards of what’s going on in the simulation environment:



When I was there, we ran a vehicle collision avoidance scenario. Two vehicles (the Jeep and the Highlander, of course — they’re tied in to the system) were heading on a collision course (one was southbound and one was eastbound in a grid-style road system). Because they have V2V capabilities, both cars were aware of their impending doom. This showed up nicely on the V2V command center control panel — two cars heading towards each other, little red circles emanating from them indicating the realtime V2V “pings.” Of course, in plenty of time, the Jeep slowed down to avoid the collision (the actual brake lights even went on!). The speed, GPS coordinates, direction, and even what gear each vehicle was in were all shown on the master console. Towards the end of my visit I almost had Allan convinced to do another master control console for the OBDII connector so you could interact with all of the information in each car. What can I say? I like front panels. (I’m a reformed PDP-8 collector.)

The V2X command center, which makes its debut this week at CES, provides a bird’s eye view of several V2X traffic scenarios. In this example, V2X allows a vehicle (the Jeep) to detect that a vehicle up ahead (the Highlander) has braked suddenly, giving the Jeep plenty of time to slow down.

The engineers in the concept garage are “in the zone.” They’re working in an environment that encourages innovation. Watch and see what they produce:




About Rob
Rob is president of Iron Krten Consulting, which provides technical leadership services, from software leadership consulting through to security and embedded software products, development, training and contract services. Rob is also engaged by QNX Software Systems to write marketing and technical documentation. Visit Rob's website.

Video: Paving the way to an autonomous future

Lynn Gayowski
Lynn Gayowski
CES 2016 is now underway, and our kickoff to the year wouldn’t be complete without a behind-the-scenes look at the making of our new technology concept vehicle and updated reference vehicle.

The video below follows the journey of building our vehicles for CES 2016 and highlights the technologies we’re using to speed progress towards automated driving — and the list of tech that QNX covers is impressive! It includes advanced driver assistance systems (ADAS), V2X, and augmented reality, not to mention digital instrument clusters, in-car communication, and infotainment:



QNX Software Systems continues to innovate in automotive, with a vision for the evolution of automated driving and a trusted foundation for building reliable, adaptable systems. At risk of giving away the big finale, I think John Wall, head of QNX, sums up perfectly what QNX is on target for in the automotive industry: “We will dominate the cockpit of the car.” It’s a bold statement but we’re already amassing some imposing stats that back this up:

Rabu, 06 Januari 2016

The simpler, the better: a first look at the new QNX technology concept vehicle

Bringing the KISS principle to the dashboard.

Paul Leroux
“From sensors to smartphones, the car is experiencing a massive influx of new technologies, and automakers must blend these in a way that is simple, helpful, and non-distracting.” That statement comes from a press release we issued a year ago, but it’s as true today as it was then — if not more so. The fact is, the car is undergoing a massive transformation as it becomes more connnected and more automated. And with that transformation comes higher volumes of data and greater system complexity.

But here’s the thing. From the driver’s perspective, this complexity doesn’t matter, nor should it matter. In fact, it can’t matter. Because the driver needs to stay focused on the most important thing: driving. (At least until fully automated driving becomes reality, at which point a nap might be in order!) Consequently, it’s the job of automakers and their suppliers to harness all these technologies in a simple, intuitive way that makes driving easier, safer, and more enjoyable. Specifically, they need to provide the driver with relevant, contextually sensitive information that is easy to consume, without causing distraction.

That is the challenge that the new QNX technology concept vehicle, based on a Toyota Highlander, sets out to explore.

So what are we waiting for? Let’s take a look! (And remember, you can click on any image to magnify it.)

The oh-so-glossy exterior
As with any QNX technology concept vehicle, it’s what’s inside that counts. But to signal that this is no ordinary Highlander, we gave the exterior a luxurious, brushed-metal finish that just screams to have its picture taken. So we obliged:



The integrated display that keeps you focused
When modifying the Highlander, simplicity was the watchword. So instead of equipping the vehicle with both a digital instrument cluster and a head unit, we created a “glass cockpit” that combines the functions of both systems, along with ADAS safety alerts, into one seamless display. Everything is presented directly in front of the driver, where it is easiest to see.

For instance, in the following scenario, the cockpit allows the driver to see several pieces of important information at a glance: a forward-collision warning, an alert that the car is exceeding the local speed limit by 12 mph, and turn-by-turn navigation:



Mind you, the cockpit can display much more information than you see here, including a tachometer, album art, incoming phone calls, and the current radio station. But to keep distraction to a minimum, it displays only the information that the driver currently requires, and no more. Because simplicity.

To further minimize distraction, the cockpit uses voice as the primary way to control the user interface, including control of media, navigation, and phone connectivity. As a result, drivers can access infotainment content while keeping their hands on the wheel and eyes on the road.

Thoughtful touches abound. For instance, the HERE Auto navigation software running in the cockpit interfaces with a HERE Auto Companion App running on a BlackBerry PRIV smartphone. So when the driver steps into the vehicle, navigation route information from the smartphone is transferred automatically to the vehicle, providing a continuous user experience. How cool is that?

Here’s a slightly different view of the cockpit, showing how it can display a photo of your destination — just the thing when you are driving to a location for the first time and would like visual confirmation of what it looks like:



Before I forget, here are some additional tech specs: the cockpit is built on the QNX CAR Platform for Infotainment, uses an interface based on Qt 5.5, integrates iHeartRadio, and runs on a Renesas R-Car H2 system-on-chip.

The acoustics feature that keeps you from shouting
The glass cockpit does a great job of keeping your eyes focused straight ahead. But what’s the use of that if, as a driver, you have to turn your head every time you want to speak to someone in the back seat? If you’ve ever struggled to hold a conversation in a car at highway speeds, especially in a larger vehicle, you know what I’m talking about.

QNX acoustics to the rescue! Earlier today, QNX Software Systems announced the QNX Acoustics Management Platform, a new solution that replaces the traditional piecemeal approach to in-car acoustics with a holistic model that enables faster-time-to-production and lower system costs. The platform comes with several innovative features, including QNX In-Car Communication (ICC) technology, which enhances the voice of the driver and relays it to infotainment loudspeakers in the rear of the car.

Long story short: instead of shouting or having to turn around to be heard, the driver can talk normally while keeping his or her eyes on the road. QNX ICC dynamically adapts to noise conditions and adds enhancement only when needed. Better yet, it allows automakers to leverage their existing handsfree telephony microphones and infotainment loudspeakers.



The reference vehicle that keeps evolving
Before you go, I also want to share some updates to the QNX reference vehicle, which is based on a Jeep Wrangler. Like the Highlander, the Jeep got a slick new exterior for CES 2016:



Since 2012, the Jeep has been our go-to vehicle for showcasing the latest capabilities of the QNX CAR Platform for Infotainment. But for over a year now, it has done double-duty as a concept vehicle, showing how QNX technology can help developers build next-generation instrument clusters and ADAS solutions.

Take, for example, the Jeep’s new instrument cluster, which makes its debut this week at CES. In addition to providing all the information that you’d expect, such as speed and RPM, it displays crosswalk notifications, forward collision warnings, speed limit warnings, and turn-by-turn navigation:



The QNX reference vehicle also includes a full-featured head unit that demonstrates the latest out-of-the-box capabilities of the QNX CAR Platform for Infotainment. For example, in this image, the head unit is displaying HERE Auto navigation:



Other features of the platform include:
  • A voice interface that uses natural language processing, making it easy to launch applications, play music, select radio stations, control volume, use the navigation system, and perform a variety of other tasks.
  • A new, easy-to-navigate UI based on Qt 5.5 that supports a variety of touch gestures, including tap, swipe, pinch, and zoom.
  • QNX acoustics technology that enables clear, easy-to-understand hands-free calls through advanced echo cancellation and noise reduction.
  • Cellular connectivity provided by the QNX Wireless Framework, which simplifies system design by managing the complexities of modem control on behalf of applications.
  • Flexible support for a variety of smartphone integration protocols.

Additional tech specs: The Jeep’s cluster runs on a Qualcomm Snapdragon 602A processor and its user interface was designed by our partner Rightware, using the Rightware Kanzi tool. The head unit, meanwhile, runs on an Intel Atom E3827 processor.

ADAS, augmented reality, V2X, IoT, and more
I have only scratched the surface of what BlackBerry and QNX Software Systems are demonstrating this week at CES 2016. There’s much more to see and experience, including a very cool V2X demonstration, IoT solutions for the automotive and transportation industries, as well as ADAS and augmented reality systems that integrate with the digital clusters described in this post. To learn more, read the press release that QNX issued today and stay tuned to this channel.


QNX announces new platforms for automated driving systems and in-car acoustics

Paul Leroux
Every year, at CES, QNX Software Systems showcases its immense range of solutions for infotainment systems, digital instrument clusters, telematics systems, advanced driving assistance systems (ADAS), and in-car acoustics. This year is no different. Well, actually… let me take that back. Because this year, we are also announcing two new and very important software platforms: one that can speed the development of automated driving systems, and one that can transform how acoustics applications are implemented in the car.

QNX Platform for ADAS
The automotive industry is at an inflection point, with autonomous and semiautonomous vehicles moving from theory to reality. The new QNX Platform for ADAS is designed to help drive this industry transformation. Based on our deep automotive experience and 30-year history in safety-critical systems, the platform can help automotive companies reduce the time and effort of building a full range of ADAS and automated driving applications:
  • from informational ADAS systems that provide a multi-camera, 360° surround view of the vehicle…
  • to sensor fusion systems that combine data from multiple sources such as cameras and radar…
  • to advanced high-performance systems that make control decisions in fully autonomous vehicles



Highlights of the platform include:
  • The QNX OS for Safety, a highly reliable OS pre-certified at all of the automotive safety integrity levels needed for automated driving systems.
  • An OS architecture that can simplify the integration of new sensor technologies and purpose-built ADAS processors.
  • Frameworks and reference implementations to speed the development of multi-camera vision systems and V2X applications (vehicle-to-vehicle and vehicle-to-infrastructure communications).
  • Pre-integrated partner technologies, including systems-on-chip (SoCs), vision algorithms, and V2X modules, to enable faster time-to-market for customers.

This week, at CES 2016, QNX will present several ADAS and V2X demonstrations, including:
  • Demos that show how QNX-based ADAS systems can perform realtime analysis of complex traffic scenarios to enhance driver awareness or enable various levels of automated driving.
  • QNX-based V2X technology that allows cars to “talk” to each other and to traffic infrastructure (e.g. traffic lights) to prevent collisions and improve traffic flow.

To learn more, check out the ADAS platform press release, as well as the press release that provides a full overview of our many CES demos — including, of course, the latest QNX technology concept vehicle!

QNX Acoustics Management Platform
It’s a lesser-known fact, but QNX is a leader in automotive acoustics — its software for handsfree voice communications has shipped in over 40 million automotive systems worldwide. This week, QNX is demonstrating once again why it is a leader in this space, with a new, holistic approach to managing acoustics in the car, the QNX Acoustics Management Platform (AMP):

  • Enables automakers to enhance the audio and acoustic experience for drivers and passengers, while reducing system costs and complexity.
  • Replaces the traditional piecemeal approach to in-car acoustics with a unified model: automakers can now manage all aspects of in-car acoustics efficiently and holistically, for easier integration and tuning, and for faster time-to-production.
  • Reduces hardware costs with a new, low-latency audio architecture that eliminates the need for dedicated digital signal processors or specialized external hardware.
  • Integrates a full suite of acoustics modules, including QNX Acoustics for Voice (for handsfree systems), QNX Acoustics for Engine Sound Enhancement, and the brand new QNX In-Car Communication (ICC).

For anyone who has struggled to hold a conversation in a car at highway speeds, QNX ICC enhances the voice of the driver and relays it to loudspeakers in the back of the vehicle. Instead of shouting or having to turn around to be heard, the driver can talk normally while keeping his or her eyes on the road. QNX will demonstrate ICC this week at CES, in its latest technology concept car, based on a Toyota Highlander.

Read the press release to learn more about QNX AMP.



Senin, 04 Januari 2016

Ford ports SmartDeviceLink to QNX CAR Platform

QNX joins Ford, Toyota, and other industry leaders to help drive new standard for app integration.

Paul Leroux
For as long as I can remember, QNX Software Systems has been at the forefront of integrating cars and smartphones. Through our flexible OS architecture and large automotive ecosystem, we provide automakers and Tier 1 suppliers with the ultimate choice in connectivity options for smartphones and other smart devices. And now, QNX customers will have even greater choice, with the availability of Ford’s SmartDeviceLink (SDL) technology for the QNX CAR Platform for Infotainment.

If you’ve never heard of SDL, it’s the open source version of Ford AppLink, the software that allows Ford SYNC users to access smartphone apps through voice commands and dashboard controls. Ford donated AppLink to the open source community to create a standard way for consumers to interact with smartphone apps, regardless of which phone they use or vehicle they drive.

SDL is quickly gaining industry advocates, including Toyota, UI Evolution, and, of course QNX. What’s more, companies like PSA, Honda, Subaru, Mazda are evaluating it for use in next-generation vehicles.

Why the interest in SDL? Because it’s a flexible, vendor-neutral standard that can benefit drivers, automakers, and developers alike. With SDL:

  • Drivers can interact with apps by using voice commands, steering-wheel buttons, and other in-car controls, so they can keep their eyes on the road and hands on the wheel.
  • Automakers can deliver a consistent app experience across vehicles, while retaining the flexibility to customize that experience for each vehicle brand or model.
  • Developers can create apps that can work across multiple smart devices and multiple automotive brands — which means they have greater incentive to create automotive apps.

SDL for QNX builds on a history of successful collaborations between Ford and QNX, including the QNX-powered Ford SYNC 3 infotainment system. According to Paul Elsila, CEO of Livio, the Ford subsidiary that maintains the SDL open source project, “With its large market share, QNX can play a key role in driving the adoption of auto industry standards, and we are excited to work with them in building vendor-neutral technology that can simplify the integration of smartphone apps in any brand or type of vehicle.”

SDL works with multiple smartphone platforms. Moreover, it is highly flexible: it can work across a full range of vehicles, from entry-level to premium, and across a wide range of displays. It can even be used in systems without displays — for instance, in systems that use a voice interface.

To learn more about SDL, check out the announcements that Ford, Toyota, and QNX issued this morning.