Kamis, 03 Desember 2015

The demo is in the details

A new video of the 2015 QNX technology concept car reveals some thoughtful touches.

Paul Leroux
QNX technology concept cars serve a variety of purposes. They demonstrate, for example, how the flexibility of QNX technology can help automakers deliver unique user experiences. They also serve as vehicles — pun fully intended — for showcasing our vision of connected driving. And they explore how thoughtful integration of new technologies can make driving easier and more enjoyable.

It is this thoughtfulness that impresses me most about the cars. It is also the hardest aspect to convey in words and pictures — nothing beats sitting inside one of the cars and experiencing the nuances first hand.

The minute you get behind the wheel, you realize that our concept team is exploring answers to a multitude of questions. For instance, how do you bring more content into a car, without distracting the driver? How do you take types of information previously distributed across two or more screens and integrate them on a single display? How do you combine information about local speed limits with speedometer readouts to promote better driving? How do you make familiar activities, such as using the car radio, simpler and more intuitive? And how much should a car’s UX rely on the touch gestures that have become commonplace on smartphones and tablets?

Okay, enough from me. To see how our 2015 technology concept car, based on a Maserati Quattroporte, addresses these and other questions, check out this new video with my esteemed colleague Justin Moon. Justin does a great job of highlighting many of the nuances I just alluded to:



In just over a month, QNX will unveil a brand new technology concept vehicle. What kinds of questions will it explore? What kinds of answers will it propose? We can’t say too much yet, but stay tuned to this channel and to our CES 2016 microsite.

Senin, 09 November 2015

Bringing a bird’s eye view to a car near you

QNX and TI team up to enable surround-view systems in mass-volume vehicles

Paul Leroux
Uh-oh. You are 10 minutes late for your appointment and can’t find a place to park. At long last, a space opens up, but sure enough, it’s the parking spot from hell: cramped, hard to access, with almost no room to maneuver.

Fortunately, you’ve got this covered. You push a button on your steering wheel, and out pops a camera drone from the car’s trunk. The drone rises a few feet and begins to transmit a bird’s eye view of your car to the dashboard display — you can now see at a glance whether you are about to bump into curbs, cars, concrete barriers, or anything else standing between you and parking nirvana. Seconds later, you have backed perfectly into the spot and are off to your meeting.

Okay, that’s the fantasy. In reality, cars with dedicated camera drones will be a long time coming. In the meantime, we have something just as good and a lot more practicable — an ADAS application called surround view.

Getting aligned
Approaching an old problem from a
new perspective
. Credit: TI
Surround-view systems typically use four to six fisheye cameras installed at the front, back, and sides of the vehicle. Together, these cameras capture a complete view of the area around your car, but there’s a catch: the video frames they generate are highly distorted. So, to start, the surround-view system performs geometric alignment of every frame. Which is to say, it irons all the curves out.

Next, the system stitches the corrected video frames into a single bird’s eye view. Mind you, this step isn’t simply a matter of aligning pixels from several overlapping frames. Because each camera points in a different direction, each will generate video with unique color balance and brightness levels. Consequently, the system must perform photometric alignment of the image. In other words, it corrects these mismatches to make the resulting output look as if it were taken by a single camera hovering over the vehicle.

Moving down-market
If you think that all this work takes serious compute power, you’re right. The real trick, though, is to make the system affordable so that luxury car owners aren’t the only ones who can benefit from surround view.

Which brings me to QNX Software Systems’ support for TI’s new TDA2Eco system-on-chip (SoC), which is optimized for 3D surround view and park-assist applications. The TDA2Eco integrates a variety of automotive peripherals, including CAN and Gigabit Ethernet AVB, and supports up to eight cameras through parallel, serial and CSI-2 interfaces. To enable 3D viewing, the TDA2Eco includes an image processing accelerator for decoding multiple camera streams, along with graphics accelerators for rendering virtual views.

Naturally, surround view also needs software, which is where the QNX OS for Safety comes in. The OS can play several roles in surround-view systems, such as handling camera input, hosting device drivers for camera panning and control, and rendering the processed video onto the display screen, using QNX Software Systems’ high-performance Screen windowing system. The QNX OS for Safety complies with the ISO 26262 automotive functional safety standard and has a proven history in safety-critical systems, making it ideally suited for collision warning, surround view, and a variety of other ADAS applications.

Okay, enough from me. Let’s look at a video, hosted by TI’s Gaurav Agarwal, to see how the TDAx product line can support surround-view applications:



For more information on the TDAx product line, visit the TI website; for more on the QNX OS for Safety, visit the QNX website.

Selasa, 03 November 2015

An ADAS glossary for the acronym challenged

If you’ve got ACD, you’ve come to the right place.

Paul Leroux
Someday, in the not-so-distant future, your mechanic will tell you that your CTA sensor has gone MIA. Or that your EDA needs an OTA update. Or that the camera system for your PLD has OSD. And when that day happens, you’ll be glad you stumbled across this post. Because I am about to point you to a useful little glossary that takes the mystery out of ADAS acronyms. (The irony being, of course, that ADAS is itself an acronym.)

Kidding aside, acronyms can stand in the way of clear communication — but only when used at the wrong time and place. Otherwise, they serve as useful shorthand, especially among industry insiders who have better things to do than say “advanced driver assistance system” 100 times a day when they can simply say ADAS instead.

In any case, you can find the glossary here. And when you look at it, you’ll appreciate my ulterior motive for sharing the link — to demonstrate that the ADAS industry is moving apace. The glossary makes it abundantly clear that the industry is working on, or has already developed, a large variety of ADAS systems. The number will only increase, thanks to government calls for vehicle safety standards, technology advances that make ADAS solutions more cost-effective, and growing consumer interest in cars that can avoid crashes. In fact, Visiongain has estimated that the global ADAS market will experience double-digit growth between 2014 and 2024, from a baseline estimate of $18.2 billion.

And in case you’re wondering, ACD stands for acronym challenged disorder. ;-)

Rabu, 28 Oktober 2015

Five reasons why they should test autonomous cars in Ontario

Did I say five? I meant six…

Paul Leroux
It was late and I needed to get home. So I shut down my laptop, bundled myself in a warm jacket, and headed out to the QNX parking lot. A heavy snow had started to fall, making the roads slippery — but was I worried? Not really. In Ottawa, snow is a fact of life. You learn to live with it, and you learn to drive in it. So I cleared off the car windows, hopped in, and drove off.

Alas, my lack of concern was short-lived. The further I drove, the faster and thicker the snow fell. And then, it really started to come down. Pretty soon, all I could see out my windshield was a scene that looked like this, but with even less detail:



That’s right: a pure, unadulterated whiteout. Was I worried? Nope. But only because I was in a state of absolute terror. Fortunately, I could see the faintest wisp of tire tracks immediately in front of my car, so I followed them, praying that they didn’t lead into a ditch, or worse. (Spoiler alert: I made it home safe and sound.)

Of course, it doesn’t snow every day in Ottawa — or anywhere else in Ontario, for that matter. That said, we can get blanketed with the white stuff any time from October until April. And when we do, the snow can play havoc with highways, railways, airports, and even roofs.

Roofs, you say? One morning, a few years ago, I heard a (very) loud noise coming from the roof of QNX headquarters. When I looked out, this is what I saw — someone cleaning off the roof with a snow blower! So much snow had fallen that the integrity of the roof was being threatened:



When snow like this falls on the road, it can tax the abilities of even the best driver. But what happens when the driver isn’t a person, but the car itself? Good question. Snow and blowing snow can mask lane markers, cover street signs, and block light-detection sensors, making it difficult for an autonomous vehicle to determine where it should go and what it should do. Snow can even trick the vehicle into “seeing” phantom objects.

And it’s not just snow. Off the top of my head, I can think of 4 other phenomena common to Ontario roads that pose a challenge to human and robot drivers alike: black ice, freezing rain, extreme temperatures, and moose. I am only half joking about the last item: autonomous vehicles must respond appropriately to local fauna, not least when the animal in question weighs half a ton.

To put it simply, Ontario would be a perfect test bed for advancing the state of autonomous technologies. So imagine my delight when I learned that the Ontario government has decided to do something about it.

Starting January 1, Ontario will become the first Canadian province to allow road testing of automated vehicles and related technology. The provincial government is also pledging half a million dollars to the Ontario Centres of Excellence Connected Vehicle/Automated Vehicle Program, in addition to $2.45 million already provided.

The government has also installed some virtual guard rails. For instance, it insists that a trained driver stay behind the wheel at all times. The driver must monitor the operation of the autonomous vehicle and take over control whenever necessary.

Testing autonomous vehicles in Ontario simply makes sense, but not only because of the weather. The province also has a lot of automotive know-how. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here, as do 350 parts suppliers. Moreover, the province has almost 100 companies and institutions involved in connected vehicle and automated vehicle technologies — including, of course, QNX Software Systems and its parent company, BlackBerry.

So next time you’re in Ontario, take a peek at the driver in the car next to you. But don’t be surprised if he or she isn’t holding the steering wheel.


A version of this post originally appeared in Connected Car Expo blog.

Selasa, 20 Oktober 2015

ADAS: The ecosystem's next frontier

At DevCon last week, Renesas showcased their ADAS concept vehicle. It was just what you would expect from an advanced demonstration, combining radar, lidar, cameras, V2X, algorithms, multiple displays and a huge amount of software to make it all work. They were talking about sensor fusion and complete surround view and, well, you get the picture.

What isn’t readily obvious as you experience the demo is the investment made and the collaboration required by Renesas and their ADAS ecosystem.

Partnership is a seldom recognized cornerstone of what will ultimately become true sensor fusion. It seems, to me at least, unlikely that anyone will be able to develop the entire system on their own. As processors become more and more powerful, the discrete ECUs will start to collapse into less distributed architectures with much more functionality on each chip. The amount of data coming into and being transmitted by the vehicle will continue to grow and the need to secure it will grow alongside. V2X, high definition map data, algorithms, specialized silicon, vision acceleration and more will become the norm in every vehicle.

How about QNX Software Systems? Are we going to do all of this on our own? I doubt it. Instead, we will continue to build on the same strategy that has helped take us to a leadership position in the infotainment market: collaborating with best of breed companies to deliver a solution on a safety-certified foundation that customers can leverage to differentiate their products.

The view from above at Renesas DevCon.

Rabu, 14 Oktober 2015

What does a decades-old thought experiment have to do with self-driving cars?

Paul Leroux
Last week, I discussed, ever so briefly, some ethical issues raised by autonomous vehicles — including the argument that introducing them too slowly could be considered unethical!

My post included a video link to the trolley problem, a thought experiment that has long served as a tool for exploring how people make ethical decisions. In its original form, the trolley problem is quite simple: You see a trolley racing down a track on which five people are tied up. Next to you is a lever that can divert the trolley to an empty track. But before you can pull the lever, you notice that someone is, in fact, tied up on the second track. Do you do nothing and let all 5 people die, or do you pull the lever and kill the one person instead?

The trolley problem has undergone criticism for failing to represent real-world problems, for being too artificial. But if you ask Patryk Lin, a Cal Tech professor who has delivered talks to Google and Tesla on the ethics of self-driving cars, it can serve as a helpful teaching tool for automotive engineers — especially if its underlying concept is framed in automotive terms.

Here is how he presents it:

“You’re driving an autonomous car in manual mode—you’re inattentive and suddenly are heading towards five people at a farmer’s market. Your car senses this incoming collision, and has to decide how to react. If the only option is to jerk to the right, and hit one person instead of remaining on its course towards the five, what should it do?”

Of course, autonomous cars, with their better-than-human driving habits (e.g. people tailgate, robot cars don’t) should help prevent such difficult situations from happening in the first place. In the meantime, thinking carefully through this and other scenarios is just one more step on the road to building fully autonomous, and eventually driverless, cars.

Read more about the trolley problem and its application to autonomous cars in a recent article on The Atlantic.

Speaking of robot cars, if you missed last week's webinar on the role of software when transitioning from ADAS to autonomous driving, don't sweat it. It's now available on demand at Techonline.

Rabu, 07 Oktober 2015

The ethics of robot cars

“By midcentury, the penetration of autonomous vehicles... could ultimately cause vehicle crashes in the U.S. to fall from second to ninth place in terms of their lethality ranking.” — McKinsey

Paul Leroux
If you saw a discarded two-by-four on the sidewalk, with rusty nails sticking out of it, what would you do? Chances are, you would move it to a safe spot. You might even bring it home, pull the nails out, and dispose of it properly. In any case, you would feel obliged to do something that reduces the probability of someone getting hurt.

Driver error is like a long sharp nail sticking out of that two-by-four. It is, in fact, the largest single contributor to road accidents. Which raises the question: If the auto industry had the technology, skills, and resources to build vehicles that could eliminate accidents caused by human error, would it not have a moral obligation to do so? I am speaking, of course, of self-driving cars.

Now, a philosopher I am not. I am ready to accept that my line of thinking on this matter has more holes than Swiss cheese. But if so, I’m not the only one with Emmenthal for brain matter. I am, in fact, in good company.

Take, for example, Bryant Walker-Smith, a professor in the schools of law and engineering at the University of South Carolina. In an article in MIT Technology Review, he argues that, given the number of accidents that involve human error, introducing self-driving technology too slowly could be considered unethical. (Mind you, he also underlines the importance of accepting ethical tradeoffs. We already accept that airbags may kill a few people while saving many; we may have to accept that the same principle will hold true for autonomous vehicles.)

Then there’s Roger Lanctot of Strategy Analytics. He argues that government agencies and the auto industry need to move much more aggressively on active-safety features like automated lane keeping and automated collision avoidance. He reasons that, because the technology is readily available — and can save lives — we should be using it.

Mind you, the devil is in the proverbial details. In the case of autonomous vehicles, the ethics of “doing the right thing” is only the first step. Once you decide to build autonomous capabilities into a vehicle, you often have to make ethics-based decisions as to how the vehicle will behave.

For instance, what if an autonomous car could avoid a child running across the street, but only at the risk of driving itself, and its passengers, into a brick wall? Whom should the car be programmed to save? The child or the passengers? And what about a situation where the vehicle must hit either of two vehicles — should it hit the vehicle with the better crash rating? If so, wouldn’t that penalize people for buying safer cars? This scenario may sound far-fetched, but vehicle-to-vehicle (V2X) technology could eventually make it possible.

The “trolley problem” captures the dilemma nicely:



Being aware of such dilemmas gives me more respect for the kinds of decisions automakers will have to make as they build a self-driving future. But you know what? All this talk of ethics brings something else to mind. I work for a company whose software has, for decades, been used in medical devices that help save lives. Knowing that we do good in the world is a daily inspiration — and has been for the last 25 years of my life. And now, with products like the QNX OS for Safety, we are starting to help automotive companies build ADAS systems that can help mitigate driver error and, ultimately, reduce accidents. So I’m doubly proud.

More to the point, I believe this same sense of pride, of helping to make the road a safer place, will be a powerful motivator for the thousands of engineers and development teams dedicated to paving the road from ADAS to autonomous. It’s just one more reason why autonomous cars aren’t a question of if, but only of when.

Rabu, 30 September 2015

A low-down look at the QNX concept cars

Paul Leroux
It’s that time of year again. The QNX concept team has set the wheels in motion and started work on a brand new technology concept car, to be unveiled at CES 2016.

The principle behind our technology concept cars is simple in theory, but challenging in practice: Take a stock production vehicle off the dealer’s lot, mod it with new software and hardware, and create user experiences that make driving more connected, more enjoyable, and, in some cases, even safer.

It’s always fun to guess what kind of car the team will modify. But the real story lies in what they do with it. In recent years, they’ve implemented cloud-based diagnostics, engine sound enhancement, traffic sign recognition, collision warnings, speed alerts, natural voice recognition — the list goes on. There’s always a surprise or two, and I intend to keep it that way, so no hints about the new car until CES. ;-)

In the meantime, here is a retrospective of QNX technology concept cars, past and present. It’s #WheelWednesday, so instead of the usual eye candy, I’ve chosen images to suit the occasion. Enjoy.

The Maserati Quattroporte GTS
From the beginning, our technology concept cars have demonstrated how the QNX platform helps auto companies create connected (and compelling) user experiences. The Maserati, however, goes one step further. It shows how QNX can enable a seamless blend of infotainment and ADAS technologies to simplify driving tasks, warn of possible collisions, and enhance driver awareness. The car can even recommend an appropriate speed for upcoming curves. How cool is that?




The Mercedes CLA 45 AMG
By their very nature, technology concept cars have a short shelf life. The Mercedes, however, has defied the odds. It debuted in January 2014, but is still alive and well in Europe, and is about to be whisked off to an event in Dubai. The car features a multi-modal user experience that blends touch, voice, physical buttons, and a multi-function controller, enabling users to interact naturally with infotainment functions. The instrument cluster isn’t too shabby, either. It will even warn you to ease off the gas if you exceed the local speed limit.




The Bentley Continental GT
I dubbed our Bentley the “ultimate show-me car,” partially because that’s exactly what people would ask when you put them behind the wheel. The digital cluster was drop-dead gorgeous, but the head unit was the true pièce de résistance — an elegantly curved 17” high-definition display based on TI’s optical touch technology. And did I mention? The car’s voice rec system spoke with an English accent.




The Porsche 911 Carrera
Have you ever talked to a Porsche? Well, in this case, you could — and it would even talk back. We outfitted our 911 with cloud-based voice recognition (so you could control the nav system using natural language) and text-to-speech (so you could listen to incoming BBMs, emails, and text messages). But my favorite feature was one-touch Bluetooth pairing: you simply touched your phone to an NFC reader in the center console and, hey presto, the phone and car were automatically paired,




The Chevrolet Corvette
I have a confession to make: The Corvette is the only QNX technology concept car that I got to drive around the block. For some unfathomable reason, they never let me drive another one. Which is weird, because I saw the repair bill, and it wasn’t that much. In any case, the Corvette served as the platform for the very first QNX technology concept car, back in 2010. It included a reconfigurable instrument cluster and a smartphone-connected head unit — features that would become slicker and more sophisticated in our subsequent concept vehicles. My favorite feature: the reskinnable UI.




The Jeep Wrangler
Officially, the Wrangler serves as the QNX reference vehicle, demonstrating what the QNX CAR Platform can do out of the box. But it also does double-duty as a concept vehicle, showing how the QNX platform can help developers build leading-edge ADAS solutions. My favorite features: in-dash collision warnings and a fast-booting backup display.



Well, there you have it. In just a few months’ time, we will have the honor of introducing you to a brand new QNX technology concept car. Any guesses as to what the wheels will look like?



If you liked this post, you may also be interested in... The lost concept car photos

Kamis, 24 September 2015

Developing safety-critical systems? This book is for you

In-depth volume covers development of systems under the IEC 61508, ISO 26262, EN 50128, and IEC 62304 standards

Paul Leroux
In June, I told you of an upcoming book by my colleague Chris Hobbs, who works as a software safety specialist here at QNX Software Systems. Well, I’m happy to say that the book is now available. It’s called Embedded Software Development for Safety-Critical Systems and it explores design practices for building medical devices, railway control systems, industrial control systems, and, of course, automotive ADAS devices.

The book:
  • covers the development of safety-critical systems under ISO 26262, IEC 61508, EN 50128, and IEC 62304
  • helps developers learn how to justify their work to external auditors
  • discusses the advantages and disadvantages of architectural and design practices recommended in the standards, including replication and diversification, anomaly detection, and so-called “safety bag” systems
  • examines the use of open-source components in safety-critical systems
Interested? I invite to you to visit the CRC Press website, where you can view the full Table of Contents and, of course, order the book.

Looking forward to getting my copy!

Selasa, 22 September 2015

From ADAS to autonomous

A new webinar on how autonomous driving technologies will affect embedded software — and vice versa

Paul Leroux
When, exactly, will production cars become fully autonomous? And when will they become affordable to the average Jane or Joe? Good questions both, but in the meantime, the auto industry isn’t twiddling its collective thumbs. It’s already starting to build a more autonomous future through active-control systems that can avoid accidents (e.g. automated emergency braking) and handle everyday driving tasks (e.g. adaptive cruise control).

These systems rely on software to do their job, and that reliance will grow as the systems become more sophisticated and cars become more fully autonomous. This trend, in turn, will place enormous pressure on how the software is designed, developed, and maintained. Safety, in particular, must be front and center at every stage of development.

Which brings me to a new webinar from my inestimable colleague, Kerry Johnson. Titled “The Role of a Software Platform When Transitioning from ADAS to Autonomous Driving,” the webinar will examine:
  • the emergence of high-performance systems-on-chip that target ADAS and autonomous vehicle applications
  • the impact of increasing system integration and autonomous technologies on embedded software
  • the need for functional safety standards such as ISO 26262
  • the emergence of pre-certified products as part of the solution to address safety challenges
  • the role of a software platform to support the evolution from ADAS to autonomous driving

If you are tasked with either developing or sourcing software for functional safety systems in passenger vehicles, this webinar is for you. Here are the coordinates:

Wednesday, October 7
1:00pm EDT

Registration Site



Selasa, 08 September 2015

One OS, multiple safety applications

The latest version of our certified OS for ADAS systems and digital instrument clusters has a shorter product name — but a longer list of talents.

Paul Leroux
Can you ever deliver a safety-critical product to a customer and call it a day? For that matter, can you deliver any product to a customer and call it a day? These, of course, are rhetorical questions. Responsibility for a product rarely ends when you release it, especially when you add safety to the mix. In that case, it’s a long-term commitment that continues until the last instance of the product is retired from service. Which can take decades.

Mind you, people dedicated to building safety-critical products aren’t prone to sitting on their thumbs. From their perspective, product releases are simply milestones in a process of ongoing diligence and product improvement. For instance, at QNX Software Systems, we subject our OS safety products to continual impact analysis, even after they have been independently certified for use in functional safety systems. If that analysis calls for improved product, then improved product is what we deliver. With a refreshed certificate, of course.

Which brings me to the QNX OS for Safety. It’s a new — and newly certified — release of our field-proven OS safety technology, with a twist. Until now, we had one OS certified to the ISO 26262 standard (for automotive systems) and another certified to the IEC 61508 standard (for general embedded systems). The new release is certified to both of these safety standards and replaces the two existing products in one fell swoop.

So if you no longer see the QNX OS for Automotive Safety listed on the QNX website, not to worry. We’ve simply replaced it with an enhanced version that has a shorter product name and broader platform support — all with the same proven technology under the hood. (My colleague Patryk Fournier has put together an infographic that nicely summarizes the new release; see sidebar).

And if you’re at all surprised that a single OS can be certified to both 61508 and 26262, don’t be. As the infographic suggests, IEC 61508 provides the basis for many market-specific standards, including IEC 62304, EN 5012x, and, of course, ISO 26262.

Learn more about the QNX OS for Safety on the QNX website. And for more information on ISO 26262 and how it affects the design of safety-critical automotive systems, check out these whitepapers:


Selasa, 25 Agustus 2015

TWICE applauds QNX OS-powered OnStar 4G LTE with VIP Award

Megan Alink
Our customers are all VIPs, and we love nothing more than seeing them shine with industry recognition. Recently, TWICE named the OnStar 4G LTE powered by the QNX Neutrino OS to its list of Very Important Product (VIP) Award winners in the in-dash navigation multimedia receivers category.

The product builds a Wi-Fi hotspot into the vehicle so customers can stay online easily while they’re on the go. Up to seven devices, including computers, smartphones, video game consoles and tablets, can be paired to the hotspot for use any time the car is on. OnStar 4G LTE also gives customers access to the same features that OnStar is known for, including emergency assistance, security, navigation and vehicle diagnostics.

Congratulations to our customer OnStar and the rest of the TWICE VIPs! You can view the full list of categories and winners on the TWICE website.

Kamis, 30 Juli 2015

What do Taylor Swift’s legs and QNX Acoustics have in common?

By Megan Alink, Marketing Communications Director for Automotive

OK, so at first blush, nothing. Let me explain.

It starts with a number. A really big number. Think “40 million” — because I recently learned that that’s how many units of our QNX Acoustics for Voice product have shipped to date. Wow!

QNX Acoustics for Voice is a complete signal processing software solution for automotive voice communications, based on production-proven acoustics technology. It’s designed specifically to meet the acoustics challenges unique to the in-car environment, sets a new benchmark in hands-free quality, and, very importantly, supports the rigorous requirements of smartphone connectivity. Check out the product page for more information.

Obviously 40 million is a number worth talking about, so what’s the first thing that any marketing writer worth her salt does? She looks for an analogy to help put this impressive number into context. Number of steps it takes to go around the world? Population of California? Google comes in handy in such situations and eventually, it delivered the perfect informational nugget: Taylor Swift’s legs are reportedly insured for $40 million. What does this have to do with automotive acoustics? Well, clearly Ms. Swift’s legs are critical to her success as an entertainer, and, as anyone in the business of sound and noise knows, our acoustics engineers couldn’t make their magic — and achieve 40 million units sold — without their finely honed hearing. The conclusion is obvious. We must insure their ears for $40 million as well. All in favour?

Congratulations to everyone who has played a role in putting this ground breaking technology into the hands (and ears) of our customers, and many thanks to those customers for helping QNX achieve this amazing milestone.

With thanks to Phil Hetherington and Len Layton for the idea…

Kamis, 23 Juli 2015

Intel and QNX Software Systems talk connected cars

By Megan Alink, Marketing Communications Director for Automotive

How many collaborations in embedded have lasted more than three decades? Our relationship with Intel comes immediately to mind.

I’m told it all began in 1981, when the first IBM PCs ran the Intel 8088 microprocessor and the OS could be swapped out for QNX. A quick trip to your local Sears, ComputerLand or IBM Product Center, followed by an order to QNX Support, and the most reliable computer the market could offer was all yours.

The first IBM PC: 21lbs without the diskette drive and complete with a cassette player jack.
Photo credit: https://www-03.ibm.com/ibm/history/exhibits/pc25/pc25_album.html
Fast forward 34 years, and QNX and Intel are still changing the technology landscape. Earlier this summer, Intel announced their plans to work with us on technologies for a variety of connected car applications, including infotainment systems, digital instrument clusters and advanced driver assistance systems (ADAS). Right around that same time, our Andrew Poliak, global director of business development, met up with Ken Obuszewski, director of automotive marketing for Intel, at TU-Automotive Detroit, where they spoke on camera with Bill Hampton from The AutoBeat Group. The topic was, of course, the connected car, and both gentlemen made compelling comments about the future of this exciting aspect of automotive and our mutual plans to deliver:

“With the evolution of the connected car, the capabilities that you’re seeing in the vehicle are really starting to expand dramatically…QNX is a long-time leader in the automotive market, Intel – we’re one of the technology leaders making a large investment in automotive, [so] it’s very natural for us to expand our working relationship and to bring consumers great technologies going forward.” — Ken Obuszewski

“Making systems that can be upgradable and updateable even after you purchase the vehicle so that way it stays fresh and current over the life of the vehicle is really a key foundation of a software platform running in a real[ly] flexible architecture like Intel’s.” — Andrew Poliak

Check out the entire video below. Many thanks to our friends at Intel for this opportunity to talk about our shared vision.

Selasa, 30 Juni 2015

It seems like only yesterday...

By Megan Alink, Director of Marketing Communications for Automotive

What were you doing on September 14, 1999? It was likely an inauspicious day for most people, but for QNX, the date represented our official entry into the automotive market:


Don’t get me wrong — QNX was no tentative newcomer on the scene. After all, we were marking almost two decades in the embedded software business. QNX OS technology was already powering mission-critical systems for credit card processing, energy generation, healthcare, mail sorting, precision manufacturing, mining, security, and warehouse automation worldwide. (Whew!) But it was time to take that reliability and flexibility to more markets, ones with needs similar to our existing customer base. Enter automotive. (And we did.)

Today, we are pleased to be able to say that QNX software is found in more than 60 million vehicles on the road. In telematics systems like OnStar. In infotainment services like Volkswagen's RNS 850 GPS navigation system and Ford SYNC 3. In the digital instrument clusters of the state-of-the-art Audi TT and Mercedes S-Class Coupé.

60 million is a very big number. Obviously, we wouldn’t have reached this milestone without the support of our Tier 1 customers who build QNX into their systems every day, the 40+ automakers who choose these QNX-based systems, and our ecosystem of automotive partners who enrich our offering with their market-leading innovations. We want to thank all of these companies for the exciting and challenging opportunities they give us. Here’s to the next 60 million!

Senin, 29 Juni 2015

The A to Z of QNX in cars

Over 26 fast facts, brought to you by the English alphabet

Paul Leroux
A is for Audi, one of the first automakers to use QNX technology in its vehicles. For more than 15 years, Audi has put its trust in QNX, in state-of-the-art systems like the Audi virtual cockpit and the MIB II modular infotainment system. A is also for QNX acoustics software, which enhances hands-free voice communications, eliminates “boom noise” created by fuel-saving techniques, and even helps automakers create signature sounds for their engines.

B is for Bentley, BMW, and Buick, and for their QNX-powered infotainment systems, which include BMW ConnectedDrive and Buick Intellilink.

C is for concept vehicles, including the latest QNX technology concept car, a modded Maserati Quattroporte GTS. The car integrates an array of technologies — including cameras, LiDAR, ultrasonic sensors, and specialized navigation engines — to show how QNX-based ADAS systems can simplify driving tasks, warn of possible collisions, and enhance driver awareness.

D is for the digital instrument clusters in vehicles from Alpha Romeo, Audi, GM, Jaguar, Mercedes-Benz, and Land Rover. These QNX-powered displays can reconfigure themselves on the fly, providing quick, convenient access to turn-by-turn directions, back-up video, incoming phone calls, and a host of other information.

E is for experience. QNX has served the automotive market since the late 1990s, working with car makers and tier one suppliers to create infotainment systems for tens of millions of vehicles. QNX has been at work in safety-critical industrial applications even longer — since the 1980s. This unique pedigree makes QNX perfectly suited for the next generation of in-vehicle systems, which will consolidate infotainment and safety-related functions on a single, cost-effective platform.

F is for Ford, which has chosen the QNX Neutrino OS for its new SYNC 3 infotainment system. The system will debut this summer in the 2016 Ford Escape and Ford Fiesta and will be one of the first infotainment systems to support both Apple CarPlay and Android Auto.

G is for GM and its QNX-based OnStar system, which is now available in almost all of the company’s vehicles. GM also uses QNX OS and acoustics technology in several infotainment systems, including the award-winning Chevy MyLink.

H is for hypervisor. By using the QNX Hypervisor, automotive developers can consolidate multiple OSs onto a single system-on-chip to reduce the cost, size, weight, and power consumption of their designs. The hypervisor can also simplify safety certification efforts by keeping safety-related and non-safety-related software components isolated from each other.

I is for the ISO 26262 standard for functional safety in road vehicles. The QNX OS for Automotive Safety has been certified to this standard, at Automotive Safety Integrity Level D — the highest level achievable. This certification makes the OS suitable for a wide variety of digital clusters, heads-up displays, and ADAS applications, from adaptive cruise control to pedestrian detection.

J is for Jeep. The QNX reference vehicle, based on a Jeep Wrangler, showcases what the QNX CAR Platform for Infotainment can do out of the box. In its latest iteration, the reference vehicle ups the ante with traffic sign detection, lane departure warnings, curve speed warnings, collision avoidance alerts, backup displays, and other ADAS features for enhancing driver awareness.

K is for Kia, which uses QNX technology in the infotainment and connectivity systems for several of its vehicles.

L is for LG, a long-time QNX customer that is using several QNX technologies to develop a new generation of infotainment systems, digital clusters, and ADAS systems for the global automotive market.

M is for Mercedes-Benz, which offers QNX-based infotainment systems in several of its vehicles, including the head unit and digital instrument cluster in the S Class Coupe. M is also for market share: according to IHS Automotive, QNX commands more than 50% of the infotainment software market.

N is for navigation. Thanks to the navigation framework in the QNX CAR Platform, automakers can integrate a rich variety of navigation solutions into their cars.

O is for the over-the-air update solution of the BlackBerry IoT Platform, which will help automakers cut maintenance costs, reduce expensive recalls, improve customer satisfaction, and keep vehicles up to date with compelling new features long after they have rolled off the assembly line.

P is for partnerships. When automotive companies choose QNX, they also tap into an incredibly rich partner ecosystem that provides infotainment apps, smartphone connectivity solutions, navigation engines, automotive processors, voice recognition engines, user interface tools, and other pre-integrated technologies. P is also for Porsche, which uses the QNX Neutrino OS in its head units, and for Porsche 911, which formed the basis of one of the first QNX concept cars.

Q is for the QNX CAR Platform for Infotainment, a comprehensive solution that pre-integrates partner technologies with road-proven QNX software to jump-start customer projects.

R is for the reliability that QNX OS technology brings to advanced driver assistance systems and other safety-related components in the vehicle — the same technology proven in space shuttles, nuclear plants, and medical devices.

S is for the security expertise and solutions that Certicom and QNX bring to automotive systems. S is also for the advanced smartphone integration of the QNX CAR Platform, which allows infotainment systems to support the latest brought-in solutions, such as Apple CarPlay and Android Auto. S is also for the scalability of QNX technology, which allows customers to use a single software platform across all of their product lines, from high-volume economy vehicles to luxury models. And last, but not least, S is for the more than sixty million vehicles worldwide that use QNX technology. (S sure is a busy letter!)

T is for Toyota, which uses QNX technology in infotainment systems like Entune and Touch ‘n’ Go. T is also for tools: using the QNX Momentics Tool Suite, automotive developers can root out subtle bugs and optimize the performance of their sophisticated, multi-core systems.

U is for unified user interface. With QNX, automotive developers can choose from a rich set of user interface technologies, including Qt, HTML5, OpenGL ES, and third-party toolkits. Better yet, they can blend these various technologies on the same display, at the same time, for the ultimate in design flexibility.

V is for the Volkswagen vehicles, including the Touareg, Passat, Polo, Golf, and Golf GTI, that use the QNX Neutrino OS and QNX middleware technology in their infotainment systems.

W is for the QNX Wireless Framework, which brings smartphone-caliber connectivity to infotainment systems, telematics units, and a variety of other embedded devices. The framework abstracts the complexity of modem control, enabling developers to upgrade cellular and Wi-Fi hardware without having to rewrite their applications.

X, Y, and Z are for the 3D navigation solutions and the 3D APIs and partner toolkits supported by the QNX CAR Platform. I could show you many examples of these solutions in action, but my personal favorite is the QNX technology concept car based on a Bentley Continental GT. Because awesome.

Before you go... This post mentions a number of automotive customers, but please don’t consider it a complete list. I would have gotten them all in, but I ran out of letters!

Rabu, 24 Juni 2015

Developing software for safety-critical systems? Have I got a book for you

Paul Leroux
Chris Hobbs is the only person I know who holds a math degree with a specialization in mathematical philosophy. In fact, before I met him, I didn’t know such a thing even existed. But guess what? That’s one of the things I really like about Chris. The more I hang out with him, the more I learn.

Come to think of it, helping people learn has become something of a specialty for Chris. He is, for example, a flying instructor and the author of Flying Beyond: The Canadian Commercial Pilot Textbook. And, as a software safety specialist at QNX Software Systems, he regularly provides advice to customers building systems that must comply with functional safety standards like IEC 61508, EN 5012x, and ISO 26262.

Chris has already written a number of papers on software safety, some of which I have had the great privilege to edit. You can find several of them on the QNX website. But recently, Chris upped the ante and wrote an entire book on the subject, titled Embedded Software Development for Safety-Critical Systems. The book:

  • covers the development of safety-critical systems under ISO 26262, IEC 61508, EN 50128, and IEC 62304
  • helps readers understand and apply remarkably esoteric development practices and be prepared to justify their work to external auditors
  • discusses the advantages and disadvantages of architectural and design practices recommended in the standards, including replication and diversification, anomaly detection, and so-called “safety bag” systems
  • examines the use of open-source components in safety-critical systems

I haven’t yet had a chance to review the book, but at 358 pages, it promises to be a substantial read.

Interested? Well, you can’t get the book just yet. But you can pre-order it today and get one of the first copies off the press. It’s scheduled for release September 1.


Selasa, 23 Juni 2015

Concept Car mit QNX-Technologie feiert seinen Auftakt in Europa

Ein Gastbeitrag von Matthias Stumpf, Vertriebsleiter Automotive EMEA, QNX Software Systems
(Guest post from Matthias Stumpf, manager of automotive sales EMEA, QNX Software Systems)


Nachdem der Mercedes-Benz CLA45 AMG, ein mit QNX-Technologie ausgestattetes Concept Car, in Nordamerika für Schlagzeilen gesorgt hat, wagt es nun für seine Europa-Tour den Sprung über den großen Teich. Startschuss ist auf dem Automobile Elektronik Kongress am 23. und 24. Juni in Ludwigsburg, wo das Auto zum ersten Mal in Europa ausgestellt wird.

Alle die den Mercedes in Aktion sehen wollen, sollten im Hauptfoyer des Kongresses vorbeischauen. Dort wird gezeigt, wie der Fahrer völlig natürlich und intuitiv mit dem im Auto verbauten Infotainment-System und den digitalen Instrumenten-Gruppen interagieren kann.

Eine extrabreite Head Unit
Das Auto verfügt über eine extrabreite Head Unit, die Fahrer und Beifahrer mit Hilfe detaillierter Grafiken und über ein durchgehendes 7 Zoll bis 21 Zoll großes Interface mit Informationen versorgt. Dank des nutzerorientierten Designs kann das Infotainment-System optional über den Touchscreen, physische Knöpfe, den Multifunktions-Controller oder via Sprachbefehl gesteuert werden. Das System basiert auf der QNX CAR Platform for Infotainment, einem umfangreichen Ökosystem, das bereits QNX-Software-Systems-Technologien und zahlreiche Partner integriert hat:

QNX 2014 technology concept car - infotainment system

Konfigurierbares Instrumente-Cluster
Das digitale Instrumente-Cluster kann dynamisch angepasst werden und zeigt Wegbeschreibungen in Echtzeit, eingehende Telefonanrufe, Videos der Front- und Heck-Bordkameras, Drehzahl- und Geschwindigkeitsmesser sowie weitere virtuelle Instrumente an. Via Tastendruck auf dem Lenkrad werden sogar empfangene Textnachrichten vorgelesen; so behält der Fahrer seine Augen auf der Straße:

QNX 2014 technology concept car - cluster

Darüber hinaus können mit der “virtuellen Bordmechanik” des Clusters Statusinformationen wie Reifendruck, Bremsverschleiß sowie Treibstoff-, Öl- und Scheibenwaschwasserstand abgerufen werden:



Wenn Sie an weiteren Informationen über die zahlreichen Features des Concept Cars interessiert sind, lesen Sie hier und hier unsere vorangegangenen Blogbeiträge.

Wir freuen uns, Sie in Ludwigsburg begrüßen zu dürfen! Alle weiteren Termine der Europa-Tour des Concept Cars erhalten Sie hier auf unserem Blog.

Selasa, 02 Juni 2015

Digital instrument clusters and the road to autonomous driving

Guest post by Walter Sullivan, head of Innovation Lab, Silicon Valley, Elektrobit Automotive

Autonomous driving requires new user experience interfaces, always on connectivity, new system architectures and reliable security. In addition to these requirements, the real estate in the car is changing as we move towards autonomous driving, and the traditional display is being replaced by head up displays (HUD), digital instrument clusters, and other screens. The digital cluster is where automakers can blend traditional automotive status displays (such as odometer, speed, etc.) with safety features, entertainment, and navigation, providing a more personalized, safe, comfortable, and enjoyable driving experience.

For autonomous vehicles, the human-machine interface (HMI) will change with the level of autonomy. Until vehicles are fully autonomous, all the traditional functions of the in-car HMI must be covered and driver distraction needs to be minimized. As we progress through piloted drive towards full autonomy, additional functions are taking center stage in the instrument cluster: driver assistance (distance to vehicle in front, speed limit, optimized time to destination/fuel consumption, object detection, etc.).

The digital instrument cluster brings a number of benefits to the driver experience including:
  • Comfort: The more information that a driver has about the route, right before his or her eyes, the more comfortable the drive. Digital clusters that provide map data, not just routing guidance but information on the nearest gas station, traffic, upcoming toll roads, etc., give the most comfort by empowering the driver with the information needed to get to the destination quickly and safely.
  • Safety: Drivers benefit from cars that know what’s on the road ahead. Through electronic horizon-based features, clusters can display “predictive” driver-assistance information that delivers to the driver important messages regarding safety.
  • Entertainment: Consumers are looking for vehicles that allow them to transfer their digital lifestyle seamlessly into the driving experience. The cluster can enable such integration, allowing the driver to control a smartphone using the in-car system, stream music, make phone calls, and more.

As more software and technology enters the car and we move closer to the fully autonomous vehicle, the cluster will continue to be the main platform for HMI. Automakers are challenged to build the most user-friendly, personalized clusters they can, with today’s cars employing advanced visual controls that integrate 3D graphics and animation and even natural language voice control. Drivers will rely more heavily on the cluster to provide them information that ensures their safety and comfort during the ride.

Digital instrument cluster developed using EB technology, as shown in the QNX reference vehicle.

Curious about what this kind of technology looks like? Digital instrument clusters developed using Elektrobit (EB) Automotive software will be displayed at the QNX Software Systems (booth C92) during TU-Automotive Detroit, June 3-4. QNX will feature a demo cluster developed using EB GUIDE that integrates a simulated navigation route with EB street director, plus infotainment and car system data. You can also see EB technology in action in the QNX reference vehicle based on a Jeep Wrangler, in which EB street director and the award-winning EB Assist Electronic Horizon are both integrated in the digital cluster.


Walter Sullivan is head of Elektrobit (EB) Automotive’s newly established Silicon Valley Innovation Lab, responsible for developing and leading the company’s presence in Silicon Valley, as well as building and fostering strategic partnerships around the globe.

Visit Elektrobit here.

Rabu, 20 Mei 2015

Reimagining digital instrument cluster design

Guest post by Jason Clarke, vice president, sales and marketing, Crank Software

Technology in cars has been advancing at an impressive rate. From rich infotainment systems to intelligent digital instrument clusters, today’s automobile has evolved to become a cool reality that many of us only envisioned as a possibility a few years ago. But while the technology has changed, the driver has stayed the same. Drivers still need to get from point A to point B as efficiently and safely as possible, while perhaps listening to some favorite road trip tunes on the journey.

What has changed for drivers is the sheer volume of information that is available while behind the wheel. Today’s vehicle can tell you more than the fact that you are desperately in need of finding the nearest gas station. It’s smart enough to let you know when you are getting close to hitting the neighbor’s garbage can… again. It can alert you to traffic pattern changes, road hazards, inclement weather, your affinity to your lead foot, and to the fact that your spouse is texting you to remind you to pick up the dry cleaning. It can also effortlessly re-route you back to the dry cleaners after you realize you’ve forgotten, providing you with helpful turn-by-turn navigation in your instrument cluster.

That’s a lot of information. And it’s only a small slice of what’s available to today’s driver. The simplicity, reliability, and safety capabilities of platforms by QNX Software Systems make it a possible to have a wide range of technologies and features in a single vehicle, offering up an abundance of data for driver consumption.

So, how do we make this data useful for drivers? What do we need to consider when designing the UI for digital instrument clusters?

How much information does the driver REALLY need?
Information should be helpful, not intrusive or distracting from the task at hand — driving. The point of having more data available to drivers isn’t to show it all at the same time. That’s visually noisy and complex. Complex isn’t better; context is better. Turn-by-turn information can be displayed in the instrument cluster, based on communication from the navigation system. Video of the car’s surroundings can be displayed when parking assist services are engaged. Advanced Driver Assistance Systems (ADAS) can present in the cluster alerts to immediate hazards and objects.

Using tools that support rapid prototyping of design scenarios empowers teams to deliver the best user experience possible, serving up only the most relevant information. Using Storyboard Suite from Crank Software, teams can quickly cycle through design prototypes and perform testing on real hardware, focusing on the needs of the driver.

How do we best visualize the data?
It’s critical that drivers see and interpret displayed information as easily and quickly as possible. Clear visual representation of data is required, so it’s important to keep design considerations at the forefront in the development process. This is where the graphic designer comes in.

Crank Software’s Storyboard Suite allows the graphic designer to be integrated into the development process from concept to final HMI delivery, working in parallel with the engineers to ensure that fine details and subtle design nuances aren’t lost. With Storyboard Suite, designers don’t hand over a mockup to a developer to visually represent with code and then walk away. As the graphics change and evolve to satisfy usability requirements, the designer stays engaged throughout the entire process, helping to deliver a polished HMI.

Automotive cluster designed and developed with Crank Software Storyboard Suite, running on QNX Neutrino OS

Can we respond quickly to design change?
Remaining focused on the usability of the end design is critical to ensuring the safest driving experience. Delivering a high-performance, user-centric HMI requires testing, design refinements, retesting, and even further changes. This isn’t a linear process. While iterative process is important, it’s often cost prohibitive because it can introduce lengthy redesign cycles. Storyboard Suite provides teams the functionality to prototype and iterate through designs easily, using features such as Photoshop Re-import to quickly evaluate design changes on hardware and shorten development cycles. In addition, support for collaboration enables teams to share design and development work, thereby reducing the load on individuals and further optimizing time and resources.

A faster development process coupled with a user-focused end design is the key to delivering a highly usable and safe digital instrument cluster to market on schedule and within budget.

A digital instrument cluster developed with Storyboard Suite will be on display at TU-Automotive Detroit in the QNX Software Systems booth, #C92, and the Crank Software booth, #C113. Check out a previous Crank Software and QNX Software Systems collaboration with a Storyboard Suite UI in a QNX technology concept car.


Jason Clarke has over 15 years of experience in the embedded industry, in roles that span development, sales, and marketing. Jason heads up Crank Software’s marketing and sales initiatives.

Visit Crank Software here.