Tampilkan postingan dengan label Safety systems. Tampilkan semua postingan
Tampilkan postingan dengan label Safety systems. Tampilkan semua postingan

Kamis, 21 April 2016

Autonomous cars that can navigate winter roads? ‘Snow problem!

A look at what happens when you equip a Ford Fusion with sensor fusion.

Paul Leroux
Lets face it, cars and snow don’t mix. A heavy snowfall can tax the abilities of even the best driver — not to mention the best automated driving algorithm. As I discussed a few months ago, snow can mask lane markers, obscure street signs, and block light-detection sensors, making it difficult for an autonomous car to determine where it should go and what it should do. Snow can even trick the car into “seeing” phantom objects.

Automakers, of course, are working on the problem. Case in point: Ford’s autonomous research vehicles. These experimental Ford Fusion sedans create 3D maps of roads and surrounding infrastructure when the weather is good and visibility clear. They then use the maps to position themselves when the road subsequently disappears under a blanket of the white stuff.

How accurate are the maps? According to Ford, the vehicles can position themselves to within a centimeter of their actual location. Compare that to GPS, which is accurate to about 10 yards (9 meters).

To create the maps, the cars use LiDAR scanners. These devices collect a ginormous volume of data about the road and surrounding landmarks, including signs, buildings, and trees. Did I say ginormous? Sorry, I meant gimongous: 600 gigabytes per hour. The scanners generate so many laser points — 2.8 million per second — that some can bounce off falling snowflakes or raindrops, creating the false impression that an object is in the way. To eliminate these false positives, Ford worked with U of Michigan researchers to create an algorithm that filters out snow and rain.

The cars don’t rely solely on LiDAR. They also use cameras and radar, and blend the data from all three sensor types in a process known as sensor fusion. This “fused” approach compensates for the shortcomings of any particular sensor technology, allowing the car to interpret its environment with greater certainty. (To learn more about sensor fusion for autonomous cars, check out this recent EE Times Automotive article from Hannes Estl of TI.)

Ford claims to be the first automaker to demonstrate robot cars driving in the snow. But it certainly won’t be the last. To gain worldwide acceptance, robot cars will have to prove themselves on winter roads, so we are sure to see more innovation on this (cold) front. ;-)

In the meantime, dim the lights and watch this short video of Ford’s “snowtonomy” technology:



Did you know? In January, QNX announced a new software platform for ADAS and automated driving systems, including sensor fusion solutions that combine data from multiple sources such as cameras and radar processors. Learn more about the platform here and here.

Selasa, 15 Maret 2016

Goodbye analog, hello digital

Since 2008, QNX has explored how digital instrument clusters will change the driving experience.

Paul Leroux
Quick: What do the Alfa Romeo 4C, Audi TT, Audi Q7, Corvette Stingray, Jaguar XJ, Land Rover Range Rover, and Mercedes S Class Coupe have in common?

Answer: They would all look awesome in my driveway! But seriously, they all have digital instrument clusters powered by the QNX Neutrino OS.

QNX Software Systems has established a massive beachhead in automotive infotainment and telematics, with deployments in over 60 million cars. But it’s also moving into other growth areas of the car, including advanced driver assistance systems (ADAS), multi-function displays, and, of course, digital instrument clusters.

Retrofitting the QNX reference
vehicle with a new digital cluster.
The term “digital cluster” means different things to different people. To boomers like myself, it can conjure up memories of 1980s dashboards equipped with less-than-sexy segment displays — just the thing if you want your dash to look like a calculator. Thankfully, digital clusters have come a long way. Take, for example, the slick, high-resolution cluster in the Audi TT. Designed to display everything directly in front of the driver, this QNX-powered system integrates navigation and infotainment information with traditional cluster readouts, such as speed and RPM. It’s so advanced that the folks at Audi don’t even call it a cluster — they call it virtual cockpit, instead.

Now here’s the thing: digital clusters require higher-end CPUs and more software than their analog predecessors, not to mention large LCD panels. So why are automakers adopting them? Several reasons come to mind:

  • Reusable — With a digital cluster, automakers can deploy the same hardware across multiple vehicle lines simply by reskinning the graphics.
  • Simple — Digital clusters can help reduce driver distraction by displaying only the information that the driver currently requires.
  • Scalable — Automakers can add functionality to a digital cluster by changing the software only; they don’t have to incur the cost of machining or adding new physical components.
  • Attractive — A digital instrument cluster can enhance the appeal of a vehicle with eye-catching graphics and features.
     
In addition to these benefits, the costs of high-resolution LCD panels and the CPUs needed to drive them are dropping, making digital instrument clusters an increasingly affordable alternative.

2008: The first QNX cluster
It’s no coincidence that so many automakers are using the QNX Neutrino OS in their digital clusters. For years now, QNX Software Systems has been exploring how digital clusters can enhance the driving experience and developing technologies to address the requirements of cluster developers.

Let’s start with the very first digital cluster that the QNX team created, a proof-of-concept that debuted in 2008. Despite its vintage, this cluster has several things in common with our more recent clusters — note, for example, the integrated turn-by-turn navigation instructions:



For 2008, this was pretty cool. But as an early proof-of-concept, it lacked some niceties, such as visual cues that could suggest which information is, or isn’t, currently important. For instance, in this screenshot, the gauges for fuel level, engine temperature, and oil pressure all indicate normal operation, so they don’t need to be so prominent. They could, instead, be shrunk or dimmed until they need to alert the driver to a critical change — and indeed, we explored such ideas soon after we created the original design. As you’ll see, the ability to prioritize information for the driver becomes quite sophisticated in subsequent generations of our concept clusters.

Did you know? To create this 2008 cluster, QNX engineers used Adobe Flash Lite 3 and OpenGL ES.

2010: Concept cluster in a Chevrolet Corvette
Next up is the digital cluster in the first QNX technology concept car, based on a Chevrolet Corvette. If the cluster design looks familiar, it should: it’s modeled after the analog cluster that shipped in the 2010-era ‘Vettes. It’s a great example of how a digital instrument cluster can deliver state-of-the-art features, yet still honor the look-and-feel of an established brand. For example, here is the cluster in “standard” mode, showing a tachometer, just as it would in a stock Corvette:



And here it is again, but with something that you definitely wouldn’t find in a 2010 Corvette cluster — an integrated navigation app:



Did you know? The Corvette is the only QNX technology concept car that I ever got to drive.

2013: Concept cluster in a Bentley Continental GT
Next up is the digital cluster for the 2013 QNX technology concept car, based on a Bentley Continental GT. This cluster took the philosophy embodied in the Corvette cluster — honor the brand, but deliver forward-looking features — to the next level.

Are you familiar with the term Trompe-l’œil? It’s a French expression that means “deceive the eye” and it refers to art techniques that make 2D objects appear as if they are 3D objects. It’s a perfect description of the gorgeously realistic virtual gauges we created for the Bentley cluster:



Because it was digital, this cluster could morph itself on the fly. For instance, if you put the Bentley in Drive, the cluster would display a tach, gas gauge, temperature gauge, and turn-by-turn directions — the cluster pulled these directions from the head unit’s navigation system. And if you threw the car into Reverse, the cluster would display a video feed from the car’s backup camera. The cluster also had other tricks up its digital sleeve, such as displaying information from the car’s media player.

Did you know? The Bentley came equipped with a 616 hp W12 engine that could do 0-60 mph in a little over 4 seconds. Which may explain why they never let me drive it.

2014: Concept cluster in a Mercedes CLA45 AMG
Plymouth safety speedometer, c 1939
Up next is the 2014 QNX technology concept car, based on Mercedes CLA45 AMG. But before we look at its cluster, let me tell you about the Plymouth safety speedometer. Designed to curb speeding, it alerted the driver whenever he or she leaned too hard on the gas.

But here’s the thing: the speedometer made its debut in 1939. And given the limitations of 1939 technology, the speedometer couldn’t take driving conditions or the local speed limit into account. So it always displayed the same warnings at the same speeds, no matter what the speed limit.

Connectivity to the rescue! Some modern navigation systems include information on local speed limits. By connecting the CLA45’s concept cluster to the navigation system in the car’s head unit, the QNX team was able to pull this information and display it in real time on the cluster, creating a modern equivalent of Plymouth's 1939 invention.

Look at the image below. You’ll see the local speed limit surrounded by a red circle, alerting the driver that they are breaking the limit. The cluster could also pull other information from the head unit, including turn-by-turn directions, trip information, album art, and other content normally relegated to the center display:



Did you know? Our Mercedes concept car is still alive and well in Germany, and recently made an appearance at the Embedded World conference in Nuremburg.

2015: Concept cluster in a Maserati Quattroporte
Up next is the 2015 QNX technology concept car, based on a Maserati Quattroporte GTS. Like the cluster in the Mercedes, this concept cluster provided speed alerts. But it could also recommend an appropriate speed for upcoming curves and warn of obstacles on the road ahead. It even provided intelligent parking assist to help you back into tight spaces.

Here is the cluster displaying a speed alert:



And here it is again, using input from a LiDAR system to issue a forward collision warning:



Did you know? Engadget selected the “digital mirrors” we created for the Maserati as a finalist for the Best of CES Awards 2015.

2015 and 2016: Concept clusters in QNX reference vehicle
The QNX reference vehicle, based on a Jeep Wrangler, is our go-to vehicle for showcasing the latest capabilities of the QNX CAR Platform for Infotainment. But it also does double-duty as a technology concept vehicle. For instance, in early 2015, we equipped the Jeep with a concept cluster that provides lane departure warnings, collision detection, and curve speed warnings. For instance, in this image, the cluster is recommending that you reduce speed to safely navigate an upcoming curve:



Just in time for CES 2016, the Jeep cluster got another makeover that added crosswalk notifications to the mix:



Did you know? Jeep recently unveiled the Trailcat, a concept Wrangler outfitted with a 707HP Dodge Hellcat engine.

2016: Glass cockpit in a Toyota Highlander
By now, you can see how advances in sensors, navigation databases, and other technologies enable us to integrate more information into a digital instrument cluster, all to keep the driver aware of important events in and around the vehicle. In our 2016 technology concept vehicle, we took the next step and explored what would happen if we did away with an infotainment system altogether and integrated everything — speed, RPM, ADAS alerts, 3D navigation, media control and playback, incoming phone calls, etc. — into a single cluster display.

On the one hand, this approach presented a challenge, because, well… we would be integrating everything into a single display! Things could get busy, fast. On the other hand, this approach presents everything of importance directly in front of the driver, where it is easiest to see. No more glancing over at a centrally mounted head unit.

Simplicity was the watchword. We had to keep distraction to a minimum, and to do that, we focused on two principles: 1) display only the information that the driver currently requires; and 2) use natural language processing as the primary way to control the user interface. That way, drivers can access infotainment content while keeping their hands on the wheel and eyes on the road.

For instance, in the following scenario, the cockpit allows the driver to see several pieces of important information at a glance: a forward-collision warning, an alert that the car is exceeding the local speed limit by 12 mph, and map data with turn-by-turn navigation:



This design also aims to minimize the mental translation, or cognitive processing, needed on the part of the driver. For instance, if you exceed the speed limit, the cluster doesn’t simply show your current speed. It also displays a red line (visible immediately below the 52 mph readout) that gives you an immediately recognizable hint that you are going too fast. The more you exceed the limit, the thicker the red line grows.

The 26262 connection
Today’s digital instrument clusters require hardware and software solutions that can support rich graphics and high-level application environments while also displaying critical information (e.g. engine warning lights, ABS indicators) in a fast and highly reliable fashion. The need to isolate critical from non-critical software functions in the same environment is driving the requirement for ISO 26262 certification of digital clusters.

QNX OS technology, including the QNX OS for Safety, is ideally suited for environments where a combination of infotainment, advanced driver assistance system (ADAS), and safety-related information are displayed. Building a cluster with the ISO 26262 ASIL-D certified QNX OS for Safety can make it simpler to keep software functions isolated from each other and less expensive to certify the end cluster product.

The partner connection
Partnerships are also important. If you had the opportunity to drop by our booth at 2016 CES, you would have seen a “cluster innovation wall” that showcases QNX OS technology integrated with user interface design tools from the industry’s leading cluster software providers, including 3D Incorporated’s REMO HMI Runtime, Crank Software’s Storyboard Suite, DiSTI Corporation’s GL Studio, Elektrobit’s EB GUIDE, HI Corporation’s exbeans UI Conductor, and Rightware’s Kanzi UI software. This pre-integration with a rich choice of partner tools enables our customers to choose the user interface technologies and design approaches that best address their instrument cluster requirements.

For some partner insights on digital cluster design, check out these posts:

Selasa, 23 Februari 2016

QNX OS for Safety named best software product at Embedded World

“Winning takes talent, to repeat takes character” — legendary basketball coach John Wooden

Patryk Fournier
Earlier today, at Embedded World 2016, QNX won an embedded AWARD for its QNX OS for Safety, an operating system designed for safety-critical applications in the automotive, rail transportation, healthcare, and industrial automation markets. The OS was named best product in the software category.

This award win is a testament to the commitment and integrity that drives QNX to continuously release world-class products. In fact, this marks the fourth time that QNX Software Systems has won an embedded AWARD. In 2014, it took top honors for QNX Acoustics for Active Noise Control (ANC), a software library that cancels out distracting engine noise in cars while eliminating the dedicated hardware required by conventional ANC solutions. The company also won in 2006 for its multicore-enabled operating system and development tools, and in 2004 for power management technology.

The QNX OS for Safety is built on a highly reliable software architecture proven in nuclear power plants, train control systems, laser eye-surgery devices, and a variety of other safety-critical environments. It was created to meet the rigorous IEC 61508 functional safety standard as well as industry-specific standards based on IEC 61508. These include ISO 26262 for passenger vehicles, EN 50128 for railway applications, IEC 62304 for medical devices, and IEC 61511 for factory automation, process control, and robotics.

Hats off to the many talented QNX staffers responsible for developing, certifying, promoting, and selling the QNX OS for Safety!

The media scrum at today's award ceremony.

Kamis, 07 Januari 2016

In the zone — a visit to the QNX concept garage

Guest post by QNX consultant and software designer Rob Krten.

How often have you heard the expression, “If it were easy to do, everyone would do it”? I’m constantly amazed at the things that QNX does with their concept cars. To me, a car is an inviolate object that must be touched only by the dealer (well, ok, I do top up the windshield wiper fluid and I once changed a battery). I don’t say that because I necessarily like to give the dealer money, but I just don’t want to break anything that’ll cost me more to get fixed properly later.

Pushing the envelope, however, means getting right in there and doing stuff. QNX engineers have done this for their technology concept cars — from replacing the mirrors with LCD screens, to getting right into the dash and rebuilding it, to adding cameras into the antenna fin on the roof. It’s nothing for them to rip out the center console and then look at all the wiring and go, “Huh, ok — so we need to lengthen this wire, add a shim here, move this piece,” and so on. They are fearless.

Redoing the dash of the QNX
reference vehicle.
Sometimes the “getting right in there” is physical; other times, it’s software based — such as making a new application that lives in the infotainment stack or that interfaces with a smartphone. Like a “Dude, where’s my car?” feature — when your Bluetooth phone unpairs with your car, the phone records the current GPS position. Later, when you’re looking for your car, your phone can recall this last stored GPS position — this must be where you left your car. Or even simple aids, such as a radio tuner that detects when you are losing an AM/FM signal and automatically switches to the corresponding digital station, so you can continue listening to your favorite station anywhere you drive.

Curious to see what the future holds, and to actually see some of this work in action, I invited myself down to the “garage” at QNX headquarters. It’s at the far end of the building, next to the cafeteria. The hallway is festooned with posters of previous QNX concept vehicles, highlighting success stories in 3-foot-high glory.

The day I visited, there were half a dozen people in the garage, and two vehicles: a Jeep and a Highlander (otherwise known as the QNX reference vehicle and QNX technology concept vehicle). The garage is a combination of software development lab, hardware development lab, simulation environment, and actual garage (but without the greasy/oily smell). I wanted to get a sense of what drives these people, what they do, and how they do it.

Digital analogs
No, not that kind of digital 
display. Credit: Peter Halasz
The first thing I learned was that there are no real limits. They have the freedom to innovate, without preconceived notions about how things should look. For example, a lead designer on the team (let’s call him Allan, because that’s his name), explained how they look at the controls in the car’s dash display area. In the era of analog, the speedometer had a certain look — it was usually a needle rotating about a central point, where the needle pointed to the speed you were going. In the very early era of digitization, car manufacturers changed this needle to a seven-segment numerical display.

Of course, this was a failure, because the human brain is basically analog; it likes to see nice, continuous changes for processes that are continuous — such as the speed that you’re going. Seven-segment digits change too “randomly”; they require higher-level cognitive functions to parse what the individual lights mean and convert that into digits, and then convert that into a “speed” (and then convert that into “too slow,” or “just right,” or “too fast,” and then, finally, convert that into “apply brake” or “press down on throttle”).

Allan pointed out that changing to a digital display didn’t necessarily mean that they have to slavishly follow the analog “physical” appearance (except do it on an LCD display), but that they were free to experiment with “fill concepts” — digitally controlled analogs to the actual controls. We likened it to the displays in military avionics, where the most important information becomes bigger as it increases in importance. Consider a fighter jet at 20,000 feet — the altitude isn’t nearly as important as it as at 300 feet. Therefore, at 20,000 feet, the part showing the altitude is small, and in a less prominent position than it is when the plane is at 300 feet. The same thing with your speedometer: if you’re doing the speed limit, it’s not as important to show your current speed (you’re most likely flowing with traffic) as it is when you’re 20 over (or under).

In this image from the new QNX technology concept vehicle, the digital instrument cluster is warning that a
forward collision is imminent, and that the driver is exceeding the speed limit by 12 mph. 

You could do the same thing with your fuel range — when you have a full tank, the indicator can be off in a corner somewhere. But as you start to run low, the indicator can get bigger or more prominent, to start nagging you to refuel. By having the displays all be “virtual” on a large LCD screen in the dash, the designers have incredible flexibility to create systems that present relevant information when required, and have it move out of the way when something more important comes along. (Come to think of it, this would be an awesome feature to have on turn-signal indicators — after you’ve kept your blinker on for more than 10 seconds, it would start to get bigger and brighter. Maybe then people would stop driving with their turn indicator permanently on.)

Collision avoided: The V2X command center
Also in the lab was a huge (3 by 5 foot) flat-panel touchscreen, mounted at an angle that’s aggressively unfriendly to coffee cups (probably for that very reason). It’s reminiscent of Star Trek’s main transporter control station, but it’s used to control and display the simulation environment’s V2V (vehicle to vehicle) and V2I (vehicle to infrastructure) data. It acts as a command center to control and reveal the innards of what’s going on in the simulation environment:



When I was there, we ran a vehicle collision avoidance scenario. Two vehicles (the Jeep and the Highlander, of course — they’re tied in to the system) were heading on a collision course (one was southbound and one was eastbound in a grid-style road system). Because they have V2V capabilities, both cars were aware of their impending doom. This showed up nicely on the V2V command center control panel — two cars heading towards each other, little red circles emanating from them indicating the realtime V2V “pings.” Of course, in plenty of time, the Jeep slowed down to avoid the collision (the actual brake lights even went on!). The speed, GPS coordinates, direction, and even what gear each vehicle was in were all shown on the master console. Towards the end of my visit I almost had Allan convinced to do another master control console for the OBDII connector so you could interact with all of the information in each car. What can I say? I like front panels. (I’m a reformed PDP-8 collector.)

The V2X command center, which makes its debut this week at CES, provides a bird’s eye view of several V2X traffic scenarios. In this example, V2X allows a vehicle (the Jeep) to detect that a vehicle up ahead (the Highlander) has braked suddenly, giving the Jeep plenty of time to slow down.

The engineers in the concept garage are “in the zone.” They’re working in an environment that encourages innovation. Watch and see what they produce:




About Rob
Rob is president of Iron Krten Consulting, which provides technical leadership services, from software leadership consulting through to security and embedded software products, development, training and contract services. Rob is also engaged by QNX Software Systems to write marketing and technical documentation. Visit Rob's website.

Rabu, 06 Januari 2016

The simpler, the better: a first look at the new QNX technology concept vehicle

Bringing the KISS principle to the dashboard.

Paul Leroux
“From sensors to smartphones, the car is experiencing a massive influx of new technologies, and automakers must blend these in a way that is simple, helpful, and non-distracting.” That statement comes from a press release we issued a year ago, but it’s as true today as it was then — if not more so. The fact is, the car is undergoing a massive transformation as it becomes more connnected and more automated. And with that transformation comes higher volumes of data and greater system complexity.

But here’s the thing. From the driver’s perspective, this complexity doesn’t matter, nor should it matter. In fact, it can’t matter. Because the driver needs to stay focused on the most important thing: driving. (At least until fully automated driving becomes reality, at which point a nap might be in order!) Consequently, it’s the job of automakers and their suppliers to harness all these technologies in a simple, intuitive way that makes driving easier, safer, and more enjoyable. Specifically, they need to provide the driver with relevant, contextually sensitive information that is easy to consume, without causing distraction.

That is the challenge that the new QNX technology concept vehicle, based on a Toyota Highlander, sets out to explore.

So what are we waiting for? Let’s take a look! (And remember, you can click on any image to magnify it.)

The oh-so-glossy exterior
As with any QNX technology concept vehicle, it’s what’s inside that counts. But to signal that this is no ordinary Highlander, we gave the exterior a luxurious, brushed-metal finish that just screams to have its picture taken. So we obliged:



The integrated display that keeps you focused
When modifying the Highlander, simplicity was the watchword. So instead of equipping the vehicle with both a digital instrument cluster and a head unit, we created a “glass cockpit” that combines the functions of both systems, along with ADAS safety alerts, into one seamless display. Everything is presented directly in front of the driver, where it is easiest to see.

For instance, in the following scenario, the cockpit allows the driver to see several pieces of important information at a glance: a forward-collision warning, an alert that the car is exceeding the local speed limit by 12 mph, and turn-by-turn navigation:



Mind you, the cockpit can display much more information than you see here, including a tachometer, album art, incoming phone calls, and the current radio station. But to keep distraction to a minimum, it displays only the information that the driver currently requires, and no more. Because simplicity.

To further minimize distraction, the cockpit uses voice as the primary way to control the user interface, including control of media, navigation, and phone connectivity. As a result, drivers can access infotainment content while keeping their hands on the wheel and eyes on the road.

Thoughtful touches abound. For instance, the HERE Auto navigation software running in the cockpit interfaces with a HERE Auto Companion App running on a BlackBerry PRIV smartphone. So when the driver steps into the vehicle, navigation route information from the smartphone is transferred automatically to the vehicle, providing a continuous user experience. How cool is that?

Here’s a slightly different view of the cockpit, showing how it can display a photo of your destination — just the thing when you are driving to a location for the first time and would like visual confirmation of what it looks like:



Before I forget, here are some additional tech specs: the cockpit is built on the QNX CAR Platform for Infotainment, uses an interface based on Qt 5.5, integrates iHeartRadio, and runs on a Renesas R-Car H2 system-on-chip.

The acoustics feature that keeps you from shouting
The glass cockpit does a great job of keeping your eyes focused straight ahead. But what’s the use of that if, as a driver, you have to turn your head every time you want to speak to someone in the back seat? If you’ve ever struggled to hold a conversation in a car at highway speeds, especially in a larger vehicle, you know what I’m talking about.

QNX acoustics to the rescue! Earlier today, QNX Software Systems announced the QNX Acoustics Management Platform, a new solution that replaces the traditional piecemeal approach to in-car acoustics with a holistic model that enables faster-time-to-production and lower system costs. The platform comes with several innovative features, including QNX In-Car Communication (ICC) technology, which enhances the voice of the driver and relays it to infotainment loudspeakers in the rear of the car.

Long story short: instead of shouting or having to turn around to be heard, the driver can talk normally while keeping his or her eyes on the road. QNX ICC dynamically adapts to noise conditions and adds enhancement only when needed. Better yet, it allows automakers to leverage their existing handsfree telephony microphones and infotainment loudspeakers.



The reference vehicle that keeps evolving
Before you go, I also want to share some updates to the QNX reference vehicle, which is based on a Jeep Wrangler. Like the Highlander, the Jeep got a slick new exterior for CES 2016:



Since 2012, the Jeep has been our go-to vehicle for showcasing the latest capabilities of the QNX CAR Platform for Infotainment. But for over a year now, it has done double-duty as a concept vehicle, showing how QNX technology can help developers build next-generation instrument clusters and ADAS solutions.

Take, for example, the Jeep’s new instrument cluster, which makes its debut this week at CES. In addition to providing all the information that you’d expect, such as speed and RPM, it displays crosswalk notifications, forward collision warnings, speed limit warnings, and turn-by-turn navigation:



The QNX reference vehicle also includes a full-featured head unit that demonstrates the latest out-of-the-box capabilities of the QNX CAR Platform for Infotainment. For example, in this image, the head unit is displaying HERE Auto navigation:



Other features of the platform include:
  • A voice interface that uses natural language processing, making it easy to launch applications, play music, select radio stations, control volume, use the navigation system, and perform a variety of other tasks.
  • A new, easy-to-navigate UI based on Qt 5.5 that supports a variety of touch gestures, including tap, swipe, pinch, and zoom.
  • QNX acoustics technology that enables clear, easy-to-understand hands-free calls through advanced echo cancellation and noise reduction.
  • Cellular connectivity provided by the QNX Wireless Framework, which simplifies system design by managing the complexities of modem control on behalf of applications.
  • Flexible support for a variety of smartphone integration protocols.

Additional tech specs: The Jeep’s cluster runs on a Qualcomm Snapdragon 602A processor and its user interface was designed by our partner Rightware, using the Rightware Kanzi tool. The head unit, meanwhile, runs on an Intel Atom E3827 processor.

ADAS, augmented reality, V2X, IoT, and more
I have only scratched the surface of what BlackBerry and QNX Software Systems are demonstrating this week at CES 2016. There’s much more to see and experience, including a very cool V2X demonstration, IoT solutions for the automotive and transportation industries, as well as ADAS and augmented reality systems that integrate with the digital clusters described in this post. To learn more, read the press release that QNX issued today and stay tuned to this channel.


QNX announces new platforms for automated driving systems and in-car acoustics

Paul Leroux
Every year, at CES, QNX Software Systems showcases its immense range of solutions for infotainment systems, digital instrument clusters, telematics systems, advanced driving assistance systems (ADAS), and in-car acoustics. This year is no different. Well, actually… let me take that back. Because this year, we are also announcing two new and very important software platforms: one that can speed the development of automated driving systems, and one that can transform how acoustics applications are implemented in the car.

QNX Platform for ADAS
The automotive industry is at an inflection point, with autonomous and semiautonomous vehicles moving from theory to reality. The new QNX Platform for ADAS is designed to help drive this industry transformation. Based on our deep automotive experience and 30-year history in safety-critical systems, the platform can help automotive companies reduce the time and effort of building a full range of ADAS and automated driving applications:
  • from informational ADAS systems that provide a multi-camera, 360° surround view of the vehicle…
  • to sensor fusion systems that combine data from multiple sources such as cameras and radar…
  • to advanced high-performance systems that make control decisions in fully autonomous vehicles



Highlights of the platform include:
  • The QNX OS for Safety, a highly reliable OS pre-certified at all of the automotive safety integrity levels needed for automated driving systems.
  • An OS architecture that can simplify the integration of new sensor technologies and purpose-built ADAS processors.
  • Frameworks and reference implementations to speed the development of multi-camera vision systems and V2X applications (vehicle-to-vehicle and vehicle-to-infrastructure communications).
  • Pre-integrated partner technologies, including systems-on-chip (SoCs), vision algorithms, and V2X modules, to enable faster time-to-market for customers.

This week, at CES 2016, QNX will present several ADAS and V2X demonstrations, including:
  • Demos that show how QNX-based ADAS systems can perform realtime analysis of complex traffic scenarios to enhance driver awareness or enable various levels of automated driving.
  • QNX-based V2X technology that allows cars to “talk” to each other and to traffic infrastructure (e.g. traffic lights) to prevent collisions and improve traffic flow.

To learn more, check out the ADAS platform press release, as well as the press release that provides a full overview of our many CES demos — including, of course, the latest QNX technology concept vehicle!

QNX Acoustics Management Platform
It’s a lesser-known fact, but QNX is a leader in automotive acoustics — its software for handsfree voice communications has shipped in over 40 million automotive systems worldwide. This week, QNX is demonstrating once again why it is a leader in this space, with a new, holistic approach to managing acoustics in the car, the QNX Acoustics Management Platform (AMP):

  • Enables automakers to enhance the audio and acoustic experience for drivers and passengers, while reducing system costs and complexity.
  • Replaces the traditional piecemeal approach to in-car acoustics with a unified model: automakers can now manage all aspects of in-car acoustics efficiently and holistically, for easier integration and tuning, and for faster time-to-production.
  • Reduces hardware costs with a new, low-latency audio architecture that eliminates the need for dedicated digital signal processors or specialized external hardware.
  • Integrates a full suite of acoustics modules, including QNX Acoustics for Voice (for handsfree systems), QNX Acoustics for Engine Sound Enhancement, and the brand new QNX In-Car Communication (ICC).

For anyone who has struggled to hold a conversation in a car at highway speeds, QNX ICC enhances the voice of the driver and relays it to loudspeakers in the back of the vehicle. Instead of shouting or having to turn around to be heard, the driver can talk normally while keeping his or her eyes on the road. QNX will demonstrate ICC this week at CES, in its latest technology concept car, based on a Toyota Highlander.

Read the press release to learn more about QNX AMP.



Senin, 09 November 2015

Bringing a bird’s eye view to a car near you

QNX and TI team up to enable surround-view systems in mass-volume vehicles

Paul Leroux
Uh-oh. You are 10 minutes late for your appointment and can’t find a place to park. At long last, a space opens up, but sure enough, it’s the parking spot from hell: cramped, hard to access, with almost no room to maneuver.

Fortunately, you’ve got this covered. You push a button on your steering wheel, and out pops a camera drone from the car’s trunk. The drone rises a few feet and begins to transmit a bird’s eye view of your car to the dashboard display — you can now see at a glance whether you are about to bump into curbs, cars, concrete barriers, or anything else standing between you and parking nirvana. Seconds later, you have backed perfectly into the spot and are off to your meeting.

Okay, that’s the fantasy. In reality, cars with dedicated camera drones will be a long time coming. In the meantime, we have something just as good and a lot more practicable — an ADAS application called surround view.

Getting aligned
Approaching an old problem from a
new perspective
. Credit: TI
Surround-view systems typically use four to six fisheye cameras installed at the front, back, and sides of the vehicle. Together, these cameras capture a complete view of the area around your car, but there’s a catch: the video frames they generate are highly distorted. So, to start, the surround-view system performs geometric alignment of every frame. Which is to say, it irons all the curves out.

Next, the system stitches the corrected video frames into a single bird’s eye view. Mind you, this step isn’t simply a matter of aligning pixels from several overlapping frames. Because each camera points in a different direction, each will generate video with unique color balance and brightness levels. Consequently, the system must perform photometric alignment of the image. In other words, it corrects these mismatches to make the resulting output look as if it were taken by a single camera hovering over the vehicle.

Moving down-market
If you think that all this work takes serious compute power, you’re right. The real trick, though, is to make the system affordable so that luxury car owners aren’t the only ones who can benefit from surround view.

Which brings me to QNX Software Systems’ support for TI’s new TDA2Eco system-on-chip (SoC), which is optimized for 3D surround view and park-assist applications. The TDA2Eco integrates a variety of automotive peripherals, including CAN and Gigabit Ethernet AVB, and supports up to eight cameras through parallel, serial and CSI-2 interfaces. To enable 3D viewing, the TDA2Eco includes an image processing accelerator for decoding multiple camera streams, along with graphics accelerators for rendering virtual views.

Naturally, surround view also needs software, which is where the QNX OS for Safety comes in. The OS can play several roles in surround-view systems, such as handling camera input, hosting device drivers for camera panning and control, and rendering the processed video onto the display screen, using QNX Software Systems’ high-performance Screen windowing system. The QNX OS for Safety complies with the ISO 26262 automotive functional safety standard and has a proven history in safety-critical systems, making it ideally suited for collision warning, surround view, and a variety of other ADAS applications.

Okay, enough from me. Let’s look at a video, hosted by TI’s Gaurav Agarwal, to see how the TDAx product line can support surround-view applications:



For more information on the TDAx product line, visit the TI website; for more on the QNX OS for Safety, visit the QNX website.

Selasa, 03 November 2015

An ADAS glossary for the acronym challenged

If you’ve got ACD, you’ve come to the right place.

Paul Leroux
Someday, in the not-so-distant future, your mechanic will tell you that your CTA sensor has gone MIA. Or that your EDA needs an OTA update. Or that the camera system for your PLD has OSD. And when that day happens, you’ll be glad you stumbled across this post. Because I am about to point you to a useful little glossary that takes the mystery out of ADAS acronyms. (The irony being, of course, that ADAS is itself an acronym.)

Kidding aside, acronyms can stand in the way of clear communication — but only when used at the wrong time and place. Otherwise, they serve as useful shorthand, especially among industry insiders who have better things to do than say “advanced driver assistance system” 100 times a day when they can simply say ADAS instead.

In any case, you can find the glossary here. And when you look at it, you’ll appreciate my ulterior motive for sharing the link — to demonstrate that the ADAS industry is moving apace. The glossary makes it abundantly clear that the industry is working on, or has already developed, a large variety of ADAS systems. The number will only increase, thanks to government calls for vehicle safety standards, technology advances that make ADAS solutions more cost-effective, and growing consumer interest in cars that can avoid crashes. In fact, Visiongain has estimated that the global ADAS market will experience double-digit growth between 2014 and 2024, from a baseline estimate of $18.2 billion.

And in case you’re wondering, ACD stands for acronym challenged disorder. ;-)

Rabu, 28 Oktober 2015

Five reasons why they should test autonomous cars in Ontario

Did I say five? I meant six…

Paul Leroux
It was late and I needed to get home. So I shut down my laptop, bundled myself in a warm jacket, and headed out to the QNX parking lot. A heavy snow had started to fall, making the roads slippery — but was I worried? Not really. In Ottawa, snow is a fact of life. You learn to live with it, and you learn to drive in it. So I cleared off the car windows, hopped in, and drove off.

Alas, my lack of concern was short-lived. The further I drove, the faster and thicker the snow fell. And then, it really started to come down. Pretty soon, all I could see out my windshield was a scene that looked like this, but with even less detail:



That’s right: a pure, unadulterated whiteout. Was I worried? Nope. But only because I was in a state of absolute terror. Fortunately, I could see the faintest wisp of tire tracks immediately in front of my car, so I followed them, praying that they didn’t lead into a ditch, or worse. (Spoiler alert: I made it home safe and sound.)

Of course, it doesn’t snow every day in Ottawa — or anywhere else in Ontario, for that matter. That said, we can get blanketed with the white stuff any time from October until April. And when we do, the snow can play havoc with highways, railways, airports, and even roofs.

Roofs, you say? One morning, a few years ago, I heard a (very) loud noise coming from the roof of QNX headquarters. When I looked out, this is what I saw — someone cleaning off the roof with a snow blower! So much snow had fallen that the integrity of the roof was being threatened:



When snow like this falls on the road, it can tax the abilities of even the best driver. But what happens when the driver isn’t a person, but the car itself? Good question. Snow and blowing snow can mask lane markers, cover street signs, and block light-detection sensors, making it difficult for an autonomous vehicle to determine where it should go and what it should do. Snow can even trick the vehicle into “seeing” phantom objects.

And it’s not just snow. Off the top of my head, I can think of 4 other phenomena common to Ontario roads that pose a challenge to human and robot drivers alike: black ice, freezing rain, extreme temperatures, and moose. I am only half joking about the last item: autonomous vehicles must respond appropriately to local fauna, not least when the animal in question weighs half a ton.

To put it simply, Ontario would be a perfect test bed for advancing the state of autonomous technologies. So imagine my delight when I learned that the Ontario government has decided to do something about it.

Starting January 1, Ontario will become the first Canadian province to allow road testing of automated vehicles and related technology. The provincial government is also pledging half a million dollars to the Ontario Centres of Excellence Connected Vehicle/Automated Vehicle Program, in addition to $2.45 million already provided.

The government has also installed some virtual guard rails. For instance, it insists that a trained driver stay behind the wheel at all times. The driver must monitor the operation of the autonomous vehicle and take over control whenever necessary.

Testing autonomous vehicles in Ontario simply makes sense, but not only because of the weather. The province also has a lot of automotive know-how. Chrysler, Ford, General Motors, Honda, and Toyota all have plants here, as do 350 parts suppliers. Moreover, the province has almost 100 companies and institutions involved in connected vehicle and automated vehicle technologies — including, of course, QNX Software Systems and its parent company, BlackBerry.

So next time you’re in Ontario, take a peek at the driver in the car next to you. But don’t be surprised if he or she isn’t holding the steering wheel.


A version of this post originally appeared in Connected Car Expo blog.

Rabu, 14 Oktober 2015

What does a decades-old thought experiment have to do with self-driving cars?

Paul Leroux
Last week, I discussed, ever so briefly, some ethical issues raised by autonomous vehicles — including the argument that introducing them too slowly could be considered unethical!

My post included a video link to the trolley problem, a thought experiment that has long served as a tool for exploring how people make ethical decisions. In its original form, the trolley problem is quite simple: You see a trolley racing down a track on which five people are tied up. Next to you is a lever that can divert the trolley to an empty track. But before you can pull the lever, you notice that someone is, in fact, tied up on the second track. Do you do nothing and let all 5 people die, or do you pull the lever and kill the one person instead?

The trolley problem has undergone criticism for failing to represent real-world problems, for being too artificial. But if you ask Patryk Lin, a Cal Tech professor who has delivered talks to Google and Tesla on the ethics of self-driving cars, it can serve as a helpful teaching tool for automotive engineers — especially if its underlying concept is framed in automotive terms.

Here is how he presents it:

“You’re driving an autonomous car in manual mode—you’re inattentive and suddenly are heading towards five people at a farmer’s market. Your car senses this incoming collision, and has to decide how to react. If the only option is to jerk to the right, and hit one person instead of remaining on its course towards the five, what should it do?”

Of course, autonomous cars, with their better-than-human driving habits (e.g. people tailgate, robot cars don’t) should help prevent such difficult situations from happening in the first place. In the meantime, thinking carefully through this and other scenarios is just one more step on the road to building fully autonomous, and eventually driverless, cars.

Read more about the trolley problem and its application to autonomous cars in a recent article on The Atlantic.

Speaking of robot cars, if you missed last week's webinar on the role of software when transitioning from ADAS to autonomous driving, don't sweat it. It's now available on demand at Techonline.

Rabu, 07 Oktober 2015

The ethics of robot cars

“By midcentury, the penetration of autonomous vehicles... could ultimately cause vehicle crashes in the U.S. to fall from second to ninth place in terms of their lethality ranking.” — McKinsey

Paul Leroux
If you saw a discarded two-by-four on the sidewalk, with rusty nails sticking out of it, what would you do? Chances are, you would move it to a safe spot. You might even bring it home, pull the nails out, and dispose of it properly. In any case, you would feel obliged to do something that reduces the probability of someone getting hurt.

Driver error is like a long sharp nail sticking out of that two-by-four. It is, in fact, the largest single contributor to road accidents. Which raises the question: If the auto industry had the technology, skills, and resources to build vehicles that could eliminate accidents caused by human error, would it not have a moral obligation to do so? I am speaking, of course, of self-driving cars.

Now, a philosopher I am not. I am ready to accept that my line of thinking on this matter has more holes than Swiss cheese. But if so, I’m not the only one with Emmenthal for brain matter. I am, in fact, in good company.

Take, for example, Bryant Walker-Smith, a professor in the schools of law and engineering at the University of South Carolina. In an article in MIT Technology Review, he argues that, given the number of accidents that involve human error, introducing self-driving technology too slowly could be considered unethical. (Mind you, he also underlines the importance of accepting ethical tradeoffs. We already accept that airbags may kill a few people while saving many; we may have to accept that the same principle will hold true for autonomous vehicles.)

Then there’s Roger Lanctot of Strategy Analytics. He argues that government agencies and the auto industry need to move much more aggressively on active-safety features like automated lane keeping and automated collision avoidance. He reasons that, because the technology is readily available — and can save lives — we should be using it.

Mind you, the devil is in the proverbial details. In the case of autonomous vehicles, the ethics of “doing the right thing” is only the first step. Once you decide to build autonomous capabilities into a vehicle, you often have to make ethics-based decisions as to how the vehicle will behave.

For instance, what if an autonomous car could avoid a child running across the street, but only at the risk of driving itself, and its passengers, into a brick wall? Whom should the car be programmed to save? The child or the passengers? And what about a situation where the vehicle must hit either of two vehicles — should it hit the vehicle with the better crash rating? If so, wouldn’t that penalize people for buying safer cars? This scenario may sound far-fetched, but vehicle-to-vehicle (V2X) technology could eventually make it possible.

The “trolley problem” captures the dilemma nicely:



Being aware of such dilemmas gives me more respect for the kinds of decisions automakers will have to make as they build a self-driving future. But you know what? All this talk of ethics brings something else to mind. I work for a company whose software has, for decades, been used in medical devices that help save lives. Knowing that we do good in the world is a daily inspiration — and has been for the last 25 years of my life. And now, with products like the QNX OS for Safety, we are starting to help automotive companies build ADAS systems that can help mitigate driver error and, ultimately, reduce accidents. So I’m doubly proud.

More to the point, I believe this same sense of pride, of helping to make the road a safer place, will be a powerful motivator for the thousands of engineers and development teams dedicated to paving the road from ADAS to autonomous. It’s just one more reason why autonomous cars aren’t a question of if, but only of when.

Kamis, 24 September 2015

Developing safety-critical systems? This book is for you

In-depth volume covers development of systems under the IEC 61508, ISO 26262, EN 50128, and IEC 62304 standards

Paul Leroux
In June, I told you of an upcoming book by my colleague Chris Hobbs, who works as a software safety specialist here at QNX Software Systems. Well, I’m happy to say that the book is now available. It’s called Embedded Software Development for Safety-Critical Systems and it explores design practices for building medical devices, railway control systems, industrial control systems, and, of course, automotive ADAS devices.

The book:
  • covers the development of safety-critical systems under ISO 26262, IEC 61508, EN 50128, and IEC 62304
  • helps developers learn how to justify their work to external auditors
  • discusses the advantages and disadvantages of architectural and design practices recommended in the standards, including replication and diversification, anomaly detection, and so-called “safety bag” systems
  • examines the use of open-source components in safety-critical systems
Interested? I invite to you to visit the CRC Press website, where you can view the full Table of Contents and, of course, order the book.

Looking forward to getting my copy!

Selasa, 22 September 2015

From ADAS to autonomous

A new webinar on how autonomous driving technologies will affect embedded software — and vice versa

Paul Leroux
When, exactly, will production cars become fully autonomous? And when will they become affordable to the average Jane or Joe? Good questions both, but in the meantime, the auto industry isn’t twiddling its collective thumbs. It’s already starting to build a more autonomous future through active-control systems that can avoid accidents (e.g. automated emergency braking) and handle everyday driving tasks (e.g. adaptive cruise control).

These systems rely on software to do their job, and that reliance will grow as the systems become more sophisticated and cars become more fully autonomous. This trend, in turn, will place enormous pressure on how the software is designed, developed, and maintained. Safety, in particular, must be front and center at every stage of development.

Which brings me to a new webinar from my inestimable colleague, Kerry Johnson. Titled “The Role of a Software Platform When Transitioning from ADAS to Autonomous Driving,” the webinar will examine:
  • the emergence of high-performance systems-on-chip that target ADAS and autonomous vehicle applications
  • the impact of increasing system integration and autonomous technologies on embedded software
  • the need for functional safety standards such as ISO 26262
  • the emergence of pre-certified products as part of the solution to address safety challenges
  • the role of a software platform to support the evolution from ADAS to autonomous driving

If you are tasked with either developing or sourcing software for functional safety systems in passenger vehicles, this webinar is for you. Here are the coordinates:

Wednesday, October 7
1:00pm EDT

Registration Site



Selasa, 08 September 2015

One OS, multiple safety applications

The latest version of our certified OS for ADAS systems and digital instrument clusters has a shorter product name — but a longer list of talents.

Paul Leroux
Can you ever deliver a safety-critical product to a customer and call it a day? For that matter, can you deliver any product to a customer and call it a day? These, of course, are rhetorical questions. Responsibility for a product rarely ends when you release it, especially when you add safety to the mix. In that case, it’s a long-term commitment that continues until the last instance of the product is retired from service. Which can take decades.

Mind you, people dedicated to building safety-critical products aren’t prone to sitting on their thumbs. From their perspective, product releases are simply milestones in a process of ongoing diligence and product improvement. For instance, at QNX Software Systems, we subject our OS safety products to continual impact analysis, even after they have been independently certified for use in functional safety systems. If that analysis calls for improved product, then improved product is what we deliver. With a refreshed certificate, of course.

Which brings me to the QNX OS for Safety. It’s a new — and newly certified — release of our field-proven OS safety technology, with a twist. Until now, we had one OS certified to the ISO 26262 standard (for automotive systems) and another certified to the IEC 61508 standard (for general embedded systems). The new release is certified to both of these safety standards and replaces the two existing products in one fell swoop.

So if you no longer see the QNX OS for Automotive Safety listed on the QNX website, not to worry. We’ve simply replaced it with an enhanced version that has a shorter product name and broader platform support — all with the same proven technology under the hood. (My colleague Patryk Fournier has put together an infographic that nicely summarizes the new release; see sidebar).

And if you’re at all surprised that a single OS can be certified to both 61508 and 26262, don’t be. As the infographic suggests, IEC 61508 provides the basis for many market-specific standards, including IEC 62304, EN 5012x, and, of course, ISO 26262.

Learn more about the QNX OS for Safety on the QNX website. And for more information on ISO 26262 and how it affects the design of safety-critical automotive systems, check out these whitepapers: