US9898005B2 - Driving path determination for autonomous vehicles - Google Patents
Driving path determination for autonomous vehicles Download PDFInfo
- Publication number
- US9898005B2 US9898005B2 US15/192,032 US201615192032A US9898005B2 US 9898005 B2 US9898005 B2 US 9898005B2 US 201615192032 A US201615192032 A US 201615192032A US 9898005 B2 US9898005 B2 US 9898005B2
- Authority
- US
- United States
- Prior art keywords
- vehicle
- roadway
- positions
- lateral
- identified
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 claims abstract description 28
- 230000006870 function Effects 0.000 claims description 99
- 230000008447 perception Effects 0.000 claims description 27
- 230000000977 initiatory effect Effects 0.000 claims description 2
- 238000005303 weighing Methods 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000011664 signaling Effects 0.000 description 4
- 238000003491 array Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- VNWKTOKETHGBQD-UHFFFAOYSA-N methane Chemical compound C VNWKTOKETHGBQD-UHFFFAOYSA-N 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000002310 reflectometry Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000002283 diesel fuel Substances 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 239000003502 gasoline Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000003345 natural gas Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0255—Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
-
- G05D2201/0213—
Definitions
- the embodiments disclosed herein generally relate to autonomous operation systems for vehicles, and more particularly to their generation and execution of driving plans for maneuvering vehicles on roadways.
- Some vehicles include an autonomous operation system under which the vehicle is subject to autonomous operation.
- a human driver may cede control over one or more primary control functions in favor of autonomous operation.
- the autonomous operation system generates a driving plan for maneuvering the vehicle on a roadway based on detected information about the environment surrounding the vehicle.
- the autonomous operation system operates vehicle systems associated with the primary control functions over which the human driver has ceded control.
- a driving plan may describe, among other things, a driving path of the vehicle along a roadway.
- An autonomous operation system's framework for determining the driving path has to accommodate the dynamic changes in the environment surrounding the vehicle involved in real world situations. Developing and improving these frameworks is the subject of ongoing research.
- a method of autonomous driving includes identifying, using a perception module executable by at least one processor, from detected information about an environment surrounding a vehicle on a roadway, a lateral surface profile of the roadway. Based on the lateral surface profile of the roadway, vertical wheel positions at identified candidate future lateral positions of the vehicle are determined using a planning/decision making module executable by the at least one processor. Based on the determined vertical wheel positions, once again using the planning/decision making module executable by the at least one processor, as part of a driving path along the roadway, future lateral positions of the vehicle from among the identified candidates therefor are determined using an energy function that algorithmically favors low vertical wheel positions.
- a vehicle in another aspect, includes sensors configured to detect information about an environment surrounding the vehicle, and vehicle systems operable to maneuver the vehicle.
- the vehicle also includes one or more modules stored on memory and executable by at least one processor for initiating instructions.
- the instructions include identifying, from the detected information about the environment surrounding the vehicle, a lateral surface profile of the roadway. Based on the lateral surface profile of the roadway, vertical wheel positions at identified candidate future lateral positions of the vehicle are determined. Based on the determined vertical wheel positions, as part of a driving path along the roadway, future lateral positions of the vehicle from among the identified candidates therefor are determined using an energy function that algorithmically favors low vertical wheel positions.
- the vehicle systems are then operated to maneuver the vehicle along the roadway according to a driving plan describing the driving path.
- FIG. 1 includes top views of a vehicle, showing, via block diagrams, components of an autonomous operation system
- FIG. 2 is a perspective view of the vehicle and an example environment surrounding the vehicle detectable by the autonomous operation system while the vehicle is on a roadway covered in snow, showing the roadway, snow ruts on the roadway and example obstacles on the roadway;
- FIG. 3 is a flowchart showing the operations of a process by which the autonomous operation system generates and executes a driving plan for maneuvering the vehicle on the roadway based on the detected information about the environment surrounding the vehicle, including determining, as part of a driving path along the roadway, future lateral positions of the vehicle;
- FIGS. 4 and 5 are conceptual renderings of a lateral surface profile of the roadway identifiable from the detected information about the environment surrounding the vehicle, showing their representations of the snow ruts on the roadway and, in FIG. 5 , the future lateral positions of the vehicle determined as part of the driving path.
- This disclosure teaches a vehicle with an autonomous operation system configured to generate and execute a driving plan for maneuvering the vehicle on a roadway.
- a driving path of the vehicle along a roadway described in the driving plan is determined using an energy function that algorithmically favors, among other things, low vertical wheel positions. This determination is, by this, suited for roadways on which ruts, such as snow ruts, are formed.
- FIGS. 1 and 2 A representative vehicle 10 is shown in FIGS. 1 and 2 .
- the vehicle 10 has an exterior and a number of inner compartments.
- the inner compartments may include a passenger compartment 12 , an engine compartment and, for the illustrated vehicle 10 , a trunk.
- the vehicle 10 may include, among other things, an engine, motor, transmission and other powertrain components housed in its engine compartment or elsewhere in the vehicle 10 , as well as other powertrain components, such as wheels 14 .
- the wheels 14 support the remainder of the vehicle 10 .
- One, some or all of the wheels 14 may be powered by other powertrain components to drive the vehicle 10 .
- One, some or all of the wheels 14 may be steered wheels subject to having their steering angles adjusted to adjust the orientation of the vehicle 10 .
- the vehicle 10 includes an autonomous operation system 20 under which the vehicle 10 is, generally speaking, subject to autonomous operation. Under the autonomous operation system, the vehicle 10 may be semi-autonomous or highly automated, for instance.
- the autonomous operation system 20 includes various autonomous support systems that support autonomous operation of the vehicle 10 . Although the autonomous support systems could be dedicated to the autonomous operation system 20 , it is contemplated that some or all of the autonomous support systems may also support other functions of the vehicle 10 , including its manual operation.
- the autonomous support systems may be or include various vehicle systems 30 .
- the vehicle systems 30 may include a propulsion system 32 , an energy system 34 , a braking system 36 , a steering system 38 , a signaling system 40 , a stability control system 42 and a navigation system 44 , for example, as well as any other systems generally available in vehicles.
- the propulsion system 32 includes components operable to accelerate the vehicle 10 , as well as maintain its speed.
- the propulsion system 32 may include, for instance, the engine, motor, transmission and other powertrain components, as well as certain vehicle controls, such as a cruise control system.
- the energy system 34 includes components that control or otherwise support the storage and use of energy by the vehicle 10 .
- the energy source employed by the energy system 34 may include, for instance, gasoline, natural gas, diesel oil and the like, as well as batteries, fuel cells and the like.
- the braking system 36 includes components operable to decelerate the vehicle 10 , such as brakes, for instance.
- the steering system 38 includes components operable to adjust the orientation of the vehicle 10 with respect to its longitudinal direction ⁇ or lateral direction ⁇ , or both, by, for example, adjusting the steering angle of one, some or all of the wheels 14 .
- the signaling system 40 includes components operable to communicate driving intentions and other notifications to other vehicles and their users.
- the signaling system 40 may include, for instance, exterior lights such as headlights, a left turn indicator light, a right turn indicator light, a brake indicator light, a backup indicator light, taillights and a running light.
- the stability control system 42 includes components operable to maintain, among other aspects of the stability of the vehicle 10 , its proper yaw and pitch, by, for example, actuating brakes and adjusting the power to one, some or all of the wheels 14 powered by other powertrain components to drive the vehicle 10 .
- the navigation system 44 establishes routes and directions for the vehicle 10 using, for instance, digital maps.
- the navigation system 44 may itself include digital maps, or the navigation system 44 may connect to remote sources for digital maps.
- the autonomous operation system 20 may connect to remote sources for routes and directions for the vehicle 10 .
- the autonomous support systems may be or include a sensor system 60 including one or more sensors.
- the sensor system 60 and its sensors may be positioned anywhere in or on the vehicle 10 , and may include existing sensors of the vehicle 10 , such as backup sensors, lane keeping sensors and front sensors, for instance.
- the sensor system 60 and its sensors may detect information about the vehicle 10 , including without limitation information about the operation of the vehicle 10 and information about the environment surrounding the vehicle 10 .
- the sensor system 60 and its sensors may detect information about the environment in front of and behind the vehicle 10 in its longitudinal direction ⁇ , as well as to the sides of the vehicle 10 in its lateral direction ⁇ .
- the sensor system 60 and its sensors may be configured to monitor in real-time, that is, at a level of processing responsiveness at which sensing is sufficiently immediate for a particular process or determination to be made, or that enables a processor to keep up with some external process.
- the sensors of the sensor system 60 may include one or more vehicle sensors 62 , one or more microphones 64 , one or more radar sensors 66 , one or more lidar sensors 68 , one or more sonar sensors 70 , one or more positioning sensors 72 and one or more cameras 74 , for example, as well as any other sensors generally available in vehicles.
- the vehicle sensors 62 are operable to detect information about the operation of the vehicle 10 .
- the vehicle sensors 62 may include, for instance, speedometers, gyroscopes, magnetometers, accelerometers, barometers, thermometers, altimeters, inertial measurement units (IMUs) and controller area network (CAN) sensors.
- the detected information about the operation of the vehicle 10 may include, for example, its speed, acceleration, orientation, rotation, direction, elevation, temperature and the like, as well as the operational statuses of the vehicle systems 30 and their components.
- the microphones 64 are operable detect sounds waves, and transform those sound waves into corresponding signals. Some microphones 64 may be located to detect sound waves in the environment surrounding the vehicle 10 . These microphones 64 may, accordingly, be at least partially exposed to the environment surrounding the vehicle 10 .
- the radar sensors 66 , the sonar sensors 68 and the lidar sensors 70 are each mounted on the vehicle 10 and positioned to have a fields of view in the environment surrounding the vehicle 10 , and are each, generally speaking, operable to detect objects in the environment surrounding the vehicle 10 . More specifically, the radar sensors 66 , the sonar sensors 68 and the lidar sensors 70 are each operable to scan the environment surrounding the vehicle 10 , using radio signals in the case of the radar sensors 66 , sound waves in the case of the sonar sensors 68 and laser signals in the case of the lidar sensors 70 , and generate signals representing objects, or the lack thereof, in the environment surrounding the vehicle 10 . Among other things about the objects, the signals may represent their presence, location and motion, including their speed, acceleration, orientation, rotation, direction and the like, either absolutely or relative to the vehicle 10 , or both.
- the signals generated by the lidar sensors 70 include without limitation 3D points.
- the lidar sensors 70 each include a transmitter and a receiver.
- the transmitters are operable to transmit eye safe laser signals from any suitable portion of the electromagnetic spectrum, such as from the ultraviolet, visible, or near infrared portions of the electromagnetic spectrum, into the environment surrounding the vehicle 10 , where they impinge upon objects located in their paths.
- the laser signals may be transmitted in series of 360 degree spins around a vertical axis of the vehicle 10 , for example. When the laser signals impinge upon objects, portions thereof are returned by reflection to the lidar sensors 70 , where they are captured by the receivers.
- the receivers may be, or include, one or more photodetectors, solid state photodetectors, photodiodes or photomultipliers, or any combination of these.
- the lidar sensors 70 Responsive to capturing the returned laser signals, the lidar sensors 70 output signals representing objects, or the lack thereof, in the environment surrounding the vehicle 10 .
- the lidar sensors 70 may each include a global positioning system (GPS) transceiver or other positioning sensor for identifying their positions, and an IMU for identifying their pose.
- the signals may include 3D points representing the location in space of the points from which the returned laser signals are received, and therefore, the location in space of points of objects on which the laser signals impinged.
- the lidar sensors 70 may determine the location in space of points of objects based on the distance from the lidar sensors 70 to the points, as well as the position and pose of the lidar sensors 70 associated with the returned laser signals.
- the distance to the points may be determined from the returned laser signals using the time of flight (TOF) method, for instance.
- TOF time of flight
- the signals may also represent the locations in space from which no returned laser signals are received, and therefore, the lack of points of objects in those locations in space on which the laser signals would have otherwise impinged.
- the signals output by the lidar sensors 70 may further represent other aspects of the returned laser signals, which, in turn, may represent other properties of points of objects on which the incident laser signals impinged. These aspects of the returned laser signals can include their intensity or reflectivity, for instance, or any combination of these.
- the positioning sensors 72 are operable to identify the position of the vehicle 10 .
- the positioning sensors 72 may implement, in whole or in part, a GPS, a geolocation system or a local positioning system, for instance, or any combination of these.
- the positioning sensors 72 may include GPS transceivers configured to determine a position of the vehicle 10 with respect to the Earth via its latitude and longitude and, optionally, its altitude.
- the cameras 74 are operable to detect light or other electromagnetic energy from objects, and transform that electromagnetic energy into corresponding visual data signals representing objects, or the lack thereof.
- the cameras 74 may be, or include, one or more image sensors configured for capturing light or other electromagnetic energy. These image sensors may be, or include, one or more photodetectors, solid state photodetectors, photodiodes or photomultipliers, or any combination of these. In these and other configurations, the cameras 74 may be any suitable type, including without limitation high resolution, high dynamic range (HDR), infrared (IR) or thermal imaging, or any combination of these.
- HDR high dynamic range
- IR infrared
- Some cameras 74 may be located to detect electromagnetic energy within the passenger compartment 12 of the vehicle 10 . These cameras 74 may accordingly be located within the passenger compartment 12 of the vehicle 10 . Other cameras 74 may be located to detect electromagnetic energy in the environment surrounding the vehicle 10 . These cameras 74 may be mounted on the vehicle 10 and positioned to have fields of view individually, or collectively, common to those of the radar sensors 66 , the sonar sensors 68 and the lidar sensors 70 in the environment surrounding the vehicle 10 , for example.
- the autonomous operation system 20 includes one or more processors 80 , a memory 82 and one or more modules 84 .
- the processors 80 , the memory 82 and the modules 84 constitute a computing device to which the vehicle systems 30 , the sensor system 60 and any other autonomous support systems are communicatively connected.
- this computing device could be dedicated to the autonomous operation system 20 , it is contemplated that some or all of its processors 80 , its memory 82 and its modules 84 could also be configured as parts of a central control system for the vehicle 10 , for instance, such as a central electronic control unit (ECU).
- ECU central electronic control unit
- the processors 80 may be any components configured to execute any of the processes described herein or any form of instructions to carry out such processes or cause such processes to be performed.
- the processors 80 may be implemented with one or more general-purpose or special-purpose processors. Examples of suitable processors 80 include microprocessors, microcontrollers, digital signal processors or other forms of circuity that can execute software. Other examples of suitable processors 80 include without limitation central processing units (CPUs), array processors, vector processors, digital signal processors (DSPs), field-programmable gate arrays (FPGAs), programmable logic arrays (PLAs), application specific integrated circuits (ASICs), programmable logic circuitry or controllers.
- the processors 80 can include at least one hardware circuit (e.g., an integrated circuit) configured to carry out instructions contained in program code. In arrangements where there are multiple processors 80 , the processors 80 can work independently from each other or in combination with one another.
- the memory 82 is a non-transitory computer readable medium.
- the memory 82 may include volatile or non-volatile memory, or both. Examples of suitable memory 82 includes RAM (Random Access Memory), flash memory, ROM (Read Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable Programmable Read-Only Memory), EEPROM (Electrically Erasable Programmable Read-Only Memory), registers, magnetic disks, optical disks, hard drives or any other suitable storage medium, or any combination of these.
- the memory 82 includes stored instructions in program code. Such instructions can be executed by the processors 80 or the modules 84 .
- the memory 82 may be part of the processors 80 or the modules 84 , or may be communicatively connected the processors 80 or the modules 84 .
- the modules 84 are employable to perform various tasks in the vehicle 10 .
- the modules 84 include instructions that may be executed by the processors 80 .
- the modules 84 can be implemented as computer readable program code that, when executed by the processors 80 , execute one or more of the processes described herein. Such ⁇ 01027219 ⁇ 8 computer readable program code can be stored on the memory 82 .
- the modules 84 may be part of the processors 80 , or may be communicatively connected the processors 80 .
- the modules 84 may include, for example, an autonomous driving module 90 .
- the autonomous driving module 90 generates driving plans for maneuvering the vehicle 10 on roadways based on the information about the vehicle 10 detected by the sensor system 60 and its sensors, and executes the driving plans by operating the appropriate vehicle systems 30 .
- its human driver will have ceded control over one or more primary control functions in favor of autonomous operation.
- These primary control functions may include propulsion, or throttle, braking or steering, for instance, or any combination of these.
- the vehicle systems 30 operated by the autonomous driving module 90 include those associated with the primary control functions over which the human driver has ceded control.
- the autonomous driving module 90 may include a perception module 92 , a planning/decision making module 94 and a control module 96 .
- the perception module 92 gathers and evaluates information about the vehicle 10 detected by the sensor system 60 and its sensors. In the case of information about the environment surrounding the vehicle 10 , the perception module 92 may, as part of its evaluation, identify objects in the environment surrounding the vehicle 10 , including their properties. These properties may include, among other things about the objects, their presence, location and motion, including their speed, acceleration, orientation, rotation, direction and the like, either absolutely or relative to the vehicle 10 , or both, as well as their reflectivity, surface profile, color, thermal profile and the like.
- the perception module 92 may discriminate between different objects and individually track different objects over time. Either on initial detection or after tracking them over time, the perception module 92 may classify the objects to account not only for roadways, features of roadways, such as lane markings, and obstacles on roadways, such as other vehicles, but also for surrounding ground, pedestrians, bicycles, construction equipment, road signs, buildings, trees and foliage, for instance. Either alone or in combination with its identification and classification of objects in the environment surrounding the vehicle 10 , the perception module 92 may identify roadway conditions, such as surface profiles of roadways, weather conditions, traffic conditions and the like.
- the planning/decision making module 94 based on the evaluation of the information about the vehicle 10 by the perception module 92 , generates driving plans for maneuvering the vehicle 10 on roadways.
- the driving plans may account for any objects in the environment surrounding the vehicle 10 , their properties and roadway conditions, for example.
- the driving plans may also account for different lane positions and traffic rules, such as speed limits, priorities at intersections and roundabouts, stop line positions and the like.
- the planning/decision making module 94 may itself include digital maps reflecting these lane positions and traffic rules as part of an overall 3D road network, for instance, or the planning/decision making module 94 may connect to the navigation system 44 or to remote sources for digital maps.
- the control module 96 operates the appropriate vehicle systems 30 to execute the driving plans generated by the planning/decision making module 94 .
- the control module 96 may send control signals to the vehicle systems 30 or may directly send control signals to actuators that operate their components, or both.
- the vehicle 10 is shown, in FIG. 2 , on an exemplary longitudinal roadway 100 .
- the roadway 100 is subject to winter weather conditions and, as a result, its surface 102 is covered in snow.
- the operations of a process 200 by which the autonomous operation system 20 generates and executes a driving plan for maneuvering the vehicle 10 on the roadway 100 are shown in FIG. 3 .
- the process 200 is described with reference to the roadway 100 and other roadways whose surfaces are similarly covered in snow, it will be understood that the process 200 is applicable in principle to any roadways on which ruts are formed, including, for instance, roadways on which mud ruts are formed.
- information about the vehicle 10 is detected by the sensor system 60 and its sensors for gathering and evaluation by the perception module 92 .
- the perception module 92 may, as part of its evaluation, identify, among other objects in the environment surrounding the vehicle 10 , the roadway 100 , the surrounding ground 104 , as well as any obstacles on the roadway 100 , such as an oncoming neighboring vehicle 106 .
- the perception module 92 may identify its surface 102 and, among other features of the roadway 100 , lane markings which, for the illustrated roadway 100 , include edge lines 110 and a center line 112 .
- the edge lines 110 mark the outside boundaries of the roadway 100
- the center line 112 separates the roadway 100 into two sections for traffic moving in opposite directions.
- the perception module 92 may further identify the distinct lanes of the roadway 100 , as well as the lane centers of these lanes. For the illustrated roadway 100 , these lanes include a lane 114 extending between one edge line 110 and the center line 112 , in which the vehicle 10 located, and which has a lane center 116 .
- These lanes further include a lane 118 for traffic moving in the opposite direction as the vehicle 10 extending between the center line 112 and the other edge line 110 , in which the neighboring vehicle 106 is located.
- the perception module 92 may moreover identify various roadway conditions, such as the surface profile of the roadway 100 and that its surface 102 is covered in snow.
- the planning/decision making module 94 based on the evaluation of the information about the vehicle 10 by the perception module 92 , generates a driving plan for maneuvering the vehicle 10 on the roadway 100 .
- the driving plan may be for maneuvering the vehicle 10 along the roadway 100 from an origin, such as the current location of the vehicle 10 on the roadway 100 , to an ultimate future location of the vehicle 10 on the roadway 100 a certain distance down the roadway 100 .
- the driving plan describes the motion of the vehicle 10 along the roadway 100 .
- Part of the driving plan may describe a trajectory, or driving path, of the vehicle 10 along the roadway 100 .
- Other parts the driving plan may describe other things about maneuvering the vehicle 10 along the roadway 100 , such as the speed, acceleration and orientation of the vehicle 10 along the roadway 100 , as well as its signaling, for instance.
- the driving path may be represented by successive future locations of the vehicle 10 along the roadway 100 from the origin to the ultimate future location of the vehicle 10 on the roadway 100 .
- These successive future locations of the vehicle 10 along the roadway 100 may be defined by successive future positions of the vehicle 10 in the direction of the roadway 100 , or future longitudinal positions of the vehicle 10 .
- the successive future locations of the vehicle 10 along the roadway 100 may further be defined by future positions of the vehicle 10 across the direction of the roadway 100 , or future lateral positions of the vehicle 10 , for each of the future longitudinal positions of the vehicle 10 .
- the driving path may accordingly be represented by pairs of future longitudinal positions of the vehicle 10 , and corresponding future lateral positions of the vehicle 10 .
- the driving path is determined based on the information about the vehicle 10 detected by the sensor system 60 and its sensors and, more specifically, based on the information about the environment surrounding the vehicle 10 .
- the future longitudinal positions of the vehicle 10 may be taken as a given or otherwise determined as a matter of course. There are often, however, many identifiable candidates for the corresponding future lateral positions of the vehicle 10 .
- the argument at which the energy function is minimized, x is a set of the future lateral positions of the vehicle 10 (i.e., x i ) for given future longitudinal positions of the vehicle 10 .
- the energy function includes a number of sub-functions that determine different aspects of the candidate future lateral positions of the vehicle 10 .
- the sub-functions favor certain aspects of the candidate future lateral positions of the vehicle 10 .
- the sub-functions penalize the inverses of those aspects of the candidate future lateral positions of the vehicle 10 .
- the sub-functions may include, for instance, a vertical wheel position determination function P(x), a driving path lateral curvature determination function Q(x), a lane center lateral offset determination function R(x), an obstacle proximity determination function S(x) and a predetermined driving path deviation determination function T(x).
- the planning/decision making module 94 identifies the candidate future lateral positions of the vehicle 10 .
- the candidate future lateral positions of the vehicle 10 may be identified as all of those reasonably possible given the current location and motion of the vehicle 10 on the roadway 100 , or as some subset of these, for example.
- the planning/decision making module 94 uses the sub-functions to determine different aspects of the candidate future lateral positions of the vehicle 10 .
- the aspects of the candidate future lateral positions of the vehicle 10 are determined based on the information about the vehicle 10 detected by the sensor system 60 and its sensors.
- the aspects of the candidate future lateral positions of the vehicle 10 are determined, more specifically, based on the information about the environment surrounding the vehicle 10 and, even more specifically, based on the identification, by the perception module 92 , of the objects in the environment surrounding the vehicle 10 , the features of the roadway 100 and roadway conditions.
- the vertical wheel position determination function P(x) determines the vertical positions of the wheels 14 of vehicle 10 at the candidate future lateral positions of the vehicle 10 .
- the perception module 92 may identify, among other roadway conditions, a surface profile of the roadway 100 across the direction of the roadway 100 , or a lateral surface profile LSP of the roadway 100 .
- the lateral surface profile LSP of the roadway 100 is represented by discretized cross sections of the roadway 100 across the direction of the roadway 100 , or discretized lateral cross sections LCS of the roadway 100 (i.e., LCS i ) at given future longitudinal positions of the vehicle 10 .
- the perception module 92 may identify the lateral surface profile LSP of the roadway 100 , and its discretized lateral cross sections LCS of the roadway 100 , from 3D points included among the signals generated by the lidar sensors 70 . These 3D points represent the objects in the environment surrounding the vehicle 10 via their representation of the locations in space of points of the objects on which the laser signals of the lidar sensors 70 impinged.
- the perception module 92 may, for example, identify and remove those of the 3D points representing any objects exhibiting motion, or dynamic objects, in the environment surrounding the vehicle 10 , such as the neighboring vehicle 106 .
- the remaining 3D points representing the roadway 100 and its features, as well as the surrounding ground 104 may then be used to identify the lateral surface profile LSP of the roadway 100 , and its discretized lateral cross sections LCS of the roadway 100 .
- the lateral surface profile LSP of the roadway 100 and its discretized lateral cross sections LCS of the roadway 100 , represent, among other things, snow ruts 120 on the roadway 100 .
- snow ruts 120 As snow falls on the surface 102 of the roadway 100 , when proceeding vehicles drive along the roadway 100 , their wheels form the snow ruts 120 by pushing the fallen snow to their sides into snow piles, and leaving tracks between the snow piles.
- the vertical wheel position determination function P(x) favors low vertical wheel positions by returning lower values for the candidate future lateral positions of the vehicle 10 at which the vertical positions of the wheels 14 of vehicle 10 are lower. Equally, the vertical wheel position determination function P(x) penalizes high vertical wheel positions by returning higher values for the candidate future lateral positions of the vehicle 10 at which the vertical positions of the wheels 14 of vehicle 10 are higher.
- the energy function is suited for roadways on which ruts are formed.
- the lowest vertical wheel positions occur for the candidate future lateral positions of the vehicle 10 in which its wheels 14 are positioned on the existing tracks in the snow ruts 120 .
- the highest vertical wheel positions occur for the candidate future lateral positions of the vehicle 10 in which its wheels 14 are positioned on the snow piles bordering the tracks.
- the energy function accordingly, by favoring low vertical wheel positions, favors the determination of future lateral positions of the vehicle 10 from among the candidates therefor and, as an extension, a driving path as a whole, by which the vehicle 10 follows the existing tracks in the snow ruts 120 , and avoids the snow piles bordering the tracks.
- Equation 2 An example of the vertical wheel position determination function P(x) is shown in Equation 2:
- Z(x i ) is the average height of the wheels 14 of the vehicle 10 above a reference level, which is determined at the candidate future lateral positions of the vehicle 10 for given future longitudinal positions of the vehicle 10 , for instance, at the discretized lateral cross sections LCS of the roadway 100 .
- the driving path lateral curvature determination function Q(x) determines the lateral curvature between the candidate future lateral positions of the vehicle 10 .
- the driving path lateral curvature determination function Q(x) favors low lateral curvature by returning lower values for the candidate future lateral positions of the vehicle 10 at which the lateral curvature is lower. Equally, the driving path lateral curvature determination function Q(x) penalizes high lateral curvature by returning higher values for the candidate future lateral positions of the vehicle 10 at which the lateral curvature is higher.
- the energy function favors the determination of future lateral positions of the vehicle 10 from among the candidates therefor and, as an extension, a driving path as a whole, by which the vehicle 10 has a smooth trajectory.
- Equation 3 An example of the driving path lateral curvature determination function Q(x) is shown in Equation 3:
- Curv(x i ) is the lateral curvature between the candidate future lateral positions of the vehicle 10 , which is determined at the candidate future lateral positions of the vehicle 10 for given future longitudinal positions of the vehicle 10 , for instance, at the discretized lateral cross sections LCS of the roadway 100 .
- the lane center lateral offset determination function R(x) determines the lateral offsets of the vehicle 10 from the lane center 116 of the lane 114 of the roadway 100 in which the vehicle 10 is located at the candidate future lateral positions of the vehicle 10 .
- the perception module 92 may identify, among other features of the roadway 100 , the lane 114 of the roadway 100 in which the vehicle 10 is located, as well as its lane center 116 .
- the lane center lateral offset determination function R(x) favors low lateral offsets from the lane center 116 by returning lower values for the candidate future lateral positions of the vehicle 10 at which the lateral offsets from the lane center 116 are lower.
- the lane center lateral offset determination function R(x) penalizes high lateral offsets from the lane center 116 by returning higher values for the candidate future lateral positions of the vehicle 10 at which the lateral offsets from the lane center 116 are higher.
- the energy function favors the determination of future lateral positions of the vehicle 10 from among the candidates therefor and, as an extension, a driving path as a whole, by which the vehicle 10 stays close to the lane center 116 of the lane 114 of the roadway 100 in which the vehicle 10 is located.
- Equation 4 An example of the lane center lateral offset determination function R(x) is shown in Equation 4:
- Equation 4 y(x i ) is the lateral offset of the vehicle 10 from the lane center 116 of the lane 114 of the roadway 100 in which the vehicle 10 is located, which is determined at the candidate future lateral positions of the vehicle 10 for given future longitudinal positions of the vehicle 10 , for instance, at the discretized lateral cross sections LCS of the roadway 100 .
- the obstacle proximity determination function S(x) determines the proximity of the vehicle 10 from obstacles on the roadway 100 at the candidate future lateral positions of the vehicle 10 .
- the perception module 92 may identify, among other objects in the environment surrounding the vehicle 10 , obstacles on the roadway 100 , such as the neighboring vehicle 106 . In addition to the presence of the obstacles on the roadway 100 , the perception module 92 may further identify their location and motion, which the planning/decision making module 94 may use to predict their future maneuvering along the roadway 100 .
- the obstacle proximity determination function S(x) favors far proximity from obstacles by returning lower values for the candidate future lateral positions of the vehicle 10 at which the proximity from obstacles is far. Equally, the obstacle proximity determination function S(x) penalizes close proximity to obstacles by returning higher values for the candidate future lateral positions of the vehicle 10 at which the proximity to obstacles is close.
- the energy function favors the determination of future lateral positions of the vehicle 10 from among the candidates therefor and, as an extension, a driving path as a whole, by which the vehicle 10 stays away from obstacles.
- Equation 5 An example of the obstacle proximity determination function S(x) is shown in Equation 5:
- dis(x i ) is a minimum distance between the vehicle 10 and obstacles on the roadway 100
- ⁇ is the closest distance between the vehicle 10 and obstacles on the roadway 100 without the vehicle 10 crashing into the obstacles on the roadway 100 .
- the predetermined driving path deviation determination function T(x) determines the deviation from a predetermined driving path along the roadway 100 at the candidate future lateral positions of the vehicle 10 across the direction of the roadway 100 .
- the predetermined driving path may, for example, be the driving path determined in a previous iteration of the process 200 .
- the predetermined driving path deviation determination function T(x) favors low deviation from the predetermined driving path by returning lower values for the candidate future lateral positions of the vehicle 10 at which the deviation from the predetermined driving path is lower. Equally, the predetermined driving path deviation determination function T(x) penalizes high deviation from the predetermined driving path by returning higher values for the candidate future lateral positions of the vehicle 10 at which the deviation from the predetermined driving path is higher.
- the energy function favors the determination of future lateral positions of the vehicle 10 from among the candidates therefor and, as an extension, a driving path as a whole, by which the vehicle 10 continues according to the predetermined driving path.
- the sub-functions may be weighted to, for example, establish the extent which the energy function favors the different aspects of the candidate future lateral positions of the vehicle 10 determined by the sub-functions.
- a weighing factor ⁇ is applied to the vertical wheel position determination function P(x)
- a weighing factor ⁇ is applied to the driving path lateral curvature determination function Q(x)
- a weighing factor ⁇ is applied to the lane center lateral offset determination function R(x)
- a weighing factor ⁇ is applied to the obstacle proximity determination function S(x).
- One, some or all of the weighing factors ⁇ , ⁇ , ⁇ and ⁇ may be dynamically increased or decreased compared to one, some or all of the remaining weighing factors ⁇ , ⁇ , ⁇ and ⁇ . This adjusts the extent which the energy function favors the different aspects of the candidate future lateral positions of the vehicle 10 determined by the sub-functions to which the weighing factors ⁇ , ⁇ , ⁇ and ⁇ are applied.
- the perception module 92 may identify, among other roadway conditions, the lateral surface profile LSP of the roadway 100 and its discretized lateral cross sections LCS of the roadway 100 , as well as the depths of the snow ruts 120 on the roadway 100 that they represent.
- the weighing factor ⁇ applied to the vertical wheel position determination function P(x) may be increased compared to one, some of all of the remaining weighing factors ⁇ , ⁇ and ⁇ with increasing depths of the snow ruts 120 on the roadway 100 , for example.
- the planning/decision making module 94 generates the driving plan.
- the driving plan describes, among other things about maneuvering the vehicle 10 along the roadway 100 , the driving path.
- the planning/decision making module 94 uses the energy function to determine, as part of the driving path, and based on the different aspects of the candidate future lateral positions of the vehicle 10 determined using the sub-functions in operation 206 , the set, x, of the future lateral positions of the vehicle 10 from among the candidates therefore for given future longitudinal positions of the vehicle 10 .
- the set, x, of the future lateral positions of the vehicle 10 is, more specifically, determined as the argument at which the energy function is minimized.
- an example driving path DP is represented by the future lateral positions of the vehicle 10 , x 1-5 , for given future longitudinal positions of the vehicle 10 at discretized lateral cross sections LCS 1-5 of the roadway 100 representing the lateral surface profile LSP of the roadway 100 .
- a driving path by which the vehicle 10 stays close to the lane center 116 of the lane 114 of the roadway 100 in which the vehicle 10 is located might be desirable.
- this otherwise desirable driving path could destabilize the vehicle 10 by positioning its wheels 14 on the snow piles bordering the tracks in the snow ruts 120 .
- the example driving path DP is a global optimum returned from a single framework that accommodates not only low lateral offsets from the lane center 116 , but also low vertical wheel positions, among other interacting aspects of the candidate future lateral positions of the vehicle 10 .
- the example driving path DP is, accordingly, somewhat to the right of the lane center 116 of the lane 114 , so that the vehicle 10 follows the existing tracks in the snow ruts 120 , and avoids the snow piles bordering the tracks, while still staying as close as possible to the lane center 116 of the lane 114 .
- the control module 96 operates the appropriate vehicle systems 30 to execute the driving plan.
- the vehicle 10 is maneuvered according to the driving plan from the origin to the ultimate future location of the vehicle 10 on the roadway 100 .
- the vehicle 10 is a host vehicle.
- the generated driving plan is for maneuvering the vehicle 10 along the roadway 100 , and describes the driving path of the vehicle 10 along the roadway 100 .
- a similar framework as that used to return the driving path of the vehicle 10 along the roadway 100 may also be used to return predicted driving paths along the roadway 100 for other obstacles on the roadway 100 , such as the neighboring vehicle 106 . These predicted driving paths for other obstacles may then be used to predict the future maneuvering of those obstacles along the roadway 100 .
- Equation 6 the argument at which the energy function is minimized, y, is a set of the predicted future lateral positions of the neighboring vehicle 106 (i.e., y i ) for given predicted future longitudinal positions of the neighboring vehicle 106 .
- This energy function includes a number of sub-functions that determine different aspects of the candidate predicted future lateral positions of the neighboring vehicle 106 .
- the sub-functions may include, for instance, a vertical wheel position determination function P(y), a driving path lateral curvature determination function Q(y), a lane center lateral offset determination function R(y), an obstacle proximity determination function S(y) and a predetermined driving path deviation determination function T(y).
- the predetermined driving path deviation determination function T(y) may use a predetermined driving path determined as a function of the tracked motion of the neighboring vehicle 106 along the roadway 100 , instead of the driving path determined in a previous iteration of the process 200 .
- the predicted driving path of the neighboring vehicle 106 may, for instance, be used in the obstacle proximity determination function S(x) to predict the future maneuvering of the neighboring vehicle 106 along the roadway 100 .
- the predicted driving path for the neighboring vehicle 106 may, accordingly, be used to determine the driving path of the vehicle 10 along the roadway 100 and thus, ultimately, the driving plan for maneuvering the vehicle 10 along the roadway 100 .
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Transportation (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Game Theory and Decision Science (AREA)
- Medical Informatics (AREA)
- Acoustics & Sound (AREA)
- Artificial Intelligence (AREA)
- Business, Economics & Management (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
Description
x=argmin[ρP(x)+λQ(x)+μR(x)+ηS(x)+T(x)] (Eq. 1)
In Equation 1, the argument at which the energy function is minimized, x, is a set of the future lateral positions of the vehicle 10 (i.e., xi) for given future longitudinal positions of the
In
In Equation 3, Curv(xi) is the lateral curvature between the candidate future lateral positions of the
In Equation 4, y(xi) is the lateral offset of the
In Equation 5, dis(xi) is a minimum distance between the
y=argmin[ρP(y)+λQ(y)+μR(y)+ηS(y)+T(y)] (Eq. 6)
In Equation 6, the argument at which the energy function is minimized, y, is a set of the predicted future lateral positions of the neighboring vehicle 106 (i.e., yi) for given predicted future longitudinal positions of the neighboring
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/192,032 US9898005B2 (en) | 2016-06-24 | 2016-06-24 | Driving path determination for autonomous vehicles |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/192,032 US9898005B2 (en) | 2016-06-24 | 2016-06-24 | Driving path determination for autonomous vehicles |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170371336A1 US20170371336A1 (en) | 2017-12-28 |
US9898005B2 true US9898005B2 (en) | 2018-02-20 |
Family
ID=60675571
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/192,032 Active US9898005B2 (en) | 2016-06-24 | 2016-06-24 | Driving path determination for autonomous vehicles |
Country Status (1)
Country | Link |
---|---|
US (1) | US9898005B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170010115A1 (en) * | 2015-02-10 | 2017-01-12 | Mobileye Vision Technologies Ltd. | Autonomous navigation based on road signatures |
US20200156640A1 (en) * | 2018-11-15 | 2020-05-21 | Volvo Car Corporation | Vehicle safe stop |
US20220315005A1 (en) * | 2021-03-31 | 2022-10-06 | Subaru Corporation | Vehicle traveling control apparatus |
US11584371B2 (en) | 2020-07-15 | 2023-02-21 | Toyota Research Institute, Inc. | Systems and methods for using R-functions and semi-analytic geometry for lane keeping in trajectory planning |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102581482B1 (en) * | 2016-09-01 | 2023-09-21 | 삼성전자주식회사 | Method and apparatus for operating of automatic driving controller of vehicle |
US11010615B2 (en) | 2016-11-14 | 2021-05-18 | Lyft, Inc. | Rendering a situational-awareness view in an autonomous-vehicle environment |
US10640121B2 (en) * | 2017-04-28 | 2020-05-05 | International Business Machines Corporation | Vehicle control for reducing road wear |
DE102017211887A1 (en) * | 2017-07-12 | 2019-01-17 | Robert Bosch Gmbh | Method and device for locating and automated operation of a vehicle |
US10496098B2 (en) * | 2017-09-12 | 2019-12-03 | Baidu Usa Llc | Road segment-based routing guidance system for autonomous driving vehicles |
US11874126B1 (en) * | 2017-09-15 | 2024-01-16 | Apple Inc. | Map with location-based observations, actions, and rules |
US10435020B2 (en) * | 2017-12-01 | 2019-10-08 | Robert Bosch Gmbh | Lane keeping support on roads covered by snow |
US11062608B2 (en) | 2018-05-11 | 2021-07-13 | Arnold Chase | Passive infra-red pedestrian and animal detection and avoidance system |
US10467903B1 (en) * | 2018-05-11 | 2019-11-05 | Arnold Chase | Passive infra-red pedestrian detection and avoidance system |
US11294380B2 (en) * | 2018-05-11 | 2022-04-05 | Arnold Chase | Passive infra-red guidance system |
US10750953B1 (en) | 2018-05-11 | 2020-08-25 | Arnold Chase | Automatic fever detection system and method |
DE102018127342B4 (en) * | 2018-11-01 | 2022-09-29 | Mercedes-Benz Group AG | Method and device for operating an assistance system of a vehicle |
US11554775B2 (en) * | 2019-03-18 | 2023-01-17 | Arnold Chase | Passive infra-red guidance system |
JP7012769B2 (en) * | 2020-03-26 | 2022-01-28 | 日立建機株式会社 | Autonomous driving system |
DE102022109423A1 (en) * | 2022-04-19 | 2023-10-19 | Valeo Schalter Und Sensoren Gmbh | Method for the lateral localization of a vehicle on a road, computer program and driver assistance system |
CN115447616B (en) * | 2022-10-26 | 2024-05-17 | 重庆长安汽车股份有限公司 | Method and device for generating objective index of vehicle driving |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070291130A1 (en) * | 2006-06-19 | 2007-12-20 | Oshkosh Truck Corporation | Vision system for an autonomous vehicle |
US20090295917A1 (en) | 2008-04-24 | 2009-12-03 | Gm Global Technology Operations, Inc. | Pixel-based texture-less clear path detection |
US20100106356A1 (en) * | 2008-10-24 | 2010-04-29 | The Gray Insurance Company | Control and systems for autonomously driven vehicles |
US20130211720A1 (en) | 2012-02-09 | 2013-08-15 | Volker NIEMZ | Driver-assistance method and driver-assistance system for snow-covered roads |
JP2014142831A (en) | 2013-01-24 | 2014-08-07 | Toyota Industries Corp | Vehicle |
JP2014184747A (en) | 2013-03-21 | 2014-10-02 | Toyota Motor Corp | Vehicle control apparatus and vehicle control method |
US9008890B1 (en) * | 2013-03-15 | 2015-04-14 | Google Inc. | Augmented trajectories for autonomous vehicles |
US20150202770A1 (en) * | 2014-01-17 | 2015-07-23 | Anthony Patron | Sidewalk messaging of an autonomous robot |
US9120485B1 (en) | 2012-09-14 | 2015-09-01 | Google Inc. | Methods and systems for smooth trajectory generation for a self-driving vehicle |
US20150345966A1 (en) * | 2014-05-30 | 2015-12-03 | Nissan North America, Inc. | Autonomous vehicle lane routing and navigation |
US20160132705A1 (en) * | 2014-11-12 | 2016-05-12 | Joseph E. Kovarik | Method and System for Autonomous Vehicles |
US9373149B2 (en) * | 2006-03-17 | 2016-06-21 | Fatdoor, Inc. | Autonomous neighborhood vehicle commerce network and community |
US20160318531A1 (en) * | 2015-04-28 | 2016-11-03 | General Electric Company | Location and/or direction of travel detection system and method |
US20170010106A1 (en) * | 2015-02-10 | 2017-01-12 | Mobileye Vision Technologies Ltd. | Crowd sourcing data for autonomous vehicle navigation |
US20170123434A1 (en) * | 2015-11-04 | 2017-05-04 | Toyota Jidosha Kabushiki Kaisha | Autonomous driving system |
-
2016
- 2016-06-24 US US15/192,032 patent/US9898005B2/en active Active
Patent Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9373149B2 (en) * | 2006-03-17 | 2016-06-21 | Fatdoor, Inc. | Autonomous neighborhood vehicle commerce network and community |
US20070291130A1 (en) * | 2006-06-19 | 2007-12-20 | Oshkosh Truck Corporation | Vision system for an autonomous vehicle |
US20090295917A1 (en) | 2008-04-24 | 2009-12-03 | Gm Global Technology Operations, Inc. | Pixel-based texture-less clear path detection |
US20100106356A1 (en) * | 2008-10-24 | 2010-04-29 | The Gray Insurance Company | Control and systems for autonomously driven vehicles |
US20130211720A1 (en) | 2012-02-09 | 2013-08-15 | Volker NIEMZ | Driver-assistance method and driver-assistance system for snow-covered roads |
US9120485B1 (en) | 2012-09-14 | 2015-09-01 | Google Inc. | Methods and systems for smooth trajectory generation for a self-driving vehicle |
JP2014142831A (en) | 2013-01-24 | 2014-08-07 | Toyota Industries Corp | Vehicle |
US9008890B1 (en) * | 2013-03-15 | 2015-04-14 | Google Inc. | Augmented trajectories for autonomous vehicles |
JP2014184747A (en) | 2013-03-21 | 2014-10-02 | Toyota Motor Corp | Vehicle control apparatus and vehicle control method |
US20150202770A1 (en) * | 2014-01-17 | 2015-07-23 | Anthony Patron | Sidewalk messaging of an autonomous robot |
US20150345966A1 (en) * | 2014-05-30 | 2015-12-03 | Nissan North America, Inc. | Autonomous vehicle lane routing and navigation |
US20160132705A1 (en) * | 2014-11-12 | 2016-05-12 | Joseph E. Kovarik | Method and System for Autonomous Vehicles |
US20170010106A1 (en) * | 2015-02-10 | 2017-01-12 | Mobileye Vision Technologies Ltd. | Crowd sourcing data for autonomous vehicle navigation |
US9690293B2 (en) * | 2015-02-10 | 2017-06-27 | Mobileye Vision Technologies Ltd. | Autonomous vehicle tail alignment navigation |
US9760090B2 (en) * | 2015-02-10 | 2017-09-12 | Mobileye Vision Technologies Ltd. | Crowd sourcing data for autonomous vehicle navigation |
US20160318531A1 (en) * | 2015-04-28 | 2016-11-03 | General Electric Company | Location and/or direction of travel detection system and method |
US20170123434A1 (en) * | 2015-11-04 | 2017-05-04 | Toyota Jidosha Kabushiki Kaisha | Autonomous driving system |
Non-Patent Citations (2)
Title |
---|
Ordonez et al., "Laser-Based Rut Detection and Following System for Autonomous Ground Vehicles", Florida A&M-Florida State University, 22 pages. |
Ordonez et al., "Laser-Based Rut Detection and Following System for Autonomous Ground Vehicles", Florida A&M—Florida State University, 22 pages. |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170010115A1 (en) * | 2015-02-10 | 2017-01-12 | Mobileye Vision Technologies Ltd. | Autonomous navigation based on road signatures |
US10908606B2 (en) * | 2015-02-10 | 2021-02-02 | Mobileye Vision Technologies Ltd. | Autonomous navigation based on road signatures |
US20200156640A1 (en) * | 2018-11-15 | 2020-05-21 | Volvo Car Corporation | Vehicle safe stop |
US11608063B2 (en) * | 2018-11-15 | 2023-03-21 | Volvo Car Corporation | Vehicle safe stop |
US11584371B2 (en) | 2020-07-15 | 2023-02-21 | Toyota Research Institute, Inc. | Systems and methods for using R-functions and semi-analytic geometry for lane keeping in trajectory planning |
US20220315005A1 (en) * | 2021-03-31 | 2022-10-06 | Subaru Corporation | Vehicle traveling control apparatus |
US12024176B2 (en) * | 2021-03-31 | 2024-07-02 | Subaru Corporation | Vehicle traveling control apparatus |
Also Published As
Publication number | Publication date |
---|---|
US20170371336A1 (en) | 2017-12-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9898005B2 (en) | Driving path determination for autonomous vehicles | |
US12223831B2 (en) | Detecting and responding to sirens | |
US12181878B2 (en) | Velocity estimation and object tracking for autonomous vehicle applications | |
US11314253B2 (en) | Providing user assistance in a vehicle based on traffic behavior models | |
US11315418B2 (en) | Providing user assistance in a vehicle based on traffic behavior models | |
US11702102B2 (en) | Filtering return points in a point cloud based on radial velocity measurement | |
US9915951B2 (en) | Detection of overhanging objects | |
JP2024045402A (en) | Vehicle control device, vehicle control method, and vehicle control program | |
US10147324B1 (en) | Providing user assistance in a vehicle based on traffic behavior models | |
JP7005326B2 (en) | Roadside object recognition device | |
US12214802B1 (en) | Augmenting point cloud data with artificial return points | |
US20240025446A1 (en) | Motion planning constraints for autonomous vehicles | |
CN119568194A (en) | Implementing autonomous vehicle lane understanding system using filter-based lane tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AME Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MEI, XUE;HARADA, MASAHIRO;PROKHOROV, DANIL V.;REEL/FRAME:039041/0531 Effective date: 20160617 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOYOTA MOTOR ENGINEERING & MANUFACTURING NORTH AMERICA, INC.;REEL/FRAME:044975/0323 Effective date: 20180220 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |