![]() Method and control unit for vehicle self-diagnosis
专利摘要:
SUMMARY. Method (400) and control unit (310) in a vehicle (100) for vehicle diagnosis by visual inspection. The method (400) comprises comparing (401) an image (220) of at least a part of the vehicle (100) with a previously stored image (210), depicting a normal state of the vehicle (100); detecting (402) an anomaly (230) in the image (220), in comparison with the previously stored image (210); and executing (403) a measurement in order to at least reduce the impact of the detected (402) anomaly (230).(Publ. Fig. 3A) 公开号:SE1650346A1 申请号:SE1650346 申请日:2016-03-15 公开日:2017-09-16 发明作者:Claezon Fredrich;Lindberg Mikael 申请人:Scania Cv Ab; IPC主号:
专利说明:
METHOD AND CONTROL UNIT FOR VEHICLE SELF-DIAGNOSIS TECHNICAL FIELD This document relates to a method and a control unit in a vehicle. More particularly, amethod and a control unit for vehicle self-diagnosis by visual inspection are described. BACKGROUND A regular check on the vehicle health is important from a traffic security view, but also forkeeping the vehicle in operable state. An example may be to walk around the vehicle withthe engine running and lights on, before taking off, in order to check that the lights areworking, that the lamp glass is broken or has a crack (which may affect the light distribu-tion), that the rear view mirrors are intact and clean, that there are no oil spots under thevehicle, etc. There are fortunately electrical tests in many vehicles, for detecting e.g. when a lamp isbroken and present a visual indication to the driver. However, none of the other abovementioned defects may be detected by such known electrical test. Further, in case the dis-play of the vehicle (where the visual indication is presented) is malfunctioning, the drivermay not notice the emitted visual indication. The vehicle as herein discussed may comprise a means for transportation in broad sensesuch as e.g. a truck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft,a watercraft, a cable transport, an aerial tramway, an elevator, a drone, a spacecraft, orother similar manned or unmanned means of conveyance. Due to lack of time and/ or interest, many vehicle drivers may not check their vehicle on aregular basis, at least not as intrusive as may be desired. Further, some vehicles may be unmanned, so called autonomous vehicles. Thus there isno driver present to make any check at all concerning the condition of the vehicle. ln the case of negligent driver and/ or unmanned vehicle, when finally taken to a workshop,it would be a great help for the vehicle mechanic and/ or the vehicle manufacturer to knowfor example how long the error has existed, how it has emerged, if there are any consecu-tive errors, etc. The negligent driver may be of little help for the mechanic in such cases, ifno regular vehicle inspections are performed. Another problem concerns digital maps used in a navigator of the vehicle. Such maps are often based on collected data which may be obsolete due to road re-/ constructions. This isdangerous for the driver trusting the navigator blindly. For an autonomous vehicle, a trust-worthy digital map is crucial for a successful driving; however, updating digital maps re- quire an extensive work effort. lt may be easier to detect various errors of another vehicle while driving. However, there isno convenient way of informing another vehicle driver of an anomaly on his/ her vehicle.lterated honking is an option which however is likely to be misinterpreted by the other vehi-cle driver. Looking up the vehicle registration plate in the vehicle register (if at all available),searching for the vehicle owner's cell phone number (if public) and calling him/ her is anoption which is not only illegal in many legislations, but also rather hazardous as it requiresthe driver to put his/ her attention into various internet searching and user equipment ma- nipulation activities. Document US20050062615 describes a system for driver assistance of a vehicle, includinga self-diagnosis of camera sensors of a vehicle, having over lapping fields of detection. Bycomparing the overlapping regions of two or more cameras, it may be determined if one ofthem is detect. However, the document does not discuss how any other anomaly or detectof any vehicle part may be detected. Document US7991583 presents a method for detecting errors in a component of an Ad-vanced Driver Assistance Systems (ADAS), by measuring electrical currents and detectinganomalies from an expected value. Based on the detected anomaly and a comparison witha model, the error may be localised. However, errors or anomalies besides electrical errorsof the vehicle ADAS cannot be detected. Document US20100066526 also illustrates error detection in a vehicle ADAS system, simi-lar to the previously discussed system, and unfortunately inducing the same disadvantagesconcerning detection of other anomalies of the vehicle. Document US20060170427 presents a method for cycling vehicle lamps on and off to al-low direct sight inspection by one person of operability of the lamp bulbs during a vehicleworkaround. Thereby the problem of moving back and forth from a position outside thevehicle where the respective lamp can be seen, and the lamp switches at the driving seatis omitted. However, the method does not provide any error detection at all in case thedriver does not make the visual inspection him/ herself. lt would thus be desired to improve error detection of a vehicle. SUMMARY lt is therefore an object of this invention to solve at least some of the above problems andincrease traffic security by improving anomaly detection. According to a first aspect of the invention, this objective is achieved by a method in a ve-hicle for vehicle diagnosis by visual inspection. The method comprises comparing an im-age of at least a part of the vehicle with a previously stored image, depicting a normal stateof the vehicle. The method further comprises detecting an anomaly in the image, in com-parison with the previously stored image. ln addition, the method also comprises executinga measurement in order to at least reduce the impact of the detected anomaly. According to a second aspect of the invention, this objective is achieved by a control unit ina vehicle. The control unit aims at providing vehicle self-diagnosis by visual inspection. Thecontrol unit is configured to compare an image of at least a part of the vehicle with a previ-ously stored image, depicting a normal state of the vehicle. Furthermore, the control unit isin addition configured to detect an anomaly in the image, in comparison with the previouslystored image. Also, the control unit is additionally configured to execute a measurement inorder to at least reduce the impact of the detected anomaly. Thanks to the described aspects, by capturing images with various on-board sensors andcomparing them with a previously stored image covering at least partly the same part of thevehicle, or vehicle surroundings, an anomaly such as a malfunctioning vehicle part couldbe detected, also when there is no driver present in the vehicle, or at least no observantdriver present. Thereby various problems that may occur on a vehicle, which normally aredetected by an attentive driver, may be detected and the vehicle driver/ owner may be in-formed. Also, various errors in digital maps may be detected and continuously updated insome embodiments, leading to more reliable digital maps, which is crucial for automatedvehicle navigation, e.g. in autonomous vehicles. Thus increased traffic security is achieved. Other advantages and additional novel features will become apparent from the subsequentdetailed description. FIGURES Embodiments of the invention will now be described in further detail with reference to theaccompanying figures, in which: Figure 1 illustrates a vehicle according to an embodiment of the invention; Figure 2A illustrates an example of a traffic scenario and an embodiment of the inven-tion; Figure 2B illustrates an example of an anomaly detection in an image, according to anembodiment of the invention; Figure 3A illustrates an example of an anomaly detection in an image, according to anembodiment of the invention; Figure 3B illustrates an example of an anomaly detection in an image, according to anembodiment of the invention; Figure 3C illustrates an example of an anomaly detection in an image, according to anembodiment of the invention; Figure 3D illustrates an example of an anomaly detection in an image, according to anembodiment of the invention; Figure 3E illustrates an example of an anomaly detection in an image, according to anembodiment of the invention; Figure 4 is a flow chart illustrating an embodiment of the method; Figure 5 is an illustration depicting a system according to an embodiment. DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method and a control unit,which may be put into practice in the embodiments described below. These embodimentsmay, however, be exemplified and realised in many different forms and are not to be lim-ited to the examples set forth herein; rather, these illustrative examples of embodimentsare provided so that this disclosure will be thorough and complete. Still other objects and features may become apparent from the following detailed descrip-tion, considered in conjunction with the accompanying drawings. lt is to be understood,however, that the drawings are designed solely for purposes of illustration and not as adefinition of the limits of the herein disclosed embodiments, for which reference is to bemade to the appended claims. Further, the drawings are not necessarily drawn to scaleand, unless otherwise indicated, they are merely intended to conceptually illustrate thestructures and procedures described herein. Figure 1 illustrates a scenario with a vehicle 100. The vehicle 100 is driving on a road in adriving direction 105. The vehicle 100 may comprise a means for transportation in broad sense such as e.g. atruck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, an aircraft, a watercraft, acable transport, an aerial tramway, an elevator, a drone, a spacecraft, or other similar manned or unmanned means of conveyance. The vehicle 100 may be driver controlled or driverless (i.e. autonomously controlled) indifferent embodiments. However, for enhanced clarity, the vehicle 100 is subsequently de-scribed as having a driver. The vehicle 100 comprises at least one sensor, typically a plurality of sensors. Such sen-sors may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, aradar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in differentembodiments. ln some embodiments, the sensors may comprise e.g. a motion detector and/ or be basedon a Passive Infrared (PIR) sensor sensitive to a person's skin temperature through emit-ted black body radiation at mid-infrared wavelengths, in contrast to background objects atroom temperature; or by emitting a continuous wave of microwave radiation and detectmotion through the principle of Doppler radar; or by emitting an ultrasonic wave an detect-ing and analysing the reflections; or by a tomographic motion detection system based ondetection of radio wave disturbances, to mention some possible implementations. The sensors may be of the same or different types in various embodiments, and may besituated in the vehicle 100 for various other functionalities such as e.g. Advanced DriverAssistance System (ADAS) than to provide a vehicle self-diagnosis. Such sensors may be situated in the vehicle 100 and be directed out from the vehicle 100,e.g. for detecting an obstacle in front of the vehicle 100. However, a part of the image,video sequence or other information captured by the sensor will cover the own vehicle 100. By using the on-board sensors for collecting information concerning the own vehicle 100,image, video sequence or other sensor information and analyse this information, e.g. bycomparing this information with a stored ideal image, video sequence or other sensor in-formation of the vehicle 100, or a part thereof, an anomaly which may be the result from abroken component on the vehicle 100, could be detected. This solution acts like a secondpair of eyes (or the only pair of eyes in case of an autonomous vehicle), constantly lookingfor anomalies which could indicate failure on the vehicle 100. Some arbitrary examples of such anomalies detected on the vehicle 100 may be e.g.changes in light distribution implies broken or misfit lamps, dirty or broken lamp glass etc.;lack of direction indication light when using the direction indicator implies dirty or broken indicator lamps; changes in cabin roll-, yaw or pitch angle implies broken cab suspension;detection of errors of high beam area may trigger adjustment of the light; detection of errorsin Adaptive Main Beam function (i.e. that the high beam is turned down when meeting an-other vehicle or traffic user); detection of unsecured cargo; error detection for rain sensor. Further, error detection of other vehicles may be made, e.g. of malfunctioning lamps etc.The other vehicle is informed via wireless communication. Also, error detection may bemade of map data not consistent with the surroundings, such as e.g. wrong numbers oflanes in the road. By using existing sensors, anomalies and problems may be detected automatically and insome cases fixed by online adjustments. For example, if the light distributions for the Adap-tive Main Beam is incorrect, the sensor based self-diagnostic will enable the function toself-adjust, according to an embodiment. Another example may be, in case the vehicle body panel, vehicle glass and/ or sensorlenses are dirty, a recommendation may be made to drive to a car wash, in some embodi-ments. The sensors, or a subset of the on-board sensors may be situated in the cab, directed to-wards the driver, in the cargo space and/ or in the passenger compartment (e.g. in case ofa bus or another mass transportation vehicle). Thereby for example a fire in the passengercompartment, displaced cargo in the cargo space or a driver having a heart attack may bedetected and appropriate measures may be taken, starting with slowing down and parkingthe vehicle 100. Examples of anomalies and functions will later be discussed more in detail, however, firstlysome examples of sensors of the vehicle 100 will be presented. The vehicle 100 comprises at least one sensor such as e.g. a forwardly directed sensor110 in some embodiments. ln the illustrated embodiment, which is merely an arbitrary ex-ample, the forwardly directed sensor 110 may be situated e.g. at the front of the vehicle100, behind the windscreen of the vehicle 100. Mounting the forwardly directed sensor 110 behind the windshield have some advantagescompared to externally mounted camera systems. These advantages include the possibilityto use windshield wipers for cleaning and using the light from headlights to illuminate ob-jects in the camera's field of view. lt is also protected from dirt, snow, rain and to some ex- tent also from damage, vandalism and/ or theft. Such sensor 110 may also be used for avariety of other tasks, such as detecting an in-front object or vehicle, detecting road signs,detecting road lane markings, for estimating distance to an in-front vehicle, etc. The sensor 110 may be directed towards the front of the vehicle 100, in the driving direc-tion 105. The sensor 110 may comprise eg. a camera, a stereo camera, an infrared cam-era, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or simi-lar device, in different embodiments. Further the vehicle 100 may comprise one or two side view sensors 120. The side viewsensors 120 may be situated at the left/ right sides of the vehicle 100 (as regarded in thedriving direction 105), arranged to detect objects at the respective side of the vehicle 100.The side view sensor 120 may comprise e.g. a camera, a stereo camera, an infrared cam-era, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or simi-lar device in different embodiments. instead of using traditional rear view mirrors on the vehicle 100, side view sensors 120 maybe utilised in combination with one or more devices intended to display objects outside adriver's direct field of vision may be used. Such presentational device may comprise e.g. adisplay, a projector, a Head-Up Display, a transparent display being part oi the windshield,intelligent glasses of the driver, etc., which output an image, or stream of images, capturedby a corresponding sensor 110, 120. Typically, the sensor 120 on the left side of the vehi-cle 100 may be associated with a presentational device on the left side of the cabin whilethe sensor on the right side of the vehicle 100 may be associated with a presentationaldevice on the right side of the cabin, even if other combinations are possible. The sensors 110, 120 may be turned and/ or re-directed in different directions and the de-vices intended to display objects outside a driver's direct field of vision may present theadjusted view of the associated sensor 110, 120. Further, in some embodiments, a detected object around the vehicle 100 may be indicatedon an overview (bird-eyes view) presentation, e.g. on a display in the cabin, or in any of thepresentational devices. Figure 2A schematically illustrates a scenario, similar to the previously discussed scenarioillustrated in Figure 1, but with the vehicle 100 seen from an above perspective wherein aplurality of sensors 110, 120, 130, 140 are depicted. The reverse sensor 130 may be utilised for detecting objects behind the vehicle 100 andthe left side sensor 140 may function and detect objects etc., much in the same or similarmanner as has been described above in their respective directions. The sensors 110, 120, 130, 140 comprises, or are connected to a control unit configured toimage recognition/ computer vision and object recognition. Computer vision is a technical field comprising methods for acquiring, processing, analys-ing, and understanding images and, in general, high-dimensional data from the real worldin order to produce numerical or symbolic information. A theme in the development of thisfield has been to duplicate the abilities of human vision by electronically perceiving andunderstanding an image. Understanding in this context means the transformation of visualimages (the input of retina) into descriptions of world that can interface with other thoughtprocesses and elicit appropriate action. This image understanding can be seen as the dis-entangling of symbolic information from image data using models constructed with the aidof geometry, physics, statistics, and learning theory. Computer vision may also be de-scribed as the enterprise of automating and integrating a wide range of processes and rep-resentations for vision perception. The image data of the sensors 110, 120, 130, 140 may take many forms, such as e.g. im-ages, video sequences, views from multiple cameras, or multi-dimensional data from ascanner Computer vision may comprise e.g. scene reconstruction, event detection, video tracking,object recognition, object pose estimation, learning, indexing, motion estimation, and imagerestoration, just to mention some examples. Figure 2B illustrates an example of vehicle diagnosis by visual inspection according to anembodiment. A presentational device 200 may be used for presenting a previously stored image 210,e.g. at a first period of time t1. The previously stored image 210 may depict an original orideal image of a part of the vehicle 100; or the state when the owner configures the storedimage 210 at a first period of time. The sensors 110, 120, 130, 140 may then continuously or at predetermined or configurabletime intervals collect information, e.g. by capturing an image 220 at a second period of timet2. ln the illustrated scenario, an anomaly 230 is detected in the image 220, in comparisonwith the previously stored image 210. ln this case the anomaly 230 is a crack in a glass,e.g. of the wind shield or the head light glass of the vehicle 100. However, the anomaly 230may comprise e.g. a broken or lost vehicle part; holes, scars or stains in the vehicle paint;malfunctioning vehicle instruments, etc. ln some embodiments, the anomaly 230 may be recognised in the image 220 by deeplearning (sometimes also referred to as deep structured learning, hierarchical learning and/or deep machine learning); a branch of machine learning based on a set of algorithms thatattempt to model high-level abstractions in data by using multiple processing layers withcomplex structures, or otherwise composed of multiple non-linear transformations. Deeplearning is based on learning representations of data. An observation (e.g., the image 220)may be represented in many ways such as a vector of intensity values per pixel, or in amore abstract way as a set of edges, regions of particular shape, etc. Deep learning typically uses a cascade of many layers of nonlinear processing units forfeature extraction and transformation. Each successive layer uses the output from the pre-vious layer as input. The algorithms may be supervised or unsupervised and applicationsmay comprise pattern analysis (unsupervised) and classification (supervised). Further,deep learning may be based on the (unsupervised) learning of multiple levels of features orrepresentations of the data. Higher level features may be derived from lower level featuresto form a hierarchical representation. By Deep learning, multiple levels of representationsthat correspond to different levels of abstraction are learned; the levels form a hierarchy ofconcepts. The composition of a layer of nonlinear processing units used in a deep learningalgorithm depends on the problem to be solved, i.e. recognising the anomaly 230. ln the illustrated embodiment, the presentational device 200 is a mobile device of the vehi-cle driver, vehicle owner or other vehicle responsible person, who may not necessarily besituated in the vehicle 100. Figure 3A illustrates an example of a vehicle interior of the vehicle 100 and depicts howthe previously scenario in Figure 2B may be perceived by the driver of the vehicle 100, or apassenger as the case may be when the vehicle 100 is autonomous. The vehicle 100 comprises a control unit 310, for vehicle diagnosis by visual inspection bycollecting information from sensors 110, 120, 130, 140 of the vehicle 100 and comparing alater captured image 220 with a previously stored image 210. The previously stored image 210 may be stored in a data storage device/ database 315. The control unit 310 may communicate with the other vehicle internal units such as thesensors 110, 120, 130, 140 via e.g. a communication bus. The communication bus maycomprise e.g. a Controller Area Network (CAN) bus, a Media Oriented Systems Transport(MOST) bus, or similar. However, the datalink may alternatively be made over a wirelessconnection comprising, or at least be inspired by any wireless communication technologysuch as e.g. Wi-Fi, Bluetooth, etc. ln the illustrated embodiment, a message is displayed on the presentational device 200, formaking the driver attentive on the observed and detected anomaly 230. ln some embodi-ments, a recommendation may also be presented for the driver/ owner concerning what todo. ln the case of a detected crack in the windshield, the driver/ owner may be advised tocover the crack from the outside with a piece of transparent tape and then drive to a glassrepairing workshop. Via a wireless internet connection, a search may be made for suchworkshops close to the geographical position of the vehicle 100, and/ or in the driving di-rection 105 of the vehicle 100, and a recommendation may be made based on e.g. a pricecomparison, an instant service availability check and/ or customer satisfaction of previouscustomers, if such information is available. Figure 3B illustrates yet an example of a vehicle interior of the vehicle 100 and depictshow a scenario wherein a broken head lamp glass may be perceived from inside anautonomous vehicle 100, and by a vehicle owner (or other person being responsible for thevehicle 100) via a presentational device 200 on a distance from the vehicle 100. The vehicle 100, besides the already presented control unit 310, data storage device 315and sensors 110, 120, 130, 140 comprises a wireless transmitter or transceiver 320. Thetransmitter 320 may communicate wirelessly with the presentational device 200 of the ve-hicle owner. Communication may be made over a wireless communication interface, such as e.g. Vehi-cle-to-Vehicle (V2V) communication, or Vehicle-to-Structure (V2X) communication. ln some embodiments, the communication between the transmitter 320 and the presenta-tional device 200 may be performed via V2V communication, e.g. based on DedicatedShort-Range Communications (DSRC) devices. DSFlC works in 5.9 GHz band with band-width of 75 MHz and approximate range of 1000 m in some embodiments. 11 The wireless communication may be made according to any IEEE standard for wirelessvehicular communication like e.g. a special mode of operation of IEEE 802.11 for vehicularnetworks called Wireless Access in Vehicular Environments (WAVE). IEEE 802.11p is anextension to 802.11 Wireless LAN medium access layer (MAC) and physical layer (PHY)specification. Such wireless communication interface may comprise, or at least be inspired by wirelesscommunication technology such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mo-bile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), optical communication such as Infrared Data Association(lrDA) or infrared transmission to name but a few possible examples of wireless communi-cations in some embodiments. The communication may alternatively be made over a wireless interface comprising, or atleast being inspired by radio access technologies such as e.g. 3GPP LTE, LTE-Advanced,E-UTRAN, UMTS, GSM, GSM/ EDGE, WCDMA, Time Division Multiple Access (TDMA)networks, Frequency Division Multiple Access (FDMA) networks, Orthogonal FDMA (OF-DMA) networks, Single-Carrier FDMA (SC-FDMA) networks, Worldwide lnteroperability forMicrowave Access (WiMax), or Ultra Mobile Broadband (UMB), High Speed Packet Access(HSPA) Evolved Universal Terrestrial Radio Access (E-UTRA), Universal Terrestrial RadioAccess (UTRA), GSM EDGE Radio Access Network (GERAN), 3GPP2 CDMA technolo-gies, e.g., CDMA2000 1x RTT and High Rate Packet Data (HRPD), or similar, just to men-tion some few options, via a wireless communication network. ln the illustrated embodiment, the image comparison has resulted in detecting that the ve-hicle 100 has a broken head lamp glass on the left side. The detected anomaly 230 thusresults from a malfunctioning physical vehicle part. As driving with malfunctioning headlights is illegal (in at least some jurisdictions), the vehicle 100 has been parked at the roadside and the geographical position of the vehicle 100 is sent to the presentational device200. Thereby, the owner/ corresponding responsible person is informed about the situation andmay take appropriate measures, such as bringing correct spare parts and tools, informing atransportation receiver (if any) about the delay and drive to the vehicle 100 and repairing it. The geographical position of the vehicle 100 may be determined by the positioning unit 330in the vehicle 100, which may be based on a satellite navigation system such as the Navi-gation Signal Timing and Ranging (Navstar) Global Positioning System (GPS), Differential 12 cPs (DePs), Galileo, eLoNAss, of the like. The geographical position of the positioning unit 330, (and thereby also of the vehicle 100)may be made continuously with a certain predetermined or configurable time intervals ac-cording to various embodiments. Positioning by satellite navigation is based on distance measurement using triangulationfrom a number of satellites 340-1, 340-2, 340-3, 340-4. ln this example, four satellites 340-1, 340-2, 340-3, 340-4 are depicted, but this is merely an example. More than four satel-lites 340-1, 340-2, 340-3, 340-4 may be used for enhancing the precision, or for creatingredundancy. The satellites 340-1, 340-2, 340-3, 340-4 continuously transmit informationabout time and date (for example, in coded form), identity (which satellite 340-1, 340-2,340-3, 340-4 that broadcasts), status, and where the satellite 340-1, 340-2, 340-3, 340-4are situated at any given time. The GPS satellites 340-1, 340-2, 340-3, 340-4 sends infor-mation encoded with different codes, for example, but not necessarily based on Code Divi-sion Multiple Access (CDl/IA). This allows information from an individual satellite 340-1,340-2, 340-3, 340-4 distinguished from the others' information, based on a unique code foreach respective satellite 340-1, 340-2, 340-3, 340-4. This information can then be transmit-ted to be received by the appropriately adapted positioning device comprised in the vehicle100. Distance measurement can according to some embodiments comprise measuring the dif-ference in the time it takes for each respective satellite signal transmitted by the respectivesatellites 340-1, 340-2, 340-3, 340-4 to reach the positioning unit 330. As the radio signalstravel at the speed of light, the distance to the respective satellite 340-1, 340-2, 340-3, 340-4 may be computed by measuring the signal propagation time. The positions of the satellites 340-1, 340-2, 340-3, 340-4 are known, as they continuouslyare monitored by approximately 15-30 ground stations located mainly along and near theearth's equator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle100 may be calculated by determining the distance to at least three satellites 340-1, 340-2,340-3, 340-4 through triangulation. For determination of altitude, signals from four satellites340-1, 340-2, 340-3, 340-4 may be used according to some embodiments. The geographical position of the vehicle 100 may alternatively be determined, e.g. by hav-ing transponders positioned at known positions around the route of the vehicle 100 and adedicated sensor in the vehicle 100, for recognising the transponders and thereby deter-mining the position; by detecting and recognising WiFi networks (WiFi networks along the 13 route may be mapped with certain respective geographical positions in a database); byreceiving a Bluetooth beaconing signal, associated with a geographical position, or othersignal signatures of wireless signals such as e.g. by triangulation of signals emitted by aplurality of fixed base stations with known geographical positions. The position may alter-natively be entered by a passenger in the vehicle 100. Having determined the geographical position of the positioning unit 330 (or in another way),it may be presented on the presentational device 200, e.g. on a map where the position ofthe vehicle 100 may be marked, in some embodiments. Figure 3C illustrates yet an example of a vehicle interior of the vehicle 100 and depicts ascenario wherein the anomaly 230 comprises a deviation from an expected condition of thevehicle 100, under current driving conditions. ln this example, it is presumed that the vehicle 100 comprises a rain sensor which acti-vates the wipers when it is raining. On the image 220 it is detected rain but the wipers arenot active, as may be detected by the forwardly directed sensor 110. The reason may bethat the rain sensor is defect, or alternatively that the wiper engine is not working properly. Such discovered anomaly 230, may be presented to the driver on the presentational device200, possibly together with information concerning which measures to take. Other examples of anomalies 230 comprising a deviation from an expected condition of thevehicle 100, under current driving conditions may be that a door of the vehicle 100 is openwhile driving; a crane is in upright position while driving; a pair of glasses (or other arbitraryobject) has been left on the vehicle roof when the drive is about to commence; a piece ofthe driver's clothes has been jammed in the door; a piece of a passenger's clothes hasbeen jammed in the door when exiting (e.g. in a mass transportation vehicle); the vehicledoors are unlocked when driving into a harsh suburban neighbourhood, frequently visitedby carjackers; a person under 18 years old (or a non-authorised person) is trying to startthe vehicle 100, etc. Figure 3D illustrates another example of a vehicle interior of the vehicle 100 and depicts ascenario wherein the anomaly 230 comprises a plausibility deviation of information in adigital map, based on at least a part of the image 220, depicting vehicle surroundings. According to some embodiments, the image 220 may be compared with a map state. Suchcomparison may result in a detection of a different number of lanes in reality than accord- 14 ing to the map data, due to a recent roadwork etc., or simply an error in the stored mapdata. This is merely an arbitrary example of such possible deviation between the reality ascaptured by the vehicle sensors 110, 120, 130, 140. Other examples may be new road/entrance/ exit; new speed limit on an existing road; that a road has been changed into uni-directional, etc. The sensors 110, 120, 130, 140 may capture and collect information fromthe road, traffic signs etc., continuously while driving, and a comparison may be made withmap data, and information associated with the geographical position of the vehicle 100,such as speed limit and other restrictions. ln case an anomaly 230 between the reality and the stored map data is discovered, analert may be outputted for informing a person responsible of the digital map concerning thedetected anomaly 230. Such alert may also be provided to the driver, if any, on the presen-tational device 200. By receiving information continuously from a plurality of vehicles 100 in various traffic situa-tions, map data may be updated in a convenient manner. For driver assistance systemsand (to even larger extent) autonomous vehicles, it is important that map data can betrusted for navigation, as there may not be any driver present to notice and react on devia-tions from the map data. ln some embodiments, the on-board sensors 110, 120, 130, 140 may detect indications ofan accident, or a hazardous situation on, or in the vicinity of the road, such as e.g. a sta-tionary vehicle on a highway; a reversing vehicle on a highway; a vehicle driving againstthe driving direction on a road; a vehicle driving in a pedestrian zone or bicycle path; peo-ple or animals lying on the road, etc. ln such case, besides slowing down and/ or stoppingthe own vehicle 100, information concerning the detected accident indication may be sentto a police department, a traffic surveillance centre, an emergency centre or similar entity. Figure 3E illustrates another example of a vehicle interior of the vehicle 100, comprisingall, or at least some of the previously discussed components. However, in the illustratedexample, the anomaly 230 comprises a deviation from an expected condition of anothervehicle 350, under current driving conditions. Such anomaly 230 may comprise any of the previously mentioned possible anomalies 230for the own vehicle 100, such as e.g. that the left rear position lamp 370 is not working. lnsuch case, an alert may be outputted for informing a person responsible of the other vehi-cle 350 concerning the detected anomaly 230, e.g. via a wireless signal transmitted by thetransmitter 320, according to any of the previously mentioned wireless communication technologies. The other vehicle 350 may comprise a receiver 360 for receiving such trans- mitted information. Thereby, the other driver/ owner of the other vehicle 350 may be informed about anomalies230 of his/ her vehicle 350, which otherwise may be difficult for the driver (if any) of thevehicle 350 to notify such as various flaws or defects, i.e. anomalies 230 on the back sideof the other vehicle 350, for example. Thus various potentially traffic dangerous situations due to errors or anomalies 230 of theother vehicle 350 may be detected and rectified, leading to a safer traffic environment. Figure 4 illustrates an example of a method 400 according to an embodiment. The flowchart in Figure 4 shows the method 400 for use in a vehicle 100. The method 400 aims atproviding vehicle diagnosis by visual inspection. The vehicle 100 may be e.g. a truck, a bus, a car, or similar means of conveyance as pre- viously mentioned, autonomous or non-autonomous. The vehicle 100 may comprise a plurality of sensors 110, 120, 130, 140, pointable in vari-ous different directions around the vehicle 100, and having a respective surveillance areawhich at least partly covers a part of the own vehicle 100. ln order to correctly be able to make a visual inspection via the sensors 110, 120, 130, 140,for providing vehicle diagnosis, the method 400 may comprise a number of steps 401-404.However, some of these steps 401-404 may be performed in various alternative manners.Some method steps may only be performed in some optional embodiments; such as e.g.steps 403-404. Further, the described steps 401-404 may be performed in a somewhatdifferent chronological order than the numbering suggests. The method 400 may comprisethe subsequent steps: Step 401 comprises comparing an image 220 of at least a part of the vehicle 100 with apreviously stored image 210, depicting a normal state of the vehicle 100. Step 402 comprises detecting an anomaly 230 in the image 220, in comparison with thepreviously stored image 210. 16 The anomaly 230 may result from a malfunctioning physical vehicle part in some embodi-ments. Further, in some embodiments, the anomaly 230 may comprise a deviation from anexpected condition of the vehicle 100, under current driving conditions. Furthermore, the anomaly 230 may comprise a plausibility deviation of information in adigital map, based on at least a part of the image 220, depicting vehicle surroundings insome embodiments. According to some embodiments, the anomaly 230 may comprise a deviation from an ex-pected condition of another vehicle 350, under current driving conditions. The anomaly 230 may be detected, based on image recognition. Step 403 comprises executing a measurement in order to eliminate or at least reduce theimpact of the detected 402 anomaly 230. The executed measurement comprises outputting an alert for informing a person responsi-ble of the vehicle 100 concerning the detected 402 anomaly 230. The executed measurement may in some embodiments comprise outputting an alert forinforming a person responsible of the digital map concerning the detected 402 anomaly230. Further, the executed measurement may comprise outputting an alert for informing a per-son responsible of the other vehicle 350 concerning the detected 402 anomaly 230, ac-cording to some embodiments. Step 404, which may be performed only in some particular embodiments, comprises stor-ing images 220 of the detected anomaly 230 in a vehicle inspection file. The vehicle inspection file may be situated in the data storage device 315, or at any otherconvenient location, in different embodiments. Thus a vehicle inspection light is provided,noticing various errors and/ or defects of the vehicle 100. Thereby, information may be saved and provided to a mechanic, who thanks to the vehicleinspection file may understand what has occurred with the vehicle 100, and what problemshas occurred. 17 Figure 5 illustrates an embodiment of a system 500 in a vehicle 100 for vehicle self-diagnosis by visual inspection. The system 500 may perform at least some of the previ-ously described steps 401-404 according to the method 400 described above and illus-trated in Figure 4. The system 500 comprises at least one control unit 310 in the vehicle 100, for vehicle self-diagnosis by visual inspection. The control unit 310 is configured to compare an image 220of at least a part of the vehicle 100 with a previously stored image 210, depicting a normalstate of the vehicle 100. Further, the control unit 310 is configured to detect an anomaly230 in the image 220, in comparison with the previously stored image 210. Also, the controlunit 310 is configured to execute a measurement in order to at least reduce the impact ofthe detected anomaly 230. Further the control unit 310 may be configured to store images of the detected anomaly230 in a vehicle inspection file in a data storage device 315. Further, in some embodiments, the control unit 310 may be configured to execute themeasurement by outputting an alert for informing a person responsible of the vehicle 100concerning the detected anomaly 230. The anomaly 230 may result from a malfunctioning physical vehicle part in some embodi-ments. Further, according to some embodiments, the anomaly 230 may comprise a devia-tion from an expected condition of the vehicle 100, under current driving conditions. lnsome particular embodiments, the anomaly 230 may comprise a plausibility deviation ofinformation in a digital map, based on at least a part of the image 220, depicting vehiclesurroundings and wherein the executed measurement comprises outputting an alert forinforming a person responsible of the digital map concerning the detected anomaly 230. lnaddition, according to some embodiments, the anomaly 230 may comprise a deviation froman expected condition of another vehicle 350, under current driving conditions and whereinthe executed measurement comprises outputting an alert for informing a person responsi-ble of the other vehicle 350 concerning the detected anomaly 230. Further, the system 500 also comprises at least one sensor 110, 120, 130, 140 of the vehi-cle 100, for capturing an image 210, 220 of at least a part of the vehicle 100. The system 500 in addition further comprises a data storage device 315 for storing an im-age 210 depicting a normal state of the vehicle 100. 18 The control unit 310 comprises a receiving circuit 510 configured for receiving a signal fromthe sensors 110, 120, 130, 140; and/ or from the data storage device 315. Further, the control unit 310 comprises a processor 520 configured for performing at leastsome steps 401 -404 of the above described method 400, according to some embodiments. Such processor 520 may comprise one or more instances of a processing circuit, i.e. aCentral Processing Unit (CPU), a processing unit, a processing circuit, an Application Spe-cific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may inter-pret and execute instructions. The herein utilised expression “processor” may thus repre-sent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any,some or all of the ones enumerated above. Furthermore, the control unit 310 may comprise a memory 525 in some embodiments. Theoptional memory 525 may comprise a physical device utilised to store data or programs,i.e., sequences of instructions, on a temporary or permanent basis. According to some em-bodiments, the memory 525 may comprise integrated circuits comprising silicon-basedtransistors. The memory 525 may comprise e.g. a memory card, a flash memory, a USBmemory, a hard disc, or another similar volatile or non-volatile storage unit for storing datasuch as e.g. FlOlVl (Read-Only Memory), PROM (Programmable Read-Only Memory),EPFlOM (Erasable PROM), EEPFlOlVl (Electrically Erasable PROM), etc. in different em-bodiments. Further, the control unit 310 may comprise a signal transmitter 530 in some embodiments.The signal transmitter 530 may be configured for transmitting a signal to e.g. the presenta-tional device 200, and/ or the transmitter 320. The above described steps 401-404 to be performed in the vehicle 100 may be imple-mented through the one or more processors 520 within the control unit 310, together withcomputer program product for performing at least some of the functions of the steps 401-404. Thus a computer program product, comprising instructions for performing the steps401-404 in the control unit 310 may perform the method 400 comprising at least some ofthe steps 401-404 for vehicle diagnosis by visual inspection, when the computer program isloaded into the one or more processors 520 of the control unit 310. Further, some embodiments of the invention may comprise a vehicle 100, comprising thecontrol unit 310, for vehicle diagnosis by visual inspection, according to at least some ofthe steps 401-404. 19 The computer program product mentioned above may be provided for instance in the formof a data carrier carrying computer program code for performing at least some of the steps401-404 according to some embodiments when being loaded into the one or more proces-sors 520 of the control unit 310. The data carrier may be, e.g., a hard disk, a CD ROM disc,a memory stick, an optical storage device, a magnetic storage device or any other appro-priate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be provided as com-puter program code on a server and downloaded to the control unit 310 remotely, e.g., overan Internet or an intranet connection. The terminology used in the description of the embodiments as illustrated in the accompa-nying drawings is not intended to be limiting of the described method 400; the control unit310; the computer program; the system 500 and/ or the vehicle 100. Various changes,substitutions and/ or alterations may be made, without departing from invention embodi-ments as defined by the appended claims. As used herein, the term "and/ or" comprises any and all combinations of one or more ofthe associated listed items. The term “or“ as used herein, is to be interpreted as a mathe-matical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR),unless expressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are tobe interpreted as “at least one", thus also possibly comprising a plurality of entities of thesame kind, unless expressly stated otherwise. lt will be further understood that the terms"includes", "comprises", "including" and/ or "comprising", specifies the presence of statedfeatures, actions, integers, steps, operations, elements, and/ or components, but do notpreclude the presence or addition of one or more other features, actions, integers, steps,operations, elements, components, and/ or groups thereof. A single unit such as e.g. aprocessor may fulfil the functions of several items recited in the claims. The mere fact thatcertain measures are recited in mutually different dependent claims does not indicate that acombination of these measures cannot be used to advantage. A computer program may bestored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distrib-uted in other forms such as via Internet or other wired or wireless communication system.
权利要求:
Claims (16) [1] 1. A method (400) in a vehicle (100) for vehicle diagnosis by visual inspection, com-prising: comparing (401) an image (220) of at least a part of the vehicle (100) with a previ-ously stored image (210), depicting a normal state of the vehicle (100); detecting (402) an anomaly (230) in the image (220), in comparison with the pre-viously stored image (210); and executing (403) a measurement in order to at least reduce the impact of the de-tected (402) anomaly (230). [2] 2. The method (400) according to claim 1, wherein the executed measurement com-prises outputting an alert for informing a person responsible of the vehicle (100) concerningthe detected (402) anomaly (230). [3] 3. The method (400) according to any of claim 1 or claim 2, wherein the anomaly(230) results from a malfunctioning physical vehicle part. [4] 4. The method (400) according to any of claims 1-3, wherein the anomaly (230) comprises a deviation from an expected condition of the vehicle (100), under current driv-ing conditions. [5] 5. The method (400) according to any of claims 1-4, wherein the anomaly (230)comprises a plausibility deviation of information in a digital map, based on at least a part ofthe image (220), depicting vehicle surroundings and wherein the executed (403) measure-ment comprises outputting an alert for informing a person responsible of the digital mapconcerning the detected (402) anomaly (230). [6] 6. The method (400) according to any of claims 1-5, wherein the anomaly (230)comprises a deviation from an expected condition of another vehicle (350), under currentdriving conditions and wherein the executed (403) measurement comprises outputting analert for informing a person responsible of the other vehicle (350) concerning the detected(402) anomaly (230). [7] 7. The method (400) according to any of claims 1-6, further comprising:storing (404) images (220) of the detected anomaly (230) in a vehicle inspectionfile. 21 [8] 8. A control unit (310) in a vehicle (100), for vehicle self-diagnosis by visual inspec-tion, configured to: compare an image (220) of at least a part of the vehicle (100) with a previouslystored image (210), depicting a normal state of the vehicle (100); detect an anomaly (230) in the image (220), in comparison with the previouslystored image (210); and execute a measurement in order to at least reduce the impact of the detectedanomaly (230). [9] 9. The control unit (310) according to claim 8, wherein the executed measurementcomprises outputting an alert for informing a person responsible of the vehicle (100) con-cerning the detected anomaly (230). [10] 10.(230) results from a malfunctioning physical vehicle part. The control unit (310) according to any of claim 8 or claim 9, wherein the anomaly [11] 11.comprises a deviation from an expected condition of the vehicle (100), under current driv- The control unit (310) according to any of claims 8-10, wherein the anomaly (230) ing conditions. [12] 12.comprises a plausibility deviation of information in a digital map, based on at least a part of The control unit (310) according to any of claims 8-11, wherein the anomaly (230) the image (220), depicting vehicle surroundings and wherein the executed measurementcomprises outputting an alert for informing a person responsible of the digital map concern-ing the detected anomaly (230). [13] 13.comprises a deviation from an expected condition of another vehicle (350), under current The control unit (310) according to any of claims 8-12, wherein the anomaly (230) driving conditions and wherein the executed measurement comprises outputting an alertfor informing a person responsible of the other vehicle (350) concerning the detectedanomaly (230). [14] 14.images of the detected anomaly (230) in a vehicle inspection file in a database (315). The control unit (310) according to any of claims 8-13, further configured to store [15] 15.cording to any of claims 1-7 when the computer program is executed in a processor in a A computer program comprising program code for performing a method (400) ac- control unit (310), according to any of claims 8-14. 22 [16] 16. A system (500) for vehicle self-diagnosis by visual inspection, which system (500)comprises: a control unit (310) according to claims 8-14; at least one sensor (110, 120, 130, 140) of the vehicle (100), for capturing an im-age (210, 220) of at least a part of the vehicle (100);a data storage device (315) for storing an image (210) depicting a normal state of the vehicle (100).
类似技术:
公开号 | 公开日 | 专利标题 US11030881B1|2021-06-08|Systems and methods for reconstruction of a vehicular crash US10282999B2|2019-05-07|Road construction detection systems and methods EP3445626B1|2020-12-30|Method and control unit in a vehicle for estimating a stretch of a road based on a set of tracks of another vehicle US10332401B2|2019-06-25|Running vehicle alerting system and method WO2017160201A1|2017-09-21|Method and control unit for vehicle self-diagnosis US10807593B1|2020-10-20|Systems and methods for reconstruction of a vehicular crash CN113353073A|2021-09-07|Determining and responding to an internal state of a vehicle WO2016123424A1|2016-08-04|Remote accident monitoring and vehcile diagnostic distributed database TW201333896A|2013-08-16|Remote traffic management system using video radar GB2552092A|2018-01-10|Inspection system and method for automatic visual inspection of a motor vehicle SE1650327A1|2017-09-11|Method and system for theft detection in a vehicle US10504302B1|2019-12-10|360 degree vehicle camera accident monitoring system US20190360165A1|2019-11-28|Method and control unit for ground bearing capacity analysis SE1650608A1|2017-02-24|Method and control unit for a vehicle EP3679317A1|2020-07-15|Method and control arrangement for estimating vehicle dimensions CN107110653A|2017-08-29|The renewal of Vehicular navigation system based on fixed target SE542785C2|2020-07-07|Method and control arrangement for controlling an adas SE541984C2|2020-01-14|Methods and control arrangement for adaptive passenger comfort and safety functionality of a bus CN113525397A|2021-10-22|Method and control device for self-diagnosis of vehicle US10977503B2|2021-04-13|Fault isolation for perception systems in autonomous/active safety vehicles US20220024503A1|2022-01-27|Vehicle monitoring system SE540361C2|2018-08-07|Method, control unit and system for reducing false alerts SE1950992A1|2021-03-01|Method and control arrangement for autonomy enabling infrastructure features
同族专利:
公开号 | 公开日 WO2017160201A1|2017-09-21| SE542087C2|2020-02-25|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US8150105B2|2008-05-22|2012-04-03|International Electronic Machines Corporation|Inspection using three-dimensional profile information| KR100962408B1|2008-12-24|2010-06-11|주식회사 현대오토넷|Around view system for vehicle and cotrol method thereof| WO2013098980A1|2011-12-27|2013-07-04|パイオニア株式会社|Information processing system, information processing device, information processing program, information recording medium and method for processing information|US10304165B2|2017-05-12|2019-05-28|Ford Global Technologies, Llc|Vehicle stain and trash detection systems and methods| JP2019188979A|2018-04-24|2019-10-31|株式会社日立製作所|Sign diagnostic device and method| SE1851450A1|2018-11-23|2020-05-24|Scania Cv Ab|Method, Computer Program, Control Unit for Detecting Faults in a Driver-Assistance System and Vehicle| DE102019220338A1|2019-12-20|2021-06-24|Volkswagen Aktiengesellschaft|Method for transmitting an emergency call in an autonomous vehicle|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 SE1650346A|SE542087C2|2016-03-15|2016-03-15|Method and control unit for vehicle diagnosis|SE1650346A| SE542087C2|2016-03-15|2016-03-15|Method and control unit for vehicle diagnosis| PCT/SE2017/050196| WO2017160201A1|2016-03-15|2017-03-02|Method and control unit for vehicle self-diagnosis| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|