专利摘要:
USER INTERACTION SYSTEM FOR THE CONTROL OF A LIGHTING SYSTEM FOR THE LIGHTING OF AN ENVIRONMENT AND PORTABLE ELECTRONIC DEVICEThe invention provides a user interaction system (100) and a method of controlling a lighting system. The user interaction system (100) comprises a display (106), a means for receiving a location indication (102), a location identification means (104) and an overlay image creator ( 108). The display (106) displays a sub-area image and an overlay image. The sub-area image is an image of a specific sub-area of the environment. The location identification means (104) detects which specific location in the environment is indicated by the location indication. The overlay image creator (108) generates the overlay image. The overlay image comprises information related to a lighting effect that is obtainable through the lighting system at the specific location. The obtainable effect is an effect of at least two controllable light sources from the lighting system. The information that is related to the obtainable lighting effect is based on a lighting system model (110) that represents the effects that are obtainable in the environment through controllable light sources of the lighting system.
公开号:BR112013008713A2
申请号:R112013008713-7
申请日:2011-10-13
公开日:2021-05-25
发明作者:Dzmitry Viktorovich Aliakseyeu;Jonathan David Mason;Dirk Valentinus René Engelen;Dhannes Weda;Bartel Marinus Van De Sluis
申请人:Koninklijke Philips Electronics N.V.;
IPC主号:
专利说明:

-. USER INTERACTION SYSTEM FOR CONTROL FROM A LIGHTING SYSTEM TO LIGHTING AN ENVIRONMENT AND PORTABLE ELECTRONIC DEVICE
FIELD OF THE INVENTION The invention relates to user interaction systems for controlling lighting systems.
BACKGROUND OF THE INVENTION Published patent application WO 2010/079400 discloses a control system for controlling one or more light sources. The system comprises an array of image sensors for obtaining a scene image of a scene. The light emitted by the light sources comprises, for each of the light sources, a unique code. The codes are detected by the sisterna. In this way, the system obtains knowledge about the light sources available in the environment and an indication of each of the light sources. The system presents the scene image to a user on a viewfinder, and accompanies the scene image with a control image in which control items are shown. The control items are related to the light sources recognized in the scene, and the user can select through the respective control items a specific operating mode for the respective light sources. The scene image can be updated to show the result of controlling the light sources. In one embodiment, the user can select a zone in the scene image and provide a desired light effect for the selected zone. The system uses an intelligent mapping algorithm to control light devices to optimize the requested light effect. When the user provides the system of the technique mentioned above, a priori it is not clear to the user whether this effect is obtained. Thus, with the aforementioned technique system, it may happen that the user requests the effect for a specific zone, and the effect cannot be obtained because of the
— · — 2/3i"" . .
.-.
limited capacity of the lighting system. The aforementioned technique system only provides feedback to the user by updating the scene image and/or controlling the light sources in the real scene, and this feedback shows whether the requested lighting effect is obtained or not. It is not easy to use.
SUMMARY OF THE INVENTION An object of the invention is the provision of an easier-to-use user interaction system for controlling a lighting system. A first aspect of the invention provides a user interaction system as claimed in claim 1.
A second aspect of the invention provides a portable electronic device as claimed in claim 12. A third aspect of the invention provides a method for controlling a lighting system as claimed in claim 13. A fourth aspect of the invention provides a software program product. computer as claimed in claim 14. Advantageous embodiments are defined in the dependent claims. A user interaction system according to the first aspect of the invention comprises a display, a means for receiving a location indication, a location identification means and an overlay image creator. The viewfinder shows a sub-area image and an overlay image. The sub-area image is an image of a specific sub-area of the environment. The location identification means which detects the specific location in the environment is indicated by the Location indication. The overlay image creator 30 generates the overlay image. The overlay image comprises information related to a lighting effect that is obtainable through the lighting system at the specific location. The obtainable effect is a
' · k -. [ 3/31 r t..
effect of at least two controllable light sources from the lighting system. The information that is related to the obtainable lighting effect is based on a lighting system model that represents the effects that are obtainable in the environment through controllable light sources of the lighting system.
FY The user interaction system provides the user with information about the possible lighting effect that ¶' can be obtained at the specific location which is indicated by the location indication. It is easy to use because the user knows, a priori, before provision of input related to a desired lighting effect, that effects can be obtainable in the specific Location. Therefore, the user does not need to discover by trial and error whether the effect he has in mind is obtainable. Thus, the user does not
L will select an unobtainable effect, and will not be disappointed with the lighting system. Lighting systems become much more complex, and a surface or area of the environment is usually illuminated by a plurality of controllable light sources, and thus the lighting effect is the combination of a plurality of lighting effects. rf individual controllable light sources. It should be noted that at least two controllable light sources that contribute
W for q obtainable lighting effect must emit light to obtain the obtainable lighting effect. In particular, the more intensive use of individually controllable light sources, such as individually controllable light emitting diodes, makes it much more difficult for users to assess which effects are obtainable in specific areas. The system is capable of analyzing the lighting effects j of a plurality of controllable light sources based on the lighting system model, and of translating the lighting effects of the light sources r 6 I
" 5/31 device to a more comfortable position before providing user input and receiving information about the obtainable effect. The sub-area image is an image of the sub-area. The image can be a schematic drawing of the sub-area of the environment, or the image can be a recording of the subarea of the environment, for example, the image can be a photograph of the
W subarea taken at a particular moment in time.
b It should be noted that the viewfinder may show the sub-area image 10 without the overlay image, for example, before the location indication is received. If the overlay image is generated, the display shows the sub-area image and the overlay image. Furthermore, the lighting system model can be obtained through known technologies 15 such as, for example, the so-called camera calibration
Dark W, or technologies that are related to the disclosure of the patent application cited in the technical section background. In the patent application WO 2010/079400, the light sources emit coded light, and the system of the patent application uses an irrigation sensor for detecting the indicia of the light sources and for detecting the identification codes of the light sources. light. In another embodiment, the lighting system model was composed of a person, for example, Urri and a technician, who installed the lighting system in the environment.
In one embodiment, the means for receiving a location indication is a user input means for receiving user input comprising the location indication. It is easier to use if the user selects the location indication himself. The system then provides 30 c) obtainable effect for a location from which the user wants to know the obtainable effect. In one embodiment, the obtainable effect is at least one of: the light intensity that can be locally
,_ 6/31 increased or decreased, the locally obtainable light colors that are a specific subspace of a color space, the light intensity that can be globally increased or decreased, the light colors obtainable in an area in and in
5 around the specific location that are a specific subspace of the color space, the contrast that can be increased or decreased in an area in and around the specific location, a cooler or warmer color than
0 may be obtainable, the color temperature which may be
10 increased or decreased, and an effect of a controllable, redirectable light source that can be moved.
It should be noted that the obtainable effect may only be related to the specific location when, for example, at the specific location, the intensity can be increased or decreased, or the effect
15 obtainable may be related to the specific location and an area close to the specific location when, for example, the intensity can be increased or decreased globally.
In the context of the obtainable effect, globally must be considered as meaning: in the specific place and in a
20 area around the specific location.
In one embodiment, the input means is configured to receive a desired lighting effect or a desired lighting effect change. the received desired lighting effect and the effect change of
25 desired lighting received are received at the specific location.
In other words, after providing, through the user input means, a location indication and receiving feedback from the system about the obtainable effects, the user provides an input in which the user expresses a
30 expectation regarding a lighting effect in the specific location.
The desired effect or effect change may be related to local change in intensity and/or color of light, or change in intensity and/or color of light in
- .—- 7/31 "~ an area in and around the specified area. In another embodiment, the user interaction system further comprises a light effect controller for generating a control signal for a light source. 5 controllable light from the lighting system for optimally obtaining the desired lighting effect or changing the desired lighting effect at the specific location.
P of the control signals is based on the system model of " . lighting. The lighting system model further comprises information about the relationship between the values of the control signals for the controllable light sources and a response of the respective sources controllable light outputs when receiving control signals.
-. The lighting model may further comprise the relationships between the control signals that the controllable light sources of the lighting system can receive, and how the control signals alter the operation of the controllable light sources. Thus, the lighting model comprises sufficient information to allow the system to determine how an effect of lighting in the environment changes if specific control signals are provided to one or more controllable light sources of the lighting system. The user provides the desired light effect, or the
The desired light effect change and, based on the model information, the light effect controller generates at least one control signal for a controllable light source of the lighting system, so that the desired light effect or the desired light effect change is obtained as much as possible by the lighting system if the controllable light source 30 of the lighting system receives at least one control signal. Thus, the user does not need to have a lot of knowledge about the lighting system and how the controllable light sources of the lighting system should be
.
- -.
- - - """ "8/ Ji controlled. It should be noted that it may be possible that the desired lighting effect or desired lighting effect change is not yet fully obtainable. Light effect controller 5 is configured to generate the control signal so that the desired effect or desired effect change can be achieved as much as possible. At least one control signal is not necessarily immediately provided to the
W lighting. In subsequent embodiments, at least one control signal 10 is used for different purposes. Various parameters of the controllable light source can be controlled via the control signal. Examples are the intensity of the emitted light, the color of the emitted light, the angle of light emission or, in the case of redirectable light sources, for example, the location of the effect of the redirectable light source, the exact location of the light source, and the direction in which the light beam is emitted. Other parameters that may be comprised in the control signal are, for example, time-dependent parameters such as, for example, "the emission of a specific intensity during a specific time interval". n In another embodiment, the overlay image creator is further configured to generate the overlay image, which comprises information showing the altered operation of the controllable light source of the lighting system in accordance with the control signal. generated. So, in other words, the overlay image is a virtual addition to the reality of the sub-area image, which shows the user what the sub-area will look like when the lighting system is controlled with the control signal. This allows the user to assess the lighting situation that may arise from controlling the lighting system with the control signal generated, even before the lighting system.
9/3"1 " " lighting can be controlled with the control signal. It is convenient for the user to evaluate the lighting system control from a virtual environment, as this allows the design of a lighting control pattern without using the 5 lighting system and therefore that the lighting system is not unnecessarily occupied for the design of the lighting control pattern.
In one embodiment, the overlay image creator is configured to generate the overlay image comprising virtual objects that are positioned in the sub-area of the environment and that are virtually illuminated by the controllable light sources of the lighting system in accordance with the signal of control.
In another embodiment, the overlay image creator is configured to generate the overlay image comprising feedback on the feasibility of the desired lighting effect or changing the desired lighting effect. As mentioned earlier, the light effect controller may be unable to control the lighting system so that the desired light effect or desired light effect change is achieved due to the limitations of the lighting system. The overlay image can provide feedback to the user, which indicates to what extent the desired light effect is achievable, and to what extent the desired light effect change is achievable, in other words, the feedback presents as much of the effect as possible. of desired light or the desired light effect change is, given the present lighting system in the environment. In general, it is perceived as easy to use because it makes the user aware of the limitations of the lighting system, and the user is not unexpectedly disappointed if the lighting system is controlled with the control signal.
In another embodiment, the user interaction system further comprises an output means for providing the control signal to the lighting system. By providing the control signal for the lighting system, the lighting system can be controlled not only in a virtual environment, but also in the real environment. Thus, the system is an easy-to-use system that can be used to assess the attainable effects of the lighting system in the virtual environment through the overlay image, and that can be used to control the lighting system in the real environment. . In one embodiment, the overlay image creator is further configured to generate the overlay image comprising information related to one or more controllable light sources of the lighting system.
Information about a specific controllable light source is at least one of: the position of the specific light source, the direction in which the light beam from the specific light source can be emitted, the indication of the light beam from the specific light source. specific light, the colors that can be emitted by the specific light source, the intensities that can be emitted by the specific light source, the movement options of the specific light source, the relationship between the obtainable effect and the specific light source. Providing additional information regarding one or more light sources of the lighting system and/or their capabilities results in an easier-to-use system, as the additional information makes the user more aware of the functioning of the lighting system and its capabilities. Additional information is, for example, regarding the current position and current operation of the light sources, that is, where the light sources are located and how the light sources operate to obtain the lighting situation that is recorded in the subarea image. In another realization, additional information "FOR
II"r/"3r"__"_"_"" _ l'j are, for example, related to the capacity of the light sources of the lighting system, which are not necessarily being operated at the particular moment in time when the image In an example, the possible motions 5 of a redirectable point are presented, the possible subspace of a color space of the light source is presented, 1[ k. as well as the possible light intensity, the angles of possible light emission, etc.
i-|' In another embodiment, the user interaction system further comprises a camera for obtaining a camera image, and the display is further configured to display the camera image. The realization allows navigation in the environment through the control device. A 'l, camera takes one or more successive camera images, and the display presents one or more camera images and, therefore, the user interaction system is a means for the user to see the environment. it is especially easy to use for showing camera images on the viewfinder when the camera has a different viewing angle than the user, and the camera may, for example, be able to zoom into details that are difficult for a user to recognize. As discussed above, the y sub-area image can also be displayed on the display, together with the overlay image or initially without the overlay image. The camera image can be displayed independently of the sub-area and/or the image. independently of the overlay image, or together with the sub-area image and/or the overlay image. If the camera image and the sub-area image are displayed simultaneously, this can be done in one way overlay, or the two images can be displayed next to each other. In one realization, the image of li
. - — _& -. .- - - . .
in the environment is equivalent to the indication of location. The realization provides a means for detecting, while obtaining a record of the sub-area through the camera, from which part of the environment the camera is directed from and which part of the environment is recorded by the camera. It is very easy to use because the user interaction system is able to automatically detect which specific location is indicated in the k' sub-area image without the need for additional user interaction.
10 The location identification means also has knowledge about the environment. It can be pre-programmed into the location identification means in the form of a model of the environment, or the location identification means is capable of deducing the environment model from the sub-area image, or the location identification means location receives, together with the lighting system model, a model of the environment. In another embodiment, the sub-area image is accompanied by information that expresses which specific sub-area of the environment has an image generated in the sub-area image. In one embodiment, the camera comprises at least one of: means for detecting a location of the camera, means for detecting a direction in which the camera is directed, a tilt sensor for detecting the tilt of the camera, and the means of detecting the camera's angle of view. The user input means is configured to base the location indication on at least one of: the detected camera location, the detected direction, the detected tilt, and the detected angle of view. In realization, the camera, together with the viewfinder, is the means of detection of the camera, as well as the means of user input, that is, a means of pointing to a location in the environment.
- . ..— - .— .
~.
14/31 -... -.-.
and thus provide an indication of location. You can, for example, navigate with the camera, and finally point the camera to a sub-area and zoom in on the center of the sub-area to provide the location indication. later, the overlay image creator 5 can generate the overlay image so that the user can receive information about one or more obtainable lighting effects at the indicated location.
According to a second aspect of the invention, there is provided a portable electronic device comprising the user interaction system according to one of the embodiments of the user interaction system according to the first aspect of the invention. The portable electronic device is, for example, a mobile phone or a portable computer.
According to a third aspect of the invention, a method for controlling a lighting system for lighting an environment is provided. The method comprises the steps of i) displaying a sub-area image which is an image of a specific sub-area of the environment, ii) receiving user input which comprises a location indication, iii) detecting which specific location in the environment is indicated by the location indication, iv) the generation of an overlay image comprising the information related to a lighting effect that is obtainable through the lighting system at the specific location, the obtainable lighting effect that is an effect of at least two controllable light sources of the lighting system, and information related to the obtainable lighting effect which is based on a model lighting system that represents the effects obtainable on the environment through the controllable light sources of the lighting system, v) the display of the sub-area image and the overlay image.
In one embodiment, the method starts again with the step of displaying a sub-area image after performing all of the method steps.
According to a fourth aspect of the invention, there is provided a computer program product comprising instructions for causing a processor system to execute the method according to the third aspect of the invention.
The portable electronic device, the lighting system control method and the computer program product provide the same benefits as the user interaction system according to the first aspect of the invention, and have similar realizations with similar effects.
those of the corresponding system realizations.
These and other aspects of the invention are evident and will be elucidated with reference to the embodiments described below. Those skilled in the art can assert that two or more of the above-mentioned embodiments, implementations and/or aspects of the invention may be combined in any way deemed useful. Modifications and variations of the method, portable device, and/or computer program product, which correspond to the described modifications and variations of the system, can be made by one skilled in the art based on the present description.
BRIEF DESCRIPTION OF THE DRAWINGS In the drawings: Fig. 1a schematically shows a first embodiment of the user interaction system according to the first aspect of the invention, Fig. Ib schematically shows a display showing a sub-area image and an image overlapping,
-· ·N Fig. 2 schematically shows a second realization of the user interaction system, Fig. 3 schematically shows a typical use case of the user interaction system, 5 Fig. 4 schematically shows another realization of the display Fig. 5 showing the sub-area image and the W' image overlay, Fig. 5 schematically shows a model of a lighting system, Fig. 6 schematically shows another embodiment of the display that presents the sub-area image as a schematic drawing of one room, and Fig. 7 shows a flowchart of the method accordingly. with the third aspect of the invention.
15 It should be noted that items indicated by the same reference numerals in different Figures have the same structural characteristics and the same functions, or are the same signs. Once the function and/or structure of said item has been explained, there is no need for repetitive explanations of it in the detailed description. Figures are purely schematic, and are not drawn to scale. particularly for clarity some dimensions are heavily exaggerated.
DETAILED DESCRIPTION OF EMBODIMENTS 25 A first embodiment of a user interaction system 100 in accordance with the first aspect of the invention is shown in Fig. 1a. Through the user interaction system 100, a lighting system for lighting an environment can be controlled. The user interaction system 100 comprises a display 106, an input means 102, a location identification means 104 and an overlay image creator 108. The display is capable of displaying a sub-area image 112 and an image overlay
113, which are shown in Fig. 1b. The sub-area image 112 can be displayed without the overlay image 113, and together with the overlay image 113. The sub-area image 112 is a record of a specific sub-area of the environment, and the sub-area image 112 is recorded in a moment specific in time. User input means 102 is for receiving user input. The user input comprises a location indication 118. The location indication 118 may initially relate to a point or location in the sub-area image 112, or it may relate initially to a point or location in the environment. The location identification means 104 detects which specific location in the environment is indicated by the indication of
W location 118. Overlay image creator 108 generates overlay image 113. Overlay image 113 comprises information 116 relating to a lighting effect that is obtainable through the lighting system at the specific location in the environment. The obtainable lighting effect is an effect of at least two light sources. In other words, at least two light sources contribute to the obtainable lighting effect. The information that is related to the obtainable lighting effect is based on a lighting system model 110. The creator of irnagern de
The overlay 108 comprises the lighting system model 110 in an internal storage, or the overlay image creator 108 is able to access the lighting system model 110, for example, via a network connection. The lighting system model 110 represents the effects that are obtainable in the environment by light sources of the lighting system 30. In other words, the lighting system model comprises information about which lighting effects are obtainable, where in the environment, and through which lighting system light sources.
18/31" -- .
-.. -— —· C The user interaction system 100 presents, on the display 106, the sub-area image 112 which is shown in Fig.
1b. Subarea image 112 is a record of a specific subarea of the environment. Two walls of the sub-area have image 5 generated in the sub-area image 112. A first luminaire 114 is present in a small table 122, and a second luminaire 120 is present in one of the walls. The first luminaire 114 is used to locally light the small table
122. The second luminaire 120 is used to globally illuminate the subspace. Therefore, the total amount of light falling onto the small table 122 depends on the light emitted by the first luminaire 114 and the second luminaire 120. The display 106 can be combined with the input means
102. In the embodiment of Fig. 1b, the input means 102 has a touch-sensitive layer integrated into the display 106, which allows a user to provide user input by touching the display 106. In the example of Fig. 1b, a Pointed instrument 124 is used by the user to provide a location indication 118 to input means 102. Pointed instrument 124 touches display 106 to one or more pixels of the upper surface of small table 122. In other embodiments, the input means it is not necessarily a touchscreen, but may comprise a keyboard, or a mouse, or other means of receiving input from a user. The location identifying means 104 further detects which specific location in the sub-area is indicated by the location indication 118. The location identifying means 104 has, for example, a model of the environment, and knows which sub-area of the environment has image generated in the image. of sub-area 112. By mapping the position of the pixels that are touched with the pointed instrument 124 on a surface in the environment model, the location identification means 104 is capable of detecting that the location indication 118 equates to a specific location in the upper surface of table 122. In another embodiment, the location identification is an indication area in the sub-area image. User input can be a polygon that is drawn around the indication area. Overlay image creator 108 uses an illumination system template 110 to generate overlay image 113. Overlay image 113 is presented on display 106 along with overlay image 112 in an overlaid manner. The overlay image 113 comprises information 116 which relates to the obtainable lighting effects at the specific location. In Fig. 1b, it is seen that the information 116 is presented in text form.
however, information 116 can also be presented in a graphical form. In the example in Fig. 1b, four different characteristics of the lighting effects obtainable at the specific location are presented. In another realization, only one feature of a lighting effect for the specific location is presented. Information 116 shows the user that the location brightness, which is the brightness at the specific location, can be increased and decreased.
The overall brightness, which also refers to the brightness in an area around a specific location, can only be dimmed. Contrast, which refers to the relationship between local point intensity and global ambient intensity, can be decreased and increased. And the lighting colors available in the specific location are a plurality of red, green and blue color combinations.
Fig. 2 shows another embodiment of a user interaction system 200 in accordance with the first aspect of the invention. The user interaction system 200 comprises a display 212, an input means 202, an input means.
""" 2"0"/31"" location identification 204, an overlay image creator 214, a light effect controller 218, an output means 220, and a camera 208. The user interaction system 200 further comprises a communication bus 206 which is used by different components of the system to communicate with each other. The display 212 displays a sub-area image, an overlay image and/or
P an image from the camera. Input means 202 receives the input. of user. User input comprises an indication of location. The location identification means 204 detects which specific location in the environment is indicated by the location indication. The overlay image creator 214 generates the overlay image comprising information related to a lighting effect 15 obtainable at the specific location in the environment. for the creation of the * overlay image, a lighting system template 216 is used by the overlay image creator. In the embodiment shown, lighting system model 216 is stored in lighting effect controller 218, and is accessed by the overlay image creator via communication bus 206.
. User input means 202 may also receive user input regarding an effect of
Q desired lighting for the specific location that has been indicated 25 with the location indication. Instead of a desired lighting effect, a desired effect change can also be provided, which means that the user input provided is a desired change of lighting situation that has image generated in the sub-area image. If the lighting effect change provided is, for example, "locally higher brightness", the brightness of the specific location, since it had an image generated in the sub-area image, should increase locally. The lighting effect controller 218 uses the desired lighting effect or desired lighting effect change to generate at least one control signal for at least one light source in the lighting system, so that, if the sources of light of the lighting system 5 emit light according to at least one control signal, the desired lighting effect or the desired lighting effect change is ideally achieved at the specific location. Ideally it means, in this context, "as much as possible" in view of the limitations of the lighting system. It should be noted that the user is first informed about the possible lighting effects and therefore the user is expected to provide a desired lighting effect or a desired lighting effect change, which is obtainable through the lighting system . The light effect controller 218 stores the model of the lighting system 216 which not only contains the information about which light effects are obtainable by the light sources of the lighting system, but also understands the information that associates the obtainable lighting effects to the control signals to control the operation of light sources to obtain lighting effects. It should be noted that the control signal can comprise information regarding all controllable characteristics of the light sources of the lighting system. For example, if the light emission from the light sources is controllable, the control signal can comprise the control information for the emission at a specific light intensity. In other examples, the color of light, the angle of light emission or, for example, the position of the light sources, is controllable.
The generated control signal can be provided to the lighting system through the output means 220. An output 222 of the output means 220 is, for example, a DMX bus, which is a communication bus system that is used
"Z"Z"[31 - --,.
by some lighting systems to communicate between control parameters and light sources. The 222 output can also employ other communication technology, such as a wireless Ethernet, Bluetooth, zigBee or z-5 Wave connection. The control signal is not necessarily used to control the lighting system. In one embodiment, the overlay image creator 214 is capable of generating an overlay image that also shows the altered operation of the light sources of the lighting system if the lighting system is controlled with the control signal. In order to generate the overlay image, overlay image creator 214 accesses the lighting system template 216 to enhance the lighting effect that is the result of the control signal. Said overlay image is displayed along with the sub-area image on display 212, and the combined view virtually displays the controller effects.
In one embodiment, overlay image creator 214 is further capable of generating an overlay image that presents information about the feasibility of the desired lighting effect or changing the desired lighting effect. In other words, if the desired lighting effect or desired lighting effect change is not completely achievable by the lighting system because of the limitations of the lighting system, the overlay image can communicate to the user that the lighting effect cannot be completely obtained and that the overlay image can eventually communicate to what extent the illumination effect can be obtained. The user interaction system 200 further comprises a camera 208. The camera obtains an image from the camera of a part of the environment comprising the lighting system. Viewfinder 212 can present the camera image.
As such, the user interaction system 200 can be used to survey the environment to find a sub-area in which the lighting system is to be controlled.
The user input means 202 can also receive a specific user input, which indicates that, at the particular time of provision of the specific user input, the camera image is the sub-area image. Thus, navigation and provision of specific user input can be used by the user of the user interaction system 200 to register the sub-area image. In one embodiment, after obtaining the sub-area image, the camera image is no longer displayed on display 212, and the camera image is replaced by the sub-area image. In other words, during navigation, the camera image is frozen to b control the lighting system with respect to one or more locations with image generated in the frozen camera image.
The camera 208 may comprise at least one of the following means: means 210 for detecting a location 20 of the camera, means 224 for detecting a direction in which the camera 208 is directed, a tilt sensor 226, and the means. 228 of detecting the angle of view of the camera 208. The means 210 for detecting the location of the camera 208 is, for example, a GPS sensor, or a sensor capable of determining the location of the camera 208 with respect to a fixed point known in the environment. The means 224 for detecting the direction in which the camera 208 is directed is, for example, an electronic compass. The means 228 for detecting the angle of view is, for example, the magnification controller of the camera 208. The different detection sensors in the camera 208 can be used for different purposes.
In one embodiment, camera 208 is a fixed camera whose direction and tilt can be controlled. THE
.-. ..- .-. -... -.. -...
location of said camera is known, and the tilt and direction correspond to the controlled value, and the viewing angle is assumed to be fixed. In one embodiment, the detection location of camera 5 208, the detected recording direction, the detected tilt, and/or the detected angle of view are used by the location identification means 204 to determine which specific sub-area of the environment is image generated. in the image of - subarea. The location identification means 204 has, for example, a model of the environment that is combined with the detected parameters to determine which specific sub-area is imaged. Alternatively, trigonometry is used to detect which specific sub-area is imaged in the sub-area image.
In another embodiment, the detection location of the camera ¶ 208, the detected recording direction, the detected tilt and/or the detected angle of view are used by the input means to detect which location indication the user is providing as user input. The fact that the user 20 is directing the camera 208 to a specific point in the environment can be interpreted by the user's input means 202 as the user's provision of the user input of the location indication. For example, the center point of the
The camera image can be thought of as the location indication if the user holds the camera 208 in a fixed position for a predetermined time interval. Or, if the user zooms to a specific location in the environment, this can be interpreted by the input means 202 as the user's provision of the user input of the location indication 30. In one embodiment, a mobile phone that is provided with a video camera comprises the user interaction system of Fig. 2, in which the camera 208 of the user interaction system is the camera of the phone.
- , mobile. The user navigates through the environment and, at a particular moment in time, a button is pressed so that the user indicates that the camera image at the particular moment in time is the sub-area image. Subsequently, user 5 zooms in with camera 208 to a specific location in the sub-area of the environment and, at the moment in time that the maximum magnification is reached, user input means 202 considers the location that has the image generated in the zoomed image as the location indication. In another embodiment, the user does not need to zoom to the location that is the location indication, but the user does need to align the center of the camera image with the location indication, and keep the camera 208 in a fixed position for a period of time. predetermined.
In another embodiment, the location identification means 204 comprises 3D model generation means for generating a model of the sub-area by analyzing the sub-area image. The specific sub-area determined is later used to detect which specific location is indicated by the location indication. In yet another embodiment, markers are available in the environment. Some of the markers may be visible in the sub-area image, and the visible markers are recognized by location identification means 204 for identifying which specific sub-area of the environment is imaged in the sub-area image. Location identification means 204 requires knowledge of available markers and their relationship to the environment.
Fig. 3 shows a typical use case of the user interaction system. Fig. 3a shows a mannequin in a store. the store is lit by a lighting system. After dressing the mannequin in clothes from the new collection, the shop assistant receives a portable electronic device with a viewfinder on one side and a camera viewfinder on the opposite side. The shop assistant directs the camera from the handheld electronic device to the mannequin.
In Fig. 3b it is shown that the shop assistant provides the input for the handheld electronic device, and the device registers the mannequin image as the sub-area image. In Fig. 3c, the shop assistant moves the handheld electronic device 0 to a more comfortable position, while the display of the handheld electronic device 10 still presents the sub-area image. Fig. 3d shows that the shop assistant selects, on the (touch screen) of the handheld electronic device, a location for which he wants to control the lighting. The handheld electronic device presents, in an overlay image, the 15 effects obtainable at the selected location, and subsequently the '0 shop assistant provides a desired lighting effect for the selected location. In Fig. 3d it is shown that the control signals which are generated by the portable electronic device are immediately provided to the lighting system, so that the shop assistant is able to see in the shop what the effect of the control is. In another achievement, the
The store lighting system is not controlled immediately, and the portable electronic device presents, in the image of the overlay, the effect of the control of the lighting system, 25 so that the store assistant is able to create the light situation in around the mannequin in a virtual environment.
Fig. 4 refers to another embodiment, in which the lighting system comprises a redirectable luminaire 404 and a fixed luminaire 407. In Fig. 4, the display 30 106 is shown which presents the sub-area image 112 and the overlay image 113. The overlay image shows the lighting effects obtainable at the indicated location with the pointed instrument 124. In the realization of Fig. 4, it was
"" 2"7/3i demonstrated that a relatively bright spot 406 of light can be obtained at the indicated location. The light at the indicated location is the sound of the light from the redirectable luminaire 404 with that of the fixed luminaire 407 which emits light all around the The 5 overlay image 113 also shows the relationship between the position of the 404 Redirectable Fixture and the light effects that can be achieved by the 404 Redirectable Fixture. in the overlay image 113. When the redirectable luminaire 404 moves along the line 404, the light beam indicia moves along the generated image path 410. At the leftmost position, the leftmost indicia 408 is obtained and, in the rightmost position, the rightmost clue 406 can be obtained. If the user provides input that the desired effect is that a bright spot needs to be created at the indicated location, the effect controller of l uz generates a control signal that indicates that the 404 Redirectable Fixture needs to move to its rightmost position.
When the user provides input regarding a desired lighting effect or a desired lighting effect change, the user can, in one example, drag the brightest point on the floor to another position. The brightest point substantially corresponds to the indicia of the redirectable luminaire 404. In one embodiment, the redirectable luminaire 404 is swivel about an x-axis and a y-axis. The direction of the x-axis and y-axis is indicated at the lower right end of Fig. 4. The indication of said re-directable luminaire can be directed towards the floor or one of the walls. Thus, the obtainable effect is that a local bright spot can be/created almost anywhere in subspace. Fig. 5 shows an example of a system model.
28/31 '""'"' ' .
of lighting 500. Said model can be built using known techniques such as, for example, the so-called darkroom calibration, light planning software, CAD representations, or using a technology that is related to coded light emission with the light sources and the recording and analysis of coded light emission. The document, which is cited in the history, refers to the construction of said model using light sources that emit coded light.
The lighting system model 500 is presented in the form of a Table. However, said model can be stored differently, for example, in a relational database, and the model can be implemented differently, for example, in a light simulation module. The Table comprises, in the lines, the characteristics of a specific light source. for example, line 512 comprises the parameters of the first light source. In column 502, an identification code of the respective light source is presented; in column 504, the location (within the room) of the respective light source; in column 506, the indication of the respective light source; in column 508, the subspace of the CIE XYZ color space in which light can be emitted by the respective light source and; in column 510, the variation in light intensity of the respective light source is stored. As seen in Fig. 5, the first light source, which has the identification code LSl, is located in the environment at a position that is described with coordinates (xl, yl, zl). The first light source has an oval mark on the floor of the room, and the color of the light emitted is one somewhere in the middle of the CIE XYZ color space. In addition, the first light source can be turned off so that no light is emitted, but if the light source is turned on, the intensity of the emitted light is between a value Lnini and Lnáxi. If, in one example, the overlay image creator needs to generate an overlay image that presents information regarding a light effect obtainable at a point near the far right side of the floor, the overlay image creator finds out in column
5,506 of Table 500, that the LSI and LS2 light sources have a clue in this specific position and therefore that the combination of capabilities of the LSI and LS2 light sources determines the achievable effects.
For example, intensities
%' of light obtainable is the sum of the light intensities that
10 can be emitted by the LSI light source and the LS2 light source. The 5QO model in Fig. 5 is just an example of this model.
More information may be available in the model such as, for example, the relationship between the value control signals that can be received by the light sources and the characteristics of the light that is emitted in response to the reception of the control signals.
Column 506 may additionally comprise information regarding the intensity distribution in the light beam indicia of the respective light source.
20 light when the respective light source emits light at maximum intensity.
Furthermore, in the example of Fig. 5, the indicia of the respective light sources are presented in column 506 as an indicia on the floor or walls of the room.
Light planning software is often able to work as well.
25 with virtual planes and therefore the clue can also be stored as a clue in a virtual plane of the environment.
Column 508 of model 500, which comprises the color space obtainable by the respective light source, can also be represented by three color points in the XYZ color space that
30 defines the corners of the obtainable color space.
Fig. 6 shows another embodiment, in which the sub-area image 112 is a schematic drawing of a room.
The sub-area image shows, for example, that in an office
- -- .- a meeting table 606 with 6 chairs is placed at the far right of the meeting room. Wall 602 has an entrance door, and a reproduction of a famous painting by Mondrian is attached to a side wall 610. Part 604 of the 5th schematic drawing of the room does not correspond to a wall in the room, and is drawn solely for reasons of clarity. Overlay image 113 has an obtainable effect 608
Q on table 606. The obtainable effect 608 is local light on & table 606.
Fig. 7 shows a flowchart of method 700 in accordance with the third aspect of the invention. At step 702, a sub-area image is displayed. The subarea image is an image of a specific subarea of the environment at a specific point in time. In step 704, user input is received. User input comprises an indication of location. In step 706, it is detected which specific location in the environment is indicated by the location indication. In step 708, an overlay image is generated. The overlay image comprises the 20 information related to a lighting effect that is obtainable through the lighting system at the specific location. The obtainable lighting effect is an effect of at least two controllable light sources from the lighting systems. Information related to the obtainable lighting effect is based on a lighting system model that represents the effects that are obtainable in the environment through controllable light sources of the lighting system. In step 710, the sub-area image and the overlay image are displayed. In one embodiment, all steps of the method have been performed, and the method starts again with step 702 of displaying a sub-area image. It should be noted that the above mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art
' subject will be able to design many alternative realizations without departing from the scope of the attached claims. In the claims, all reference signs placed in parentheses are not to be interpreted as limiting the claim. The use of the verb "to understand" and its conjugations does not exclude the presence of elements or bd steps beyond those indicated in a claim. The article
W "a" or "an" that precedes an element does not exclude the presence of a plurality of said elements. The invention can be implemented by means of hardware comprising several distinct elements, and by means of a properly programmed computer. In the device claim that enumerates several means, several of those means can be embodied by one and the same hardware item. the simple fact of
W certain measures will be recited in mutually different dependent claims does not indicate that a combination of these measures cannot be advantageously used.
权利要求:
Claims (1)
[1]
.q
S y I' 4- r""" ' 1/6
1. USER INTERACTION SYSTEM (100, 200)
FOR THE CONTROL OF A LIGHTING SYSTEM FOR THE LIGHTING OF AN ENVIRONMENT, the user interaction system (100, 5 200) comprises: - a display (106) for displaying a sub-area image (112) and a overlay image (113), the sub-area image (112) being an image of a specific sub-area of the environment, ., - a means (102) for receiving a location indication (118), - an identification means (104, 204) for detecting which specific location in the environment is indicated by the location indication (118), 15 - an overlay image creator (108, 214)
Q for overlay image generation (113) comprising information (116, 410, 608) relating to a lighting effect that is obtainable through the lighting system at the specific location, the obtainable lighting effect that is an effect of at least two controllable light sources of the lighting system, the information (116, 410, 608) relating to the obtainable lighting effect that is based on 0 a lighting system model (110, 216, 500) representing the effects which are obtainable in the environment through controllable light sources of the lighting system, characterized in that the user interaction system (100, 200) further comprises a camera (208) for obtaining an image from the camera, and in that the display (106) is further configured to display the camera image; the means for receiving the location indication (102) is configured to receive input from the user who selects the camera image and who indicates the camera image.
3/6 controllable redirectable that can be moved.
4. USER INTERACTION SYSTEM (100, 200), according to claim 2, characterized in that the user input means (102, 202) is further configured to receive a desired lighting effect or a change in desired lighting effect, and the desired lighting effect and desired lighting effect change are received at the specific location.
5. USER INTERACTION SYSTEM (100, 200), according to claim 4, characterized in that the user interaction system (100, 200) additionally comprises: - a light effect controller (218) for the generation of a control signal for a controllable light source of the lighting system to optimally obtain the desired lighting effect or desired lighting effect change at the specific location, where the control signal generation is based on the model of the lighting system (110, 216, 500), and the lighting system model further comprises information about the relationship between the values of the control signals for the controllable light sources and a response of the respective controllable light sources during the reception of control signals.
USER INTERACTION SYSTEM (100, 200) according to claim 5, characterized in that an overlay image creator (108, 214) is further configured to generate the overlay image (113) comprising : - the information showing the altered operation of the controllable light source of the lighting system according to the generated control signal, or
-...
.— - . -...
—. -.- --- 4/6 - .
.
.-.-- .
.
.—.--- —. .- ..—.-.-— - .—.
..- . , - -... .—-.
— - —- - feedback on the feasibility of the desired lighting effect or desired lighting effect change.
7. USER INTERACTION SYSTEM (100, 200), according to claim 5, characterized in that the user interaction system (100, 200) further comprises: - an output means (220) for the provision of the lighting system control signal.
8. USER INTERACTION SYSTEM (100, 200) according to claim 1 or 2, characterized in that the overlay image creator (108, 214) is further configured to generate the overlay image (113) comprises information relating to one or more controllable light sources of the lighting system, and information about a specific controllable light source is at least one of: the position of the specific light source, the direction in which the light beam of the specific light source can be emitted, the indication of the light beam of the specific light source, the colors that can be emitted by the specific light source, the intensities that can be emitted by the specific light source, the movement or the options rotation of the specific light source, or the relationship between the obtainable effect and the specific light source.
10. USER INTERACTION SYSTEM (100, 200), according to claim 9 when referring to claim 2, characterized in that - the user input means (102, 202) is further configured to receive input from the user who indicates at a particular point in time that the camera image is the sub-area image (112).
11. USER INTERACTION SYSTEM (100, 200), according to claim 9, characterized in that
.
- ., .. .... ..- .-. .
— - .—- -.
—.
.—-— . — —.-- - the camera (208) comprises at least one of: the means (210) of detecting a location of the camera (208), the means (224) of detecting a direction in which the camera (208) is directed, a tilt sensor (226) for detecting the tilt of the camera (208), and the means (228) for detecting the angle of view of the camera (208), and - the location identification means (104 , 204) 4 is configured to detect which specific subarea of the
Environment W is imaged in the sub-area image (112), based on at least one of: the detected camera location (208), the detected direction, the detected tilt, and the detected angle of view, and where the means of identifying the location is configured to be based on detection of the
K specific location in specific detected sub-area.
15 12. USER INTERACTION SYSTEM (100, 200) «
FOR THE CONTROL OF A LIGHTING SYSTEM FOR THE LIGHTING OF AN ENVIRONMENT, whose user interaction system (100, 200), comprises a display (106) for the display of an image of 20 sub-area (112) and an image of overlay (113), the sub-area image (112) being an image of a specific sub-area of the environment, a means (102) for receiving a location indication (118), a location identification means (104, 204 ) for detecting which specific location in the environment is indicated by the location indication (118), an overlay image creator (108, 214) for generating the overlay image (113) comprising the information (116 , 410, 608) relating to a lighting effect that is obtainable through the lighting system at the specific location, the obtainable lighting effect that is an effect of at least two controllable light sources of the
- . - - - .. . ——
6/6 lighting system, the information (116, 410, 608) related to the obtainable lighting effect that is based on a lighting system model (110, 216, 500) that represents the effects that are obtainable in the environment through
5 of controllable light sources of the lighting system, wherein the means for receiving a location indication is a user input means (102, 202) for receiving user input comprising the Location indication. a camera (208) for obtaining a camera image, and wherein the display (106) is further configured to display the camera image. characterized in that - the chamber (208) comprises at least one of: o
15 means (210) for detecting a location of the camera (208), means (224) for detecting a direction in which the camera (208) is directed, a tilt sensor (226) for detecting the tilt of the camera (208), and the camera (208) angle-of-view detection means (228), and
20 - the user input means (102, 202) is configured to be based on the location indication in at least one of: the detected location of the camera (208), the detected direction, the detected tilt, and the angle of view detected. 25 13. PORTABLE ELECTRONIC DEVICE characterized by comprising the user interaction system (100, 200), as defined in any one of claims 1 to 12.
类似技术:
公开号 | 公开日 | 专利标题
BR112013008713A2|2021-05-25|user interaction system for the control of a SW lighting system for the lighting of an environment and portable electronic arrangement
ES2616249T3|2017-06-12|Light source remote control
US9317959B2|2016-04-19|System and method for visualizing virtual objects on a mobile device
BRPI1008194B1|2020-10-20|lighting control device for controlling one or more lighting parameters for each of one or more light sources, lighting system, use of the lighting control device and method of a lighting control device
US11118928B2|2021-09-14|Method for providing map information and electronic device for supporting the same
US8494660B2|2013-07-23|Method and computer implemented apparatus for controlling a lighting infrastructure
JP5128489B2|2013-01-23|User interface and method for controlling a lighting system
US8155872B2|2012-04-10|Method and apparatus for indoor navigation
JPH0696603A|1994-04-08|System and method for modeling of illumination production
CN108681453A|2018-10-19|The implementation method and device of engine map
US20180197339A1|2018-07-12|Augmented reality device for visualizing luminaire fixtures
JP2010181447A5|2011-09-22|
ES2770780T3|2020-07-03|Apparatus and method of providing adjustments of a control system to implement a perceptible result spatial distribution
US10735708B2|2020-08-04|Transforming locations in a spherical image viewer
US10943109B2|2021-03-09|Electronic apparatus, method for controlling thereof and the computer readable recording medium
RU2574586C2|2016-02-10|Method and user interaction system for controlling lighting system, portable electronic device and computer programme product
JP2013003596A|2013-01-07|Information processing apparatus, program, and information processing method
JP5705688B2|2015-04-22|Lighting control apparatus, lighting control system, and lighting control method
JP2017220034A|2017-12-14|Information creation method, information creation system and program
JP6695402B2|2020-05-20|Display system and display program
JP6785471B2|2020-11-18|Setting method, setting system, and program
CN113742439A|2021-12-03|Space labeling method and device, electronic equipment and storage medium
JP2009162515A|2009-07-23|Measurement method and measuring device of illumination condition
BR112014019630B1|2021-09-21|REMOTE CONTROL UNIT TO CONTROL A SET OF LIGHT SOURCES, COMMUNICATIONS DEVICE AND METHOD TO CONTROL A SET OF LIGHT SOURCES
WO2021244918A1|2021-12-09|A method of configuring a plurality of parameters of a lighting device
同族专利:
公开号 | 公开日
US9041731B2|2015-05-26|
CN103168505A|2013-06-19|
CA2821303A1|2012-04-19|
WO2012049656A3|2012-09-13|
JP2014500573A|2014-01-09|
CN103168505B|2015-11-25|
US20130214698A1|2013-08-22|
EP2628363A2|2013-08-21|
JP5819431B2|2015-11-24|
RU2013122115A|2014-11-20|
EP2628363B1|2021-05-05|
CA2821303C|2018-01-16|
WO2012049656A2|2012-04-19|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

KR100538948B1|2003-08-11|2005-12-27|삼성전자주식회사|Display device of portable apparatus capable of accommodately displaying image|
US7646029B2|2004-07-08|2010-01-12|Philips Solid-State Lighting Solutions, Inc.|LED package methods and systems|
CN101571266A|2005-03-23|2009-11-04|皇家飞利浦电子股份有限公司|Light mixing user interface|
CN101341462A|2005-12-22|2009-01-07|皇家飞利浦电子股份有限公司|Button arrangement for colored lighting controller|
WO2008078286A1|2006-12-22|2008-07-03|Koninklijke Philips Electronics N. V.|Method and system for automatically verifying the possibility of rendering a lighting atmosphere from an abstract description|
US8937444B2|2007-05-22|2015-01-20|Koninklijke Philips N.V.|Remote lighting control|
JP2009016941A|2007-06-29|2009-01-22|Funai Electric Co Ltd|Imaging apparatus|
EP2163138A1|2007-06-29|2010-03-17|Philips Intellectual Property & Standards GmbH|Light control system with a user interface for interactively changing settings in a lighting system and method for interactively changing settings in a lighting system with a user interface|
US20100191353A1|2007-07-05|2010-07-29|Koninklijke Philips Electronics N.V.|Apparatus and method for modifying a light scene|
RU2549185C2|2008-07-11|2015-04-20|Конинклейке Филипс Электроникс Н.В.|Method and pc-based device for control of lighting infrastructure|
JP5522747B2|2008-09-26|2014-06-18|コーニンクレッカフィリップスエヌヴェ|System and method for automatic commissioning of multiple light sources|
CN102273322A|2009-01-06|2011-12-07|皇家飞利浦电子股份有限公司|Control system for controlling one or more controllable devices sources and method for enabling such control|
CA2784339A1|2009-12-18|2011-06-23|Koninklijke Philips Electronics N.V.|Lighting tool for creating light scenes|
US10015865B2|2010-01-29|2018-07-03|Philips Lighting Holding B.V.|Interactive lighting control system and method|US10015865B2|2010-01-29|2018-07-03|Philips Lighting Holding B.V.|Interactive lighting control system and method|
EP2659739B1|2010-12-29|2018-05-23|Philips Lighting Holding B.V.|Setting up hybrid coded-light - zigbee lighting system|
US10465882B2|2011-12-14|2019-11-05|Signify Holding B.V.|Methods and apparatus for controlling lighting|
US9445480B2|2012-04-12|2016-09-13|Lg Electronics Inc.|Lighting system, lighting apparatus, and lighting control method|
US9089227B2|2012-05-01|2015-07-28|Hussmann Corporation|Portable device and method for product lighting control, product display lighting method and system, method for controlling product lighting, and -method for setting product display location lighting|
CN104541580B|2012-08-16|2018-03-02|飞利浦灯具控股公司|The system that control includes one or more controllable devices|
US10678407B2|2012-08-16|2020-06-09|Signify Holding B.V.|Controlling a system comprising one or more controllable device|
JP2014044916A|2012-08-28|2014-03-13|Panasonic Corp|Lighting control system|
ES2576498T3|2012-08-30|2016-07-07|Koninklijke Philips N.V.|Control of light sourceusing a portable device|
JP6097963B2|2012-09-13|2017-03-22|パナソニックIpマネジメント株式会社|Lighting system|
PL2939504T3|2012-10-24|2017-12-29|Philips Lighting Holding B.V.|Assisting a user in selecting a lighting device design|
US10467670B2|2012-10-24|2019-11-05|Signify Holdng B.V.|Assisting a user in selecting a lighting device design|
ES2796739T3|2012-10-24|2020-11-30|Signify Holding Bv|Generating a lighting fixture design|
WO2014064634A1|2012-10-24|2014-05-01|Koninklijke Philips N.V.|Assisting a user in selecting a lighting device design|
EP3047702A2|2013-09-16|2016-07-27|Philips Lighting Holding B.V.|Methods and apparatus for controlling lighting|
CN105050227B|2014-03-21|2019-07-26|奥斯兰姆施尔凡尼亚公司|For controlling the method and user interface of the solid-state floodlight of adjusting light beam distribution|
US10568179B2|2013-09-20|2020-02-18|Osram Sylvania Inc.|Techniques and photographical user interface for controlling solid-state luminaire with electronically adjustable light beam distribution|
US9872368B2|2014-01-10|2018-01-16|Panasonic Intellectual Property Corporation Of America|Control method for mobile device|
JP6626841B2|2014-05-05|2019-12-25|シグニファイ ホールディング ビー ヴィSignify Holding B.V.|Device with camera and screen|
US10134064B2|2014-06-27|2018-11-20|Ledvance Llc|Lighting audit and LED lamp retrofit|
CN104093251B|2014-07-08|2017-12-05|四川长虹电器股份有限公司|Landscape Lamp application process based on intelligent bulbs|
JP6242535B2|2014-07-17|2017-12-06|フィリップス ライティング ホールディング ビー ヴィ|Method for obtaining gesture area definition data for a control system based on user input|
CN104219833B|2014-07-31|2017-12-05|四川长虹电器股份有限公司|The method of intelligence generation light position distribution electronic map|
JP6659666B2|2014-08-11|2020-03-04|シグニファイ ホールディング ビー ヴィSignify Holding B.V.|Light system interface and method|
WO2016034546A1|2014-09-01|2016-03-10|Philips Lighting Holding B.V.|Lighting system control method, computer program product, wearable computing device and lighting system kit|
DE102014225706A1|2014-12-12|2016-06-16|Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V.|Method for selectively setting a desired brightness and / or color of a specific spatial area and data processing device for this purpose|
CN107636729B|2015-04-22|2022-03-04|昕诺飞控股有限公司|Lighting plan generator|
WO2016173897A1|2015-04-28|2016-11-03|Philips Lighting Holding B.V.|Area illumination system and method|
US20160370981A1|2015-06-19|2016-12-22|Uniform Industrial Corp.|Editing device and method for edit and preview of light control pattern|
CN106406504B|2015-07-27|2019-05-07|常州市武进区半导体照明应用技术研究院|The atmosphere rendering system and method for human-computer interaction interface|
WO2018077640A1|2016-10-27|2018-05-03|Philips Lighting Holding B.V.|A method of storing object identifiers|
CN109892011B|2016-11-01|2022-02-01|昕诺飞控股有限公司|Lighting system and lighting system control method|
GB2559321A|2017-01-10|2018-08-08|Hoare Lea Llp|Graphical lighting control system|
US10412802B2|2017-03-02|2019-09-10|Osram Sylvania Inc.|Luminaire with programmable light distribution|
US10420186B2|2017-05-31|2019-09-17|Nbcuniversal Media, Llc|Color tunable light with zone control|
EP3658947B1|2017-07-26|2021-04-07|Signify Holding B.V.|A system for communicating a presence of a device via a light source|
US10098204B1|2018-03-13|2018-10-09|Cisco Technology, Inc.|System to determine the placement of smart light emitters|
EP3776506A1|2018-04-09|2021-02-17|Signify Holding B.V.|Superimposing a virtual representation of a sensor and its detection zone over an image|
EP3553373A1|2018-04-13|2019-10-16|NBCUniversal Media, LLC|Digitally adjustable focused beam lighting system|
GB2581248A|2018-12-10|2020-08-12|Electronic Theatre Controls Inc|Augmented reality tools for lighting design|
DE102019102252A1|2019-01-30|2020-07-30|Trilux Gmbh & Co. Kg|Procedure for supporting the installation of a sensor or a luminaire in lighting systems|
US11047560B2|2019-05-29|2021-06-29|Nbcuniversal Media, Llc|Light emitting diode cooling systems and methods|
法律状态:
2021-06-01| B08F| Application dismissed because of non-payment of annual fees [chapter 8.6 patent gazette]|Free format text: REFERENTE A 9A ANUIDADE. |
2021-06-22| B25D| Requested change of name of applicant approved|Owner name: KONINKLIJKE PHILIPS N.V. (NL) |
2021-07-13| B25G| Requested change of headquarter approved|Owner name: KONINKLIJKE PHILIPS N.V. (NL) |
2021-08-03| B25A| Requested transfer of rights approved|Owner name: PHILIPS LIGHTING HOLDING B.V. (NL) |
2021-09-21| B08K| Patent lapsed as no evidence of payment of the annual fee has been furnished to inpi [chapter 8.11 patent gazette]|Free format text: EM VIRTUDE DO ARQUIVAMENTO PUBLICADO NA RPI 2630 DE 01-06-2021 E CONSIDERANDO AUSENCIA DE MANIFESTACAO DENTRO DOS PRAZOS LEGAIS, INFORMO QUE CABE SER MANTIDO O ARQUIVAMENTO DO PEDIDO DE PATENTE, CONFORME O DISPOSTO NO ARTIGO 12, DA RESOLUCAO 113/2013. |
优先权:
申请号 | 申请日 | 专利标题
EP10187697|2010-10-15|
US10187697.7|2010-10-15|
PCT/IB2011/054542|WO2012049656A2|2010-10-15|2011-10-13|A method and a user interaction system for controlling a lighting system, a portable electronic device and a computer program product|
[返回顶部]