![]() User interface for a video player, and method for controlling a video player that has a touch-activa
专利摘要:
VIDEO PEEK A new display and user interface system for a video display device with a touch screen makes it possible to peek, ie briefly see, selected video content while viewing other content. During a video peek, video from a second video source will be seen by partially shifting and temporarily replacing a portion of the video that is currently being viewed. The selection of the other video sources can be controlled, for example, by tapping with one, two, three or four fingers and tapping inwards from any of the four edges of a video display. Furthermore, the video currently being viewed can be interchanged with the video being spied on. 公开号:BR112014002039B1 申请号:R112014002039-6 申请日:2012-02-21 公开日:2022-02-01 发明作者:Matthew Jacob Wardenaar 申请人:Interdigital Ce Patent Holdings; IPC主号:
专利说明:
Cross-Reference to Related Orders [001] This application claims, in accordance with 35 U.S.C. § 119(e), the benefit of filing date of provisional patent application No. 61/515 578, filed on August 5, 2011, and incorporates, by way of reference, the subject matter of provisional patent application No. 61/515,578, filed August 5, 2011. With regard to the background of the invention, this application relates to PCT/US2010/049772, which has an international filing date of September 22, 2010. Background of the Invention [002] The background of the invention relates to the general operation of a touch-activated screen on a video tablet or other display device. The description of a touch-activated screen on a video tablet or other display device found here can also be found in PCT/US2010/049772. Patents, published applications, and articles cited in the International Search Report for PCT/US2010/049772 and thus referring to the general operation of a touch-activatable display include: US2009210819A1; US20070013708A1;US20080079972A1; US20090153478A1; US20090019924A1; EP1450277A2; eSHIRAZI J: “Java Performance Tuning - Chapter 4 Object Creation”. It should be understood that the elements shown in the figures may be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software in one or more appropriately programmed general purpose devices, which may include a processor, memory, and input/output interfaces. Here, the term “coupled” is defined as meaning directly connected to or indirectly connected with through one or more intermediate components. Such intermediate components may include both hardware and software based components. [003] The present description illustrates the principles of the present disclosure. It should thus be understood that those skilled in the art will be able to devise various provisions which, while not explicitly described or shown here, embody the principles of the revelation and are included within its spirit and scope. [004] All examples and conditional language enumerated here are intended for educational purposes in order to help the reader understand the principles of disclosure and the concepts with which the inventor contributed to advance the technique and should be interpreted without limitation to such specifically enumerated examples and conditions. [005] Furthermore, all present statements enumerating principles, aspects, and modalities of the revelation, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Furthermore, such equivalents are intended to include both currently known equivalents and future-developed equivalents, that is, any developed elements that perform the same function, regardless of structure. [006] Thus, for example, those skilled in the art will understand that the block diagrams presented herein represent conceptual views of illustrative circuits that embody the principles of the disclosure. Likewise, it is to be understood that any flowcharts, flowcharts, state transition diagrams, pseudo-code and the like represent various processes which can be substantially represented on computer readable media and thus executed by a computer or processor, whether or not such computer or processor is explicitly shown. [007] The functions of the various elements shown in the figures can be performed using dedicated hardware as well as hardware capable of running software in association with appropriate software. When performed by one processor, functions may be performed by a single dedicated processor, a single shared processor, or a series of individual processors, some of which may be shared. Furthermore, the explicit use of the term “processor” or “controller” should not be interpreted as referring exclusively to hardware capable of running software and may implicitly include, without limitation, digital signal processor (DSP) hardware, memory Read-only memory (ROM) for storing software random access memory (RAM) and non-volatile storage. [008] Other hardware, conventional and/or customized, can also be included. Likewise, any switches shown in the figures are conceptual only. Its function can be performed through the functioning of program logic, through dedicated logic, through the interaction of program control and dedicated logic or even manually, the specific technique being selectable by the implementation as understood more specifically from the context. [009] In the present claims, any element expressed as a device to perform a specified function is intended to encompass any way of performing that function, including, for example, a) a combination of circuit elements that perform that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry to execute such software to perform the function. The disclosure defined by such claims resides in the fact that the functionalities provided by the various devices listed are combined and assembled in the manner required by the claims. It is therefore considered that any devices that can provide these functionalities are equivalent to those shown here. [010] The present disclosure provides a gridded view which is a graphical user interface view that allows the user to navigate a set of data elements in a two-dimensional space (ie, in the x and y directions). The checkerboard display can have a two-dimensional (2D) pattern, such as columns and rows, but it can take other forms. Navigation of the gridded view can be accomplished using recommendations, such as gestures, to locate the desired element. An entry in the grid view is tapped or otherwise selected to initiate another action, such as running related content. This interface mechanism is for use in media applications in which items in the gridded view can be represented graphically, such as by audio album covers or video poster images. The specific modalities describe an apparatus and method associated with optimizations for the view of a gridded display implementation, so that the number of display elements is reduced to a minimum and independent of the number of items in the complete dataset. The modalities also solve problems with navigating through the database so that it can be smooth and effective with respect to the visual perception of the displayed part. The apparatus and method can be specifically adapted for use in a content delivery network that encompasses controlled access to a large database of media content. [011] Navigation through the user interface of the present disclosure is facilitated by a mechanism for moving quickly, simply and accurately through a display such as a television, monitor or touch screen. In one embodiment, an input device, such as a motion detector remote control, is provided. In another embodiment, a remote device with a touch screen or panel is employed that has the cursor on a screen that essentially tracks the user's finger or fingers as they move across the screen of the remote control. As the user moves over the gridded view, graphics representing content in the database move in response to user input, where certain graphics disappear and new graphics appear. In the form of remote device with screen or touch panel, it should be understood that the remote device with screen or touch panel can function as a display device in its own right or it can function simply as a navigation tool. In another embodiment, a conventional handheld remote control is employed that uses at least one button or input mechanism disposed on the surface of the remote control to navigate the checkerboard display. [012] Initially, systems to deliver different types of content to the user will be described. Next, a method and a user interface for searching the content according to the modalities of the present disclosure will be detailed. [013] Referring now to Figure 1, a block diagram of a system embodiment 100 for delivering content to a home or end user is shown. The content originates from a video source 102, such as a movie studio or production house. Content can be provided in at least two ways. One way can be the content broadcast way. The broadcast content is provided to the broadcast affiliate manager 104, which is typically a national broadcast service, such as the American Broadcasting Company (ABC), the National Broadcasting Company (NBC), the Columbia Broadcasting System (CBS), etc. The broadcast affiliate manager may gather and store the content and may schedule delivery of the content through a delivery network, shown as delivery network 1 (106). The delivery network 1 (106) may include satellite link transmission from a national center to one or more regional or local centers. Delivery network 1 (106) may also include local content delivery using local delivery systems such as over-the-air broadcast, satellite broadcast or cable broadcast. The locally delivered content is provided to a receiving device 108 at the user's home, where the content will then be fetched by the user. It should be understood that the receiving device 108 can take many forms and can be embodied as a set-top box/digital video recorder (DVR) converter, a gateway, a modem, etc. In addition, the receiving device 108 may act as an entry point, or gateway, to a native network system that includes additional devices configured as client devices or peer devices on the native network. [014] A second form of content is referred to as special content. Specialty content may include content delivered as premium viewing, pay-per-view, or other content not provided to the Affiliate Broadcast Manager, such as movies, video games, or other video elements. In many cases, special content can be user-requested content. Special content may be delivered to a content manager 110. Content manager 110 may be a service provider, such as an Internet website, affiliated with, for example, a content provider, broadcast service or streaming service. delivery network. Content manager 110 may also incorporate Internet content into the delivery system. The content manager 110 may deliver the content to the user's receiving device 108 via a separate delivery network, delivery network 2 (112). Delivery network 2 (112) may include high-speed broadband Internet-type communication systems. It is important to note that content from affiliate broadcast manager 104 can also be delivered using all or parts of delivery network 2 (112), and content from content manager 110 can be delivered using all or parts of the delivery network 1 (106). In addition, the user can also obtain content directly from the Internet through the delivery network 2 (112) without necessarily having the content managed by the content manager 110. [015] Various adaptations are possible to use the content delivered separately. In one possible approach, premium content is provided as an augmentation of broadcast content, providing alternative views, purchase and marketing options, enhancement material, etc. In another embodiment, the special content may completely replace some programming content provided as broadcast content. Finally, premium content can be completely separate from broadcast content and can simply be an alternative means that the user can choose to use. For example, premium content might be a cinema library with movies that are not yet available as broadcast content. [016] The receiving device 108 can receive different types of content from one or both of the delivery networks 1 and 2. The receiving device 108 processes the content and provides a separation of the content based on the user's preferences and commands. Receiving device 108 may also include a storage device, such as a hard drive or optical disc drive, for recording and replaying audio and video content. Further details of the functioning of the receiving device 108 and features associated with the replay of stored content will be described below with reference to Figure 2. The processed content is provided on a display device 114. The display device 114 can be a conventional 2D type display or it can alternatively be an advanced 3D player. [017] A block diagram of an embodiment of receiving device 200 is shown in Figure 2. The receiving device may function similarly to the receiving device 108 shown in Figure 1 and may be included as part of a gateway device, modem, set-top box converter or other similar communication device. The device 200 shown may also be incorporated into other systems that include the display device (114) itself. In both cases, various components necessary for the complete functioning of the system are not shown for the sake of brevity, as they are well known to those skilled in the art. [018] In device 200, the content is received at an input signal receiver 202. The input signal receiver 202 may be one of several known receiver circuits used to receive, demodulate and decode the signals provided through one of the several possible networks, including over-the-air, cable, satellite, Ethernet, fiber, and telephone line networks. The desired input signal can be selected and retrieved at input signal receiver 202 based on user input provided through a control interface (not shown). The decoded output signal is sent to an input stream processor 204. The input stream processor 204 performs final signal selection and processing and includes separating the video content from the audio content into the content stream. The audio content is sent to an audio processor 206 for converting the received format, such as a compressed audio signal, into an analog waveform signal. The analog waveform signal is sent to an audio interface 208 and additionally to a display device 114 or an audio amplifier (not shown). Alternatively, the audio interface 208 may provide a digital signal to an audio output device or display device using a High Definition Multimedia Interface (HDM) cable or alternative audio interface, such as via a Digital Interconnect Format. Sony/Philips (SPDIF). The audio processor 206 also performs any conversion necessary for storing the audio signals. [019] The video output from the input stream processor 204 is provided to a video processor 210. The video signal can be one of several formats. The video processor 210 provides, as needed, a conversion of the video content, based on the format of the input signal. The video processor 210 also performs any conversion necessary for storing the video signals. [020] A storage device 212 storing audio and video content received at the input. Storage device 212 allows for later retrieval and replay of content under the control of a controller 214 and also based on commands, such as navigation instructions such as fast forward (FF) and rewind (Rew), received. of a user interface 216. Storage device 212 may be a hard disk drive, one or more large-capacity on-board electronic memories, such as static RAM (SRAM) or dynamic RAM (DRAM), or it may be a storage system. interchangeable optical disc storage, such as a compact disc (CD) drive or digital video disc (DVD) drive. [021] The converted video signal from the video processor 210, which originates from the input or storage device 212, is sent to the display interface 218. The display interface 218 also sends the display signal to a display device. of the type described above. The display interface 218 may be an analog signal interface such as red-green-blue (RGB) or it may be a digital interface such as HDMI. It should be understood that the display interface 218 will generate the various screens to present the search results in a three-dimensional grid, as will be described in more detail below. [022] The controller 214 is interconnected via a bus to several of the components of the device 200, which include the input stream processor 202, the audio processor 206, the video processor 210, the storage device 212 and the user interface 216. Controller 214 manages the conversion process to convert the input stream signal into a signal for storage on the storage device or for display. Controller 214 also manages the retrieval and replay of stored content. In addition, as will be described below, the controller 214 performs content fetching and the creation and adjustment of the gridded display representing the content, either stored or to be delivered through the delivery networks described above. Controller 214 is also coupled to control memory 220 (such as, for example, volatile or non-volatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), ROM programmable electronically erasable (EEPROM) to store information and instruction codes for the controller 214. In addition, the memory implementation may include several possible embodiments, such as a single memory device or, alternatively, more than one communicatively connected memory circuit. or coupled together to form a shared or common memory In addition, the memory may be included within another set of circuits, such as parts of a bus communication circuitry, in a larger circuit. [023] To function effectively, the user interface 216 of the present disclosure employs an input device that moves a cursor around the display. To further enhance the user experience and to facilitate the display of, and navigation around, a database such as a cinema library, a touch panel device 300 can be interfaced with the receiving device 108 , as shown in Figure 3(a). The touch panel device 300 allows the receiver device or set-top box converter to work based on manual movements, or gestures, and translated actions, through the panel, into commands for the set-top box converter. In one embodiment, the touch panel 300 may simply function as a navigation tool to navigate the checkerboard display. In other embodiments, the touch panel 300 will additionally function as the display device, allowing the user to interact more directly with the navigation through the gridded display of the content. [024] Alternatively, a mouse device, a remote control with navigation capabilities or a gesture-based remote control can also be used, as described below. [025] The user interface control may be included in the receiving device 200 as part of the user interface 216 or as part of the controller 214. The user interface control incorporates useful features for displaying and navigating through a grid representing the content in a database as well as for video display of the content. The user interface, and more specifically the user interface grid element, is built into a video media player interface that includes scripting or programming capability for graphics manipulation. The video media player and interface may be implemented in the receiving device 200 using any combination of hardware, software, or firmware. Alternatively, some part of the operation of the control and video display may be included in the touch panel device 300 and may also be part of the information transmitted over the native network. [026] In another embodiment, the input device is a remote control, with a form of motion detection, such as a gyroscope or accelerometer, which allows the user to move a cursor freely around the screen or display. Shown in Figure 3(b) is an exemplary angle detector handheld remote control 301. Remote control 301 includes a thumb button 302 positioned on the top side of control 301 so as to be selectively activated by the user's thumb. Activation of the thumb button 302 will also be referred to as a "click", a command often associated with activating or activating a selected function. Control 301 also includes a trigger button 304 positioned on the base side of control 301 so as to be selectively activated by the user's index (or "trigger") finger. Activation of trigger button 304 will also be referred to as "trigger", and angular movement (i.e., pant, yaw and/or roll) of control 301 while trigger is depressed will be referred to as "trigger-drag". A drag-trigger command is often associated with the movement of a cursor, virtual cursor, or other indication of the user's interactive position in the player, such as a state change (i.e., a highlighted or outlined cell), and is commonly used to browse and select interactive player entries. In addition, a series of buttons 306 is provided for entering numbers and/or letters. In one embodiment, the series of buttons 306 is configured similarly to a telephone-type keypad. [027] The use of an angle detector handheld remote control, such as the 301 control described in Figure 3(b), provides various types of user interaction. When using an angle detector control, changes in yaw are mapped to left and right movements, changes in pitch are mapped to up and down movements, and changes in roll are mapped to rotational movements along the control geometric axis. These inputs are used to define gestures, and the gestures, in turn, define specific contextual commands. Thus, a combination of yaw and pitch can be used to define any two-dimensional motion, such as a diagonal, and a combination of yaw, pitch, and roll can be used to define any three-dimensional motion, such as a wobble. Various gestures are shown in Figure 3. The gestures are interpreted in context and are identified by defined movements of the control 301 while the fire button 304 is pressed ("trigger-drag" movements). [028] Boss 320 is defined by a two-stroke design that indicates the action of pointing in one direction, either up, down, left or right. The bump gesture is associated with specific commands in context. In a Temporal Shift mode, a jerk gesture to the left 320 indicates rewind, and a jerk gesture to the right indicates fast forward. In other contexts, a jerk gesture 320 is interpreted as incrementing a specific value in the direction designated by the jerk. Check 330 is defined in the drawing as a check mark. It is similar to a 320 down bump gesture. The check is identified in context as assigning a reminder, a user tag, or as selecting an item or element. Contour 340 is defined as drawing a circle in either direction. It is possible to distinguish both directions. To avoid confusion, however, a circle is identified as a single command regardless of direction. Drag 350 is defined as an angular movement of the control (a change in pitch and/or yaw) while holding trigger button 304 (i.e., "trigger drag"). The 350 drag gesture is used for navigation, speed, distance, time shift, rewind, and forward. Drag 350 can be used to move a cursor, a virtual cursor, or a state change, such as highlighting, outlining, or selecting in the display. Drag 350 can be in any direction and is generally used to navigate in two dimensions. On certain interfaces, however, it is preferable to modify the response to the drag command. In some interfaces, for example, functioning in one dimension or direction is favored over other dimensions or directions depending on the position of the virtual cursor or the direction of movement. The 360 head tilt is defined by two quick up and down vertical trigger-drag movements. The 360 head tilt is used to indicate “Yes” or “Accept”. The formation of X 370 is defined as the design of the letter “X”. The X 370 formation is used for the “Delete” or “Lock” commands. The 380 wiggle is defined by two quick trigger-drag horizontal back and forth motions. The 380 nod gesture is used to indicate “No” or “Cancel”. [029] In addition to the traditional controls for repeating video, the input device will also include a mechanism to invoke or execute at least three separate options on any selected element in the player or screen. These options will be referred to as “Additional Information”, “Run” and “Additional Search”. The “Additional Information” function is used to display more information about the currently selected element. The “Run” function, assuming it is available for the selected element, will select that element to run, which may require a secondary UI to purchase content, etc. The “Additional Search” function represents the mechanism that allows the user to use any element as a source for an additional advanced search that will generate a new total content set, updating the entire screen based on criteria defined by the selected element. It should be understood that these three options may be associated with predefined or new gestures on the touch panel 300, for example, or each option may be assigned to a predetermined button, for example, from the series of buttons 306 on the remote control 301 . [030] It should be understood that at least some of the components described above with respect to Figures 1-3 will form an apparatus and/or system for generating the user interface. [031] Figure 4 shows a graphical flowchart for the operation of the user interface related to the display and navigation aspects of the checkered display of the present disclosure. Initially, video content 401, either from a broadcast source or from the managed special source, can be displayed, in step 402. In step 44, input of the main interface menu can be done by tapping or otherwise by selecting in the video screen. The main menu screen may include various user information elements 403, and may also include a portion of the player that still displays previous video content 401. The video content may continue to play or may be paused. . [032] Navigation to the content library may include using the search, navigation or recommendation button by tapping or otherwise by selecting the desired button. In step 406, selecting the search or recommendation button accesses a 405 linear display structure of certain objects within the library or database and adds additional criteria to the search or recommendation feature to restrict the size of the database coverage data, including actor, genre, or title criteria. Linear grid 405 may be more useful for these access functions because of the restrictions imposed on library access and the reduction in searchable content. [033] In step 408, selecting the navigation function as a navigation tool pulls up a separate two-dimensional gridded view 407 of content selections. The navigation function gives access to the entire library or database and provides very little restriction to navigation around the database. The 407 grid display and navigation to the library or database will be described in more detail below. An entry or selection of an element of a content view (by tapping it, for example), after it has been highlighted or enlarged in one of the previous function operations, opens a detailed screen that provides more details about entering the selected content, step 410. The detail screen also provides additional options for playing, renting, recording, or purchasing the content as well as options for returning to the previous content browsing function described above. [034] Figure 5 shows a detailed view of an embodiment of the grid display 500 that uses aspects of the present disclosure. The checkerboard display 500 functions similarly to the checkerboard display 407 described in Figure 4. The checkerboard display 500 may be displayed on the display device 114 and manipulated or navigated using the touch panel 300 or other navigation device described above. . The interface screen can also be displayed on the touch panel device 300 as a remote viewer, allowing the user to more directly interact with the navigation through the gridded display of content. [035] The Grid display 500 is made up of several graphic elements 502 arranged in a two-dimensional grid. The two-dimensional grid may include rows and columns or include some other two-dimensional pattern arrangement, such as a radial or elliptical pattern around one or more center points. In one embodiment, all elements move together as a contiguous unit. Each graphic element 502 represents a single data entry location from the library or content database, referred to as a template in the control software, which will be described below with reference to Figure 7. For example, the gridded display 500 includes graphic elements representing movie posters. Checkered views showing graphic elements representing book covers, album or CD covers or the like can also be used. The current item 504 is in this case highlighted by adjusting the item's appearance, such as zooming in and centering the element in the viewport. When an item is highlighted in response to user input, additional information relating to the specific content associated with the graphic can be provided with the graphic. In addition, specific content associated with the graphic, such as a movie to be shown, a game to be loaded, a website to be launched, can be executed in response to other user input. Grid view takes advantage of the screen's property or display area to present additional options and context in various dimensions. The grid view navigation is not restricted to a single dimension, typically horizontal. Data in the grid, such as movie content, audio content, etc., can either be arbitrarily or explicitly arranged within the two-dimensional space. When explicitly arranged, the data or graphic elements 502 are organized according to at least one variable related to the specific content associated with the graphic element. For example, rows might represent an alphanumeric organization, while columns might represent genres. [036] Any 502 element can consist of images that represent specific content, such as a defined frame of a recorded piece of content, a network or user-supplied image, or a library of generic or manually or automatically assigned elements. to the content. Any of these elements can be augmented with text, or superimposed on the element itself or displayed with it, and/or additional smaller elements to indicate the type of content. For example, elements representing content that is locally stored on the receiving device, such as the receiving device 108 depicted in Figure 1, can be displayed with a small element of a disk drive in the bottom right corner of a large image representing the content itself. Elements are configured to be detailed enough for the user to clearly see the type of content they represent. Elements can also be partially or completely created dynamically, such as by including content elements that are currently running on a broadcast channel. For example, an element can be dynamically generated (either locally or delivered over the network) from a newly broadcast video scene, then combined with a logo or some indication of the channel on which it is currently being broadcast. This would allow the user to see at a glance what is currently being shown on a large number of channels at the same time. [037] Figure 6 shows user triggering and navigation to the checkered display using aspects of the present disclosure. Interaction with the checkerboard display 600 shown in Figure 6 is described in connection with the touch panel device 300 shown in Figure 3(s). Gestures, made by the user's hand, on or above the touch-sensitive, or capacitively-sensitive panel, translate into messages that are communicated to the receiving device 108 via a network, such as a native network, as described in Figure 1. Messages are translated by a controller on the touch panel device into changes processed by the set-top box converter or receiving device. It is important to note that messages that create changes can be interpreted by the receiving device 108 in a way that results in different effects within the physical data structure that represents the content library (known as the model) and the parts of the display structure ( known as the view) of the implementation. [038] As an example, when initiating a drag movement from lower left to upper right on the touch panel, as shown in Figure 3(a), the elements of grid 600 move so that the item in position B (element 602) moves away from the display device screen towards the upper right and is replaced by the item at position A (element 604), furthermore the item at position C (element 606) moves to position A as shown in Figure 6. In addition, motions can be animated in the player for smooth transition. Moment effects can also be applied to improve the physics of view. For example, the rate of speed at which the gesture is made can be translated into the distance the view is shifted across the grid and/or checkerboard view. [039] It is important to note that the touch panel interface is just one of several possible input devices that can be used for input into the device or system. For example, the use of a hand angle detector controller, as shown in Figure 3(b), provides various types of user interaction. [040] Figure 7 shows a state control diagram for implementing a gridded display that uses aspects of the present disclosure. The implementation for the gridded view and interface follows a Model-View Controller (MVC) coding structure. The Model 702 part, or database library, holds (or gives access to) the entire set of data and also correlates virtual x/y coordinates 704 with specific data items that are arranged in a two-dimensional array. The Model 702 part also tracks the currently selected item 706 within the database based on virtual coordinates (which ideally center the selected item when generated in the view). The Controller part 708 converts mouse messages and other messages 710 (from a remote input device, for example) into relative x/y coordinate changes that are submitted to the model, which in turn updates its virtual position 704. A part of Vista 712 subscribes to Model 702 events and generates the grid for an update-based view. The Vista 712 part includes position updates and detailed item updates. [041] Additionally, the implementation may include a control interpreter for a remote input device, such as a touch panel that uses gesture interaction. Remote input device messages are communicated via software and/or hardware interface to the gridded display implementation and are interpreted by the 708 Controller for input to the Model 702. [042] It will be described with respect to Figures 8-13 a method for optimizing the display of the gridded view and the interface. [043] Initially, in step 802, a total number of items in a graphics database is determined. The data or graphics are arranged in a two-dimensional array that is correlated with the height and width of the individual grid elements multiplied by the number of rows and columns of data in the dataset. Furthermore, it is not necessary to symmetrically arrange the dataset in horizontal and vertical dimensions. Each data item in the dataset contains at least one of an image, a title, rating and uniform resource locator (URL), and other metadata related to a specific piece of content, in the example described above a main movie. [044] Figure 9 shows a data structure embodiment 900 that uses aspects of the present disclosure. Data structure 900 may include an arrangement of a series of data elements arranged based on a display pattern. For example, data structure 900 can include a dataset of 6400 items with the data elements arranged in an 80 column by 80 row arrangement (row and column dimensions do not have to be identical). [045] Data structure 900 exemplifies two-dimensional indexing for each data element 902 as it relates between the display area in a player and the array. As an exemplary illustration of the modality, the virtual dimensions of a grid that contains 80 x 80 elements, each with a visual dimension of 150 x 200 pixels, would result in a virtual space of 12,000 x 16,000 pixels. Instead of loading images for all 6400 items in a plane of 12,000 x 16,000 pixels, the method of the present disclosure would generate only a fraction of the total space, i.e. a first subset of graphics, step 804. This is achieved selecting a “window” in the dataset that constitutes the visible area plus an additional boundary area to facilitate caching sufficient to support smooth navigation to adjacent areas of the grid, step 806. Figure 10 shows a boundary element 1002 around visible area 1004. As will be described in more detail below, all elements 1001 within boundary 1002 are generated, but only elements 1003 within area 1004 are visible. In cases where it is desirable to capture an entire information-worthy screen in any direction, to support rapid gestural movement across the entire width of the screen, the boundary area 1002 may be increased. [046] A loading priority can be set for data (images, for example). Images should be loaded primarily from the center outward to the edge of boundary 1002. As the direction of motion is known, the priority of image loading is weighted with respect to elements entering the view over those leaving the view. . Figure 10 also shows the visible area 1004, the boundary area 1002 and the non-generated virtual data space 1006. The generated visual elements are labeled AA (element 1001), BA (element 1005), and so on. It should be understood that graphic elements, such as graphic element 1007, in virtual data space 1006 are not generated but are assigned to a container or store. As element 1007 enters area 1002 in response to user input, the container will be loaded and the graphic element associated with it will be loaded or generated. [047] In step 808, a part of the first subset of graphics will be displayed. As shown in Figure 10, elements in the visible area 1004 are generated and visible in the player, while elements in the pre-generated boundary area 1002 are generated but not visible in the player. In step 810, the position of at least one graphic element of the first subset is adjusted to a center point on the display in response to user input, and a second subset of graphics is displayed with the at least one graphic element at the point center, step 812. It should be understood that as elements are moved, elements in the boundary area 1002 will quickly appear in the visible area 1004 as they are already generated. If, for example, a user-selected CE element (37, 25) has dragged the element to the top left, at least the GC element (41, 23) would move into the boundary area 1002 and out of the visible area. . Furthermore, at least the AG element (35, 27) will move from the boundary area 1002 to the visible area 1004, the transition appearing continuous since the AG element has already been generated and cached in the boundary area. [048] Special exceptions can be processed, such as when approaching an edge or corner die. The checkerboard display must be configured to avoid navigation beyond the border. The terminal position for an edge should ideally be centered in the view (and emphasized as the selected item), as shown in Figure 11. Here, element 1, 1 (element 1101) is centered on visible area 1004. Even if the user tries select element 1, 1 (element 1101) and move it towards the lower right corner in the player, the elements will be in the same position or will be locked. This way, user input will not result in a blank screen. [049] Figure 12 is a flowchart of the optimization process to display a portion of a larger database or library in a gridded view that uses aspects of the present disclosure. Optimization involves reusing visual elements rather than allocating and deallocating them as needed. At step 1202, the method begins and proceeds to position an element based on a change, step 1204, i.e., in response to user input. In step 1206, it is determined whether the position of the element exceeds the boundary area 1002. When a display element moves outside the pre-generated boundary area 1002, the element is moved to the opposite edge of the boundary area, step 1208. In step 1210, the view element queries the model for data relevant to its new position in virtual space. The virtual space location is determined by the current virtual location offset by the current display coordinates. [050] Figure 13 shows an illustrative embodiment of the movement of display elements in the gridded view that follows a response to a user input to shift the window of displayed elements with the database or library. The display diagram shows how the visuals on the top and right side move to the bottom and left side to be inserted as the display is moved in a diagonal direction. For example, the element IB 1302 on the right shifts its physical position to IB' 1304 on the left. As part of this transition, the visual element queries the model to obtain relevant data for the new virtual position (in the case of IB' that would be the data element 34, 22). In the case of the visual element IA 1306, there are actually two moves, one from right to left, which results in IA' 1308, and one from top to bottom, which results in the final location of IA” 1310. In this way, when a element moves into the boundary area 1002, it is moved to another position in the boundary area so that the underlying container or warehouse for the data or graphic does not need to be unloaded. The container will be reused, but only the graphic would need to be unloaded, which is less resource intensive than having to load a new container. [051] The checkered display of the present disclosure can be used to browse several hundred or even thousands of items, such as content such as movies. Generating a visual for each element can be processor intensive. The techniques of the present disclosure provide for minimizing the number of display elements required to generate a gridded display view, while maintaining the illusion of navigating a larger virtual space of information. [052] A modality of software code that can be used to drive a video player with a touch-activated screen, as described above, can be found in Provisional Patent Application No. 61/515,578 and PCT/US2010/049772. The software code represents a device for enabling and implementing the various features of the invention taught herein, the various features of the invention mentioned above having not been known at the time PCT/US2010/049772 was filed. The software code is exemplary and those skilled in the art should understand that other software code may be developed to implement the features of the invention taught herein. Therefore, the observed software code is considered to be understood by those skilled in the art and need not be repeated here. [053] A device and method for displaying and navigating through a database or library of elements representing available content have been described so far. Elements can be visual elements such as album covers or movie posters. The framework arranges objects in a gridded view to take advantage of the two dimensions of a view, that is, the vertical dimension along with the horizontal dimension, for navigation purposes. Navigation aspects associated with the user interface include gesture-based movements translated into display changes for the coverage grid. Optimizations for the view of a gridded display implementation are described so that the number of display elements is kept to a minimum and independent of the number of items in the complete dataset and that navigation through the database is smooth and efficient with relation to the visual perception of the displayed part. [054] Having described the currently preferred embodiments of an apparatus and a method and user interface for grid navigation, which are intended to be illustrative and not limiting, it should be noted that those skilled in the art may make modifications. and variations in light of the above teachings. Summary of the Invention [055] A new display and user interface system for a video display device with a touch screen makes it possible to peek, i.e. briefly view, a second, or favored, video content while viewing a first video content. video. During a video peek, video from a second video source will be seen by partially shifting and temporarily replacing a portion of the video that is currently being viewed. The selection of the other video sources can be controlled, for example, by tapping with one, two, three or four fingers or fingertips, or tapping inwards from either edge of a video player. What's more, the video currently being viewed can be interchanged with the video being spied on. [056] Patents, published applications, and articles of interest relating to aspects of video peeking beyond basic touchscreen operation include: US786416382, relating to displaying at least a part of a structured electronic document on the screen viewer touch-sensitive, in which the structured electronic document comprises a series of content boxes, and with the detection of a first gesture at a location in the displayed part of the structured electronic document so that a first box of the series of boxes is determined at the location of the first gesture. The first box on the touchscreen display is then enlarged and substantially centered; US2010077455A1 , relating to a method of displaying program content from a pay-television service on a display and receiving a signal to initiate a multi-panel navigation mode on the display. The method includes displaying a multi-pane view in the viewer, the multi-pane view including a program content panel and a top program panel based on program information received from the server. Additional panels included in the multi-pane view may include interactive games or other content obtainable from the pay-TV service; US2003126605B1, relating to an interactive television system designed to fill in an electronic program guide (EPG), which features an On-Demand Video Clip Preview to automatically trigger a video clip preview after browsing through the grid guide from the EPG to a highlighted program titled cell, and remain in such highlighted cell for a predetermined delay. The viewing process is a “No Touch Viewing” process that does not require selections by the viewer while browsing; and “INDIRECT MULTI-TOUCH INTERACTION FOR BRUSHING IN PARALLEL COORDINATES”, KOSARA, R., Univ N Carolina, VISUALIZATION AND DATA ANALYSIS 2011 | 7869 - 2011, SPIE-INT SOC OPTICAL ENGINEERING, related to the use of multi-touch interaction to provide fast and proper interaction with parallel coordinates by using a touch-sensitive trackpad instead of the screen directly, so that the user's hands do not obscure the view during interaction, in which the user employs one, two, three, or four fingers on the trackpad to make complex selections within a dataset. [057] The provisions of the invention for video peeking are, for example, a picture-in-picture (PIP) functionality with substantial improvement, since a PIP image is so much smaller that the PIP image has less resolution and since that making the PIP image larger compromises the relative entirety of the primary video source, or main image. The picture-in-picture functionality also does not provide for video from a secondary video source that appears to flow inward from the sides of a video player and does not provide for the disappearance of video from a secondary video source in a stream that appears to be opposite. In contrast, PIP displays must be turned on and off by separate actuations of a remote control. The same shortcomings are generally true for picture-out-of-picture (POP) functionality. [058] A user interface according to the arrangement of the invention comprises: a touch-activated screen; a touchscreen processor capable of detecting swipe movements across areas of the screen, including distinguishing between at least one or more of different directions of swiping movements, different lengths of swiping movements, and different widths of swiping movements; a video signal processor for selectively providing a first video image to the screen and for selectively providing at least one of a series of other video images to the screen; at least one of the series of other video images being selectively provided to the screen for a given time interval in response to a sliding movement across the screen that occurs within given ranges of directions, lengths and widths; and the other video image being fed to the screen, which is displayed instead of a portion of the first video image. [059] The touchscreen processor is preferably capable of detecting swiping motions across areas of the screen, including distinguishing between at least two different directions of swiping motions, different lengths of swiping motions, and different widths of swiping motions. of slip. The touchscreen processor is also preferably capable of detecting swiping motions across areas of the screen, including distinguishing between each of different swiping motion directions, different swiping motion lengths, and different swiping motion widths. Slipping. [060] The other video image is preferably supplied to the screen by shifting the part of the first video image in sweeping motion which generally corresponds to the direction of the sliding motion. At the end of the time interval, the other video image that is fed to the screen preferably recedes from the view in a sweeping motion that generally corresponds to the direction opposite to the direction of the sliding motion. [061] The screen of most if not all display devices has discernible sides and each side of the screen is a starting point for at least one, two, three or more of the swipe movements. [062] In a currently preferred embodiment taught herein, each sliding motion is characterized by at least one or more of a sliding motion origin point, sliding widths, sliding directions, and sliding extents. Various combinations of the characteristics of the gliding motions can preferably result in different images from the series of other video images that are selectively provided to the screen. If the screen is generally rectangular, for example, different combinations of swipe widths and swipe directions can provide the other video images selected from any one of eight different video sources. [063] Additional control may be provided in accordance with the provisions of the invention. For example, maintaining pressure on the screen at the end of a gliding motion for a given time interval may result in the first video image being replaced by the other video image. Alternatively, a more extensive swipe movement than necessary to start displaying one of the other video images on the screen results in the first video image being replaced by one of the other video images. [064] The provisions of the invention may also be embodied in a method for controlling a video player having a touch-activatable screen, which comprises the steps of: displaying a first video image on the screen; detecting a sliding movement through a first area of the screen; distinguish between different possible origins of the sliding movement, different possible directions of the sliding movement and different possible widths of the sliding movement; selecting at least one of a number of other video images in response to the distinguishing step; providing the selected video image for a given time interval instead of a portion of the first video image; and, when the time interval expires, end the supply stage. [065] With respect to distinguishing at least one range of extension of the sliding movement, the provisions of the invention preferably comprise: initiating the supply step in response to detection of a first extension of the sliding movement occurring within a first range of extensions; and replacing the first video image with the selected video image in response to detection of a second extent of sliding motion occurring within a second range of extents unique to the first track. Alternatively, the method may comprise the step of replacing the first video image with the selected video image in response to detection of user input other than sliding motion. [066] In a currently preferred embodiment, the supply step also comprises the step of gradually replacing the part of the first video image by moving the selected video image in a direction that generally corresponds to the direction of the sliding motion, while that the terminating step comprises the step of gradually replacing the selected video image with the portion of the first video image by moving the selected video image in a direction that is generally opposite to the direction of the sliding motion. In general, the method preferably comprises the step of associating different video images from the series of the other video images with different combinations of different origins, directions and possible widths of the sliding movement. Brief Description of the Drawings [067] Figures 1-3 and the accompanying description explain the general operation of a video tablet, in the context of methods and apparatus for grid navigation on the video tablet, which includes a touch-activated user interface. [068] Figures 4-7 and the accompanying description explain methods and an apparatus for implementing the “video peeking” feature described above. In the drawings, in which the same reference numbers denote similar elements in all views: Figure 1 is a block diagram of an exemplary system for delivering video content in accordance with the present disclosure; Figure 2 is a block diagram of an exemplary receiving device in accordance with the present disclosure; Figure 3(a) is a perspective view of a touch panel in accordance with the present disclosure; Figure 3(b) includes a perspective view of a wireless hand angle detector remote control and shows various exemplary gestures performed with the remote control ;Figure 4 is a graphical flowchart of the operation of a user interface in accordance with an embodiment of the present disclosure;Figure 5 shows an exemplary embodiment of the user interface of the present disclosure;Figure 6 shows the user triggering and navigating a user interface in accordance with an exemplary embodiment of the present disclosure; Figure 7 shows a state control diagram for an exemplary embodiment of a user interface in accordance with the present disclosure; Figure 8 is a flowchart of an exemplary process for optimizing the user interface in accordance with an embodiment of the present disclosure; Figure 9 shows a two-dimensional indexing for each data element of the user interface; Figure 10 shows an area window visible and boundary area of generated graphics for a user interface in accordance with an exemplary embodiment of the present disclosure; Figure 11 shows a view of a user interface in accordance with an exemplary embodiment of the present disclosure; Figure 12 is a flowchart of an exemplary process for optimizing the user interface in accordance with another embodiment of the present disclosure; e Figure 13 shows the movement of graphics in a grid of a user interface in accordance with an exemplary embodiment of the present disclosure; Figure 14 shows a video display useful for explaining the user interface of an exemplary embodiment in accordance with the present disclosure; Figure 15 shows single finger slides and several exemplary fingers of respective widths corresponding to one, two, three and four fingers in accordance with the provisions of the invention; Figures 16(a)-16(d) show several video player alternatives according to the flowchart shown in Figure 14. Figures 17(a)-17(e) sequentially show a video peek in accordance with the present disclosure, in which, for the purposes of this application, the images colors are shown in grayscale. [069] It should be understood that the drawings are for purposes of illustrating the concepts of disclosure and do not necessarily represent the only possible configuration for showing and implementing disclosure. Detailed Description [070] The provisions of the invention provide a display and user interface system for a video display device that makes it possible to take a peek at a first, and perhaps favored, video content while viewing other content, referred to herein as a peek at video. Such video players may include, for example and without limitation, video tablets, video devices, smart phones, electronic books, portable computers, computer monitors and various television sets, which are capable of receiving video content from various sources. and which can be placed close enough to the user for touchscreen activation and control. Video content can also be thought of more generally as primary and secondary videos or video sources; or as a first video source and a number of other video sources. Swipe commands on a touchscreen interface can, for example, allow the user to peek at a video from one of a number of other video sources for a period of time. [071] In accordance with the provisions of the invention, video from one of the other video sources will be viewed by partially shifting and temporarily replacing a portion of the video being viewed with a portion of the other selected video. The selection of the other video sources can be controlled, for example, by tapping with one, two, three or four fingers. The selection of the other video sources can also be controlled, for example, by tapping inwards from any of the four edges of a rectangular video player. In this way, for example, the user can easily peek at a video from any of sixteen other video sources, or secondary video sources, while viewing a video from a primary video source or first video source, based on on the number of fingertips it taps with at the same time and based on which side of the video player the slips originate from. Furthermore, the display can also be programmed to allow the exchange of “spy video” from a secondary source with what was previously the video from the primary video source. This interchange can, for example and without limitation, be implemented with a slide longer than necessary for a peek. The video of what then becomes a secondary source can thus become able to slip. [072] The video peek is explained in more detail in conjunction with Figures 14-17. In a fundamental mode of video peeking, the user viewing a first video (or first video content) from a first video source can allow video from one of a number of other video sources (or second content from video) smoothly shift, partially, and temporarily replace a part of the video being viewed with a part of the other selected video. The content of the first video source can also be thought of as a primary video source. [073] Figure 14 is a flowchart 1400 showing the use of swiping as described above to implement and control a video peek at selected video sources on a video tablet or similarly capable device. In step 1402, the user selects a series of video sources that can be viewed in accordance with the provisions of the invention. These video content sources can be thought of as “favorite” or “preferred” sources that the user expects to see from time to time by peeking at video in accordance with the provisions of the invention, while viewing the video content of yet another source together. As noted in the flowchart, the viewer can assign a first video source denoted as Video Source 1; you can assign 4 other video sources (from 2 to 5) to the Horizontal Video Peek, as Video Sources 2 to 5; and can assign 4 other additional video sources to Vertical Video Peek, such as Video Sources 6 to 9. So when the user activates the display of Video Source 1 according to step 1404, and Video Source 1 is displayed in accordance with step 1406, the user has available a set of eight different channels or sources of video content that can be briefly viewed through the video peek without completely interrupting the display of Video Source 1. The user can choose then implementing the horizontal video peeking using horizontal scrolling the screen from the edges or left or right sides of the video player according to step 1408, or using vertical scrolling the screen from the edges or sides of the top or bottom of the video player from according to step 1402. [074] As indicated by the blocks associated with reference number 1410, tapping horizontally with one, two, three, or four fingers will initiate a video peek at video content or video sources 6, 7, 8, and 9, respectively. As indicated by the blocks associated with the reference number 1414, tapping with one, two, three, or four fingers vertically will initiate a video peek at video content or video sources 2, 3, 4, and 5, respectively. It should be understood that many types of video player have been and are presented with the ability to rotate the video content to compensate for the rotation of the video player device. It is to be understood that the provisions of the invention can operate with such rotation control. [075] In the modality shown in Figure 14, the number of other video content sources is four sources that can be displayed in response to left or right directional swipe and four sources that can be displayed in response to directional swipe up or down. The number of sources in Figure 14 is not limiting for the purposes of the invention. In fact, for example, directional swipe only left and only right and directional swipe only up and only down can call only one of sixteen other video content sources for video peeking. However, the number of other video content sources will be limited virtually by certain practical and personal issues. One such problem is how many combinations of directions and numbers of fingers can be remembered by any given user. Another such problem is the manual dexterity of the user. It should be understood, therefore, that, with respect to flowchart 1400, the selection of less than eight other sources of video content is still within the scope of the provisions of the invention. In fact, even video peeking at a single other source of video content is within the scope of the invention. [076] Figure 15 shows a video display device 1502. The display device has a top edge or edge 1504, a right edge or edge 1506, a bottom edge or edge 1508, and a left edge or edge 1510. Each of the circles dashes 1512, 1514, 1516 and 1518 correspond to a location on the touch-activatable display from which a gliding motion can be initiated. Circles are not intended to represent an accurate conformation size. Dashed circles can be displayed as a temporary training measure, they can be permanently displayed, or they can remain undisplayed. Arrows are intended to indicate a sliding direction. The arrow associated with the dashed circle 1512 is intended to indicate a swipe with a finger. The arrows associated with the dashed circle 1514 are intended to indicate a two-finger swipe. The arrows associated with the dashed circle 1516 are intended to indicate a three-finger swipe. The arrows associated with the dashed circle 1518 are intended to indicate a four-finger swipe. [077] The practical application of the flowchart 1400 of Figure 14 is shown in Figures 16(a) to 16(d). With reference to flowchart 1400 of Figure 14 and Figure 16(a), a finger-width downward swipe will invoke video peeking from Video Source 2 into Video Source 1. Figure 14 and Figure 16(b), a two finger width swipe to the left will call video peeking from Video Source 7 into Video Source 1. Referring to flowchart 1400 of Figure 14 and Figure 16( c), a three finger-width upward swipe will call video peek from Video Source 4 into Video Source 1. With reference to flowchart 1400 of Figure 14 and Figure 16(d), a swipe width of four fingers to the right will call video peek from Video Source 9 into Video Source 1. [078] It should be noted that the association of a slip width and a slip direction can be done in a way that can optimize the video peek value. For example, Figure 16(a) shows a video peek in the downward direction from the top side or edge. It is known that a bookmark or identifier is typically displayed on the basis of news and weather related video content. Therefore, it is advantageous for the user to select news and weather related video content sources to be invoked by downward-directed swipes so that the marker or identifier is fully visible. Likewise, in video content sources for sporting events, scores and other information are often displayed at the top of the video content or in the upper left corner of the video content. A user familiar with the practices of displaying the broadcasts of the user's favorite teams can also select swipe options that immediately provide information regarding the status of sporting events. [079] Figures 17(a) to 17(e) are a sequence 1700 of video frames 1702, 1704, 1706, 1708 and 1710 extracted from a video clip in order to visually show a video peek. One each of the video frames, the content of the first video source (Video Source 1 or primary video source) is denoted by 1712. Only Video Source 1, in this example a penguin, is displayed at frame 1702. A circle 1716 and an arrow 1718, explained in connection with Figures 15 and 16(a)-16(d), are also shown in video frame 1702. In frame 1704, and in response to a sliding motion in the direction of the arrow 1716, part of the primary video 1712 can be seen to be partially displaced and temporarily replaced by a part of other video content 1714, in this example a baseball game, that moves from the right side. At frame 1706, shifting and replacing a portion of the primary video 1714 is complete. At the 1708 video frame, the left edge of the other 1714 video can be seen moving to the right and becoming smaller. At frame 1710, only the primary video 1712 is being displayed again. [080] According to the currently preferred modalities, a video peek can last for a time interval and it is considered that the time interval can be adjustable. Also in accordance with currently preferred modalities, for example, the user can retain or maintain the video peek by continuing to press the touchscreen at the end of the swipe, the video peek continuing like this until the user stops pressing the screen. . Also according to the currently preferred modalities, a user who peeks at a new video who wants to see the peeked video in full screen can, for example, do so by swiping across the entire screen or pressing at the end of a swipe movement. In this regard, consideration is also given to the possibility that the extent to which a secondary video will shift and replace the primary video will be controlled by the extent of the slip. [081] According to the provisions of the invention, the user can, for example and without limitation, advantageously view sports, news or entertainment channels and can take a peek at other videos in order to check scores and exciting news, or even determine if a commercial in an entertainment video is over and thus return to the primary video source.
权利要求:
Claims (10) [0001] 1. User interface for a video player (1502), comprising: a touch-activatable screen; a touch-sensitive screen processor that detects sliding motions across areas (1512, 1514, 1516, 1518) of the screen; a video signal processor for selectively providing a first video image sequence (1700) to the screen and for selectively providing at least one of a plurality of second video image sequences (1714) to the screen for display; CHARACTERIZED by the fact that What: The touchscreen processor distinguishes between up (1516), down (1512), left (1514) and right (1518) directions of swipe movements, and distinguishes between different widths (1410, 1414) of the sliding movements determined by the sliding movement performed with one, two, three or four fingers, respectively; one of a plurality of second video image sequences being selectively provided to the screen in response This depends on a particular combination of the direction and width of the gliding motion, as determined by the touch screen processor, the second video image sequence (1714) being provided to the screen and being played back simultaneously with the first image sequence so that the second video image sequence flows in from one side (1504, 1506, 1508, 1510) of the video player, replaces a part of the first video image sequence (1700), while a pressure associated with the sliding motion is applied to the screen, and disappears after the pressure is released. [0002] 2. User interface according to claim 1, CHARACTERIZED in that the second video image sequence being supplied to the screen displaces part of said first video image sequence in a movement that generally corresponds to the direction of movement of slip. [0003] 3. User interface according to claim 1, CHARACTERIZED by the fact that the second video image sequence being supplied to the screen recedes from viewing in a motion that generally corresponds to a direction opposite to the direction of the sliding motion. . [0004] 4. User interface according to claim 1, CHARACTERIZED by the fact that instead of disappearing automatically, the second video image sequence being supplied to the screen replaces the first video image sequence if pressure exerted on the screen on the end of a sliding motion is maintained for a certain period of time. [0005] 5. User interface according to claim 1, CHARACTERIZED by the fact that instead of automatically disappearing, the second video image sequence being supplied to the screen replaces the first video image sequence if a wide-swipe motion greater than necessary to start displaying the second video image sequence on the screen will result in the second video image sequence replacing the first video image sequence. [0006] 6. User interface according to claim 1, CHARACTERIZED by the fact that a width of the swipe movements controls the extent to which the second video image sequence being supplied to the screen displaces and replaces the first image sequence of video. [0007] 7. A method of controlling a video player having a touch-activatable screen, comprising the steps of: displaying a first video image sequence (1700) on the screen; detecting a sliding motion through a first area (1512, 1514). , 1516, 1518) of the screen; where the method is CHARACTERIZED by: distinguishing between up (1516), down (1512), left (1514) and right (1518) directions of sliding movements, and different possible widths (1410, 1414) of sliding movements determined by the sliding movement performed with one, two, three or four fingers, respectively; selecting one of a plurality of second video image sequences (1714) in response to a particular combination of the direction and width of the gliding motion, as determined by the step of distinguishing, the second video image sequence (1714) being provided to the screen and being played back simultaneously with the first video image sequence video (1700), so that the second video image sequence flows into one side of the video player (1504, 1506, 1508, 1510), replaces a portion of the first video image sequence (1700), and automatically disappears after the given time interval. [0008] 8. Method according to claim 7, CHARACTERIZED in that it further comprises gradually replacing the part of the first video image sequence with the movement of the second video image sequence being supplied to the screen in a direction that generally corresponds to the direction of the sliding motion. [0009] 9. Method according to claim 7, CHARACTERIZED in that it further comprises rewinding the second video image sequence being supplied to the screen from viewing in a direction which is generally opposite to the direction of the sliding movement. [0010] A method according to claim 7, CHARACTERIZED in that it further comprises controlling the extent to which the second video image sequence moves and replaces the first video image sequence, being controlled by a width of the sliding movements.
类似技术:
公开号 | 公开日 | 专利标题 BR112014002039B1|2022-02-01|User interface for a video player, and method for controlling a video player that has a touch-activated screen JP5681193B2|2015-03-04|Equipment and method for grid navigation US10674107B2|2020-06-02|User interface for audio video display device such as TV JP5628424B2|2014-11-19|System, method, and user interface for content search US11175818B2|2021-11-16|Method and apparatus for controlling display of video content US9538229B2|2017-01-03|Media experience for touch screen devices US9380345B2|2016-06-28|Immersive scaling interactive television JP2014508984A|2014-04-10|Method and apparatus for providing media recommendations US20130268522A1|2013-10-10|System and method for content exclusion from a multi-domain search US9542407B2|2017-01-10|Method and apparatus for media searching using a graphical user interface JP2014505928A|2014-03-06|Method for identifying a region of interest in a user interface US20130155053A1|2013-06-20|Multi-dimensional visual display interface US20140047395A1|2014-02-13|Gesture based control of element or item
同族专利:
公开号 | 公开日 KR20140044881A|2014-04-15| BR112014002039A2|2017-03-01| WO2013022486A1|2013-02-14| JP2014529212A|2014-10-30| EP2740264A1|2014-06-11| US9152235B2|2015-10-06| EP2740264B1|2016-10-19| CN103797784A|2014-05-14| JP6050352B2|2016-12-21| US20140176479A1|2014-06-26|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 JPS60124196U|1984-01-30|1985-08-21| JPH11103470A|1997-09-26|1999-04-13|Toshiba Corp|Video changeover processing device| US20030126605A1|2001-12-28|2003-07-03|Betz Steve Craig|Method for displaying EPG video-clip previews on demand| JP3925297B2|2002-05-13|2007-06-06|ソニー株式会社|Video display system and video display control device| GB0303888D0|2003-02-19|2003-03-26|Sec Dep Acting Through Ordnanc|Image streaming| WO2006098700A1|2005-03-18|2006-09-21|Nanyang Technological University|Microfluidic sensor for interfacial tension measurement and method for measuring interfacial tension| US9041744B2|2005-07-14|2015-05-26|Telecommunication Systems, Inc.|Tiled map display on a wireless device| US7532253B1|2005-07-26|2009-05-12|Pixelworks, Inc.|Television channel change picture-in-picture circuit and method| JP2007334525A|2006-06-14|2007-12-27|Sofny Group:Kk|Computer, client/server computer group, server computer, display program, and display representation method| US7864163B2|2006-09-06|2011-01-04|Apple Inc.|Portable electronic device, method, and graphical user interface for displaying structured electronic documents| US8564543B2|2006-09-11|2013-10-22|Apple Inc.|Media player with imaged based browsing| US7667719B2|2006-09-29|2010-02-23|Amazon Technologies, Inc.|Image-based document display| JP4973245B2|2007-03-08|2012-07-11|富士ゼロックス株式会社|Display device and program| US8194037B2|2007-12-14|2012-06-05|Apple Inc.|Centering a 3D remote controller in a media system| US8250604B2|2008-02-05|2012-08-21|Sony Corporation|Near real-time multiple thumbnail guide with single tuner| JP5039903B2|2008-02-18|2012-10-03|インターナショナル・ビジネス・マシーンズ・コーポレーション|System, method and program for executing application| JP5016553B2|2008-05-28|2012-09-05|京セラ株式会社|Mobile communication terminal and terminal operation method| US20090328101A1|2008-06-30|2009-12-31|Nokia Corporation|User interface for mobile tv interactive services| KR101526973B1|2008-07-07|2015-06-11|엘지전자 주식회사|Mobile terminal and method for controlling the same| US20100077433A1|2008-09-24|2010-03-25|Verizon Data Services Llc|Multi-panel television browsing| KR101588660B1|2008-09-30|2016-01-28|삼성전자주식회사|A display apparatus capable of moving image and the method thereof| JP4666053B2|2008-10-28|2011-04-06|ソニー株式会社|Information processing apparatus, information processing method, and program| JP5202425B2|2009-04-27|2013-06-05|三菱電機株式会社|Video surveillance system| JP5179537B2|2010-04-09|2013-04-10|株式会社ソニー・コンピュータエンタテインメント|Information processing device| JP5541998B2|2010-07-28|2014-07-09|株式会社ソニー・コンピュータエンタテインメント|Information processing device| JP2012038271A|2010-08-11|2012-02-23|Kyocera Corp|Electronic apparatus and method for controlling the same| US20120069055A1|2010-09-22|2012-03-22|Nikon Corporation|Image display apparatus| JP5678576B2|2010-10-27|2015-03-04|ソニー株式会社|Information processing apparatus, information processing method, program, and monitoring system| US9471145B2|2011-01-06|2016-10-18|Blackberry Limited|Electronic device and method of displaying information in response to a gesture| US9423878B2|2011-01-06|2016-08-23|Blackberry Limited|Electronic device and method of displaying information in response to a gesture| US20130141371A1|2011-12-01|2013-06-06|Research In Motion Limited|Electronic device and method of displaying information in response to a gesture| US20120262462A1|2011-04-18|2012-10-18|Johan Montan|Portable electronic device for displaying images and method of operation thereof| WO2013105254A1|2012-01-12|2013-07-18|三菱電機株式会社|Map display device and map display method| KR20130090138A|2012-02-03|2013-08-13|삼성전자주식회사|Operation method for plural touch panel and portable device supporting the same| JP5882779B2|2012-02-15|2016-03-09|キヤノン株式会社|Image processing apparatus, image processing apparatus control method, and program| JP5598737B2|2012-02-27|2014-10-01|カシオ計算機株式会社|Image display device, image display method, and image display program| CN102929527A|2012-09-27|2013-02-13|鸿富锦精密工业(深圳)有限公司|Device with picture switching function and picture switching method| CN103902080A|2012-12-27|2014-07-02|华硕电脑股份有限公司|Touch device and touch processing method| KR102010955B1|2013-01-07|2019-08-14|삼성전자 주식회사|Method for controlling preview of picture taken in camera and mobile terminal implementing the same| KR102134404B1|2013-08-27|2020-07-16|삼성전자주식회사|Method for displaying data and an electronic device thereof|US8689123B2|2010-12-23|2014-04-01|Microsoft Corporation|Application reporting in an application-selectable user interface| US8612874B2|2010-12-23|2013-12-17|Microsoft Corporation|Presenting an application change through a tile| US9465440B2|2011-01-06|2016-10-11|Blackberry Limited|Electronic device and method of displaying information in response to a gesture| US9423878B2|2011-01-06|2016-08-23|Blackberry Limited|Electronic device and method of displaying information in response to a gesture| US9471145B2|2011-01-06|2016-10-18|Blackberry Limited|Electronic device and method of displaying information in response to a gesture| US10353566B2|2011-09-09|2019-07-16|Microsoft Technology Licensing, Llc|Semantic zoom animations| WO2013169842A2|2012-05-09|2013-11-14|Yknots Industries Llc|Device, method, and graphical user interface for selecting object within a group of objects| DE112013002387T5|2012-05-09|2015-02-12|Apple Inc.|Apparatus, method and graphical user interface for providing tactile feedback for operations in a user interface| WO2013169851A2|2012-05-09|2013-11-14|Yknots Industries Llc|Device, method, and graphical user interface for facilitating user interaction with controls in a user interface| EP2975512B1|2012-05-09|2019-02-06|Apple Inc.|Device and method for displaying a virtual loupe in response to a user contact| WO2013169865A2|2012-05-09|2013-11-14|Yknots Industries Llc|Device, method, and graphical user interface for moving a user interface object based on an intensity of a press input| CN104487930A|2012-05-09|2015-04-01|苹果公司|Device, method, and graphical user interface for moving and dropping a user interface object| CN104471521B|2012-05-09|2018-10-23|苹果公司|For providing the equipment, method and graphic user interface of feedback for the state of activation for changing user interface object| WO2013169843A1|2012-05-09|2013-11-14|Yknots Industries Llc|Device, method, and graphical user interface for manipulating framed graphical objects| JP6002836B2|2012-05-09|2016-10-05|アップル インコーポレイテッド|Device, method, and graphical user interface for transitioning between display states in response to a gesture| WO2013169849A2|2012-05-09|2013-11-14|Industries Llc Yknots|Device, method, and graphical user interface for displaying user interface objects corresponding to an application| KR101956082B1|2012-05-09|2019-03-11|애플 인크.|Device, method, and graphical user interface for selecting user interface objects| KR101515623B1|2012-05-14|2015-04-28|삼성전자주식회사|Method and apparatus for operating functions of portable terminal having bended display| US9582157B1|2012-08-03|2017-02-28|I4VU1, Inc.|User interface and program guide for a multi-program video viewing apparatus| EP2939095B1|2012-12-29|2018-10-03|Apple Inc.|Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics| CN104903835B|2012-12-29|2018-05-04|苹果公司|For abandoning equipment, method and the graphic user interface of generation tactile output for more contact gestures| CN104885050B|2012-12-29|2017-12-08|苹果公司|For determining the equipment, method and the graphic user interface that are rolling or selection content| US8994828B2|2013-02-28|2015-03-31|Apple Inc.|Aligned video comparison tool| US20140298245A1|2013-03-28|2014-10-02|Microsoft Corporation|Display Instance Management| KR101799294B1|2013-05-10|2017-11-20|삼성전자주식회사|Display appratus and Method for controlling display apparatus thereof| US9641891B2|2013-06-17|2017-05-02|Spotify Ab|System and method for determining whether to use cached media| US9516082B2|2013-08-01|2016-12-06|Spotify Ab|System and method for advancing to a predefined portion of a decompressed media stream| KR102134404B1|2013-08-27|2020-07-16|삼성전자주식회사|Method for displaying data and an electronic device thereof| US9529888B2|2013-09-23|2016-12-27|Spotify Ab|System and method for efficiently providing media and associated metadata| US9654532B2|2013-09-23|2017-05-16|Spotify Ab|System and method for sharing file portions between peers with different capabilities| US20150100885A1|2013-10-04|2015-04-09|Morgan James Riley|Video streaming on a mobile device| US9063640B2|2013-10-17|2015-06-23|Spotify Ab|System and method for switching between media items in a plurality of sequences of media items| US9686581B2|2013-11-07|2017-06-20|Cisco Technology, Inc.|Second-screen TV bridge| US9990125B2|2013-11-15|2018-06-05|Lg Electronics Inc.|Mobile terminal and method of controlling the same| KR101522468B1|2013-12-05|2015-05-28|네이버 주식회사|Video transition method and video transition system| USD770325S1|2013-12-24|2016-11-01|Tencent TechnologyCompany Limited|Penguin figurine| USD765139S1|2013-12-24|2016-08-30|Tencent TechnologyCompany Limited|Portion of a display screen with graphical user interface| US10222935B2|2014-04-23|2019-03-05|Cisco Technology Inc.|Treemap-type user interface| KR20150142347A|2014-06-11|2015-12-22|삼성전자주식회사|User terminal device, and Method for controlling for User terminal device, and multimedia system thereof| USD770492S1|2014-08-22|2016-11-01|Google Inc.|Portion of a display panel with a computer icon| KR20160028272A|2014-09-03|2016-03-11|삼성전자주식회사|Display apparatus and method for controlling the same| US9645732B2|2015-03-08|2017-05-09|Apple Inc.|Devices, methods, and graphical user interfaces for displaying and using menus| US10048757B2|2015-03-08|2018-08-14|Apple Inc.|Devices and methods for controlling media presentation| US9632664B2|2015-03-08|2017-04-25|Apple Inc.|Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback| US10095396B2|2015-03-08|2018-10-09|Apple Inc.|Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object| US9639184B2|2015-03-19|2017-05-02|Apple Inc.|Touch input cursor manipulation| US10200598B2|2015-06-07|2019-02-05|Apple Inc.|Devices and methods for capturing and interacting with enhanced digital images| US9830048B2|2015-06-07|2017-11-28|Apple Inc.|Devices and methods for processing touch inputs with instructions in a web page| US9891811B2|2015-06-07|2018-02-13|Apple Inc.|Devices and methods for navigating between user interfaces| US10346030B2|2015-06-07|2019-07-09|Apple Inc.|Devices and methods for navigating between user interfaces| US9860451B2|2015-06-07|2018-01-02|Apple Inc.|Devices and methods for capturing and interacting with enhanced digital images| JP6643776B2|2015-06-11|2020-02-12|株式会社バンダイナムコエンターテインメント|Terminal device and program| US9652125B2|2015-06-18|2017-05-16|Apple Inc.|Device, method, and graphical user interface for navigating media content| US9880735B2|2015-08-10|2018-01-30|Apple Inc.|Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback| US20170045981A1|2015-08-10|2017-02-16|Apple Inc.|Devices and Methods for Processing Touch Inputs Based on Their Intensities| US10416800B2|2015-08-10|2019-09-17|Apple Inc.|Devices, methods, and graphical user interfaces for adjusting user interface objects| US10235035B2|2015-08-10|2019-03-19|Apple Inc.|Devices, methods, and graphical user interfaces for content navigation and manipulation| US9928029B2|2015-09-08|2018-03-27|Apple Inc.|Device, method, and graphical user interface for providing audiovisual feedback| US9990113B2|2015-09-08|2018-06-05|Apple Inc.|Devices, methods, and graphical user interfaces for moving a current focus using a touch-sensitive remote control| USD811428S1|2015-09-24|2018-02-27|4Thought Sa|Display screen or portion thereof with transitional graphical user interface| CN106791352A|2015-11-25|2017-05-31|中兴通讯股份有限公司|A kind of photographic method, device and terminal| JP2017126941A|2016-01-15|2017-07-20|株式会社サイバーエージェント|Content distribution system| US10397632B2|2016-02-16|2019-08-27|Google Llc|Touch gesture control of video playback| USD833454S1|2016-05-27|2018-11-13|Axis Ab|Display screen or portion thereof with graphical user interface| JP6729069B2|2016-06-28|2020-07-22|富士ゼロックス株式会社|Information processing equipment| KR20180020830A|2016-08-19|2018-02-28|삼성전자주식회사|Videowall system, the control method and a display apparatus| US10372520B2|2016-11-22|2019-08-06|Cisco Technology, Inc.|Graphical user interface for visualizing a plurality of issues with an infrastructure| US10739943B2|2016-12-13|2020-08-11|Cisco Technology, Inc.|Ordered list user interface| US11042262B2|2017-02-01|2021-06-22|Opentv, Inc.|Menu modification based on controller manipulation data| USD831700S1|2017-07-31|2018-10-23|Shenzhen Valuelink E-Commerce Co., Ltd.|Display screen or portion thereof with graphical user interface| USD916860S1|2017-09-26|2021-04-20|Amazon Technologies, Inc.|Display system with a virtual reality graphical user interface| US10614616B1|2017-09-26|2020-04-07|Amazon Technologies, Inc.|Virtual reality user interface generation| USD896235S1|2017-09-26|2020-09-15|Amazon Technologies, Inc.|Display system with a virtual reality graphical user interface| CN110221734A|2018-03-01|2019-09-10|华为技术有限公司|Information display method, graphical user interface and terminal| US10862867B2|2018-04-01|2020-12-08|Cisco Technology, Inc.|Intelligent graphical user interface| CN109640188B|2018-12-28|2020-02-07|北京微播视界科技有限公司|Video preview method and device, electronic equipment and computer readable storage medium| JP6826295B1|2019-09-20|2021-02-03|株式会社ミクシィ|Computer programs, information processing equipment and information processing methods|
法律状态:
2018-12-18| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]| 2019-09-03| B25G| Requested change of headquarter approved|Owner name: THOMSON LICENSING (FR) | 2019-10-01| B25A| Requested transfer of rights approved|Owner name: INTERDIGITAL CE PATENT HOLDINGS (FR) | 2020-04-07| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]| 2021-12-07| B09A| Decision: intention to grant [chapter 9.1 patent gazette]| 2022-02-01| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 21/02/2012, OBSERVADAS AS CONDICOES LEGAIS. |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201161515578P| true| 2011-08-05|2011-08-05| US61/515,578|2011-08-05| PCT/US2012/025878|WO2013022486A1|2011-08-05|2012-02-21|Video peeking| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|