SYSTEM AND METHODS FOR CONTROLLING DEVICE OPERATION AND IMAGE CAPTURE

- Jamdeo Canada Ltd.

The present disclosure relates to controlling device operation. In one embodiment, a process for controlling device operation includes detecting a command to launch image capture functionality of the device, wherein the command includes contact to a display of the device and displaying an overlay window on the display of the device in response to the command, wherein the overlay window presents image data detected by the device. The process can also include detecting a capture command for image capture by the device, wherein the capture command relates to detection of a release of the contact to the display, and capturing image data in response to the capture command, wherein capturing includes storing captured image data by the device.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims priority to U.S. Provisional Application No. 62/183,613 titled SYSTEM AND METHODS FOR A USER INTERFACE AND DEVICE OPERATION filed on Jun. 23, 2015, and U.S. Provisional Application No. 62/184,476 titled SYSTEM AND METHODS FOR A USER INTERFACE AND DEVICE OPERATION filed on Jun. 25, 2015, the content of which is expressly incorporated by reference in its entirety.

FIELD

The present disclosure relates to devices and operation of devices, and in particular, controlling presentation and interoperation with user interface elements and applications, such as an image capture capabilities of a device.

BACKGROUND

Mobile devices and personal communication devices are generally used for multiple purposes. There exist many different ways of controlling these devices. Conventional control methods include the use of dedicated buttons, soft buttons in combination with a user interface, a user interfaces with graphical elements and the use of touch screens. As the number of operations and features on a device increase, basic operations are harder to access. In other instances, operations for accessing features are cumbersome and/or disruptive to use of a device. There exists a need for access to functions and features of devices and configurations for devices to allow for improved control and improved access.

Although conventional methods for accessing applications exist for mobile devices, the conventional techniques require navigation to a feature or can interrupt the use of a device. By way of example, devices may include short cuts or selectable icons to launch an application. These techniques are generally limited to launching an application and thus, switch the device into a particular mode. With existing devices and methods, these processes and functions limited in accessibility of the device. In many cases, these old methods do not provide access to a function in a desired time. Accordingly there exists a need for improved and different control.

BRIEF SUMMARY OF THE EMBODIMENTS

Disclosed and claimed herein are systems, methods and devices for controlling device operation. In one embodiment, a method for controlling device operation for image capture includes detecting, by a device, a command to launch an image capture functionality of the device, wherein the command includes contact to a display of the device, and displaying, by the device, an overlay window on the display of the device in response to the command. The method also includes detecting, by the device, a capture command for image capture by the device, wherein the capture command relates to detection of a release of the contact to the display. The method also includes capturing, by the device, image data in response to the capture command.

In one embodiment, the command to launch the image capture functionality includes a touch command within a predetermined area of the display and continued contact with the display.

In one embodiment, the overlay window includes image data detected by at least one image sensor of the device during detection of the contact to the display of the device.

In one embodiment, the overlay window includes functionality for detection of an image data application of the device.

In one embodiment, the capture command includes detection of a plurality of releases relative to the display, and wherein capturing image data includes capturing a plurality of images.

In one embodiment, the method further includes detecting, by the device, a toggle command following the command to launch the image capture functionality, wherein the toggle command includes contact to the display associated with a predetermined movement.

In one embodiment, the toggle command includes switching image detection from a first image detection sensor to a second image detection sensor.

In one embodiment, the toggle command switches functionality of the image detector from image to video detection.

In one embodiment, the toggle command launches a camera application of the device.

In one embodiment, the method further includes updating the display to remove the overlay window on the display of the device in response to the capture command.

According to another embodiment, a device is provided including at least one image sensor, a display configured for presentation of a user interface and a controller configured to control the at least one image sensor and display. The controller is configured to detect a command to launch an image capture functionality of the device, wherein the command includes contact to a display of the device, and control display of an overlay window on the display in response to the command. The controller is also configured to detect a capture command for image capture by the device, wherein the capture command relates to detection of a release of the contact to the display, and control capture of image data in response to the capture commands.

Other aspects, features, and techniques will be apparent to one skilled in the relevant art in view of the following detailed description of the embodiments.

BRIEF DESCRIPTION OF THE DRAWINGS

The features, objects, and advantages of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout and wherein:

FIGS. 1A-1C depict graphical representations of device control according to one or more embodiments;

FIG. 2 depicts a process for device control according to one or more embodiments;

FIG. 3 depicts a graphical representation of a device according to one or more embodiments;

FIG. 4 depicts a process for device control according to one or more other embodiments;

FIG. 5 depicts a graphical representation of image capture according to one or more embodiments;

FIG. 6 depicts a graphical representation of commands to invoke an application according to one or more embodiments;

FIG. 7 depicts a graphical representation of image capture according to one or more embodiments;

FIG. 8 depicts a graphical representation of toggle operations according to one or more embodiments; and

FIG. 9 depicts a graphical representation of toggle operations according to one or more other embodiments.

DETAILED DESCRIPTION OF THE EXEMPLARY EMBODIMENTS Overview and Terminology

One aspect of the disclosure is directed to control configurations for a device and, in particular, to controls and operations for launching functionality and applications of a device. One or more embodiments are directed towards providing control features that allow for quick access to device functions. Devices, such as personal communication devices, portable computers, media players, etc., can be configured with user interfaces including displays and in some cases touch screen capabilities. Although on screen menus, icons and elements can allow for selection of an application and/or navigation to one or more applications, there is a need for configurations to provide access and control to one or more particular features. Certain functionalities can benefit from quick access. According to one embodiment, configurations are provided that allow for quick access to functionality, such as image capture. Configuration a device to allow for access to functionality and applications, and for control of the access improves the operation of the device by reducing the actions require to access the feature. In addition, handling of control commands as described herein solves the problem of access to applications and functions as devices provide more and more features.

Disclosed herein are methods, devices and systems for operation of a device. One aspects of the disclosure is directed to methods for device operation including presentation of an overlay window associated with functionality of the device. In one embodiment, a method is directed to presentation of a user interface of a device for at least one of control, operation and navigation of the device. Other embodiments are directed to application functions. Embodiments are directed to configuration of an application and/or user interface to provide one or more functions.

Access of control features may be especially beneficial as the number of applications and functions increase for devices. In one embodiment, systems and methods are provided for launching an image capture functionality of a device based on control inputs. Image capture relates to one exemplary embodiment for capture of image data without requiring launch of an image application. In certain embodiments, image capture relates to a fast camera feature to provide a quick method of capturing images from any application with one input command. As such, fast capture can avoid interruption of a current focus on a device.

Another aspect is to provide multiple stage interface commands to allow for inputs to a device to launch a functionality and allow for additional commands while still providing access to the functionality.

As used herein, a functionality relates to a particular function of a device, the functionality often associated with an application of the device. A functionality, such as image capture for example, may be associated with an application, such as the image capture application. However, launching a functionality does not require launching the application.

Applications relate to computer programs operating on a device. The computer programs may be part of the operating platform of the device and may be accessed by a user. Applications of the device may each be associated with a particular purpose. By way of example, the device may include applications for web browsing, communications (e.g., phone, messaging, email, etc.), capturing image data, social media, widgets, etc.

As used herein, the terms “a” or “an” shall mean one or more than one. The term “plurality” shall mean two or more than two. The term “another” is defined as a second or more. The terms “including” and/or “having” are open ended (e.g., comprising). The term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means “any of the following: A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.

Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner on one or more embodiments without limitation.

Exemplary Embodiments

Referring now to the figures, FIGS. 1A-1C depict graphical representations of a device control according to one or more embodiments. FIGS. 1A-1C depict device 100 including a display 105. According to one embodiment, device 100 may be configured to present a user interface and/or one or more functions based on detected control commands. According to another embodiment, device 100 may be configured to provide access to one or more functions and allow for control of device 100 based on one or more control commands. FIGS. 1A-1C depict an exemplary representation for invoking a functionality based on a detected command according to one or more embodiments.

FIGS. 1A-1C depict one embodiment including invocation of a functionality, such as a swipe and hold to trigger single mode camera functionality. In certain embodiments, a camera user interface is presented as an overlay wherein release of the contact allows for image capture. The processes and configurations of device 100 may be modified based on one or more embodiments discussed herein.

FIG. 1A depicts device 100 including display 105 presenting a user interface 110. According to one embodiment, device 100 may relate to electronic devices, such as personal communication devices, media players, tablets, etc. According to another embodiment, display 105, and/or one or more controllers and elements of device 100, may be configured to detect inputs, such as contact, to display 105. By way of example, inputs to display 105 of device 100 can relate to one or more of touch commands, taps, swipes, gestures, etc.

According to one embodiment, device 100 may be configured to detect a control command based on an input to display 105. According to another embodiment, a control command for device 100 may be input based on particular contact to display 105. Contact 125 to display 105 is depicted as contact along the display 105 along a direction. According to one embodiment, contact 125 may be associated with a predetermined portion of display 105 such as area 115 associated with the lower left corner of display 105. Although shown in the lower left area of display 105, predetermined areas for contact 125 may be associated with other portions of display 105, such as corners of display 105, a top portion of display 105 bottom bar of 105, side bars of 105, etc. In certain embodiments, the position of area 115 may be associated with locations a user access user interface 110 and/or for which are convenient for one hand operation of device 100.

FIG. 1A depicts contact 125 as a result of a user (e.g., user finger) 120 contacting display 105. According to one embodiment, contact 125 relates to a swipe or slide command of the user across the display device, wherein contact is maintained, at least substantially, in direct contact with the display 105 and then held on display 105. Contact 125 may be detected to trigger or invoke a function of device 100. As will be discussed below, momentary release of contact 125 may trigger operation of device 100. According to one embodiment, device 100 detects a control command based on contact 125 and the continued application of contact (e.g., touch) to display 105 to launch a particular functionality. FIGS. 1A-1C represent an image capture functionality, however, it is appreciated that device 100 may be configured for other functions and capabilities to be similarly invoked by contact 125.

According to one embodiment, contact 125 is a swipe along a predetermined portion of the display, but longer swipes may have a different functionality as will be described in more detail below.

FIG. 1B depicts presentation of an overlay window 130 on display 105. According to one embodiment, overlay window 130 is presented in response to contact 125. In FIG. 1B, user contact 120 is maintained. Based on contact 120, display 105 may optionally present a graphical element, such as display element 135. According to one embodiment, display element 135 may be presented as a virtual button either at location associated with contact 120 and or a portion of display 105. In certain embodiments, display element 135 may change in appearance when contact 120 is released from display 105. In other embodiments, display element 135 may be displayed for a predetermined period of time following a command to capture image data, such that one or more repeated contacts with display element 135 can be detected and employed for capturing successive images.

According to one embodiment, image 131 in overlay window relates to image data detected by device 100. Image 131 may relate to image data detected by one or more image sensors of device 100, such as forward facing or rear facing image detectors of device 100. As will be described in more detail below, image 131 may include image data from one or more image sensors of device 100 at the same time. Image 131 may relate be presented as a video image and may be associated with capture of still images, a series of images and video image data.

According to one embodiment, device 100 may initiate focus of image 131 based on detection of command 125. In that fashion, image data may be quickly captured by device 100.

According to one embodiment, release of display element 135 may be detected to initiate capture of image data presented in overlay window 130. FIG. 1C depicts contact 120 released from display 105, and removed from the location of display element 135. According to one embodiment, device 100 may updated the presentation of image data in overlay window 130, shown as 132 in FIG. 1C, to indicate that the image data is captured. According to one embodiment, overlay window may be display for a predetermined period of time, (e.g., 0.5-5 seconds) following release to allow a user to input additional capture commands by initiating contact with display element 135 one or more times. Following each release, device 100 may display overlay window 130. Following a predetermined period of time, such as a period of time without contact, device 130 is configured to remove the display of overlay window 130 and return back to the previous display configuration prior to detection of command 125. In that fashion, device 100 is configured to quickly transition to an image detection interface for capture of image data and then return to the user interface presentation without being intrusive. In addition, presentation of an overlay window and capture of image data can allow for launching the image detection from any application, how screen, user interface presentation without require multiple input commands. In contrast to launching a full application from a swipe, the process and configurations in FIGS. 1A-1C allows for a different process of launching the application/functionality and eliminates one or more commands required to launch, capture and close the application/functionality. Detection of contact 120 provides specific interactions that can be manipulated to yield a desired result of the device, and this, interaction is not merely a routine or conventional use of a device.

FIGS. 1A-1C are discussed and described above with reference to touch input from a user. In certain embodiments, input commands may be based on contact with other touch inputs, such as a stylus, for example. It should also be appreciated, that the principles represented and discussed in FIGS. 1A-1C are exemplary, and could apply to one or more other embodiments discussed herein. By way of example, although image capture presentation and control are described as one embodiment, control command 125 may be employed to launch one or more of a speed dial, payment system, chat, note taking application, voice recorder, etc.

FIG. 2 depicts a process for device control according to one or more embodiments. According to one embodiment, process 200 may be employed for controlling device operation for image capture. Process 200 may be initiated by detecting a control command at block 205, such as a command to invoke camera capture functionality. The command detected at block 205 may relate to a contact to a display (e.g., display 105) of a device (e.g., device 100). In one embodiment, the command to launch the image capture functionality includes a touch command that with a predetermined area of the display and continued contact with the display. According to one embodiment, the command detected at block 205 relates to a swipe or slide relative to the display. The command may be an off-screen to on screen swipe. In certain embodiments, the command relates to contact, swipe along the surface of the display wherein the contact is held to the display following the swipe. By retaining contact, the input allows for additional commands to be detected following the initial contact and swipe. With respect to image functionality, a capture command may follow the command to invoke the camera. With further respect to image functionality, a toggle command may follow the input to control or modify the functionality available to a user.

At block 210 an overlay window (e.g., overlay window 130) is displayed on the display of the device in response to the command detected at block 205. According to one embodiment, the overlay window includes graphical elements associated with the functionality invoked. By way of example, when the functionality associated with the command relates to image capture, an overlay window may be presented to include image data detected by the device (e.g., display window for camera). As will be discussed herein, the overlay window can include one or more elements based on the command, functionality, and one or more toggle commands.

In certain embodiments, display of an overlay window at block 210 includes presentation of a window on all or part of the display. The overlay window can be presented with some or all functions of an application. By way of example, when the overlay window is associated with an image capture function, and the device includes an image capture application, the overlay window may be presented to allow for some or all of the functions of the image application to be accessed. In certain embodiments, a command may be entered to launch the full application. Presentation of the overlay window may be associated with a temporary period of time. According to one embodiment, the overlay window includes image data detected by at lease one image sensor of the device during detection of the contact to the display of the device. The overlay window can include functionality for detection of an image data application of the device.

At block 215, a capture command may be detected by the device for image capture by. In certain embodiments, the capture command relates to detection of a release of the contact to the display. In other embodiments, the capture command may relate to release followed by a subsequent tap and hold to the display, wherein the tap and hold may be associated with a graphical element displayed following detecting of the command at block 205 and the overlay window at block 210.

At block 220, process 200 includes capturing, by the device, image data in response to the capture command. Image capture may be associated with the image window display in the overlay window. In certain embodiments, a capture command relates to a release of the contact with the display that is associated with invoking the camera functionality. In other embodiments, the capture command can include release and re-establishment of contact with the display.

Following the capture of image data at block 220, process 200 may optionally include removal of the overlay window at block 225. Removal of the overlay window at block 225 may occur following capture of an image and/or a predetermined period of time that contact is removed from the display. In one embodiment, updating the display at block 225 to remove the overlay window on the display of the device is in response to the capture command detected at block 215.

At block 230, process 200 may optionally including detecting one or more additional capture commands, such a plurality of releases relative to the display, and wherein capturing image data includes capturing a plurality of images.

FIG. 3 depicts a graphical representation of a device according to one or more embodiments. According to one embodiment, device 300 is configured to detect a command to launch functionality of the device and control operation in response to the command. FIG. 1 depicts a representation of elements of device 100 according to one or more embodiments.

Device 300 includes controller 305, user interface 310, communications unit 315, memory 320 and image sensor module 325. Controller 305 may communicate with each of user interface 310, communications unit 315, memory 320 and image sensor module 325 by way of one or more communication links within device 300.

Device 300 can include at least one image sensor. Image sensor module 325 may be configured to detect image data including still and video image data from one or more image sensors, such as sensor 326 and sensor 327. Sensor 326 may relate to a forward facing image sensor, and similarly sensor 327 may relate to a rearward facing image sensor.

Device 300 includes controller 305 configured to control the at least one image sensor and display. According to certain embodiments, controller 305 may be configured to detect a command to launch an image capture functionality of the device. The command can include contact to display 312. In response, controller 305 can control display of an overlay window on display 312 in response to the command. Controller 305 is also configured to detect a capture command for image capture by the device 300. The capture command can relates to detection of a release of the contact to the display 312. Controller 305 controls capture of image data in response to the capture commands.

Controller 305 may be configured to execute code stored in memory 320 for operation of device 300 including presentation of a graphical user interface, overlay windows, graphical elements, etc. Controller 305 may include a processor and/or one or more processing elements. In one embodiment controller 305 may be include one or more of hardware, software, firmware and/or processing components in general. According to one embodiment, controller 305 may be configured to perform one or more processes described herein.

User interface 310 is depicted as including an input/output (I/O) interface 311 and display 312. According to one embodiment, commands to device 300 may be detected by display 312, such as swipe, slide, contact, touch stylus, and touch screen commands. Commands to invoke functionality may be input relative to display 312 of user interface 310. Device 300 includes display 312 configured for presentation of a user interface and overlay windows. User interface 310 may be configured to receive one or more commands via an input/output (I/O) interface 311 which may include one or more inputs or terminals to receive user commands.

Communications unit 315 may be configured to allow for transmission and reception of data relative to device 300. Communications unit 315 may be configured for wired and/or wireless communication with one or more network elements, such as servers. Memory 320 may be configured to store data captured by device 300 and to store instructions for operation of device 300. Memory 320 may include non-transitory RAM and/or ROM memory for storing executable instructions, operating instructions and content for display.

FIG. 4 depicts a process for device control according to one or more other embodiments. Process 400 may be employed for presentation of an overlay element (e.g., overlay element 130) associated with control of a device (e.g., device 300). Process 400 may be initiated by detecting a command to invoke an application or functionality of a device at block 405. In certain embodiments, the capture command may be detected while the device is in an “on” or “awake” state at block 405. In other embodiments, the command at block 405 may be detected in an “off” state. Commands may relate to off-screen to on screen contacts, on screen contact and slides (e.g., swipes, etc.) and contact associated with a particular motion in one or more areas of the display (e.g., diagonal swipe from a lower left corner to a position on the display below one third of the display screen height, etc.).

According to one embodiment, detection of a command at block 405 may be associated with presentation of a user interface by a device. Process 400 may optionally include presenting user interface at block 406. Presentation of a user interface at block 406 can include presentation of one or more of a lock screen, home screen, desktop view, and/or presentation of one or more applications. Presentation of the user interface at block 406 may be prior to detection of a command at block 405. In certain embodiments, presentation of the user interface at block 406 may be during detection of a command at block 405.

At block 410, the device is configured to characterize the command. After the command is invokes, finer may be moved all over screen, as long as there is contact, the overlay window will be presented. In some embodiments, movement of the input contact (e.g., finger, etc.) is tracked after the command is detected.

In certain embodiments, a control command may be input based on interactions with a display screen of a device. According to one embodiment, the length of an input command, such as swipe length, for example, can indicate the desired input. Accordingly, the device can determine at block 410 for lunching a function for a period of time or launching a full application. In addition, the command can be characterized at block 410 based on the location on display which the command is entered, the length of the command, type of command (e.g., swipe, tap, press, hold, etc.) and the particular state of the device. According to certain embodiments, certain commands detected at block 405 may be detected in any presentation state of the device (e.g., lock screen, home screen, application presentation, etc.). According to other embodiments, the command detected at block 405 may be based on the current state, such that the command can provide a first result when the device is in a first state and a second result when the device is in a second state. As will be discussed below, input commands detected by the device may continue to be evaluated further after the command is initially detected and characterized at block 410.

According to one embodiment, process 400 can include determining whether to launch an application at decision block 415 based at least in part on the command characterization at block 410. By way of example, in certain embodiments, the command detected at block 405 may relate to a command to launch an application of the device. As such, the command to launch an application may relate to a user interface shortcut similar to selection of a graphical element representative of a particular application in certain embodiments. To that end, when a command is detected to launch a full application (e.g. “YES” path out of decision block 415), process 400 can launch the full application at block 420. When a command is detected to launch a functionality (e.g. “NO” path out of decision block 415), process 400 can present an overlay at block 425. The overlay can include one or more graphical elements, image windows, and elements in general associated with the functionality for the control command. In certain embodiments, presentation of the overlay window at block 425 relates to a pop-up display during and in response to a control command. Process 400 may then monitor the control command at block 430. According to one embodiment, when the control command relates to contact with the display of the device, block 405, block 410, block 415, block 425 and block 430 may be performed while the contact is made and/or held to the display of the device. In that fashion, a device can provide access to a particular functionality associated with the command, or multiple functionalities while presenting a user interface or application without requiring navigation to the particular functionality, such as navigating to a selectable icon.

According to one embodiment, control commands detected at block 405 may relate to an invocation of functionality and can be followed up with other actions to allow for control of a device. Actions following a control command can include rotational contact/movement of the input (e.g., finger, touch, stylus, etc.) relative to the display screen, pressure/additional contact force with the screen, or other predefined movement of the input. Actions may be detected following the command to launch the image capture functionality. In certain embodiments, the action following a command includes contact to the display associated with a predetermined movement. Actions following a command can include switching image detection from a first image detection sensor to a second image detection sensor. Alternatively, actions following a command can switch functionality of the image detector from image to video detection. In one embodiment, an action following a command launches a camera application of the device. Process 400 includes multiple actions associated with control commands according to one or more embodiments.

According to one embodiment, presentation of elements in an overlay window may be modified based on one or more toggle commands. By way of example, when the control command relates to launch of image capture functionality, the overlay window can include one or more displays of image data detected by the device. In certain embodiments, a toggle command can switch the number of image areas in the overlay window, change the camera or cameras from which image data is to be detected and/or change image capture settings (e.g., black and white, still to video, panoramic, etc.).

According to an exemplary embodiment, process 400 includes determining if a toggle command is entered at decision block 435. In one embodiment, a toggle command detected by process 400 relates to contact motion or movement relative to the display, such as a rotational or arc movement of the contact relative to the display. According to another embodiment, the toggle command may relate to one or more movements relative to the display in one or more directions. According to another embodiment, the toggle command may relate to detected motion of the device, such as waving the device in a circular, or at least substantially circular movement. According to another embodiment, toggle command may be a toss gesture of the device, such as a flick in a sideways direction of the device. One or more toggle commands may be received to change the presentation of the overlay window. In certain embodiments, the toggle command may indicate different results based on the direction of rotation, such that, rotation in a clockwise direction changes the overlay presentation configuration (e.g., 1 camera, camera switch, two cameras, etc.). Rotation in the opposite direction may be employed to return the user interface presentation into another format, such as returning to the previously displayed configuration.

When a toggle command is detected (e.g. “YES” path out of decision block 435), process 400 can update the presentation of the overlay at block 425. When a toggle command is not detected (e.g. “NO” path out of decision block 425), process 400 continues to monitor the command at block 430.

According to one embodiment, process 400 includes determining whether to capture image data at block 440. According to one embodiment, a capture command is detected based on a release relative to the device display. By way of example, when a command relate to a slide or swipe command on a display at block 405, image capture of image data presented in the overlay window may be detected when the contact (e.g., finger, touch, stylus, etc.) is then removed from the display. When a capture command is detected (e.g. “YES” path out of decision block 440), process 400 can capture image data at block 450. When a capture command is not detected (e.g. “NO” path out of decision block 440), process 400 continues to monitor the command at block 430.

Process 400 includes capturing image data at block 450. Image capture can include capture of still images, multiple still images and video data.

According to one embodiment, process 400 includes determining whether to switch to a full application at block 445. According to one embodiment, a control command detected at block 405 may relate to launching a functionality, which may be associated with an application of the device, but not launching the full application. By way of example, the control command at block 405 may allow for an overlay presentation for image capture, but not the launching of the full image detection application. Not launching a full application may allow for quicker presentation and minimal interface with use of a device. In addition, the control command can allow faster access to a particular functionality. Fast access to image capture can allow better allow for capture of fleeting moments. However, following the launch of an overlay window, there may be a desire to launch the full application without having to navigate to the application in the user interface of the device. Accordingly, it may be determined whether a command to launch a full application is detected at decision block 445. According to one embodiment, a command to launch the full application may include an extension of the input command, such as a longer swipe or continuation of the swipe or contact across the display device following the initial input of the command to invoke functionality. When a switch to full command is detected (e.g. “YES” path out of decision block 445), process 400 can launch the application at block 420. When a switch to full command is not detected (e.g. “NO” path out of decision block 445), process 400 continues to monitor the command at block 430.

According to one embodiment, presentation of an overlay window may be limited to a predetermined period of time following the command to invoke functionality and/or one or more commands detected by the device following the detected command. By way of example, following monitoring of the initial command at block 430 and/or capture of image data at block 450, process 400 may determine whether to remove the overlay and return to the previous presentation format of the device. In certain embodiments, it may be beneficial to allow for presentation of the overlay window to continue for a user to capture additional or subsequent images. Process 400 may include determining whether to end the presentation of the overlay window at decision block 455. When contact is not re-established with the display of the device (e.g. “NO” path out of decision block 455), process 400 ends the display of the overlay window and thus, terminates the functionality presented at block 460. When a contact is re-established with the device (e.g. “YES” path out of decision block 455), process 400 can launch continue to monitor the command at block 430.

FIG. 4 and process 400 are described as relating to image capture, however it should be appreciated that elements of the process may be employed similarly for other functions of a device.

FIG. 5 depicts a graphical representation of image capture according to one or more embodiments. According to one embodiment, image capture as discussed herein can relate to one or more imaging sensor of a device. As such, an overlay window and/or one or more commands associated with the device may pertain to access to a particular image sensor or combination of sensors. FIG. 5 depicts device 500 including an image sensor facing direction 505 (e.g., forward direction) and an image sensor facing direction 510 (e.g., backward/self facing image sensor). An exemplary overlay window 515 for device 500 is depicted. According to one embodiment, the presentation of elements in overlay window 515 may be in response to a command to invoke functionality. Overlay window 515 includes image data 520 associated with direction 505 and image data 530 associated with direction 510. According to one embodiment, overlay window 515 may be presented following detection of a command (e.g., command 125). According to another embodiment, overlay window 515 may be presented in response to a toggle command.

According to one embodiment, overlay window 515 can include graphical element 535 which is presented in association with capture of image data. Graphical element 535 can be presented in overlay window 515 at or substantially collocated with the position of a contact to device 500. The position of graphical element 535 shown in FIG. 5 between image data 530 and image data 530 is exemplary. A contact or user is not shown in FIG. 5. In certain embodiments, graphical element presentation tracks the input contact position, such as that of a touch (e.g., finger, stylus, etc.).

FIG. 6 depicts a graphical representation of commands to invoke an application according to one or more embodiments. According to one embodiment, device 600 can determining whether to launch an application or functionality based on a detected input command. According to another embodiment, device 600 can characterize the type of input to distinguish between commands with similar input characteristics. FIG. 6 depicts device 600 including display 605. Device 600 is configured to detect input command 610. Input contact command 610 can relate to contact with display 605, such as a slide or swipe associated with a particular region 620 of display 605. In certain embodiments, command 610 may be detected to launch a functionality of device 600. According to another embodiment, detection the extension of input command 610 beyond a predetermined threshold 625 such with an extended length 630 can result in launching a different functionality and/or launch of an application. By way of example, device 600 may detect input command 610 and launch an overlay window associated with image detection. However, when an input command 610 is detected as a slide or swipe across the display 605 past threshold 625, the device may then launch a full image detection application. In that fashion, device 600 can be configured to provide quick access to two separate features of the device, which may be related, based on how the input contact command is entered. In addition, the input command 610 and/or 630 can allow for bypassing one or more navigation steps to launch functionality and applications.

Threshold 625 is shown as a dashed horizontal line across display 605. In some embodiments, threshold 625 is not displayed, but rather is employed by device 600 as a frame of reference or cutoff for determining the difference between launching a functionality or launching a full application.

FIG. 7 depicts a graphical representation of image capture according to one or more embodiments. According to one embodiment, an input command may launch a functionality of the device, wherein release of contact with the display can be detected to allow for image capture. According to certain embodiments, the device may remove display of the overlay window following a capture command to return the device to a display state or configuration prior to detection of the control command. According to another embodiment, the device may be configured to allow for the overlay window and a graphical element to be displayed for at least a period of time following the release to allow for capture of multiple images. FIG. 7 depicts a graphical representation of an overlay window 700, graphical element 705 and repeated contact 710 and a plurality of images 7151-n.

According to one embodiment, a device may be configured to present graphical element 705 in response to an input command, and continue displaying graphical element 705 following a release of the input command. In that fashion, repeated contact 710, that include taps and a combination of taps and holds, may be detected by the device such that each release is associated with capture of an image, such as the plurality of images 7151-n. In that fashion, the device is configured to allow for an input command to launch a functionality, such as image capture, allow for multiple images to be detected from the initial command, and then dismiss the overlay and functionality following a period of time when contact is not reestablished with the display device. According to one embodiment the device may be configured to present the overlay for a period of time of 0.5-4 seconds, before dismissing the functionality.

FIG. 8 depicts a graphical representation of toggle operations according to one or more embodiments. According to one embodiment, one or more toggle commands may be detected following the detection of an input command. FIG. 8 depicts a graphical representation of an exemplary overlay window 800, a toggle command 805 and overlay window 810 that is presented following the toggle command 805.

According to an exemplary embodiment, overlay window 800 includes image data detected relative to a self facing view for a device. Based on a toggle command 805, the device can switch image data presentation in the overlay window from data associated with detection by a self facing image sensor and update the user interface to present overlay window 810 including image data associated with a forward facing sensor. In certain embodiments, the toggle command is detected while contact with the display of the device.

According to one embodiment, toggle command 805 relates to contact motion or movement relative to the display, such as a rotational or arc movement of the contact relative to the display. According to another embodiment, the toggle command may relate to one or more movements relative to the display in one or more directions. According to another embodiment, the toggle command 805 may relate to detected motion of the device, such as waving the device in a circular, or at least substantially circular movement. According to another embodiment, toggle command 805 may be a toss gesture of the device, such as a flick in a sideways direction of the device. One or more toggle commands may be received to change the presentation of the overlay window. In certain embodiments, the toggle command 805 may indicate different results based on the direction of rotation, such that, rotation in a clockwise direction changes the overlay presentation configuration (e.g., 1 camera, camera switch, two cameras, etc.). Rotation in the opposite direction may be employed to return the user interface presentation into another format, such as returning to the previously displayed configuration.

FIG. 9 depicts a graphical representation of toggle operations according to one or more other embodiments. According to one embodiment, one or more toggle commands may be detected following the detection of an input command to change and/or modify the functionality invoked by the input command. FIG. 9 depicts a graphical representation of an exemplary overlay window 900, a toggle command 905 and overlay window 910 that is presented following the toggle command 905.

According to an exemplary embodiment, overlay window 900 includes image data detected relative to a forward facing view for a device, wherein the functionality invoked for overlay window 900 relates to capture of still image data. Based on a toggle command 905, the device can functionality for the overlay window from still image capture to video recording in order to present overlay window 910 including image data associated with a forward facing sensor. In certain embodiments, the toggle command is detected while contact with the display of the device.

While this disclosure has been particularly shown and described with references to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the claimed embodiments.

Claims

1. A method for controlling device operation, the method comprising:

detecting, by a device, a command to launch image capture functionality of the device, wherein the command includes contact to a display of the device;
displaying, by the device, an overlay window on the display of the device in response to the command, wherein the overlay window presents image data detected by the device;
detecting, by the device, a capture command for image capture by the device, wherein the capture command relates to detection of a release of the contact to the display; and
capturing, by the device, image data in response to the capture command; wherein capturing includes storing captured image data by the device.

2. The method of claim 1, wherein the command to launch the image capture functionality includes a touch command in a predetermined area of the display and includes continued contact with the display.

3. The method of claim 1, wherein the overlay window includes image data detected by at least one image sensor of the device during detection of the contact to the display of the device.

4. The method of claim 1, wherein the overlay window includes functionality for detection of an image data application of the device.

5. The method of claim 1, wherein the capture command includes detection of a plurality of releases relative to the display, and wherein capturing image data includes capturing a plurality of images.

6. The method of claim 1, further comprising detecting, by the device, a toggle command following the command to launch the image capture functionality, wherein the toggle command includes contact to the display associated with a predetermined movement.

7. The method of claim 6, wherein the toggle command includes switching image detection from a first image detection sensor to a second image detection sensor.

8. The method of claim 6, wherein the toggle command switches functionality of the image detector from image to video detection.

9. The method of claim 6, wherein the toggle command launches a camera application of the device.

10. The method of claim 1, further comprising updating the display to remove the overlay window on the display of the device in response to the capture command.

11. A device comprising:

at least one image sensor;
a display configured for presentation of a user interface; and
a controller configured to control the at least one image sensor and display, wherein the controller is configured to detect a command to launch image capture functionality of the device, wherein the command includes contact to the display; control display of an overlay window on the display in response to the command, wherein the overlay window presents image data detected by the at least one sensor; detect a capture command for image capture by the device, wherein the capture command relates to detection of a release of the contact to the display; and control capture of image data in response to the capture commands.

12. The device of claim 11, wherein the command to launch the image capture functionality includes a touch command in a predetermined area of the display and includes continued contact with the display.

13. The device of claim 11, wherein the overlay window includes image data detected by at least one image sensor of the device during detection of the contact to the display of the device.

14. The device of claim 11, wherein the overlay window includes functionality for detection of an image data application of the device.

15. The device of claim 11, wherein the capture command includes detection of a plurality of releases relative to the display, and wherein capturing image data includes capturing a plurality of images.

16. The device of claim 11, further comprising detecting, by the device, a toggle command following the command to launch the image capture functionality, wherein the toggle command includes contact to the display associated with a predetermined movement.

17. The device of claim 16, wherein the toggle command includes switching image detection from a first image detection sensor to a second image detection sensor.

18. The device of claim 16, wherein the toggle command switches functionality of the image detector from image to video detection.

19. The device of claim 16, wherein the toggle command launches a camera application of the device.

20. The device of claim 11, further comprising updating the display to remove the overlay window on the display of the device in response to the capture command.

Patent History
Publication number: 20160381287
Type: Application
Filed: Apr 20, 2016
Publication Date: Dec 29, 2016
Applicants: Jamdeo Canada Ltd. (Oakville), Hisense Electric Co., Ltd. (Qingdao), Hisense USA CORP. (Suwanee, GA), Hisense International Co., Ltd. (Qingdao)
Inventors: Sanjiv SIRPAL (Oakville), Mohammed Selim (Oakville), Salvador SOTO (Toronto), Stephen EDWARDS (Hamilton)
Application Number: 15/133,846
Classifications
International Classification: H04N 5/232 (20060101); H04N 5/77 (20060101); G06F 3/01 (20060101); H04B 1/3827 (20060101);