Abstract: An intelligent television can provide various interfaces for navigating processes associated with providing content. The user interfaces include unique visual representations and organizations that allow the user to utilize the intelligent television more easily and more effectively. Particularly, the user interfaces pertain to the display of media content, electronic programming guide information, television content, and other content. Further, the user interfaces provide unique process of transitioning between the content.
Type:
Application
Filed:
September 29, 2015
Publication date:
June 9, 2016
Inventors:
Mohammed Selim, Saulo Correia Dourado, Sanjiv Sirpal, Alexander de Paz, Salvador Soto, Karina A. Limongi
Abstract: A communication device comprising a microprocessor readable computer readable medium is provided that includes microprocessor executable instructions to alter displayed information based on screen orientation.
Type:
Grant
Filed:
September 27, 2012
Date of Patent:
May 9, 2017
Assignee:
Z124
Inventors:
Sanjiv Sirpal, Mohammed Selim, Lucilla Madamba, Jennifer L. Fraser, Alexander de Paz
Abstract: An intelligent television can provide various interfaces for navigating processes associated with providing content. The user interfaces include unique visual representations and organizations that allow the user to utilize the intelligent television more easily and more effectively. Particularly, the user interfaces pertain to the display of media content, electronic programming guide information, television content, and other content. Further, the user interfaces provide unique process of transitioning between the content.
Type:
Application
Filed:
August 19, 2013
Publication date:
March 13, 2014
Applicant:
Flextronics AP, LLC
Inventors:
Alexander de Paz, Karina A. Limongi, Sanjiv Sirpal
Abstract: A multi-screen user device and methods for performing a copy-paste operation using finger gestures are disclosed. A first finger gesture is used to select a display area from which data is to be copied. Subsequently, a drag finger gesture is used to identify where the data that is to be pasted. The drag may extend across a non-display boundary between a first and second display screen of the multi-screen device.
Type:
Application
Filed:
September 28, 2011
Publication date:
April 5, 2012
Applicant:
IMERJ LLC
Inventors:
Sanjiv Sirpal, Paul Reeves, Alexander de Paz, Jared Ficklin, Denise Burton, Gregg Wygonik
Abstract: A dual-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the dual-screen user device is conditioned upon the type of user gesture or combination of user gestures detected. The display controls described herein can correlate user inputs received in a gesture capture region to one or more display actions, which may include maximization, minimization, or reformatting instructions.
Type:
Grant
Filed:
October 27, 2014
Date of Patent:
August 14, 2018
Assignee:
Z124
Inventors:
Martin Gimpl, Alexander de Paz, Sanjiv Sirpal
Abstract: A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the multi-screen user device is conditioned upon the relative position of the multiple screens. A gravity drop display feature is also disclosed in which data from a first application on a first screen is automatically displayed on a second screen when the device is rotated. Modal windows can be displayed with the gravity drop display feature in which the modal windows can either be dismissed upon execution of the gravity drop feature, or can remain being displayed.
Abstract: Systems and methods are provides for displaying a desktop for a multi-screen device in response to opening the device. The window stack can change based on the change in the orientation of the device. The system can receive an orientation change that transitions the device from a closed state to an open state. A previously created in the stack can expand over the area of the two or more displays comprising the device when opened. A desktop expands to fill the display area and be displayed on the second of the displays after the device is opened.
Type:
Application
Filed:
September 29, 2011
Publication date:
April 5, 2012
Applicant:
IMERJ LLC
Inventors:
Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock
Abstract: An intelligent television can provide various interfaces for navigating processes associated with providing content. The user interfaces include unique visual representations and organizations that allow the user to utilize the intelligent television more easily and more effectively. Particularly, the user interfaces pertain to the display of media content, electronic programming guide information, television content, and other content. Further, the user interfaces provide unique process of transitioning between the content.
Type:
Application
Filed:
August 19, 2013
Publication date:
February 27, 2014
Applicant:
Flextronics AP, LLC
Inventors:
Alexander de Paz, Saulo Correia Dourado, Karina A. Limongi, Sanjiv Sirpal, Mohammed Selim, Salvador Soto
Abstract: An intelligent television and methods for user interaction between the intelligent television and the user are provided. In general, a user is provided with navigation, notification, and setup options which enable one or more functions associated with the intelligent television. The presentation of options is based on input received by the intelligent television. As a user provides input to the intelligent television via a remote control or other input device, the intelligent television is configured to interpret the input and provide interactive functionality in the form of content presented to the display of the intelligent television.
Type:
Application
Filed:
August 16, 2013
Publication date:
February 20, 2014
Applicant:
Flextronics AP, LLC
Inventors:
Sanjiv Sirpal, Saulo Correia Dourado, Alexander de Paz, Mahammed Selim
Abstract: Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, a first image displayed on a first touch sensitive display of a first screen may be currently in focus. In embodiments, the gesture is a tap on a second touch sensitive display of the device. In response to the gesture, an application is launched, which displays a second image on a second display of a second screen. Focus is then changed from the first image on the first touch sensitive display to the second image on the second touch sensitive display.
Type:
Application
Filed:
October 24, 2014
Publication date:
February 12, 2015
Applicant:
Z124
Inventors:
Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock
Abstract: Systems and methods are provided for displaying a desktop for a multi-screen device in response to opening the device. The window stack can change based on a change in the orientation of the device. The system can receive an orientation change that transitions the device from a closed state to an open state. A composite display can expand over the area of the two or more displays comprising the device when opened. A desktop expands to fill the display area and be displayed on the second of the displays after the device is opened.
Type:
Application
Filed:
June 11, 2021
Publication date:
December 2, 2021
Applicant:
Z124
Inventors:
Sanjiv Sirpal, Paul E. Reeves, Alexander de Paz, Rodney W. Schrock
Abstract: A multi-screen user device and methods for performing a copy-paste operation using finger gestures are disclosed. A first finger gesture is used to select a display area from which data is to be copied. Subsequently, a drag finger gesture is used to identify where the data that is to be pasted. The drag may extend across a non-display boundary between a first and second display screen of the multi-screen device.
Type:
Application
Filed:
November 30, 2018
Publication date:
March 28, 2019
Inventors:
Sanjiv Sirpal, Paul E. Reeves, Alexander de Paz, Jared L. Ficklin, Denise Burton, Gregg Wygonik
Abstract: Methods and devices for selectively presenting a user interface in a dual screen device. More particularly, the method includes providing a gallery for the dual screen device. The gallery can present one or more images in a user interface. The gallery user interface can adapt to changes in the device configuration. Further, the gallery can display images or videos in the various configurations.
Type:
Application
Filed:
February 26, 2018
Publication date:
September 6, 2018
Inventors:
Alexander de Paz, Martin Gimpl, Mohammed Selim
Abstract: A multi-screen user device and methods for performing a drag and drop operation using finger gestures are disclosed. A first finger gesture is used to select a display area from which data is to be copied. Subsequently, a drag finger gesture is used to identify where the data that is to be pasted. The drag may extend across a non-display boundary between a first and second display screen of the multi-screen device.
Type:
Grant
Filed:
September 28, 2011
Date of Patent:
September 3, 2013
Assignee:
Z124
Inventors:
Sanjiv Sirpal, Paul Reeves, Alexander de Paz, Jared Ficklin, Denise Burton, Gregg Wygonik
Abstract: A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the multi-screen user device is conditioned upon the relative position of the multiple screens. A gravity drop display feature is also disclosed in which data from a first application on a first screen is automatically displayed on a second screen when the device is rotated. Modal windows can be displayed with the gravity drop display feature in which the modal windows can either be dismissed upon execution of the gravity drop feature, or can remain being displayed.
Abstract: Methods and systems for presenting a user interface that includes a virtual keyboard are provided. More particularly, a virtual keyboard can be presented using one or more touch screens included in a multiple display device. The content of the virtual keyboard can be controlled in response to user input. Configurable portions of the virtual keyboard include selectable rows of virtual keys. In addition, whether selectable rows of virtual keys and/or a suggestion bar is displayed together with the standard character and control keys of the virtual keyboard can be determined in response to context or user input.
Type:
Application
Filed:
April 21, 2016
Publication date:
October 27, 2016
Inventors:
Alexander de Paz, Martin Gimpl, Rodney W. Schrock
Abstract: Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, a first image displayed on a first touch sensitive display of a first screen may be currently in focus. In embodiments, the gesture is a tap on a second touch sensitive display of the device. In response to the gesture, an application is launched, which displays a second image on a second display of a second screen. Focus is then changed from the first image on the first touch sensitive display to the second image on the second touch sensitive display.
Type:
Application
Filed:
September 29, 2011
Publication date:
April 5, 2012
Applicant:
IMERJ LLC
Inventors:
Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock
Abstract: An intuitive technique for inputting user gestures into a handheld computing device is disclosed allowing a user to better manipulate different types of screen display presentations, such as desktops and application windows, when performing tasks thereon, wherein a window stack for application windows and/or desktops can be navigated and sequentially displayed according to the window stack ordering without disturbing or changing this ordering.
Type:
Application
Filed:
September 28, 2011
Publication date:
May 3, 2012
Applicant:
IMERJ LLC
Inventors:
Sanjiv Sirpal, Brett Faulk, Paul Reeves, Alexander de Paz, Rodney Wayne Schrock, John Steven Visosky, Eric Freedman, Jared Ficklin, Denise Burton, Misty Cripps, Gregg Wygonik
Abstract: Systems and methods are provides for changing a window stack for a multi-screen device. The window stack can change based on the movement of a window. The system can receive a gesture indicating a change in the position of a window in the device. Upon receiving the gesture, the system determines a new position in the window stack for the moved window. Then, the system can determine a display associated with the moved window and change the logic data structure associated with the moved window to describe the new position of the moved window in the window stack.
Type:
Grant
Filed:
September 1, 2011
Date of Patent:
May 26, 2020
Assignee:
Z124
Inventors:
Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock
Abstract: An intuitive technique for inputting user gestures into a handheld computing device is disclosed allowing a user to better manipulate different types of screen display presentations, such as desktops and application windows, when performing tasks thereon, wherein a window stack for application windows and/or desktops can be navigated and sequentially displayed according to the window stack ordering without disturbing or changing this ordering.
Type:
Application
Filed:
December 14, 2015
Publication date:
June 16, 2016
Inventors:
Sanjiv Sirpal, Brett B. Faulk, Paul E. Reeves, Alexander de Paz, Rodney W. Schrock, John S. Visosky, Eric Freedman, Jared L. Ficklin, Denise L. Burton, Misty Cripps, Gregg Wygonik