Patents Assigned to Imerj LLC
-
Publication number: 20120084679Abstract: Methods and devices for selectively presenting a virtual keyboard are provided. More particularly, upon the receipt of instructions to launch an application, a determination can be made as to whether the application is associated with instructions to receive keyboard focus on launch. If the application is to receive keyboard focus on launch, a virtual keyboard is presented together with the newly launched application. Where an application is not set to receive keyboard focus on launch, a virtual keyboard that is presented when instructions to launch the application are received can be dismissed.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Martin Gimpl
-
Publication number: 20120084712Abstract: Systems and methods are provides for adjusting focus during an orientation change. A full screen window has focus before the orientation change. After the device is oriented in the landscape orientation, the focus is maintained on the window. However, a configurable area associated with at least one screen that displays the full screen window is changed to display the configurable area on the screen that is at the top of the device.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Martin Gimpl, Rodney Wayne Schrock, Sanjiv Sirpal
-
Publication number: 20120081323Abstract: Embodiments are described for handling the launching of applications in a multi-screen device. In embodiments, a first touch sensitive display of a first screen receives input to launch an application. In response, the application is launched. A determination is made as to whether the first touch sensitive display already has windows in its stack. If there are no windows in the stack of the first touch sensitive display, a new window of the first application is displayed on the first touch sensitive display. If there are windows in the stack, a determination is made whether a second display has windows in its stack. If not, the new window is displayed on the second display. If the second display also has windows in its stack, the new window will be displayed on the first touch sensitive display.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: Imerj LLCInventors: Sanjiv Sirpal, Martin Gimpl, Rodney Wayne Schrock
-
Publication number: 20120084698Abstract: A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Alexander de Paz
-
Publication number: 20120084716Abstract: Systems and methods are provides for revealing a desktop in a window stack for a multi-screen device. The window stack can change based on the revealing of a desktop. The system can receive a gesture indicating an application with the desktop, which was previously created in the stack, is to be revealed on the display of the device. Upon receiving the gesture, the system determines that the desktop is to occupy substantially all of a composite display that spans substantially all of the two or more touch sensitive displays of the device. Then, the system can determine that the desktop is to be associated with the composite display and change a logic data structure associated with the desktop to describe the position of the desktop on the top of the window stack.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock, Jared L. Ficklin
-
Publication number: 20120084724Abstract: Systems and methods are provides for changing a user interface for a multi-screen device. The user interface can change based on the movement of a window. The system can receive a user interface event that modifies the display of windows in the user interface. Upon receiving the user interface event, the system determines if a window has been covered or uncovered. If a window has been covered, the window is placed in a sleep state. If a window is uncovered, the window is activated from a sleep state. A sleep state is a window state where an application associated with the window does not receive user interface inputs and/or does not render the window.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Paul Edward Reeves, Ron Cassar, Nikhil Swaminathan, John Steven Visosky
-
Publication number: 20120081317Abstract: A multi-screen user device and methods for performing a copy-paste operation using finger gestures are disclosed. A first finger gesture is used to select a display area from which data is to be copied. Subsequently, a drag finger gesture is used to identify where the data that is to be pasted. The drag may extend across a non-display boundary between a first and second display screen of the multi-screen device.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Paul Reeves, Alexander de Paz, Jared Ficklin, Denise Burton, Gregg Wygonik
-
Publication number: 20120084681Abstract: Embodiments are described for handling the launching of applications in a multi-screen device. In embodiments, a first touch sensitive display of a first screen receives input to launch an application. In response, the application is launched and a window of the first application is displayed on the first display. A second touch sensitive display of a second screen receives input to launch a second application. In response, the second application is launched and a second window of the second application is displayed on the second display. In embodiments, when an application is launched, it displays the view of the application (whether on the first touch sensitive display or the second touch sensitive display) that was displayed when the application was last closed.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventor: Ron Cassar
-
Publication number: 20120081280Abstract: A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, a gesture sequence is disclosed which enables a user to toggle or shift though applications that are displayed by the multi-screen user device. The gesture sequence may correspond to various rotation or partial rotations of the multi-screen user device.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Rodney Wayne Schrock, Martin Gimpl, Sanjiv Sirpal, John Steven Visosky
-
Publication number: 20120084686Abstract: Systems and methods are provides for adjusting focus during a desktop reveal. A window has focus before the desktop is revealed. After the window is returned and the desktop hidden, the focus is again placed on the window. Further, a configurable area associated with the screen that displays the window is maintained during the desktop reveal and the return of the window.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock
-
Publication number: 20120084720Abstract: The present disclosure is directed to methodologies and devices for handling maximizing and minimizing of exposé views.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Martin Gimpl
-
Publication number: 20120084727Abstract: Embodiments are described for handling display of modal windows in a multi-screen device. In embodiments, a modal window will be launched and displayed in a display which receives the input that resulted in the display of the modal window. The other portions of a first display or second display, not displaying the modal window, are made inactive. In other embodiments, the modal window occupies only a first display and the second display remains active.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Martin Gimpl, John Steven Visosky
-
Publication number: 20120081310Abstract: The disclosed method and device are directed to a communication device that receives, by a gesture capture region and/or a touch sensitive display, a gesture while a first touch sensitive display is displaying a first displayed image and a second touch sensitive display is displaying a second displayed image and, in response, ceasing to display the first displayed image on the first touch sensitive display and commencing to display the first displayed image on the second touch sensitive display and ceasing to display a second displayed image on the second touch sensitive display and commencing to display the second displayed image on the first touch sensitive display.Type: ApplicationFiled: September 1, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventor: Rodney W. Schrock
-
Publication number: 20120081289Abstract: Methods and devices for presenting a virtual keyboard are provided. More particularly, in connection with a multiple screen device, a virtual keyboard can be presented in a first mode using all of one of the screens. In a second mode, the virtual keyboard can be presented using portions of both of the screens. More particularly, with the screens of the device in a landscape orientation, one screen can be devoted to present the virtual keyboard while the other screen remains available to present other information. In a portrait orientation, the virtual keyboard can span the two screens, such that a first portion of the first screen and a first portion of the second screen operate cooperatively to present the virtual keyboard.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Robert Csiki
-
Publication number: 20120081354Abstract: A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a second user environment. Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. Real-time display of applications running in the mobile operating system within an environment of the desktop operating system is provided by rendering the application through an extended graphics context of the mobile operating system. Application graphics for multiple applications are rendered into separate graphics frames.Type: ApplicationFiled: September 27, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Alisher Yusupov, Paul E. Reeves, Octavian Chincisan, Wuke Liu
-
Publication number: 20120081306Abstract: The disclosed method and device are directed to navigation, by a dual display communication device, through display objects.Type: ApplicationFiled: September 1, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Alexander de Paz, Paul E. Reeves, Maxim Marintchenko
-
Publication number: 20120084718Abstract: Systems and methods are provides for opening a full screen window in a window stack for a multi-screen device. The window stack can change based on the opening of a window. The system can receive a gesture indicating an application with a new window is to be executed or a new window is to be opened in the device. Upon receiving the gesture, the system determines that the new window is to occupy substantially all of a composite display that spans substantially all of the two screens of the device. Then, the system can determine that the full screen window is to be associated with the composite display and create a logic data structure associated with the opened window to describe the position of the opened window in the window stack.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: Imerj LLCInventors: Martin Gimpl, Ron Cassar, Paul Edward Reeves
-
Publication number: 20120081322Abstract: Embodiments are described for handling focus when a gesture is input in a multi-screen device. In embodiments, a first image displayed on a first touch sensitive display of a first screen may be currently in focus. In embodiments, the gesture is a tap on a second touch sensitive display of the device. In response to the gesture, an application is launched, which displays a second image on a second display of a second screen. Focus is then changed from the first image on the first touch sensitive display to the second image on the second touch sensitive display.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock
-
Publication number: 20120081401Abstract: Systems and methods are provides for adjusting focus during an orientation change. A window is displayed on a single display before the orientation change. After the device is oriented in the landscape orientation, the window expands into a full screen window being displayed on two displays. Further, the focus is changed to the window. And, a configurable area associated with at least one screen that displays the full screen window is changed to display the configurable area on the screen that is at the top of the device.Type: ApplicationFiled: September 29, 2011Publication date: April 5, 2012Applicant: IMERJ LLCInventors: Sanjiv Sirpal, Martin Gimpl
-
Publication number: 20120084710Abstract: A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Selected desktops and/or running applications are displayed on dual screen displays. Desktops and applications can be shifted between screens by user gestures, and/or moved off of the screens and therefore hidden. Hidden desktops and screens can be re-displayed by yet other gestures. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications providing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. One user gesture launches an applications management window that provides visual indications of all of the applications and desktops running at the time, applications/desktops displayed on the screens. Other gestures can rearrange the order of all of the applications and desktops in the window stack.Type: ApplicationFiled: September 28, 2011Publication date: April 5, 2012Applicant: IMERJ, LLCInventors: Sanjiv Sirpal, Martin Gimpl, Eduardo Diego Torres Milano