Patents Assigned to Imerj LLC
  • Publication number: 20120081269
    Abstract: A dual-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the dual-screen user device is conditioned upon the relative position of the multiple screens and whether the data being displayed originated from a single-screen application or a multi-screen application.
    Type: Application
    Filed: August 31, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventor: Alex de Paz
  • Publication number: 20120081398
    Abstract: A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Alexander de Paz, Martin Gimpl, John Steven Visosky
  • Publication number: 20120083319
    Abstract: Embodiments are described for handling receipt of call in a multi-screen device. In embodiments, the device may be in a closed mode in which a primary screen is being used. A message regarding the incoming call is displayed on the primary screen to a user so that a user can decide whether to answer the call from the primary screen. If the device is being used in a closed secondary screen mode (with the user interacting with the secondary screen) when the call is received, a notice will be displayed to the user to turn the phone around to the primary screen so that the user can decide whether to answer the phone.
    Type: Application
    Filed: September 29, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, John Steven Visosky
  • Publication number: 20120081854
    Abstract: A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Martin Gimpl, John Steven Visosky, Alexander de Paz
  • Publication number: 20120084716
    Abstract: Systems and methods are provides for revealing a desktop in a window stack for a multi-screen device. The window stack can change based on the revealing of a desktop. The system can receive a gesture indicating an application with the desktop, which was previously created in the stack, is to be revealed on the display of the device. Upon receiving the gesture, the system determines that the desktop is to occupy substantially all of a composite display that spans substantially all of the two or more touch sensitive displays of the device. Then, the system can determine that the desktop is to be associated with the composite display and change a logic data structure associated with the desktop to describe the position of the desktop on the top of the window stack.
    Type: Application
    Filed: September 29, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Paul Edward Reeves, Alexander de Paz, Rodney Wayne Schrock, Jared L. Ficklin
  • Publication number: 20120084680
    Abstract: An intuitive technique for inputting user gestures into a handheld computing device is disclosed allowing a user to better manipulate different types of screen display presentations, such as desktops and application windows, when performing tasks thereon, e.g., minimization, maximization, moving between display screens, and increasing/decreasing a display thereof across multiple display screens. For manipulating an application window on a device display screen for performing tasks as identified above, user gestures are input to a corresponding gesture capture area for this display screen, wherein this capture area is separate from this display screen.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Martin Gimpl, Ron Cassar, Maxim Marintchenko, Nikhil Swaminathan
  • Publication number: 20120081313
    Abstract: A multi-display device is adapted to be dockable or otherwise associatable with an additional device. In accordance with one exemplary embodiment, the multi-display device is dockable with a smartpad. The exemplary smartpad can include a screen, a touch sensitive display, a configurable area, a gesture capture region(s) and a camera. The smartpad can also include a port adapted to receive the device. The exemplary smartpad is able to cooperate with the device such that information displayable on the device is also displayable on the smartpad. Furthermore, any one or more of the functions on the device are extendable to the smartpad, with the smartpad capable of acting as an input/output interface or extension of the smartpad. Therefore, for example, information from one or more of the displays on the multi-screen device is displayable on the smartpad.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Martin Gimpl, John Steven Visosky, Alexander de Paz
  • Publication number: 20120081307
    Abstract: The disclosed method and device are directed to navigation, by a dual display communication device, through display objects.
    Type: Application
    Filed: September 1, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Brett B. Faulk, Paul E. Reeves, Alexander de Paz, Rodney W. Schrock, Jared Ficklin, Denise Burton, Maxim Marintchenko
  • Publication number: 20120081302
    Abstract: A dual-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the dual-screen user device is conditioned upon the type of user gesture or combination of user gestures detected. The display controls described herein can correlate user inputs received in a gesture capture region to one or more display actions, which may include maximization, minimization, or reformatting instructions.
    Type: Application
    Filed: August 31, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Martin Gimpl, Alexander de Paz, Sanjiv Sirpal
  • Publication number: 20120084693
    Abstract: The present disclosure is directed to methodologies and devices for handling modals in a set of related windows.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Martin Gimpl
  • Publication number: 20120081268
    Abstract: A dual-screen user device and methods for launching applications from a revealed desktop onto a logically chosen screen are disclosed. Specifically, a user reveals the desktop and then launches a selected application from one of two desktops displayed on a primary and secondary screen of a device. When the application is launched, it is displayed onto a specific screen depending on the input received and the logical rules determining the display output. As the application is displayed onto the specific screen, the desktop is removed from display and the opposite screen can display other data.
    Type: Application
    Filed: August 31, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Martin Gimpl
  • Publication number: 20120084791
    Abstract: A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a secondary terminal environment. Applications of the desktop operating system communicate with applications and services of the mobile operating system through a cross-environment communication framework. The cross-environment communication framework may include interfaces to remotable objects allowing processes in the mobile operating system and processes in the desktop operating system to share memory in a thread-safe manner. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel.
    Type: Application
    Filed: August 24, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Laszlo Csaba Benedek, Octavian Chincisan, Cristian Hancila, Anthony Russello
  • Publication number: 20120084722
    Abstract: The present disclosure is directed to methodologies and devices for handling maximizing and minimizing of hierarchically related windows.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Ron Cassar, Paul E. Reeves, Volodimir Felixovich Lemberg, John Steven Visosky
  • Publication number: 20120084676
    Abstract: A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. A determined number of desktops and/or running applications can be selectively displayed on dual screen displays conditioned upon inputs received and the state of the device. Desktop displays and applications can be selectively shifted between the screens by user gestures, and can be moved off of the screens by other user gestures and therefore hidden. The hidden desktops and screens however can be re-displayed by yet another gesture. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications allowing a user with an intuitive ability to manage multiple applications/desktops running simultaneously. Visual indicators can be used on the displayed applications and desktops enabling a user to maximize a display on multiple screens or to minimize them on a single selected screen.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ, LLC
    Inventor: Alexander de Paz
  • Publication number: 20120084706
    Abstract: A dual-screen user device and methods for revealing a combination of desktops on single and multiple screens are disclosed. Specifically, a determined number of desktops and/or running applications can be selectively displayed on dual screen displays conditioned upon inputs received and the state of the device. Desktop displays and applications can be selectively shifted between the screens by user gestures, or moved off of the screens by other user gestures and hidden. The hidden desktops and screens can be re-displayed by yet another gesture. The desktops and applications are arranged in a window stack that represents a logical order of the desktops and applications. Desktops and applications can be selectively launched and added to the window stack. The user can also select where the desktops/applications are to be inserted and where they are first to be displayed after being launched.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ, LLC
    Inventors: Sanjiv Sirpal, Paul Reeves, Alexander de Paz, Jared L. Ficklin, Denise Burton
  • Publication number: 20120084723
    Abstract: Systems and methods are provided for changing a user interface for a multi-screen device. The user interface can change based on the movement of a window. The system can receive a user interface event that modifies the display of windows in the user interface. Upon receiving the user interface event, the system determines if a window has been covered or uncovered. If a window has been covered, the window is placed in a sleep state. If a window is uncovered, the window is activated from a sleep state. A sleep state is a window state where an application associated with the window does not receive user interface inputs and/or does not render the window. Moreover, in a sleep state an image representing the window is maintained in memory.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Paul Edward Reeves, Tong Chen
  • Publication number: 20120081397
    Abstract: A multi-screen user device and methods for controlling data displayed thereby are disclosed. Specifically, the data displayed by the multiple screens of the multi-screen user device is conditioned upon the relative position of the multiple screens. A gravity-drop display feature is also disclosed in which data from a first application on a first screen is automatically displayed on a second screen when the device is rotated.
    Type: Application
    Filed: August 31, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventor: Alexander de Paz
  • Publication number: 20120081380
    Abstract: A mobile computing device with a mobile operating system and desktop operating system running concurrently and independently on a shared kernel without virtualization. The mobile operating system provides a user experience for the mobile computing device that suits the mobile environment. The desktop operating system provides a full desktop user experience when the mobile computing device is docked to a second user environment. Cross-environment rendering and user interaction support provide a seamless computing experience in a multi-operating system computing environment. The seamless computing experience includes mirroring the active user interaction space of the mobile operating system to a display of a user environment associated with the desktop operating system. The mobile computing device may be a smartphone running the Android mobile operating system and a full desktop Linux distribution on a modified Android kernel.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: Imerj LLC
    Inventors: Brian Reeves, Paul E. Reeves, Richard Teltz, David Reeves, Sanjiv Sirpal, Chris Tyghe, Octavian Chincisan
  • Publication number: 20120081315
    Abstract: Methods and devices for presenting a virtual keyboard are provided. More particularly, in connection with a multiple screen device, a virtual keyboard can be presented using portions of both of the screens. More particularly, with the screens of the device in a portrait orientation, the virtual keyboard can span the two screens, such that a first portion of the first screen and a first portion of the second screen operate cooperatively to present the virtual keyboard.
    Type: Application
    Filed: September 28, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventor: Sanjiv Sirpal
  • Publication number: 20120081309
    Abstract: A dual display communication device includes a gesture capture region to receive a gesture, a first touch sensitive display to receive a gesture and display displayed images (such as a desktop or window of an application), and a second touch sensitive display to receive a gesture and display displayed images. Middleware receives a gesture, the gesture indicating that a displayed image is to be moved from the first touch sensitive display to the second touch sensitive display, such as to maximize a window to cover portions of both displays simultaneously; in response and prior to movement of the displayed image to the second touch sensitive display, moves a transition indicator from the first touch sensitive display to the second touch sensitive display to a selected position to be occupied by the displayed image; and thereafter moves the displayed image from the first touch sensitive display to the second touch sensitive display to the selected position.
    Type: Application
    Filed: September 1, 2011
    Publication date: April 5, 2012
    Applicant: IMERJ LLC
    Inventors: Sanjiv Sirpal, Alexander de Paz