User Interface (e.g., Touch Screen Menu) Patents (Class 348/14.03)
  • Publication number: 20140043428
    Abstract: A system that incorporates teachings of the present disclosure may include, for example, obtaining first images that are captured by a first camera system at a first location associated with a live presentation by the first user, transmitting first video content representative of the first images over a network for presentation by a group of other processors that are each at one of a group of other locations associated with corresponding other users, receiving second video content representative of second images that are associated with each of the other users, and presenting the second video content in a telepresence configuration that simulates each of the other users being present in an audience at the first location. Other embodiments are disclosed.
    Type: Application
    Filed: October 21, 2013
    Publication date: February 13, 2014
    Applicant: AT&T Intellectual Property I, LP
    Inventors: Tara Hines, Andrea Basso, Aleksey Ivanov, Jeffrey Mikan, Nadia Morris
  • Publication number: 20140043427
    Abstract: A mobile communication terminal includes a display module to display image data on an image display area, an input module to generate touch data according to a touch input in a touch area, and a storage module to store the touch data associated with the image data. A data input method includes displaying image data on an image display area, generating touch data according to a touch input in a touch area, associating the touch data with the image data, and storing the touch data associated with the image data.
    Type: Application
    Filed: October 21, 2013
    Publication date: February 13, 2014
    Applicant: Pantech Co., Ltd.
    Inventors: Sung-Sik WANG, Yong-Hoon CHO, Hoo-Doek LEE
  • Patent number: 8648894
    Abstract: Included are systems and methods for a virtual inmate visitation. Some embodiments include providing a scheduling user interface for a visitor to schedule a video visitation, receiving a visitor scheduling request for the video visitation with an inmate, and determining a visitor type for the visitor. Some embodiments include providing, based on the visitor type, a scheduling option, providing a visitor payment mechanism that depends on the visitor type, and providing a video visitation user interface to the visitor for conducting the video visitation between the visitor and the inmate. Still some embodiment include in response to receiving a visitor input, conduct the video visitation between the visitor and the inmate, determining whether the video visitation includes an unacceptable activity and performing a preventative measure.
    Type: Grant
    Filed: May 4, 2011
    Date of Patent: February 11, 2014
    Assignee: Eyconix, LLC
    Inventors: Alan Laney, William Watson, Jared Laney, Leslie Laney, Julie Collinsworth
  • Publication number: 20140036026
    Abstract: A GUI (graphical user interface) apparatus for connecting with a spot of the party being connected for use with a two-way communication system such as a video conference system in which a shot image is transmitted through a network to the spot of the party being connected, an image received from the spot of the party being connected through the network being displayed including a display processing device for arranging a plurality of display frames “P1” to “P6” to display still pictures representing specific nominated connection destinations on a screen of display device, a plurality of operation devices arranged on the display frames “P1” to “P6” in a one-to-one relation and a connection processing device for connecting with the nominated parties being connected represented by the still picture displayed on the display frame through the network each time the operation device located at the position corresponding to any of the display frames “P1” to “P6” on which the still picture is displayed is operated.
    Type: Application
    Filed: October 9, 2013
    Publication date: February 6, 2014
    Applicant: Sony Corporation
    Inventor: Koichi UCHIDE
  • Patent number: 8644467
    Abstract: A video-conferencing device for presenting augmented images that includes at least one interface, a network and a computer processor programmed to receive first information identifying a scene via the at least one interface. The computer processor also detects whether the scene contains at least one marker and identifies a location of each detected marker within the scene. In response to determining that the scene contains a first marker and based on the location of the first marker, the computer processor then augments the portion of the scene containing the first marker with second information. The computer processor then transmits the augmented scene to at least one external device via the network.
    Type: Grant
    Filed: September 7, 2011
    Date of Patent: February 4, 2014
    Assignee: Cisco Technology, Inc.
    Inventor: Jason Catchpole
  • Patent number: 8646012
    Abstract: A system, method, and web-based application platform enabling a television viewer to utilize an Internet device to request a Video-On-Demand (VOD) server to stream a selected video to the viewer's Set Top Box (STB). An Internet Protocol (IP) connection is established between the Internet device and an application executing at a web site, and an Internet device identifier is passed to the application. The application communicates with the VOD server to obtain a listing of available videos, and provides the listing to the Internet device. When the viewer selects a video, the application accesses an equipment-mapping table, which associates the Internet device identifier with an STB identifier. The application then sends the STB identifier and a request for the selected video to the VOD server, which delivers the video to the STB over a television delivery system.
    Type: Grant
    Filed: October 6, 2010
    Date of Patent: February 4, 2014
    Assignee: Ericsson Television Inc
    Inventors: Alan Rouse, Charles Dasher
  • Publication number: 20140028780
    Abstract: Producing a conversational video experience is disclosed. In various embodiments, a definition data associated with a first conversation node associated with the conversational video experience is received via a user interface. A response concept associated with the first conversation node is determined based at least in part on the received definition data. A relationship between the first conversation node and a second conversation node associated with the conversational video experience is determined based at least in part on the determined response concept. An association data that represents the relationship is generated and stored.
    Type: Application
    Filed: May 31, 2013
    Publication date: January 30, 2014
    Inventors: Ronald A. Croen, Mark T. Anikst, Vidur Apparao, Bernt Habermeier, Todd A. Mendeloff
  • Publication number: 20140022329
    Abstract: A system and method for providing an image are provided. The image providing method includes: transmitting, to an external device, a first video image of a first resolution, which is converted from an original video image of an original resolution; receiving, from the external device, area information about an area of interest of the first video image of the first resolution; determining, based on the area information, an area corresponding to the area of interest, of the original video image of the original resolution, wherein the determined area is smaller than the original video image of the original resolution converting a part of the original video image of the original resolution to a second video image of the first resolution, wherein the part corresponds to the determined area; and transmitting the second video image to the external device.
    Type: Application
    Filed: July 17, 2013
    Publication date: January 23, 2014
    Inventors: Yong-tae KIM, Hae-young JUN, Youn-gun JUNG
  • Patent number: 8629895
    Abstract: A camera can be associated with each conference participant endpoint. The camera, either frame or video-based, can monitor and detect one or more of gestures, facial recognition, emotions, and movements of the conference participant. Based on the detection of one or more of these triggering events, a correlation to an action corresponding the triggering event can be evoked. For example, if a participant raises their hand, e.g., a triggering event, the system can recognize that this is a request to speak. The participant can then be queued in the system based, for example, relative to other participants' requests. When the other participants have finished speaking, and it is the time for the user who raised their hand to speak, the system can optionally queue the user by modifying the endpoint with which they are associated.
    Type: Grant
    Filed: July 1, 2010
    Date of Patent: January 14, 2014
    Assignee: Avaya Inc.
    Inventor: Vandy Lee
  • Publication number: 20140009560
    Abstract: Authentication of a user initiating a communication may be achieved using a visual indicator of the user. Initiation of a communication may result in the initiator of the communication collecting image data associated with the initiator's identity. Additionally, the initiator may be required to perform a task, wherein a response to the task may be transmitted with the image data to the receiver of the communication. The receipt of the image data may allow a receiver of the communication to reduce spam and verify that the initiator is who it purports to be.
    Type: Application
    Filed: July 3, 2012
    Publication date: January 9, 2014
    Applicant: Avaya Inc.
    Inventors: Parameshwaran Krishnan, Navjot Singh
  • Patent number: 8626247
    Abstract: A mobile terminal having a dual display unit and a method of changing a display screen using the same are disclosed. The mobile terminal includes a main body, a front display unit for providing a multimedia mode, a rear display unit for providing a normal mode, and a screen control module for controlling to turn off the rear display unit and turn on the front display unit, and to display a multimedia execution screen in the front display unit. The method of changing a display screen of the mobile terminal having additional multimedia functions provides an optimum multimedia environment and improves user convenience and effectiveness of the mobile terminal by flexibly supporting a multimedia function to be executed according to a screen state of the rear display unit.
    Type: Grant
    Filed: February 2, 2012
    Date of Patent: January 7, 2014
    Assignee: Samsung Electronics Co., Ltd
    Inventor: Sang Hyeon Yoon
  • Publication number: 20140002578
    Abstract: A user terminal has an input for receiving a video signal, and a display for displaying to a user a video image derived from the video signal. A selection input is provided for receiving from the user at least one effect for enhancing the video image derived from the video signal. The user terminal has a rendering device for rendering on the display the video image derived from the video signal enhanced by the selected effect.
    Type: Application
    Filed: June 28, 2012
    Publication date: January 2, 2014
    Inventor: Jonathan David Rosenberg
  • Patent number: 8619115
    Abstract: The present invention provides a video communication system including a kiosk for recording video messages created by a user and a database for storing and providing access to the video messages. The kiosk includes a user interface for receiving user information such as name, address, email, and other identifying information. The kiosk further includes a message-recording device for recording a user video message. The video message and user message data are uploaded to a database. The database reconnects with the user through the user information to allow the user to access the video message. In an embodiment, the database sends an email web link to the user. The user may view the video message by opening the email web link and viewing the video message on an internet website.
    Type: Grant
    Filed: January 15, 2010
    Date of Patent: December 31, 2013
    Assignee: nSixty, LLC
    Inventors: James Matthew Stephens, Matthew Berlage
  • Patent number: 8619178
    Abstract: An image rendition and capture method includes rendering a first image on a surface using a first set of wavelength ranges of light. While the first image is being rendered, a second image is captured using a second set of wavelength ranges of light but not the first set. The second image is of the surface on which the first image is being rendered.
    Type: Grant
    Filed: April 9, 2009
    Date of Patent: December 31, 2013
    Assignee: Hewlett-Packard Development Company, L.P.
    Inventors: Kar Han Tan, Daniel George Gelb, Ian N Robinson
  • Patent number: 8619116
    Abstract: According to one aspect, a web optimized user device is provided. The web optimized device reduces complexity and facilitates interaction with web-based services and content. The web optimized device can be configured without a hard drive, facilitating integration of web-based services into a computing experience. The web optimized device presents a user interface that integrates video chat functionality into every aspect of the computer content accessed. In particular, a display manager manages the user interface presented and integrates video chat displays and features into the content displays in a content and/or context aware manner. These displays permit a user to intuitively interact with the video chat content and features while the user changes content, for example, web-based services, web-based applications, and other media content, without interruption of or interference from the video chat content.
    Type: Grant
    Filed: May 16, 2011
    Date of Patent: December 31, 2013
    Assignee: Litl LLC
    Inventors: Robert Sanford Havoc Pennington, Aaron Tang, John Chuang, Chris Bambacus, Eben Eliason, Chris Moody, Johan Bilien
  • Patent number: 8614731
    Abstract: A system and method of determining the quality of audio-visual services of a mobile telephone is provided. In one embodiment, the method includes wirelessly receiving streaming audio-video content at the mobile telephone and wherein the audio-video content comprises video content and audio content. The video content is formed of a plurality of video frames with each encoded with one or more video symbols. The audio content is formed of a plurality of audio segments with each encoded with a sequence of tones. The method further includes outputting the audio content with the mobile telephone and displaying the video content with the mobile telephone. A computer processes the video symbols of each video frame of the displayed video content to determine a video quality of the displayed video content and processes the sequences of tones of the outputted audio content to determine an audio quality of the outputted audio content. The determined audio and video quality is then output by the computer.
    Type: Grant
    Filed: April 21, 2010
    Date of Patent: December 24, 2013
    Assignee: Spirent Communications, Inc.
    Inventors: Dimitrios M. Topaltzas, Rupert C. Lloyd
  • Publication number: 20130328997
    Abstract: A method for video image sharing and control comprises activating video communication between electronic devices. Transmission of multiple video feeds is controlled using multiple cameras from a first electronic device.
    Type: Application
    Filed: May 30, 2013
    Publication date: December 12, 2013
    Inventor: Prashant Desai
  • Patent number: 8605873
    Abstract: Controlling a videoconference based on gestures received to a touch interface. A gesture may be received to a touch interface. In response to the gesture, a videoconference action may be performed. For example, a first gesture may be received to mute the videoconference and in response, the videoconference may be muted. As another example, a second gesture may be received to adjust the volume of the videoconference, and the volume may be correspondingly adjusted. Further, various gestures may be received for controlling one or more cameras in a videoconference, accessing settings in a videoconference, interacting with a presentation, etc.
    Type: Grant
    Filed: June 28, 2011
    Date of Patent: December 10, 2013
    Assignee: LifeSize Communications, Inc.
    Inventor: Wayne E. Mock
  • Patent number: 8605872
    Abstract: Controlling a videoconference based on gestures received to a touch interface. A gesture may be received to a touch interface. In response to the gesture, a videoconference action may be performed. For example, a first gesture may be received to mute the videoconference and in response, the videoconference may be muted. As another example, a second gesture may be received to adjust the volume of the videoconference, and the volume may be correspondingly adjusted. Further, various gestures may be received for controlling one or more cameras in a videoconference, accessing settings in a videoconference, interacting with a presentation, etc.
    Type: Grant
    Filed: June 28, 2011
    Date of Patent: December 10, 2013
    Assignee: LifeSize Communications, Inc.
    Inventor: Wayne E. Mock
  • Publication number: 20130321560
    Abstract: A system for a multi-point videoconference consisting of three or more simultaneously connected end-points in a one-to-many or a many-to-many conference via a dynamic-virtual-multi-point-control-unit. The system comprising of a dynamic control unit is created when the Audio and Video parameters with conference participants are input by the initiator. After the establishment of the conference the dynamic-control-unit may shift from end-point to end-point in the course of a conference without terminating the conference. The movement of the Composite table of parameters creates a virtual multipoint conference control unit first by selecting the best endpoint to host the dynamic control unit using the client device resources. The dynamic-control-unit uses a protocol to establish the connections between multipliable endpoints, including a requesting or initiating endpoint and two or more participating endpoints.
    Type: Application
    Filed: May 31, 2012
    Publication date: December 5, 2013
    Inventor: Ronald Angelo, SR.
  • Publication number: 20130321561
    Abstract: A Video Ticket Office provides interactive agent-assisted transportation service requests using real-time video and audio transmission with a separately-located operator center. A method may allow for changing from agent-assisted transportation service requests to automated services.
    Type: Application
    Filed: May 30, 2013
    Publication date: December 5, 2013
    Inventor: Gavin Smith
  • Patent number: 8599236
    Abstract: A method, a communication device and computer readable storage media facilitate engaging in communication session between a first communication device and another, i.e., a second communication device. One or more video frames received from the second communication device for display on the first communication device are captured. At least one video frame from the communication session is selected, at the first communication device, for use as a contact identifier associated with a contact person at the second communication device. The selected video frame for use as the contact identifier of the contact person is stored in a memory of the first communication device.
    Type: Grant
    Filed: May 11, 2011
    Date of Patent: December 3, 2013
    Assignee: Cisco Technology, Inc.
    Inventor: Steven Charles Prentice
  • Publication number: 20130314489
    Abstract: An information processing apparatus may include a detecting unit to detect a pointing object in a captured image, and a generation unit to generate pointing information based on detection of the pointing object by the detecting unit. The pointing information may indicate a position of the pointing object determined using a pointable range set based on a user in the captured image. The apparatus may further include a communication unit to transmit the pointing information to an external apparatus.
    Type: Application
    Filed: October 3, 2011
    Publication date: November 28, 2013
    Applicant: SONY CORPORATION
    Inventors: Yusuke Sakai, Masao Kondo
  • Patent number: 8593504
    Abstract: Once an active video conference is set up and a user is viewing the active video conference at a video terminal, the video terminal looks for different events that indicate a change in focus of the user to or from the active video conference. For example, the user brings up another application and starts using the application or the user has minimized a window that is displaying the active video conference. The video terminal sends a change of focus message based on the event to a video conference bridge or another video terminal that is streaming the active video conference to the user. The video conference bridge/video terminal processes the message and changes video portion of the stream of the active video conference based on the message. The result is that there is improved use of bandwidth between the video terminal and the video conference bridge/video terminal.
    Type: Grant
    Filed: February 11, 2011
    Date of Patent: November 26, 2013
    Assignee: Avaya Inc.
    Inventors: Lin Lin, Moni Manor, Gregory T. Osterhout, Stephen R. Whynot
  • Patent number: 8593502
    Abstract: A videoconferencing system includes a touch screen display device and a videoconferencing unit. The display device displays video data for the videoconference and generates touch data based on user selections relative to the touch screen. The videoconferencing unit is operatively coupled to the touch screen device by a video connection and a data interface connection, for example. The unit establishes and conducts a videoconference with one or more endpoints via a network. The unit sends video data to the display device and receives touch data from the device. The received touch data is used to control operation of the videoconferencing system. The received touch data can be used to initiate a videoconference call, change an operating parameter, change orientation of a camera, initiate a picture-in-picture display, access a menu, access memory, change a source of video data, initiate a whiteboard display, and access a screen of a connected device.
    Type: Grant
    Filed: January 18, 2011
    Date of Patent: November 26, 2013
    Assignee: Polycom, Inc.
    Inventors: Youssef Saleh, Mark Duckworth, Gopal Paripally, Everett Hiller, Britt Nelson, Wallace Henry
  • Patent number: 8595766
    Abstract: An operating method of an image display apparatus includes displaying a screen on a display and displaying a thumbnail image screen in response to a command to display an input image list. The thumbnail-image list includes a plurality of groups of thumbnail images that respectively correspond to a plurality of input image signals on the display. Each thumbnail-image group includes one or more thumbnail images that represent different points of time in a corresponding input image signal.
    Type: Grant
    Filed: December 3, 2009
    Date of Patent: November 26, 2013
    Assignee: LG Electronics Inc.
    Inventors: Yong Ki Ahn, Jae Kyung Lee, Kun Sik Lee, Gyu Seung Kim
  • Publication number: 20130307920
    Abstract: A computer-implemented method and system of providing a video chat experience in a network are described. The method may include: receiving live video stream signals, including audio signals, from a plurality of participants of a live video chat session; combining the live video stream signals into a shared canvas; providing the shared canvas to the plurality of participants, wherein the shared canvas is substantially synchronized among the plurality of participants; and providing options for the specific chat participant to manipulate the shared canvas.
    Type: Application
    Filed: May 15, 2012
    Publication date: November 21, 2013
    Inventors: Matt Cahill, Sean Parker, Shawn D. Fanning, Joey Liaw
  • Publication number: 20130300818
    Abstract: The present invention provides an interactive video platform system and a method for the same. Users at the client terminals link to the interactive video platform via a browser to participate in video interactivities among the users. The present invention also provides a virtual camera and a virtual transparency to promote the convenience of the users and the quality and efficiency of the interactive video platform system. At the end of video interactivities, a video server of the interactive video platform system edits the information of the video interactivity and stores the information in a central database for reference by users.
    Type: Application
    Filed: August 23, 2012
    Publication date: November 14, 2013
    Applicant: NATIONAL CHIAO TUNG UNIVERSITY
    Inventors: HSIN-CHIA FU, CHENG-LUNG TSENG, YUNG-CHANG TAI, JIN SHENG LIN, YI-JUI LEE, YOU-HAO LIU, CHUN FONG LIOU, GUAN-HONG CHEN
  • Patent number: 8581955
    Abstract: An apparatus and method in a mobile terminal support remote control with other mobile terminals. A video call is connected between the mobile terminal with a corresponding terminal. A remote control request message is transmitted through a control channel to the corresponding terminal. A remote control acceptance message is received through the control channel from the corresponding terminal. And a control message including control data that corresponds to a user input is transmitted through the control channel to the corresponding terminal.
    Type: Grant
    Filed: March 31, 2011
    Date of Patent: November 12, 2013
    Assignee: Samsung Electronics Co., Ltd.
    Inventor: Sang-Wook Woo
  • Publication number: 20130293664
    Abstract: Systems and methods are provided to automatically discover if an email address contact is video chat capable. A server receives a request from an electronic device to determine video chat capability associated with an email address. It compares the email address with a listing of email addresses, wherein the listing includes email addresses associated with other electronic devices having video chat capability. If an entry in the listing matches the email address, it returns a message configured to automatically enable addition of the email address as a video chat contact on the electronic device.
    Type: Application
    Filed: May 2, 2012
    Publication date: November 7, 2013
    Applicant: Research In Motion Limited
    Inventors: Thomas Calum Tsang, Boris Rozinov, Matthew David Douglas Williams
  • Patent number: 8576178
    Abstract: A mobile communication terminal is provided. The mobile communication terminal includes a communication module for transmitting and receiving data in order to perform audiovisual communication with at least one other party's terminal; a camera module for photographing an image; a touch screen for displaying a first image received from the at least one other party's terminal and a second image acquired from the camera module; and a controller for changing, when the first image or the second image displayed on the touch screen is touched, a display manner of the first image and the second image.
    Type: Grant
    Filed: August 12, 2008
    Date of Patent: November 5, 2013
    Assignee: LG Electronics Inc.
    Inventors: Moon Ju Kim, Eun Young Lee
  • Patent number: 8570425
    Abstract: An electronic apparatus has a touch sensor provided with a first touching zone including at least a second touching zone and a third touching zone, the second and third touching zones being allocated with different functions. The electronic apparatus is controlled to perform a specific function assigned to a specific touching zone that is the second or the third touching zone when there is a first touch input at first through the specific touching zone and continuously perform the specific function even if there is a second touch input that follows the first touch input, through either the second or the third touching zone that is not the specific touching zone, as long as there is a continuous touch input through the first touching zone from the first to the second touch input with no intermission.
    Type: Grant
    Filed: November 5, 2012
    Date of Patent: October 29, 2013
    Assignee: JVC Kenwood Corporation
    Inventor: Takashi Ohuchi
  • Publication number: 20130278710
    Abstract: System and method involving user interfaces and remote control devices. These user interfaces may be particularly useful for providing an intuitive and user friendly interaction between a user and a device or application using a display, e.g., at a “10 foot” interaction level. The user interfaces may be specifically designed for interaction using a simple remote control device having a limited number of inputs. For example, the simple remote control may include directional inputs (e.g., up, down, left, right), a confirmation input (e.g., ok), and possibly a mute input. The user interface may be customized based on current user activity or other contexts (e.g., based on current or previous states), the user logging in (e.g., using a communication device), etc. Additionally, the user interface may allow the user to adjust cameras whose video are not currently displayed, rejoin previously left videoconferences, and/or any of a variety of desirable actions.
    Type: Application
    Filed: April 20, 2012
    Publication date: October 24, 2013
    Inventor: Wayne E. Mock
  • Publication number: 20130278708
    Abstract: System and method involving user interfaces and remote control devices. These user interfaces may be particularly useful for providing an intuitive and user friendly interaction between a user and a device or application using a display, e.g., at a “10 foot” interaction level. The user interfaces may be specifically designed for interaction using a simple remote control device having a limited number of inputs. For example, the simple remote control may include directional inputs (e.g., up, down, left, right), a confirmation input (e.g., ok), and possibly a mute input. The user interface may be customized based on current user activity or other contexts (e.g., based on current or previous states), the user logging in (e.g., using a communication device), etc. Additionally, the user interface may allow the user to adjust cameras whose video are not currently displayed, rejoin previously left videoconferences, and/or any of a variety of desirable actions.
    Type: Application
    Filed: April 20, 2012
    Publication date: October 24, 2013
    Inventor: Wayne E. Mock
  • Publication number: 20130278711
    Abstract: System and method involving user interfaces and remote control devices. These user interfaces may be particularly useful for providing an intuitive and user friendly interaction between a user and a device or application using a display, e.g., at a “10 foot” interaction level. The user interfaces may be specifically designed for interaction using a simple remote control device having a limited number of inputs. For example, the simple remote control may include directional inputs (e.g., up, down, left, right), a confirmation input (e.g., ok), and possibly a mute input. The user interface may be customized based on current user activity or other contexts (e.g., based on current or previous states), the user logging in (e.g., using a communication device), etc. Additionally, the user interface may allow the user to adjust cameras whose video are not currently displayed, rejoin previously left videoconferences, and/or any of a variety of desirable actions.
    Type: Application
    Filed: April 20, 2012
    Publication date: October 24, 2013
    Inventor: Wayne E. Mock
  • Publication number: 20130278709
    Abstract: System and method involving user interfaces and remote control devices. These user interfaces may be particularly useful for providing an intuitive and user friendly interaction between a user and a device or application using a display, e.g., at a “10 foot” interaction level. The user interfaces may be specifically designed for interaction using a simple remote control device having a limited number of inputs. For example, the simple remote control may include directional inputs (e.g., up, down, left, right), a confirmation input (e.g., ok), and possibly a mute input. The user interface may be customized based on current user activity or other contexts (e.g., based on current or previous states), the user logging in (e.g., using a communication device), etc. Additionally, the user interface may allow the user to adjust cameras whose video are not currently displayed, rejoin previously left videoconferences, and/or any of a variety of desirable actions.
    Type: Application
    Filed: April 20, 2012
    Publication date: October 24, 2013
    Inventor: Wayne E. Mock
  • Patent number: 8558865
    Abstract: The present invention discloses a transmission method in communication of a video phone, which utilize a user input indication messages to include a text message during communication of telephone call, and displays the text message on the calling interface of the terminal, thereby providing transmission of text messages during communication of the video phone. The present invention has improved message interaction between the mobile phones and is suitable for sending and receiving a text message during a telephone call of video mobile phones in the 3G network. The user can directly input words on the calling interface and send them to the other party, and the inputted text message can be directly displayed on the user's calling interface after choosing to send the text message.
    Type: Grant
    Filed: September 30, 2009
    Date of Patent: October 15, 2013
    Assignee: ZTE Corporation
    Inventor: Xiaofeng Gu
  • Patent number: 8558868
    Abstract: In one implementation, a conference bridge or a multipoint conference unit (MCU) receives media streams from the endpoints in the conference. The media stream may contain at least one of audio, video, file sharing, or collaboration data. The MCU measures a characteristic in each of a plurality of media streams and calculates statistics based on the individual participation levels of the endpoints. A dynamic participation indicator displayed at the endpoints shows the relative participation levels of the endpoints. For example, the dynamic participation indicator may show the names of the users in a font that changes size and/or location as the participation level changes. In another example, the dynamic participation indicator may show respective videos of the endpoints in a format and/or size that changes as the participation level changes.
    Type: Grant
    Filed: July 1, 2010
    Date of Patent: October 15, 2013
    Assignee: Cisco Technology, Inc.
    Inventor: Steven Charles Prentice
  • Patent number: 8555179
    Abstract: A mobile communication terminal having a camera module and a touch screen is provided. The mobile communication terminal transmits and receives text data while performing the video call with at least one other party's terminal. The touch screen displays at least one images when the text data transmission and reception function is started, an input window for displaying text data input through the touch screen, an output window for displaying the transmitted and received text data, and at least one soft key, and a controller for controlling, when the input window or one of the at least one soft key is selected, the touch screen for changing a display method of at least one of the first image, the second image, the input window, the output window, and the at least one soft key and for displaying a touch pad comprising a plurality of keys for inputting text data.
    Type: Grant
    Filed: August 14, 2008
    Date of Patent: October 8, 2013
    Assignee: LG Electronics Inc.
    Inventors: Ha Youn Lee, Eun Young Lee, Min Hak Lee, Moon Ju Kim
  • Patent number: 8555406
    Abstract: A method and system for remote viewing of multimedia content using a multimedia content distribution network (MCDN) is configured to duplicate multimedia content displayed on a first MCDN terminal device and route the duplicate multimedia content to a second MCDN terminal device. The MCDN terminal devices may be coupled to a local network at an MCDN client premises. The MCDN terminal devices may also include wireless telephony devices for mobile remote viewing functionality. The method may include transcoding of the multimedia content into a format suitable for the second MCDN terminal device.
    Type: Grant
    Filed: October 6, 2009
    Date of Patent: October 8, 2013
    Assignee: AT&T Intellectual Property I, L.P.
    Inventors: Jerald Robert Howcroft, Michael Raftelis
  • Patent number: 8547414
    Abstract: The present invention is a video production and control system that uses a touch screen display and user interface, managed by a controller, for video switching. The system is used to select Preview and Program video feeds from a set of video input feeds. The video source configuration for some of the feeds may have been selected using touch screen video source control technology. Over time, the user can use the touch screen controls to switch which feeds are then designated as the Preview and Program feeds. The evolving Program signal is sent to a Program output connection for consumers of the Program, possibly after some reformatting. The current Program feed may be promoted from the current Preview feed. User controls determine the type of transition, such as cut or crossfade, that will apply to a transition from Preview to Program.
    Type: Grant
    Filed: February 15, 2011
    Date of Patent: October 1, 2013
    Assignee: New Vad, LLC
    Inventor: Robin Sheeley
  • Publication number: 20130226578
    Abstract: Aspects of an asynchronous video interview system and related techniques include a server that receives a plurality of pre-recorded video prompts, generates an interview script, transmits a video prompt from the interview script to be displayed at a client computing device, and receives a streamed video response from the client computing device. The server can perform algorithmic analysis on content of the video response. In another aspect, a server obtains response preference data indicating a timing parameter for a response. In another aspect, a video prompt and an information supplement (e.g., a news item) that relates to the content of the video prompt are transmitted. In another aspect, a server automatically selects a video prompt (e.g., a follow-up question) to be displayed at the client computing device (e.g., based on a response or information about an interviewee).
    Type: Application
    Filed: February 22, 2013
    Publication date: August 29, 2013
    Applicant: COLLEGENET, INC.
    Inventor: CollegeNET, Inc.
  • Patent number: 8514273
    Abstract: A 3D image display apparatus includes a display device which can display a two-dimensional (2D) image and a three-dimensional (3D) image for stereoscopic viewing, a 2D/3D display control device which controls the display device so as to alternately display the 2D image and 3D image, an on-screen display (OSD) control device which controls display of OSD information superimposed on an image displayed so that the displayed OSD information is erased in a shorter time when OSD information is displayed on a 3D image for stereoscopic viewing than when OSD information is displayed on a 2D image displayed on the display device.
    Type: Grant
    Filed: July 12, 2010
    Date of Patent: August 20, 2013
    Assignee: FUJIFILM Corporation
    Inventor: Koji Mori
  • Patent number: 8514263
    Abstract: A new approach is proposed that contemplates systems and methods to support the operation of a Virtual Media Room or Virtual Meeting Room (VMR), wherein each VMR can accept from a plurality of participants at different geographic locations a variety of video conferencing feeds of audio and video streams from video conference endpoints. A globally distributed infrastructure that supports operations of the VMR through a plurality of MCUs (Multipoint Control Unit) built from off-the-shelf components instead of custom hardware as media processing nodes, each configured to process the plurality of audio and video streams from the plurality of video conference endpoints in real time.
    Type: Grant
    Filed: May 11, 2011
    Date of Patent: August 20, 2013
    Assignee: Blue Jeans Network, Inc.
    Inventors: Alagu Periyannan, Raghavan Anand, Michael Grupenhoff, Ravi Kiran Kalluri, Emmanuel Weber, Anil Villait
  • Patent number: 8508570
    Abstract: A method for efficiently accessing pertinent information retrieved from a videoconferencing system call log. System call logs typically contain a chronological list of raw information pertaining to inbound and outbound videoconferencing calls. The method for efficiently accessing this chronological information is performed using input from the user at an endpoint to correlate and sort for display the information required at the current time. Videoconferencing systems typically are shared use or community type devices and the method of this disclosure allows for more user friendly access to pertinent information. Auto population of a speed dial list or associating a smart tag with the retrieved information is another possible feature to aid the end user. This method will allow a business to more efficiently use a limited number of videoconferencing systems amongst diverse groups of users with diverse calling needs.
    Type: Grant
    Filed: August 12, 2008
    Date of Patent: August 13, 2013
    Assignee: Polycom, Inc.
    Inventors: Tanvir Rahman, Krishna Sai
  • Patent number: 8508571
    Abstract: A teleconference system for connecting a plurality of conference bases via a network, each of the conference bases having: a display device for displaying an image on a screen; a video camera that is capable of measuring a depth; a person identifying section for identifying a participant who has made a specified gesture as a speaker, from a video picture taken by the video camera; an image input/display control section that usually keeps the display device displaying predetermined conference material on the screen and that turns the screen blank triggered by the person identifying section's identifying a speaker; a motion identifying section for identifying and digitalizing a motion of the speaker's hand to make motion data; and an image generating section for making a line drawing in accordance with the motion data made by the motion identifying section; wherein the image input/display control section controls the display device to display the line drawing made by the image generating section on the screen.
    Type: Grant
    Filed: July 7, 2011
    Date of Patent: August 13, 2013
    Assignee: Konica Minolta Business Technologies, Inc.
    Inventors: Kazumi Sawayanagi, Hideyuki Matsuda, Toshihiko Otake, Kazusei Takahashi
  • Publication number: 20130201276
    Abstract: Techniques for implementing an integrative interactive space are described. In implementations, video cameras that are positioned to capture video at different locations are synchronized such that aspects of the different locations can be used to generate an integrated interactive space. The integrated interactive space can enable users at the different locations to interact, such as via video interaction, audio interaction, and so on. In at least some embodiments, techniques can be implemented to adjust an image of a participant during a video session such that the participant appears to maintain eye contact with other video session participants at other locations. Techniques can also be implemented to provide a virtual shared space that can enable users to interact with the space, and can also enable users to interact with one another and/or objects that are displayed in the virtual shared space.
    Type: Application
    Filed: February 6, 2012
    Publication date: August 8, 2013
    Applicant: Microsoft Corporation
    Inventors: Vivek Pradeep, Stephen G. Latta, Steven Nabil Bathiche, Kevin Geisner, Alice Jane Bernheim Brush
  • Patent number: 8502856
    Abstract: Some embodiments provide a method for modifying a composite display of a first mobile device that is engaged in a video conference with a second device. The method presents, on the first device, the composite display having a first video captured by the first device and a second video captured by the second device. The method receives, at the first device, an input for modifying the composite display during the video conference. The method modifies the composite display based on the received input.
    Type: Grant
    Filed: June 6, 2010
    Date of Patent: August 6, 2013
    Assignee: Apple Inc.
    Inventors: Elizabeth C. Cranfill, Stephen O. Lemay, Hsi-Jung Wu, Xiaosong Zhou, Joe S. Abuan, Hyeonkuk Jeong, Roberto Garcia, Jr.
  • Patent number: 8502857
    Abstract: Disclosed herein is a method and apparatus for videoconferencing that allows video images from two or more cameras at the same site to be displayed as a single panoramic image. Accordingly, a conferencing endpoint having a single monitor can display the panoramic image of the two or more video images from an endpoint having multiple cameras, such as a telepresence endpoint. A sliding display area can be used to manually define a zoomed portion of the panoramic image to be displayed. Alternatively, the zoomed portion may be determined automatically. The zoomed portion may be changed during the course of the conference.
    Type: Grant
    Filed: October 19, 2009
    Date of Patent: August 6, 2013
    Assignee: Polycom, Inc.
    Inventor: Avishay Halavy
  • Publication number: 20130182061
    Abstract: A system for providing video conference touch blur. More specifically, in certain embodiments, a video conference touch blur operation is performed by actuating (e.g., via touch or via a mouse operation) a video conference screen image to blur areas that are not to be transmitted in high definition.
    Type: Application
    Filed: January 13, 2012
    Publication date: July 18, 2013
    Inventors: Roy Stedman, Carlton Andrews