Patents Issued in September 21, 2017
-
Publication number: 20170269771Abstract: A vehicle includes a display for displaying a keypad to receive a command through a touch gesture of a user, and a controller for determining an operable area for the user on the display based on a user's shoulder height and arm length, and for controlling the display to display the keypad in the operable area.Type: ApplicationFiled: December 9, 2016Publication date: September 21, 2017Inventors: Jong Yong NAM, Inseong PARK, Yong LEE, Gi Beom HONG, Seok-Young YOUN, Jia LEE, Taeyub KIM, Hotaek LEE
-
Publication number: 20170269772Abstract: A display control and touch detection device is capable of controlling display and non-display terms in start timing depending on a result of touch detection, and includes a nonvolatile memory and a control logic which selectively uses data stored in the memory according to a display mode. The control logic changes the display and non-display terms in start timing in display frame periods, whereby the phenomenon of appearance of an undesired brightness difference at a fixed location in a display frame with no display, and the phenomenon of occurrence of flicker owing to the undesired brightness difference can be suppressed. Based on the result of touch detection, the control logic changes the way to use data which decide start timings of display and no display. The start timings of display and non-display terms in a display frame period can be changed depending on the result of touch detection readily.Type: ApplicationFiled: February 16, 2017Publication date: September 21, 2017Inventor: Takayuki NOTO
-
Publication number: 20170269773Abstract: A display device includes: a display section having a screen and a first electrode section provided at a first position in a thickness direction; a second electrode section provided at a second position in the thickness direction; a gap section which is provided between the first electrode section and the second electrode section and is deformable in the thickness direction when the screen is pressed; and a circuit section which is connected to the first electrode section and the second electrode section, displays to the screen, and detects a press onto the screen, and a capacitance value of a first capacitance between the first electrode section and the second electrode section is changeable due to deformation of the gap section. In a force period, the circuit section applies a sensor driving signal to the second electrode section, and detects a sensor detection signal based on the sensor driving signal through the first capacitance.Type: ApplicationFiled: March 10, 2017Publication date: September 21, 2017Inventor: Takafumi SUZUKI
-
Publication number: 20170269774Abstract: A control device for a vehicle includes a transparent support, an opaque decorative layer, a detection layer, and a light source adapted for emitting a light passing through the transparent support. The opaque decorative layer is printed on the transparent support in a control area, and delimits a pattern forming a pictogram where the decorative layer is missing, with the pictogram having an outer edge. The detection layer is electrically conductive and printed on the transparent support around the pictogram, substantially up to the outer edge of the pictogram, in the control area.Type: ApplicationFiled: March 15, 2017Publication date: September 21, 2017Inventor: Omar BEN ABDELAZIZ
-
Publication number: 20170269775Abstract: A method is disclosed. The method includes determining a touch control parameter of a touch control operation at an identifier corresponding to an application program. The method also includes determining a touch control type based on the touch control parameter. The method also includes invoking a first element related to the application program in response to a determination that the touch control operation is a first type of touch control. The method also includes invoking a second element related to the application program in response to a determination that the touch control operation is a second type of touch control.Type: ApplicationFiled: March 21, 2017Publication date: September 21, 2017Inventors: Xiao Fei Li, Jian Li Liu, Xue Gong Zhou
-
Publication number: 20170269776Abstract: A position detecting device obtains information from a stylus when the stylus moves at high speed, while removing influences of noise. The position detecting device includes a differential amplification circuit that amplifies and outputs a difference in a signal at a first terminal and a signal at a second terminal, and a selection circuit that selects at least a first electrode of a sensor, connects at least the first electrode to the first terminal of the differential amplification circuit, selects at least a second electrode of the sensor, and connects at least the second electrode to the second terminal of the differential amplification circuit. The selection circuit selects electrodes separated by a first interval in a period in which a position indicated by the stylus is detected, and selects electrodes separated by a second interval that is shorter than the first interval in a period in which data is detected.Type: ApplicationFiled: June 2, 2017Publication date: September 21, 2017Inventor: Yuji Katsurahira
-
Publication number: 20170269777Abstract: An optical touch-sensitive device has the capability to determine touch locations of multiple simultaneous touch events. The touch events disturb optical beams propagating across the touch sensitive surface. With multi-touch events, a single beam can be disturbed by more than one touch event. In one aspect, a non-linear transform is applied to measurements of the optical beams in order to linearize the effects of multiple touch events on a single optical beam. In another aspect, the effect of known touch events (i.e., reference touches) is modeled in advance, and then unknown touch events are determined with respect to the reference touches.Type: ApplicationFiled: June 5, 2017Publication date: September 21, 2017Inventors: Julien Piot, Mihailo Kolundzija, Danil Korchagin, Ivan Dokmanic, Martin Vetterli, Owen Drumm
-
Publication number: 20170269778Abstract: Techniques for adjusting a sensing frequency of a sensing signal are provided. The techniques include performing sensing and display updates with frequencies that have an integer ratio. The techniques include detecting a noise signal with a frequency similar to the sensing frequency. The techniques also include varying the integer ratio to achieve a desired sensing signal frequency.Type: ApplicationFiled: March 15, 2016Publication date: September 21, 2017Inventor: Kasra KHAZENI
-
Publication number: 20170269779Abstract: A capacitive sensing method comprising: driving a first excitation signal onto a sensor electrode, and driving a second excitation signal onto the embedded electrodes. Either the first or the second excitation signal has voltage oscillations of constant first amplitude, while the other has voltage oscillations of varying second amplitude. The second amplitude varies between a high amplitude value that is larger than the first amplitude by an amplitude difference and a low amplitude value that is smaller than the first amplitude by the same amplitude difference. The method further comprises using one or more integrators to integrate an electric current due to combined voltage oscillations of the first and the second excitation signals within an integration cycle, and generate an integrated signal for detecting a capacitive change on the sensor electrode; wherein the integration cycle comprises a plurality of voltage oscillations of the first and the second excitation signals.Type: ApplicationFiled: June 7, 2016Publication date: September 21, 2017Inventors: Wing Chi Stephen Chan, Jun Chen, Sing Ng
-
Publication number: 20170269780Abstract: This disclosure provides a touch screen panel, comprising: a plurality of first touch control electrodes extending in a first direction; a plurality of second touch control electrodes extending in a second direction intersecting with the first direction; a plurality of first wirings, one end of each first wiring being connected to a corresponding first touch control electrode, and the other end of each first wiring being connected to an integrated circuit, wherein each first wiring is located within a gap formed by the corresponding first touch control electrode and the adjacent second touch control electrode and extends along the second direction.Type: ApplicationFiled: August 5, 2016Publication date: September 21, 2017Inventor: Hao ZHANG
-
Publication number: 20170269781Abstract: A driving method for an in-cell touch display and a mobile device using the same are provided. The driving method for an in-cell touch display comprises the steps of: dividing a frame period into N display/touch detection sub-periods each comprising a display sub-period and a touch detection sub-period; dividing scan lines into M scan-line sets, wherein a position of each of the scan-line sets corresponds to at least one of touch sensors; supplying a display common voltage to the touch sensor corresponding to the Ith scan-line set when the scan line being scanned in the display sub-period of the Kth display/touch detection sub-period comprises the scan line of the Ith scan-line set, wherein N, M, K and I are natural numbers, K is smaller than or equal to N, and I is smaller than or equal to M.Type: ApplicationFiled: August 15, 2016Publication date: September 21, 2017Inventors: Ho-Nien YANG, Shen-Chia HUANG
-
Publication number: 20170269782Abstract: The present disclosure provides a photoelectric sensor and driving method thereof, as well as an array substrate and a display device. The photoelectric sensor comprises a photoelectric element having an output terminal and a reference level input terminal, an amplifying transistor, a readout transistor, a reset transistor, a capacitor and a plurality of control input terminals. The output terminal of the photoelectric element, the gate of the amplifying transistor and the source of the reset transistor are connected to a first terminal of the capacitor. The reference level input terminal, the sources of the readout transistor and amplifying transistor are connected to a first reference voltage input terminal. The drains of the reset transistor and amplifying transistor are connected to a second reference voltage input terminal. The gates of the read-out transistor and reset transistor are respectively connected to a control input terminal.Type: ApplicationFiled: February 14, 2016Publication date: September 21, 2017Inventor: Chunwei WU
-
Publication number: 20170269783Abstract: The present disclosure provides in some embodiments a pixel driving circuit, a display panel, a method for driving the display panel, and a display device. The pixel driving circuit includes a preset unit, a driving unit, a compensation unit, an energy storage unit, and a driving signal output unit.Type: ApplicationFiled: March 1, 2016Publication date: September 21, 2017Applicants: BOE TECHNOLOGY GROUP CO., LTD., BEIJING BOE OPTOELECTRONICS TECHNOLOGY CO., LTD.Inventors: Shengji YANG, Xue DONG, Hailin XUE, Haisheng WANG, Xiaochuan CHEN, Hongjuan LIU, Tuo SUN, Lifei MA, Yingming LIU, Weijie ZHAO, Changfeng LI
-
Publication number: 20170269784Abstract: An input device includes a sensor unit, a processing unit, a storage unit, and an interface unit. The processing unit includes: a change amount calculating section for calculating the temporal change amount of the electrostatic capacitance that is detected by the sensor unit and that changes according to the proximity degree of an object with respect to a detection surface; a determination section for determining a touch operation corresponding to an operation on the detection surface using a fingertip, a palm operation corresponding to an operation on the detection surface using a palm, and a grip operation corresponding to an operation on a conductor portion, based on the calculated change amount; and a reset section for performing reset processing in a case where the duration of a state, in which a palm operation is not determined and a grip operation is determined, is a first predetermined time or more.Type: ApplicationFiled: November 11, 2016Publication date: September 21, 2017Inventors: Satoshi NAKAJIMA, Satoshi HAYASAKA, Kohei KITAGAWA
-
Publication number: 20170269785Abstract: In one aspect, the present disclosure relates to a method including detecting force applied to a force sensing layer in a device and detecting touch contacts applied to a touch screen layer in a device. The method also includes determining if the location and/or amount of force detected by the force sensing layer correlates to a touch contact, and, if it correlates, treating the force as front-side force, while if the location and/or amount of force detected by the force sensing layer does not correlate to a touch contact, treating the force as a back-side force. Based on the type of force detected, appropriate action may be taken, including back-side specific actions such as multi-tasking application switches or content or viewport manipulation.Type: ApplicationFiled: March 17, 2017Publication date: September 21, 2017Applicant: Apple Inc.Inventors: Golnaz ABDOLLAHIAN, Wayne C. WESTERMAN
-
Publication number: 20170269786Abstract: An electronic device (300) includes a housing (301). A touch sensitive surface (100) can be disposed along the housing. The touch sensitive surface can include a recessed surface feature (106) on a portion of the touch sensitive surface. A control circuit (315), operable with the touch sensitive surface, can detect a predetermined gesture sequence (501, 502, 503) when a touch actuation along the touch sensitive surface interacts with the recessed surface feature.Type: ApplicationFiled: June 7, 2017Publication date: September 21, 2017Inventors: Chad Austin Phipps, Jeffrey R. DeVries, John C. Johnson, Louis J. Lundell, Thomas Y. Merrell, Mitul R. Patel, Jiri Slaby
-
Publication number: 20170269787Abstract: A user interface system, including a display surface displaying thereon a plurality of controls representing different applications, a multi-faceted housing situated along a single edge of the display surface, including an eye-tracker mounted in a first facet of the housing, the first facet being distal from the display surface, the eye-tracker identifying a control on the display surface at which a user's eyes are directed, and a proximity sensor mounted in a second facet of the housing, the second facet being between the first facet and the display, the proximity sensor detecting a gesture performed by an object opposite the second facet, and a processor running the different applications, connected to the proximity sensor and to the eye-tracker, causing the application represented by the control identified by the eye-tracker to receive as input the gesture detected by the proximity sensor.Type: ApplicationFiled: June 7, 2017Publication date: September 21, 2017Inventors: Thomas Eriksson, Robert Pettersson, Stefan Holmgren, Xiatao Wang, Rozita Teymourzadeh, Per Erik Lindström, Emil Anders Braide, Jonas Daniel Justus Hjelm, Erik Rosengren
-
Publication number: 20170269788Abstract: Disclosed in the invention are a projector screen, touch screen projection displaying method and system, which belong to the field of display technology. The touch screen projection displaying system comprises a projector screen, a processing module and a projector, wherein the projector screen is provided with a capacitive touch film layer, is connected with the processing module, and is configured to receive a touch operation via the capacitive touch film layer, and transmit the touch position information generated by the touch operation to the processing module; the processing module is connected with the projector, and is configured to determine the target image information according to the touch position information; and the projector is configured to receive the target image information transmitted by the processing module, process the target image information, and project the processed target image information onto the projector screen.Type: ApplicationFiled: February 1, 2016Publication date: September 21, 2017Inventors: Yifei ZHAN, Dayu ZHANG
-
Publication number: 20170269789Abstract: An optical touch device using imaging module according the present invention includes a guide light plate, at least a light source, an imaging module, at least a photosensitive unit and a microprocessor; the imaging module has at least an image reflective curved surface to image multiple contact positions to at least a photosensitive unit conveniently to form image information corresponding to the multiple contact positions so as to allow the microprocessor to generate corresponding touch signals; the images at where an object contacts the guide light plate are utilized directly to generate the touch signals instead of detecting attenuation signal of frustrated total internal reflection resulting from the object contacting the guide light plate; due to the light in the guide light plate is employed to conduct the total internal reflection propagation to attain touch control function, components needed in the touch device is reduced and the fabrication cost is saved largely.Type: ApplicationFiled: March 13, 2017Publication date: September 21, 2017Applicant: inFilm Optoelectronic Inc.Inventors: CHIH-HSIUNG LIN, SHIH-YUAN CHANG
-
Publication number: 20170269790Abstract: Technologies for context aware graphical user interfaces for mobile compute devices include a mobile compute device that includes a touchscreen display, a wireless signal sensor to receive a wireless signal transmitted by a wearable device, a context determination module, and a graphical user interface manager module. The context determination module is to measure a strength of the wireless signal and identify which hand of the user is presently used to hold the mobile compute device based on the measured strength of the wireless signal. The graphical user interface manager module is to configure a graphical user interface displayed on the touchscreen display based on the identification of the hand of the user presently used to hold the mobile compute device.Type: ApplicationFiled: March 18, 2016Publication date: September 21, 2017Inventors: Andrea Grandi, Stephanie Courtney
-
Publication number: 20170269791Abstract: Relevant content (e.g., containers and/or container elements) can be surfaced via user interfaces based at least partly on determining the relevant content based on interactions between user(s), container(s), and/or container element(s). Techniques described herein include generating a user interface configured with functionality to present content to a user. The user interface can include interface elements, such as cards, corresponding to containers. The cards can be arranged on the user interface in an order determined based at least partly on respective relevancies of the containers to the user, and a presentation of individual cards can be based at least partly on a type of corresponding individual containers. Individual cards can include a group of one or more interface elements corresponding to container elements that can be arranged based at least partly on respective relevancies of the container elements to the user.Type: ApplicationFiled: March 21, 2016Publication date: September 21, 2017Inventors: Dmitriy Meyerzon, David M. Cohen, Adam Ford, Andrew C. Haon, Ryan Nakhoul, Jason Glenn Silvis, Vidya Srinivasan, Denise Trabona
-
Publication number: 20170269792Abstract: Provided is a method of notifying a schedule by using a mobile terminal, the method including obtaining schedule information indicating a schedule of a user; generating at least one schedule tag based on time information comprised in the schedule information of the user; displaying a clock graphical user interface (GUI) on a screen of the mobile terminal; displaying the schedule tag on a periphery of the displayed clock GUI, based on the time information corresponding to the schedule tag; and changing and displaying an attribute of the displayed schedule tag, according to the time information corresponding to the displayed schedule tag, and current time.Type: ApplicationFiled: December 14, 2015Publication date: September 21, 2017Applicant: SAMSUNG ELECTRONICS CO., LTD.Inventors: Li XU, Pengyu LI, Zhe ZHAO
-
Publication number: 20170269793Abstract: The description relates to a shared digital workspace. One example includes a display device and sensors. The sensors are configured to detect users proximate the display device and to detect that an individual user is performing an individual user command relative to the display device. The system also includes a graphical user interface configured to be presented on the display device that allows multiple detected users to simultaneously interact with the graphical user interface via user commands.Type: ApplicationFiled: June 6, 2017Publication date: September 21, 2017Applicant: Microsoft Technology Licensing, LLCInventors: Desney S. TAN, Kenneth P. HINCKLEY, Steven N. BATHICHE, Ronald O. PESSNER, Bongshin LEE, Anoop GUPTA, Amir NETZ, Brett D. BREWER
-
Publication number: 20170269794Abstract: At an electronic device with a touch screen display, display a user interface, where the user interface includes at least two windows of an application, a first icon displayed in association with a first window of the at least two windows, and a second icon at a location different from the at least two windows. While displaying the at least two windows, detect a gesture on the touch screen display. In response to detecting the gesture: in accordance with detecting the gesture on the first icon associated with the first window on the touch screen display, remove from the display the first window of the at least two displayed windows corresponding to the gesture; and in accordance with detecting the gesture on the second icon at the location different from the at least two windows of the application, display a new window of the application on the touch screen display.Type: ApplicationFiled: June 9, 2017Publication date: September 21, 2017Inventors: Scott FORSTALL, Chris BLUMENBERG, Andre M.J. BOULE, Imran CHAUDHRI, Gregory N. CHRISTIE, Stephen O. LEMAY, Marcel VAN OS, Richard J. WILLIAMSON
-
Publication number: 20170269795Abstract: A multi-window user interface (UI) is presented in various configurations and operational uses to leverage the relatively large display canvas afforded by large screen display devices such as 4K or 8K displays. Along with the various “Multiview” aspects, content delivery techniques, content selection techniques, and level of service techniques also are presented.Type: ApplicationFiled: March 15, 2016Publication date: September 21, 2017Inventors: STEVEN MARTIN RICHMAN, JASON CLEMENT
-
Publication number: 20170269796Abstract: Comparing recurring processes. A method includes automatically identifying a plurality of recurring process instances having one or more commonalities. The method further includes displaying the plurality of recurring process instances having one or more commonalities to a user in a user interface. The method further includes receiving user input at the user interface selecting a first recurring process instance from the plurality of recurring process instances. The method further includes receiving user input at the user interface selecting a second recurring process instance from the plurality of recurring process instances. The method further includes automatically identifying differences in the first and second recurring process instances. The method further includes presenting the differences in the graphical user interface to the user.Type: ApplicationFiled: March 15, 2016Publication date: September 21, 2017Inventors: Jiahui Wang, Yaron Burd, Omid Afnan
-
Publication number: 20170269797Abstract: Embodiments of the invention provide a system for enhancing user interaction with the Internet of Things in a network. The system includes a processor, and a memory. The memory includes a database including one or more options corresponding to each of the Internet of Things. Further, the memory includes instructions executable by the processor for providing the options to a user for enabling the user to select at least one option therefrom. Further, the instructions create a visual menu based on information corresponding to selection of the at least one option. The visual menu includes one or more objects corresponding to the Internet of Things. Furthermore, the instructions receive a rating for the visual menu from one or more second users of the Internet of Things. Additionally, instructions customize the visual menu based on the received rating.Type: ApplicationFiled: March 18, 2016Publication date: September 21, 2017Inventors: Tal Lavian, Zvi Or-Bach
-
Publication number: 20170269798Abstract: Relevant content can be surfaced via user interfaces presented via devices based at least partly on determining the relevant content from interactions between user(s), container(s), and/or container element(s). Techniques described herein include accessing data associated with interactions between a user and content (e.g., containers and container elements) associated with a collaborative computing environment. Based at least partly on the data, relationships between the user, container(s), and/or container element(s), and weights corresponding to individual relationships of the relationships can be determined. Techniques described herein include determining at least a portion of the content that is relevant to the user based at least partly on the weights and generating a content page associated with the collaborative computing environment configured with functionality to surface at least the portion of the content.Type: ApplicationFiled: March 21, 2016Publication date: September 21, 2017Inventors: Dmitriy Meyerzon, David M. Cohen, Bjornstein Lilleby, Aninda Ray, Yauhen Shnitko, Vidya Srinivasan, Michael Taylor, Vidar Vikjord, Nikita Voronkov
-
Publication number: 20170269799Abstract: The disclosed methods and apparatus relate generally to the use of electronic devices to interact with electronic tags associated with objects, services, places, people, or animals (“objects”) in order to access programming material related to these objects. The invention discloses the input of keywords, phrases, preferences, or interests into a user's device to search a plurality of electronic tags within a given physical range to access programming material from the tags—or via data links provided in the tags—that matches the keywords, phrases, preferences, or interests. Furthermore, the invention locates, maps, and labels the objects associated with this programming material on the user's display.Type: ApplicationFiled: March 7, 2017Publication date: September 21, 2017Inventor: SPENCER A. RATHUS
-
Publication number: 20170269800Abstract: The present specification discloses a mobile terminal and a control method thereof. According to an embodiment of the present specification, the mobile terminal comprises: a display unit configured to output a first region which displays messages and a second region which displays information on a contact corresponding to the messages; and a control unit for outputting, to the display unit, a controller user interface (UI) for processing the messages to be displayed in the first region in association with the information on the contact displayed in the second region when a preset touch input is received in the second region. Here, the controller UI comprises: a menu region which displays scrap information associated with the information on the contact displayed in the second region among the messages displayed in the first region; and an input region which displays a message corresponding to the scrap information selected in the menu region.Type: ApplicationFiled: February 27, 2015Publication date: September 21, 2017Applicant: LG ELECTRONICS INC.Inventors: Junho PARK, Bokheun LEE, Jiseok JUNG, Seungjun LEE
-
Publication number: 20170269801Abstract: Disclosed are a multimedia device and a control method therefor. A multimedia device according an embodiment of the present invention may comprise: a display unit; and a control unit for controlling the display unit to display a menu bar on a screen, wherein the menu bar includes a plurality of icons arranged in one direction, and the icons adjacent to each other are displayed while partially overlapping each other. A multimedia device according to another embodiment of the present invention may comprise: a display unit; and a control unit for controlling the display unit to display a menu bar on a screen, wherein the menu bar includes a plurality of icons arranged in one direction, and the control unit sets a fixed area by fixing at least one icon included in the menu bar by a fixing frame, and separates the menu bar into the fixed area and a variable area according to whether the fixing frame exists.Type: ApplicationFiled: February 6, 2015Publication date: September 21, 2017Applicant: LG ELECTRONICS INC.Inventor: Jaesun YUN
-
Publication number: 20170269802Abstract: Methods and apparatus related to determining a triggering event of a user, selecting media relevant to the triggering event, and providing the selected media to the user. Some implementations are directed to methods and apparatus for determining a past event of the user that is indicative of past interaction of the user with one or more past entities and the triggering event may be determined to be associated with the past event. The media selected to provide to the user may contain media that includes the one or more past entities associated with the past event and the media may be provided to the user in response to the triggering event.Type: ApplicationFiled: May 31, 2017Publication date: September 21, 2017Inventors: Matthew Kulick, Aparna Chennapragada, Albert Segars, Hartmut Neven, Arcot J. Preetham
-
Publication number: 20170269803Abstract: A screen management system may receive a user input to divide a screen area and divide the screen area into a plurality of divided screen areas. A different application may be displayed in each of the divided screen areas. User input comprised of dragging a resizing bar in a direction is received and the plurality of divided screen areas are resized in response to the user input.Type: ApplicationFiled: May 30, 2017Publication date: September 21, 2017Applicant: INADEV CORPORATIONInventor: Benjamin MONNIG
-
Publication number: 20170269804Abstract: A multi-window user interface (UI) is presented in various configurations and operational uses to leverage the relatively large display canvas afforded by large screen display devices such as 4K or 8K displays. Along with the various “Multiview” aspects, content delivery techniques, content selection techniques, and level of service techniques also are presented.Type: ApplicationFiled: March 15, 2016Publication date: September 21, 2017Inventors: STEVEN MARTIN RICHMAN, JASON CLEMENT
-
Publication number: 20170269805Abstract: Aspects of the present disclosure relate to systems and methods for creating and progressing files through a workflow using a workflow board. In one aspect, a file repository may be rendered within a file sharing tool. The file repository may include at least a workflow board. A workflow board view associated with a workflow comprising one or more stages may be generated. The generated workflow board view may be displayed as the workflow board. In one example, the workflow board view includes a visual representation of a file status of one or more files progressing through the one or more stages of the workflow.Type: ApplicationFiled: March 17, 2016Publication date: September 21, 2017Inventors: John L. DeMaris, Michael Scott Pierce, M. Tyler J. Rasmussen
-
Publication number: 20170269806Abstract: As disclosed herein a computer-implemented method includes detecting a gesture applied to a touch display that corresponds to an object, determining a direction for the gesture and determining if a gesture pressure is greater than a threshold pressure required to move the object. The method further includes determining a surface profile of the display and selecting a rate of transfer according to the surface profile of the display, determining if a surface profile of a receiving device is compatible with the surface profile of the display, and transferring the object to the receiving device. A computer program product and a computer system corresponding to the above method are also disclosed herein.Type: ApplicationFiled: March 15, 2016Publication date: September 21, 2017Inventors: Vijay Ekambaram, Sarbajit K. Rakshit
-
Publication number: 20170269807Abstract: An application is installed on a device that includes a user interface comprising multiple elements organized in a hierarchy. The application communicates with an inspector tool that accesses the hierarchy. The inspector tool may be integrated into the application or separate from the application. During execution of the application, the inspector identifies an element in the hierarchy and presents information regarding the element. For example, a display region corresponding to the element may be highlighted or text regarding the element may be displayed. The hierarchy may be navigated by selection of items in the user interface itself, by selection of an item in the displayed information, by directional inputs, or any suitable combination thereof. Information displayed in the inspector may be configured by the application based on the identified element.Type: ApplicationFiled: March 15, 2016Publication date: September 21, 2017Inventor: Tyler Yong Nugent
-
Publication number: 20170269808Abstract: An interface operating control device applied in an electronic device with an interface includes a sensing module, a recognizing module, a bottom layer executing module, and an operating module. The sensing module outputs sensing signal when a distance between the object and the interface is less than a preset distance. The recognizing module receives the sensing signal and acquires operating characteristics of the object according the sensing signal. The bottom layer executing module boots an application mode corresponding to the operating characteristics acquired currently. The operating module determines different intensity applied on the interface by the object according to total areas touching on the interface, and executes different operation instructions of the application mode currently operating according to the different intensity. Therefore, a system of the electronic device can be accessed rapidly and conveniently, and the users can get abundant operations and experiences.Type: ApplicationFiled: April 29, 2016Publication date: September 21, 2017Inventor: WEN-HSIN CHANG
-
Publication number: 20170269809Abstract: A method for screen capture. The method includes: upon receipt of a slow play instruction, playing a video at a predetermined slowing rate in a video playing interface of a terminal device, the predetermined slowing rate being less than a normal play rate of the video; receiving a screen capture instruction during the course of playing the video at the predetermined slowing rate; and performing screen capture for the video playing interface according to the screen capture instruction.Type: ApplicationFiled: August 25, 2016Publication date: September 21, 2017Inventor: Yanying QIAN
-
Publication number: 20170269810Abstract: An electronic whiteboard that displays visual information on a display device depending on an input position on a screen, the electronic whiteboard includes a communication information storage unit configured to store communication information for communicating with another electronic whiteboard; an obtainment unit configured to communicate with the other electronic whiteboard by using the communication information, and to obtain a program providing a function of the electronic whiteboard, and a common introduction program common to models of the electronic whiteboard for introducing the program; and an introduction unit configured to execute the common introduction program, and to introduce the program in the electronic whiteboard.Type: ApplicationFiled: March 3, 2017Publication date: September 21, 2017Applicant: Ricoh Company, Ltd.Inventors: Tomoki KANDA, Yoshinaga KATO
-
Publication number: 20170269811Abstract: To provide a control panel for machine tools that improves operability and enables even an unskilled person to easily grasp a procedure for moving a screen to a target screen. A plurality of screen data are classified into a plurality of groups and classified into main screen data and sub-screen data associated with the main screen data in each of the groups. A control section displays the main screen data and displays the sub-screen data on a display section according to a type of an input to a plurality of main screen corresponding buttons.Type: ApplicationFiled: March 15, 2017Publication date: September 21, 2017Applicant: JTEKT CorporationInventors: Kazuhiro TSUJIMURA, Tomokazu TAKAYAMA, Takahito UMEKI, Masanori Ando, Hiroyuki TSUSAKA
-
Publication number: 20170269812Abstract: The present invention provides displaying methods and related electronic apparatus for displaying contents without being blocked by at least an object on a screen of the electronic apparatus. The electronic apparatus comprises: a blocking object determining circuit, a blocked area identifying circuit, a blocked contents identifying circuit, a blocked contents analyzing circuit, a screen contents display processor, and the screen. The present invention can make the blocked contents in the screen bypass the blocked area to be displayed while the users do not need to move their hands or other blocking objects.Type: ApplicationFiled: March 19, 2017Publication date: September 21, 2017Inventors: Dan Luo, Yi-Kai Lee, Xiaomeng Yan, Chun-Chia Chen
-
Publication number: 20170269813Abstract: In one aspect, method is provided for controlling a display device, comprising: displaying, on a touchscreen display, a first window executing a first application and a second window executing a second application; receiving, at the touchscreen display, a first command input to the first window and a second command input to the second window; determining whether the first command and the second command are received simultaneously; dispatching, by a processor, the first command; and dispatching, by the processor, the second command.Type: ApplicationFiled: June 1, 2017Publication date: September 21, 2017Inventor: Sung-Jae CHO
-
Publication number: 20170269814Abstract: An embodiment of the invention provides a method for displaying a text box on a display screen of an electronic device, including determining a state of a user with an input device having a camera, a keyboard, and/or a mouse. A text box setting on the electronic device is modified with a processor connected to the input device based on the state of the user, the modifying of the text box setting includes modifying an amount of visual information in the text box, modifying an amount of audible information played with the text box, and/or modifying an amount of time required to display the text box. The text box is displayed on the display screen of the electronic device when a pointer is within a threshold degree of proximity to an item on the display screen for the amount of time required to display the text box.Type: ApplicationFiled: March 16, 2016Publication date: September 21, 2017Applicant: International Business Machines CorporationInventors: James R. Kozloski, Clifford A. Pickover, Maja Vukovic
-
Publication number: 20170269815Abstract: Presenting thumbnails of visual objects using a data processing system includes determining, using a processor, aspect ratios for a plurality of visual objects and associating, using the processor, each of the plurality of visual objects with a thumbnail size selected from a plurality of predetermined thumbnail sizes based upon the aspect ratio of the visual object. A determination is made whether a visual feature is detected within the plurality of visual objects. A layout for displaying thumbnails of the plurality of visual objects is generated using the processor based on chronological order of the visual objects, detection of the visible feature, and the thumbnail sizes associated with the visual objects. Using a screen and the processor, the thumbnails are displayed according to the layout.Type: ApplicationFiled: March 17, 2016Publication date: September 21, 2017Inventors: Heron Da Silva Ramos, Tommy Park, Nasson Julian Schahin Boroumand, Syyean Gastelum, Wu Guan, Jinghai Rao, Hyung Keun Kim, Florian Dusch
-
Publication number: 20170269816Abstract: Methods and systems for manual and programmatic remediation of websites. JavaScript code is accessed by a user device and optionally calls TTS, ASR, and RADAE modules from a remote server to thereby facilitate website navigation by people with diverse abilities.Type: ApplicationFiled: March 18, 2016Publication date: September 21, 2017Inventors: Sean D. Bradley, Mark D. Baker, Jeffrey O. Jones, Kenny P. Hefner, Adam Finkelstein, Douglas J. Gilormo, Taylor R. Bodnar, David C. Pinckney, Charlie E. Blevins, Trevor C. Jones, Helena Laymon
-
Publication number: 20170269817Abstract: One embodiment provides a method, comprising: detecting, using a processor, a first user operation upon one of a plurality of displays operative connected to an information handling device; and controlling, using a processor, the one of a plurality of displays to display thereupon a first window associated with the first user operation; wherein the first user operation may be performed upon any of the plurality of displays. Other aspects are disclosed and claimed.Type: ApplicationFiled: March 21, 2017Publication date: September 21, 2017Inventor: Qi Hua Xiao
-
Publication number: 20170269818Abstract: A method, executed by a computer, for remapping interface elements on a graphical user interface includes activating an action capture mode responsive to input from a user, receiving a selection of an interface element, responsive to input from the user that uses the action capture mode, adding an alias user interface element corresponding to the interface element to a shortcut group, and wherein activation of the alias user interface element performs an action corresponding to the interface element. A computer program product and computer system corresponding to the above method are also disclosed herein.Type: ApplicationFiled: March 15, 2016Publication date: September 21, 2017Inventors: Zai Cen, Jie Jiang, Wen Juan Nie, Qi Ruan, Li Zhang, Chao Xing Zhou
-
Publication number: 20170269819Abstract: An authorized user is allowed to personalize an existing third-party webpage. A request is received from the authorized user to insert a user-selected image into the third-party webpage. It is then determined if the user-selected image is appropriate for use on the third-party webpage, where this determination is based on one or more image appropriateness criteria. Whenever the user-selected image is determined to be appropriate for use on the third-party webpage, the third-party webpage is personalized by inserting the user-selected image into the third-party webpage according to the authorized user's image insertion request.Type: ApplicationFiled: March 21, 2016Publication date: September 21, 2017Inventors: Maria Bernadette G. Pinpin, Wallace Earl Greathouse, Jason J. Wall, Kris L. Kendall, Dan Li
-
Publication number: 20170269820Abstract: Techniques are described herein that are capable of providing selectable interaction elements in a 360-degree video stream. A selectable interaction element is an element (e.g., user interface element) for which selection of the element initiates the providing of information pertaining to an object with which the element is associated. For instance, the selectable interaction element may be positioned proximate the object in the 360-degree video stream (e.g., a portion of the 360-degree video stream or an entirety of the 360-degree video stream). Examples of a user interface element include but are not limited to text, an icon, and a widget.Type: ApplicationFiled: May 12, 2017Publication date: September 21, 2017Inventors: Raymond Walter Riley, Kae-Ling Gurr, Brett Delainey Christie, Joshua D. Maruska, Joshua Noble