Patents by Inventor B Michael Victor

B Michael Victor has filed for patents to protect the following inventions. This listing includes patent applications that are pending as well as patents that have already been granted by the United States Patent and Trademark Office (USPTO).

  • Patent number: 8201109
    Abstract: Methods and graphical user interfaces for editing on a portable multifunction device with a touch screen display are disclosed. While displaying an application interface of an application, the device detects a multitouch edit initiation gesture on the touch screen display. In response to detection of the multitouch edit initiation gesture, the device displays a plurality of user-selectable edit option icons in an area of the touch screen display that is independent of a location of the multitouch edit initiation gesture. The device also displays a start point object and an end point object to select content displayed by the application in the application interface.
    Type: Grant
    Filed: September 30, 2008
    Date of Patent: June 12, 2012
    Assignee: Apple Inc.
    Inventors: Marcel Van Os, Bas Ording, Stephen O. Lemay, Wayne C. Westerman, B Michael Victor
  • Publication number: 20120113023
    Abstract: A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion.
    Type: Application
    Filed: March 30, 2011
    Publication date: May 10, 2012
    Inventors: Jonathan Koch, B. Michael Victor, Avi E. Cieplinski, Julian Missig
  • Publication number: 20120117505
    Abstract: A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion.
    Type: Application
    Filed: March 30, 2011
    Publication date: May 10, 2012
    Inventors: Jonathan Koch, B. Michael Victor, Avi E. Cieplinski, Julian Missig
  • Publication number: 20120117506
    Abstract: A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion.
    Type: Application
    Filed: March 30, 2011
    Publication date: May 10, 2012
    Inventors: Jonathan Koch, B. Michael Victor, Avi E. Cieplinski, Julian Missig
  • Publication number: 20120113126
    Abstract: A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion.
    Type: Application
    Filed: March 30, 2011
    Publication date: May 10, 2012
    Inventors: Jonathan Koch, B. Michael Victor, Avi E. Cieplinski, Julian Missig
  • Publication number: 20120113024
    Abstract: A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion.
    Type: Application
    Filed: March 30, 2011
    Publication date: May 10, 2012
    Inventors: Jonathan Koch, B Michael Victor, Avi E. Cieplinski, Julian Missig
  • Publication number: 20120113025
    Abstract: A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion.
    Type: Application
    Filed: March 30, 2011
    Publication date: May 10, 2012
    Inventors: Jonathan Koch, B. Michael Victor, Avi E. Cieplinski, Julian Missig
  • Publication number: 20120117501
    Abstract: A method includes, at an electronic device with a display and a touch-sensitive surface: concurrently displaying a first text entry area and an unsplit keyboard on the display; detecting a gesture on the touch-sensitive surface; and, in response to detecting the gesture on the touch-sensitive surface, replacing the unsplit keyboard with an integrated input area. The integrated input area includes a left portion with a left side of a split keyboard, a right portion with a right side of the split keyboard, and a center portion in between the left portion and the right portion.
    Type: Application
    Filed: March 30, 2011
    Publication date: May 10, 2012
    Inventors: Jonathan Koch, B. Michael Victor, Avi E. Cieplinski, Julian Missig
  • Publication number: 20120084689
    Abstract: User interface changes related to moving items in a user interface are disclosed. An operation (e.g., a drag operation) can be initiated on selected items by moving a cursor or pointing device in the user interface, and an animation can be presented illustrating representations of the selected items moving from their respective original locations toward a current location of the cursor or pointing device and forming a cluster in proximity to the current location of the cursor or pointing device. As the cluster of items is moved over a container object in the user interface, the representations of the items can adopt the appearance style defined by that container object. The representations of the items can also be shown to depart from the cluster and move toward anticipated locations of the items in the container object as a preview of a drop operation into the container object.
    Type: Application
    Filed: September 30, 2010
    Publication date: April 5, 2012
    Inventors: Raleigh Joseph Ledet, Jeffrey Traer Bernstein, B. Michael Victor, Avi E. Cieplinksi, Kristin Forster, Craig Federighi
  • Publication number: 20120036460
    Abstract: An electronic device concurrently displays a plurality of user interface objects and a list of folder icons. The list of folder icons includes a first new folder icon. The device detects a first input by a user using a first user input device, selects a folder icon in the list of folder icons in accordance with the first input, and indicates selection of the folder icon in the list of folder icons. The device also detects a second input by the user using a second user input device on one or more of the displayed user interface objects, moves the one or more user interface objects into a folder that corresponds to the selected folder icon, and when the selected folder icon is the first new folder icon, displays a second new folder icon in the list of folders icons.
    Type: Application
    Filed: August 3, 2010
    Publication date: February 9, 2012
    Inventors: Avi E. Cieplinski, Julian Missig, B. Michael Victor
  • Publication number: 20120030566
    Abstract: Computing equipment may display data items in a list on a touch screen display. The computing equipment may use the touch screen display to detect touch gestures. A user may select a data item using a touch gesture such as a tap gesture. In response, the computing equipment may display a selectable option. When the option is displayed, movable markers may be placed in the list. The markers can be dragged to new locations to adjust how many of the data items are selected and highlighted in the list. Ranges of selected items may be merged by moving the markers to unify separate groups of selected items. A region that contains multiple selectable options may be displayed adjacent to a selected item. The selectable options may correspond to different ways to select and deselect items. Multifinger swipe gestures may be used to select and deselect data items.
    Type: Application
    Filed: July 28, 2010
    Publication date: February 2, 2012
    Inventor: B. Michael Victor
  • Publication number: 20120030567
    Abstract: A user may select content that has been displayed. The selected content may be provided to multiple applications as input in response to detection of a user command such as a touch gesture. The applications may be widgets that are displayed in respective application regions surrounding a focus region. The selected text may be presented in the focus region. Each widget may produce output in its application region that is based on the selected input. A user can launch a desired widget using a swipe gesture towards the desired widget. A user may transfer the selected content using a swipe from the focus region to an application region. A user can select which widgets are included in the application regions. Displayed data items may be related to selected content. A data item may be dragged onto a widget icon to transfer the data item to an associated widget.
    Type: Application
    Filed: July 28, 2010
    Publication date: February 2, 2012
    Inventor: B. Michael Victor
  • Publication number: 20110109538
    Abstract: This is directed to dynamic tags or screen savers for display on an electronic device. The tags can include several dynamic elements that move across the display. The particular characteristics of the elements can be controlled in part by the output of one or more sensors detecting the environment of the device. For example, the color scheme used for a tag can be selected based on the colors of an image captured by a camera, and the orientation of the movement can be selected from the output of a motion sensing component. The tag can adjust automatically based on the sensor outputs to provide an aesthetically pleasing display that a user can use as an fashion accessory.
    Type: Application
    Filed: November 10, 2009
    Publication date: May 12, 2011
    Applicant: Apple Inc.
    Inventors: Duncan Kerr, Nicholas King, B. Michael Victor
  • Publication number: 20110078622
    Abstract: In some embodiments, a multifunction device with a display and a touch-sensitive surface displays a multi-week view in a calendar application on the display and detects a first input by a user. In response to detecting the first input by the user, the device selects a first calendar entry in the multi-week view in the calendar application. While continuing to detect selection of the first calendar entry by the user, the device detects a first multifinger gesture on the touch-sensitive surface, and in response to detecting the first multifinger gesture on the touch-sensitive surface, the device expands display of a single week in the multi-week view; and maintains display of the first calendar entry on the display. In some embodiments, the device moves the first calendar entry to a date and time in the calendar application in accordance with a second input by the user.
    Type: Application
    Filed: September 25, 2009
    Publication date: March 31, 2011
    Inventors: Julian Missig, Jonathan Koch, Avi E. Cieplinski, B. Michael Victor, Jeffrey Traer Bernstein, Duncan R. Kerr, Myra M. Haggerty
  • Publication number: 20110078624
    Abstract: In some embodiments, a multifunction device with a display and a touch-sensitive surface creates a plurality of workspace views. A respective workspace view is configured to contain content assigned by a user to the respective workspace view. The content includes application windows. The device displays a first workspace view in the plurality of workspace views on the display without displaying other workspace views in the plurality of workspace views and detects a first multifinger gesture on the touch-sensitive surface. In response to detecting the first multifinger gesture on the touch-sensitive surface, the device replaces display of the first workspace view with concurrent display of the plurality of workspace views.
    Type: Application
    Filed: September 25, 2009
    Publication date: March 31, 2011
    Inventors: Julian Missig, Jonathan Koch, Avi E. Cieplinski, B. Michael Victor, Jeffrey Traer Bernstein, Duncan R. Kerr, Myra M. Haggerty
  • Publication number: 20110072375
    Abstract: A computing device with a touch screen display simultaneously displays on the touch screen display a plurality of user interface objects and at least one destination object. The computing device detects a first input by a user on a destination object displayed on the touch screen display. While continuing to detect the first input by the user on the destination object, the computing device detects a second input by the user on a first user interface object displayed on the touch screen display. In response to detecting the second input by the user on the first user interface object, the computing device performs an action on the first user interface object. The action is associated with the destination object.
    Type: Application
    Filed: September 25, 2009
    Publication date: March 24, 2011
    Inventor: B. Michael Victor
  • Publication number: 20110069016
    Abstract: A computing device with a touch screen display simultaneously displays on the touch screen display a plurality of user interface objects and at least one destination object. The computing device detects a first input by a user on a destination object displayed on the touch screen display. While continuing to detect the first input by the user on the destination object, the computing device detects a second input by the user on a first user interface object displayed on the touch screen display. In response to detecting the second input by the user on the first user interface object, the computing device performs an action on the first user interface object. The action is associated with the destination object.
    Type: Application
    Filed: September 25, 2009
    Publication date: March 24, 2011
    Inventor: B. Michael Victor
  • Publication number: 20110069017
    Abstract: A computing device with a touch screen display simultaneously displays on the touch screen display a plurality of user interface objects and at least one destination object. The computing device detects a first input by a user on a destination object displayed on the touch screen display. While continuing to detect the first input by the user on the destination object, the computing device detects a second input by the user on a first user interface object displayed on the touch screen display. In response to detecting the second input by the user on the first user interface object, the computing device performs an action on the first user interface object. The action is associated with the destination object.
    Type: Application
    Filed: September 25, 2009
    Publication date: March 24, 2011
    Inventor: B. Michael Victor
  • Publication number: 20110072394
    Abstract: A computing device with a touch screen display simultaneously displays on the touch screen display a plurality of user interface objects and at least one destination object. The computing device detects a first input by a user on a destination object displayed on the touch screen display. While continuing to detect the first input by the user on the destination object, the computing device detects a second input by the user on a first user interface object displayed on the touch screen display. In response to detecting the second input by the user on the first user interface object, the computing device performs an action on the first user interface object. The action is associated with the destination object.
    Type: Application
    Filed: September 25, 2009
    Publication date: March 24, 2011
    Inventor: B. Michael Victor
  • Publication number: 20100171712
    Abstract: In some embodiments, an electronic device with a display and a touch-sensitive surface displays a user interface object. The device detects a first contact and a second contact concurrently on the touch-sensitive surface. The device determines which contact of the first contact and the second contact is a topmost contact, a bottommost contact, a leftmost contact, and a rightmost contact on the touch-sensitive surface. While continuing to detect the first contact and the second contact, the device detects movement of the first contact across the touch-sensitive surface, and concurrently moves two edges of the user interface object that correspond to the first contact in accordance with the detected movement of the first contact, including horizontally moving one of the two edges and vertically moving the other of the two edges.
    Type: Application
    Filed: September 25, 2009
    Publication date: July 8, 2010
    Inventors: Avi E. Cieplinski, Timothy David Cherna, Jeffrey Traer Bernstein, B. Michael Victor