Methods and Systems for Interactive User Interface Objects
Methods and systems for interactive user interface objects are provided. The user interface allows for users to manipulate different objects represented by icons within a user interface, such as that of an iPhone or iPad or other touch screen devices. Users may bump objects together, flick one object towards another, bump an object against the edge of a screen, flick an object towards a certain area such as a dock, or otherwise manipulate the objects. The user's actions or manipulations of objects may result in an action by the objects or may have no effect. Depending on the compatibility of objects, the objects may be neutral towards one another, may attract each other, or may repel. The objects that attract may share content or communicate with each other via a one-time interaction, or may establish links or connections with each other that enable longer term communication or broadcasts that occur upon pre-determined triggers.
Latest PHUNWARE, INC. Patents:
- Monitoring outdoor and indoor regions with mobile devices
- Systems and methods for enterprise branded application frameworks for mobile and other environments
- Mobile device localization based on relative received signal strength indicators
- Monitoring outdoor and indoor regions with mobile devices
- Systems and methods for enterprise branded application frameworks for mobile and other environments
This application is a continuation application U.S. Patent Application Ser. No. 13/076,370 filed on Mar. 30, 2011, which claims the benefit of U.S. Provisional Application No. 61/319,840, filed on Mar. 31, 2010, each of which is entirely incorporated herein by reference.BACKGROUND
Application users are often forced to interact with content and objects leveraging a static, mechanical methodology. While the underlying content or objects typically provide utility and value to the application user, the interface provided is not typically engaging, playful, fun or based on any specific scientific properties.
Today's application users deserve more in their consumption of multimedia, including content and objects, and the application users should be able to be as engaged by the user interface and experience provided for navigating and interacting with the application as the underlying content and objects themselves.
What is needed is a dynamic, interactive interface based on math, science and physics that allows for interoperability and connections between applications or objects in a multi-touch interface where the user engagement is as core to the underlying experience as the content and objects themselves.SUMMARY
The invention provides systems and methods for interactive user interface objects. Various aspects of the invention described herein may be applied to any of the particular applications set forth below. The invention may be applied as a standalone system or as a component of an integrated software solution for programmable devices and their frameworks. The invention can be optionally integrated into existing business and processes seamlessly. It shall be understood that different aspects of the invention can be appreciated individually, collectively or in combination with each other.
In one aspect, a user interface for displaying interoperability of objects may include a graphical depiction of at least one object on a display screen; input variables for receiving inputs from a user to associate the at least one object with a second object; a graphical depiction of an action based on an interoperability dynamic of the at least one object with the second object. The interoperability dynamic may be to repel, attract or remain neutral.
Other goals and advantages of the invention will be further appreciated and understood when considered in conjunction with the following description and accompanying drawings. While the following description may contain specific details describing particular embodiments of the invention, this should not be construed as limitations to the scope of the invention but rather as an exemplification of a preferred embodiment. For each aspect of the invention, many variations are possible as suggested herein that are known to those of ordinary skill in the art. A variety of changes and modifications can be made within the scope of the invention without departing from the spirit thereof.INCORPORATION BY REFERENCE
All publications, patents, and patent applications mentioned in this specification are herein incorporated by reference to the same extent as if each individual publication, patent, or patent application was specifically and individually indicated to be incorporated by reference.
The novel features of the invention are set forth with particularity in the appended claims. A better understanding of the features and advantages of the present invention will be obtained by reference to the following detailed description that sets forth illustrative embodiments, in which the principles of the invention are utilized, and the accompanying drawings of which:
In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However it will be understood by those of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the invention. Various modifications to the described embodiments will be apparent to those with skill in the art, and the general principles defined herein may be applied to other embodiments. The invention is not intended to be limited to the particular embodiments shown and described.
In current user interfaces, objects are often displayed as icons in an environment. For example, on an iPhone interface, each application is represented by an icon. These icons are forms of objects within a user interface. Currently, users may move these objects around, rearrange these objects within an environment, and interact with these objects individually and separately. Utilizing embodiments of the present invention, however, users may be able to interact with these objects (interactive objects or icons) in new and innovative ways. For example, users may “play with” or move the objects around such that the objects interact with each other. Users may bump objects together, or flick one object or icon into another object. Users may rearrange objects and place objects on top of each other or into each other, or even on a dock. The objects may be “thrown” to a dock, bounced off one another, connected with one another, bounced off the edge of the screen, etc. As the users interact with objects, the objects may react such that certain events are triggered by the users' actions.
If the objects, on the other hand, are compatible for interaction, they may either establish a connection or link to one other for a subscription based interaction (or a longer period of interaction) or they may engage in a one-time or limited interaction (105). If it is a subscription-based interaction, the two objects may establish a connection (106) in which they are able to communicate or broadcast content to one another for some pre-determined period of time or perpetually. If it is a one-time or limited interaction, then the two objects may share or communicate certain content at that instance (107). For example, if a user flicks a content object (such as a picture, video, etc.) onto the Facebook application object (icon), then the Facebook application could post the content to the user's Facebook page or wall. In this case, Facebook would attract content, and the content would have a one-time interaction with the Facebook object.
In one embodiment, in addition to attracting or repelling one another, two objects may also orbit each other. For example, a social networking application object could orbit a Community object. The social networking application object may be a submenu and the Community object may be the top menu. In this example, an object that orbits another object is a submenu of the object that it is orbiting. In another situation, when two objects are connected, sharing content, or otherwise linked, one of those objects may orbit the other object. There may be various other situations in which objects may orbit one another.
In another example, a user may bump a game application object against their Twitter application object (or to their Facebook application object or other social networking object). Instead of bumping the two icons together, the user could also flick one to another, or place one atop of another, or any other action that causes triggers an interaction between the two objects. If the Twitter application object is capable of communicating with the game application object, then the two objects (or content modules) could establish a connection or link that allows them to communicate with each other or share content until the objects are disconnected. The objects may be disconnected by the user or after some pre-defined period of time. When connected, for example, achievements in the game object may be automatically broadcast via the social networking object that the user has connected the game object to. Thus, content (or achievements) from one application could automatically be shared virally and socially through the other application, which may be a social networking application object (or through all of the user's social networking accounts that the user chooses to “connect”). It may be possible for an object to establish connections or engage in one-time interactions with one or more other objects. It may also be possible that various objects cannot or are not set up to interact with each other.
In yet another example, two social networking objects could be connected. For example, the Facebook application object and the Twitter application object may be connected, so that whatever the user broadcasts or posts to Facebook is automatically posted and broadcasted via the Twitter application as well, and vice versa. When connected, the two social networking objects may share content, postings and updates.
On object A1, the triangle shaped inputs 101 represent the kinds of inputs, content, communication, etc. that an object may be able to receive, and the triangle shaped outputs 102 represent the types of outputs, content, communication, etc. that object A1 is able to provide to other objects. Thus, object A1 may send output to other objects that are able to receive triangle-type inputs like 101. Similarly, object B can only communicate its outputs with objects like object C who are able to receive circle-type inputs like 103 or 104. And object D may only communicate its outputs or receive inputs from and with other objects like itself. The shapes of the inputs and outputs represent the types of messages, communication, content, etc. that are used by each of the applications (objects) within an interface.
When two objects, such as A1 and A2, or B and C have compatible inputs and outputs, then they are able to establish either a one-time or longer-term connection so that they may interact with one another. Two objects that have compatible inputs and outputs may attract one other, whereas two objects that do not have compatible inputs and outputs, such as object B and object D may repel one another. When a user thus places object B on top of object C, or bumps object B and C together, or flicks object B to object C or flicks object C to object B, then those two objects will interact and something will happen, such as the sharing of content, for example. When a user places object B on top of object D, or bumps object B and D together, or flicks object B to object D, they may either repel, or they may do nothing, depending on how incompatible these two objects are.
As shown by object C, an object may have inputs that can receive multiple types of other object outputs. Similarly, as shown by object D, an object may have outputs that may be compatible with various types of other object inputs. For example, object D's outputs 305 may be compatible with the inputs of object A 301, object B 307, object C 303, and object F 306. Object A's outputs 304 may be compatible with object C's inputs 303 and object F's inputs 306, but depending on the strength of the compatibility, the objects may attract each other more or less. For example, object D's inputs 309 may be strongly attracted to object C's outputs 310 because they are highly compatible, and these objects may gravitate towards one another within the interface. In contrast, object F's outputs 311 may be highly incompatible with object B's inputs 307, and object B's outputs 312 may be highly incompatible with object F's inputs 306, so these two objects may repel each other.
Object E may have a universal input 308 such that it may accept outputs from all types of objects. For example, object E may be a Dock object (as described further below) that is able to attract or be attracted to any object.
In another example, if there is featured content 405 that is interesting, a user could flick or move it on top of the community object 401 to feature that content to other members of the user's community. Or, the user could take content within the videos object 402 and flick it into the featured 405 object to feature that content. The user may establish connections or links between the featured object 405 and their photos object 403, for example, to establish a link between the two to share content. The connection or link may be established for a certain period of time, or may persist until the user disconnects the two objects. In some situations, the connection may be established permanently.
Each object (or content) may also have a natural “home” and the object may gravitate towards that “home” at all times. Or, the object may simply return to its natural “home” when the user is finished interacting with that object. For example, the “home” of a certain photo of a user could be the Photos object 403. Or the “home” of a Facebook application object may be the Community object 401. Different objects or content may have different “homes” which may be a Community object 401, a Videos object 402, a Photos object 403, a Games object 404, a Featured object 405, or a Dock 406. Users may drag certain objects to a Dock 406, where they can be easily accessed by the user. The Dock may have universal inputs (or in other words, accept inputs from all types of objects), so that any type of object may be attracted to the Dock. When a user throws an object, such as Object A onto the Dock 406, the object will be easily accessible by the user from the Dock 406. When an object is placed in the Dock 406, the Dock may become the objects new “home.”
Different objects may also be “shared” to various social networks or other sites/actions represented within the “Share” object 606. For example, within the Share object 606, Facebook could be “1”, Twitter could be “2”, and Linked-In could be “3”. A photo within the Photos object 604 may be dragged onto the “Share” object 606 to be shared and posted to these various social networks. For example, a photo within the Photos object 604 may be dragged and dropped into the upper-right hand corner of the screen shown in
It is understood that when referring to mobile devices or mobile platforms, various other types of programmable or touch screen devices, platforms and application frameworks may be utilized by embodiments of the present invention, including mobile phones (including iPhone OS based devices, Android OS based devices, Windows mobile devices, Symbian OS and RIM OS, etc.), mobile consumer platforms (including iPod touch, Zune, PSP, Nintendo DS, etc.), tablet devices (including iPad, all Windows tablet edition devices, etc.), televisions (including Samsung SDK capable devices), “Smart Appliances” (including refrigerators, washing machines and any other appliances equipped with support for application development), automobiles and other vehicles equipped with support for application development, digital billboards and other advertisement based devices equipped with support for application development, and other programmable devices.
While preferred embodiments of the present invention have been shown and described herein, it will be obvious to those skilled in the art that such embodiments are provided by way of example only. Numerous variations, changes, and substitutions will now occur to those skilled in the art without departing from the invention. It should be understood that various alternatives to the embodiments of the invention described herein may be employed in practicing the invention. It is intended that the following claims define the scope of the invention and that methods and structures within the scope of these claims and their equivalents be covered thereby.
While this invention has been described and illustrated with reference to particular embodiments, it will be readily apparent to those skilled in the art that the scope of the invention is not limited to the disclosed embodiments but, on the contrary, is intended to cover numerous other modifications and equivalent arrangements which are included within the spirit and scope of the following claims.
Aspects of the systems and methods described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (PLDs), such as field programmable gate arrays (FPGAs), programmable array logic (PAL) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits (ASICs). Some other possibilities for implementing aspects of the systems and methods include: microcontrollers with memory, embedded microprocessors, firmware, software, etc. Furthermore, aspects of the systems and methods may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural network) logic, quantum devices, and hybrids of any of the above device types. Of course the underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, etc.
It should be noted that the various functions or processes disclosed herein may be described as data and/or instructions embodied in various computer-readable media, in terms of their behavioral, register transfer, logic component, transistor, layout geometries, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, non-volatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof. Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, email, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., TCP, UDP, HTTP, FTP, SMTP, etc.). When received within a computer system via one or more computer-readable media, such data and/or instruction-based expressions of components and/or processes under the systems and methods may be processed by a processing entity (e.g., one or more processors) within the computer system in conjunction with execution of one or more other computer programs.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification, discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining,” or the like, may refer in whole or in part to the action and/or processes of a processor, computer or computing system, or similar electronic computing device, that manipulate and/or transform data represented as physical, such as electronic, quantities within the system's registers and/or memories into other data similarly represented as physical quantities within the system's memories, registers or other such information storage, transmission or display devices. It will also be appreciated by persons skilled in the art that the term “users” referred to herein can be individuals as well as corporations and other legal entities. Furthermore, the processes presented herein are not inherently related to any particular computer, processing device, article or other apparatus. An example of a structure for a variety of these systems will appear from the description below. In addition, embodiments of the invention are not described with reference to any particular processor, programming language, machine code, etc. It will be appreciated that a variety of programming languages, machine codes, etc. may be used to implement the teachings of the invention as described herein.
Unless the context clearly requires otherwise, throughout the description and the claims, the words ‘comprise,’ comprising,' and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of ‘including, but not limited to.’ Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words ‘herein,’ hereunder,“above,”below,' and words of similar import refer to this application as a whole and not to any particular portions of this application. When the word ‘or’ is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.
The above description of illustrated embodiments of the systems and methods is not intended to be exhaustive or to limit the systems and methods to the precise form disclosed. While specific embodiments of, and examples for, the systems and methods are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the systems and methods, as those skilled in the relevant art will recognize. The teachings of the systems and methods provided herein can be applied to other processing systems and methods, not only for the systems and methods described above.
The elements and acts of the various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the systems and methods in light of the above detailed description.
In general, in the following claims, the terms used should not be construed to limit the systems and methods to the specific embodiments disclosed in the specification and the claims, but should be construed to include all processing systems that operate under the claims. Accordingly, the systems and methods are not limited by the disclosure, but instead the scope of the systems and methods is to be determined entirely by the claims.
While certain aspects of the systems and methods are presented below in certain claim forms, the inventor contemplates the various aspects of the systems and methods in any number of claim forms. Accordingly, the inventor reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the systems and methods.
1. A mobile device comprising a user interface for picture or video sharing, comprising:
- a display screen comprising a user interface that includes a first visual image and a second visual image, wherein the first visual image is associated with a picture or video file stored in a memory on the mobile device, and wherein the second visual image is associated with a mobile application that is accessible on the mobile device; and
- a computer processor coupled to the memory that is programmed (i) to receive user input on the user interface to position the first visual image on or in proximity to the second visual image, and (ii) to initiate transfer or processing of the picture or video file on the mobile device by the mobile application if the mobile application is compatible with the picture or video file, or not initiate the transfer or processing of the picture or video file on the mobile device by the mobile application if the mobile application is incompatible with the picture or video file.
2. The mobile device of claim 1, wherein the first visual image is associated with another mobile application that is accessible by the user upon input from the user on the first visual image.
3. The mobile device of claim 2, wherein the another mobile application is different from the mobile application.
4. The mobile device of claim 1, wherein on the user interface, the first and second visual images repel or attract each other based on the compatibility of the mobile application with the picture or video file.
5. The mobile device of claim 4, wherein the computer processor is programmed such that, if the mobile application is compatible with the picture or video file, then the first image and second image attract each other on the user interface, and the transfer or processing of the picture or video file by the mobile application is initiated.
6. The mobile device of claim 4, wherein the computer processor is programmed such that, if the mobile application is incompatible with the picture or video file, then the first image and second image repel or remain neutral with respect to each other on the user interface, and the transfer or processing of the picture or video file by the mobile application is not initiated.
7. The mobile device of claim 1, wherein the first image and second image are icons.
8. The mobile device of claim 1, wherein a position of each image within the user interface is user selectable.
9. The mobile device of claim 1, wherein the first image is a graphical depiction of the picture or video.
10. The mobile device of claim 1, wherein the second image is a graphical depiction of the mobile application.
11. The mobile device of claim 1, further comprising a graphical depiction of a link established between the first image and the second image.
12. A device comprising a user interface for picture or video sharing, comprising:
- a touch screen comprising a user interface that includes a first image and a second image, wherein the first image is associated with a picture or video and the second image is associated with an application that is accessible on the device upon user input on the second image; and
- a computer processor coupled to the touch screen and programmed to (i) receive user input on the user interface to move the first image to the second image, and (ii) initiate the transfer of the picture or video from the device to the application if the application is compatible with the picture or video, or not initiate the transfer of the picture or video from the device to the application if the application is incompatible with the picture or video.
13. The device of claim 12, wherein the first image is associated with another application that is accessible by the user upon input from the user on the first image.
14. The device of claim 13, wherein the another application is different from the application.
15. The device of claim 12, wherein the computer processor is programmed such that, if the application is compatible with the picture or video, then the first image and second image attract each other on the user interface, and the transfer of the picture or video from the mobile device to the application is initiated.
16. The device of claim 12, wherein the computer processor is programmed such that, if the application is incompatible with the picture or video, then the first image and second image repel or remain neutral with respect to each other on the user interface, and the transfer of the picture or video from the mobile device to the application is not initiated.
17. The device of claim 12, wherein the first image and second image are icons.
18. The device of claim 12, wherein a position of each image within the user interface is user selectable.
19. A device comprising a user interface for content sharing, comprising:
- a display screen comprising a user interface that includes a plurality of images associated with applications that are accessible on the device upon user input on a respective one of the plurality images, which applications include a first application associated with a first image among the plurality of images and a second application associated with a second image among the plurality of images; and
- a computer processor coupled to the display and programmed to (i) receive user input on the user interface to move the first image to the second image, and (ii) initiate the transfer of content from the first application to the second application if the first application and second application are compatible, or not initiate the transfer of content from the first application to the second application if the first application and second application are incompatible.
20. The system of claim 19, wherein the first application is different from the second application.
21. The device of claim 19, wherein the first image and second image are icons.
22. The device of claim 19, wherein a position of each image within the user interface is user selectable.
Filed: May 15, 2014
Publication Date: Nov 13, 2014
Applicant: PHUNWARE, INC. (Austin, TX)
Inventors: Alan S. Knitowski (Austin, TX), Luan Dang (Newport Beach, CA), David J. Reese (Austin, TX), James D. Trim (Pflugerville, TX), Anthony C. Hall (Austin, TX), Cyrus Lum (Austin, TX)
Application Number: 14/279,269
International Classification: G06F 3/0486 (20060101); G06F 3/0484 (20060101);