SYSTEMS FOR INFORMATION SHARING AND METHODS OF USE, DISCUSSION AND COLLABORATION SYSTEM AND METHODS OF USE

A collaborative and communication system is presented. The collaborative technology can be applied to enable team members and customers to discuss and collaborate over a document, document content, and the like through voice recording and/or voice and video meetings. In this way, an application and associated functionality and features for document discussions in a dynamically generated format is provided. In this way, support incident by forming a document and/or pdf document and/or image based on a support incident and various properties can be generated. Furthermore, the present disclosure provides for recording of audio and video in association with annotation of a document and/or annotation of drawings, and the like. Furthermore, the system provides for document discussion and annotation in real time and for collaboration and dynamically generating documents—documents which represent various task properties, and support incident properties—all with voice recording, animation, and drawing.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application claims priority to the U.S. Utility patent application Ser. No. 17/587,030 which was filed on Jan. 28, 2022, which is hereby incorporated by reference herein in its entirety, including any figures, tables, or drawings.

The present application claims priority to the U.S. Provisional Patent Application No. 61/576,389 which was filed on Dec. 16, 2011, which is hereby incorporated by reference herein in its entirety, including any figures, tables, or drawings.

FIELD OF THE DISCLOSURE

This disclosure relates to a system for converting business objects, objects, and content into a live document, or livedoc. Furthermore, this disclosure relates to a system for information sharing and a method of use. Furthermore, and without limitation, this disclosure relates to a discussion and collaboration system and methods of use.

COPYRIGHT NOTICE

At least a portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files and/or records, but otherwise reserves all copyright rights whatsoever. The following notice applies to the software and data as described below and in the drawings that form a part of this document. Copyright. Tieren Zhou. All rights reserved.

BACKGROUND OF THE DISCLOSURE

In recent times, cloud computing service providers deliver applications via the internet. Such cloud supported applications can be accessed on desktops or mobile devices via web browsers, while the operational software and data are stored on servers at some remote location(s) in the “cloud”. One of the promising application areas in the context of cloud computing is information sharing among different users because the “cloud” enables a user to access systems or applications via a web browser regardless of the location of or device type the user is using. As the backbone supporting infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, a user can virtually connect to an application from anywhere. Early examples in this area include screen-sharing applications where one person's screen can be encoded video stream and real-time delivered to other persons. In other examples, business applications have been coded entirely using web-based technologies. In still another example, business applications for information sharing are often developed using web-based technologies where information is shared using web browsers.

Traditionally, people share information online through means such as email, instant messenger, message board, desktop sharing, etc., which may not be effective and efficient, especially when multiple parties are involved and when the shared information includes multimedia information. For example, traditional online meeting based on screen-sharing may introduce a significant latency because it requires transferring the desktop information of the presenter in the form of a video stream to each of the participants. Moreover, currently, there is no effective cloud-based platform for multiple users to modify shared information in a simple and straightforward manner and consolidate modifications to the shared information from different users in an intuitive form. Therefore, there is a need to provide a solution for sharing multimedia information with an improved user experience.

Thus, there is a long-felt need in the art for collaborative technology, as described further herein, which can be applied to enable team members and customers to discuss and collaborate over a document, document content, and the like through voice recording and/or voice and video meetings. In the present disclosure, as will become more clear as further described herein, the present disclosure provides an application and associated functionality and features for document discussions in a dynamically generated format.

In this way, the present disclosure also provides for support incident by forming a document and/or pdf document and/or image based on a support incident and various properties. Furthermore, the present disclosure provides for recording of audio and video in association with annotation of a document and/or annotation of drawings, and the like. In this way, the present disclosure provides for document discussion and annotation in real time and in collaboration. Furthermore, the present disclosure provides for dynamically generating documents—documents which represent various task properties, and support incident properties, and the like. Furthermore, the present disclosure provides for voice recording, animation, and drawing, and the like.

The disclosure herein provides these advantages and others as will become clear from the specification and claims provided.

SUMMARY OF THE DISCLOSURE

The present disclosure provides a system for converting content into a live document. The live document can then be used to create sync meetings or live meetings, and more. The present disclosure relates to a system for information sharing and a method of use. Furthermore, and without limitation, this disclosure relates to a discussion and collaboration system and methods of use.

Particularly, the present disclosure is directed to a collaborative technology, as described further herein, which can be applied to enable team members and customers to discuss and collaborate over a document, document content, and the like through voice recording and/or voice and video meetings. In the present disclosure, as will become more clear as further described herein, the present disclosure provides an application and associated functionality and features for document discussions in a dynamically generated format.

In this way, the present disclosure also provides for support incident by forming a document and/or pdf document and/or image based on a support incident and various properties. Furthermore, the present disclosure provides for recording of audio and video in association with annotation of a document and/or annotation of drawings, and the like. In this way, the present disclosure provides for document discussion and annotation in real time and in collaboration. Furthermore, the present disclosure provides for dynamically generating documents—documents which represent various task properties, and support incident properties, and the like. Furthermore, the present disclosure provides for voice recording, animation, and drawing, and the like.

In the arrangement shown, as one example, a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use are presented. The present disclosure provides the state of the art with a technology that can be applied to enable team members and customers to interactively and dynamically discuss and collaborate over document content.

In this way, teams and customers can interact dynamically via voice recording and live meetings, including video and pre-edited documents—it desired. In this way, the present disclosure also provides for generating a document from a support incident and can also form a pdf or an image based on the support incident.

In this way, information can be shared and later recalled in the exact manner it was discussed. In this way, the present disclosure provides clarity in ongoing matters and enhances communication of subject matter.

Furthermore, and in the arrangements shown, the present disclosure also provides for voice recording and animation of documents and documents information. This includes annotation of the drawings.

In the present disclosure, documents can be viewed, dynamically edited and/or annotated in real time. Furthermore, documents which are reviewed and/or generated can represent task properties and various support incidents properties. Furthermore, voice recording can be overlayed on live information and also animation and drawing of documents.

Furthermore, the present disclosure provides an asynchronous means of communicating by various users. This enhanced communication and the various features provided herein change the state of the art and the ability to collaborate, especially from remote geographic locations. Said another way, the present disclosure provides the ability to transmit to remote users the ability to view the content whether live or pre-recorded.

Thus, it is a primary object of the disclosure to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that improve upon and enhance the state of the art.

Another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that provides for generation and implementation of a support ticket.

Yet another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that can display a plurality of objects in a single view.

Another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that provides call center functionality in which all meeting requests between teams and customers are represented as object tiles.

Yet another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that enhances communication and success of call centers.

Another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that enhances the communication and success of supporting agents.

Yet another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that enhances the communication and success of ticketing systems.

Another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that enhances the communication and success of ALM and project management.

Yet another object of the disclosure is to provide a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use that enhances the communication and success of event management and event management platforms.

These and other objects, features, or advantages of the present disclosure will become apparent from the specification and claims.

BRIEF DESCRIPTION OF THE DRAWINGS

The methods, systems, and/or programming described herein are further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:

FIG. 1 is a high level exemplary diagram of a system for information sharing, according to an embodiment of the present teaching;

FIG. 2(a) is a more detailed diagram of the exemplary system for information sharing shown in FIG. 1, according to different embodiments of the present teaching;

FIG. 2(b) is a more detailed diagram of the exemplary system for information sharing shown in FIG. 1, according to different embodiments of the present teaching;

FIG. 3(a) is a depiction of an exemplary application of a system for information sharing, according to different embodiments of the present teaching;

FIG. 3(b) is a depiction of an exemplary application of a system for information sharing, according to different embodiments of the present teaching;

FIG. 3(c) is a depiction of an exemplary application of a system for information sharing, according to different embodiments of the present teaching;

FIG. 3(d) is a depiction of an exemplary application of a system for information sharing, according to different embodiments of the present teaching;

FIG. 3(e) is a depiction of an exemplary application of a system for information sharing, according to different embodiments of the present teaching;

FIG. 3(f) is a depiction of an exemplary application of a system for information sharing, according to different embodiments of the present teaching;

FIG. 3(g) is a depiction of an exemplary application of a system for information sharing, according to different embodiments of the present teaching;

FIG. 4 is a diagram of an exemplary information sharing controller of a system for information sharing, according to an embodiment of the present teaching;

FIG. 5 is a depiction of an exemplary non-time-based supporting object and time-based supporting object, according to an embodiment of the present teaching;

FIG. 6 is a depiction of an exemplary process of synchronizing supporting objects with a base object, according to an embodiment of the present teaching;

FIG. 7 is a depiction of another exemplary process of synchronizing supporting objects with a base object, according to an embodiment of the present teaching;

FIG. 8(a) is a flowchart of exemplary processes of information sharing, according to different embodiments of the present teaching;

FIG. 8(b) is a flowchart of exemplary processes of information sharing, according to different embodiments of the present teaching;

FIG. 9 is an exemplary diagram of a system for online meeting, according to an embodiment of the present teaching;

FIG. 10 is a depiction of an exemplary process of online meeting, according to an embodiment of the present teaching;

FIG. 11(a) is a flowchart of an exemplary process of online meeting, according to different embodiments of the present teaching;

FIG. 11(b) is a flowchart of an exemplary process of online meeting, according to different embodiments of the present teaching;

FIG. 12 depicts a general computer architecture on which the present teaching can be implemented;

FIG. 13 depicts a general timeline and/or computer appearance of an example of a timeline for which a document might be produced, the document might be enhanced, and collaborations may take place and the present teaching can be implemented;

FIG. 14 depicts a general timeline and/or computer appearance of an example of a timeline for which a document might be produced, the document might be enhanced, and collaborations may take place and the present teaching can be implemented;

FIG. 15 depicts a general computer architecture and appearance on which the present teaching can be implemented and actions such as meeting actions can take place;

FIG. 16 depicts a general computer architecture and appearance on which the present teaching can be implemented and actions such as meeting actions can take place;

FIG. 17 is a depiction of an exemplary process of online meeting, according to an embodiment of the present teaching;

FIG. 18 is a depiction of an exemplary process of online meeting, according to an embodiment of the present teaching;

FIG. 19 is a depiction of an exemplary process of online meeting, according to an embodiment of the present teaching;

FIG. 20 is a depiction of an exemplary process of online meeting, according to an embodiment of the present teaching;

FIG. 21 is a depiction of an exemplary process of online meeting, according to an embodiment of the present teaching.

DETAILED DESCRIPTION OF THE DISCLOSURE

In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that mechanical, procedural, and other changes may be made without departing from the spirit and scope of the disclosure(s). The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the disclosure(s) is defined only by the appended claims, along with the full scope of equivalents to which such claims are entitled.

As used herein, the terminology such as vertical, horizontal, top, bottom, front, back, end, sides and the like are referenced according to the views, pieces and figures presented. It should be understood, however, that the terms are used only for purposes of description, and are not intended to be used as limitations. Accordingly, orientation of an object or a combination of objects may change without departing from the scope of the disclosure.

Reference throughout this specification to “one embodiment,” “an embodiment,” “one example,” or “an example” means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present disclosure. Thus, the appearance of the phrases “in one embodiment,” “in an embodiment,” “one example,” or “an example” in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, databases, or characteristics may be combined in any suitable combinations and/or sub-combinations in one or more embodiments or examples. In addition, it should be appreciated that the figures provided herewith are for explanation purposes to persons ordinarily skilled in the art and that the drawings are not necessarily drawn to scale.

Embodiments in accordance with the present disclosure may be embodied as an apparatus, method, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware-comprised embodiment, an entirely software-comprised embodiment (including firmware, resident software, micro-code, etc.), or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, embodiments of the present disclosure may take the form of a computer program product embodied in any tangible medium.

Any combination of one or more computer-usable or computer-readable media may be utilized. For example, a computer-readable medium may include one or more of a portable computer removable drive, a hard disk, a random access memory (RAM) device, a read-only memory (ROM) device, an erasable programmable read-only memory (EPROM or Flash memory) device, a portable compact disc read-only memory (CDROM), an optical storage device, and a magnetic storage device. Computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages. Such code may be compiled from source code to computer-readable assembly language or machine code, or virtual code, or framework code suitable for the disclosure herein, or machine code suitable for the device or computer on which the code will be executed.

Embodiments may also be implemented in cloud computing environments. In this description and the following claims, “cloud computing” may be defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction and then scaled accordingly. A cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service), service models (e.g., Software as a Service (“Saas”), Platform as a Service (“PaaS”), and Infrastructure as a Service (“IaaS”)), and deployment models (e.g., private cloud, community cloud, public cloud, and hybrid cloud).

The flowchart and block diagrams in the attached figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It will also be noted that each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. These computer program instructions may also be stored in a computer-readable medium that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instruction means which implement the function/act specified in the flowchart and/or block diagram block or blocks.

In the arrangement shown, as one example, a system for quickly and efficiently converting content into a live document is presented. The live document can then be used to create sync meetings or live meetings, and more. In the arrangement shown, as a couple of many examples, objects, or business objects may include, but are not limited to, sales quotes in a sales application, tickets in a customer support application, website images, content, or even videos. Said another way, a system for information sharing is presented. Furthermore, a system for information sharing and a method of use are provided. Furthermore, a system for information sharing, method of use, a discussion and collaboration system and method of use are also presented.

In the arrangement shown, as one example, the system for information sharing presented provides the state of the art with an information and collaboration system which enhances and enables efficiencies that did not previously exist in the art. Said another way, the present disclosure provides a collaboration system which provides for real-time and recording of communications which easily communicate information for collaborators—including but not limited to user, clients, customers, technicians, employees, friends, co-workers, and the like.

In the arrangement shown, as one example, an information sharing system is provided which enhances clarity and understanding in communication which makes fixing technical issues and the like easier. This is only one example, as the present disclosure is not limited to technical issues. The present disclosure may be utilized in a variety of fields and applications, as will become apparent from the disclosure. For example, the present disclosure may be used between a business and a business, a business and potential client, a business and a client, a team of marketers, a team of employees, a plurality of technicians working with clients to fix technical or IT problems, and many other applications.

In this way, the present disclosure provides efficiencies in communications, especially in providing understanding of communication, dramatically improving time spent in diagnosing and/or fixing issues, and more.

In summary, the present disclosure provides the ability for a user to easily create content which aids in communication. This content may be to tell someone what a problem is, to communicate to another about changes requested, or simply to relay and provide understanding in information. For these reasons, it quickly will become clear that this disclosure can be used across a variety of fields from teaching, to IT, to marketing, to others. In this way, the present disclosure provides tremendous value to any user because of enhanced communication and satisfying results in knowing that a subsequent collaborator has viewed and/or worked with the present and/or solved issues—as will become more clear from the present disclosure.

The present disclosure also provides a history of communication, the ability to record over previous syncs and the like. The present disclosure also provides a log of history related to syncs, and the like. Livesync also has an ongoing history and log—keeps track of previous syncs, previous uploads of content, and ability for agents and the like to review back. Furthermore, the present disclosure provides improved collaboration, improved issue solving and/or problem solving. Importantly, the present disclosure provides for an information system sharing which is more than a one-way street. In other words, various collaborators can communicate over the same content and easily re-enable previously recorded syncs to work on top of.

These advantages, features, and functionalities will be elaborated on, expanded upon, and more understood in the present disclosure.

System:

With reference to the figures, a system for information sharing and a method of use are presented—and a discussion and collaboration system and methods of use are presented. Information sharing system is formed of any suitable size, shape and design having various features and functionality.

As one example, the information sharing system may be utilized on a website. Existing websites in the state of the art provide information sharing by providing static images and static text to communicate. In this way, existing websites and similar try to present material in a static way to communicate with potential clients and others learning about the company, and the like. The present disclosure transforms websites, and similar to be interactive, and enhances communication and understanding.

In this example, the information sharing system disclosed herein provides presentations, voice recordings, and annotations instead of only static images or static content or any content on a website, videos, or the like. With livesync enabled on a website, a visitor is enabled to also request live support, live help, a visitor can start a livesync meeting, other collaborators of the visitor and the website representative can all join. In this way, a visitor can start a new meeting to discuss certain content and/or topic of interest. While a visitor is waiting for someone to join, they can create their own content, they can also upload additional content to have additional content available for representatives. More resources than a simple image or viewing a static image. Creates collaboration in real time for customers and clients, other collaborators, and more.

As another example—of the many examples—livesync and the information sharing system disclosed herein might be used in a customer support system environment. In this example, a customer can login to a portal as a customer of an entity for a service, etc. Once logged in can view open issues, tickets, livesync updates (for previous livesyncs) or can enter a new issue. In one example, a user and/or client may be using a software, but perhaps a button of the interface and/or software is not working.

In this example, a customer and/or user might provide a description of the problem in creating a new ticket and/or creation of a new livesync. Common problems with typical support issues like this are additional calls and additional screen sharing just to figure out the issue (Not all industries provide for screen sharing, etc.). Some customers, at best, might at least attach a screenshot. Most customers just want issues resolved and may not be highly technically skilled or familiar with a particular program, or the like. Even with screenshots, interpretation and lots of additional communication to resolve issues.

The present disclosure provides the state of the art with a system in which a customer can record the issue while voicing the issue, etc. This conveys and explains the problem more clearly to a technician. In turn, a technician can understand what the problem and/or is and can easily diagnose that problem in significantly less time with less communication and less effort, especially with highlighting done to highlight a particular problem.

In this example, as continued, if a customer doesn't want to use a screenshot or perhaps is unable to use a screenshot, screen sharing is an option through livesync. Additionally, the support team and/or collaborator can walk through with clients on content and applications in real time. This enhances offering clarity in understanding and makes fixing technical issues and the like extremely more efficient than the current state of the art. In this way, diagnostic time can be dramatically reduced with simple livesync by a customer. Support agents and/or collaborators can playback issues and/or playback livesyncs which were provided, share those livesyncs with others for second or third opinions and/or assistance, and prepare new sync for customers with resolutions, and the like.

These and other examples will become apparent through the present disclosure.

In the arrangement shown, as one example, information sharing system may comprise remote servers, databases, application servers, application databases, product databases, mobile applications, and/or computers; all of which in continuity or as separate acts fulfill the functions disclosed herein.

In the arrangement shown, as one example, the main structure of system 10 also includes a plurality of users and/or collaborators, a plurality of content, a sync, a syncroom, a livesync, and a computing platform, and communication and control components, among other components, features, and functionality.

Users/Collaborators (or Plurality Thereof):

In the arrangement shown, as one example, an information sharing system includes at least one user or at least one collaborator. User or collaborator may be any user interacting with or utilizing the information sharing system—whether the collaborator is creating syncs, sharing syncs, reviewing syncs, listening in on livesyncs, and the like. Furthermore, this may include viewing, controlling, analyzing, manipulating, and/or interacting with the information sharing system, the content, a plurality of syncs, a syncroom, a livesync, or the like. User and/or collaborator is not limited to a single user but may be a plurality of users and/or a plurality of collaborators.

Content:

In the arrangement shown, as one example, the information sharing system includes a plurality of content. Content is formed of any suitable information sharing content and is incorporated, uploaded, or created within the system. Content includes, but is not limited to a pdf, a word document, an image file, a combination thereof, and the like. Content may include audio, or may have audio and the like added to it. Content may also include a live webscreen or interface of a software application and the like in which a collaborator can view, and overlay with a voice and/or audio recording. Similarly, this content may be annotated and the like and may also include video from a recording such as a phone and/or smart device video or image capture.

Sync (synchronous collaboration—not done in real time, solo or voice recording; syncs against a live doe): In the arrangement shown, and at the heart of the communication and/or information sharing system is a “sync”. A sync is a recording in which a collaborator records audio or video and/or a combination of these in order to share information. A sync is generally not done in real time but is created by a user and/or plurality users for the purpose of explaining information and subsequently sharing this information or sync with another and/or a plurality of subsequent collaborators.

Viewable Content of Sync: A sync may include viewing content. In this way, viewing content may be screen recording, an online video, a screenshot, a graphical user interface, a capture image, a captured video, a combination of die like, or similar.

Audio Content of Sync: Sync provides a communication tool for the purpose of sharing information. For this purpose, audio content is often incorporated into a sync. This audio content provides explanations, communications, question asking, explanations, guidance, directions, and more. Audio content is not always strictly audio content but may be other forms of communication content such as typed content, chats, annotations through words, a combination thereof, and the like.

Annotation Content of Sync: Sync provides for a variety of annotation types. Some annotation types include, but are not limited to highlighting, sticky notes (with typed communication), embedding objects—such as video links or web links, voice recordings as annotations, a combination thereof, and other annotations for the purpose of clarifying information.

These annotations and other information are recorded as part of the sync and/or become part of the document which can be shared with others. Additionally, these syncs can be subsequently acted upon, and/or information can be overlaid. For example, a previous audio recording may be quieted and/or removed so that a new audio can be overlaid on the previous sync to answer questions about a particular point in a video sync or the like. In this way, information shared can be further elaborated on, answered, and more. In this way, a plurality of syncs related to a particular topic, or the like can be created. Various audience members can overlay their own sync on a previous sync. “subsequent sync”.

Syncrooms: or a plurality of syncs and/or a history log related to the same topic. These annotations and the like described herein, and other information are recorded as part of the sync and/or become part of the document which can be shared with others. Additionally, these syncs can be subsequently acted upon, and/or information can be overlaid. For example, a previous audio recording may be quieted and/or removed so that a new audio can be overlaid on the previous sync to answer questions about a particular point in a video sync or the like. In this way, information shared can be further elaborated on, answered, and more. In this way, a plurality of syncs related to a particular topic, or the like can be created. Various audience members can overlay their own sync on a previous sync. “subsequent sync”.

In this way, topics and/or channels for certain topics or even multiple topics can be organized into a particular location for access (and even subsequent sharing of information). Collaborations on a particular topic, often might need to be accessed more than once. For example, if there is a common error that users need assistance with, then a technician may share a particular syncroom with that user rather than recreating a sync which was previously created or addressed. Syncrooms have this purpose and many others which further aid in information sharing, and information organization, and the like.

In this way, previously created content may be searched if it is already within a syncroom, and more. Furthermore, and said another way, syncrooms provided a centralized location for content. Furthermore, and said another way, syncrooms provide for easily finding material and content, and easily finding related content. Furthermore, and said another way, syncrooms also provide a real time chat feature within a syncroom. Furthermore, meetings and the like can be created directly from a syncroom setting.

LiveSync:

A livesync is a type of sync. A livesync further enhances on the advantages of a sync by providing the livesync for a plurality of collaborators in real time. In other words, a livesync provides for instant interaction of a sync for enhanced communication and information sharing, and the like. Said another way, livesyncs provide tremendous value because this can also work across various industries. Livesync also provides satisfaction for customers and users as being heard—it can be very satisfying to see that technicians are looking at problems and providing a review of the sync and providing resolution through a sync, so customers can play back and feel like they have been heard, and/or confirm they have been heard and understood properly.

A livesync may be a single instance of a syncroom, or a sync. Furthermore, a livesync may be a plurality of livesyncs and appear as a series of meetings which can also be reviewed and further elaborated upon. Said another way, livesyncs also provide an ongoing history and log—keeps track of previous syncs, previous uploads of content, and ability for agents and the like to review back.

System for Information Sharing:

FIG. 1 is an exemplary example of a high-level exemplary system diagram of a system for information sharing and the disclosures provided herein—according to an embodiment of the present teaching. The information-sharing system 100 may reside on a “cloud” computing environment formed by distributed and shared computing resources connected through a set of networks. The networks can be a single network or a combination of different networks. For example, a network can be a local area network (LAN), a wide area network (WAN), a public network, a private network, a proprietary network, a Public Telephone Switched Network (PSTN), the Internet, a wireless network, a virtual network, or any combination thereof. Users may share information (e.g., documents, ideas, etc.) Through the information sharing system 100 residing on the “cloud.

The information-sharing system 100 in this example includes an information sharing controller 102, an information database 104, and a user database 106. The information sharing controller 102 is a mechanism for controlling operations of the information sharing system 100 and will be described in detail later. The information database 104 may include one or more databases on one or more servers for providing and storing any information to be shared among users. For example, at least three categories of information are stored in the information database 104: base objects, supporting objects and synchronizing actions.

The base objects may be converted user files of a certain type that can be accessed by any user via a web browser or an application. For example, the user files may carry on information that the users want to share through information sharing system 100 and include, but are not limited to, text, presentation slide, images, music sheet, spreadsheet, video, portable document format (PDF) file, database file, or any suitable type of file known in the art. The user files may not be directly accessed by different web browsers, operating systems, or applications and thus, may need to be converted to the base objects of a certain type. For example, the base object may be an ADOBE FLASH file, A MICROSOFT SILVERLIGHT file, an HTML5 file, and image file, a video file, a PDF file, or any suitable type of file known in the art.

The supporting objects may be generated based on information provided by the users in view of the base objects and associated with the corresponding base objects. For example, the supporting objects may comment on the base object provided by the same user or a different user, in the forms of, for example, a text note, a text comment, a highlighting box, and audio comment, etc. Depending on the way in which the supporting objects are associated with the base objects, the supporting objects may include time-based supporting object, such as an audio, a video, and animation, a mouse movement, a visual effect, and an application and non-time-based supporting object, such as a text note, a text comment, a highlighting box, a magnifier, a hyperlink, a diagram, and image, and a drawing.

The user database 106 may include one or more databases on one or more servers for providing and storing any information related to the users of the information sharing system that 100. Database 106 may include, for example, user profiles in member accounts. The user profiles may include any suitable information related to the user (e.g., demographic information, geographical information, online activity history, etc.). In one example, the use of the information sharing system 100 may be “members” who have subscribed for the service of the information sharing system 100 and have an associated member account stored in the user database 106. The member account may include records such as annual fees paid to the entity that runs information-sharing system 100 and service fees incremented per information service and paid to the entity that runs the information sharing system 100 and/or other members who provide the piece of information (e.g., base objects and/or supporting objects). It is understood that the users may also be “non-members” who can use the information sharing system 100 as guests without subscribing for the service from the information sharing system 100. The member account may also include records such as a time and date when each base object, supporting object, or synchronizing actions is created and modified by the member user.

FIGS. 2(a) and 2(b) are more detailed diagrams of the information sharing system 100 shown in FIG. 1, according to different embodiments of the present teaching. In FIG. 2(a), a user may interact with the information sharing system 100 to provide information in the forms of base objects, supporting objects, synchronizing actions, which he/she would like to share with other users for free or at a price. In this example, the user may first upload a user file, which contains contents to be shared, in any suitable type. The information-sharing controller 102 then converts the user file into a base object of a certain type as noted above. The base object is stored in the information database 104. The user then may comment on the base object by providing add-on text. The information-sharing controller 102 then generates a supporting object based on die add-on text. In order to associate the supporting object with the base object, the user may further input a synchronizing action to the information sharing system 100. For example, the user may move the mouse cursor on the screen to indicate where the add-on text should be located on the base object. The mouse movement may be recorded as a synchronizing action by the information sharing controller 102 and stored in the information database 104. In this example, the base object itself may also be modified by the user. For example, the user may modify the content of the original user file and upload the modified user file to the information sharing system 100 to replace the previous version of the base object. The information-sharing controller 102 then generates the modified base object and stores it in the information database 104. In one example, different versions of the base object may be stored in the information database 104 for version control purposes. The number of supporting objects for a particular base object may not be limited. For example, the user may further provide additional add-on information to the modified base object as the additional supporting objects.

FIG. 2(b) shows building a dynamic information-sharing linkage between at least two users through the information sharing system 100. The base objects, supporting objects, and synchronizing actions may be transmitted between the paired users to achieve information-sharing. Each user may comment on the other party's shared information by adding supporting objects and synchronizing actions on the base object. In this example, one or more users may have local information sharing client 202 each including a local database such that the base object and supporting object stored in the information database 104 may be retrieved and stored in the local database. The information-sharing client 202 may reside on any suitable device, such as but not limited to, a desktop or laptop computer, a netbook, a tablet, a smartphone, a game console, a set-top box, Etc. In this example, synchronizing actions may be retrieved and stored in the information sharing client 202 along with the corresponding base and supporting objects. Once a user dynamically manipulates the base object and/or the supporting object through synchronizing actions (e.g., moving a highlighting box to a different location on the WORD document), only the dynamically generated synchronizing actions (e.g. the mouse movement) need to be transmitted to the other user because the manipulated base object in the supporting object has already been retrieved and stored in the local database. As a result, the information sharing between the users is facilitated since the amount of data that needs to be transmitted (dynamic synchronizing actions) is minimized. In this example, depending on the type of shared information, a particular service relationship may be established between the paired users through the information sharing system 100. The service relationship includes, for example, teach—student, editor—author, attorney—client, doctor patient, and collaborators relationships, to name a few.

FIGS. 3(a)-3(g) depict exemplary applications of the information sharing system 100, according to the different embodiments of the present teaching. FIG. 3(a) shows interactive and remote learning through the information sharing system 100. And one example, a teacher teaches online or offline with lecture notes (as space objects) and creates online or offline testing exams and homework assignments (as face projects close parentheses. Teachers can also subscribe and make copies of standard teaching lectures, exams, and homework assignments and further customize the lectures, exams, and homework assignments to better suit the needs of his or her students. Students can learn online or offline by, for example, watching and listening to the lectures and completing the exams and homework assignments (as supporting objects associated with the base object). The teacher made further reviews and comments on the student's answers to the exams and assignments by adding additional supporting objects to the exams and assignments. For offline learning, the lectures, exams, and assignments may be delivered and saved to each student's local information-sharing client 202. As noted above, the teacher and students may be members of the information sharing system 100, and the interactive and remote learning establishes a teach-student relationship between the members. The member account in the user database 106 may track the information sharing/exchange between the teacher and students in order to calculate the service charges that the students need to pay to the entity that runs the information sharing system 100 and the teacher. For example, the service charge calculation may be based on the number if the base and/or supporting objects downloaded to the student's local information sharing client 202 (e.g. the total pages of lecture notes), the number of course subjects the students have subscribed, or the amount of time the teacher and/or the students have spent on the interactive and remote learning. It is understood that the non-member users of the information sharing system 100 may also be able to participate in the learning for free through the information sharing system 100 either as the teacher or the student. In that case, the teacher may make a profit by adding advertisements on the teaching materials that he/she uploads to the information database 104, and the entity that runs the information sharing system 100 may also add additional advertisement on the teaching materials to make a profit instead of charging the students directly. In addition, the students may provide feedback and ratings to the teachers and their teaching materials, and the feedback may be tracked in the member account as a factor to determine the service fees to be allocated to each teacher.

FIG. 3(b) shows music sharing, learning, and publication through the information sharing system 100. In this example, music sheets may be converted and stored as base objects in the forms of, for example, PDF, flash, or image files, which can be synchronized with supporting objects, such as instruments music, or songs in the forms of audio or video files, add on notes, highlighting, etc. Such a platform powers musicians and students to cooperate and learn based on the same standard base objects (the converted sheet music as the base objects). For example, the music recordings (as supporting objects) made by musicians may synchronize with the music sheets (a space object) and available to other users of the information sharing system 100. For example, musicians can compose and publish their sheet music as standard objects, or record their piano or another instrument playing, and upload the playing as the supporting objects to the information database 104 so that other users can share the recorded playing for various purposes such as entertainment, music learning, or publishing their own recording with the professional music accompaniments. The information database 104 may contain standard music sheets for the musicians to record and upload their music accompaniments as supporting, objects to the standard music sheets. Moreover, since the base and supporting objects may be stored locally instead of being transmitted as the video streaming, the music sharing and learning in this example can be achieved with minimum latency.

In this example, the music accompaniment made by famous musicians could be featured and available for other users to sing along with words to play their musical instruments with, at a certain price. This allows musicians to publish their records through the information sharing system 100 to easily promote and sell their music pieces. It is understood that users of the music sharing, learning, and publication application may be either members or non-members, and their monetization schemes may vary accordingly as noted above. For example, for members, service charges may be incremented for each music accompaniment download and tracked in the members' account in the user database 106. In one example, the information sharing system 100 may be used to facilitate such transactions by splitting the collected service fees between the musicians and the entity that runs the information sharing system 100.

In one example, different users (e.g., musicians) can post their performance on a piece of music, and another user can select and choose to compose different pieces together to make e.g., a symphony. In another example, a composer can post his/her music and solicited other users to play different instruments and then put them together. The put-together music may be distributed or downloaded to make a profit. In this example, information sharing system 100 keeps track of which piece is actually incorporated into the final performance and the number of downloads. And one example, the entity that runs the information sharing system 100 may make a profit by taking a percentage of the income, and at the same time, the information sharing system 100 may keep track of the contributors to make sure that they will also get paid because their price has been incorporated into the final product. In a similar vein, the information sharing system 100 may generate a sharing object that comprises all the information generated from the original base object, which, for example, includes all the modifications of the original content of the user file and all the add on comments, notes, explanations, reviews, etc., in the forms of supporting objects and synchronizing actions. Such sharing object may have its special value as a new piece of information and may be distributed and downloaded by any user. In one example, the base object itself may contain a solicitation to a particular group of users (e.g., collaborators in the same entity) or to all users of the information sharing system 100 contributing to the sharing object. It is also understood that the application in FIGS. 3(a) and 3(b) may be combined such that interactive and remote music learning may be achieved through the information sharing system 100.

FIG. 3(c) shows research notes and papers discussion through information sharing system 100. Research labs often require their research members to share notes and publications. Research notes are often written in the document and word processing file format, such as MICROSOFT WORD, and need to be shared with other researchers. In this example, the information sharing system 100 provides an intuitive and effective way to achieve research notes and paper sharing among researchers. For example, the research note and papers may be generated as base objects and shared with other members in the research lab. The researcher may comment on the base object by adding supporting objects in various forms as noted above. Similarly, FIG. 3(d) shows team collaboration through the information sharing system 100. In this example, any team member can save documents, ideas or any information to the information database 104 through the information sharing system 100 in order to share the information with collaborators. And one example, the base and supporting objects may be used as the work requirements or work specifications such that the team leader or manager can assign the base and supporting objects to team members as work assignments to manage teamwork. FIG. 3(e) shows interactive meetings through the information sharing system 100. And in this example the information sharing system 100 may enable users to conduct an online meeting with minimum latency. For example, during the meeting, the presenter may switch pages of a document, highlight certain areas, play animations, and playback pre-recorded audio files that are synchronized with page switching or the animation. Because such meetings may only transfer dynamic synchronizing actions without sending massive amounts of video stream data, it provides a better user experience. The interactive meeting will be described in detail later. FIG. 3(f) shows real-time polling and voting through the information sharing system 100. In this example, because information can be shared as base objects, and multiple users can input their notes, comments, and ratings (as supporting objects) for approval or disapproval, the information sharing system 100 may be applied for real-time polling and voting with multiple inputs are required in real-time for making team consent and agreement. FIG. 3(g) shows information sharing for social networking through the information sharing system 100. And in this example, any information Such as documents can be easily converted to movie-like presentations as base objects by the information sharing system 100. Plus, even users with little computer knowledge may produce professional-quality presentations. The information-sharing system 100 may be integrated with any other social networking tools for more impressive and effective information sharing.

FIG. 4 depicts an exemplary diagram of the information sharing controller 102 to an embodiment of the present teaching. In this example, the information sharing controller 102 includes a base object generator 402, he's supporting object generator 404, and a synchronizing engine 406, each operatively coupled to the information database 104. The base object generator 402 may be configured to convert a user file to a base object of a certain type such that the information in the user file is accessible to any user via a web browser or an application. The base object generator 402 may reside on a server in the “cloud” or on the local information-sharing client 202. The base object generated by the base object generator 402 may be stored in the information database 104 directly if the base object generator 402 is in the “cloud” or may be stored in the local database and uploaded to the information database 104 later if the base object generator 402 is on the local information-sharing client 202.

The information-sharing controller 102 may also include a supporting object generator 404 configured to generate the supporting object to be associated with the base objects in response to user inputs and requests. As noted above, the supporting objects may include time-based supporting objects that are synchronized with the base object in a timescale and non-time by supporting objects that are coordinated with the base object in the space scale. The supporting objects then may be stored in the information database 104 as separate files from the associated base objects. Similar to the base object generator 402, the supporting object generator 404 may reside on a server in the “cloud” or on the local information sharing client 202.

The information-sharing controller 102 may further include a synchronizing engine 406 configured to generate synchronizing actions for manipulating and associating the face objects and corresponding supporting objects. For example, the base objects and supporting objects may already exist in the local database of the local information sharing client 202 where the base and supporting objects are generated or maybe pre-downloaded to the local database. Thus, only the dynamically changed synchronizing actions need to be transferred from the information database 104 to each user's local information sharing client 202 to coordinate the presentation of the base and supporting objects on local information sharing client 202.

FIG. 5 is a depiction of an exemplary non-time by supporting object 502 and time-based supporting object 504, according to an embodiment of the present teaching. For non-time-based supporting object 502, those visual objects are coordinated with the base object 506 by specifying the relative scale coordinates on the base object 506. The base object 506 may be represented as a visual object with three dimensions, including page numbers representing the z coordinate and the horizontal and vertical coordinates (x, y) Representing a position where the non-time-based supporting object 502, as shown in FIG. 5. In one example, standard resolution/coordinates may be applied for coordinating the non-time-based supporting object 502 on the base object 506. For example, the information sharing system 100 may use a relative scale mechanism of height and width of the base object 506, allowing the non-time-based supporting object 502 to be displayed independent of screen size or resolution.

For time-based supporting object 504, these objects and the base object 506 may be synchronized in a timescale. As shown in FIG. 5, each time-based supporting object 504 or each part of a time-based supporting object 504 may be activated at a correct time as triggered by one or more synchronizing actions. The synchronizing engine 406 may use time as the coordinating scale so that any audio, video, mouse movements, animations, visual effects, applications, or any other time-based supporting object 504, are recorded with the absolute value of time. For example, as a user plays back the recorded audio or animation, synchronizing actions may be used to synchronize the audio or animation with the base object. The synchronizing actions may include information regarding, for example, on which page or at what time to activate the time-based supporting object 504. Synchronizing multiple time-based supporting objects 504 may also be supported. For example, recorded animations may be saved and played back while playing back the audio. In addition, time-based supporting objects 504 may be synchronized with a video-type base object 506. This can be achieved in a way similar to how the time by supporting objects 504 are handled with respect to the document type-based objects 506.

FIG. 6 is a depiction of an exemplary process of synchronizing supporting objects with a base object, according to an embodiment of the present teaching. In this example, different users may add different supporting objects, which may be synchronized by time-based coordinates and/or the three-dimensional space coordinates, as noted above. In this example, a version control mechanism may be applied to control the modification of the base object and the association of additional supporting objects to the different versions of the base object. For example, once a user modifies the content of the user file, the base object may be converted again and labeled as a new version to distinguish it from previous versions. And one example, a new supporting object may be always associated with the latest version of the base object. And another example, the supporting objects associated with the latest version of the base object may, re-associate it with the latest version of the base object if necessary. It is understood that some or all of the versions of a base object may be temporarily or permanently stored in the information database 104 and/or local database if necessary.

FIG. 7 is a depiction of another exemplary process of synchronizing supporting objects with a base object, according to an embodiment of the present teaching. In this example, not only the base object is version controlled, but also the supporting objects and synchronizing actions may be version controlled. In this example, a base object with its associated supporting objects and synchronizing, actions may be defined as a sharing object. Any change on the base object, supporting object(s), or synchronizing action may trigger the information sharing system 100 to record a new version of the sharing object. For example, adding new add-on notes and sounds to the sharing object may generate a new version of the sharing object; modifying the base document may generate a new version of the sharing object; adding a new synchronizing action to the previous version may also generate a new version of the sharing object. In this example, the information sharing system 100 may regenerate each version of a sharing object with the corresponding base object, supporting object(s), and synchronizing action(s).

FIGS. 8(a) and 8(b) our flow charts of exemplary processes in which information sharing is performed, according to different embodiments of the present teaching. In FIG. 8 (a), starting from block 802, a first request is received from a user to access a first piece of information in the information database 104. As a response to the first request at block 804, a first representation of the first piece of information is retrieved. A block 806, the first piece of information is made accessible to the user. For example, the information sharing system 100 may convert the first piece of information to a base object of a certain type and present the base object to the user as a response to the first request.

Moving to block 808, a second request to generate a second piece of information based on the first piece of information is received. At block 810, the second piece of information is generated based on an input received from the user. At block 812, a second representation of the second piece of information is created. For example, the information sharing system 100 may receive a second request from the user to generate a second piece of information, such as a supporting object and or a synchronizing action, based on the base object. The information-sharing system 100 may generate the second piece of information based on the user's inputs and creates a representation of the second piece of information so that other users may retrieve the second piece of information. In one example, the second representation may be an indication of the second piece of information available for retrieval, such as a list of supporting objects that are associated with the first piece of information for selection.

Moving to block 814, the first and second pieces of information are stored, for example, in the information database 104 or local databases. At block 816, the second piece information is associated with the first piece of information. At block 818, when the first piece of information is accessed, the second representation of the second piece of information is retrieved. For example, the second representation in the first representation may be marked as associated with each other so that whenever the first piece of information is accessed in the future, the second piece of information is made available.

In FIG. 8(b), starting from block 820, a first request is received from a first user to access a first piece of information. In response to the first request, at block 822, a first representation of the first piece of information is retrieved. At block 824, a second representation of a second piece of information created by a second user and associated with the first piece of information is retrieved. For example, the information sharing system 100 may receive a first request from the first user to access a first piece of information. In response to the first request, the base object and its associated supporting objects and synchronizing actions created by a second user may be retrieved and presented to the first user as the first and second pieces of information, respectively. In one example, the representation of the second piece of information may include a solicitation for a response to the second piece of information. For example, the first piece of information may be a music sheet, and the second piece of information may be a playing of the music sheet made by the second user with a solicitation to the first user for comment on the playing.

Moving to block 826, a second request is received from the first user to create a third piece of information associated with the second piece of information. At block 828, the third piece of information is created based on input from the first user. Moving to block 830, a third representation is created for the third piece of information. The third representation includes an indication of association to the second piece of information. At block 832, a relationship between the first and the second users is established. For example, the information sharing system 100 may further receive a second request from the first user to create a third piece of information, such as additional supporting objects and/or synchronizing actions. The information-sharing system 100th in may create the third piece of information and establishes a relationship between the first and the second users. At block 834, once the relationship is established, a record may be created as evidence of the relationship by the information sharing system 100. In one example, the third piece of information (e.g. comments made by the first user on the second user's playing) may be transmitted back to the second user as a response to the solicitation for comment.

FIG. 9 is an exemplary diagram of a system for online meetings, according to an embodiment of the present teaching. In this example, the system may include a meeting server 900 residing in the “cloud” and local meeting clients 902, 904 for the meeting presenter (host) and participants (guests). The meeting server 900 may include the information database 104 that stores the base and supporting objects and the synchronization engine 406 operatives to simultaneously dispatch dynamic synchronizing actions (actionable items) received from the meeting presenter 902 to each meeting participant 904. In this example, the base and supporting objects, such as the presentation slides and comments, maybe pre-downloaded to the local database of each meeting participant 904. During the meeting, the presenter 902 made dynamically generated new synchronizing actions by the synchronizing action generator 906, Such as moving the mouse cursor on the presentation slides, highlighting an area, or switching pages. These dynamic synchronizing actions may synchronize with the base object in supporting objects and output it to the presenter 902. At the same time, the dynamic synchronizing actions may be simultaneously transmitted to the action synchronizing module 908 of each participant in real-time. Similarly, these dynamic synchronizing actions may be synchronized with the base object and supporting objects and presented to each participant 904.

For example, in FIG. 10, an object may be presentation slides that have been pre-downloaded to the local database of each participant before the meeting starts. At time t11 the presenter may generate a first dynamic synchronizing action (actionable item) of switching the presentation slides to page 22. The first dynamic synchronizing action may be dispatched to each meeting participant at approximately the same time of al, such that the presentation slide on each participant's machine may also be switched to page 22. At time t12, the presenter may create a second dynamic synchronizing action of adding an animation note on page 22. In one example, the animation that may exist in the local database of participant one, and thus, only the dynamic synchronizing action itself needs to be transferred to participant one to instruct which and where the animation needs to be added. However, if participant 2 does not have the animation note and its local database, it may retrieve the animation note from participant 1 or from the information database. Also, the second dynamic synchronizing action may not be directed dispatch from the meeting server to participant to but instead, maybe transmitted from participant 1. As shown in FIG. 10, other dynamic synchronizing actions, such as moving mouse cursor and highlighting, may also be dispatched to the participant in a similar manner as noted above.

FIGS. 11(a) and 11(b) are flow charts of exemplary processes in which online meetings are performed, according to different embodiments of the present teaching. In FIG. 11(a), starting at block 1102, a first request to access the first piece of information is received. The first request is associated with a plurality of users, e.g. meeting participants 904. The first piece of information may be based on objects and supporting objects, such as presentation slides, at block 1104, the first piece of information is retrieved as a response to the first request. The retrieved first piece of information is delivered to the plurality of users to become accessible at block 1106. For example, the meeting server 900 May retrieve the presentation slides from the information database 102 so that each meeting participant 904 can pre-download the presentation slides before the meeting starts.

At block 1108, a second request is received from an acting user, e.g., the meeting presenter 902, to generate a second piece of information based on the first piece of information. That block 1110, the second piece of information, e.g., synchronizing actions, is generated based on an input received from the acting user. For example, during the meeting, the presenter 902 may dynamically generated new synchronizing actions by the synchronizing action generator 906, such as moving the mouse cursor on a presentation slides, highlighting an area, or switching pages. Proceeding to block 1112, information indicating an association between the second piece of information and the first piece of information is created. For example, dynamic synchronizing actions may synchronize with the base object and supporting objects as output to presenter 902. I block 1114, the second piece of information with embedded information indicating the association is delivered to the plurality of users. For example, the dynamic synchronizing actions may be simultaneously transmitted to the actions synchronizing module 908 of each participant 904 in real-time.

In FIG. 11(b), before the meeting starts, the meeting's over 900 delivers a base object, e.g., presentation slides, to the meeting presenter 902 in each meeting participant 904 at blocks 1116 and 1118 respectively. During the meeting, the meeting server 900, at block 1120, receives input from the meeting presenter 902 to generate an actionable item for the base objects, such as moving the mouse cursor on the presentation slides, highlighting an area, or switching pages. At block 1122, the meeting server 900 delivers the actual items to all the meeting participants 904 simultaneously, and each meeting participant 904 receives the action of item in real-time during the meeting at block 1124. The action item has been associated with the base object for the meeting for presenter 902 and meeting participants 904 at blocks 1124 and 1128, respectively. The base object with the associated actionable item is then presented to the meeting presenter 902 and meeting participants 904 at blocks 1130 and 1132, respectively.

This online meeting not only allows users to view the same shared document but also transfers the synchronizing actions performed by the presenter to all participants instantly. Since the size of the synchronizing action data is relatively small, the latency is minimized. As a result, the participants can synchronize action as the meeting presenter performs them in real-time. It provides a real-time and in-person feeling for the meetings to the end-users. Not having any latency and providing an in-person feeling promotes a more effective communication tool to its users. In addition, meetings can be recorded and played back by other users so that those people who are not able to attend the meeting live may be at the exact same meeting at a later time. Furthermore, since the recorded meeting is the synchronized action saved in sequential order based on the time scale, such recorded meetings have a very small file size and provide more advantages over traditional methods where online meetings are recorded as video files with the larger size.

To implement the present teaching, computer hardware platforms may be used as the hardware platform(s) for one or more of the elements described herein. The hardware element, operating systems, and programming languages of such computers are conventional in nature, and it is presumed that those skilled in the art are adequately familiar therewith to adapt those technologies to implement the processing essentially as described herein. A computer with user interface elements may be used to implement a personal computer (PC) or another type of workstation or terminal device, although a computer may also act as a server if appropriately programmed. It is believed that those skills in the art are familiar with the structure, programming, and general operation of such computer equipment and as a result, the drawing should be self-explanatory.

FIG. 12 depicts a general computer architecture on which the present teaching can be implemented and has a functional block diagram illustration of a computer hardware platform that includes user interface elements. The computer may be a general-purpose computer or a special-purpose computer. This computer 1200 can be used to implement any components of the information sharing architecture as described herein. Different components of the system can all be implemented on one or more computers such as a computer 1200, via its hardware, software program, firmware, or a combination thereof. Although only one such computer is shown, for convenience, the computer functions relating to information sharing may be implemented in a distributed fashion on a number of similar platforms to distribute the processing load.

The computer 1200, for example, includes COM ports 1202 connected to and from a network connected there to facilitate data communications. The computer 1200 also includes a central processing unit (CPU) 1204, in the form of one or more processors, for executing program instructions. The exemplary computer platform includes an internal communication bus 1206, program storage and data storage of different forms, e.g. disc 1208, read-only memory (ROM) 1210, or random access memory (RAM) 1212, for various data files to be processed and or communicated by the computer, as well as possibly program instructor to be executed by the CPU. The computer 1200 also includes an I/O component 1214, supporting input-output flows between the computer and other components therein such as user interface elements 1216. The computer 1200 may also receive programming and data network communications.

Hence, aspects of the method of information-sharing, as outlined above, may be embodied in programming. Program aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of executable code and/or associated data that is carried on or embodied in a type of machine-readable medium. Tangible non-transitory “storage” type media includes any or all of the memory or other storage for the computers, processors or the like, or associated modules thereof, such as various semiconductors memories, tape drives, disc drives, and the like, which may provide storage at any time for the software programming

All or portions of the software may at times be communicated through a network such as the Internet or various other telecommunication networks. Such communications, for example, may enable loading of the software from one computer or processor to another. Plus, another type of media that may bear the software elements includes optical, electrical, and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks, and over various Airlink. The physical elements that carry such waves, such as wired or wireless length, optical links, or the like, may be considered as media bearing the software. As used herein, unless restricted to tangible “storage” media, terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions to a processor for execution.

Hence, a machine-readable medium may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium or a physical transmission medium. Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, which may be used to implement the system or any of its components as shown in the drawings. Volatile storage media include dynamic memory, such as a main memory of such a computer platform. Tangible transmission media include coaxial cables, copper wire, and fiber optics, including the wire that form a bus within the computer system. Carrier wave transmission media can take the form of electric or electromagnetic signals or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media, therefore, include an example: a floppy disk oh, a flexible disk, hard disk, Magnetic Tape, any other magnetic medium, A CD-ROM, DVD, or DVD-ROM, or any other optical medium, punch card paper tape, any other physical storage medium with patterns of holes, a RAM, a PROM, and EPROM, and FLASH-EPROM, and any other memory chip or cartridge, a carrier wave transporting data or instructions, cables are links transporting such a carrier wave or any other medium from which computer can read programming code and/or data. Many of these forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.

In addition to the above identified features, options, controls, and components, system 10 may also include other features and functionalities, among other options, controls, and components.

It will be appreciated by those skilled in the art that other various modifications could be made to the system, process, and method of use without parting from the spirit and scope of this disclosure. All such modifications and changes fall within the scope of the claims and are intended to be covered thereby.

Claims

1. An information sharing system, comprising:

a plurality of collaborators;
a content;
a sync; the sync having at least one content; the sync having a voice recording; the voice recording created by the plurality of collaborators; the voice recording overlaid over the content so that the content is being discussed in the voice recording; wherein the voice recording communicates information and discussion related to the content; wherein the voice recording provides additional information related to the content; the sync having a plurality of annotation features; the plurality of annotation features created by the plurality of collaborators; the plurality of annotation features overlaid over the content so that the content is being discussed and emphasized by the plurality of annotation features; wherein the plurality of annotation features communicates information and discussion related to the content; wherein the plurality of annotation features provides additional information related to the content; wherein the sync can be viewed in a similar manner to viewing a video; such that the information shared within the sync can be reviewed a number of times if needed.

2. The system of claim 1, further comprising:

a subsequent sync edit; wherein the subsequent sync edit creates a second sync.

3. The system of claim 1, further comprising:

a history log; the history log having a viewable log of the sync and interactions between the sync and the plurality of collaborators: wherein any one of the plurality of collaborators can review original syncs which were created; wherein any one of the plurality of collaborators can review subsequent syncs created; wherein the plurality of collaborators can quickly review a history of a topic to understand the history of a communication about the content.

4. The system of claim 1, further comprising:

a first collaborator;
a second collaborator;
wherein the first collaborator creates a first sync;
wherein the second collaborator views the first sync at a time subsequent to the creation of the first sync by the first collaborator.

5. The system of claim 1, further comprising:

a first collaborator;
a second collaborator;
wherein the first collaborator creates a first sync;
wherein the second collaborator views the first sync at a time subsequent to the creation of the first sync by the first collaborator;
wherein the second collaborator creates a second sync;
wherein the first collaborator views the second sync at a time subsequent to the creation of the second sync by the second collaborator.

6. The system of claim 1, further comprising:

a syncroom; the syncroom having a plurality of syncs; the syncroom having a plurality of syncs related to the same topic.

7. The system of claim 1, further comprising:

a syncroom; the syncroom having a plurality of syncs; the syncroom having a plurality of syncs related to the same topic; the syncroom having a chat feature; the syncroom having a meeting launch feature; the syncroom having a livesync launch feature;
wherein the syncroom provides access to a plurality of syncs related to the same topic;
wherein the syncroom provides for information sharing, on the same topic.

8. The system of claim 1, further comprising:

an information sharing controller;
an information database;
a collaborator database.

9. The system of claim 1, further comprising:

an information sharing controller;
an information database;
a collaborator database;
an information sharing client.

10. The system of claim 1, further comprising:

a base object generator;
a supporting object generator;
a synchronizing engine.

11. The system of claim 1, further comprising:

a base object generator;
a supporting object generator;
a synchronizing engine;
a non-time-based supporting object.

12. The system of claim 1, further comprising:

a base object generator;
a supporting object generator;
a synchronizing engine;
a time-based supporting object.

13. The system of claim 1, further comprising:

a base object generator;
a supporting object generator;
a synchronizing engine;
a plurality of base objects.

14. The system of claim 1, further comprising:

a meeting server;
a plurality of local meeting clients;
a synchronizing generator;
a synchronizing module.

15. The system of claim 1, further comprising:

a computing system;
a plurality of corn ports;
a central processing unit;
an internal communication bus;
a storage;
a random access memory;
an I/O component;
a user interface.

16. A collaboration and communication system, comprising:

a plurality of collaborators;
at least one content;
a livesync;
the livesync having at least one content;
the livesync having a voice feature; the voice feature spoken by at least one of the plurality of collaborators; the voice feature spoken for the purpose of audio communication over the content so that the content is being discussed in the voice feature;
wherein the voice feature communicates information and discussion related to the content;
wherein the voice feature provides additional information related to the content;
the livesync having a plurality of annotation features; the plurality of annotation features created by the plurality of collaborators; the plurality of annotation features providing real-time annotation on the content so that the content is being visually discussed and emphasized by the plurality of collaborators;
wherein the plurality of annotation features communicates information and discussion related to the content in real time;
wherein the plurality of annotation features provides additional information related to the content in real time;
wherein the livesync can be viewed in a similar manner to viewing a video; such that the information shared within the livesync can be reviewed a number of times if needed;
wherein the plurality of collaborators are able to effectively communicate and share information in real-time through annotation, voice communication, and a chat feature.

17. The system of claim 16, further comprising:

a first collaborator;
a second collaborator;
wherein the first collaborator creates a first livesync;
wherein the second collaborator views the first livesync at a time of creation of the first livesync by the first collaborator; such that the second collaborator is viewing the livesync in real-time; such that the second collaborator is interacting in information sharing with the livesync;

18. The system of claim 16, further comprising:

a first collaborator;
a second collaborator;
wherein the first collaborator creates a first livesync;
wherein the second collaborator views the first livesync at a time subsequent to the creation of the first livesync by the first collaborator;
wherein the first collaborator creates a first livesync;
wherein the second collaborator views the first livesync at a time of creation of the first livesync by the first collaborator; such that the second collaborator is viewing the first livesync ill real-time;
wherein the second collaborator interacts with the first livesync by providing voice and annotations on the first livesync;
wherein the second collaborator creates a second livesync.

19. The system of claim 16, further comprising:

a syncroom; the syncroom having a plurality of livesyncs; the syncroom having a plurality of livesyncs related to the same topic; the syncroom having a chat feature; the syncroom having a meeting launch feature; the syncroom having a livesync launch feature;
wherein the syncroom provides access to a plurality of livesyncs related to the same topic;
wherein the syncroom provides for information sharing on the same topic.

20. A method of collaborating and communicating, comprising the steps:

opening a content in a sync; the content uploaded into the sync by a first collaborator;
recording voice to create a video for a sync;
annotating the content during the video of the sync;
sharing the sync with a second collaborator;
viewing the sync by the second collaborator.
Patent History
Publication number: 20220294836
Type: Application
Filed: May 28, 2022
Publication Date: Sep 15, 2022
Inventor: Tieren Zhou (Orinda, CA)
Application Number: 17/827,698
Classifications
International Classification: H04L 65/403 (20060101); H04L 65/612 (20060101);