SYNCHRONIZING CONTENT DISPLAY ACROSS MULTIPLE DEVICES
Techniques are disclosed to synchronize content display across multiple devices, e.g., in a same physical location. In various embodiments, a content feed comprising a representation of an event is received. A content stream comprising or derived from the content feed and a synchronization signal that associates each of a plurality of portions of the content stream with a corresponding real-world time at the event depicted in the content stream are generated. The content stream and the synchronization signal are provided to a location via a communication interface. Each of a plurality of devices at the location is configured to use the synchronization signal to render the content stream in a manner that is synchronized with one or more other of the devices.
Many consumers enjoy watching in a group setting live or pre-recorded events, such as a professional or amateur sports contest, a music concert or other artistic performance, a Presidential debate or other significant civic or cultural events, and other live or pre-recorded broadcast events. Examples of a group setting include without limitation a sports bar, a gaming establishment, a public or private club, a private residence, or any other public or private place where people may gather.
In some group settings, viewers may consume the same content (e.g., a broadcast of a live or recorded event) in the same space but on different devices, potentially having different latency in receiving the content of the broadcast. For example, some viewers of a live sporting event at a sports bar may watch a broadcast as displayed on a television or other screen mounted in the location, while others may prefer to watch the broadcast on their personal mobile device. The latency of receiving the broadcast signal or stream at the television, via a cable or satellite provider, for example, may be different than the latency for the same broadcast as stream to a personal mobile device. It may be desired to have all consumers view the content in a synchronized manner, despite these difference in latency.
In some cases, interactive or even competitive elements may be introduced, such as online gaming or fantasy sports or other real time multi-player competitions. In such cases, it may be considered important, e.g., for fairness, to ensure that all viewers receive the broadcast content in a synchronized manner.
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
Techniques are disclosed to provide synchronized consumption of a content feed across multiple devices. In various embodiments, a content provider, such a broadcaster of a live event or a content provider service that streams prerecorded content, provides a synchronization signal that associates each of a plurality of portions of the content stream with a corresponding real-world time (sometimes referred to as a “wall” time or a “wall-clock” time) at the event depicted in the content stream. In various embodiments, the synchronization signal is used at a location to synchronize consumption of the content stream across multiple devices at the location. For example, consumption of a broadcast of a live sports event, e.g., at a sports bar or other venue, may be synchronized among one or more televisions or other monitors associated with the venue and one or more mobile devices of individual patrons or other users at the location.
In some embodiments, a synchronization master node at a location receives and decodes the synchronization signal and broadcasts the synchronization signal and/or a signal derived therefrom to other devices at the location, such as users' mobile devices. The mobile (or other) devices are configured, e.g., via an installed app, to receive and use the synchronization signal to render the content stream (e.g., display video via a display device, play audio via a speaker device, etc.) on the device in a manner that is synchronized with other devices at the location, e.g., the synchronization master (e.g., a network-connected smart television) and/or other users' mobile devices.
In some embodiments, the content provider and/or other another node may be configured to derive the real-world time or other event time from the content stream, such as by recognizing the numerical characters comprising a real-world or other event time displayed visually in the content, e.g., a game clock of a sporting event or a physical clock visible in the content.
In the example shown, the video (and/or still image and/or audiovisual) content data generated by camera(s) 104, audio data capture by microphone(s) 106, and position (or other) sensor data received by receiver 108 are provided to content feed server 110. In various embodiments, content feed server 110 comprises one or more computers, workstations, and/or other equipment associated with a broadcaster or other content provider positioned to broadcast and/or record for later consumption a stream or set of content data depicting the event.
The content feed server 110 receives a “real world” (also known as “wall” or “wall time”) time signal 112. Time signal 112 comprises a sequence of values each corresponding to a successive real-world time at the venue 102 during pendency of the event that is being broadcast and/or recorded. In some embodies, the times are expressed as the Coordinated Universal Time (UTC).
In various embodiments, content feed server 110 processes content data received from camera(s) 104, microphone(s) 106, and/or receiver 108 into a form to be broadcast (e.g., streamed) and/or stored, according to one or more communication and/or content data storage protocols. For each discrete set (e.g., chunk) of content data, content feed server 110 uses the real time signal 112 to associate with the set of content data a corresponding tuple that includes an event identifier for the event being broadcast and/or recorded and a corresponding real world time, the latter time indicating the real world time at which the scene (e.g., visual, audio, positional) depicted by that set of content data occurred at the venue during the event.
In various embodiments, one or more of a software development kit (SDK), plug in, application, application programming interface (API), and/or other software running on the content feed server 110 may be used to associate portions of content with corresponding real-world times at the venue at which the event is occurring or occurred.
Referring further to
In the example shown, a plurality of mobile devices, such as mobile phones, tablets, smart watches, etc., represented in
In various embodiments, the site server 116 is configured, e.g., by an application, SDK, API, or other software, to receive and decode tuples received from the content feed server 110 in connection with the event content stream, e.g., tuples each of which includes an event identifier associated with the event and a real world time, the tuple being associated with a portion of the content that occurred at the event venue at or very near that real world time. In various embodiments, the site server 116 broadcasts to the mobile devices 118, 120 a local synchronization signal that comprises and/or is derived from the sequence of tuples received from the content feed server 110. In various embodiments, the site server 116 and/or mobile devices 118, 120 use the locally broadcast synchronization signal to display content data depicting and/or otherwise associated with the event in a synchronized manner. In various embodiments, the synchronization ensures that at any given time the site server 116 and the mobile devices 118, 120 are displaying content data associated with the event that occurred at the same real-world time.
In various embodiments, techniques disclosed herein are used to ensure fairness and a shared experience by synchronizing the display of event content and related information across multiple devices at a viewing site, e.g., site server 116 and mobile devices 118, 120.
While the above example refers to a sports competition that is broadcast and/or recorded live, in various embodiments synchronized viewing of other events may be provided, including without limitation a live music or dramatic performance, a speech or lecture, a demonstration or meeting, a parade, a class, or any live (or recorded live) event.
In the example shown in
In the example shown, the content stream processor 306 provides content stream data and the synchronization signal processor 308 provides a synchronization signal to the display engine 310, which in various embodiments is configured to use the content stream data and the synchronization signal to render the content stream via one or more display devices 312 (e.g., monitor or other display device, one or more speakers, etc.) in a manner determined at least in part by the synchronization signal. For example, the display engine 310, in some embodiments, controls timing of the rendering of successive portions of the content stream based on the synchronization signal, which in turn enables the content stream to be displayed one other devices, e.g., mobile devices 118, 120, in synchronization with the display of the content stream via the display device(s) 312.
The synchronization signal processor 308 provides the synchronization signal to a synchronization signal local broadcast engine 314. The synchronization signal local broadcast engine 314 broadcasts the synchronization signal locally to one or more other devices in a same physical location as the site server 116, such as mobile devices 118, 120 in the example shown in
In various embodiments, one or more of the decoder 304, content stream processor 306, synchronization signal processor 308, display engine 310, and synchronization signal local broadcast engine 314 each may comprises a software module or component, e.g., such as a functional module or component of a software application running on a processor comprising the site server 116. In some embodiments, more or fewer modules may be used and/or the functions, processing, and/or operation described as being performed by one or more of the modules as shown in
In various embodiments, the event identifier included in each synchronization tuple may be mapped to one or more other identifiers associated with the event depicted in the content stream, including by way of example and without limitation an event identifier assigned by a promoter, owner, broadcaster, distributor, or other entity involved in the production of one or both of the event and the content stream depicting the event.
At 504, the synchronization signal and/or a local synchronization signal derived therefrom is broadcast locally, e.g., via network communications, such as Ethernet; cellular; or Wi-Fi and/or Bluetooth™ or other near field communications and/or audio fingerprinting or optical data transfer such as QR codes. In some embodiments, the synchronization signal is “broadcast” visually and/or via audio techniques, such as those described below.
At 506, the content stream is displayed (e.g., video via a monitor or other display device, audio via speakers, etc.) in a manner determined at least in part by the synchronization signal.
In the example shown in
In some alternative embodiments, a content feed server, such as content feed server 110, generates the successive versions of the dynamic optical code 610, e.g., based on a real-world time associated with the event, such as the real-world time signal 112 of
At 642, the mobile device is used to scan a displayed optical code, such as the dynamic optical code 610 of
If at 644 it is determined that the application is not yet present on the device, then at 648 a URL or other information encoded in the optical code is used to download and install the application, after which the application is launched/opened at 646 and used as described above.
If at 650 an indication is received that the display of the content stream may need to be resynchronized, then at 652 the user is prompted to rescan the dynamic optical code, and the synchronization information encoded in the dynamic optical code at the time of rescanning is used to resynchronize (or verify continued synchronization of) the display of content comprising the content stream. Examples of an indication that the display of the content stream may need to be resynchronized include, without limitation, a realization by a user of the mobile device that display of content on their device is out of synchronization with other nearby devices, reaching a threshold amount of device time since the most recent prior scan of the optical code, and detection, e.g., by the application based on audio information received via a microphone of the device, or Bluetooth™ or other computer network information, that other devices nearby are displaying event content other than what the device is displaying at the same time.
Synchronized and/or resynchronized display of the content stream continues as describe above unless/until done (654), e.g., the event content has reached an end or the user has closed the application, left the location, and/or turned off their device.
In the example shown, at 702 a content stream and/or live or recorded content data is received. At 704, frames of content data are sampled, each sample comprising one or more frames, each comprising an image of a moment in time at the event. At 706, each sample is used to determine a real-world or other time associated with the sampled content. In some embodiments, an event clock shown in each sampled frame, such as the clock 608 in the example shown in
In the example shown in
In various embodiments, techniques disclosed herein may be used to provide synchronize consumption of a content feed and/or associated content among a plurality of devices, e.g., in a same physical location.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
Claims
1. A system, comprising:
- a communication interface; and
- a processor coupled to the communication interface and configured to: receive a content feed via the communication interface, the content feed comprising a representation of an event; generate a content stream comprising or derived from the content feed and a synchronization signal that associates each of a plurality of portions of the content stream with a corresponding real-world time at the event depicted in the content stream; and provide the content stream and the synchronization signal to a location via the communication interface;
- wherein each of a plurality of devices at the location is configured to use the synchronization signal to render the content stream in a manner that is synchronized with one or more other of the devices.
2. The system of claim 1, wherein the processor is further configured to receive a real-world time signal via the communication interface.
3. The system of claim 1, wherein the synchronization signal comprises a series of tuples, each including an event identifier associated with the event and a real-world time.
4. The system of claim 1, wherein the event comprises a live event and the content steam and synchronization signal are generated and provided as output in real time.
5. The system of claim 1, wherein the event in the past and the content feed comprises prerecorded content depicting the event.
6. The system of claim 1, wherein the content feed includes one or more of a video feed, an audio feed, a sensor output, and a position information.
7. The system of claim 1, wherein the processor comprises a plurality of component processors including a first component processor and a second component processor and wherein the content stream is generated by the first component processor and the synchronization signal is generated by the second component processor.
8. The system of claim 7, wherein the first component processor is included in a first physical system and the second component processor is included in a second physical system that is distinct from the first physical system.
9. The system of claim 1, wherein the processor is configured to generate the content stream at a first time and to generate the synchronization signal at a second time.
10. The system of claim 1, wherein the processor is configured to generate the synchronization signal at least in part by receiving a real-world time signal at a same time as the content feed and associating each of a plurality of discrete portions of the content feed with a corresponding real-world time.
11. The system of claim 1, wherein the processor is configured to generate the synchronization signal at least in part by performing optical character recognition processing on visual content comprising a portion of the content feed to determine a real-world time represented by a clock depicted in the visual content.
12. The system of claim 1, wherein a synchronization master at the location is configured to broadcast the synchronization signal locally to one or more other devices at the location.
13. The system of claim 12, wherein the broadcast includes one or more of a network broadcast sent via a local computer network, a wireless network, a Bluetooth™ or other near field transmission, a human-audible audio signal, and an audio signal at a frequency not associated with human hearing.
14. The system of claim 12, wherein the synchronization master is configured to display the content feed and the content feed as displayed includes a dynamic QRC or other dynamic optical code which at any given time encodes a real-world time associated with event content being displayed at that time.
15. The system of claim 14, wherein the each of at least a subset of the plurality of devices at the location is configured to scan and decode the dynamic QRC or other dynamic optical code to determine the real-world time associated with event content being displayed at that time and use the determined real-world time to render the content stream in a manner that is synchronized with one or more other of the devices.
16. A method, comprising:
- receive, at a processor via a communication interface, a content feed comprising a representation of an event;
- generate a content stream comprising or derived from the content feed and a synchronization signal that associates each of a plurality of portions of the content stream with a corresponding real-world time at the event depicted in the content stream; and
- provide the content stream and the synchronization signal to a location via the communication interface;
- wherein each of a plurality of devices at the location is configured to use the synchronization signal to render the content stream in a manner that is synchronized with one or more other of the devices.
17. The method of claim 16, wherein the synchronization signal comprises a series of tuples, each including an event identifier associated with the event and a real-world time.
18. The method of claim 16, wherein the event comprises a live event and the content steam and synchronization signal are generated and provided as output in real time.
19. A computer program product embodied in a non-transitory computer readable medium, comprising computer instructions for:
- receiving, at a processor via a communication interface, a content feed comprising a representation of an event;
- generating a content stream comprising or derived from the content feed and a synchronization signal that associates each of a plurality of portions of the content stream with a corresponding real-world time at the event depicted in the content stream; and
- providing the content stream and the synchronization signal to a location via the communication interface;
- wherein each of a plurality of devices at the location is configured to use the synchronization signal to render the content stream in a manner that is synchronized with one or more other of the devices.
20. The computer program product of claim 19, wherein the synchronization signal comprises a series of tuples, each including an event identifier associated with the event and a real-world time.
Type: Application
Filed: Mar 16, 2022
Publication Date: Sep 21, 2023
Patent Grant number: 12225259
Inventors: Michael Naquin (Alamo, CA), Erik Schwartz (Los Altos Hills, CA), Charles D. Ebersol (Atlanta, GA), Anne Gerhart (Atlanta, GA)
Application Number: 17/696,271