System and method for transitioning from decompressing one compressed media stream to decompressing another media stream
A computer system generates a first decompressed media stream based on a first compressed media stream and provides the first decompressed media stream to a presentation device. While providing the first decompressed media to the presentation device: the computer system receives a content-transition indication indicating that a second decompressed media stream is to be provided to the presentation device instead of the first decompressed media stream. The computer system stores first buffered data corresponding to the first decompressed media stream and receives a second compressed media stream corresponding to the second decompressed media stream. After receiving the second compressed media stream, the computer system provides the first buffered data corresponding to the first decompressed media stream to the presentation device while preparing to generate the second decompressed media stream, and after preparing to generate the second decompressed media stream, provides the second decompressed media stream to the presentation device.
Latest SPOTIFY AB Patents:
- SKIPPING TO PLAYBACK POSITIONS IN MEDIA CONTENT USING SEEK GUIDES
- Method, system, and computer-readable medium for creating song mashups
- Human-machine interfaces for utterance-based playlist selection
- AUTOMATICALLY GENERATED MEDIA PREVIEW
- Systems and methods for injecting content into media content items
This application claims priority to U.S. Provisional Patent Application Ser. No. 61/861,330, filed Aug. 1, 2013, entitled “Transitioning from Decompressing One Compressed Media Stream to Decompressing another Media Stream,” which application is incorporated by reference in its entirety.
This application is related to U.S. Provisional Patent Application Ser. No. 61/836,079, filed Jun. 17, 2013, entitled “System and Method for Switching Between Media Streams while Providing a Seamless User Experience;” U.S. Provisional Patent Application Ser. No. 61/881,353, filed Sep. 23, 2013, entitled “System and Method for Efficiently Providing Media and Associated Metadata;” and U.S. Provisional Patent Application Ser. No. 61/892,343, filed Oct. 17, 2013, entitled “System and Method for Switching between Media Items in a Plurality of Sequences of Media Items,” which applications are incorporated by reference in their entireties.
TECHNICAL FIELDThe disclosed implementations relate to decompressing compressed media content generally and transitioning between decompressing media streams in particular.
BACKGROUNDAs computer technology has improved and become ubiquitous, users increasingly are able to consume media content on or through their computer based devices. For example, users can listen to audio content or watch video content on a variety of computer based electronic devices. This media content is sometimes compressed. In addition, advances in media content compression technology have increased the speed and reliability with which information can be transmitted over computer networks. As such, it is possible to stream media data over computer networks as needed rather than transmitting a file in a physical media, such as a CD or DVD, or downloading the entire file before consuming the media content.
SUMMARYWhen media content is encrypted and/or compressed, it is subsequently decrypted and/or decompressed before being presented to a user; however, some solutions for streaming media that is transmitted in an encrypted and/or compressed format are cumbersome and involve excessive loading times that interfere with a user experience of streaming the media. This is especially true when attempting to transition between media streams where the device switches from decrypting and/or decompressing one media stream to decrypting and/or decompressing another media stream. Some transitions result from user selection of another media stream for presentation. Other transitions occur because the amount of bandwidth currently available at a device receiving a media stream changes. In this case, another media stream with the same media content is requested to better match the current available bandwidth. For example, if the available bandwidth decreases, a lower quality stream (e.g., a media stream with less fidelity and therefore less data to be transmitted) is requested. If the available bandwidth increases, a higher quality media stream is requested to improve the presentation quality to the user.
The decompression of a media stream is often accomplished though the use of a decoder (e.g., the decoding component of a codec that uses an algorithm for decompressing media streams). When switching between streams of media content, more than one decoder is sometimes necessary (e.g., when the different streams of media content are encoded in different formats). Running multiple decoders can strain the processing and memory resources of an electronic device, resulting in a degraded user experience. In addition, the time needed to load and run a second decoder can also degrade user experience by causing a media stream (e.g., a video stream) to pause while the second decoder loads. In such circumstances, if a user attempts to scan through several media streams (e.g., channel surfing through television channels), the user will likely experience frequent breaks in presentation of the content as the system loads the appropriate decoder.
Accordingly, there is a need for a method to transition from decompressing one compressed media stream to decompressing another media stream seamlessly even when more than one decoder is necessary to provide a seamless user experience when transitioning between different media streams. Such methods and interfaces may complement or replace conventional methods for switching between media stream channels. Such methods and interfaces enhance the user experience as the user is able to switch between media stream channels without interruption of streaming content. In particular, when switching between two media streams that include the same content but at two different bitrates, the transition can be made without interrupting media presentation. In addition, users watching live media streams (e.g., live sporting events) will be able to browse through different channels without undue delay (e.g., without being shown a “now loading” or “waiting to buffer” message and/or an icon indicating that the content is buffering such as an hourglass, spinning icon, or progress bar).
In accordance with some implementations, a method to transition from decompressing one compressed media stream to decompressing another media stream seamlessly is disclosed. The method is performed at a computer system including one or more processors and memory. The computer system generates a first decompressed media stream based on a first compressed media stream and provides the first decompressed media stream to a presentation device. While providing the first decompressed media to the presentation device, the computer system: receives a content-transition indication indicating that a second decompressed media stream is to be provided to the presentation device instead of the first decompressed media stream, stores first buffered data corresponding to the first decompressed media stream, and receives a second compressed media stream corresponding to the second decompressed media stream. After receiving the second compressed media stream, the computer system provides the first buffered data corresponding to the first decompressed media stream to the presentation device while preparing to generate the second decompressed media stream. After preparing to generate the second decompressed media stream, the computer system provides the second decompressed media stream to the presentation device instead of providing the first buffered data corresponding to the first decompressed media stream.
In accordance with some implementations, a method to transition from decompressing one compressed media stream to decompressing another media stream seamlessly is disclosed. The method is performed at a computer system including one or more processors and memory. The computer system generates a first decompressed media stream based on a first compressed media stream. The computer system provides the first decompressed media stream to a presentation device. While providing the first decompressed media stream to the presentation device, the computer system receives a content-transition indication indicating that a second decompressed media stream is to be provided to the presentation device instead of the first decompressed media stream. In response to receiving the content-transition indication, the computer system selects a first transition point in first content corresponding to the first compressed media stream and selecting a second transition point in second content corresponding to the second compressed media stream, wherein: the second transition point is selected in accordance with a bandwidth available for receiving the second compressed media stream; and the first transition point is selected based on the second transition point. The computer system requests the second compressed media stream starting at the second transition point, and generates a second decompressed media stream based on the second compressed media stream that includes content after the second transition point. After generating the second decompressed media stream, the computer system ceases to provide the first content at the first transition point (e.g., to a presentation device) and starts to provide the second content at the second transition point (e.g., to the presentation device).
In accordance with some implementations, a computer system (e.g., a client computer system or server computer system) includes one or more processors, memory, and one or more programs; the one or more programs are stored in the memory and configured to be executed by the one or more processors and the one or more programs include instructions for performing the operations of the method described above. In accordance with some implementations, a non-transitory computer readable storage medium has stored therein instructions which when executed by one or more processors, cause an computer system (e.g., a client computer system or server computer system) to perform the operations of the methods described above.
The implementations disclosed herein are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings. Like reference numerals refer to corresponding parts throughout the drawings.
Attention is now directed to the figures, and in particular to
In some implementations, the client environment 108-1 includes a first electronic device 102-1 (e.g., a controlling electronic device). In some implementations, the first electronic device 102-1 is one of the group of: a personal computer, a mobile electronic device, a laptop, a tablet computer, a mobile phone, a digital media player, or any other electronic device able to prepare media content for presentation, control presentation of media content, and/or present media content.
In some implementations, the client environment 108-1 also includes a second electronic device 106-1 (e.g., a controlled electronic device). In some implementations, the second electronic device 106-1 is one of the group of: a computer, an audio/visual system, a media receiver/player, television or similar display device, a discrete graphics processing unit, or any other electronic device able to present media content (e.g., display visual content and/or play audio content). In some implementations, both the first electronic device 102-1 and the second electronic device 106-1 are associated with a common user account (or associated user accounts) provided by a content provider with which the server system 120 is associated. For example, in some implementations, the server system 120 is operated and/or provided by a subscription-based media streaming service to which a user, optionally, has an account, and the first and second electronic devices 102-1, 106-1 are each associated with account credentials that enable them to communicate with and receive content from the server system 120-1. In some implementations, the first electronic device 102-1 (e.g., a personal computer or a set top box) is associated with account credentials and receives content from server system 120-1, and the second electronic device 106-1 is a media presentation device (e.g., a set of speakers, a television, etc.) that receives the content from the first electronic device 102-1 and presents that content to the user. In some embodiments, the second electronic device 106-1 is integrated with the first electronic device 102-1 (e.g., the second electronic device is a display or monitor connected to, or integrated with, an electronic device such as a laptop computer, a desktop computer, a smart phone, or a tablet computer).
In some implementations, both the first electronic device 102-1 and the second electronic device 106-1 are on the same local network. In some implementations, the first electronic device 102-1 and the second electronic device 106-1 are on different local area networks. In some implementations, the local network is a Local Area Network. In some implementations, the server system 120 is not located on the same Local Area Network as either the first electronic device 102-1 or the second electronic device 106-1.
As noted above, Local Area Networks are often associated with a relatively small geographic area (e.g., one house or building or a corporate or academic campus) and create IP address space behind a traffic routing device. In some implementations, the traffic routing device uses network address translation to connect devices within the LAN to devices outside the LAN. Network address translation is the process of readdressing packets by modifying the IP address contained in each packet so that the packets reach their intended destination device. Network address translation allows a plurality of electronic devices on the Local Area Network to connect to the Internet through a single IP address (e.g., the IP address of the traffic routing device). The traffic routing device (i.e. a router) receives incoming packets, determines which device on the Local Area Network is the intended recipient, and modifies the IP address to identify that device. For example, a Local Area Network has 5 devices with local IP addresses 192.168.0.1-192.168.0.5 that all connect to a router. The router connects to the Internet and has an IP address of 12.162.29.2. Using network address translation, the router translates the source address for all communications sent from any of the 5 devices and intended for destinations in the Internet to be 12.162.29.2 (the router's IP address). On the other hand, the router collects all packets incoming from the Internet, determines the intended recipient device based upon the contents of each packet, and translates the destination IP address to the address of the correct device on the Local Area Network. So when the intended device is the device with Local Area Network IP address 196.168.0.2, the router would change the destination address to this address. Local Area Networks also commonly use firewalls to limit incoming connections. In this way, computer devices outside of the Local Area Network are generally not able to communicate directly with the devices on a Local Area Network. Indeed, in some Local Area Networks the devices in the network are not contactable even by other devices in the Local Area Network.
In some implementations, both the first electronic device 102-1 and the second electronic device 106-1 are on the same Personal Area Network. In some implementations, the Personal Area Network uses BLUETOOTH communication technology. In some implementations, the server system 120 is not located on the same Personal Area Network as either the first electronic device 102-1 or the second electronic device 106-1.
In some implementations, the first electronic device 102-1 includes a media content presentation and control application (presentation module 220 in
In some implementations, the data streamed from the server system 120 is stored/cached by the first electronic device 102-1 in a media content buffer (e.g., compressed buffer 136 in
In some implementations, the media stream data from the server system 120 (or another content source 121 such as a content delivery system or a peer in a peer-to-peer network) is compressed and/or encrypted prior to being transmitted to the first electronic device 102 to reduce the bandwidth needed to transmit the media stream. Once the media stream data has been received by the first electronic device 102-1, the media stream data needs to be decompressed and/or decrypted before it can be presented at a media presentation device. In some implementations, the first electronic device 102-1 includes one or more decrypters 132, one or more demultiplexers 134 (sometimes called demuxers), a compressed data buffer 136, one or more decoders 138, and a decompressed buffer 140. In some implementations, the one or more decrypters 132, the one or more demultiplexers 134, and the one or more decoders 138 are implemented as software modules that are executed on a multi-purpose computer processor. In some implementations, the one or more decrypters 132, the one or more demultiplexers 134, and the one or more decoders 138 are implemented on specialized computer hardware.
In some implementations, the second electronic device 106-1 includes a renderer 142 and an output device 144. In some implementations, some components, such as the one or more decoders 138 and the decompressed buffer 140 can be located in either the first electronic device 102-1 or the second electronic device 106-1. In some implementations, the first electronic device 102-1 transmits the media stream data to the second electronic device 106-1.
In some implementations, the first electronic device 102-1 receives the media stream data from the server system 120-1 (or another content source 121 such as a content delivery system or a peer in a peer-to-peer network) and passes the media stream data to at least one of the one or more decrypters 132. In some implementations, the decrypter 132 receives encrypted data and decrypts it to produce decrypted data. Any appropriate encryption algorithm can be used including, but not limited to, symmetric key encryption algorithms or public-key encryption algorithms. The decrypted data is then passed from the decrypters 132 to one or more demultiplexers 134.
In some implementations, the demultiplexer 134 (also known as a file splitter) separates a media stream that includes interlaced multiple individual component streams into the individual component streams. For example, a media stream includes a video stream, an audio stream, subtitles, and other data that have all been interlaced or interwoven into a single combined media stream. The one or more demultiplexers 134 separates the combined media stream into its constituent parts for separate processing. The one or more demultiplexers 134 output compressed data that is then stored in the compressed data buffer 136.
In some implementations, compressed data stored in the compressed buffer is sent to at least one of the one or more decoders 128. In some implementations, the decoder 138 is the decoding component of a codec (e.g., coder-decoder) that is either implemented by a software module or a specialized piece of computer hardware. In some implementations, a decoder 138 takes encoded information and uses a predefined algorithm to convert the encoded data to decoded, decompressed data. In some implementations, the decoder 138 is any appropriate decoder, including either lossless or lossy decoders. For example, H.264, V9, DivX, Xvid, are all examples of codecs that can be used to encode (e.g., compress) and decode (e.g., decompress) information.
In some implementations, the decrypted, decompressed media stream data is stored in a decompressed buffer 140. In some implementations, the decompressed buffer 140 is fixed in size. In other implementations, the decompressed buffer 140 is dynamically resized to accommodate different amounts of decompressed data (e.g., the decompressed buffer 140 is, optionally, expanded in size as the amount of decompressed data that needs to be stored and is, optionally, reduced in size as the amount of data that needs to be stored decreases). In some implementations, the decompressed data in the decompressed buffer 140 is stored as one or more frames, where each frame includes all the data needed to be displayed on a given display at one time. For example, the decompressed buffer would, in a steady state, store approximately 4 decompressed frames of data (e.g., enough data to display four subsequent frames). In some implementations, when switching between providing a first media stream and a second media stream, the decompressed buffer is increased to store up to 24 frames of decompressed media stream data. In some implementations, the decompressed buffer 140 stores ten to fifteen seconds of video for a first stream while preparing to begin providing a second stream for display.
In some implementations, the renderer 142 prepares the decompressed data stored in the decompressed buffer 140 to be presented on an output device 144. In some implementations, the renderer 142 stores decompressed stream data in a display buffer or frame buffer prior to sending the data to a presentation device. In some implementations, the output device 144 then transfers the rendered data to a device for presenting the media data. In some implementations, the output device 144 itself is a display or audio presentation device and presents the media data directly (e.g., a display or speakers).
For example, the first electronic device 102-1 receives an encrypted/multiplexed/compressed video stream, decrypts the video stream with decrypters 132, demultiplexes the video stream with demultiplexers 134, and stores the decrypted/demultiplexed video stream in compressed buffer 136. The video stream is then decompressed by a decoder 138 at the first electronic device 102-1 or the second electronic device 106-1 and the decrypted/demultiplexed/decompressed video stream is stored in a decompressed buffer 140 at the first electronic device 102-1 or the second electronic device 106-1. The decrypted/demultiplexed/decompressed video stream is provided to a renderer 142 at the second electronic device 106-1 that renders content corresponding to the decrypted/demultiplexed/decompressed video stream, the rendered content is presented to the user using output device 144 (e.g., the video stream is displayed on a display such as a touch screen, computer monitor, or television).
The client-server environment 100 also includes a server system 120. In some implementations, the server system 120 includes a media streaming module 122, a media content database 124, and a context database 126. The media content database 124 stores media content that can be presented by an electronic device. For example, in some implementations, the media content database 124 stores audio (e.g., music, audio books, podcasts, etc.), video (e.g., movies, television shows, etc.), images, or other content that can be streamed to other electronic devices. In some implementations, the media content database includes data stored in different formats and file types to allow a variety of different devices and/or applications to receive streamed content under a variety of different network conditions (e.g., varying available network bandwidth). In some implementations, the data is stored in a single file format and is converted/transcribed/transcoded to the appropriate data type or format before or as it is streamed to a remote device.
In some implementations, the server system 120 includes a media streaming module 122. In some implementations, the media streaming module 122 receives media control requests from electronic devices and streams media content in response. In some implementations, the media streaming module 122 receives media control requests from a first electronic device 102-1. For example, a user sends a media control request to the server using a mobile phone (a first electronic device) requesting that a particular media stream be presented. The server system 120 then sends the requested media stream to the first electronic device 102-1. In some implementations, instead of sending the requested media stream to the first electronic device 102-1, the server system 120 enables the first electronic device 102-1 to obtain the media stream from other content sources 121 (e.g., media servers, content delivery networks or peers in a peer-to-peer network) by sending instructions to the other content sources 121 to deliver the media stream to the first electronic device 102-1 or by sending the first electronic device 102-1 information that enables the first electronic device 102-1 to request the media stream from the other content sources 121 (e.g., identifiers such as IP addresses for other content sources that are available to provide at least a portion of the media stream to the first electronic device 102-1).
In some implementations, the received media control request includes information identifying the stream requested by the first electronic device 102-1 (e.g., information identifying the content requested and/or information identifying the file format processing or other capabilities of the first electronic device 102-1). Accordingly, the media control request will identify that a request is intended for the home stereo, for example, so that the server system 120 can send the requested media and/or the media control request to the home stereo. In some implementations, the electronic device provides the server system with an indication of device capabilities of the device such as screen resolution, processing speed, video buffer size/availability, codec availability and the like, and the server system provides content to the electronic device in accordance with the device capabilities.
In some implementations, the server system 120 includes a context database 126. The context database 126 stores data associated with the presentation of media content by an electronic device. In some implementations, the context database 126 includes, among other things, the current position in a media stream that is being presented by an electronic device, a playlist associated with the media stream, previously played content, skipped pieces of media content, and previously indicated user preferences. For example, the context database, optionally, includes information that a media stream to an electronic device currently is presenting a song, at 1 minute and 23 seconds into the song, as well as all the songs played in the last hour and the next 20 songs in the playlist. In some implementations, the server system 120 transmits the context associated with a media stream to the device that is presenting the media stream so that one or more items of context information can be used by the device, such as for display to the user. In some implementations, when the device to which the media content is being streamed changes, the server system 120 (or another content source 121 such as a content delivery system or a peer in a peer-to-peer network) transmits the context associated with the active media content to the newly active device.
-
- an operating system 216 that includes procedures for handling various basic system services and for performing hardware dependent tasks;
- a network communication module 218 that is used for connecting the electronic device 102 to other computers via the one or more communication network interfaces 410 (wired or wireless) and one or more communication networks, such as the Internet, other Wide Area Networks, Local Area Networks, Personal Area Networks, metropolitan area networks, VPNs, local peer-to-peer, ad-hoc connections, and so on;
- a presentation module 220 for enabling presentation of media content at the electronic device 102 through the output devices 206 (e.g., screens, speakers, headphones, etc.) associated with the user interface 204 or second electronic device 106-2;
- one or more client system applications module(s) 222 for enabling the electronic device 102 to perform the functions offered by the client system 102, the application module(s) 222 including but not limited to:
- an input processing module 224 for receiving input from a user through an input device 208 and interpreting the received input;
- a media request generation module 226 for generating a request for media stream data based on input received from the input processing module 224;
- a media stream reception module 228 for receiving one or more media streams from a media streaming server, remote from the electronic device 102;
- a decrypter 132 for using an algorithm to convert encrypted data received from a server system 120 to decrypted data;
- a demultiplexer 134 for separating a media stream with two or more distinct constituent streams interleaved within it into a plurality of constituent streams;
- a decoder 138 for decompressing encoded data;
- a renderer 142 for preparing decrypted, demultiplexed, and decompressed data to be presented on an output device (e.g., output device 206); and
- a bandwidth allocation module 230 for allocating bandwidth between a first media stream and a second media stream based on predetermined priorities;
- an electronic device data module 250 for storing data, including but not limited to:
- compressed media data buffer(s) 136 including media stream data received from a content source (e.g., server system 120 or other content source 121 as shown in
FIG. 1 ) for one or more media streams that has been demultiplexed and decrypted and stored in the memory of the electronic device 102; - decompressed buffer 140 including media stream data received from a content source (e.g., server system 120 other content source 121, as shown in
FIG. 1 ) that has been decrypted, demultiplexed, and decompressed; - media content database 245 for storing, on the local device, media content that is part of the user's personal library of media content; and
- user profile database 246 for storing account information for a user of the electronic device 102 including user media history, user preferences, determined user interests, and account credentials.
- compressed media data buffer(s) 136 including media stream data received from a content source (e.g., server system 120 or other content source 121 as shown in
Each of the above identified elements is, optionally, stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various implementations. In some implementations, Memory 212 optionally stores a subset of the modules and data structures identified above. Furthermore, Memory 212 optionally stores additional modules and data structures not described above.
-
- an operating system 310 that includes procedures for handling various basic system services and for performing hardware dependent tasks;
- a network communication module 312 that is used for connecting the server system 120 to other computers via the one or more communication network interfaces 304 (wired or wireless) and one or more communication networks, such as the Internet, other Wide Area Networks, Local Area Networks, Personal Area Networks, metropolitan area networks, VPNs, and so on;
- one or more server application module(s) 314 for enabling the server system 120 to perform the functions offered by the server system 120, including but not limited to:
- a streaming module 122 for streaming a media stream to an electronic device (e.g., first and second electronic devices 102-n, 106-n,
FIG. 1 ) remote from the sever system 120; - a request processing module 316 for receiving requests from electronic devices (e.g., first and/or second electronic devices 102, 106,
FIG. 1 ), where the requests include requests to stream specific media content to the electronic devices and/or requests to change the destination of the media content stream from one electronic device to another electronic device; - a context tracking module 318 for tracking and storing the context of a media stream, optionally including storing, among other data, one or more of the current playback position in a media stream that is currently being presented by an electronic device (e.g., first and/or second electronic devices 102, 106,
FIG. 1 ), the position in a current playlist, the play history of a user, the preferences of a user, previously skipped media content, whether media content items were “liked” or “disliked” (e.g., via “starred,” “thumbs-up,” and/or “thumbs-down” indications), and the like; and - a context access module 320 for allowing electronic devices (e.g., first and/or second electronic devices 102-n, 106-n,
FIG. 1 ) associated with a given user account to access the current context for media streams associated with the given user account; and
- a streaming module 122 for streaming a media stream to an electronic device (e.g., first and second electronic devices 102-n, 106-n,
- one or more server data module(s) 330 for storing data related to the data server system 120, including but not limited to:
- media content database 124 including media content and metadata describing the media content and enabling clients to search through the media content to identify media content;
- a context database 126 including information associated with one or more media streams where context information optionally includes one or more of the current playback position in a media stream, metadata relating to the media, a position in a playlist, play history of a user, user preferences, skipped media, and user settings; and
- a user profile database 332 including account information for a plurality of users, where the account information for a respective user optionally includes a user media content request/playback history for the respective user, a list of electronic devices associated with the respective user, user preferences of the respective user, and determined user interests of the respective user.
Each of the above identified elements is, optionally, stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various implementations. In some implementations, Memory 306 optionally stores a subset of the modules and data structures identified above. Furthermore, Memory 306 optionally stores additional modules and data structures not described above.
Although
In some implementations, the first media stream data processing pipeline (e.g., pipeline A) includes a data source A 402-A (e.g., server system 120 or other content source 121), a decrypter A 404-A (e.g., decrypter 132), a demultiplexer A 406-A (e.g., demultiplexer 134), a compressed data buffer A 408-A (e.g., compressed buffer 136), a decoder A 410-A (e.g., decoder 138), a decompressed buffer 412 (e.g., decompressed buffer 140), a renderer 414 (e.g., renderer 142), and an output device 416 (e.g., output device 144 in
In some implementations, the data source 402 streams media data to an electronic device (e.g. device 102 as shown in
In some implementations, once the data has been decrypted and demultiplexed, the compressed data is stored in a compressed data buffer 408. In some implementations, the decrypter 404, the demultiplexer 406, and the compressed data buffer are all implemented in software. In some implementations, the compressed data stored in the compressed data buffer 408 is decompressed by the decoder 410. In some implementations, decompressing includes decompressing the data to return the data to its original decompressed size. This decompressed data is then stored in the decompressed buffer 412. In some implementations, the decoder 410 and/or the decompressed buffer 412 are implemented as software modules and databases (e.g., a software decoder uses resources from a CPU or GPU to decode the compressed media stream). In other implementations, the decoder 410 and/or the decompressed buffer 412 are implemented in hardware. For example, a graphics processing unit (GPU) optionally contains hardware dedicated to decompressing media streams quickly and efficiently.
In some implementations, the decompressed data is delivered to the renderer 414 (e.g., image/video rendering circuitry in a GPU and/or digital or analog audio output circuitry). The renderer 414 maps the decompressed data to the presentation device, such that the data will be presented properly. For example, the renderer 414 fills a frame buffer with values for each pixel on a display for each frame to be displayed. In some implementations, the output device then sends the data to a presentation device. In some implementations, the output device 416 is a presentation device such as a display or a stereo or other audio presentation device. In situations where the media stream includes visual and audio data, the visual data is routed to a display device and the audio data is routed to an audio device.
In some implementations, when a second media stream is selected, the electronic device (e.g. device 102 as shown in
Attention is now directed towards
As described below, the method 500 provides a fast and efficient way to seamlessly transition between media streams. The method reduces the likelihood of interruption or unnecessary delay of the viewing experience when transitioning from a first media stream to a second media stream, thereby creating a better user experience. Furthermore, the method ensures that the transition occurs efficiently, minimizing the power needed and the strain placed on the computing resources of the computer system (e.g. device 102 as shown in
In some implementations, the computer system (e.g. device 102 as shown in
In some implementations, while generating the first decompressed media stream based on the first compressed media stream, the computer system receives (504) the first compressed media stream (e.g., data moving from the first data source 402-A to the decrypter 404-A as shown in
In some implementations, the computer system provides (514) the first decompressed media stream to a presentation device (e.g., a device with a display and/or speakers, such as a stand-alone display and/or speaker system such as output device 144 in
In some implementations, receiving the content-transition indication includes detecting (518) a decrease in the bandwidth available for receiving media streams at the computer system (e.g., the computer system monitors the bandwidth currently available and if the bandwidth available for receiving media streams drops below a predetermined level or if the bandwidth drops by a predetermined amount, a content-transition indication is received), and the computer system transmits (519), to a media source, a respective request for a lower bandwidth version of content corresponding to the first decompressed media stream (e.g., the same source content but with a lower bit rate). In some implementations, the second compressed media stream is received (520) in response to the respective request. In some implementations, the computer system ceases to receive the first compressed media stream when the computer system begins receiving data from the compressed second media stream.
In some implementations, receiving the content-transition indication includes detecting (522) an increase in the bandwidth available for receiving media streams at the computer system, and the computer system transmits (524), to a media source, a respective request for a higher bandwidth version of content corresponding to the first decompressed media stream (e.g., the same source content but with a higher bit rate). In some implementations, the second compressed media stream is received (525) in response to the respective request.
In some implementations, in response (526) to receiving the content-transition indication, the computer system selects a first transition point in first content corresponding to the first compressed media stream and selects a second transition point in second content corresponding to the second compressed media stream. In some implementations, the second transition point is selected (530) in accordance with the bandwidth available for receiving the second compressed media stream; and the first transition point is selected based on the second transition point. In some implementations, the second transition is also selected in accordance with one or more of: a target amount of buffered compressed media, transmission latency, content bit rate, and/or key frame location.
In some implementations, the second transition point is selected so as to ensure that enough of the second media stream has been buffered to ensure smooth streaming of the second media stream. In some implementations, the computer system determines (531) a respective quantity of data corresponding to the second compressed media stream for downloading as buffered media (e.g., prior to starting to provide the second decompressed media stream to the presentation device) based on: a target play time of the buffered media and a bit rate of the second compressed media stream. After the respective quantity of data has been determined, the computer system determines (532) a predicted buffering time that it will take to buffer the second compressed media stream based on: the respective quantity of data and the bandwidth available for receiving the second compressed media stream. The computer system also identifies (533) a plurality of candidate transition points in the second compressed media stream. After determining the predicted buffering time and identifying the candidate transition points, the computer system selects (534) the second transition point from the plurality of candidate transition points based on the predicted buffering time that it will take to buffer the second compressed media stream. In some implementations, the second transition point is a first key frame in the content corresponding to the second compressed media stream that corresponds to a timestamp that is greater than or equal to a current time plus the predicted buffering time.
An example of logic for selecting the second transition point includes: for a target play time of buffered media (e.g., four seconds of media) and a bit rate of the second compressed media stream (e.g., 2 MB/s), the computer system determines a respective quantity of data to be downloaded for the second stream of media E (enough) (e.g., if the bit rate is 2 MB/s and the target play time of buffered compressed media is four seconds then the respective quantity of data to be downloaded is 8 MB). Continuing with this example, after determining the respective quantity of data to be downloaded, the device determines a predicted time to download the respective quantity of data based on the respective quantity and the bandwidth available for receiving the second media stream plus a safety margin (e.g., if the respective quantity is 8 MB, the bit rate is 8 MB/s and the safety margin is 1 second, then the predicted time is 2 seconds=(8 MB/8 MB/s)+1 second; similarly, if the respective quantity is 8 MB, the bit rate is 0.5 MB/s and the safety margin is 1 second, then the predicted time is 17 seconds=(8 MB/0.5 MB/s)+1 second).
In some implementations, the safety margin is a multiple of the latency of communications between the device and the source of the second compressed media stream (e.g., two times the latency, where a common latency is a time between 100 to 900 milliseconds). In some implementations, the predicted time has a minimum value of two seconds (e.g., the second transition point is selected as a transition point that is at least two seconds in the future). In some implementations, the predicted time has a maximum value of twenty seconds (e.g., if the device is on a very slow connection that cannot download the respective quantity of the second media stream within 20 seconds, then the device still selects a transition point within twenty seconds and accepts the possibility that the target play time of buffered media for the second compressed media stream will not be stored before starting to provide content corresponding to the second compressed media stream to the user.
In some implementations, the computer system selects a key frame as the transition point. A key frame (also known as an Intra Frame) in at least some video compression formats S is a frame that contains complete image data in the stream. In order to reduce the amount of data that must be transferred, only changes that occur from one frame to the next are transmitted between key frames. For example, the computer system receives a key frame, and then only receives data change information for a plurality for subsequent frames. After a plurality of subsequent frames the computer system then receives a subsequent key frame. In some implementations, data in media data streams are grouped into clusters by key frames. In some implementations, clusters are then grouped into segments that include about 10 MB of data.
In some implementations, while performing on-demand encoding, the key frames are found first in the primary stream. Then corresponding key frames are inserted into all of the supporting bit rates at the same time code, which requires some additional processing time especially as N grows large for N bit rates. In some implementations, each stream includes a video header such as an HTTP Live Streaming (HLS) header that lists all the HLS segments in the media data stream. In some implementations, HLS segments are based on time. In other implementations, the HLS segments are based on size. In some implementations, a segment header lists out all segments, SHA-1 hashes, and file sizes. In some implementations, the first segment includes a segment header in addition to the normal 10 MB of data and thus is over 10 MB in total and the last segment is smaller than 10 MB. In some implementations, segments are designed to be variable in size based on cluster boundaries. In some implementations, the computer system requests a live stream by accessing the same URL every few seconds and retrieves new HLS segments via playlists. In some implementations, the server system (e.g., server 120 in
In some implementations, key frames occur at fixed intervals within a stream (e.g., every three seconds). In other implementations, key frames are dynamically chosen based on stream content (e.g., at scene changes). In some implementations, key frames are spaced between one and four seconds apart based on scene changes (e.g., if a scene change happens within one to four seconds after the last keyframe, a next keyframe is selected at or near the scene change, but if no scene change occurs in this time span then a key frame is forced at or near four seconds after the last keyframe). Live television optionally uses fixed key frame intervals. In some implementations, switching between live streams will not be as seamless as switching between on-demand streams because fixed key frame intervals are used for live streams and thus the live streams will, in some circumstances, not have corresponding key frames that can be matched up and so some noticeable visual jumps at switch boundaries may occur whereas for on-demand streams (e.g., two streams of the same content with different bitrate), matching keyframes can be identified and the content can be switched over at the matching keyframes to provide a seamless transition between the content at a first bitrate and the content at a second, different, bitrate).
In some implementations, segment size is four seconds for live content, so that only a four second latency will occur for accessing any segment on a client. This starts with a key frame and each segment contains exactly one cluster. In some implementations, live content uses fixed key frame intervals, as compared with on-demand content, which uses variable key frame intervals based on scene changes. This is done in order to reduce latency for live encoding, reducing latency by one second.
In some implementations, the computer system selects the first key frame that occurs in the second media stream after the predicted download time as the transition point. For example, if the predicted download time is ten seconds and a key frame occurs every four seconds, the computer system would select the third key frame. In some implementations, there are many potential key frames within the switching interval, especially on slow connections, but, in some circumstances, there will be only one or perhaps two potential candidate key frames. In some implementations, the transition point is selected based on the download speed (e.g., the rate at which the content is downloading based on the available bandwidth) and the position of the nearest scene change such that switches happen at the point of minimal visual disruption for both fast and slow connections. In some implementations, once the computer system selects the transition point (e.g., key frame) for the transition, the decompressed buffer size is increased to 24 frames and it starts to be filled. In some implementations, at the transition point only N frames are stored in the buffer vs. the target of 24 (best effort objective), in which case a switch can still occur. In some implementations, the buffer size is 3 Mbytes for one frame of 1920×1080 with 12 bits per pixel, 8 bits for luma, and four bits for chroma per frame, while adding 24 extra frames would increase the buffer size to 72 Mbytes of memory allocation (e.g., the buffer size would be increased, as shown in stage 2 in
In some implementations, prior to receiving the content-transition indication, the computer system receives (535) the first compressed media stream (e.g., as shown in
In some implementations, the computer system, after receiving the content-transition indication and prior to ceasing to generate the first decompressed media stream, stores (538) sufficient decompressed media content in the first buffered data to continue providing the first decompressed media stream to the presentation device (from a current playback position) until the first transition point is reached (e.g., a size of the decompressed buffer is temporarily increased while the device is switching from using a first codec to using a second codec, as shown in
In some implementations, the first content includes video content that is divided into a first plurality of groups of frames, the second content includes video content that is divided into a second plurality of groups of frames, the first transition point in the first content is selected so as to occur at a boundary of a first group of frames in the first plurality of groups of frames; and (e.g., immediately prior to a next key frame) the second transition point in the second content is selected (539) so as to occur at a boundary of a second group of frames in the second plurality of groups of frames (e.g., a key frame). In some implementations, the first transition point corresponds to an end of a scene in the first decompressed media stream, and the second transition point corresponds (540) to a beginning of a scene in the second decompressed media stream.
In some implementations, the computer system stores (543) first buffered data corresponding to the first decompressed media stream (e.g., decompressed content is stored in decompressed buffer 414 as shown in
In some implementations, the computer system receives (546) a second compressed media stream corresponding to the second decompressed media stream (e.g., the computer system retrieves the second compressed media stream from a second data source 402-B, as part of stage 3 and shown in
In some implementations, when a first transition point and a second transition point have been selected, after selecting the second transition point, the computer system requests (549) the second compressed media stream starting at the second transition point, and receiving the second compressed media stream corresponding to the second decompressed media stream includes receiving a portion of the second compressed media stream starting at the second transition point (e.g., without receiving a portion of the second compressed media stream prior to the second transition point).
In some implementations, the first compressed media stream is received (550) from a media source (e.g., server 120 or another content source 121). In some implementations, the computer system stores (551) first data corresponding to the first compressed media stream in a first compressed buffer (e.g., compressed stream A data is stored in buffer 410-A as shown in
In some implementations, after receiving the second compressed media stream, the computer system provides (556) the first buffered data corresponding to the first decompressed media stream to the presentation device while preparing to generate the second decompressed media stream (e.g., as shown in
In some implementations, the first decompressed media stream is generated using a first codec, the second decompressed media stream is generated using a second codec, and the computer system provides (559) the first buffered data to the presentation device after the first codec has ceased to decompress the first decompressed media stream and before the second codec has started to decompress the second decompressed media stream (e.g., as shown in
In some implementations, the first buffered data is stored in (562) a decompressed buffer (e.g., the decompressed buffer 412 as shown in
In some implementations, the first decompressed media stream is generated using a first codec, the second decompressed media stream is generated using a second codec and preparing to generate the second decompressed media stream includes initializing (570) the second codec after the first codec has stopped decompressing compressed media so that no more than one codec is decompressing compressed media at a time (e.g., as shown in
In some implementations, after preparing to generate the second decompressed media stream, the computer system provides (574) the second decompressed media stream to the presentation device instead of providing the first buffered data corresponding to the first decompressed media stream (e.g., as shown in
In some implementations, when a first transition point and a second transition point have been selected, providing the second decompressed media stream to the presentation device instead of providing the first buffered data corresponding to the first decompressed media stream includes ceasing (582) to provide the first content at the first transition point and starting to provide the second content at the second transition point. In some implementations, after generating at least a portion of the second decompressed media stream, the computer system cross fades (584) first content from the first decompressed media stream with second content from the second decompressed media stream. In some implementations, cross fading the first content with the second content includes combining decompressed video and/or audio from the first content and the second content. In some implementations, the decompressed video and/or audio content is combined in accordance with a weighted average of the first content and the second content, where the weighting values are adjusted over time so that the presented content gradually transitions from the first content to the second content over time.
It should be understood that the particular order in which the operations in
As described below, the method 600 provides a fast and efficient way to seamlessly transition between media streams. The method reduces the likelihood of interruption or unnecessary delay of the viewing experience when transitioning from a first media stream to a second media stream, thereby creating a better user experience. Furthermore, the method ensures that the transition occurs efficiently, minimizing the power needed and the strain placed on the computing resources of the computer system (e.g. device 102 as shown in
In some implementations, the computer system (e.g. device 102 as shown in
In some implementations, while providing (606) the first decompressed media stream to the presentation device, the computer system receives (608) a content-transition indication indicating that a second decompressed media stream is to be provided to the presentation device instead of the first decompressed media stream.
In some implementations, in response to receiving the content-transition indication, the computer system selects (610) a first transition point in first content corresponding to the first compressed media stream and selecting a second transition point in second content corresponding to the second compressed media stream, where the second transition point is selected in accordance with the bandwidth available for receiving the second compressed media stream. In some implementations, the second transition is also selected in accordance with one or more of: a target amount of buffered compressed media, transmission latency, content bit rate, and/or key frame location and the first transition point is selected based on the second transition point.
In some implementations, the computer system requests (614), in response to receiving the content-transition indication, the second compressed media stream starting at the second transition point (e.g., as shown in
In some implementations, after (617) generating the second decompressed media stream, the computer system ceases (618) to provide the first decompressed media stream at the first transition point in the first content (e.g., as shown in
It should be understood that the particular order in which the operations in
Plural instances are, optionally provided for components, operations, or structures described herein as a single instance. Finally, boundaries between various components, operations, and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and optionally fall within the scope of the implementation(s). In general, structures and functionality presented as separate components in the example configurations are, optionally, implemented as a combined structure or component. Similarly, structures and functionality presented as a single component are, optionally, implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the implementation(s).
It will also be understood that, although the terms “first,” “second,” are, in some circumstances, used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, which changing the meaning of the description, so long as all occurrences of the “first contact” are renamed consistently and all occurrences of the second contact are renamed consistently. The first contact and the second contact are both contacts, but they are not the same contact.
The terminology used herein is for the purpose of describing particular implementations only and is not intended to be limiting of the claims. As used in the description of the implementations and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” is, optionally, construed to mean “when” or “upon” or “in response to determining” or “in accordance with a determination” or “in response to detecting,” that a stated condition precedent is true, depending on the context. Similarly, the phrase “if it is determined (that a stated condition precedent is true)” or “if (a stated condition precedent is true)” or “when (a stated condition precedent is true)” is, optionally, construed to mean “upon determining” or “in response to determining” or “in accordance with a determination” or “upon detecting” or “in response to detecting” that the stated condition precedent is true, depending on the context.
The foregoing description included example systems, methods, techniques, instruction sequences, and computing machine program products that embody illustrative implementations. For purposes of explanation, numerous specific details were set forth in order to provide an understanding of various implementations of the inventive subject matter. It will be evident, however, to those skilled in the art that implementations of the inventive subject matter is, optionally, practiced without these specific details. In general, well-known instruction instances, protocols, structures and techniques have not been shown in detail.
The foregoing description, for purpose of explanation, has been described with reference to specific implementations. However, the illustrative discussions above are not intended to be exhaustive or to limit the implementations to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The implementations were chosen and described in order to best explain the principles and their practical applications, to thereby enable others skilled in the art to best utilize the implementations and various implementations with various modifications as are suited to the particular use contemplated.
Claims
1. A method comprising:
- at a computer system including one or more processors and memory: generating a first decompressed media stream based on a first compressed media stream; providing the first decompressed media stream to a presentation device; receiving a content-transition indication indicating that a second decompressed media stream is to be provided to the presentation device instead of the first decompressed media stream, wherein receiving the content-transition indication includes detecting a change in the bandwidth available for receiving media streams at the computer system; transmitting, to a media source, a respective request for a different bandwidth version of content corresponding to the first decompressed media stream; after receiving the content-transition indication, continuing to provide the first decompressed media stream to the presentation device; while continuing to provide the first decompressed media stream to the presentation device after receiving the content-transition indication: generating first buffered data corresponding to the first decompressed media stream; starting to receive a second compressed media stream corresponding to the second decompressed media stream, wherein the second compressed media stream is received in response to the respective request; and ceasing to generate the first decompressed media stream, wherein at least part of the first buffered data corresponding to the first decompressed media stream is provided to the presentation device after ceasing to generate the first decompressed media stream; after starting to receive the second compressed media stream, providing at least part of the first buffered data corresponding to the first decompressed media stream to the presentation device while starting to generate the second decompressed media stream; and after starting to generate the second decompressed media stream, providing the second decompressed media stream to the presentation device instead of providing the first buffered data corresponding to the first decompressed media stream.
2. The method of claim 1, wherein generating the first decompressed media stream based on the first compressed media stream includes:
- receiving the first compressed media stream;
- decrypting the first compressed media stream;
- demultiplexing the first compressed media stream;
- storing decrypted and demultiplexed first compressed media stream data in a first compressed buffer; and
- decompressing the first compressed media stream data using a first codec to generate the first decompressed media stream.
3. The method of claim 1, further including:
- prior to receiving the content-transition indication, storing a first amount of decompressed media stream data; and
- after receiving the content-transition indication, storing a second amount of decompressed media stream data that is larger than the first amount of decompressed media stream data.
4. The method of claim 1, wherein:
- the first compressed media stream is received from the media source; and
- the method further includes: storing first compressed data corresponding to the first compressed media stream in a first compressed buffer, wherein generating the first decompressed media stream is performed using the first compressed data stored in the first compressed buffer; in response to receiving the content-transition indication, ceasing to receive the first compressed media stream from the media source; and while receiving the second compressed media stream: storing second compressed data corresponding to the second compressed media stream in a second compressed buffer.
5. The method of claim 1, wherein:
- the first decompressed media stream is generated using a first codec;
- the second decompressed media stream is generated using a second codec; and
- the method further includes initializing the second codec after the first codec has stopped decompressing compressed media so that no more than one codec is decompressing compressed media at a time.
6. The method of claim 1, wherein:
- the first decompressed media stream is generated using a first codec;
- the second decompressed media stream is generated using a second codec; and
- at least part of the first buffered data is provided to the presentation device after the first codec has ceased to generate the first decompressed media stream and before the second codec has started to generate the second decompressed media stream.
7. The method of claim 1, wherein:
- the first decompressed media stream is generated using a first codec;
- the second decompressed media stream is generated using a second codec; and
- at least part of the first buffered data is provided to the presentation device after the first codec has ceased to generate the first decompressed media stream and after the second codec has started to generate the second decompressed media stream.
8. The method of claim 1, wherein:
- the first decompressed media stream is generated using a first codec;
- the first buffered data is stored in a decompressed buffer;
- starting to generate the second decompressed media stream comprises starting to generate second buffered data corresponding to the second decompressed media stream; and
- the method further includes, after the first codec has ceased to generate the first decompressed media stream: continuing to provide the first buffered data to the presentation device;
- and in accordance with a determination that an end of the first buffered data has been reached, starting to provide the second buffered data to the presentation device.
9. The method of claim 1, wherein:
- the content-transition indication corresponds to a request to display the second decompressed media stream starting at a respective point in the second decompressed media stream;
- the respective point is within a predefined portion of the second decompressed media stream; and
- providing the second decompressed media stream to the presentation device instead of providing the first buffered data corresponding to the first decompressed media stream includes: providing a subset of the predefined portion of the second decompressed media stream, which does not include content prior to the respective point, to the presentation device.
10. The method of claim 1, wherein:
- detecting the change in the bandwidth available for receiving media streams includes detecting a decrease in the bandwidth available for receiving media streams at the computer system; and
- the respective request for a different bandwidth version of the content is a request for a lower bandwidth version of the content corresponding to the first decompressed media stream.
11. The method of claim 1, wherein:
- detecting the change in the bandwidth available for receiving media streams includes detecting an increase in the bandwidth available for receiving media streams at the computer system; and
- the respective request for a different bandwidth version of the content is a request for a higher bandwidth version of the content corresponding to the first decompressed media stream.
12. The method of claim 1, further including, after generating at least a portion of the second decompressed media stream, cross fading first content from the first decompressed media stream with second content from the second decompressed media stream.
13. The method of claim 1, wherein the first compressed media stream and the second compressed media stream correspond to the same source content.
14. A computer system, comprising:
- one or more processors; and
- memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: generating a first decompressed media stream based on a first compressed media stream; providing the first decompressed media stream to a presentation device; receiving a content-transition indication indicating that a second decompressed media stream is to be provided to the presentation device instead of the first decompressed media stream, wherein receiving the content-transition indication includes detecting a change in the bandwidth available for receiving media streams at the computer system; transmitting, to a media source, a respective request for a different bandwidth version of content corresponding to the first decompressed media stream; after receiving the content-transition indication, continuing to provide the first decompressed media stream to the presentation device; while continuing to provide the first decompressed media stream to the presentation device after receiving the content-transition indication: generating first buffered data corresponding to the first decompressed media stream; starting to receive a second compressed media stream corresponding to the second decompressed media stream, wherein the second compressed media stream is received in response to the respective request; and ceasing to generate the first decompressed media stream, wherein at least part of the first buffered data corresponding to the first decompressed media stream is provided to the presentation device after ceasing to generate the first decompressed media stream; after starting to receive the second compressed media stream, providing at least part of the first buffered data corresponding to the first decompressed media stream to the presentation device while starting to generate the second decompressed media stream; and after starting to generate the second decompressed media stream, providing the second decompressed media stream to the presentation device instead of providing the first buffered data corresponding to the first decompressed media stream.
15. A non-transitory computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computer system with one or more processors, cause the computer system to:
- generate a first decompressed media stream based on a first compressed media stream;
- provide the first decompressed media stream to a presentation device;
- receive a content-transition indication indicating that a second decompressed media stream is to be provided to the presentation device instead of the first decompressed media stream, wherein receiving the content-transition indication includes detecting a change in the bandwidth available for receiving media streams at the computer system;
- transmit, to a media source, a respective request for a different bandwidth version of content corresponding to the first decompressed media stream;
- after receiving the content-transition indication, continue to provide the first decompressed media stream to the presentation device;
- while continuing to provide the first decompressed media stream to the presentation device after receiving the content-transition indication: generate first buffered data corresponding to the first decompressed media stream; start to receive a second compressed media stream corresponding to the second decompressed media stream, wherein the second compressed media stream is received in response to the respective request; and cease to generate the first decompressed media stream, wherein at least part of the first buffered data corresponding to the first decompressed media stream is provided to the presentation device after ceasing to generate the first decompressed media stream; after starting to receive the second compressed media stream, provide at least part of the first buffered data corresponding to the first decompressed media stream to the presentation device while starting to generate the second decompressed media stream; and after starting to generate the second decompressed media stream, providing the second decompressed media stream to the presentation device instead of providing the first buffered data corresponding to the first decompressed media stream.
16. The method of claim 1, wherein the first compressed media stream and the second compressed media stream both correspond to a same media item.
5682207 | October 28, 1997 | Takeda et al. |
5896128 | April 20, 1999 | Boyer |
6208354 | March 27, 2001 | Porter |
6384869 | May 7, 2002 | Sciammarella et al. |
6590618 | July 8, 2003 | Park et al. |
6671453 | December 30, 2003 | Yagura et al. |
6804824 | October 12, 2004 | Potrebic et al. |
6919929 | July 19, 2005 | Iacobelli et al. |
7165050 | January 16, 2007 | Marking |
7519223 | April 14, 2009 | Dehlin et al. |
7797713 | September 14, 2010 | Dawson et al. |
8146019 | March 27, 2012 | Kim et al. |
8214619 | July 3, 2012 | Connolly |
8340654 | December 25, 2012 | Bratton |
8341662 | December 25, 2012 | Bassett et al. |
8341681 | December 25, 2012 | Walter et al. |
8434006 | April 30, 2013 | Profitt |
8532464 | September 10, 2013 | Randall |
8564728 | October 22, 2013 | Petersson et al. |
8606954 | December 10, 2013 | Abidogun et al. |
8683377 | March 25, 2014 | Zuverink et al. |
8736557 | May 27, 2014 | Chaudhri |
8908110 | December 9, 2014 | Yamashita et al. |
9032412 | May 12, 2015 | Davidson |
9071798 | June 30, 2015 | Hoffert et al. |
20010003846 | June 14, 2001 | Rowe |
20010030660 | October 18, 2001 | Zainoulline |
20020089587 | July 11, 2002 | White |
20020116701 | August 22, 2002 | Asada et al. |
20040003399 | January 1, 2004 | Cooper |
20040056885 | March 25, 2004 | Azami et al. |
20040093618 | May 13, 2004 | Baldwin et al. |
20040221306 | November 4, 2004 | Noh |
20040261127 | December 23, 2004 | Freeman et al. |
20050002453 | January 6, 2005 | Chang et al. |
20050114885 | May 26, 2005 | Shikata et al. |
20050138658 | June 23, 2005 | Bryan |
20050234992 | October 20, 2005 | Haberman |
20060015904 | January 19, 2006 | Marcus |
20060061688 | March 23, 2006 | Choi |
20060075428 | April 6, 2006 | Farmer et al. |
20060155952 | July 13, 2006 | Haas et al. |
20060159184 | July 20, 2006 | Jang |
20060218250 | September 28, 2006 | Nakase |
20060245605 | November 2, 2006 | Matsunaga |
20060259877 | November 16, 2006 | Kaminagayoshi |
20060282864 | December 14, 2006 | Gupte |
20070028270 | February 1, 2007 | Ostojic et al. |
20070067815 | March 22, 2007 | Bowen et al. |
20070083911 | April 12, 2007 | Madden et al. |
20070169156 | July 19, 2007 | Zeng |
20070263066 | November 15, 2007 | Henning et al. |
20080056273 | March 6, 2008 | Pelletier et al. |
20080074550 | March 27, 2008 | Park |
20080126294 | May 29, 2008 | Ray et al. |
20080126919 | May 29, 2008 | Uskali et al. |
20080155459 | June 26, 2008 | Ubillos |
20080242280 | October 2, 2008 | Shapiro et al. |
20080244092 | October 2, 2008 | Kosaka et al. |
20090010324 | January 8, 2009 | Yamamoto |
20090046545 | February 19, 2009 | Blinnikka |
20090055506 | February 26, 2009 | Hudson et al. |
20090100380 | April 16, 2009 | Gardner et al. |
20090119594 | May 7, 2009 | Hannuksela |
20090132599 | May 21, 2009 | Soroushian et al. |
20090195515 | August 6, 2009 | Lee |
20090198827 | August 6, 2009 | Hughes |
20090234967 | September 17, 2009 | Yu et al. |
20090235170 | September 17, 2009 | Golden |
20090297123 | December 3, 2009 | Virdi et al. |
20100049864 | February 25, 2010 | Lu et al. |
20100066918 | March 18, 2010 | Gupta et al. |
20100077441 | March 25, 2010 | Thomas et al. |
20100095240 | April 15, 2010 | Shiplacoff et al. |
20100153999 | June 17, 2010 | Yates |
20100162180 | June 24, 2010 | Dunnam et al. |
20100175026 | July 8, 2010 | Bortner et al. |
20100180297 | July 15, 2010 | Levine et al. |
20100191859 | July 29, 2010 | Raveendran |
20100235733 | September 16, 2010 | Drislane et al. |
20100235746 | September 16, 2010 | Anzures |
20100287586 | November 11, 2010 | Walter et al. |
20100306401 | December 2, 2010 | Gilson |
20100332453 | December 30, 2010 | Prahlad et al. |
20110029874 | February 3, 2011 | Profitt |
20110066703 | March 17, 2011 | Kaplan et al. |
20110090402 | April 21, 2011 | Huntington et al. |
20110119611 | May 19, 2011 | Ahn et al. |
20110119711 | May 19, 2011 | Marshall et al. |
20110119712 | May 19, 2011 | Choi et al. |
20110242002 | October 6, 2011 | Kaplan et al. |
20110252183 | October 13, 2011 | Cho |
20110289139 | November 24, 2011 | Mcintosh et al. |
20110289534 | November 24, 2011 | Jordan et al. |
20110296351 | December 1, 2011 | Ewing, Jr. et al. |
20120030619 | February 2, 2012 | Lee et al. |
20120050616 | March 1, 2012 | Anderson et al. |
20120054679 | March 1, 2012 | Ma et al. |
20120079429 | March 29, 2012 | Stathacopoulos et al. |
20120131459 | May 24, 2012 | Ilama-Vaquero et al. |
20120137216 | May 31, 2012 | Choi |
20120141095 | June 7, 2012 | Schwesinger et al. |
20120158802 | June 21, 2012 | Lakshmanan |
20120170903 | July 5, 2012 | Shirron et al. |
20120180090 | July 12, 2012 | Yoon et al. |
20120182384 | July 19, 2012 | Anderson et al. |
20120204106 | August 9, 2012 | Hill et al. |
20120213295 | August 23, 2012 | Quere et al. |
20120216117 | August 23, 2012 | Arriola et al. |
20120221950 | August 30, 2012 | Chao et al. |
20120254793 | October 4, 2012 | Briand et al. |
20120254926 | October 4, 2012 | Takahashi et al. |
20120257120 | October 11, 2012 | Nakai |
20120290933 | November 15, 2012 | Rajaraman et al. |
20120311444 | December 6, 2012 | Chaudhri |
20120323917 | December 20, 2012 | Mercer et al. |
20130016129 | January 17, 2013 | Gossweiler, III et al. |
20130061045 | March 7, 2013 | Kiefer et al. |
20130080895 | March 28, 2013 | Rossman et al. |
20130132605 | May 23, 2013 | Kocks et al. |
20130145268 | June 6, 2013 | Kukulski |
20130152129 | June 13, 2013 | Alberth et al. |
20130179925 | July 11, 2013 | Woods et al. |
20130222274 | August 29, 2013 | Mori et al. |
20130236158 | September 12, 2013 | Lynch et al. |
20130263047 | October 3, 2013 | Allen et al. |
20130265501 | October 10, 2013 | Murugesan et al. |
20130275924 | October 17, 2013 | Weinberg et al. |
20130279375 | October 24, 2013 | Newberg |
20130282917 | October 24, 2013 | Reznik et al. |
20130283154 | October 24, 2013 | Sasakura |
20130293454 | November 7, 2013 | Jeon et al. |
20130305307 | November 14, 2013 | Nagahama |
20130307792 | November 21, 2013 | Andres et al. |
20130309986 | November 21, 2013 | Cox et al. |
20130332835 | December 12, 2013 | Mace |
20130346867 | December 26, 2013 | Woods et al. |
20140032781 | January 30, 2014 | Casey et al. |
20140059479 | February 27, 2014 | Hamburg et al. |
20140082497 | March 20, 2014 | Chalouhi et al. |
20140098140 | April 10, 2014 | Tran et al. |
20140108929 | April 17, 2014 | Garmark et al. |
20140114985 | April 24, 2014 | Mok et al. |
20140136653 | May 15, 2014 | Luby et al. |
20140143725 | May 22, 2014 | Lee |
20140157124 | June 5, 2014 | Roberts et al. |
20140164984 | June 12, 2014 | Farouki |
20140173027 | June 19, 2014 | Kappes et al. |
20140176479 | June 26, 2014 | Wardenaar |
20140178047 | June 26, 2014 | Apodaca et al. |
20140185466 | July 3, 2014 | Syed et al. |
20140215303 | July 31, 2014 | Grigorovitch et al. |
20140280433 | September 18, 2014 | Messerli et al. |
20140282281 | September 18, 2014 | Ram et al. |
20150006662 | January 1, 2015 | Braness |
20150089075 | March 26, 2015 | Strigeus et al. |
1672923 | June 2006 | EP |
1775953 | April 2007 | EP |
2469841 | June 2012 | EP |
WO 2004/019530 | March 2004 | WO |
WO 2009/088952 | July 2009 | WO |
WO 2011/095693 | August 2011 | WO |
WO 2013/022486 | February 2013 | WO |
- Hoffert, Office Action U.S. Appl. No. 14/165,513, dated Mar. 27, 2015, 16 pgs.
- Hoffert, Notice of Allowance, U.S. Appl. No. 14/165,517, dated Apr. 28, 2015, 6 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/222,299, 13APR015, 8 pgs.
- Hoffert, Notice of Allowance, U.S. Appl. No. 14/165,508, dated Mar. 2, 2015, 7 pgs.
- ISO/IEC 14496-12, Oct. 1, 2005, International Standard, ISO/IEC, XP55178146, 94 pgs.
- Siglin, “Unifying Global Video Strategies, MP4 File Fragmentation for Broadcast, Mobile and Web Delivery,” a Transitions in Technology White Paper, Nov. 16, 2011, 16 pgs.
- Spotify AB, International Search Report and Written Opinion, PCTIB/2014/002831, dated Mar. 19, 2015, 11 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,514, dated Mar. 3, 2015, 19 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/165,512, dated Mar. 2, 2015, 6 pgs.
- Hoffert, Notice of Allowance, U.S. Appl. No. 14/165,508, dated Mar. 2, 2015, 5 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/165,507, dated Mar. 16, 2015, 17 pgs.
- Spotify AB, International Search Report, PCTUS2014/042571, dated Dec. 12, 2014, 6 pgs.
- Spotify AB, Invitation to Pay Additional Fees and Partial Search Report, PCTIB/2014002726, dated Mar. 31, 2015, 8 pgs.
- Zambelli, Alex, “IIS Smooth Streaming Technical Overview,” Mar. 1, 2009, Microsoft Corporation, downloaded from http://dfpcorec-p.international.epo.org/wf/storage/14C3247F2EA000308DF/originalPdf, 8 pgs.
- Hoffert, Final Office Action, U.S. Appl. No. 14/165,513, dated Jul. 16, 2015, 16 pgs.
- Spotify AB, International Search Report and Written Opinion, PCT/IB2014/002726, dated Jun. 8, 2015, 20 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,508, dated Apr. 21, 2014, 17pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,514, dated May 9, 2014, 21 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,512, dated May 28, 2014, 19 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,517, dated May 28, 2014, 18 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,507, dated May 14, 2014, 18 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,513, dated Jun. 6, 2014, 13 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/222,299, dated Jul. 3, 2014, 35 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,508, dated Sep. 22, 2014, 24 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,514, dated Oct. 23, 2014, 23 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,513, dated Nov. 7, 2014, 14 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/165,512, dated Oct. 14, 2014, 5 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,517, dated Oct. 7, 2014, 7 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,507, dated Oct. 22, 2014, 20 pgs.
- Spotify AB, Invitation to Pay Additional Fees and Partial ISR, PCTUS2014/042571, dated Sep. 24, 2014, 6 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/222,299, dated Nov. 28, 2014, 45 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,508, dated Jan. 5, 2015, 24 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,514, dated Nov. 20, 2015, 20 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,514, dated Aug. 21, 2015, 20 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/133,340, dated Sep. 15, 2015, 19 pgs.
- Spotify AB, International Preliminary Report on Patentability, PCTUS2014/042571, dated Dec. 22, 2015, 6 pgs.
- Strigeus, Office Action, U.S. Appl. No. 14/135,217, dated Aug. 26, 2015, 19 pgs.
- Strigeus, Office Action, U.S. Appl. No. 14/135,230, dated Oct. 1, 2015, 15 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,518, dated Feb. 11, 2016, 15 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/720,525, dated Aug. 22, 2016, 17 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/743,694, dated Aug. 24, 2016, 6 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/134,950, dated Oct. 3, 2016, 10 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/805,323, dated Oct. 3, 2016, 9 pgs.
- Hoffert, Notice of Allowance, U.S. Appl. No. 14/743,684, dated Sep. 23, 2016, 8 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/135,225, dated Oct. 7, 2016, 33 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,514, dated Jun. 1, 2016, 21 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/165,518, dated Jun. 7, 2016, 8 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/165,513, dated Mar. 18, 2016, 17 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/165,513, dated Aug. 11, 2016, 16 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/133,340, dated Apr. 6, 2016, 21 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/133,352, dated May 31, 2016, 23 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/133,358, dated Mar. 3, 2016, 21 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/134,950, dated Apr. 5, 2016, 17 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/805,323, dated Apr. 20, 2016, 6 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/743,684, dated Apr. 14, 2016, 7 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/135,225, dated Apr. 22, 2016, 25 pgs.
- Spotify AB, International Preliminary Report on Patentability, PCTIB/2014/002831, dated Apr. 19, 2016, 8 pgs.
- Spotify AB, Communication Pursuant to Rules 161(1) and 162, EP14828273-4, dated May 23, 2016, 2 pgs.
- Spotify AB, Communication Pursuant to Rules 161(1) and 162, EP14831065.9, dated Jun. 3, 2016, 2 pgs.
- Strigeus, Final Office Action, U.S. Appl. No. 14/135,217, dated May 20, 2016, 11 pgs.
- Strigeus, Final Office Action, U.S. Appl. No. 14/135,230, dated Jun. 1, 2016, 16 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/133,352, dated Dec. 1, 2016, 26 pgs.
- Hoffert, Notice of Allowance, U.S. Appl. No. 14/133,358, dated Oct. 25, 2016, 10 pgs.
- Hoffert, Notice of Allowance, U.S. Appl. No. 14/743,694, dated Dec. 19, 2016, 5 pgs.
- Strigeus, Office Action, U.S. Appl. No. 14/135,217, dated Jan. 9, 2017, 6 pgs.
- Strigeus, Office Action, U.S. Appl. No. 14/135,230, dated Nov. 3, 2016, 15 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/165,518, dated Jan. 13, 2017, 17 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/133,340, dated Jan. 18, 2017, 21 pgs.
- Hoffert, Notice of Allowance, U.S. Appl. No. 14/133,352, dated Mar. 28, 2017, 24 pgs.
- Hoffert, Final Office Action U.S. Appl. No. 14/135,225, dated May 30, 2017, 12 pgs.
- Hoffert, Notice of Allowance U.S. Appl. No. 14/720,525, dated Jan. 31, 2017, 8 pgs.
- Hoffert, Office Action U.S. Appl. No. 14/743,702, dated May 17, 2017, 12 pgs.
- Strigeus, Notice of Allowance, U.S. Appl. No. 14/135,217, dated May 24, 2017, 9 pgs.
- Strigeus, Notice of Allowance, U.S. Appl. No. 14/135,230, dated Mar. 23, 2017, 8 pgs.
Type: Grant
Filed: Dec 18, 2013
Date of Patent: Oct 23, 2018
Patent Publication Number: 20150039780
Assignee: SPOTIFY AB (Stockholm)
Inventors: Eric Hoffert (South Orange, NJ), Ludvig Strigeus (Gothenburg), Andreas Oman (Stockholm), Oskar Arvidsson (Trångsund)
Primary Examiner: Vivek Srivastava
Assistant Examiner: Sibte Bukhari
Application Number: 14/133,333
International Classification: H04L 29/06 (20060101); H04N 21/434 (20110101); H04N 21/44 (20110101); H04N 21/4405 (20110101); H04N 21/442 (20110101); H04N 21/2343 (20110101); H04N 21/438 (20110101);