METHODS AND APPARATUS FOR CLIENT AGGREGATION OF MEDIA IN A NETWORKED MEDIA SYSTEM
A network client aggregates media items available in a media system. The network consists of a plurality of nodes, including at least two media server nodes. A client node generates an internal request to obtain a list of media items available in the media system. In response, the client node generates a request for a list of media items from each individual media server node on the network. Each media server node sends their list of media items to the client node. The client node aggregates the lists of media items from each of the media server nodes. Thus, a list of media items available on the media system is aggregated to a requesting client node in the media system.
1. Field of the Invention
The present invention is directed toward the held of converging disparate types of media, and more particularly directed toward a -media device that aggregates media from multiple disparate devices over a network:.
2. Art Background
The widespread use of computers, digital cameras, and the Internet has resulted in the creation and use of digital media. Digital media has also largely replaced more traditional analog audio and video formats with the introduction and popular acceptance of audio compact discs (CDs) and digital video discs (DVDs). In general, digital media consists of various formats of data that stores audio, video, and images in binary files. These binary files are typically stored on a medium accessible to computer devices, such as CD-ROMs, hard drives, floppy disks and memory sticks.
The storage of digital media on commonly used computer medium allows for easy generation and transfer of digital media. For example, it has become popular to generate digital photos using a digital camera and then to transfer the digital photos onto computers. Computer software permits the user to manipulate the digital photos. The user may then transfer the digital photos to friends using e-mail, or post the digital photos on a web site accessible by the World Wide Web. These types of applications, which take advantage of the connectivity among different devices, have also contributed to the widespread popularity of digital media.
Digital media may be stored in a variety of formats. Special hardware or software compatible with the formats of the digital media is required to playback or view the digital media. For example, to listen to music stored in the popular MP3 format, a consumer must have a special MP3 player (i.e., either software running on a general purpose computer or a stand alone MP3 player). There are numerous formats for video, including high quality DVDs and various compression based MPEG and proprietary standards. To playback various formats of digital video, the consumer must use a device that reads the proper format of the digital media.
Because of the numerous different formats of digital media, the playback or viewing of numerous types of digital media today requires multiple types of devices. The playback of digital media stored in different formats is less problematic on a computer because the computer may play the digital media using software programs. However, a consumer may desire to play the media on other types of devices. For example, the consumer may desire to play digital audio files on a home stereo and view digital video on a television. Currently, stereos and televisions are not equipped to playback all formats of digital media. Accordingly, it is desirable to provide a media convergence platform that integrates various types of digital media into a single system.
Aggregation of media in a home network is typically performed using a server. Under this technique, a server tracks the existence of all media items available on the home network. For example, a media server may be implemented on a personal computer. A digital audio jukebox may be coupled to the home network. To aggregate a list of all audio available on the home network, the server (personal computer) receives a list of the media items from the digital jukebox. For this implementation, the server acts as a central point to acquire a list of all audio available on the home network. This server aggregation architecture requires constant availability of the server. The server becomes a single point of failure. Furthermore, aggregating all media items through a server limits system throughput. Accordingly, it is desirable to generate a home media system that does not rely on server aggregation to acquire all media items on a home network.
SUMMARY OF THE INVENTIONA network client aggregates media items available in a media system. A plurality of nodes are coupled to the network. A node may comprise a device, which supports services for the media system, or a media server that presents at least one media item to the network. Each network node provides one or more services for the media system. At least two of the nodes comprise media server nodes. A media server node presents media items to the network. For example, a media server node may be a hard disk drive that stores MP3 music, or a media server node may be a gateway to the internet for downloading media items. A client node generates an internal request to obtain a list of media items available in the media system. For example, a client node may comprise a television, and a user may request to view a list of all available media items in the system on the television screen.
In response to the client's internal request for media items, the client node generates a request for a list of media items from each individual media server node on the network. In one embodiment, the media system uses a discovery protocol to learn of media server nodes on the network. In response, each media server node sends their list of media items to the client node. The client node aggregates the lists of media items from each of the media server nodes. During the aggregation process, the client node determines whether each media item is unique from other media items on the aggregated list. Thus, a list of media items available on the media system is aggregated to a requesting client node in the media system.
In one embodiment, to obtain a list of media stems from a media server, the client node invokes a service on the media server. The media system supports multiple protocols to communicate among nodes in the media system. First, the client node determines a protocol supported by a media server, and then uses the protocol to obtain a list of media items from the media server. The media system also supports multiple remote procedure call (“RPC”) mechanisms to invoke procedures on the media server.
The user interface of the present invention provides an efficient and easy way for one or more users to manage and playback media within a “media space.” As used herein, a “media space” connotes one or more media storage devices coupled to one or more media players for use by one or more users. The integration of media storage devices and media players into a single media space permits distributed management and control of content available within the media space.
As shown in
The storage devices 110 and media players 120 are controlled by management component 130. In general, management component 130 permits users to aggregate, organize, control (e.g., add, delete or modify), browse, and playback media available within the media space 100. The management component 130 may be implemented across multiple devices. The media space of
For this embodiment, the media server 210 executes software to perform a variety of functions within the media space. Thus, in this configuration, the media server 210 operates as a “thick client.” A user accesses and controls the functions of the media convergence platform through a system user interface. The user interface utilizes the thick and thin clients, as well as some media players (e.g., televisions 250 & 270). In one embodiment, the user interface includes a plurality of interactive screens displayed on media player output devices to permit a user to access the functionality of the system. A screen of the user interface includes one or more items for selection by a user. The user navigates through the user interface using a remote control device (e.g., remote control 260). The user, through use of a remote control, controls the display of screens in the user interface and selects items displayed on the screens. A user interface displayed on a television permits the user, using a remote control, to perform a variety of functions pertaining to the media available in the media space.
The components of the media convergence platform are integrated through a network. For example, in the embodiment of
For the embodiment of
The media convergence platform system also optionally integrates one or more thin audio clients into the media space. For the embodiment of
The media manager 280 is an optional component for the media convergence platform system. In general, the media manager 280 permits the user to organize, download, and edit media in the personal computer “PC” environment. The media manager may store media for integration into the media space (i.e., store media for use by other components in the media space). In one embodiment, the media manager 280 permits the user to perform system functions on a PC that are less suitable for implementation on a television based user interface.
The media space may be extended to access media stored external to those components located in the same general physical proximity (e.g., a house). In one embodiment, the media convergence platform system integrates content from external sources into the media space. For example, as shown in
As used herein, a “device” connotes a home network client that supports a collection of services to operate a broader functionality. Also, as used herein, a “media server” is an entity on the home network that stores or presents media items to the network. Furthermore, a “node” connotes any entity on a home network, including a device and/or a media server.
The convergence media platform utilizes a “peer-to-peer” architecture. All client devices on the media platform have the ability to communicate with other devices, including multiple client devices and multiple servers. This architecture permits a device to obtain all media available on the network and to aggregate the media for presentation on that device.
A device, including a client device or a server device, may enter and/or exit the home network, at any time, and still maintain full functionality. Thus, when a device is powered off, other devices automatically recognize that the device is no longer available on the home network. When a new device is added or a portable device comes onto the network, the other nodes automatically recognize the new devices. The other nodes may utilize the services on the added device. A new media server may also automatically recognize new devices, as long as at least one other media server is currently on the network.
After completing a discovery process, media device 350 determines relevant media items stored on other devices (e.g., media servers) available on home network 340. Thus, media device 350 aggregates all media, relevant to media device 350, for use at media device 350 (i.e., playback, control, etc.). As shown in
The media convergence platform provides the capability to identify all media items as unique. For example, all media items classified under the genre “pop” are recognized as such, and the system displays them accordingly. An artist may have the same name but not be the same artist. The media convergence platform utilizes a distributed database that allows the system to distinguish among unique media items. Thus, if a media item is stored on two different media servers, then during client device aggregation, the device recognizes only a single media item. For the example of
The underlying protocols do not permit a client device to aggregate media items from devices on the home network. The protocols themselves have no requirement to support a distributed system. For this embodiment of the media convergence platform, aggregation logic creates a distributed system using non-distributed protocols. The aggregation logic uses multiple protocols to integrate devices on the home network.
The aggregation logic for the client device acquires media items from all media servers that contain those media items. For example, if the client requests music items, the client device acquires all music items from all media servers available on the network. This operation is illustrated in
The software components 500 also include user interface (“UI”) rendering logic 510. UI rendering component 510 translates scene information to display information suitable for display on the client device. The UI rendering component 510 also renders the display data. For example, if the underlying client device includes a television display (e.g., CRT), then UI rendering engine 510 generates graphics data from scene information, and renders the graphics data on the television display. If the display on the client device is a LCD display, then UI rendering engine 510 generates lists from scene information, and displays the lists on the LCD display.
As shown in
The client device software 500 supports one or more services. As shown in
In one embodiment, the media convergence platform supports a plurality of underlying protocols. In general, the protocols define commands, RPC mechanisms, and interfaces to services. In one embodiment, the media convergence platform supports an industry defined UPnP protocol. In general, the UPnP protocol defines discovery over IP networks, an RPC mechanism, and interfaces for activating services. UPnP services include: a content directory service, a connection manager service, an audio/video (“A/V”) transport service and an A/V control service.
In one embodiment, the media convergence platform also supports a proprietary protocol (i.e., non-industry standard protocol). For this embodiment, the proprietary protocol defines a network discovery process, an RPC mechanism, and an interface to services. The services include a content manager and a media player service. The content manager service allows a client device to interface to a database. Specifically, using the content manager service, the client device may extract information (e.g., URL to identify media, metadata, etc.) from a database on another network device. Thus, the content manager service provides a means for a device of the media convergence platform system to query a database. The media player service defines an interface to permit playback functionality (e.g., initiate and control media streams).
In one embodiment, the discovery process on the proprietary protocol implements asynchronous based messaging. The discovery protocol operates on any network that supports packet based messaging or on a serialized network. In one embodiment, the discovery protocol includes an “announce” command, a “discovery” command, and a “bye-bye” command. The announce command is used by a device to announce its presence on the home media network. A discovery command is a request for an announcement (i.e., queries whether any client devices are on the home network). The “bye-bye” command is used by a client device to announce that the client device is leaving the network. In one embodiment, there are two types of announcements and two types of “bye-bye” commands: one for devices and one for services.
In one embodiment, the RPC mechanism, supported by the proprietary protocol, uses a packet based protocol. The services include methods and an identification number to permit a device on the home network to construct RPC based packets with the appropriate arguments. In general, an RPC mechanism permits a device to control another device on the network. The protocol is effectuated through requests and responses. The RPC packets include a header. In one embodiment, the header contains: version information, a command class (maps to a particular service), the command (the method the device is requesting or the response coming from the method), an identification (identification of requests or identification of responses corresponding to a request), and a length. After the header, the RPC protocol format specifies data (i.e., arguments for requests and returns values for responses).
As shown in
In one embodiment, a media convergence platform implementation provides security. For this embodiment, the announcement command is open ended, such that the protocol only defines a minimum specification for communication. Thus, announcement protocols may support multiple network specifications, including TCP and secure sockets layer (“SSL”). The protocol supports implementation on TCP/IP networks. In addition, the protocol supports SSL operating on TCP/IP networks. SSL permits secure communications, including authentication, between two parties on a network.
The proprietary protocol also permits an implementation using partial security. For this embodiment, a service may include some methods that require secure communications and other methods that do not require secure communications. Thus, some methods utilize SSL technology to realize secure communications between two devices on the home network.
Discovery:The new device transmits an “announcement” command over the network (block 730,
In response to the new device's announcement command, the new device constructs state information. In general, the state information provides details regarding devices available on the network. The state information includes protocols and services supported by those devices. When compatible devices on the network receive the announcement command, those compatible devices may add information, encapsulated in the announcement command, to a local cache.
If there are no compatible devices on the network or the new device does not desire to utilize a service on the network, then the process terminates. For example, if the new device is an MP3 player, then compatible devices include those media servers storing MP3 audio as well as other MP3 players. If there are other compatible devices on the network, those devices expose one or more services to the new device (block 750,
In response to the request (e.g., new device application logic), the new device connects to a compatible device via a supporting protocol (block 760,
A media server entering a home network is one example of the discovery process. For this example, the media server, after obtaining a network address, transmits an announcement command over the network. The media server announces the services it supports. (E.g., content manager, media player service), and exposes interfaces to network clients to permit access to those services. If a device enters the network, the device waits for an announcement from the server. When the client identifies the media server, the client connects to the media server via a protocol the server specified in the announcement command. This process allows the client device to navigate media on the media server. Using the supporting protocol, the client device connects to a playback device, either itself or another playback device, and instructs the playback device to play the item that a user selected from those media items available on the media server.
Convergence Platform Data Model:The media convergence system operates in conjunction with a data model. The format and arrangement of underlying database is not defined by the media convergence system. In the data model, objects (e.g., media items) have unique identifications in the database. The objects also have an associated “type” (e.g., photos, audio tracks, video clips, etc.). The data model defines relationships to define structure and hierarchy among objects and types.
In one embodiment, the database for the media convergence system comprises a relational database (e.g., key value pair database or standard query language (“SQL”) database). For this embodiment, the database maps objects for storage in the relational database. Although one embodiment of the media convergence system utilizes a relational database, other databases may be used without deviating from the spirit or scope of the invention.
Client device 810 may obtain information from Database A and Database B. To query Database B, client device 810 obtains a connection with device 840 in a manner as described above. The client device 810 invokes methods via an interface on content manager serviceB. For example, client device 810 may desire to obtain a list of ail genres recognized by the media convergence system. This information may be stored in database B. Client device 810 generates a request using data model parameters specified in the interface for content manager serviceB. For the example above, client device 810 generates a request to content manager serviceB to identify all objects with the type “genre.” In response to the request, client manager serviceB translates the data model notion of “genre” to a query compatible with Database B. For example, if Database B supports SQL, then content manager serviceB generates a SQL request to Database B to obtain all records in a table with the type “genre.”
The implementation of the content manager service performs the translation from the media convergence system data model to an underlying database implementation. For the example in
In one embodiment, the media convergence platform system is implemented using a database. In general, the database stores objects, attributes associated with those objects, and associations between those objects. For example, the database stores an identification of musical tracks available within the media space. The database stores a plurality of attributes, so as to associate one or more attributes for each musical track. In one embodiment, the objects include albums, artists, tracks, genres, and playlists. Thus, a track may be associated with one or more albums, one or more artists, one or more genres, and one or more playlists. Attributes include titles, creation dates, and multiple associated media files. Thus, a track may have associated album art, lyrics, etc.
The media convergence platform database permits classifying audio tracks in an extremely versatile manner. For example, a user may desire to classify a track or album (i.e., collection of tracks) in more than one genre because the user associates the music with two different types of genres (e.g., rock and blues). Also, a musical track may be a result of a collaboration between two artists. To properly classify the track, a user of the media convergence platform may associate the track with two different artists. As illustrated by the above examples, the media convergence platform system provides maximum flexibility in classifying and organizing music.
The media convergence platform system handles each classification or item as a distinct object. For example, for the music jukebox application, playlists, genres, artists, albums, and tracks are all handled as individual objects. This feature, which supports independent objects for organization and classification of items, provides maximum flexibility in organizing and classifying music. For example, the user may create nested playlists, such that a first playlist may be wholly contained within a second playlist. Prior art music systems only deal with playlists by tracks. For these prior art systems, a playlist only consists of tracks. In the media convergence platform system, playlists may comprise any “objects.” Therefore, playlists may be created from one or more artists, genres, albums or other playlists,
The use of objects in organizing and playing music also permits artists with the same name to be treated differently. Prior art digital music systems store metadata to identify artists. If a user executes a search on the metadata using these prior art systems, there is no way for the system to differentiate among artists with the same name. In the media convergence platform system, each artist is treated as an object. Thus, two artists with the same name are two distinct objects, and may be manipulated as two separate artists.
The media convergence system utilizes distributed iterators. A response to a query to a database may generate a huge amount of data. In one embodiment, the media convergence platform protocol supports transmitting a portion of the data, and maintaining a pointer to identify the data that has been sent. In one embodiment, the protocol uses iterators. The use of iterators by the media convergence platform allows the system to track a portion of data (e.g., a list) transferred from one device to another device. The iterator is implemented such that the iterator dynamically changes if items in the database change during transfer of the data. In general, the iterator specifies a position in an array. A list is a result from the database. For example, the response to a query to a database may produce a list of audio tracks. Subsequently, an audio track, extracted as part of the example query, may be deleted. In another scenario, an audio track, specified by the query, may be added to the database.
If the media convergence system is implemented using the proprietary protocol and a TCP/IP network, the system associates state with the request for database information. This slate information is utilized to maintain iterator information.
User Interface:In one embodiment, the media convergence platform separates the user interface (“UI”) scene manager and application logic from the UI rendering engine. In one implementation, the system defines user interface displays in terms of “scenes.” In general, a scene is an abstract layout for a display, and it consists of logical entities or elements. For example, a scene may define, for a particular display, a title at the top of the display, a message at the bottom the display, and a list of elements in the middle of the display. The scene itself does not define the particular data for the title, message and list. In one implementation, the user interface software comprises a scene manager, UI application logic, and UI rendering engine. In general, the scene manager generates the abstract layout, in terms of logical entities, for a UI display. The application logic receives user input and determines the scene and data to populate the scene based on the logical flow of the user interface. For example, a user may select a first item displayed on the current UI display. In response, the UI application logic selects, if applicable, a new scene and data to populate the new scene based on the user selection.
The application logic is implemented independent of the scene and the UI rendering. The UI application logic obtains, from a scene manager, the scene in terms of the abstract elements. The application logic then populates the logical elements with data, and transfers the abstract layout with data to the rendering engine. The rendering engine then displays the scene and data with display elements particular to the output display for that device. The display elements include display resolution, font size for textual display, the ability to display graphics, etc. For example, if the output device is a television screen, then the UI rendering engine generates graphics data (i.e., RGB data) suitable for display of the scene on the television screen (e.g., proper resolution, font size, etc.). If the output display is a liquid crystal display (“LCD”), the UI rendering engine translates the scene logical entities to a format suitable for display on the LCD display. For example, if the display for a device is only capable of displaying lists, then the UI rendering engine translates the scene with data to display only lists. This translation may result in deleting some information from the scene to render the display. The UI rendering engine may convert other logical elements to a list for display on the LCD display.
A user interface implementation for a media convergence platform that separates the scene manager and UI application logic from the UI rendering engine has several advantages. First, the scene manager/application logic does not require any information regarding the capabilities of the output display. Instead, the scene manager and application logic only view the UI display in terms of logical entities, and populate data for those logic entities based on user input and logical flow of the user interface. Second, this separation permits a graphical designer of a user interface system to easily change the scenes of the user interface. For example, if a graphical designer desires to change a scene in the user interface, the graphical designer only changes the abstract layout of the scene. During runtime, the application logic receives the revised abstract layout, populates the revised abstract layout with data, and transmits the abstract layout with data to the UI rendering engine. The UI rendering engine then determines the specific display elements to display the scene based on the output device. Thus, a change to the scene does not require a change to the display elements particular to each output display because the conversion from the scene to the display elements occurs locally.
In one embodiment, the media convergence platform permits implementing user interface software remote from a device. In one implementation, the scene manager and application logic are executed on a device remote from the device displaying a user interface. The device displaying the user interface only contains the UI rendering engine. For this implementation, the data and scenes for a user interface exist on a remote device. Using this implementation, the scene interface (interface between the scene manager and the application logic) is remote from the device rendering the display. The remote device does not transfer large bitmaps across the network because only scene information with data is transferred. This delineation of functions provides a logical boundary between devices on a network that maximizes throughput over the network. In addition, a remote device hosting the scene manager/application logic does not require information regarding display capabilities of each device on the home network. Thus, this implementation pushes the UI rendering software to the device rendering the images, while permitting the application logic to reside on other devices. This architecture permits implementing a thin client in the media convergence platform because the thin client need not run the scene manager and application logic software.
Claims
1-21. (canceled)
22. A method for presenting an aggregated list of media content, the method comprising:
- receiving first information from a first device, wherein the first information indicates digital media files available on the first device;
- receiving second information from a second device, wherein the second information indicates digital media files available on the second device;
- determining a list of digital media files on the first and second device based on the first and second information; and
- generating for display the list, wherein a digital media file available on both the first and the second device only appears once in the list.
23. The method of claim 22, wherein digital media files comprise audio media files, video media files, or image media files.
24. The method of claim 22, further comprising:
- determining that a third device is configured to playback only audio media files; and
- generating for display the list at the third device, wherein the list comprises only audio media files.
25. The method of claim 22, further comprising:
- determining that a third device is configured to playback only video media files; and
- generating for display the list at the third device, wherein the list comprises only video media files.
26. The method of claim 22, further comprising:
- determining that a third device is configured to playback only image media files; and
- generating for display the list at the third device, wherein the list comprises only image media files.
27. The method of claim 22, wherein the first device comprises a media service for downloading and storing digital medial files from the internet.
28. The method of claim 22, wherein first and second information from the first and second devices respectively, each include state information specifying a network protocol and a content service supported by the respective device.
29. The method of claim 22, further comprising receiving a selection of a digital media file in the list; sending a request to access the selected digital medial file; and receiving the selected digital media file from the first or second device.
30. The method of claim 22, wherein the digital media files in the list are organized in a data model, and the data model stores metadata associated with each digital media file, including one of artist, genre, album, track, title, creation date, album art and lyrics.
31. The method of claim 30, wherein each digital medial file in the data model has a unique identification and an associated type.
32. A media system for presenting an aggregated list of media content, the system comprising:
- a first device configured to receive first information from a second device, wherein the first information indicates digital media files available on the second device; receive second information from a third device, wherein the second information indicates digital media files available on the third device; determine a list of digital media files on the second and third devices based on the first and second information; and generate for display the list, wherein a digital media file available on both the second and the third devices only appears once in the list.
33. The system of claim 32, wherein digital media files comprise audio media files, video media files, or image media files.
34. The system of claim 32, the first device further configured to:
- determine that a fourth device is configured to playback only audio media files; and
- generate for display the list at the fourth device, wherein the list comprises only audio media files.
35. The system of claim 32, the first device further configured to:
- determine that a fourth device is configured to playback only video media files; and
- generate for display the list at the fourth device, wherein the list comprises only video media files.
36. The system of claim 32, the first device further configured to:
- determine that a fourth device is configured to playback only image media files; and
- generate for display the list at the fourth device, wherein the list comprises only image media files.
37. The system of claim 32, wherein the second device comprises a media service for downloading and storing digital medial files from the internet.
38. The system of claim 32, wherein first and second information from the second and third devices respectively, each include state information specifying a network protocol and a content service supported by the respective device.
39. The method of claim 32, the first device further configured to:
- receive a selection of a digital media file in the list;
- send a request to access the selected digital medial file; and
- receive the selected digital media file from the second or third device.
40. The method of claim 32, wherein the digital medial files in the list are organized in a data model, and the data model stores metadata associated with each digital media file, including one of artist, genre, album, track, title, creation date, album art and lyrics.
41. The method of claim 39, wherein each digital medial file in the data model has a unique identification and an associated type.
Type: Application
Filed: Nov 26, 2014
Publication Date: Jul 2, 2015
Inventors: Daniel Putterman (San Francisco, CA), Brad Dietrich (San Francisco, CA), John Doornbos (San Francisco, CA), Jeremy Toeman (San Francisco, CA)
Application Number: 14/555,108