INFORMATION PROCESSING APPARATUS, METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM THAT CONTROLS A REPRESENTATION OF A USER OBJECT IN A VIRTUAL SPACE
Provided is an information processing apparatus including: a first information acquisition unit configured to acquire first information indicating behavior of at least one user; a second information acquisition unit configured to acquire second information on the at least one user, the second information being different from the first information; and a display control unit configured to display, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user and a virtual space which is configured based on the second information and in which the user object are arranged.
Latest Sony Corporation Patents:
- POROUS CARBON MATERIAL COMPOSITES AND THEIR PRODUCTION PROCESS, ADSORBENTS, COSMETICS, PURIFICATION AGENTS, AND COMPOSITE PHOTOCATALYST MATERIALS
- POSITIONING APPARATUS, POSITIONING METHOD, AND PROGRAM
- Electronic device and method for spatial synchronization of videos
- Surgical support system, data processing apparatus and method
- Information processing apparatus for responding to finger and hand operation inputs
This application is a continuation of U.S. application Ser. No. 14/763,603, filed Jul. 27, 2015, which is a National Stage Application based on PCT/JP2014/050108, filed on Jan. 8, 2014, and claims priority to Japanese Patent Application No. 2013-047040, filed on Mar. 8, 2013, the entire contents of each are incorporated herein by its reference.
TECHNICAL FIELDThe present disclosure relates to an information processing apparatus, a system, an information processing method, and a program.
BACKGROUND ARTIn accordance with diffusion of social media, it has become common for users to share their behavior via a network. For example, Patent Literature 1 discloses a technology for reflecting, in postures of objects representing other users, actual behavior of the users, such as walking, running, sitting, standing, and making a phone call in a virtual space displayed on a screen and expressing intimacy of a user with the users and actual positions, characteristics of behavior, and taste of the users by using display positions of the objects.
CITATION LIST Patent Literature
- Patent Literature 1: JP 2010-134802A
However, it cannot be said that, for example, the technology disclosed in Patent Literature 1 has been satisfactorily developed, and it can be said that usability of a user interface and other aspects can be further improved.
In view of this, the present disclosure proposes an information processing apparatus, a system, an information processing method, and a program, each of which is new and improved and is capable of providing a user interface that can be used more easily in the case where users share their behavior information via user objects displayed in a virtual space.
Solution to ProblemAccording to the present disclosure, there is provided an information processing apparatus including: a first information acquisition unit configured to acquire first information indicating behavior of at least one user; a second information acquisition unit configured to acquire second information on the at least one user, the second information being different from the first information; and a display control unit configured to display, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user and a virtual space which is configured based on the second information and in which the user object are arranged.
According to the present disclosure, there is provided a system including: a terminal apparatus; and one or more server devices configured to provide a service to the terminal apparatus. The system provides, by cooperating the terminal apparatus with the one or more server devices, a function for acquiring first information indicating behavior of at least one user, a function for acquiring second information on the at least one user, the second information being different from the first information, and a function for displaying, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user, and a virtual space which is configured based on the second information and in which the user object is arranged.
According to the present disclosure, there is provided an information processing method including: acquiring first information indicating behavior of at least one user; acquiring second information on the at least one user, the second information being different from the first information; and displaying, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user, and a virtual space which is configured based on the second information and in which the user object is arranged.
According to the present disclosure, there is provided a program causing a computer to realize a function for acquiring first information indicating behavior of at least one user, a function for acquiring second information on the at least one user, the second information being different from the first information, and a function for displaying, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user, and a virtual space which is configured based on the second information and in which the user object is arranged.
In the above configurations, the behavior of the users is reflected in the respective user objects, and the virtual space in which the user objects are arranged is configured based on some information of the users. With this, for example, attributes or characteristics of the behavior of the users displayed as the user objects can be grasped based on the virtual space serving as a background of the user objects. This improves usability of a user interface for sharing behavior information of the users via the user objects.
Advantageous Effects of InventionAs described above, according to the present disclosure, it is possible to provide a user interface that can be used more easily in the case where users share their behavior information via user objects displayed in a virtual space.
Hereinafter, (a) preferred embodiment(s) of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the drawings, elements that have substantially the same function and structure are denoted with the same reference signs, and repeated explanation is omitted.
Note that description will be provided in the following order.
1. Examples of embodiments
-
- 1-1. Embodiment 1
- 1-2. Embodiment 2
- 1-3. Embodiment 3
2. Behavior information sharing
3. Behavior status display
4. Setting of virtual space
5. Operation performed on object
6. Grouping of user
7. Hardware configuration
8. Supplement
1. EXAMPLES OF EMBODIMENTSWith reference to
(Terminal Apparatus)
The terminal apparatus 100 includes, as a functional configuration, a sensor 110, a behavior recognition unit 120, a communication unit 130, a control unit 140, and a user interface (UI) 150.
The sensor 110 is any of various sensors such as an acceleration sensor, a gyro sensor, a magnetic field sensor, an optical sensor, a sound sensor, and a pressure sensor and detects acceleration, a posture, a direction, and a surrounding environment, and the like of a user of the terminal apparatus 100. The sensor 110 may include positioning means such as a GPS sensor or a Wi-Fi communication unit for acquiring position information of the user.
The behavior recognition unit 120 is realized by software in such a way that a processor such as a CPU operates in accordance with a program. Note that a functional configuration, such as the behavior recognition unit 120, which is described to be realized by software in the present specification, may be realized by a processing circuit such as an application specific integrated circuit (ASIC). The behavior recognition unit 120 recognizes behavior of the user of the terminal apparatus 100 on the basis of a detection result of the sensor 110 and acquires information (behavior information) indicating the behavior of the user. For example, the behavior recognition unit 120 may recognize whether a user is stopping, walking, or running on the basis of a detection result of acceleration or the like. Further, the behavior recognition unit 120 may recognize more advanced behavior, i.e., for example, may recognize whether a user is working, stays at home, or is shopping by combining a detection result of acceleration or the like with position information of the user. The behavior recognition unit 120 transmits the information indicating the recognized behavior of the user to the server 200. The behavior recognition unit 120 may also provide the information indicating the behavior of the user to the control unit 140.
The communication unit 130 is realized by, for example, various wired or wireless communication devices. The communication unit 130 receives behavior information of at least one user from the server 200. Herein, the users whose behavior information is received may include the user of the terminal apparatus 100 or may include at least one user (other users) other than the user of the terminal apparatus 100. The behavior information of other users can be acquired by, for example, sensors 110 and behavior recognition units 120 of other terminal apparatuses 100 and be transmitted to the server 200. Further, as described below, the communication unit 130 may receive, from the server 200, information different from the behavior information such as information indicating an attribute of each user or information obtained by performing statistical processing on the behavior information of each user. The communication unit 130 provides the received information to the control unit 140.
The control unit 140 is realized by software in such a way that a processor such as a CPU operates in accordance with a program. The control unit 140 displays an image in a display unit included in the UI 150 on the basis of the information acquired from the communication unit 130. Note that, in the present specification, the image includes a still image and a moving image. The control unit 140 may output audio from a speaker or the like included in the UI 150 on the basis of the information acquired from the communication unit 130. The image displayed by the control unit 140 in the UI 150 includes a user object and a virtual space in which the user object is arranged. Further, the control unit 140 may acquire operation performed by a user via an operation unit included in the UI 150 and may change the image to be displayed on the basis of the operation. Furthermore, the control unit 140 may execute operation with respect to a user displayed as a user object, such as grouping or the like of the user, on the basis of the operation performed by the user. Moreover, the control unit 140 may specify a behavior status on the basis of behavior information of each user and reflect the behavior status in each user object.
Herein, the user object is an object representing each of at least one user and is configured based on the behavior information of each user received by the communication unit 130 and/or the behavior information provided from the behavior recognition unit 120. That is, the user object is displayed for a user whose behavior information is provided. Note that it is unnecessary to display all user objects of users whose behavior information is provided, and only a user specified by the user of the terminal apparatus 100, such as a user classified in a specific group, may be displayed.
Meanwhile, the virtual space is configured based on information different from the behavior information. This information may be, for example, information indicating an attribute of each user or information obtained by performing statistical processing on the behavior information of each user. The control unit 140 configures an image to be displayed as the virtual space that is common to the user objects by, for example, combining the above information of each of the users displayed as the user objects. The image to be displayed as the virtual space may be selected from, for example, images that have been set in advance or may be newly generated by converting a numerical value extracted from information on each user in accordance with a predetermined rule.
The UI 150 is realized by, for example, the display unit (output device) that displays an image such as a display and the operation unit (input device) that acquires operation performed by a user on the displayed image, such as a touch panel or a mouse. The UI 150 displays an image in accordance with control of the control unit 140 and provides, to the control unit 140, information indicating operation performed by a user on the displayed image. The output device included in the UI 150 may further include an audio output unit such as a speaker.
(Server)
The server 200 includes, as a functional configuration, an application server 210 and a database (DB) 220.
The application server 210 is realized by software in such a way that a processor such as a CPU operates in accordance with a program. The application server 210 executes various kinds of calculation for providing an application service to the terminal apparatus 100 while referring to the DB 220.
For example, the application server 210 accumulates, in the DB 220, information indicating behavior of a user received from the terminal apparatus 100 and transmits the accumulated information to the terminal apparatus 100 as necessary. By such processing, for example, information indicating behavior of a plurality of users recognized in the terminal apparatuses 100 used by the respective users is collected in the DB 220. In response to a request from the user through the terminal apparatus 100, the application server 210 reads information indicating behavior of another user from the DB 220 to transmit the information to the terminal apparatus 100. This makes it possible that the users share behavior information.
The application server 210 acquires information (information different from behavior information) used for configuring the virtual space in the control unit 140 of the terminal apparatus 100 to transmit the information to the terminal apparatus 100. For example, the application server 210 may collect profile information (information indicating attribute of user) registered by each user in the DB 220 and transmit the profile information to the terminal apparatus 100 as necessary. Further, the application server 210 may execute statistical processing with respect to the behavior information of the user accumulated in the DB 220 and transmit information obtained by the processing, such as a behavior pattern of the user, to the terminal apparatus 100.
The DB 220 is realized by, for example, a storage device and stores various kinds of information to be processed by the application server 210. For example, the DB 220 stores the behavior information of each user received by the application server 210 from the terminal apparatus 100. The DB 220 may store the information indicating the attribute of each user acquired by the application server 210 and the information generated by the application server 210 performing the statistical processing on the behavior information of each user.
In Embodiment 1 of the present disclosure described above, for example, the communication unit 130 functions as a first information acquisition unit that acquires behavior information of a user (in the case where behavior information of a user him/herself of the terminal apparatus 100 is internally acquired, the behavior recognition unit 120 also functions as the first information acquisition unit). The communication unit 130 also functions as a second information acquisition unit that acquires information on the user different from the behavior information (for example, attribute information or information obtained by performing statistical processing on behavior information). The control unit 140 functions as a display control unit that displays a user object, which is configured based on the behavior information and represents each user, and a virtual space, which is configured based on the information different from the behavior information and in which the user object is arranged, in the display unit included in the UI 150. Therefore, it can be also said that the terminal apparatus 100 is an information processing apparatus including those constituent elements.
In the above description, Embodiment 1 of the present disclosure has been described. Note that the above is schematic description, and the terminal apparatus 100 and the server 200 can further include a functional configuration other than the functional configurations shown in
Further, in the above example, although a terminal apparatus that provides behavior information of a user to the server and a terminal apparatus that displays an image including a user object on the basis of information such as the behavior information collected in the server are the same, those terminal apparatuses can be independently configured as a modification example. For example, a function of the above terminal apparatus 100 may be realized by the following dispersed terminals: a sensor log terminal 100a that includes the sensor 110 and the behavior recognition unit 120 and acquires sensing data to recognize behavior of a user and transmits behavior information to the server 200; and a display terminal 100b that includes the communication unit 130, the control unit 140, and the UI 150 and displays an image including a user object on the basis of information received from the server 200. In this case, the display terminal 100b is not necessarily a terminal apparatus carried by a user, and may be, for example, a stationary PC, television, or game console.
Because behavior recognition is executed and behavior information of a user is generated in the terminal apparatus 300, this embodiment has an advantage in protecting privacy of the behavior information. For example, a user may select whether to transmit behavior information obtained as a result of behavior recognition to the server. This makes it possible that, for example, the user keeps behavior information that the user does not want to share in the terminal apparatus 100 without transmitting the behavior information to the server 200.
1-2. Embodiment 2In this embodiment, as illustrated in
Note that details of each functional configuration are similar to those in Embodiment 1 described above, and therefore repeated description will be omitted. Also in this embodiment, as a modification example, a terminal apparatus that provides sensing data of a user to the server and a terminal apparatus that displays an image including a user object on the basis of information such as behavior information collected in the server can be independently configured. For example, a function of the above terminal apparatus 300 may be realized by the following dispersed terminals: a sensor log terminal 300a that includes the sensor 110 and acquires sensing data and provides the sensing data to the server 400; and a display terminal 300b that includes the communication unit 130, the control unit 140, and the UI 150 and displays an image including a user object on the basis of information received from the server 400. In this case, the display terminal 300b is not necessarily a terminal apparatus carried by a user, and may be, for example, a stationary PC, television, or game console.
Also in Embodiment 2 of the present disclosure described above, for example, the communication unit 130 functions as a first information acquisition unit that acquires behavior information of a user and also as a second information acquisition unit that acquires information on the user different from the behavior information. The control unit 140 functions as a display control unit that displays a user object, which is configured based on the behavior information and represents each user, and a virtual space, which is configured based on the information different from the behavior information and in which the user object is arranged, in a display unit included in the UI 150. Therefore, it can be also said that the terminal apparatus 300 is an information processing apparatus including those constituent elements.
As compared to Embodiment 1 described above, this embodiment has an advantage in reducing power consumption of the terminal apparatus 300 because the terminal apparatus 300 does not need to execute calculation for behavior recognition. Further, in the case where the sensor log terminal 300a is independently provided as described in the modification example, a processing circuit such as a processor for behavior recognition is not needed, and therefore it is possible to reduce a size and a weight of the sensor log terminal 300a and also reduce power consumption thereof.
1-3. Embodiment 3In this embodiment, as described in
The communication unit 530 is realized by, for example, various wired or wireless communication devices. The communication unit 530 receives the data of the image including the user object including the user object from the server 600. The image including the user object is generated based on behavior information and other information in the control unit 140 included in the server 600. Further, the communication unit 530 may acquire information indicating operation performed by a user on the image including the user object from the UI 150 and transmit the information to the server 600 so that the control unit 140 can change the image including the user object on the basis of the operation.
Note that details of each functional configuration other than the communication unit 530 are similar to those in Embodiment 1 described above, and therefore repeated description will be omitted. Also in this embodiment, as a modification example, a terminal apparatus that provides sensing data of a user to the server and a terminal apparatus that receives data of an image including a user object from the server to display the data can be independently configured. For example, a function of the above terminal apparatus 500 may be realized by the following dispersed terminals: a sensor log terminal 500a that includes the sensor 110 and acquires sensing data and provides the sensing data to the server 600; and a display terminal 500b that includes the communication unit 530 and the UI 150 and displays an image including a user object by use of data received from the server 600. In this case, the display terminal 500b is not necessarily a terminal apparatus carried by a user, and may be, for example, a stationary PC, television, or game console.
In Embodiment 3 of the present disclosure described above, for example, the behavior recognition unit 120 and the application server 210 function as a first information acquisition unit that acquires behavior information of a user. The application server 210 also functions as a second information acquisition unit that reads, from the DB 220, attribute information that is additionally provided and acquires information on the user different from the behavior information by performing statistical processing on the behavior information. The control unit 140 functions as a display control unit that displays a user object, which is configured based on the behavior information and represents each user, and a virtual space, which is configured based on the information different from the behavior information and in which the user object is arranged, in a display unit included in the UI 150 of the terminal apparatus 500 via the communication unit 530. Therefore, it can be also said that the server 600 is an information processing apparatus including those constituent elements.
As compared to Embodiments 1 and 2 described above, this embodiment is the most advantageous in terms of power consumption of the terminal apparatus 500 because the terminal apparatus 500 does not need to execute even calculation for generating the image including the user object. Meanwhile, in this embodiment, because the image including the user object is generated in the server 600, the server 600 grasps not only the behavior information of the user but also an operation state in the UI 150. Therefore, the above two embodiments can have an advantage over this embodiment in terms of privacy.
As described above, in the embodiments of the present disclosure, it is possible to employ various variations regarding which part in the system each functional configuration is arranged. As a matter of course, examples of such variations are not limited to the above three examples and the modification examples thereof.
2. BEHAVIOR INFORMATION SHARINGWith reference to
Note that, in the following description, a “behavior status” is information indicating behavior of a user classified based on a predetermined reference. For example, in the following description, “resting”, “running”, “moving by vehicle”, and the like will be exemplified as the behavior status, but other various behavior statuses can be set. Granularity of the behavior status can be changed depending on, for example, the kind of virtual space 1013 in some cases. For example, behavior statuses of “moving by vehicle”, “moving by bus”, and “moving by train” may be integrated and displayed as a single behavior status of “moving” in some cases. In the following description, a “behavior log” is information indicating a history of behavior of a user in the past. In some examples, the behavior log can be a history of the behavior status.
Meanwhile, the client application 1001 controls a behavior recognition unit 1007 of the terminal apparatus or the server to cause the behavior recognition unit 1007 to provide behavior information of a user to a behavior information DB 1009. A plurality of users use the client applications 1001 to log in the social media 1003, and therefore behavior information of the users are collected in the behavior information DB 1009.
In the case where a user and another user share behavior information, in, for example, the embodiments described above, the control unit 140 may display a list 1011 of other users whose behavior information can be shared in the UI 150 in accordance with the sharing list 1005 and then display, in the virtual space 1013, a user object 1015 showing a behavior status of a user selected from the list 1011. Note that a specific display example of the virtual space 1013 and the user object 1015 will be described below.
Herein, in order to display the user object 1015, the control unit 140 acquires the behavior information of the user from the behavior information DB 1009. At this time, whether or not the behavior information of the user can be provided may be determined by referring to the sharing list 1005. A plurality of virtual spaces 1013 may be set for respective groups in which users are classified and the virtual spaces 1013 may be switched and displayed. In the example of
A user can refer to behavior log display 1017 of another user by selecting the user object 1015 displayed in the virtual space 1013 in the UI 150. The behavior log display 1017 is, for example, display of a behavior log of a target user expressed by a graph showing a ratio of behavior performed by the target user that day and a behavior pattern of the user in the past. The behavior log display 1017 can be generated based on a behavior log 1019 of the target user acquired from the behavior information DB 1009.
Meanwhile, the virtual space 1013 is a space in which the user object 1015 representing each user is arranged. The user object 1015 shows the behavior status of each user by using a posture or a display position thereof. Therefore, there is a possibility that the user object 1015, as well as the user object 1023 in the list 1011, is not displayed in the case where the behavior status of the user is not specified. Further, as described below, the virtual space 1013 in which the user object 1015 is arranged is changed in accordance with, for example, an attribute, a behavior pattern, and the like of each user.
By selecting the information 1021 of each user displayed in the list 1011 or the user object 1015 of each user displayed in the virtual space 1013, it is possible to display the behavior log display 1017 of the user.
Herein, as described above, in the list 1011 and the virtual space 1013, a user him/herself of the terminal apparatus is displayed as the information 1021 or the user object 1015. By selecting them, it is also possible to display the user's own behavior log display 1017. In the user's own behavior log display 1017, all items can be basically displayed. Meanwhile, in the behavior log display 1017 of other users, an item that is permitted to be open by each user can be limitedly displayed. That is, a behavior log that is set to be closed by each user on the basis of, for example, the kind of behavior or a time period is not reflected in the behavior log display 1017.
A range of the behavior log displayed as the behavior log display 1017 and a method of displaying the behavior log in the behavior log display 1017 may be changed in accordance with a current behavior status of the target user. For example, in the case where the user is currently moving, only a behavior log regarding moving such as walking, a train, and a bus may be reflected in the behavior log display 1017. Alternatively, in the case where the user is currently working in an office, only a behavior log in the office may be reflected in the behavior log display 1017.
3. BEHAVIOR STATUS DISPLAYWith reference to
(Expression Using User Object Itself)
As illustrated in the example of
(Expression Using Relationship with Another Object)
The user object 1015 may express a behavior status of a corresponding user by using another object (container object) displayed as a container thereof. For example, the user object 1015b is displayed in a container object 1016a of a vehicle running on a road, which indicates that the behavior status of the user is “moving by vehicle”. The user object 1015d is displayed in a container object 1016b of a bus running on the road, which indicates that the behavior status of the user is “moving by bus”.
Herein, the container object 1016 arranged in the virtual space 1013 is displayed based on, for example, a behavior pattern of a user displayed as the user object 1015 in the virtual space 1013 and can be displayed even in the case where a user who has a behavior status corresponding to the container object 1016 does not exist. For example, in the example of
The number or a size of other objects displayed in association with the user object 1015 may be changed in accordance with the number of associated user objects 1015. For example, in the example of
(Arrangement of User Object Corresponding to Behavior Status)
Arrangement of the user objects 1015 in the virtual space 1013 is determined in accordance with the behavior statuses expressed by the respective user objects 1015. In the example of
Meanwhile, as to the user objects 1015c and 1015e corresponding to the running users, the user object 1015c is arranged in a park on a front side and the user object 1015e is arranged in a crosswalk on a deep side. This arrangement can indicate that the user corresponding to the user object 1015c runs in a place other than a road such as a park and the user corresponding to the user object 1015e runs on a road or a sidewalk. Thus, the arrangement of the user objects 1015 may reflect not only the behavior statuses expressed by the respective user objects 1015 but also the position information and the like of the users. Note that, in the example of
In the above example, the number or a size of background objects, such as a road, a cafe, a park, a crosswalk, and a building, arranged in the virtual space 1013 may be changed in accordance with the number of the user objects 1015 arranged therein. For example, in the example of
With reference to
As described in the three examples described above, in some embodiments of the present disclosure, the virtual space 1013 serving as a background of the user object 1015 in an image including a user object is configured based on information (second information) different from behavior information (first information) of a user used for configuring the user object 1015. The virtual space 1013 can be configured based on a result of combination of the second information on users (for example, a result of various kinds of clustering including an average value, an intermediate value, and the like). In this case, the user objects 1015 representing the users can be arranged in the common virtual space 1013. Note that the image displayed as the virtual space 1013 may be selected from, for example, images that have been set in advance or may be newly generated by converting numerical values extracted from information on the users in accordance with a predetermined rule.
For example, in the above examples of
For example, in the above example of
Further, in the example of
Herein, the bus 1027a and the bus 1027b displayed in the virtual space 1013 may be displayed, regardless of whether or not the users P and Q ride the buses. More specifically, in the virtual space 1013, the bus 1027a may be operated at 8:00 every day and the bus 1027b may be operated at 9:00. In this case, for example, in the case where the user P does not ride the bus at 8:00 because the user P takes a vacation or the like, the user object 1015p representing the user P is displayed as another posture in another place in accordance with the behavior status of the user P at that time, whereas the bus 1027a is operated at 8:00 as usual. Note that, because the user P does not ride the bus, no user object rides the bus 1027a.
Thus, by controlling display of the virtual space 1013 or an object displayed in the virtual space 1013 on the basis of information such as a behavior pattern obtained by performing statistical processing on behavior information of a user, a characteristic of behavior of the user can be recognized based on the display of the virtual space 1013 without being influenced by temporary and irregular behavior of the user.
As application of such display, approach from a certain user to another user may be performed. For example, there is a case where a plurality of users have behavior patterns (behavior may be the same or different) in the same time period, such as a case where a user X has a behavior pattern “jogging at evening on Saturdays” and a user Y has a behavior pattern “riding a bicycle at evening on Saturdays”. In this case, in the case where some of the plurality of users do not unexpectedly perform behavior in accordance with behavior patterns, another user may encourage the users to perform the behavior. In the above example, in the case where the user X is jogging but the user Y is not riding a bicycle at evening on Saturday, the user object 1015 representing the user X may encourage the user Y to ride the bicycle while jogging in the virtual space 1013 displayed in a terminal apparatus of the user Y.
5. OPERATION PERFORMED ON OBJECTWith reference to
For example, in the case of the stopping user object 1015 illustrated in
Meanwhile, in the case of the user object 1015 which indicates that the corresponding user is moving by car because the user object 1015 rides the car 1027c as illustrated in
Further, in some embodiments of the present disclosure, it is also possible to execute different kinds of operation with respect to a user represented by the user object 1015 on the basis of the kind of operation to select the user object 1015. For example, in the case where the user object 1015 is touched, display is changed to the log display 1017. In the case where the user object 1015 is touched twice, the display is changed to the message transmission screen. In the case where the user object 1015 is flicked toward a right side, the audio telephone call is prepared. In the case of shaking a terminal apparatus while tapping the user object 1015, vibration notification is executed with respect to a target user.
Further, in some embodiments of the present disclosure, the kind of operation performed on a user represented by the user object 1015 may be automatically selected when the user object 1015 is selected. For example, in the case where predetermined operation (for example, double tap) is executed with respect to the user object 1015 and the user represented by the user object 1015 is a user him/herself who performs the operation, posting a message to the social media is selected. In the case where the user represented by the user object 1015 is a friend, message transmission to the user is selected. In the case where the message transmission is performed and if a target user sits, transmission of a message with an image may be selected because it is presumed that the user can take a long time to read a message. In the case where the target user is moving by vehicle, vibration notification may be selected so as not to obstruct driving.
In a screen in which a behavior status of each user is displayed by use of the user object 1015 as illustrated in, for example,
With reference to
At this time, a user object 1031 displayed while the screen is being changed may be the same as the user object 1023, may be the same as the user object 1015, or may be different from the both. By further flicking the screen to the right or left side after the screen is switched to the virtual space 1013, still another virtual space 1013 may be displayed and the selected user may be classified in still another group.
As described above, the virtual space 1013 displayed in the UI 150 can be changed for each a group in which the user is classified. Further, a reference used when behavior of the user is classified and a behavior status thereof is set may be changed for each virtual space 1013. For example, in the case where the virtual space 1013 corresponding to a certain group is a virtual space shaped like a bookstore (“Bookstore” town view), a behavior status regarding books may be classified in more detail in comparison with another virtual space and behavior statuses other than the behavior status regarding books may be roughly classified (for example, moving by train, bus, and vehicle is integrated as “moving”).
As described above with reference to
Next, profiles of the registered members are acquired (Step S103). The profiles acquired herein are information such as the profile information 1025 illustrated in the examples of, for example,
Next, a town view of a group is selected based on the common item extracted in Step S103 (Step S107). The town view herein is displayed as the virtual space 1013 in the above description and is, for example, the “Bookstore” town view and the “young & student” town view illustrated in the examples of
After that, for example, in the case where a user who is not registered in the group exists among the users displayed in the list 1011, a recommended group is displayed (Step S109). The recommended group can be displayed by, for example, the method described above with reference to
Next, additional selection and additional registration of a group member are received (Step S111). The received additional selection and additional registration herein can be executed by a user who has referred to the display of the recommended group in Step S109. However, the user does not necessarily execute the additional selection and additional registration in accordance with the display of the recommended group and may ignore the display of the recommended group and execute the additional selection and additional registration. Although not illustrated, processing similar to that of Steps S103 to S107 is executed again after Step S111, and a town view suitable for the group after the additional selection and additional registration can be selected. Further, in the case where a user who is not registered in the group still exists, the recommended group display similar to that in Step S109 can be executed again.
Further, highly correlated behavior among the group members is extracted based on the behavior patterns of the respective members acquired in Step S203 (Step S205). The “highly correlated behavior” herein may be extracted based on multiple references such as “the same behavior in the same time period” (for example, as illustrated in the example of
Next, the town view is updated in accordance with the behavior extracted in Step S205 (Step S207). The update of the town view herein may indicate, for example, replacement of the town view with a completely new town view or indicate addition or change of an object to be arranged in the virtual space 1013 such as the bus 1027 illustrated in the example of
Next, with reference to
The information processing apparatus 900 includes a central processing unit (CPU) 901, read only memory (ROM) 903, and random access memory (RAM) 905. Further, the information processing apparatus 900 may also include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925. Furthermore, the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary. The information processing apparatus 900 may also include, instead of or along with the CPU 901, a processing circuit such as a digital signal processor (DSP).
The CPU 901 functions as an arithmetic processing unit and a control unit and controls an entire operation or a part of the operation of the information processing apparatus 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or a removable recording medium 927. The ROM 903 stores programs and arithmetic parameters used by the CPU 901. The RAM 905 primarily stores programs used in execution of the CPU 901 and parameters and the like varying as appropriate during the execution. The CPU 901, the ROM 903, and the RAM 905 are connected to each other via the host bus 907 configured from an internal bus such as a CPU bus or the like. In addition, the host bus 907 is connected to the external bus 911 such as a peripheral component interconnect/interface (PCI) bus via the bridge 909.
The input device 915 is a device operated by a user, such as a mouse, a keyboard, a touch panel, buttons, a switch, and a lever. Also, the input device 915 may be a remote control device using, for example, infrared light or other radio waves, or may be an external connection device 929 such as a cell phone compatible with the operation of the information processing apparatus 900. The input device 915 includes an input control circuit that generates an input signal on the basis of information input by the user and outputs the input signal to the CPU 901. The user inputs various kinds of data to the information processing apparatus 900 and instructs the information processing apparatus 900 to perform a processing operation by operating the input device 915.
The output device 917 is configured from a device capable of visually or aurally notifying the user of acquired information. For example, the output device 917 may be: a display device such as a liquid crystal display (LCD), a plasma display panel (PDP), or an organic electro-luminescence (EL) display; an audio output device such as a speaker or headphones; or a printer. The output device 917 outputs results obtained by the processing performed by the information processing apparatus 900 as video in the form of text or an image or as audio in the form of audio or sound.
The storage device 919 is a device for storing data configured as an example of a storage unit of the information processing apparatus 900. The storage device 919 is configured from, for example, a magnetic storage device such as a hard disk drive (HDD), a semiconductor storage device, an optical storage device, or a magneto-optical storage device. This storage device 919 stores programs to be executed by the CPU 901, various data, and various data obtained from the outside.
The drive 921 is a reader/writer for the removable recording medium 927 such as a magnetic disk, an optical disc, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900. The drive 921 reads out information recorded on the attached removable recording medium 927, and outputs the information to the RAM 905. Further, the drive 921 writes the record on the attached removable recording medium 927.
The connection port 923 is a port for allowing devices to directly connect to the information processing apparatus 900. Examples of the connection port 923 include a universal serial bus (USB) port, an IEEE1394 port, and a small computer system interface (SCSI) port. Other examples of the connection port 923 may include an RS-232C port, an optical audio terminal, and a high-definition multimedia interface (HDMI) port. The connection of the external connection device 929 to the connection port 923 may enable the various data exchange between the information processing apparatus 900 and the external connection device 929.
The communication device 925 is a communication interface configured from, for example, a communication device for establishing a connection to a communication network 931. The communication device 925 is, for example, a wired or wireless local area network (LAN), Bluetooth (registered trademark), a communication card for wireless USB (WUSB), or the like. Alternatively, the communication device 925 may be a router for optical communication, a router for asymmetric digital subscriber line (ADSL), a modem for various communications, or the like. The communication device 925 can transmit and receive signals and the like using a certain protocol such as TCP/IP on the Internet and with other communication devices, for example. The communication network 931 connected to the communication device 925 is configured from a network which is connected via wire or wirelessly and is, for example, the Internet, a home-use LAN, infrared communication, radio wave communication, and satellite communication.
The imaging device 933 is a device which images a real space by use of various members including an image sensor such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) and a lens for controlling image formation of a subject on the image sensor, and generates a pickup image. The imaging device 933 may image a still image or a moving image.
The sensor 935 is any of various sensors such as an acceleration sensor, a gyro sensor, a magnetic field sensor, an optical sensor, a sound sensor, and a pressure sensor. For example, the sensor 935 acquires information related to the state of the information processing apparatus 900 itself, such as the posture of the housing of the information processing apparatus 900, or information related to the peripheral environment of the information processing apparatus 900, such as the brightness or noise around the information processing apparatus 900. Further, the sensor 935 may include a global positioning system (GPS) sensor which measures the latitude, the longitude, and the altitude of the apparatus by receiving a GPS signal.
Heretofore, an example of the hardware configuration of the information processing apparatus 900 has been shown. Each of the structural elements described above may be configured using a general-purpose material, or may be configured from hardware dedicated to the function of each structural element. The configuration may be changed as appropriate according to the technical level at the time of carrying out embodiments.
8. SUPPLEMENTThe embodiments of the present disclosure may include the information processing apparatus (a terminal apparatus or a server), the system, the information processing method executed in the information processing apparatus or the system, the program for causing the information processing apparatus to function, and the non-transitory tangible media having the program recorded thereon, which have been described above, for example.
The preferred embodiment(s) of the present disclosure has/have been described above with reference to the accompanying drawings, whilst the present disclosure is not limited to the above examples, of course. A person skilled in the art may find various alterations and modifications within the scope of the appended claims, and it should be understood that they will naturally come under the technical scope of the present disclosure.
Additionally, the present technology may also be configured as below.
(1)
An information processing apparatus including:
a first information acquisition unit configured to acquire first information indicating behavior of at least one user;
a second information acquisition unit configured to acquire second information on the at least one user, the second information being different from the first information; and
a display control unit configured to display, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user and a virtual space which is configured based on the second information and in which the user object are arranged.
(2)
The information processing apparatus according to (1),
wherein the display control unit combines the second information on the at least one user to set the virtual space that is common to the user object representing the corresponding at least one user.
(3)
The information processing apparatus according to (2),
wherein the second information acquisition unit acquires the second information obtained by performing statistical processing on the information indicating the behavior of the at least one user.
(4)
The information processing apparatus according to (3),
wherein the second information acquisition unit acquires the second information indicating a behavior pattern of the at least one user.
(5)
The information processing apparatus according to (4),
wherein the display control unit changes a part or all of the virtual space with time on the basis of the behavior pattern.
(6)
The information processing apparatus according to any one of (2) to (5),
wherein the second information acquisition unit acquires the second information indicating an attribute of the at least one user.
(7)
The information processing apparatus according to (6),
wherein the second information acquisition unit acquires the second information generated based on age, an occupation, or a product purchase history of the at least one user.
(8)
The information processing apparatus according to any one of (1) to (7),
wherein the display control unit configures the virtual space for each group in which the at least one user is classified and arranges the user object in the virtual space corresponding to a group in which each user is classified.
(9)
The information processing apparatus according to (8),
wherein, regarding a user who is not classified in the group, the display control unit displays, in the display unit, text or an image showing a group to which the user is recommended to be classified.
(10)
The information processing apparatus according to (8) or (9),
wherein, regarding a user who is not classified in the group, by displaying, as a preview, a change in the virtual space occurring in a case where the user is arranged in the virtual space, the display control unit indicates whether or not a group corresponding to the virtual space is a group to which the user who is not classified in the group is recommended to be classified.
(11)
The information processing apparatus according to any one of (1) to (10),
wherein the display control unit shows, by use of display of the user object, a behavior status obtained by classifying the behavior of each user on the basis of a predetermined reference.
(12)
The information processing apparatus according to (11),
wherein the display control unit shows the behavior status of each user by using a shape or a motion of the user object.
(13)
The information processing apparatus according to (11) or (12),
wherein the display control unit arranges a container object corresponding to the behavior status in the virtual space and shows the behavior status of each user by displaying the user object in the container object.
(14)
The information processing apparatus according to (13),
wherein the display control unit displays the container object on the basis of a behavior pattern of the at least one user.
(15)
The information processing apparatus according to (14),
wherein the display control unit displays the container object, regardless of whether or not the at least one user performs behavior in accordance with the behavior pattern.
(16)
The information processing apparatus according to any one of (13) to (15),
wherein the display control unit changes a number or a size of the container objects in accordance with a number of users having behavior statuses corresponding to the container object.
(17)
The information processing apparatus according to any one of (11) to (16),
wherein the display control unit changes the reference for classifying the behavior status depending on the virtual space.
(18)
A system including:
a terminal apparatus; and
one or more server devices configured to provide a service to the terminal apparatus,
wherein the system provides, by cooperating the terminal apparatus with the one or more server devices,
a function for acquiring first information indicating behavior of at least one user,
a function for acquiring second information on the at least one user, the second information being different from the first information, and a function for displaying, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user, and a virtual space which is configured based on the second information and in which the user object is arranged.
(19)
An information processing method including:
acquiring first information indicating behavior of at least one user;
acquiring second information on the at least one user, the second information being different from the first information; and
displaying, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user, and a virtual space which is configured based on the second information and in which the user object is arranged.
(20)
A program causing a computer to realize
a function for acquiring first information indicating behavior of at least one user,
a function for acquiring second information on the at least one user, the second information being different from the first information, and
a function for displaying, in a display unit, a user object which is configured based on the first information and represents the corresponding at least one user, and a virtual space which is configured based on the second information and in which the user object is arranged.
REFERENCE SIGNS LIST
-
- 10, 30, 50 system
- 100, 300, 500 terminal apparatus
- 200, 400, 600 server
- 110 sensor
- 120 behavior recognition unit
- 130, 530 communication unit
- 140 control unit
- 150 UI
Claims
1. An information processing apparatus comprising circuitry configured to:
- acquire a behavior pattern for each user of a plurality of users based on a behavior log of each user to construct a recommendation group of users whose behavior patterns are highly correlated; and
- receive a registration for a previously unregistered user selected from the recommendation group.
2. The information processing apparatus of claim 1, wherein the circuitry is further configured to:
- display the recommendation group for selection by a particular user of the plurality of users.
3. The information processing apparatus of claim 1, wherein the recommendation group of users is determined to be highly correlated based on each user of the recommendation group having a similar score value.
4. The information processing apparatus of claim 3, wherein the score value of each user of the recommendation group is associated with a behavior of the respective user.
5. The information processing apparatus of claim 1, wherein the circuitry is further configured to:
- acquire the behavior pattern for each user of the plurality of users by performing statistical processing on behavior information of each user of the plurality of users.
6. The information processing apparatus of claim 1, wherein each of the plurality of user objects are displayed to indicate a behavior of the respective user.
7. The information processing apparatus of claim 1, wherein the behavior patterns are highly correlated over a period of time.
8. The information processing apparatus of claim 1, wherein each user of the plurality of users has an associated profile.
9. The information processing apparatus of claim 8, wherein the associated profile of each user comprises an age, a gender and an occupation.
10. A method in an information processing system, the method comprising:
- acquiring a behavior pattern for each user of a plurality of users based on a behavior log of each user to construct a recommendation group of users whose behavior patterns are highly correlated; and
- receiving a registration for a previously unregistered user selected from the recommendation group.
11. The method of claim 10, further comprising:
- displaying the recommendation group for selection by a particular user of the plurality of users.
12. The method of claim 10, wherein the recommendation group of users is determined to be highly correlated based on each user of the recommendation group having a similar score value.
13. The method of claim 12, wherein the score value of each user of the recommendation group is associated with a behavior of the respective user.
14. The method of claim 10, further comprising:
- acquiring the behavior pattern for each user of the plurality of users by performing statistical processing on behavior information of each user of the plurality of users.
15. The method of claim 10, wherein each of the plurality of user objects are displayed to indicate a behavior of the respective user.
16. The method of claim 10, wherein the behavior patterns are highly correlated over a period of time.
17. The method of claim 10, wherein each user of the plurality of users has an associated profile.
18. The method of claim 17, wherein the associated profile of each user comprises an age, a gender and an occupation.
19. A non-transitory computer readable medium comprising instructions for causing a computer to:
- acquire a behavior pattern for each user of a plurality of users based on a behavior log of each user to construct a recommendation group of users whose behavior patterns are highly correlated; and
- receive a registration for a previously unregistered user selected from the recommendation group.
Type: Application
Filed: Mar 11, 2021
Publication Date: Jul 1, 2021
Applicants: Sony Corporation (Tokyo), So-Net Corporation (Tokyo)
Inventors: Masatomo KURATA (Tokyo), Hideyuki ONO (kanagawa), Sota MATSUZAWA (Tokyo), Akikazu TAKEUCHI (Tokyo), Takayoshi MURAMATSU (Tokyo)
Application Number: 17/198,277