Viewpoint Change on a Display Device Based on Movement of the Device
Embodiments of the disclosed technology comprise a handheld display device with built-in accelerometer and, in some embodiments, compass. The display of a human figure is changed based on a change in viewpoint/orientation of the device. That is, upon detecting a change in viewpoint (e.g., viewing angle, tilt, roll, or pitch of the device), the image of the person changes. This may be used with a still picture of a person, such as for the sale of clothing, or in conjunction with moving images, such as for a sports or exercise instructional video.
Latest Celsia, LLC Patents:
- Viewpoint change on a display device based on movement of the device
- Viewpoint Change on a Display Device Based on Movement of the Device
- Viewpoint change on a display device based on movement of the device
- Method of Creating and Exhibiting Fluid Dynamics
- Viewpoint Change on a Display Device Based on Movement of the Device
The disclosed technology relates generally to viewing on a display device and, more specifically, to changing a viewing angle based on physical orientation of the device.
BACKGROUND OF THE DISCLOSED TECHNOLOGYShopping online is typically a glorified version of catalog shopping. On a catalog page, a picture of the item, or perhaps several pictures, are shown, and, in the case of clothing, typically on a model. One can see the sizes and prices available as well. In an online catalog, generally the same information is available, but in some cases, a person can choose a certain picture and zoom in. In some instances, videos are available. Someone wishing to learn a karate sequence, might watch a video of it, see diagrams, and so forth. Still, the content shown is controlled, by and large, by the provider. The interactivity is limited to the familiar rewind, stop, play, and fast-forward features. While such features are useful, still pictures and even videos are a poor substitute for actually being there.
While using a mouse to click and drag around an object to transpose the position of the object or even rotate it, such an interaction has limited translation to the real world and allows a finite direction of change to what is viewed, based on how the mouse is currently mapped in an application. Generally, such movements either transpose the position of an object or camera position on the XY axis, zooms in, or allows a user choose a different picture where one can repeat the changes.
What is needed is a way to make the user experience more real in such a way as to make a user feel more immersed and in control of what s/he is watching and to have the ability to control greater axes of movement in a natural manner.
SUMMARY OF THE DISCLOSED TECHNOLOGYIt is an object of the disclosed technology to provide a simple and natural interface and control for viewing a human figure.
It is a further object of the disclosed technology to provide a handheld display device whereby a display becomes interactive when the device is moved.
It is a further object of the disclosed technology to take into account orientation and/or acceleration of a device to determine what is displayed on a device.
An embodiment of the disclosed technology is a display device with an orientation sensor. The orientation sensor, which for example could be an accelerometer or a compass, measures orientation of the display device relative to a fixed external directional vector and, in some embodiments, the rate of displacement of the device from the same directional vector. An accelerometer measures orientation or movement changes relative to gravity while a compass measures change in orientation or movement relative to a pole (e.g., relative to the north pole). Thus, depending on orientation of the device and direction of movement, the accelerometer, compass, or combination thereof determines a direction of movement of a display screen. The display device of this embodiment further has a storage device with data representative of a human figure, and a display exhibiting the human figure. The display of the human figure changes based on a direction of movement detected by the orientation sensor, and in some cases, also based on a direction of movement detected by a secondary orientation sensor.
The changing of the exhibited display may be a change in viewpoint of the human figure around a predefined center of gravity of the human figure, or a center of gravity of a plurality of human figures. The center of gravity may be an actual center of gravity, an estimated center of gravity, and/or a chosen point which is at the center, or a approximately a center, as defined in each case in a specific embodiment of the disclosed technology.
The display device may exhibit a display of a moving human figure, the human figure moving irrespective of, and in addition to, a change in viewpoint of the exhibited human figure. That is, a user of the display device may change the orientation of the device relative to his or her position, and the viewpoint of the human figure shown therein may continue to change with changes in device orientation around a vector orthogonal or partionally orthogonal to the sensor's directional vector. The change in viewpoint may be non-linearly or linearly mapped to an amount of movement of the display device.
In a specific embodiment of the display device, upon a first rotation of the device at a first relative position, a first display of the human figure is exhibited. Upon a second rotation to a second relative position, a second display of the human figure is exhibited, such as frames in a continuous video. Upon a third rotation back to the first relative position, a third display of the human figure is exhibited. Thus, for example, a person with frontal face showing may be frowning in the first display at a first relative position of the device. When a user changes the position of the device relative to his or her position, the display of the human figure on the device is changed, such as, for example, to a view of the side of the same person's head. When returning to the first relative position showing the frontal view of the person's face again, the same person is smiling. This viewpoint change may be toggled during repeated successions of viewing and changing the view away from and back to the first relative position.
The human figure shown on the display device may be wearing clothes as part of an offer for sale of the clothes. As another usage, the display may be used to teach a user positioning during an exercise with defined positions, such as yoga, martial arts, sports, and pornography positions.
A further embodiment of the disclosed technology is a method of displaying a series of images on a handheld device. The method proceeds by storing a plurality of images representative of a human figure, measuring the orientation of the handheld device with an orientation sensor, exhibiting on the display a first image of the human figure, and changing the image exhibited based on a direction and rate of movement of the handheld device, as determined from the orientation sensor. The direction and rate of movement are determined by measuring changes in orientation over time compared to a fixed directional vector, such as acceleration due to gravity or magnetic north.
Further elements of the device of the disclosed technology are applicable to embodiments of the method of the disclosed technology.
Embodiments of the disclosed technology comprise a handheld display device with a built-in orientation sensor such as an accelerometer or a compass. The display of a human figure on the display screen changes based on a change in the device's hardware with respect to an external orientation vector and the user's viewing angle. That is, upon detecting a change in the viewpoint (e.g., viewing angle, tilt, roll, or pitch of the device relative to the stationary user), the image of the person changes. This may be used with images of a non-moving person, such as for the sale of clothing, or in conjunction with images of a person moving, such as for a sports or exercise instructional video.
For purposes of this disclosure, an accelerometer is defined as a device which measures acceleration of a device relative to freefall. A single or multi-axis accelerometer may be used to carry out embodiments of the disclosed technology to sense orientation. An accelerometer measures the acceleration relative to a frame of reference. An accelerometer at rest relative to the earth's surface will indicate approximately 1 g upwards, because any point on the earth's surface is accelerating upwards relative to a local inertial frame. To obtain the acceleration due to motion with respect to the earth, the offset from gravity is subtracted, or the change in acceleration is measured to determine when an object is rotating about an axis. The combined measurements in relation to multiple axes are used to determine rotation which is unaligned with the earth.
A compass, for purposes of this disclosure, is a device which determines the orientation of the display device of embodiments of the disclosed technology with respect to the plane of the earth's surface. It may detect true north, magnetic north, or any other direction. It may instead determine only a change in direction without knowing an actual direction, in embodiments of the disclosed technology. When determining a magnetic field, this may be by measuring the magnetic field directly or measuring another value and approximating the magnetic field/direction. A solid state compass, global positioning system, or other such device may be used for this purpose.
The display device used in embodiments of the disclosed technology is a display capable of changing an exhibited image and is a screen in two dimensions (e.g., a flat screen, CRT screen, LCD screen, plasma screen, etc).
Embodiments of the disclosed technology will become clearer in view of the description of the following figures.
In embodiments of the disclosed technology, this is a linear or non-linear relationship. In a linear relationship, the rotation may be made to feel natural, that is, a person is looking around the object by tilting the screen. As the screen is tilted, so too is the image in an adjacent manner, whether it is rotated the same degree amount or a multiple of that amount. That is, by way of example, when the screen is rotated 30, 45, and 60 degrees, the person shown in the image is rotated 30, 45, and 60 degrees in a one-to-one linear correspondence, or 60, 90, and 120 degrees in a two-to-one linear correspondence, or −30, −45, and −60 in a negative one correspondence. In a non-linear example, when rotating the device/screen 30, 45, and 60 degrees, the person shown may be rotated 30, 90, and 180 degrees, respectively.
Referring first to
The rotations depicted by arrows 250 and 270 described above with reference to
Using a combination of the methods and devices described with regard to
Referring now to the first case described above, that is, a changing human figure over time, irrespective of the orientation of the device, this may be a series of sequential images or a video. In the example shown, the human figure's arm is in a down position 570 at a first time, shown in
In the second case described above, the change in position of the human figure may be as a result of a change in orientation of the display device. In
In a combination of the two cases described above, the video or images could be continuously changing as time progresses, but the set of video images shown at every particular orientation of the device would be relative to each particular orientation. For example, in
In step 930, which can be carried out before, after, or both before and after step 960, orientation data is received in a continuous manner from an orientation sensor, such as an accelerometer or a compass or another instrument for measuring orientation of, or relative to, a pre-existing magnetic field. From this data, in step 950, it is determined how the display device of embodiments of the disclosed technology is moving or being reoriented, whether tilting up/down, tilting left/right, being rotated left/right, or being reoriented towards a different cardinal direction. To supplement this, in embodiments of the disclosed technology, in step 940, orientation data is further received from a secondary orientation sensor, which could also be an accelerometer or compass. Then, in step 960, an image of a human figure is exhibited which is either a pre-defined first image or an image based on the orientation or position of the display device. Steps 930 and 940 continue to be carried out, and in step 970, upon receiving further movement data from the accelerometer (such as movement past a predefined threshold) or new orientation data from a compass, the image is changed. The image may additionally change over time, such as with a video or sequence, in step 980. A change in time and a change in orientation would produce a combined change, in embodiments of the disclosed technology. Thus, over time, the human figure moves irrespective of a change in viewpoint (orientation) of the exhibited human figure. A change in viewpoint may also occur, as determined by received motion data providing movement data with regard to the display device.
The data storage apparatus 1030 may be magnetic media (e.g., hard disk, video cassette), optical media (e.g., Blu-Ray or DVD) or another type of storage mechanism known in the art. The data storage apparatus 1030 or the non-volatile memory 1020 stores data which is sent via bus 1070 to the video output 1060.
A datum received from an accelerometer or compass 1090 is processed by the central processing unit 1040 to determine if a change in viewpoint or orientation has been made. The displayed image, as described above, is outputted via a video output 1060, that is, a transmitter or video relay device which transmits video to a television screen, monitor, or other display device 1080 via cable or data bus 1065. The video output 1060 may also be an output over a packet-switched network 1065 such as the internet, where it is received and interpreted as video data by a recipient display 1080. The recipient display may be a liquid crystal display, cathode ray tube, or series of light-emitting diodes, or any other known display system.
An input/output device 1050, such as buttons on the device itself, an infrared signal receiver for use with a remote control, or a network input/output for control via a local or wide area network, receives and/or sends a signal via data pathway 1055 (e.g., infrared signal, signal over copper or fiber cable, wireless network, etc. The input/output device, in embodiments of the disclosed technology, receives input from a user, such as which image to display and how to interact with a detected object.
While the disclosed technology has been taught with specific reference to the above embodiments, a person having ordinary skill in the art will recognize that changes can be made in form and detail without departing from the spirit and the scope of the disclosed technology. The described embodiments are to be considered in all respects only as illustrative and not restrictive. All changes that come within the meaning and range of equivalency of the claims are to be embraced within their scope. Combinations of any of the methods, systems, and devices described hereinabove are also contemplated and within the scope of the disclosed technology.
Claims
1-46. (canceled)
47. A method of changing an image of an object displayed on a portable device having a processor, and having a display screen forming a plane, comprising, with the processor of the portable device:
- (a) at a first time, causing the display screen to display a first image depicting an object in an initial orientation; and
- (b) at a second time, later than the first time, based on a change between the first time and the second time in a reference angle formed between the plane of the display screen and a select direction, which reference angle is determined from data produced by a sensor of the portable device, causing the display screen to display a second image depicting at least a portion of the object in a modified orientation different from the initial orientation.
48. The method of claim 47 wherein the first image and second image comprise views of the object at different moments while said at least a portion of said object is in motion, and wherein part (b) comprises, at the second time, causing the display screen to display said at least a portion of the object in the modified orientation as a result of said motion of said at least a portion of the object.
49. The method of claim 47 wherein the second image depicts the entire object displayed in the modified orientation.
50. The method of claim 49 wherein the first image and second image comprise views of the object at different moments while said at least a portion of said object is in motion, and wherein part (b) comprises, at the second time, causing the display screen to display said at least a portion of the object in the modified orientation, further as a result of said motion of said at least a portion of the object.
51. The method of claim 47 wherein the display screen has edges and further comprising causing the display screen to display the first image and the second image while maintaining an alignment of the object with respect to the edges of the display screen between the first time and the second time.
52. The method of claim 47 wherein the display screen has edges and further comprising causing the display screen to display the first image and the second image with an alignment of the object that is rotated with respect to the edges of the display screen between the first time and the second time, based on a rotation of the portable device in the plane of the display screen, which rotation is determined from data produced by a supplemental sensor of the portable device.
53. The method of claim 47 wherein the select direction is gravitational down and wherein the sensor is an accelerometer.
54. The method of claim 47 wherein the select direction is magnetic north and wherein the sensor is a compass.
55. The method of claim 47 wherein part (a) comprises causing the display screen to display a first image of a series of images, which first image depicts the object in the initial orientation, and part (b) comprises causing the display screen to display a second image of the series of images, which second image depicts the object in the modified orientation.
56. The method of claim 47 wherein the first image and the second image represent different views of the object from different perspectives aimed at a center of gravity of the object.
57. The method of claim 47 wherein the object is a human figure.
58. The method of claim 47 wherein the object is a plurality of bodies in a scene.
59. The method of claim 47 wherein the modified orientation and the initial orientation differ by a rotation angle and the rotation angle is not linearly proportional to the change in the reference angle.
60. The method of claim 47 further comprising, at a third time, later than the second time, based on a change between the second time and the third time in the reference angle, causing the display screen to display a third image depicting said at least a portion of the object in a further modified orientation different from both the initial orientation and the modified orientation.
61. The method of claim 60 further comprising, at a fourth time, later than the third time, based on data indicating that a reference angle at the fourth time matches the reference angle at the first time, causing the display screen to display the first image.
62. A portable display device comprising:
- (a) a display screen forming a plane;
- (b) a sensor;
- (c) a storage device containing image data of an object;
- (d) a computer processor responsive to the sensor, coupled to the storage device, and controlling the display screen, which computer processor:
- (i) determines, from data produced by the sensor, a reference angle formed between the plane of the display screen and a select direction;
- (ii) at a first time, causes the display screen to display a first image depicting an object in an initial orientation; and
- (iii) at a second time, later than the first time, based on a change between the first time and the second time in the reference angle causes the display screen to display a second image depicting at least a portion of the object in a modified orientation different from the initial orientation.
63. The display device of claim 62 wherein the sensor is an accelerometer and the select direction is gravitational down.
64. The display device of claim 62 wherein the sensor is a compass and the select direction is magnetic north.
65. The display device of claim 62 wherein the image data of the object in the storage device comprises a series of images representing different views of the object rotated through different rotation angles.
66. The display device of claim 62 wherein the display screen has edges, further comprising a supplemental sensor, and wherein the computer processor (i) is responsive to the supplemental sensor, (ii) determines, from data produced by the supplemental sensor, a rotation of the portable device in the plane of the display screen, and (iii) causes the display screen to display the first image and the second image with the object aligned differently with respect to the edges of the display screen between the first time and the second time, which alignment difference is based on the rotation.
67. The display device of claim 66 wherein the sensor is an accelerometer and the supplemental sensor is a compass.
Type: Application
Filed: Mar 30, 2015
Publication Date: Jul 23, 2015
Applicant: Celsia, LLC (San Jose, CA)
Inventor: Barry Lee Petersen (Castle Rock, CO)
Application Number: 14/672,856