Modifying a Simulated Reality Display Based on Object Detection
Various implementations modify a simulated display based upon object detection. A simulated reality device displays at least some computer-generated graphics on a display. In some implementations, the computer-generated graphics are visually overlaid on a real world scene to provide augmented information. Alternately or additionally, the computer-generated graphics visually replace a user's view of the real world scene to provide a virtual reality. While displaying the computer-generated graphics, the simulated reality device detects a real world object, such as by detecting the real world object is within a predetermined distance of the simulated reality device and/or by detecting the real world object is moving in one or more directions. Upon detecting the real world object, various implementations of the simulated reality device modify the computer-generated graphics to display a notification of the real world object.
Latest Motorola Mobility LLC Patents:
- Communication device with predicted service failure triggered registration fallback for reduced communication service setup
- Electronic device with automatic eye gaze tracking and camera adjustment
- Modifying a barcode display to facilitate barcode scanning
- Method to authenticate with a mobile communication network
- Electronic device that receives and presents private augmented reality messages between two users
Simulated reality systems provide users with a reality experience that differs from the real word. For example, a virtual reality (VR) system simulates a fictitious world such that the user experiences tactile feedback, noises, and/or visual feedback corresponding to the fictitious world. As another example, an augmented reality (AR) system overlays additional information, data, and/or images over an image capture and/or view of the real world to simulate a reality that visually displays computer-generated information integrated with views of the real word. Oftentimes, users become so engrossed in viewing simulated reality content that they become distracted from observing real word events and/or objects, thus creating a potential hazard to the user and/or those around them.
While the appended claims set forth the features of the present techniques with particularity, these techniques, together with their objects and advantages, may be best understood from the following detailed description taken in conjunction with the accompanying drawings of which:
Turning to the drawings, wherein like reference numerals refer to like elements, techniques of the present disclosure are illustrated as being implemented in a suitable environment. The following description is based on embodiments of the claims and should not be taken as limiting the claims with regard to alternative embodiments that are not explicitly described herein.
Various implementations modify a simulated display based upon object detection. A simulated reality device displays at least some computer-generated graphics on a display, such as by visually overlaying the computer-generated graphics on a real world scene to provide augmented information and/or visually replacing the real world scene with the computer-generated graphics to provide a virtual reality. While displaying the computer-generated graphics, various implementations of the simulated reality device detect a real world object, and modify the computer-generated graphics to provide a visual notification of the real world object.
Consider now an example environment in which various aspects as described herein can be employed.
Example EnvironmentGenerally, virtual reality can be viewed as a computer-generated simulated environment in which a user has an apparent physical presence. Accordingly, when implemented as a virtual reality device, simulated reality device 102 provides the user with an environment that can viewed with a head-mounted display, such as glasses or other wearable display device that has near-eye display panels as lenses, to display a virtual reality environment that visually replaces a user's view of the actual environment. When implemented as an augmented reality device, simulated reality device 102 provides a visual display that a user can see through to view the surrounding environment (e.g., the real world), and also see images of computer-generated virtual objects, such as holograms, directions, speed information, device status information, etc., that appear as a part of the environment. Augmented reality can include any type of virtual images or graphics that enhance or augment the environment that a user experiences.
To provide the user with object notification, simulated reality device 102 includes display device 104, content generation module 106, one or more sensor(s) 108, and object notification module 110.
Display device 104 represents a display device that displays images and/or information to the user. For example, display device 104 can display video of a scene occurring in the real world, stored video(s) captured at previous moment(s) in time, and/or display computer-generated video and/or images. In one or more implementations, the content displayed by display device 104 is generated and/or managed by content generation module 106. Alternately or additionally, display device 104 represents a surface on which images can be projected. For instance, in the case of augmented reality, some implementations of display device 104 represents a see-through display through which the user can view a surrounding environment and a display on which computer-generated images can be projected.
Content generation module 106 represents functionality that generates and/or drives the content displayed by, or projected on, display device 104. For example, some implementations of content generation module 106 drive display device with real-time video generated by a camera. Alternately or additionally, content generation module 106 can augment the real-time video with additional information, such as holograms, icons, location names, travel directions, and so forth. For instance, content generation module 106 can analyze a current scene being captured, generate information about the scene, and generate images that are overlaid on top of the video being displayed by display device 104. In some implementations, the overlaid information can include status and/or state information associated with simulated reality device 102 (e.g., battery level, unread messages, incoming communication notifications, etc.). As another example, content generation module 106 can generate images projected onto a surface as further described herein. In one or more implementations, content generation module 106 drives display device 104 with computer-generated graphics, such as a computer-generate scene corresponding to a virtual reality experience.
While illustrated as a single module, it is to be appreciated that content generation module 106 can include any number modules that interact with one another to provide content to display device 104. For example, content generation module 106 can include a virtual reality-based video game module that provides a virtual reality experience, and an augmentation module that augments the virtual reality experience with device information. Alternately or additionally, content generation module 106 can include a video module that streams images of a scene being captured in real-time to display device 104, a location module that analyzes and gathers information about the scene, and/or an augmentation module that augments the images of the scene with information generated by the location module.
Sensors 108 represent sensors used by simulated reality device 102 to detect the presence of a real world object. For example, sensors 108 can include a camera, a proximity sensor, light detection sensor(s), microphone(s), motion sensor(s), a Global Positioning System (GPS) sensor, and so forth. Here, the term “presence” is used to signify any suitable type of characteristic that can be determined by a sensor, such as size, distance, velocity, shape, presence, lack of presence, and so forth. For example, sensors 108 can be used to determine whether an object resides within a predetermined perimeter around, and/or distance from, simulated reality device 102, whether no objects reside with the predetermined perimeter, whether an identified object is moving towards or away from the simulated reality device, and so forth. In some implementations, sensors 108 are communicatively coupled to object notification module 110 to provide object notification in a simulated reality as further described herein.
Object notification module 110 represents functionality that identifies when to modify content being displayed (by way of content generation module 106) via display device 104, and subsequently modifies the content being displayed to notify a user of an identified object. For instance, object notification module 110 can receive information gathered and/or generated by sensors 108, and analyze the information to determine whether an object has moved within a predetermined perimeter. Upon determining that an object has moved with the predetermined perimeter, object notification module 110 can generate display data used to modify the simulated reality content being displayed by, or projected on, display device 104. Alternately or additionally, object notification module 110 can modify existing simulated reality content being displayed by, or projected on, display device 104. In some implementations, object notification module 110 works in concert with content generation module 106 to display an object notification and/or modify existing content on display device 104 to indicate the presence of an object, examples of which are further provided herein.
Computer-readable media 204 includes content generation module 106 and object notification module 110 of
Simulated reality device 102 also includes haptic feedback component(s) 210 and audio output module 212. Haptic feedback components(s) 210 deliver tactile interactions to a user. For example, when an object of interest has been detected, some implementations use haptic feedback components 210 to deliver a physical notification and/or physical feedback to the user, such as a vibration or motion. Tactile feedback notifications can be in addition to, or alternately in place of, visual notifications associated with object detection as further described herein.
Audio output module 212 represents any suitable component that can be used to deliver audio to a user, such as a speaker, an earphone port, wireless audio transmission, etc. Upon detecting an object of interest, various implementations generate an audible notification (e.g., a beep, audible words, music, etc.). As in the case of haptic feedback components 210, audible notifications can be provided to the user in addition to, or alternately in place of, visual notifications and/or haptic notifications, to announce the presence of a detected object. Accordingly, any combination of audible, visual, and/or tactile notifications can be utilized.
As described with respect to
Having described an example operating environment in which various aspects of object notification in a simulated reality device can be utilized, consider now visually modifying simulated reality content based on object detection in accordance with one or more implementations.
Visually Modifying Simulated Reality Content Based on Object DetectionVarious implementations of simulated reality devices display computer-generated graphics, such as by generating content overlaid on an existing scene of the real world (e.g., overlaid on a video capture, projected onto a surface) or generating content that visually replaces a user's view of the real world. In turn, the user viewing these graphics sometimes becomes so engrossed in the experience, they become less aware of the real world. For instance, a VR system can provide audio output and/or tactile output that is synchronized with the computer-generated graphics such that the virtual world becomes a realistic experience to the user. Similarly, an AR system can provide graphics that engross the user, such as an animated cartoon character that interacts with various aspects of the underlying real world scene, information bubbles that include contextual data about various points of interest associated with the real world scene, device information, and so forth. While the computer-generated graphics can be entertaining to the user, these graphics can also sometimes put the user at risk.
To demonstrate, consider
In the upper portion of
While the various forms of information displayed in augmented display 304 can be helpful to the user, the additional information can distract the user from potential hazards. For example, in the lower portion of
Various implementations modify a simulated display based upon object detection. A simulated reality device displays at least some computer-generated graphics on a display, such as by visually overlaying the computer-generated graphics on a real world scene to display augmented information and/or visually replacing the real world scene with the computer-generated graphics to provide a virtual reality. While displaying the computer-generated graphics, various implementations of the simulated reality device detect a real world object, and modify the computer-generated graphics to provide a visual notification of the real world object.
In the upper portion of
Environment 400 also includes object 404 that represents any suitable type of real world object, such as a car, a person, a bicycle, an animal, a fixed structure, and so forth. In the upper portion of
Sensors 108 can send out and/or receive sensing information 406, indicated here as outgoing and incoming information signals. For example, some implementations of sensors 108 receive incoming light to a camera lens to capture images of environment 400. In turn, the captured images can be analyzed to identify when an object of interest becomes a potential hazard. In at least some implementations, sensors 108 send out probing signals, such as a proximity sensor emitting a beam of electromagnetic radiation, and analyze return fields and/or signals for changes that indicate an object of interest has become a potential hazard. As another example, a passive motion sensor can receive and/or detect emitted infrared energy in the form of heat to identify when a motion is occurring. Thus, sensors 108 can passively receive sensing information 406 and/or actively send out probing signals to generate sensing information 406.
Moving to the lower portion of
Recall from the lower portion
In
Alternately or additionally, in
In
While
Consider now
In the
As another example,
By modifying a simulated reality display based on object detection, a simulated reality device can provide the user with a safer viewing environment relative to an unmodified simulated reality display. The simulated reality device scans a surrounding environment for real world objects that pose a potential hazard to a user. In turn, when a real world object poses a potential hazard, the simulated reality device can visually alert the user of the hazard, such as by displaying the real world object in the foreground of the simulated reality display. Since users visually engage to experience a simulated reality, the visual notification is more likely to be observed by the user relative to other notification mechanisms. This allows the user to become submerged in a simulated reality with the added security of knowing the simulated reality device will alert the user of pending hazards.
Now consider
At 802, a simulated reality device displays computer-generated graphics associated with a simulated reality. In some implementations, the simulated reality device generates virtual reality graphics, and visually replaces a user's view of the real world with the virtual reality graphics, such as by displaying the virtual reality graphics on a display device. In other implementations, the simulated reality device generates augmented information, and visually displays the augmented information as an overlay on a scene of the real world. This can include overlaying the augmented information over a video and/or image generated by a camera, projecting the augmented information onto a window and/or view of the real world, and so forth.
At 804, the simulated reality device detects a real world object of interest. This can be achieved in any suitable manner. The simulated reality device can use one sensor to detect the real world object, or multiple sensors in combination to detect the real world object. Sensors can be integrated into the simulated reality device and/or the sensors can be external and electronically coupled to the simulated reality device. Some implementations of the simulated reality device receive triggers, events, and/or notifications from the sensors that indicate an object has been detected. Alternately or additionally, some implementations of the simulated reality device receive information gathered by the sensors, and analyze the information to detect when the real world object may pose a hazard to a user. Detecting a real world object can include detecting the presence of an object, a size of the object, a shape of the object, a direction of movement of the object, a velocity of the object, and so forth.
At 806, the simulated reality device visually modifies the computer-generated graphics based on detecting the real world object. Some implementations capture an image and/or video of the detected object, and overlay the image and/or video on a portion of the computer-generated graphics. This can include visually locating the image of the detected object at particular location to indicate a real world location of the detected object relative to the simulated reality device. Other implementations remove some or all of the computer-generated graphics, such as computer-generated graphics that are visually located in a same region as the detected object. Alternately or additionally, the simulated reality device can generate new graphics to display, such as a highlight notation, shading notation, animations, and so forth, that can be used to highlight the detected object. Alternately or additionally, some implementations provide audible and/or tactile notifications to convey an object has been detected.
Having described examples of visually modifying simulated reality graphics based on object detection, consider now a discussion of an example device in which can be used for various implementations.
Example DeviceElectronic device 900 includes communication transceivers 902 that enable wired or wireless communication of device data 904, such as received data and transmitted data. While referred to as a transceiver, it is to be appreciated that communication transceivers 902 can additionally include separate transmit antennas and receive antennas without departing from the scope of the claimed subject matter. Example communication transceivers include Wireless Personal Area Network (WPAN) radios compliant with various Institute of Electrical and Electronics Engineers (IEEE) 802.15 (Bluetooth™) standards, Wireless Local Area Network (WLAN) radios compliant with any of the various IEEE 802.11 (WiFi™) standards, Wireless Wide Area Network (WWAN) radios for cellular telephony (3GPP-compliant), wireless metropolitan area network radios compliant with various IEEE 802.16 (WiMAX™) standards, and wired Local Area Network (LAN) Ethernet transceivers.
Electronic device 900 may also include one or more data-input ports 906 via which any type of data, media content, and inputs can be received, such as user-selectable inputs, messages, music, television content, recorded video content, and any other type of audio, video, or image data received from any content or data source. Data-input ports 906 may include Universal Serial Bus (USB) ports, coaxial-cable ports, and other serial or parallel connectors (including internal connectors) for flash memory, Digital Versatile Discs (DVDs), Compact Disks (CDs), and the like. These data-input ports may be used to couple the electronic device to components, peripherals, or accessories such as keyboards, microphones, or cameras.
Electronic device 900 of this example includes processor system 908 (e.g., any of application processors, microprocessors, digital-signal processors, controllers, and the like) or a processor and memory system (e.g., implemented in a system-on-chip), which processes computer-executable instructions to control operation of the device. A processing system may be implemented at least partially in hardware, which can include components of an integrated circuit or on-chip system, digital-signal processor, application-specific integrated circuit, field-programmable gate array, a complex programmable logic device, and other implementations in silicon and other hardware. Alternatively, or in addition, the electronic device can be implemented with any one or combination of software, hardware, firmware, or fixed-logic circuitry that is implemented in connection with processing and control circuits, which are generally identified as processing and control 910. Although not shown, electronic device 900 can include a system bus, crossbar, interlink, or data-transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, data protocol/format converter, a peripheral bus, a universal serial bus, a processor bus, or local bus that utilizes any of a variety of bus architectures.
Electronic device 900 also includes one or more memory devices 912 that enable data storage, examples of which include random access memory (RAM), non-volatile memory (e.g., read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. Memory devices 912 are implemented at least in part as a physical device that stores information (e.g., digital or analog values) in storage media, which does not include propagating signals or waveforms. The storage media may be implemented as any suitable types of media such as electronic, magnetic, optic, mechanical, quantum, atomic, and so on. Memory devices 912 provide data storage mechanisms to store the device data 904, other types of information or data, and various device applications 914 (e.g., software applications). For example, operating system 916 can be maintained as software instructions within memory devices 912 and executed by processor system 908.
In some aspects, memory devices 912 includes content generation module 918 and object notification module 920. While these modules are illustrated and described as residing within memory devices 912, other implementations of these modules can alternately or additionally include software, firmware, hardware, or any combination thereof
Content generation module(s) 918 generate display content that can be used to provide a simulated reality display. This can include any combination of modules used to generate simulated reality content, such as a virtual reality gaming application, an augmented navigation module, an augmented hologram module, and so forth.
Object notification module 920 determines when to visually modify the simulated reality display based on object detection, and generates images and/or graphics used to modify the simulated reality display. This can include generating captured images of the detected objects and/or generating highlighting graphics as further described herein. In some implementations, object notification module 920 interfaces with sensor(s) 922 to identify objects and/or to determine when to modify the simulated reality display. Alternately or additionally, object notification module 920 interfaces with content generation module(s) 918 to drive the display of the modified content.
Electronic device 900 includes sensor(s) 922 that can be used to detect a real world object. Alternately or additionally, electronic device 900 can electronically couple to external sensors as further described herein. In some implementations, sensor(s) 922 provide information to object notification module 920 that is subsequently analyzed to determine the presence of a real world object. Alternately or additionally, sensor(s) 922 can identify the presence of the real world object, and send object notification module 920 a communication that indicates the presence of the real world object.
Electronic device 900 also includes haptic feedback component(s) 924 to deliver tactile experiences to the user, such as a vibration or motion. As further described herein, various embodiments provide the user with these tactile experiences to announce the presence of a detected object. For example, object notification module 920 can interface with haptic feedback component(s) 924 when an object has been detected to initiate a vibration, motion, etc.
Electronic device 900 also includes audio and video processing system 926 that processes audio data and passes through the audio and video data to audio system 928. Audio system 928 and display system 930 may include any modules that process, display, or otherwise render audio, video, display, or image data. Display data and audio signals can be communicated to an audio component and to a display component via a radio-frequency link, S-video link, HDMI, composite-video link, component-video link, digital video interface, analog-audio connection, or other similar communication link, such as media-data port 932. In some implementations, audio system 928 and display system 930 are external components to electronic device 900. Alternatively, or additionally, audio system 928 and/or display system 930 can be an integrated component of the example electronic device 900, such as part of an integrated speaker and/or an integrated display and touch interface. In some implementations, object notification module 920 interfaces with audio system 928 and/or display system 930 to deliver an audio alert to the user when an object has been detected as further described herein.
In view of the many possible aspects to which the principles of the present discussion may be applied, it should be recognized that the implementations described herein with respect to the drawing figures are meant to be illustrative only and should not be taken as limiting the scope of the claims. Therefore, the techniques as described herein contemplate all such implementations as may come within the scope of the following claims and equivalents thereof.
Claims
1. A device comprising:
- one or more sensors;
- one or more processors; and
- one or more computer-readable storage memories comprising processor-executable instructions which, responsive to execution by the one or more processors, enable the device to perform operations comprising: displaying, on a display device, computer-generated graphics associated with a simulated reality; detecting, using the one or more sensors, a real world object of interest; and visually modifying the computer-generated graphics, on the display device, to display a notification that the real world object of interest has been detected.
2. The device as recited in claim 1, wherein said displaying the computer-generated graphics associated with the simulated reality further comprises displaying augmented information associated with an augmented reality.
3. The device as recited in claim 2, wherein said displaying the computer-generated graphics further comprises projecting the augmented information over a view of the real world.
4. The device as recited in claim 2, wherein said visually modifying the computer-generated graphics further comprises:
- identifying a region around the object of interest;
- identifying augmented information that is in contact with the region; and
- visually removing the augmented information that is in contact with the region.
5. The device as recited in claim 2, wherein said visually modifying the computer-generated graphics further comprises:
- identifying a region around the object of interest; and
- adding a highlight notation around the region.
6. The device as recited in claim 1, wherein said visually modifying the computer-generated graphics to display the notification further comprises:
- capturing one or more images of the real world object; and
- overlaying the one or more images of the real world object over a virtual reality display.
7. The device as recited in claim 6, wherein said visually modifying the computer-generated graphics further comprises:
- visually displaying emphasis shading in a region around the one or more images.
8. The device as recited in claim 6, wherein said overlaying the one or more images of the real world object over the virtual reality display further comprises overlaying the one or more images at a location over the virtual reality display that indicates a position of the real world object relative to the device.
9. A computer-implemented method comprising:
- displaying, using a simulated reality device, computer-generated graphics associated with a simulated reality;
- detecting, using one or more hardware sensors, a real world object of interest that is located at or within a predetermined distance from the simulated reality device; and
- visually modifying the computer-generated graphics to display a notification associated with detecting the real world object of interest.
10. The computer-implemented method as recited in claim 9, wherein said detecting, the real world object of interest further comprises:
- predefining a region around the simulated reality device using the predetermined distance to define a boundary of the region;
- using the one or more hardware sensors to identify a location of the real world object of interest; and
- determining the location of the real world object resides at or within the region.
11. The computer-implemented method as recited in claim 9, wherein said using the one or more hardware sensors further comprises using an external sensor that is communicatively coupled to the simulated reality device.
12. The computer-implemented method as recited in claim 11, wherein the external sensor comprises a camera.
13. The computer-implemented method as recited in claim 9, wherein said displaying the computer-generated graphics associated with the simulated reality further comprises displaying a virtual reality display associated with a virtual reality gaming application.
14. The computer-implemented method as recited in claim 13, wherein said visually modifying the computer-generated graphics to display the notification further comprises visually overlaying one or more images of the real world object on top of the virtual reality display associated with the virtual reality gaming application.
15. The computer-implemented method as recited in claim 14, wherein said visually modifying the computer-generated graphics to display the notification further comprises displaying a highlight notation around the one or more images of the real world object.
16. The computer-implemented method as recited in claim 9, wherein said visually modifying the computer-generated graphics further comprises:
- identifying augmented information in an augmented reality display to modify; and
- visually modifying the augmented information to a semi-translucent state.
17. A simulated reality device comprising:
- a camera;
- one or more processors; and
- one or more computer-readable storage memories comprising processor-executable instructions which, responsive to execution by the one or more processors, enable the computing device to perform operations comprising: displaying computer-generated graphics associated with a simulated reality display; detecting a real world object of interest using one or more sensors; capturing one or more images of the real world object of interest using the camera; and visually modifying the computer-generated graphics to display the one or more images of the real world object of interest in a foreground of the simulated reality display.
18. The simulated reality device as recited in claim 17 further comprising:
- one or more haptic feedback components, and
- wherein said operations further comprise delivering physical feedback using the one or more haptic feedback components based on said detecting the real world object of interest.
19. The simulated reality device as recited in claim 17, wherein said displaying the computer-generated graphics associated with the simulated reality display further comprises overlaying augmented information on a scene of the real world captured by the camera to generate an augmented reality display.
20. The simulated reality device as recited in claim 17, wherein said displaying the computer-generated graphics associated with the simulated reality further comprises displaying computer-generated graphics associated with a virtual reality.
Type: Application
Filed: Nov 9, 2017
Publication Date: May 9, 2019
Applicant: Motorola Mobility LLC (Chicago, IL)
Inventor: Jun-Ki Min (Chicago, IL)
Application Number: 15/808,755