MOBILE DEVICE ACCESSORY FOR THREE-DIMENSIONAL SCANNING

A variety of techniques are disclosed for enabling three-dimensional scanning with mobile devices. In general, an accessory provides additional optics, image sensors, lighting and/or processing to complement preexisting device hardware in support of a variety of three-dimensional imaging techniques.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
RELATED APPLICATIONS

This application is a continuation of U.S. Appl. No. U.S. Ser. No. 13/736,210 filed on Jan. 8, 2013, which claims the benefit of U.S. Prov. App. No. 61/680,989 filed on Aug. 8, 2012 and U.S. Prov. App. No. 61/719,874 filed on Oct. 29, 2012. The entire content of these applications is incorporated herein by reference.

This application is related to U.S. application Ser. No. 13/314,337 filed on Dec. 8, 2011, the entire content of which is hereby incorporated by reference.

BACKGROUND

There remains a need for accessories to support three-dimensional scanning with mobile devices such as cellular phones, tables, and laptop computers.

SUMMARY

A variety of techniques are disclosed for enabling three-dimensional scanning with mobile devices. In general, an accessory provides additional optics, image sensors, lighting and/or processing to complement preexisting device hardware in support of a variety of three-dimensional imaging techniques.

BRIEF DESCRIPTION OF THE FIGURES

The invention and the following detailed description of certain embodiments thereof may be understood by reference to the following figures:

FIG. 1 is a block diagram of a three-dimensional printer.

FIG. 2 shows a networked three-dimensional printing environment.

FIG. 3 shows a mobile device with an accessory for three-dimensional imaging.

FIG. 4 is functional block diagram of an accessory coupled to a mobile device.

FIG. 5 shows a three-dimensional imaging system with dual optical paths.

FIG. 6 shows a three-dimensional imaging system with dual optical paths.

FIG. 7 shows a method for using an accessory to capture three-dimensional images with a mobile device.

FIG. 8 shows a method for using an accessory to capture three-dimensional images with a mobile device.

FIG. 9 shows a method for using an accessory to capture three-dimensional images with a mobile device.

DETAILED DESCRIPTION

All documents mentioned herein are hereby incorporated in their entirety by reference. References to items in the singular should be understood to include items in the plural, and vice versa, unless explicitly stated otherwise or clear from the text. Grammatical conjunctions are intended to express any and all disjunctive and conjunctive combinations of conjoined clauses, sentences, words, and the like, unless otherwise stated or clear from the context. Thus the term “or” should generally be understood to mean “and/or” and so forth.

The following description emphasizes three-dimensional printers using fused deposition modeling or similar techniques where a bead of material is extruded in a layered series of two dimensional patterns as “roads,” “paths” or the like to form a three-dimensional object from a digital model. It will be understood, however, that numerous additive fabrication techniques are known in the art including without limitation multijet printing, stereolithography, Digital Light Processor (“DLP”) three-dimensional printing, selective laser sintering, and so forth. Such techniques may benefit from the systems and methods described below, and all such printing technologies are intended to fall within the scope of this disclosure, and within the scope of terms such as “printer”, “three-dimensional printer”, “fabrication system”, and so forth, unless a more specific meaning is explicitly provided or otherwise clear from the context.

FIG. 1 is a block diagram of a three-dimensional printer. In general, the printer 100 may include a build platform 102, an extruder 106, an x-y-z positioning assembly 108, and a controller 110 that cooperate to fabricate an object 112 within a working volume 114 of the printer 100.

The build platform 102 may include a surface 116 that is rigid and substantially planar. The surface 116 may provide a fixed, dimensionally and positionally stable platform on which to build the object 112. The build platform 102 may include a thermal element 130 that controls the temperature of the build platform 102 through one or more active devices 132, such as resistive elements that convert electrical current into heat, Peltier effect devices that can create a heating or cooling affect, or any other thermoelectric heating and/or cooling devices. The thermal element 130 may be coupled in a communicating relationship with the controller 110 in order for the controller 110 to controllably impart heat to or remove heat from the surface 116 of the build platform 102.

The extruder 106 may include a chamber 122 in an interior thereof to receive a build material. The build material may, for example, include acrylonitrile butadiene styrene (“ABS”), high-density polyethylene (“HDPL”), polylactic acid (“PLA”), or any other suitable plastic, thermoplastic, or other material that can usefully be extruded to form a three-dimensional object. The extruder 106 may include an extrusion tip 124 or other opening that includes an exit port with a circular, oval, slotted or other cross-sectional profile that extrudes build material in a desired cross-sectional shape.

The extruder 106 may include a heater 126 (also referred to as a heating element) to melt thermoplastic or other meltable build materials within the chamber 122 for extrusion through an extrusion tip 124 in liquid form. While illustrated in block form, it will be understood that the heater 126 may include, e.g., coils of resistive wire wrapped about the extruder 106, one or more heating blocks with resistive elements to heat the extruder 106 with applied current, an inductive heater, or any other arrangement of heating elements suitable for creating heat within the chamber 122 sufficient to melt the build material for extrusion. The extruder 106 may also or instead include a motor 128 or the like to push the build material into the chamber 122 and/or through the extrusion tip 124.

In general operation (and by way of example rather than limitation), a build material such as ABS plastic in filament form may be fed into the chamber 122 from a spool or the like by the motor 128, melted by the heater 126, and extruded from the extrusion tip 124. By controlling a rate of the motor 128, the temperature of the heater 126, and/or other process parameters, the build material may be extruded at a controlled volumetric rate. It will be understood that a variety of techniques may also or instead be employed to deliver build material at a controlled volumetric rate, which may depend upon the type of build material, the volumetric rate desired, and any other factors. All such techniques that might be suitably adapted to delivery of build material for fabrication of a three-dimensional object are intended to fall within the scope of this disclosure.

The x-y-z positioning assembly 108 may generally be adapted to three-dimensionally position the extruder 106 and the extrusion tip 124 within the working volume 114. Thus by controlling the volumetric rate of delivery for the build material and the x, y, z position of the extrusion tip 124, the object 112 may be fabricated in three dimensions by depositing successive layers of material in two-dimensional patterns derived, for example, from cross-sections of a computer model or other computerized representation of the object 112. A variety of arrangements and techniques are known in the art to achieve controlled linear movement along one or more axes. The x-y-z positioning assembly 108 may, for example, include a number of stepper motors 109 to independently control a position of the extruder 106 within the working volume along each of an x-axis, a y-axis, and a z-axis. More generally, the x-y-z positioning assembly 108 may include without limitation various combinations of stepper motors, encoded DC motors, gears, belts, pulleys, worm gears, threads, and so forth. For example, in one aspect the build platform 102 may be coupled to one or more threaded rods by a threaded nut so that the threaded rods can be rotated to provide z-axis positioning of the build platform 102 relative to the extruder 124. This arrangement may advantageously simplify design and improve accuracy by permitting an x-y positioning mechanism for the extruder 124 to be fixed relative to a build volume. Any such arrangement suitable for controllably positioning the extruder 106 within the working volume 114 may be adapted to use with the printer 100 described herein.

In general, this may include moving the extruder 106, or moving the build platform 102, or some combination of these. Thus it will be appreciated that any reference to moving an extruder relative to a build platform, working volume, or object, is intended to include movement of the extruder or movement of the build platform, or both, unless a more specific meaning is explicitly provided or otherwise clear from the context. Still more generally, while an x, y, z coordinate system serves as a convenient basis for positioning within three dimensions, any other coordinate system or combination of coordinate systems may also or instead be employed, such as a positional controller and assembly that operates according to cylindrical or spherical coordinates.

The controller 110 may be electrically or otherwise coupled in a communicating relationship with the build platform 102, the x-y-z positioning assembly 108, and the other various components of the printer 100. In general, the controller 110 is operable to control the components of the printer 100, such as the build platform 102, the x-y-z positioning assembly 108, and any other components of the printer 100 described herein to fabricate the object 112 from the build material. The controller 110 may include any combination of software and/or processing circuitry suitable for controlling the various components of the printer 100 described herein including without limitation microprocessors, microcontrollers, application-specific integrated circuits, programmable gate arrays, and any other digital and/or analog components, as well as combinations of the foregoing, along with inputs and outputs for transceiving control signals, drive signals, power signals, sensor signals, and so forth. In one aspect, this may include circuitry directly and physically associated with the printer 100 such as an on-board processor. In another aspect, this may be a processor associated with a personal computer or other computing device coupled to the printer 100, e.g., through a wired or wireless connection. Similarly, various functions described herein may be allocated between an on-board processor for the printer 100 and a separate computer. All such computing devices and environments are intended to fall within the meaning of the term “controller” or “processor” as used herein, unless a different meaning is explicitly provided or otherwise clear from the context.

A variety of additional sensors and other components may be usefully incorporated into the printer 100 described above. These other components are generically depicted as other hardware 134 in FIG. 1, for which the positioning and mechanical/electrical interconnections with other elements of the printer 100 will be readily understood and appreciated by one of ordinary skill in the art. The other hardware 134 may include a temperature sensor positioned to sense a temperature of the surface of the build platform 102, the extruder 126, or any other system components. This may, for example, include a thermistor or the like embedded within or attached below the surface of the build platform 102. This may also or instead include an infrared detector or the like directed at the surface 116 of the build platform 102.

In another aspect, the other hardware 134 may include a sensor to detect a presence of the object 112 at a predetermined location. This may include an optical detector arranged in a beam-breaking configuration to sense the presence of the object 112 at a predetermined location. This may also or instead include an imaging device and image processing circuitry to capture an image of the working volume and to analyze the image to evaluate a position of the object 112. This sensor may be used for example to ensure that the object 112 is removed from the build platform 102 prior to beginning a new build on the working surface 116. Thus the sensor may be used to determine whether an object is present that should not be, or to detect when an object is absent. The feedback from this sensor may be used by the controller 110 to issue processing interrupts or otherwise control operation of the printer 100.

The other hardware 134 may also or instead include a heating element (instead of or in addition to the thermal element 130) to heat the working volume such as a radiant heater or forced hot air heater to maintain the object 112 at a fixed, elevated temperature throughout a build, or the other hardware 134 may include a cooling element to cool the working volume.

FIG. 2 depicts a networked three-dimensional printing environment. In general, the environment 200 may include a data network 202 interconnecting a plurality of participating devices in a communicating relationship. The participating devices may, for example, include any number of three-dimensional printers 204 (also referred to interchangeably herein as “printers”), client devices 206, print servers 208, content sources 210, mobile devices 212, and other resources 216.

The data network 202 may be any network(s) or internetwork(s) suitable for communicating data and control information among participants in the environment 200. This may include public networks such as the Internet, private networks, telecommunications networks such as the Public Switched Telephone Network or cellular networks using third generation (e.g., 3G or IMT-2000), fourth generation (e.g., LTE (E-UTRA) or WiMax-Advanced (IEEE 802.16m)) and/or other technologies, as well as any of a variety of corporate area or local area networks and other switches, routers, hubs, gateways, and the like that might be used to carry data among participants in the environment 200.

The three-dimensional printers 204 may be any computer-controlled devices for three-dimensional fabrication, including without limitation any of the three-dimensional printers or other fabrication or prototyping devices described above. In general, each such device may include a network interface comprising, e.g., a network interface card, which term is used broadly herein to include any hardware (along with software, firmware, or the like to control operation of same) suitable for establishing and maintaining wired and/or wireless communications. The network interface card may include without limitation wired Ethernet network interface cards (“NICs”), wireless 802.11 networking cards, wireless 802.11 USB devices, or other hardware for wireless local area networking. The network interface may also or instead include cellular network hardware, wide area wireless network hardware or any other hardware for centralized, ad hoc, peer-to-peer, or other radio communications that might be used to carry data. In another aspect, the network interface may include a serial or USB port to directly connect to a computing device such as a desktop computer that, in turn, provides more general network connectivity to the data network 202.

The printers 204 might be made to fabricate any object, practical or otherwise, that is amenable to fabrication according to each printer's capabilities. This may be a model of a house or a tea cup, as depicted, or any other object such as a bunny, gears or other machine hardware, replications of scanned three-dimensional objects, or fanciful works of art.

Client devices 206 may be any devices within the environment 200 operated by users to initiate, manage, monitor, or otherwise interact with print jobs at the three-dimensional printers 204. This may include desktop computers, laptop computers, network computers, tablets, or any other computing device that can participate in the environment 200 as contemplated herein. Each client device 206 generally provides a user interface, which may include a graphical user interface, a text or command line interface, a voice-controlled interface, and/or a gesture-based interface to control operation of remote three-dimensional printers 204. The user interface may be maintained by a locally executing application on one of the client devices 206 that receives data and status information from, e.g., the printers 204 and print servers 208 concerning pending or executing print jobs. The user interface may create a suitable display on the client device 206 for user interaction. In other embodiments, the user interface may be remotely served and presented on one of the client devices 206, such as where a print server 208 or one of the three-dimensional printers 204 includes a web server that provides information through one or more web pages or the like that can be displayed within a web browser or similar client executing on one of the client devices 206. In one aspect, the user interface may include a voice controlled interface that receives spoken commands from a user and/or provides spoken feedback to the user.

The print servers 208 may include data storage, a network interface, and a processor and/or other processing circuitry. In the following description, where the functions or configuration of a print server 208 are described, this is intended to include corresponding functions or configuration (e.g., by programming) of a processor of the print server 208. In general, the print servers 208 (or processors thereof) may perform a variety of processing tasks related to management of networked printing. For example, the print servers 208 may manage print jobs received from one or more of the client devices 206, and provide related supporting functions such as content search and management. A print server 208 may also include a web server that provides web-based access by the client devices 206 to the capabilities of the print server 208. A print server 208 may also communicate periodically with three-dimensional printers 204 in order to obtain status information concerning, e.g., availability of printers and/or the status of particular print jobs, any of which may be subsequently presented to a user through the web server or any other suitable interface. A print server 208 may also maintain a list of available three-dimensional printers 204, and may automatically select one of the three-dimensional printers 204 for a user-submitted print job, or may permit a user to specify a single printer, or a group of preferred printers, for fabricating an object. Where the print server 208 selects the printer automatically, any number of criteria may be used such as geographical proximity, printing capabilities, current print queue, fees (if any) for use of a particular three-dimensional printer 204, and so forth. Where the user specifies criteria, this may similarly include any relevant aspects of three-dimensional printers 204, and may permit use of absolute criteria (e.g., filters) or preferences, which may be weighted preferences or unweighted preferences, any of which may be used by a print server 208 to allocate a print job to a suitable resource.

In one aspect, the print server 208 may be configured to support interactive voice control of one of the printers 204. For example, the print server 208 may be configured to receive a voice signal (e.g., in digitized audio form) from a microphone or other audio input of the printer 204, and to process the voice signal to extract relevant content such as a command for the printer. Where the command is recognized as a print command, the voice signal may be further processed to extract additional context or relevant details. For example, the voice signal may be processed to extract an object identifier that specifies an object for printing, e.g., by filename, file metadata, or semantic content. The voice signal may also be processed to extract a dimensional specification, such as a scale or absolute dimension for an object. The print server 208 may then generate suitable control signals for return to the printer 204 to cause the printer 204 to fabricate the object. Where an error or omission is detected, the print server 208 may return a request for clarification to the printer 204, which may render the request in spoken form through a speaker, or within a user interface of the printer 204 or an associated device.

Other user preferences may be usefully stored at the print server 208 to facilitate autonomous, unsupervised fabrication of content from content sources 210. For example, a print server 208 may store a user's preference on handling objects greater than a build volume of a printer. These preferences may control whether to resize the object, whether to break the object into multiple sub-objects for fabrication, and whether to transmit multiple sub-objects to a single printer or multiple printers. In addition, user preferences or requirements may be stored, such as multi-color printing capability, build material options and capabilities, and so forth. More generally, a print queue (which may be a printer-specific or user-specific queue, and which may be hosted at a printer 204, a server 208, or some combination of these) may be managed by a print server 208 according to one or more criteria from a remote user requesting a print job. The print server 208 may also store user preferences or criteria for filtering content, e.g., for automatic printing or other handling. While this is described below as a feature for autonomous operation of a printer (such as a printer that locally subscribes to a syndicated model source), any criteria that can be used to identify models of potential interest by explicit type (e.g., labeled in model metadata), implicit type (e.g., determined based on analysis of the model), source, and so forth, may be provided to the print server 208 and used to automatically direct new content to one or more user-specified ones of the three-dimensional printers 204.

In the context of voice-controlled printing, the print server 208 may usefully store user-specific data such as training for a voice recognition model. The print server 208 may also or instead store voice rendering data to use in generating spoken output by the printer 204. This may, for example, include voice type data, voice model data, voice sample data, and so forth. Thus for example, a user may purchase or otherwise obtain a voice style (e.g., a celebrity voice or other personality) to render spoken commands and maintain the voice style on the print server 208. The print server 208 may also or instead store data characterizing capabilities of the printer 204 so that voice commands received at the print server 208 can be analyzed for suitability, accuracy, and so forth according to the capabilities of the printer 204 from which the voice command was received. More generally, any data or processing for voice interaction that can be usefully stored or executed remotely from the printer 204 may be located at the printer server 208. It will be understood that any such data may also or instead be stored on a client device, a printer 204, or some combination of these.

In one aspect, the processor of the print server may be configured to store a plurality of print jobs submitted to the web server in a log and to provide an analysis of print activity based on the log. This may include any type of analysis that might be useful to participants in the environment 200. For example, the analysis may include tracking of the popularity of particular objects, or of particular content sources. The analysis may include tracking of which three-dimensional printers 204 are most popular or least popular, or related statistics such as the average backlog of pending print jobs at a number of the three-dimensional printers 204. The analysis may include success of a particular printer in fabricating a particular model or of a particular printer in completing print jobs generally. More generally, any statistics or data may be obtained, and any analysis may be performed, that might be useful to users (e.g., when requesting prints), content sources (e.g., when choosing new printable objects for publication), providers of fabrication resources (e.g., when setting fees), or network facilitators such as the print servers 208.

A print server 208 may also maintain a database 209 of content, along with an interface for users at client devices 206 to search the database 209 and request fabrication of objects in the database 209 using any of the three-dimensional printers 204. Thus in one aspect, a print server 208 (or any system including the print server 208) may include a database 209 of three-dimensional models, and the print server 208 may act as a server that provides a search engine for locating a particular three-dimensional model in the database 209. The search engine may be a text-based search engine using keyword text queries, plain language queries, and so forth. The search engine may also or instead include an image-based search engine configured to identify three-dimensional models similar to a two-dimensional or three-dimensional image provide by a user.

In another aspect, the printer server 208 may periodically search for suitable content at remote locations on the data network, which content may be retrieved to the database 209, or have its remote location (e.g., a URL or other network location identifier) stored in the database 209. In another aspect, the print server 208 may provide an interface for submission of objects from remote users, along with any suitable metadata such as a title, tags, creator information, descriptive narrative, pictures, recommended printer settings, and so forth. In one aspect, the database 209 may be manually curated according to any desired standards. In another aspect, printable objects in the database 209 may be manually or automatically annotated according to content type, popularity, editorial commentary, and so forth.

The print server 208 may more generally provide a variety of management functions. For example, the print server 204 may store a location of a predetermined alternative three-dimensional printer to execute a print job from a remote user in the event of a failure by the one of the plurality of three-dimensional printers 204. In another aspect, the print server 208 may maintain exclusive control over at least one of the plurality of three-dimensional printers 204, such that other users and/or print servers cannot control the printer. In another aspect, the print server 208 may submit a print job to a first available one of the plurality of three-dimensional printers 204.

In another aspect, a print server 208 may provide an interface for managing subscriptions to sources of content. This may include tools for searching existing subscriptions, locating or specifying new sources, subscribing to sources of content, and so forth. In one aspect, a print server 208 may manage subscriptions and automatically direct new content from these subscriptions to a three-dimensional printer 204 according to any user-specified criteria. Thus while it is contemplated that a three-dimensional printer 204 may autonomously subscribe to sources of content through a network interface and receive new content directly from such sources, it is also contemplated that this feature may be maintained through a remote resource such as a print server 208.

A print server 208 may maintain print queues for participating three-dimensional printers 204. This approach may advantageously alleviate backlogs at individual printers 204, which may have limited memory capacity for pending print jobs. More generally, a print server 208 may, by communicating with multiple three-dimensional printers 204, obtain a view of utilization of multiple networked resources that permits a more efficient allocation of print jobs than would be possible through simple point-to-point communications among users and printers. Print queues may also be published by a print server 208 so that users can view pending queues for a variety of different three-dimensional printers 204 prior to selecting a resource for a print job. In one aspect, the print queue may be published as a number of print jobs and size of print jobs so that a requester can evaluate likely delays. In another aspect, the print queue may be published as an estimated time until a newly submitted print job can be initiated.

In one aspect, the print queue of one of the print servers 208 may include one or more print jobs for one of the plurality of three-dimensional printers 204. The print queue may be stored locally at the one of the plurality of three-dimensional printers. In another aspect, the print queue may be allocated between the database 209 and a local memory of the three-dimensional printer 204. In another aspect, the print queue may be stored, for example, in the database 209 of the print server 208. As used here, the term ‘print queue’ is intended to include print data (e.g., the three-dimensional model or tool instructions to fabricate an object) for a number of print job (which may be arranged for presentation in order of expected execution), as well as any metadata concerning print jobs. Thus, a portion of the print queue such as the metadata (e.g., size, status, time to completion) may be usefully communicated to a print server 208 for sharing among users while another portion of the print queue such as the model data may be stored at a printer in preparation for execution of a print job.

Print queues may implement various user preferences on prioritization. For example, for a commercial enterprise, longer print jobs may be deferred for after normal hours of operation (e.g., after 5:00 p.m.), while shorter print jobs may be executed first if they can be completed before the end of a business day. In this manner, objects can be identified and fabricated from within the print queue in a manner that permits as many objects as possible to be fabricated before a predetermined closing time. Similarly, commercial providers of fabrication services may charge explicitly for prioritized fabrication, and implement this prioritization by prioritizing print queues in a corresponding fashion.

In another aspect, a print server 208 may provide a virtual workspace for a user. In this virtual workspace, a user may search local or remote databases of printable objects, save objects of interest (or links thereto), manage pending prints, specify preferences for receiving status updates (e.g., by electronic mail or SMS text), manage subscriptions to content, search for new subscription sources, and so forth. In one aspect, the virtual workspace may be, or may include, web-based design tools or a web-based design interface that permits a user to create and modify models. In one aspect, the virtual workspace may be deployed on the web, while permitting direct fabrication of a model developed within that environment on a user-specified one of the three-dimensional printers 204, thus enabling a web-based design environment that is directly coupled to one or more fabrication resources.

The content sources 210 may include any sources of content for fabrication with a three-dimensional printer 204. This may, for example, include databases of objects accessible through a web interface or application programming interface. This may also or instead include individual desktop computers or the like configured as a server for hosted access, or configured to operate as a peer in a peer-to-peer network. This may also or instead include content subscription services, which may be made available in an unrestricted fashion, or may be made available on a paid subscription basis, or on an authenticated basis based upon some other relationship (e.g., purchase of a related product or a ticket to an event). It will be readily appreciated that any number of content providers may serve as content sources 210 as contemplated herein. By way of non-limiting example, the content sources 210 may include destinations such as amusement parks, museums, theaters, performance venues, or the like, any of which may provide content related to users who purchase tickets. The content sources 210 may include manufacturers such as automobile, computer, consumer electronics, or home appliance manufacturers, any of which may provide content related to upgrades, maintenance, repair, or other support of existing products that have been purchased. The content sources 210 may include artists or other creative enterprises that sell various works of interest. The content sources 210 may include engineering or architectural firms that provide marketing or advertising pieces to existing or prospective customers. The content sources 210 may include marketing or advertising firms that provide promotional items for clients. More generally, the content sources 210 may be any individual or enterprise that provides single or serial objects for fabrication by the three-dimensional printers 204 described herein.

One or more web servers 211 may provide web-based access to and from any of the other participants in the environment 200. While depicted as a separate network entity, it will be readily appreciated that a web server 211 may be logically or physically associated with one of the other devices described herein, and may, for example, provide a user interface for web access to one of the three-dimensional printers 204, one of the print servers 208 (or databases 209 coupled thereto), one of the content sources 210, or any of the other resources 216 described below in a manner that permits user interaction through the data network 202, e.g., from a client device 206 or mobile device 212.

The mobile devices 212 may be any form of mobile device, such as any wireless, battery-powered device, that might be used to interact with the networked printing environment 200. The mobile devices 212 may, for example, include laptop computers, tablets, thin client network computers, portable digital assistants, messaging devices, cellular phones, smart phones, portable media or entertainment devices, and so forth. In general, mobile devices 212 may be operated by users for a variety of user-oriented functions such as to locate printable objects, to submit objects for printing, to monitor a personally owned printer, and/or to monitor a pending print job. A mobile device 212 may include location awareness technology such as Global Positioning System (“GPS”), which may obtain information that can be usefully integrated into a printing operation in a variety of ways. For example, a user may select an object for printing and submit a model of the object to a print server, such as any of the print servers described above. The print server may determine a location of the mobile device 212 initiating the print job and locate a closest printer for fabrication of the object.

In another aspect, a printing function may be location-based, using the GPS input (or cellular network triangulation, proximity detection, or any other suitable location detection techniques). For example, a user may be authorized to print a model only when the user is near a location (e.g., within a geo-fenced area or otherwise proximal to a location), or only after a user has visited a location. Thus a user may be provided with printable content based upon locations that the user has visited, or while within a certain venue such as an amusement park, museum, theater, sports arena, hotel, or the like. Similarly, a matrix barcode such as a QR code may be employed for localization.

The other resources 216 may include any other software or hardware resources that may be usefully employed in networked printing applications as contemplated herein. For example, the other resources 216 may include payment processing servers or platforms used to authorize payment for content subscriptions, content purchases, or printing resources. As another example, the other resources 216 may include social networking platforms that may be used, e.g., to share three-dimensional models and/or fabrication results according to a user's social graph. In another aspect, the other resources 216 may include certificate servers or other security resources for third party verification of identity, encryption or decryption of three-dimensional models, and so forth. In another aspect, the other resources 216 may include online tools for three-dimensional design or modeling, as well as databases of objects, surface textures, build supplies, and so forth. In another aspect, the other resources 216 may include a desktop computer or the like co-located (e.g., on the same local area network with, or directly coupled to through a serial or USB cable) with one of the three-dimensional printers 204. In this case, the other resource 216 may provide supplemental functions for the three-dimensional printer 204 in a networked printing context such as maintaining a print queue or operating a web server for remote interaction with the three-dimensional printer 204. Other resources 216 also include supplemental resources such as three-dimensional scanners, cameras, and post-processing/finishing machines or resources. More generally, any resource that might be usefully integrated into a networked printing environment may be one of the resources 216 as contemplated herein.

It will be readily appreciated that the various components of the networked printing environment 200 described above may be arranged and configured to support networked printing in a variety of ways. For example, in one aspect there is disclosed herein a networked computer with a print server and a web interface to support networked three-dimensional printing. This device may include a print server, a database, and a web server as discussed above. The print server may be coupled through a data network to a plurality of three-dimensional printers and configured to receive status information from one or more sensors for each one of the plurality of three-dimensional printers. The print server may be further configured to manage a print queue for each one of the plurality of three-dimensional printers. The database may be coupled in a communicating relationship with the print server and configured to store print queue data and status information for each one of the plurality of three-dimensional printers. The web server may be configured to provide a user interface over the data network to a remote user, the user interface adapted to present the status information and the print queue data for one or more of the plurality of three-dimensional printers to the user and the user interface adapted to receive a print job from the remote user for one of the plurality of three-dimensional printers.

The three-dimensional printer 204 described above may be configured to autonomously subscribe to syndicated content sources and periodically receive and print objects from those sources. Thus in one aspect there is disclosed herein a device including any of the three-dimensional printers described above; a network interface; and a processor (which may without limitation include the controller for the printer). The processor may be configured to subscribe to a plurality of sources of content (such as the content sources 210 described above) selected by a user for fabrication by the three-dimensional printer through the network interface. The processor may be further configured to receive one or more three-dimensional models from the plurality of content sources 210, and to select one of the one or more three-dimensional models for fabrication by the three-dimensional printer 204 according to a user preference for prioritization. The user preference may, for example, preferentially prioritize particular content sources 210, or particular types of content (e.g., tools, games, artwork, upgrade parts, or content related to a particular interest of the user).

The memory of a three-dimensional printer 204 may be configured to store a queue of one or more additional three-dimensional models not selected for immediate fabrication. The processor may be programmed to periodically re-order or otherwise alter the queue according to pre-determined criteria or manual user input. For example, the processor may be configured to evaluate a new three-dimensional model based upon a user preference for prioritization, and to place the new three-dimensional model at a corresponding position in the queue. The processor may also or instead be configured to retrieve content from one of the content sources 210 by providing authorization credentials for the user, which may be stored at the three-dimensional printer or otherwise accessible for presentation to the content source 210. The processor may be configured to retrieve content from at least one of the plurality of content sources 210 by authorizing a payment from the user to a content provider. The processor may be configured to search a second group of sources of content (such as any of the content sources 210 described above) according to one or more search criteria provide by a user. This may also or instead include demographic information for the user, contextual information for the user, or any other implicit or explicit user information.

In another aspect, there is disclosed herein a system for managing subscriptions to three-dimensional content sources such as any of the content sources 210 described above. The system may include a web server configured to provide a user interface over a data network, which user interface is adapted to receive user preferences from a user including a subscription to a plurality of sources of a plurality of three-dimensional models, a prioritization of content from the plurality of sources, and an identification of one or more fabrication resources coupled to the data network and suitable for fabricating objects from the plurality of three-dimensional models. The system may also include a database to store the user preferences, and to receive and store the plurality of three-dimensional models as they are issued by the plurality of sources. The system may include a processor (e.g., of a print server 208, or alternatively of a client device 206 interacting with the print server 208) configured to select one of the plurality of three-dimensional models for fabrication based upon the prioritization. The system may include a print server configured to communicate with the one or more fabrication resources through the data network, to determine an availability of the one or more fabrication resources, and to transmit the selected one of the plurality of three-dimensional models to one of the one or more fabrication resources.

In another aspect, there is disclosed herein a network of three-dimensional printing resources comprising a plurality of three-dimensional printers, each one of the plurality of three-dimensional printers including a network interface; a server configured to manage execution of a plurality of print jobs by the plurality of three-dimensional printers; and a data network that couples the server and the plurality of three-dimensional printers in a communicating relationship.

In general as described above, the server may include a web-based user interface configured for a user to submit a new print job to the server and to monitor progress of the new print job. The web-based user interface may permit video monitoring of each one of the plurality of three-dimensional printers, or otherwise provide information useful to a remote user including image-based, simulation-based, textual-based or other information concerning status of a current print. The web-based user interface may include voice input and/or output for network-based voice control of a printer.

The fabrication resources may, for example, include any of the three-dimensional printers 204 described above. One or more of the fabrication resources may be a private fabrication resource secured with a credential-based access system. The user may provide, as a user preference and prior to use of the private fabrication resource, credentials for accessing the private fabrication resource. In another aspect, the one or more fabrication resources may include a commercial fabrication resource. In this case the user may provide an authorization to pay for use of the commercial fabrication resource in the form of a user preference prior to use of the commercial fabrication resource.

Many current three-dimensional printers require significant manufacturing time to fabricate an object. At the same time, certain printers may include a tool or system to enable multiple, sequential object prints without human supervision or intervention, such as a conveyor belt. In this context, prioritizing content may be particularly important to prevent crowding out of limited fabrication resources with low priority content that arrives periodically for autonomous fabrication. As a significant advantage, the systems and methods described herein permit prioritization using a variety of user-specified criteria, and permit use of multiple fabrication resources in appropriate circumstances. Thus prioritizing content as contemplated herein may include any useful form of prioritization. For example, this may include prioritizing the content according to source. The content sources 210 may have an explicit type that specifies the nature of the source (e.g., commercial or paid content, promotional content, product support content, non-commercial) or the type of content provided (e.g., automotive, consumer electronics, radio control hobbyist, contest prizes, and so forth). Prioritizing content may include prioritizing the content according to this type. The three-dimensional models themselves may also or instead include a type (e.g., tool, game, home, art, jewelry, replacement part, upgrade part, etc.) or any other metadata, and prioritizing the content may includes prioritizing the content according to this type and/or metadata.

In one aspect, the processor may be configured to select two or more of the plurality of three-dimensional models for concurrent fabrication by two or more of the plurality of fabrication resources based upon the prioritization when a priority of the two or more of the plurality of three-dimensional models exceeds a predetermined threshold. That is, where particular models individually have a priority above the predetermined threshold, multiple fabrication resources may be located and employed to fabricate these models concurrently. The predetermined threshold may be evaluated for each model individually, or for all of the models collectively such as on an aggregate or average basis.

In one aspect, the processor may be configured to adjust prioritization based upon a history of fabrication when a number of objects fabricated from one of the plurality of sources exceeds a predetermined threshold. Thus, for example, a user may limit the number of objects fabricated from a particular source, giving subsequent priority to content from other sources regardless of an objectively determined priority for a new object from the particular source. This prevents a single source from overwhelming a single fabrication resource, such as a personal three-dimensional printer operated by the user, in a manner that crowds out other content from other sources of possible interest. At the same time, this may enable content sources 210 to publish on any convenient schedule, without regard to whether and how subscribers will be able to fabricate objects.

In another aspect, the processor may be configured to identify one or more additional sources of content based upon a similarity to one of the plurality of sources of content. For example, where a content source 210 is an automotive manufacturer, the processor may perform a search for other automotive manufactures, related parts suppliers, mechanics, and so forth. The processor may also or instead be configured to identify one or more additional sources of content based upon a social graph of the user. This may, for example, include analyzing a social graph of relationships from the user to identify groups with common interests, shared professions, a shared history of schools or places of employment, or a common current or previous residence location, any of which may be used to locate other sources of content that may be of interest to the user.

FIG. 3 shows a mobile device with an accessory for three-dimensional imaging.

The mobile device 302 may include any suitable mobile device such as a cellular phone, media player, tablet, laptop computer or the like. In general, the mobile device may include a processor 304 such as a microprocessor, microcontroller, or other processing circuitry that controls operation of the mobile device, provides a user interface, and so forth. The mobile device 302 may include a first camera 306 and a second camera 308 operable by the mobile device 302 to capture still images or vide, or to support live video-based communications. In one embodiment, the first camera 306 may be a forward facing camera and the second camera 308 may be a rear facing camera (or vice versa). In this conventional configuration, a user can take pictures of objects in front of the user with a first camera 306 facing away from the user, or the user can take a picture of himself or herself, or other items facing toward the user from the mobile device 302 using the second camera 308.

The processor 304 may include processing circuitry configured to obtain three-dimensional data from one or more images obtained by the first camera 306 and/or the second camera 308. A wide array of image-based techniques for three-dimensional reconstruction are known in the art, and may be suitably adapted for use with the systems and methods contemplated herein. For example, the processor 304 may apply shape-from motion techniques to a sequence of images captured from either or both of the first camera 306 and the second camera 308. In another aspect, the first camera 306 and the second camera 308 may be controlled to capture images concurrently, or substantially concurrently, and the two images from offset poses may be processed as a stereoscopic image pair to extract three-dimensional features. Similarly, the housing may include a structured light source or the light that illuminates an object with features that can be recognized when projected onto an object and processed to recover three-dimensional data. These or any other suitable techniques may be usefully employed with the mobile device 302 and accessory as contemplated herein. Certain variations are described below that employ different arrangements of hardware and processing, as well as various types of communication between the mobile device 302 and components within the housing 310 of the accessory. For example, the housing 310 may include an additional camera to complement a camera of the mobile device 302. In another aspect, the housing 310 may include a structured light source or other supplemental illumination source for improved imaging. In another aspect, the housing 310 may include independent processing circuitry for three-dimensional imaging, which may receive images from a camera of the mobile device 302 and process such images either alone or in combination with images from a camera of the housing 310 to obtain three-dimensional data. All such variations are intended to fall within the scope of this disclosure.

An accessory may include a housing 310 with a mechanical interface 312 configured to removably and replaceably attach to a predetermined mobile computing device such as the mobile device 302 in a predetermined orientation. While this mechanical interface 312 is illustrated in the cross-section of FIG. 3 as a flanged edge that encloses sides of the mobile device 302 along its perimeter, it will be appreciated that mobile devices 302 may have a variety of shapes and sizes, and a variety of mechanical interfaces may readily be devised to removably and replaceably secure the housing 310 to the mobile device 302 such that the mobile device 302 and the housing 310 are in a predetermined orientation relative to one another. For example, the housing 310 may be fashioned of a flexible material that permits the housing 310 to be elastically bent around the edges of the mobile device 302, or the mechanical interface 312 may include hinged, spring-loaded, or sliding latches that are manually secured about the edges of the mobile device 302.

However attached, a lens 314 on the housing 310 may as a result be fixed in a predetermined location and orientation relative to a lens of the first camera 306 in order to provide a field of view from a predetermined pose relative to the first camera 306 of the mobile device 302 when the mobile device 302 is positioned within the housing 310. An optical train 316 may be further provided that optically couples the second camera 308 of the mobile device 302 to the lens 314. In this configuration, the first camera 306 and the second camera 308 both capture forward-facing images from offset poses, thus providing a stereoscopic perspective on a field of view for the combined device 320. The optical train may include any of a variety of optical components such as mirrors, fiber optics, intermediate lenses, and so forth to suitably couple the lens 314 to the second camera 308 for image acquisition.

FIG. 4 is functional block diagram of an accessory coupled to a mobile device. In generally, the accessory 402 and the mobile device 404 may share various components for a three-dimensional imaging system, and may further share processing resources and/or be coordinated through a communications interface to cooperate in a three-dimensional imaging process.

The accessory 402, or the housing of the accessory 402 (housing and accessory being used interchangeably herein, unless a different meaning is explicitly provided or otherwise clear from the context), may include a communication interface 406 configured to data communication between the accessory 402 and the mobile device 404, with a complementary communication interface 408 on the mobile device 404. In one aspect, the communication interface 406 may include hardware and/or software for any suitable wireless communication interface using, e.g., any 802.11 wireless protocol, BlueTooth, or any standardized or proprietary short-range wireless communications protocol based upon, e.g., radio frequency, optical, acoustic, or other suitable communication medium. In another aspect, the communication interface may include a wired communication interface that couples to a data port of the predetermined mobile computing device when the housing is attached to the predetermined mobile computing device. Thus in one aspect, the communication interface 408 of the mobile device may include a data port configured for wired data communications. Contemporary mobile devices include numerous suitable physical ports including without limitation standardized ports such as USB connectors, micro-USB connectors, two or three ring plugs, and so forth, any of which may adapted for use as a data port as contemplated herein. Similarly, many devices included proprietary arrangements of plugs, contacts, and the like for docking stations and recharging that may be adapted to use as a physical data port.

In one aspect, the communication interface 406 of the accessory 402 may include a sensor to detect an action of the mobile computing device. This may include a data input such as a data input, trigger, dedicated pin, or the like in the communication interface 406. In another aspect, this may include a sensor independent of the communications circuitry that couples the accessory 402 to the mobile device 404. For example, the sensor may include a sensor that detects a sound, a vibration, an illumination, or the like. In this manner, the mobile device 404 may signal the accessory 402 independent of a data communication link using any action that is (a) within the capabilities of the mobile device 404, and (b) detectable by the accessory 402. Thus, for example, when a picture is taken with the mobile device camera, a processor of the mobile device may transmit a signal through the communication interfaces 408, 406, or with a vibration or a beep, to concurrently capture a picture with a camera of the accessory 402, or to otherwise operate a shutter, illumination source or the like concurrently with the image capture by the mobile device. Similarly, the action itself may include an autofocus, zoom, light meter reading, or other action related to image capture that can include a corresponding data signal to the accessory.

The accessory 402 may also or instead include a camera 412 separate from the mobile device 404. As noted above, the accessory 402 may support three-dimensional processing in one respect by providing a supplemental camera to capture an image concurrently with or otherwise in addition to one or more images from the mobile device 404 for use in three-dimensional processing. Thus in one aspect a system 400 described herein includes a camera 412 within a housing of an accessory 402. An image from the camera 412 may be transmitted to the mobile device 404 through the communication interfaces 406, 408 for the mobile device 404 to perform three-dimensional processing tasks with a processor 410 or other processing circuitry of the mobile device 404.

The accessory may also or instead include a processor 414 or other processing circuitry to support three-dimensional imaging/processing. For example, the processor 414 of the accessory 402 may receive image data from the mobile device 404, e.g., through the communication interface 406, and process the image data along with data obtained from the camera 412 of the accessory to obtain three-dimensional data. The processor 414 may also control operation of the camera 412, and/or may control operation of a camera of the mobile device 404, e.g., by communication with the mobile device 404 through the communication interfaces 406, 408. Similarly, the processor 410 of the mobile device 404 may be configured to receive a first image from the camera and a second image through the communication interface from the second camera, and further configured to process the first image and the second image to obtain three-dimensional data from an overlapping field of view of the camera and the second camera.

In one aspect it will be understood that processing circuitry for three-dimensional processing may be contained within the mobile device 404, such as the processing circuitry 410 depicted in FIG. 4, which may be programmed for appropriate three-dimensional processing tasks, or within the accessory 402 (e.g., processor 414), or some combination of these. In one aspect, a system 400 contemplated herein includes processing circuitry on a mobile device configured to obtain substantially concurrent images from a camera and a second camera of the mobile device. The system 400 may also or instead include processing circuitry on the mobile device configured to process such substantially concurrent images to obtain three-dimensional data from an overlapping region of the field of view of the first camera (e.g., through the optical train) and the second field of view of the second camera.

The processor 414 of the accessory 402 may include, or be associated with, a memory that, among other things, stores a unique identifier for the accessory 402. This may be used to identify a user of the accessory 402 so that, for example, when a three-dimensional image is acquired using the accessory 402 and a mobile device 404, the three-dimensional image may be transmitted to a print server or other networked three-dimensional printing resource such as any of those described above, which may in turn automatically associate the three-dimensional image with a particular user. This may be particularly useful where, for example, the mobile device 404 includes a data network connection for Internet access to such a remote resource. In this manner, a user of the accessory may have three-dimensional images automatically uploaded to the remote resource where they can be available for printing or other manipulation by the user. In one aspect, a user interface may be provided on the mobile device 404 or on the accessory 402 for a user to authorize transmission of a captured three-dimensional image with the unique identifier to a remote resource for storage and subsequent retrieval/use. In another aspect, the user interface (again, on either the accessory 402 or the mobile device 404) may provide a “print this now” button or the like so that a physical reproduction of the three-dimensional image, or a further-processed version of the image, can be immediately queued for fabrication. The other networked printing resources, systems and methods described above may be used in various combinations to further process and/or reproduce such images on a three-dimensional printer, either automatically or under user control.

The components of the accessory 402 may be powered by a power source for the mobile device 404, which may be transferred to the housing 402 through the same electromechanical interface, e.g., a USB or plug connector, that supports data communications. The housing 402 may also or instead include a power source 416 independent of the mobile device for autonomous operation. This power source 416, which may be a battery or the like, may support operation of the camera 412 and the processing circuitry 414 of the accessory 402, and may also provide supplemental power to the mobile device 404, which may be particularly useful, for example, where data acquisition and three-dimensional processing would otherwise tend to tax a power supply of a mobile device to premature depletion.

The accessory 402 may also include any other hardware 418 complementary to the intended use(s) of the accessory 402. For example, this may include memory such as a removable storage device (e.g., memory card, USB drive, or the like) or internal memory for storing image data and/or processed three-dimensional data. Where three-dimensional data is captured for a specific use, the accessory 402 may also include processing circuitry adapted convert acquired three-dimensional data into a suitable form. For example, the processing circuitry may convert raw point cloud or polygonal data into an STL format for use by a three-dimensional printer, or into a CAD file of any suitable format for further processing. In another aspect, the other hardware 418 may include local or cellular wireless communications capabilities for connecting the accessory 402 to remote resources such as a three-dimensional printer, print server, desktop computer, or other device or combination of devices useful for processing and management of printable content as contemplated herein.

FIG. 5 shows a three-dimensional imaging system with dual optical paths. An accessory 502 coupled to a mobile device 504 may in general include an optical train to direct optical paths to cameras of the accessory 502 and/or mobile device 504. For example a first optical path 506 within the accessory 502 may direct an image from a lens of the accessory 502 to a first camera 508 of the mobile device 502. A second optical path 510 within the accessory 502 may direct an image from another lens or opening of the mobile device 502 to a second camera 512 of the mobile device 502. While the first optical path 506 provides an optical coupling to a first field of view (indicated generally by an arrow 520), the second optical path 510 may provide an optical coupling to a second field of view (indicated generally by a second arrow 522) different from the first field of view. In this manner, stereoscopic imaging or other three-dimensional imaging techniques based upon image differentiation may be employed with multiple cameras of the mobile device 502. Each optical path 506, 510 may include independent optical components, which may be fixed optics such as transfer lenses or fiber optics, and/or controllable optics such as shutters, apertures, or the like to supplement imaging functions (i.e., sampling, shutter speed, etc.) of the mobile device 502.

It will be appreciated that the first optical path 506 may be readily omitted where the camera 508 has a field of view that can be overlapped with the second field of view. Alternatively, even in this configuration, the second optical path 510 may be included to provide supplemental optics such as focusing lenses, scaling lenses, a controllable shutter or aperture, and so forth.

The accessory 502 may optionally include a supplemental light source 528 positioned to illuminate the field of view and/or the second field of view. The supplemental light source 528 may be a strobe, flash, high-intensity light, or other light source useful for photographic illumination. The supplemental light source 528 may also or instead include a structured light source that can provide illumination using predetermined patterns of light that can be imaged and processed to derive three-dimensional data.

In one aspect, the supplemental light source 528 may serve as an illumination source to illuminate a field of view of one of the cameras 508, 512 from a predetermined pose for any of a variety of optically-based imaging techniques. For example, the illumination source may be a structured light source that projects a predetermined pattern of light from the predetermined pose. The predetermined pattern of light may include one or more lines or shapes, which may be created with a lens, filter, or other suitable optics, or with a controllable or steerable light source such as a laser and corresponding hardware. In one aspect, a fixture 530 such as a lens, mirrors, or other mechanical and/or optical beam steering elements, may be provided to move the illumination source in a predetermined pattern. The illumination source may, for example, include a laser light source, a light emitting diode, an incandescent light source, or combinations of the foregoing. In another aspect, the supplemental light source 528 may include a plurality of illumination sources coupled to the housing, each one of the plurality of illumination sources having a different pose relative to the predetermined mobile computing device. In this manner, any reconstruction technique based upon directional lighting and or different patterns of light may be usefully implemented using a number of separately controllable illumination sources coupled to the housing of the accessory 504.

FIG. 6 shows a three-dimensional imaging system with dual optical paths. In the embodiment of FIG. 6 a first optical path 602 of an accessory 600 optically couples a camera 604 of a mobile device 605 to a first lens 606, and a second optical path 608 optically couples the camera 604 to a second lens 610 that provides a pose that is offset from the first lens 606. An optical switch 612 such as a moveable mirror, surface with controllable reflectivity, controllable mirrors and apertures, or any other hardware that can controllably select between the optical paths 602, 608 may be used as the optical switch 612 so that the accessory 600 can controllably direct the camera 604 toward the first lens 606 or the second lens 608. A processor on the mobile device may, for example, be configured (e.g., by programming) to control the camera 604 and the optical switch 612 to capture temporally adjacent images from a field of view of the first lens 606 and a second field of view of the second lens 610 with the camera 604. In another aspect, the processor may be configured to process the temporally adjacent images to obtain three-dimensional data from an overlapping region of the field of view and the second field of view.

While dual optical path systems are described, it will be understood that any number of additional paths and supporting hardware/software may be used to capture additional views of an object, such as to resolve spatial ambiguities, address occlusions, and otherwise improve three-dimensional processing as contemplated herein.

FIG. 7 shows a method for using an accessory to capture three-dimensional images with a mobile device.

As shown in step 702, the method 700 may begin with attaching an accessory such as any of the accessories described above to a mobile device. The accessory may be removably and replaceably attachable to the mobile device, and may include a camera and a communication interface for communications with the mobile device.

As shown in step 704, the method 700 may include capturing a first image with the camera.

As shown in step 706, the method 700 may include capturing a second image substantially concurrently with the first image using a second camera of the mobile device, wherein the second camera has an overlapping field of view with the camera.

As shown in step 708, the method 700 may include transmitting the first image to the mobile device through the communication interface. In another aspect, this step may include transmitting the second image to the accessory, where subsequent three-dimensional processing is performed on a processor of the accessory.

As shown in step 710, the method may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view.

FIG. 8 shows a method for using an accessory to capture three-dimensional images with a mobile device. As described below, an accessory may provide multiple optical paths, along with an optical switch that can be controlled to selectively expose a camera of a mobile device to different poses relative to an object in a field of view. The two (or more) resulting images may be processed to extract three-dimensional data.

As shown in step 802, the method 800 may include attaching an accessory to a mobile device having a camera. As described above, the accessory may be removably and replaceably attachable to the mobile device, and the accessory may include an optical train with a first optical path for the camera to a first field of view and a second optical path for the camera to a second field of view having an overlapping field of view with the first field of view. The accessory may include an optical switch configured to selectively switch between the respective optical paths.

As shown in step 804, the method 800 may include selecting the first optical path, such as by controlling the optical switch accordingly with a control signal from processing circuitry of the accessory.

As shown in step 806, the method 800 may include capturing a first image with the camera, such as through the selected first optical path.

As shown in step 808, the method 800 may include selecting the second optical path, such as by controlling the optical switch accordingly with a control signal from the processing circuitry of the accessory.

As shown in step 810, the method 800 may include capturing a second image with the camera, e.g., through the selected second optical path.

As shown in step 812, the method 800 may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view provided by the two optical paths.

FIG. 9 shows a method for using an accessory to capture three-dimensional images with a mobile device. As described below, an accessory may provide two optical paths for two different cameras of the mobile device, which optical paths may serve to direct the two cameras toward to overlapping fields of view of an object. In this manner, images may be captured using the two cameras and provided to a processor for extraction of three-dimensional data. The two cameras may advantageously be operated concurrently or substantially concurrently in order to avoid temporally-based changes in a shape or position of the object that might otherwise require additional processing for accurate extraction of three-dimensional data.

As shown in step 902, the method 900 may include attaching an accessory to a mobile device. The accessory may be removably and replaceably attachable to the mobile device as described above, and the accessory may include an optical train with a first optical path from a first camera of the mobile device to a first field of view and a second optical path from a second camera of the mobile device to a second field of view having an overlapping field of view with the first field of view.

As shown in step 904, the method may include capturing a first image with the first camera.

As shown in step 906, the method may include capturing a second image with the second camera. The second image may be captured substantially concurrently with the first image. That is, the first image and the second image may be captured sufficiently close in time to prevent substantial movement of an object within the overlapping field of view relative to the cameras. In one aspect, this may include operating the first camera and the second camera concurrently, or as close to concurrently as possible based upon the hardware and processing capabilities of the mobile device.

As shown in step 906, the method may include processing the first image and the second image on the mobile device to obtain three-dimensional data from the overlapping field of view.

The methods or processes described above, and steps thereof, may be realized in hardware, software, or any combination of these suitable for a particular application. The hardware may include a general-purpose computer and/or dedicated computing device. The processes may be realized in one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors, or other programmable device, along with internal and/or external memory. The processes may also, or instead, be embodied in an application specific integrated circuit, a programmable gate array, programmable array logic, or any other device or combination of devices that may be configured to process electronic signals. It will further be appreciated that one or more of the processes may be realized as computer executable code created using a structured programming language such as C, an object oriented programming language such as C++, or any other high-level or low-level programming language (including assembly languages, hardware description languages, and database programming languages and technologies) that may be stored, compiled or interpreted to run on one of the above devices, as well as heterogeneous combinations of processors, processor architectures, or combinations of different hardware and software.

Thus, in one aspect, each method described above and combinations thereof may be embodied in computer executable code that, when executing on one or more computing devices, performs the steps thereof. In another aspect, the methods may be embodied in systems that perform the steps thereof, and may be distributed across devices in a number of ways, or all of the functionality may be integrated into a dedicated, standalone device or other hardware. In another aspect, means for performing the steps associated with the processes described above may include any of the hardware and/or software described above. All such permutations and combinations are intended to fall within the scope of the present disclosure.

It should further be appreciated that the methods above are provided by way of example. Absent an explicit indication to the contrary, the disclosed steps may be modified, supplemented, omitted, and/or re-ordered without departing from the scope of this disclosure.

The method steps of the invention(s) described herein are intended to include any suitable method of causing such method steps to be performed, consistent with the patentability of the following claims, unless a different meaning is expressly provided or otherwise clear from the context. So for example performing the step of X includes any suitable method for causing another party such as a remote user, a remote processing resource (e.g., a server or cloud computer) or a machine to perform the step of X. Similarly, performing steps X, Y and Z may include any method of directing or controlling any combination of such other individuals or resources to perform steps X, Y and Z to obtain the benefit of such steps.

While particular embodiments of the present invention have been shown and described, it will be apparent to those skilled in the art that various changes and modifications in form and details may be made therein without departing from the spirit and scope of this disclosure and are intended to form a part of the invention as defined by the following claims, which are to be interpreted in the broadest sense allowable by law.

Claims

1. A system for three-dimensional imaging with a mobile device, the system comprising:

a housing with a mechanical interface configured to removably and replaceably attach to a predetermined mobile computing device in a predetermined orientation; and an illumination source coupled to the housing that illuminates a field of view of a camera of the predetermined mobile computing device from a predetermined pose.

2. The system of claim 1 wherein the illumination source is a structured light source that projects a predetermined pattern of light from the predetermined pose.

3. The system of claim 2 wherein the predetermined pattern of light includes one or more lines.

4. The system of claim 2 wherein the predetermined pattern of light includes one or more shapes.

5. The system of claim 1 further comprising a fixture on the housing to move the illumination source in a predetermined pattern.

6. The system of claim 1 wherein the illumination source includes a laser light source.

7. The system of claim 1 wherein the illumination source includes a light emitting diode.

8. The system of claim 1 further comprising a plurality of illumination sources coupled to the housing, each one of the plurality of illumination sources having a different pose relative to the predetermined mobile computing device.

9. The system of claim 1 further comprising a communication interface in the housing configured for data communications between the housing and the predetermined mobile computing device.

10. The system of claim 9 wherein the communication interface includes a wired communication interface that couples to a data port of the predetermined mobile computing device when the housing is attached to the predetermined mobile computing device.

11. The system of claim 9 wherein the communication interface includes a wireless communication interface.

12. The system of claim 9 wherein the communication interface includes a sensor to detect an action of the predetermined mobile computing device.

13. The system of claim 9 further comprising processing circuitry to obtain three-dimensional data from one or more images acquired by the camera of the predetermined mobile computing device.

14. The system of claim 13 wherein the processing circuitry is within the housing.

15. The system of claim 13 wherein the processing circuitry is within the predetermined mobile computing device.

16. The system of claim 1 wherein the housing includes a power source independent from the predetermined mobile computing device.

17. The system of claim 1 further comprising an optical train in the housing, wherein the optical train includes a first optical path that optically couples the camera to the field of view and a second optical path that optically couples a second camera of the predetermined mobile computing device to a second field of view from a different pose than the field of view.

18. The system of claim 1 further comprising an optical train in the housing, wherein the optical train includes a first optical path that optically couples the camera to the field of view and a second optical path that optically couples the camera to a second field of view from a different pose than the field of view, the optical train further including an optical switch that selectively couples the camera to the first optical path and the second optical path.

19. The system of claim 1 further comprising a second camera in the housing wherein the housing includes processing circuitry to control the second camera in response to a control signal received through a communication interface to the predetermined mobile computing device.

20. The system of claim 19 wherein the predetermined mobile computing device includes processing circuitry configured to receive a first image from the camera and a second image through the communication interface from the second camera, and further configured to process the first image and the second image to obtain three-dimensional data from an overlapping field of view of the camera and the second camera.

Patent History
Publication number: 20140043442
Type: Application
Filed: Jan 9, 2013
Publication Date: Feb 13, 2014
Inventors: Gregory Alan Borenstein (Brooklyn, NY), Anthony James Buser (Reading, PA), Ariel Douglas (Brooklyn, NY), Charles E. Pax (Brooklyn, NY)
Application Number: 13/737,579
Classifications
Current U.S. Class: Multiple Cameras (348/47); Picture Signal Generator (348/46)
International Classification: H04N 13/02 (20060101);