Autonomous Item Fabrication Utilizing a Trained Machine Learning Model

- eBay

An autonomous item generation system implements a trained machine learning model configured to output fabrication instructions for generating an item and metadata describing the item, automatically and independent of user input. Fabrication instructions output by the machine learning model are transmitted to a fabrication device for generating the item. The autonomous item generation system generates a listing for the item based on the metadata output by the machine learning model and publishes the listing to a virtual marketplace. Analytics data describing feedback for the item listing is used to generate training data for the machine learning model. The training data is input to the machine learning model, which causes the machine learning model to refine at least one control parameter according to a loss function that penalizes negative differences between predicted and observed feedback data for the item. The machine learning model with the refined parameter(s) is then used by the autonomous item generation system to generate fabrication instructions and metadata for an additional item.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
PRIORITY

This application is a continuation of and claims priority to U.S. patent application Ser. No. 17/015,822, filed on Sep. 9, 2020, which claims priority to U.S. patent application Ser. No. 62/899,060, filed on Sep. 11, 2019, the disclosures of which are hereby incorporated by reference in their entireties.

BACKGROUND

Virtual marketplaces such as network-based commerce systems are increasingly becoming a preferred mechanism by which vendors offer goods and services for sale, in contrast to conventional brick-and-mortar retail stores. Although the proliferation of virtual marketplaces enables vendors to reach a wider audience and offer shopping experiences that are curated for individual users, virtual marketplaces still present significant disadvantages. For instance, while virtual marketplaces provide item listings with certain tools to facilitate purchase of items, such as search interfaces to browse item listings and controls for purchasing a subject item of an item listing, virtual marketplaces require vendors to manually designate and design specific aspects of each item listing for publication at the virtual marketplace.

Additionally, while virtual marketplaces connect interested buyers with vendors, it remains the vendor's responsibility to manually perform various operations required to complete the sale of an item, such as processing financial transactions, contracting for shipment of the item to the buyer, and so forth. Furthermore, due to the computational and network resources required to aggregate and analyze data that describes potential buyer feedback pertaining to an item listing, virtual marketplaces only sporadically provide vendors with information describing item listing feedback. Due to this limited information, vendors are unable to instantly realize how an item listing, or the subject item, should be modified to account for changing trends and behaviors. Consequently, vendors are faced with undue delay at each stage in an item's lifecycle, such as item conception, item manufacture, item listing, item delivery, ascertaining item feedback, item design modification, and so forth.

SUMMARY

To overcome these problems, a system and techniques for autonomous item generation are described. An autonomous item generation system receives at least one machine learning model trained to generate both fabrication instructions for generating an item as well as metadata describing the item, automatically and independent of user input. The autonomous item generation system causes the at least one machine learning model to generate the fabrication instructions and metadata for an item. The autonomous item generation system then transmits the fabrication instructions to a fabrication device, which causes the fabrication device to generate the item. A listing for the item is generated from the item metadata output by the at least one machine learning model, and the autonomous item generation system publishes the listing to a virtual marketplace. The autonomous item generation system is configured to obtain analytics data describing one or more interactions with the item listing as published to the virtual marketplace, such as views of the item listing, favorites of the item listing, purchases of the item via the item listing, navigation from the item listing to a different item listing, shares of the item listing, comments on the item listing, user feedback to the item listing, combinations thereof, and so forth.

In some implementations, the autonomous item generation system is configured to serve as the virtual marketplace by publishing the item listing to a network (e.g., the Internet) and monitoring network traffic pertaining to the item listing. Alternatively or additionally, the autonomous item generation system is configured to leverage an existing virtual marketplace platform and implement one or more application programming interfaces (APIs) of the virtual marketplace to cause publication of the item listing on the virtual marketplace and obtain analytics data pertaining to the item listing from the virtual marketplace. Based on the analytics data, and optionally additional feedback data pertaining to the item, the autonomous item generation system forms training data for refining the at least one machine learning model. The training data is provided as input to the at least one machine learning model, which causes modification of one or more control parameters of the at least one machine learning model. The autonomous item generation system then generates fabrication instructions and metadata for an additional item using the at least one machine learning model with its modified parameter(s) and repeats its operations to continuously refine the machine learning model(s), without requiring user input or intervention to do so.

This Summary introduces a selection of concepts in a simplified form that are further described below in the Detailed Description. As such, this Summary is not intended to identify essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.

BRIEF DESCRIPTION OF THE DRAWINGS

The detailed description is described with reference to the accompanying figures.

FIG. 1 is an illustration of an environment in an example implementation that is operable to employ the autonomous item generation techniques described herein.

FIG. 2 depicts an example implementation in which an autonomous item generation system of FIG. 1 generates an item and an item listing for the item using at least one machine learning model and refines the at least one machine learning model based on feedback data for the item.

FIG. 3 depicts an example implementation of a user interface displaying an item listing generated by the autonomous item generation system of FIG. 1.

FIG. 4 depicts an example implementation of a user interface displaying an item listing generated by the autonomous item generation system of FIG. 1.

FIG. 5 depicts an example implementation of a user interface for the autonomous item generation system of FIG. 1.

FIG. 6 depicts an example implementation of a user interface for the autonomous item generation system of FIG. 1.

FIG. 7 is a flow diagram depicting a procedure in an example implementation for the autonomous item generation system of FIG. 1 generating an item using at least one machine learning model and refining the machine learning model for generating additional items.

FIG. 8 is a flow diagram depicting a procedure in an example implementation for outputting and modifying a user interface for the autonomous item generation system of FIG. 1.

FIG. 9 illustrates an example system including various components of an example device that can be implemented as any type of computing device as described and/or utilized with reference to FIGS. 1-8 to implement embodiments of the techniques described herein.

DETAILED DESCRIPTION Overview

With advances in computing device technology, virtual marketplaces are increasingly used as a mechanism to publish item listings (e.g., offers of goods for sale). While such virtual marketplaces enable vendors to reach a wider audience than otherwise enabled by conventional brick-and-mortar storefronts, vendors continue to face undue burdens involved with bringing an initial concept design to a tangible item, determining a strategy for listing the item on a virtual marketplace, publishing the listing to the virtual marketplace, entering into transactions with potential buyers, securing shipping for transporting the item to a buyer, and verifying that the item actually reaches the buyer. For instance, from designers tasked with creating item designs, manufacturers tasked with ensuring the resulting item matches the item's design, marketers tasked with creating listings for the item, and sales representatives tasked with ensuring the item reaches the buyer, each of these various actions required to develop an item from conception to a tangible good requires substantial human input and is thus prone to human error and delay.

These inefficiencies are further compounded when considering the lifecycle of an item's development, which requires further time and resources to ascertain feedback describing the public's reaction to the item before determining modifications to be incorporated in subsequent item design iterations. For instance, due to the computational and network resources required to collect, store, and analyze virtual marketplace user behavior information, virtual marketplaces are dissuaded from continuously analyzing and publishing user behavior information due to the prohibitive amount of resources required to do so. This delay in aggregating and providing user behavior information forces vendors to wait for publication of the information, and further spend significant time analyzing the behavior information in the hope of identifying certain characteristics of the item design, the item listing, or other aspects in the pipeline of item conception to delivery that can benefit from improvement. Furthermore, an accuracy of such vendor analysis is dependent on an expertise level of individuals tasked with gleaning trends from user behavior information, and even the most skilled team of individuals is unable to simultaneously handle analysis for multiple different items, much less an entire item catalog.

Accordingly, systems and techniques are described herein that enable autonomous item generation, in which generation of instructions for fabricating, manufacturing, or otherwise creating a tangible item and generation of a listing for the tangible item at a virtual marketplace are performed automatically, and independent of user input. To do so, an autonomous item generation system employs at least one machine learning model trained to generate, for a tangible item, item data that includes fabrication instructions for the item, a description for the item, tags describing various attributes of the item, and a recommended price for listing the item at a virtual marketplace. In addition to automatically generating the fabrication instructions and listing for the item, the autonomous item generation system is further configured to interface with the virtual marketplace to automatically perform operations involved with transporting the item from a fabrication device used to generate the item to a purchasing entity, including financial transactions, shipping operations, and so forth.

The autonomous item generation system is further configured to obtain feedback data describing one or more interactions with the item listing as published to the virtual marketplace, and continuously modify control parameters of the machine learning model used to generate the item and item listing, all without human user intervention or guidance. In this manner, the autonomous item generation system is configured to identify virtual marketplace trends and behaviors in real-time, well before even the most skilled human analyst could detect the same trends and behaviors when provided with the same behavior data. Thus, the autonomous item generation system and techniques described herein are configured to generate items and item listings without human guidance, identify observed interactions with the product listings, and continuously adapt to such observed interactions in generating subsequent items and item listings at both a rate and a scale that is not possible via conventional systems.

Example Environment

FIG. 1 illustrates a digital medium environment 100 in an example implementation that is operable to employ the autonomous item generation techniques described herein. The illustrated environment 100 includes computing device 102, which may be implemented according to a variety of different configurations. The computing device 102, for instance, may be configured as a desktop computer, a laptop computer, a mobile device (e.g., assuming a handheld configuration such as a tablet or mobile phone), and so forth. Thus, the computing device 102 may range from full resource devices with substantial memory and/or processing resources to devices with limited memory and/or processing resources (e.g., mobile devices). Additionally, although a single computing device 102 is shown, the computing device 102 may be representative of a plurality of different devices, such as multiple servers utilized by a business to perform operations “over the cloud” as described in further detail below with respect to FIG. 9.

The computing device 102 is illustrated as including an autonomous item generation system 104. The autonomous item generation system 104 is implemented at least partially in hardware of the computing device 102 and represents functionality of the computing device 102 to generate an item 106 and an item listing 108 publication at a virtual marketplace, automatically and independent of user input, guidance, instruction, or other form of intervention that facilitates generation of the item 106 or item listing 108. In this manner, the item 106 is representative of a tangible good, product, and the like. In some implementations, the item 106 is representative of digital content, such as a digital graphic, an animation, a video, and so forth. The item listing 108 is representative of publication information describing the item 106, as described in further detail below.

To enable generation of the item 106 and the item listing 108, the autonomous item generation system 104 employs an item generation module 110, a transaction module 112, a feedback module 114, and a training module 116. The item generation module 110, the transaction module 112, the feedback module 114, and the training module 116 are implemented at least partially in hardware of the computing device 102 (e.g., through use of a processing system and computer-readable storage media), as described in further detail below with respect to FIG. 9.

The item generation module 110 is representative of functionality of the computing device 102 to generate the item 106 and item listing 108, without user input otherwise required by conventional systems (e.g., input specifying design criteria for the item 106, input specifying information to be included or emphasized in the item listing 108, input specifying demographic targeting information for the item listing 108, and so forth). To do so, the item generation module 110 employs machine learning model 118. Machine learning model 118 is representative of one or more machine learning models, where each model represented by machine learning model 118 can be configured according to a range of different machine learning model architectures. For instance, machine learning model 118 is representative of a model having an architecture based on neural networks (e.g., fully-connected neural networks, convolutional neural networks, or recurrent neural networks), deep learning networks, generative adversarial networks (GANs), decision trees, support vector machines, linear regression, logistic regression, Bayesian networks, random forest learning, dimensionality reduction algorithms, boosting algorithms, combinations thereof, and so forth.

Regardless of an underlying machine learning model architecture, machine learning model 118 is representative of one or more trained machine learning models that are configured to generate fabrication instructions 120 for the item 106 and metadata describing the item 106, where the metadata is useable by the autonomous item generation system 104 to generate the item listing 108. For instance, the machine learning model 118 may be representative of a GAN that is trained to generate fabrication instructions for a particular type of item (e.g., an article of clothing, a work of art, a three-dimensional model, and so forth). Such a trained GAN implementation of the machine learning model 118 may be generated by providing the machine learning model 118 with a plurality of training sets, where each training set includes information that is useable to guide the machine learning model towards producing a desired output.

For instance, in the context of a GAN trained to generate fabrication instructions and metadata for an article of clothing, each training dataset may include fabrication instructions for an example article of clothing (e.g., instructions describing one or more fabrics or materials to be used in generating the article of clothing; cut patterns for the one or more fabrics or materials; adhesion instructions for combining the cuts of materials or fabrics such as sewing patterns, sewing thread types, fabric adhesive types, and the like; folding instructions for the article of clothing; and so forth). In addition to fabrication instructions for the example article of clothing, the training dataset may further include metadata describing the example article of clothing (e.g., a name for the article of clothing, a product category for the article of clothing, descriptive attributes of the article of clothing, a demographic audience for the article of clothing, and so forth). Each training dataset for the example machine learning model 118 trained to generate fabrication instructions and metadata for an article of clothing may further include information describing feedback pertaining to the subject article of clothing for the training dataset. For instance, such feedback information may specify a number of times a listing for the article of clothing was viewed via a virtual marketplace, a number of purchases made of the article of clothing, a percentage of views that resulted in a share or favorite of the article of clothing, reviews for the article of clothing, information specifying comparative feedback data for different articles of clothing listed on the virtual marketplace, information describing an appearance of a listing for the article of clothing on the virtual marketplace, combinations thereof, and so forth.

In this manner, the machine learning model 118 is representative of a model trained on a generic dataset for one or more characteristics to be learned by the model during training. For instance, continuing the example of a machine learning model 118 trained to learn artwork characteristics, the machine learning model 118 is representative of a model trained on a training dataset that includes a collection of different works of art that share one or more common characteristics (e.g., being portraits of a human subject, depicting landscapes, being abstract art, being vector artwork, comprising a certain color palette, and so forth). As such, the characteristics learned by machine learning model 118 during training are dictated by the contents of the dataset used during training, such that the training dataset defines a style or theme of the machine learning model's 118 output following training.

Accordingly, the machine learning model 118 may be representative of a model trained on a training dataset that consists of human portraits, such that the machine learning model 118 is configured to output artwork that depicts human portraits after training is complete. In some implementations, machine learning model 118 is representative of one or more models trained on a plurality of different training datasets. For instance, continuing the previous example of a training dataset consisting of human portraits, after training the machine learning model 118 to output artwork depicting human portraits, training may continue using an additional training dataset that includes artwork depicting landscapes, such that the machine learning model 118 is further trained to output artwork depicting human portraits against landscape backgrounds.

In a similar manner, the machine learning model 118 may be representative of a plurality of different machine learning models arranged in a stacked configuration, where the output of one model is provided as input to a different model of the stacked configuration. In such an example implementation, each model of the stacked configuration of machine learning models may be trained on a different training dataset, such as one trained to output landscape artwork, another trained to output abstract artwork, and another trained to output watercolor artwork. Continuing this example stacked configuration, landscape artwork output by the initial model would be provided as input to the model trained to output abstract artwork, which would output an abstract artwork representation of the input landscape, which in turn would be provided to the watercolor artwork-trained model, such that the resulting output of the stacked configuration of machine learning models is an abstract watercolor landscape work of art. Thus, the specific characteristics learned by the machine learning model 118 are dependent on the training dataset(s) used to generate the machine learning model 118 and are not restricted to the examples provided herein.

As an additional example, in an implementation where the machine learning model 118 represents a GAN trained to generate fabrication instructions and metadata for a piece of art, each training dataset may include fabrication instructions for a specific piece of art (e.g., instructions designating one or more materials to use in generating the piece of art such as paint type and color, ink, paper, canvas, combinations thereof, and the like; printing instructions for generating the piece of art on a particular medium; dimensional constraints for the piece of art; and so forth). Individual training datasets supplement the fabrication instructions for the piece of art by including metadata describing the particular piece of art (e.g., a title for the piece of art, a description of the art, tags for listing the piece of art in a virtual marketplace, a recommended price for the piece of art, combinations thereof, and so forth). Each training dataset may further include feedback information for the piece of art, such as feedback information similar to that described above with respect to the example training dataset for training the machine learning model 118 to generate fabrication instructions for an article of clothing.

Given such example training datasets, in an example implementation where the machine learning model 118 is configured as a GAN, the GAN may be trained by causing different portions of the GAN (e.g., a generator portion and a discriminator portion) to compete in an adversarial objective (e.g., a min-max game) that seeks to maximize positive feedback associated with a corresponding item 106 generated according to fabrication instructions 120 output by the GAN. For instance, the feedback data of the example training datasets may be normalized on a scale that indicates whether feedback data for an item is generally positive or negative (e.g., feedback data indicating numerous views, purchases, shares, positive reviews of the item may be characterized and quantified as indicating positive feedback for the subject item of the training dataset).

Such positive feedback data can be contrasted with feedback data indicating few purchases, shares, or favorites of the item, feedback data indicating negative user reviews, and/or feedback data indicating a view of the item and subsequent purchase of a different similar item, which may be characterized and quantified as indicating negative feedback for the subject item of the training dataset. Under a training objective function, the machine learning model 118 may be configured to generate fabrication instructions 120 and metadata for an item 106 in a manner that seeks to maximize positive feedback data for the item 106.

In some implementations, training the machine learning model 118 includes supplementing training data from the training datasets with noise (e.g., Gaussian input noise), which causes the generator portion of the GAN to generate samples that could potentially fool the discriminator portion in the mini-max game objective example. In this manner, the machine learning model 118 is representative of one or more machine learning models that are trained to identify different aspects of item fabrication instructions and/or descriptive metadata for the item that influences positive feedback associated with the item.

During training on an initial dataset, the machine learning model 118 is instructed (e.g., via an objective function using convolutional neural networks) to generate an output defined by characteristics of a training dataset (e.g., articles of clothing with long sleeves). Outputs of the machine learning model 118 are then compared to the training dataset, and the model is governed to generated realistic articles of clothing with long sleeves based on a loss function determined from the comparison (e.g., F1 loss, visual perceptual quality loss, combinations thereof, and so forth in an implementation where the machine learning model 118 is configured as a GAN). Training continues until the machine learning model 118 converges and consistently generates outputs that are within a comparative threshold to the training dataset, at which point the machine learning model 118 is output, or adapted to different characteristics using additional training datasets.

In some implementations, the machine learning model 118 is further trained to identify differences in feedback data associated with an item among different demographic segments. For instance, training data may indicate that for a same article of clothing (e.g., a down jacket), the article of clothing is generally associated with positive feedback for a particular geographic location demographic segment during a three month window (e.g., during a winter season for the particular geographic location) and is generally associated with negative feedback for the northern hemisphere demographic segment at other times (e.g., during spring, summer, and fall seasons for the particular geographic location). In this manner, the machine learning model 118 is further trained with the objective of maximizing positive feedback associated for an item at an audience-specific level, where the audience can be constrained according to any range of control parameters (e.g., geographic location, time of day, day(s) of week, audience age, audience gender, combinations thereof, and so forth).

In some implementations, the machine learning model 118 may be received by the autonomous item generation system 104 together with an indication of control parameters in the machine learning model's 118 latent space(s). Alternatively or additionally, the training module 116 is configured to identify one or more control parameters in the machine learning model's 118 latent space(s). Such control parameters correlate to any aspect of the machine learning model's 118 output. For instance, in an example where the machine learning model 118 is configured to output works of art depicting landscapes, one control parameter may affect a size of the sky in the resulting landscapes, another control parameter may define a color palette (e.g., one or more colors) used in depicting mountains in the landscape, another control parameter may describe a medium on which the landscape is depicted, another control parameter may describe characteristics of a demographic audience for which the landscape is generated, and so forth. In implementations where the machine learning model 118 is received by the autonomous item generation system 104 without an indication as to which control parameters correlate with specific output characteristics, the training module 116 is configured to identify control parameters by adjusting individual control parameters of the machine learning model 118 and determining how the adjustment affects the resulting model output.

Thus, regardless of architectural configuration of the machine learning model 118, the machine learning model 118 is representative of one or more trained machine learning models that are configured to generate fabrication instructions 120 for the item 106 and metadata describing the item 106 that is useable by the autonomous item generation system 104 to generate the item listing 108, in a manner that seeks to maximize positive feedback associated with the item 106.

Upon generating fabrication instructions 120 for the item 106 using the machine learning model 118, the item generation module 110 is configured to transmit the fabrication instructions 120 to a fabrication device 122, which is representative of one or more machines that are configured to generate the item 106, responsive to receipt of the fabrication instructions. For instance, in an example implementation where the item 106 is an article of clothing, the fabrication device 122 is representative of one or more textile machines, such as a textile sourcing machine, a textile spinning machine, a textile finishing machine, cloth finishing machine, a knitting machine, a fabric seaming machine, a crochet machine, a quilting machine, a tufting machine, a weaving machine, a component (e.g., zipper, button, etc.) manufacturing machine, a measuring machine, a cutting machine, an embroidery machine, a sewing machine, a washing machine, a drying machine, a folding machine, a monogramming machine, an applique attachment machine, combinations thereof, and the like.

As another example, in an implementation where the item 106 is a piece of art, the fabrication device 122 may be configured as one or more of a two-dimensional printer, a three-dimensional printer, a computer numerical control (CNC) machine, combinations thereof, and so forth. As yet another example, in an implementation where the item 106 is representative of digital content, the fabrication device may be representative of computer-aided design (CAD) software implemented at least partially in hardware of a computing device, such as in hardware of the computing device 102. Thus, the fabrication device 122 is representative of any one or combination of multiple devices that are capable of generating the item 106 based on the fabrication instructions 120 output by the machine learning model 118.

The transaction module 112 is representative of functionality of the computing device 102 to generate the item listing 108 for the item 106, based on the metadata describing the item 106 as output by the machine learning model 118. The item listing 108 generated by the transaction module 112 is representative of information that describes an appearance of the item listing 108 as published at a virtual marketplace 124, both as visually appearing to a viewing user of the marketplace 124 as well as appearing in data observed by a search engine (e.g., when indexing virtual marketplace 124 or otherwise becoming aware of the item listing 108). For instance, the item listing 108 is representative of data specifying at least one of a title for the item 106, a detailed description for the item 106, a representative image for the item 106, a suggested price for the item 106, one or more different items that are similar to the item 106, combinations thereof, and so forth. Example implementations of an item listing 108 generated by the transaction module 112 are illustrated in FIGS. 3 and 4 and described in further detail below.

The transaction module 112 is further representative of functionality of the computing device to interface with the virtual marketplace 124 or directly implement the virtual marketplace as part of the autonomous item generation system 104. For instance, the virtual marketplace 124 is representative of a service configured to publish item listings 108 where items (e.g., tangible goods) are offered for sale. In some implementations, the virtual marketplace 124 is representative of a social networking system or other type of informational system that is configured to output the item listing 108 for display to one or more users. The virtual marketplace 124 may be hosted on dedicated or shared server machines (not shown) that are communicatively coupled to enable communications between the server machines. In implementations where the virtual marketplace 124 is implemented at the computing device 102, the virtual marketplace 124 may be implemented over distributed servers as described in further detail below with respect to FIG. 9. In this manner, the virtual marketplace 124 is further representative of connected components that allow the components to share and access common data, such as data hosted on one or more databases.

The virtual marketplace 124 is further representative of a platform that provides at least one of a publishing mechanism, a listing mechanism, or a price-setting mechanism that enable a seller to list, or publish information pertaining to tangible goods for sale. In a similar manner, the virtual marketplace 124 is representative of a platform that enables a buyer to express interest in, or indicate a desire to purchase, the tangible goods offered for sale. In this manner, the virtual marketplace 124 may comprise at least one publication engine and at least one shopping engine. The publication engine of the virtual marketplace is associated with one or more Application Programming Interfaces (APIs) that enable the transaction module 112 to communicate the item listing 108 to the virtual marketplace 124 and cause the virtual marketplace 124 to publish the item listing 108 in a manner that can be observed and interacted with by a user of the virtual marketplace (e.g., a potential buyer of the item 106). The shopping engine of the virtual marketplace 124 is associated with one or more APIs that enable a user of the virtual marketplace 124 to accept an offer for sale of the item 106 by agreeing to pay a price associated with the item 106. In some implementations, the APIs supported by the shopping engine of the virtual marketplace 124 support different price listing formats for publication of the item listing 108. As an example, price listing formats include fixed-price listing formats (e.g., the traditional classified advertisement-type listing or a catalog listing), auction-type price listing formats, buyout-type listing formats (e.g., the Buy-It-Now (BIN) technology developed by eBay Inc., of San Jose, Calif.), combinations thereof, and so forth.

The virtual marketplace 124 is further representative of a navigation engine that enables a user of the virtual marketplace to browse and inspect various item listings 108 published by the virtual marketplace 124. For instance, the navigation engine of the virtual marketplace 124 enables a user to identify and discover various item listings by providing a search module that enables keyword and/or image searches of item listings 108 or other information published by the virtual marketplace 124. In some implementations, the virtual marketplace 124 may organize item listings 108 according to various data structures (e.g., category, catalog, or other form of classification for differentiating and grouping item listings 108, relative to one another). In this manner, the virtual marketplace 124 may provide tools that enable users to browse published item listings 108 according to metadata tags that categorize the item listing 108, rather than having to index through an entirety of item listings 108 published by the virtual marketplace. Various other navigation techniques and item listing classification and categorization approaches may be enabled by the virtual marketplace 124 without departing from the spirit and scope of the examples described herein.

The virtual marketplace 124 is further representative of a messaging system that enables generation and delivery of various entities involved with facilitating a transaction via the virtual marketplace 124. For instance, the messaging system implemented by the virtual marketplace 124 may facilitate communications among a selling entity that published the item listing 108 to the virtual marketplace, a purchasing entity 126 that purchases the item 106 via interaction with the item listing 108, a shipping entity (not shown) contracted to physically deliver the item 106 from the fabrication device 122 to the purchasing entity 126, one or more financial institutions tasked with transferring funds among the various entities (e.g., the autonomous item generation system 104, the virtual marketplace 124, the fabrication device 122, the shipping entity, the purchasing entity 126, and so forth).

Communication of data among these various entities involved in facilitating the fabrication of the item 106, the publication of the item listing 108 to the virtual marketplace, the contracting for sale of the item 106 to the purchasing entity 126, the delivery of the item 106 to the purchasing entity 126, and the transfer(s) of funds among the various entities is enabled via the network 128. The network 128 is representative of a real time communication protocol that connects the autonomous item generation system 104 to one or more of the fabrication device 122, the virtual marketplace 124, the purchasing entity 126, one or more shipping entities, and one or more financial institutions involved in these example activities. For instance, the network 128 may represent functionality of a real-time communication protocol, such as a remote procedure call that enables a streaming, always-connected link among different entities. Alternatively or additionally, the network 128 may be representative of the Internet, a subscriber network such as a cellular of Wi-Fi network, combinations thereof, and so forth.

The feedback module 114 is representative of functionality of the computing device 102 to obtain analytics data from the virtual marketplace 124, such as information describing user feedback pertaining to the item listing 108 as published at the virtual marketplace 124. Data obtained by the feedback module 114 is representative of any form of information that indicates a manner in which the item listing 108 was observed and/or interacted with by users of the virtual marketplace 124. For instance, the feedback module 114 may include one or more APIs configured to obtain data describing a number of views (e.g., a number of impressions) of the item listing 108, a number of purchases of the item via the item listing 108, a number of favorites of the item listing 108, a number of shares of the item listing 108, user reviews submitted for the item listing 108, combinations thereof, and so forth.

The feedback module 114 is further representative of functionality of the autonomous item generation system 104 to obtain user profile information (e.g., age, gender, location, and so forth) pertaining to individual users that interacted with the item listing 108, and metadata describing the interaction (e.g., a duration of the interaction, specific aspects of the item listing 108 with which a user interacted, an amount of time spent viewing certain portions of the item listing, a date and time of the interaction, combinations thereof, and so forth). Thus, the feedback module 114 is representative of functionality of the autonomous item generation system 104 to understand how the particular item listing 108 for the item 106 was received, relative to other item listings published on the virtual marketplace 124.

The training module 116 is representative of functionality of the autonomous item generation system 104 to modify the machine learning model 118 by modifying at least one control parameter of the machine learning model 118 based on data obtained by the feedback module 114 pertaining to the item listing 108. In this manner, the training module 116 is configured to generate additional training datasets for refining the machine learning module 118, where each additional training dataset includes data describing the item listing 108 and the fabrication instructions 120 for the item 106 along with at least one instance of analytics data obtained by the feedback module 114 describing a user interaction with the item listing 108 at the virtual marketplace 124.

For instance, the training module 116 may generate a training dataset that includes the control parameters for the machine learning model 118 used in generating the fabrication instructions 120 and metadata for the item 106 (e.g., metadata used to generate the item listing 108). The training module 116 supplements the control parameters in the training dataset with predicted feedback data for the item 106, and provides the training dataset to the machine learning model 118 as input along with a loss function that penalizes differences, between predicted feedback data of the training dataset and observed feedback data obtained from the feedback module 114, indicating that the control parameters resulted in negative feedback for the item 106. By penalizing differences indicating negative feedback relative to predicted and observed feedback data (e.g., differences indicating that a predicted average user review score for the item 106 is higher than an observed average user review score for the item 106), the loss function implemented by the training module 116 rewards differences indicating that feedback pertaining to the item 106 was more positive than predicted.

Through training based on such example training datasets compared to observed feedback data, the training module 116 is configured to cause the machine learning model 118 to output ideal control parameters for use in generating fabrication instructions 120 and metadata useable to create an item listing 108 for a subsequent iteration of the item 106. In some implementations, control parameters output through this training process may be configured as a ranking of different combinations of control parameters for the machine learning model 118, where the ranking is ordered based control parameter combinations that are likely to garner the most positive feedback via publication to the virtual marketplace 124.

The training module 116 is configured to refine control parameters of the machine learning model 118 using any number of different loss functions. For instance, loss functions may be layered, such that the loss function penalizing negative differences between predicted and observed user review average scores is layered with a loss function that penalizes negative feedback differences between predicted and observed user interactions with the item listing 108 (e.g., click-through rates, purchase rates, etc.). In some implementations, in addition to training the machine learning model 118 using loss functions that consider differences between predicted and observed feedback data, the training module 116 is configured to employ one or more multi-armed bandit approaches to explore novel control parameter combinations that differ from previously attempted control parameter combinations for the machine learning model 118.

In this manner, the training module 116 is configured to continuously refine the machine learning model 118 to adapt its subsequent generation of fabrication instructions 120 and item metadata for use in generating the item listing 108 for additional items 106, while accounting for user behaviors and trends at the virtual marketplace 124. By enabling performance of the item generation module 110, the transaction module 112, the feedback module 114, and the training module 116 to be accomplished independent of user input or other intervention that guides the generation of the fabrication instructions 120 or design of the item listing 108, the autonomous item generation system 104 advantageously enables real-time adaptation to changes in user behavior and trends at the virtual marketplace 124 at a rate that is impossible to achieve by conventional systems that require human input or intervention. The advantages enabled by the autonomous item generation system 104 relative to conventional approaches are exponentially increased when generating a catalog of items 106 and corresponding item listings 108, as the human hours required by conventional approaches prohibit generating fabrication instructions 120 and item listings 108 for an item 106 in real-time.

Having considered an example digital medium environment, consider now a discussion of example implementations of autonomously generating an item and an item listing for the item using the techniques described herein.

FIG. 2 depicts a system 200 in an example implementation showing operation of the autonomous item generation system 104 of FIG. 1 in greater detail as generating an item 106 and an item listing 108, automatically and independent of user input via the machine learning model 118, and as refining the machine learning model 118 based on data describing one or more interactions with the item listing 108 as published to the virtual marketplace 124. To do so, system 200 illustrates components of the autonomous item generation system 104, including the item generation module 110, the transaction module 112, the feedback module 114, and the training module 116. The item generation module 110 is configured to cause the machine learning model 118 to output item data 202 for the item 106, where a type of the item 106 depends on an objective and training dataset used to originally train the machine learning model 118.

The item data 202 includes the fabrication instructions 120 for the item 106 along with metadata for the item 106 including an item description 204, at least one item tag 206, and item pricing data 208. The item description 204 is representative of a title for the item 106 to be included in the item listing 108, a detailed textual description for the item listing 108, and a digital rendering (e.g., an image, a video, an animation, combinations thereof, and so forth) of the item 106 to be represented in the item listing 108. The item tags 206 included in the item data 202 are representative of metadata to be embedded in the item listing 108 that enables the virtual marketplace 124 and/or a search engine (not shown) to identify and categorize the item listing 108 (e.g., relative to other item listings published at the virtual marketplace 124).

Item pricing data specifies at least one suggested price to be associated with the item 106 (e.g., to be displayed as part of the item listing 108). In some implementations, the item tags 206 further specify audience information for the item listing 108 to define a particular manner in which the item listing 108 is published at the virtual marketplace 124. For instance, the item tags 206 may include information specifying a particular demographic for the item listing 108 that restricts its publication to the particular demographic (e.g., specifying different item pricing data 208 for European and Asian markets, specifying different visual appearances for conveying the item description 204 in the item listing 108 at different times of the day, and so forth). Thus, the item data 202 generated by the machine learning model 118 is representative of information that is useable by the fabrication device 122 to fabricate the item 106 as well as information that is useable by the transaction module 112 to generate the item listing 108.

Upon receiving the item data 202 from the item generation module 110, the transaction module 112 is configured to generate the item listing 108, where a visual appearance of the item listing 108 as published to the virtual marketplace 124 is defined by one or more of the item description 204, the item tags 206, or the item pricing data 208. For an example representation of an item listing 108 generated by the autonomous item generation system 104, consider FIG. 3.

FIG. 3 depicts an example interface 300 of the virtual marketplace 124 as displaying an item listing 302. In the illustrated example of FIG. 3, the item listing 302 represents an instance of the item listing 108 generated by the transaction module 112, where the item listing 302 is created for an article of clothing item 106 generated by the machine learning model 118. Specifically, the item listing 302 includes an item title 304 for a “Men's Casual Button Down Shirt,” and a digital rendering 306 of the item 106. The digital rendering 306 indicates how the item 106 would visually appear following fabrication by the fabrication device 122, according to the fabrication instructions 120. The item listing 302 further includes a price 308 for purchasing the item 106 depicted by the digital rendering 306 and a detailed description 310 that provides a viewing user with additional information describing the item 106.

The example item listing 302 further includes an item details portion 312 configured to display additional information not provided by the item title 304, the digital rendering 306 of the item, the price 308, or the detailed description 310. For instance, in the example context of the item 106 being an article of clothing, the item details 312 may specify specific dimensions of the article of clothing, textiles used to construct the article of clothing, and any other information included in the item data 202 output by the machine learning model 118.

The item listing 302 is further illustrated as including a shipping options portion 314, a user reviews portion 316, and a similar items portion 318. The shipping options portion 314 is representative of information displayed to a viewing user of the item listing 302 that informs the viewing user as to available choices for logistically transferring the subject item 106 of the item listing 302 from the fabrication device 122 that manufactures the item 106 to a location of the viewing user (e.g., the purchasing entity 126). The user reviews portion 316 is representative of explicit feedback information pertaining to the item 106 as received from one or more users of the virtual marketplace 124 that have previously purchased the item 106. The similar items portion 318 is configured to display representations 320, 322, and 324 of different item listings published to the virtual marketplace 124 that are identified as being similar to the item 106 for which the item listing 302 is generated (e.g., based on comparison of the item tags 206 for the item 106 to tags associated with the representations 320, 322, and 324).

In some implementations, the shipping options portion 314, the user reviews portion 316, and the similar items portion 318 of the item listing 302 are defined by the virtual marketplace 124 to which the item listing is published, rather than being defined by the transaction module 112. Alternatively, the transaction module 112 may be configured to control a visual appearance of one or more of the shipping options portion 314, the user reviews portion 316, or the similar items portion 318 by virtue of a communicative connection between the virtual marketplace 124 and the transaction module 112 (e.g., network 128), as represented by the double-headed arrow connecting the transaction module 112 and the virtual marketplace 124 in FIG. 2.

The item listing 302 further includes controls 326, 328, and 330 for interacting with the item listing 302 via the virtual marketplace 124, where interaction with the controls 326, 328, and 330 is indicative of positive feedback to the item listing 302. For instance, control 326 enables a viewing user to immediately purchase the subject item 106 of the item listing 302, control 328 enables the viewing user to add the item 106 to a shopping cart during browsing of the virtual marketplace, and control 330 enables the viewing user to favorite the item 106. In this manner, controls 326, 328, and 330 are representative aspects of the item listing 302 from which interaction data may be gleaned to ascertain a positive or negative reaction to the item listing and used as feedback data for further refining control parameters of the machine learning model 118 used to generate the item listing 302 and its subject item 106.

Returning to FIG. 2, the transaction module 112 is illustrated as including a listing component 210, a finance component 212, and a logistics component 214. In implementations where the virtual marketplace 124 is implemented as part of the autonomous item generation system 104, the listing component 210, the finance component 212, and the logistics component 214 are representative of the transaction module's 112 ability to enable functionality of the standalone virtual marketplace 124, as described above with reference to FIG. 1. Alternatively, in implementations where the virtual marketplace 124 is implemented independently from the autonomous item generation system 104, the listing component 210, the finance component 212, and the logistics component 214 are representative of functionality of the autonomous item generation system to automatically handle interactions with the virtual marketplace 124 that otherwise cannot be performed by conventional systems absent human user intervention.

For instance, the listing component 210 is representative of functionality of the transaction module 112 to communicate and cause publication of the item listing 108 at the virtual marketplace 124. In accordance with one or more implementations, the listing component 210 is representative of one or more APIs configured to interface with the virtual marketplace 124 and list the item 106 according to one or more shopping engines or price-listing platforms supported by the virtual marketplace 124. The finance component 212 is representative of functionality of the transaction module 112 to interface with one or more financial institutions to transfer funds among the various entities involved in facilitating the fabrication of the item 106, publishing the item listing 108, purchasing the item 106, and facilitating shipment of the item 106 to a purchasing entity 126. In some implementations, the finance component 212 is configured to handle returns and process refunds in the event a purchasing entity 126 is dissatisfied with the item 106 and attempts to return the item 106 via the virtual marketplace 124.

In a similar manner, the logistics component 214 is representative of functionality of the transaction module 112 to identify one or more shipping options for logistically transporting the item 106 to a purchasing entity 126. For instance, the logistics component 126 is configured to identify geographic locations associated with a fabrication device 122 that manufactured the item 106 and the purchasing entity 126 to which the item 106 is to be transported. Given the geographic locations, the logistics component 214 is configured to interface with one or more shipping entities to obtain quotes for costs associated with transporting the item 106 to the purchasing entity 126. In some implementations, the logistics component 214 is configured to update the item listing 108 to convey such shipping cost quotes for a particular purchasing entity 126 viewing the item listing (e.g., by updating information included in the shipping options portion 314 of the example item listing 302 illustrated in FIG. 3.).

Upon receiving an indication from the virtual marketplace 124 of the purchasing entity 126 purchasing the item 106, the finance component 212 is configured to interface with a financial instruction associated with the purchasing entity 126 to verify that the purchasing entity 126 has sufficient funds to purchase the item 106 and, if so, contracts with a shipping entity for transporting the item 106 to the purchasing entity 126. In some implementations, the logistics component 214 is configured to select a particular shipping entity and shipping method for transporting the item 106 to the purchasing entity 126 based on various considerations, such as a price willing to be paid for shipping by the purchasing entity 126, a shipping speed desired by the purchasing entity 126, a cost for the autonomous item generation system 104 to transport the item 106 to the purchasing entity 126, combinations thereof, and so forth. Thus, through inclusion of the listing component 210, the finance component 212, and the logistics component 214, the transaction module 112 is configured to automatically handle interactions with the virtual marketplace 124 that otherwise cannot be performed by conventional systems absent human user intervention in facilitating the publication of the item listing 108 as well as sale activities involved with facilitating a sale of the subject item 106 for the item listing 108.

The feedback module 114 is configured to receive listing feedback data 216 from the virtual marketplace 124, which is representative of analytics data provided by the virtual marketplace 124 describing one or more interactions with the item listing 108. For instance, using the example item listing 302 of FIG. 3, the listing feedback data 216 may specify different interactions with the item listing 302 such as a number of page views, or impressions of the item listing 302, a number of different computing devices that viewed the item listing 302, a number of favorites of the item listing 302, a number of purchases of the subject item of the item listing 302, a number of shares of the item listing 302, and so forth. For each of these example interactions, the listing feedback data 216 may further provide information describing a user profile associated with the interaction, such as a location of the user during the interaction, a date and time associated with the interaction, demographic information for the user profile (e.g., age, gender, etc.), historical user behavior data for the user profile relative to the virtual marketplace 124, combinations thereof, and so forth.

The listing feedback data 216 may provide additional levels of detail regarding interactions with the item listing 108. For instance, the listing feedback data 216 may specify an amount of time spent viewing discrete portions of the item listing 302, such as a duration spent reading the detailed description 310, a number of user reviews displayed in navigating the user reviews portion 316, a purchase of an item listed in the similar items portion 318 instead of the subject item of the item listing 302, and so forth. In this manner, the listing feedback data 216 is representative of any type and format of data that indicates a manner in which the item listing 108 was experienced or interacted with by users of the virtual marketplace 124.

Given the listing feedback data 216, the feedback module 114 is configured to generate at least one training dataset 218 for use in refining the machine learning model 118. To do so, the feedback module combines the listing feedback data 216 with the item data 202 in a format that corresponds to training dataset format used to originally train the machine learning model 118 (e.g., the format of the predicted feedback data included in the training dataset generated by the training module 116). By virtue of its initial training, the feedback module 114 does not need to annotate or otherwise label the training dataset 218 (e.g., as quantifying or otherwise classifying the listing feedback data 216 as indicating that the item listing 108 is associated with positive or negative feedback).

Instead, by being trained to identify aspects of information included in initial training dataset counterparts to the listing feedback data 216 represented in the training dataset 218, the machine learning model 118 is configured to infer relationships between different aspects of the item data 202 and the resulting interactions with the item listing 108 via the virtual marketplace. To do so, the training module 116 feeds the training dataset 218 as an input to the machine learning model 118, which causes the machine learning model 118 to modify one or more control parameters (e.g., internal model node weights) according to a loss function for the model that penalizes negative differences between predicted and observed feedback data. The machine learning model 118 with its one or more modified parameters is output by the training module 116 as the refined machine learning model 220, which is communicated to the item generation module 110 for use in place of the machine learning model 118 in subsequently generating item data 202 for a different item 106. As an example of an item 106 and item listing 108 subsequently output by the refined machine learning model 220, consider FIG. 4.

FIG. 4 depicts an example interface 400 of the virtual marketplace 124 as displaying an item listing 402, representative of an instance of an item listing 108 generated from item data 202 output by the refined machine learning model 220. Specifically, item listing 402 represents example changes between item data 202 output by the machine leaning model 118 and item data 202 output by the refined machine learning model 220. For instance, item listing 402 include an item title 404 for a “Men's Double Pocket Tailored Shirt,” a digital rendering 406 of the subject item of the item listing 402, a price 408 for the subject item, and a detailed description 410 for the subject item, which each differ from their counterpart aspects of the item listing 302. Such changes may be indicative of the machine learning model 118 interpreting the listing feedback data 216 as indicating certain trends gleaned from interactions with the virtual marketplace 124, such as that double pocketed men's collared shirts are currently more popular than collared shirts without pockets, that articles of clothing including tags noting that the clothing is “tailored” are associated with positive feedback, that item listings featuring multiple digital renderings of the subject item are associated with increased impression and purchase rates, that the revised layout of item listing 402 is preferred over the layout of listing 302, and so forth.

To configure the machine learning model 118 to recognize and adapt to such changing trends, the training module 116 is configured to identify control parameters in the latent space(s) of the machine learning model 118 that correlate with different design aspects (e.g., sleeve length, pocket styles, listing tags, item fabric(s), and so forth. In this manner, by informing the machine learning model 118 of information included in the listing feedback data 216 via the training datasets 218, the autonomous item generation system 104 is configured to adapt to changing trends and alter fabrication instructions 120 and item listing 108 characteristics for items subsequently generated by the refined machine learning model 220 automatically and without relying on guiding user intervention.

Thus, the autonomous item generation system 104 is configured to continuously monitor activities associated with item listings 108 published to the virtual marketplace 124 and refine control parameters of the machine learning model 118 used to generate the item listing 108 to adapt to inferred trends and behaviors. Because the autonomous item generation system 104 is configured to perform its continuous cycle of operations independent of user input and identify trends and behaviors to consider in refining machine learning model parameters before such trends or behaviors can be identified by a user of the autonomous item generation system 104, the autonomous item generation system 104 is configured to output a user interface that enables a user to glean insight into the system's operations.

FIG. 5 depicts an example interface 500 of the autonomous item generation system 104 configured for output on a display device of the computing device implementing the autonomous item generation system 104, such as a display device of computing device 102. The interface 500 includes a model selection control 502 and an audience specification control 504. The audience specification control 504 enables a user of the autonomous item generation system 104 to change input parameters considered by the model selected via model selection control 502, such that the user can observe how the changed input parameters alter a resulting item 106 generated by the machine learning model 118. For instance, responsive to receiving selection of one or more models via the model selection control 502 and a selection of one or more options for defining an audience via the audience specification control 504, the autonomous item generation system updates interface 500 to output item preview 506, which includes a preview digital rendering 508 of an item 106 that would be generated by machine learning model 118 according to input parameters specified by the selection(s) made with respect to controls 502 and 504. Although the illustrated example depicts a preview digital rendering 508 as being output in the item preview 506 portion of the interface 500, the item preview 506 is configured to include a display of any information included in item data 202, such as visual representations of the item data 202, textual descriptions of the item data 202, and combinations thereof.

In the illustrated example, model selection control 502 includes options 510, 512, and 514, where option 510 enables selection of an instance of machine learning model 118 trained to generate item data 202 for men's clothing items, option 512 enables selection of an instance of machine learning model 118 trained to generate item data 202 for works of art, and option 514 enables a user of the autonomous item generation system 104 to upload their own model (e.g., an instance of the machine learning model 118 trained to generate item data 202 for an item 106 not categorized as men's clothing or works of art).

The audience specification control 504 in the illustrated example of FIG. 5 includes options 516, 518, and 520, where option 516 enables specification of a “general public” audience segment (e.g., no constraints on the audience to be considered by the machine learning model 118), control 518 enables designation of custom demographic parameters to be considered by the machine learning model 118 (e.g., a specified geographic region for an audience of an item listing 108, a specified audience age and gender combination, a specified time of day for publishing the item listing 108, and so forth), and control 520 enables designation of a particular individual user to be considered as the audience for the machine learning model's 118 generation of the item data 202.

By interacting with the controls 502 and 504 of interface 500, a user of the autonomous item generation system 104 is informed of considerations made by the autonomous item generation system 104 in performing its automatic operations. For instance, interface 500 indicates to a user of the autonomous item generation system 104 that an instance of the machine learning model 118 trained to generate art items, when considering the general public as an audience, will generate an item 106 that depicts a waterfront dock scene at sunset with certain nature aspects to achieve a realistic, photo-quality appearance, based on current parameters of the machine learning model 118.

In some implementations, the item preview 506 portion of the interface 500 may further include information that describes control parameters of the machine learning model 118 selected for the specified audience. For instance, the item preview portion 506 may specify that the same machine learning model 118 configured to generate landscape works of art, when targeting a Swiss audience, selects control parameters for the machine learning model 118 that promote inclusion of mountains in the landscape artwork. Conversely, by interacting with the audience specification control 504 to change the geographic demographic audience from Switzerland to Hawaii, the item preview portion 506 may specify that control parameters emphasizing inclusion of beaches and oceans in the landscape artwork are to be utilized. In this manner, the interface 500 provides a user of the autonomous item generation system 104 with insight as to what considerations are made when selecting control parameters for different machine learning models 118, audience considerations, and combinations thereof.

FIG. 6 depicts an example interface 600 of the autonomous item generation system 104, where the selected option of the audience specification control 504 has been altered from option 516 to option 518, indicating that specific audience demographic characteristics (not shown) are to be considered instead of the general public considered in the example interface 500. In the illustrated example, content of the item preview 506 is altered to indicate how a resulting item 106 generated by the same machine learning model 118 would differ based on the specified audience demographic characteristics. Specifically, interface 600 indicates to the user of the autonomous item generation system 104 that the same instance of the machine learning model 118 trained to generate art items as selected in FIG. 5, when considering the updated audience demographic characteristics, would instead generate an item 106 that depicts a surreal mountain landscape scene. In this manner, an interface of the autonomous item generation system 104 provides a user with tools to obtain insight regarding the ongoing revision of a particular machine learning model 118 implemented by the autonomous item generation system 104 in a manner that would not be possible by inspecting raw input and output data from the machine learning model 118.

Having considered example details of automatically generating data useable to fabricate an item 106 and generate a listing for the item to be published at a virtual marketplace, consider now example procedures to illustrate aspects of the techniques described herein.

Example Procedures

The following discussion describes techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference may be made to FIGS. 1-6.

FIG. 7 depicts a procedure 700 in an example implementation of autonomous item and item listing generation in accordance with aspects of the techniques described herein. Notably, each and every operation of procedure 700 is performed automatically and independent of user input or intervention. Using at least one machine learning model, fabrication instructions for an item and metadata describing the item are generated (block 702). The item generation module 110 of the autonomous item generation system 104, for instance, causes machine learning model 118 to generate item data 202, which includes fabrication instructions 120 that are useable by the fabrication device 122 to fabricate a tangible item 106. In addition to the fabrication instructions, the item data 202 includes metadata describing the item 106, such as item description 204, item tags 206, and item pricing data 208.

Fabrication of the item is caused by transmitting the fabrication instructions to the fabrication device (block 704). The item generation module 110, for instance transmits the fabrication instructions 120 to the fabrication device 122 in a manner that causes the fabrication device to fabricate, manufacture, or otherwise output the item 106. A listing for the item is then created using the metadata describing the item (block 706). The transaction module 122 of the autonomous item generation system 104 obtains the item data 202 from the item generation module 110 and generates item listing 108, such as the example item listings depicted in FIGS. 3 and 4.

The item listing is published to a virtual marketplace and analytics data describing one or more interactions with the item listing is obtained (block 708). The transaction module 112, for instance, employs listing component 210 to interface with the virtual marketplace 124 and publish the item listing 108 in a manner that makes the item listing discoverable on the virtual marketplace 124 (e.g., to a browsing user of the virtual marketplace 124, to a search engine indexing the virtual marketplace 124, and so forth). The feedback module 114 of the autonomous item generation system 104 obtains listing feedback data 216, which is representative of information describing one or more interactions with the item listing 108 as published to the virtual marketplace 124. Example interactions include a number of views (e.g., a number of impressions) of the item listing 108, a number of purchases of the item via the item listing 108, a number of favorites of the item listing 108, a number of shares of the item listing 108, user reviews submitted for the item listing 108, combinations thereof, and so forth.

The listing feedback data 216 may provide additional levels of detail regarding interactions with the item listing 108. For instance, the listing feedback data 216 may specify an amount of time spent viewing discrete portions of the item listing 302, such as a duration spent reading the detailed description 310, a number of user reviews displayed in navigating the user reviews portion 316, a purchase of an item listed in the similar items portion 318 instead of the subject item of the item listing 302, and so forth. In this manner, the listing feedback data 216 is representative of any type and format of data that indicates a manner in which the item listing 108 was experienced or interacted with by users of the virtual marketplace 124.

Training data is then formed based on the analytics data and one or more parameters of the at least one machine learning model are modified using the training data (block 710). The feedback module 114, for instance, combines the listing feedback data together with the item data 202 generated by the machine learning model 118 as training dataset 218. The format of training dataset 218 output by the feedback module 114 varies according to the machine learning model implanted by the item generation module 110 and depends on a format of training datasets used to originally train the machine learning model 118. The training dataset is then passed to the training module 116, which provides the training dataset 218 as input to the machine learning model 118. Upon input of the training dataset 218, the machine learning model 118 is configured to process the training dataset 218 according to one or more objective functions upon which the machine learning model 118 was initialized, together with one or more loss functions that penalize negative differences between predicted and observed feedback data, thereby causing the machine learning model 118 to refine one or more internal parameters via processing of the training dataset 218. The machine learning model 118 with its one or more modified parameters is then output as refined machine learning model 220.

Using the at least one machine learning model with one or more modified parameters, fabrication instructions for an additional item and metadata describing the additional item are generated (block 712). The autonomous item generation system 104, for instance, performs the operations as described above with respect to block 702, using the refined machine learning model 220 instead of the machine learning model 118. Operation of procedure 700 then optionally returns to block 704, continuing to refine model parameters based on analytic data describing interactions with item listings 108 generated by the autonomous item generation system 104.

FIG. 8 depicts a procedure 800 in an example implementation of outputting a user interface for an autonomous item generation system in accordance with aspects of the techniques described herein. A display of a user interface for an autonomous item generation system that includes controls for specifying a machine learning model to be used in generating an item and an audience for the machine learning model to consider in generating the item is output (block 802). The autonomous item generation system 104, for instance, outputs interface 500 at a display of computing device 102. The interface 500 includes model selection control 502 and audience specification control 504. The model selection control 502 enables selection of a particular machine learning model 118 to be implemented by the autonomous item generation system 104 and the audience specification control 504 enables a user to change input parameters considered by the model selected via model selection control 502 and observe how the changed input parameters alter a resulting item 106 generated by the machine learning model 118.

Input is received at the user interface specifying at least one of the machine learning model to be used, or the audience to be considered, in generating the item (block 804). A selection of one or more of options 510, 512, or 514 offered by the model selection control 502 and/or one or more options 516, 518, or 520 of the audience specification control 504 is received. The user interface is then updated to display a preview of the item as generated by the selected machine learning model for the specified audience (block 806). For instance, responsive to receiving selection of one or more models via the model selection control 502 and a selection of one or more options for defining an audience via the audience specification control 504, the autonomous item generation system updates interface 500 to output item preview 506, which includes a preview digital rendering 508 of an item 106 that would be generated by machine learning model 118 according to input parameters specified by the selection(s) made with respect to controls 502 and 504. In some implementations, machine learning model 118 control parameters are alternatively or additionally output in the item preview 506 portion of the interface 500.

Operation of procedure 800 then optionally returns to block 804, enabling selection of a different combination of the one or more of options 510, 512, or 514 offered by the model selection control 502 and/or one or more options 516, 518, or 520 of the audience specification control 504. For example, interface 600 depicts an update to the item preview 506 from that depicted in the illustrated example of FIG. 5, responsive to a different option selected from the audience specification control 504. In this manner, the user interface output by procedure 800 enables a user of the autonomous item generation system 104 to glean insight into operations of the autonomous item generation system 104 that would otherwise be unable to ascertain from inspection of raw data inputs to, and outputs from, the machine learning model 118.

Having described example procedures in accordance with one or more implementations, consider now an example system and device that can be utilized to implement the various techniques described herein.

System and Device

FIG. 9 illustrates an example system generally at 900 that includes an example computing device 902 that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. This is illustrated through inclusion of the autonomous item generation system 104. The computing device 902 may be, for example, a server of a service provider, a device associated with a client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.

The example computing device 902 includes a processing system 904, one or more computer-readable media 906, and one or more I/O interface 908 that are communicatively coupled, one to another. Although not shown, the computing device 902 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines.

The processing system 904 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 904 is illustrated as including hardware elements 910 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 910 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.

The computer-readable storage media 906 is illustrated as including memory/storage 912. The memory/storage 912 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 912 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 912 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 906 may be configured in a variety of other ways as further described below.

Input/output interface(s) 908 are representative of functionality to allow a user to enter commands and information to the example service provider device 902, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, the computing device 902 may be configured in a variety of ways as further described below to support user interaction.

Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.

An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the example computing device 902. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”

“Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information, in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.

“Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 902, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.

As previously described, hardware elements 910 and computer-readable media 906 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.

Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 910. The example computing device 902 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the example computing device 902 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 910 of the processing system 904. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more example computing devices 902 and/or processing systems 904) to implement techniques, modules, and examples described herein.

The techniques described herein may be supported by various configurations of the computing device 902 and are not limited to the specific examples of the techniques described herein. This functionality may also be implemented all or in part through use of a distributed system, such as over a “cloud” 914 via a platform 916 as described below.

The cloud 914 includes and/or is representative of a platform 916 for resources 918. The platform 916 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 914. The resources 918 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the example computing device 902. Resources 918 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.

The platform 916 may abstract resources and functions to connect the example computing device 902 with other computing devices. The platform 916 may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources 918 that are implemented via the platform 916. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout the system 900. For example, the functionality may be implemented in part on the example computing device 902 as well as via the platform 916 that abstracts the functionality of the cloud 914.

Conclusion

Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims

1. A method for autonomous item generation implemented by at least one computing device, the method comprising:

displaying a user interface that includes a control for specifying a machine learning model to be used in autonomously generating an item and a control for specifying an audience to be considered in generating the item;
receiving input at the user interface that specifies at least one of the machine learning model to be used or the audience to be considered in generating the item;
updating the user interface to display a preview representation of the item responsive to receiving the input;
receiving additional input that modifies at least one of the machine learning model to be used or the audience to be considered in generating the item; and
modifying the preview representation of the item responsive to receiving the additional input.

2. The method of claim 1, wherein the machine learning model comprises a generative adversarial network trained to generate printing instructions for a three-dimensional object, the item comprises the three-dimensional object, and the preview representation of the item comprises a digital rendering of the three-dimensional object.

3. The method of claim 1, wherein the machine learning model comprises a generative adversarial network trained to generate fabrication instructions for fabricating an article of clothing, the item comprises the article of clothing, and the preview representation of the item comprises a digital rendering of the article of clothing

4. The method of claim 1, wherein the machine learning model comprises a generative adversarial network trained to generate printing instructions for a two-dimensional piece of art, the item comprises the two-dimensional piece of art, and the preview representation of the item comprises a digital rendering of the piece of art.

5. The method of claim 1, wherein modifying the preview representation of the item comprises changing descriptive information for the item without changing a visual appearance of the preview representation of the item.

6. The method of claim 1, wherein modifying the preview representation of the item comprises changing a visual appearance of the preview representation of the item without changing descriptive information for the item.

7. The method of claim 1, wherein receiving input at the user interface that specifies the audience to be considered in generating the item comprises receiving information describing one or more demographic characteristics of the audience to be considered in generating the item.

8. A system comprising:

one or more processors; and
a computer-readable storage medium storing instructions that are executable by the one or more processors to perform operations comprising: displaying a user interface that includes a control for specifying a machine learning model to be used in autonomously generating an item and a control for specifying an audience to be considered in generating the item; receiving input at the user interface that specifies at least one of the machine learning model to be used or the audience to be considered in generating the item; updating the user interface to display a preview representation of the item responsive to receiving the input; receiving additional input that modifies at least one of the machine learning model to be used or the audience to be considered in generating the item; and modifying the preview representation of the item responsive to receiving the additional input.

9. The system of claim 8, wherein the machine learning model comprises a generative adversarial network trained to generate printing instructions for a three-dimensional object, the item comprises the three-dimensional object, and the preview representation of the item comprises a digital rendering of the three-dimensional object.

10. The system of claim 8, wherein the machine learning model comprises a generative adversarial network trained to generate fabrication instructions for fabricating an article of clothing, the item comprises the article of clothing, and the preview representation of the item comprises a digital rendering of the article of clothing

11. The system of claim 8, wherein the machine learning model comprises a generative adversarial network trained to generate printing instructions for a two-dimensional piece of art, the item comprises the two-dimensional piece of art, and the preview representation of the item comprises a digital rendering of the piece of art.

12. The system of claim 8, wherein modifying the preview representation of the item comprises changing descriptive information for the item without changing a visual appearance of the preview representation of the item.

13. The system of claim 8, wherein modifying the preview representation of the item comprises changing a visual appearance of the preview representation of the item without changing descriptive information for the item.

14. The system of claim 8, wherein receiving input at the user interface that specifies the audience to be considered in generating the item comprises receiving information describing one or more demographic characteristics of the audience to be considered in generating the item.

15. A computer-readable storage medium storing instructions that are executable by a processing device to perform operations comprising:

displaying a user interface that includes a control for specifying a machine learning model to be used in autonomously generating an item and a control for specifying an audience to be considered in generating the item;
receiving input at the user interface that specifies at least one of the machine learning model to be used or the audience to be considered in generating the item;
updating the user interface to display a preview representation of the item responsive to receiving the input;
receiving additional input that modifies at least one of the machine learning model to be used or the audience to be considered in generating the item; and
modifying the preview representation of the item responsive to receiving the additional input.

16. The computer-readable storage medium of claim 15, wherein the machine learning model comprises a generative adversarial network trained to generate printing instructions for a three-dimensional object, the item comprises the three-dimensional object, and the preview representation of the item comprises a digital rendering of the three-dimensional object.

17. The computer-readable storage medium of claim 15, wherein the machine learning model comprises a generative adversarial network trained to generate fabrication instructions for fabricating an article of clothing, the item comprises the article of clothing, and the preview representation of the item comprises a digital rendering of the article of clothing.

18. The computer-readable storage medium of claim 15, wherein the machine learning model comprises a generative adversarial network trained to generate printing instructions for a two-dimensional piece of art, the item comprises the two-dimensional piece of art, and the preview representation of the item comprises a digital rendering of the piece of art.

19. The computer-readable storage medium of claim 15, wherein modifying the preview representation of the item comprises changing descriptive information for the item without changing a visual appearance of the preview representation of the item.

20. The computer-readable storage medium of claim 15, wherein receiving input at the user interface that specifies the audience to be considered in generating the item comprises receiving information describing one or more demographic characteristics of the audience to be considered in generating the item.

Patent History
Publication number: 20230098794
Type: Application
Filed: Dec 5, 2022
Publication Date: Mar 30, 2023
Applicant: eBay Inc. (San Jose, CA)
Inventors: Maxim Manco (West New York, NJ), Fang Fang (Austin, TX), Natraj Srinivasan (New York, NY), Alexander Akerman (Weehawken, NJ), Michael Ebin (HaSharon), Eran Ben Tovim (New York, NY)
Application Number: 18/061,740
Classifications
International Classification: G06Q 30/06 (20120101); G06F 9/451 (20180101); B33Y 50/02 (20150101); G05B 13/02 (20060101); G06Q 30/02 (20120101); G06F 3/12 (20060101); G06N 3/08 (20060101); G06F 21/62 (20130101); G06Q 50/04 (20120101); G06N 20/00 (20190101); G06Q 10/08 (20120101); G06N 3/04 (20060101); G06F 3/0482 (20130101); G06F 3/0484 (20220101);