Electronic device control based on user gestures applied to a media headset

- Apple

An electronic device including an interface for receiving a control signal from a peripheral control device via a wired communications channel where the control signal is derived from a user control gesture. The electronic device also includes a data store for storing a list of known control signals where each known control signal has an associated control instruction. The electronic device further includes a processor that identifies the received control signal by comparing the received control signal with the list of known control signals and controls an operation of the electronic device based on the control instruction associated with the identified control signal.

Skip to: Description  ·  Claims  · Patent History  ·  Patent History
Description
REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of and priority to U.S. Provisional Application No. 61/020,988, filed on Jan. 14, 2008, entitled “Simultaneous Communication of Audio and Control Data Through the Microphone Node of a Multiple-Region Jack for a Mobile Device.” This application is related to U.S. patent application Ser. No. ______, filed on Sep. 3, 2008, having Attorney Docket No. P6133US1, and entitled “Electronic Device Accessory.” The entire contents of the above-referenced applications are incorporated herein by reference.

BACKGROUND OF THE INVENTION

This invention relates to controlling an electronic device based on user gestures applied to a media headset connected to the electronic device via an audio jack.

Traditional mobile telephones and media devices include a communications jack for accessories. One such traditional accessory is a media headset that includes two speakers and a single microphone. A user can conduct a telephone call using such accessory. More particularly, a user can receive audio data associated with a telephone call through the two speakers and send audio data associated with the telephone call through the microphone.

One problem with existing electronic devices or media devices is that a user can become distracted from their surroundings or have to interrupt an activity in order to interact with their electronic device. Even though electronic devices and media devices have become more compact and portable, it can be inconvenient for a user to hold, retrieve, or manipulate an electronic device while performing other activities such as walking or running which also makes it difficult to control the media device during other activities. Accordingly, there is a need for providing an electronic device user with a convenient and unobtrusive mechanism to control an operation of an electronic device.

Another problem with existing electronic devices is that peripheral control of the electronic device typically requires a control-specific interface or connector to facilitate control from a peripheral device or accessory. Thus, there is a need to enable peripheral control of an electronic device while eliminating the need for a dedicated control interface or components.

SUMMARY OF THE INVENTION

The invention, in various embodiments, addresses deficiencies in the prior art by providing systems, methods and devices that enable a user to control an electronic device via a peripheral control unit of a media headset using selected control gestures that can be input conveniently by the user via control interfaces of the peripheral control unit, but also delivered to the electronic device via an audio jack and, thereby, eliminate the need for an additional control interface or connector.

In one aspect, an electronic device includes an interface for receiving a control signal from a peripheral control device via a wired communications channel. The wired communications channel may be an audio communications channel that connects with the electronic device via an audio jack. The control signal may be derived from a user control gesture. The electronic device may also include a data store for storing a list of known control signals where each known control signal has an associated control instruction. The electronic device may use a processor to identify the received control signal by comparing the received control signal with the list of known control signals. The processor may control an operation of the electronic device based on the control instruction associated with the identified control signal.

In one configuration, the data store includes a database and/or electronic list. In one feature, the interface sends media information to the peripheral control device. Media information may include music, a song, video, multimedia, and the like. The interface may send media information concurrently with receiving the control signal. The interface may receive the control signal in a first frequency range and send media information in a second frequency range.

The processor may be configured to operate the electronic device using a plurality of applications. The data store may includes a plurality of lists of known control signals and associated control instructions. Each list of known control signals and associated control instructions may be associated with one of the plurality of applications. An application may perform, without limitation, media playback, radio playback, voice memo recording, voice memo playback, voice feedback, and user exercise support. A control instruction may include, without limitation, media play, media pause, volume increase, volume decrease, volume ramp increase, volume ramp decrease, tag media, memo play, memo pause, skip to next song, radio playback, radio mute, radio skip to next preset, radio wrap around, go to next chapter, play select song, and activate voice feedback, activate feature, mute, un-mute, and/or go to next tag.

In one configuration, a user control gesture is derived from a sequence of user interactions with one or more control interfaces. A control interface may include, without limitation, a button, click wheel, touch screen, a section of a touch screen, and/or a switch. The electronic device may include, without limitation, a cellular telephone, media player, audio player, music player, video player, multimedia player, and/or personal computer.

In another aspect, an electronic device includes a data store for storing a first list of known control signals related to a first function of the electronic device where each known control signal of the first list has an associated control instruction based on the first function of the electronic device. The electronic device also includes an interface that receives a first control signal via an audio jack. The first control signal may be derived from a user control gesture applied to a peripheral control unit in communication with the audio jack. The electronic device also includes a processor that identifies the received first control signal by comparing the received first control signal with the first list of known control signals related to the first function. The processor then controls an operation of the electronic device based on the control instruction associated with the identified first control signal.

The function may includes an application running on the electronic device or a subroutine of the application running on the electronic device. For example, an editor feature of a word processor application may be considered a subroutine or sub-feature of the word processor application. Thus, the same control gesture and/or control signal may perform a different function while the editor subroutine is running as opposed to when a print preview subroutine is running.

The data store may store a second list of known control signals related to a second function of the electronic device where each known control signal of the second list has an associated control instruction based on the second function of the electronic device. In one configuration, when the interface receives a second control signal via the audio jack, the processor identifies the received second control signal by comparing the received second control signal with the second list of known control signals related to the second function. The processor may then control an operation of the electronic device based on the control instruction associated with the identified second control signal.

BRIEF DESCRIPTION OF THE DRAWINGS

The above and other objects and advantages of the present invention will be apparent upon consideration of the following detailed description, taken in conjunction with accompanying drawings, in which like reference characters refer to like parts throughout, and in which:

FIG. 1 is a communications topology according to an illustrative embodiment of the invention;

FIG. 2 is a communication topology between an electronic device and an audio communications device including a MEMS microphone module according to an illustrative embodiment of the invention;

FIG. 3A is a communication topology between a electronic device and an audio communications device including a condenser-based microphone according to an illustrative embodiment of the invention;

FIG. 3B is a communication topology between a electronic device and an audio communications device including an alternate switch configuration according to an illustrative embodiment of the invention;

FIG. 4 includes an exemplary flow chart of the process between an electronic device and an audio communications device according to an illustrative embodiment of the invention;

FIG. 5 is a perspective view of a media device according to an illustrative embodiment of the invention;

FIG. 6 shows a simplified functional block diagram of a electronic device according to an illustrative embodiment of the invention;

FIG. 7 includes a database and/or list associating user headset control unit gestures with electronic device control instructions according to an illustrative embodiment of the invention; and

FIG. 8 is a flow diagram of a process for controlling an electronic device via a peripheral control unit according to an illustrative embodiment of the invention.

DESCRIPTION OF THE INVENTION

FIG. 1 shows communications topology 100 that may include computer 101, media device 103, and audio communications device 190. Electronic device 103 may communicate with computer 101 via communications channel 102. Electronic device 103 may communicate with audio communications device 190 via communications channel 105. In one embodiment, communications channel 105 is a wired communication channel. Alternatively, a communications channel may be wireless.

Electronic device 103 may take any form. For example, Electronic device 103 may be a media device or a portable media player such as a portable music player. Electronic device 103 may include any type of consumer electronic device such as, without limitation, a computer, stereo, receiver, mobile telephone, personal digital assistant (PDA), electronic game, camera, video equipment, audio equipment, mp3 player, video player, set top box, and the like. Electronic device 103 may also include, for example, a mobile telephone that may play downloaded media. Media may be downloaded directly to the electronic device 103 or may be downloaded to computer 101 and transferred to the media device 103 via communications channel 102.

The electronic device 103 may include a wireless communications device such as a cellular telephone, satellite telephone, cordless telephone, personal digital assistant (PDA), pager, portable computer, or any other device capable of wireless communications. In fact, FIG. 5 shows an exemplary cellular telephone version of a broad category of electronic device 103. The electronic device 103 may be compact, portable, mobile, personal, and/or transportable.

The electronic device 103 may also be integrated within the packaging of other devices or structures such as a vehicle, video game system, appliance, clothing, helmet, glasses, wearable apparel, stereo system, computer system, entertainment system, or other portable devices. In certain embodiments, the electronic device 103 may be docked or connected to a wireless (e.g., a Wi-Fi docking system) and/or radio enabling accessory system (e.g., AM/FM or satellite radio receiver) that provides the electronic device 103 with short-range communicating functionality and/or radio reception capability. Alternative types of electronic devices 103 may include, for example, a media player such as an iPod®, iPod® Nano, iPod® Shuffle, or Apple® iPhone available by Apple Inc., of Cupertino, Calif., pocket-sized personal computers such as an iPAQ® Pocket PC available by Hewlett Packard Inc., of. Palo Alto, Calif. and any other device capable of communicating wirelessly (with or without the aid of a wireless enabling accessory system).

In certain embodiments, the electronic device 103 may synchronize with, for example, a remote computing system or server, e.g., computer 101, to receive media (using either wireless or wireline communications paths). Wireless syncing enables the electronic device 103 to transmit and receive media and data without requiring a wired connection. Media may include, without limitation, sound or audio files, music, video, multi-media, and digital data, in streaming and/or discrete (e.g., files and packets) formats.

During synchronization, a host system, e.g., device 101, may provide media to a client system or software application embedded within the electronic device 103. In certain embodiments, media and/or data is “downloaded” to the electronic device 103. In other embodiments, the electronic device 103 is capable of uploading media to a remote host or other client system.

Audio communications device 190 may be utilized to provide an audio functionality associated with electronic device 103. Audio communications device 190 may include speakers 191 and 192 as well as microphone 195. Hence, the audio communications device 190 may be referred to as a head set device. Control interfaces 196, 197, and 198 and microphone 195 may be included in a peripheral control unit (PCU) 194. Accordingly, the PCU 194 may include multiple control interfaces such that the PCU 194 may receive audio input as well as tactile input.

Control interfaces 196, 197, and 198 of the PCU 194 may provide the audio communications device 190 with the capability to communicate control information to electronic device 103. Accordingly, audio communications device 190 may control the operation of any function of electronic device 103. Accordingly, for example, a telephone call may be received by electronic device 103 and a user may interact with the telephone call via audio communications device 190. Particularly, a user may transmit audio communications with another participant of the telephone call through electronic device 103 via microphone 195. The user may receive audio communications with another participant of the telephone call through electronic device 103 via speakers 191 and 192.

By including multiple control interfaces 196, 197, and 198 in the audio communications device 190, a user may be able to perform any function that the user could perform using the control interfaces of electronic device 103. For example, control interfaces 196 and 198 may allow a user to change the volume of a call. Particularly, for example, control interface 196 may increase (or decrease) the volume of a call while control interface 198 may decrease (or increase) the volume of a call. Control interface 197 may, for example, be utilized to initiate and/or terminate a call. Control interfaces may perform a number of functions. Such functions may be dependent on the operating environment of either electronic device 103 or audio communications device 190. For example, button 197 may be utilized to pause a song when electronic device 103 is in the operating environment of playing a song. As per another example, button 197 may be utilized to mute a call when electronic device 103 is in the operating environment of handling the communication of a telephone call.

Control interfaces 196, 197, and 198 may take any form. A control interface may, for example, be a wheel, a button, a single-touch screen, a multiple-touch screen, and/or a switch. Control interfaces may be activated to provide control information in a number of ways. For example, control interface 196 may be a button that may activate the transmission of one type of control information when pressed, another type of control information when depressed, and yet another type of control information when being held down for a particular period of time (e.g., approximately two seconds or more). Each type of control information may cause electronic device 103 or audio communications device 190 to perform a particular function. For example, the pressing of a button (e.g., control interface 197) may switch one song that is being played by electronic device 103 to the next song that will be played by electronic device 103. Furthering this example, the holding of the button in the pressed position may cause electronic device 103 to fast forward a song that is being played.

Communications channel 105 may be, for example, a multiple-wire (e.g., four wire) cable permanently connected to audio communications device 190. Communications channel 105 may include a plug that is operable to mate with a jack located on, for example, electronic device 103. Such a plug may take the form of a multiple-region (e.g., a four-region) male connector. Similarly, such a jack may take the form of a multiple region (e.g., a four-region) female connector. The mating of communication channel 105 to electronic device 103 may take the form of connection 110.

Connection 110 may include a multiple-region male plug that includes regions 124, 123, 122, and 121. Region 124 may be, for example, a right (or left) audio channel. Region 123 may be, for example, a left (or right) audio channel). Region 122 may be, for example, a ground channel. Region 121 may be, for example, a microphone channel. Such a multiple-region male plug may mate with a multiple-region female plug that includes regions 131, 132, 133, and 134. Region 134 may be, for example, a right (or left) audio channel. Region 133 may be, for example, a left (or right) audio channel. Region 132 may be, for example, a ground channel. Region 131 may be, for example, a microphone channel.

Persons skilled in the art will appreciate that power may be supplied to an audio communications device via a communications channel having a jack that is mateable with a electronic device. In this manner, the electronic device 103 may provide power to the audio communications device. For example, a four-region jack may include a microphone channel. Such a microphone channel may supply electrical energy to an accessory (e.g., an audio communications device) while receiving microphone audio information from the accessory. Additional information, such as control information, may be communicated through the microphone channel. Accordingly, the electronic device 103 may include circuitry that can discern control information from microphone audio information. Such a electronic device may separate the control information from the microphone information. In doing so, the electronic device 103 may send the control information to one feature and send the microphone audio information to another feature.

Control information may be embedded with microphone information in many ways. For example, the control information may be transmitted through a channel as tones. Such tones may take the form of, for example, ultrasonic current pulses (e.g., 75-300 KHz current pulses), while microphone audio information is transmitted at audible frequencies. In doing so, for example, a receiving user or particular electronic device circuitry may not be able to detect the ultrasonic frequency data amongst the audible frequency data. For example, a microphone codec located at a electronic device may be provided with the ultrasonic and audible frequency, yet may be configured to only see and/or detect the audible frequency data. In doing so, for example, the ultrasonic frequency information may not need to be stripped apart from the audible frequency information. The ultrasonic and audible microphone frequency information may be transmitted as current pulses across a channel (e.g., a microphone channel).

Ultrasonic frequency information and audible microphone frequency information may be transmitted as an analog signal. In one embodiment, the high frequency range includes a range of tones above the threshold for human hearing and less than about 1 Mhz. Alternatively, for example, an audio communications device and a electronic device may be configured to communicate digitally. Persons skilled in the art will appreciate that microphone information and control information may be transmitted digitally across a microphone node.

The audio communications device may include a microphone. Accordingly, the audio communications device may be configured, for example, to constantly supply current to the microphone, in order to keep the microphone constantly active, when power is supplied to the audio communications device through a microphone channel. In doing so, for example, the occurrence of audible artifacts may be eliminated while control interfaces are used that introduce additional signal information into the microphone channel. In other words, for example, the occurrence of audible blips and moments of silence may be eliminated as control interface switching occurs. The audio communications device may also be configured to include the ability to turn the current supplied to the microphone OFF. Such an ability may be user-controlled via a control interface (e.g., a button) and/or electronic device controlled and/or audio communications device controlled.

An audio communications device, such as a pair of earphones, may include any type of microphone. For example, an audio communications device may include a voltage-based (e.g., a MEMS microphone) or current-based condenser-type microphone (e.g., an Electret microphone).

Device-to-Device handshaking may occur through a communications channel, such as a microphone channel. A handshake between a electronic device and an audio communications device may include, for example, device identification, communication initialization, security protocol establishment, and/or timing synchronization. In doing so, for example, a device (e.g., a electronic device) may be able to identify the accessory that is coupled to the device (e.g., an audio communications device). The device may then determine, based on the identification process, whether to further communicate with the accessory and how to further communicate for the accessory. For example, the device may recognize the identification of an accessory as a pair of earphones that includes a microphone and multiple interface control scheme. Accordingly, the device may turn ON associated features such as a microphone codec and a control information reception/management circuit.

Security protocols may also be in place during handshaking such that an accessory may be required by a device to transmit particular security information before the accessory can interact with the device. In doing so, for example, the device may be protected against control signals sent through a microphone channel by an unknown device. Furthermore, for example, synchronization may occur during device handshaking. Such synchronization may be utilized to synchronize the timing of circuitry on a device and an accessory. Additional processes may be added during device handshaking. For example, device testing may occur. Also, for example, power may be supplied to a microphone for a period of time until the microphone is ready to be used and a electronic device may perform a check to make sure that a microphone is ready to use. The electronic device may, for example, check to make sure the features of an accessory are in working condition. Handshaking may be controlled by any device and any device may be the master of any subsequent communications. For example, a device may be a slave to an accessory.

Persons skilled in the art will appreciate that the features provided herein may extend beyond a electronic device to accessory communications. More particularly, the features provided herein may be provided in any device-to-device communications as well as in-device circuit-to-circuit communications. For example, a microphone-enabled device may be mated with any device such as a car, plane, boat, train, home computer, server, laptop computer, cellular phone, tablet computer, Personal Digital Assistant (PDA), or any other device.

FIG. 2 shows communication topology 200 between a device and an audio communications device. Such a electronic device, e.g., electronic device 103, may include, for example, circuit 290. Such an audio communications device, e.g., audio communications device 190, may include circuit 210. Circuit 210 may be housed, for example, within the PCU 194 of audio communications device 190.

Circuit 290 of a electronic device may include low noise power supply 291, frequency detector and controller 292, resistor 294, comparators 293, and codec 296. Circuit 290 may also include, for example, a source of electrical energy as well as any other hardware and/or software needed for any particular function. For example, circuit 290 may communicate with a electronic device operating system that runs applications/hardware for providing telephonic and media-playing functionalities.

Circuit 210 of an audio communications device may include, for example, control interfaces 260, switch 231, switch 232, switch 233, voltage detector and latch 221, shunt regulator 224, microphone 240, resistor 273, resistor 271, and capacitor 271. Circuit 290 may couple to circuit 210 through node 211, which may be a microphone node that also provides power from circuit 290 to circuit 210. Circuit 210 may also couple to ground by coupling to a ground terminal of circuit 290. Persons skilled in the art will appreciate that a ground terminal may be, for example, a virtual ground. Such a virtual ground may take the form of, for example, a stable voltage that is lower than a power voltage. Accordingly, the virtual ground and power voltage may have a particular differential voltage that is utilized to power circuit 210 (e.g., approximately 2-3.2 volts in differential).

Generally, circuits 210 and 290 may operate, for example, as follows. Circuit 210 may provide one of a number of voltages. For example, circuit 210 may provide 0 volts, 2.0 volts, or 2.7 volts. 0 volts may correspond to, for example, the situation when the electronic device is OFF or the electronic device has been instructed to stop communications with the audio communications device. 2.0 volts may be provided by one group of electronic devices while 2.7 volts may be provided by another group of electronic devices. Accordingly, the voltage initially supplied through a microphone channel may be utilized to identify a electronic device as being part of a particular group. Circuit 210 may operate differently depending on the voltage that is provided. Accordingly, circuit 210 may be able to operate with different groups of devices. One group of device (e.g., a group that supplies 2.0 volts) may not include a microphone functionality while another group may (e.g., a group that supplies 2.7 volts) include a microphone functionality. Persons skilled in the art will appreciate that a single device may change the supply voltage that is provided to an accessory based, in part, for example on the type of use desired by the electronic device for the accessory. For example, a media telephone having a multimedia feature may provide 2.0 volts when the multimedia feature is being used (e.g., and thus not utilize a microphone), yet such a device may provide 2.7 volts when a telephonic feature is being used (e.g., and thus utilize a microphone).

Circuit 210 may be able to operate differently in any number of power supply conditions. For example, circuit 210 may be able to operate differently under three power supply conditions such as HIGH (e.g., 2.7 volts), MEDIUM, (e.g., 2.0 volts), and LOW (e.g., 0 volts). Circuit 210 may be operable to operate differently under two (e.g., HIGH and LOW) or more than three power supply conditions. Higher powered electronic devices may be configured to provide HIGH and LOW power supply voltages while lower powered electronic devices may be configured to provide MEDIUM and LOW power supply voltages. For example, a portable telephone with multimedia features may be a higher powered electronic device while a media player without a display (and/or telephonic feature) may be a lower powered electronic device.

Suppose, for example, circuit 210 receives a power supply voltage of 2.7 volts from circuit 290. At such a power supply voltage, for example, switch 233 and switch 231 may turn ON. Particularly, voltage detector and latch 221 may detect the power supply voltage and may provide switching voltages to switch 231 and switch 232. In doing so, for example, power may be provided to microphone 240 by providing power to amplifier 243 and voltage multiplier 241. Accordingly, the appropriate power supply voltage may turn microphone 240 ON. Similarly, the turning ON of switch 233 may provide power to switch state transmitter 222 and impedance detector 223. Switch state transmitter 222 may, for example, be a multi-tone oscillator. Persons skilled in the art will appreciate that voltage detector and latch 221 may communicate information to, for example, switch state transmitter 222 and impedance detector 223. Accordingly, voltage detector and latch 221 may be able to discern differences in power supply voltages if, for example, additional information was communicated to circuit 210 via a power supply voltage. Thus, voltage detector and latch 221 may communicate this additional information to other structures of circuit 210 such as switch state transmitter 222 and impedance detector 223.

Persons skilled in the art will appreciate that voltage detector and latch 221 may include a latch, for example, in order to hold switching voltages for switches 231-233 at a particular voltage to hold a particular state of switches 231-233. Persons skilled in the art will also appreciate that switches 231-233 may initially be ON and that voltage detector and latch 221 may selectively turn switches 231-233 OFF. Similarly, switches 231-233 may initially be OFF and voltage detector and latch 221 may selectively turn switches 231-233 ON. Accordingly, voltage detector and latch 221 may turn switch 232 OFF and leave switches 233 and 231 ON when an appropriate voltage is detected (e.g., approximately 2.7 volts).

Switch state transmitter 222 may be a multi-tone oscillator and may, for example, transmit a handshake to circuit 290 in response to receiving a particular voltage (e.g., approximately 2.7 volts) from circuit 290. Such a handshake may, for example, communicate identification information to circuit 290. Such identification information may, for example, instruct circuit 290 as to the type of accessory that circuit 210 resides in. Accordingly, circuit 290 may communicate, via a handshake performed by transmitter 222, such that circuit 290 resides in an audio communications device that includes a microphone, two speakers embodied as headphones, and a three-button controller.

Transmitter 290 may communicate any type of identification information or other information used in a handshaking process. For example, transmitter 290 may communicate a password in response to receiving information from circuit 290 indicative of a security challenge. If circuit 290 does not receive the appropriate password, authentication data, and/or cryptographic response for the security challenge, either circuit 210 and/or circuit 290 may turn OFF (e.g., the electronic device and or accessory may terminate communications). Persons skilled in the art will appreciate that transmitter 222 may communicate the state of the switches of circuit 210. In doing so, for example, circuit 290 may be able to determine the type of accessory that circuit 290 is included in by determining what type of accessory particular switches would be turned ON in. For example, an accessory without a microphone may not include switch 231 and/or transmitter 222 may transmit that switch 231 is OFF in such an instance.

Impedance Detector 223 may, for example, detect the use of control interface circuit 260. More particularly, for example, impedance detector may detect different impedance levels of the output of control interface circuit 260 and may determine how a user interacted with control interface circuit 260 based on the detected impedance level. Accordingly, for example, impedance detector 223 may provide this information to other circuitry of circuit 210 (e.g., transmitter 222). In this manner, transmitter 222 may transmit information about the state of control interface circuit 260 to another device (e.g., a device that includes circuit 290). Transmitter 222 may transmit information as current pulses. For example, transmitter 222 may transmit information as ultrasonic current pulses (e.g., approximately 75-300 KHz).

Control interface circuit 260 may include any number of control interfaces such as, for example, one or more touch screens, wheels, buttons, and/or any other type of interface. For example, control interface circuit 260 may include multiple buttons (e.g., 3, 4, 5, or more). Each button may close a connection between a particular resistor, or resistors, with ground such that impedance detector 223 may be provided with a different level of impedance depending on which button is pressed. For example, control interface circuit 260 may include buttons 261-265. When button 262 is pressed, impedance detector 223 may detect the impedance of resistor 262 and, accordingly, may utilize the detection of this impedance a control signal. When button 263 is pressed, impedance detector 223 may detect the impedance of resistor 266 in series with resistor 267 and, accordingly, may utilize the detection of this impedance as another control signal. Persons skilled in the art will appreciate that when button 261 is pressed, node 211 may be brought to ground. This, for example, may turn circuit 210 OFF. Button 261 may, or may not, for example, be included in control interface circuit 260. In other words, button 261 may be controlled by circuit 210 and may not be user-controlled. Similarly, voltage detector and latch 221 may control when control interface circuit 260 is turned ON and is operable to interact with impedance detector 223.

Control interfaces may be configured, for example, in a variety of ways. Control interface circuit 260 may be configured, for example, such that only the interaction of a single button can be detected at any given time. Thus, impedance detector may detect the pressing of button 262, the holding down of button 262 for a period of time, and the depressing of button 262. However, an interface may be provided that is operable to detect the simultaneous operation of multiple buttons (e.g., or multiple touches to a touch screen).

Control interface circuit 280 may be utilized, for example, to detect the simultaneous activation of multiple control interfaces. To obtain this functionality, for example, series configurations of switches and resistors may be placed in a parallel configuration. The resistors may, for example, have different resistances. When a single button is pressed, for example, the impedance of a single resistor may be detected by impedance detector 223 and discerned as a single button activity. However, when more than one button is pressed, for example, a difference impedance profile may be detected by impedance detector 223 than when any single button is pressed.

Similarly, when more than one button is pressed, for example, a difference impedance profile may be detected by impedance detector 223 then any other combination of simultaneous button presses. For example, when buttons 288 and 287 are pressed, impedance detector 223 may detect the impedance profile of resistors 284 and 283 in a parallel configuration. If resistors 281-285 are provided, for example, with particular different resistances, impedance detector 223 may be able to detect the simultaneous press of any number (e.g., all) of switches 285-288.

Persons skilled in the art will appreciate that any detector may be used to detect user interaction with a control interface circuit. For example, a capacitive touch-screen may be utilized and a detector may be provided that is able to discern different capacitance profiles.

Shunt regulator 224 may be utilized to assist in maintaining a constant current draw from microphone 240 even when, for example, control interfaces of control interface circuit 260 are being utilized. Particularly, shunt regulator may be coupled to, and operate with, microphone 240 when, for example switches 233 and 231 are CLOSED. Accordingly, circuit 290 may allow for simultaneous operation of control interface circuit 260 and microphone 240. Shunt regulator 224 may, for example, be used with resistor 272 to keep the signal from microphone 240 on node 211 independent from variations introduced on the power node of transmitter 222 by transmitter 222. In doing so, the introduction of audible pops and moments of silence may be eliminated when control interfaces of control interface circuit 260 are utilized while microphone 240 is being utilized.

Frequency detector and controller 292 may receive tones from transmitter 222 and may utilize such tones to control the operation of a device. Persons skilled in the art will appreciate that if a handshake indicates that an accessory has no microphone, frequency detector and controller 292 may set low noise supply circuit 291 to a MEDIUM voltage (e.g., 2.0 volts). In doing so, for example, frequency detector and controller 292 may OPEN switch 233, OPEN switch 231, and CLOSE switch 232. In doing so, microphone 240 may be turned OFF and control interface circuit 260 may be the only device transmitting information across node 211. In this manner, for example, frequency detector and controller 292 may expect to receive only control interface information across 295 and may accordingly change how node 295 is utilized. For example, frequency detector and controller 292 may turn OFF microphone codec 296.

Persons skilled in the art will appreciate that some devices that include circuit 210 may be configured to initially provide 2.0 volts. For example, some devices may not have the capability to utilize a microphone input and, accordingly, may not include microphone codec 296. Similarly, an accessory that utilizes circuit 290 may be able to be backwards compatible with a variety of devices that provide at least one of a variety of initial power supply voltages (e.g., LOW, MEDIUM, and HIGH).

Persons skilled in the art will appreciate that a device may be configured to initially provide a particular power supply voltage (e.g., a MEDIUM voltage) for a particular amount of time (e.g., 100 ms) and then change the power supply voltage to a different voltage (e.g., a HIGH voltage). In doing so, for example, circuit 290 may be able to perform particular features and then may be able to alternate the state of operation in order to provide additional or alternate features (e.g., the inclusion of a microphone functionality).

Persons skilled in the art will also appreciate that some devices may not support multiple-control interface (e.g., multiple-button) functionality. Accordingly, for example, a device may change from one voltage to a second voltage (e.g., a MEDIUM to a HIGH) after a period of time (e.g., 100 ms) in order to indicate that the device includes multiple-control interface functionality. Accordingly, circuit 210 may operate to provide only a single control interface (or no control interface). Alternatively, for example, the device may provide a particular voltage (e.g., a HIGH voltage) and circuit 290 may operate with no, or a single, control interface support. Alternatively still, for example, multiple control interface information may be transmitted to a device that does not include multiple control information support if, for example, the device will just ignore, or not recognize, such multiple control information. When circuit 210 detects a device that can only operate, for example, with a single control interface, circuit 210 may ignore the button presses from all but one control interface.

FIG. 3A is a communication topology 300 between a electronic device (via circuit 290) and an audio communications device (via circuit 210) including a condenser-type microphone 302 according to an illustrative embodiment of the invention. In one embodiment, the microphone 302 includes a current driven microphone such as, for example, an electret microphone. An electret microphone may include, without limitation, a foil-type electret, diaphragm-type electret, front electret, and back electret. In certain embodiments, the condenser-type microphone 302 enables the power supply 291 to function as a current source for the microphone 302. By employing a current source, the circuit 210 may improve power supply noise rejection and, thereby, improve circuit 210 communications performance.

FIG. 3B is a communication topology 350 between a electronic device (via circuit 290) and an audio communications device (via circuit 210) including an alternate switch 231 configuration according to an illustrative embodiment of the invention. In contrast with the communication topology 300, the communication topology 350 includes a circuit configuration in which the switch 231 is positioned on the common (ground) side of the condenser-type microphone 352. In addition to the advantageous effects of using a current source to improve noise rejection, the positioning of the switch 231 may further enhance noise rejection and improve circuit 210 communications performance.

FIG. 4 is an illustration of a process flow chart 400. Process flow chart 400 may, for example, be utilized to communicate information between a device, e.g., electronic device 103, and an accessory, e.g., audio communications device 190.

Flow chart 400 may include step 411, in which a device is powered ON. A device may be powered ON via, for example, a manual hardware switch located on either the device or an accessory. Step 412 may be included in which the device provides a power supply voltage to an accessory mated with the device. Such a power supply voltage may be provided, for example, across the microphone node of a jack. In step 413, the accessory may detect the level of the power supply voltage that was provided to the accessory. In doing so, for example, the accessory may determine a group for the device. For example, the accessory may determine that a device is one that is only able to receive control information associated with a single control interface or is able to receive control information associated with multiple control interfaces.

In step 414, for example, the accessory may transmit a handshake to the device. Such a handshake, for example, may be utilized to identify the accessory (e.g., as a device that includes a microphone or that does not include a microphone. Such a handshake may alternatively, for example, be utilized to confirm that the accessory is operating and is ready to continue with a communication. Person skilled in the art will appreciate that an accessory may need time to power up once a power supply voltage is received. Accordingly, a handshake may be utilized to signal a device that such a power up process has been completed and that communications can begin between the device and the accessory. In step 415, the accessory may transmit control and microphone information through a microphone channel. One or more microphone channels may be provided on a jack and/or plug.

Flow chart 420 may be utilized in a communications topology between a device and an accessory. Step 421 may be included, in which one or more control interfaces may be activated. A detector, such as an impedance detector, may detect such an activation and determine the origination of the activation (e.g., the depress of a third button) in step 422. In step 423, for example, a multi-tone oscillator may provide second control information corresponding to the determined activation through the microphone node of a plug. A device may then receive the control information in step 424 and the control information may control the operation of the device in step 425.

Flow chart 430 may be utilized in a communications topology between a device and an accessory. In step 431, an accessory may be plugged into a device. In step 432, the accessory may detect a first voltage level of a power supply voltage and the accessory may place itself in a first manner of operation as a result of the detected voltage level. In step 433, the accessory may be unplugged. In step 434, the accessory may be plugged into a second device. In step 435, for example, the accessory may detect a second voltage level of a power supply voltage and the accessory may place itself in a second manner of operation as a result of the detected voltage level.

Flow chart 440 may be utilized in a communications topology between a device and an accessory. In step 441, an accessory may determine the enablement of the operation of multiple interfaces located on the accessory. In step 442, for example, the accessory may provide constant current to a microphone of the accessory. In step 443, the accessory may receive control interface activation signals from control interfaces that are isolated from the output of a microphone. The accessory may then embed the control information into the microphone node of an output plug as ultrasonic current pulses.

FIG. 5 is a perspective view of an electric device and/or media device 500 according to an illustrative embodiment of the invention. The device 500 includes a housing 502, a first housing portion 504, a second housing portion 506, a display 508, a keypad 510, a speaker housing aperture 512, a microphone housing aperture 514, a headphone jack 516, and frame sidewall 522. In certain embodiments, the frame sidewall 522 is the exposed portion of a frame residing within or adjacent to the housing 502 that provides structural support for the media device 500 and various internal components.

In one embodiment, the housing 502 includes a first housing portion 504 and a second housing portion 506 that are fastened together and/or to the frame sidewall 522 to encase various components of the media device 500. The housing 502 and its housing portions 504 and 506 may include polymer-based materials that are formed by, for example, injection molding to define the form factor of the media device 500. In one embodiment, the housing 502 surrounds and/or supports internal components such as, for example, a display 508, one or more circuit boards having integrated circuit components, internal radio frequency (RF) circuitry, an internal antenna, a speaker, a microphone, a hard drive, a processor, and other components. Further details regarding certain internal components are discussed herein with respect to FIG. 6. The housing 502 provides for mounting of a display 508, keypad 510, external jack 516, data connectors, or other external interface elements. The housing 502 may include one or more housing apertures 112 to facilitate delivery of sound, including voice and music, to a user from a speaker within the housing 502. The housing 502 may include one or more housing apertures 514 to facilitate the reception of sounds, such as voice, for an internal microphone from a device user.

Personal computing devices and/or media devices of this type may include a touchscreen remote control, such as a Pronto made available by Royal Philips Electronics of the Netherlands or a handheld GPS receivers made available by Garmin International, Inc. of Olathe, Kans. In certain embodiments, the display 508 includes a graphical user interface (GUI) to enable a user to interact with the device 500. The personal computing device 500 may also include an image sensor such as a camera capable of capturing photographic images and/or video images.

FIG. 6 shows a simplified functional block diagram of a media device 600 according to an illustrative embodiment of the invention. The block diagram provides a generalized block diagram of a computer system such as may be employed, without limitation, by the media devices 103 and 500. The media device 600 may include a processor 602, storage device 604, user interface 608, display 610, CODEC 612, bus 618, memory 620, communications circuitry 622, a speaker or transducer 624, a microphone 626, and a PCU interface 330 to facilitate communications with an audio communications device. Processor 602 may control the operation of many functions and other circuitry included in media device 600. Processor 602 may drive display 610 and may receive user inputs from the user interface 608.

Storage device 604 may store media (e.g., music and video files), software (e.g., for implanting functions on device 600), preference information (e.g., media playback preferences), lifestyle information (e.g., food preferences), personal information (e.g., information obtained by exercise monitoring equipment), transaction information (e.g., information such as credit card information), word processing information, personal productivity information, wireless connection information (e.g., information that may enable media device to establish wireless communication with another device), subscription information (e.g., information that keeps tracks of podcasts or television shows or other media that a user subscribes to), and any other suitable data. Storage device 604 may include one more storage mediums, including for example, a hard-drive, permanent memory such as ROM, semi-permanent memory such as RAM, or cache.

Memory 620 may include one or more different types of memory which may be used for performing device functions. For example, memory 620 may include cache, ROM, and/or RAM. Bus 618 may provide a data transfer path for transferring data to, from, or between at least storage device 604, memory 620, and processor 602. Coder/decoder (CODEC) 612 may be included to convert digital audio signals into an analog signals for driving the speaker 624 to produce sound including voice, music, and other like audio. The CODEC 612 may also convert audio inputs from the microphone 626 into digital audio signals. The CODEC 612 may include a video CODEC for processing digital and/or analog video signals.

User interface 608 may allow a user to interact with the media device 600. For example, the user interface 608 can take a variety of forms, such as a button, keypad, dial, a click wheel, or a touch screen. Communications circuitry 622 may include circuitry for wireless communication (e.g., short-range and/or long range communication). For example, the wireless communication circuitry may be Wi-Fi enabling circuitry that permits wireless communication according to one of the 802.11 standards. Other wireless network protocols standards could also be used, either in alternative to the identified protocols or in addition to the identified protocol. Other network standards may include Bluetooth, the Global System for Mobile Communications (GSM), code division multiple access (CDMA), and long-term evolution (LTE) based wireless protocols. Communications circuitry 622 may also include circuitry that enables the media device 600 to be electrically coupled to another device (e.g., a computer or an accessory device) and communicate with that other device.

In one embodiment, the media device 600 may be a portable computing device dedicated to processing media such as audio and video. For example, the media device 600 may be a media device such as media player (e.g., MP3 player), a game player, a remote controller, a portable communication device, a remote ordering interface, an audio tour player, or other suitable media device. The media device 600 may be battery-operated and highly portable so as to allow a user to listen to music, play games or video, record video or take pictures, communicate with others, and/or control other devices. In addition, the media device 600 may be sized such that it fits relatively easily into a pocket or hand of the user. By being handheld, the media device 600 (or media devices 103 and 500) is relatively small and easily handled and utilized by its user and thus may be taken practically anywhere the user travels.

The media device 600 may employ a PCU interface 630 to facilitate communications between the media device 600 and peripheral device such as audio communications device 190. In certain embodiments, the PCU interface 630 includes one or more components of circuit 290 of FIGS. 2, 3A, and 3B. In one embodiment, a portion of the PCU interface 630 is included in the communications circuitry 622.

In certain embodiments, a media device, such as media device 103 or 500, is configured to receive control information from a peripheral control device such as audio communications device 190 and/or its PCU 194. As discussed with respect to FIG. 1, the PCU 194 may include one or more control interfaces. In one embodiment, the control interfaces include three control interfaces 196, 197, and 198. However, the number of control interfaces may include 1, 2, 3, 4, 5, or more than five control interfaces. Also, the type and arrangement of the control interfaces may vary.

In one embodiment, the number of control interfaces is configured to optimize user control via gestures created by one hand of a user. For example, the PCU 194 may have a form factor that enables a user to hold the PCU 194 between the fingers and thumb of one hand of the user. Thus, the user may actuate one or more control interfaces using one or more fingers while simultaneously holding the PCU 194 by applying pressure to the PCU 194 via the user's thumb and at least one finger. One or more control interfaces may be actuated by the user's thumb as well. The size of the PCU 194 may be less than or equal to about 2 in3, 1.5 in3, 1 in3, 0.75 in3, 5 in3, 0.25 in3, 0.125 in3, and 0.1 in3. One length of the PCU 194 may be less than about 4 in, 3 in, 2 in, 1 in, and 0.5 in. The shape of the PCU 194 may include, without limitation, a rectangular form, square form, oval form, spherical form, circular form, multi-sided (e.g., hexagonal) form. In one embodiment, three control interfaces 196, 197, and 198 are positioned on a top surface of the PCU 194 to enable actuation by one or more fingers of a user. In other embodiments, four or five or more control interfaces may be employed.

In certain embodiments, a user manipulates the control interfaces, such as control interfaces 196, 197, and 198, to form one or more control gestures. As discussed with respect to FIG. 2, each control gesture, being associated with a particular sequence of operations of the control interfaces 196, 197, and 198, may be associated with a particular control signal (e.g., control interface information) that is generated by the circuit 210. The control signal may then be transmitted via communications channel 105 to a media device 500 that may include, for example, the circuit 290. The circuit 290 may then detect the control signal using detector 292. The media device 500 may include a processor 602 that processes the detected control signal to determine whether a particular operation of the media device 500 is to be performed. The media device 500 may include a memory 620 and/or data storage component 604 that are capable of storing a list or database 700 of known control gestures. Thus, in one embodiment, the processor 602 compares a control signal associated with a particular user-generated control gesture with a list of known control signals associated with known control gestures. The processor 602 may identify the desired control function by determining the closest match in the database 700 with the received control signal. Once the desired control function is determined, the processor 602 performed the operations necessary to carry out the desired control of the media device 500.

In one embodiment, the media device 500 includes a plurality of software applications and/or subroutines of a software application. The media device 500 may associate a set (or arrangement of a set) of control gestures with a particular application or subroutine, while associating a different set (or arrangement of a set) of control gestures for a different application and/or subroutine. The media device 500 may define control gestures independently for a plurality of applications. Thus, for example, a particular control gesture (e.g., click, press and hold) may define a different operation for one application than another application.

Table 1 shows an exemplary association of user gestures with media device functions or controls that are dependent on the type of application running on the media device. Table 1 includes various exemplary applications such as a media player application, radio application, voice memo record application, voice memo playback application, and exercise application. In no way should this listing be considered limiting. User gesture controls via a PCU 194 may be applied to any type of application, especially those applications where a user employs an audio communications device 190 with a media device 103. As discussed previously, certain gestures may be re-used for different applications to initiate different functions of the media device 500.

TABLE 1 Exemplary Table Associating User Gestures with Media Device Control Depending on the Media Device Application User Gestures Double Click- Control Press & Double Press & Application Buttons Click Hold Click Hold Media + Volume up Volume up x x Player ramp Application Center Play/ * Next x Pause track, Next chapter, Next photo Volume Volume x x down down ramp Radio + Volume up Volume up x x Application ramp Center Mute/Un- Tag Next x mute Preset Volume Volume x x down down ramp Voice Memo + x x x x Record Center Pause/ x x Launch Application Resume app & Start/ End Record x x x x Voice Memo + Volume up Volume up x x Playback ramp Application Center Play/ x Next x Pause Chapter Volume Volume x x down down ramp Exercise + Volume up Volume up x x Application ramp Center Voice PowerSong x x Feedback Volume Volume x x down down ramp

For example, Table 1 shows that the same gesture (e.g., double click) can perform different functions depending on the application that is running on the media device 103 or 500. When the media player application is running, the “double click” gesture may initiate the “Next Track, Next Chapter, Next Photo” function which, for example, changes the currently song to the next song. When the radio application is running, the “double click” gesture may initiate the “Next preset” function which, for example, moves the selected radio station to the next pre-selected radio station. While the voice memo playback application is running, the “double click” gesture may initiate the “Next chapter” function which moves the playback audio to the next chapter and/or segment of recorded audio.

While Table 1 refers to click and double clicks based gestures, it should be understood that gestures may include triple clicks, quadruple clicks, or any number of clicks. In certain embodiments, the media device 103 and/or PCU 194 may recognize and ignore multiple simultaneous button presses or control interface actuations. In other embodiments, the media device 103 and/or PCU 194 may be configured to recognize simultaneous button presses and/or control interface actuations by any two or more control interfaces.

Table 1 illustrates various functions associated with various media device applications and lists associated control gestures used to initiate the various functions. With respect to the media player or playback application, control interface 196 (e.g., V+button) may be used to increase the audio volume of the speakers of the audio communications device 190. For example, clicking once on the V+ button can increment the volume one step or increment. Pressing and holding the V+ button may cause the volume to ramp up until the button is released. Control interface 198 (e.g., V− button) may be used to decrease the audio volume of the speakers of the audio communications device 190. For example, clicking once on the V− button can decrement the volume one step or decrement. Pressing and holding the V− button may cause the volume to ramp down until the button is released. Control interface 197 (e.g., center button) may be used to play or pause the playing of media such as a song, audio file, video file, and the like. By clicking once on the center button, media playback can be paused if media is currently playing or media playback can be resumed if media is currently paused. By double clicking on the center button, the user can initiate a “next” command to advance media playback to the next song or chapter (or next video and so on).

While a radio application is running, control interface 197 (e.g., center button) may be used to mute or un-mute radio playing or playback. For example, by clicking once on the center button, a user can cause a radio playback to mute if the radio audio is currently playing or return to a previous audio volume level if the radio audio is currently muted. In certain embodiments, the radio application may be configured with preset radio station setting and, thereby, allow a user to conveniently tune to their favorite radio stations. Where there are multiple preset radio stations, the center button may be used to skip to the next preset radio station. For example, by double clicking the center button, the radio tuner will advance to the next preset radio station in an increasing and/or decreasing frequency. By double clicking the center button while at the highest frequency present radio station, the radio tuner may advance to the lowest frequency preset radio station, effectively wrapping around the tuner dial. The radio application may support adjusting audio volume in the same manner as the media playback application. The radio application may also support tagging of particular media such as a song. For example, by pressing and holding the center button while a song is playing, the song can be tagged for later identification and use. In certain embodiments, media such as a song may be designated as taggable or not taggable. For example, a song may be designated as not taggable for certain digital rights reasons which may restrict a user from performing certain operations on the song using the tag feature.

While a voice memo application is running, control interface 197 (e.g., center button) may be used to initiate and end a voice memo. For example, by double clicking and then pressing and holding the center button, a user can launch voice memo application and begin recording audio via, for example, microphone 195. In certain embodiments, a media device, e.g., media device 500, may use an internal microphone 626, microphone 195 of an audio communications device 190, or another accessory microphone. Clicking once on the center button may pause a recording if recording is currently in progress or resume a recording if a recording is currently paused. The voice memo application may support adjusting audio and/or recording volume in the same manner as the media playback application.

While a voice memo playback application is running, control interfaces 196, 197, and 198 (e.g., V−, center button, and V+) may be used to is a similar manner as when the media playback application is running.

While an exercise application is running, control interface 197 (e.g., center button) may be used to initiate a voice feedback feature. In certain embodiment, the voice feedback feature includes voice commands and/or comments, beeps, tones, audio clips, video clips, alerts, ring tones, and/or any like audio-based indication that provides feedback, an indication, a communication, and/or notice for a user. In one embodiment, voice feedback includes a statement of distance traveled (e.g., “one mile”) and/or a statement of distance remaining (e.g., “one mile to go”). In another embodiment, voice feedback includes statistical information or other status information (e.g., “heart rate is 128”). In further embodiment, the voice feedback feature may include a call, email, or other like communication notification (e.g., “call from Bill”). By pressing and holding the center button, a user may initiate a select media file such as a song (e.g., a powersong) that is desirable during an exercise routine or other activity. It should be understood that the control gestures and functions illustrated in Table 1 are exemplary of a broader range of possible control gestures, control operations, and applications that may utilize such control gestures.

For example, depending on the application that is running on the media device and/or electronic device, the same control gesture may be used to initiate a different function or operation. A control gesture, depending on the context of the application running, may perform an operation such as stepping through a list of elements associated with an application. An operation may include, without limitation, starting or launching one or more applications, stopping or ending one or more applications, selecting or de-selecting one or more elements, increasing or decreasing one or more settings, moving through a list of elements, initiating or ending a communications session, playing music or videos, pausing music or videos, and initiating or ending an audio or video recording session. An element may include at least one of a song, a video, a music file, an audio file, a video file, a photograph, a media file, an application icon, an activation icon, a control button, a data file, and contact data.

A gesture may include one or more “touch events.” In certain embodiments, a “touch event” is broader than just a touching of the input device and/or control interface. A touch event can be one of several touch events, including: a “touch begin” event (e.g., initial touch is detected), a “touch move” event (e.g., after initial touch is detected, the coordinates of the touch change), or a “touch end” event (e.g., after initial touch is detected, the touch is no longer detected). There may be other touch events as well (e.g., touch cancel). A gesture can be based on a series of touch events (e.g., touch down+multiple touch moved+touch up events), or a gesture may be interpreted as its own “gesture event” that includes scale and/or rotation information. In one embodiment, the foregoing touch events are applied to a touch surface. In other embodiments, the touch events correspond to button press events, which may include, without limitation, a press down event, a press down duration event, and a press up event.

By including control interfaces 176, 177, and 178 in the audio communications device 190, a media device user can advantageously control certain functions of their media device via a relatively small form-factor interface without the need to handle or interface directly with the media device itself. Thus, in certain circumstances, a user can secure their media device in a pocket, a purse, or to a belt, while retaining control of the media device via the PCU 194.

In certain embodiments, the PCU 194 includes a single-handed user interface for enabling a user to hold the PCU 194 and input control gestures. The PCU 194 may also include a communications interface for sending control information to the media device in response to the inputted control gestures where the control information is used to control an operation of the media device. The term “hold” should be understood to include grasp, support, and/or position. The term “hold” is not limited to independent support of the PCU 194. For example, a PCU 194 may be supported by some other mechanism such as a wire when the PCU 194 is tethered to a headset. Thus, a user may hold the PCU 194 while some other mechanism also holds the PCU 194. Alternatively, a user may independently hold the PCU 194. A single-handed interface includes an interface where a user can interact with a device using one hand.

The single-handed user interface may include a plurality of control interfaces. A control interface may include a button, click wheel, touch pad, switch, and/or presence sensor. A presence sensor may include a magnetic, light, capacitive, touch, or like sensor. A user may input the control gestures by actuating the plurality of control interfaces, a portion of which either concurrently or sequentially, using at least one finger of the user's hand. The PCU 194 may be tethered to a media device and include a wired communications interface. The PCU 194 may be un-tethered from a media device and include a wireless communications interface to facilitate communications with the media device. The PCU 194 may include a form factor that enables a user to hold the peripheral control unit between at least one finger and thumb of the user's hand.

The PCU 194 may be a stand-along unit or may be integrated with another device or structure. For example, the PCU 194 may be included with a media headset such as the audio communications device 190. The PCU 194 may be integrated with eye glasses, a stereo, radio receiver, clothing, a vehicle, helmet, watch, a wearable electronic device, and the like. The PCU 194 may be removably attachable to another device.

FIG. 7 includes a database 700 and/or list associating control signals derived from user control gestures, generated via the control interfaces 196, 197, and 198 of the PCU 194, with media device control instructions according to an illustrative embodiment of the invention. The database 700 may include multiple lists 702, 704, and 706 of control gestures and associated control instructions 708 where each list is associated with a particular application of a media device such as media device 103 or 500. For example, list 702 may be associated with a media playback application while lists 702 and 703 are associated with a radio application and voice memo application respectively.

FIG. 8 is a flow diagram of a process 800 for controlling a media device via a peripheral control unit according to an illustrative embodiment of the invention. First, a communications channel 105 is established between a media device 103 and a peripheral control unit 194 of an audio communications device 190 (Step 802). Then, the PCU 194 receives media information from the media device 103 via the communications channel 105 (Step 804). Media information may include, without limitation, signal information for playing audio via speakers 191 and 192. Media information may include, without limitation, signal information associated with audio files, songs, video, multimedia, and the like. The PCU 194 provides the user with a control interface including control interfaces 196, 197, and/or 198 so that the control interface can detect one or more control gestures made by the user (Step 806). The PCU 194 may send at least one control signal based on a control gesture by a user to the media device 103 via the communications channel 105 (Step 808). Once the control signal is received, the media device 103 may control an operation and/or function based on the control signal (Step 810).

In one embodiment, a headset for a media device includes at least one speaker that provides an audio output, a microphone that receives an audio input, and a peripheral control unit that controls certain operations of the media device. The peripheral control unit includes a data interface that sends a control signal to the media device via a wired communications channel and receives media information from the media device. The control signal may be used to control an operation of the media device. The peripheral control unit may include a plurality of control interfaces that generate the control signal in response to a user control gesture. A user control gesture may be based on a sequence of user interactions with the control interfaces.

In one configuration, the data interface sends audio information generated by the microphone to the media device. The peripheral control unit may include a second data interface that sends a portion of the media information to at least one speaker. A control interface may include a button, click wheel, touch screen, a section of a touch screen, and/or a switch.

In one feature, the peripheral control unit uses three control interfaces in the form of buttons. A user may perform a sequence of user interactions that include a click, double click, triple click, press and hold, click and press and hold, double click press and hold, and/or triple click press and hold. An operation of the media device may include: media play, media pause, volume increase, volume decrease, volume ramp increase, volume ramp decrease, media tag, memo play, memo pause, skip to next song, radio playback, radio mute, radio skip to next preset, radio wrap around, go to next chapter, play select song, and/or activate voice feedback.

In certain configurations, the headset includes a pair of speakers tethered to the peripheral control unit. The peripheral control unit may have a form factor that supports single-handed interactions with the control interfaces. The control interfaces may be located on a first surface adjacent to a user's fingers. A second surface may be adjacent to the user's thumb to enable the user to hold the peripheral control unit. The first surface of the peripheral control unit may be substantially rectangular in shape. The first surface may have a length of less than about 3 inches, 2 inches, or 1 inch.

The peripheral control unit may include a high frequency tone generator that transmits a control signal to the media device in the form of one or more high frequency tones along the communications channel. A high frequency tone may include a tone above the threshold of human hearing, but less that about 1 Mhz.

In another embodiment, a media device includes an interface for receiving a control signal from a peripheral control device via a wired communications channel. The control signal may be derived from a user control gesture. The media device also includes a data store for storing a list of known control signals where each known control signal has an associated control instruction. The media device uses a processor to identify the received control signal by comparing the received control signal with the list of known control signals. The processor controls an operation of the media device based on the control instruction associated with the identified control signal.

In one configuration, the data store includes a database and/or electronic list. In one feature, the interface sends media information to the peripheral control device. Media information may include music, a song, video, multimedia, and the like. The interface may send media information concurrently with receiving the control signal. The interface may receive the control signal in a first frequency range and send media information in a second frequency range.

The processor may be configured to operate the media device using a plurality of applications. The data store may includes a plurality of lists of known control signals and associated control instructions. Each list of known control signals and associated control instructions may be associated with one of the plurality of applications. An application may perform, without limitation, media playback, radio playback, voice memo recording, voice memo playback, voice feedback, and user exercise support. A control instruction may include, without limitation, media play, media pause, volume increase, volume decrease, volume ramp increase, volume ramp decrease, tag media, memo play, memo pause, skip to next song, radio playback, radio mute, radio skip to next preset, radio wrap around, go to next chapter, play select song, and activate voice feedback, activate feature, mute, un-mute, and/or go to next tag.

In one configuration, a user control gesture is derived from a sequence of user interactions with one or more control interfaces. A control interface may include, without limitation, a button, click wheel, touch screen, a section of a touch screen, and/or a switch. The media device may include, without limitation, a cellular telephone, media player, audio player, music player, video player, multimedia player, and/or personal computer.

In another embodiment, a peripheral control unit for a media device includes a single-handed user interface that enables a user to concurrently hold the peripheral control unit and input control gestures. The peripheral control unit also includes a communications interface that sends control information to the media device in response to the inputted control gestures. The control information may be used to control an operation of the media device.

In one configuration, the single-handed user interface includes a plurality of control interfaces. A control interface may include, without limitation, a button, click wheel, touch pad, portion of touch pad, switch, and presence sensor. In one feature, inputting the control gestures includes actuating the plurality of control interfaces using at least one finger of a hand of the user of the peripheral control unit. The communications interface may include either a wired or wireless communications interface. The peripheral control unit may have a form factor that enables the user to hold the peripheral control unit between at least one finger and thumb of the user's hand.

From the foregoing description, persons skilled in the art will recognize that this invention provides the simultaneous recognition of controls associated with a communications device. In addition, persons skilled in the art will appreciate that the various configurations described herein may be combined without departing from the present invention. It will also be recognized that the invention may take many forms other than those disclosed in this specification. Accordingly, it is emphasized that the invention is not limited to the disclosed methods, systems and apparatuses, but is intended to include variations to and modifications thereof which are within the spirit of the following claims.

Claims

1. An electronic device comprising:

an interface for receiving a control signal from a peripheral control device via a wired communications channel, the control signal being derived from a user control gesture;
a data store for storing a list of known control signals, each known control signal having an associated control instruction; and
a processor for i) identifying the received control signal by comparing the received control signal with the list of known control signals and ii) controlling an operation of the electronic device based on the control instruction associated with the identified control signal.

2. The device of claim 1, wherein the data store includes a database.

3. The device of claim 1, wherein the interface sends media information to the peripheral control device.

4. The device of claim 3, wherein the interface sends media information concurrently with receiving the control signal.

5. The device of claim 4, wherein the interface receives the control signal in a first frequency range and sends media information in a second frequency range.

6. The device of claim 1, wherein the processor is configured to operate the electronic device using a plurality of applications.

7. The device of claim 6, wherein the data store includes a plurality of lists of known control signals and associated control instructions, each list of known control signals and associated control instructions being associated with one of the plurality of applications.

8. The device of claim 6, wherein an application performs at least one of media playback, radio playback, voice memo recording, voice memo playback, voice feedback, user exercise support.

9. The device of claim 1, wherein a user control gesture is derived from a sequence of user interactions with one or more control interfaces.

10. The device of claim 9, wherein a control interface includes at least one of a button, click wheel, touch screen, a section of a touch screen, and a switch.

11. The device of claim 1, wherein the electronic device includes at least one of a cellular telephone, media player, audio player, music player, video player, multimedia player, and personal computer.

12. The device of claim 1, wherein the control instruction includes at least one of media play, media pause, volume increase, volume decrease, volume ramp increase, volume ramp decrease, tag media, memo play, memo pause, skip to next song, radio playback, radio mute, radio skip to next preset, radio wrap around, go to next chapter, play select song, and activate voice feedback, activate feature, mute, un-mute, and go to next tag.

13. The device of claim 1, wherein the interface for receiving the control signal includes a high frequency tone receiver for receiving the control signal from the peripheral control unit in the form of one or more high frequency tones.

14. A media system comprising:

an electronic device including: an interface for receiving a control signal from a peripheral control device via a wired communications channel and sending media information to the peripheral control device; a data store for storing a list of known control signals, each known control signal having an associated control instruction; a processor for i) identifying the received control signal by comparing the received control signal with the list of known control signals and ii) controlling an operation of the electronic device based on the control instruction associated with the identified control signal;
wherein the peripheral control device includes: a data interface for sending the control signal to the electronic device via the communications channel and receiving the media information from the electronic device; and a plurality of control interfaces for generating the control signal in response to a user control gesture, the user control gesture being based on a sequence of user interactions with the plurality of control interfaces.

15. The device of claim 14, wherein a control interface includes at least one of a button, click wheel, touch screen, a section of a touch screen, and a switch.

16. The device of claim 15, wherein a sequence of user interactions includes at least one of a click, double click, triple click, press and hold, click and press and hold, double click press and hold, and triple click press and hold.

17. The device of claim 14, wherein the control instruction includes at least one of media play, media pause, volume increase, volume decrease, volume ramp increase, volume ramp decrease, tag media, memo play, memo pause, skip to next song, radio playback, radio mute, radio skip to next preset, radio wrap around, go to next chapter, play select song, and activate voice feedback, activate feature, mute, un-mute, and go to next tag.

18. A method for controlling an electronic device comprising:

establishing a wired communications channel between the electronic device and a peripheral control unit,
providing a single-handed user interface for i) holding the peripheral control unit and ii) inputting control gestures,
inputting control gestures via the single-handed user interface,
sending control information via the communications channel from the peripheral control unit to the electronic device in response to the user gestures,
receiving media information via the communications channel from the electronic device, and
sending media information to the peripheral control unit via the communications channel, and
controlling an operation of the electronic device in response the received control information.

19. The method of claim 18, wherein the single-handed user interface includes a plurality of control interfaces.

20. The method of claim 19, wherein a control interface includes at least one of a button, click wheel, touch pad, switch, and presence sensor.

21. The method of claim 20, wherein inputting the control gestures includes actuating the plurality of control interfaces using at least one finger of a hand of the user.

22. The method of claim 18, wherein the wired communications channel includes a data cable.

23. The method of claim 18, wherein the sending of control information includes transmitting a control signal to the electronic device in the form of one or more high frequency tones along the wired communications channel.

24. The method of claim 18 comprising providing a form factor for enabling the user to hold the peripheral control unit between at least one finger and thumb of the user's hand.

25. An electronic device comprising:

a data store for storing a first list of known control signals related to a first function of the electronic device, each known control signal of the first list having an associated control instruction based on the first function of the electronic device; and
an interface for receiving a first control signal via an audio jack, the first control signal being derived from a user control gesture applied to a peripheral control unit in communication with the audio jack;
a processor for i) identifying the received first control signal by comparing the received first control signal with the first list of known control signals related to the first function, and ii) controlling an operation of the electronic device based on the control instruction associated with the identified first control signal.

26. The device of claim 25, wherein the function includes an application running on the electronic device.

27. The device of claim 25, wherein the function includes a subroutine of and application running on the electronic device.

28. The device of claim 25, wherein the data store stores a second list of known control signals related to a second function of the electronic device, each known control signal of the second list having an associated control instruction based on the second function of the electronic device

29. The device of claim 28, wherein the interface receives a second control signal via the audio jack and wherein the processor i) identifies the received second control signal by comparing the received second control signal with the second list of known control signals related to the second function, and ii) controls an operation of the electronic device based on the control instruction associated with the identified second control signal.

30. The device of claim 25, wherein the processor is configured to operate the electronic device using a plurality of functions.

31. The device of claim 30, wherein the data store includes a plurality of lists of known control signals and associated control instructions, each list of known control signals and associated control instructions being associated with one of the plurality of functions.

32. The device of claim 1, wherein the electronic device includes at least one of a cellular telephone, media player, audio player, music player, video player, multimedia player, and personal computer.

Patent History
Publication number: 20090179789
Type: Application
Filed: Sep 3, 2008
Publication Date: Jul 16, 2009
Applicant: Apple Inc. (Cupertino, CA)
Inventors: Allen P. Haughay, JR. (San Jose, CA), Benjamin Andrew Rottler (Burlingame, CA)
Application Number: 12/231,582
Classifications
Current U.S. Class: Transmitter For Remote Control Signal (341/176)
International Classification: H04L 17/02 (20060101);