US20150066516A1 - Appliance control method, speech-based appliance control system, and cooking appliance - Google Patents
Appliance control method, speech-based appliance control system, and cooking appliance Download PDFInfo
- Publication number
- US20150066516A1 US20150066516A1 US14/473,263 US201414473263A US2015066516A1 US 20150066516 A1 US20150066516 A1 US 20150066516A1 US 201414473263 A US201414473263 A US 201414473263A US 2015066516 A1 US2015066516 A1 US 2015066516A1
- Authority
- US
- United States
- Prior art keywords
- cooking
- appliance
- information
- unit
- program
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000010411 cooking Methods 0.000 title claims abstract description 547
- 238000000034 method Methods 0.000 title claims abstract description 219
- 230000008569 process Effects 0.000 claims abstract description 172
- 238000004891 communication Methods 0.000 claims description 94
- 230000004044 response Effects 0.000 claims description 69
- 238000010586 diagram Methods 0.000 description 77
- 230000006870 function Effects 0.000 description 76
- 235000013547 stew Nutrition 0.000 description 54
- 238000012545 processing Methods 0.000 description 48
- 230000015572 biosynthetic process Effects 0.000 description 28
- 238000003786 synthesis reaction Methods 0.000 description 28
- 239000003795 chemical substances by application Substances 0.000 description 26
- 235000015278 beef Nutrition 0.000 description 24
- 238000004590 computer program Methods 0.000 description 21
- 230000005236 sound signal Effects 0.000 description 21
- 238000007405 data analysis Methods 0.000 description 20
- 239000004615 ingredient Substances 0.000 description 20
- 239000000470 constituent Substances 0.000 description 17
- 235000015220 hamburgers Nutrition 0.000 description 17
- 230000005540 biological transmission Effects 0.000 description 15
- 238000010438 heat treatment Methods 0.000 description 12
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Chemical compound O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 11
- 235000013311 vegetables Nutrition 0.000 description 10
- 238000001514 detection method Methods 0.000 description 8
- 230000011218 segmentation Effects 0.000 description 7
- 238000009835 boiling Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 235000012054 meals Nutrition 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 235000013305 food Nutrition 0.000 description 3
- 235000011389 fruit/vegetable juice Nutrition 0.000 description 3
- 235000020993 ground meat Nutrition 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 235000013372 meat Nutrition 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 235000015067 sauces Nutrition 0.000 description 3
- 238000010257 thawing Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000012905 input function Methods 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000005406 washing Methods 0.000 description 2
- 244000000626 Daucus carota Species 0.000 description 1
- 235000002767 Daucus carota Nutrition 0.000 description 1
- 241000287828 Gallus gallus Species 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 239000006071 cream Substances 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 239000008213 purified water Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 235000013599 spices Nutrition 0.000 description 1
Images
Classifications
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F24—HEATING; RANGES; VENTILATING
- F24C—DOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
- F24C7/00—Stoves or ranges heated by electric energy
- F24C7/08—Arrangement or mounting of control or safety devices
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L17/00—Speaker identification or verification techniques
- G10L17/22—Interactive procedures; Man-machine interfaces
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
Definitions
- the present disclosure relates to an appliance control method, a speech-based appliance control system, and a cooking appliance.
- Patent Literature 1 discloses a speech-based control system for a plurality of appliances that locates the direction from which an utterance made by a user originates to achieve an improvement in the recognition rate of the target appliance to be controlled.
- the techniques disclosed here feature a method for controlling a cooking appliance using a user's speech in a speech-based appliance control system.
- the speech-based appliance control system includes the cooking appliance including a first cooking unit and a second cooking unit, and includes an audio input device configured to receive input of user's speech.
- instruction information including first audio information indicating operation instructions for a cooking appliance when first and second cooking units are executing first and second cooking programs, respectively, operation instructions are recognized from the first audio information.
- a control command is transmitted to the cooking appliance to cause the cooking appliance to execute a process corresponding to the operation instructions, without executing a process according to the first cooking program or the second cooking program corresponding to one of the first cooking menu information or the second cooking menu information to which the second audio information is related.
- FIG. 1A is a diagram illustrating an overview of a service provided by a speech-based appliance control system according to an embodiment
- FIG. 1B is a diagram illustrating an example of a data center management company that is an appliance manufacturer
- FIG. 1C is a diagram illustrating an example of a data center management company that is one or both of an appliance manufacturer and a management company;
- FIG. 2 is a configuration diagram of the speech-based appliance control system according to a first embodiment
- FIG. 3 is a diagram illustrating the hardware configuration of an audio input and output device according to the first embodiment and the second embodiment
- FIG. 4 is a diagram illustrating the hardware configuration of a cooking appliance according to the first embodiment and the second embodiment
- FIG. 5 is a diagram illustrating the hardware configuration of a display terminal according to the first embodiment and the second embodiment
- FIG. 6 is a diagram illustrating the hardware configuration of a gateway according to the first embodiment
- FIG. 7 is a diagram illustrating the hardware configuration of a cloud server according to the first embodiment
- FIG. 8 is a diagram illustrating the system configuration of the audio input and output device according to the first embodiment and the second embodiment
- FIG. 9 is a diagram illustrating the system configuration of the cooking appliance according to the first embodiment and the second embodiment.
- FIG. 10 is a diagram illustrating the system configuration of the gateway according to the first embodiment
- FIG. 11 is a diagram illustrating the system configuration of the cloud server according to the first embodiment
- FIG. 12 is a flowchart illustrating an example of the operation of an utterance interpretation unit
- FIG. 13A is a diagram illustrating an example of an utterance interpretation dictionary DB
- FIG. 13B is a diagram illustrating the example of the utterance interpretation dictionary DB
- FIG. 14 is a diagram illustrating an example of context data extracted by the utterance interpretation unit
- FIG. 15A is a flowchart illustrating an example of the operation of a state management unit
- FIG. 15B is a flowchart illustrating the example of the operation of the state management unit
- FIG. 16 is a flowchart illustrating an example of the operation of a response generation unit
- FIG. 17 is a diagram illustrating a specific example of an appliance state management DB according to the first embodiment
- FIG. 18 is a diagram illustrating a specific example of an appliance function DB according to the first embodiment
- FIG. 19A is a diagram illustrating an example of a menu list included in a cooking program DB
- FIG. 19B is a diagram illustrating an example of a cooking step list included in the cooking program DB.
- FIG. 20A is a diagram illustrating an example of an error message list included in the cooking program DB
- FIG. 20B is a diagram illustrating an example of a display screen of the display terminal.
- FIG. 21 is a sequence diagram illustrating the operation of a speech-based appliance control system according to the first embodiment
- FIG. 22 is a sequence diagram illustrating the operation of the speech-based appliance control system according to the first embodiment
- FIG. 23 is a sequence diagram illustrating the operation of the speech-based appliance control system according to the first embodiment
- FIG. 24A is a diagram illustrating an example of a menu selection screen displayed on the display terminal
- FIG. 24B is a diagram illustrating an example of the menu selection screen displayed on the display terminal.
- FIG. 24C is a diagram illustrating an example of the menu selection screen displayed on the display terminal.
- FIG. 25 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the first embodiment
- FIG. 26 is a flowchart illustrating a cooking program management process executed in S 2201 of FIG. 22 ;
- FIG. 27 is a diagram illustrating the configuration of a speech-based appliance control system according to a second embodiment
- FIG. 28A is a block diagram illustrating the hardware configuration of an integrated management device
- FIG. 28B is a block diagram illustrating the system configuration of the integrated management device
- FIG. 29 is a diagram illustrating a specific example of an appliance state management DB according to the second embodiment.
- FIG. 30A is a sequence diagram illustrating the operation of the speech-based appliance control system according to the second embodiment
- FIG. 30B is a sequence diagram illustrating the operation of the speech-based appliance control system according to the second embodiment
- FIG. 31 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the second embodiment
- FIG. 32 is a block diagram illustrating the hardware configuration of a cooking appliance according to another embodiment
- FIG. 33 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 1 (local-data-center-based cloud service);
- FIG. 34 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 2 (IaaS-based cloud service);
- FIG. 35 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 3 (PaaS-based cloud service).
- FIG. 36 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 4 (SaaS-based cloud service).
- Patent Literature 1 The inventors have found that the technique described in Patent Literature 1 given above has the following problem.
- Patent Literature 1 discloses a speech-based control system including appliances to be controlled, and microphones respectively placed near the appliances for detecting user speech, and provides the following technique: Audio data detected by the microphones is collected by an audio collecting means. The content of the audio data input to the audio collecting means is analyzed by a speech recognition means. The direction from which an utterance made by the user originates is located by a distribution analysis means using the amplitude of the audio data input to the audio collecting means. An appliance to be controlled and the content of the operation to be performed on the appliance are determined by an inference means on the basis of the content of the audio data analyzed by the speech recognition means and the direction located by the distribution analysis means. A control signal is issued to the appliance to be controlled on the basis of the appliance and the content of the operation which are determined by the inference means.
- a user gives instructions to a target appliance by using speech with their face directed toward the target appliance. Because of this natural behavior, the configuration described above can determine which of a plurality of appliances that exist the user has given instructions to.
- a first aspect of the present disclosure provides a method for controlling a cooking appliance using a user's speech in a speech-based appliance control system including the cooking appliance and an audio input device configured to receive input of the user's speech, the cooking appliance including a first cooking unit and a second cooking unit.
- the method includes transmitting, to the cooking appliance via a first network, first cooking program information indicating a first cooking program, the first cooking program corresponding to a first cooking recipe, and second cooking program information indicating a second cooking program, the second cooking program corresponding to a second cooking recipe; in a case of receiving, from the audio input device, instruction information including first audio information indicating operation instructions for the cooking appliance when the first cooking unit is operated based on the first cooking program and the second cooking unit is operated based on the second cooking program, recognizing the operation instructions from the first audio information; determining, using a database configured to manage first cooking menu information indicating the name of a cooking menu item corresponding to the first cooking recipe and second cooking menu information indicating the name of a cooking menu item corresponding to the second cooking recipe, whether or not the received instruction information includes the first audio information and second audio information related to the first cooking menu information or the second cooking menu information; and in a case where it is determined that the received instruction information includes the second audio information, transmitting to the cooking appliance via the first network a control command for
- first cooking program information indicating a first cooking program corresponding to a first cooking recipe, and second cooking program information indicating a second cooking program corresponding to a second cooking recipe are transmitted to a cooking appliance via a first network.
- instruction information including first audio information indicating operation instructions for the cooking appliance from an audio input device while a first cooking unit in the cooking appliance is executing a process that is based on a first cooking program and a second cooking unit in the cooking appliance is executing a process that is based on a second cooking program the operation instructions are recognized from the first audio information.
- a control command is transmitted to the cooking appliance via the first network to cause the cooking appliance to execute a process corresponding to the operation instructions instead of executing a process that is executed in accordance with a cooking program corresponding to the cooking menu information to which the second audio information is related.
- the first cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the first cooking program.
- the second cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the second cooking program.
- an error message indicating that the process corresponding to the operation instructions is not executable at the cooking appliance may be provided to the user of the cooking appliance.
- an error message indicating that the process corresponding to the operation instructions is not executable is provided to a user. This allows the user to once again make instructions by using speech.
- the speech-based appliance control system may be configured to be further connected to a display device, and the error message may be displayed on the display device.
- an error message is displayed on a display of a display device. This allows the user to visually check that the process corresponding to the operation instructions is not executable.
- the speech-based appliance control system may be configured to be further connected to an audio output device configured to output audio, and the error message may be provided to a user of the cooking appliance using the audio output device.
- an error message is provided to a user using an audio output device. This allows the user to auditorily check that the process corresponding to the operation instructions is not executable.
- the process corresponding to the operation instructions may be executed instead of processes that are based on all the programs including the first cooking program and the second cooking program that are being executed in the cooking appliance.
- the process corresponding to the operation instructions is executed instead of processes that are based on all the programs including the first cooking program and the second cooking program that are being executed in the cooking appliance. Since it is determined that the instruction information does not include the second audio information, it is difficult to understand which of the first cooking unit and the second cooking unit the operation instructions are provided to.
- both the first cooking unit and the second cooking unit execute the process corresponding to the operation instructions instead of executing the respective processes that are executed in accordance with the first cooking program and the second cooking program. As a result, at least operation instructions for the cooking appliance may be executed.
- the operation instructions may be used for interrupting the process that is based on the first cooking program or the second cooking program in the cooking appliance.
- the operation instructions are used for interrupting a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- the execution of a process in the first cooking unit or the second cooking unit in accordance with the first cooking program or the second cooking program is interrupted.
- the operation instructions may be used for executing a process having a cooking parameter different from a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- the operation instructions are instructions for executing a process having a cooking parameter different from a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- a process having a cooking parameter different from a process that is being executed in the first cooking unit or the second cooking unit in accordance with the first cooking program or the second cooking program is executed.
- the speech-based appliance control system may be configured to further include a display device, display screen information indicating a display screen that provides two or more cooking recipes including the first cooking recipe and the second cooking recipe may be transmitted to the display device via a second network, and first cooking recipe selection information indicating that the first cooking recipe has been selected on the display device, and second cooking recipe selection information indicating that the second cooking recipe has been selected on the display device may be received from the display device via the second network.
- display screen information indicating a display screen that provides two or more cooking recipes including the first cooking recipe and the second cooking recipe is transmitted to a display device via a second network.
- First cooking recipe selection information indicating that the first cooking recipe has been selected on the display device, and second cooking recipe selection information indicating that the second cooking recipe has been selected on the display device are received from the display device via the second network. This allows the user to recognize that the first cooking recipe and the second cooking recipe have been selected on the display device.
- the first cooking program information may be transmitted to the cooking appliance via the display device in response to receipt of the first cooking recipe selection information from the display device
- the second cooking program information may be transmitted to the cooking appliance via the display device in response to receipt of the second cooking recipe selection information from the display device.
- the first cooking program information and the second cooking program information are transmitted to the cooking appliance via the display device in response to receipt of the first cooking recipe selection information and the second cooking recipe selection information from the display device, respectively.
- the display device may be used for both the selection of the first cooking recipe and the second cooking recipe and the transmission of the first cooking program information and the second cooking program information to the cooking appliance.
- the database may include correspondence relationship information indicating a first correspondence relationship between the first cooking unit and the first cooking program and a second correspondence relationship between the second cooking unit and the second cooking program
- a cooking unit that is to execute the process corresponding to the operation instructions may be specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information
- the control command may include specific-cooking-unit information indicating the specified one of the first cooking unit and the second cooking unit.
- the database includes correspondence relationship information indicating a correspondence relationship between the first cooking unit and the first cooking program and a correspondence relationship between the second cooking unit and the second cooking program.
- a cooking unit that is to execute the process corresponding to the operation instructions is specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information.
- the control command includes specific-cooking-unit information indicating the specified cooking unit.
- the cooking appliance may be configured to manage the correspondence relationship information, a cooking unit that is to execute the process corresponding to the operation instructions may be specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information and the specific-cooking-unit information, and the specified cooking unit may be caused to execute the process corresponding to the operation instructions.
- the cooking appliance is configured to manage correspondence relationship information indicating a correspondence relationship between the first cooking unit and the first cooking program and a correspondence relationship between the second cooking unit and the second cooking program.
- a cooking unit that is to execute the process corresponding to the operation instructions is specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information and the specific-cooking-unit information.
- the specified cooking unit executes the process corresponding to the operation instructions.
- the operation instructions transmitted to the cooking appliance may be accurately executed in the first cooking unit or the second cooking unit.
- the display device may be included in the cooking appliance.
- the display device may be included in an appliance that is different from the cooking appliance.
- a second aspect of the present disclosure provides a method for controlling a cooking appliance using a user's speech in a speech-based appliance control system, the speech-based appliance control system including the cooking appliance and an audio input device configured to receive input of user's speech, the cooking appliance including a first cooking unit and a second cooking unit.
- the appliance control method includes transmitting, to the cooking appliance via a first network, first cooking program information indicating the first cooking program, the first cooking program corresponding to a first cooking recipe, and second cooking program information indicating the second cooking program different from the first cooking program, the second cooking program corresponding to a second cooking recipe; in a case of receiving, from the audio input device, instruction information including first audio information indicating operation instructions for the cooking appliance when the first cooking unit is operated based on the first cooking program and the second cooking unit is operated based on the second cooking program, recognizing the operation instructions from the first audio information; determining, using a database configured to manage first cookware information indicating the name of a cookware item used in the first cooking recipe and second cookware information indicating the name of a cookware item used in the second cooking recipe, whether or not the instruction information includes second audio information related to the first cookware information or the second cookware information; and in a case where it is determined that the instruction information includes the first audio information and the second audio information, transmitting to the cooking appliance via the first network a control command
- first cooking program information indicating a first cooking program corresponding to a first cooking recipe, and second cooking program information indicating a second cooking program corresponding to a second cooking recipe are transmitted to a cooking appliance via a first network.
- instruction information including first audio information indicating operation instructions for the cooking appliance from an audio input device while a first cooking unit in the cooking appliance is executing a process that is based on a first cooking program and a second cooking unit in the cooking appliance is executing a process that is based on a second cooking program the operation instructions are recognized from the first audio information.
- a control command is transmitted to the cooking appliance via the first network to cause the cooking appliance to execute a process corresponding to the operation instructions instead of executing a process that is executed in accordance with a cooking program corresponding to the cookware information to which the second audio information is related.
- the first cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the first cooking program.
- the second cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the second cooking program.
- an error message indicating that the process corresponding to the operation instructions is not executable at the cooking appliance may be provided to the user of the cooking appliance.
- the speech-based appliance control system may be configured to be further connected to a display device, and the error message may be displayed on the display device.
- the speech-based appliance control system may be configured to further include an audio output device configured to output audio, and the error message may be provided to the user of the cooking appliance using the audio output device.
- the process corresponding to the operation instructions may be executed instead of processes that are based on all the programs including the first cooking program and the second cooking program that are being executed in the cooking appliance.
- the operation instructions may be used for interrupting a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- the operation instructions may be used for executing a process having a cooking parameter different from a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- the speech-based appliance control system may be configured to further include a display device, display screen information indicating a display screen that provides two or more cooking recipes including the first cooking recipe and the second cooking recipe may be transmitted to the display device via a second network, and first cooking recipe selection information indicating that the first cooking recipe has been selected on the display device, and second cooking recipe selection information indicating that the second cooking recipe has been selected on the display device may be received from the display device via the second network.
- the first cooking program information may be transmitted to the cooking appliance via the display device in a case of receiving the first cooking recipe selection information from the display device
- the second cooking program information may be transmitted to the cooking appliance via the display device in case of receiving the second cooking recipe selection information from the display device.
- the database may include correspondence relationship information indicating a first correspondence relationship between the first cooking unit and the first cooking program and a second correspondence relationship between the second cooking unit and the second cooking program, one of the first cooking unit of the second cooking unit that is to execute the process corresponding to the operation instructions may be specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information, and the control command may include specific-cooking-unit information indicating the specified one of the first cooking unit and the second cooking unit.
- the process corresponding to the operation instructions are executed at the one of the first cooking unit or the second cooking unit specified on the basis of the specific-cooking-unit information.
- the display device may be included in the cooking appliance.
- the display device may be included in an appliance that is different from the cooking appliance.
- a third aspect of the present disclosure provides a speech-based appliance control system including a cooking appliance having a first cooking unit and a second cooking unit, an audio input device configured to receive input of a user's speech, and a server connectable to the cooking appliance and the audio input device.
- the cooking appliance is controlled using the user's speech.
- the cooking appliance includes a first communication unit configured to receive from the server first cooking program information indicating the first cooking program, the first cooking program corresponding to a first cooking recipe, and second cooking program information indicating the second cooking program, the second cooking program corresponding to a second cooking recipe, and a second control unit configured to cause the first cooking unit to operate based on a first cooking program, and configured to cause the second cooking unit to operate based on the second cooking program.
- the audio input device includes an audio acquisition unit configured to acquire instruction information including first audio information indicating operation instructions for the cooking appliance, and a second communication unit configured to transmit the acquired instruction information to the server.
- the server includes a third communication unit configured to transmit the first cooking program information and the second cooking program information to the cooking appliance, a database configured to manage first cooking menu information indicating the name of a cooking menu item corresponding to the first cooking recipe and second cooking menu information indicating the name of a cooking menu item corresponding to the second cooking recipe, a determination unit configured to, in a case of receiving the instruction information from the audio input device when the first cooking unit is operated based on the first cooking program and the second cooking unit is operated based on the second cooking program, recognize the operation instructions from the first audio information included in the received instruction information and determine whether or not the instruction information includes second audio information related to the first cooking menu information or the second cooking menu information, and a fourth communication unit configured to, in a case where it is determined that the instruction information includes the second audio information, transmit to the cooking appliance a control command for causing the
- the operation instructions are recognized from the first audio information included in the received instruction information. It is determined whether or not the instruction information includes second audio information related to the first cooking menu information or the second cooking menu information.
- a control command is transmitted to the cooking appliance to cause the cooking appliance to execute a process corresponding to the operation instructions instead of executing a process that is executed in accordance with a cooking program corresponding to the cooking menu information to which the second audio information is related.
- the first cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the first cooking program.
- the second cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the second cooking program.
- a fourth aspect of the present disclosure provides a cooking appliance used in the speech-based appliance control system according to the third aspect.
- FIG. 1A is a diagram illustrating an overview of a service provided by a speech-based appliance control system according to embodiments disclosed herein.
- the speech-based appliance control system includes a group 4100 , a data center management company 4110 , and a service provider 4120 .
- the group 4100 may indicate, for example, a company, an organization, a household, or the like whatever its size.
- the group 4100 includes a plurality of home electric appliances 101 including a first home electric appliance and a second home electric appliance, and a home gateway 4102 .
- the plurality of home electric appliances 101 includes an appliance that is capable of accessing the Internet (such as a smartphone, a personal computer (PC), or a television receiver).
- the plurality of home electric appliances 101 further includes an appliance that is incapable of accessing the Internet by itself (such as lighting, a washing machine, or a refrigerator).
- the plurality of home electric appliances 101 may include an appliance that is incapable of accessing the Internet by itself but is capable of accessing the Internet via the home gateway 4102 .
- Users 4200 use the plurality of home electric appliances 101 in the group 4100 .
- the data center management company 4110 includes a cloud server 4111 .
- the cloud server 4111 is a virtual server that builds a cooperative relationship with various appliances over the Internet.
- the cloud server 4111 mainly manages vast volumes of data (or “big data”) that are difficult to handle with traditional database management tools or the like.
- the data center management company 4110 engages in business activities such as operating a data center that manages data and that manages the cloud server 4111 . The details of the activities that the data center management company 4110 undertakes are described below.
- the data center management company 4110 is not limited to a company engaging in business activities such as operating a data center that manages data and that manages the cloud server 4111 .
- FIGS. 1B and 1C are diagrams illustrating examples of the data center management company 4110 .
- the appliance manufacturer corresponds to the data center management company 4110 .
- the data center management company 4110 is not limited to a single company.
- FIG. 1C in a case where an appliance manufacturer and a management company collaborate or share with each other to manage data or manage the cloud server 4111 , one or both of them correspond to the data center management company 4110 .
- the service provider 4120 includes a server 121 .
- the server 121 includes, for example, a memory in a personal use PC whatever the size.
- the service provider 4120 may not include the server 121 .
- the service provider 4120 may include a different device configured to perform the functions of the server 121 .
- the home gateway 4102 may not necessarily be used.
- the home gateway 4102 is a device that allows the home electric appliances 101 to access the Internet. Accordingly, in a case where the home electric appliances 101 do not include an appliance that is incapable of accessing the Internet by itself, for example, in a case where all the home electric appliances 101 in the group 4100 are connecting to the Internet, the home gateway 4102 is not used.
- the first home electric appliance or the second home electric appliance in the group 4100 transmits log information to the cloud server 4111 in the data center management company 4110 .
- the cloud server 4111 collects the log information on the first home electric appliance or the second home electric appliance (arrow 131 in FIG. 1A ).
- the log information may be information indicating, for example, the operating state or operation date and time of the plurality of home electric appliances 101 .
- the log information includes, for example, the viewing history of a TV viewer, scheduled recording information on a recorder, the date and time when a washing machine runs, the amount of laundry, the date and time when a refrigerator door opens and closes, and the number of times the refrigerator door opens and closes.
- the log information is not limited to the information described above, and may include a variety of pieces of information available from the home electric appliances 101 .
- the log information may be provided from the plurality of home electric appliances 101 directly to the cloud server 4111 via the Internet.
- the log information may also temporarily be collected in the home gateway 4102 from the plurality of home electric appliances 101 , and may be provided from the home gateway 4102 to the cloud server 4111 .
- the cloud server 4111 in the data center management company 4110 provides the collected log information to the service provider 4120 at a constant rate.
- the “constant rate” may be the unit of how the data center management company 4110 can organize the collected information and provide the information to the service provider 4120 , or may be the unit requested by the service provider 4120 .
- the amount of information may not necessarily be constant, and, for example, the amount of information that is provided may vary depending on the situation.
- the log information is saved in the server 121 included in the service provider 4120 , if necessary (arrow 132 in FIG. 1A ).
- the service provider 4120 organizes the log information into information adapted to a service that is provided to users, and provides the information to the users.
- the users to whom the information is provided may be the users 4200 who use the plurality of home electric appliances 101 , or may be external users 4210 .
- the information may be provided from the service provider 4120 directly to the users 4200 or 4210 (arrow 133 or 134 in FIG. 1A ).
- the information may be provided to the users 4200 , passing back through the cloud server 4111 in the data center management company 4110 (arrows 135 and 136 in FIG. 1A ).
- the cloud server 4111 in the data center management company 4110 may organize the log information into information adapted to a service that is provided to users, and may provide the information to the service provider 4120 .
- the users 4200 may be identical to or different from the users 4210 .
- FIG. 2 is a diagram illustrating the configuration of a speech-based appliance control system according to a first embodiment. The configuration of the speech-based appliance control system according to the first embodiment will be described with reference to FIG. 2 .
- the speech-based appliance control system illustrated in FIG. 2 includes an audio input and output device 240 , a plurality of home electric appliances 101 , a display terminal 260 , a gateway 102 , an information communication network 220 , and a cloud server 111 .
- the home electric appliances 101 include an oven range 243 , an induction-heating (IH) cooker 244 , and a refrigerator 245 .
- the plurality of home electric appliances 101 may include any other desired appliance instead of or in addition to the oven range 243 , the IH cooker 244 , and the refrigerator 245 .
- the audio input and output device 240 (an example of an audio input device) includes an audio acquisition unit configured to acquires speech from a user 250 , and an audio output unit configured to output audio to the user 250 .
- a group 100 is a space within which the audio input and output device 240 can provide information (or a space over which audio interaction is feasible).
- the group 100 may be, for example, a house of the user 250 .
- the audio input and output device 240 recognizes speech of the user 250 .
- the audio input and output device 240 presents audio information and controls the plurality of home electric appliances 101 in accordance with instructions entered by the user 250 through speech. More specifically, the audio input and output device 240 reads content aloud, responds to a question made by the user 250 , and controls the home electric appliance 101 in accordance with instructions entered by the user 250 through speech.
- the display terminal 260 (an example of a display device) has an input function that allows the user 250 to give appliance control instructions, and an information output function that provides information to the user 250 .
- the input function of the display terminal 260 may be implemented by a touch panel or a push button.
- the display terminal 260 may be a mobile phone, a smartphone, or a tablet device.
- the display terminal 260 , the audio input and output device 240 , and the plurality of home electric appliances 101 may be connected to the gateway 102 using wired or wireless connection. Additionally, the audio input and output device 240 and at least one of the plurality of home electric appliances 101 may be integrated into a single unit.
- FIG. 3 is a block diagram illustrating the hardware configuration of the audio input and output device 240 .
- the hardware configuration of the audio input and output device 240 will be described with reference to FIG. 3 .
- the audio input and output device 240 includes a processing circuit 300 , an audio collection circuit 301 , an audio output circuit 302 , and a communication circuit 303 . These circuits are connected to one another via a bus 330 , and are capable of exchanging data or instructions.
- the processing circuit 300 includes a central processing unit (CPU) 310 and a memory 320 .
- the processing circuit 300 may include dedicated hardware configured to implement the operations described below, instead of the CPU 310 and the memory 320 .
- the memory 320 stores an appliance ID 341 and a computer program 342 .
- the appliance ID 341 is an identifier uniquely assigned to the audio input and output device 240 .
- the appliance ID 341 may be independently assigned by a manufacturer, or may be a physical address (or so-called Media Access Control (MAC) address), which is uniquely assigned basically on a network.
- MAC Media Access Control
- the audio collection circuit 301 collects user speech and generates an analog audio signal.
- the audio collection circuit 301 converts the generated analog audio signal into digital data and then transmits the digital data to the bus 330 .
- the audio output circuit 302 converts the digital data received via the bus 330 into an analog audio signal.
- the audio output circuit 302 outputs the resulting analog audio signal.
- the communication circuit 303 is a circuit that communicates with other devices (e.g., the gateway 102 ) via a network.
- the communication circuit 303 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the communication circuit 303 transmits log information or ID information generated by the processing circuit 300 to the gateway 102 .
- the communication circuit 303 transmits a signal received from the gateway 102 to the processing circuit 300 via the bus 330 .
- the audio input and output device 240 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function.
- FIG. 4 is a block diagram illustrating the hardware configuration of a cooking appliance 400 that is an example of the home electric appliances 101 .
- the hardware configuration of the cooking appliance 400 will be described with reference to FIG. 4 .
- the oven range 243 , the IH cooker 244 , and the refrigerator 245 are examples of the cooking appliance 400 .
- the cooking appliance 400 includes an input and output circuit 410 , a communication circuit 450 , and a processing circuit 470 . These circuits are connected to one another via a bus 460 , and are capable of exchanging data or instructions.
- the processing circuit 470 includes a CPU 430 and a memory 440 .
- the processing circuit 470 may include dedicated hardware configured to implement the operations described below, instead of the CPU 430 and the memory 440 .
- the memory 440 stores an appliance ID 441 , a computer program 442 , and a cooking program ID 443 .
- the appliance ID 441 is an identifier uniquely assigned to the cooking appliance 400 .
- the cooking program ID 443 is an identifier uniquely assigned to a cooking program.
- the appliance ID 441 and the cooking program ID 443 may be independently assigned by a manufacturer, or may be a physical address (or so-called Media Access Control (MAC) address), which is uniquely assigned basically on a network.
- MAC Media Access Control
- the input and output circuit 410 outputs a result of processing performed by the processing circuit 470 .
- the input and output circuit 410 converts an input analog signal into digital data, and transmits the digital data to the bus 460 .
- the input and output circuit 410 displays a result of processing performed by the processing circuit 470 .
- the cooking appliance 400 that includes the input and output circuit 410 (an example of a display device) having a display function may have the function of the display terminal 260 .
- the communication circuit 450 is a circuit that communicates with other devices (e.g., the gateway 102 ) via a network.
- the communication circuit 450 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the communication circuit 450 transmits log information or ID information generated by the processing circuit 470 to the gateway 102 .
- the communication circuit 450 transmits a signal received from the gateway 102 to the processing circuit 470 via the bus 460 .
- the cooking appliance 400 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function.
- FIG. 5 is a block diagram illustrating the hardware configuration of the display terminal 260 .
- the display terminal 260 includes a display control circuit 500 , a display circuit 502 , a communication circuit 505 , and a processing circuit 510 . These circuits are connected to one another via a bus 525 , and are capable of exchanging data or instructions.
- the display circuit 502 includes a liquid crystal display and so on.
- the display circuit 502 displays an image such as an object image including icons or operation buttons, and a text image.
- the display control circuit 500 controls the operation of the display circuit 502 to display an image on the display circuit 502 .
- the communication circuit 505 is a circuit that communicates with other devices (e.g., the audio input and output device 240 , the cooking appliance 400 , etc.) via a network.
- the communication circuit 505 performs communication complying with, for example, the Ethernet (registered trademark) standards or near field communication standards.
- the communication circuit 505 transmits log information or ID information generated by the processing circuit 510 to the audio input and output device 240 or the cooking appliance 400 .
- the communication circuit 505 transmits a signal received from the audio input and output device 240 or the cooking appliance 400 to the processing circuit 510 via the bus 525 .
- the processing circuit 510 includes a CPU 515 and a memory 520 .
- the processing circuit 510 may include dedicated hardware configured to implement the operations described below, instead of the CPU 515 and the memory 520 .
- the memory 520 stores a display terminal ID 521 , a computer program 522 , and a cooking program ID 523 .
- the display terminal ID 521 is an identifier uniquely assigned to the display terminal 260 .
- the cooking program ID 523 is an identifier uniquely assigned to a cooking program.
- the display terminal 260 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function.
- the display terminal ID 521 and the cooking program ID 523 are stored in the memory 520 in which the computer program 522 is stored.
- the computer program 522 may be stored in a random access memory (RAM) or a read-only memory (ROM), and the display terminal ID 521 and the cooking program ID 523 may be stored in a flash memory.
- FIG. 6 is a block diagram illustrating the hardware configuration of the gateway 102 .
- the gateway 102 includes a communication circuit 550 and a processing circuit 570 . These circuits are connected to each other via a bus 560 , and are capable of exchanging data or instructions.
- the communication circuit 550 is a circuit that communicates with other devices (e.g., the audio input and output device 240 , the cooking appliance 400 , etc.) via a network.
- the communication circuit 550 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the communication circuit 550 transmits log information or ID information generated by the processing circuit 570 to the audio input and output device 240 or the cooking appliance 400 .
- the communication circuit 550 transmits a signal received from the audio input and output device 240 or the cooking appliance 400 to the processing circuit 570 via the bus 560 .
- the processing circuit 570 includes a CPU 530 and a memory 540 .
- the processing circuit 570 may include dedicated hardware configured to implement the operations described below, instead of the CPU 530 and the memory 540 .
- the memory 540 stores a gateway ID 541 and a computer program 542 .
- the gateway ID 541 is an identifier uniquely assigned to the gateway 102 .
- the gateway 102 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function.
- the gateway ID 541 is stored in the memory 540 in which the computer program 542 is stored.
- the computer program 542 may be stored in a RAM or a ROM, and the gateway ID 541 may be stored in a flash memory.
- FIG. 7 is a block diagram illustrating the hardware configuration of the cloud server 111 .
- the cloud server 111 includes a communication circuit 650 , a processing circuit 670 , a speech recognition database (DB) 600 , an appliance state management DB 620 (an example of a database), an utterance interpretation dictionary DB 625 , an appliance function DB 630 , and a cooking program DB 640 .
- the processing circuit 670 includes a CPU 671 , and a memory 672 in which a computer program 673 is stored. These constituent elements are connected to one another via a bus 680 , and are capable of mutually exchanging data.
- the processing circuit 670 is connected to the speech recognition/synthesis DB 600 , the appliance state management DB 620 , the utterance interpretation dictionary DB 625 , the appliance function DB 630 , and the cooking program DB 640 via the bus 680 .
- the processing circuit 670 acquires or edits management information stored in the databases 600 , 620 , 625 , 630 , and 640 .
- the speech recognition/synthesis DB 600 , the appliance state management DB 620 , the utterance interpretation dictionary DB 625 , the appliance function DB 630 , and the cooking program DB 640 are elements included in the cloud server 111 .
- the speech recognition/synthesis DB 600 , the appliance state management DB 620 , the utterance interpretation dictionary DB 625 , the appliance function DB 630 , and the cooking program DB 640 may be provided outside the cloud server 111 , in which case the cloud server 111 may further include an Internet line in addition to the bus 680 .
- the communication circuit 650 is a circuit that communicates with other communication devices (e.g., the gateway 102 ) via a network.
- the communication circuit 650 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the CPU 671 controls the operation of the cloud server 111 .
- the CPU 671 executes a group of instructions written in the computer program 673 developed in the memory 672 . Accordingly, the CPU 671 is capable of implementing a variety of functions.
- a group of instructions for allowing the cloud server 111 to implement the operations described below is written in the computer program 673 .
- the computer program 673 described above may be recorded on a recording medium such as a CD-ROM and distributed as a marketed product. Alternatively, the computer program 673 may be transmitted via an electric communication line such as the Internet.
- An appliance e.g., a PC including the hardware illustrated in FIG. 7 is capable of functioning as the cloud server 111 according to this embodiment by reading the computer program 673 .
- the CPU 671 and the memory 672 in which the computer program 673 is stored may be implemented as hardware such as a digital signal processor (DSP) in which a computer program is integrated in a single semiconductor circuit.
- DSP digital signal processor
- the DSP is capable of implementing all the processing operations implementable by the CPU 671 that executes the computer program 673 described above on a single integrated circuit.
- This DSP in place of the CPU 671 and the memory 672 illustrated in FIG. 7 may be used as the processing circuit 670 .
- the speech recognition/synthesis DB 600 stores acoustic models and language models for speech recognition.
- the appliance state management DB 620 , the utterance interpretation dictionary DB 625 , the appliance function DB 630 , and the cooking program DB 640 will be described in detail below.
- FIG. 8 is a block diagram illustrating the system configuration of the audio input and output device 240 .
- the audio input and output device 240 includes an audio collection unit 1000 , an audio detection unit 1010 , an audio section segmentation unit 1020 , a communication unit 1030 , and an audio output unit 1040 .
- the audio collection unit 1000 corresponds to the audio collection circuit 301 .
- the audio collection unit 1000 collects user speech and generates an analog audio signal.
- the audio collection unit 1000 converts the generated analog audio signal into digital data, and generates an audio signal.
- the audio detection unit 1010 and the audio section segmentation unit 1020 are implemented by the processing circuit 300 .
- the CPU 310 that executes the computer program 342 functions as, for example, the audio detection unit 1010 at a certain point in time, and functions as the audio section segmentation unit 1020 at a different point in time.
- At least one of the audio detection unit 1010 and the audio section segmentation unit 1020 may be implemented by hardware configured to perform dedicated processing, such as a DSP.
- the audio detection unit 1010 determines whether or not audio has been detected. For example, if the level of the audio signal (e.g., the amplitude of the audio signal) generated by the audio collection unit 1000 is less than or equal to a predetermined value, the audio detection unit 1010 determines that no audio has been detected.
- the level of the audio signal e.g., the amplitude of the audio signal
- the audio section segmentation unit 1020 extracts a section in which audio is present from the acquired audio signal.
- the audio collection unit 1000 , the audio detection unit 1010 , and the audio section segmentation unit 1020 constitute an example of an audio acquisition unit.
- the communication unit 1030 (an example of a second communication unit) corresponds to the communication circuit 303 .
- the communication unit 1030 communicates with other communication devices (e.g., the gateway 102 ) via a network.
- the communication unit 1030 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the communication unit 1030 transmits an audio signal for the section extracted by the audio section segmentation unit 1020 . Further, the communication unit 1030 passes the received audio signal to the audio output unit 1040 .
- the audio output unit 1040 corresponds to the audio output circuit 302 .
- the audio output unit 1040 converts the audio signal received by the communication unit 1030 into an analog audio signal.
- the audio output unit 1040 outputs the resulting analog audio signal.
- FIG. 9 is a block diagram illustrating the system configuration of the cooking appliance 400 .
- the cooking appliance 400 includes a communication unit 900 , an appliance control unit 910 , a first cooking unit 911 , and a second cooking unit 912 .
- the communication unit 900 (an example of a first communication unit) corresponds to the communication circuit 450 .
- the communication unit 900 communicates with other communication devices (e.g., the gateway 102 ) via a network.
- the communication unit 900 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the appliance control unit 910 (an example of a second control unit) corresponds to the input and output circuit 410 and the processing circuit 470 .
- the processing circuit 470 corresponding to the appliance control unit 910 reads the control data received by the communication unit 900 .
- the processing circuit 470 corresponding to the appliance control unit 910 controls the input and output circuit 410 using the read control data.
- the appliance control unit 910 controls the operations of the first cooking unit 911 and the second cooking unit 912 in accordance with control commands received by the communication unit 900 .
- the first cooking unit 911 and the second cooking unit 912 are configured to be capable of simultaneously executing different cooking programs.
- the first cooking unit 911 corresponds to, for example, a “heater 1”
- the second cooking unit 912 corresponds to, for example, a “heater 2”.
- the first cooking unit 911 corresponds to, for example, a “top rack”
- the second cooking unit 912 corresponds to, for example, a “bottom rack”.
- FIG. 10 is a block diagram illustrating the system configuration of the gateway 102 .
- the gateway 102 includes a communication unit 800 , a received data analysis unit 810 , and a transmission data generation unit 820 .
- the communication unit 800 corresponds to the communication circuit 550 .
- the communication unit 800 is a circuit that communicates with other devices (e.g., the audio input and output device 240 , the cooking appliance 400 , etc.) via a network.
- the communication unit 800 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the communication unit 800 passes received data to the received data analysis unit 810 .
- the communication unit 800 transmits data generated by the transmission data generation unit 820 .
- the received data analysis unit 810 corresponds to the processing circuit 570 .
- the received data analysis unit 810 analyzes the data received by the communication unit 800 to determine the type of the received data. As a result of the analysis of the received data in terms of type, the received data analysis unit 810 determines the next target appliance (e.g., the audio input and output device 240 or the cooking appliance 400 ), and the data to be transmitted to the target appliance.
- the next target appliance e.g., the audio input and output device 240 or the cooking appliance 400
- the transmission data generation unit 820 corresponds to the processing circuit 570 .
- the transmission data generation unit 820 generates transmission data based on the next target appliance and the data to be transmitted to the target appliance, which are determined by the received data analysis unit 810 .
- FIG. 11 is a block diagram illustrating the system configuration of the cloud server 111 .
- the cloud server 111 includes a communication unit 700 , a speech recognition unit 710 , an utterance interpretation unit 730 , a state management unit 740 , a response generation unit 750 , and a speech synthesis unit 760 .
- the communication unit 700 (an example of a third communication unit and a fourth communication unit) corresponds to the communication circuit 650 .
- the communication unit 700 is a circuit that communicates with other devices (e.g., the gateway 102 ) via a network.
- the communication unit 700 performs communication complying with, for example, the Ethernet (registered trademark) standards.
- the speech recognition unit 710 is implemented by the processing circuit 670 and the speech recognition/synthesis DB 600 .
- the speech recognition unit 710 converts an audio signal into character string data. Specifically, the speech recognition unit 710 acquires information on pre-registered acoustic models from the speech recognition/synthesis DB 600 , and converts the audio signal into phonemic data using the acoustic models and the frequency characteristics of the audio signal.
- the speech recognition unit 710 also acquires information on pre-registered language models from the speech recognition/synthesis DB 600 , and generates specific character string data using the language models in accordance with the arrangement of the phonemes in the phonemic data.
- the utterance interpretation unit 730 is implemented by the processing circuit 670 , the appliance function DB 630 , and the cooking program DB 640 .
- the utterance interpretation unit 730 extracts context data from the character string data.
- the context data may include, specifically, the target appliance name, the menu name (or food name), the cookware name, or the task content.
- the utterance interpretation unit 730 checks the character string data against the appliance function DB 630 and the cooking program DB 640 to extract context data.
- the state management unit 740 (an example of a determination unit) is implemented by the processing circuit 670 , the appliance state management DB 620 , and the cooking program DB 640 .
- the state management unit 740 receives the context data as input, and acquires data stored in the appliance state management DB 620 and the cooking program DB 640 .
- the state management unit 740 changes the acquired data to update the appliance state management DB 620 and the cooking program DB 640 .
- the response generation unit 750 is implemented by the processing circuit 670 , the appliance state management DB 620 , the appliance function DB 630 , and the cooking program DB 640 .
- the response generation unit 750 searches the appliance state management DB 620 , the appliance function DB 630 , and the cooking program DB 640 , and generates a control signal for controlling the cooking appliance 400 to be controlled.
- the response generation unit 750 searches the appliance function DB 630 and the cooking program DB 640 , and generates character string data of information to be provided to the user 250 .
- the speech synthesis unit 760 is implemented by the processing circuit 670 and the speech recognition/synthesis DB 600 .
- the speech synthesis unit 760 converts the character string data into an audio signal. Specifically, the speech synthesis unit 760 acquires information on pre-registered acoustic models and language models from the speech recognition/synthesis DB 600 , and converts the character string data into a specific audio signal using the acoustic models and the language models.
- FIG. 12 is a flowchart illustrating an example of the operation of the utterance interpretation unit 730 .
- FIGS. 13A and 13B are diagrams illustrating an example of the utterance interpretation dictionary DB 625 .
- FIG. 14 is a diagram illustrating an example of context data 1400 extracted by the utterance interpretation unit 730 .
- the context data 1400 illustrated in FIG. 14 is an example of context data in a case where a user speaks an utterance “Turn off the heat to the stew”. In the following, the stew is described as a Japanese stewed or simmered vegetable dish (“nimono”).
- the utterance interpretation dictionary DB 625 holds word IDs, word names, related word IDs, types, and concepts in association with one another.
- the word IDs are identifiers uniquely assigned to the words registered in the word names. For example, the word name “pot” is registered with the word ID “W001”.
- the related word IDs are word IDs of words related to the words registered in the word names. For example, the word ID “W030” associated with the word name “ground meat” is registered as a related word ID of the word name “hamburger steak”. Conversely, the word ID “W010” associated with the word name “hamburger steak” is registered as a related word ID associated with the word name “ground meat”.
- the types represent the types of the words registered in the word names.
- the types include ⁇ equipment>, ⁇ menu>, ⁇ category>, ⁇ ingredient>, ⁇ appliance>, and ⁇ task>.
- the type ⁇ equipment> represents a cooking equipment or cookware product.
- the type ⁇ menu> represents a menu item or food item.
- the type ⁇ category> represents the general concept of a menu item.
- the type ⁇ ingredient> represents the names of ingredients used in cooking.
- the type ⁇ appliance> represents a cooking appliance.
- the type ⁇ task> represents instructions for an action or the like.
- the word name “pot” is registered with the type ⁇ equipment>.
- the word name “hamburger steak” is registered with the type ⁇ menu>.
- the word name “stew” is registered with the type ⁇ category>.
- the word name “ground meat” is registered with the type ⁇ ingredient>.
- the word name “IH cooker” is registered with the type ⁇ appliance>.
- the word name “turn off the heat” is registered with the type ⁇ task>.
- the concepts represent logical symbols for the words registered in the word names.
- the concepts correspond to the words registered in the word names in a one-to-one way.
- the word name “stew” is registered with the concept ⁇ stewed>.
- the word name “oven range” is registered with the concept ⁇ stove>.
- the word name “turn off the heat” is registered with the concept ⁇ stop_heat>.
- a process illustrated in FIG. 12 is initiated by the utterance interpretation unit 730 immediately after the speech recognition unit 710 converts an audio signal for an utterance made by a user into character string data.
- the utterance interpretation unit 730 in the cloud server 111 checks a character string for the utterance made by the user (i.e., the character string data output from the speech recognition unit 710 ) against a list of word names in the utterance interpretation dictionary DB 625 .
- the utterance interpretation unit 730 outputs, as context data, the “types” and “concepts” associated with all the word names that match part or all of the character string.
- the context data is output in a table form.
- the utterance interpretation unit 730 determines that the word name “stew” and the word name “turn off the heat” in the utterance “Turn off the heat of the stew” match the corresponding word names in the utterance interpretation dictionary DB 625 .
- the utterance interpretation unit 730 outputs, as the context data 1400 , the information associated with the word name “stew”, namely, the word ID “W020”, the type ⁇ category>, and the concept ⁇ stewed>, and the information associated with the word name “turn off the heat”, namely, the word ID “W100”, the type ⁇ task>, and the concept ⁇ stop_heat>.
- the utterance interpretation unit 730 determines whether or not each word name has a related word ID. If each word name has no related word ID (NO in S 1203 ), the process illustrated in FIG. 12 ends. If each word name has a related word ID (YES in S 1203 ), then in S 1204 , the utterance interpretation unit 730 outputs the “word name”, “type”, and “concept” associated with the related word ID as context data.
- the utterance interpretation unit 730 determines that there is no related word ID associated with the word name “turn off the heat”, and determines that there are related word IDs associated with the word name “stew”.
- the utterance interpretation unit 730 outputs the word name “beef stew”, the type ⁇ menu>, and the concept ⁇ beef_stew>, which are associated with the related word ID “W011” associated with the word name “stew”, and the word name “chikuzen-ni stew”, the type ⁇ menu>, and the concept ⁇ chikuzen_ni>, which are associated with the related word ID “W012” associated with the word name “stew”, as context data.
- the context data 1400 illustrated in FIG. 14 is output from the utterance interpretation unit 730 .
- FIG. 17 is a diagram illustrating a specific example of the appliance state management DB 620 .
- the appliance state management DB 620 holds, for example, gateway IDs (GW-IDs), appliance IDs, appliance names ⁇ appliance>, ongoing cooking program IDs, ongoing cooking step IDs, cookware names ⁇ equipment>, menu names ⁇ menu>, and appliance operating states in association with one another.
- GW-IDs gateway IDs
- appliance IDs appliance IDs
- appliance names ⁇ appliance> ongoing cooking program IDs
- ongoing cooking step IDs ongoing cooking step IDs
- cookware names ⁇ equipment> menu names ⁇ menu>
- appliance operating states in association with one another.
- the gateway IDs are identifiers uniquely assigned to gateways 102 .
- gateway IDs “G001” and “G002” are registered.
- the appliance IDs are identifiers uniquely assigned to separate cooking units included in cooking appliances 400 .
- the appliance ID “M01-01” is registered with the “heater 1” of the IH cooker 244
- the appliance ID “M01-02” is registered with the “heater 2” of the IH cooker 244
- the appliance ID “M01-03” is registered with a “heater 3” of the IH cooker 244
- the appliance ID “M02-01” is registered with the “top rack” in the oven range 243
- the appliance ID “M02-02” is registered with the “bottom rack” in the oven range 243 .
- the appliance names ⁇ appliance> represent logical symbols for cooking appliances 400 .
- the appliance names ⁇ appliance> correspond to the cooking appliances 400 in a one-to-one way.
- the appliance name ⁇ ih_heater> is registered with each of the heaters 1, 2, and 3 of the IH cooker 244 .
- the appliance name ⁇ stove> is registered with each of the top rack and bottom rack in the oven range 243 .
- the ongoing cooking program IDs are identifiers of ongoing cooking programs currently being undertaken.
- the cooking programs with the cooking program IDs “T001”, “T002”, and “T003” are currently undertaken.
- the ongoing cooking step IDs are identifiers of individual cooking steps in currently ongoing cooking programs.
- the cooking step with the cooking step ID “S001” is being performed in the cooking associated with the cooking program ID “T001”
- the cooking step with the cooking step ID “S002” is being performed in the cooking associated with the cooking program ID “T002”
- the cooking step with the cooking step ID “S002” is being performed in the cooking associated with the cooking program ID “T003”.
- the cookware names ⁇ equipment> represent cookware items used in accordance with the associated cooking programs.
- ⁇ pot> is used in the cooking associated with the cooking program ID “T001”
- ⁇ pan> is used in the cooking associated with the cooking program ID “T002”
- ⁇ gratin plate> is used in the cooking associated with the cooking program ID “T003”.
- the menu names ⁇ menu> represent menu items (or food items) currently being cooked.
- the menu item associated with the cooking program ID “T001” is represented by ⁇ chikuzen_ni>
- the menu item associated with the cooking program ID “T002” is represented by ⁇ hamburger>
- the menu item associated with the cooking program ID “T003” is represented by ⁇ gratin>.
- the appliance operating states indicate, for example, whether the corresponding appliances are currently in operation or in standby state.
- the “heater 1” of the IH cooker 244 is in the appliance operating state “in standby mode”
- the “heater 2” of the IH cooker 244 is in the appliance operating state “in operation (over a low heat)”
- the “top rack” in the oven range 243 is in the appliance operating state “in operation (2 minutes left)”.
- the IH cooker 244 has three cooking units, namely, the “heater 1”, the “heater 2”, and the “heater 3”.
- the three cooking units are configured to be capable of simultaneously operating in accordance with different cooking programs. Accordingly, separate appliance IDs are registered with the “heater 1”, the “heater 2”, and the “heater 3” in order to individually identify the three cooking units.
- the “heater 1” of the IH cooker 244 is an example of a first cooking unit
- the “heater 2” of the IH cooker 244 is an example of a second cooking unit.
- the oven range 243 has two cooking units, namely, the “top rack” and the “bottom rack”.
- the two cooking units are configured to be capable of simultaneously operating in accordance with different cooking programs. Accordingly, separate appliance IDs are registered with the “top rack” and the “bottom rack” in order to individually identify the two cooking units.
- the “top rack” in the oven range 243 is an example of the first cooking unit
- the “bottom rack” in the oven range 243 is an example of the second cooking unit.
- FIG. 18 is a diagram illustrating a specific example of the appliance function DB 630 .
- the appliance function DB 630 holds, for example, function IDs, appliance IDs, task contents ⁇ task>, control commands, and response messages in association with one another.
- the function IDs are identifiers uniquely assigned to functions of cooking units registered in association with the appliance IDs.
- the task contents ⁇ task> represent logical symbols indicating tasks for the functions with the function IDs.
- the control commands represent control commands used to perform the functions with the function IDs.
- the response messages represent messages issued when the functions with the function IDs are performed.
- the function ID “O01-01-01” represents the function of the cooking unit with the appliance ID “M01-01”, that is, referring to FIG. 17 , the function of the “heater 1” of the IH cooker 244 .
- FIG. 19A is a diagram illustrating an example of a menu list 1900 included in the cooking program DB 640 .
- FIG. 19B is a diagram illustrating an example of a cooking step list 1910 included in the cooking program DB 640 .
- FIG. 20A is a diagram illustrating an example of an error message list 1920 included in the cooking program DB 640 .
- FIG. 20B is a diagram illustrating an example of a display screen 1930 of the display terminal 260 .
- the menu list 1900 in the cooking program DB 640 holds, for example, cooking program IDs, menu names ⁇ menu>, cookware names ⁇ equipment>, ingredient names ⁇ ingredient>, and category names ⁇ category> in association with one another.
- the cooking program ID “T001” is associated with the menu name ⁇ chikuzen_ni>, the cookware name ⁇ pot>, the ingredient names ⁇ chicken>, ⁇ carrot>, etc., and the category name ⁇ stewed>.
- the cooking step list 1910 in the cooking program DB 640 holds, for example, cooking program IDs, cooking step IDs, and response messages in association with one another.
- the cooking procedure associated with the cooking program ID “T001” includes cooking steps with the cooking step IDs “S001”, “S002”, and so on.
- the response message “Heat the pot over a high heat” is registered in association with the cooking step with the cooking step ID “S002”.
- the error message list 1920 in the cooking program DB 640 holds, for example, error message IDs, error types, and response error messages in association with one another.
- “no categories, menu items, or ingredients” is registered as the error type with the error message ID “E002”, and “XXX is not being made right now” is registered as the response error message.
- the display terminal 260 displays a display screen 1930 including the response error message “A cream stew is not being made right now”.
- FIGS. 15A and 15B are flowcharts illustrating an example of the operation of the state management unit 740 in the cloud server 111 .
- the state management unit 740 acquires context data output from the utterance interpretation unit 730 (S 1501 ). Then, the state management unit 740 determines whether or not the acquired context data includes a category name or an ingredient name (S 1502 ). If it is determined that the context data does not include a category name or an ingredient name (NO in S 1502 ), the state management unit 740 advances the process to S 1506 .
- the state management unit 740 checks the category name or the ingredient name against the cooking program DB 640 (S 1503 ). The state management unit 740 determines whether or not the corresponding category name or ingredient name has been registered in the cooking program DB 640 (S 1504 ). If it is determined that the corresponding category name or ingredient name has not been registered in the cooking program DB 640 (NO in S 1504 ), the state management unit 740 advances the process to S 1513 .
- the state management unit 740 If it is determined that the corresponding category name or ingredient name has been registered in the cooking program DB 640 (YES in S 1504 ), the state management unit 740 outputs the associated menu name and cookware name (S 1505 ), and then advances the process to S 1506 .
- the state management unit 740 checks the appliance name, menu name, or cookware name in the context data against the appliance state management DB 620 .
- the state management unit 740 checks the menu name and cookware name output in S 1505 against the appliance state management DB 620 .
- the state management unit 740 determines whether or not the corresponding appliance name, menu name, or cookware name has been registered in the appliance state management DB 620 . If it is determined that the corresponding appliance name, menu name, or cookware name has not been registered in the appliance state management DB 620 (NO in S 1507 ), the state management unit 740 advances the process to S 1513 .
- the state management unit 740 acquires the appliance ID from the appliance state management DB 620 (S 1508 ). Then, the state management unit 740 determines whether or not the acquired appliance ID is uniquely identifiable (S 1509 ). If it is determined that the acquired appliance ID is not uniquely identifiable (NO in S 1509 ), the state management unit 740 advances the process to S 1513 .
- the state management unit 740 checks the task content in the context data against the appliance function DB 630 (S 1510 ). Then, the state management unit 740 determines whether or not the corresponding task content has been registered in the appliance function DB 630 (S 1511 ). If it is determined that the corresponding task content has not been registered in the appliance function DB 630 (NO in S 1511 ), the state management unit 740 advances the process to S 1513 .
- the state management unit 740 acquires the function ID from the appliance function DB 630 , and outputs the acquired function ID (S 1512 ). Then, the process illustrated in FIGS. 15A and 15B ends.
- the state management unit 740 searches the error message list 1920 in the cooking program DB 640 for the corresponding error message ID, and outputs the error message ID. Then, the process illustrated in FIGS. 15A and 15B ends.
- the state management unit 740 acquires the error message ID “E002”.
- the state management unit 740 acquires any of the error message IDs “E001”, “E002”, and “E003”.
- the state management unit 740 acquires the error message ID “E004”.
- the state management unit 740 acquires the error message ID “E006”.
- the state management unit 740 may set the error message ID to be acquired to, for example, “E002” by default.
- an error message ID is output, and then the current process ends. After that, when the user 250 speaks an utterance in response to the error message, the operation of the utterance interpretation unit 730 ( FIG. 12 ) is initiated.
- FIG. 16 is a flowchart illustrating an example of the operation of the response generation unit 750 in the cloud server 111 .
- the response generation unit 750 acquires the content output from the state management unit 740 in S 1512 or S 1513 of FIGS. 15A and 15B (S 1601 ). Then, the response generation unit 750 determines whether the acquired output content is a function ID or an error message ID (S 1602 ).
- the response generation unit 750 checks the function ID against the appliance function DB 630 , and generates a control command and a response message (S 1603 ).
- the response generation unit 750 checks the error message ID against the error message list 1920 included in the cooking program DB 640 , and generates a response error message (S 1604 ).
- FIGS. 21 to 23 are sequence diagrams illustrating the operation of the speech-based appliance control system according to the first embodiment.
- FIGS. 24A , 24 B, and 24 C are diagrams illustrating an example of a menu selection screen 2400 displayed on the display terminal 260 .
- FIGS. 21 to 23 illustrate a continuous sequence. The process in the sequence diagram illustrated in FIGS. 21 to 23 is initiated when the user 250 gives instructions to the display terminal 260 to start the speech-based appliance control system by, for example, tapping an icon displayed on a display screen of the display terminal 260 .
- the display terminal 260 acquires a menu list request from the user 250 .
- the communication circuit 505 in the display terminal 260 transmits the acquired menu list request and the display terminal ID 521 to the gateway 102 .
- the gateway 102 receives the menu list request and the display terminal ID 521 .
- the gateway 102 transmits the menu list request and the display terminal ID 521 received from the display terminal 260 , and the gateway ID 541 held in the memory 540 of the gateway 102 to the cloud server 111 .
- the cloud server 111 receives the menu list request, the display terminal ID 521 , and the gateway ID 541 .
- the state management unit 740 in the cloud server 111 performs a menu list acquisition process to extract a menu list.
- the response generation unit 750 in the cloud server 111 transmits the extracted menu list, the display terminal ID 521 that specifies the display terminal 260 to be used for display, and the gateway ID 541 to the gateway 102 .
- the gateway 102 receives the menu list, the display terminal ID 521 , and the gateway ID 541 .
- the received data analysis unit 810 in the gateway 102 performs a received data analysis process.
- the received data analysis unit 810 separates the data received from the cloud server 111 into the menu list, the display terminal ID 521 , and the gateway ID 541 .
- the transmission data generation unit 820 in the gateway 102 transmits the separated menu list to the display terminal 260 corresponding to the display terminal ID 521 .
- the display control circuit 500 in the display terminal 260 displays a menu selection screen 2400 on the display circuit 502 in a manner illustrated in FIG. 24A in accordance with the received menu list (an example of display screen information).
- the display terminal 260 acquires instructions for a specific cooking program request from the user 250 .
- the menu selection screen 2400 includes a cooking appliance display portion 2401 and a cooking program display portion 2402 .
- a cooking appliance 400 including a plurality of cooking units is schematically displayed.
- three cooking units, namely, the “heater 1”, the “heater 2”, and the “heater 3” of the IH cooker 244 are displayed in the cooking appliance display portion 2401 .
- the cooking program display portion 2402 a list of cooking programs is displayed.
- four menu items e.g., “hamburger steak”, “beef stew”, “chikuzen-ni stew”, and “gratin”, are displayed in the cooking appliance display portion 2401 .
- the cooking program display portion 2402 may be configured such that swiping up or down in the area corresponding to the cooking program display portion 2402 scrolls the screen to allow the remaining cooking programs to appear.
- the user 250 taps, for example, an area labeled “chikuzen-ni stew” in the cooking program display portion 2402 and then taps an area labeled “heater 1” in the cooking appliance display portion 2401 with a contact object 2403 (e.g., the user's finger). Then, the display terminal 260 acquires instructions to request that a meal for the cooking program “chikuzen-ni stew” (an example of a first cooking recipe) be cooked using the “heater 1” (an example of a first cooking unit) of the IH cooker 244 . In accordance with the instructions, the display terminal 260 acquires the cooking program ID and the appliance ID. In addition, as illustrated in FIG. 24B , the display terminal 260 changes the display color of the tapped areas to allow the user 250 to easily identify the selected items.
- a contact object 2403 e.g., the user's finger
- appliance IDs corresponding to the “heater 1”, the “heater 2”, and the “heater 3” of the IH cooker 244 are registered in advance.
- the communication circuit 505 in the display terminal 260 transmits the cooking program ID 523 to the cooking appliance 400 corresponding to the appliance ID.
- the cooking appliance 400 receives the transmitted cooking program ID 523 , and stores the received cooking program ID 523 in the memory 440 .
- the cooking program ID indicating “chikuzen-ni stew”, which is transmitted in S 2110 from the display terminal 260 to the cooking appliance 400 is an example of first cooking program information.
- the communication circuit 505 in the display terminal 260 transmits the cooking program ID, the display terminal ID, and the appliance ID to the gateway 102 .
- the gateway 102 receives the cooking program ID, the display terminal ID, and the appliance ID.
- the cooking program ID indicating “chikuzen-ni stew”, which is transmitted in S 2111 from the display terminal 260 to the gateway 102 is an example of first cooking recipe selection information.
- the gateway 102 transmits the cooking program ID, the display terminal ID, and the appliance ID, which are received from the display terminal 260 , and the gateway ID 541 held in the memory 540 of the gateway 102 to the cloud server 111 .
- the communication circuit 650 in the cloud server 111 receives the cooking program ID, the display terminal ID, the appliance ID, and the gateway ID 541 .
- the state management unit 740 in the cloud server 111 performs a cooking program management process.
- the state management unit 740 performs a process to update the content of the appliance state management DB 620 using the values of the received cooking program ID, display terminal ID, appliance ID, and gateway ID.
- FIG. 26 is a flowchart illustrating the cooking program management process executed in S 2201 of FIG. 22 .
- the state management unit 740 acquires the display terminal ID, the gateway ID, the appliance ID, and the cooking program ID, which are received by the communication circuit 650 in S 2112 of FIG. 21 .
- the state management unit 740 checks the gateway ID and the appliance ID against the appliance state management DB 620 .
- the state management unit 740 determines whether or not the cooking program ID has been registered in the column of the ongoing cooking program ID in the appliance state management DB 620 associated with the gateway ID and the appliance ID. If the cooking program ID has been registered in the column of the ongoing cooking program ID (YES in S 2603 ), the state management unit 740 ends the process illustrated in FIG. 26 .
- the state management unit 740 registers the cooking program ID acquired in S 2601 in the column of the ongoing cooking program ID in the appliance state management DB 620 in association with the gateway ID and the appliance ID (S 2604 ).
- the state management unit 740 checks the cooking program ID against the menu list 1900 in the cooking program DB 640 , and acquires the cookware name and the menu name.
- the state management unit 740 registers the acquired cookware name and menu name in the columns of the cookware name and menu name in the appliance state management DB 620 , respectively, in association with the gateway ID and the appliance ID.
- the state management unit 740 resets the current value in the column of the ongoing cooking step ID in the appliance state management DB 620 to the initial value (in the example illustrated in FIG. 19B , “S001”), and ends the process illustrated in FIG. 26 .
- the response generation unit 750 in the cloud server 111 performs a response text generation process to generate a response message for the user 250 .
- the cloud server 111 holds information on the response messages registered in the cooking step list 1910 ( FIG. 19B ) in the cooking program DB 640 , and information on the response messages registered in the appliance function DB 630 ( FIG. 18 ).
- the response generation unit 750 in the cloud server 111 reads the response messages stored in the cooking program DB 640 or the appliance function DB 630 to generate text data of response text.
- the speech synthesis unit 760 in the cloud server 111 performs a speech synthesis process to convert the response message into audio data.
- the cloud server 111 holds information on the acoustic models and language models registered in the speech recognition/synthesis DB 600 .
- the speech synthesis unit 760 in the cloud server 111 reads the information on the acoustic models and language models registered in the speech recognition/synthesis DB 600 , and convert the text data of the response text into specific audio data using the information on the acoustic models and language models.
- the cloud server 111 transmits the generated audio data, the generated text data, the display terminal ID 521 , and the gateway ID 541 to the gateway 102 .
- the gateway 102 receives the audio data, the text data, the display terminal ID 521 , and the gateway ID 541 .
- the received data analysis unit 810 in the gateway 102 performs a received data analysis process.
- the received data analysis unit 810 in the gateway 102 separates the received data into the audio data, the text data, the display terminal ID 521 , and the gateway ID 541 .
- the transmission data generation unit 820 in the gateway 102 transmits the separated audio data to the audio input and output device 240 .
- the audio input and output device 240 outputs audio using the received audio data.
- the transmission data generation unit 820 in the gateway 102 transmits the separated text data to the display terminal 260 corresponding to the display terminal ID 521 .
- the display terminal 260 displays a text image corresponding to the received text data.
- the cooking appliance 400 detects the content of the operation (hereinafter referred to as the “operation content”) to be performed on the cooking appliance 400 by the user 250 .
- the communication circuit 450 in the cooking appliance 400 transmits the detected operation content, the appliance ID 441 , and the cooking program ID 443 to the gateway 102 .
- the gateway 102 receives the operation content, the appliance ID 441 , and the cooking program ID 443 .
- the gateway 102 transmits the cooking program ID 443 , the operation content, and the appliance ID 441 , which are received from the cooking appliance 400 , and the gateway ID 541 held in the memory 540 of the gateway 102 to the cloud server 111 .
- the cloud server 111 receives the cooking program ID 443 , the operation content, the appliance ID 441 , and the gateway ID 541 .
- the state management unit 740 in the cloud server 111 performs a cooking program update process.
- the state management unit 740 performs a process to update the content of the appliance state management DB 620 using the values of the received cooking program ID, the received operation content, the received appliance ID 441 , and the received gateway ID 541 .
- the state management unit 740 can know that the immediately preceding cooking step has been executed.
- the state management unit 740 updates the content of the appliance state management DB 620 in accordance with the result of the immediately preceding cooking step.
- FIG. 24B a description is given of the case where, in S 2109 of FIG. 21 , the user 250 made a request to cook “chikuzen-ni stew” using the “heater 1” of the IH cooker 244 .
- the cooking program ID for “chikuzen-ni stew” is “T001”.
- the response generation unit 750 in the cloud server 111 acquires the cooking program ID “T001” for “chikuzen-ni stew” from the state management unit 740 .
- the response generation unit 750 refers to the cooking step list 1910 in the cooking program DB 640 ( FIG. 19B ), and acquires the response message “Pour 400 cc of purified water into a pot on the stove” associated with the first cooking step ID “S001” associated with the cooking program ID “T001”. In S 2202 of FIG. 22 , the response generation unit 750 generates the above-described response message.
- the above-described response message is output as audio in S 2207 of FIG. 22 , and is displayed on a screen in S 2209 . Accordingly, the user 250 pours water into a pot and places the pot on the “heater 1” of the IH cooker 244 . Then, in S 2301 of FIG. 23 , the cooking appliance 400 (in the illustrated example, the IH cooker 244 ) detects an increase in the weight on the “heater 1”. In S 2302 , the cooking appliance 400 transmits the increase in the weight on the “heater 1” to the gateway 102 as operation content.
- the gateway 102 transmits the operation content indicating the increase in the weight on the “heater 1” to the cloud server 111 .
- the communication circuit 650 in the cloud server 111 receives the operation content.
- the state management unit 740 acquires the operation content indicating the increase in the weight on the “heater 1”, which is received by the communication circuit 650 . On the basis of the operation content indicating the increase in the weight on the “heater 1”, the state management unit 740 determines that the cooking step ID “S001” in the cooking (of chikuzen-ni stew) associated with the cooking program ID “T001” has been executed.
- the state management unit 740 updates the ongoing cooking step ID corresponding to the “IH cooker: heater 1” with the appliance ID “M01-01” in the appliance state management DB 620 from “S001” to “S002”. In this way, the state management unit 740 executes the cooking program update process in S 2304 .
- chikuzen-ni stew is cooked using the “heater 1” of the IH cooker 244 .
- the user 250 may wish to simultaneously cook another meal using another cooking unit of the IH cooker 244 .
- the speech-based appliance control system again starts the process from S 2101 of FIG. 21 .
- the user 250 taps, for example, an area labeled “hamburger steak” in the cooking program display portion 2402 and then taps an area labeled “heater 2” in the cooking appliance display portion 2401 with the contact object 2403 (e.g., the user's finger). Then, in S 2109 of FIG. 21 , the display terminal 260 acquires instructions to request that a meal for the cooking program “hamburger steak” (an example of a second cooking recipe) be cooked using the “heater 2” (an example of a second cooking unit) of the IH cooker 244 .
- a meal for the cooking program “hamburger steak” an example of a second cooking recipe
- the display terminal 260 acquires the cooking program ID and the appliance ID. In addition, as illustrated in FIG. 24C , the display terminal 260 changes the display color of the tapped areas to allow the user 250 to easily identify the further selected items.
- the process illustrated in FIGS. 21 to 23 is performed again using the cooking program ID indicating “hamburger steak” and the appliance ID indicating the “heater 2” of the IH cooker 244 .
- the cooking program ID indicating “hamburger steak”, which is transmitted in S 2110 of FIG. 21 from the display terminal 260 to the cooking appliance 400 is an example of second cooking program information.
- the cooking program ID indicating “hamburger steak”, which is transmitted in S 2111 of FIG. 21 from the display terminal 260 to the gateway 102 is an example of second cooking recipe selection information.
- the heating of the cooking appliance 400 may be started in response to an operation of the user 250 .
- the response message “Heat the pot over a high heat”, which has been registered in the cooking step list 1910 illustrated in FIG. 19B may be output.
- the heating of the cooking appliance 400 may be automatically started in response to an increase in the weight of the pot.
- the response message “The heater 1 was turned on”, which has been registered in the appliance function DB 630 illustrated in FIG. 18 may be output.
- FIG. 25 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the first embodiment. The process illustrated in FIG. 25 is initiated when a user gives some instructions to an appliance by using speech.
- the audio input and output device 240 acquires audio data of the user 250 .
- the communication circuit 303 in the audio input and output device 240 transmits the acquired audio data to the gateway 102 .
- the gateway 102 receives the audio data.
- the gateway 102 transmits the audio data received from the audio input and output device 240 and the gateway ID 541 held in the memory 540 of the gateway 102 to the cloud server 111 .
- the communication circuit 650 in the cloud server 111 receives the audio data and the gateway ID 541 .
- the speech recognition unit 710 and the utterance interpretation unit 730 in the cloud server 111 execute an audio content interpretation process.
- the speech recognition unit 710 acquires the user audio data received from the communication circuit 650 .
- the speech recognition unit 710 extracts frequency characteristics from the user audio data.
- the speech recognition unit 710 extracts phonemic data using the acoustic models held in the speech recognition/synthesis DB 600 and the extracted frequency characteristics.
- the speech recognition unit 710 converts the extracted phonemic data into specific character string data by checking the extracted phonemic data against the speech recognition/synthesis DB 600 and determining which character string data of the language models held in the speech recognition/synthesis DB 600 is the most similar to the extracted phonemic data in terms of arrangement.
- the utterance interpretation unit 730 executes the process described above with reference to FIG. 12 using the character string data obtained by the speech recognition unit 710 . In this way, the audio content interpretation process in S 2504 is executed.
- the state management unit 740 in the cloud server 111 executes a state management process.
- the state management unit 740 executes the process described above with reference to FIGS. 15A and 15B .
- the response generation unit 750 in the cloud server 111 executes an output generation process.
- the response generation unit 750 executes the process described above with reference to FIG. 16 .
- the speech synthesis unit 760 in the cloud server 111 performs a speech synthesis process.
- the speech synthesis unit 760 in the cloud server 111 performs a process to convert response text into audio data.
- the cloud server 111 holds information on the acoustic models and language models registered in the speech recognition/synthesis DB 600 .
- the CPU 671 in the cloud server 111 reads the information on the acoustic models and language models registered in the speech recognition/synthesis DB 600 , and convert the character string data into specific audio data using the information on the acoustic models and language models.
- the cloud server 111 transmits the generated control command, the generated audio data, the generated text data, the appliance ID 441 of the appliance to be controlled, and the gateway ID 541 to the gateway 102 .
- the gateway 102 receives the control command, the audio data, the text data, the appliance ID 441 , and the gateway ID 541 .
- the received data analysis unit 810 in the gateway 102 performs a received data analysis process.
- the received data analysis unit 810 in the gateway 102 separates the received data into the control command, the audio data, the text data, the appliance ID 441 , and the gateway ID 541 .
- the transmission data generation unit 820 in the gateway 102 transmits the separated text data to the display terminal 260 .
- the display terminal 260 displays the received text data on the display screen.
- the transmission data generation unit 820 in the gateway 102 transmits the separated control command to the cooking appliance 400 corresponding to the appliance ID 441 .
- the appliance control unit 910 in the cooking appliance 400 controls the operation in accordance with the control command received by the communication unit 900 .
- the transmission data generation unit 820 in the gateway 102 transmits the separated audio data to the audio input and output device 240 .
- the audio output unit 1040 in the audio input and output device 240 outputs audio in accordance with the audio data received by the communication unit 1030 .
- FIG. 25 A specific example of the process illustrated in FIG. 25 will now be described. First, a description will be given of the operation in a first specific example in a case where while, as illustrated above in FIG. 24C , chikuzen-ni stew and hamburger steak are being cooked, as in FIG. 14 described above, the user 250 speaks an utterance “Turn off the heat to the stew”.
- the context data 1400 illustrated in FIG. 14 is output from the utterance interpretation unit 730 .
- the process illustrated in FIGS. 15A and 15B is executed by the state management unit 740 .
- the beef stew is also a kind of stew.
- the menu name ⁇ beef_stew> and the cookware name ⁇ pot> are also output.
- the output generation process in S 2506 of FIG. 25 that is, the process illustrated in FIG. 16 , which is performed by the response generation unit 750 , is executed.
- a function ID is output from the state management unit 740 .
- the “heater 1” with which chikuzen-ni stew is being cooked, rather than the “heater 2” with which hamburger steak is being cooked, can be accurately turned off.
- “Turn off the heat to the stew” is an example of instruction information
- “turn off the heat” is an example of first audio information (operation instructions)
- “stew” is an example of first menu information or second menu information and is also an example of second audio information.
- the context data output from the utterance interpretation unit 730 includes the type ⁇ equipment> and the concept ⁇ pot> associated with the column of the word name “pot”, and the type ⁇ task> and the concept ⁇ stop_heat> associated with the column of the word name “turn off the heat” in the utterance interpretation dictionary DB 625 illustrated in FIGS. 13A and 13B .
- the context data includes no category names or ingredient names. Thus, NO is determined in S 1502 of FIG. 15A . Then, the process proceeds to S 1506 .
- the cookware name ⁇ pot> in the context data has been registered in one column. Thus, the appliance ID “M01-01” is acquired through S 1506 to S 1508 . Accordingly, an appliance ID is identified by the cookware name ⁇ pot> in the appliance state management DB 620 .
- the task content “turn off the heat” is the same as that in the first specific example described above.
- the process is performed in a manner similar to that in the first specific example described above, and a similar control command and a similar response message are generated.
- the “heater 1” with which chikuzen-ni stew is being cooked, rather than the “heater 2” with which hamburger steak is being cooked, can be accurately turned off.
- “Turn off the heat to the pot” is an example of instruction information
- “turn off the heat” is an example of first audio information (operation instructions)
- “pot” is an example of first cookware information or second cookware information and is also an example of second audio information.
- the context data 1400 illustrated in FIG. 14 which is the same as that in the first specific example described above, is output from the utterance interpretation unit 730 .
- the menu name ⁇ chikuzen_ni> and the cookware name ⁇ pot> are output.
- the menu name ⁇ beef_stew> and the cookware name ⁇ pot> are also output in S 1505 .
- the cookware name ⁇ pot> is stored with redundancy.
- the error message ID “E004” is acquired from the error message list 1920 in the cooking program DB 640 illustrated in FIG. 20A , and is output.
- an error message ID is output from the state management unit 740 .
- the error message ID “E004” is checked against the cooking program DB 640 illustrated in FIG. 20A , the response error message “Please enter the name of a menu item” is generated.
- the context data output from the utterance interpretation unit 730 includes the type ⁇ equipment> and the concept ⁇ pot> associated with the column of the word name “pot”, and the type ⁇ task> and the concept ⁇ stop_heat> associated with the column of the word name “turn off the heat” in the utterance interpretation dictionary DB 625 illustrated in FIGS. 13A and 13B .
- the context data includes no category names or ingredient names. Thus, NO is determined in S 1502 of FIG. 15A . Then, the process proceeds to S 1506 .
- the cookware name ⁇ pot> is stored with redundancy.
- the response error message “Please enter the name of a menu item” is generated.
- the context data output from the utterance interpretation unit 730 includes the type ⁇ equipment> and the concept ⁇ pot> associated with the column of the word name “pot”, and the type ⁇ task> and the concept ⁇ low heat> associated with the column of the word name “low heat” in the utterance interpretation dictionary DB 625 illustrated in FIGS. 13A and 13B .
- the context data includes no category names or ingredient names. Thus, NO is determined in S 1502 of FIG. 15A . Then, the process proceeds to S 1506 .
- the cookware name ⁇ pot> in the context data has been registered in one column. Thus, the appliance ID “M01-01” is acquired through S 1506 to S 1508 . Accordingly, an appliance ID is identified by the cookware name ⁇ pot> in the appliance state management DB 620 .
- the “heater 1” with which chikuzen-ni stew is being cooked, rather than the “heater 2” with which hamburger steak is being cooked, can be accurately turned down low.
- “Heat the pot over a low heat” is an example of instruction information
- “low heat” is an example of first audio information (operation instructions)
- “pot” is an example of first cookware information or second cookware information and is also an example of second audio information.
- the task content ⁇ low heat> in the appliance function DB 630 illustrated in FIG. 18 which corresponds to the word name “low heat” in the utterance interpretation dictionary DB 625 illustrated in FIG. 13B , is an example of operation instructions for executing a process having a cooking parameter different from a process that is being executed in accordance with a cooking program.
- the cooking parameter is the temperature of the heat, that is, a set temperature.
- a cooking parameter in the present disclosure is not limited to a set temperature.
- the cooking parameter may be, for example, the duration for which a set temperature is maintained, an inclination toward change in temperature, or a heating on/off duty ratio.
- instructions given by the user 250 through utterance may include instructions to change the duration for which a set temperature is maintained.
- a response error message is output if it is difficult to determine which cooking unit instructions are provided to.
- the response error message can prompt the user 250 to make an appropriate utterance.
- the display terminal 260 may include the audio input and output device 240 .
- each of the home electric appliances 101 may include the audio input and output device 240 and/or the display terminal 260 .
- FIG. 27 is a diagram illustrating the configuration of a speech-based appliance control system according to a second embodiment.
- substantially the same elements as those in the first embodiment are assigned the same numerals, and are not described in detail herein.
- a description will be given of the second embodiment, focusing on differences from the first embodiment.
- the speech-based appliance control system includes the audio input and output device 240 , the plurality of home electric appliances 101 , the display terminal 260 , and an integrated management device 2800 . That is, the speech-based appliance control system according to the second embodiment includes the integrated management device 2800 in place of the gateway 102 , the information communication network 220 , and the cloud server 111 in the speech-based appliance control system according to the first embodiment.
- the integrated management device 2800 is located in the group 100 .
- the integrated management device 2800 may be connected to the display terminal 260 , the audio input and output device 240 , and the plurality of home electric appliances 101 using wired or wireless connection.
- the integrated management device 2800 is separate from the home electric appliances 101 .
- the present disclosure is not limited to this embodiment.
- the oven range 243 , the IH cooker 244 , or the refrigerator 245 may include the integrated management device 2800 .
- FIG. 28A is a block diagram illustrating the hardware configuration of the integrated management device 2800 .
- the integrated management device 2800 includes the communication circuit 650 , the processing circuit 670 , the speech recognition database (DB) 600 , the appliance state management DB 620 , the utterance interpretation dictionary DB 625 , the appliance function DB 630 , and the cooking program DB 640 .
- the processing circuit 670 includes the CPU 671 and the memory 672 in which the computer program 673 is stored. In this manner, the integrated management device 2800 has substantially the same hardware configuration as that of the cloud server 111 illustrated in FIG. 7 .
- FIG. 28B is a block diagram illustrating the system configuration of the integrated management device 2800 .
- the integrated management device 2800 includes the communication unit 700 , the speech recognition unit 710 , the utterance interpretation unit 730 , the state management unit 740 , the response generation unit 750 , and the speech synthesis unit 760 .
- the integrated management device 2800 has substantially the same system configuration as that of the cloud server 111 illustrated in FIG. 11 .
- FIG. 29 is a diagram illustrating a specific example of the appliance state management DB 620 according to the second embodiment.
- the appliance state management DB 620 according to the second embodiment holds appliance IDs, appliance names ⁇ appliance>, ongoing cooking program IDs, ongoing cooking step IDs, cookware names ⁇ equipment>, menu names ⁇ menu>, and appliance operating states in association with one another.
- the appliance state management DB 620 according to the second embodiment is different from the appliance state management DB 620 according to the first embodiment illustrated in FIG. 17 in that no gateway ID is held.
- FIGS. 30A and 30B are sequence diagrams illustrating the operation of the speech-based appliance control system according to the second embodiment.
- substantially the same steps as those in the operation of the speech-based appliance control system according to the first embodiment illustrated in FIGS. 21 to 23 are assigned the same numerals.
- the processes executed by the cloud server 111 in the first embodiment are executed by the integrated management device 2800 in the second embodiment ( FIGS. 30A and 30B ).
- the display terminal 260 and the cooking appliance 400 transmit data to the gateway 102
- the display terminal 260 and the cooking appliance 400 transmit data to the integrated management device 2800 .
- FIGS. 21 to 23 The operation of the speech-based appliance control system according to the first embodiment ( FIGS. 21 to 23 ) and the operation of the speech-based appliance control system according to the second embodiment ( FIGS. 30A and 30B ) are the same except for the points described above.
- FIG. 31 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the second embodiment.
- substantially the same steps as those in the process of the speech-based appliance control system according to the first embodiment illustrated in FIG. 25 are assigned the same numerals.
- the processes executed by the cloud server 111 in the first embodiment ( FIG. 25 ) are executed by the integrated management device 2800 in the second embodiment ( FIG. 31 ).
- transmission and reception processes between the cloud server 111 and the gateway 102 which are performed in the first embodiment ( FIG. 25 ), are not performed, and the received data analysis process performed by the gateway 102 is not performed.
- the audio input and output device 240 transmits data to the gateway 102
- the audio input and output device 240 transmits data to the integrated management device 2800 .
- the processes executed by the cloud server 111 in the first embodiment are executed by the integrated management device 2800 .
- the other points are similar to those in the first embodiment, and thus the advantages achievable by the second embodiment are similar to the advantages achievable by the first embodiment.
- the present disclosure is not limited to these embodiments.
- FIG. 32 is a block diagram illustrating another example of the hardware configuration of the cooking appliance 400 .
- the memory 440 in the cooking appliance 400 additionally includes the appliance state management DB 620 .
- the cooking appliance 400 holds information stored in the appliance state management DB 620 . Accordingly, upon receipt of a control command including a cooking program ID, the CPU 430 of the cooking appliance 400 can identify an appliance ID from the information stored in the appliance state management DB 620 . Thus, the appliance control unit 910 in the cooking appliance 400 can activate a cooking unit (e.g., the heater 2 of the IH cooker 244 ) corresponding to the identified appliance ID in accordance with the control command.
- a cooking unit e.g., the heater 2 of the IH cooker 244
- the display terminal 260 may include the audio input and output device 240 .
- each of the home electric appliances 101 may include the audio input and output device 240 and/or the display terminal 260 .
- each of the home electric appliances 101 may include the integrated management device 2800 .
- the speech-based interactive software agent may be mounted in each cooking appliance, or may be installed in a house so that home cooking appliances or home electric appliances are uniformly controllable.
- the speech-based interactive software agent is configured to be able to access each of the home electric appliances and vice versa.
- the speech-based interactive software agent is sometimes referred to simply as “the agent”.
- the agent notifies the user that the user has to thaw frozen beef, which is an ingredient, and instructs the user to put the frozen beef in the microwave oven and press a thaw button.
- the agent accesses the microwave oven to check that the thaw button has been pressed.
- the agent may also check that the thaw button has been pressed, by receiving a notification from the user that the user has completed carrying out the thawing instructions.
- the agent instructs the user to boil water using the IH cooker to make boiled vegetables which will accompany the roast beef.
- the user pours water into a pot and activates the IH cooker.
- the agent accesses the IH cooker to check that the IH cooker has been activated.
- the press of a boiling-water button or the like on the IH cooker is detected. Simply, the activation of the IH cooker may be detected. Alternatively, the user may inform the agent that the water has started to boil.
- a gas stove, an electronic kettle, or any other suitable cooking equipment may be used.
- the agent instructs the user to prepare vegetables for boiling.
- the agent may give detailed instructions, such as how to cut or slice the vegetables, to the user.
- the user informs the agent that the preparation has been completed.
- the microwave oven and the IH cooker notify the agent of completion of the action. If access between the microwave oven or the IH cooker and the agent is not available, the user may inform the agent of completion of the action.
- the agent After checking the microwave oven and IH cooker and detecting the completion of the thawing of the frozen beef or the boiling of the water, the agent gives further instructions to the user to perform the subsequent operation.
- Examples of the instructions include taking the beef out of the microwave oven and preparing the beef.
- Other examples of the instructions include putting the vegetables into the pot filled with boiling water on the IH cooker and boiling the vegetables for, for example, 10 minutes. Placing the vegetables into the pot temporarily reduces the temperature of the water in the pot.
- the agent may detect a reduction in temperature, and start a timer to run for 10 minutes after the detection to notify the user when to lift the vegetables out of the pot.
- the agent may also prompt the user to preheat the oven to, for example, 250 degrees at the same time instructions are given to prepare the beef.
- the agent After detecting the completion of the preparation of the beef, the agent instructs the user to put the beef into the oven. Here, the user may inform the agent of the completion of the preparation of the beef.
- the agent activates the oven. For example, the agent causes the oven to perform operations including roasting the beef at 250 degrees for 15 minutes, and then turning the temperature of the oven down to 160 degrees and roasting the beef for a further 40 minutes.
- the agent may perform automatic setting of the operations on the oven, or may instruct the user to operate the oven.
- the agent may notify the user when to turn down the temperature of the oven, by using the timer through measurement, at an appropriate time. Alternatively, the user may be prompted to set in advance a temperature control program to turn the temperature of the oven down to 160 degrees after 15 minutes.
- the agent instructs the user to make a sauce from the drippings left over on a roasting tray in the oven. For example, the agent instructs the user to pour the meat juices into a saucepan or a frying pan and to heat the meat juices using the IH cooker. In this case, the agent may automatically set the heating program for the IH cooker to make a sauce, or may instruct the user at an appropriate time to change the heating level. Additionally, instructions for spices and similar items to be added to the meat juices are also given as necessary.
- the agent After detecting the completion of the sauce, the agent instructs the user to arrange the boiled vegetables and the roast beef on a plate. After that, the cooking of the recipe is completed.
- a speech-based interactive software agent is capable of checking the progress of the cooking by receiving a notification of an operation check from each cooking appliance, and is capable of giving instructions to the user in accordance with the progress.
- the processes described above may not necessarily be applied to cooking, and may also be applied to other activities using a plurality of home electric appliances.
- processing units included in the speech-based appliance control systems according to the embodiments described above are typically implemented as large scale integrated circuits (LSIs) that are integrated circuits. These processing units may be separated into single chips, or some or all of them may be integrated into a single chip.
- LSIs large scale integrated circuits
- the approach of fabricating an integrated circuit is not limited to an LSI technology, and may be implemented by a dedicated circuit or a general-purpose processor.
- a field programmable gate array (FPGA) that is programmable after an LSI is fabricated or a reconfigurable processor capable of reconfiguring the connection or setting of circuit cells in the LSI may be used.
- each constituent element may be implemented in dedicated hardware or may be implemented by the execution of a software program suitable for the constituent element.
- Each constituent element may be implemented by reading and executing, with a program executing unit such as a CPU or a processor, a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- One embodiment of the present invention may be implemented as the program described above, or may be implemented as a non-transitory computer-readable recording medium storing the program described above. It is to be understood that the program described above may be distributed via a transmission medium such as the Internet.
- connection relationships between constituent elements are illustrative in order to clarify the present invention, and the connection relationships that achieve the functions in the present invention are not limited to the illustrated connection relationships.
- a plurality of function blocks may be implemented as a single function block, a single function block may be divided into a plurality of pieces, or some functions may be transferred to other function blocks. Additionally, the functions of a plurality of function blocks having similar functions may be processed by a single hardware or software component in parallel or time-sharing fashion.
- FIG. 33 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 1 (local-data-center-based cloud service).
- the service provider 4120 acquires information from the group 4100 , and provides a service to a user.
- the service provider 4120 has the function of a data center management company. That is, the service provider 4120 owns a cloud server 4203 that manages big data. Accordingly, no data center management company exists.
- the service provider 4120 operates and manages the data center (cloud server) 4203 .
- the service provider 4120 manages an operating system (OS) 4202 and an application 4201 .
- the service provider 4120 provides a service using the OS 4202 and the application 4201 that are managed by the service provider 4120 (arrow 204 ).
- FIG. 34 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 2 (IaaS-based cloud service).
- IaaS is an acronym for Infrastructure as a Service, and is a cloud service providing model that provides, as a service via the Internet, an infrastructure itself to build and run a computer system.
- the data center management company 4110 operates and manages the data center (cloud server) 4203 .
- the service provider 4120 manages the OS 4202 and the application 4201 .
- the service provider 4120 provides a service using the OS 4202 and the application 4201 that are managed by the service provider 4120 (arrow 204 ).
- FIG. 35 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 3 (PaaS-based cloud service).
- PaaS is an acronym for Platform as a Service, and is a cloud service providing model that provides, as a service via the Internet, a platform which provides a foundation to build and run software.
- the data center management company 4110 manages the OS 4202 , and operates and manages the data center (cloud server) 4203 .
- the service provider 4120 manages the application 4201 .
- the service provider 4120 provides a service using the OS 4202 managed by the data center management company 4110 and the application 4201 managed by the service provider 4120 (arrow 204 ).
- FIG. 36 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 4 (SaaS-based cloud service).
- SaaS is an acronym for Software as a Service.
- the SaaS-based cloud service is a cloud service providing model in which, for example, a user such as a company or an individual who does not own a data center (cloud server) has a function of being able to use, via a network such as the Internet, an application provided by a platform provider that owns a data center (cloud server).
- the data center management company 4110 manages the application 4201 , manages the OS 4202 , and operates and manages the data center (cloud server) 4203 .
- the service provider 4120 provides a service using the OS 4202 and the application 4201 that are managed by the data center management company 4110 (arrow 204 ).
- the service provider 4120 provides a service in any of the cloud service models described above.
- the service provider 4120 or the data center management company 4110 may develop the OS 4202 , the application 4201 , a database for big data, or the like by itself, or may outsource the development to a third party.
- the present disclosure is applicable to a speech-based appliance control system that controls a cooking appliance by using speech, and to the cooking appliance used in the speech-based appliance control system.
Landscapes
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Mechanical Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Selective Calling Equipment (AREA)
- Electric Ovens (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
Abstract
Description
- 1. Field of the Invention
- The present disclosure relates to an appliance control method, a speech-based appliance control system, and a cooking appliance.
- 2. Description of the Related Art
- An example of background art is disclosed in Japanese Unexamined Patent Application Publication No. 2002-91491 (hereinafter referred to as Patent Literature 1).
Patent Literature 1 discloses a speech-based control system for a plurality of appliances that locates the direction from which an utterance made by a user originates to achieve an improvement in the recognition rate of the target appliance to be controlled. - However, further improvements are needed in the speech-based control system disclosed in
Patent Literature 1. - In one general aspect, the techniques disclosed here feature a method for controlling a cooking appliance using a user's speech in a speech-based appliance control system. The speech-based appliance control system includes the cooking appliance including a first cooking unit and a second cooking unit, and includes an audio input device configured to receive input of user's speech. In the method, in a case of receiving, from an audio input device, instruction information including first audio information indicating operation instructions for a cooking appliance when first and second cooking units are executing first and second cooking programs, respectively, operation instructions are recognized from the first audio information. In a case where it is determined that the instruction information includes second audio information related to the first cooking menu information or the second cooking menu information, a control command is transmitted to the cooking appliance to cause the cooking appliance to execute a process corresponding to the operation instructions, without executing a process according to the first cooking program or the second cooking program corresponding to one of the first cooking menu information or the second cooking menu information to which the second audio information is related.
- According to the aspect described above, further improvements may be achieved.
-
FIG. 1A is a diagram illustrating an overview of a service provided by a speech-based appliance control system according to an embodiment; -
FIG. 1B is a diagram illustrating an example of a data center management company that is an appliance manufacturer; -
FIG. 1C is a diagram illustrating an example of a data center management company that is one or both of an appliance manufacturer and a management company; -
FIG. 2 is a configuration diagram of the speech-based appliance control system according to a first embodiment; -
FIG. 3 is a diagram illustrating the hardware configuration of an audio input and output device according to the first embodiment and the second embodiment; -
FIG. 4 is a diagram illustrating the hardware configuration of a cooking appliance according to the first embodiment and the second embodiment; -
FIG. 5 is a diagram illustrating the hardware configuration of a display terminal according to the first embodiment and the second embodiment; -
FIG. 6 is a diagram illustrating the hardware configuration of a gateway according to the first embodiment; -
FIG. 7 is a diagram illustrating the hardware configuration of a cloud server according to the first embodiment; -
FIG. 8 is a diagram illustrating the system configuration of the audio input and output device according to the first embodiment and the second embodiment; -
FIG. 9 is a diagram illustrating the system configuration of the cooking appliance according to the first embodiment and the second embodiment; -
FIG. 10 is a diagram illustrating the system configuration of the gateway according to the first embodiment; -
FIG. 11 is a diagram illustrating the system configuration of the cloud server according to the first embodiment; -
FIG. 12 is a flowchart illustrating an example of the operation of an utterance interpretation unit; -
FIG. 13A is a diagram illustrating an example of an utterance interpretation dictionary DB; -
FIG. 13B is a diagram illustrating the example of the utterance interpretation dictionary DB; -
FIG. 14 is a diagram illustrating an example of context data extracted by the utterance interpretation unit; -
FIG. 15A is a flowchart illustrating an example of the operation of a state management unit; -
FIG. 15B is a flowchart illustrating the example of the operation of the state management unit; -
FIG. 16 is a flowchart illustrating an example of the operation of a response generation unit; -
FIG. 17 is a diagram illustrating a specific example of an appliance state management DB according to the first embodiment; -
FIG. 18 is a diagram illustrating a specific example of an appliance function DB according to the first embodiment; -
FIG. 19A is a diagram illustrating an example of a menu list included in a cooking program DB; -
FIG. 19B is a diagram illustrating an example of a cooking step list included in the cooking program DB; -
FIG. 20A is a diagram illustrating an example of an error message list included in the cooking program DB; -
FIG. 20B is a diagram illustrating an example of a display screen of the display terminal; -
FIG. 21 is a sequence diagram illustrating the operation of a speech-based appliance control system according to the first embodiment; -
FIG. 22 is a sequence diagram illustrating the operation of the speech-based appliance control system according to the first embodiment; -
FIG. 23 is a sequence diagram illustrating the operation of the speech-based appliance control system according to the first embodiment; -
FIG. 24A is a diagram illustrating an example of a menu selection screen displayed on the display terminal; -
FIG. 24B is a diagram illustrating an example of the menu selection screen displayed on the display terminal; -
FIG. 24C is a diagram illustrating an example of the menu selection screen displayed on the display terminal; -
FIG. 25 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the first embodiment; -
FIG. 26 is a flowchart illustrating a cooking program management process executed in S2201 ofFIG. 22 ; -
FIG. 27 is a diagram illustrating the configuration of a speech-based appliance control system according to a second embodiment; -
FIG. 28A is a block diagram illustrating the hardware configuration of an integrated management device; -
FIG. 28B is a block diagram illustrating the system configuration of the integrated management device; -
FIG. 29 is a diagram illustrating a specific example of an appliance state management DB according to the second embodiment; -
FIG. 30A is a sequence diagram illustrating the operation of the speech-based appliance control system according to the second embodiment; -
FIG. 30B is a sequence diagram illustrating the operation of the speech-based appliance control system according to the second embodiment; -
FIG. 31 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the second embodiment; -
FIG. 32 is a block diagram illustrating the hardware configuration of a cooking appliance according to another embodiment; -
FIG. 33 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 1 (local-data-center-based cloud service); -
FIG. 34 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 2 (IaaS-based cloud service); -
FIG. 35 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 3 (PaaS-based cloud service); and -
FIG. 36 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 4 (SaaS-based cloud service). - The inventors have found that the technique described in
Patent Literature 1 given above has the following problem. -
Patent Literature 1 discloses a speech-based control system including appliances to be controlled, and microphones respectively placed near the appliances for detecting user speech, and provides the following technique: Audio data detected by the microphones is collected by an audio collecting means. The content of the audio data input to the audio collecting means is analyzed by a speech recognition means. The direction from which an utterance made by the user originates is located by a distribution analysis means using the amplitude of the audio data input to the audio collecting means. An appliance to be controlled and the content of the operation to be performed on the appliance are determined by an inference means on the basis of the content of the audio data analyzed by the speech recognition means and the direction located by the distribution analysis means. A control signal is issued to the appliance to be controlled on the basis of the appliance and the content of the operation which are determined by the inference means. - Naturally, a user gives instructions to a target appliance by using speech with their face directed toward the target appliance. Because of this natural behavior, the configuration described above can determine which of a plurality of appliances that exist the user has given instructions to.
- In recent years, cooking appliances in which a plurality of tasks are feasible, for example, an induction-heating (IH) cooker including a plurality of heating units or a microwave oven having a top-rack heating unit and a bottom-rack heating unit that are capable of executing different heating programs, are widely used. In these cooking appliances, the distances between a plurality of heating units are short. Thus, it is difficult to distinguish a difference in direction from which audio instructions given by a user to each of the heating units originate. Accordingly, there is a problem with the technique disclosed in
Patent Literature 1 in that it is difficult to give instructions to the cooking appliances described above by using speech by specifying one of a plurality of tasks. - To address the problem described above, the inventors have developed the following solution.
- A first aspect of the present disclosure provides a method for controlling a cooking appliance using a user's speech in a speech-based appliance control system including the cooking appliance and an audio input device configured to receive input of the user's speech, the cooking appliance including a first cooking unit and a second cooking unit. The method includes transmitting, to the cooking appliance via a first network, first cooking program information indicating a first cooking program, the first cooking program corresponding to a first cooking recipe, and second cooking program information indicating a second cooking program, the second cooking program corresponding to a second cooking recipe; in a case of receiving, from the audio input device, instruction information including first audio information indicating operation instructions for the cooking appliance when the first cooking unit is operated based on the first cooking program and the second cooking unit is operated based on the second cooking program, recognizing the operation instructions from the first audio information; determining, using a database configured to manage first cooking menu information indicating the name of a cooking menu item corresponding to the first cooking recipe and second cooking menu information indicating the name of a cooking menu item corresponding to the second cooking recipe, whether or not the received instruction information includes the first audio information and second audio information related to the first cooking menu information or the second cooking menu information; and in a case where it is determined that the received instruction information includes the second audio information, transmitting to the cooking appliance via the first network a control command for causing the cooking appliance to execute a process corresponding to the operation instructions, without executing a process that is based on corresponding to one of the first cooking menu information or the second cooking menu information to which the second audio information is related.
- According to this aspect, first cooking program information indicating a first cooking program corresponding to a first cooking recipe, and second cooking program information indicating a second cooking program corresponding to a second cooking recipe are transmitted to a cooking appliance via a first network. In response to receipt of instruction information including first audio information indicating operation instructions for the cooking appliance from an audio input device while a first cooking unit in the cooking appliance is executing a process that is based on a first cooking program and a second cooking unit in the cooking appliance is executing a process that is based on a second cooking program, the operation instructions are recognized from the first audio information.
- Using a database configured to manage first cooking menu information indicating the name of a cooking menu item corresponding to the first cooking recipe and second cooking menu information indicating the name of a cooking menu item corresponding to the second cooking recipe, it is determined whether or not the instruction information includes second audio information related to the first cooking menu information or the second cooking menu information. In a case where it is determined that the instruction information includes the second audio information, a control command is transmitted to the cooking appliance via the first network to cause the cooking appliance to execute a process corresponding to the operation instructions instead of executing a process that is executed in accordance with a cooking program corresponding to the cooking menu information to which the second audio information is related.
- Accordingly, for example, in a case where it is determined that the instruction information includes second audio information related to the first cooking menu information, the first cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the first cooking program. For example, in a case where it is determined that the instruction information includes second audio information related to the second cooking menu information, the second cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the second cooking program. As a result, operation instructions for the first cooking unit or the second cooking unit in the cooking appliance may be accurately executed.
- In the first aspect, for example, in a case where it is determined that the instruction information does not include the second audio information, an error message indicating that the process corresponding to the operation instructions is not executable at the cooking appliance may be provided to the user of the cooking appliance.
- According to this aspect, in a case where it is determined that the instruction information does not include the second audio information, an error message indicating that the process corresponding to the operation instructions is not executable is provided to a user. This allows the user to once again make instructions by using speech.
- In the first aspect, for example, the speech-based appliance control system may be configured to be further connected to a display device, and the error message may be displayed on the display device.
- According to this aspect, an error message is displayed on a display of a display device. This allows the user to visually check that the process corresponding to the operation instructions is not executable.
- In the first aspect, for example, the speech-based appliance control system may be configured to be further connected to an audio output device configured to output audio, and the error message may be provided to a user of the cooking appliance using the audio output device.
- According to this aspect, an error message is provided to a user using an audio output device. This allows the user to auditorily check that the process corresponding to the operation instructions is not executable.
- In the first aspect, for example, in a case where it is determined that the instruction information does not include the second audio information, the process corresponding to the operation instructions may be executed instead of processes that are based on all the programs including the first cooking program and the second cooking program that are being executed in the cooking appliance.
- According to this aspect, in a case where it is determined that the instruction information does not include the second audio information, the process corresponding to the operation instructions is executed instead of processes that are based on all the programs including the first cooking program and the second cooking program that are being executed in the cooking appliance. Since it is determined that the instruction information does not include the second audio information, it is difficult to understand which of the first cooking unit and the second cooking unit the operation instructions are provided to. In this case, according to this aspect, both the first cooking unit and the second cooking unit execute the process corresponding to the operation instructions instead of executing the respective processes that are executed in accordance with the first cooking program and the second cooking program. As a result, at least operation instructions for the cooking appliance may be executed.
- In the first aspect, for example, the operation instructions may be used for interrupting the process that is based on the first cooking program or the second cooking program in the cooking appliance.
- According to this aspect, the operation instructions are used for interrupting a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program. Thus, the execution of a process in the first cooking unit or the second cooking unit in accordance with the first cooking program or the second cooking program is interrupted.
- In the first aspect, for example, the operation instructions may be used for executing a process having a cooking parameter different from a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- According to this aspect, the operation instructions are instructions for executing a process having a cooking parameter different from a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program. Thus, a process having a cooking parameter different from a process that is being executed in the first cooking unit or the second cooking unit in accordance with the first cooking program or the second cooking program is executed.
- In the first aspect, for example, the speech-based appliance control system may be configured to further include a display device, display screen information indicating a display screen that provides two or more cooking recipes including the first cooking recipe and the second cooking recipe may be transmitted to the display device via a second network, and first cooking recipe selection information indicating that the first cooking recipe has been selected on the display device, and second cooking recipe selection information indicating that the second cooking recipe has been selected on the display device may be received from the display device via the second network.
- According to this aspect, display screen information indicating a display screen that provides two or more cooking recipes including the first cooking recipe and the second cooking recipe is transmitted to a display device via a second network. First cooking recipe selection information indicating that the first cooking recipe has been selected on the display device, and second cooking recipe selection information indicating that the second cooking recipe has been selected on the display device are received from the display device via the second network. This allows the user to recognize that the first cooking recipe and the second cooking recipe have been selected on the display device.
- In the first aspect, for example, the first cooking program information may be transmitted to the cooking appliance via the display device in response to receipt of the first cooking recipe selection information from the display device, and the second cooking program information may be transmitted to the cooking appliance via the display device in response to receipt of the second cooking recipe selection information from the display device.
- According to this aspect, the first cooking program information and the second cooking program information are transmitted to the cooking appliance via the display device in response to receipt of the first cooking recipe selection information and the second cooking recipe selection information from the display device, respectively. Thus, the display device may be used for both the selection of the first cooking recipe and the second cooking recipe and the transmission of the first cooking program information and the second cooking program information to the cooking appliance.
- In the first aspect, for example, the database may include correspondence relationship information indicating a first correspondence relationship between the first cooking unit and the first cooking program and a second correspondence relationship between the second cooking unit and the second cooking program, a cooking unit that is to execute the process corresponding to the operation instructions may be specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information, and the control command may include specific-cooking-unit information indicating the specified one of the first cooking unit and the second cooking unit.
- According to this aspect, the database includes correspondence relationship information indicating a correspondence relationship between the first cooking unit and the first cooking program and a correspondence relationship between the second cooking unit and the second cooking program. A cooking unit that is to execute the process corresponding to the operation instructions is specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information. The control command includes specific-cooking-unit information indicating the specified cooking unit. Thus, the process corresponding to the operation instructions is executed in a cooking unit specified from among the first cooking unit and the second cooking unit on the basis of the specific-cooking-unit information, instead of a process that is executed in accordance with the corresponding cooking program. As a result, operation instructions for the first cooking unit or the second cooking unit in the cooking appliance may be accurately executed.
- In the first aspect, for example, the cooking appliance may be configured to manage the correspondence relationship information, a cooking unit that is to execute the process corresponding to the operation instructions may be specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information and the specific-cooking-unit information, and the specified cooking unit may be caused to execute the process corresponding to the operation instructions.
- According to this aspect, the cooking appliance is configured to manage correspondence relationship information indicating a correspondence relationship between the first cooking unit and the first cooking program and a correspondence relationship between the second cooking unit and the second cooking program. A cooking unit that is to execute the process corresponding to the operation instructions is specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information and the specific-cooking-unit information. The specified cooking unit executes the process corresponding to the operation instructions. Thus, the operation instructions transmitted to the cooking appliance may be accurately executed in the first cooking unit or the second cooking unit.
- In the first aspect, for example, the display device may be included in the cooking appliance.
- In the first aspect, for example, the display device may be included in an appliance that is different from the cooking appliance.
- A second aspect of the present disclosure provides a method for controlling a cooking appliance using a user's speech in a speech-based appliance control system, the speech-based appliance control system including the cooking appliance and an audio input device configured to receive input of user's speech, the cooking appliance including a first cooking unit and a second cooking unit. The appliance control method includes transmitting, to the cooking appliance via a first network, first cooking program information indicating the first cooking program, the first cooking program corresponding to a first cooking recipe, and second cooking program information indicating the second cooking program different from the first cooking program, the second cooking program corresponding to a second cooking recipe; in a case of receiving, from the audio input device, instruction information including first audio information indicating operation instructions for the cooking appliance when the first cooking unit is operated based on the first cooking program and the second cooking unit is operated based on the second cooking program, recognizing the operation instructions from the first audio information; determining, using a database configured to manage first cookware information indicating the name of a cookware item used in the first cooking recipe and second cookware information indicating the name of a cookware item used in the second cooking recipe, whether or not the instruction information includes second audio information related to the first cookware information or the second cookware information; and in a case where it is determined that the instruction information includes the first audio information and the second audio information, transmitting to the cooking appliance via the first network a control command for causing the cooking appliance to execute a process corresponding to the operation instructions without executing a process that is based on the first cooking program or the second cooking program corresponding to one of the first cookware information or the second cookware information to which the second audio information is related.
- According to this aspect, first cooking program information indicating a first cooking program corresponding to a first cooking recipe, and second cooking program information indicating a second cooking program corresponding to a second cooking recipe are transmitted to a cooking appliance via a first network. In response to receipt of instruction information including first audio information indicating operation instructions for the cooking appliance from an audio input device while a first cooking unit in the cooking appliance is executing a process that is based on a first cooking program and a second cooking unit in the cooking appliance is executing a process that is based on a second cooking program, the operation instructions are recognized from the first audio information.
- Using a database configured to manage first cookware information indicating the name of a cookware item used in the first cooking recipe and second cookware information indicating the name of a cookware item used in the second cooking recipe, it is determined whether or not the instruction information includes second audio information related to the first cookware information or the second cookware information. In a case where it is determined that the instruction information includes the second audio information, a control command is transmitted to the cooking appliance via the first network to cause the cooking appliance to execute a process corresponding to the operation instructions instead of executing a process that is executed in accordance with a cooking program corresponding to the cookware information to which the second audio information is related.
- Accordingly, for example, in a case where it is determined that the instruction information includes second audio information related to the first cookware information, the first cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the first cooking program. For example, in a case where it is determined that the instruction information includes second audio information related to the second cookware information, the second cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the second cooking program. As a result, operation instructions for the first cooking unit or the second cooking unit in the cooking appliance may be accurately executed.
- In the second aspect, for example, in a case where it is determined that the instruction information does not include the second audio information, an error message indicating that the process corresponding to the operation instructions is not executable at the cooking appliance may be provided to the user of the cooking appliance.
- In the second aspect, for example, the speech-based appliance control system may be configured to be further connected to a display device, and the error message may be displayed on the display device.
- In the second aspect, for example, the speech-based appliance control system may be configured to further include an audio output device configured to output audio, and the error message may be provided to the user of the cooking appliance using the audio output device.
- In the second aspect, for example, in a case where it is determined that the instruction information does not include the second audio information, the process corresponding to the operation instructions may be executed instead of processes that are based on all the programs including the first cooking program and the second cooking program that are being executed in the cooking appliance.
- In the second aspect, for example, the operation instructions may be used for interrupting a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- In the second aspect, for example, the operation instructions may be used for executing a process having a cooking parameter different from a process that is being executed in the cooking appliance in accordance with the first cooking program or the second cooking program.
- In the second aspect, for example, the speech-based appliance control system may be configured to further include a display device, display screen information indicating a display screen that provides two or more cooking recipes including the first cooking recipe and the second cooking recipe may be transmitted to the display device via a second network, and first cooking recipe selection information indicating that the first cooking recipe has been selected on the display device, and second cooking recipe selection information indicating that the second cooking recipe has been selected on the display device may be received from the display device via the second network.
- In the second aspect, for example, the first cooking program information may be transmitted to the cooking appliance via the display device in a case of receiving the first cooking recipe selection information from the display device, and the second cooking program information may be transmitted to the cooking appliance via the display device in case of receiving the second cooking recipe selection information from the display device.
- In the second aspect, for example, the database may include correspondence relationship information indicating a first correspondence relationship between the first cooking unit and the first cooking program and a second correspondence relationship between the second cooking unit and the second cooking program, one of the first cooking unit of the second cooking unit that is to execute the process corresponding to the operation instructions may be specified from among the first cooking unit and the second cooking unit on the basis of the correspondence relationship information, and the control command may include specific-cooking-unit information indicating the specified one of the first cooking unit and the second cooking unit.
- In the second aspect, for example, the process corresponding to the operation instructions are executed at the one of the first cooking unit or the second cooking unit specified on the basis of the specific-cooking-unit information.
- In the second aspect, for example, the display device may be included in the cooking appliance.
- In the second aspect, for example, the display device may be included in an appliance that is different from the cooking appliance.
- A third aspect of the present disclosure provides a speech-based appliance control system including a cooking appliance having a first cooking unit and a second cooking unit, an audio input device configured to receive input of a user's speech, and a server connectable to the cooking appliance and the audio input device. The cooking appliance is controlled using the user's speech. The cooking appliance includes a first communication unit configured to receive from the server first cooking program information indicating the first cooking program, the first cooking program corresponding to a first cooking recipe, and second cooking program information indicating the second cooking program, the second cooking program corresponding to a second cooking recipe, and a second control unit configured to cause the first cooking unit to operate based on a first cooking program, and configured to cause the second cooking unit to operate based on the second cooking program. The audio input device includes an audio acquisition unit configured to acquire instruction information including first audio information indicating operation instructions for the cooking appliance, and a second communication unit configured to transmit the acquired instruction information to the server. The server includes a third communication unit configured to transmit the first cooking program information and the second cooking program information to the cooking appliance, a database configured to manage first cooking menu information indicating the name of a cooking menu item corresponding to the first cooking recipe and second cooking menu information indicating the name of a cooking menu item corresponding to the second cooking recipe, a determination unit configured to, in a case of receiving the instruction information from the audio input device when the first cooking unit is operated based on the first cooking program and the second cooking unit is operated based on the second cooking program, recognize the operation instructions from the first audio information included in the received instruction information and determine whether or not the instruction information includes second audio information related to the first cooking menu information or the second cooking menu information, and a fourth communication unit configured to, in a case where it is determined that the instruction information includes the second audio information, transmit to the cooking appliance a control command for causing the cooking appliance to execute a process corresponding to the operation instructions, without executing a process that is executed in accordance with the first cooking program or the second cooking program corresponding to one of the first cooking menu information or the second cooking menu information to which the second audio information is related.
- According to this aspect, in response to receipt of instruction information including first audio information indicating operation instructions for a cooking appliance from an audio input device while a first cooking unit in the cooking appliance is executing a process that is based on a first cooking program and a second cooking unit in the cooking appliance is executing a process that is based on a second cooking program, the operation instructions are recognized from the first audio information included in the received instruction information. It is determined whether or not the instruction information includes second audio information related to the first cooking menu information or the second cooking menu information.
- In a case where it is determined that the instruction information includes the second audio information, a control command is transmitted to the cooking appliance to cause the cooking appliance to execute a process corresponding to the operation instructions instead of executing a process that is executed in accordance with a cooking program corresponding to the cooking menu information to which the second audio information is related.
- Accordingly, for example, in a case where it is determined that the instruction information includes second audio information related to the first cooking menu information, the first cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the first cooking program. For example, in a case where it is determined that the instruction information includes second audio information related to the second cooking menu information, the second cooking unit executes the process corresponding to the operation instructions instead of executing a process that is executed in accordance with the second cooking program. As a result, operation instructions for the first cooking unit or the second cooking unit in the cooking appliance may be accurately executed.
- A fourth aspect of the present disclosure provides a cooking appliance used in the speech-based appliance control system according to the third aspect.
- The above-described generic or specific aspects may be implemented by a system, a method, an integrated circuit, a computer program, or a computer-readable recording medium such as a compact disc read-only memory (CD-ROM), or may be implemented by any combination of a system, a method, an integrated circuit, a computer program, and a recording medium.
- Embodiments will be described in detail hereinafter with reference to the drawings.
- The embodiments described hereinafter provide specific examples of the present disclosure. The values, shapes, constituent elements, steps, the orders of the steps, etc., given in the following embodiments are illustrative, and are not intended to limit the present disclosure. In addition, among the constituent elements in the following embodiments, a constituent element not recited in any of the independent claims indicating the most generic concept of the present disclosure is described as optional. In addition, every embodiment disclosed herein may be combined with every other embodiment of the present disclosure.
- First, an overview of a service provided by a speech-based appliance control system according to embodiments disclosed herein will be described.
-
FIG. 1A is a diagram illustrating an overview of a service provided by a speech-based appliance control system according to embodiments disclosed herein. The speech-based appliance control system includes agroup 4100, a datacenter management company 4110, and aservice provider 4120. - The
group 4100 may indicate, for example, a company, an organization, a household, or the like whatever its size. Thegroup 4100 includes a plurality of homeelectric appliances 101 including a first home electric appliance and a second home electric appliance, and ahome gateway 4102. The plurality of homeelectric appliances 101 includes an appliance that is capable of accessing the Internet (such as a smartphone, a personal computer (PC), or a television receiver). The plurality of homeelectric appliances 101 further includes an appliance that is incapable of accessing the Internet by itself (such as lighting, a washing machine, or a refrigerator). The plurality of homeelectric appliances 101 may include an appliance that is incapable of accessing the Internet by itself but is capable of accessing the Internet via thehome gateway 4102.Users 4200 use the plurality of homeelectric appliances 101 in thegroup 4100. - The data
center management company 4110 includes acloud server 4111. Thecloud server 4111 is a virtual server that builds a cooperative relationship with various appliances over the Internet. Thecloud server 4111 mainly manages vast volumes of data (or “big data”) that are difficult to handle with traditional database management tools or the like. The datacenter management company 4110 engages in business activities such as operating a data center that manages data and that manages thecloud server 4111. The details of the activities that the datacenter management company 4110 undertakes are described below. - It is noted here that the data
center management company 4110 is not limited to a company engaging in business activities such as operating a data center that manages data and that manages thecloud server 4111. -
FIGS. 1B and 1C are diagrams illustrating examples of the datacenter management company 4110. For example, as illustrated inFIG. 1B , in a case where an appliance manufacturer that develops or manufactures one of the plurality of homeelectric appliances 101 performs activities such as managing data or managing thecloud server 4111, the appliance manufacturer corresponds to the datacenter management company 4110. Note that the datacenter management company 4110 is not limited to a single company. For example, as illustrated inFIG. 1C , in a case where an appliance manufacturer and a management company collaborate or share with each other to manage data or manage thecloud server 4111, one or both of them correspond to the datacenter management company 4110. - The
service provider 4120 includes aserver 121. When used herein, theserver 121 includes, for example, a memory in a personal use PC whatever the size. In some cases, theservice provider 4120 may not include theserver 121. In these cases, theservice provider 4120 may include a different device configured to perform the functions of theserver 121. - In the speech-based appliance control system described above, the
home gateway 4102 may not necessarily be used. Thehome gateway 4102 is a device that allows the homeelectric appliances 101 to access the Internet. Accordingly, in a case where the homeelectric appliances 101 do not include an appliance that is incapable of accessing the Internet by itself, for example, in a case where all the homeelectric appliances 101 in thegroup 4100 are connecting to the Internet, thehome gateway 4102 is not used. - The flow of information in the speech-based appliance control system described above will now be described with reference to
FIG. 1A . - First, the first home electric appliance or the second home electric appliance in the
group 4100 transmits log information to thecloud server 4111 in the datacenter management company 4110. Thecloud server 4111 collects the log information on the first home electric appliance or the second home electric appliance (arrow 131 inFIG. 1A ). The log information may be information indicating, for example, the operating state or operation date and time of the plurality of homeelectric appliances 101. The log information includes, for example, the viewing history of a TV viewer, scheduled recording information on a recorder, the date and time when a washing machine runs, the amount of laundry, the date and time when a refrigerator door opens and closes, and the number of times the refrigerator door opens and closes. The log information is not limited to the information described above, and may include a variety of pieces of information available from the homeelectric appliances 101. The log information may be provided from the plurality of homeelectric appliances 101 directly to thecloud server 4111 via the Internet. The log information may also temporarily be collected in thehome gateway 4102 from the plurality of homeelectric appliances 101, and may be provided from thehome gateway 4102 to thecloud server 4111. - Then, the
cloud server 4111 in the datacenter management company 4110 provides the collected log information to theservice provider 4120 at a constant rate. The “constant rate” may be the unit of how the datacenter management company 4110 can organize the collected information and provide the information to theservice provider 4120, or may be the unit requested by theservice provider 4120. Instead of the information being provided at a “constant rate”, the amount of information may not necessarily be constant, and, for example, the amount of information that is provided may vary depending on the situation. The log information is saved in theserver 121 included in theservice provider 4120, if necessary (arrow 132 inFIG. 1A ). - Then, the
service provider 4120 organizes the log information into information adapted to a service that is provided to users, and provides the information to the users. The users to whom the information is provided may be theusers 4200 who use the plurality of homeelectric appliances 101, or may beexternal users 4210. In a method for providing the information to theusers service provider 4120 directly to theusers 4200 or 4210 (arrow FIG. 1A ). In another method for providing the information to theusers 4200, for example, the information may be provided to theusers 4200, passing back through thecloud server 4111 in the data center management company 4110 (arrows FIG. 1A ). - Alternatively, the
cloud server 4111 in the datacenter management company 4110 may organize the log information into information adapted to a service that is provided to users, and may provide the information to theservice provider 4120. - The
users 4200 may be identical to or different from theusers 4210. -
FIG. 2 is a diagram illustrating the configuration of a speech-based appliance control system according to a first embodiment. The configuration of the speech-based appliance control system according to the first embodiment will be described with reference toFIG. 2 . - The speech-based appliance control system illustrated in
FIG. 2 includes an audio input andoutput device 240, a plurality of homeelectric appliances 101, adisplay terminal 260, agateway 102, aninformation communication network 220, and acloud server 111. The homeelectric appliances 101 include anoven range 243, an induction-heating (IH)cooker 244, and arefrigerator 245. The plurality of homeelectric appliances 101 may include any other desired appliance instead of or in addition to theoven range 243, theIH cooker 244, and therefrigerator 245. - The audio input and output device 240 (an example of an audio input device) includes an audio acquisition unit configured to acquires speech from a
user 250, and an audio output unit configured to output audio to theuser 250. Agroup 100 is a space within which the audio input andoutput device 240 can provide information (or a space over which audio interaction is feasible). Thegroup 100 may be, for example, a house of theuser 250. - The audio input and
output device 240 recognizes speech of theuser 250. The audio input andoutput device 240 presents audio information and controls the plurality of homeelectric appliances 101 in accordance with instructions entered by theuser 250 through speech. More specifically, the audio input andoutput device 240 reads content aloud, responds to a question made by theuser 250, and controls the homeelectric appliance 101 in accordance with instructions entered by theuser 250 through speech. - The display terminal 260 (an example of a display device) has an input function that allows the
user 250 to give appliance control instructions, and an information output function that provides information to theuser 250. The input function of thedisplay terminal 260 may be implemented by a touch panel or a push button. Thedisplay terminal 260 may be a mobile phone, a smartphone, or a tablet device. - The
display terminal 260, the audio input andoutput device 240, and the plurality of homeelectric appliances 101 may be connected to thegateway 102 using wired or wireless connection. Additionally, the audio input andoutput device 240 and at least one of the plurality of homeelectric appliances 101 may be integrated into a single unit. -
FIG. 3 is a block diagram illustrating the hardware configuration of the audio input andoutput device 240. The hardware configuration of the audio input andoutput device 240 will be described with reference toFIG. 3 . - As illustrated in
FIG. 3 , the audio input andoutput device 240 includes aprocessing circuit 300, anaudio collection circuit 301, anaudio output circuit 302, and acommunication circuit 303. These circuits are connected to one another via abus 330, and are capable of exchanging data or instructions. - The
processing circuit 300 includes a central processing unit (CPU) 310 and amemory 320. Alternatively, theprocessing circuit 300 may include dedicated hardware configured to implement the operations described below, instead of theCPU 310 and thememory 320. Thememory 320 stores anappliance ID 341 and acomputer program 342. - The
appliance ID 341 is an identifier uniquely assigned to the audio input andoutput device 240. Theappliance ID 341 may be independently assigned by a manufacturer, or may be a physical address (or so-called Media Access Control (MAC) address), which is uniquely assigned basically on a network. - The
audio collection circuit 301 collects user speech and generates an analog audio signal. Theaudio collection circuit 301 converts the generated analog audio signal into digital data and then transmits the digital data to thebus 330. - The
audio output circuit 302 converts the digital data received via thebus 330 into an analog audio signal. Theaudio output circuit 302 outputs the resulting analog audio signal. - The
communication circuit 303 is a circuit that communicates with other devices (e.g., the gateway 102) via a network. Thecommunication circuit 303 performs communication complying with, for example, the Ethernet (registered trademark) standards. Thecommunication circuit 303 transmits log information or ID information generated by theprocessing circuit 300 to thegateway 102. Thecommunication circuit 303 transmits a signal received from thegateway 102 to theprocessing circuit 300 via thebus 330. - The audio input and
output device 240 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function. -
FIG. 4 is a block diagram illustrating the hardware configuration of acooking appliance 400 that is an example of the homeelectric appliances 101. The hardware configuration of thecooking appliance 400 will be described with reference toFIG. 4 . Theoven range 243, theIH cooker 244, and therefrigerator 245 are examples of thecooking appliance 400. - The
cooking appliance 400 includes an input andoutput circuit 410, acommunication circuit 450, and aprocessing circuit 470. These circuits are connected to one another via abus 460, and are capable of exchanging data or instructions. - The
processing circuit 470 includes aCPU 430 and amemory 440. Alternatively, theprocessing circuit 470 may include dedicated hardware configured to implement the operations described below, instead of theCPU 430 and thememory 440. Thememory 440 stores anappliance ID 441, acomputer program 442, and acooking program ID 443. - The
appliance ID 441 is an identifier uniquely assigned to thecooking appliance 400. Thecooking program ID 443 is an identifier uniquely assigned to a cooking program. Theappliance ID 441 and thecooking program ID 443 may be independently assigned by a manufacturer, or may be a physical address (or so-called Media Access Control (MAC) address), which is uniquely assigned basically on a network. - The input and
output circuit 410 outputs a result of processing performed by theprocessing circuit 470. The input andoutput circuit 410 converts an input analog signal into digital data, and transmits the digital data to thebus 460. For example, in a case where the input andoutput circuit 410 has a display function, the input andoutput circuit 410 displays a result of processing performed by theprocessing circuit 470. In this case, thecooking appliance 400 that includes the input and output circuit 410 (an example of a display device) having a display function may have the function of thedisplay terminal 260. - The
communication circuit 450 is a circuit that communicates with other devices (e.g., the gateway 102) via a network. Thecommunication circuit 450 performs communication complying with, for example, the Ethernet (registered trademark) standards. Thecommunication circuit 450 transmits log information or ID information generated by theprocessing circuit 470 to thegateway 102. Thecommunication circuit 450 transmits a signal received from thegateway 102 to theprocessing circuit 470 via thebus 460. - The
cooking appliance 400 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function. -
FIG. 5 is a block diagram illustrating the hardware configuration of thedisplay terminal 260. As illustrated inFIG. 5 , thedisplay terminal 260 includes adisplay control circuit 500, adisplay circuit 502, acommunication circuit 505, and aprocessing circuit 510. These circuits are connected to one another via abus 525, and are capable of exchanging data or instructions. - The
display circuit 502 includes a liquid crystal display and so on. Thedisplay circuit 502 displays an image such as an object image including icons or operation buttons, and a text image. Thedisplay control circuit 500 controls the operation of thedisplay circuit 502 to display an image on thedisplay circuit 502. - The
communication circuit 505 is a circuit that communicates with other devices (e.g., the audio input andoutput device 240, thecooking appliance 400, etc.) via a network. Thecommunication circuit 505 performs communication complying with, for example, the Ethernet (registered trademark) standards or near field communication standards. Thecommunication circuit 505 transmits log information or ID information generated by theprocessing circuit 510 to the audio input andoutput device 240 or thecooking appliance 400. Thecommunication circuit 505 transmits a signal received from the audio input andoutput device 240 or thecooking appliance 400 to theprocessing circuit 510 via thebus 525. - The
processing circuit 510 includes aCPU 515 and amemory 520. Alternatively, theprocessing circuit 510 may include dedicated hardware configured to implement the operations described below, instead of theCPU 515 and thememory 520. Thememory 520 stores adisplay terminal ID 521, acomputer program 522, and acooking program ID 523. Thedisplay terminal ID 521 is an identifier uniquely assigned to thedisplay terminal 260. Similarly to thecooking program ID 443, thecooking program ID 523 is an identifier uniquely assigned to a cooking program. - The
display terminal 260 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function. - In
FIG. 5 , thedisplay terminal ID 521 and thecooking program ID 523 are stored in thememory 520 in which thecomputer program 522 is stored. However, this is an example. Thecomputer program 522 may be stored in a random access memory (RAM) or a read-only memory (ROM), and thedisplay terminal ID 521 and thecooking program ID 523 may be stored in a flash memory. -
FIG. 6 is a block diagram illustrating the hardware configuration of thegateway 102. Thegateway 102 includes acommunication circuit 550 and aprocessing circuit 570. These circuits are connected to each other via abus 560, and are capable of exchanging data or instructions. - The
communication circuit 550 is a circuit that communicates with other devices (e.g., the audio input andoutput device 240, thecooking appliance 400, etc.) via a network. Thecommunication circuit 550 performs communication complying with, for example, the Ethernet (registered trademark) standards. Thecommunication circuit 550 transmits log information or ID information generated by theprocessing circuit 570 to the audio input andoutput device 240 or thecooking appliance 400. In addition, thecommunication circuit 550 transmits a signal received from the audio input andoutput device 240 or thecooking appliance 400 to theprocessing circuit 570 via thebus 560. - The
processing circuit 570 includes aCPU 530 and amemory 540. Alternatively, theprocessing circuit 570 may include dedicated hardware configured to implement the operations described below, instead of theCPU 530 and thememory 540. Thememory 540 stores agateway ID 541 and acomputer program 542. Thegateway ID 541 is an identifier uniquely assigned to thegateway 102. Thegateway 102 may also include, in addition to the illustrated constituent elements, a constituent element(s) to implement the requested function. - In
FIG. 6 , thegateway ID 541 is stored in thememory 540 in which thecomputer program 542 is stored. However, this is an example. Thecomputer program 542 may be stored in a RAM or a ROM, and thegateway ID 541 may be stored in a flash memory. -
FIG. 7 is a block diagram illustrating the hardware configuration of thecloud server 111. Thecloud server 111 includes acommunication circuit 650, aprocessing circuit 670, a speech recognition database (DB) 600, an appliance state management DB 620 (an example of a database), an utteranceinterpretation dictionary DB 625, anappliance function DB 630, and acooking program DB 640. Theprocessing circuit 670 includes aCPU 671, and amemory 672 in which acomputer program 673 is stored. These constituent elements are connected to one another via abus 680, and are capable of mutually exchanging data. - The
processing circuit 670 is connected to the speech recognition/synthesis DB 600, the appliancestate management DB 620, the utteranceinterpretation dictionary DB 625, theappliance function DB 630, and thecooking program DB 640 via thebus 680. Theprocessing circuit 670 acquires or edits management information stored in thedatabases - In this embodiment, the speech recognition/
synthesis DB 600, the appliancestate management DB 620, the utteranceinterpretation dictionary DB 625, theappliance function DB 630, and thecooking program DB 640 are elements included in thecloud server 111. The speech recognition/synthesis DB 600, the appliancestate management DB 620, the utteranceinterpretation dictionary DB 625, theappliance function DB 630, and thecooking program DB 640 may be provided outside thecloud server 111, in which case thecloud server 111 may further include an Internet line in addition to thebus 680. - The
communication circuit 650 is a circuit that communicates with other communication devices (e.g., the gateway 102) via a network. Thecommunication circuit 650 performs communication complying with, for example, the Ethernet (registered trademark) standards. - The
CPU 671 controls the operation of thecloud server 111. TheCPU 671 executes a group of instructions written in thecomputer program 673 developed in thememory 672. Accordingly, theCPU 671 is capable of implementing a variety of functions. A group of instructions for allowing thecloud server 111 to implement the operations described below is written in thecomputer program 673. - The
computer program 673 described above may be recorded on a recording medium such as a CD-ROM and distributed as a marketed product. Alternatively, thecomputer program 673 may be transmitted via an electric communication line such as the Internet. An appliance (e.g., a PC) including the hardware illustrated inFIG. 7 is capable of functioning as thecloud server 111 according to this embodiment by reading thecomputer program 673. - The
CPU 671 and thememory 672 in which thecomputer program 673 is stored may be implemented as hardware such as a digital signal processor (DSP) in which a computer program is integrated in a single semiconductor circuit. The DSP is capable of implementing all the processing operations implementable by theCPU 671 that executes thecomputer program 673 described above on a single integrated circuit. This DSP in place of theCPU 671 and thememory 672 illustrated inFIG. 7 may be used as theprocessing circuit 670. - The speech recognition/
synthesis DB 600 stores acoustic models and language models for speech recognition. The appliancestate management DB 620, the utteranceinterpretation dictionary DB 625, theappliance function DB 630, and thecooking program DB 640 will be described in detail below. -
FIG. 8 is a block diagram illustrating the system configuration of the audio input andoutput device 240. The audio input andoutput device 240 includes anaudio collection unit 1000, anaudio detection unit 1010, an audiosection segmentation unit 1020, acommunication unit 1030, and anaudio output unit 1040. - The
audio collection unit 1000 corresponds to theaudio collection circuit 301. Theaudio collection unit 1000 collects user speech and generates an analog audio signal. Theaudio collection unit 1000 converts the generated analog audio signal into digital data, and generates an audio signal. - The
audio detection unit 1010 and the audiosection segmentation unit 1020 are implemented by theprocessing circuit 300. TheCPU 310 that executes thecomputer program 342 functions as, for example, theaudio detection unit 1010 at a certain point in time, and functions as the audiosection segmentation unit 1020 at a different point in time. At least one of theaudio detection unit 1010 and the audiosection segmentation unit 1020 may be implemented by hardware configured to perform dedicated processing, such as a DSP. - The
audio detection unit 1010 determines whether or not audio has been detected. For example, if the level of the audio signal (e.g., the amplitude of the audio signal) generated by theaudio collection unit 1000 is less than or equal to a predetermined value, theaudio detection unit 1010 determines that no audio has been detected. - The audio
section segmentation unit 1020 extracts a section in which audio is present from the acquired audio signal. Theaudio collection unit 1000, theaudio detection unit 1010, and the audiosection segmentation unit 1020 constitute an example of an audio acquisition unit. - The communication unit 1030 (an example of a second communication unit) corresponds to the
communication circuit 303. Thecommunication unit 1030 communicates with other communication devices (e.g., the gateway 102) via a network. Thecommunication unit 1030 performs communication complying with, for example, the Ethernet (registered trademark) standards. Thecommunication unit 1030 transmits an audio signal for the section extracted by the audiosection segmentation unit 1020. Further, thecommunication unit 1030 passes the received audio signal to theaudio output unit 1040. - The
audio output unit 1040 corresponds to theaudio output circuit 302. Theaudio output unit 1040 converts the audio signal received by thecommunication unit 1030 into an analog audio signal. Theaudio output unit 1040 outputs the resulting analog audio signal. -
FIG. 9 is a block diagram illustrating the system configuration of thecooking appliance 400. Thecooking appliance 400 includes acommunication unit 900, anappliance control unit 910, afirst cooking unit 911, and asecond cooking unit 912. - The communication unit 900 (an example of a first communication unit) corresponds to the
communication circuit 450. Thecommunication unit 900 communicates with other communication devices (e.g., the gateway 102) via a network. Thecommunication unit 900 performs communication complying with, for example, the Ethernet (registered trademark) standards. - The appliance control unit 910 (an example of a second control unit) corresponds to the input and
output circuit 410 and theprocessing circuit 470. Theprocessing circuit 470 corresponding to theappliance control unit 910 reads the control data received by thecommunication unit 900. Theprocessing circuit 470 corresponding to theappliance control unit 910 controls the input andoutput circuit 410 using the read control data. - The
appliance control unit 910 controls the operations of thefirst cooking unit 911 and thesecond cooking unit 912 in accordance with control commands received by thecommunication unit 900. Thefirst cooking unit 911 and thesecond cooking unit 912 are configured to be capable of simultaneously executing different cooking programs. In a case where thecooking appliance 400 is theIH cooker 244, thefirst cooking unit 911 corresponds to, for example, a “heater 1”, and thesecond cooking unit 912 corresponds to, for example, a “heater 2”. In a case where thecooking appliance 400 is theoven range 243, thefirst cooking unit 911 corresponds to, for example, a “top rack”, and thesecond cooking unit 912 corresponds to, for example, a “bottom rack”. -
FIG. 10 is a block diagram illustrating the system configuration of thegateway 102. Thegateway 102 includes acommunication unit 800, a receiveddata analysis unit 810, and a transmissiondata generation unit 820. - The
communication unit 800 corresponds to thecommunication circuit 550. Thecommunication unit 800 is a circuit that communicates with other devices (e.g., the audio input andoutput device 240, thecooking appliance 400, etc.) via a network. Thecommunication unit 800 performs communication complying with, for example, the Ethernet (registered trademark) standards. Thecommunication unit 800 passes received data to the receiveddata analysis unit 810. - Further, the
communication unit 800 transmits data generated by the transmissiondata generation unit 820. - The received
data analysis unit 810 corresponds to theprocessing circuit 570. The receiveddata analysis unit 810 analyzes the data received by thecommunication unit 800 to determine the type of the received data. As a result of the analysis of the received data in terms of type, the receiveddata analysis unit 810 determines the next target appliance (e.g., the audio input andoutput device 240 or the cooking appliance 400), and the data to be transmitted to the target appliance. - The transmission
data generation unit 820 corresponds to theprocessing circuit 570. The transmissiondata generation unit 820 generates transmission data based on the next target appliance and the data to be transmitted to the target appliance, which are determined by the receiveddata analysis unit 810. -
FIG. 11 is a block diagram illustrating the system configuration of thecloud server 111. Thecloud server 111 includes acommunication unit 700, aspeech recognition unit 710, anutterance interpretation unit 730, astate management unit 740, aresponse generation unit 750, and aspeech synthesis unit 760. - The communication unit 700 (an example of a third communication unit and a fourth communication unit) corresponds to the
communication circuit 650. Thecommunication unit 700 is a circuit that communicates with other devices (e.g., the gateway 102) via a network. Thecommunication unit 700 performs communication complying with, for example, the Ethernet (registered trademark) standards. - The
speech recognition unit 710 is implemented by theprocessing circuit 670 and the speech recognition/synthesis DB 600. Thespeech recognition unit 710 converts an audio signal into character string data. Specifically, thespeech recognition unit 710 acquires information on pre-registered acoustic models from the speech recognition/synthesis DB 600, and converts the audio signal into phonemic data using the acoustic models and the frequency characteristics of the audio signal. Thespeech recognition unit 710 also acquires information on pre-registered language models from the speech recognition/synthesis DB 600, and generates specific character string data using the language models in accordance with the arrangement of the phonemes in the phonemic data. - The
utterance interpretation unit 730 is implemented by theprocessing circuit 670, theappliance function DB 630, and thecooking program DB 640. Theutterance interpretation unit 730 extracts context data from the character string data. The context data may include, specifically, the target appliance name, the menu name (or food name), the cookware name, or the task content. Theutterance interpretation unit 730 checks the character string data against theappliance function DB 630 and thecooking program DB 640 to extract context data. - The state management unit 740 (an example of a determination unit) is implemented by the
processing circuit 670, the appliancestate management DB 620, and thecooking program DB 640. Thestate management unit 740 receives the context data as input, and acquires data stored in the appliancestate management DB 620 and thecooking program DB 640. Thestate management unit 740 changes the acquired data to update the appliancestate management DB 620 and thecooking program DB 640. - The
response generation unit 750 is implemented by theprocessing circuit 670, the appliancestate management DB 620, theappliance function DB 630, and thecooking program DB 640. Theresponse generation unit 750 searches the appliancestate management DB 620, theappliance function DB 630, and thecooking program DB 640, and generates a control signal for controlling thecooking appliance 400 to be controlled. Theresponse generation unit 750 searches theappliance function DB 630 and thecooking program DB 640, and generates character string data of information to be provided to theuser 250. - The
speech synthesis unit 760 is implemented by theprocessing circuit 670 and the speech recognition/synthesis DB 600. Thespeech synthesis unit 760 converts the character string data into an audio signal. Specifically, thespeech synthesis unit 760 acquires information on pre-registered acoustic models and language models from the speech recognition/synthesis DB 600, and converts the character string data into a specific audio signal using the acoustic models and the language models. -
FIG. 12 is a flowchart illustrating an example of the operation of theutterance interpretation unit 730.FIGS. 13A and 13B are diagrams illustrating an example of the utteranceinterpretation dictionary DB 625.FIG. 14 is a diagram illustrating an example ofcontext data 1400 extracted by theutterance interpretation unit 730. Thecontext data 1400 illustrated inFIG. 14 is an example of context data in a case where a user speaks an utterance “Turn off the heat to the stew”. In the following, the stew is described as a Japanese stewed or simmered vegetable dish (“nimono”). - As illustrated in
FIGS. 13A and 13B , the utteranceinterpretation dictionary DB 625 holds word IDs, word names, related word IDs, types, and concepts in association with one another. - The word IDs are identifiers uniquely assigned to the words registered in the word names. For example, the word name “pot” is registered with the word ID “W001”. The related word IDs are word IDs of words related to the words registered in the word names. For example, the word ID “W030” associated with the word name “ground meat” is registered as a related word ID of the word name “hamburger steak”. Conversely, the word ID “W010” associated with the word name “hamburger steak” is registered as a related word ID associated with the word name “ground meat”.
- The types represent the types of the words registered in the word names. The types include <equipment>, <menu>, <category>, <ingredient>, <appliance>, and <task>. The type <equipment> represents a cooking equipment or cookware product. The type <menu> represents a menu item or food item. The type <category> represents the general concept of a menu item. The type <ingredient> represents the names of ingredients used in cooking. The type <appliance> represents a cooking appliance. The type <task> represents instructions for an action or the like.
- For example, the word name “pot” is registered with the type <equipment>. For example, the word name “hamburger steak” is registered with the type <menu>. For example, the word name “stew” is registered with the type <category>. For example, the word name “ground meat” is registered with the type <ingredient>. For example, the word name “IH cooker” is registered with the type <appliance>. For example, the word name “turn off the heat” is registered with the type <task>.
- The concepts represent logical symbols for the words registered in the word names. The concepts correspond to the words registered in the word names in a one-to-one way. For example, the word name “stew” is registered with the concept <stewed>. For example, the word name “oven range” is registered with the concept <stove>. For example, the word name “turn off the heat” is registered with the concept <stop_heat>.
- A process illustrated in
FIG. 12 is initiated by theutterance interpretation unit 730 immediately after thespeech recognition unit 710 converts an audio signal for an utterance made by a user into character string data. - In S1201, the
utterance interpretation unit 730 in thecloud server 111 checks a character string for the utterance made by the user (i.e., the character string data output from the speech recognition unit 710) against a list of word names in the utteranceinterpretation dictionary DB 625. In step S1202, theutterance interpretation unit 730 outputs, as context data, the “types” and “concepts” associated with all the word names that match part or all of the character string. As illustrated inFIG. 14 , the context data is output in a table form. - In the example in
FIG. 14 , theutterance interpretation unit 730 determines that the word name “stew” and the word name “turn off the heat” in the utterance “Turn off the heat of the stew” match the corresponding word names in the utteranceinterpretation dictionary DB 625. As may be seen fromFIGS. 13A and 13B , theutterance interpretation unit 730 outputs, as thecontext data 1400, the information associated with the word name “stew”, namely, the word ID “W020”, the type <category>, and the concept <stewed>, and the information associated with the word name “turn off the heat”, namely, the word ID “W100”, the type <task>, and the concept <stop_heat>. - In S1203, the
utterance interpretation unit 730 determines whether or not each word name has a related word ID. If each word name has no related word ID (NO in S1203), the process illustrated inFIG. 12 ends. If each word name has a related word ID (YES in S1203), then in S1204, theutterance interpretation unit 730 outputs the “word name”, “type”, and “concept” associated with the related word ID as context data. - In the example in
FIG. 14 , as may be seen fromFIGS. 13A and 13B , theutterance interpretation unit 730 determines that there is no related word ID associated with the word name “turn off the heat”, and determines that there are related word IDs associated with the word name “stew”. Then, theutterance interpretation unit 730 outputs the word name “beef stew”, the type <menu>, and the concept <beef_stew>, which are associated with the related word ID “W011” associated with the word name “stew”, and the word name “chikuzen-ni stew”, the type <menu>, and the concept <chikuzen_ni>, which are associated with the related word ID “W012” associated with the word name “stew”, as context data. As a result, thecontext data 1400 illustrated inFIG. 14 is output from theutterance interpretation unit 730. -
FIG. 17 is a diagram illustrating a specific example of the appliancestate management DB 620. The appliancestate management DB 620 holds, for example, gateway IDs (GW-IDs), appliance IDs, appliance names <appliance>, ongoing cooking program IDs, ongoing cooking step IDs, cookware names <equipment>, menu names <menu>, and appliance operating states in association with one another. - The gateway IDs are identifiers uniquely assigned to
gateways 102. In the example inFIG. 17 , gateway IDs “G001” and “G002” are registered. - The appliance IDs are identifiers uniquely assigned to separate cooking units included in
cooking appliances 400. In the example inFIG. 17 , the appliance ID “M01-01” is registered with the “heater 1” of theIH cooker 244, the appliance ID “M01-02” is registered with the “heater 2” of theIH cooker 244, and the appliance ID “M01-03” is registered with a “heater 3” of theIH cooker 244. In addition, the appliance ID “M02-01” is registered with the “top rack” in theoven range 243, and the appliance ID “M02-02” is registered with the “bottom rack” in theoven range 243. - The appliance names <appliance> represent logical symbols for
cooking appliances 400. The appliance names <appliance> correspond to thecooking appliances 400 in a one-to-one way. In the example inFIG. 17 , the appliance name <ih_heater> is registered with each of theheaters IH cooker 244. In addition, the appliance name <stove> is registered with each of the top rack and bottom rack in theoven range 243. - The ongoing cooking program IDs are identifiers of ongoing cooking programs currently being undertaken. In the example in
FIG. 17 , the cooking programs with the cooking program IDs “T001”, “T002”, and “T003” are currently undertaken. The ongoing cooking step IDs are identifiers of individual cooking steps in currently ongoing cooking programs. In the example inFIG. 17 , the cooking step with the cooking step ID “S001” is being performed in the cooking associated with the cooking program ID “T001”, the cooking step with the cooking step ID “S002” is being performed in the cooking associated with the cooking program ID “T002”, and the cooking step with the cooking step ID “S002” is being performed in the cooking associated with the cooking program ID “T003”. - The cookware names <equipment> represent cookware items used in accordance with the associated cooking programs. In the example in
FIG. 17 , <pot> is used in the cooking associated with the cooking program ID “T001”, <pan> is used in the cooking associated with the cooking program ID “T002”, and <gratin plate> is used in the cooking associated with the cooking program ID “T003”. - The menu names <menu> represent menu items (or food items) currently being cooked. In the example in
FIG. 17 , the menu item associated with the cooking program ID “T001” is represented by <chikuzen_ni>, the menu item associated with the cooking program ID “T002” is represented by <hamburger>, and the menu item associated with the cooking program ID “T003” is represented by <gratin>. - The appliance operating states indicate, for example, whether the corresponding appliances are currently in operation or in standby state. In the example in
FIG. 17 , the “heater 1” of theIH cooker 244 is in the appliance operating state “in standby mode”, the “heater 2” of theIH cooker 244 is in the appliance operating state “in operation (over a low heat)”, and the “top rack” in theoven range 243 is in the appliance operating state “in operation (2 minutes left)”. - As described above, the
IH cooker 244 has three cooking units, namely, the “heater 1”, the “heater 2”, and the “heater 3”. The three cooking units are configured to be capable of simultaneously operating in accordance with different cooking programs. Accordingly, separate appliance IDs are registered with the “heater 1”, the “heater 2”, and the “heater 3” in order to individually identify the three cooking units. The “heater 1” of theIH cooker 244 is an example of a first cooking unit, and the “heater 2” of theIH cooker 244 is an example of a second cooking unit. - Similarly, the
oven range 243 has two cooking units, namely, the “top rack” and the “bottom rack”. The two cooking units are configured to be capable of simultaneously operating in accordance with different cooking programs. Accordingly, separate appliance IDs are registered with the “top rack” and the “bottom rack” in order to individually identify the two cooking units. The “top rack” in theoven range 243 is an example of the first cooking unit, and the “bottom rack” in theoven range 243 is an example of the second cooking unit. -
FIG. 18 is a diagram illustrating a specific example of theappliance function DB 630. Theappliance function DB 630 holds, for example, function IDs, appliance IDs, task contents <task>, control commands, and response messages in association with one another. The function IDs are identifiers uniquely assigned to functions of cooking units registered in association with the appliance IDs. The task contents <task> represent logical symbols indicating tasks for the functions with the function IDs. The control commands represent control commands used to perform the functions with the function IDs. The response messages represent messages issued when the functions with the function IDs are performed. - In the example in
FIG. 18 , the function ID “O01-01-01” represents the function of the cooking unit with the appliance ID “M01-01”, that is, referring toFIG. 17 , the function of the “heater 1” of theIH cooker 244. This function is associated with the task content <begin heat>, the control command “CMD=0xFFA05050”, and the response message “Theheater 1 was turned on”. -
FIG. 19A is a diagram illustrating an example of amenu list 1900 included in thecooking program DB 640.FIG. 19B is a diagram illustrating an example of acooking step list 1910 included in thecooking program DB 640.FIG. 20A is a diagram illustrating an example of anerror message list 1920 included in thecooking program DB 640.FIG. 20B is a diagram illustrating an example of adisplay screen 1930 of thedisplay terminal 260. - As illustrated in
FIG. 19A , themenu list 1900 in thecooking program DB 640 holds, for example, cooking program IDs, menu names <menu>, cookware names <equipment>, ingredient names <ingredient>, and category names <category> in association with one another. In the example illustrated inFIG. 19A , the cooking program ID “T001” is associated with the menu name <chikuzen_ni>, the cookware name <pot>, the ingredient names <chicken>, <carrot>, etc., and the category name <stewed>. - As illustrated in
FIG. 19B , thecooking step list 1910 in thecooking program DB 640 holds, for example, cooking program IDs, cooking step IDs, and response messages in association with one another. In the example inFIG. 19B , the cooking procedure associated with the cooking program ID “T001” includes cooking steps with the cooking step IDs “S001”, “S002”, and so on. The response message “Heat the pot over a high heat” is registered in association with the cooking step with the cooking step ID “S002”. - As illustrated in
FIG. 20A , theerror message list 1920 in thecooking program DB 640 holds, for example, error message IDs, error types, and response error messages in association with one another. In the example illustrated inFIG. 20A , “no categories, menu items, or ingredients” is registered as the error type with the error message ID “E002”, and “XXX is not being made right now” is registered as the response error message. In the case of the error message ID “E002” inFIG. 20A , for example, as illustrated inFIG. 20B , thedisplay terminal 260 displays adisplay screen 1930 including the response error message “A cream stew is not being made right now”. -
FIGS. 15A and 15B are flowcharts illustrating an example of the operation of thestate management unit 740 in thecloud server 111. First, thestate management unit 740 acquires context data output from the utterance interpretation unit 730 (S1501). Then, thestate management unit 740 determines whether or not the acquired context data includes a category name or an ingredient name (S1502). If it is determined that the context data does not include a category name or an ingredient name (NO in S1502), thestate management unit 740 advances the process to S1506. - If it is determined that the context data includes a category name or an ingredient name (YES in S1502), the
state management unit 740 checks the category name or the ingredient name against the cooking program DB 640 (S1503). Thestate management unit 740 determines whether or not the corresponding category name or ingredient name has been registered in the cooking program DB 640 (S1504). If it is determined that the corresponding category name or ingredient name has not been registered in the cooking program DB 640 (NO in S1504), thestate management unit 740 advances the process to S1513. - If it is determined that the corresponding category name or ingredient name has been registered in the cooking program DB 640 (YES in S1504), the
state management unit 740 outputs the associated menu name and cookware name (S1505), and then advances the process to S1506. - In S1506, in a case where NO is determined in S1502, the
state management unit 740 checks the appliance name, menu name, or cookware name in the context data against the appliancestate management DB 620. Alternatively, in S1506, in a case where the process proceeds to S1506 from S1505, thestate management unit 740 checks the menu name and cookware name output in S1505 against the appliancestate management DB 620. - In S1507, the
state management unit 740 determines whether or not the corresponding appliance name, menu name, or cookware name has been registered in the appliancestate management DB 620. If it is determined that the corresponding appliance name, menu name, or cookware name has not been registered in the appliance state management DB 620 (NO in S1507), thestate management unit 740 advances the process to S1513. - If it is determined that the corresponding appliance name, menu name, or cookware name has been registered in the appliance state management DB 620 (YES in S1507), the
state management unit 740 acquires the appliance ID from the appliance state management DB 620 (S1508). Then, thestate management unit 740 determines whether or not the acquired appliance ID is uniquely identifiable (S1509). If it is determined that the acquired appliance ID is not uniquely identifiable (NO in S1509), thestate management unit 740 advances the process to S1513. - If it is determined that the appliance ID is uniquely identifiable (YES in S1509), the
state management unit 740 checks the task content in the context data against the appliance function DB 630 (S1510). Then, thestate management unit 740 determines whether or not the corresponding task content has been registered in the appliance function DB 630 (S1511). If it is determined that the corresponding task content has not been registered in the appliance function DB 630 (NO in S1511), thestate management unit 740 advances the process to S1513. - If it is determined that the corresponding task content has been registered in the appliance function DB 630 (YES in S1511), the
state management unit 740 acquires the function ID from theappliance function DB 630, and outputs the acquired function ID (S1512). Then, the process illustrated inFIGS. 15A and 15B ends. - In S1513, the
state management unit 740 searches theerror message list 1920 in thecooking program DB 640 for the corresponding error message ID, and outputs the error message ID. Then, the process illustrated inFIGS. 15A and 15B ends. - For example, in a case where the process proceeds to S1513 since NO is obtained in S1504, the corresponding category name or ingredient name has not been registered. Thus, the
state management unit 740 acquires the error message ID “E002”. - For example, in a case where the process proceeds to S1513 since NO is obtained in S1507, the corresponding appliance name, menu name, or cookware name has not been registered. Thus, the
state management unit 740 acquires any of the error message IDs “E001”, “E002”, and “E003”. - For example, in a case where the process proceeds to S1513 since NO is obtained in S1509, the appliance ID is not uniquely identifiable. Thus, the
state management unit 740 acquires the error message ID “E004”. - For example, in a case where the process proceeds to S1513 since NO is obtained in S1511, the corresponding task content has not been registered. Thus, the
state management unit 740 acquires the error message ID “E006”. - If a plurality of error message IDs are obtained, the
state management unit 740 may set the error message ID to be acquired to, for example, “E002” by default. - In S1513, an error message ID is output, and then the current process ends. After that, when the
user 250 speaks an utterance in response to the error message, the operation of the utterance interpretation unit 730 (FIG. 12 ) is initiated. -
FIG. 16 is a flowchart illustrating an example of the operation of theresponse generation unit 750 in thecloud server 111. First, theresponse generation unit 750 acquires the content output from thestate management unit 740 in S1512 or S1513 ofFIGS. 15A and 15B (S1601). Then, theresponse generation unit 750 determines whether the acquired output content is a function ID or an error message ID (S1602). - If the acquired output content is a function ID (“function ID” in S1602), the
response generation unit 750 checks the function ID against theappliance function DB 630, and generates a control command and a response message (S1603). - If the acquired output content is an error message ID (“error message ID” in S1602), the
response generation unit 750 checks the error message ID against theerror message list 1920 included in thecooking program DB 640, and generates a response error message (S1604). -
FIGS. 21 to 23 are sequence diagrams illustrating the operation of the speech-based appliance control system according to the first embodiment.FIGS. 24A , 24B, and 24C are diagrams illustrating an example of amenu selection screen 2400 displayed on thedisplay terminal 260. Note thatFIGS. 21 to 23 illustrate a continuous sequence. The process in the sequence diagram illustrated inFIGS. 21 to 23 is initiated when theuser 250 gives instructions to thedisplay terminal 260 to start the speech-based appliance control system by, for example, tapping an icon displayed on a display screen of thedisplay terminal 260. - In S2101, the
display terminal 260 acquires a menu list request from theuser 250. In S2102, thecommunication circuit 505 in thedisplay terminal 260 transmits the acquired menu list request and thedisplay terminal ID 521 to thegateway 102. Thegateway 102 receives the menu list request and thedisplay terminal ID 521. - In S2103, the
gateway 102 transmits the menu list request and thedisplay terminal ID 521 received from thedisplay terminal 260, and thegateway ID 541 held in thememory 540 of thegateway 102 to thecloud server 111. Thecloud server 111 receives the menu list request, thedisplay terminal ID 521, and thegateway ID 541. - In S2104, the
state management unit 740 in thecloud server 111 performs a menu list acquisition process to extract a menu list. - In S2105, the
response generation unit 750 in thecloud server 111 transmits the extracted menu list, thedisplay terminal ID 521 that specifies thedisplay terminal 260 to be used for display, and thegateway ID 541 to thegateway 102. Thegateway 102 receives the menu list, thedisplay terminal ID 521, and thegateway ID 541. - In S2106, the received
data analysis unit 810 in thegateway 102 performs a received data analysis process. In the received data analysis process, the receiveddata analysis unit 810 separates the data received from thecloud server 111 into the menu list, thedisplay terminal ID 521, and thegateway ID 541. Then, in S2107, the transmissiondata generation unit 820 in thegateway 102 transmits the separated menu list to thedisplay terminal 260 corresponding to thedisplay terminal ID 521. - In S2108, the
display control circuit 500 in thedisplay terminal 260 displays amenu selection screen 2400 on thedisplay circuit 502 in a manner illustrated inFIG. 24A in accordance with the received menu list (an example of display screen information). In S2109, thedisplay terminal 260 acquires instructions for a specific cooking program request from theuser 250. - As illustrated in
FIG. 24A , themenu selection screen 2400 includes a cookingappliance display portion 2401 and a cookingprogram display portion 2402. In the cookingappliance display portion 2401, acooking appliance 400 including a plurality of cooking units is schematically displayed. In the example illustrated inFIG. 24A , three cooking units, namely, the “heater 1”, the “heater 2”, and the “heater 3” of theIH cooker 244, are displayed in the cookingappliance display portion 2401. - In the cooking
program display portion 2402, a list of cooking programs is displayed. In the example illustrated inFIG. 24A , four menu items, e.g., “hamburger steak”, “beef stew”, “chikuzen-ni stew”, and “gratin”, are displayed in the cookingappliance display portion 2401. The cookingprogram display portion 2402 may be configured such that swiping up or down in the area corresponding to the cookingprogram display portion 2402 scrolls the screen to allow the remaining cooking programs to appear. - As illustrated in
FIG. 24B , theuser 250 taps, for example, an area labeled “chikuzen-ni stew” in the cookingprogram display portion 2402 and then taps an area labeled “heater 1” in the cookingappliance display portion 2401 with a contact object 2403 (e.g., the user's finger). Then, thedisplay terminal 260 acquires instructions to request that a meal for the cooking program “chikuzen-ni stew” (an example of a first cooking recipe) be cooked using the “heater 1” (an example of a first cooking unit) of theIH cooker 244. In accordance with the instructions, thedisplay terminal 260 acquires the cooking program ID and the appliance ID. In addition, as illustrated inFIG. 24B , thedisplay terminal 260 changes the display color of the tapped areas to allow theuser 250 to easily identify the selected items. - It is noted that the appliance IDs corresponding to the “
heater 1”, the “heater 2”, and the “heater 3” of theIH cooker 244 are registered in advance. - Referring back to
FIG. 21 , in S2110, thecommunication circuit 505 in thedisplay terminal 260 transmits thecooking program ID 523 to thecooking appliance 400 corresponding to the appliance ID. Thecooking appliance 400 receives the transmittedcooking program ID 523, and stores the receivedcooking program ID 523 in thememory 440. The cooking program ID indicating “chikuzen-ni stew”, which is transmitted in S2110 from thedisplay terminal 260 to thecooking appliance 400, is an example of first cooking program information. - In S2111, the
communication circuit 505 in thedisplay terminal 260 transmits the cooking program ID, the display terminal ID, and the appliance ID to thegateway 102. Thegateway 102 receives the cooking program ID, the display terminal ID, and the appliance ID. The cooking program ID indicating “chikuzen-ni stew”, which is transmitted in S2111 from thedisplay terminal 260 to thegateway 102, is an example of first cooking recipe selection information. - In S2112, the
gateway 102 transmits the cooking program ID, the display terminal ID, and the appliance ID, which are received from thedisplay terminal 260, and thegateway ID 541 held in thememory 540 of thegateway 102 to thecloud server 111. Thecommunication circuit 650 in thecloud server 111 receives the cooking program ID, the display terminal ID, the appliance ID, and thegateway ID 541. - In S2201, the
state management unit 740 in thecloud server 111 performs a cooking program management process. In the cooking program management process, thestate management unit 740 performs a process to update the content of the appliancestate management DB 620 using the values of the received cooking program ID, display terminal ID, appliance ID, and gateway ID. -
FIG. 26 is a flowchart illustrating the cooking program management process executed in S2201 ofFIG. 22 . - In S2601, the
state management unit 740 acquires the display terminal ID, the gateway ID, the appliance ID, and the cooking program ID, which are received by thecommunication circuit 650 in S2112 ofFIG. 21 . - In S2602, the
state management unit 740 checks the gateway ID and the appliance ID against the appliancestate management DB 620. In S2603, thestate management unit 740 determines whether or not the cooking program ID has been registered in the column of the ongoing cooking program ID in the appliancestate management DB 620 associated with the gateway ID and the appliance ID. If the cooking program ID has been registered in the column of the ongoing cooking program ID (YES in S2603), thestate management unit 740 ends the process illustrated inFIG. 26 . - If the cooking program ID has not been registered in the column of the ongoing cooking program ID (NO in S2603), the
state management unit 740 registers the cooking program ID acquired in S2601 in the column of the ongoing cooking program ID in the appliancestate management DB 620 in association with the gateway ID and the appliance ID (S2604). - In S2605, the
state management unit 740 checks the cooking program ID against themenu list 1900 in thecooking program DB 640, and acquires the cookware name and the menu name. In S2606, thestate management unit 740 registers the acquired cookware name and menu name in the columns of the cookware name and menu name in the appliancestate management DB 620, respectively, in association with the gateway ID and the appliance ID. - In S2607, the
state management unit 740 resets the current value in the column of the ongoing cooking step ID in the appliancestate management DB 620 to the initial value (in the example illustrated inFIG. 19B , “S001”), and ends the process illustrated inFIG. 26 . - Referring back to
FIG. 22 , in S2202, theresponse generation unit 750 in thecloud server 111 performs a response text generation process to generate a response message for theuser 250. Specifically, thecloud server 111 holds information on the response messages registered in the cooking step list 1910 (FIG. 19B ) in thecooking program DB 640, and information on the response messages registered in the appliance function DB 630 (FIG. 18 ). Theresponse generation unit 750 in thecloud server 111 reads the response messages stored in thecooking program DB 640 or theappliance function DB 630 to generate text data of response text. - In S2203, the
speech synthesis unit 760 in thecloud server 111 performs a speech synthesis process to convert the response message into audio data. Specifically, thecloud server 111 holds information on the acoustic models and language models registered in the speech recognition/synthesis DB 600. Thespeech synthesis unit 760 in thecloud server 111 reads the information on the acoustic models and language models registered in the speech recognition/synthesis DB 600, and convert the text data of the response text into specific audio data using the information on the acoustic models and language models. - In S2204, the
cloud server 111 transmits the generated audio data, the generated text data, thedisplay terminal ID 521, and thegateway ID 541 to thegateway 102. Thegateway 102 receives the audio data, the text data, thedisplay terminal ID 521, and thegateway ID 541. - In S2205, the received
data analysis unit 810 in thegateway 102 performs a received data analysis process. In the received data analysis process, the receiveddata analysis unit 810 in thegateway 102 separates the received data into the audio data, the text data, thedisplay terminal ID 521, and thegateway ID 541. - Then, in S2206, the transmission
data generation unit 820 in thegateway 102 transmits the separated audio data to the audio input andoutput device 240. In S2207, the audio input andoutput device 240 outputs audio using the received audio data. Then, in S2208, the transmissiondata generation unit 820 in thegateway 102 transmits the separated text data to thedisplay terminal 260 corresponding to thedisplay terminal ID 521. In S2209, thedisplay terminal 260 displays a text image corresponding to the received text data. - In S2301, the
cooking appliance 400 detects the content of the operation (hereinafter referred to as the “operation content”) to be performed on thecooking appliance 400 by theuser 250. In S2302, thecommunication circuit 450 in thecooking appliance 400 transmits the detected operation content, theappliance ID 441, and thecooking program ID 443 to thegateway 102. Thegateway 102 receives the operation content, theappliance ID 441, and thecooking program ID 443. - In S2303, the
gateway 102 transmits thecooking program ID 443, the operation content, and theappliance ID 441, which are received from thecooking appliance 400, and thegateway ID 541 held in thememory 540 of thegateway 102 to thecloud server 111. Thecloud server 111 receives thecooking program ID 443, the operation content, theappliance ID 441, and thegateway ID 541. - In S2304, the
state management unit 740 in thecloud server 111 performs a cooking program update process. In the cooking program update process, thestate management unit 740 performs a process to update the content of the appliancestate management DB 620 using the values of the received cooking program ID, the received operation content, the receivedappliance ID 441, and the receivedgateway ID 541. On the basis of the received operation content, thestate management unit 740 can know that the immediately preceding cooking step has been executed. Thestate management unit 740 updates the content of the appliancestate management DB 620 in accordance with the result of the immediately preceding cooking step. - Specifically, for example, as illustrated in
FIG. 24B , a description is given of the case where, in S2109 ofFIG. 21 , theuser 250 made a request to cook “chikuzen-ni stew” using the “heater 1” of theIH cooker 244. In this case, as illustrated inFIG. 19A , the cooking program ID for “chikuzen-ni stew” is “T001”. Thus, theresponse generation unit 750 in thecloud server 111 acquires the cooking program ID “T001” for “chikuzen-ni stew” from thestate management unit 740. - The
response generation unit 750 refers to thecooking step list 1910 in the cooking program DB 640 (FIG. 19B ), and acquires the response message “Pour 400 cc of purified water into a pot on the stove” associated with the first cooking step ID “S001” associated with the cooking program ID “T001”. In S2202 ofFIG. 22 , theresponse generation unit 750 generates the above-described response message. - The above-described response message is output as audio in S2207 of
FIG. 22 , and is displayed on a screen in S2209. Accordingly, theuser 250 pours water into a pot and places the pot on the “heater 1” of theIH cooker 244. Then, in S2301 ofFIG. 23 , the cooking appliance 400 (in the illustrated example, the IH cooker 244) detects an increase in the weight on the “heater 1”. In S2302, thecooking appliance 400 transmits the increase in the weight on the “heater 1” to thegateway 102 as operation content. - In S2303, the
gateway 102 transmits the operation content indicating the increase in the weight on the “heater 1” to thecloud server 111. Thecommunication circuit 650 in thecloud server 111 receives the operation content. - In S2304, the
state management unit 740 acquires the operation content indicating the increase in the weight on the “heater 1”, which is received by thecommunication circuit 650. On the basis of the operation content indicating the increase in the weight on the “heater 1”, thestate management unit 740 determines that the cooking step ID “S001” in the cooking (of chikuzen-ni stew) associated with the cooking program ID “T001” has been executed. - Thus, the
state management unit 740 updates the ongoing cooking step ID corresponding to the “IH cooker:heater 1” with the appliance ID “M01-01” in the appliancestate management DB 620 from “S001” to “S002”. In this way, thestate management unit 740 executes the cooking program update process in S2304. - Through the process illustrated in
FIGS. 21 to 23 , for example, chikuzen-ni stew is cooked using the “heater 1” of theIH cooker 244. In some cases, while cooking chikuzen-ni stew using the “heater 1”, theuser 250 may wish to simultaneously cook another meal using another cooking unit of theIH cooker 244. In these cases, for example, in accordance with an operation such as tapping an icon displayed on thedisplay terminal 260, the speech-based appliance control system again starts the process from S2101 ofFIG. 21 . - In this case, in S2108 of
FIG. 21 , as illustrated inFIG. 24B , themenu selection screen 2400 indicating that chikuzen-ni stew is being cooked and “heater 1” is in use is displayed on thedisplay terminal 260. - While the
menu selection screen 2400 is being displayed on thedisplay terminal 260, as illustrated inFIG. 24C , theuser 250 taps, for example, an area labeled “hamburger steak” in the cookingprogram display portion 2402 and then taps an area labeled “heater 2” in the cookingappliance display portion 2401 with the contact object 2403 (e.g., the user's finger). Then, in S2109 ofFIG. 21 , thedisplay terminal 260 acquires instructions to request that a meal for the cooking program “hamburger steak” (an example of a second cooking recipe) be cooked using the “heater 2” (an example of a second cooking unit) of theIH cooker 244. - In accordance with the instructions, the
display terminal 260 acquires the cooking program ID and the appliance ID. In addition, as illustrated inFIG. 24C , thedisplay terminal 260 changes the display color of the tapped areas to allow theuser 250 to easily identify the further selected items. - Then, the process illustrated in
FIGS. 21 to 23 is performed again using the cooking program ID indicating “hamburger steak” and the appliance ID indicating the “heater 2” of theIH cooker 244. In this case, the cooking program ID indicating “hamburger steak”, which is transmitted in S2110 ofFIG. 21 from thedisplay terminal 260 to thecooking appliance 400, is an example of second cooking program information. Furthermore, the cooking program ID indicating “hamburger steak”, which is transmitted in S2111 ofFIG. 21 from thedisplay terminal 260 to thegateway 102, is an example of second cooking recipe selection information. - The heating of the
cooking appliance 400 may be started in response to an operation of theuser 250. In this case, the response message “Heat the pot over a high heat”, which has been registered in thecooking step list 1910 illustrated inFIG. 19B , may be output. - The heating of the
cooking appliance 400 may be automatically started in response to an increase in the weight of the pot. In this case, the response message “Theheater 1 was turned on”, which has been registered in theappliance function DB 630 illustrated inFIG. 18 , may be output. - In response to an operation performed by the
user 250 during the execution of a cooking program, the process illustrated inFIG. 23 is executed for each operation. -
FIG. 25 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the first embodiment. The process illustrated inFIG. 25 is initiated when a user gives some instructions to an appliance by using speech. - In S2501, the audio input and
output device 240 acquires audio data of theuser 250. In S2502, thecommunication circuit 303 in the audio input andoutput device 240 transmits the acquired audio data to thegateway 102. Thegateway 102 receives the audio data. - In S2503, the
gateway 102 transmits the audio data received from the audio input andoutput device 240 and thegateway ID 541 held in thememory 540 of thegateway 102 to thecloud server 111. Thecommunication circuit 650 in thecloud server 111 receives the audio data and thegateway ID 541. - In S2504, the
speech recognition unit 710 and theutterance interpretation unit 730 in thecloud server 111 execute an audio content interpretation process. In the audio content interpretation process, first, thespeech recognition unit 710 acquires the user audio data received from thecommunication circuit 650. Thespeech recognition unit 710 extracts frequency characteristics from the user audio data. Thespeech recognition unit 710 extracts phonemic data using the acoustic models held in the speech recognition/synthesis DB 600 and the extracted frequency characteristics. - The
speech recognition unit 710 converts the extracted phonemic data into specific character string data by checking the extracted phonemic data against the speech recognition/synthesis DB 600 and determining which character string data of the language models held in the speech recognition/synthesis DB 600 is the most similar to the extracted phonemic data in terms of arrangement. Theutterance interpretation unit 730 executes the process described above with reference toFIG. 12 using the character string data obtained by thespeech recognition unit 710. In this way, the audio content interpretation process in S2504 is executed. - In S2505, the
state management unit 740 in thecloud server 111 executes a state management process. In the state management process, thestate management unit 740 executes the process described above with reference toFIGS. 15A and 15B . - In S2506, the
response generation unit 750 in thecloud server 111 executes an output generation process. In the output generation process, theresponse generation unit 750 executes the process described above with reference toFIG. 16 . - In S2507, the
speech synthesis unit 760 in thecloud server 111 performs a speech synthesis process. In the speech synthesis process in S2507, thespeech synthesis unit 760 in thecloud server 111 performs a process to convert response text into audio data. Specifically, thecloud server 111 holds information on the acoustic models and language models registered in the speech recognition/synthesis DB 600. TheCPU 671 in thecloud server 111 reads the information on the acoustic models and language models registered in the speech recognition/synthesis DB 600, and convert the character string data into specific audio data using the information on the acoustic models and language models. - In S2508, the
cloud server 111 transmits the generated control command, the generated audio data, the generated text data, theappliance ID 441 of the appliance to be controlled, and thegateway ID 541 to thegateway 102. Thegateway 102 receives the control command, the audio data, the text data, theappliance ID 441, and thegateway ID 541. - In S2509, the received
data analysis unit 810 in thegateway 102 performs a received data analysis process. In the received data analysis process in S2509, the receiveddata analysis unit 810 in thegateway 102 separates the received data into the control command, the audio data, the text data, theappliance ID 441, and thegateway ID 541. - In S2510, the transmission
data generation unit 820 in thegateway 102 transmits the separated text data to thedisplay terminal 260. In S2511, thedisplay terminal 260 displays the received text data on the display screen. - In S2512, the transmission
data generation unit 820 in thegateway 102 transmits the separated control command to thecooking appliance 400 corresponding to theappliance ID 441. In S2513, theappliance control unit 910 in thecooking appliance 400 controls the operation in accordance with the control command received by thecommunication unit 900. - In S2514, the transmission
data generation unit 820 in thegateway 102 transmits the separated audio data to the audio input andoutput device 240. In S2515, theaudio output unit 1040 in the audio input andoutput device 240 outputs audio in accordance with the audio data received by thecommunication unit 1030. - A specific example of the process illustrated in
FIG. 25 will now be described. First, a description will be given of the operation in a first specific example in a case where while, as illustrated above inFIG. 24C , chikuzen-ni stew and hamburger steak are being cooked, as inFIG. 14 described above, theuser 250 speaks an utterance “Turn off the heat to the stew”. - In this case, as a result of the audio content interpretation process in S2504 of
FIG. 25 , thecontext data 1400 illustrated inFIG. 14 is output from theutterance interpretation unit 730. Then, in S2505, as described above, the process illustrated inFIGS. 15A and 15B is executed by thestate management unit 740. - In the
context data 1400 illustrated inFIG. 14 , “stew” is associated with the type <category> and the concept <stewed>. Since a category name is included, YES is determined in S1502 ofFIG. 15A and the process proceeds to S1503 ofFIG. 15B . - In S1503 of
FIG. 15B , the category name is checked against the cooking program DB 640 (FIG. 19A ). Since <stewed> has been registered in the category name <category>, YES is determined in S1504. Then, in S1505, the menu name <chikuzen_ni> and the cookware name <pot> are output. - Although not given in the
cooking program DB 640 illustrated inFIG. 19A , as may be anticipated from the related word IDs of the word name “stew” inFIG. 13A , the beef stew is also a kind of stew. Thus, in S1505, the menu name <beef_stew> and the cookware name <pot> are also output. - In S1506 of
FIG. 15A , as a result of checking the output menu names and cookware names against the appliance state management DB 620 (FIG. 17 ), the menu name <chikuzen_ni> has been registered. Thus, YES is determined in S1507. Then, in S1508, the appliance ID “M01-01” (an example of specific-cooking-unit information) is acquired. Since the appliance ID is uniquely identifiable, YES is determined in S1509, and the process proceeds to S1510 and then to S1511. In this way, an appliance ID is identified by the menu name <chikuzen_ni> in the appliancestate management DB 620. - In S1510, the concept <stop_heat> of the word name “turn off the heat” associated with the type <task> in the context data 1400 (
FIG. 14 ) has been registered in association with the appliance ID “M01-01” in the appliance function DB 630 (FIG. 18 ). Thus, YES is determined in S1511. Then, in S1512, the function ID “O01-01-02” is output. Accordingly, the state management process in S2505 ofFIG. 25 ends. - Subsequently, the output generation process in S2506 of
FIG. 25 , that is, the process illustrated inFIG. 16 , which is performed by theresponse generation unit 750, is executed. In the first specific example, a function ID is output from thestate management unit 740. Thus, in S1603 ofFIG. 16 , the function ID “O01-01-02” is checked against the appliance function DB 630 (FIG. 18 ), and the control command “CMD=0xFFA05051” and the response message “Theheater 1 was turned off” are generated. - Accordingly, in response to the utterance “Turn off the heat to the stew” made by the
user 250, the “heater 1” with which chikuzen-ni stew is being cooked, rather than the “heater 2” with which hamburger steak is being cooked, can be accurately turned off. In the first specific example, “Turn off the heat to the stew” is an example of instruction information, “turn off the heat” is an example of first audio information (operation instructions), and “stew” is an example of first menu information or second menu information and is also an example of second audio information. - A description will now be given of the operation in a second specific example in a case where while, as illustrated above in
FIG. 24C , chikuzen-ni stew and hamburger steak are being cooked, unlikeFIG. 14 described above, theuser 250 speaks an utterance “Turn off the heat to the pot”, focusing on differences from the operation in the first specific example described above. - In this case, the context data output from the
utterance interpretation unit 730 includes the type <equipment> and the concept <pot> associated with the column of the word name “pot”, and the type <task> and the concept <stop_heat> associated with the column of the word name “turn off the heat” in the utteranceinterpretation dictionary DB 625 illustrated inFIGS. 13A and 13B . - The context data includes no category names or ingredient names. Thus, NO is determined in S1502 of
FIG. 15A . Then, the process proceeds to S1506. In the appliance state management DB 620 (FIG. 17 ), the cookware name <pot> in the context data has been registered in one column. Thus, the appliance ID “M01-01” is acquired through S1506 to S1508. Accordingly, an appliance ID is identified by the cookware name <pot> in the appliancestate management DB 620. - In the following, the task content “turn off the heat” is the same as that in the first specific example described above. Thus, the process is performed in a manner similar to that in the first specific example described above, and a similar control command and a similar response message are generated.
- Accordingly, in response to the utterance “Turn off the heat to the pot” made by the
user 250, the “heater 1” with which chikuzen-ni stew is being cooked, rather than the “heater 2” with which hamburger steak is being cooked, can be accurately turned off. In the second specific example, “Turn off the heat to the pot” is an example of instruction information, “turn off the heat” is an example of first audio information (operation instructions), and “pot” is an example of first cookware information or second cookware information and is also an example of second audio information. - A description will now be given of the operation in a third specific example in a case where while, unlike
FIG. 24C described above, chikuzen-ni stew and beef stew are being cooked, as inFIG. 14 described above, theuser 250 speaks an utterance “Turn off the heat to the stew”, focusing on differences from the operations in the first and second specific examples. - In this case, the
context data 1400 illustrated inFIG. 14 , which is the same as that in the first specific example described above, is output from theutterance interpretation unit 730. Thus, similarly to the first specific example described above, in S1505 ofFIG. 15B , the menu name <chikuzen_ni> and the cookware name <pot> are output. In addition, similarly to the first specific example described above, the menu name <beef_stew> and the cookware name <pot> are also output in S1505. - In the third specific example, <beef_stew> has been registered in place of <hamburger> in the menu name in the appliance
state management DB 620 illustrated inFIG. 17 . Thus, both the menu name <chikuzen_ni> and the menu name <beef_stew> have been registered in the appliancestate management DB 620. Accordingly, in S1508 ofFIG. 15A , both the appliance ID “M01-01” and the appliance ID “M01-02” are acquired. As a result, a uniquely identifiable appliance ID is not obtained and thus NO is determined in S1509. - In the third specific example, the cookware name <pot> is stored with redundancy. Thus, in S1513 of
FIG. 15B , the error message ID “E004” is acquired from theerror message list 1920 in thecooking program DB 640 illustrated inFIG. 20A , and is output. - Subsequently, in the third specific example, an error message ID is output from the
state management unit 740. Thus, in S1604 ofFIG. 16 , the error message ID “E004” is checked against thecooking program DB 640 illustrated inFIG. 20A , the response error message “Please enter the name of a menu item” is generated. - Accordingly, in response to the utterance “Turn off the heat to the stew” made by the
user 250, it is difficult to determine which of the “heater 1” with which chikuzen-ni stew is being cooked and the “heater 2” with which beef stew is being cooked to select since both chikuzen-ni stew and beef stew are stews. Thus, theuser 250 is prompted to utter “the name of a menu item”. In the third specific example, “Turn off the heat to the stew” is an example of instruction information, and “turn off the heat” is an example of first audio information (operation instructions). - A description will now be given of the operation in a fourth specific example in a case where while, unlike
FIG. 24C described above, chikuzen-ni stew and beef stew are being cooked, unlikeFIG. 14 described above, theuser 250 speaks an utterance “Turn off the heat to the pot”, focusing on differences from the operations in the first to third specific examples described above. - In this case, similarly to the second specific example described above, the context data output from the
utterance interpretation unit 730 includes the type <equipment> and the concept <pot> associated with the column of the word name “pot”, and the type <task> and the concept <stop_heat> associated with the column of the word name “turn off the heat” in the utteranceinterpretation dictionary DB 625 illustrated inFIGS. 13A and 13B . - The context data includes no category names or ingredient names. Thus, NO is determined in S1502 of
FIG. 15A . Then, the process proceeds to S1506. - In the fourth specific example, similarly to the third specific example described above, in the appliance
state management DB 620, <pot> has been registered as the cookware name corresponding to the menu name <chikuzen_ni>, and <pot> has also been registered as the cookware name corresponding to the menu name <beef_stew>. Thus, a uniquely identifiable appliance ID is not obtained. - In the fourth specific example, similarly to the third specific example described above, the cookware name <pot> is stored with redundancy. Thus, similarly to the third specific example described above, the response error message “Please enter the name of a menu item” is generated.
- Accordingly, in response to the utterance “Turn off the heat to the pot” made by the
user 250, it is difficult to determine which of the “heater 1” with which chikuzen-ni stew is being cooked and the “heater 2” with which beef stew is being cooked to select since two pots are in use. Thus, theuser 250 is prompted to utter “the name of a menu item”. In the fourth specific example, “Turn off the heat to the pot” is an example of instruction information, and “turn off the heat” is an example of first audio information (operation instructions). - A description will now be given of the operation in a fifth specific example in a case where while, as in
FIG. 24C described above, chikuzen-ni stew and hamburger steak are being cooked, unlikeFIG. 14 described above, theuser 250 speaks an utterance “Heat the pot over a low heat”, focusing on differences from the operation in the first to fourth specific examples described above. - In this case, the context data output from the
utterance interpretation unit 730 includes the type <equipment> and the concept <pot> associated with the column of the word name “pot”, and the type <task> and the concept <low heat> associated with the column of the word name “low heat” in the utteranceinterpretation dictionary DB 625 illustrated inFIGS. 13A and 13B . - The context data includes no category names or ingredient names. Thus, NO is determined in S1502 of
FIG. 15A . Then, the process proceeds to S1506. In the appliance state management DB 620 (FIG. 17 ), the cookware name <pot> in the context data has been registered in one column. Thus, the appliance ID “M01-01” is acquired through S1506 to S1508. Accordingly, an appliance ID is identified by the cookware name <pot> in the appliancestate management DB 620. - Since an appliance ID is uniquely identified in S1509 of
FIG. 15A , the processing of S1510 to S1512 is executed. The concept <low heat> associated with “low heat” corresponding to the type <task> in the context data has been registered in association with the appliance ID “M01-01” in theappliance function DB 630 illustrated inFIG. 18 . Thus, the function ID “O01-01-04” is output. - Subsequently, in S1603 of
FIG. 16 , the control command “CMD=0xFFA05053” and the response message “Theheater 1 was turned down low” corresponding to the function ID “O01-01-04” are generated from the appliance function DB 630 (FIG. 18 ). - Accordingly, in response to the utterance “Heat the pot over a low heat” made by the
user 250, the “heater 1” with which chikuzen-ni stew is being cooked, rather than the “heater 2” with which hamburger steak is being cooked, can be accurately turned down low. In the fifth specific example, “Heat the pot over a low heat” is an example of instruction information, “low heat” is an example of first audio information (operation instructions), and “pot” is an example of first cookware information or second cookware information and is also an example of second audio information. - Note that the task content <low heat> in the
appliance function DB 630 illustrated inFIG. 18 , which corresponds to the word name “low heat” in the utteranceinterpretation dictionary DB 625 illustrated inFIG. 13B , is an example of operation instructions for executing a process having a cooking parameter different from a process that is being executed in accordance with a cooking program. - In the fifth specific example, the cooking parameter is the temperature of the heat, that is, a set temperature. However, a cooking parameter in the present disclosure is not limited to a set temperature. The cooking parameter may be, for example, the duration for which a set temperature is maintained, an inclination toward change in temperature, or a heating on/off duty ratio. In other words, instructions given by the
user 250 through utterance may include instructions to change the duration for which a set temperature is maintained. - As described above, according to the first embodiment, in a case where different meals are being cooked simultaneously using two cooking units, e.g., the “
heater 1” and the “heater 2” of theIH cooker 244, which of the cooking units instructions have been given to by theuser 250 through utterance can be accurately determined. - In addition, in the first embodiment, a response error message is output if it is difficult to determine which cooking unit instructions are provided to. The response error message can prompt the
user 250 to make an appropriate utterance. - While the first embodiment described above provides an example in which the audio input and
output device 240, thedisplay terminal 260, and the homeelectric appliances 101 are separate from one another, the present disclosure is not limited to this embodiment. Thedisplay terminal 260 may include the audio input andoutput device 240. Alternatively, each of the homeelectric appliances 101 may include the audio input andoutput device 240 and/or thedisplay terminal 260. -
FIG. 27 is a diagram illustrating the configuration of a speech-based appliance control system according to a second embodiment. In the second embodiment, substantially the same elements as those in the first embodiment are assigned the same numerals, and are not described in detail herein. In the following, a description will be given of the second embodiment, focusing on differences from the first embodiment. - The speech-based appliance control system according to the second embodiment includes the audio input and
output device 240, the plurality of homeelectric appliances 101, thedisplay terminal 260, and anintegrated management device 2800. That is, the speech-based appliance control system according to the second embodiment includes theintegrated management device 2800 in place of thegateway 102, theinformation communication network 220, and thecloud server 111 in the speech-based appliance control system according to the first embodiment. - The
integrated management device 2800 is located in thegroup 100. Theintegrated management device 2800 may be connected to thedisplay terminal 260, the audio input andoutput device 240, and the plurality of homeelectric appliances 101 using wired or wireless connection. - In the second embodiment, the
integrated management device 2800 is separate from the homeelectric appliances 101. However, the present disclosure is not limited to this embodiment. For example, theoven range 243, theIH cooker 244, or therefrigerator 245 may include theintegrated management device 2800. -
FIG. 28A is a block diagram illustrating the hardware configuration of theintegrated management device 2800. Theintegrated management device 2800 includes thecommunication circuit 650, theprocessing circuit 670, the speech recognition database (DB) 600, the appliancestate management DB 620, the utteranceinterpretation dictionary DB 625, theappliance function DB 630, and thecooking program DB 640. Theprocessing circuit 670 includes theCPU 671 and thememory 672 in which thecomputer program 673 is stored. In this manner, theintegrated management device 2800 has substantially the same hardware configuration as that of thecloud server 111 illustrated in FIG. 7. -
FIG. 28B is a block diagram illustrating the system configuration of theintegrated management device 2800. Theintegrated management device 2800 includes thecommunication unit 700, thespeech recognition unit 710, theutterance interpretation unit 730, thestate management unit 740, theresponse generation unit 750, and thespeech synthesis unit 760. In this manner, theintegrated management device 2800 has substantially the same system configuration as that of thecloud server 111 illustrated inFIG. 11 . -
FIG. 29 is a diagram illustrating a specific example of the appliancestate management DB 620 according to the second embodiment. The appliancestate management DB 620 according to the second embodiment holds appliance IDs, appliance names <appliance>, ongoing cooking program IDs, ongoing cooking step IDs, cookware names <equipment>, menu names <menu>, and appliance operating states in association with one another. The appliancestate management DB 620 according to the second embodiment is different from the appliancestate management DB 620 according to the first embodiment illustrated inFIG. 17 in that no gateway ID is held. -
FIGS. 30A and 30B are sequence diagrams illustrating the operation of the speech-based appliance control system according to the second embodiment. InFIGS. 30A and 30B , substantially the same steps as those in the operation of the speech-based appliance control system according to the first embodiment illustrated inFIGS. 21 to 23 are assigned the same numerals. - The processes executed by the
cloud server 111 in the first embodiment (FIGS. 21 to 23 ) are executed by theintegrated management device 2800 in the second embodiment (FIGS. 30A and 30B ). - In the second embodiment (
FIGS. 30A and 30B ), transmission and reception processes between thecloud server 111 and thegateway 102, which are performed in the first embodiment (FIGS. 21 to 23 ), are not performed, and the received data analysis processes performed by thegateway 102 are not performed. - In the first embodiment (
FIGS. 21 to 23 ), thedisplay terminal 260 and thecooking appliance 400 transmit data to thegateway 102, whereas in the second embodiment (FIGS. 30A and 30B ), thedisplay terminal 260 and thecooking appliance 400 transmit data to theintegrated management device 2800. - The operation of the speech-based appliance control system according to the first embodiment (
FIGS. 21 to 23 ) and the operation of the speech-based appliance control system according to the second embodiment (FIGS. 30A and 30B ) are the same except for the points described above. -
FIG. 31 is a sequence diagram illustrating a process for determining the target of audio instructions in the speech-based appliance control system according to the second embodiment. InFIG. 31 , substantially the same steps as those in the process of the speech-based appliance control system according to the first embodiment illustrated inFIG. 25 are assigned the same numerals. - The processes executed by the
cloud server 111 in the first embodiment (FIG. 25 ) are executed by theintegrated management device 2800 in the second embodiment (FIG. 31 ). - In the second embodiment (
FIG. 31 ), transmission and reception processes between thecloud server 111 and thegateway 102, which are performed in the first embodiment (FIG. 25 ), are not performed, and the received data analysis process performed by thegateway 102 is not performed. - In the first embodiment (
FIG. 25 ), the audio input andoutput device 240 transmits data to thegateway 102, whereas in the second embodiment (FIG. 31 ), the audio input andoutput device 240 transmits data to theintegrated management device 2800. - The process of the speech-based appliance control system according to the first embodiment (
FIG. 25 ) and the process of the speech-based appliance control system according to the second embodiment (FIG. 31 ) are the same except for the points described above. - As described above, in the second embodiment, the processes executed by the
cloud server 111 in the first embodiment are executed by theintegrated management device 2800. The other points are similar to those in the first embodiment, and thus the advantages achievable by the second embodiment are similar to the advantages achievable by the first embodiment. - In the first embodiment described above, only the
cloud server 111 has the appliancestate management DB 620. In the second embodiment described above, in contrast, only theintegrated management device 2800 has the appliancestate management DB 620. However, the present disclosure is not limited to these embodiments. -
FIG. 32 is a block diagram illustrating another example of the hardware configuration of thecooking appliance 400. InFIG. 32 , thememory 440 in thecooking appliance 400 additionally includes the appliancestate management DB 620. - In the configuration illustrated in
FIG. 32 , thecooking appliance 400 holds information stored in the appliancestate management DB 620. Accordingly, upon receipt of a control command including a cooking program ID, theCPU 430 of thecooking appliance 400 can identify an appliance ID from the information stored in the appliancestate management DB 620. Thus, theappliance control unit 910 in thecooking appliance 400 can activate a cooking unit (e.g., theheater 2 of the IH cooker 244) corresponding to the identified appliance ID in accordance with the control command. - While the second embodiment described above provides an example in which the audio input and
output device 240, thedisplay terminal 260, the homeelectric appliances 101, and theintegrated management device 2800 are separate from one another, the present disclosure is not limited to this embodiment. Thedisplay terminal 260 may include the audio input andoutput device 240. Alternatively, each of the homeelectric appliances 101 may include the audio input andoutput device 240 and/or thedisplay terminal 260. Additionally, each of the homeelectric appliances 101 may include theintegrated management device 2800. - A description will now be given of a further embodiment in which a user cooks by applying a cooking recipe that uses a plurality of cooking appliances to a speech-based interactive software agent. The speech-based interactive software agent may be mounted in each cooking appliance, or may be installed in a house so that home cooking appliances or home electric appliances are uniformly controllable. By way example, the speech-based interactive software agent is configured to be able to access each of the home electric appliances and vice versa. In the following, the speech-based interactive software agent is sometimes referred to simply as “the agent”.
- Here, it is assumed to provide, for example, a cooking recipe that uses an IH cooker or a gas stove, a microwave oven, and an oven to make roast beef and its accompanying boiled vegetables.
- The agent notifies the user that the user has to thaw frozen beef, which is an ingredient, and instructs the user to put the frozen beef in the microwave oven and press a thaw button. In this case, the agent accesses the microwave oven to check that the thaw button has been pressed. The agent may also check that the thaw button has been pressed, by receiving a notification from the user that the user has completed carrying out the thawing instructions.
- When it is confirmed that the thaw button has been pressed, the agent instructs the user to boil water using the IH cooker to make boiled vegetables which will accompany the roast beef. In response to the instructions, the user pours water into a pot and activates the IH cooker. Also in this case, the agent accesses the IH cooker to check that the IH cooker has been activated. Preferably, the press of a boiling-water button or the like on the IH cooker is detected. Simply, the activation of the IH cooker may be detected. Alternatively, the user may inform the agent that the water has started to boil. Instead of the IH cooker, a gas stove, an electronic kettle, or any other suitable cooking equipment may be used.
- After checking that the water has started to boil, the agent instructs the user to prepare vegetables for boiling. Here, the agent may give detailed instructions, such as how to cut or slice the vegetables, to the user. The user informs the agent that the preparation has been completed.
- After completion of the thawing of the frozen beef or the boiling of the water, the microwave oven and the IH cooker notify the agent of completion of the action. If access between the microwave oven or the IH cooker and the agent is not available, the user may inform the agent of completion of the action.
- After checking the microwave oven and IH cooker and detecting the completion of the thawing of the frozen beef or the boiling of the water, the agent gives further instructions to the user to perform the subsequent operation. Examples of the instructions include taking the beef out of the microwave oven and preparing the beef. Other examples of the instructions include putting the vegetables into the pot filled with boiling water on the IH cooker and boiling the vegetables for, for example, 10 minutes. Placing the vegetables into the pot temporarily reduces the temperature of the water in the pot. Thus, the agent may detect a reduction in temperature, and start a timer to run for 10 minutes after the detection to notify the user when to lift the vegetables out of the pot. The agent may also prompt the user to preheat the oven to, for example, 250 degrees at the same time instructions are given to prepare the beef.
- After detecting the completion of the preparation of the beef, the agent instructs the user to put the beef into the oven. Here, the user may inform the agent of the completion of the preparation of the beef.
- Upon a sensor or the like in the oven detecting that the beef has been put into the oven, the agent activates the oven. For example, the agent causes the oven to perform operations including roasting the beef at 250 degrees for 15 minutes, and then turning the temperature of the oven down to 160 degrees and roasting the beef for a further 40 minutes. In this case, the agent may perform automatic setting of the operations on the oven, or may instruct the user to operate the oven. In this case, the agent may notify the user when to turn down the temperature of the oven, by using the timer through measurement, at an appropriate time. Alternatively, the user may be prompted to set in advance a temperature control program to turn the temperature of the oven down to 160 degrees after 15 minutes.
- After checking that cooking in the oven has been completed, the agent instructs the user to make a sauce from the drippings left over on a roasting tray in the oven. For example, the agent instructs the user to pour the meat juices into a saucepan or a frying pan and to heat the meat juices using the IH cooker. In this case, the agent may automatically set the heating program for the IH cooker to make a sauce, or may instruct the user at an appropriate time to change the heating level. Additionally, instructions for spices and similar items to be added to the meat juices are also given as necessary.
- After detecting the completion of the sauce, the agent instructs the user to arrange the boiled vegetables and the roast beef on a plate. After that, the cooking of the recipe is completed.
- In this manner, when applying a cooking recipe that uses a plurality of cooking appliances, a speech-based interactive software agent is capable of checking the progress of the cooking by receiving a notification of an operation check from each cooking appliance, and is capable of giving instructions to the user in accordance with the progress.
- The processes described above may not necessarily be applied to cooking, and may also be applied to other activities using a plurality of home electric appliances.
- While speech-based appliance control systems according to some embodiments have been described above, the present invention is not limited to the embodiments herein described.
- The processing units included in the speech-based appliance control systems according to the embodiments described above are typically implemented as large scale integrated circuits (LSIs) that are integrated circuits. These processing units may be separated into single chips, or some or all of them may be integrated into a single chip.
- The approach of fabricating an integrated circuit is not limited to an LSI technology, and may be implemented by a dedicated circuit or a general-purpose processor. A field programmable gate array (FPGA) that is programmable after an LSI is fabricated or a reconfigurable processor capable of reconfiguring the connection or setting of circuit cells in the LSI may be used.
- In the embodiments described above, each constituent element may be implemented in dedicated hardware or may be implemented by the execution of a software program suitable for the constituent element. Each constituent element may be implemented by reading and executing, with a program executing unit such as a CPU or a processor, a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
- One embodiment of the present invention may be implemented as the program described above, or may be implemented as a non-transitory computer-readable recording medium storing the program described above. It is to be understood that the program described above may be distributed via a transmission medium such as the Internet.
- The numbers or values used herein are illustrative in order to clarify the present invention, and the present invention is not limited to the illustrated numbers or values.
- In addition, connection relationships between constituent elements are illustrative in order to clarify the present invention, and the connection relationships that achieve the functions in the present invention are not limited to the illustrated connection relationships.
- In addition, the divisions of function blocks in the block diagrams are examples. A plurality of function blocks may be implemented as a single function block, a single function block may be divided into a plurality of pieces, or some functions may be transferred to other function blocks. Additionally, the functions of a plurality of function blocks having similar functions may be processed by a single hardware or software component in parallel or time-sharing fashion.
- While a speech-based appliance control system according to an aspect has been described with reference to some embodiments, the present invention is not limited to the embodiments described herein. Various modifications that might be made by those skilled in the art to the embodiments described herein, and embodiments that might be made by combining constituent elements in different embodiments may also be contained within an aspect without departing from the scope of the present invention.
- The techniques described in the embodiments described above are feasible in, for example, the following cloud service models. However, the models in which the techniques described in the embodiments described above are feasible are not limited to the following models.
-
FIG. 33 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 1 (local-data-center-based cloud service). In this model, theservice provider 4120 acquires information from thegroup 4100, and provides a service to a user. In this model, theservice provider 4120 has the function of a data center management company. That is, theservice provider 4120 owns acloud server 4203 that manages big data. Accordingly, no data center management company exists. - In this model, the
service provider 4120 operates and manages the data center (cloud server) 4203. In addition, theservice provider 4120 manages an operating system (OS) 4202 and anapplication 4201. Theservice provider 4120 provides a service using theOS 4202 and theapplication 4201 that are managed by the service provider 4120 (arrow 204). -
FIG. 34 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 2 (IaaS-based cloud service). IaaS is an acronym for Infrastructure as a Service, and is a cloud service providing model that provides, as a service via the Internet, an infrastructure itself to build and run a computer system. - In this model, the data
center management company 4110 operates and manages the data center (cloud server) 4203. In addition, theservice provider 4120 manages theOS 4202 and theapplication 4201. Theservice provider 4120 provides a service using theOS 4202 and theapplication 4201 that are managed by the service provider 4120 (arrow 204). -
FIG. 35 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 3 (PaaS-based cloud service). PaaS is an acronym for Platform as a Service, and is a cloud service providing model that provides, as a service via the Internet, a platform which provides a foundation to build and run software. - In this model, the data
center management company 4110 manages theOS 4202, and operates and manages the data center (cloud server) 4203. In addition, theservice provider 4120 manages theapplication 4201. Theservice provider 4120 provides a service using theOS 4202 managed by the datacenter management company 4110 and theapplication 4201 managed by the service provider 4120 (arrow 204). -
FIG. 36 is a diagram illustrating an overview of a service provided by a speech-based appliance control system in service model 4 (SaaS-based cloud service). SaaS is an acronym for Software as a Service. The SaaS-based cloud service is a cloud service providing model in which, for example, a user such as a company or an individual who does not own a data center (cloud server) has a function of being able to use, via a network such as the Internet, an application provided by a platform provider that owns a data center (cloud server). - In this model, the data
center management company 4110 manages theapplication 4201, manages theOS 4202, and operates and manages the data center (cloud server) 4203. In addition, theservice provider 4120 provides a service using theOS 4202 and theapplication 4201 that are managed by the data center management company 4110 (arrow 204). - Accordingly, the
service provider 4120 provides a service in any of the cloud service models described above. In addition, for example, theservice provider 4120 or the datacenter management company 4110 may develop theOS 4202, theapplication 4201, a database for big data, or the like by itself, or may outsource the development to a third party. - The present disclosure is applicable to a speech-based appliance control system that controls a cooking appliance by using speech, and to the cooking appliance used in the speech-based appliance control system.
Claims (23)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/473,263 US9316400B2 (en) | 2013-09-03 | 2014-08-29 | Appliance control method, speech-based appliance control system, and cooking appliance |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201361873140P | 2013-09-03 | 2013-09-03 | |
JP2014-135827 | 2014-07-01 | ||
JP2014135827A JP6371606B2 (en) | 2013-09-03 | 2014-07-01 | Device control method and audio device control system |
US14/473,263 US9316400B2 (en) | 2013-09-03 | 2014-08-29 | Appliance control method, speech-based appliance control system, and cooking appliance |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150066516A1 true US20150066516A1 (en) | 2015-03-05 |
US9316400B2 US9316400B2 (en) | 2016-04-19 |
Family
ID=51429135
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/473,263 Expired - Fee Related US9316400B2 (en) | 2013-09-03 | 2014-08-29 | Appliance control method, speech-based appliance control system, and cooking appliance |
Country Status (2)
Country | Link |
---|---|
US (1) | US9316400B2 (en) |
EP (1) | EP2851621B1 (en) |
Cited By (170)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150287411A1 (en) * | 2014-04-08 | 2015-10-08 | Panasonic Intellectual Property Corporation Of America | Device control method, device management system, and voice input apparatus |
US20150348554A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Intelligent assistant for home automation |
US20160094360A1 (en) * | 2014-09-30 | 2016-03-31 | Robert Bosch Gmbh | Method and Device for Commissioning a Smart Home Appliance |
US20160255480A1 (en) * | 2015-02-26 | 2016-09-01 | Sony Corporation | Unified notification and response system |
US20160343376A1 (en) * | 2015-01-12 | 2016-11-24 | YUTOU Technology (Hangzhou) Co.,Ltd. | Voice Recognition System of a Robot System and Method Thereof |
US9548050B2 (en) | 2010-01-18 | 2017-01-17 | Apple Inc. | Intelligent automated assistant |
US9582608B2 (en) | 2013-06-07 | 2017-02-28 | Apple Inc. | Unified ranking with entropy-weighted information for phrase-based semantic auto-completion |
US9620104B2 (en) | 2013-06-07 | 2017-04-11 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US9626955B2 (en) | 2008-04-05 | 2017-04-18 | Apple Inc. | Intelligent text-to-speech conversion |
US9633660B2 (en) | 2010-02-25 | 2017-04-25 | Apple Inc. | User profiling for voice input processing |
US9646614B2 (en) | 2000-03-16 | 2017-05-09 | Apple Inc. | Fast, language-independent method for user authentication by voice |
US9668024B2 (en) | 2014-06-30 | 2017-05-30 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US20170205077A1 (en) * | 2014-07-26 | 2017-07-20 | Won Young SEO | Intelligent electric stove |
US20170287479A1 (en) * | 2014-10-24 | 2017-10-05 | Sony Interactive Entertainment Inc. | Control device, control method, program and information storage medium |
CN107327876A (en) * | 2017-08-23 | 2017-11-07 | 华帝股份有限公司 | A kind of acoustic control gas-cooker and its application method |
US9934775B2 (en) | 2016-05-26 | 2018-04-03 | Apple Inc. | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
US20180102127A1 (en) * | 2016-10-12 | 2018-04-12 | Kabushiki Kaisha Toshiba | Electronic device and control method thereof |
US9953088B2 (en) | 2012-05-14 | 2018-04-24 | Apple Inc. | Crowd sourcing information to fulfill user requests |
US9966068B2 (en) | 2013-06-08 | 2018-05-08 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US9971774B2 (en) | 2012-09-19 | 2018-05-15 | Apple Inc. | Voice-based media searching |
US9972304B2 (en) | 2016-06-03 | 2018-05-15 | Apple Inc. | Privacy preserving distributed evaluation framework for embedded personalized systems |
US9986419B2 (en) | 2014-09-30 | 2018-05-29 | Apple Inc. | Social reminders |
US9984686B1 (en) | 2015-03-17 | 2018-05-29 | Amazon Technologies, Inc. | Mapping device capabilities to a predefined set |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
US10067938B2 (en) | 2016-06-10 | 2018-09-04 | Apple Inc. | Multilingual word prediction |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US10083690B2 (en) | 2014-05-30 | 2018-09-25 | Apple Inc. | Better resolution when referencing to concepts |
US10089072B2 (en) | 2016-06-11 | 2018-10-02 | Apple Inc. | Intelligent device arbitration and control |
US10102359B2 (en) | 2011-03-21 | 2018-10-16 | Apple Inc. | Device access using voice authentication |
US10108612B2 (en) | 2008-07-31 | 2018-10-23 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US10169329B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Exemplar-based natural language processing |
US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US10185542B2 (en) | 2013-06-09 | 2019-01-22 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US10192552B2 (en) | 2016-06-10 | 2019-01-29 | Apple Inc. | Digital assistant providing whispered speech |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10249300B2 (en) | 2016-06-06 | 2019-04-02 | Apple Inc. | Intelligent list reading |
US10269345B2 (en) | 2016-06-11 | 2019-04-23 | Apple Inc. | Intelligent task discovery |
US10283110B2 (en) | 2009-07-02 | 2019-05-07 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
US10297253B2 (en) | 2016-06-11 | 2019-05-21 | Apple Inc. | Application integration with a digital assistant |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
US10311871B2 (en) | 2015-03-08 | 2019-06-04 | Apple Inc. | Competing devices responding to voice triggers |
US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
US10318871B2 (en) | 2005-09-08 | 2019-06-11 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US10332518B2 (en) | 2017-05-09 | 2019-06-25 | Apple Inc. | User interface for correcting recognition errors |
US10354011B2 (en) | 2016-06-09 | 2019-07-16 | Apple Inc. | Intelligent automated assistant in a home environment |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10366158B2 (en) | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
US10365620B1 (en) | 2015-06-30 | 2019-07-30 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US10381016B2 (en) | 2008-01-03 | 2019-08-13 | Apple Inc. | Methods and apparatus for altering audio output signals |
US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
US10403283B1 (en) | 2018-06-01 | 2019-09-03 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
US10431204B2 (en) | 2014-09-11 | 2019-10-01 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US10438595B2 (en) | 2014-09-30 | 2019-10-08 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US10434412B2 (en) | 2014-10-24 | 2019-10-08 | Sony Interactive Entertainment Inc. | Control apparatus, control method, program, and information storage medium |
US10445429B2 (en) | 2017-09-21 | 2019-10-15 | Apple Inc. | Natural language understanding using vocabularies with compressed serialized tries |
US10446143B2 (en) | 2016-03-14 | 2019-10-15 | Apple Inc. | Identification of voice inputs providing credentials |
US10453443B2 (en) | 2014-09-30 | 2019-10-22 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US10490187B2 (en) | 2016-06-10 | 2019-11-26 | Apple Inc. | Digital assistant providing automated status report |
US10496705B1 (en) | 2018-06-03 | 2019-12-03 | Apple Inc. | Accelerated task performance |
US10497365B2 (en) | 2014-05-30 | 2019-12-03 | Apple Inc. | Multi-command single utterance input method |
US10509862B2 (en) | 2016-06-10 | 2019-12-17 | Apple Inc. | Dynamic phrase expansion of language input |
US10521466B2 (en) | 2016-06-11 | 2019-12-31 | Apple Inc. | Data driven natural language event detection and classification |
US10529332B2 (en) | 2015-03-08 | 2020-01-07 | Apple Inc. | Virtual assistant activation |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10593346B2 (en) | 2016-12-22 | 2020-03-17 | Apple Inc. | Rank-reduced token representation for automatic speech recognition |
US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
US10636424B2 (en) | 2017-11-30 | 2020-04-28 | Apple Inc. | Multi-turn canned dialog |
US10643611B2 (en) | 2008-10-02 | 2020-05-05 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10657328B2 (en) | 2017-06-02 | 2020-05-19 | Apple Inc. | Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling |
US10655951B1 (en) | 2015-06-25 | 2020-05-19 | Amazon Technologies, Inc. | Determining relative positions of user devices |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10684703B2 (en) | 2018-06-01 | 2020-06-16 | Apple Inc. | Attention aware virtual assistant dismissal |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10714117B2 (en) | 2013-02-07 | 2020-07-14 | Apple Inc. | Voice trigger for a digital assistant |
US10720077B2 (en) | 2016-02-18 | 2020-07-21 | Meyer Intellectual Properties Ltd. | Auxiliary button for a cooking system |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
US10733982B2 (en) | 2018-01-08 | 2020-08-04 | Apple Inc. | Multi-directional dialog |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10741185B2 (en) | 2010-01-18 | 2020-08-11 | Apple Inc. | Intelligent automated assistant |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US10748546B2 (en) | 2017-05-16 | 2020-08-18 | Apple Inc. | Digital assistant services based on device capabilities |
US10755051B2 (en) | 2017-09-29 | 2020-08-25 | Apple Inc. | Rule-based natural language processing |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US10789959B2 (en) | 2018-03-02 | 2020-09-29 | Apple Inc. | Training speaker recognition models for digital assistants |
US10789945B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Low-latency intelligent automated assistant |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10795541B2 (en) | 2009-06-05 | 2020-10-06 | Apple Inc. | Intelligent organization of tasks items |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
WO2021008120A1 (en) * | 2019-07-12 | 2021-01-21 | 深圳技术大学 | Method and apparatus for controlling cooking robot through cloud speech |
US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
CN112664996A (en) * | 2021-02-05 | 2021-04-16 | 广东沃尔姆斯电器有限公司 | Novel intelligent range hood |
US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
US11010127B2 (en) | 2015-06-29 | 2021-05-18 | Apple Inc. | Virtual assistant for media playback |
US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
US11010550B2 (en) | 2015-09-29 | 2021-05-18 | Apple Inc. | Unified language modeling framework for word prediction, auto-completion and auto-correction |
US11023513B2 (en) | 2007-12-20 | 2021-06-01 | Apple Inc. | Method and apparatus for searching using an active ontology |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US11069336B2 (en) | 2012-03-02 | 2021-07-20 | Apple Inc. | Systems and methods for name pronunciation |
US11070949B2 (en) | 2015-05-27 | 2021-07-20 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display |
US11080012B2 (en) | 2009-06-05 | 2021-08-03 | Apple Inc. | Interface for a virtual digital assistant |
CN113359569A (en) * | 2021-06-29 | 2021-09-07 | 海信家电集团股份有限公司 | Menu processing method and device |
US11120372B2 (en) | 2011-06-03 | 2021-09-14 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US11127397B2 (en) | 2015-05-27 | 2021-09-21 | Apple Inc. | Device voice control |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
CN113647797A (en) * | 2021-09-09 | 2021-11-16 | 广东美的厨房电器制造有限公司 | Cooking equipment, control method and device thereof and storage medium |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
CN113848745A (en) * | 2021-10-20 | 2021-12-28 | 海信家电集团股份有限公司 | Menu generation method and device |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
US11231904B2 (en) | 2015-03-06 | 2022-01-25 | Apple Inc. | Reducing response latency of intelligent automated assistants |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11269678B2 (en) | 2012-05-15 | 2022-03-08 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US11281993B2 (en) | 2016-12-05 | 2022-03-22 | Apple Inc. | Model and ensemble compression for metric learning |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
US11314370B2 (en) | 2013-12-06 | 2022-04-26 | Apple Inc. | Method for extracting salient dialog usage from live data |
US11350253B2 (en) | 2011-06-03 | 2022-05-31 | Apple Inc. | Active transport based notifications |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
US11388291B2 (en) | 2013-03-14 | 2022-07-12 | Apple Inc. | System and method for processing voicemail |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US11467802B2 (en) | 2017-05-11 | 2022-10-11 | Apple Inc. | Maintaining privacy of personal information |
US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11495218B2 (en) | 2018-06-01 | 2022-11-08 | Apple Inc. | Virtual assistant operation in multi-device environments |
US11501771B2 (en) * | 2017-08-31 | 2022-11-15 | Samsung Electronics Co., Ltd. | Cooking apparatus and cooking system |
US11532306B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Detecting a trigger of a digital assistant |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
CN116047922A (en) * | 2021-10-28 | 2023-05-02 | 青岛海尔科技有限公司 | A kitchen equipment control method, device and cooking system |
US11657813B2 (en) | 2019-05-31 | 2023-05-23 | Apple Inc. | Voice identification in digital assistant systems |
US11671920B2 (en) | 2007-04-03 | 2023-06-06 | Apple Inc. | Method and system for operating a multifunction portable electronic device using voice-activation |
WO2023109247A1 (en) * | 2021-12-15 | 2023-06-22 | 聚好看科技股份有限公司 | Food storage device, server, and interface display method |
US11696060B2 (en) | 2020-07-21 | 2023-07-04 | Apple Inc. | User identification using headphones |
US11765209B2 (en) | 2020-05-11 | 2023-09-19 | Apple Inc. | Digital assistant hardware abstraction |
US11766151B2 (en) | 2016-02-18 | 2023-09-26 | Meyer Intellectual Properties Ltd. | Cooking system with error detection |
US11790914B2 (en) | 2019-06-01 | 2023-10-17 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11798547B2 (en) | 2013-03-15 | 2023-10-24 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US11809483B2 (en) | 2015-09-08 | 2023-11-07 | Apple Inc. | Intelligent automated assistant for media search and playback |
US11838734B2 (en) | 2020-07-20 | 2023-12-05 | Apple Inc. | Multi-device audio adjustment coordination |
US11853536B2 (en) | 2015-09-08 | 2023-12-26 | Apple Inc. | Intelligent automated assistant in a media environment |
US11886805B2 (en) | 2015-11-09 | 2024-01-30 | Apple Inc. | Unconventional virtual assistant interactions |
US11914848B2 (en) | 2020-05-11 | 2024-02-27 | Apple Inc. | Providing relevant data items based on context |
US20240099507A1 (en) * | 2019-10-30 | 2024-03-28 | BSH Hausgeräte GmbH | Food preparation system |
US12010262B2 (en) | 2013-08-06 | 2024-06-11 | Apple Inc. | Auto-activating smart responses based on activities from remote devices |
US12014118B2 (en) | 2017-05-15 | 2024-06-18 | Apple Inc. | Multi-modal interfaces having selection disambiguation and text modification capability |
US12197817B2 (en) | 2016-06-11 | 2025-01-14 | Apple Inc. | Intelligent device arbitration and control |
US12223282B2 (en) | 2016-06-09 | 2025-02-11 | Apple Inc. | Intelligent automated assistant in a home environment |
US12301635B2 (en) | 2020-05-11 | 2025-05-13 | Apple Inc. | Digital assistant hardware abstraction |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105737218A (en) * | 2016-02-23 | 2016-07-06 | 美的集团股份有限公司 | Microwave oven speech control method and microwave oven |
US11410638B1 (en) * | 2017-08-30 | 2022-08-09 | Amazon Technologies, Inc. | Voice user interface for nested content |
US11671311B2 (en) * | 2020-10-23 | 2023-06-06 | Netapp, Inc. | Infrastructure appliance malfunction detection |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030040915A1 (en) * | 2000-03-08 | 2003-02-27 | Roland Aubauer | Method for the voice-controlled initiation of actions by means of a limited circle of users, whereby said actions can be carried out in appliance |
US6778964B2 (en) * | 2000-02-11 | 2004-08-17 | Bsh Bosch Und Siemens Hausgerate Gmbh | Electrical appliance voice input unit and method with interference correction based on operational status of noise source |
US20130092032A1 (en) * | 2011-10-18 | 2013-04-18 | Bsh Home Appliances Corporation | Intelligent home cooking appliance, associated systems, and/or methods |
US20130201316A1 (en) * | 2012-01-09 | 2013-08-08 | May Patents Ltd. | System and method for server based control |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5688506A (en) | 1979-12-21 | 1981-07-18 | Matsushita Electric Ind Co Ltd | Heater |
JP2002091491A (en) | 2000-09-20 | 2002-03-27 | Sanyo Electric Co Ltd | Voice control system for plural pieces of equipment |
DE102005018276A1 (en) | 2005-04-14 | 2006-10-19 | E.G.O. Elektro-Gerätebau GmbH | Electrical-household appliance e.g. refrigerator, system, for dwelling place, has voice recognition system recognizing voice input at appliances, and including control device for outputting data and/or instructions to appliances |
US8342080B2 (en) | 2009-12-04 | 2013-01-01 | Richardson Steven M | Programmable cooking system and method |
-
2014
- 2014-08-29 US US14/473,263 patent/US9316400B2/en not_active Expired - Fee Related
- 2014-09-01 EP EP14183039.8A patent/EP2851621B1/en not_active Not-in-force
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6778964B2 (en) * | 2000-02-11 | 2004-08-17 | Bsh Bosch Und Siemens Hausgerate Gmbh | Electrical appliance voice input unit and method with interference correction based on operational status of noise source |
US20030040915A1 (en) * | 2000-03-08 | 2003-02-27 | Roland Aubauer | Method for the voice-controlled initiation of actions by means of a limited circle of users, whereby said actions can be carried out in appliance |
US20130092032A1 (en) * | 2011-10-18 | 2013-04-18 | Bsh Home Appliances Corporation | Intelligent home cooking appliance, associated systems, and/or methods |
US20130201316A1 (en) * | 2012-01-09 | 2013-08-08 | May Patents Ltd. | System and method for server based control |
Cited By (307)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9646614B2 (en) | 2000-03-16 | 2017-05-09 | Apple Inc. | Fast, language-independent method for user authentication by voice |
US10318871B2 (en) | 2005-09-08 | 2019-06-11 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US11928604B2 (en) | 2005-09-08 | 2024-03-12 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
US11671920B2 (en) | 2007-04-03 | 2023-06-06 | Apple Inc. | Method and system for operating a multifunction portable electronic device using voice-activation |
US11979836B2 (en) | 2007-04-03 | 2024-05-07 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
US11023513B2 (en) | 2007-12-20 | 2021-06-01 | Apple Inc. | Method and apparatus for searching using an active ontology |
US10381016B2 (en) | 2008-01-03 | 2019-08-13 | Apple Inc. | Methods and apparatus for altering audio output signals |
US9626955B2 (en) | 2008-04-05 | 2017-04-18 | Apple Inc. | Intelligent text-to-speech conversion |
US9865248B2 (en) | 2008-04-05 | 2018-01-09 | Apple Inc. | Intelligent text-to-speech conversion |
US10108612B2 (en) | 2008-07-31 | 2018-10-23 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
US11900936B2 (en) | 2008-10-02 | 2024-02-13 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10643611B2 (en) | 2008-10-02 | 2020-05-05 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US11348582B2 (en) | 2008-10-02 | 2022-05-31 | Apple Inc. | Electronic devices with voice command and contextual data processing capabilities |
US10795541B2 (en) | 2009-06-05 | 2020-10-06 | Apple Inc. | Intelligent organization of tasks items |
US11080012B2 (en) | 2009-06-05 | 2021-08-03 | Apple Inc. | Interface for a virtual digital assistant |
US10283110B2 (en) | 2009-07-02 | 2019-05-07 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
US9548050B2 (en) | 2010-01-18 | 2017-01-17 | Apple Inc. | Intelligent automated assistant |
US10706841B2 (en) | 2010-01-18 | 2020-07-07 | Apple Inc. | Task flow identification based on user intent |
US12087308B2 (en) | 2010-01-18 | 2024-09-10 | Apple Inc. | Intelligent automated assistant |
US10741185B2 (en) | 2010-01-18 | 2020-08-11 | Apple Inc. | Intelligent automated assistant |
US11423886B2 (en) | 2010-01-18 | 2022-08-23 | Apple Inc. | Task flow identification based on user intent |
US12165635B2 (en) | 2010-01-18 | 2024-12-10 | Apple Inc. | Intelligent automated assistant |
US10049675B2 (en) | 2010-02-25 | 2018-08-14 | Apple Inc. | User profiling for voice input processing |
US10692504B2 (en) | 2010-02-25 | 2020-06-23 | Apple Inc. | User profiling for voice input processing |
US9633660B2 (en) | 2010-02-25 | 2017-04-25 | Apple Inc. | User profiling for voice input processing |
US10102359B2 (en) | 2011-03-21 | 2018-10-16 | Apple Inc. | Device access using voice authentication |
US10417405B2 (en) | 2011-03-21 | 2019-09-17 | Apple Inc. | Device access using voice authentication |
US11350253B2 (en) | 2011-06-03 | 2022-05-31 | Apple Inc. | Active transport based notifications |
US11120372B2 (en) | 2011-06-03 | 2021-09-14 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
US11069336B2 (en) | 2012-03-02 | 2021-07-20 | Apple Inc. | Systems and methods for name pronunciation |
US9953088B2 (en) | 2012-05-14 | 2018-04-24 | Apple Inc. | Crowd sourcing information to fulfill user requests |
US11321116B2 (en) | 2012-05-15 | 2022-05-03 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US11269678B2 (en) | 2012-05-15 | 2022-03-08 | Apple Inc. | Systems and methods for integrating third party services with a digital assistant |
US10079014B2 (en) | 2012-06-08 | 2018-09-18 | Apple Inc. | Name recognition system |
US9971774B2 (en) | 2012-09-19 | 2018-05-15 | Apple Inc. | Voice-based media searching |
US11557310B2 (en) | 2013-02-07 | 2023-01-17 | Apple Inc. | Voice trigger for a digital assistant |
US11636869B2 (en) | 2013-02-07 | 2023-04-25 | Apple Inc. | Voice trigger for a digital assistant |
US12009007B2 (en) | 2013-02-07 | 2024-06-11 | Apple Inc. | Voice trigger for a digital assistant |
US12277954B2 (en) | 2013-02-07 | 2025-04-15 | Apple Inc. | Voice trigger for a digital assistant |
US10714117B2 (en) | 2013-02-07 | 2020-07-14 | Apple Inc. | Voice trigger for a digital assistant |
US11862186B2 (en) | 2013-02-07 | 2024-01-02 | Apple Inc. | Voice trigger for a digital assistant |
US10978090B2 (en) | 2013-02-07 | 2021-04-13 | Apple Inc. | Voice trigger for a digital assistant |
US11388291B2 (en) | 2013-03-14 | 2022-07-12 | Apple Inc. | System and method for processing voicemail |
US11798547B2 (en) | 2013-03-15 | 2023-10-24 | Apple Inc. | Voice activated device for use with a voice-based digital assistant |
US9582608B2 (en) | 2013-06-07 | 2017-02-28 | Apple Inc. | Unified ranking with entropy-weighted information for phrase-based semantic auto-completion |
US9620104B2 (en) | 2013-06-07 | 2017-04-11 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US9966060B2 (en) | 2013-06-07 | 2018-05-08 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
US9966068B2 (en) | 2013-06-08 | 2018-05-08 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US10657961B2 (en) | 2013-06-08 | 2020-05-19 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
US11727219B2 (en) | 2013-06-09 | 2023-08-15 | Apple Inc. | System and method for inferring user intent from speech inputs |
US12073147B2 (en) | 2013-06-09 | 2024-08-27 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US10185542B2 (en) | 2013-06-09 | 2019-01-22 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US10176167B2 (en) | 2013-06-09 | 2019-01-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US10769385B2 (en) | 2013-06-09 | 2020-09-08 | Apple Inc. | System and method for inferring user intent from speech inputs |
US11048473B2 (en) | 2013-06-09 | 2021-06-29 | Apple Inc. | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
US12010262B2 (en) | 2013-08-06 | 2024-06-11 | Apple Inc. | Auto-activating smart responses based on activities from remote devices |
US11314370B2 (en) | 2013-12-06 | 2022-04-26 | Apple Inc. | Method for extracting salient dialog usage from live data |
US10102861B2 (en) | 2014-04-08 | 2018-10-16 | Panasonic Intellectual Property Corporation Of America | Device control method, device management system, and voice input apparatus |
US9431017B2 (en) * | 2014-04-08 | 2016-08-30 | Panasonic Intellectual Property Corporation Of America | Device control method, device management system, and voice input apparatus |
US10515641B2 (en) | 2014-04-08 | 2019-12-24 | Sovereign Peak Ventures, Llc | Device control method, device management system, and voice input apparatus |
US9747903B2 (en) * | 2014-04-08 | 2017-08-29 | Panasonic Intellectual Property Corporation Of America | Device control method, device management system, and voice input apparatus |
US20150287411A1 (en) * | 2014-04-08 | 2015-10-08 | Panasonic Intellectual Property Corporation Of America | Device control method, device management system, and voice input apparatus |
US11699448B2 (en) | 2014-05-30 | 2023-07-11 | Apple Inc. | Intelligent assistant for home automation |
US20150348554A1 (en) * | 2014-05-30 | 2015-12-03 | Apple Inc. | Intelligent assistant for home automation |
US11133008B2 (en) | 2014-05-30 | 2021-09-28 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11810562B2 (en) | 2014-05-30 | 2023-11-07 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US11257504B2 (en) | 2014-05-30 | 2022-02-22 | Apple Inc. | Intelligent assistant for home automation |
US10170123B2 (en) * | 2014-05-30 | 2019-01-01 | Apple Inc. | Intelligent assistant for home automation |
US10417344B2 (en) | 2014-05-30 | 2019-09-17 | Apple Inc. | Exemplar-based natural language processing |
US10657966B2 (en) | 2014-05-30 | 2020-05-19 | Apple Inc. | Better resolution when referencing to concepts |
US10169329B2 (en) | 2014-05-30 | 2019-01-01 | Apple Inc. | Exemplar-based natural language processing |
US10699717B2 (en) | 2014-05-30 | 2020-06-30 | Apple Inc. | Intelligent assistant for home automation |
US10083690B2 (en) | 2014-05-30 | 2018-09-25 | Apple Inc. | Better resolution when referencing to concepts |
US10878809B2 (en) | 2014-05-30 | 2020-12-29 | Apple Inc. | Multi-command single utterance input method |
US10714095B2 (en) | 2014-05-30 | 2020-07-14 | Apple Inc. | Intelligent assistant for home automation |
US12118999B2 (en) | 2014-05-30 | 2024-10-15 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
US10497365B2 (en) | 2014-05-30 | 2019-12-03 | Apple Inc. | Multi-command single utterance input method |
US12067990B2 (en) | 2014-05-30 | 2024-08-20 | Apple Inc. | Intelligent assistant for home automation |
US11670289B2 (en) | 2014-05-30 | 2023-06-06 | Apple Inc. | Multi-command single utterance input method |
US11838579B2 (en) | 2014-06-30 | 2023-12-05 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US11516537B2 (en) | 2014-06-30 | 2022-11-29 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US9668024B2 (en) | 2014-06-30 | 2017-05-30 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US12200297B2 (en) | 2014-06-30 | 2025-01-14 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US10904611B2 (en) | 2014-06-30 | 2021-01-26 | Apple Inc. | Intelligent automated assistant for TV user interactions |
US20170205077A1 (en) * | 2014-07-26 | 2017-07-20 | Won Young SEO | Intelligent electric stove |
US10431204B2 (en) | 2014-09-11 | 2019-10-01 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
US10390213B2 (en) | 2014-09-30 | 2019-08-20 | Apple Inc. | Social reminders |
US10438595B2 (en) | 2014-09-30 | 2019-10-08 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
US11356290B2 (en) * | 2014-09-30 | 2022-06-07 | Robert Bosch Gmbh | Method and device for commissioning a smart home appliance |
US20160094360A1 (en) * | 2014-09-30 | 2016-03-31 | Robert Bosch Gmbh | Method and Device for Commissioning a Smart Home Appliance |
US10453443B2 (en) | 2014-09-30 | 2019-10-22 | Apple Inc. | Providing an indication of the suitability of speech recognition |
US9986419B2 (en) | 2014-09-30 | 2018-05-29 | Apple Inc. | Social reminders |
US20170287479A1 (en) * | 2014-10-24 | 2017-10-05 | Sony Interactive Entertainment Inc. | Control device, control method, program and information storage medium |
US10339928B2 (en) * | 2014-10-24 | 2019-07-02 | Sony Interactive Entertainment Inc. | Control device, control method, program and information storage medium |
US10434412B2 (en) | 2014-10-24 | 2019-10-08 | Sony Interactive Entertainment Inc. | Control apparatus, control method, program, and information storage medium |
US20160343376A1 (en) * | 2015-01-12 | 2016-11-24 | YUTOU Technology (Hangzhou) Co.,Ltd. | Voice Recognition System of a Robot System and Method Thereof |
US9693207B2 (en) * | 2015-02-26 | 2017-06-27 | Sony Corporation | Unified notification and response system |
US20160255480A1 (en) * | 2015-02-26 | 2016-09-01 | Sony Corporation | Unified notification and response system |
US11231904B2 (en) | 2015-03-06 | 2022-01-25 | Apple Inc. | Reducing response latency of intelligent automated assistants |
US10567477B2 (en) | 2015-03-08 | 2020-02-18 | Apple Inc. | Virtual assistant continuity |
US10930282B2 (en) | 2015-03-08 | 2021-02-23 | Apple Inc. | Competing devices responding to voice triggers |
US11842734B2 (en) | 2015-03-08 | 2023-12-12 | Apple Inc. | Virtual assistant activation |
US11087759B2 (en) | 2015-03-08 | 2021-08-10 | Apple Inc. | Virtual assistant activation |
US10311871B2 (en) | 2015-03-08 | 2019-06-04 | Apple Inc. | Competing devices responding to voice triggers |
US10529332B2 (en) | 2015-03-08 | 2020-01-07 | Apple Inc. | Virtual assistant activation |
US12236952B2 (en) | 2015-03-08 | 2025-02-25 | Apple Inc. | Virtual assistant activation |
US12014117B2 (en) | 2015-03-17 | 2024-06-18 | Amazon Technologies, Inc. | Grouping devices for voice control |
US9984686B1 (en) | 2015-03-17 | 2018-05-29 | Amazon Technologies, Inc. | Mapping device capabilities to a predefined set |
US11429345B2 (en) * | 2015-03-17 | 2022-08-30 | Amazon Technologies, Inc. | Remote execution of secondary-device drivers |
US10976996B1 (en) | 2015-03-17 | 2021-04-13 | Amazon Technologies, Inc. | Grouping devices for voice control |
US10453461B1 (en) * | 2015-03-17 | 2019-10-22 | Amazon Technologies, Inc. | Remote execution of secondary-device drivers |
US11422772B1 (en) | 2015-03-17 | 2022-08-23 | Amazon Technologies, Inc. | Creating scenes from voice-controllable devices |
US10031722B1 (en) | 2015-03-17 | 2018-07-24 | Amazon Technologies, Inc. | Grouping devices for voice control |
US12154016B2 (en) | 2015-05-15 | 2024-11-26 | Apple Inc. | Virtual assistant in a communication session |
US11468282B2 (en) | 2015-05-15 | 2022-10-11 | Apple Inc. | Virtual assistant in a communication session |
US12001933B2 (en) | 2015-05-15 | 2024-06-04 | Apple Inc. | Virtual assistant in a communication session |
US11127397B2 (en) | 2015-05-27 | 2021-09-21 | Apple Inc. | Device voice control |
US11070949B2 (en) | 2015-05-27 | 2021-07-20 | Apple Inc. | Systems and methods for proactively identifying and surfacing relevant content on an electronic device with a touch-sensitive display |
US10356243B2 (en) | 2015-06-05 | 2019-07-16 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US10681212B2 (en) | 2015-06-05 | 2020-06-09 | Apple Inc. | Virtual assistant aided communication with 3rd party service in a communication session |
US11025565B2 (en) | 2015-06-07 | 2021-06-01 | Apple Inc. | Personalized prediction of responses for instant messaging |
US10655951B1 (en) | 2015-06-25 | 2020-05-19 | Amazon Technologies, Inc. | Determining relative positions of user devices |
US11703320B2 (en) | 2015-06-25 | 2023-07-18 | Amazon Technologies, Inc. | Determining relative positions of user devices |
US11010127B2 (en) | 2015-06-29 | 2021-05-18 | Apple Inc. | Virtual assistant for media playback |
US11947873B2 (en) | 2015-06-29 | 2024-04-02 | Apple Inc. | Virtual assistant for media playback |
US10365620B1 (en) | 2015-06-30 | 2019-07-30 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US11340566B1 (en) | 2015-06-30 | 2022-05-24 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US11809150B1 (en) | 2015-06-30 | 2023-11-07 | Amazon Technologies, Inc. | Interoperability of secondary-device hubs |
US11853536B2 (en) | 2015-09-08 | 2023-12-26 | Apple Inc. | Intelligent automated assistant in a media environment |
US11500672B2 (en) | 2015-09-08 | 2022-11-15 | Apple Inc. | Distributed personal assistant |
US11126400B2 (en) | 2015-09-08 | 2021-09-21 | Apple Inc. | Zero latency digital assistant |
US12204932B2 (en) | 2015-09-08 | 2025-01-21 | Apple Inc. | Distributed personal assistant |
US11809483B2 (en) | 2015-09-08 | 2023-11-07 | Apple Inc. | Intelligent automated assistant for media search and playback |
US11954405B2 (en) | 2015-09-08 | 2024-04-09 | Apple Inc. | Zero latency digital assistant |
US11550542B2 (en) | 2015-09-08 | 2023-01-10 | Apple Inc. | Zero latency digital assistant |
US10747498B2 (en) | 2015-09-08 | 2020-08-18 | Apple Inc. | Zero latency digital assistant |
US10671428B2 (en) | 2015-09-08 | 2020-06-02 | Apple Inc. | Distributed personal assistant |
US10366158B2 (en) | 2015-09-29 | 2019-07-30 | Apple Inc. | Efficient word encoding for recurrent neural network language models |
US11010550B2 (en) | 2015-09-29 | 2021-05-18 | Apple Inc. | Unified language modeling framework for word prediction, auto-completion and auto-correction |
US11587559B2 (en) | 2015-09-30 | 2023-02-21 | Apple Inc. | Intelligent device identification |
US12051413B2 (en) | 2015-09-30 | 2024-07-30 | Apple Inc. | Intelligent device identification |
US11809886B2 (en) | 2015-11-06 | 2023-11-07 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11526368B2 (en) | 2015-11-06 | 2022-12-13 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US10691473B2 (en) | 2015-11-06 | 2020-06-23 | Apple Inc. | Intelligent automated assistant in a messaging environment |
US11886805B2 (en) | 2015-11-09 | 2024-01-30 | Apple Inc. | Unconventional virtual assistant interactions |
US10049668B2 (en) | 2015-12-02 | 2018-08-14 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US10354652B2 (en) | 2015-12-02 | 2019-07-16 | Apple Inc. | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
US11853647B2 (en) | 2015-12-23 | 2023-12-26 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10942703B2 (en) | 2015-12-23 | 2021-03-09 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US10223066B2 (en) | 2015-12-23 | 2019-03-05 | Apple Inc. | Proactive assistance based on dialog communication between devices |
US11766151B2 (en) | 2016-02-18 | 2023-09-26 | Meyer Intellectual Properties Ltd. | Cooking system with error detection |
US10720077B2 (en) | 2016-02-18 | 2020-07-21 | Meyer Intellectual Properties Ltd. | Auxiliary button for a cooking system |
US10446143B2 (en) | 2016-03-14 | 2019-10-15 | Apple Inc. | Identification of voice inputs providing credentials |
US9934775B2 (en) | 2016-05-26 | 2018-04-03 | Apple Inc. | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
US9972304B2 (en) | 2016-06-03 | 2018-05-15 | Apple Inc. | Privacy preserving distributed evaluation framework for embedded personalized systems |
US10249300B2 (en) | 2016-06-06 | 2019-04-02 | Apple Inc. | Intelligent list reading |
US11227589B2 (en) | 2016-06-06 | 2022-01-18 | Apple Inc. | Intelligent list reading |
US11069347B2 (en) | 2016-06-08 | 2021-07-20 | Apple Inc. | Intelligent automated assistant for media exploration |
US10049663B2 (en) | 2016-06-08 | 2018-08-14 | Apple, Inc. | Intelligent automated assistant for media exploration |
US12223282B2 (en) | 2016-06-09 | 2025-02-11 | Apple Inc. | Intelligent automated assistant in a home environment |
US10354011B2 (en) | 2016-06-09 | 2019-07-16 | Apple Inc. | Intelligent automated assistant in a home environment |
US11657820B2 (en) | 2016-06-10 | 2023-05-23 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US11037565B2 (en) | 2016-06-10 | 2021-06-15 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10192552B2 (en) | 2016-06-10 | 2019-01-29 | Apple Inc. | Digital assistant providing whispered speech |
US10733993B2 (en) | 2016-06-10 | 2020-08-04 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US12175977B2 (en) | 2016-06-10 | 2024-12-24 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
US10509862B2 (en) | 2016-06-10 | 2019-12-17 | Apple Inc. | Dynamic phrase expansion of language input |
US10067938B2 (en) | 2016-06-10 | 2018-09-04 | Apple Inc. | Multilingual word prediction |
US10490187B2 (en) | 2016-06-10 | 2019-11-26 | Apple Inc. | Digital assistant providing automated status report |
US10089072B2 (en) | 2016-06-11 | 2018-10-02 | Apple Inc. | Intelligent device arbitration and control |
US10942702B2 (en) | 2016-06-11 | 2021-03-09 | Apple Inc. | Intelligent device arbitration and control |
US11152002B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Application integration with a digital assistant |
US10297253B2 (en) | 2016-06-11 | 2019-05-21 | Apple Inc. | Application integration with a digital assistant |
US11749275B2 (en) | 2016-06-11 | 2023-09-05 | Apple Inc. | Application integration with a digital assistant |
US11809783B2 (en) | 2016-06-11 | 2023-11-07 | Apple Inc. | Intelligent device arbitration and control |
US12293763B2 (en) | 2016-06-11 | 2025-05-06 | Apple Inc. | Application integration with a digital assistant |
US10269345B2 (en) | 2016-06-11 | 2019-04-23 | Apple Inc. | Intelligent task discovery |
US12197817B2 (en) | 2016-06-11 | 2025-01-14 | Apple Inc. | Intelligent device arbitration and control |
US10580409B2 (en) | 2016-06-11 | 2020-03-03 | Apple Inc. | Application integration with a digital assistant |
US10521466B2 (en) | 2016-06-11 | 2019-12-31 | Apple Inc. | Data driven natural language event detection and classification |
US10474753B2 (en) | 2016-09-07 | 2019-11-12 | Apple Inc. | Language identification using recurrent neural networks |
US10043516B2 (en) | 2016-09-23 | 2018-08-07 | Apple Inc. | Intelligent automated assistant |
US10553215B2 (en) | 2016-09-23 | 2020-02-04 | Apple Inc. | Intelligent automated assistant |
US11404060B2 (en) | 2016-10-12 | 2022-08-02 | Hisense Visual Technology Co., Ltd. | Electronic device and control method thereof |
US20180102127A1 (en) * | 2016-10-12 | 2018-04-12 | Kabushiki Kaisha Toshiba | Electronic device and control method thereof |
US10522139B2 (en) * | 2016-10-12 | 2019-12-31 | Qingdao Hisense Electronics Co., Ltd. | Electronic device and control method thereof |
US11281993B2 (en) | 2016-12-05 | 2022-03-22 | Apple Inc. | Model and ensemble compression for metric learning |
US10593346B2 (en) | 2016-12-22 | 2020-03-17 | Apple Inc. | Rank-reduced token representation for automatic speech recognition |
US12260234B2 (en) | 2017-01-09 | 2025-03-25 | Apple Inc. | Application integration with a digital assistant |
US11656884B2 (en) | 2017-01-09 | 2023-05-23 | Apple Inc. | Application integration with a digital assistant |
US11204787B2 (en) | 2017-01-09 | 2021-12-21 | Apple Inc. | Application integration with a digital assistant |
US10741181B2 (en) | 2017-05-09 | 2020-08-11 | Apple Inc. | User interface for correcting recognition errors |
US10417266B2 (en) | 2017-05-09 | 2019-09-17 | Apple Inc. | Context-aware ranking of intelligent response suggestions |
US10332518B2 (en) | 2017-05-09 | 2019-06-25 | Apple Inc. | User interface for correcting recognition errors |
US10847142B2 (en) | 2017-05-11 | 2020-11-24 | Apple Inc. | Maintaining privacy of personal information |
US10395654B2 (en) | 2017-05-11 | 2019-08-27 | Apple Inc. | Text normalization based on a data-driven learning network |
US11599331B2 (en) | 2017-05-11 | 2023-03-07 | Apple Inc. | Maintaining privacy of personal information |
US10755703B2 (en) | 2017-05-11 | 2020-08-25 | Apple Inc. | Offline personal assistant |
US10726832B2 (en) | 2017-05-11 | 2020-07-28 | Apple Inc. | Maintaining privacy of personal information |
US11467802B2 (en) | 2017-05-11 | 2022-10-11 | Apple Inc. | Maintaining privacy of personal information |
US11405466B2 (en) | 2017-05-12 | 2022-08-02 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10791176B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Synchronization and task delegation of a digital assistant |
US10789945B2 (en) | 2017-05-12 | 2020-09-29 | Apple Inc. | Low-latency intelligent automated assistant |
US11301477B2 (en) | 2017-05-12 | 2022-04-12 | Apple Inc. | Feedback analysis of a digital assistant |
US11862151B2 (en) | 2017-05-12 | 2024-01-02 | Apple Inc. | Low-latency intelligent automated assistant |
US10410637B2 (en) | 2017-05-12 | 2019-09-10 | Apple Inc. | User-specific acoustic models |
US11380310B2 (en) | 2017-05-12 | 2022-07-05 | Apple Inc. | Low-latency intelligent automated assistant |
US11580990B2 (en) | 2017-05-12 | 2023-02-14 | Apple Inc. | User-specific acoustic models |
US11837237B2 (en) | 2017-05-12 | 2023-12-05 | Apple Inc. | User-specific acoustic models |
US11538469B2 (en) | 2017-05-12 | 2022-12-27 | Apple Inc. | Low-latency intelligent automated assistant |
US10810274B2 (en) | 2017-05-15 | 2020-10-20 | Apple Inc. | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
US10482874B2 (en) | 2017-05-15 | 2019-11-19 | Apple Inc. | Hierarchical belief states for digital assistants |
US12014118B2 (en) | 2017-05-15 | 2024-06-18 | Apple Inc. | Multi-modal interfaces having selection disambiguation and text modification capability |
US12026197B2 (en) | 2017-05-16 | 2024-07-02 | Apple Inc. | Intelligent automated assistant for media exploration |
US12254887B2 (en) | 2017-05-16 | 2025-03-18 | Apple Inc. | Far-field extension of digital assistant services for providing a notification of an event to a user |
US11532306B2 (en) | 2017-05-16 | 2022-12-20 | Apple Inc. | Detecting a trigger of a digital assistant |
US10311144B2 (en) | 2017-05-16 | 2019-06-04 | Apple Inc. | Emoji word sense disambiguation |
US10748546B2 (en) | 2017-05-16 | 2020-08-18 | Apple Inc. | Digital assistant services based on device capabilities |
US10303715B2 (en) | 2017-05-16 | 2019-05-28 | Apple Inc. | Intelligent automated assistant for media exploration |
US10403278B2 (en) | 2017-05-16 | 2019-09-03 | Apple Inc. | Methods and systems for phonetic matching in digital assistant services |
US11675829B2 (en) | 2017-05-16 | 2023-06-13 | Apple Inc. | Intelligent automated assistant for media exploration |
US10909171B2 (en) | 2017-05-16 | 2021-02-02 | Apple Inc. | Intelligent automated assistant for media exploration |
US11217255B2 (en) | 2017-05-16 | 2022-01-04 | Apple Inc. | Far-field extension for digital assistant services |
US10657328B2 (en) | 2017-06-02 | 2020-05-19 | Apple Inc. | Multi-task recurrent neural network architecture for efficient morphology handling in neural language modeling |
CN107327876A (en) * | 2017-08-23 | 2017-11-07 | 华帝股份有限公司 | A kind of acoustic control gas-cooker and its application method |
US11501771B2 (en) * | 2017-08-31 | 2022-11-15 | Samsung Electronics Co., Ltd. | Cooking apparatus and cooking system |
US10445429B2 (en) | 2017-09-21 | 2019-10-15 | Apple Inc. | Natural language understanding using vocabularies with compressed serialized tries |
US10755051B2 (en) | 2017-09-29 | 2020-08-25 | Apple Inc. | Rule-based natural language processing |
US10636424B2 (en) | 2017-11-30 | 2020-04-28 | Apple Inc. | Multi-turn canned dialog |
US10733982B2 (en) | 2018-01-08 | 2020-08-04 | Apple Inc. | Multi-directional dialog |
US10733375B2 (en) | 2018-01-31 | 2020-08-04 | Apple Inc. | Knowledge-based framework for improving natural language understanding |
US10789959B2 (en) | 2018-03-02 | 2020-09-29 | Apple Inc. | Training speaker recognition models for digital assistants |
US10592604B2 (en) | 2018-03-12 | 2020-03-17 | Apple Inc. | Inverse text normalization for automatic speech recognition |
US10818288B2 (en) | 2018-03-26 | 2020-10-27 | Apple Inc. | Natural assistant interaction |
US12211502B2 (en) | 2018-03-26 | 2025-01-28 | Apple Inc. | Natural assistant interaction |
US11710482B2 (en) | 2018-03-26 | 2023-07-25 | Apple Inc. | Natural assistant interaction |
US10909331B2 (en) | 2018-03-30 | 2021-02-02 | Apple Inc. | Implicit identification of translation payload with neural machine translation |
US11854539B2 (en) | 2018-05-07 | 2023-12-26 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US10928918B2 (en) | 2018-05-07 | 2021-02-23 | Apple Inc. | Raise to speak |
US11145294B2 (en) | 2018-05-07 | 2021-10-12 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US11169616B2 (en) | 2018-05-07 | 2021-11-09 | Apple Inc. | Raise to speak |
US11907436B2 (en) | 2018-05-07 | 2024-02-20 | Apple Inc. | Raise to speak |
US11487364B2 (en) | 2018-05-07 | 2022-11-01 | Apple Inc. | Raise to speak |
US11900923B2 (en) | 2018-05-07 | 2024-02-13 | Apple Inc. | Intelligent automated assistant for delivering content from user experiences |
US10984780B2 (en) | 2018-05-21 | 2021-04-20 | Apple Inc. | Global semantic word embeddings using bi-directional recurrent neural networks |
US11009970B2 (en) | 2018-06-01 | 2021-05-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US10403283B1 (en) | 2018-06-01 | 2019-09-03 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10892996B2 (en) | 2018-06-01 | 2021-01-12 | Apple Inc. | Variable latency device coordination |
US11431642B2 (en) | 2018-06-01 | 2022-08-30 | Apple Inc. | Variable latency device coordination |
US12080287B2 (en) | 2018-06-01 | 2024-09-03 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US11495218B2 (en) | 2018-06-01 | 2022-11-08 | Apple Inc. | Virtual assistant operation in multi-device environments |
US12067985B2 (en) | 2018-06-01 | 2024-08-20 | Apple Inc. | Virtual assistant operations in multi-device environments |
US10684703B2 (en) | 2018-06-01 | 2020-06-16 | Apple Inc. | Attention aware virtual assistant dismissal |
US11360577B2 (en) | 2018-06-01 | 2022-06-14 | Apple Inc. | Attention aware virtual assistant dismissal |
US12061752B2 (en) | 2018-06-01 | 2024-08-13 | Apple Inc. | Attention aware virtual assistant dismissal |
US11386266B2 (en) | 2018-06-01 | 2022-07-12 | Apple Inc. | Text correction |
US10984798B2 (en) | 2018-06-01 | 2021-04-20 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US11630525B2 (en) | 2018-06-01 | 2023-04-18 | Apple Inc. | Attention aware virtual assistant dismissal |
US10720160B2 (en) | 2018-06-01 | 2020-07-21 | Apple Inc. | Voice interaction at a primary device to access call functionality of a companion device |
US10496705B1 (en) | 2018-06-03 | 2019-12-03 | Apple Inc. | Accelerated task performance |
US10504518B1 (en) | 2018-06-03 | 2019-12-10 | Apple Inc. | Accelerated task performance |
US10944859B2 (en) | 2018-06-03 | 2021-03-09 | Apple Inc. | Accelerated task performance |
US11010561B2 (en) | 2018-09-27 | 2021-05-18 | Apple Inc. | Sentiment prediction from textual data |
US10839159B2 (en) | 2018-09-28 | 2020-11-17 | Apple Inc. | Named entity normalization in a spoken dialog system |
US11893992B2 (en) | 2018-09-28 | 2024-02-06 | Apple Inc. | Multi-modal inputs for voice commands |
US11170166B2 (en) | 2018-09-28 | 2021-11-09 | Apple Inc. | Neural typographical error modeling via generative adversarial networks |
US11462215B2 (en) | 2018-09-28 | 2022-10-04 | Apple Inc. | Multi-modal inputs for voice commands |
US11475898B2 (en) | 2018-10-26 | 2022-10-18 | Apple Inc. | Low-latency multi-speaker speech recognition |
US11638059B2 (en) | 2019-01-04 | 2023-04-25 | Apple Inc. | Content playback on multiple devices |
US12136419B2 (en) | 2019-03-18 | 2024-11-05 | Apple Inc. | Multimodality in digital assistant systems |
US11348573B2 (en) | 2019-03-18 | 2022-05-31 | Apple Inc. | Multimodality in digital assistant systems |
US11783815B2 (en) | 2019-03-18 | 2023-10-10 | Apple Inc. | Multimodality in digital assistant systems |
US11675491B2 (en) | 2019-05-06 | 2023-06-13 | Apple Inc. | User configurable task triggers |
US11475884B2 (en) | 2019-05-06 | 2022-10-18 | Apple Inc. | Reducing digital assistant latency when a language is incorrectly determined |
US11217251B2 (en) | 2019-05-06 | 2022-01-04 | Apple Inc. | Spoken notifications |
US12216894B2 (en) | 2019-05-06 | 2025-02-04 | Apple Inc. | User configurable task triggers |
US11307752B2 (en) | 2019-05-06 | 2022-04-19 | Apple Inc. | User configurable task triggers |
US11423908B2 (en) | 2019-05-06 | 2022-08-23 | Apple Inc. | Interpreting spoken requests |
US12154571B2 (en) | 2019-05-06 | 2024-11-26 | Apple Inc. | Spoken notifications |
US11705130B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | Spoken notifications |
US11140099B2 (en) | 2019-05-21 | 2021-10-05 | Apple Inc. | Providing message response suggestions |
US11888791B2 (en) | 2019-05-21 | 2024-01-30 | Apple Inc. | Providing message response suggestions |
US11360739B2 (en) | 2019-05-31 | 2022-06-14 | Apple Inc. | User activity shortcut suggestions |
US11237797B2 (en) | 2019-05-31 | 2022-02-01 | Apple Inc. | User activity shortcut suggestions |
US11289073B2 (en) | 2019-05-31 | 2022-03-29 | Apple Inc. | Device text to speech |
US11657813B2 (en) | 2019-05-31 | 2023-05-23 | Apple Inc. | Voice identification in digital assistant systems |
US11496600B2 (en) | 2019-05-31 | 2022-11-08 | Apple Inc. | Remote execution of machine-learned models |
US11790914B2 (en) | 2019-06-01 | 2023-10-17 | Apple Inc. | Methods and user interfaces for voice-based control of electronic devices |
US11360641B2 (en) | 2019-06-01 | 2022-06-14 | Apple Inc. | Increasing the relevance of new available information |
WO2021008120A1 (en) * | 2019-07-12 | 2021-01-21 | 深圳技术大学 | Method and apparatus for controlling cooking robot through cloud speech |
US11488406B2 (en) | 2019-09-25 | 2022-11-01 | Apple Inc. | Text detection using global geometry estimators |
US20240099507A1 (en) * | 2019-10-30 | 2024-03-28 | BSH Hausgeräte GmbH | Food preparation system |
US11914848B2 (en) | 2020-05-11 | 2024-02-27 | Apple Inc. | Providing relevant data items based on context |
US12197712B2 (en) | 2020-05-11 | 2025-01-14 | Apple Inc. | Providing relevant data items based on context |
US12301635B2 (en) | 2020-05-11 | 2025-05-13 | Apple Inc. | Digital assistant hardware abstraction |
US11765209B2 (en) | 2020-05-11 | 2023-09-19 | Apple Inc. | Digital assistant hardware abstraction |
US11924254B2 (en) | 2020-05-11 | 2024-03-05 | Apple Inc. | Digital assistant hardware abstraction |
US11838734B2 (en) | 2020-07-20 | 2023-12-05 | Apple Inc. | Multi-device audio adjustment coordination |
US12219314B2 (en) | 2020-07-21 | 2025-02-04 | Apple Inc. | User identification using headphones |
US11750962B2 (en) | 2020-07-21 | 2023-09-05 | Apple Inc. | User identification using headphones |
US11696060B2 (en) | 2020-07-21 | 2023-07-04 | Apple Inc. | User identification using headphones |
CN112664996A (en) * | 2021-02-05 | 2021-04-16 | 广东沃尔姆斯电器有限公司 | Novel intelligent range hood |
CN113359569A (en) * | 2021-06-29 | 2021-09-07 | 海信家电集团股份有限公司 | Menu processing method and device |
CN113647797A (en) * | 2021-09-09 | 2021-11-16 | 广东美的厨房电器制造有限公司 | Cooking equipment, control method and device thereof and storage medium |
CN113848745A (en) * | 2021-10-20 | 2021-12-28 | 海信家电集团股份有限公司 | Menu generation method and device |
CN116047922A (en) * | 2021-10-28 | 2023-05-02 | 青岛海尔科技有限公司 | A kitchen equipment control method, device and cooking system |
WO2023109247A1 (en) * | 2021-12-15 | 2023-06-22 | 聚好看科技股份有限公司 | Food storage device, server, and interface display method |
Also Published As
Publication number | Publication date |
---|---|
US9316400B2 (en) | 2016-04-19 |
EP2851621B1 (en) | 2016-05-25 |
EP2851621A1 (en) | 2015-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9316400B2 (en) | Appliance control method, speech-based appliance control system, and cooking appliance | |
JP6371606B2 (en) | Device control method and audio device control system | |
CN110953609B (en) | Cooking control method, storage medium, cooking control device and cooking system | |
CN204698314U (en) | A kind of intelligent kitchen cooking system | |
JP6586274B2 (en) | Cooking apparatus, cooking method, cooking control program, and cooking information providing method | |
WO2018024913A1 (en) | Cooking system having inductive heating and wireless powering of kitchen appliances | |
CN105785775A (en) | Internet-of-Things-based cooking method | |
US20220013038A1 (en) | Interactive Cooking Application | |
CN111035261B (en) | Cooking control method, device and equipment | |
CN102968579A (en) | Copyright protection method and device for menu and cooking system | |
CN103092681A (en) | Cellphone menu | |
CN112383455B (en) | Data generation method and execution method and equipment | |
US12262459B2 (en) | Method for operating a cooking appliance | |
CN109507962A (en) | Kitchen appliance control method, device, terminal and computer storage medium | |
CN109446228A (en) | Information cuing method, device, terminal and computer storage medium | |
JP2010123101A (en) | Support apparatus of cooking using cooking apparatus capable of controlling temperature by electricity | |
JP6989683B2 (en) | Voice controlled cookware platform | |
CN111616577B (en) | Intelligent auxiliary cooking system and cooking process interactive control method | |
CN113359569A (en) | Menu processing method and device | |
CN113450894B (en) | Structured data, electronic menu generation method and device | |
CN117670596A (en) | Cooking method based on AI large model and intelligent cooking device thereof | |
CN112369122B (en) | Method for operating a cooking appliance | |
CN114680635A (en) | Method, system, main control device and storage medium for generating cooking instruction information | |
Xiaoguang et al. | Design and implementation of smart cooking based on amazon echo | |
JP2021026121A (en) | Voice controlled cookware platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AME Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NISHIKAWA, YURI;YONEDA, AKI;YAMAGAMI, KATSUYOSHI;SIGNING DATES FROM 20140820 TO 20140821;REEL/FRAME:033811/0843 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: SOVEREIGN PEAK VENTURES, LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PANASONIC INTELLECTUAL PROPERTY CORPORATION OF AMERICA;REEL/FRAME:048830/0085 Effective date: 20190308 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Expired due to failure to pay maintenance fee |
Effective date: 20200419 |