CN119002746A - Method, apparatus, device and storage medium for session interaction - Google Patents
Method, apparatus, device and storage medium for session interaction Download PDFInfo
- Publication number
- CN119002746A CN119002746A CN202311570257.6A CN202311570257A CN119002746A CN 119002746 A CN119002746 A CN 119002746A CN 202311570257 A CN202311570257 A CN 202311570257A CN 119002746 A CN119002746 A CN 119002746A
- Authority
- CN
- China
- Prior art keywords
- message
- topic
- digital assistant
- user
- conversation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Embodiments of the present disclosure provide methods, apparatuses, devices, and storage medium for session interaction. The method comprises the following steps: receiving a first session message for the digital assistant from a first user in a session comprising at least one user and the digital assistant; determining context information associated with the first session message according to a trigger mode of the first session message; generating a first reply message by the digital assistant to the first session message based on the context information; and presenting the first reply message according to a presentation mode associated with the trigger mode. In this way, replies can be made based on the context of questions of multiple users, and also a discussion of conversations of multiple persons with digital assistants can be developed for a certain topic. This in turn advantageously increases the efficiency of the assistance of the digital assistant to the user.
Description
RELATED APPLICATIONS
The present application claims priority from the chinese patent application entitled "method, apparatus, device and storage medium for session interaction", application number 202311482320.0, filed on month 08 of 2023, the entire contents of which are incorporated herein by reference.
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers and, more particularly, relate to a method, apparatus, device, and computer-readable storage medium for session interaction.
Background
With the development of information technology, various terminal devices can provide various services to people in terms of work and life, etc. The terminal device may have an application deployed therein that provides a service. The terminal equipment presents corresponding content through the user interface of the application, realizes interaction with the user and meets various requirements of the user. Therefore, rich application interaction interfaces are an important means to enhance the user experience. The terminal device or application may provide digital assistant-like functionality to the user to assist the user in using the terminal device or application. The user may have a need to use the digital assistant in a different session.
Disclosure of Invention
In a first aspect of the present disclosure, a method for session interaction is provided. The method comprises the following steps: receiving a first session message for the digital assistant from a first user in a session comprising at least one user and the digital assistant; determining context information associated with the first session message according to a trigger mode of the first session message; generating a first reply message by the digital assistant to the first session message based on the context information; and presenting the first reply message according to a presentation mode associated with the trigger mode.
In a second aspect of the present disclosure, an apparatus for session interaction is provided. The device comprises: a receiving module configured to receive a first session message for the digital assistant from a first user in a session including at least one user and the digital assistant; a context information determination module configured to: determining context information associated with the first session message according to a trigger mode of the first session message; a first reply message generation module configured to generate a first reply message for the first session message by the digital assistant based on the context information; and a presentation module configured to present the first reply message according to a presentation mode associated with the trigger mode.
In a third aspect of the present disclosure, an electronic device is provided. The apparatus comprises at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit. The instructions, when executed by at least one processing unit, cause the electronic device to perform the method of the first aspect.
In a fourth aspect of the present disclosure, a computer-readable storage medium is provided. A medium has stored thereon a computer program which, when executed by a processor, implements the method of the first aspect.
It should be understood that what is described in this section is not intended to limit the key features or essential features of the embodiments of the present disclosure, nor is it intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure can be implemented;
FIG. 2 illustrates a flowchart of an example process for session interaction, according to some embodiments of the present disclosure;
3A-3F illustrate schematic diagrams of example interfaces for session interactions, according to some embodiments of the present disclosure;
FIG. 4 illustrates a flow chart of a process for session interaction according to some embodiments of the present disclosure;
FIG. 5 illustrates a schematic block diagram of an apparatus for session interaction, according to some embodiments of the present disclosure; and
Fig. 6 illustrates a block diagram of an electronic device in which one or more embodiments of the disclosure may be implemented.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather, these embodiments are provided so that this disclosure will be more thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The term "some embodiments" should be understood as "at least some embodiments". Other explicit and implicit definitions are also possible below.
In this context, unless explicitly stated otherwise, performing a step "in response to a" does not mean that the step is performed immediately after "a", but may include one or more intermediate steps.
It will be appreciated that the data (including but not limited to the data itself, the acquisition, use, storage or deletion of the data) involved in the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the relevant users, which may include any type of rights subjects, such as individuals, enterprises, groups, etc., should be informed and authorized by appropriate means of the types of information, usage ranges, usage scenarios, etc. involved in the present disclosure according to relevant legal regulations.
For example, in response to receiving an active request from a user, prompt information is sent to the relevant user to explicitly prompt the relevant user that the operation requested to be performed will need to obtain and use information to the relevant user, so that the relevant user may autonomously select whether to provide information to software or hardware such as an electronic device, an application program, a server, or a storage medium that performs the operation of the technical solution of the present disclosure according to the prompt information.
As an alternative but non-limiting implementation manner, in response to receiving an active request from a relevant user, the prompt information may be sent to the relevant user, for example, in a popup window, where the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
As used herein, the term "model" may learn the association between the respective inputs and outputs from training data so that, for a given input, a corresponding output may be generated after training is completed. The generation of the model may be based on machine learning techniques. Deep learning is a machine learning algorithm that processes inputs and provides corresponding outputs through the use of multiple layers of processing units. The neural network model is one example of a deep learning-based model. The "model" may also be referred to herein as a "machine learning model," "machine learning network," or "learning network," which terms are used interchangeably herein.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure may be implemented. The environment 100 relates to an application creation platform 110 and an application execution platform 140.
As shown in FIG. 1, the application creation platform 110 may provide a creation and release environment for applications for users 105. The user 105 may be referred to as an application creation user, creator. In some embodiments, the application creation platform 110 may be a low code platform that provides a collection of tools for application creation. The application creation platform 110 may support visual development of various applications, so that a developer may skip the process of manual coding, and accelerate the development cycle and cost of the applications. The application creation platform 110 may support any suitable platform for a user to develop one or more types of applications, and may include, for example, an application platform-based, i.e., service (aPaaS) platform. The platform can support the user to develop the application with high efficiency, and realize the operations of application creation, application function adjustment and the like.
The application creation platform 110 may be deployed locally to the terminal device of the user 105 and/or may be supported by a server device. For example, a terminal device of the user 105 may be running a client of the application creation platform 110, which may support user interaction with the application creation platform 110 provided by the server. In the case where the application creation platform 110 is running locally at the user's terminal device, the user 105 may interact directly with the local application creation platform 110 using the terminal device. In the case where the application creation platform 110 operates on a server device, the server device may implement service provision to a client operating in a terminal device based on a communication connection with the terminal device. The application creation platform 110 may present the corresponding pages 130 to the user 105 based on the user's 105 operations to output to the user 105 and/or receive information related to application creation from the user 105.
In some embodiments, the application creation platform 110 may be associated with a corresponding database in which data or information required for the application creation process supported by the application creation platform 110 is stored. For example, the database may store codes and description information and the like corresponding to the respective functional modules constituting the application. The application creation platform 110 may also perform operations such as calling, adding, deleting, updating, etc., on functional modules in the database. The database may also store operations that are performed on different functional blocks. Illustratively, in a scenario in which an application is to be created, the application creation platform 110 may call corresponding function blocks from a database to build the application.
In embodiments of the present disclosure, the user 105 may create the target application 120 on the application creation platform 110 as needed and issue the target application 120. The target application 120 may be published to any suitable application execution platform 140 as long as the application execution platform 140 is capable of supporting the execution of the target application 120. After release, the target application 120 may be available for operation by one or more users 145. User 145 may be referred to as an end user of target application 120. In some embodiments, the target application 120 may include or be implemented as a digital assistant 122.
The digital assistant 122 may be configured to have an intelligent dialog. In the example shown in FIG. 1, digital assistant 122 may be integrated within target application 120 as part of target application 120 to assist in performing task processing within target application 120. In other examples, digital assistant 122 may be configured as a stand-alone application, such as a web application or other type of application. In such an example, the digital assistant 122 may be considered the same application as the target application 120. The digital assistant 122 is provided to assist the user in various task processing needs for different applications, scenarios. During interaction with the digital assistant 122, the user enters an interaction message, and the digital assistant 122 provides a reply message in response to the user input. In general, the digital assistant 122 is capable of supporting a user to input questions in natural language and perform tasks and provide replies based on understanding and logical reasoning capabilities of the natural language input.
In some embodiments, digital assistant 122 may interact with user 145 as a contact. For example, digital assistant 122 may be implemented in an Instant Messaging (IM) application. Digital assistant 122 may interact with user 145 in a single chat session with user 145. In some embodiments, digital assistant 122 may interact with multiple users in a group chat session that includes multiple users.
For each user 145, the client of the application execution platform 140 may present an interactive window 142 of the target application 120 or digital assistant 122, such as a session window with the digital assistant 122, in a client interface. User 145 may enter a session message in the session window and target application 120 may determine a reply message to digital assistant 122 based on the created configuration information and present to the user in interactive window 142. In some embodiments, depending on the configuration of the target application 120, the interaction message with the target application 120 may include a multimodal form of message, such as a text message (e.g., natural language text), a voice message, an image message, a video message, and so forth.
Similar to the application creation platform 110, the application execution platform 140 may be deployed locally at the terminal device of each user 145 and/or may be supported by a server device. For example, a terminal device of the user 145 may be running a client of the application running platform 140, which may support user interaction with the application running platform 140 provided by the server. In the case where the application execution platform 140 is running locally at the user's terminal device, the user 145 may interact directly with the local application execution platform 140 using the terminal device. In the case where the application running platform 140 runs on a server device, the server device may implement service provision for a client running in the terminal device based on a communication connection with the terminal device. The application execution platform 140 may present the corresponding application pages to the user 145 based on the user's 145 operations to output to the user 145 and/or receive information related to application usage from the user 145.
In some embodiments, the implementation of at least a portion of the functionality of the target application 120, and/or the implementation of at least a portion of the functionality of the digital assistant 122 in the target application 120, may be model-based. One or more models 150 may be invoked during creation or execution of the target application 120. In target application 120, digital assistant 122 may utilize model 155 to understand user input and provide a reply to the user based on the output of model 155.
In the creation process, testing of the target application 120 by the application creation platform 110 requires the use of the model 155 to determine that the results of the execution of the target application 120 are expected. During execution, in response to different operational requests by a user of the target application 120, the application execution platform 140 may need to utilize the model 155 to determine a response result to the user.
Although shown as being independent of the application creation platform 110 and the application execution platform 140, one or more models 155 may be run on the application creation platform 110 and/or the application execution platform 140, or other remote servers. In some embodiments, the model 155 may be a machine learning model, a deep learning model, a neural network, or the like. In some embodiments, the model may be based on a Language Model (LM). The language model can have question-answering capability by learning from a large number of corpora. Model 155 may also be based on other suitable models.
The application creation platform 110 and/or the application execution platform 140 may be executed on an appropriate electronic device. The electronic device herein may be any type of device having computing capabilities, including a terminal device or a server device. The terminal device may be any type of mobile terminal, fixed terminal, or portable terminal, including a mobile handset, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, personal Communication System (PCS) device, personal navigation device, personal Digital Assistant (PDA), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, or any combination of the preceding, including accessories and peripherals for these devices, or any combination thereof. The server devices may include, for example, computing systems/servers, such as mainframes, edge computing nodes, computing devices in a cloud environment, and so forth. In some embodiments, the application creation platform 110 and/or the application execution platform 140 may be implemented based on cloud services.
It should be understood that the structure and function of environment 100 are described for illustrative purposes only and are not meant to suggest any limitation as to the scope of the disclosure. For example, while FIG. 1 shows a single user interacting with application creation platform 110 and a single user interacting with application execution platform 140, in practice multiple users may access application creation platform 110 to each create a digital assistant, and each digital assistant may be used to interact with multiple users.
Currently, with the rapid development of digital assistants, users' demands for digital assistants are increasing. For example, for a group chat scenario, multiple users need to get answers from a digital assistant. In the group chat scenario, the traditional scheme is that the digital assistant can only reply to a specified question or all the question contexts of specified users, can not reply based on the question contexts of a plurality of users, and can not develop a multi-person dialogue discussion with the digital assistant for a certain theme.
In an embodiment of the present disclosure, an improved scheme for session interaction is provided. In this scheme, a first session message for a digital assistant is received from a first user in a first topic of a group chat session including at least one user and the digital assistant. Based on the contextual information related to the first topic, a first reply message to the first session message by the digital assistant is generated. And presenting the first reply message of the digital assistant in the first topic. In this way, the user can reply based on the context of questions of a plurality of users, and can develop a dialog discussion with a digital assistant for a certain theme.
Some example embodiments of the present disclosure will be described in detail below with reference to examples of the accompanying drawings. It should be understood that the pages shown in the drawings are merely examples, and that various page designs may actually exist. Individual graphical elements in a page may have different arrangements and different visual representations, one or more of which may be omitted or replaced, and one or more other elements may also be present. Embodiments of the disclosure are not limited in this respect.
Fig. 2 illustrates a flow chart for session interaction, according to some embodiments of the present disclosure. Through flowchart 200, conversational interactions of at least one user 145 and digital assistant 122 may be achieved. In the examples below, the user 145 interacts with the digital assistant 122 for a session, described from the perspective of the application execution platform, e.g., based on pages presented by the application execution platform 140, for discussion purposes.
In some embodiments, user 145 may represent each user. For example, a first user, a second user, a third user, etc. The present disclosure is not limited in this regard.
In some embodiments, the application execution platform 140 receives a first session message from the first user 145 for the digital assistant 122 in a session that includes at least one user 145 and the digital assistant 122. For example, application execution platform 140 receives a first session message sent by user 145 that "helps me generate a client as a single trend list".
In some embodiments, the first session message may include a mention operation of the digital assistant. In some embodiments, the first session message may be sent directed to the digital assistant in any other manner.
The session described herein may be any suitable type of session, for example, a group chat session. In some embodiments, the session may be a normal group chat session. For example, as depicted by blocks 210 and 214 shown in fig. 2. In some embodiments, the session may be a topic group chat session. Such as block 212 and block 216 shown in fig. 2. In a topic group chat session, each session message may be attributed to a topic. In contrast, in a normal group chat session, session messages are not attributed to topics unless specifically specified.
The application execution platform 140 may then determine context information associated with the first session message based on the trigger pattern of the first session message.
In some embodiments, to determine the context information, the time dimension information associated with the first session message may be based. For example, the context information may be determined based on those session messages within the session having a transmission time less than a threshold from the transmission time of the first session message.
Alternatively or additionally, in some embodiments, to determine the context information, the user dimension information associated with the first session message may be based. For example, if one or more users are mentioned in the first session message or are directed to one or more users to send, the context information may be determined based on information (e.g., a sent message) about the one or more users mentioned or directed to send.
In some embodiments, the trigger mode may include conversationally triggering, for example, conversations in a normal group chat session. In other words, in this conversational mode triggered mode, the first user may not send the first conversation message against any topic. In such an embodiment, the application execution platform 140 may determine the context information based at least on the first session message. For example, the user 145 may send a first session message with content "help me generate a client as a single trend list" and the application execution platform 140 or digital assistant may determine the context information based on the content of this session message.
In some embodiments, the trigger mode may include topic-wise triggering, i.e., a conversation message is sent for a topic. For example, such topics may be topics in a common group chat session. As another example, such topics may be topics in a topic group chat session. In such embodiments, the context information may be determined based on one or more conversation messages under the topic.
Further, a first reply message to the first session message by the digital assistant may be generated based on the determined context information. For example, at block 218, the digital assistant 122 obtains as context all of the conversation content related to the first topic. At block 220, the digital assistant 122 generates a first reply message based on the first session message of the user 145 and the contextual information related to the first topic.
In some embodiments, the first reply message may be generated using a machine learning model. For example, a hint word may be generated based on the context information and the generated hint word sent to the machine learning model. At least a portion of the first reply message may then be obtained from the machine learning model.
After the first reply message is acquired, the first reply message of the digital assistant may be presented according to a presentation mode associated with the trigger mode. The presentation mode may be used to indicate where the reply message is presented, the style of presentation, etc.
In some embodiments, as described above, the first session message may be triggered in a conversational form, i.e., the trigger mode is conversational form trigger. In such an embodiment, since the first conversation message is not directed to any topic, a first topic may be created in the conversation and the first reply message is presented under the first topic.
In some embodiments, as described above, the first session message may be triggered against a certain topic (also referred to as a second topic). Then the first reply message may be presented directly under the second question.
In the embodiment of the disclosure, the reply can be based on the context of questions of a plurality of users, and the dialogue discussion of a plurality of people and a digital assistant can be developed for a certain theme. This in turn advantageously increases the efficiency of assistance by the digital assistant to the user, particularly in multiple user scenarios.
Some example embodiments of the present disclosure are described below with continued reference to the example interfaces shown in fig. 3A-3F. Fig. 3A-3F illustrate schematic diagrams of example interfaces according to some embodiments of the present disclosure. It should be understood that the application interfaces shown in the figures are merely examples, and that various interface designs may actually exist. The individual graphical elements in the interface may have different arrangements and different visual representations, one or more of which may be omitted or replaced, and one or more other elements may also be present. Embodiments of the disclosure are not limited in this respect.
In some embodiments, the first topic may be created in response to a session message from the user. In particular, application execution platform 140 may receive session messages from a user for digital assistant 122 during a group chat session. Such group chat sessions may be ordinary group chat sessions. For example, block 210, the user asks questions to the digital assistant 122 within a normal group chat session.
Reference is made to an interface schematic 301 as shown in fig. 3A. In the example of fig. 3A, application execution platform 140 provides session interaction page 330 for user 145. Session interaction page 330 includes message list 320, group chat session page 310. A function card 340, also referred to as a message card, is provided in the group chat session page 310 that is automatically generated by the digital assistant 122. For example, the function cards automatically generated by the digital assistant 122 may include, but are not limited to, portals to functions that the digital assistant 122 has, such as the knowledge question and answer function, the data update function, the data query function, the Business Intelligence (BI) report function shown in FIG. 3A. A billboard may also be opened by the card, and so on.
Group chat session page 310 also presents a session message 343 for user 145 to digital assistant 122. For example, the application execution platform 140 receives a session message from the user for the digital assistant 122 in a group chat session and presents the group chat session page 310.
In some embodiments, the session message may include a mention operation of the digital assistant 122. Illustratively, the mention operation may include @ digital assistant 122. For example, the user in the group chat session sends a message "@ XX helper helping me generate a client into a single trend report", as shown by session message 343 in FIG. 3A.
In some embodiments, the session message may not be associated with an existing topic in the group chat session. For example, in a normal group chat session, the user has not entered a message for any topic, such as session message 343 shown in fig. 3A.
In turn, the application execution platform 140 creates a first topic. The creation of the first topic may be based on a user asking a question for the digital assistant (i.e., issuing a conversation message) or may be in response to generating a reply message for the conversation message. For example, at block 212, the user asks the digital assistant in a normal group chat session. At block 214, the digital assistant 122 creates topics and replies to the user within the normal group chat session based on the user's message.
The created first topic includes at least a conversation message issued by the user for the digital assistant and a reply message by the digital assistant 122 for the conversation message, e.g., schematics 302, 303 as shown in fig. 3B and 3C. The digital assistant 122 determines a reply message for the session message. For example, the topic 353 shown in fig. 3B includes the session message 343 shown in fig. 3A and the reply message 351 to the session message 343.
In some embodiments, if a view request for a first topic is detected in a group chat session, the application execution platform 140 may present a topic area corresponding to the first topic in which messages related to the first topic are presented. The schematic 303 shown in fig. 3C includes a conversation interaction page 330, the conversation interaction page 330 including a message list 320, a group chat session 310, the group chat session 310 including a topic area 355. The topic area 355 includes the conversation message 354 and a reply message 363 to the conversation message 354.
By way of example, if the user clicks on a card corresponding to topic 353 shown in FIG. 3B, a topic area 355 may be presented, as shown in FIG. 3C. Some user 145 has sent a "XX helper me generate customer order trend report," which the XX helper will "answer" that a customer order trend report has been generated for "XX helper me generate customer order trend report," please check "as shown in topic area 355 in FIG. 3C.
In some embodiments, topics may not be created for a common group chat. The digital assistant may reply to the user directly in the group chat session. If the user again asks the digital assistant, at least a portion of the group message previously sent by the user may be acquired as context for the digital assistant to generate a further reply. In some embodiments, whether topics are created in the group session is configurable. For example, it may be configured by the creator when creating an application or digital assistant. In this case, the same configuration is applied for different group chat sessions. As another example, it may be configured by a group member having a preset right (such as group chat management right). In this case, different configurations may be applied for different group chat sessions.
In some embodiments, a topic has been created prior to receiving the first session message. For example, topics previously created by the user may exist in a common group chat session. If the user mentions a digital assistant in such a topic, the digital assistant may reply to the user based on the context of the topic.
As another example, in a topic group chat session, each message is sent out for one topic. The user may mention the digital assistant in one of the topics or call the digital assistant in any other suitable manner.
The schematics 304, 305 shown in fig. 3D, 3E, respectively, include a session interaction page 330, the session interaction page 330 including a message list 320, a topic group chat session 360. The topic group chat session 360 includes topics 313 that have been created. Topic 313 includes a conversation message for the user, an input control 314, a reply message 315 for the conversation message. In some embodiments, the input control 314 may be used to reply to a conversation message from the user.
In this example, in the topic group chat session, user A refers to the digital assistant 122 in topic 313 and asks "what is today weather. User a or other users may click on the input control 314 in fig. 3D to input a message in topic 313.
The digital assistant 122 replies to the session message like "@ XX assistant how weather today" to form a reply message, as shown by message 315"@ XX assistant [ card message ]" in FIG. 3E.
In some embodiments, if the user clicks anywhere in the interface region corresponding to topic 313, a message in topic 313 may be presented. An example is described with reference to fig. 3F. The schematic diagram 306 shown in fig. 3F includes a conversational interaction page 330, the conversational interaction page 330 including a message list 320, a topic window 333. Topic window 333 may be a topic window corresponding to a single topic 313 in topic group chat session 360.
For example, in a topic group chat session, if user A asks the digital assistant about "XX helper how weather today," XX helper will "answer" to "how weather today is the following: XXX ", as shown in fig. 3F.
In some embodiments, the group chat session may include multiple users 145 and digital assistants 122. In some embodiments, the group chat session may include a topic group chat session made up of at least one user 145 and digital assistant 122. The topic group chat session supports two or more parties to the session for discussion of multiple topics.
In some embodiments, the topic has been created prior to receiving the first session message. For example, as described above, the group chat session is a topic group chat session. As another example, in a normal group chat session, a topic is created and a user mentions a digital assistant under the topic. In some embodiments, topics are created in response to a triggering operation by a user in a group chat session. For example, topics may be created manually by a user.
In some embodiments, responsive to detecting a view request for a first topic in a group chat session, a topic area corresponding to the first topic is presented, and messages related to the first topic are presented in the topic area. For example, in a normal group chat session, the topic area is as shown in fig. 3C as topic area 355. In a topic group chat session, the topic area can be understood as topic window 333 as in FIG. 3F.
In this way, a reply can be made based on the context of questions of a plurality of users, and a discussion of conversations between a plurality of persons and a digital assistant can be developed for a certain topic.
Fig. 4 illustrates a flow chart of a process 400 for session interaction according to some embodiments of the present disclosure. The process 400 may be implemented at the application execution platform 140. The process 400 is described below with reference to fig. 1.
At block 410, the application execution platform 140 receives a first session message for the digital assistant from a first user in a session that includes at least one user and the digital assistant.
At block 420, the application execution platform 140 determines context information associated with the first session message based on the trigger pattern of the first session message.
At block 430, the application execution platform 140 generates a first reply message to the first session message by the digital assistant based on the context information.
At block 440, the application execution platform 140 presents the first reply message according to the presentation mode associated with the trigger mode.
In some embodiments, the trigger mode includes triggering in a conversational manner, and the terminal device 110 determines the context information based at least on the first session message.
In some embodiments, the application execution platform 140 creates a first topic in the conversation, the first topic including at least the first conversation message, and the application execution platform 140 presents the first reply message under the first topic.
In some embodiments, the trigger mode includes triggering for a second topic in the conversation, and the application execution platform 140 determines the context information based on one or more conversation messages in the second topic.
In some embodiments, the application execution platform 140 presents the first reply message under the first topic.
In some embodiments, the session comprises a topic group chat session.
In some embodiments, the first session message includes a mention operation to the digital assistant.
In some embodiments, the session includes multiple users and digital assistants.
In some embodiments, in response to detecting a view request for a first topic in a conversation, the terminal device 110 presents a topic area corresponding to the first topic and presents messages related to the first topic in the topic area.
In some embodiments, the application execution platform 140 provides the generated hint words based on the context information to the machine learning model; and obtaining at least a portion of the first reply message from the machine learning model.
In some embodiments, the application execution platform 140 determines the context information based on at least one of: time dimension information associated with the first session message, or user dimension information associated with the first session message.
Fig. 5 illustrates a schematic block diagram of an apparatus 500 for session interaction, according to some embodiments of the present disclosure. The apparatus 500 may be implemented in or included in the application execution platform 110, for example. The various modules/components in apparatus 500 may be implemented in hardware, software, firmware, or any combination thereof.
As shown, the apparatus 500 includes a receiving module 510 configured to receive a first session message for a digital assistant from a first user in a session including at least one user and the digital assistant; the context information determination module 520 is configured to: determining context information associated with the first session message according to a trigger mode of the first session message; a first reply message generation module 530 configured to generate a first reply message to the first session message by the digital assistant based on the context information; and a presentation module 540 configured to present the first reply message according to a presentation mode associated with the trigger mode.
In some implementations, the trigger mode includes triggering in a conversational manner, and the context information determination module 520 is further configured to determine the context information based at least on the first conversation message.
In some implementations, the presentation module 540 is further configured to create a first topic in the conversation, the first topic including at least the first conversation message; and presenting the first reply message under the first topic.
In some implementations, the trigger mode includes triggering for a second topic in the conversation, and the context information determination module 520 is further configured to determine the context information based on one or more conversation messages in the second topic.
In some implementations, the presentation module 540 is further configured to present the first reply message under the first topic.
In some implementations, the session includes a topic group chat session.
In some implementations, the first session message includes a mention operation to the digital assistant.
In some implementations, the session includes multiple users and digital assistants.
In some implementations, the apparatus 500 further includes a view request processing module configured to, in response to detecting a view request for a first topic in a conversation, present a topic area corresponding to the first topic and present a message related to the first topic in the topic area.
In some implementations, the first reply message generation module 530 is further configured to provide the generated hint words based on the context information to the machine learning model; and obtaining at least a portion of the first reply message from the machine learning model.
In some implementations, the context information determination module 520 is further configured to determine the context information based on at least one of: time dimension information associated with the first session message, or user dimension information associated with the first session message.
Fig. 6 illustrates a block diagram of an electronic device 600 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device 600 illustrated in fig. 6 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein. The electronic device 600 illustrated in fig. 6 may include or be implemented as the application execution platform 140 of fig. 1, or the apparatus 500 of fig. 5.
As shown in fig. 6, the electronic device 600 is in the form of a general-purpose electronic device. The components of electronic device 600 may include, but are not limited to, one or more processors or processing units 610, memory 620, storage 630, one or more communication units 640, one or more input devices 650, and one or more output devices 670. The processing unit 610 may be an actual or virtual processor and is capable of performing various processes according to programs stored in the memory 620. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capabilities of electronic device 600.
The electronic device 600 typically includes a number of computer storage media. Such a medium may be any available media that is accessible by electronic device 600, including, but not limited to, volatile and non-volatile media, removable and non-removable media. The memory 620 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory), or some combination thereof. Storage device 630 may be a removable or non-removable media and may include machine-readable media such as flash drives, magnetic disks, or any other media that may be capable of storing information and/or data and that may be accessed within electronic device 600.
The electronic device 600 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 6, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 620 may include a computer program product 625 having one or more program modules configured to perform the various methods or acts of the various embodiments of the disclosure.
The communication unit 640 enables communication with other electronic devices through a communication medium. Additionally, the functionality of the components of the electronic device 600 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communication connection. Thus, the electronic device 600 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 650 may be one or more input devices such as a mouse, keyboard, trackball, etc. The output device 670 may be one or more output devices such as a display, speakers, printer, etc. The electronic device 600 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., with one or more devices that enable a user to interact with the electronic device 600, or with any device (e.g., network card, modem, etc.) that enables the electronic device 600 to communicate with one or more other electronic devices, as desired, via the communication unit 640. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium having stored thereon computer-executable instructions, wherein the computer-executable instructions are executed by a processor to implement the method described above is provided. According to an exemplary implementation of the present disclosure, there is also provided a computer program product tangibly stored on a non-transitory computer-readable medium and comprising computer-executable instructions that are executed by a processor to implement the method described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus, devices, and computer program products implemented according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.
Claims (14)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2024/131087 WO2025098500A1 (en) | 2023-11-08 | 2024-11-08 | Method and apparatus for conversation interaction, device, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311482320 | 2023-11-08 | ||
CN2023114823200 | 2023-11-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN119002746A true CN119002746A (en) | 2024-11-22 |
Family
ID=93487732
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311570257.6A Pending CN119002746A (en) | 2023-11-08 | 2023-11-22 | Method, apparatus, device and storage medium for session interaction |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN119002746A (en) |
WO (1) | WO2025098500A1 (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102792320A (en) * | 2010-01-18 | 2012-11-21 | 苹果公司 | Intelligent automated assistant |
CN112231463A (en) * | 2020-11-10 | 2021-01-15 | 腾讯科技(深圳)有限公司 | Session display method and device, computer equipment and storage medium |
US20210382925A1 (en) * | 2020-06-05 | 2021-12-09 | International Business Machines Corporation | Contextual Help Recommendations for Conversational Interfaces Based on Interaction Patterns |
CN117009482A (en) * | 2023-07-11 | 2023-11-07 | 北京百度网讯科技有限公司 | Dialogue processing method and device, electronic equipment and storage medium |
-
2023
- 2023-11-22 CN CN202311570257.6A patent/CN119002746A/en active Pending
-
2024
- 2024-11-08 WO PCT/CN2024/131087 patent/WO2025098500A1/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102792320A (en) * | 2010-01-18 | 2012-11-21 | 苹果公司 | Intelligent automated assistant |
US20210382925A1 (en) * | 2020-06-05 | 2021-12-09 | International Business Machines Corporation | Contextual Help Recommendations for Conversational Interfaces Based on Interaction Patterns |
CN112231463A (en) * | 2020-11-10 | 2021-01-15 | 腾讯科技(深圳)有限公司 | Session display method and device, computer equipment and storage medium |
CN117009482A (en) * | 2023-07-11 | 2023-11-07 | 北京百度网讯科技有限公司 | Dialogue processing method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2025098500A1 (en) | 2025-05-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN118409681B (en) | Method, apparatus, device and medium for managing workflow | |
CN119003018A (en) | Method, apparatus, device and storage medium for rights management | |
CN119002746A (en) | Method, apparatus, device and storage medium for session interaction | |
US20250168133A1 (en) | Method, apparatus, device and storage medium for conversation interaction | |
US20250165166A1 (en) | Storing data in a digital assistant | |
US20250150418A1 (en) | Method, apparatus, device and storage medium for session interaction | |
US20250013479A1 (en) | Method, apparatus, device and storage medium for processing information | |
US12242742B1 (en) | Storing data in a digital assistant | |
CN119088438A (en) | Information processing and application configuration method, device, equipment, storage medium and product | |
US20250150417A1 (en) | Method, apparatus, device and storage medium for session interaction | |
US20250147735A1 (en) | Method, apparatus, device and storage medium for message processing | |
US20250138852A1 (en) | Task processing | |
CN119172344A (en) | Interactive method, device, equipment and storage medium | |
CN119002740A (en) | Method, apparatus, device and storage medium for information interaction | |
CN119292707A (en) | Information processing method, device, equipment, storage medium and program product | |
CN119088439A (en) | Method, apparatus, device and readable storage medium for application creation | |
CN119135667A (en) | Method, device, equipment and storage medium for function calling | |
CN119002981A (en) | Method, apparatus, device and storage medium for application creation | |
CN119002741A (en) | Method, apparatus, device and storage medium for session interaction | |
CN119004403A (en) | Information processing method, apparatus, device and storage medium | |
CN119002739A (en) | Information interaction method, device, equipment and storage medium | |
CN119646144A (en) | Interaction method, device, equipment, storage medium and program product | |
CN118838519A (en) | Information processing method, apparatus, device and storage medium | |
CN119937844A (en) | Method, device, equipment and storage medium for digital assistant interaction | |
CN119003558A (en) | Information processing method, apparatus, device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |