US10757196B2 - Method and apparatus for displaying application data in wireless communication system - Google Patents

Method and apparatus for displaying application data in wireless communication system Download PDF

Info

Publication number
US10757196B2
US10757196B2 US14/635,443 US201514635443A US10757196B2 US 10757196 B2 US10757196 B2 US 10757196B2 US 201514635443 A US201514635443 A US 201514635443A US 10757196 B2 US10757196 B2 US 10757196B2
Authority
US
United States
Prior art keywords
data
timing information
session
multimedia content
gui
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US14/635,443
Other versions
US20150249714A1 (en
Inventor
Kiran Bharadwaj VEDULA
In-Young Shin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US14/635,443 priority Critical patent/US10757196B2/en
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHIN, IN-YOUNG, VEDULA, KIRAN BHARADWAJ
Publication of US20150249714A1 publication Critical patent/US20150249714A1/en
Application granted granted Critical
Publication of US10757196B2 publication Critical patent/US10757196B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1095Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/401Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference
    • H04L65/4015Support for services or applications wherein the services involve a main real-time session and one or more additional parallel real-time or time sensitive sessions, e.g. white board sharing or spawning of a subconference where at least one of the additional parallel sessions is real time or time sensitive, e.g. white board sharing, collaboration or spawning of a subconference
    • H04L65/608
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • H04L65/65Network streaming protocols, e.g. real-time transport protocol [RTP] or real-time control protocol [RTCP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/04Protocols specially adapted for terminals or networks with limited capabilities; specially adapted for terminal portability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • H04L67/1097Protocols in which an application is distributed across nodes in the network for distributed storage of data in networks, e.g. transport arrangements for network file system [NFS], storage area networks [SAN] or network attached storage [NAS]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/14Session management
    • H04L67/141Setup of application sessions
    • H04L67/2823
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/565Conversion or adaptation of application format or content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/75Indicating network or usage conditions on the user display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Definitions

  • the present disclosure relates to a method and apparatus for playing contents shared between devices in a wireless communication system.
  • the Wireless Fidelity (Wi-Fi) Display (WFD) standard has been specified according to needs for transmitting Audio/Video (AV) data while satisfying high quality and low latency.
  • a WFD network to which the WFD standard is applied is a network system proposed by the Wi-Fi Alliance which enables Wi-Fi devices to be connected to each other in a peer-to-peer manner without joining a home network, an office network, or a hotspot network.
  • the WFD devices in the WFD network search for information about each other, for example, capability information, establish a WFD session, and render contents received during the WFD session.
  • the WFD network includes two types of devices, for example, a source device and a sink device.
  • the source device mirrors data existing on a source screen onto a screen of the sink device.
  • the source device and the sink device exchange a first sequence message with each other and perform device discovery and service discovery.
  • IP Internet Protocol
  • TCP Transmission Control Protocol
  • RTSP Real Time Streaming Protocol
  • RTP Real Time Protocol
  • Capability negotiation between the source device and the sink device is performed through the RTSP, and during capability negotiation, the source device and the sink device exchange M1 through M4 messages. Thereafter, the source device and the sink device exchange WFD session control messages. A data session is established between the source device and the sink device through the RTP.
  • UDP User Datagram Protocol
  • the UDP has no reliability, such that during wireless transmission, some packets may be lost. Since the lost packets are not retransmitted, a loss of data may be noticed by a user. In particular, a data loss related to a text, such as a subtitle, or a Graphic User Interface (GUI) is more noticeable to the user than AV data.
  • GUI Graphic User Interface
  • the sink device merges the AV data with the text and GUI-related data, instead of the source device merging the AV data with the text and GUI-related data and transmitting the merged data to the sink device.
  • SMIL Synchronized Multimedia Integration Language
  • TTML Timed Text Markup Language
  • RVU Remote View
  • HTML Hyper Text Markup Language
  • CE Consumer Electronics
  • An aspect of the present disclosure is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a method and apparatus for merging Audio/Video (AV) data with a text or Graphic User Interface (GUI) data in a sink device.
  • AV Audio/Video
  • GUI Graphic User Interface
  • Another aspect of the present disclosure is to provide a method and apparatus for synchronization in a sink device when merging AV data with a text or GUI data.
  • a method for displaying application data in a wireless communication system including transmitting, by a first device, audio and video data out of application data executed by the first device to a second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery between the first device and the second device, transmitting, by the first device, text or Graphic User Interface (GUI) data related to the transmitted audio and video data, to the second device in a session that is different from the WFD session, and receiving, by the first device, a close command for the application data and sending a message including a set parameter for stopping display of the text or GUI data to the second device.
  • Wi-Fi Wireless Fidelity
  • GUI Graphic User Interface
  • a method for displaying application data in a wireless communication system including receiving, by a first device, audio and video data out of application data executed by a second device from a second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery between the first device and the second device, receiving, by the first device, text or Graphic User Interface (GUI) data related to the received audio and video data, from the second device in a session that is different from the WFD session, merging and displaying, by the first device, the received audio and video data with the text or GUI data, and receiving, by the first device, a message including a set parameter for stopping display of the text or GUI data from the second device and stopping display of the text or GUI data, upon receiving, by the second device, a close command for the application data.
  • Wi-Fi Wireless Fidelity
  • GUI Graphic User Interface
  • a method for displaying application data in a wireless communication system including transmitting, by a first device, a file including text data out of application data executed by the first device to a second device and parsing, by the first device, the file to acquire timing information regarding the text data and transmitting audio and video data out of the application data and packet data including the acquired timing information to the second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session.
  • Wi-Fi Wireless Fidelity
  • a method for displaying application data in a wireless communication system including receiving, by a first device, a file including text data out of application data executed by a second device from the second device, receiving, by the first device, audio and video data out of the application data and packet data including the acquired timing information from the second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, and decoding, by the first device, the packet data to detect the timing information, comparing the detected timing information with timing information included in the file received from the second device, and displaying the audio and video data and text data corresponding thereto based on a result of the comparing.
  • Wi-Fi Wireless Fidelity
  • a device for displaying application data in a wireless communication system including a transmitter/receiver configured transmit audio and video data out of application data executed by the first device to another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery with the another device, to transmit text or Graphic User Interface (GUI) data related to the transmitted audio and video data, to the another device in a session that is different from the WFD session, and to receive a close command for the application data and send a message including a set parameter for stopping display of the text or GUI data to the another device, and a controller configured to control the transmitter/receiver and to set up connection with the another device.
  • Wi-Fi Wireless Fidelity
  • GUI Graphic User Interface
  • a device for displaying application data in a wireless communication system including a transmitter/receiver configured to receive audio and video data out of application data executed by another device from the another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery with the another device, to receive text or Graphic User Interface (GUI) data related to the received audio and video data, from the another device in a session that is different from the WFD session, and to receive a message including a set parameter for stopping display of the text or GUI data from the another device upon receiving, by the another device, a close command for the application data and a controller configured to merge and display the received audio and video data with the text or GUI data, to stop display of the text or GUI data upon receiving the message including the set parameter from the another device, and to set up connection with the another device.
  • Wi-Fi Wireless Fidelity
  • GUI Graphic User Interface
  • a device for displaying application data in a wireless communication system including a transmitter/receiver configured to transmit a file including text data out of currently executed application data to another device, to parse the file to acquire timing information regarding the text data and transmit audio and video data out of the application data and packet data including the acquired timing information to the another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session and a controller configured to control the transmitter/receiver.
  • Wi-Fi Wireless Fidelity
  • WFD Wireless Fidelity
  • a device for displaying application data in a wireless communication system including a transmitter/receiver configured to receive a file including text data out of application data executed by another device from the another device, to receive audio and video data out of the application data and packet data including the acquired timing information from the another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session and a controller configured to decode the packet data for detecting the timing information, to compare the detected timing information with timing information included in the file received from the another device, to display the audio and video data and text data corresponding thereto based on a result of the comparing, and to control the transmitter/receiver.
  • a transmitter/receiver configured to receive a file including text data out of application data executed by another device from the another device, to receive audio and video data out of the application data and packet data including the acquired timing information from the another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session and a controller configured to decode the packet data for detecting the timing information,
  • FIG. 1 schematically illustrates a system for sharing contents between WFD devices according to an embodiment of the present disclosure
  • FIG. 2 illustrates a series of processes for synchronization between a source device and a sink device according to an embodiment of the present disclosure
  • FIG. 3 illustrates a series of processes for synchronization between a source device and a sink device according to another embodiment of the present disclosure
  • FIG. 4 is a block diagram of a source device and a sink device for synchronization between the source device and the sink device according to an embodiment of the present disclosure.
  • FIGS. 5 and 6 illustrate an example of synchronization between a source device and a sink device according to an embodiment of the present disclosure.
  • While terms including ordinal numbers such as a first and a second may be used to describe various components, such components are not limited to the terms. The terms are used only for the purpose of distinguishing one component from other components.
  • a second component may be named as a first component without departing from the scope of the present disclosure, and in a similar way, the first component may be renamed as the second component.
  • the term and/or includes a combination of a plurality of related items or any one of the plurality of related items.
  • FIG. 1 schematically illustrates a system (or a WFD system) for sharing contents between Wireless Fidelity (Wi-Fi) Display (WFD) devices according to an embodiment of the present disclosure.
  • the WFD system includes a source device 110 and a sink device 120 .
  • the source device 110 may be a portable device having a relatively small screen, such as a mobile communication device, a smartphone, a tablet Personal Computer (PC), a Portable Multimedia Player (PMP), a Personal Digital Assistant (PDA), or the like.
  • a mobile communication device such as a smartphone, a tablet Personal Computer (PC), a Portable Multimedia Player (PMP), a Personal Digital Assistant (PDA), or the like.
  • PC Personal Computer
  • PMP Portable Multimedia Player
  • PDA Personal Digital Assistant
  • the source device 110 forms a communication channel for wirelessly transmitting multimedia contents with the sink device 120 .
  • the communication channel may include a Miracast session. More specifically, the source device 110 transmits multimedia contents and a control command to the sink device 120 through the communication channel.
  • the communication channel may be a communication channel based on a Transmission Control Protocol (TCP) or a User Datagram Protocol (UDP).
  • TCP Transmission Control Protocol
  • UDP User Datagram Protocol
  • WFD Transmission Control Protocol
  • UDP User Datagram Protocol
  • a streaming protocol between the source device 110 and the sink device 120 is defined in a forward channel.
  • the source device 110 may transmit multimedia contents and a control command to the sink device 120 on a real-time basis, upon being connected with the sink device 120 through the communication channel. More specifically, the source device 110 converts the multimedia contents into a stream form and transmits the multimedia contents converted into the stream form to the sink device 120 through the communication channel. The source device 110 transmits the control command to the sink device 120 through the communication channel.
  • the control command is a command for controlling playback of the multimedia contents.
  • the control command may include Volume Control, Pause, Resume, Stop, Rewind, and the like.
  • the source device 110 may transmit text or Graphic User Interface (GUI) data, separately from transmission of Audio/Video (AV) data, to the sink device 120 .
  • GUI Graphic User Interface
  • the sink device 120 may be an electronic device having a relatively large screen, such as a desktop PC, a laptop, a smart Television (TV), or the like.
  • the sink device 120 receives the multimedia contents converted into the stream form from the source device 110 through the communication channel, and inversely converts and plays the received multimedia contents.
  • the sink device 120 merges and displays the AV data and the text or GUI data transmitted from the source device 110 .
  • the sink device 120 receives the control command from the source device 110 and performs a function corresponding to the control command.
  • FIG. 1 Although one sink device is illustrated in FIG. 1 , the present disclosure is not limited to this example, and the present disclosure may also be applied when two or more sink devices exist. For convenience, the following description will be made using an example where there is one sink device.
  • a text or a GUI is generally related to a program that is being currently played.
  • AV data and text or GUI data are merged in a sink device, there are an AV session (or a Miracast session) for the AV data and a session for the text or GUI data.
  • An operation in the source device needs to be reflected in the sink device, and the sink device needs to synchronize the AV data with the text/GUI data. That is, the operation in the source device has to be displayed in temporal synchronization with the AV session and has to be displayed when the AV session for corresponding programs is ongoing.
  • Multiple programs may be launched and closed in the source device during an effective time of the Miracast session, and a text/GUI related to a program has to be closed when the program is closed.
  • the program may be closed in the source device by user's input of a close button or a back button.
  • a mechanism is needed for the source device to inform the sink device that the sink device needs to close a program each time when an operation related to closing of the program is performed in the source device.
  • FIG. 2 illustrates a series of processes for synchronization between a source device and a sink device according to an embodiment of the present disclosure.
  • a source device and a sink device discover each other and start a Miracast session in operation 201 .
  • a user executes a program such as a video player and inputs a command for playing a video in operation 202 .
  • the source device delivers corresponding AV contents to the sink device in the Miracast session in operation 203 .
  • the source device delivers subtitle/GUI contents to the sink device in a session (for example, a File Transfer Protocol (FTP)) separate from the Miracast session in operation 204 .
  • the subtitle/GUI contents may be delivered by adding a new media component to Moving Picture Experts Group (MPEG) packets for the Miracast session.
  • MPEG Moving Picture Experts Group
  • the sink device merges and displays the subtitle/GUI contents with the already received AV contents in operation 205 .
  • the source device transmits the AV contents to the sink device.
  • the source device Upon input of a command for terminating a currently executed video from the user to the source device in operation 207 , the source device sends an RTSP message including a new set parameter “stop_media_components” for stopping media components to the sink device in operation 208 . Then, the sink device stops the media components such as currently displayed GUI/text according to the RTSP message received from the source device in operation 209 . Thereafter, the Miracast session may be ongoing and the user may start another AV application in the source device in operation 210 .
  • the text/GUI contents may be delivered in the form of SMIL/TTML, and SMIL/TTML files are played based on time synchronization only in SMIL/TTML, without considering timing of video files.
  • SMIL/TTML files are played based on time synchronization only in SMIL/TTML, without considering timing of video files.
  • the sink device displays a text
  • the text may be displayed without being synchronized with the AV contents. This problem occurs because the same contents are streamed from one device to another device.
  • the WFD network needs to synchronize play timing of an SMIL file with timing of video contents.
  • an embodiment of the present disclosure proposes a method for, by the source device, inserting text timing information (SMIL/TTML timing information) into MPEG packets of AV contents for transmission to the sink device.
  • the timing information is information indicating when to detect which SMIL/TTML when the sink device renders contents.
  • FIG. 3 illustrates a series of processes for synchronization between a source device and a sink device according to another embodiment of the present disclosure.
  • the source device delivers an SMIL/TTML file including a text to the sink device in operation 301 .
  • the SMIL/TTML file includes timing information regarding when to display which text.
  • the source device parses the SMIL/TTML file delivered to the sink device and acquires the timing information regarding the text. Thereafter, as a part of a Miracast session operation, the source device captures a screen and packetizes captured screen information.
  • the source device incorporates the timing information acquired from the SMIL/TTML file into a packet in operation 302 .
  • the source device delivers the packet (for example, am MPEG2-TS) including the timing information to the sink device in operation 303 .
  • the sink device then decodes the packet to detect the SMIL/TTML timing information.
  • the sink device parses an already received SMIL/TTMl file and compares timing information included in the SMIL/TTML file with the detected timing information to detect a corresponding text and display the text together with AV data in operation 304 . If the user inputs a pause command to the source device, the source device transmits a trigger for pause to the sink device in operation 305 .
  • the sink device sends an RTSP message to the source device in response to the pause in operation 306 and pauses a currently displayed text in operation 307 .
  • FIG. 4 is a block diagram of a source device and a sink device for synchronization between the source device and the sink device according to an embodiment of the present disclosure.
  • the source device and the sink device may include a controller 410 , a transmitter/receiver 420 , and a storing unit 430 .
  • the source device and the sink device may be separate devices as would be obvious to those of ordinary skill in the art.
  • the controller 410 controls the transmitter/receiver 420 and the storing unit 430 for a series of operations for synchronization between devices according to an embodiment of the present disclosure.
  • An overall operation related to the control operation is the same as the operation which has been already described with reference to FIGS. 2 and 3 , and thus will not be described in detail at this time.
  • the transmitter/receiver 420 transmits and receives data, a message, a signal, and the like between the source device and the sink device for synchronization between devices according to an embodiment of the present disclosure under control of the controller 410 .
  • the storage unit 430 stores data to be transmitted to a counterpart device or data received from the counterpart device. An overall operation related to the storage operation is the same as the operation which has been already described with reference to FIGS. 2 and 3 , and thus will not be described in detail at this time.
  • an input unit for receiving a user's command and a display for displaying contents may be further included in the source device and the sink device.
  • the input unit and the display unit may be configured as one unit depending on a device type.
  • an input unit included in the source device 110 and an input unit included in the sink device may include multiple input keys and function keys for receiving number or character information and setting and controlling functions, and may be formed by any one of input means, such as a touch key, a touch pad, and a touch screen, or a combination thereof.
  • FIGS. 5 and 6 illustrate an example of synchronization between a source device and a sink device according to an embodiment of the present disclosure.
  • FIG. 5 shows a case where a smartphone as a source device executes a navigation application and transmits a navigation execution screen as contents to a screen in a vehicle as a sink device.
  • the navigation program displays a text for detailed road guidance and a GUI for setting road guiding conditions, together with a map image, on the screen.
  • the text and the GUI data displayed on the screen in the vehicle may be displayed without any loss.
  • FIG. 6 shows a case where a smartphone as a source device executes a video play program and transmits contents to a screen of a TV as a sink device.
  • Contents such as a movie include subtitle data, together with video and audio data.
  • subtitles may be displayed on the screen of the TV without distortion, or may be synchronized with the video and audio data.
  • a computer readable recording medium is any data storage device that can store data which can be thereafter read by a computer system.
  • Examples of the computer readable recording medium include read-only memory (ROM), random-access memory (RAM), Compact Disc (CD)-Read Only Memories (ROMs), magnetic tapes, floppy disks, optical data storage devices, and carrier waves (such as data transmission through the Internet).
  • the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. Also, functional programs, code, and code segments for accomplishing the present disclosure can be easily construed by programmers skilled in the art to which the present disclosure pertains.
  • An apparatus and method according to an embodiment of the present disclosure may be implemented by hardware, software, or a combination of hardware and software.
  • Such software may be stored, whether or not erasable or re-recordable, in a volatile or non-volatile storage such as a ROM, a memory such as a RAM, a memory chip, a device, or an integrated circuit; and an optically or magnetically recordable and machine (e.g., computer)-readable storage medium such as a CD, a Digital Versatile Disk (DVD), a magnetic disk, or a magnetic tape.
  • a volatile or non-volatile storage such as a ROM, a memory such as a RAM, a memory chip, a device, or an integrated circuit
  • an optically or magnetically recordable and machine (e.g., computer)-readable storage medium such as a CD, a Digital Versatile Disk (DVD), a magnetic disk, or a magnetic tape.
  • DVD Digital Versatile Disk
  • the method according to the present disclosure may be implemented by a computer or a portable terminal which includes a controller and a memory
  • the memory is an example of a machine-readable storage medium which is suitable for storing a program or programs including instructions for implementing the embodiment of the present disclosure.
  • the present disclosure includes a program including codes for implementing an apparatus or method claimed in an arbitrary claim and a machine(computer)-readable storage medium for storing such a program.
  • the program may be electronically transferred through an arbitrary medium such as a communication signal delivered through a wired or wireless connection, and the present disclosure properly includes equivalents thereof.
  • the apparatus may receive and store the program from a program providing device connected in a wired or wireless manner.
  • the program providing device may include a memory for storing a program including instructions for instructing a program processor to execute a preset contents protection method, information necessary for contents protection, a communication unit for performing wired or wireless communication with a graphic processor, and a controller for transmitting a corresponding program to a transceiver at the request of the graphic processor or automatically.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Telephone Function (AREA)
  • Telephonic Communication Services (AREA)
  • Digital Computer Display Output (AREA)
  • Information Transfer Between Computers (AREA)

Abstract

Provided is a method for displaying application data in a wireless communication system, the method including transmitting, by a first device, audio and video data out of application data executed by the first device to a second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery between the first device and the second device, transmitting, by the first device, text or Graphic User Interface (GUI) data related to the transmitted audio and video data, to the second device in a session that is different from the WFD session, and receiving, by the first device, a close command for the application data and sending a message comprising a set parameter for stopping display of the text or GUI data to the second device.

Description

PRIORITY
This patent application claims priority under 35 U.S.C. §119(e) to Patent Application filed in the United States Patent and Trademark Office on Feb. 28, 2014 and assigned Ser. No. 61/946,151, the content of which is incorporated herein by reference.
TECHNICAL FIELD
The present disclosure relates to a method and apparatus for playing contents shared between devices in a wireless communication system.
BACKGROUND
The Wireless Fidelity (Wi-Fi) Display (WFD) standard has been specified according to needs for transmitting Audio/Video (AV) data while satisfying high quality and low latency. A WFD network to which the WFD standard is applied is a network system proposed by the Wi-Fi Alliance which enables Wi-Fi devices to be connected to each other in a peer-to-peer manner without joining a home network, an office network, or a hotspot network. The WFD devices in the WFD network search for information about each other, for example, capability information, establish a WFD session, and render contents received during the WFD session.
The WFD network includes two types of devices, for example, a source device and a sink device. The source device mirrors data existing on a source screen onto a screen of the sink device. The source device and the sink device exchange a first sequence message with each other and perform device discovery and service discovery. After completion of device discovery and service discovery, an Internet Protocol (IP) address is allocated to the source device and the sink device. Transmission Control Protocol (TCP) connection is established between the source device and the sink device, and then Real Time Streaming Protocol (RTSP) and Real Time Protocol (RTP) stacks for the source device and the sink device are activated.
Capability negotiation between the source device and the sink device is performed through the RTSP, and during capability negotiation, the source device and the sink device exchange M1 through M4 messages. Thereafter, the source device and the sink device exchange WFD session control messages. A data session is established between the source device and the sink device through the RTP.
In the WFD network, a User Datagram Protocol (UDP) is used for data transport. The UDP has no reliability, such that during wireless transmission, some packets may be lost. Since the lost packets are not retransmitted, a loss of data may be noticed by a user. In particular, a data loss related to a text, such as a subtitle, or a Graphic User Interface (GUI) is more noticeable to the user than AV data. Thus, a need exists for a scheme for improving the quality of text and GUI data in the sink device.
To reduce noticeable distortion in the quality of the text and GUI-related data in the sink device, the sink device merges the AV data with the text and GUI-related data, instead of the source device merging the AV data with the text and GUI-related data and transmitting the merged data to the sink device. To merge the text such as a subtitle with the AV data in the sink device, techniques such as Synchronized Multimedia Integration Language (SMIL) or Timed Text Markup Language (TTML) may be used. Likewise, to merge the GUI data with the AV data in the sink device, techniques such as Remote View (RVU), Hyper Text Markup Language (HTML) 5, or Consumer Electronics (CE)-HTML may be used.
The above information is presented as background information only to assist with an understanding of the present disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the present disclosure.
SUMMARY
An aspect of the present disclosure is to address at least the above-mentioned problems and/or disadvantages and to provide at least the advantages described below. Accordingly, an aspect of the present disclosure is to provide a method and apparatus for merging Audio/Video (AV) data with a text or Graphic User Interface (GUI) data in a sink device.
Another aspect of the present disclosure is to provide a method and apparatus for synchronization in a sink device when merging AV data with a text or GUI data.
In accordance with an aspect of the present disclosure, there is provided a method for displaying application data in a wireless communication system, the method including transmitting, by a first device, audio and video data out of application data executed by the first device to a second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery between the first device and the second device, transmitting, by the first device, text or Graphic User Interface (GUI) data related to the transmitted audio and video data, to the second device in a session that is different from the WFD session, and receiving, by the first device, a close command for the application data and sending a message including a set parameter for stopping display of the text or GUI data to the second device.
In accordance with another aspect of the present disclosure, there is provided a method for displaying application data in a wireless communication system, the method including receiving, by a first device, audio and video data out of application data executed by a second device from a second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery between the first device and the second device, receiving, by the first device, text or Graphic User Interface (GUI) data related to the received audio and video data, from the second device in a session that is different from the WFD session, merging and displaying, by the first device, the received audio and video data with the text or GUI data, and receiving, by the first device, a message including a set parameter for stopping display of the text or GUI data from the second device and stopping display of the text or GUI data, upon receiving, by the second device, a close command for the application data.
In accordance with another aspect of the present disclosure, there is provided a method for displaying application data in a wireless communication system, the method including transmitting, by a first device, a file including text data out of application data executed by the first device to a second device and parsing, by the first device, the file to acquire timing information regarding the text data and transmitting audio and video data out of the application data and packet data including the acquired timing information to the second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session.
In accordance with another aspect of the present disclosure, there is provided a method for displaying application data in a wireless communication system, the method including receiving, by a first device, a file including text data out of application data executed by a second device from the second device, receiving, by the first device, audio and video data out of the application data and packet data including the acquired timing information from the second device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, and decoding, by the first device, the packet data to detect the timing information, comparing the detected timing information with timing information included in the file received from the second device, and displaying the audio and video data and text data corresponding thereto based on a result of the comparing.
In accordance with another aspect of the present disclosure, there is provided a device for displaying application data in a wireless communication system, the device including a transmitter/receiver configured transmit audio and video data out of application data executed by the first device to another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery with the another device, to transmit text or Graphic User Interface (GUI) data related to the transmitted audio and video data, to the another device in a session that is different from the WFD session, and to receive a close command for the application data and send a message including a set parameter for stopping display of the text or GUI data to the another device, and a controller configured to control the transmitter/receiver and to set up connection with the another device.
In accordance with another aspect of the present disclosure, there is provided a device for displaying application data in a wireless communication system, the device including a transmitter/receiver configured to receive audio and video data out of application data executed by another device from the another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session, if the WFD session is established through discovery with the another device, to receive text or Graphic User Interface (GUI) data related to the received audio and video data, from the another device in a session that is different from the WFD session, and to receive a message including a set parameter for stopping display of the text or GUI data from the another device upon receiving, by the another device, a close command for the application data and a controller configured to merge and display the received audio and video data with the text or GUI data, to stop display of the text or GUI data upon receiving the message including the set parameter from the another device, and to set up connection with the another device.
In accordance with another aspect of the present disclosure, there is provided a device for displaying application data in a wireless communication system, the device including a transmitter/receiver configured to transmit a file including text data out of currently executed application data to another device, to parse the file to acquire timing information regarding the text data and transmit audio and video data out of the application data and packet data including the acquired timing information to the another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session and a controller configured to control the transmitter/receiver.
In accordance with another aspect of the present disclosure, there is provided a device for displaying application data in a wireless communication system, the device including a transmitter/receiver configured to receive a file including text data out of application data executed by another device from the another device, to receive audio and video data out of the application data and packet data including the acquired timing information from the another device in a Wireless Fidelity (Wi-Fi) Display (WFD) session and a controller configured to decode the packet data for detecting the timing information, to compare the detected timing information with timing information included in the file received from the another device, to display the audio and video data and text data corresponding thereto based on a result of the comparing, and to control the transmitter/receiver.
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
The above and other aspects, features and advantages of certain exemplary embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
FIG. 1 schematically illustrates a system for sharing contents between WFD devices according to an embodiment of the present disclosure;
FIG. 2 illustrates a series of processes for synchronization between a source device and a sink device according to an embodiment of the present disclosure;
FIG. 3 illustrates a series of processes for synchronization between a source device and a sink device according to another embodiment of the present disclosure;
FIG. 4 is a block diagram of a source device and a sink device for synchronization between the source device and the sink device according to an embodiment of the present disclosure; and
FIGS. 5 and 6 illustrate an example of synchronization between a source device and a sink device according to an embodiment of the present disclosure.
Throughout the drawings, like reference numerals will be understood to refer to like parts, components, and structures.
DETAILED DESCRIPTION
Hereinafter, certain exemplary embodiments of the inventive concept will be described in detail with reference to the accompanying drawings. The matters defined herein, such as a detailed construction and elements thereof, are provided to assist in a comprehensive understanding of this description. Thus, it is apparent that exemplary embodiments may be carried out without those defined matters. Also, terms that will be described later in the present disclosure are defined by considering functions in the present inventive concept, and may vary depending on an operator's or user's intention or practice. Therefore, the definition of the terms will be made based on information throughout this disclosure.
The present disclosure may be variously modified and realized in various forms, and thus specific embodiments will be exemplified in the drawings and described in detail hereinbelow. However, the present disclosure is not limited to the specific disclosed forms, and needs to be construed to include all modifications, equivalents, or replacements included in the spirit and technical range of the present disclosure.
While terms including ordinal numbers such as a first and a second may be used to describe various components, such components are not limited to the terms. The terms are used only for the purpose of distinguishing one component from other components. For example, a second component may be named as a first component without departing from the scope of the present disclosure, and in a similar way, the first component may be renamed as the second component. The term and/or includes a combination of a plurality of related items or any one of the plurality of related items.
The terms used herein are to explain only specific embodiments, and are not intended to limit the present disclosure. A singular expression covers a plural expression unless it is definitely used in a different way in the context. It should be understood that the terms “comprising”, “including”, and “having” use herein are intended to denote a feature, a number, a step, an operation, an element, a part, and a combination thereof described herein, but not to exclude one or more features, numbers, steps, operations, elements, parts, and combinations thereof.
Unless otherwise defined, the terms used herein including technical or scientific terms have the same meanings as those understood by those skilled in the art to which the present disclosure pertains. The terms generally defined in dictionaries should be construed to have meanings in agreement with those in the contexts of the related technology, and not construed as ideal or excessively formal meanings unless definitely defined herein.
FIG. 1 schematically illustrates a system (or a WFD system) for sharing contents between Wireless Fidelity (Wi-Fi) Display (WFD) devices according to an embodiment of the present disclosure.
Referring to FIG. 1, the WFD system includes a source device 110 and a sink device 120.
The source device 110 may be a portable device having a relatively small screen, such as a mobile communication device, a smartphone, a tablet Personal Computer (PC), a Portable Multimedia Player (PMP), a Personal Digital Assistant (PDA), or the like.
The source device 110 forms a communication channel for wirelessly transmitting multimedia contents with the sink device 120. For example, the communication channel may include a Miracast session. More specifically, the source device 110 transmits multimedia contents and a control command to the sink device 120 through the communication channel. The communication channel may be a communication channel based on a Transmission Control Protocol (TCP) or a User Datagram Protocol (UDP). In particular, if WFD is provided between the source device 110 and the sink device 120 in the communication system according to an embodiment of the present disclosure, a streaming protocol between the source device 110 and the sink device 120 is defined in a forward channel.
That is, the source device 110 may transmit multimedia contents and a control command to the sink device 120 on a real-time basis, upon being connected with the sink device 120 through the communication channel. More specifically, the source device 110 converts the multimedia contents into a stream form and transmits the multimedia contents converted into the stream form to the sink device 120 through the communication channel. The source device 110 transmits the control command to the sink device 120 through the communication channel. The control command is a command for controlling playback of the multimedia contents. For example, the control command may include Volume Control, Pause, Resume, Stop, Rewind, and the like. The source device 110 may transmit text or Graphic User Interface (GUI) data, separately from transmission of Audio/Video (AV) data, to the sink device 120.
The sink device 120 may be an electronic device having a relatively large screen, such as a desktop PC, a laptop, a smart Television (TV), or the like. The sink device 120 receives the multimedia contents converted into the stream form from the source device 110 through the communication channel, and inversely converts and plays the received multimedia contents. The sink device 120 merges and displays the AV data and the text or GUI data transmitted from the source device 110. The sink device 120 receives the control command from the source device 110 and performs a function corresponding to the control command.
Although one sink device is illustrated in FIG. 1, the present disclosure is not limited to this example, and the present disclosure may also be applied when two or more sink devices exist. For convenience, the following description will be made using an example where there is one sink device.
A text or a GUI is generally related to a program that is being currently played. When AV data and text or GUI data are merged in a sink device, there are an AV session (or a Miracast session) for the AV data and a session for the text or GUI data. An operation in the source device needs to be reflected in the sink device, and the sink device needs to synchronize the AV data with the text/GUI data. That is, the operation in the source device has to be displayed in temporal synchronization with the AV session and has to be displayed when the AV session for corresponding programs is ongoing. Multiple programs may be launched and closed in the source device during an effective time of the Miracast session, and a text/GUI related to a program has to be closed when the program is closed. The program may be closed in the source device by user's input of a close button or a back button. Thus, a mechanism is needed for the source device to inform the sink device that the sink device needs to close a program each time when an operation related to closing of the program is performed in the source device.
FIG. 2 illustrates a series of processes for synchronization between a source device and a sink device according to an embodiment of the present disclosure.
Referring to FIG. 2, a source device and a sink device discover each other and start a Miracast session in operation 201. In the source device, a user executes a program such as a video player and inputs a command for playing a video in operation 202. The source device delivers corresponding AV contents to the sink device in the Miracast session in operation 203. The source device delivers subtitle/GUI contents to the sink device in a session (for example, a File Transfer Protocol (FTP)) separate from the Miracast session in operation 204. The subtitle/GUI contents may be delivered by adding a new media component to Moving Picture Experts Group (MPEG) packets for the Miracast session. The sink device merges and displays the subtitle/GUI contents with the already received AV contents in operation 205. During the Miracast session, the source device transmits the AV contents to the sink device.
Upon input of a command for terminating a currently executed video from the user to the source device in operation 207, the source device sends an RTSP message including a new set parameter “stop_media_components” for stopping media components to the sink device in operation 208. Then, the sink device stops the media components such as currently displayed GUI/text according to the RTSP message received from the source device in operation 209. Thereafter, the Miracast session may be ongoing and the user may start another AV application in the source device in operation 210.
The text/GUI contents may be delivered in the form of SMIL/TTML, and SMIL/TTML files are played based on time synchronization only in SMIL/TTML, without considering timing of video files. As a result, when the sink device displays a text, the text may be displayed without being synchronized with the AV contents. This problem occurs because the same contents are streamed from one device to another device. Thus, the WFD network needs to synchronize play timing of an SMIL file with timing of video contents.
Therefore, an embodiment of the present disclosure proposes a method for, by the source device, inserting text timing information (SMIL/TTML timing information) into MPEG packets of AV contents for transmission to the sink device. The timing information is information indicating when to detect which SMIL/TTML when the sink device renders contents.
FIG. 3 illustrates a series of processes for synchronization between a source device and a sink device according to another embodiment of the present disclosure.
Referring to FIG. 3, the source device delivers an SMIL/TTML file including a text to the sink device in operation 301. The SMIL/TTML file includes timing information regarding when to display which text. The source device parses the SMIL/TTML file delivered to the sink device and acquires the timing information regarding the text. Thereafter, as a part of a Miracast session operation, the source device captures a screen and packetizes captured screen information. During packetization, the source device incorporates the timing information acquired from the SMIL/TTML file into a packet in operation 302. The source device delivers the packet (for example, am MPEG2-TS) including the timing information to the sink device in operation 303. The sink device then decodes the packet to detect the SMIL/TTML timing information. The sink device parses an already received SMIL/TTMl file and compares timing information included in the SMIL/TTML file with the detected timing information to detect a corresponding text and display the text together with AV data in operation 304. If the user inputs a pause command to the source device, the source device transmits a trigger for pause to the sink device in operation 305. The sink device sends an RTSP message to the source device in response to the pause in operation 306 and pauses a currently displayed text in operation 307.
FIG. 4 is a block diagram of a source device and a sink device for synchronization between the source device and the sink device according to an embodiment of the present disclosure.
Referring to FIG. 4, the source device and the sink device may include a controller 410, a transmitter/receiver 420, and a storing unit 430. Although one device is illustrated in FIG. 4 for convenience, the source device and the sink device may be separate devices as would be obvious to those of ordinary skill in the art.
The controller 410 controls the transmitter/receiver 420 and the storing unit 430 for a series of operations for synchronization between devices according to an embodiment of the present disclosure. An overall operation related to the control operation is the same as the operation which has been already described with reference to FIGS. 2 and 3, and thus will not be described in detail at this time.
The transmitter/receiver 420 transmits and receives data, a message, a signal, and the like between the source device and the sink device for synchronization between devices according to an embodiment of the present disclosure under control of the controller 410. The storage unit 430 stores data to be transmitted to a counterpart device or data received from the counterpart device. An overall operation related to the storage operation is the same as the operation which has been already described with reference to FIGS. 2 and 3, and thus will not be described in detail at this time.
Although not shown in FIG. 4, an input unit for receiving a user's command and a display for displaying contents may be further included in the source device and the sink device. The input unit and the display unit may be configured as one unit depending on a device type. In particular, an input unit included in the source device 110 and an input unit included in the sink device may include multiple input keys and function keys for receiving number or character information and setting and controlling functions, and may be formed by any one of input means, such as a touch key, a touch pad, and a touch screen, or a combination thereof.
FIGS. 5 and 6 illustrate an example of synchronization between a source device and a sink device according to an embodiment of the present disclosure.
FIG. 5 shows a case where a smartphone as a source device executes a navigation application and transmits a navigation execution screen as contents to a screen in a vehicle as a sink device. The navigation program displays a text for detailed road guidance and a GUI for setting road guiding conditions, together with a map image, on the screen. In this case, by applying the method for synchronization according to an embodiment of the present disclosure described in FIGS. 2 and 3, the text and the GUI data displayed on the screen in the vehicle may be displayed without any loss.
FIG. 6 shows a case where a smartphone as a source device executes a video play program and transmits contents to a screen of a TV as a sink device. Contents such as a movie include subtitle data, together with video and audio data. In this case, by applying the method for synchronization according to an embodiment of the present disclosure described with reference to FIGS. 2 and 3, subtitles may be displayed on the screen of the TV without distortion, or may be synchronized with the video and audio data.
Certain aspects of the present disclosure can also be embodied as computer readable code on a computer readable recording medium. A computer readable recording medium is any data storage device that can store data which can be thereafter read by a computer system. Examples of the computer readable recording medium include read-only memory (ROM), random-access memory (RAM), Compact Disc (CD)-Read Only Memories (ROMs), magnetic tapes, floppy disks, optical data storage devices, and carrier waves (such as data transmission through the Internet). The computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. Also, functional programs, code, and code segments for accomplishing the present disclosure can be easily construed by programmers skilled in the art to which the present disclosure pertains.
An apparatus and method according to an embodiment of the present disclosure may be implemented by hardware, software, or a combination of hardware and software. Such software may be stored, whether or not erasable or re-recordable, in a volatile or non-volatile storage such as a ROM, a memory such as a RAM, a memory chip, a device, or an integrated circuit; and an optically or magnetically recordable and machine (e.g., computer)-readable storage medium such as a CD, a Digital Versatile Disk (DVD), a magnetic disk, or a magnetic tape. It can be seen that the method according to the present disclosure may be implemented by a computer or a portable terminal which includes a controller and a memory, and the memory is an example of a machine-readable storage medium which is suitable for storing a program or programs including instructions for implementing the embodiment of the present disclosure.
Therefore, the present disclosure includes a program including codes for implementing an apparatus or method claimed in an arbitrary claim and a machine(computer)-readable storage medium for storing such a program. The program may be electronically transferred through an arbitrary medium such as a communication signal delivered through a wired or wireless connection, and the present disclosure properly includes equivalents thereof.
The apparatus according to an embodiment of the present disclosure may receive and store the program from a program providing device connected in a wired or wireless manner. The program providing device may include a memory for storing a program including instructions for instructing a program processor to execute a preset contents protection method, information necessary for contents protection, a communication unit for performing wired or wireless communication with a graphic processor, and a controller for transmitting a corresponding program to a transceiver at the request of the graphic processor or automatically.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents.

Claims (4)

What is claimed is:
1. A method for displaying data in a wireless communication system, the method comprising:
establishing a first session corresponding to a wifi display session between a first device and a second device;
establishing a second session other than the first session between the first device and the second device;
displaying, by the first device, application data including graphic user interface (GUI) data and playing multimedia content, with the multimedia content comprising audio data and video data, and the GUI data being associated with the video data;
transmitting, by the first device, a synchronized multimedia integration language (SMIL) file including the GUI data and timing information associated with the GUI data and the multimedia content, to the second device through the second session;
after transmission of the SMIL file to the second device, generating, by the first device, one or more packets by packetizing the multimedia content currently being displayed by the first device and inserting timing information associated with the SMIL file into the one or more packets;
transmitting, by the first device, the one or more packets to the second device through the first session;
decoding, by the second device, the one or more packets to detect the timing information;
parsing, by the second device, the SMIL file and extracting the timing information associated with the GUI data and the multimedia content;
comparing, by the second device, the extracted timing information with timing information of the one or more packets;
merging and displaying, by the second device, the multimedia content and the GUI data based on comparing the timing information included in the SMIL with the timing information inserted into the one or more packets; and
transmitting, by the first device, a message comprising a set parameter for stopping display of the GUI data to the second device in response to detecting a command for stopping play of the multimedia content at the first device.
2. A method for displaying data in a wireless communication system, the method comprising:
establishing a first session corresponding to a wifi display session between a first device and a second device;
establishing a second session other than the first session between the first device and the second device;
receiving, by the first device, a synchronized multimedia integration language (SMIL) file including graphic user interface (GUI) data and timing information associated with the GUI data and multimedia content from the second device through the second session, with the multimedia content comprising audio data and video data, and the GUI data being associated with the video data;
receiving, by the first device, one or more packets including the multimedia content currently being displayed by the second device and timing information associated with the SMIL file, from the second device, through the first session;
decoding, by the first device, the one or more packets to detect the timing information;
parsing, by the first device, the SMIL file and extracting the timing information associated with the GUI data and the multimedia content comparing, by the first device, the timing information included in the SMIL with the timing information included in the one or more packets;
merging and displaying, by the first device, the multimedia content and the GUI data based on the comparing the timing information included in the SMIL with the timing information inserted into the one or more packets; and
receiving, by the first device, a message comprising a set parameter for stopping display of the GUI data from the second device and stopping display of the GUI data in response to detecting a command for stopping play of the multimedia content at the first device.
3. A first device for displaying data in a wireless communication system, the first device comprising:
a transmitter/receiver configured to transmit or receive data; and
at least one processor configured to:
establish a first session corresponding to a wifi display session between the first device and a second device,
establish a second session other than the first session between the first device and the second device
display application data including graphic user interface (GUI) data and play multimedia content, with the multimedia content comprising audio data and video data, and the GUI data being associated with the video data,
control transmission of a synchronized multimedia integration language (SMIL) file including the GUI data and timing information associated with the GUI data and the multimedia content, to the second device through the second session,
after transmission of the SMIL file to the second device, generate one or more packets by packetizing the multimedia content currently being displayed by the first device and insert timing information associated with the SMIL file into the one or more packets,
control transmission the one or more packets to the second device through the first session,
control decoding the one or more packets to detect the timing information;
control parsing the SMIL file and extracting the timing information associated with the GUI data and the multimedia content;
compare the extracted timing information with timing information of the one or more packets, and
control transmission of a message comprising a set parameter for stopping display of the GUI data to the second device in response to detecting a command for stopping play of the multimedia content at the first device, with the multimedia content and the GUI data being merged based on comparing the timing information included in the SMIL with the timing information inserted into the one or more packets.
4. A first device for displaying data in a wireless communication system, the first device comprising:
a transmitter/receiver configured to receive or transmit data; and
at least one processor configured to:
establish a first session corresponding to a wifi display session between the first device and a second device,
establish a second session other than the first session between the first device and the second device,
control reception of a synchronized multimedia integration language (SMIL) file including graphic user interface (GUI) data and timing information associated with the GUI data and multimedia content from the second device through the second session, with the multimedia content comprising audio data and video data, and the GUI data being associated with the video data,
control reception of one or more packets including the multimedia content currently being displayed by the second device and timing information associated with the SMIL file, from the second device, through the first session,
control decoding the one or more packets to detect the timing information,
control parsing the SMIL file and extracting the timing information associated with the GUI data and the multimedia content;
compare the timing information included in the SMIL with the timing information included in the one or more packets;
compare the timing information included in the SMIL with the timing information included in the one or more packets, and
control reception of a message comprising a set parameter for stopping display of the GUI data from the second device and stop display of the GUI data in response to detecting a command for stopping play of the multimedia content at the first device, with the multimedia content and the GUI data being merged based on comparing the timing information included in the SMIL with the timing information inserted into the one or more packets.
US14/635,443 2014-02-28 2015-03-02 Method and apparatus for displaying application data in wireless communication system Expired - Fee Related US10757196B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/635,443 US10757196B2 (en) 2014-02-28 2015-03-02 Method and apparatus for displaying application data in wireless communication system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461946151P 2014-02-28 2014-02-28
US14/635,443 US10757196B2 (en) 2014-02-28 2015-03-02 Method and apparatus for displaying application data in wireless communication system

Publications (2)

Publication Number Publication Date
US20150249714A1 US20150249714A1 (en) 2015-09-03
US10757196B2 true US10757196B2 (en) 2020-08-25

Family

ID=54007315

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/635,443 Expired - Fee Related US10757196B2 (en) 2014-02-28 2015-03-02 Method and apparatus for displaying application data in wireless communication system

Country Status (6)

Country Link
US (1) US10757196B2 (en)
EP (1) EP3113500B1 (en)
JP (1) JP6662784B2 (en)
KR (1) KR102284721B1 (en)
CN (1) CN106464965B (en)
WO (1) WO2015130149A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220182424A1 (en) * 2012-11-12 2022-06-09 Samsung Electronics Co., Ltd. Method and system for sharing an output device between multimedia devices to transmit and receive data

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014174657A1 (en) * 2013-04-26 2014-10-30 日立マクセル株式会社 Projection-type video display device
TWI616808B (en) * 2014-06-30 2018-03-01 緯創資通股份有限公司 Method and apparatus for sharing display frame
US10681122B2 (en) * 2014-12-09 2020-06-09 Samsung Electronics Co., Ltd. Method and apparatus for converting content using cloud
US10034047B2 (en) * 2014-12-11 2018-07-24 Lg Electronics Inc. Method and apparatus for outputting supplementary content from WFD
WO2017111183A1 (en) * 2015-12-21 2017-06-29 모다정보통신 주식회사 Portable radio device and method for providing real-time mirroring using same
CN106941628B (en) * 2016-01-04 2019-12-13 中国移动通信集团公司 Transmission method, sending terminal and receiving terminal of auxiliary stream in the same screen process
US10530856B2 (en) * 2016-02-09 2020-01-07 Qualcomm Incorporated Sharing data between a plurality of source devices that are each connected to a sink device
CN107222769B (en) * 2016-03-21 2020-06-05 中国移动通信集团公司 A kind of auxiliary data stream transmission method, device and system
KR20180039341A (en) * 2016-10-10 2018-04-18 삼성전자주식회사 Method for Communicating with the External Device and the Electronic Device supporting the same
US11550461B2 (en) 2017-06-16 2023-01-10 Huawei Technologies Co., Ltd. Screen locking method and apparatus
US20190028522A1 (en) * 2017-07-19 2019-01-24 Qualcomm Incorporated Transmission of subtitle data for wireless display
CN107682541B (en) * 2017-09-27 2021-02-05 努比亚技术有限公司 Audio control method for screen projection, mobile terminal and storage medium
CN113473238B (en) * 2020-04-29 2022-10-18 海信集团有限公司 Intelligent device and simultaneous interpretation method during video call
WO2022066459A1 (en) * 2020-09-24 2022-03-31 Sterling Labs Llc Synchronization in a multiuser experience
KR20250019860A (en) * 2023-08-02 2025-02-11 알비클라우드 주식회사 Method for synchronizing alpha and video in cloud streaming and apparatus therefor

Citations (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001184188A (en) 1999-12-27 2001-07-06 Toshiba Corp Network system
JP2001298676A (en) 2000-04-11 2001-10-26 Sony Corp Video reproducing device, video display device, operation control system and its method
JP2002518947A (en) 1998-06-18 2002-06-25 ソニー エレクトロニクス インク Method and apparatus for handling wideband screen display graphics data in a distributed IEEE 1394 network using an isochronous data transmission format
US20040055011A1 (en) 2002-09-17 2004-03-18 Seung-Gyun Bae Apparatus and method for displaying a television video signal and data in a mobile terminal according to a mode thereof
US20060080456A1 (en) * 2004-10-13 2006-04-13 Sung Jin Hur Device and method of integrating and executing multimedia streaming service and application streaming service
JP2006129103A (en) 2004-10-29 2006-05-18 Mitsubishi Electric Corp Video apparatus and video system corresponding to network
US20060272000A1 (en) 2005-05-26 2006-11-30 Samsung Electronics Co., Ltd. Apparatus and method for providing additional information using extension subtitles file
US20070136777A1 (en) * 2005-12-09 2007-06-14 Charles Hasek Caption data delivery apparatus and methods
KR20070065656A (en) 2005-12-20 2007-06-25 엘지전자 주식회사 Scene composition method and terminal for him
KR20080037474A (en) 2006-10-26 2008-04-30 삼성전자주식회사 Storage method, reproduction method, apparatus and recording medium of bidirectional digital broadcasting signal
JP2008131569A (en) 2006-11-24 2008-06-05 Sony Corp Image information transmission system and method, image information transmission apparatus and method, and image information receiving apparatus and method,
WO2009130840A1 (en) 2008-04-23 2009-10-29 三菱電機株式会社 Vehicle-mounted information system
JP2010516078A (en) 2007-01-08 2010-05-13 エスケーテレコム株式会社 System and method for synchronizing broadcast content and additional information
EP2209280A1 (en) 2009-01-19 2010-07-21 Koninklijke KPN N.V. Managing associated sessions in a network
JP2010252215A (en) 2009-04-20 2010-11-04 Sony Ericsson Mobile Communications Ab Mobile terminal, video display device, and video processing system
US7934010B2 (en) * 2004-03-03 2011-04-26 Alcatel-Lucent Usa Inc. System and method for retrieving digital multimedia content from a network node
US20110107388A1 (en) * 2009-11-02 2011-05-05 Samsung Electronics Co., Ltd. Method and apparatus for providing user input back channel in audio/video system
US20110261889A1 (en) 2010-04-27 2011-10-27 Comcast Cable Communications, Llc Remote User Interface
US20120192064A1 (en) 2011-01-21 2012-07-26 Oudi Antebi Distributed document processing and management
WO2012106644A1 (en) 2011-02-04 2012-08-09 Qualcomm Incorporated Low latency wireless display for graphics
US20120243524A1 (en) 2009-11-17 2012-09-27 Samsung Electronics Co., Ltd. Method and device for investigating wifi display service in a wifi direct network
CN102833876A (en) 2011-06-14 2012-12-19 兄弟工业株式会社 Wireless communication device
US20130003624A1 (en) 2011-01-21 2013-01-03 Qualcomm Incorporated User input back channel for wireless displays
US8386632B2 (en) * 2008-11-24 2013-02-26 Huawei Technologies Co., Ltd. Method, device, and system for controlling streaming media play
US20130067331A1 (en) * 2011-09-09 2013-03-14 Screenovate Technologies Ltd. Method and System of Simultaneous Display of Multiple Screens on a Target Display
US20130185447A1 (en) 2012-01-12 2013-07-18 Marvell World Trade Ltd. Systems and methods for establishing a wi-fi display (wfd) session
US20130185391A1 (en) * 2012-01-17 2013-07-18 Canon Kabushiki Kaisha Transmission apparatus and transmission method
KR20130091104A (en) 2012-02-07 2013-08-16 현대모비스 주식회사 Video decoding apparatus and method based on android platform using dual memory
US20130219072A1 (en) * 2012-02-20 2013-08-22 Samsung Electronics Co., Ltd. Screen mirroring method and apparatus thereof
US20140003516A1 (en) * 2012-06-28 2014-01-02 Divx, Llc Systems and methods for fast video startup using trick play streams
US20140334381A1 (en) 2013-05-08 2014-11-13 Qualcomm Incorporated Video streaming in a wireless communication system
US20140376892A1 (en) * 2013-06-21 2014-12-25 Kabushiki Kaisha Toshiba Display data processor and display data processing method
US20160142865A1 (en) * 2013-06-20 2016-05-19 Lg Electronics Inc. Method and apparatus for reproducing multimedia contents using bluetooth in wireless communication system

Patent Citations (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002518947A (en) 1998-06-18 2002-06-25 ソニー エレクトロニクス インク Method and apparatus for handling wideband screen display graphics data in a distributed IEEE 1394 network using an isochronous data transmission format
US20030210252A1 (en) 1998-06-18 2003-11-13 Ludtke Harold Aaron Method of and apparatus for handling high bandwidth on-screen-display graphics data over a distributed IEEE 1394 network utilizing an isochronous data transmission format
JP2001184188A (en) 1999-12-27 2001-07-06 Toshiba Corp Network system
JP2001298676A (en) 2000-04-11 2001-10-26 Sony Corp Video reproducing device, video display device, operation control system and its method
US20040055011A1 (en) 2002-09-17 2004-03-18 Seung-Gyun Bae Apparatus and method for displaying a television video signal and data in a mobile terminal according to a mode thereof
CN1747512A (en) 2002-09-17 2006-03-15 三星电子株式会社 The apparatus and method that in mobile communication terminal, show television video frequency signal and data
US7934010B2 (en) * 2004-03-03 2011-04-26 Alcatel-Lucent Usa Inc. System and method for retrieving digital multimedia content from a network node
US20060080456A1 (en) * 2004-10-13 2006-04-13 Sung Jin Hur Device and method of integrating and executing multimedia streaming service and application streaming service
JP2006129103A (en) 2004-10-29 2006-05-18 Mitsubishi Electric Corp Video apparatus and video system corresponding to network
US20060272000A1 (en) 2005-05-26 2006-11-30 Samsung Electronics Co., Ltd. Apparatus and method for providing additional information using extension subtitles file
JP2006333460A (en) 2005-05-26 2006-12-07 Samsung Electronics Co Ltd Apparatus and method for providing additional information using extended subtitle file
US20070136777A1 (en) * 2005-12-09 2007-06-14 Charles Hasek Caption data delivery apparatus and methods
KR20070065656A (en) 2005-12-20 2007-06-25 엘지전자 주식회사 Scene composition method and terminal for him
KR20080037474A (en) 2006-10-26 2008-04-30 삼성전자주식회사 Storage method, reproduction method, apparatus and recording medium of bidirectional digital broadcasting signal
US20080104656A1 (en) 2006-10-26 2008-05-01 Samsung Electronics Co., Ltd. Method of storing and reproducing interactive digital broadcast signals and apparatus therefor
JP2008131569A (en) 2006-11-24 2008-06-05 Sony Corp Image information transmission system and method, image information transmission apparatus and method, and image information receiving apparatus and method,
US20080198930A1 (en) 2006-11-24 2008-08-21 Sony Corporation Image information transmission system, image information transmitting apparatus, image information receiving apparatus, image information transmission method, image information transmitting method, and image information receiving method
US20100295992A1 (en) 2007-01-08 2010-11-25 Sk Telecom. Co., Ltd System and method for synchroning broadcast content with supplementary information
JP2010516078A (en) 2007-01-08 2010-05-13 エスケーテレコム株式会社 System and method for synchronizing broadcast content and additional information
WO2009130840A1 (en) 2008-04-23 2009-10-29 三菱電機株式会社 Vehicle-mounted information system
US8386632B2 (en) * 2008-11-24 2013-02-26 Huawei Technologies Co., Ltd. Method, device, and system for controlling streaming media play
EP2209280A1 (en) 2009-01-19 2010-07-21 Koninklijke KPN N.V. Managing associated sessions in a network
JP2010252215A (en) 2009-04-20 2010-11-04 Sony Ericsson Mobile Communications Ab Mobile terminal, video display device, and video processing system
US20110107388A1 (en) * 2009-11-02 2011-05-05 Samsung Electronics Co., Ltd. Method and apparatus for providing user input back channel in audio/video system
US20120243524A1 (en) 2009-11-17 2012-09-27 Samsung Electronics Co., Ltd. Method and device for investigating wifi display service in a wifi direct network
US20110261889A1 (en) 2010-04-27 2011-10-27 Comcast Cable Communications, Llc Remote User Interface
US20130003624A1 (en) 2011-01-21 2013-01-03 Qualcomm Incorporated User input back channel for wireless displays
US20120192064A1 (en) 2011-01-21 2012-07-26 Oudi Antebi Distributed document processing and management
WO2012106644A1 (en) 2011-02-04 2012-08-09 Qualcomm Incorporated Low latency wireless display for graphics
US20140320925A1 (en) 2011-06-14 2014-10-30 Brother Kogyo Kabushiki Kaisha Wireless communication device
CN102833876A (en) 2011-06-14 2012-12-19 兄弟工业株式会社 Wireless communication device
US20130067331A1 (en) * 2011-09-09 2013-03-14 Screenovate Technologies Ltd. Method and System of Simultaneous Display of Multiple Screens on a Target Display
US20130185447A1 (en) 2012-01-12 2013-07-18 Marvell World Trade Ltd. Systems and methods for establishing a wi-fi display (wfd) session
US20130185391A1 (en) * 2012-01-17 2013-07-18 Canon Kabushiki Kaisha Transmission apparatus and transmission method
KR20130091104A (en) 2012-02-07 2013-08-16 현대모비스 주식회사 Video decoding apparatus and method based on android platform using dual memory
US20130219072A1 (en) * 2012-02-20 2013-08-22 Samsung Electronics Co., Ltd. Screen mirroring method and apparatus thereof
US20140003516A1 (en) * 2012-06-28 2014-01-02 Divx, Llc Systems and methods for fast video startup using trick play streams
US20140334381A1 (en) 2013-05-08 2014-11-13 Qualcomm Incorporated Video streaming in a wireless communication system
JP2016521518A (en) 2013-05-08 2016-07-21 クゥアルコム・インコーポレイテッドQualcomm Incorporated Video streaming in wireless communication systems
US20160142865A1 (en) * 2013-06-20 2016-05-19 Lg Electronics Inc. Method and apparatus for reproducing multimedia contents using bluetooth in wireless communication system
US20140376892A1 (en) * 2013-06-21 2014-12-25 Kabushiki Kaisha Toshiba Display data processor and display data processing method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Chinese Office Action dated Oct. 12, 2018 issued in counterpart application No. 201580023227.0, 11 pages.
European Search Report dated Sep. 5, 2017 issued in counterpart application No. 15755501.2-1853, 7 pages.
International Search Report dated May 26, 2015 issued in counterpart application No. PCT/KR2015/001983.
Japanese Office Action dated Oct. 22, 2018 issued in counterpart application No. 2016-554409, 10 pages.
Wi-Fi Alliance, Wi-Fi Display Technical Specification, Wi-Fi Alliance Technical Committee, Wi-Fi Display Technical Task Group, Version 1.0.0, Aug. 24, 2012, 16 pages.

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220182424A1 (en) * 2012-11-12 2022-06-09 Samsung Electronics Co., Ltd. Method and system for sharing an output device between multimedia devices to transmit and receive data
US11757950B2 (en) * 2012-11-12 2023-09-12 Samsung Electronics Co., Ltd. Method and system for sharing an output device between multimedia devices to transmit and receive data

Also Published As

Publication number Publication date
WO2015130149A1 (en) 2015-09-03
EP3113500A1 (en) 2017-01-04
CN106464965B (en) 2020-02-14
US20150249714A1 (en) 2015-09-03
EP3113500A4 (en) 2017-10-04
KR20150102747A (en) 2015-09-07
JP6662784B2 (en) 2020-03-11
CN106464965A (en) 2017-02-22
KR102284721B1 (en) 2021-08-03
JP2017517905A (en) 2017-06-29
EP3113500B1 (en) 2020-04-29

Similar Documents

Publication Publication Date Title
US10757196B2 (en) Method and apparatus for displaying application data in wireless communication system
US9648073B2 (en) Streaming control for real-time transport protocol
US11412021B2 (en) Method and device for media streaming between server and client using RTP/RTSP standard protocol
US10250949B2 (en) Broadcast content to HTTP client conversion
US9665336B2 (en) Direct streaming for wireless display
KR102499967B1 (en) TVs and electronic devices with external tuners and memory for personal video recording
US20150341413A1 (en) Apparatus and method for providing media programming
CN104717549A (en) Multi-screen information interaction method and device
WO2017096851A1 (en) Method, system, and server for pushing video file
TWI577186B (en) Rendering time control
CN105142008B (en) A method of playing second terminal data using first terminal
KR102263223B1 (en) Electronic apparatus and the control method thereof
US12075111B2 (en) Methods and apparatus for responding to inoperative commands
WO2015180446A1 (en) System and method for maintaining connection channel in multi-device interworking service
US20170048291A1 (en) Synchronising playing of streaming content on plural streaming clients
US10104422B2 (en) Multimedia playing control method, apparatus for the same and system
US20190028522A1 (en) Transmission of subtitle data for wireless display
KR100652679B1 (en) Video channel switching method of mobile communication terminal
KR102654716B1 (en) Method and Apparatus for playing back video in accordance with requested video playback time
CN107222769A (en) A kind of transmission method of auxiliary data flow, equipment and system
CN105227607A (en) A kind of method, apparatus and system of data sharing
BR112017000744B1 (en) CONTINUOUS TRANSMISSION DIRECT TO WIRELESS DISPLAY

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VEDULA, KIRAN BHARADWAJ;SHIN, IN-YOUNG;REEL/FRAME:036413/0476

Effective date: 20150226

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20240825