US20170018289A1 - Emoji as facetracking video masks - Google Patents

Emoji as facetracking video masks Download PDF

Info

Publication number
US20170018289A1
US20170018289A1 US15/211,928 US201615211928A US2017018289A1 US 20170018289 A1 US20170018289 A1 US 20170018289A1 US 201615211928 A US201615211928 A US 201615211928A US 2017018289 A1 US2017018289 A1 US 2017018289A1
Authority
US
United States
Prior art keywords
emoji
video
expression
mask
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/211,928
Inventor
Jared S. Morgenstern
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
String Theory Inc
Original Assignee
String Theory Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by String Theory Inc filed Critical String Theory Inc
Priority to US15/211,928 priority Critical patent/US20170018289A1/en
Assigned to String Theory, Inc. reassignment String Theory, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MORGENSTERN, JARED S.
Publication of US20170018289A1 publication Critical patent/US20170018289A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/02Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
    • G11B27/031Electronic editing of digitised analogue information signals, e.g. audio or video signals
    • G11B27/036Insert-editing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • G06K9/00315
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • G06V40/176Dynamic expression
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure

Definitions

  • Implementations disclosed herein relate, in general, to information management technology and specifically to video recording.
  • Emoji Masks System provides for a method of enabling a user to add an animated or still image overlay on a video. For example, when a user is watching or is creating a video, an emoji mask can be overlaid on the video by simply selecting an emoji or other character from a keyboard. In one implementation, upon selection by the user, the emoji or such other character gets enlarged or is interpreted and enlarged as a related symbol and then can be added on top of the video. Yet alternatively, if the emoji mask system recognizes a face or a designated feature in the video, the emoji is added on top of such recognized face and tracks the recognized face. In one alternative implementation, the system allows a user to manually adjust the tracking position of the emoji mask.
  • the emoji mask system disclosed herein allows users to choose an emoji and then enlarge said emoji into a mask. As a result, the emoji mask system extends the expressiveness and makes it more convenient for a user to express themselves through the use of a related emoji.
  • an emoji upon selection of an emoji, or such other expression, the emoji is enlarged to cover faces as they move in the video.
  • an emoji for example a heart emoji, could be associated with an animation, such as animated hearts—that appear above the head of the user moving in the video.
  • an emoji to be used directly, and/or associated with a paired image or animation and a face offset that tells it where to display the mask.
  • the emoji masks can be selected before recording. In another, during recording and even swappable during recording, and, in another, in a review or playback step.
  • One implementation allows all three methods of mask selection.
  • masks are chosen by sliding a tray showing the masks that appear when you toggle on the mask interface and when you swipe to the right, a keyboard comes up, letting you preview different emoji.
  • the system can keep track of your last used emoji and use them to populate the sliding tray.
  • hot swapping the masks during recording could cycle them from person to person in a group video.
  • a user can create his or her own emoji, by selecting a drawing icon in the tray that lets the user draw his or her own mask.
  • the system can use signals such as a user's location current time and change an emoji symbol based on the such location and time. For example, if the user was located to be in San Francisco and if the system determines that the San Francisco Giants are playing in the World Series at the time of selection of a hat emoji, the emoji mask system disclosed herein automatically changes or interprets the hat emoji with a Giants image to make it a Giants hat emoji. Alternatively, it also allows users to add their own text, image, etc., on top of such hat emoji before the hat emoji is attached to and tracks a face in the video.
  • signals such as a user's location current time and change an emoji symbol based on the such location and time. For example, if the user was located to be in San Francisco and if the system determines that the San Francisco Giants are playing in the World Series at the time of selection of a hat emoji, the emoji mask
  • the video content itself may be used to help determine how to display the mask. For example, a winking person may make the mask wink. A smiling person may make it frown. Someone shaking their head rapidly may make a head shake animation. Someone jumping may make lift off smoke appear.
  • FIG. 1 illustrates an example flow chart for providing emoji masks based on an implementation of an emoji mask system disclosed herein.
  • FIG. 2 illustrates various examples of emoji masks tracking a user face in a video.
  • FIG. 3 illustrates an example interface for selecting an emoji mask for tracking user faces in a video.
  • FIG. 4 illustrates an example interface for selecting an emoji mask from an emoji keyboard for tracking user faces in a video.
  • FIG. 5 illustrates an example interface for applying an animated emoji for tracking user faces in a video.
  • FIG. 6 illustrates an example flow chart for displaying an emoji mask on a user face in a video.
  • FIG. 7 illustrates an example system that may be useful in implementing the described technology.
  • FIG. 8 illustrates an example system including various components of the described technology.
  • the recording system disclosed herein referred to as emoji masks system, provides for a method of enabling a user recording a video to a mask tracking his or her face using an emoji or other similar expression graphics such that the emoji, or such other expression graphics, tracks the movement of the user's face in the video.
  • FIG. 1 illustrates a flow chart 100 depicting an implementation of emoji mask system that details the process for selection, replacement, and interfacing with video tracking.
  • An operation 102 presents a toggle mask interface.
  • the toggle mask interface may be presented before a recording of a video, during the recording of the video, or after the recording in complete.
  • the user may invoke the toggle mask interface in a manner disclosed herein.
  • the user may also invoke the toggle mask interface during a playback phase of the video.
  • a mask tray appears at the bottom of the video screen.
  • mask selection is shown.
  • a user may cycle through a selection of masks and select a mask from the mask tray within the toggle mask interface.
  • An operation 106 determines if an emoji mask icon is selected. If an emoji mask is selected, an operation 108 opens an emoji keyboard. Subsequently, an operation 110 looks up emoji mapping and if it determines custom mapping, the mask is added to the video. Otherwise, the emoji can be enlarged to generate an enlarged mask that is used as a mask on a face.
  • the system recognizes a face or a designated feature in the video, and the emoji is overlaid on the face or designated feature and tracks it.
  • the emoji mapping may include mapping of emojis from the emoji keyboard or from the emoji tray to animations to be added on top of the video. For example, if an emoji for a light bulb may be mapped to a blinking light bulb, a static light bulb, etc. Similarly, an emoji for a heart may be mapped to an animated heart, an emoji for sun may be mapped to weather, a shining sun, etc.
  • a new interface listing various possible mappings for that emoji are displayed to a user and the user can select a mapping therefrom. Thus, in effect, this listing of various possible mapping provides a second keyboard or tray of emojis or its animations.
  • the listing of various possible mappings may be selected based on one or more other parameters, such as time of day, location as determined by the gps coordinates of the device, etc.
  • time of day e.g., time of day
  • location e.g., location of sun
  • a different mapping of sun is provided vs in the afternoon.
  • a list of mappings including Colorado Rockies hat may be displayed.
  • An operation 112 determines if a keyboard is dismissed, and if so, it keeps track of the chosen mask and the time of selecting the chosen mask. Tapping anywhere on the video will release the emoji keyboard, returning to the recording interface.
  • Another determining operation 116 determines if a video interface is exited and if so, an operation 118 sends masks and time of placement to either burn the mask on the video or it is sent to a server.
  • the video is sent to the server with an identifier of the mask (for example, Unicode may be used for the emoji, or a mapped id, or a special id if the emoji mask is a special mask or a user drawn mask) and the location, size, and rotation of the mask for each key frame (e.g. with a bounding box for each 1/32 of a second, and its coordinates of rotation).
  • an identifier of the mask for example, Unicode may be used for the emoji, or a mapped id, or a special id if the emoji mask is a special mask or a user drawn mask
  • the location, size, and rotation of the mask for each key frame e.g. with a bounding box for each 1/32 of a second, and its coordinates of rotation.
  • multiple faces can be identified and saved to the server, including each with a different mask.
  • special masks such as drawn masks or location specific masks (I love NY), or customized
  • FIG. 2 illustrates various example still images 200 of emoji masks tracking a user face in a video.
  • each of still images 202 - 208 is illustrated to show a user 210 with masks 212 - 218 , respectively, where such masks track the movement of the user 210 .
  • Some of the expression masks 212 - 218 such as the expression mask 218 may be a single emoji or expression selected from an emoji list and it is expanded or adjusted to the size of the face of the user 210 in the video.
  • another of the masks, such as the mask 216 may be generated by combining more than one emoji or expression and expanding or adjusting the combined mask to the size of the face being tracked.
  • the mask 214 may be developed using expression or may be a custom emoji designed by a user.
  • FIG. 3 illustrates an example interface for selecting an emoji to generate a mask.
  • a user can start selecting an emoji mask for a video by using a toggle mask interface.
  • a mask tray 314 appears at the bottom of the video screen.
  • User can cycle through a selection of emojis in the mask tray 314 by scrolling from side to side.
  • the emoji begins to track the face of the user 310 , maintaining an overlaid position while the user 310 moves.
  • the mask interface may be removed by a user tapping on the masks icon in a top right toggle, which toggles it on and off.
  • the mask interface may be removed by pressing and holding anywhere in the center of the screen.
  • a user can slide the emoji interface tray to the right (e.g. “throw the tray off the screen”) to remove the emoji interface.
  • the masks tray is active, a user can select other masks. However, the user may not be able to take one off and keep the tray there. Furthermore, the user may also switch masks before recording and/or during recording.
  • FIG. 4 illustrates example still images 400 demonstrating the use of an emoji keyboard for selecting an emoji to generate a mask.
  • a tray 404 appears.
  • the item selected displays on the video and starts tracking the user's face. This can happen before recording, during recording, or after recording.
  • an icon 406 indicating the emoji keyboard option. This can be selected by tapping on the icon 406 , or in one implementation the keyboard will display automatically when the user scrolls the tray to the right.
  • FIG. 5 illustrates the use of an “interpreted emoji”, where the emoji 510 isn't just blown up, but separate artwork, even animated artwork, can be displayed as a result of that emoji 510 being keyed in.
  • an “interpreted emoji” is associated with an animation
  • the system allows for the emoji to be used with a face offset that determines where to display the mask on the video.
  • emoji heart 510 is selected, the system tracks the location of the user's face and displays the animated hearts 508 above the head of the user 512 moving in the video.
  • FIG. 6 illustrates a flow chart 600 detailing the process of a user recording a video with a face tracking emoji.
  • An operation 602 presents a toggle mask interface, which in turn, causes a mask tray to appear at the bottom of the device screen.
  • the user can select and open an emoji keyboard from the mask tray.
  • User selects an icon indicating the emoji keyboard which will open a selective interface presenting an array of emoji icons.
  • user selects an emoji icon from the array presented in the emoji keyboard.
  • the emoji is displayed on top of the video, tracking the face of the user.
  • the moustache emoji may move in the video based on movement of the face.
  • Such tracking of the emoji may be done based on analysis of the movement of a feature of the face.
  • the moustache emoji may be locked to the lips on the face in the video so that the movement of the lips also results in the movement of the emoji.
  • the user is given the capability to unlock the emoji from one feature and move to a different feature of an element in the video. For example, if a sunglass emoji were, by mistake, locked to the lips feature of a face, the user may be able to move it from the lips to the eyes, forehead, etc.
  • the selected emoji adapts its size in order to match the dimensions of the user's face.
  • the mask can be burned to the video and saved, or can be sent to a server with an identifier of the mask (for example, Unicode may be used for the emoji, or a mapped id, or a special id if the emoji mask is a special mask or a user drawn mask) and the location, size, and rotation of the mask for each key frame.
  • an identifier of the mask for example, Unicode may be used for the emoji, or a mapped id, or a special id if the emoji mask is a special mask or a user drawn mask
  • FIG. 7 illustrates an example system labeled as computing device 700 that may be useful in implementing the described technology.
  • the example hardware and operating environment of FIG. 7 for implementing the described technology includes a computing device, such as a general purpose computing device in the form of a computer, a mobile telephone, a personal data assistant (PDA), a tablet, smart watch, gaming remote, or other type of computing device.
  • PDA personal data assistant
  • the computer may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer. These logical connections are achieved by a communication device coupled to or a part of the computer; the implementations are not limited to a particular type of communications device.
  • the remote computer may be another computer, a server, a router, a network PC, a client, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer.
  • the computing device 700 includes a processor 702 , a memory 704 , a display 706 (e.g., a touchscreen display), and other interfaces 708 (e.g., a keyboard).
  • the memory 704 generally includes both volatile memory (e.g., RAM) and non-volatile memory (e.g., flash memory).
  • An operating system 710 resides in the memory 704 and is executed by the processor 702 , although it should be understood that other operating systems may be employed.
  • One or more application programs 712 are loaded in the memory 704 and executed on the operating system 708 by the processor 702 .
  • the computing device 700 includes a power supply 716 , which is powered by one or more batteries or other power sources and which provides power to other components of the computing device 700 .
  • the power supply 716 may also be connected to an external power source that overrides or recharges the built-in batteries or other power sources.
  • the computing device 700 includes one or more communication transceivers 730 to provide network connectivity (e.g., mobile phone network, Wi-Fi®, BlueTooth®, etc.).
  • the computing device 700 also includes various other components, such as a positioning system 720 (e.g., a global positioning satellite transceiver), one or more accelerometers 722 , one or more cameras 724 , an audio interface 726 (e.g., a microphone, an audio amplifier and speaker and/or audio jack), a magnetometer (not shown), and additional storage 728 .
  • a positioning system 720 e.g., a global positioning satellite transceiver
  • the one or more communications transceivers 730 may be communicatively coupled to one or more antennas, including magnetic dipole antennas capacitively coupled to a parasitic resonating element.
  • the one or more transceivers 730 may father be in communication with the operating system 710 , such that data transmitted to or received from the operating system 710 may be sent or received by the
  • a mobile operating system, wireless device drivers, various applications, and other modules and services may be embodied by instructions stored in memory 704 and/or storage devices 728 and processed by the processing unit 702 .
  • Device settings, service options, and other data may be stored in memory 704 and/or storage devices 728 as persistent datastores.
  • software or firmware instructions for generating carrier wave signals may be stored on the memory 704 and processed by processor 702 .
  • the memory 704 may store instructions for tuning multiple inductively-coupled loops to impedance match a desired impedance at a desired frequency.
  • Mobile device 700 may include a variety of tangible computer-readable storage media and intangible computer-readable communication signals.
  • Tangible computer-readable storage can be embodied by any available media that can be accessed by the computing device 700 and includes both volatile and nonvolatile storage media, removable and non-removable storage media.
  • Tangible computer-readable storage media excludes intangible communications signals and includes volatile and nonvolatile, removable and non-removable storage media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
  • Tangible computer-readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store the desired information and which can be accessed by computing device 700 .
  • intangible computer-readable communication signals may embody computer readable instructions, data structures, program modules or other data resident in a modulated data signal, such as a carrier wave or other signal transport mechanism.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • intangible communication signals include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • FIG. 8 illustrates an example expression management system 800 including various components of the described technology.
  • the expression management system 800 is implemented on a memory 802 with one or more modules and databases.
  • the modules may include instructions that may be executed on a processor 820 .
  • An emoji management module 804 stores various instructions for performing functionalities disclosed herein.
  • a GUI module 806 presents various user interfaces, such as the emoji keyboard, the emoji tray, etc., to a user on a user device based on the instructions from the emoji management module 804 .
  • the GUI module 806 may also be used to receive input from the user and communicate the input to the emoji management module 804 for further processing.
  • a video database 812 may be used to store videos.
  • a video recorder 814 may be used to store instructions for recording videos using a video camera of a user device.
  • a video editing module 816 may include instructions for editing the videos and a video playback module 818 allows a user to playback video.
  • the emoji management module 804 may interact with one or more of the modules 812 to 818 to add emojis from an emoji database 822 .
  • An article of manufacture may comprise a tangible storage medium to store logic.
  • Examples of a storage medium may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
  • Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof.
  • API application program interfaces
  • an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments.
  • the executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like.
  • the executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function.
  • the instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The system disclosed herein allows a user to select and/or create a mask using emoji or other expressions and to add the selected mask to track a face or other elements of a video. By utilizing the existing emoji character set, users are familiar with the expressiveness of the masks they can create and can quickly find them. By combining emoji with face tracking software the system provides a more intuitive and fun interface for making playful and expressive videos.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application claims benefit of priority to U.S. Provisional Patent Application No. 62/192,710, entitled “Emoji as Facetracking Video Masks” and filed on Jul. 15, 2015, which is specifically incorporated by reference for all that it discloses and teaches.
  • FIELD
  • Implementations disclosed herein relate, in general, to information management technology and specifically to video recording.
  • SUMMARY
  • The video stickering system disclosed herein, referred to as Emoji Masks System, provides for a method of enabling a user to add an animated or still image overlay on a video. For example, when a user is watching or is creating a video, an emoji mask can be overlaid on the video by simply selecting an emoji or other character from a keyboard. In one implementation, upon selection by the user, the emoji or such other character gets enlarged or is interpreted and enlarged as a related symbol and then can be added on top of the video. Yet alternatively, if the emoji mask system recognizes a face or a designated feature in the video, the emoji is added on top of such recognized face and tracks the recognized face. In one alternative implementation, the system allows a user to manually adjust the tracking position of the emoji mask.
  • Many people are familiar with expressing themselves through various emoji that have become new symbols of international language. The emoji mask system disclosed herein allows users to choose an emoji and then enlarge said emoji into a mask. As a result, the emoji mask system extends the expressiveness and makes it more convenient for a user to express themselves through the use of a related emoji.
  • In one implementation, upon selection of an emoji, or such other expression, the emoji is enlarged to cover faces as they move in the video. In another, an emoji, for example a heart emoji, could be associated with an animation, such as animated hearts—that appear above the head of the user moving in the video. Thus, the system allows an emoji to be used directly, and/or associated with a paired image or animation and a face offset that tells it where to display the mask.
  • In one implementation, the emoji masks can be selected before recording. In another, during recording and even swappable during recording, and, in another, in a review or playback step. One implementation allows all three methods of mask selection.
  • In one implementation, masks are chosen by sliding a tray showing the masks that appear when you toggle on the mask interface and when you swipe to the right, a keyboard comes up, letting you preview different emoji.
  • In another implementation, the system can keep track of your last used emoji and use them to populate the sliding tray.
  • In another implementation, multiple faces—if found in the video—can be mapped to various slots in the tray. In this implementation, hot swapping the masks during recording could cycle them from person to person in a group video.
  • In another implementation, a user can create his or her own emoji, by selecting a drawing icon in the tray that lets the user draw his or her own mask.
  • In another implementation, the system can use signals such as a user's location current time and change an emoji symbol based on the such location and time. For example, if the user was located to be in San Francisco and if the system determines that the San Francisco Giants are playing in the World Series at the time of selection of a hat emoji, the emoji mask system disclosed herein automatically changes or interprets the hat emoji with a Giants image to make it a Giants hat emoji. Alternatively, it also allows users to add their own text, image, etc., on top of such hat emoji before the hat emoji is attached to and tracks a face in the video.
  • In another implementation, the video content itself may be used to help determine how to display the mask. For example, a winking person may make the mask wink. A smiling person may make it frown. Someone shaking their head rapidly may make a head shake animation. Someone jumping may make lift off smoke appear.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A further understanding of the nature and advantages of the present technology may be realized by reference to the figures, which are described in the remaining portion of the specification. In the figures, like reference numerals are used throughout several figures to refer to similar components. In some instances, a reference numeral may have an associated sub-label consisting of a lower-case letter to denote one of multiple similar components. When reference is made to a reference numeral without specification of a sub-label, the reference is intended to refer to all such multiple similar components.
  • FIG. 1 illustrates an example flow chart for providing emoji masks based on an implementation of an emoji mask system disclosed herein.
  • FIG. 2 illustrates various examples of emoji masks tracking a user face in a video.
  • FIG. 3 illustrates an example interface for selecting an emoji mask for tracking user faces in a video.
  • FIG. 4 illustrates an example interface for selecting an emoji mask from an emoji keyboard for tracking user faces in a video.
  • FIG. 5 illustrates an example interface for applying an animated emoji for tracking user faces in a video.
  • FIG. 6 illustrates an example flow chart for displaying an emoji mask on a user face in a video.
  • FIG. 7 illustrates an example system that may be useful in implementing the described technology.
  • FIG. 8 illustrates an example system including various components of the described technology.
  • DETAILED DESCRIPTION
  • The recording system disclosed herein, referred to as emoji masks system, provides for a method of enabling a user recording a video to a mask tracking his or her face using an emoji or other similar expression graphics such that the emoji, or such other expression graphics, tracks the movement of the user's face in the video.
  • FIG. 1 illustrates a flow chart 100 depicting an implementation of emoji mask system that details the process for selection, replacement, and interfacing with video tracking. An operation 102 presents a toggle mask interface. The toggle mask interface may be presented before a recording of a video, during the recording of the video, or after the recording in complete. For example, during an editing phase of the video, the user may invoke the toggle mask interface in a manner disclosed herein. Alternatively, the user may also invoke the toggle mask interface during a playback phase of the video.
  • When the user has selected the toggle mask interface, a mask tray appears at the bottom of the video screen. At operation 104, mask selection is shown. A user may cycle through a selection of masks and select a mask from the mask tray within the toggle mask interface. An operation 106 determines if an emoji mask icon is selected. If an emoji mask is selected, an operation 108 opens an emoji keyboard. Subsequently, an operation 110 looks up emoji mapping and if it determines custom mapping, the mask is added to the video. Otherwise, the emoji can be enlarged to generate an enlarged mask that is used as a mask on a face. The system recognizes a face or a designated feature in the video, and the emoji is overlaid on the face or designated feature and tracks it.
  • The emoji mapping may include mapping of emojis from the emoji keyboard or from the emoji tray to animations to be added on top of the video. For example, if an emoji for a light bulb may be mapped to a blinking light bulb, a static light bulb, etc. Similarly, an emoji for a heart may be mapped to an animated heart, an emoji for sun may be mapped to weather, a shining sun, etc. In one implementation, when a user selects an emoji, a new interface listing various possible mappings for that emoji are displayed to a user and the user can select a mapping therefrom. Thus, in effect, this listing of various possible mapping provides a second keyboard or tray of emojis or its animations.
  • In one implementation, the listing of various possible mappings may be selected based on one or more other parameters, such as time of day, location as determined by the gps coordinates of the device, etc. Thus, for example, if an emoji for sun is selected in the evening, a different mapping of sun is provided vs in the afternoon. Similarly, if an emoji for a baseball is selected by a device that is in general vicinity of Denver, a list of mappings including Colorado Rockies hat may be displayed.
  • An operation 112 determines if a keyboard is dismissed, and if so, it keeps track of the chosen mask and the time of selecting the chosen mask. Tapping anywhere on the video will release the emoji keyboard, returning to the recording interface. Another determining operation 116 determines if a video interface is exited and if so, an operation 118 sends masks and time of placement to either burn the mask on the video or it is sent to a server. The video is sent to the server with an identifier of the mask (for example, Unicode may be used for the emoji, or a mapped id, or a special id if the emoji mask is a special mask or a user drawn mask) and the location, size, and rotation of the mask for each key frame (e.g. with a bounding box for each 1/32 of a second, and its coordinates of rotation). Note that multiple faces can be identified and saved to the server, including each with a different mask. For special masks, such as drawn masks or location specific masks (I love NY), or customized masks (tweaking the eyebrows on one for example), additional parameters may need to be passed to the server so it can recreate what the user saw. An alternative implementation has what the user saw burned into the video on the client device by recording the screen without the UI elements and then sending the new video. A combination of both techniques may also be used so that the original video is preserved.
  • FIG. 2 illustrates various example still images 200 of emoji masks tracking a user face in a video. Specifically, each of still images 202-208 is illustrated to show a user 210 with masks 212-218, respectively, where such masks track the movement of the user 210. Some of the expression masks 212-218, such as the expression mask 218 may be a single emoji or expression selected from an emoji list and it is expanded or adjusted to the size of the face of the user 210 in the video. Alternatively, another of the masks, such as the mask 216, may be generated by combining more than one emoji or expression and expanding or adjusting the combined mask to the size of the face being tracked. Yet alternatively, the mask 214 may be developed using expression or may be a custom emoji designed by a user.
  • FIG. 3 illustrates an example interface for selecting an emoji to generate a mask. A user can start selecting an emoji mask for a video by using a toggle mask interface. When the user 310 has selected the toggle mask interface, a mask tray 314 appears at the bottom of the video screen. User can cycle through a selection of emojis in the mask tray 314 by scrolling from side to side. Once an emoji 312 is selected, the emoji begins to track the face of the user 310, maintaining an overlaid position while the user 310 moves.
  • In one implementation, the mask interface may be removed by a user tapping on the masks icon in a top right toggle, which toggles it on and off. Alternatively, the mask interface may be removed by pressing and holding anywhere in the center of the screen. In another implementation, a user can slide the emoji interface tray to the right (e.g. “throw the tray off the screen”) to remove the emoji interface. Furthermore, while the masks tray is active, a user can select other masks. However, the user may not be able to take one off and keep the tray there. Furthermore, the user may also switch masks before recording and/or during recording.
  • FIG. 4 illustrates example still images 400 demonstrating the use of an emoji keyboard for selecting an emoji to generate a mask. Once a user 410 selects a toggle mask interface, a tray 404 appears. As a user selects or cycles the tray, the item selected displays on the video and starts tracking the user's face. This can happen before recording, during recording, or after recording. At the far end of the mask tray 404 is an icon 406 indicating the emoji keyboard option. This can be selected by tapping on the icon 406, or in one implementation the keyboard will display automatically when the user scrolls the tray to the right. Once selected, the emoji keyboard 408 rises from the bottom of the screen, as seen in image 420, and the user 410 can select an emoji from those displayed on the keyboard which, once selected, begins to track the user's face. The selected emoji 412 also adapts its size so as to match the size of the user's face in the video. In image 422, the emoji 412 is transferred to the video at an initial size. In image 424 the emoji 412 has adapted its size in order to properly match the dimensions of the user's face and effectively mask it. Tapping on the video releases the keyboard, and the last emoji selected 414 takes a slot in the tray 404.
  • FIG. 5 illustrates the use of an “interpreted emoji”, where the emoji 510 isn't just blown up, but separate artwork, even animated artwork, can be displayed as a result of that emoji 510 being keyed in. When an “interpreted emoji” is associated with an animation, the system allows for the emoji to be used with a face offset that determines where to display the mask on the video. When emoji heart 510 is selected, the system tracks the location of the user's face and displays the animated hearts 508 above the head of the user 512 moving in the video.
  • FIG. 6 illustrates a flow chart 600 detailing the process of a user recording a video with a face tracking emoji. An operation 602 presents a toggle mask interface, which in turn, causes a mask tray to appear at the bottom of the device screen. At operation 604 the user can select and open an emoji keyboard from the mask tray. User selects an icon indicating the emoji keyboard which will open a selective interface presenting an array of emoji icons. At operation 606, user selects an emoji icon from the array presented in the emoji keyboard. When the user selects an emoji, the emoji is displayed on top of the video, tracking the face of the user. Thus, for example, if an emoji of a moustache is placed on a face in the video, the moustache emoji may move in the video based on movement of the face. Such tracking of the emoji may be done based on analysis of the movement of a feature of the face. For example, the moustache emoji may be locked to the lips on the face in the video so that the movement of the lips also results in the movement of the emoji.
  • Furthermore, in an implementation, the user is given the capability to unlock the emoji from one feature and move to a different feature of an element in the video. For example, if a sunglass emoji were, by mistake, locked to the lips feature of a face, the user may be able to move it from the lips to the eyes, forehead, etc.
  • At operation 608, the selected emoji adapts its size in order to match the dimensions of the user's face. At operation 610, the mask can be burned to the video and saved, or can be sent to a server with an identifier of the mask (for example, Unicode may be used for the emoji, or a mapped id, or a special id if the emoji mask is a special mask or a user drawn mask) and the location, size, and rotation of the mask for each key frame.
  • FIG. 7 illustrates an example system labeled as computing device 700 that may be useful in implementing the described technology. The example hardware and operating environment of FIG. 7 for implementing the described technology includes a computing device, such as a general purpose computing device in the form of a computer, a mobile telephone, a personal data assistant (PDA), a tablet, smart watch, gaming remote, or other type of computing device. It should be appreciated by those skilled in the art that any type of tangible computer-readable media may be used in the example operating environment. The computer may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer. These logical connections are achieved by a communication device coupled to or a part of the computer; the implementations are not limited to a particular type of communications device. The remote computer may be another computer, a server, a router, a network PC, a client, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer.
  • The computing device 700 includes a processor 702, a memory 704, a display 706 (e.g., a touchscreen display), and other interfaces 708 (e.g., a keyboard). The memory 704 generally includes both volatile memory (e.g., RAM) and non-volatile memory (e.g., flash memory). An operating system 710 resides in the memory 704 and is executed by the processor 702, although it should be understood that other operating systems may be employed.
  • One or more application programs 712, such as a high resolution display imager 714, are loaded in the memory 704 and executed on the operating system 708 by the processor 702. The computing device 700 includes a power supply 716, which is powered by one or more batteries or other power sources and which provides power to other components of the computing device 700. The power supply 716 may also be connected to an external power source that overrides or recharges the built-in batteries or other power sources.
  • The computing device 700 includes one or more communication transceivers 730 to provide network connectivity (e.g., mobile phone network, Wi-Fi®, BlueTooth®, etc.). The computing device 700 also includes various other components, such as a positioning system 720 (e.g., a global positioning satellite transceiver), one or more accelerometers 722, one or more cameras 724, an audio interface 726 (e.g., a microphone, an audio amplifier and speaker and/or audio jack), a magnetometer (not shown), and additional storage 728. Other configurations may also be employed. The one or more communications transceivers 730 may be communicatively coupled to one or more antennas, including magnetic dipole antennas capacitively coupled to a parasitic resonating element. The one or more transceivers 730 may father be in communication with the operating system 710, such that data transmitted to or received from the operating system 710 may be sent or received by the communications transceivers 730 over the one or more antennas.
  • In an example implementation, a mobile operating system, wireless device drivers, various applications, and other modules and services may be embodied by instructions stored in memory 704 and/or storage devices 728 and processed by the processing unit 702. Device settings, service options, and other data may be stored in memory 704 and/or storage devices 728 as persistent datastores. In another example implementation, software or firmware instructions for generating carrier wave signals may be stored on the memory 704 and processed by processor 702. For example, the memory 704 may store instructions for tuning multiple inductively-coupled loops to impedance match a desired impedance at a desired frequency.
  • Mobile device 700 may include a variety of tangible computer-readable storage media and intangible computer-readable communication signals. Tangible computer-readable storage can be embodied by any available media that can be accessed by the computing device 700 and includes both volatile and nonvolatile storage media, removable and non-removable storage media. Tangible computer-readable storage media excludes intangible communications signals and includes volatile and nonvolatile, removable and non-removable storage media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Tangible computer-readable storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CDROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other tangible medium which can be used to store the desired information and which can be accessed by computing device 700. In contrast to tangible computer-readable storage media, intangible computer-readable communication signals may embody computer readable instructions, data structures, program modules or other data resident in a modulated data signal, such as a carrier wave or other signal transport mechanism. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, intangible communication signals include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
  • FIG. 8 illustrates an example expression management system 800 including various components of the described technology. Specifically, the expression management system 800 is implemented on a memory 802 with one or more modules and databases. The modules may include instructions that may be executed on a processor 820. An emoji management module 804 stores various instructions for performing functionalities disclosed herein. A GUI module 806 presents various user interfaces, such as the emoji keyboard, the emoji tray, etc., to a user on a user device based on the instructions from the emoji management module 804. The GUI module 806 may also be used to receive input from the user and communicate the input to the emoji management module 804 for further processing.
  • A video database 812 may be used to store videos. A video recorder 814 may be used to store instructions for recording videos using a video camera of a user device. A video editing module 816 may include instructions for editing the videos and a video playback module 818 allows a user to playback video. The emoji management module 804 may interact with one or more of the modules 812 to 818 to add emojis from an emoji database 822.
  • Some embodiments may comprise an article of manufacture. An article of manufacture may comprise a tangible storage medium to store logic. Examples of a storage medium may include one or more types of computer-readable storage media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of the logic may include various software elements, such as software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. In one embodiment, for example, an article of manufacture may store executable computer program instructions that, when executed by a computer, cause the computer to perform methods and/or operations in accordance with the described embodiments. The executable computer program instructions may include any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. The executable computer program instructions may be implemented according to a predefined computer language, manner or syntax, for instructing a computer to perform a certain function. The instructions may be implemented using any suitable high-level, low-level, object-oriented, visual, compiled and/or interpreted programming language.

Claims (12)

What is claimed is:
1. A method comprising:
receiving an input from a user during recording of a video;
in response to the input, presenting a plurality of expression graphics;
receiving a selection input from the user indicating selection of one of the plurality of expression graphics;
receiving a placement input indicating placement of the selected one of the plurality of expression graphics on the video; and
adding the selected one of the plurality of expression graphics in the video at a time indicated by the placement.
2. The method of claim 1, wherein the placement also provides the location of the one of the expression graphics on the video.
3. The method of claim 1, wherein the expression graphic is an emoji.
4. The method of claim 3, further comprising adjusting the size of the selected expression graphic to a size of an object identified in the video.
5. The method of claim 3, further comprising tracking the selected expression object to the object identified in the video.
6. The method of claim 5, further comprising tracking multiple expression objects to multiple objects identified in the video.
7. The method of claim 6, further comprising switching expression objects from one object to another object during recording in a group video.
8. The method of claim 1, wherein the expression object is animated.
9. The method of claim 1, wherein a user can create their own emoji by selecting a drawing icon.
10. The method of claim 1, wherein the emoji mask can be selected and added to the video prior to recording.
11. The method of claim 1, wherein the emoji mask can be selected and added to the video after recording.
12. A system for adding expression objects to a video, the system comprising:
a memory;
one or more processors; and
an expression management module including one or more computer instructions stored in the memory and executable by the one or more processors, the computer instructions comprising:
an instruction for presenting a plurality of expression graphics during recording of the video;
an instruction for receiving a selection input from the user indicating selection of one of the plurality of expression graphics;
an instruction for receiving a placement input indicating placement of the selected one of the plurality of expression graphics on the video; and
an instruction for adding the selected one of the plurality of expression graphics in the video at a time indicated by the placement.
US15/211,928 2015-07-15 2016-07-15 Emoji as facetracking video masks Abandoned US20170018289A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/211,928 US20170018289A1 (en) 2015-07-15 2016-07-15 Emoji as facetracking video masks

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562192710P 2015-07-15 2015-07-15
US15/211,928 US20170018289A1 (en) 2015-07-15 2016-07-15 Emoji as facetracking video masks

Publications (1)

Publication Number Publication Date
US20170018289A1 true US20170018289A1 (en) 2017-01-19

Family

ID=57775189

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/211,928 Abandoned US20170018289A1 (en) 2015-07-15 2016-07-15 Emoji as facetracking video masks

Country Status (1)

Country Link
US (1) US20170018289A1 (en)

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180147728A1 (en) * 2016-11-30 2018-05-31 Universal City Studios Llc Animated character head systems and methods
US20180182149A1 (en) * 2016-12-22 2018-06-28 Seerslab, Inc. Method and apparatus for creating user-created sticker and system for sharing user-created sticker
US20180186316A1 (en) * 2016-12-30 2018-07-05 Textron Innovations Inc. Controlling electrical access to a lithium battery on a utility vehicle
US10325417B1 (en) 2018-05-07 2019-06-18 Apple Inc. Avatar creation user interface
US10375313B1 (en) * 2018-05-07 2019-08-06 Apple Inc. Creative camera
US10379719B2 (en) * 2017-05-16 2019-08-13 Apple Inc. Emoji recording and sending
US10444963B2 (en) 2016-09-23 2019-10-15 Apple Inc. Image data for enhanced user interactions
US10504260B2 (en) * 2016-08-09 2019-12-10 Pegge Vissicaro Keyboard with in-line user created emojis
US10521948B2 (en) 2017-05-16 2019-12-31 Apple Inc. Emoji recording and sending
US10528243B2 (en) 2017-06-04 2020-01-07 Apple Inc. User interface camera effects
US10602053B2 (en) 2016-06-12 2020-03-24 Apple Inc. User interface for camera effects
US10645294B1 (en) 2019-05-06 2020-05-05 Apple Inc. User interfaces for capturing and managing visual media
US10659405B1 (en) 2019-05-06 2020-05-19 Apple Inc. Avatar integration with multiple applications
CN111464430A (en) * 2020-04-09 2020-07-28 腾讯科技(深圳)有限公司 Dynamic expression display method, dynamic expression creation method and device
KR20200101208A (en) * 2019-02-19 2020-08-27 삼성전자주식회사 Electronic device and method for providing user interface for editing of emoji in conjunction with camera function thereof
US10812430B2 (en) 2018-02-22 2020-10-20 Mercury Universe, LLC Method and system for creating a mercemoji
US10991397B2 (en) * 2016-10-14 2021-04-27 Genetec Inc. Masking in video stream
JP2021511728A (en) * 2018-01-18 2021-05-06 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 How to display additional objects and their devices, computer devices and storage media
US11054973B1 (en) 2020-06-01 2021-07-06 Apple Inc. User interfaces for managing media
US11061372B1 (en) 2020-05-11 2021-07-13 Apple Inc. User interfaces related to time
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
US11103161B2 (en) 2018-05-07 2021-08-31 Apple Inc. Displaying user interfaces associated with physical activities
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
US11145103B2 (en) * 2017-10-23 2021-10-12 Paypal, Inc. System and method for generating animated emoji mashups
US11212449B1 (en) 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
CN113867876A (en) * 2021-10-08 2021-12-31 北京字跳网络技术有限公司 Expression display method, device, equipment and storage medium
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
US11350026B1 (en) 2021-04-30 2022-05-31 Apple Inc. User interfaces for altering visual media
US11423596B2 (en) * 2017-10-23 2022-08-23 Paypal, Inc. System and method for generating emoji mashups with machine learning
US11468625B2 (en) 2018-09-11 2022-10-11 Apple Inc. User interfaces for simulated depth effects
US11481988B2 (en) 2010-04-07 2022-10-25 Apple Inc. Avatar editing environment
US11521368B2 (en) * 2019-07-18 2022-12-06 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for presenting material, and storage medium
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11733769B2 (en) 2020-06-08 2023-08-22 Apple Inc. Presenting avatars in three-dimensional environments
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
US20230410394A1 (en) * 2021-01-22 2023-12-21 Beijing Zitiao Network Technology Co., Ltd. Image display method and apparatus, device, and medium
US11875439B2 (en) * 2018-04-18 2024-01-16 Snap Inc. Augmented expression system
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US20240211678A1 (en) * 2022-12-22 2024-06-27 Microsoft Technology Licensing, Llc Encoding documents for peripheral privacy
US12033296B2 (en) 2018-05-07 2024-07-09 Apple Inc. Avatar creation user interface
US12112024B2 (en) 2021-06-01 2024-10-08 Apple Inc. User interfaces for managing media styles
EP3700179B1 (en) * 2019-02-19 2024-11-20 Samsung Electronics Co., Ltd. Electronic device supporting avatar recommendation and download
US12184969B2 (en) 2016-09-23 2024-12-31 Apple Inc. Avatar creation and editing
US12287913B2 (en) 2022-09-06 2025-04-29 Apple Inc. Devices, methods, and graphical user interfaces for controlling avatars within three-dimensional environments
US12314553B2 (en) 2023-03-20 2025-05-27 Apple Inc. User interface camera effects

Cited By (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12223612B2 (en) 2010-04-07 2025-02-11 Apple Inc. Avatar editing environment
US11481988B2 (en) 2010-04-07 2022-10-25 Apple Inc. Avatar editing environment
US11869165B2 (en) 2010-04-07 2024-01-09 Apple Inc. Avatar editing environment
US10602053B2 (en) 2016-06-12 2020-03-24 Apple Inc. User interface for camera effects
US11165949B2 (en) 2016-06-12 2021-11-02 Apple Inc. User interface for capturing photos with different camera magnifications
US11245837B2 (en) 2016-06-12 2022-02-08 Apple Inc. User interface for camera effects
US11641517B2 (en) 2016-06-12 2023-05-02 Apple Inc. User interface for camera effects
US11962889B2 (en) 2016-06-12 2024-04-16 Apple Inc. User interface for camera effects
US12132981B2 (en) 2016-06-12 2024-10-29 Apple Inc. User interface for camera effects
US10504260B2 (en) * 2016-08-09 2019-12-10 Pegge Vissicaro Keyboard with in-line user created emojis
US12079458B2 (en) 2016-09-23 2024-09-03 Apple Inc. Image data for enhanced user interactions
US12184969B2 (en) 2016-09-23 2024-12-31 Apple Inc. Avatar creation and editing
US10444963B2 (en) 2016-09-23 2019-10-15 Apple Inc. Image data for enhanced user interactions
US11232817B2 (en) 2016-10-14 2022-01-25 Genetec Inc. Masking in video stream
US10991397B2 (en) * 2016-10-14 2021-04-27 Genetec Inc. Masking in video stream
US12087330B2 (en) 2016-10-14 2024-09-10 Genetec Inc. Masking in video stream
US11756587B2 (en) 2016-10-14 2023-09-12 Genetec Inc. Masking in video stream
KR20190091300A (en) * 2016-11-30 2019-08-05 유니버셜 시티 스튜디오스 엘엘씨 Animated Character Head Systems and Methods
KR102400398B1 (en) 2016-11-30 2022-05-19 유니버셜 시티 스튜디오스 엘엘씨 Animated Character Head Systems and Methods
US20180147728A1 (en) * 2016-11-30 2018-05-31 Universal City Studios Llc Animated character head systems and methods
US10775880B2 (en) * 2016-11-30 2020-09-15 Universal City Studios Llc Animated character head systems and methods
US20180182149A1 (en) * 2016-12-22 2018-06-28 Seerslab, Inc. Method and apparatus for creating user-created sticker and system for sharing user-created sticker
US20180186316A1 (en) * 2016-12-30 2018-07-05 Textron Innovations Inc. Controlling electrical access to a lithium battery on a utility vehicle
US10379719B2 (en) * 2017-05-16 2019-08-13 Apple Inc. Emoji recording and sending
US10845968B2 (en) 2017-05-16 2020-11-24 Apple Inc. Emoji recording and sending
US10521948B2 (en) 2017-05-16 2019-12-31 Apple Inc. Emoji recording and sending
US10521091B2 (en) * 2017-05-16 2019-12-31 Apple Inc. Emoji recording and sending
US11532112B2 (en) 2017-05-16 2022-12-20 Apple Inc. Emoji recording and sending
US10997768B2 (en) 2017-05-16 2021-05-04 Apple Inc. Emoji recording and sending
US10846905B2 (en) 2017-05-16 2020-11-24 Apple Inc. Emoji recording and sending
US12045923B2 (en) 2017-05-16 2024-07-23 Apple Inc. Emoji recording and sending
US11204692B2 (en) 2017-06-04 2021-12-21 Apple Inc. User interface camera effects
US11687224B2 (en) 2017-06-04 2023-06-27 Apple Inc. User interface camera effects
US10528243B2 (en) 2017-06-04 2020-01-07 Apple Inc. User interface camera effects
US12135932B2 (en) 2017-10-23 2024-11-05 Paypal, Inc. System and method for generating emoji mashups with machine learning
US11145103B2 (en) * 2017-10-23 2021-10-12 Paypal, Inc. System and method for generating animated emoji mashups
US11783113B2 (en) 2017-10-23 2023-10-10 Paypal, Inc. System and method for generating emoji mashups with machine learning
US11423596B2 (en) * 2017-10-23 2022-08-23 Paypal, Inc. System and method for generating emoji mashups with machine learning
JP7109553B2 (en) 2018-01-18 2022-07-29 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 Additional object display method and its device, computer device and storage medium
EP3742743A4 (en) * 2018-01-18 2021-07-28 Tencent Technology (Shenzhen) Company Limited Method and apparatus for displaying additional object, computer device, and storage medium
JP2021511728A (en) * 2018-01-18 2021-05-06 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 How to display additional objects and their devices, computer devices and storage media
US11640235B2 (en) 2018-01-18 2023-05-02 Tencent Technology (Shenzhen) Company Limited Additional object display method and apparatus, computer device, and storage medium
US10812430B2 (en) 2018-02-22 2020-10-20 Mercury Universe, LLC Method and system for creating a mercemoji
US11875439B2 (en) * 2018-04-18 2024-01-16 Snap Inc. Augmented expression system
US11178335B2 (en) 2018-05-07 2021-11-16 Apple Inc. Creative camera
US10325417B1 (en) 2018-05-07 2019-06-18 Apple Inc. Avatar creation user interface
US10325416B1 (en) 2018-05-07 2019-06-18 Apple Inc. Avatar creation user interface
US12170834B2 (en) 2018-05-07 2024-12-17 Apple Inc. Creative camera
US10375313B1 (en) * 2018-05-07 2019-08-06 Apple Inc. Creative camera
US11103161B2 (en) 2018-05-07 2021-08-31 Apple Inc. Displaying user interfaces associated with physical activities
US10410434B1 (en) 2018-05-07 2019-09-10 Apple Inc. Avatar creation user interface
US10523879B2 (en) 2018-05-07 2019-12-31 Apple Inc. Creative camera
US10580221B2 (en) 2018-05-07 2020-03-03 Apple Inc. Avatar creation user interface
US12033296B2 (en) 2018-05-07 2024-07-09 Apple Inc. Avatar creation user interface
US11722764B2 (en) 2018-05-07 2023-08-08 Apple Inc. Creative camera
US11380077B2 (en) 2018-05-07 2022-07-05 Apple Inc. Avatar creation user interface
US11682182B2 (en) 2018-05-07 2023-06-20 Apple Inc. Avatar creation user interface
US10861248B2 (en) 2018-05-07 2020-12-08 Apple Inc. Avatar creation user interface
US11468625B2 (en) 2018-09-11 2022-10-11 Apple Inc. User interfaces for simulated depth effects
US12154218B2 (en) 2018-09-11 2024-11-26 Apple Inc. User interfaces simulated depth effects
US11669985B2 (en) 2018-09-28 2023-06-06 Apple Inc. Displaying and editing images with depth information
US11128792B2 (en) 2018-09-28 2021-09-21 Apple Inc. Capturing and displaying images with multiple focal planes
US11321857B2 (en) 2018-09-28 2022-05-03 Apple Inc. Displaying and editing images with depth information
US11895391B2 (en) 2018-09-28 2024-02-06 Apple Inc. Capturing and displaying images with multiple focal planes
US11107261B2 (en) 2019-01-18 2021-08-31 Apple Inc. Virtual avatar animation based on facial feature movement
EP3700179B1 (en) * 2019-02-19 2024-11-20 Samsung Electronics Co., Ltd. Electronic device supporting avatar recommendation and download
US11995750B2 (en) 2019-02-19 2024-05-28 Samsung Electronics Co., Ltd. Electronic device and method of providing user interface for emoji editing while interworking with camera function by using said electronic device
KR102667064B1 (en) * 2019-02-19 2024-05-20 삼성전자 주식회사 Electronic device and method for providing user interface for editing of emoji in conjunction with camera function thereof
KR20200101208A (en) * 2019-02-19 2020-08-27 삼성전자주식회사 Electronic device and method for providing user interface for editing of emoji in conjunction with camera function thereof
EP3913902A4 (en) * 2019-02-19 2022-06-08 Samsung Electronics Co., Ltd. ELECTRONIC DEVICE AND METHOD FOR PROVIDING A USER INTERFACE FOR EDITING EMOJI WHILE COLLABORATING WITH A CAMERA FUNCTION USING THE ELECTRONIC DEVICE
US10659405B1 (en) 2019-05-06 2020-05-19 Apple Inc. Avatar integration with multiple applications
US11770601B2 (en) 2019-05-06 2023-09-26 Apple Inc. User interfaces for capturing and managing visual media
US10735642B1 (en) 2019-05-06 2020-08-04 Apple Inc. User interfaces for capturing and managing visual media
US11706521B2 (en) 2019-05-06 2023-07-18 Apple Inc. User interfaces for capturing and managing visual media
US10735643B1 (en) 2019-05-06 2020-08-04 Apple Inc. User interfaces for capturing and managing visual media
US12218894B2 (en) 2019-05-06 2025-02-04 Apple Inc. Avatar integration with a contacts user interface
US10645294B1 (en) 2019-05-06 2020-05-05 Apple Inc. User interfaces for capturing and managing visual media
US10652470B1 (en) 2019-05-06 2020-05-12 Apple Inc. User interfaces for capturing and managing visual media
US12192617B2 (en) 2019-05-06 2025-01-07 Apple Inc. User interfaces for capturing and managing visual media
US11223771B2 (en) 2019-05-06 2022-01-11 Apple Inc. User interfaces for capturing and managing visual media
US10681282B1 (en) 2019-05-06 2020-06-09 Apple Inc. User interfaces for capturing and managing visual media
US10674072B1 (en) 2019-05-06 2020-06-02 Apple Inc. User interfaces for capturing and managing visual media
US10791273B1 (en) 2019-05-06 2020-09-29 Apple Inc. User interfaces for capturing and managing visual media
US11521368B2 (en) * 2019-07-18 2022-12-06 Beijing Dajia Internet Information Technology Co., Ltd. Method and apparatus for presenting material, and storage medium
CN111464430A (en) * 2020-04-09 2020-07-28 腾讯科技(深圳)有限公司 Dynamic expression display method, dynamic expression creation method and device
US11822778B2 (en) 2020-05-11 2023-11-21 Apple Inc. User interfaces related to time
US12099713B2 (en) 2020-05-11 2024-09-24 Apple Inc. User interfaces related to time
US11442414B2 (en) 2020-05-11 2022-09-13 Apple Inc. User interfaces related to time
US11061372B1 (en) 2020-05-11 2021-07-13 Apple Inc. User interfaces related to time
US11921998B2 (en) 2020-05-11 2024-03-05 Apple Inc. Editing features of an avatar
US12008230B2 (en) 2020-05-11 2024-06-11 Apple Inc. User interfaces related to time with an editable background
US11330184B2 (en) 2020-06-01 2022-05-10 Apple Inc. User interfaces for managing media
US11617022B2 (en) 2020-06-01 2023-03-28 Apple Inc. User interfaces for managing media
US11054973B1 (en) 2020-06-01 2021-07-06 Apple Inc. User interfaces for managing media
US12081862B2 (en) 2020-06-01 2024-09-03 Apple Inc. User interfaces for managing media
US12282594B2 (en) 2020-06-08 2025-04-22 Apple Inc. Presenting avatars in three-dimensional environments
US11733769B2 (en) 2020-06-08 2023-08-22 Apple Inc. Presenting avatars in three-dimensional environments
US11212449B1 (en) 2020-09-25 2021-12-28 Apple Inc. User interfaces for media capture and management
US12155925B2 (en) 2020-09-25 2024-11-26 Apple Inc. User interfaces for media capture and management
US20230410394A1 (en) * 2021-01-22 2023-12-21 Beijing Zitiao Network Technology Co., Ltd. Image display method and apparatus, device, and medium
US12106410B2 (en) * 2021-01-22 2024-10-01 Beijing Zitiao Network Technology Co., Ltd. Customizing emojis for users in chat applications
US11778339B2 (en) 2021-04-30 2023-10-03 Apple Inc. User interfaces for altering visual media
US12101567B2 (en) 2021-04-30 2024-09-24 Apple Inc. User interfaces for altering visual media
US11539876B2 (en) 2021-04-30 2022-12-27 Apple Inc. User interfaces for altering visual media
US11350026B1 (en) 2021-04-30 2022-05-31 Apple Inc. User interfaces for altering visual media
US11416134B1 (en) 2021-04-30 2022-08-16 Apple Inc. User interfaces for altering visual media
US11418699B1 (en) 2021-04-30 2022-08-16 Apple Inc. User interfaces for altering visual media
US12112024B2 (en) 2021-06-01 2024-10-08 Apple Inc. User interfaces for managing media styles
US11776190B2 (en) 2021-06-04 2023-10-03 Apple Inc. Techniques for managing an avatar on a lock screen
CN113867876A (en) * 2021-10-08 2021-12-31 北京字跳网络技术有限公司 Expression display method, device, equipment and storage medium
WO2023056847A1 (en) * 2021-10-08 2023-04-13 北京字跳网络技术有限公司 Emoticon display method and apparatus, and device and storage medium
US12287913B2 (en) 2022-09-06 2025-04-29 Apple Inc. Devices, methods, and graphical user interfaces for controlling avatars within three-dimensional environments
US20240211678A1 (en) * 2022-12-22 2024-06-27 Microsoft Technology Licensing, Llc Encoding documents for peripheral privacy
US12314553B2 (en) 2023-03-20 2025-05-27 Apple Inc. User interface camera effects

Similar Documents

Publication Publication Date Title
US20170018289A1 (en) Emoji as facetracking video masks
US20230316683A1 (en) Video clip object tracking
CN109819313B (en) Video processing method, device and storage medium
US12192667B2 (en) DIY effects image modification
US11775165B2 (en) 3D cutout image modification
CN110582018B (en) Video file processing method, related device and equipment
US11810220B2 (en) 3D captions with face tracking
US9942484B2 (en) Electronic device and method for displaying image therein
US10713835B2 (en) Displaying method, animation image generating method, and electronic device configured to execute the same
US8589815B2 (en) Control of timing for animations in dynamic icons
KR102576908B1 (en) Method and Apparatus for Providing Dynamic Panorama
US11711414B2 (en) Triggering changes to real-time special effects included in a live streaming video
US20230345113A1 (en) Display control method and apparatus, electronic device, and medium
CN112565911B (en) Bullet screen display method, bullet screen generation device, bullet screen equipment and storage medium
US9959487B2 (en) Method and device for adding font
US20190278797A1 (en) Image processing in a virtual reality (vr) system
US10915778B2 (en) User interface framework for multi-selection and operation of non-consecutive segmented information
CN112416486A (en) Information guiding method, device, terminal and storage medium
CN106919943A (en) A kind of data processing method and device
KR20160012909A (en) Electronic device for displyaing image and method for controlling thereof
CN114630085A (en) Image projection method, image projection device, storage medium and electronic equipment
CN109725813B (en) Display interface switching control method and system based on movement track of marked image
CN117786176A (en) Resource searching method, device, terminal equipment and storage medium
CN113377478A (en) Data marking method, device, storage medium and equipment for entertainment industry

Legal Events

Date Code Title Description
AS Assignment

Owner name: STRING THEORY, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MORGENSTERN, JARED S.;REEL/FRAME:039170/0182

Effective date: 20160715

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION