US10397649B2 - Method of zooming video images and mobile display terminal - Google Patents
Method of zooming video images and mobile display terminal Download PDFInfo
- Publication number
- US10397649B2 US10397649B2 US15/681,192 US201715681192A US10397649B2 US 10397649 B2 US10397649 B2 US 10397649B2 US 201715681192 A US201715681192 A US 201715681192A US 10397649 B2 US10397649 B2 US 10397649B2
- Authority
- US
- United States
- Prior art keywords
- video frame
- coordinate
- threshold
- zoom
- current video
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47202—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4728—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present disclosure relates to video play techniques, and particularly to a method of zooming video images and mobile terminal.
- a solution may be as follows. While playing a video, a mobile terminal receives from the user a switch request for switching the display mode into full screen. Then the video is displayed at full screen after the switch request is received.
- various embodiments of the present disclosure provide a method of zooming video images and a mobile terminal.
- the technical schemes are as follows.
- various embodiments provide a method of zooming video images which may include:
- embodiments also provide a mobile terminal which may include:
- the storage device which stores at least one program executable by the at least one processor, the at least one program includes instructions for:
- the technical scheme solves the problem that the user may be unable to see details in a video clearly even after the video is played at full screen, e.g., the problem that the related art cannot satisfy user demands.
- a user is enabled to selectively zoom video frames according to the needs, thus can clearly see details in the video.
- FIGS. 1A and 1B are schematic diagrams illustrating a playing window in accordance with embodiments of the present disclosure
- FIG. 1C is a schematic diagram illustrating a coordinate system of vertex coordinates in accordance with embodiments of the present disclosure
- FIG. 1D is a schematic diagram illustrating a coordinate system of texture coordinates in accordance with embodiments of the present disclosure
- FIG. 1E is a schematic diagram illustrating texture coordinates in a coordinate system in accordance with embodiments of the present disclosure
- FIG. 1F is a flowchart illustrating a method of zooming video images in accordance with embodiments of the present disclosure
- FIG. 2A is a flowchart illustrating a method of zooming video images in accordance with embodiments of the present disclosure
- FIG. 2B is a schematic diagram illustrating a video player application when a user is zooming a current video frame in accordance with embodiments of the present disclosure
- FIG. 2C is a schematic diagram illustrating zoom guidance information presented by a video player application in accordance with embodiments of the present disclosure
- FIG. 2D is a schematic diagram illustrating a user drags image content in accordance with embodiments of the present disclosure
- FIG. 2E is a schematic diagram illustrating a target image area after adjustment in accordance with embodiments of the present disclosure
- FIG. 2F is a schematic diagram illustrating a recover button presented by a video player application in accordance with embodiments of the present disclosure
- FIG. 3 is a block diagram illustrating an apparatus of zooming video images in accordance with embodiments of the present disclosure
- FIG. 4 is a block diagram illustrating an apparatus of zooming video images in accordance with embodiments of the present disclosure
- FIG. 5 is a schematic diagram illustrating modules of a mobile terminal in accordance with embodiments of the present disclosure.
- the playing interface refers to an interface provided by a player terminal for presenting videos.
- the playing window refers to an area actually occupied by video frames in the playing interface.
- the size of a playing window may be the same with that of a playing interface, or may be different from that of a playing interface.
- FIG. 1A when a video frame occupies the whole playing interface, the size of the playing window 11 is the same with the size of the playing interface.
- FIG. 1B when a video frame only occupies the area except the upper area and the bottom area (the upper area and the bottom area are generally black areas in the playing interface when a user is watching a video), the size of the playing window 11 equals the size of the playing interface.
- Vertex coordinates refer to coordinates of each vertex of a playing window.
- the coordinate system of vertex coordinates is built based on a horizontal central axis x and a vertical central axis y of a player terminal, denoted as P.
- the coordinate system P may be the coordinate system as shown in FIG. 1C .
- Vertex coordinates are coordinates of each vertex of a playing window in the coordinate system.
- the maximum coordinate value of vertex coordinates is generally 1.
- FIG. 1C shows vertex coordinates of each vertex of a playing window when the playing window occupies the whole playing interface.
- those skilled in the art may also set another value as the maximum coordinate value according to the needs.
- Texture coordinates may include coordinates of at least two vertexes of a zoomed playing window in an un-zoomed version of the current video frame.
- a coordinate system of the texture coordinates denoted as Q, is set up along an edge of a current video frame with the bottom-left of the video frame as the center.
- the coordinate system Q may be the coordinate system as shown in FIG. 1D (the maximum coordinate value of the coordinate system is generally set to be 1, and may be set to be another value by those skilled in the art according to the needs).
- Texture coordinates are coordinates of a to-be-presented target image area in the coordinate system. For example, referring to FIG.
- the target image area to be presented in the playing window in the current video frame is the area defined by A, B, C and D in the upper figure (the area defined by the dotted lines in the upper figure is an area of an un-zoomed version of the current video frame).
- the texture coordinates are the coordinates in the coordinate system Q of at least two points of the four points A, B, C and D in the current video frame, e.g., the coordinates of at least two points of the four points A′, B′, C′ and D′ in the lower figure of FIG. 1E .
- the at least two points may include two diagonal vertexes.
- FIG. 1F is a flowchart illustrating a method of zooming video images in accordance with embodiments of the present disclosure.
- the method of zooming video images may include the following procedures.
- a zoom request for zooming a current video frame may be received while a video is being played.
- a zoom center point and a zoom ratio may be determined according to the zoom request.
- a target image area to be displayed in a playing window may be determined from a zoomed version of the current video frame according to the zoom center point and the zoom ratio.
- image content within the target image area of subsequent video frames of the current video frame may be rendered in the playing window when the subsequent video frames are played.
- the method of various embodiments provides receiving a zoom request, determining a target image area to be displayed in a playing window from a zoomed version of the current frame according to the zoom request, so that image in the target image area in each frame subsequent to the current frame is rendered in the playing window when each frame subsequent to the current frame is played.
- the technical scheme solves the problem that the user may be unable to see details in a video clearly even after the video is played at full screen, e.g., the problem related art cannot satisfy demands of users.
- a user is enabled to selectively zooming video frames according to the needs, thus can clearly see details in the video.
- FIG. 2A is a flowchart illustrating a method of zooming video images in accordance with embodiments of the present disclosure.
- the method of zooming video images may include the following procedures.
- a zoom request for zooming a current video frame may be received while a video is being played.
- the method may be applied to a video player application which may be an application installed in a mobile terminal device.
- the mobile terminal device may be a touch-control terminal, e.g., a touch-control mobile phone, a tablet computer, a personal reader, or the like.
- the user may trigger a zoom request for the current video frame.
- the video player application may receive the zoom request.
- this procedure may be implemented in the following two manners.
- a stretch gesture applied to the current video frame may be received, and determined to be a magnify request.
- the user when a user is watching a tutorial video and wants to magnify and view tutorial material in the tutorial video, the user may put two fingers on a target position in the playing window, and make a stretch gesture on the playing window.
- the video player application may determine the stretch gesture received to be a magnify request.
- the target position refers to the position of the center of an area the user wants to magnify to view.
- a minify gesture applied to the current video frame may be received, and determined to be a minify request.
- the user may make a minify gesture in the playing window.
- the video player application may also present zoom guidance information for guiding the user to minify the video frames.
- the video player application may present the zoom guidance information after receiving a click signal indicating the user has clicked on the video frame.
- the video player application may present the zoom guidance information the first time when the video player application is run to play a video.
- the video player application may always present the zoom guidance information.
- taking a video player application presents zoom guidance the first time when the video player application is run to play a video as an example, as shown in the upper figure in FIG. 2C , the video player application may present the zoom guidance information as shown in the figure in a popover. In another example, referring to the lower figure of FIG. 2C , the video player application may always present the zoom guidance information 21 as shown in the figure.
- a zoom center point and a zoom ratio may be determined according to the zoom request.
- the video player application may determine a zoom center point and a zoom ratio according to the zoom request.
- the video player application may respectively determine a zoom ratio for the horizontal axis and a zoom ratio for the vertical axis for zooming the current video frame. For example, taking a zoom request for magnifying the current video frame to two times of the original size and the current video frame is magnified according to the same ratio in the horizontal direction and the vertical direction, the video player application may determine the zoom ratio of the horizontal axis to be
- a reference horizontal coordinate and a reference vertical coordinate may be calculated according to the zoom ratio.
- this procedure may include:
- xCord n x * viewHeight viewWidth * ImageWidth ImageHeight ;
- yCord n y * viewWidth viewHeight * ImageHeight ImageWidth .
- this procedure may include:
- the video player application after obtaining the first value, checking by the video player application whether the first value reaches the first threshold; determining by the video player application the reference horizontal coordinate xCord to be the first threshold in response to a determination that the first value reaches the first threshold because the maximum values of the horizontal coordinate and the vertical coordinate are both the first threshold; the first threshold is generally 1;
- the video player application may check whether the second value reaches the first threshold.
- the video player application may determine the reference horizontal coordinate yCord to be the first threshold in response to a determination that the second value reaches the first threshold because the maximum value of both the horizontal coordinate and the vertical coordinate can only be the first threshold.
- a determination may be made that the reference vertical coordinate yCord is the second value in response to a determination that the second value does not reach the first threshold.
- n is the zoom ratio for the vertical axis of the current video frame
- viewWidth is the width of the displaying area of the playing window
- viewHeight is the height of the displaying area of the playing window
- ImageWidth is the width of the current video frame
- ImageHeight is the height of the current video frame.
- the video player application may determine the reference horizontal coordinate and the reference vertical coordinate according to the following method which may include the following procedures.
- the reference vertical coordinate yCord may be determined to be the first threshold in response to a determination that the first value does not reach the first threshold.
- a third value may be obtained, which may be
- n x * viewHeight viewWidth * ImageWidth ImageHeight .
- n x is the zoom ratio for the horizontal axis of the current video frame.
- the reference horizontal coordinate xCord may be determined to be the first threshold in response to a determination that the third value reaches the first threshold.
- the video player application may determine the reference horizontal coordinate to be the first threshold in response to a determination that the obtained third value reaches the first threshold.
- the reference horizontal coordinate xCord may be determined to be the third value in response to a determination that the third value does not reach the first threshold.
- this example is taking a video player application calculating the reference horizontal coordinate and the reference vertical coordinate using the above method as an example.
- a video player application may perform the calculation using other methods, and this is not limited in the present disclosure.
- texture coordinates may be calculated according to the zoom center point, the reference horizontal coordinate and the reference vertical coordinate.
- the video player application may calculate coordinate values of texture coordinates according to the zoom center point, the reference horizontal coordinate and the reference vertical coordinate.
- the texture coordinates may include coordinates of at least two vertexes of a playing window presenting a zoomed version of the current video frame in an un-zoomed version of the current video frame.
- the at least two vertexes may include diagonal vertexes of the playing window. This example takes the texture coordinates includes four vertexes as an example.
- this procedure may include the following steps.
- a target horizontal coordinate and a target vertical coordinate of the zoom center point may be obtained.
- the target horizontal coordinate may be:
- the target vertical coordinate may be:
- Y 0 y 0 viewHeight * n y .
- a first horizontal coordinate X 1 of the texture coordinates is the first threshold i
- a second horizontal coordinate X 2 of the texture coordinates is j
- a first vertical coordinate Y 1 of the texture coordinates is
- a second vertical coordinate Y 2 of the texture coordinates is Y 0 .
- a first horizontal coordinate X 1 of the texture coordinates is
- a second horizontal coordinate X 2 of the texture coordinates is X 0
- a first vertical coordinate Y 1 of the texture coordinates is the first threshold i
- a second vertical coordinate Y 2 of the texture coordinates is the second threshold j.
- the second horizontal coordinate X 2 is the second threshold j.
- the first horizontal coordinate X 1 of the texture coordinates is
- the second horizontal coordinate X 2 of the texture coordinates is X 0
- the first vertical coordinate Y 1 of the texture coordinates is
- the second vertical coordinate Y 2 of the texture coordinates is Y 0 .
- a first horizontal coordinate X 1 of the texture coordinates is the first threshold i
- a second horizontal coordinate X 2 of the texture coordinates is the second threshold j
- a first vertical coordinate Y 1 of the texture coordinates is the first threshold i
- a second vertical coordinate Y 2 of the texture coordinates is the second threshold j.
- the second threshold j is the minimal coordinate value in the coordinate system of the video frame. For example, when the coordinate system is the coordinate system as shown in FIG. 1D , the second threshold is 0.
- an area defined by the texture coordinates are determined to be the target image area.
- the video player application may determine the area defined by the texture coordinates in the current video frame as the target image area.
- vertex coordinates of the playing window may be calculated according to the reference horizontal coordinate and the reference vertical coordinate.
- the reference horizontal coordinate may be assumed to be xCord; the reference vertical coordinate to be yCord.
- the vertex coordinates may be: (xCord, ⁇ yCord), (xCord, yCord), (xCord, yCord), and ( ⁇ xCord, ⁇ yCord).
- image content in the target image area of the subsequent video frames are rendered in an area defined by the vertex coordinates according to the texture coordinates.
- the act of a user zooming an area of a video frame means the user is interested in image content in the area within the video frame. Therefore, the video player application may directly render image content in the area defined by the texture coordinates of subsequent video frames in the area defined by the vertex coordinates. The video player application may render the image content using the Open Graphics Library (OpenGL).
- OpenGL Open Graphics Library
- the video player application may present image content according to the above method when presenting the k'th frame subsequent to the current video frame because it may take some time for the video player application to calculate the texture coordinates and the vertex coordinates. If the calculation does not take much time, e.g., the video player application has obtained the calculated texture coordinates and the vertex coordinates before playing the next image frame of the current video frame, the video player application may present image content according to the above method when playing the next video frame of the current frame.
- image content may be presented according to the above method as long as the texture coordinates and the vertex coordinates have been calculated, and the implementation of the method is not limited.
- the user may selectively dragging image content rendered in the playing window when the user wants to adjust the display position of the video frames. That is, the video player application may also perform the following procedures.
- a drag request may be received.
- the drag request is for dragging image content rendered in the playing window, and the image content is image content within the target image area in the k'th video frame of the subsequent video frames.
- the k is a positive integer.
- the user when the user wants to drag tutorial content in the video frame as shown in FIG. 1E into the center of the playing window, the user may trigger a drag request by performing a leftward dragging in the video frame presented by the video player application. Accordingly, the video player application may receive the drag request.
- the target image area is adjusted according to the drag request, and the adjusted target image area includes an area in the k'th video frame to be presented in the playing window after the dragging.
- the video player application may adjust the target image area according to the received drag request.
- the adjusted target image area may include an area in the k'th video frame to be presented in the playing window after dragging.
- this procedure may include: calculating adjusted texture coordinates according to the drag request; taking an area defined by the adjusted texture coordinates as the adjusted target image area.
- the method of the video player application calculates the adjusted texture coordinates may include: obtaining a dragging displacement corresponding to the drag request; calculating adjusted texture coordinates according to the texture coordinates before the adjustment and the dragging displacement. That is, the texture coordinates may include coordinates of at least two vertexes of a playing window presenting the dragged current video frame in an un-zoomed version of the current video frame. The at least two vertexes may include diagonal vertexes of the playing window.
- the image area to be presented in the playing window is the area defined by E, F, G and H.
- the video player application may calculate coordinates of E′, F′, G′ and H′, and take the area defined by E′, F′, G′ and H′ as the adjusted target image area.
- the video player application may render image content of the video frame that falls within the adjusted target image area in the playing window. This procedure is similar to the procedure in block 207 .
- the user may perform an action on a recover button presented in the playing interface of the video player application, e.g., the video player application may perform the following procedures.
- a recover request may be received via the recover button.
- the user when the user wants to zoom to return to the play mode used before the zooming after watching a video for a period of time under the mode as shown in the upper figure of FIG. 2E , the user may click on the recover button 22 as shown in FIG. 2F , and correspondingly, the video player application may receive the recover request via the recover button 22 .
- Video frames to be played are played according to the play mode used before the zooming according to the zoom request in response to the recover request.
- the video player application may use the play mode as shown in FIG. 2B , e.g., using the same play center point and zoom ratio with that of the video frame as shown in FIG. 2B , to play the pending video frames.
- the method of various embodiments provides receiving a zoom request, determining a target image area to be displayed in a playing window from a zoomed version of the current frame according to the zoom request, image in the target image area in each frame after the current frame is rendered in the playing window when each frame subsequent to the current frame is played.
- the technical scheme solves the problem that the user may be unable to see details in a video clearly even after the video is played at full screen, e.g., the problem that related art cannot satisfy demands of users.
- a user is enabled to selectively zooming video frames according to the needs, thus can clearly see details in the video.
- adjusted texture coordinates may be calculated and an area defined by the adjusted texture coordinates may be regarded as the adjusted target image area.
- image content within the adjusted target image area is rendered in the playing window, such that the user is enabled to adjust content of video frames that is presented in the playing window according to the needs of the user watching the video during play of the video.
- users' demand for watching a video can be better satisfied.
- FIG. 3 is a schematic diagram illustrating an apparatus of zooming video images in accordance with embodiments of the present disclosure.
- the apparatus may include: a first receiving module 310 , a parameter determining module 320 , an area determining module 330 and a first presenting module 340 .
- the first receiving module 310 may receive a zoom request for zooming a current video frame while a video is being played.
- the parameter determining module 320 may determine a zoom center point and a zoom ratio according to the zoom request.
- the area determining module 330 may determine a target image area to be presented in a playing window from a zoomed version of the current video frame according to the zoom center point and the zoom ratio.
- the first presenting module 340 may render in the playing window image content within the target image area of subsequent video frames of the current video frame when playing the subsequent video frames.
- the apparatus of various embodiments provides receiving a zoom request, determining a target image area to be displayed in a playing window from a zoomed version of the current frame according to the zoom request, image in the target image area in each frame after the current frame is presented in the playing window when each frame subsequent to the current frame is played.
- the technical scheme solves the problem that the user may be unable to see details in a video clearly even after the video is played at full screen, e.g., the problem related art cannot satisfy demands of users.
- a user is enabled to selectively zooming video frames according to the needs, thus can clearly see details in the video.
- FIG. 4 is a schematic diagram illustrating an apparatus of zooming video images in accordance with embodiments of the present disclosure.
- the apparatus may include: a first receiving module 410 , a parameter determining module 420 , an area determining module 430 and a first presenting module 440 .
- the first receiving module 410 may receive a zoom request for zooming a current video frame while a video is being played.
- the parameter determining module 420 may determine a zoom center point and a zoom ratio according to the zoom request.
- the area determining module 430 may determine a target image area to be presented in a playing window from a zoomed version of the current video frame according to the zoom center point and the zoom ratio.
- the first presenting module 440 may render in the playing window image content within the target image area in subsequent video frames of the current video frame when playing the subsequent video frames.
- the first receiving module 410 may also:
- the apparatus may also include:
- an information presenting module 450 to present zoom guidance information for guiding the user to zoom the video frame.
- the apparatus may also include:
- a second receiving module 460 to receive a drag request for dragging image content presented in the playing window which is image content within the target image area in the k'th video frame of the subsequent video frames;
- the k is a positive integer;
- an adjusting module 470 to adjust the target image area according to the dragging request so that the adjusted target image area includes an area in the k'th video frame to be presented in the playing window after the dragging;
- a second presenting module 480 to render image content of a video frame that falls in the adjusted target image area in the playing window when playing video frames subsequent to the k'th video frame.
- the apparatus may also include:
- a third receiving module 490 to receive a recover request via a recover button
- a third presenting module 510 to play video frames to be played after the recover request is received according to the play mode used before the zooming according to the zoom request.
- the area determining module 430 may include:
- a first calculating unit 413 to calculate a reference horizontal coordinate and a reference vertical coordinate according to the zoom ratio
- a second calculating unit 432 to calculate coordinate values of texture coordinates according to the zoom center point, the reference horizontal coordinate and the reference vertical coordinate;
- the texture coordinates may include coordinates of at least two vertexes of the playing window in a zoomed version of the current video frame, the at least two vertexes may include diagonal vertexes of the playing window;
- an area determining unit 433 to determine an area defined by the texture coordinates as the target image area.
- the first presenting module 440 may include:
- a third calculating unit 441 to calculate vertex coordinates of the playing window according to the reference horizontal coordinate and the reference vertical coordinate;
- a content presenting unit 442 to render image content within the target image area of subsequent video frames in an area corresponding to the vertex coordinates according to the texture coordinates.
- the third calculating unit 442 may also:
- the first calculating unit 431 may also:
- xCord n x ⁇ viewHeight viewWidth ⁇ ImageWidth ImageHeight ;
- yCord n y ⁇ viewWidth viewHeight ⁇ ImageHeight ImageWidth ;
- n x is the zoom ratio for the horizontal axis of the current video frame
- n y is the zoom ratio for the vertical axis of the current video frame
- viewWidth is the width of the displaying area in the playing window
- viewHeight is the height of the displaying area in the playing window
- ImageWidth is the width of the current video frame
- ImageHeight is the height of the current video frame.
- the first calculating unit 431 may also:
- n y viewWidth viewHeight ⁇ ImageHeight ImageWidth ;
- n y is the zoom ratio for the vertical axis of the current video frame; viewWidth is the width of the displaying area of the playing window; viewHeight is the height of the displaying area of the playing window; ImageWidth is the width of the current video frame; ImageHeight is the height of the current video frame.
- the first calculating unit 431 may also:
- n x is the zoom ratio for the horizontal axis of the current video frame
- the second calculating unit 432 may also:
- a target horizontal coordinate and a target vertical coordinate of the zoom center point obtain a target horizontal coordinate and a target vertical coordinate of the zoom center point; determine the target horizontal coordinate to be:
- the target vertical coordinate may be:
- Y 0 y 0 viewHeight * n y .
- the reference horizontal coordinate is smaller than the first threshold i and the reference vertical coordinate is the first threshold i, determine a first horizontal coordinate X 1 of the texture coordinates to be the first threshold i, a second horizontal coordinate X 2 of the texture coordinates to be a second threshold j, a first vertical coordinate Y 1 of the texture coordinates to be
- a first horizontal coordinate X 1 of the texture coordinates is
- a second horizontal coordinate X 2 of the texture coordinates is X 0
- a first vertical coordinate Y 1 of the texture coordinates is the first threshold i
- a second vertical coordinate Y 2 of the texture coordinates is the second threshold j; when the first horizontal coordinate X 1 is larger than the first threshold i, determine the first horizontal coordinate X 1 is the first threshold i, the second horizontal coordinate X 2 is
- the second horizontal coordinate X 2 is the second threshold j;
- the reference horizontal coordinate and the reference vertical coordinate are both the first threshold, obtain the value of a first parameter a, the value of a second parameter b, the value of a third parameter c and the value of a fourth parameter d may be obtained; determine the first horizontal coordinate X 1 of the texture coordinates is
- the second horizontal coordinate X 2 of the texture coordinates is X 0
- the first vertical coordinate Y 1 of the texture coordinates is
- the second vertical coordinate Y 2 of the texture coordinates is Y 0 ; when the first horizontal coordinate X 1 is larger than the first threshold i, determine the second horizontal coordinate is
- a first horizontal coordinate X 1 of the texture coordinates is the first threshold i
- a second horizontal coordinate X 2 of the texture coordinates is the second threshold j
- a first vertical coordinate Y 1 of the texture coordinates is the first threshold i
- a second vertical coordinate Y 2 of the texture coordinates is the second threshold j
- x 0 is the horizontal coordinate of the zoom center point in the current video frame
- y 0 is the vertical coordinate of the zoom center point in the current video frame
- a is the first threshold i
- n x is the zoom ratio for the horizontal axis of the current video frame
- n y is the zoom ratio for the vertical axis of the current video frame.
- the apparatus of various embodiments provides receiving a zoom request, determining a target image area to be displayed in a playing window from a zoomed version of the current frame according to the zoom request, image in the target image area in each frame after the current frame is presented in the playing window when each frame subsequent to the current frame is played.
- the technical scheme solves the problem that the user may be unable to see details in a video clearly even after the video is played at full screen, e.g., the problem related art cannot satisfy demands of users.
- a user is enabled to selectively zooming video frames according to the needs, thus can clearly see details in the video.
- FIG. 5 is a schematic diagram illustrating modules of a mobile terminal in accordance with embodiments of the present disclosure.
- a video player application may run in the mobile terminal 600 .
- the specific components are as follows.
- the mobile terminal 600 may include a radio frequency (RF) circuit 610 , at least one memory of computer-readable storage medium 620 , an input unit 630 , a display unit 640 , at least one sensor 650 , an audio circuit 660 , a wireless fidelity (WiFi) unit 670 , at least one processor 680 and a power supply 6390 and the like.
- RF radio frequency
- the structure as shown in FIG. 5 is not for restricting the terminal device.
- the terminal device of various examples may include extra components or may include fewer components, or may have some of the components integrated into one component, or may have a different deployment of the components.
- the RF circuit 610 is capable of sending and receiving signals during a process of information sending/receiving process or a voice communication process.
- the RF circuit 610 may send downlink information received from a base station to the at least one processor 680 for further processing, and may send uplink data to the base station.
- the RF circuit 610 may generally include, but not limited to, an antenna, at least one amplifier, a tuner, at least one oscillator, a subscriber identity module (SIM) card, a transceiver, a coupler, a low noise amplifier (LNA), a duplexer, and the like.
- SIM subscriber identity module
- the RF circuit 610 may perform wireless communications with a network and other devices.
- the wireless communication may adopt any communication standard or protocol, including but not limited to: global system of mobile communication (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code division multiple access (WCDMA), long term evolution (LTE), email, short messaging service (SMS), or the like.
- GSM global system of mobile communication
- GPRS general packet radio service
- CDMA code division multiple access
- WCDMA wideband code division multiple access
- LTE long term evolution
- email short messaging service
- SMS short messaging service
- the storage medium 620 may be used for storing software programs and modules.
- the storage medium 620 may store a pre-determined time list, a software program for collecting voice signals, a software program for identifying key words, a software program for continuous speech recognition, a software program for setting events and alerts, and store relationships which associates wireless access points with user accounts, or the like.
- the processor 680 may be capable of executing the software programs and modules stored in the storage device 620 to implement various functions and data processing.
- the storage medium 620 may mainly include program storage sections and data storage sections.
- the program storage sections may store an operating system, applications required for implementing at least one function (such as video playing function, image display function, touch screen recognition function, or the like).
- the data storage sections may store data generated during usage of the mobile terminal 600 , or the like.
- the storage medium 620 may include a high-speed random access memory, and may also include a non-transitory memory, e.g., at least one disk storage, flash memory or other non-transitory solid state storage device and the like.
- the storage device 620 may also include a storage controller to provide the processor 680 and the inputting unit 630 with access to the storage device 620 .
- the input unit 630 may receive digits or characters inputted, and generate a keyboard input signal, a mouse input signal, a control lever input signal, an optical input signal, or a track ball input signal which is related with user settings and function controlling.
- the input unit 630 may include a touch sensitive surface 631 and other inputting devices 632 .
- the touch sensitive surface 631 also referred to as a touch screen or a touchpad, is capable of collecting touch operations performed by a user on the surface or near the surface (e.g., an operation performed on or near the touch sensitive surface 631 using any proper object or attachment such as a finger or a touch pen and etc.), and driving a connecting apparatus corresponding to the operation according to a pre-defined procedure.
- the touch sensitive surface 631 may include a touch detecting apparatus and a touch controller.
- the touch detecting apparatus detects the position touched by the user, detects a signal generated by the touch, and sends the signal to the touch controller.
- the touch controller receives touch information from the touch detecting apparatus, converts the touch information into coordinates of the touch position, sends the coordinates to the processor 680 , receives a command sent by the processor 680 and executes the command.
- the touch sensitive surface 631 may be implemented via various types of touch techniques such as resistive touch screen, capacitive touch screen, infrared touch screen and surface acoustic wave touch screen and so on.
- the input unit 630 may include another input device 632 besides the touch sensitive surface 631 .
- the another input device 632 may include, but not limited to, at least one of a physical keyboard, a function key (e.g., a volume control key, a power on/off key and etc.), a track ball, a mouse, a control lever and the like.
- a function key e.g., a volume control key, a power on/off key and etc.
- a track ball e.g., a mouse, a control lever and the like.
- the display unit 640 is capable of displaying information inputted by the user, information provided for the user and various graphical user interfaces of the mobile terminal 600 .
- the graphical user interfaces may include any combination of graphics, texts, icons, videos.
- the display unit 640 may include a display panel 641 .
- the display panel 641 may be implemented by Liquid Crystal Display (LCD), Organic Light-Emitting Diode (OLED) and the like.
- the touch sensitive surface 661 may overlay the display panel 641 . When detecting a touch operation on or near the touch sensitive surface 661 , the touch sensitive surface 661 may send the touch operation to the processor 680 to determine the type of the touch event.
- the processor 680 may provide visual output on the display panel 641 according to the type of the touch event.
- the touch sensitive surface 661 and the display panel 641 are depicted as two independent components respectively for input and output in FIG. 5 , the touch sensitive surface 661 and the display panel 641 may be integrated to provide input and output in various examples.
- the mobile terminal 600 may also include at least one sensor 650 , e.g., an optical sensor, a motion sensor, or other types of sensors.
- the optical sensor may include an ambient light sensor and a proximity sensor.
- the ambient light sensor may adjust the brightness of the display panel 641 according to the strength of ambient light.
- the proximity sensor may close the display panel 641 and/or the light when the mobile terminal 600 is held close to an ear.
- a gravity sensor is a type of motion sensor, may detect the amount of acceleration in multiple directions (typically XYZ-axis), the amount and the direction of gravity when kept in stationary, and can be used in applications which need to identify phone postures (such as auto screen rotation, games using the sensing result, magnetometer attitude calibration), features related with vibration identify (such as a pedometer, percussion) and the like.
- the mobile terminal 600 may include other sensors, e.g., a gyroscope, a barometer, a hygrometer, a thermometer, infrared sensors and the like, which are not listed further herein.
- the audio circuit 660 , the speaker 661 and the microphone 662 may provide an audio interface between the user and the mobile terminal device 600 .
- An audio circuit 660 may convert received audio data into electrical signals, and send the electrical signals to the speaker 661 .
- the speaker 661 may convert the electrical signals into sound and outputs the sound.
- the microphone 662 may convert collected sound signals into electrical signals which are received by the audio circuit 660 .
- the audio circuit 660 may convert the electrical signals into audio data, and sends the electrical signals to the processor 680 for processing.
- the processed audio data may be sent to another terminal device via the RF circuit 610 , or be output to the storage device 620 for further processing.
- the audio circuit 660 may also include an ear jack providing communications between a peripheral earphone and the mobile terminal 600 .
- the short-distance wireless communication module 670 may be a wireless fidelity (WiFi) module or a Bluetooth module, or the like.
- the mobile terminal 600 may adopt a WiFi module 270 to provide wireless broadband Internet access to enable a user to send and receive emails, browse webpages and access stream media and so on.
- the terminal device 600 may not include the WiFi module 670 although it is shown in FIG. 5 .
- the structure in FIG. 5 is merely an example, modifications can be made as long as they do not change the mechanism of the examples.
- the processor 680 is a control center of the mobile terminal 600 which interconnects all of the components in the phone using various interfaces and circuits and monitors the phone by running or executing software programs and/or modules stored in the storage device 620 and calling various functions of the mobile terminal 600 and processing data.
- the processing unit 680 may include one or multiple processing cores.
- the processing unit 680 may integrate an application processor and a modem processor.
- the application processor mainly handles the operating system, user interfaces and application programs, and etc.
- the modem processor mainly handles wireless communications.
- the modem may not be integrated into the processor 680 .
- the mobile terminal 600 may also include a power supply 690 (e.g., a battery) providing power for various parts.
- the power supply may be logically connected with the processor 680 via a power supply management system to implement functions such as charging, discharging, power management and the like.
- the power supply 690 may also include any components such as one or multiple AC or DC power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator and the like.
- the mobile terminal 600 may also include a camera, a Bluetooth module and the like, which is not described further herein.
- the mobile terminal 600 may also include a storage device and at least one program which may be executed by at least one processor to implement the method of zooming video images of various examples.
- a non-transitory computer-readable storage medium including instructions e.g., a storage device including instructions
- the instructions may be executable by a processor at the mobile terminal to implement the method of zooming video images.
- the non-transitory computer-readable storage medium may be ROM, RAM, CD-ROM, magnetic tape, floppy disc, optical storage device, or the like.
- the apparatus of zooming video images takes the above modules as an example.
- the functions may be re-divided to be implemented by different modules, e.g., the apparatus may have a different inner structure composed of different modules to implement all or some of the above functions.
- the above methods of zooming video images provided by the examples belong to the same conceptive idea. Details have been described in the above, and will not be repeated herein.
- index numbers of the examples are merely for facilitating description, and should not be interpreted to be representative for the preference order of the examples.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Databases & Information Systems (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Studio Circuits (AREA)
Abstract
Description
and the zoom ratio of the vertical axis is also
nx is the zoom ratio for the horizontal axis of the current video frame.
the target vertical coordinate may be:
and a second vertical coordinate Y2 of the texture coordinates is Y0. When the first vertical coordinate Y1 reaches the first threshold i, it is determined the first vertical coordinate Y1 is the first threshold i, the second vertical coordinate Y2 is
a second horizontal coordinate X2 of the texture coordinates is X0, a first vertical coordinate Y1 of the texture coordinates is the first threshold i, a second vertical coordinate Y2 of the texture coordinates is the second threshold j. When the first horizontal coordinate X1 is larger than the first threshold i, it is determined the first horizontal coordinate X1 is the first threshold i, the second horizontal coordinate X2 is
When the second horizontal coordinate X2 does not reach the second threshold j, it is determined the first horizontal coordinate X1 is
the second horizontal coordinate X2 is the second threshold j.
the second horizontal coordinate X2 of the texture coordinates is X0, the first vertical coordinate Y1 of the texture coordinates is
the second vertical coordinate Y2 of the texture coordinates is Y0. When the first horizontal coordinate X1 is larger than the first threshold i, it is determined the second horizontal coordinate is
When the first vertical coordinate Y1 is larger than the first threshold i, it is determined the second vertical coordinate Y2 is
When the second horizontal coordinate is smaller than the second threshold j, it is determined the second horizontal coordinate is the second threshold j, the first horizontal coordinate is
d is the first threshold i; nx is the zoom ratio for the horizontal axis of the current video frame; ny is the zoom ratio for the vertical axis of the current video frame.
nx is the zoom ratio for the horizontal axis of the current video frame;
the target vertical coordinate may be:
a second vertical coordinate Y2 of the texture coordinates to be Y0; when the first vertical coordinate Y1 reaches the first threshold i, determine the first vertical coordinate Y1 is the first threshold i, the second vertical coordinate Y2 is
a second horizontal coordinate X2 of the texture coordinates is X0, a first vertical coordinate Y1 of the texture coordinates is the first threshold i, a second vertical coordinate Y2 of the texture coordinates is the second threshold j; when the first horizontal coordinate X1 is larger than the first threshold i, determine the first horizontal coordinate X1 is the first threshold i, the second horizontal coordinate X2 is
when the second horizontal coordinate X2 does not reach the second threshold j, determine the first horizontal coordinate X1 is
the second horizontal coordinate X2 is the second threshold j;
the second horizontal coordinate X2 of the texture coordinates is X0, the first vertical coordinate Y1 of the texture coordinates is
the second vertical coordinate Y2 of the texture coordinates is Y0; when the first horizontal coordinate X1 is larger than the first threshold i, determine the second horizontal coordinate is
when the first vertical coordinate Y1 is larger than the first threshold i, determine the second vertical coordinate Y2 is
when the second horizontal coordinate is smaller than the second threshold j, determine the second horizontal coordinate is the second threshold j, the first horizontal coordinate is
d is the first threshold i; nx is the zoom ratio for the horizontal axis of the current video frame; ny is the zoom ratio for the vertical axis of the current video frame.
Claims (17)
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201510181200 | 2015-04-16 | ||
| CN201510181200.6A CN104822088B (en) | 2015-04-16 | 2015-04-16 | Video image zooming method and apparatus |
| CN201510181200.6 | 2015-04-16 | ||
| PCT/CN2016/078352 WO2016165568A1 (en) | 2015-04-16 | 2016-04-01 | Method for scaling video image, and mobile terminal |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2016/078352 Continuation WO2016165568A1 (en) | 2015-04-16 | 2016-04-01 | Method for scaling video image, and mobile terminal |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20170347153A1 US20170347153A1 (en) | 2017-11-30 |
| US10397649B2 true US10397649B2 (en) | 2019-08-27 |
Family
ID=53732233
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/681,192 Active US10397649B2 (en) | 2015-04-16 | 2017-08-18 | Method of zooming video images and mobile display terminal |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US10397649B2 (en) |
| KR (1) | KR101951135B1 (en) |
| CN (1) | CN104822088B (en) |
| WO (1) | WO2016165568A1 (en) |
Families Citing this family (26)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104822088B (en) | 2015-04-16 | 2019-03-19 | 腾讯科技(北京)有限公司 | Video image zooming method and apparatus |
| CN106817533A (en) * | 2015-11-27 | 2017-06-09 | 小米科技有限责任公司 | Image processing method and device |
| CN107547913B (en) * | 2016-06-27 | 2021-06-18 | 阿里巴巴集团控股有限公司 | Video data playing and processing method, client and equipment |
| CN106373169A (en) * | 2016-08-30 | 2017-02-01 | 广东成德电子科技股份有限公司 | Parameter-driven printed circuit board bitmap duplication method and system |
| WO2018097632A1 (en) | 2016-11-25 | 2018-05-31 | Samsung Electronics Co., Ltd. | Method and device for providing an image |
| CN107577398B (en) * | 2017-08-08 | 2021-03-12 | 深圳Tcl新技术有限公司 | Interface animation control method, device and storage medium |
| CN109598672B (en) * | 2017-09-30 | 2021-12-14 | 腾讯科技(深圳)有限公司 | Map road rendering method and device |
| CN108170350A (en) * | 2017-12-28 | 2018-06-15 | 努比亚技术有限公司 | Realize method, terminal and the computer readable storage medium of Digital Zoom |
| CN109121000A (en) * | 2018-08-27 | 2019-01-01 | 北京优酷科技有限公司 | A kind of method for processing video frequency and client |
| CN110876079B (en) * | 2018-08-31 | 2022-05-06 | 阿里巴巴集团控股有限公司 | Video processing method, device and equipment |
| CN111277886B (en) * | 2018-11-16 | 2022-10-28 | 北京字节跳动网络技术有限公司 | Panoramic video view field control method and device, electronic equipment and storage medium |
| CN110933493A (en) * | 2018-12-03 | 2020-03-27 | 北京仁光科技有限公司 | Video rendering system, method and computer-readable storage medium |
| CN109729408B (en) * | 2018-12-19 | 2022-03-11 | 四川坤和科技有限公司 | A mobile terminal high-definition online video scaling method |
| CN110275749B (en) * | 2019-06-19 | 2022-03-11 | 深圳顺盈康��疗设备有限公司 | Surface amplifying display method |
| CN112446904B (en) * | 2019-08-30 | 2024-04-09 | 西安诺瓦星云科技股份有限公司 | Image alignment method, device and system |
| CN110764764B (en) * | 2019-09-16 | 2024-03-01 | 平安科技(深圳)有限公司 | Webpage end image fixed stretching method and device, computer equipment and storage medium |
| CN110996150A (en) * | 2019-11-18 | 2020-04-10 | 咪咕动漫有限公司 | Video fusion method, electronic device and storage medium |
| CN111221455B (en) | 2020-01-06 | 2022-03-04 | 北京字节跳动网络技术有限公司 | Material display method and device, terminal and storage medium |
| CN111722781A (en) * | 2020-06-22 | 2020-09-29 | 京东方科技集团股份有限公司 | Intelligent interaction method and device, storage medium |
| CN112218157A (en) * | 2020-10-10 | 2021-01-12 | 杭州赛鲁班网络科技有限公司 | System and method for intelligently focusing video |
| CN112367559B (en) * | 2020-10-30 | 2022-10-04 | 北京达佳互联信息技术有限公司 | Video display method and device, electronic equipment, server and storage medium |
| CN112667345A (en) * | 2021-01-25 | 2021-04-16 | 深圳市景阳信息技术有限公司 | Image display method and device, electronic equipment and readable storage medium |
| CN113259767B (en) * | 2021-06-15 | 2021-09-17 | 北京新片场传媒股份有限公司 | Method and device for zooming audio and video data and electronic equipment |
| US11847807B2 (en) | 2021-07-02 | 2023-12-19 | Genesys Logic, Inc. | Image processing system and processing method of video stream |
| TWI824321B (en) * | 2021-07-02 | 2023-12-01 | 創惟科技股份有限公司 | Image controller, image processing system and image modifying method |
| CN116233539A (en) * | 2023-03-07 | 2023-06-06 | 北京字跳网络技术有限公司 | A method and device for displaying information |
Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6400852B1 (en) * | 1998-12-23 | 2002-06-04 | Luxsonor Semiconductors, Inc. | Arbitrary zoom “on -the -fly” |
| CN101325040A (en) | 2008-07-16 | 2008-12-17 | 宇龙计算机通信科技(深圳)有限公司 | Mobile terminal capable of adjusting resolution and method for adjusting resolution of the mobile terminal |
| US20100026721A1 (en) * | 2008-07-30 | 2010-02-04 | Samsung Electronics Co., Ltd | Apparatus and method for displaying an enlarged target region of a reproduced image |
| CN102377960A (en) | 2010-08-24 | 2012-03-14 | 腾讯科技(深圳)有限公司 | Video picture displaying method and device |
| KR20120024058A (en) | 2010-09-03 | 2012-03-14 | 에스케이플래닛 주식회사 | Digital contents service system, methods for creating and providing digital contents |
| US20120092381A1 (en) * | 2010-10-19 | 2012-04-19 | Microsoft Corporation | Snapping User Interface Elements Based On Touch Input |
| US20130009997A1 (en) * | 2011-07-05 | 2013-01-10 | Research In Motion Limited | Pinch-to-zoom video apparatus and associated method |
| CN103888840A (en) | 2014-03-27 | 2014-06-25 | 电子科技大学 | Method and device for dragging and zooming video mobile terminal in real time |
| US8817052B2 (en) * | 2009-11-02 | 2014-08-26 | Sony Corporation | Information processing apparatus, image enlargement processing method, and computer program product with visible data area enlargement features |
| US20140282061A1 (en) * | 2013-03-14 | 2014-09-18 | United Video Properties, Inc. | Methods and systems for customizing user input interfaces |
| KR20140133081A (en) | 2013-05-09 | 2014-11-19 | 엘지전자 주식회사 | Mobile terminal and sharing contents displaying method thereof |
| CN104238863A (en) * | 2014-08-29 | 2014-12-24 | 广州视睿电子科技有限公司 | Android-based circle selection scaling method and system |
| CN104469398A (en) | 2014-12-09 | 2015-03-25 | 北京清源新创科技有限公司 | Network video image processing method and device |
| CN104822088A (en) | 2015-04-16 | 2015-08-05 | 腾讯科技(北京)有限公司 | Video image zooming method and device |
| US20150268822A1 (en) * | 2014-03-21 | 2015-09-24 | Amazon Technologies, Inc. | Object tracking in zoomed video |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TW201352055A (en) * | 2012-06-01 | 2013-12-16 | Jinone Inc | Apparatus for controlling LED sub-series |
-
2015
- 2015-04-16 CN CN201510181200.6A patent/CN104822088B/en active Active
-
2016
- 2016-04-01 WO PCT/CN2016/078352 patent/WO2016165568A1/en not_active Ceased
- 2016-04-01 KR KR1020177018479A patent/KR101951135B1/en active Active
-
2017
- 2017-08-18 US US15/681,192 patent/US10397649B2/en active Active
Patent Citations (16)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6400852B1 (en) * | 1998-12-23 | 2002-06-04 | Luxsonor Semiconductors, Inc. | Arbitrary zoom “on -the -fly” |
| CN101325040A (en) | 2008-07-16 | 2008-12-17 | 宇龙计算机通信科技(深圳)有限公司 | Mobile terminal capable of adjusting resolution and method for adjusting resolution of the mobile terminal |
| US20100026721A1 (en) * | 2008-07-30 | 2010-02-04 | Samsung Electronics Co., Ltd | Apparatus and method for displaying an enlarged target region of a reproduced image |
| US8817052B2 (en) * | 2009-11-02 | 2014-08-26 | Sony Corporation | Information processing apparatus, image enlargement processing method, and computer program product with visible data area enlargement features |
| CN102377960A (en) | 2010-08-24 | 2012-03-14 | 腾讯科技(深圳)有限公司 | Video picture displaying method and device |
| US20130083078A1 (en) | 2010-08-24 | 2013-04-04 | Tencent Technology (Shenzhen) Company Limited | Method and Apparatus for Presenting a Video Screen |
| KR20120024058A (en) | 2010-09-03 | 2012-03-14 | 에스케이플래닛 주식회사 | Digital contents service system, methods for creating and providing digital contents |
| US20120092381A1 (en) * | 2010-10-19 | 2012-04-19 | Microsoft Corporation | Snapping User Interface Elements Based On Touch Input |
| US20130009997A1 (en) * | 2011-07-05 | 2013-01-10 | Research In Motion Limited | Pinch-to-zoom video apparatus and associated method |
| US20140282061A1 (en) * | 2013-03-14 | 2014-09-18 | United Video Properties, Inc. | Methods and systems for customizing user input interfaces |
| KR20140133081A (en) | 2013-05-09 | 2014-11-19 | 엘지전자 주식회사 | Mobile terminal and sharing contents displaying method thereof |
| US20150268822A1 (en) * | 2014-03-21 | 2015-09-24 | Amazon Technologies, Inc. | Object tracking in zoomed video |
| CN103888840A (en) | 2014-03-27 | 2014-06-25 | 电子科技大学 | Method and device for dragging and zooming video mobile terminal in real time |
| CN104238863A (en) * | 2014-08-29 | 2014-12-24 | 广州视睿电子科技有限公司 | Android-based circle selection scaling method and system |
| CN104469398A (en) | 2014-12-09 | 2015-03-25 | 北京清源新创科技有限公司 | Network video image processing method and device |
| CN104822088A (en) | 2015-04-16 | 2015-08-05 | 腾讯科技(北京)有限公司 | Video image zooming method and device |
Non-Patent Citations (5)
| Title |
|---|
| International Preliminary Report on Patentability Issued in International Application No. PCT/CN2016/078352 dated Oct. 17, 2017. |
| International Search Report with Translation for International Application No. PCT/CN2016/078352 dated Jun. 3, 2016. |
| Office Action Issued in Chinese Application No. 201510181200.6 dated Jun. 2, 2017. |
| Office Action with Explanation of Relevance Issued in Chinese Application No. 201510181200.6 dated Oct. 9, 2018. |
| Office Action with Translation for Korean Patent Application No. 10-2017-7018479 dated May 25, 2018. |
Also Published As
| Publication number | Publication date |
|---|---|
| CN104822088A (en) | 2015-08-05 |
| KR101951135B1 (en) | 2019-02-21 |
| KR20170089929A (en) | 2017-08-04 |
| CN104822088B (en) | 2019-03-19 |
| WO2016165568A1 (en) | 2016-10-20 |
| US20170347153A1 (en) | 2017-11-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10397649B2 (en) | Method of zooming video images and mobile display terminal | |
| CN111061574B (en) | Object sharing method and electronic device | |
| US11054988B2 (en) | Graphical user interface display method and electronic device | |
| CN110096326B (en) | Screen capture method, terminal device and computer-readable storage medium | |
| US10133480B2 (en) | Method for adjusting input-method keyboard and mobile terminal thereof | |
| CN111142991A (en) | Application function page display method and electronic equipment | |
| CN108446058B (en) | Operation method of a mobile terminal and mobile terminal | |
| CN105975190B (en) | Graphical interface processing method, device and system | |
| JP2015007949A (en) | Display device, display controlling method, and computer program | |
| CN107193451B (en) | Information display method, apparatus, computer equipment, and computer-readable storage medium | |
| CN108664190A (en) | page display method, device, mobile terminal and storage medium | |
| CN108920069B (en) | Touch operation method and device, mobile terminal and storage medium | |
| US20150089431A1 (en) | Method and terminal for displaying virtual keyboard and storage medium | |
| CN108287650A (en) | One-handed performance method based on mobile terminal and mobile terminal | |
| CN103399657B (en) | The control method of mouse pointer, device and terminal unit | |
| CN110647277A (en) | Control method and terminal equipment | |
| CN113050863A (en) | Page switching method and device, storage medium and electronic equipment | |
| CN105513098B (en) | Image processing method and device | |
| CN103885692A (en) | Page changing method, device and terminal | |
| CN110941378B (en) | Video content display method and electronic equipment | |
| CN117435109A (en) | Content display method and device and computer readable storage medium | |
| CN109032487A (en) | Electronic device control method, electronic device control device, storage medium and electronic device | |
| CN108287745A (en) | A kind of display methods and terminal device at the interfaces WebApp | |
| CN109104573B (en) | Method for determining focusing point and terminal equipment | |
| CN108628534B (en) | Character display method and mobile terminal |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: TENCENT TECHNOLOGY (SHENZHEN) COMPANY LIMITED, CHI Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ZUO, HONGTAO;REEL/FRAME:043347/0306 Effective date: 20170808 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |