frame buffer

(redirected from Framebuffer)
Also found in: Dictionary, Thesaurus, Wikipedia.

frame buffer

[′frām ‚bəf·ər]
(computer science)
A device that stores a television picture or frame for processing.
McGraw-Hill Dictionary of Scientific & Technical Terms, 6E, Copyright © 2003 by The McGraw-Hill Companies, Inc.

frame buffer

(hardware)
Part of a video system in which an image is stored, pixel by pixel and which is used to refresh a raster image. The term "video memory" suggests a fairly static display whereas a frame buffer holds one frame from a sequence of frames forming a moving image.

Frame buffers are found in frame grabbers and time base correction systems, for example.
This article is provided by FOLDOC - Free Online Dictionary of Computing (foldoc.org)

frame buffer

An area of memory used to hold the frame of data that is continuously being sent to the screen. The buffer is the size of the maximum image that can be displayed and may be a separate memory bank on the graphics card (display adapter) or a reserved part of regular memory. Sophisticated graphics systems are built with several memory planes, each holding one or more bits of the pixel. See video RAM. See also frame grabber.
Copyright © 1981-2019 by The Computer Language Company Inc. All Rights reserved. THIS DEFINITION IS FOR PERSONAL USE ONLY. All other reproduction is strictly prohibited without permission from the publisher.
References in periodicals archive ?
The control interface uses a flexible protocol that can easily be augmented to provide additional functionality for special devices or protocols (e.g., resetting the framebuffer, etc.).
As part of that application, a "digital VCR" was implemented to display the four images on a framebuffer. The requirements called for real-time display along with the additional computations in the stereovision application.
Four interface nodes are used to forward the four images to the framebuffer through the network interface.
The specific features of the streams software that were used include support for generating flame buffer headers using a "framebuffer protocol" and stream synchronization to keep the four images synchronized.
Each of the buffers is large enough to hold both data and a header, allowing protocols to generate the packet header "in place." Several protocols are supported including raw HIPPI, IP, and support for the NSC and IOSC framebuffers.
The software scaling generally creates an off-screen texture through the framebuffer object and performs viewport scaling, whereas the hardware method performs scaling in the viewport through the hardware scaler.
The framebuffer is the set of buffers in which the final output of the OpenGL rendering is stored.
The framebuffer object (FBO) is an OpenGL extension for flexible off-screen rendering, including texture rendering.
When the setFixedSize() is called during the initialization process of the GLSurfaceView, VRS is carried out by the setFixedSize() and the output is stored in the framebuffer. At this time, VRS is performed using the hardware scaler.
The application method is to modify the framebuffer object in the source code of the Rolling ball benchmark of the GFx game engine, which is editable.