data flow analysis


Also found in: Acronyms.

data flow analysis

[′dad·ə ¦flō ə‚nal·ə·səs]
(computer science)
The development of models for the movement of information within an organization, indicating the sources and destinations of information and where and how information is transmitted, processed, and stored.

data flow analysis

(programming)
A process to discover the dependencies between different data items manipulated by a program. The order of execution in a data driven language is determined solely by the data dependencies. For example, given the equations

1. X = A + B 2. B = 2 + 2 3. A = 3 + 4

a data-flow analysis would find that 2 and 3 must be evaluated before 1. Since there are no data dependencies between 2 and 3, they may be evaluated in any order, including in parallel.

This technique is implemented in hardware in some pipelined processors with multiple functional units. It allows instructions to be executed as soon as their inputs are available, independent of the original program order.
Mentioned in ?
References in periodicals archive ?
StealthWatch features a patent-pending approach to detecting network anomalies, called Data Flow Analysis.
Parasoft Data Flow Analysis Ensures Security Policy Compliance and Exposes High-Risk Runtime Vulnerabilities
Leveraging its team of healthcare and security experts, ZEFER performed a data flow analysis and an information security architecture review outlining HIPAA's security requirements and recommending specific controls and mechanisms to protect personally-identifiable health information during electronic transfer of information.
EZ Source gives customers a complete view of applications by providing understanding capabilities, impact analysis, control and data flow analysis, graphical and alphanumeric reports and an open design database repository so customers can evaluate the effectiveness and interaction of existing applications in their portfolio.
The powerful Data Flow Analysis Peephole (DFAP) optimization increases both code execution speed and code compaction.
Sourcefire NS 3000 utilizes a highly optimized detection engine based on data flow analysis and a stateful protocol inspection technology, allowing it to keep up with traffic on fully saturated gigabit networks.
0 provides a unique platform for rapid development of mission critical Information Integration Infrastructure including graphical data flow analysis, content driven reverse engineering, and data flow simulation for diverse sources such as SAP R/3, Internet content, and relational data bases.
It uses scanning, parsing, control and data flow analysis techniques to precisely identify date variables and lines of code affected by dates.
ATFx Data Flow Analysis tracks the transfer of data from one variable to others through the entire code base.
The Find phase uses comprehensive data flow analysis to help organizations make informed decisions about which date fields need to be corrected.
Dynamic Execution is a combination of technologies -- multiple branch prediction, data flow analysis and speculative execution -- that is constantly feeding P6's data-crunching units.