Database Integrity
When data within a database retains internal consistency and lack of
corruption.
Database Integrity Checking Routines
Processes (programs or scripts) that check the overall integrity of a (test)
database.
The routines scan the database tables and links and report any
logical data integrity problems such a missing relationships, undefined
codes, etc.
They may also check the physical integrity of the database.

The Acceptance Testing Kit – Glossary
2
Database Script
A (database) program to written to undertake a specific database
administration task such as changing the structure of the database, checking
the database or taking a database backup copy.
End-user
Any individual who will be at the “receiving end” of the system from an
operational perspective.
They may be inputting or extracting data, or simply
generating reports.
Expected Results
The expected outcome of a test.
Function Points
A measure of the size of a system, based on the Function Point Analysis
methodology and discipline.
Function points count, categorise, and qualify
the business functions of the system. They are a technology-independent
method of estimating the amount of work associated with developing a
system.
GITC
Acronym for Government Information Technology Contract.
Operating Environment
The hardware, software and network environment in which the system will
run.
System Owner
This may be the Business Unit Manager, the Project Sponsor, or another key
stakeholder.
They will usually have a vested interest in the outputs and
outcomes from the system.
Parallel Running
Where the existing system(s) continue to operate as normal, while the new
system duplicates all processes and activities.
The results from the old and
new systems can be compared as a means of testing the new system.
Parallel
running is normally conducted after implementation prior to a full
changeover to the new system.
Parallel running may also occur as part of
Acceptance Testing.

The Acceptance Testing Kit – Glossary
3
Performance Test
Performance testing is where specific system functions (usually critical ones)
are timed under various system loads to ensure that the times meet
contractual performance criteria.
System loads may include network traffic,
database activity as well as normal system functions.
Problem Reports
A formal record of an Acceptance Test problem that defines the problem, the
conditions under which it occurred, and the details of the software version in
use.
It may also be used to record details of the resolution of the problem.
Problem Report Register
An index of all problem reports, summarising their status and relevant dates.
Regression Testing
Where a series of tests (or all tests) are repeated, where the expected results
are known.
The expected results are based on the same tests that have been
successfully completed at an earlier time.


You've reached the end of your free preview.
Want to read all 96 pages?
- Fall '18
- acceptance testing, Acceptance Test Manager