Skip to main content

How to Use the Validation Tool in Amped FIVE

Amped Software Stand: 4/D62
How to Use the Validation Tool in Amped FIVE

In this new article, Amped will guide you through the methodology of dataset generation and comparison utilizing the new Validation Tool incorporated in Amped FIVE. This tool is exclusively devised to detect alterations in pixel values, thus providing a swift mechanism for creating preliminary datasets and subsequently comparing them against those originating from diverse software editions or alternate workstations where FIVE is employed.

Amped FIVE boasts an extensive collection of filters, exceeding 140, each capable of affecting the pixel values found within images or videos. Given the necessity for law enforcement agencies and forensic video services to attain accreditation through software validation, a daunting challenge is presented as numerous filters that must be thoroughly examined.

The validation process entails two distinct stages, the first of which involves the creation of an initial dataset. Subsequently, a comparison is conducted between datasets to verify that the software continues to perform as intended. The re-validation process, following any software update or utilization on disparate hardware, is now a significantly less time-consuming task. This can be attributed to the automated dataset comparison feature offered by Amped Software’s new tool, which requires only a few minutes to complete.

Before generating the preliminary dataset findings, it is imperative to create several projects which encompass the filters to be tested. It should be noted that extensive video units may have already implemented these measures. Additionally, adherence to Standard Operating Procedures (SOPs) may call for selective filter testing.

Project Creation in the Validation Tool

It is possible to have all components within a singular Amped FIVE Project (AFP); however, its maintenance may pose an arduous task. The deconstruction of filters into distinct projects can alleviate/lighten this issue and facilitate the tool's functionality with a directory of projects, including sub-directories. Each project will contain essential loaded sample media alongside the filter being evaluated.

It is noteworthy that for numerous filters, the identical sample file may be employed by the Loaders. Therefore, having many diverse files is deemed unnecessary. For instance, it would suffice to utilize one image or video to evaluate each filter within the Adjust category.

Every component of the system can undergo scrutiny concerning its produced values and deciphered values and subsequently, upon transcription, those values can be scrutinized for comparison.

Amped has developed a project for the Adjust filter category comprising FIVE chains. They would like to bring to your attention that each chain has been labeled with the corresponding tested filter, which will facilitate ease of analysis when reviewing results.

Utilizing the individual categories is merely a suggestion. It is plausible that you may discover it more convenient to categorize them distinctively to accommodate your specific work environment or personal testing necessities.

Through the implementation of the filter category methodology, a series of AFP files is produced.

The MEDIA_SET directory is prominently displayed in this location. It serves as the storage facility for the media samples employed in the projects.

The following step in this process is the fun part. Within the Utilities menu, one may locate the Validation Tool.

The tool presents three distinct processes that may be selected via a dropdown menu located at the top of its interface.

  1. Generate: Create a new dataset.
  2. Generate and Compare: Create a new dataset and then compare it against a previous one.
  3. Compare: Compare two different previously created datasets.

The folder selection boxes will only be visible for the process selected.

Generate

We shall proceed with the generation of the test results by clicking on the 'OK' button.

Within the designated output folder, a new directory is created to house the distinct outcomes of each generation. Subsequently, every set of results is methodically stored within its separate directory denoted by the FIVE Revision number and accompanied by a timestamp indicating the precise date and time of the corresponding generation.

Inside the folder, a sequence of visual representations will be present that have been produced during the moment of frame hash computation. Following this process, the outcomes will be available in the preferred format. The tool's interface offers two options for selection: HTML or TSV.

For every test (chain), a set of three images is generated for video files. These include the initial frame, the middle frame, and the final frame obtained from the last filter in each chain. The resulting images are also labeled with their corresponding frame numbers. The objective behind generating these images is to facilitate the identification of any discrepancies that may arise during a follow-up comparison test by highlighting what has changed.

The filter list file provides a comprehensive breakdown of all categories and filters utilized during the testing phase, along with their respective usage frequencies. It is worth noting that certain filters, such as Video Loader, have been subjected to extensive testing due to their critical role in loading videos before applying other filters. Conversely, there are some filters like Audio Loader or Change Frame Rate which cannot be evaluated for pixel alterations.

The Validation Tool Results file starts with the system details.

After this, a table listing all the assessed projects and chains will ensue, including any sub-directory.

In accordance with testing and comparison procedures, hereby you see the relevant information on the frames used for Line Doubling analysis. The AFP name shall be accompanied by both the Chain name and internal Chain ID, followed by their respective frame numbers. It is pertinent to note that three distinct frames were utilized during this exercise, namely 0, 374, and 749.

The subsequent column denotes each chain's MD5 hashcode of the loaded file. It is noteworthy that a distinct file was employed for the Line Doubling test as compared to Interleaving, as evidenced by non-identical hashcodes.

The next column denotes the frame hash (MD5) of the tested frame. It is imperative to note that this is distinct from the file hash. The computation of the frame hash involves utilizing data obtained from individual pixels, and it corresponds with what is exhibited in Amped FIVE's Tools' Inspector window.

Ultimately, the duration of the test execution on the chain will be revealed. With the acquisition of a comprehensive dataset, we can progress with Generate and Compare.

Generate and Compare

Upon selection, the Dataset Results (1) field shall become available for selection. It is imperative to select the results file of a previously generated dataset.

Upon clicking the "OK" button, within a brief span of time, multiple new directories will be generated along with the comparative findings. A new dataset shall emerge within the Generation Results Folder. Furthermore, there shall exist an additional directory in your Output Folder.

The new directory, "Compare Results," shall be created within the Dataset Comparison Results. This will serve as a repository for all relevant comparative data.

The header that outlines the two sets utilized will be situated at the top of the Compare Results table.

The table, presenting the outcomes of the comparative analysis, compares the hash values of frames and subsequently provides a verdict of either PASSED or FAIL for each comparison. It also undertakes a comparative analysis of the time taken to process each image, with fast processing being portrayed in green and slow processing depicted in red. Such an assessment may assist in identifying system malfunctions concerning hardware configuration, particularly if one computer is observed to be taking significantly longer than others to execute chain processes.

Following on from the PASSED column, it is imperative to draw attention to the Sum of Absolute Difference (SAD). As depicted in the aforementioned example, all values are at zero owing to identical resulting images. Nonetheless, in the case of a FAIL scenario, this value would indicate the extent of dissimilarity between them.

A significantly low value may indicate a modification in the filter's operational pattern. Conversely, a high value would imply that the filter has malfunctioned. For instance, if Correct Perspective was scrutinized and the quadrilateral points were incorrectly computed or omitted from inputting.

Compare

If the Compare process is chosen, it shall be necessary to provide solely two antecedent dataset outcomes. A straightforward comparison will subsequently transpire.

Upon the release of a new version of Amped FIVE, the process of generating and comparing results between the previous and the latest version has been significantly expedited.

If personnel utilize disparate workstations or versions of Windows, it is now a straightforward process to ascertain discrepancies. Even on a solitary workstation featuring dual graphics processing alternatives (CPU and GPU), it can be expeditiously confirmed that both methodologies yield identical outcomes.

The Validation Tool has been meticulously crafted to be adaptable to all unit working practices, with a focus on flexibility. It is designed in such a way that Standard Operating Procedures (SOPs) can be effortlessly composed for its use. In the same manner as the Assistant Tool, which serves as an aid in guiding users through processing SOPs, the Validation Tool is tailored to assist you with your Software Validation Processes.

Don’t Delay – Update Today

If you possess an active support plan, you can update immediately by going to the menu labeled "About" and selecting "Check for Updates" within Amped FIVE. In case your SMS plan requires renewal, get in touch with Amped Software. Kindly note that the management of your license and requests can always be accomplished via the Amped Support Portal.

Loading