Retrieved from https://studentshare.org/environmental-studies/1422865-case-projects
https://studentshare.org/environmental-studies/1422865-case-projects.
Despite the fact that the Daubert Standard is not directly connected to forensics examination, it sets a good guideline for acceptability of validation reports in courts of law (cited Brunty 1). In order to determine the validity of the findings in question, it is vital to determine from the police investigator the exact procedure he/she took as well as all the forensic tools he/she used in examining the computer data. In the Digital Evidence discipline, it is fundamental that the examination of procedures include a complete understanding of the methodology used so as to determine its specificity, limitations and margins of error.
The validation of forensic tools is conducted to verify that they did not alter, add or delete any of the original data. (Barbara 4). Computer forensics involves, at least, the basic four processes of identification, preservation, analysis and presentation. Since identification and presentation depend on the skills of the investigator they cannot be subjected to the validation process of the computer laboratory. On the other hand, preservation and analysis are forensics-tool bases and thus, can be verified and validated.
Preservation entails forensic copy, verification, write protection and media sanitation while analysis involves the processes of searching, file rendering, data recovery, decryption, file identification, processing, temporal data, and process automation (Guo et al 2009, pp. S-15-S16). Validation and verification of electronic evidence requires mathematical technique primarily to determine their error ratio. One way of doing this is to split a sample data into subsets, to be called model subsets, and compare them to the remaining subsets.
Next, the file system should be examined remembering that file systems have two layers: abstract and; low level. When examination had focused on the abstract level, it is most likely that the low level evidence has been overlooked. To determine if this is the case, the file system image should be examined and layer 1 obtained. From layer 1, layer 2 should be inferred and this inferred layer should be compared to the actual layer 2 and the discrepancies noted. All other file systems are then examined similarly and the average error ratio obtained.
To continue with the validation, a mathematical formula must be adopted for each file system that consists of a weighted summation of every layer. The following is an example of a formula of FAT12: Er = 0.05(L1) + 0.1(L2) + 0.05(L3) + 0.1(L4) + 0.2(L5) + 0.2(L6) + 0.3(L7) A FAT12 file system, a file system commonly used in most computers, has seven layers: boot sector values; FAT and data areas; FAT entries; clusters; formatted cluster content; linked list of clusters; all directory entries. Each of these layers must be examined for their respective abstract and lower-level, a forecast lower level layer developed with the help of the abstract and the predicted lower-level compared with the original lower-level layer and error ratio is stored.
At the end, all stored layer ratios must be placed into a weighted formula similar as above and an average error ratio for the file system obtained. This examination would establish the extent of the discrepancies in the previous examination (Sremack 2003). Case Project 14-3 Several characteristics distinguish a technical/formal paper from other types.
...Download file to see next pages Read More