1- When data or information is retrieved from other resources (secondary data) it must be analyzed in-depth and thoroughly. The origins of the information must be trace and validated to maintain its data integrity. Data integrity refers to the reliability and trustworthiness of data in its development. It is imperative to keep important information by measuring the level accuracy and completeness.
To ensure data integrity researchers and investigators must perform risk-based validation of the data by selecting appropriate step to ensure accuracy. Poor and incomplete data collection may defeat the purpose of the intended goal to be achieved. Incomplete data collection may lead to loss of funds and waste of media dollars, let alone faulty decision making that may hinder critical decision making towards a project (Ghazouani, Kiram, Khanboubi, 2020).
El Ghazouani, M., El Kiram, M. A., Er-Rajy, L., & El Khanboubi, Y. (2020). Efficient Method Based on Blockchain Ensuring Data Integrity Auditing with Deduplication in Cloud. International Journal of Interactive Multimedia & Artificial Intelligence, 6(3), 32–38. https://doi-org.lopes.idm.oclc.org/10.9781/ijimai….
2- Data integrity can be defined as the accuracy and consistency of the data that is being presented overtime. It helps to keep important information measured by its reliability. When data is being retrieved from sources or secondary data, it needs to be analyzed to make sure the information is accurate and or completed. This is when data integrity comes into place. In order to assure the truthfulness of the data, it needs to go through this process. If the data turns out to have flawed information, it is now invalid data. To assure that the data within research remains valid, is deep saturation of research. For example, if responses become more consistent across larger numbers of samples, the data becomes more reliable (Li, 2016). With invalid data presented, there can cause problems with funding that may go into the research along with false allegations that can be reported, resulting in consequences.
Li, Y. (2016, November 28). How to Determine the Validity and Reliability of an Instrument. Retrieved October 16, 2020, from https://blogs.miamioh.edu/discovery-center/2016/11…
3- Their are many federal agencies that show statistics on criminal elements and where the crime has occurred. These agencies are UCR, NIBSR and USDOT. These agencies have crucial information regarding the type of crime, where it happened and to whom. Knowing such information can help law enforcement and other elite agencies tackle criminal activities within designated geographical areas. Many of these agencies will help with their information in initiating prevention programs with reducing criminal elements and activities. However, data gathered must be validated by examining the validity of the data by cross referencing all information gathered. Validity in data must be considered for many reasons, one ensuring measuring techniques are measured efficiently in the effort to bring quality and consistent data to the table. All these steps are paramount due to many factors such as crime control policies and policy change as well (Hu, Barnes, Feng, Wang, Jiang, 2020)
Hu, C., Barnes, B. B., Feng, L., Wang, M., & Jiang, L. (2020). On the Interplay Between Ocean Color Data Quality and Data Quantity: Impacts of Quality Control Flags. IEEE Geoscience and Remote Sensing Letters, Geoscience and Remote Sensing Letters, IEEE, IEEE Geosci. Remote Sensing Lett, 17(5), 745–749. https://doi-org.lopes.idm.oclc.org/10.1109/LGRS.20…
APA style, cite and reference each response by it self. 150+ words each discussion