for Journals by Title or ISSN
for Articles by Keywords
help
Followed Journals
Journal you Follow: 0
 
Sign Up to follow journals, search in your chosen journals and, optionally, receive Email Alerts when new issues of your Followed Journals are published.
Already have an account? Sign In to see the journals you follow.
Journal Cover Digital Investigation
  [SJR: 0.674]   [H-I: 32]   [443 followers]  Follow
    
   Full-text available via subscription Subscription journal
   ISSN (Print) 1742-2876
   Published by Elsevier Homepage  [3044 journals]
  • Contents List
    • Abstract: Publication date: September 2017
      Source:Digital Investigation, Volume 22


      PubDate: 2017-09-18T21:32:01Z
       
  • The value of forensic preparedness and digital-identification expertise in
           smart society
    • Abstract: Publication date: September 2017
      Source:Digital Investigation, Volume 22
      Author(s): Eoghan Casey


      PubDate: 2017-09-18T21:32:01Z
       
  • Study on the tracking revision history of MS Word files for forensic
           investigation
    • Abstract: Publication date: Available online 8 September 2017
      Source:Digital Investigation
      Author(s): Doowon Jeong, Sangjin Lee
      Document forensics remains an important field of digital forensics. To date, previously existing methods focused on the last saved version of the document file stored on the PC; however, the drawback of this approach is that this provides no indication as to how the contents have been modified. This paper provides a novel method for document forensics based on tracking the revision history of a Microsoft Word file. The proposed method concentrates on the TMP file created when the author saves the file and the ASD file created periodically by Microsoft Word during editing. A process whereby the revision history lists are generated based on metadata of the Word, TMP, and ASD files is presented. Furthermore, we describe a technique developed to link the revision history lists based on similarity. These outcomes can provide considerable assistance to a forensic investigator trying to establish the extent to which document file contents have been changed and when the file was created, modified, deleted, and copied.

      PubDate: 2017-09-12T21:13:51Z
       
  • A novel file carving algorithm for National Marine Electronics Association
           (NMEA) logs in GPS forensics
    • Abstract: Publication date: Available online 8 September 2017
      Source:Digital Investigation
      Author(s): Kai Shi, Ming Xu, Haoxia Jin, Tong Qiao, Xue Yang, Ning Zheng, Jian Xu, Kim-Kwang Raymond Choo
      Globe positioning system (GPS) devices are an increasing importance source of evidence, as more of our devices have built-in GPS capabilities. In this paper, we propose a novel framework to efficiently recover National Marine Electronics Association (NMEA) logs and reconstruct GPS trajectories. Unlike existing approaches that require file system metadata, our proposed algorithm is designed based on the file carving technique without relying on system metadata. By understanding the characteristics and intrinsic structure of trajectory data in NMEA logs, we demonstrate how to pinpoint all data blocks belonging to the NMEA logs from the acquired forensic image of GPS device. Then, a discriminator is presented to determine whether two data blocks can be merged. And based on the discriminator, we design a reassembly algorithm to re-order and merge the obtained data blocks into new logs. In this context, deleted trajectories can be reconstructed by analyzing the recovered logs. Empirical experiments demonstrate that our proposed algorithm performs well when the system metadata is available/unavailable, log files are heavily fragmented, one or more parts of the log files are overwritten, and for different file systems of variable cluster sizes.

      PubDate: 2017-09-12T21:13:51Z
       
  • Decoding the APFS file system
    • Abstract: Publication date: Available online 6 September 2017
      Source:Digital Investigation
      Author(s): Kurt H. Hansen, Fergus Toolan
      File systems have always played a vital role in digital forensics and during the past 30–40 years many of these have been developed to suit different needs. Some file systems are more tightly connected to a specific Operating System (OS). For instance HFS and HFS+ have been the file systems of choice in Apple devices for over 30 years. Much has happened in the evolution of storage technologies, the capacity and speed of devices has increased and Solid State Drives (SSD) are replacing traditional drives. All of these present challenges for file systems. APFS is a file system developed from first principles and will, in 2017, become the new file system for Apple devices. To date there is no available technical information about APFS and this is the motivation for this article.

      PubDate: 2017-09-06T20:59:09Z
       
  • Clustering image noise patterns by embedding and visualization for common
           source camera detection
    • Abstract: Publication date: Available online 5 September 2017
      Source:Digital Investigation
      Author(s): Sonja Georgievska, Rena Bakhshi, Anand Gavai, Alessio Sclocco, Ben van Werkhoven
      We consider the problem of clustering a large set of images based on similarities of their noise patterns. Such clustering is necessary in forensic cases in which detection of common source of images is required, when the cameras are not physically available. We propose a novel method for clustering combining low dimensional embedding, visualization, and classical clustering of the dataset based on the similarity scores. We evaluate our method on the Dresden images database showing that the methodology is highly effective.

      PubDate: 2017-09-06T20:59:09Z
       
  • Advancing coordinated cyber-investigations and tool interoperability using
           a community developed specification language
    • Abstract: Publication date: Available online 26 August 2017
      Source:Digital Investigation
      Author(s): Eoghan Casey, Sean Barnum, Ryan Griffith, Jonathan Snyder, Harm van Beek, Alex Nelson
      Any investigation can have a digital dimension, often involving information from multiple data sources, organizations and jurisdictions. Existing approaches to representing and exchanging cyber-investigation information are inadequate, particularly when combining data sources from numerous organizations or dealing with large amounts of data from various tools. To conduct investigations effectively, there is a pressing need to harmonize how this information is represented and exchanged. This paper addresses this need for information exchange and tool interoperability with an open community-developed specification language called Cyber-investigation Analysis Standard Expression (CASE). To further promote a common structure, CASE aligns with and extends the Unified Cyber Ontology (UCO) construct, which provides a format for representing information in all cyber domains. This ontology abstracts objects and concepts that are not CASE-specific, so that they can be used across other cyber disciplines that may extend UCO. This work is a rational evolution of the Digital Forensic Analysis eXpression (DFAX) for representing digital forensic information and provenance. CASE is more flexible than DFAX and can be utilized in any context, including criminal, corporate and intelligence. CASE also builds on the Hansken data model developed and implemented by the Netherlands Forensic Institute (NFI). CASE enables the fusion of information from different organizations, data sources, and forensic tools to foster more comprehensive and cohesive analysis. This paper includes illustrative examples of how CASE can be implemented and used to capture information in a structured form to advance sharing, interoperability and analysis in cyber-investigations. In addition to capturing technical details and relationships between objects, CASE provides structure for representing and sharing details about how cyber-information was handled, transferred, processed, analyzed, and interpreted. CASE also supports data marking for sharing information at different levels of trust and classification, and for protecting sensitive and private information. Furthermore, CASE supports the sharing of knowledge related to cyber-investigations, including distinctive patterns of activity/behavior that are common across cases. This paper features a proof-of-concept Application Program Interface (API) to facilitate implementation of CASE in tools. Community members are encouraged to participate in the development and implementation of CASE and UCO.

      PubDate: 2017-08-31T19:43:34Z
       
  • Enhancing security incident response follow-up efforts with lightweight
           agile retrospectives
    • Abstract: Publication date: Available online 26 August 2017
      Source:Digital Investigation
      Author(s): George Grispos, William Bradley Glisson, Tim Storer
      Security incidents detected by organizations are escalating in both scale and complexity. As a result, security incident response has become a critical mechanism for organizations in an effort to minimize the damage from security incidents. The final phase within many security incident response approaches is the feedback/follow-up phase. It is within this phase that an organization is expected to use information collected during an investigation in order to learn from an incident, improve its security incident response process and positively impact the wider security environment. However, recent research and security incident reports argue that organizations find it difficult to learn from incidents. A contributing factor to this learning deficiency is that industry focused security incident response approaches, typically, provide very little practical information about tools or techniques that can be used to extract lessons learned from an investigation. As a result, organizations focus on improving technical security controls and not examining or reassessing the effectiveness or efficiency of internal policies and procedures. An additional hindrance, to encouraging improvement assessments, is the absence of tools and/or techniques that organizations can implement to evaluate the impact of implemented enhancements in the wider organization. Hence, this research investigates the integration of lightweight agile retrospectives and meta-retrospectives, in a security incident response process, to enhance feedback and/or follow-up efforts. The research contribution of this paper is twofold. First, it presents an approach based on lightweight retrospectives as a means of enhancing security incident response follow-up efforts. Second, it presents an empirical evaluation of this lightweight approach in a Fortune 500 Financial organization's security incident response team.

      PubDate: 2017-08-31T19:43:34Z
       
  • Decision-theoretic file carving
    • Abstract: Publication date: Available online 25 August 2017
      Source:Digital Investigation
      Author(s): Pavel Gladyshev, Joshua I. James
      This article explores a novel approach to file carving by viewing it as a decision problem. This allows us to design algorithms that produce best-effort results under given resource constraints. Resource-constrained carving is important for digital forensic triage, as well as for e-discovery, where a reduction in carving time may be preferred to completeness. In this work we give a formal definition of decision-theoretic file carving. As an illustration, we developed a JPEG file carving tool using the described decision-theoretic algorithm. We then examine the results of decision-theoretic file carving compared with linear carving methods to demonstrate when decision-theoretic carving is most useful.

      PubDate: 2017-08-31T19:43:34Z
       
  • Registration Data Access Protocol (RDAP) for digital forensic
           investigators
    • Abstract: Publication date: Available online 24 August 2017
      Source:Digital Investigation
      Author(s): Bruce Nikkel
      This paper describes the Registration Data Access Protocol (RDAP) with a focus on relevance to digital forensic investigators. RDAP was developed as the successor to the aging WHOIS system and is intended to eventually replace WHOIS as the authoritative source for registration information on IP addresses, Domain Names, Autonomous Systems, and more. RDAP uses a RESTful interface over HTTP and introduces a number of new features related to security, internationalization, and standardized query/response definitions. It is important for digital forensic investigators to become familiar with RDAP as it will play an increasingly important role in Internet investigations requiring the search and collection of registration data as evidence.

      PubDate: 2017-08-31T19:43:34Z
       
  • A local variance based approach to alleviate the scene content
           interference for source camera identification
    • Abstract: Publication date: Available online 23 August 2017
      Source:Digital Investigation
      Author(s): Chao Shi, Ngai-Fong Law, Frank H.F. Leung, Wan-Chi Siu
      Identifying the source camera of images is becoming increasingly important nowadays. A popular approach is to use a type of pattern noise called photo-response non-uniformity (PRNU). The noise of image contains the patterns which can be used as a fingerprint. Despite that, the PRNU-based approach is sensitive towards scene content and image intensity. The identification is poor in areas having low or saturated intensity, or in areas with complicated texture. The reliability of different regions is difficult to model in that it depends on the interaction of scene content and the characteristics of the denoising filter used to extract the noise. In this paper, we showed that the local variance of the noise residual can measure the reliability of the pixel for PRNU-based source camera identification. Hence, we proposed to use local variance to characterize the severeness of the scene content artifacts. The local variance is then incorporated to the general matched filter and peak to correlation energy (PCE) detector to provide an optimal framework for signal detection. The proposed method is tested against several state-of-art methods. The experimental results show that the local variance based approach outperformed other state-of-the-art methods in terms of identification accuracy.

      PubDate: 2017-08-31T19:43:34Z
       
  • Forensic analysis of Telegram Messenger for Windows Phone
    • Abstract: Publication date: Available online 18 August 2017
      Source:Digital Investigation
      Author(s): J. Gregorio, A. Gardel, B. Alarcos
      This article presents a forensic analysis methodology for obtaining the digital evidence generated by one of today's many instant messaging applications, namely “Telegram Messenger” for “Windows Phone”, paying particular attention to the digital forensic artifacts produced. The paper provides an overview of this forensic analysis, while focusing particularly on how the information is structured and the user, chat and conversation data generated by the application are organised, with the goal of extracting related data from the information. The application has several other features (e.g. games, bots, stickers) besides those of an instant messaging application (e.g. messages, images, videos, files). It is therefore necessary to decode and interpret the information, which may relate to criminal offences, and establish the relation of different types of user, chat and conversation.

      PubDate: 2017-08-31T19:43:34Z
       
  • Future challenges for smart cities: Cyber-security and digital forensics
    • Abstract: Publication date: Available online 16 August 2017
      Source:Digital Investigation
      Author(s): Zubair A. Baig, Patryk Szewczyk, Craig Valli, Priya Rabadia, Peter Hannay, Maxim Chernyshev, Mike Johnstone, Paresh Kerai, Ahmed Ibrahim, Krishnun Sansurooah, Naeem Syed, Matthew Peacock
      Smart cities are comprised of diverse and interconnected components constantly exchanging data and facilitating improved living for a nation's population. Our view of a typical smart city consists of four key components, namely, Smart Grids, Building Automation Systems (BAS), Unmanned Aerial Vehicles (UAVs), Smart Vehicles; with enabling Internet of Things (IoT) sensors and the Cloud platform. The adversarial threats and criminal misuses in a smart city are increasingly heterogenous and significant, with provisioning of resilient and end-to-end security being a daunting task. When a cyber incident involving critical components of the smart city infrastructure occurs, appropriate measures can be taken to identify and enumerate concrete evidence to facilitate the forensic investigation process. Forensic preparedness and lessons learned from past forensic analysis can help protect the smart city against future incidents. This paper presents a holistic view of the security landscape of a smart city, identifying security threats and providing deep insight into digital investigation in the context of the smart city.

      PubDate: 2017-08-31T19:43:34Z
       
  • Contents List
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement


      PubDate: 2017-08-31T19:43:34Z
       
  • Seventeenth Annual DFRWS Conference
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement


      PubDate: 2017-08-31T19:43:34Z
       
  • Digital forensic approaches for Amazon Alexa ecosystem
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Hyunji Chung, Jungheum Park, Sangjin Lee
      Internet of Things (IoT) devices such as the Amazon Echo – a smart speaker developed by Amazon – are undoubtedly great sources of potential digital evidence due to their ubiquitous use and their always-on mode of operation, constituting a human-life's black box. The Amazon Echo in particular plays a centric role for the cloud-based intelligent virtual assistant (IVA) Alexa developed by Amazon Lab126. The Alexa-enabled wireless smart speaker is the gateway for all voice commands submitted to Alexa. Moreover, the IVA interacts with a plethora of compatible IoT devices and third-party applications that leverage cloud resources. Understanding the complex cloud ecosystem that allows ubiquitous use of Alexa is paramount on supporting digital investigations when need raises. This paper discusses methods for digital forensics pertaining to the IVA Alexa's ecosystem. The primary contribution of this paper consists of a new efficient approach of combining cloud-native forensics with client-side forensics (forensics for companion devices), to support practical digital investigations. Based on a deep understanding of the targeted ecosystem, we propose a proof-of-concept tool, CIFT, that supports identification, acquisition and analysis of both native artifacts from the cloud and client-centric artifacts from local devices (mobile applications and web browsers).

      PubDate: 2017-08-31T19:43:34Z
       
  • Leveraging the SRTP protocol for over-the-network memory acquisition of a
           GE Fanuc Series 90-30
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): George Denton, Filip Karpisek, Frank Breitinger, Ibrahim Baggili
      Programmable Logic Controllers (PLCs) are common components implemented across many industries such as manufacturing, water management, travel, aerospace and hospitals to name a few. Given their broad deployment in critical systems, they became and still are a common target for cyber attacks; the most prominent one being Stuxnet. Often PLCs (especially older ones) are only protected by an outer line of defense (e.g., a firewall) but once an attacker gains access to the system or the network, there might not be any other defense layers. In this scenario, a forensic investigator should not rely on the existing software as it might have been compromised. Therefore, we reverse engineered the GE-SRTP network protocol using a GE Fanuc Series 90-30 PLC and provide two major contributions: We first describe the Service Request Transport protocol (GE-SRTP) which was invented by General Electric (GE) and is used by many of their Ethernet connected controllers. Note, to the best of our knowledge, prior to this work, no publicly available documentation on the protocol was available affording users' security by obscurity. Second, based on our understanding of the protocol, we implemented a software application that allows direct network-based communication with the PLC (no intermediate server is needed). While the tool's forensic mode is harmless and only allows for reading registers, we discovered that one can manipulate/write to the registers in its default configuration, e.g., turn off the PLC, or manipulate the items/processes it controls.

      PubDate: 2017-08-31T19:43:34Z
       
  • SCARF: A container-based approach to cloud-scale digital forensic
           processing
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Christopher Stelly, Vassil Roussev
      The rapid growth of raw data volume requiring forensic processing has become one of the top concerns of forensic analysts. At present, there are no readily available solutions that provide: a) open and flexible integration of existing forensic tools into a processing pipeline; and b) scale-out architecture that is compatible with common cloud technologies. Containers, lightweight OS-level virtualized environments, are quickly becoming the preferred architectural unit for building large-scale data processing systems. We present a container-based software framework, SCARF, which applies this approach to forensic computations. Our prototype demonstrates its practicality by providing low-cost integration of both custom code and a variety of third-party tools via simple data interfaces. The resulting system fits well with the data parallel nature of most forensic tasks, which tend to have few dependencies that limit parallel execution. Our experimental evaluation shows that for several types of processing tasks–such as hashing, indexing and bulk processing–performance scales almost linearly with the addition of hardware resources. We show that the software engineering effort to integrate new tools is quite modest, and all the critical task scheduling and resource allocation are automatically managed by the container orchestration runtime–Docker Swarm, or similar.

      PubDate: 2017-08-31T19:43:34Z
       
  • Insights gained from constructing a large scale dynamic analysis platform
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Cody Miller, Dae Glendowne, Henry Cook, DeMarcus Thomas, Chris Lanclos, Patrick Pape
      As the number of malware samples found increases exponentially each year, there is a need for systems that can dynamically analyze thousands of malware samples per day. These systems should be reliable, scalable, and simple to use by other systems and malware analysts. When handling thousands of malware, reprocessing a small percentage of the malware due to errors can be devastating; a reliable system avoids wasting resources by reducing the number of errors. In this paper, we describe our scalable dynamic analysis platform, perform experiments on the platform, and provide lessons we have learned through the process. The platform uses Cuckoo sandbox for dynamic analysis and is improved to process malware as quickly as possible without losing valuable information. Experiments were performed to improve the configuration of the system's components and help improve the accuracy of the dynamic analysis. Lessons learned presented in the paper may aid others in the development of similar dynamic analysis systems.

      PubDate: 2017-08-31T19:43:34Z
       
  • SCADA network forensics of the PCCC protocol
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Saranyan Senthivel, Irfan Ahmed, Vassil Roussev
      Most SCADA devices have few built-in self-defence mechanisms, and tend to implicitly trust communications received over the network. Therefore, monitoring and forensic analysis of network traffic is a critical prerequisite for building an effective defense around SCADA units. In this work, we provide a comprehensive forensic analysis of network traffic generated by the PCCC(Programmable Controller Communication Commands) protocol and present a prototype tool capable of extracting both updates to programmable logic and crucial configuration information. The results of our analysis show that more than 30 files are transferred to/from the PLC when downloading/uploading a ladder logic program using RSLogix programming software including configuration and data files. Interestingly, when RSLogix compiles a ladder-logic program, it does not create any low-level representation of a ladder-logic file. However, the low-level ladder logic is present and can be extracted from the network traffic log using our prototype tool. The tool extracts SMTP configuration from the network log and parses it to obtain email addresses, username and password. The network log contains password in plain text.

      PubDate: 2017-08-31T19:43:34Z
       
  • Linux memory forensics: Dissecting the user space process heap
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Frank Block, Andreas Dewald
      The analysis of memory during a forensic investigation is often an important step to reconstruct events. While prior work in this field has mostly concentrated on information residing in the kernel space (process lists, network connections, and so on) and in particular on the Microsoft Windows operating system, this work focuses on Linux user space processes as they might also contain valuable information for an investigation. Because a lot of process data is located in the heap, this work in the first place concentrates on the analysis of Glibc's heap implementation and on how and where heap related information is stored in the virtual memory of Linux processes that use this implementation. Up to now, the heap was mostly considered a large cohesive memory region from a memory forensics perspective, making it rather hard manual work to identify relevant information inside. We introduce a Python class for the memory analysis framework Rekall that is based on our analysis results and allows access to all chunks contained in the heap and their meta information. Further, based on this class, six plugins have been developed that support an investigator in analyzing user space processes: Four of these plugins provide generic analysis capabilities such as finding information/references within chunks and dumping chunks into separate files for further investigation. These plugins have been used to reverse engineer data structures within the heap for user space processes, while illustrating how such plugins ease the whole analysis process. The remaining two plugins are a result of these user space process analyses and are extracting the command history for the zsh shell and password entry information for the password manager KeePassX.

      PubDate: 2017-08-31T19:43:34Z
       
  • Extending The Sleuth Kit and its underlying model for pooled storage file
           system forensic analysis
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Jan-Niclas Hilgert, Martin Lambertz, Daniel Plohmann
      Carrier's book File System Forensic Analysis is one of the most comprehensive sources when it comes to the forensic analysis of file systems. Published in 2005, it provides details about the most commonly used file systems of that time as well as a process model to analyze file systems in general. The Sleuth Kit is the implementation of Carrier's model and it is still widely used during forensic analyses today—standalone or as a basis for forensic suites such as Autopsy. While The Sleuth Kit is still actively maintained, the model has not seen any updates since then. Moreover, there is no support for modern file systems implementing new paradigms such as pooled storage. In this paper, we present an update to Carrier's model which enables the analysis of pooled storage file systems. To demonstrate that our model is suitable, we implemented it for ZFS—a file system for large scale storage, cloud, and virtualization environments—and show how to perform an analysis of this file system using our model and extended toolkit.

      PubDate: 2017-08-31T19:43:34Z
       
  • Gaslight: A comprehensive fuzzing architecture for memory forensics
           frameworks
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Andrew Case, Arghya Kusum Das, Seung-Jong Park, J. (Ram) Ramanujam, Golden G. Richard
      Memory forensics is now a standard component of digital forensic investigations and incident response handling, since memory forensic techniques are quite effective in uncovering artifacts that might be missed by traditional storage forensics or live analysis techniques. Because of the crucial role that memory forensics plays in investigations and because of the increasing use of automation of memory forensics techniques, it is imperative that these tools be resilient to memory smear and deliberate tampering. Without robust algorithms, malware may go undetected, frameworks may crash when attempting to process memory samples, and automation of memory forensics techniques is difficult. In this paper we present Gaslight, a powerful and flexible fuzz-testing architecture for stress-testing both open and closed-source memory forensics frameworks. Gaslight automatically targets critical code paths that process memory samples and mutates samples in an efficient way to reveal implementation errors. In experiments we conducted against several popular memory forensics frameworks, Gaslight revealed a number of critical previously undiscovered bugs.

      PubDate: 2017-08-31T19:43:34Z
       
  • Analyzing user-event data using score-based likelihood ratios with marked
           point processes
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Christopher Galbraith, Padhraic Smyth
      In this paper we investigate the application of score-based likelihood ratio techniques to the problem of detecting whether two time-stamped event streams were generated by the same source or by two different sources. We develop score functions for event data streams by building on ideas from the statistical modeling of marked point processes, focusing in particular on the coefficient of segregation and mingling index. The methodology is applied to a data set consisting of logs of computer activity over a 7-day period from 28 different individuals. Experimental results on known same-source and known different-source data sets indicate that the proposed scores have significant discriminative power in this context. The paper concludes with a discussion of the potential benefits and challenges that may arise from the application of statistical analysis to user-event data in digital forensics.

      PubDate: 2017-08-31T19:43:34Z
       
  • Time-of-recording estimation for audio recordings
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): Lilei Zheng, Ying Zhang, Chien Eao Lee, Vrizlynn L.L. Thing
      This work addresses the problem of ENF pattern matching in the task of time-of-recording estimation. Inspired by the principle of visual comparison, we propose a novel similarity criterion, the bitwise similarity, for measuring the similarity between two ENF signals. A search system is then developed to find the best matches for a given test ENF signal within a large searching scope on the reference ENF data. By empirical comparison to other popular similarity criteria, we demonstrate that the proposed method is more effective and efficient than the state-of-the-art. For example, compared with the recent DMA algorithm, our method achieves a relative error rate decrease of 86.86% (from 20.32% to 2.67%) and a speedup of 45× faster search response (41.0444 s versus 0.8973 s). Last but not least, we present a strategy of uniqueness examination to help human examiners to ensure high precision decisions, which makes our method practical in potential forensic use.

      PubDate: 2017-08-31T19:43:34Z
       
  • Carving database storage to detect and trace security breaches
    • Abstract: Publication date: August 2017
      Source:Digital Investigation, Volume 22, Supplement
      Author(s): James Wagner, Alexander Rasin, Boris Glavic, Karen Heart, Jacob Furst, Lucas Bressan, Jonathan Grier
      Database Management Systems (DBMS) are routinely used to store and process sensitive enterprise data. However, it is not possible to secure data by relying on the access control and security mechanisms (e.g., audit logs) of such systems alone – users may abuse their privileges (no matter whether granted or gained illegally) or circumvent security mechanisms to maliciously alter and access data. Thus, in addition to taking preventive measures, the major goal of database security is to 1) detect breaches and 2) to gather evidence about attacks for devising counter measures. We present an approach that evaluates the integrity of a live database, identifying and reporting evidence for log tampering. Our approach is based on forensic analysis of database storage and detection of inconsistencies between database logs and physical storage state (disk and RAM). We apply our approach to multiple DBMS to demonstrate its effectiveness in discovering malicious operations and providing detailed information about the data that was illegally accessed/modified.

      PubDate: 2017-08-31T19:43:34Z
       
  • Prelim i - Editorial Board
    • Abstract: Publication date: June 2017
      Source:Digital Investigation, Volume 21


      PubDate: 2017-06-05T14:57:04Z
       
  • Prelim iii - Contents List
    • Abstract: Publication date: June 2017
      Source:Digital Investigation, Volume 21


      PubDate: 2017-06-05T14:57:04Z
       
  • The broadening horizons of digital investigation
    • Abstract: Publication date: Available online 16 May 2017
      Source:Digital Investigation
      Author(s): Eoghan Casey


      PubDate: 2017-05-21T02:44:35Z
       
  • Comments on “A method and a case study for the selection of the best
           available tool for mobile device forensics using decision analysis”
           [Digit Investig 16S, S55–S64]
    • Abstract: Publication date: Available online 11 May 2017
      Source:Digital Investigation
      Author(s): Shahzad Saleem, Oliver Popov, Ibrahim Baggili


      PubDate: 2017-05-16T02:37:28Z
       
  • A survey of current social network and online communication provision
           policies to support law enforcement identify offenders
    • Abstract: Publication date: Available online 8 May 2017
      Source:Digital Investigation
      Author(s): Graeme Horsman
      Online forms of harassment, stalking and bullying on social network and communication platforms are now arguably wide-spread and subject to regular media coverage. As these provision continue to attract millions of users, generating significant volumes of traffic, regulating abuse and effectively reprimanding those who are involved in it, is a difficult and sometimes impossible task. This article collates information acquired from 22 popular social network and communication platforms in order to identify current regulatory gaps. Terms of service and privacy policies are reviewed to assess existing practices of data retention to evaluate the feasibility of law enforcement officials tracking those whose actions breach the law. For each provision, account sign-up processes are evaluated and policies for retaining Internet Protocol logs and user account information are assessed along with the availability of account preservation orders. Finally, recommendations are offered for improving current approaches to regulating social network crime and online offender tracking.

      PubDate: 2017-05-10T20:37:39Z
       
  • Graph clustering and anomaly detection of access control log for forensic
           purposes
    • Abstract: Publication date: Available online 3 May 2017
      Source:Digital Investigation
      Author(s): Hudan Studiawan, Christian Payne, Ferdous Sohel
      Attacks on operating system access control have become a significant and increasingly common problem. This type of security threat is recorded in a forensic artifact such as an authentication log. Forensic investigators will generally examine the log to analyze such incidents. An anomaly is highly correlated to an attacker's attempts to compromise the system. In this paper, we propose a novel method to automatically detect an anomaly in the access control log of an operating system. The logs will be first preprocessed and then clustered using an improved MajorClust algorithm to get a better cluster. This technique provides parameter-free clustering so that it automatically can produce an analysis report for the forensic investigators. The clustering results will be checked for anomalies based on a score that considers some factors such as the total members in a cluster, the frequency of the events in the log file, and the inter-arrival time of a specific activity. We also provide a graph-based visualization of logs to assist the investigators with easy analysis. Experimental results compiled on an open dataset of a Linux authentication log show that the proposed method achieved the accuracy of 83.14% in the authentication log dataset.

      PubDate: 2017-05-06T15:56:22Z
       
  • Contents List
    • Abstract: Publication date: March 2017
      Source:Digital Investigation, Volume 20


      PubDate: 2017-04-02T10:14:14Z
       
  • Contents List
    • Abstract: Publication date: March 2017
      Source:Digital Investigation, Volume 20, Supplement


      PubDate: 2017-03-25T14:30:16Z
       
  • Improving the reliability of chip-off forensic analysis of NAND flash
           memory devices
    • Abstract: Publication date: March 2017
      Source:Digital Investigation, Volume 20, Supplement
      Author(s): Aya Fukami, Saugata Ghose, Yixin Luo, Yu Cai, Onur Mutlu
      Digital forensic investigators often need to extract data from a seized device that contains NAND flash memory. Many such devices are physically damaged, preventing investigators from using automated techniques to extract the data stored within the device. Instead, investigators turn to chip-off analysis, where they use a thermal-based procedure to physically remove the NAND flash memory chip from the device, and access the chip directly to extract the raw data stored on the chip. We perform an analysis of the errors introduced into multi-level cell (MLC) NAND flash memory chips after the device has been seized. We make two major observations. First, between the time that a device is seized and the time digital forensic investigators perform data extraction, a large number of errors can be introduced as a result of charge leakage from the cells of the NAND flash memory (known as data retention errors). Second, when thermal-based chip removal is performed, the number of errors in the data stored within NAND flash memory can increase by two or more orders of magnitude, as the high temperature applied to the chip greatly accelerates charge leakage. We demonstrate that the chip-off analysis based forensic data recovery procedure is quite destructive, and can often render most of the data within NAND flash memory uncorrectable, and, thus, unrecoverable. To mitigate the errors introduced during the forensic recovery process, we explore a new hardware-based approach. We exploit a fine-grained read reference voltage control mechanism implemented in modern NAND flash memory chips, called read-retry, which can compensate for the charge leakage that occurs due to (1) retention loss and (2) thermal-based chip removal. The read-retry mechanism successfully reduces the number of errors, such that the original data can be fully recovered in our tested chips as long as the chips were not heavily used prior to seizure. We conclude that the read-retry mechanism should be adopted as part of the forensic data recovery process.

      PubDate: 2017-03-25T14:30:16Z
       
  • Bit-errors as a source of forensic information in NAND-flash memory
    • Abstract: Publication date: March 2017
      Source:Digital Investigation, Volume 20, Supplement
      Author(s): Jan Peter van Zandwijk
      The value of bit-errors as a source of forensic information is investigated by experiments on isolated NAND-flash chips and USB thumb-drives. Experiments on isolated NAND-flash chips, programmed directly using specialized equipment, show detectable differences in retention bit-errors over forensically relevant time periods with the device used within manufacturer specifications. In experiments with USB thumb-drives, the controller is used to load files at different times onto the drives, some of which have been subjected to stress-cycling. Retention bit-error statistics of memory pages obtained by offline analysis of NAND-flash chips from the thumb-drives are to some extent linked to the time files are loaded onto the drives. Considerable variation between USB thumb-drives makes interpretation of bit-error statistics in absolute sense difficult, although in a relative sense bit-error statistics seems to have some potential as an independent side-channel of forensic information.

      PubDate: 2017-03-25T14:30:16Z
       
  • Advances in volatile memory forensics
    • Abstract: Publication date: Available online 10 March 2017
      Source:Digital Investigation
      Author(s): Bradley Schatz, Michael Cohen


      PubDate: 2017-03-11T00:58:17Z
       
  • Corrigendum to ‘OBA2: An Onion approach to Binary code Authorship
           Attribution’ [Digit Investig 11 (2014) S94–S103]
    • Abstract: Publication date: Available online 23 February 2017
      Source:Digital Investigation
      Author(s): Saed Alrabaee, Noman Saleem, Stere Preda, Lingyu Wang, Mourad Debbabi


      PubDate: 2017-02-25T14:52:53Z
       
  • Detection of upscale-crop and splicing for digital video authentication
    • Abstract: Publication date: Available online 16 January 2017
      Source:Digital Investigation
      Author(s): Raahat Devender Singh, Naveen Aggarwal
      The eternal preoccupation with multimedia technology is the precursor of us becoming a civilization replete with astonishing miscellanea of digital audio-visual information. Not long ago, this digital information (images and videos especially) savored the unique status of ‘definitive proof of occurrence of events’. However, given their susceptibility to malicious modifications, this status is rapidly depreciating. In sensitive areas like intelligence and surveillance, reliance on manipulated visual data could be detrimental. The disparity between the ever-growing importance of digital content and the suspicions regarding their vulnerability to alterations has made it necessary to determine whether or not the contents of a given digital image or video can be considered trustworthy. Digital videos are prone to several kinds of tamper attacks, but on a broad scale these can be categorized as either inter-frame forgeries, where the arrangement of frames in a video is manipulated, or intra-frame forgeries, where the content of the individual frames is manipulated. Intra-frame forgeries are simply digital image forgeries performed on the individual frames of the video. Upscale-crop and splicing are two intra-frame forgeries, both of which are performed via an image processing operation known as resampling. While the challenge of resampling detection in digital images has remained at the receiving end of much innovation over the past two decades, detection of resampling in digital videos has been regarded with little attention. With the intent of ameliorating this situation, in this paper, we propose a forensic system capable of validating the authenticity of digital videos by establishing if any of its frames or regions of frames have undergone post-production resampling. The system integrates the outcomes of pixel-correlation inspection and noise-inconsistency analysis; the operation of the system as a whole overcomes the limitations usually faced by these individual analyses. The proposed system has been extensively tested on a large dataset consisting of digital videos and images compressed using different codecs at different bit-rates and scaling factors, by varying noise and tampered region sizes. Empirical evidence gathered over this dataset suggests good efficacy of the system in different conditions.

      PubDate: 2017-01-17T21:00:56Z
       
  • Whitelisting system state in windows forensic memory visualizations
    • Abstract: Publication date: Available online 7 January 2017
      Source:Digital Investigation
      Author(s): Joshua A. Lapso, Gilbert L. Peterson, James S. Okolica
      Examiners in the field of digital forensics regularly encounter enormous amounts of data and must identify the few artifacts of evidentiary value. One challenge these examiners face is manual reconstruction of complex datasets with both hierarchical and associative relationships. The complexity of this data requires significant knowledge, training, and experience to correctly and efficiently examine. Current methods provide text-based representations or low-level visualizations, but levee the task of maintaining global context of system state on the examiner. This research presents a visualization tool that improves analysis methods through simultaneous representation of the hierarchical and associative relationships and local detailed data within a single page application. A novel whitelisting feature further improves analysis by eliminating items of less interest from view. Results from a pilot study demonstrate that the visualization tool can assist examiners to more accurately and quickly identify artifacts of interest.

      PubDate: 2017-01-10T20:42:43Z
       
 
 
JournalTOCs
School of Mathematical and Computer Sciences
Heriot-Watt University
Edinburgh, EH14 4AS, UK
Email: journaltocs@hw.ac.uk
Tel: +00 44 (0)131 4513762
Fax: +00 44 (0)131 4513327
 
Home (Search)
Subjects A-Z
Publishers A-Z
Customise
APIs
Your IP address: 54.81.44.140
 
About JournalTOCs
API
Help
News (blog, publications)
JournalTOCs on Twitter   JournalTOCs on Facebook

JournalTOCs © 2009-2016