Skip to content

Commit

Permalink
Converted to ditjson, allowing only JSON export
Browse files Browse the repository at this point in the history
Signed-off-by: Zafer Balkan <[email protected]>
  • Loading branch information
zbalkan committed Feb 1, 2024
1 parent 72c8b70 commit 3b700fa
Show file tree
Hide file tree
Showing 12 changed files with 256 additions and 603 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -31,10 +31,10 @@ jobs:
- name: Build
shell: bash
run: |
release_name="release-dumpntds-${{ matrix.target }}"
release_name="release-ditjson-${{ matrix.target }}"
# Build everything
dotnet publish ./source/dumpntds.sln --framework net8.0 --runtime "${{ matrix.target }}" -c Release -o "$release_name"
dotnet publish ./source/ditjson.sln --framework net8.0 --runtime "${{ matrix.target }}" -c Release -o "$release_name"
# Remove unnecessary files
rm -f "./${release_name}/*.config" 2> /dev/null
Expand Down
147 changes: 0 additions & 147 deletions dshashes.py

This file was deleted.

44 changes: 7 additions & 37 deletions readme.md
Original file line number Diff line number Diff line change
@@ -1,58 +1,28 @@
dumpntds
ditjson
========

# Background

The normal workflow for dumping password hashes from **ntds.dit** files is to use the **esedbexport** application from the **[libesedb](https://github.com/libyal/libesedb)** project. The files generated by esedbextract are then fed into the [ntdsxtract](https://github.com/csababarta/ntdsxtract) project. ntdsxtract uses the files to parse out various different items of information, in this case we would want password hashes that could be fed into john the ripper.

On large domains, the ntds.dit file can be extremely large (10 GB+), from which extracting all of the columns to a CSV file can take a long time, considering the **datatable** table contains over 1000 columns.

The aim of dumpntds is to extract the minimal amount of data required (45 columns) to perform the task in hand, thus speeding up the process.

dumpntds uses the [ManagedEsent](https://github.com/microsoft/ManagedEsent) library to access the data stored in the ntds.dit file. The ManagedEsent library wraps the underlying Windows API calls and therefore needs to be run using .Net, rather than Mono.
`ditjson` is a fork of [dumpntds](https://github.com/bsi-group/dumpntds). Unlike the original tool, the purpose it to generate JSON files in order to help integration with other tools.

This fork updates the underlying framework to .NET 8.0 and uses NuGet packages rather than deploying dependencies as a part of repository. It is possible to publish as a single-file. The trimmed version's size is around 20MB.

There is also a single-file JSON export mode whcich can be used to integrate other tools while breaking `ntdsxtract` importability. The JSON export is opinionated and ignores null values to minimize the exported JSON file size.
The output is a single-file JSON export. The JSON export is opinionated and ignores null values to minimize the exported JSON file size.

# Usage

## Export JSON

Extract the ntds.dit file from the host and run using the following:
```
dumpntds -n path\to\ntds.dit\file -t Json
ditjson -n path\to\ntds.dit\file
```

Once the process has been completed it will have generated two output files in the application directory:

- ntds.json

## Export CSV

Extract the ntds.dit file from the host and run using the following:

```
dumpntds -n path\to\ntds.dit\file
```

Once the process has been completed it will have generated two output files in the application directory:

- datatable.csv
- linktable.csv

### dsusers

The extracted files can then be used with the **dsusers.py** script from the ntdsxtract project:

```
python ./dsusers.py datatable.csv linktable.csv . --passwordhashes --syshive SYSTEM --pwdformat john --lmoutfile lm.txt --ntoutfile nt.txt
```

### dshashes.py

I have also included an updated version of the [dshashes](http://ptscripts.googlecode.com/svn/trunk/dshashes.py) python script, which was broken due to changes in the underlying ntds library. The dshashes script can be used as follows:
# Dependencies

```
python ./dshashes.py datatable.csv linktable.csv . --passwordhashes SYSTEM
```
- [Commandline](https://github.com/commandlineparser/commandline)
- [ManagedEsent](https://github.com/microsoft/ManagedEsent)
2 changes: 1 addition & 1 deletion source/dumpntds.sln → source/ditjson.sln
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 17
VisualStudioVersion = 17.8.34511.84
MinimumVisualStudioVersion = 10.0.40219.1
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "dumpntds", "dumpntds\dumpntds.csproj", "{8E9DA0F1-DB6D-4AB6-B979-DE86BD554602}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "ditjson", "ditjson\ditjson.csproj", "{8E9DA0F1-DB6D-4AB6-B979-DE86BD554602}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
using System;
using System.Diagnostics;

namespace dumpntds
namespace ditjson
{
/// <inheritdoc/>
[DebuggerDisplay($"{{{nameof(GetDebuggerDisplay)}(),nq}}")]
Expand Down
5 changes: 1 addition & 4 deletions source/dumpntds/Options.cs → source/ditjson/Options.cs
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
using CommandLine;

namespace dumpntds
namespace ditjson
{
/// <summary>
/// Internal class used for the command line parsing
Expand All @@ -9,8 +9,5 @@ internal class Options
{
[Option('n', "ntds", Required = true, Default = "", HelpText = "Path to ntds.dit file")]
public string Ntds { get; set; }

[Option('t', "type", Required = false, Default = ExportType.Csv, HelpText = "Export type")]
public ExportType ExportType { get; set; }
}
}
Loading

0 comments on commit 3b700fa

Please sign in to comment.