Skip to content

Commit

Permalink
Add Catch around Recursive Extractor calls
Browse files Browse the repository at this point in the history
Was able to reproduce overflow exceptions thrown here causing end to execution, I think this is potentially the same root cause of issue #598's report of null streams causing a crash as well.
  • Loading branch information
gfs committed Jan 8, 2025
1 parent 92a437d commit 25aefa0
Showing 1 changed file with 30 additions and 12 deletions.
42 changes: 30 additions & 12 deletions AppInspector/Commands/AnalyzeCommand.cs
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
using System;
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Collections.Immutable;
using System.Diagnostics;
using System.Globalization;
using System.IO;
Expand Down Expand Up @@ -771,21 +772,38 @@ private IEnumerable<FileEntry> EnumerateFileEntries()

if (contents != null)
{
if (_options.DisableCrawlArchives)
IList<FileEntry> entriesToYield = new List<FileEntry>();
try
{
yield return new FileEntry(srcFile, contents);
if (_options.DisableCrawlArchives)
{
entriesToYield.Add(new FileEntry(srcFile, contents));
}
else
{
// Use MemoryStreamCutoff = 1 to force using FileStream with DeleteOnClose for backing, and avoid memory exhaustion.
ExtractorOptions opts = new()
{
Parallel = false, DenyFilters = _options.FilePathExclusions, MemoryStreamCutoff = 1
};
// This works if the contents contain any kind of file.
// If the file is an archive this gets all the entries it contains.
// If the file is not an archive, the stream is wrapped in a FileEntry container and yielded
entriesToYield = extractor.Extract(srcFile, contents, opts).ToImmutableList();
}
}
else
catch (Exception e)
{
// Use MemoryStreamCutoff = 1 to force using FileStream with DeleteOnClose for backing, and avoid memory exhaustion.
ExtractorOptions opts = new()
{
Parallel = false, DenyFilters = _options.FilePathExclusions, MemoryStreamCutoff = 1
};
// This works if the contents contain any kind of file.
// If the file is an archive this gets all the entries it contains.
// If the file is not an archive, the stream is wrapped in a FileEntry container and yielded
foreach (var entry in extractor.Extract(srcFile, contents, opts)) yield return entry;
_logger.LogDebug(
"Failed to analyze file {Path}. {Type}:{Message}. ({StackTrace})",
srcFile, e.GetType(), e.Message, e.StackTrace);
_metaDataHelper?.Metadata.Files.Add(new FileRecord
{ FileName = srcFile, Status = ScanState.Error });
}

foreach (var entry in entriesToYield)
{
yield return entry;
}
}

Expand Down

0 comments on commit 25aefa0

Please sign in to comment.