To test this feature out, I created another sample application. This time, it works with files and I tried to create a simulation of a memory-consuming process.
The setup
It is a simple C# Console Application with only one method – Main. All the code is executed inside that method and no calls to external libraries are made. Here is what it looks like:
using System;
using System.IO;
namespace ConsoleApplication
{
class Program
{
static void Main(string[] args)
{
string[] fileList = Directory.GetFiles(@"D:\Temporary");
foreach (string file in fileList)
{
Console.WriteLine("Getting bytes for " + file + "...");
Console.WriteLine("Bytes for " + file + ": " + File.ReadAllBytes(file).Length);
}
Console.Read();
}
}
}
What this code does is it gets the file paths (given a specific source folder) and then reads the file contents for each file separately to a byte array. For large files, this process will allocate quite a bit of memory, so that is a perfect way to demonstrate the capacities of built in profiling tools when it comes to memory allocation benchmarking.
As you can see from the code I am showing here, I am referencing a path that points to a folder called Temporary. To test it out, I copied a set of small and not so small files over there (a bunch of large texture files and a movie). And that is pretty much everything that is needed to simulate intensive memory consumption.
Trying it out – getting and analyzing the results
To start the process, go to Analyze > Launch Performance Wizard and select .NET Memory Allocation (Sampling)
Read more: DZone