I've got a webservice which essentially just executes some saved methods, transforms the information and transmits it towards the browser. No fancy ORM mapper or something like that like this involved. To have the ability to write test whithout being able to access the database, I've done the next:

  • I've removed each and every call towards the DB into one class. The techniques return only the DataSet and DataTable objects.
  • Performed an example demand both ways and serialized the DataSet/DataTable to disc.
  • Removed an interface getting all available techniques.
  • Implemented an imitation database class which just loads the serialized data and returns it.

Now I've serialized sample results that we can sign in with my project and that i may use the fake database during my tests.

This works very well for me personally, however i question, if there some framework making creating and loading the sample data simpler!? My current project is quite small, however i would make use of the same schema in bigger projects.

Update:

Clearly all solutions aren't wrong, but miss the purpose. I am conscious of the fundamentals of unit testing. But my code is dealing with DataTables, and so i would need to in some way fake my DataTables. Creating a DataTable on your own isn't always easy, would bloat my tests and will make them unreadable. During my situation it might be also quite impossible to create helpful sample data manually.

Because of this I performed some sample calls against an example database to obtain some DataTables. I've serialized these tables to disc and employ the serialized versions to produce my fake DataTables when testing. This way the tests get in addition to the database.

You will find different choices how you can structure the code, to create deserialization from the tables easy. But which are implementation particulars which have no need for attorney at law at this time. My issue is the next:

Controlling the sample calls and (p)serializing the tables is really a tiresome work. I had been searching for some tools to create this management simpler.

From reading through another solutions as well as other comments you have made, it appears you would like an simpler method to generate large populated datasets for integration testing that does not hit the database.

NBuilder is a superb open-source library that I have effectively to produce considerable amounts of test data. Simply mix NBuilder, a couple of fundamental POCO object classes, plus some reflection - you'll have ample huge datatables it is simple to mix into datasets very quickly:

public class Person
{
    public string First { get; set; }
    public string Last { get; set; }
    public DateTime Birthday { get; set; }
}

private DataTable GenerateDataTable<T>(int rows)
{
    var datatable = new DataTable(typeof(T).Name);
    typeof(T).GetProperties().ToList().ForEach(
        x => datatable.Columns.Add(x.Name));
    Builder<T>.CreateListOfSize(rows).Build()
        .ToList().ForEach(
            x => datatable.LoadDataRow(x.GetType().GetProperties().Select(
                y => y.GetValue(x, null)).ToArray(), true));
    return datatable;
}

var dataset = new DataSet();
dataset.Tables.AddRange(new[]{
        GenerateDataTable<Person>(50),
        GenerateDataTable<Dog>(100)});

To unit test the transformation you actually should not have to mock the database whatsoever. I suspect that you have tightly combined the changes along with you database calls. What for you to do here's extract all of your transformation logic right into a class of it's own such as the following:

public static Transformations
{

    public static DataSet TransformationA(DataSet dataSet)
    {
        //transformation logic here
    }

    public static DataSet TransformationB(DataSet dataSet)
    {
        //transformation logic here
    }
}

With this particular you are able to unit test just the changes logic by passing inside a dataset after which saying the dataset came back has got the correct changes put on it. This can stop you from needing to implement another data store (your 'fake' database) for testing reasons only.

Hopefully this can help