We have used the MVP pattern lately in my recent project to increase testability and decouple the UI from our business logic. To enhance our testing environment we have begun using Rhino Mocks and it is an overall satisfying experience even though I am still having a hard time "getting it right" when writing tests.
I want to share with you one of the "What The F***" experiences I had just a few minutes ago: One of our mocking tests failed. Well - insert a few breakpoints and fire up the debugger. And then it happened: For every time I ran my test I had errors on different lines in my source code... Most of the times I had a NullPointerException but at other times the Rhino framework threw mysterious errormessages at me...
I was a little scared to see a unittest behave like this for twenty minutes and really had no clue what was happening but my collegue was able to solve the puzzle once I asked around for a little help. What had happened was that in my MVP Presenter I had set up an mock of my view and some expectations for different properties to be called during OnLoad. They should be called just once so when you start using your debugger's Quickwath on those properties you "use up" the quota of expected calls. You are actually fulfilling the expectation you just set up when quickwatching a property in a test with expectations set Rhino-style... And when the code afterwards tries to execute the code you intended the expectation to test for your test fails because you only set your property up for being accessed just once.
Lesson learned today: When a test which uses mocking fails you - clear all breakpoints and try again. It actually solved the mystery and I was able to fix the initial cause of the error within a few minutes.
This blog contains reflections and thoughts on my work as a software engineer
onsdag den 9. juli 2008
Rhino + VS debugger == WFT?
Indsendt af Kristian Erbou 0 kommentarer
Etiketter: rhino mock nullpointerexception, tdd, testdriven development
onsdag den 18. juni 2008
The infamous Save() method
We (a collegue and myself) had a little infight a few days ago during a pairprogramming session - I advocated for having all Save() methods being void and parameterless whereas he didn't see any problems in having them returning objects and/or taking input parameters at will.
I couldn't at that time argue my case except that I saw this as a violation of the SoC (Separation of Concerns) principle - and why force the consumer to handle a return value if the consumer doesn't need to have one? The discussion grew a little ugly at some point but yesterday just before leaving the office we talked it over again.
It turned out that my collegue's experienced with TDD for various reasons has not been overly positive - having to figure TDD out for yourself without having some sort of experienced mentor inhouse will lead to frustration in 95% of all cases once you try to use TDD in a real-world scenario. We argued about having a signature of Save() methods in general and I understood my own reservations when we discussed what would happen once the signature of the Save() method changed because we i.e. wanted to introduce a new parameter in a call to Save() - example:
public void Save(string a, string b, int c)
{ ... }
was to be extended with:
public void Save(string a, string b, int c, List
{ ... }
What happened in his mind was that the compiler would burp on all places where the Save(a, b, c) was used so you had to change the code of your tests in all places.
What happened in my mind was that you had to change all working tests using the infamous Save() method because you changed a single signature of a method. This is bad because
1: The time spent maintaining your tests should be kept at a minimum
2: You could be mislead to work around changes in your model to avoid having to alter your tests to reflect that change - and THAT should ring the alarmbell.
Conclusion:
1: Having simple method signatures should be a goal in itself to avoid breaking the code of your tests when you want to introduce new functionality and parameters which needs to be carried around in your application.
2: Do use objects to carry around parameters for you. Less code is needed and the readability of your code is much higher. Having i.e. an object carrying 10 primitives around your model is much more extensible than having to extend every class with a property for param #11.
3:
You should not be afraid of having "dead" objects in your model and view which can only contain data. They are keeping the numbers of properties and overloaded methods in your application to a minimum :o)
Related articles:
TDD antipatterns (I don't agree to every one of these but it is a good checklist)
This guy suggests best-practises for API design:
How to design good API's
Indsendt af Kristian Erbou 0 kommentarer
Etiketter: parameterless method signature, separation of concerns, software quality, testdriven development, testdriven development antipattern
tirsdag den 22. april 2008
Hello Rhino World
I have discovered something tonight - the art of testing through mocking. Yes, I admit it: I've never actually given much thought to the fact that mocking could be remotely interesting due to the fact that a mock is simply just a replacement for something you intend to test, i.e. a database access layer or external devices connected to your computer. I do not like to think of myself as being religious but nevertheless I have never given mocking a try because it just didn't feel right (!). So much for trying to see things from the other side - until now, because:
We have had quite a few exhausting weeks at work. Our source control is coming up to speed and a CI build enviroment is slowly emerging. It just builds everything but we are working disconnected from eachother and nobody can screw up the build for days without getting instant notice. I like it sooo much... We have hired one of my former collegues, who is now an independent consultant, to tell us about Agile development and TDD. Last Wednesday we talked a bit about mocking and I peeped a little about the paradox of testing a simulated version of the real world. However I decided that I would give a mocking framework a try one of these days and I tried it out tonight, so I created a simple scenario:
I want to create a Person object and store it in a database. The person should expose to me whether or not he has been saved or not. Separation of Concerns, I know, I know... I'm just mocking a scenario here... (applause sign on). And - tada - I want to test on both a real object and a mocked one and be able to retrieve a positive response whenever I ask Mr. Per Son: Have you been saved?
I downloaded Rhino Mocks because I had it recommended and it seems to be some sort of de facto standard these days - mainly because it is strongly typed which doesn't seem to be the case with other .NET mocking frameworks. I created the following unittests to mock my reallife Person object:
[Test]
public void CreateDomainObject()
{
Person p = new Person();
Assert.IsNotNull(p, "Person is null");
}
[Test]
public void RealPerson_NotPersisted()
{
Person p = new Person();
Assert.IsFalse(p.IsPersisted());
}
[Test]
public void RealPerson_Persisted()
{
Person p = new Person();
p.Save();
Assert.IsTrue(p.IsPersisted());
}
I then created a Person object to simulate this behaviour:
public class Person
{
private bool persisted;
public Person()
{
persisted = false;
}
public void Save()
{
persisted = true;
}
public bool IsPersisted
{
get { return persisted; }
}
}
...and now things get interesting. A mock of the Person is introduced in the following test:
[Test]
public void Mock_Persisted_Property()
{
IPerson persistedPerson = GetMockedPerson();
Assert.IsTrue(persistedPerson.Persisted);
}
private IPerson GetMockedPerson()
{
MockRepository mocks = new MockRepository();
IPerson person = (IPerson)mocks.CreateMock(typeof(IPerson));
Expect.Call(person.Persisted).Return(true);
mocks.ReplayAll();
return person;
}
The MockRepository makes it possible to take an instance of an IPerson object - that is, an interface describing the methods and properties on the object you intend to test. The domain object does not actually have to inherit the interface so if you do not want to have your domain model "interfaced" for various reasons, you do not have to. The IPerson I created is simply:
public interface IPerson
{
bool Persisted { get; }
}
I do not want to do anything with Save() so I leave it out of the interface. Maybe it is not best practise - I'm a newbie on this so you are obliged to guide me in the right direction if this has proven to be an anti-pattern.
In the Expect thingy I decide that a call to the Persisted property on my Person object should return true. The mocks.ReplayAll() stops recording expectations on my Person object and now I am ready to validate my expectations in my tests.
This makes it possible for me to run the Mock_Persisted_Property() test succesfully - and retrieve a positive value when asking if the Person has ever been saved even if the Save method has not been called. This is truly great because I have had numerous encounters with problems regarding third-party vendors and external hardware and processes which has not been testable for me until now either because testing them would result in extremely slow integration tests or that the tests themselves and the chance of a succesful test suite run would be extremely tightly coupled to a certain state of the hardware being tested (i.e. the hardware device should always be on and responding to requests).
I like this a lot. I have only scratched the documentation but I will definately go deeper into this during the next weeks and months. I have seen the light.- or at least seen the possibilities instead of only focusing on the limitations of mocking ;o)
Regards Kristian
P.S: - The entire source (Visual Studio 2008 solution) can be downloaded from this location
Indsendt af Kristian Erbou 0 kommentarer
Etiketter: mocking framework, rhino mock, tdd, testdriven development
mandag den 28. januar 2008
Test-Driven Development and the quality issue
There has been quite a lot of fuzz lately regarding Test Driven Development (TDD) - I have seen a number of posts where poeple are asking if the maturity of the tools on the market make old-school TDD obsolete. I have also seen posts where Ben Hughes asks if TDD really ensures quality.
That made me think about the term "quality" - what is software quality seen from a customer's point of view and how can we convince them that the software we produce are better than the one of our competitors?
I don't really think you can measure very much out of TDD in a sense of software quality seen from a customer's point of view. I mean - do you really care how Microsoft develops Windows XP and Windows Vista? I couldn't care less - neither do our customers as long as our sh** works every time. Even when our customers have a negative experience because they are unable to fulfill whatever task they were set to do because we as software engineers have failed to deliver - do they start to ask questions? Not really - they start looking for the products developed by our strategic competitors on the market to look for a replacement of the crap we have convinced them to install on their harddrive.
The principles and paradigms on which a software product has been built upon can and should never be a quality metric for our customers. They will never care anyway. TDD is just one (very useful) tool out of many when we develop software. In terms of quality it should only be a metric inhouse whether or not our software has a certain degree of code coverage for instance... We must not be mislead to believe as developers that anyone else but us really care what tools are in our toolbox!
Indsendt af Kristian Erbou 3 kommentarer
Etiketter: how to measure software quality, software quality, tdd, testdriven development