How to test protected and private methods

Testing public methods with any testing framework is easy because we have access to the public interface of the test subject. Unfortunately, we don’t have access to the protected and private methods of our test subject. They are hidden to us (the client) for a reason. They are implementation details the class does not want to leak to potential clients.

Generally speaking, it is a bad idea to test private and protected methods. They are not part of the public interface and they are usually executed via the public methods of a class. So when we test the public methods, they implicitly test private and protected methods.

In TDD we are testing the behavior of a class, not the specific methods. So when we make sure we test all the things a class can do, we can rest assured that the private and protected methods are tested.

If you find yourself having large private methods which need their own test, there is probably something wrong with your design. Always aim for designing small classes that do one thing.

If you still think this is nonsense and you really want to test those methods, you can take a look here on how to do that with PHPUnit.

 

How to handle time in your tests?

Handling time in Unit Tests

Often we face scenario’s where a process we are modelling is depending on time. Take for example an expiring coupon. A coupon is no longer valid when it has expired. In PHP we could implement this as follows:

class Coupon
{
    /**
     * @var \DateTimeImmutable
     */
    private $expiryDate;

    public function __construct(\DateTimeImmutable $expiryDate)
    {
        $this->expiryDate = $expiryDate;
    }

    public function isValid(): bool
    {
        $diffInSeconds = $this->expiryDate->getTimestamp() - time();
        return $diffInSeconds > 0;
    }
}

class CouponTest extends TestCase
{
    public function testANotExpiredCouponIsValid(): void
    {
        $coupon = new Coupon(new \DateTimeImmutable('2018-10-28T07:14:49+02:00'));
        self::assertTrue($coupon->isValid());
    }

    public function testAnExpiredCouponIsNotValid(): void
    {
        $coupon = new Coupon(new \DateTimeImmutable('2017-10-28T07:14:49+02:00'));
        self::assertFalse($coupon->isValid());
    }
}

The problem

When we run the test for a valid coupon someday after 2018-10-28 (the date it will expire), it will actually fail because the coupon has actually expired. That’s not what we want because we need to test coupons with an expiry date in the future. So we need a way to fix that.

One way to resolve this issue would be to just change our test and pass in a date that’s 500 years in the future. It will fail again in 500 years but it will not be our problem anymore.

A better way to resolve this is to actually pass the current time as an argument to the isValid() method. Look at the new implementation:

class Coupon
{
    /**
     * @var \DateTimeImmutable
     */
    private $expiryDate;

    public function __construct(\DateTimeImmutable $expiryDate)
    {
        $this->expiryDate = $expiryDate;
    }

    public function isValid(\DateTimeImmutable $now): bool
    {
        $diffInSeconds = $this->expiryDate->getTimestamp() - $now->getTimestamp();
        return $diffInSeconds > 0;
    }
}

class CouponTest extends TestCase
{
    public function testANotExpiredCouponIsValid(): void
    {
        $coupon = new Coupon(new \DateTimeImmutable('2018-10-28T07:14:49+02:00'));
        self::assertTrue($coupon->isValid(new \DateTimeImmutable('2018-10-01T00:00:00+02:00')));
    }

    public function testAnExpiredCouponIsNotValid(): void
    {
        $coupon = new Coupon(new \DateTimeImmutable('2017-10-28T07:14:49+02:00'));
        self::assertFalse($coupon->isValid(new \DateTimeImmutable('2018-10-01T00:00:00+02:00')));
    }
}

That’s better. Now we have frozen the current time and are no longer depend on the real system type. Our tests will always pass.

How to test code that needs sleep?

The problem

Let’s say we have an API that knows a rate-limit of 1 request per second. When we have to make multiple requests, we need to build in some timeout mechanism. This mechanism would check if the last request was shorter than 1 second ago, otherwise it delays the request for 1 second.

One way to achieve this in PHP is:

class Api
{
    private $lastTime = 0;
    /**
     * @var \GuzzleHttp\Client
     */
    private $client;

    public function __construct(\GuzzleHttp\ClientInterface $client)
    {
        $this->client = $client;
    }

    public function doRequest(): void
    {
        $now = time();
        if ($now === $this->lastTime) {
            sleep(1);
        }
        $this->lastTime = $now;
        $this->client->get('https://some.api');
    }
}

How to test it?

To verify the behaviour of this class, we can mock the http client and do assertions on it. We are going to do a request twice and we are going to assert that the api is called twice. It would look something like this:

class ApiTest extends TestCase
{
    public function testCallsTheApi(): void
    {
        $httpClient = $this->createMock(\GuzzleHttp\ClientInterface::class);
        $httpClient->expects($this->exactly(2))
            ->method('get');

        $api = new Api($httpClient);
        $api->doRequest();
        $api->doRequest();
    }
}

Now we have a few problems. First of all we just slowed down our tests. Our unit tests are supposed to be fast, but now we waste 1 second. Second of all we are not actually asserting the second request came at least 1 second after the other. We could assert this by measuring the execution time, but this would still keep our tests slow.

A solution

To make our tests run fast and actually assert that our sleep function is called, we could do the following:

class Api
{
    private $lastTime = 0;
    /**
     * @var \GuzzleHttp\Client
     */
    private $client;
    /**
     * @var callable
     */
    private $sleep;

    public function __construct(\GuzzleHttp\ClientInterface $client, callable $sleep)
    {
        $this->client = $client;
        $this->sleep = $sleep;
    }

    public function doRequest(): void
    {
        $now = time();
        if ($now === $this->lastTime) {
            \call_user_func($this->sleep, 1);
        }

        $this->lastTime = $now;
        $this->client->get('https://some.api');
    }
}

class ApiTest extends TestCase
{
    public function testCallsTheApi(): void
    {
        $httpClient = $this->createMock(\GuzzleHttp\ClientInterface::class);
        $httpClient->expects($this->exactly(2))
            ->method('get');

        $sleepCalled = false;
        $api = new Api($httpClient, function (int $sec) use (&$sleepCalled)  {
            $sleepCalled = true;
            self::assertEquals(1, $sec);
        });
        $api->doRequest();
        $api->doRequest();

        self::assertTrue($sleepCalled);
    }
}

So what changed? We made the sleep function configurable via the constructor of the Api class. In our test we pass a spy function that asserts the callback is called and it’s called with the correct amount of seconds. This test now runs fast and is actually testing the needed timeout logic.

In our production code we would instantiate the Api class like this:

new Api(new Client(), function (int $sec) {
    sleep($sec);
});

Conclusion

There are for sure other ways to handle this case, but this is surely an elegant solution. If you have idea’s or thoughts about another solution, please let me know in the comments.

What is a Software Craftsman?

Last week I started to read part 1 of The Software Craftsman: Professional, Pragmatism, Pride by Sandro Mancuso. This post is my attempt to summarise what it means to be a software craftsman.

A software craftsman does more than just a coder. A coder writes code. A software craftsman crafts a software product. Software craftsmanship is about raising the bar of our industry. We all know the stories of projects that exceeded all deadlines and budgets, went full of bugs into production and did not match the clients expectations very well.

The Software Craftsmanship Manifesto

“As aspiring Software Craftsmen we are raising the bar of professional software development by practicing it and helping others learn the craft. Through this work we have come to value:

Not only working software, but also well-crafted software.
Not only responding to change, but also steadily adding value.
Not only individuals and interactions, but also a community of professionals.
Not only customer collaboration, but also productive partnerships.

That is, in pursuit of the items on the left we have found the items on the right to be indispensable.”

Well-crafted software

We have all worked on ‘legacy’ applications. Applications that have no tests. No one knows exactly how the application works. The code is full of technical and infrastructure terms instead of expressing the business domain. Classes and methods have hundreds of lines. Developers working on these kind of applications are crippled by fear because they don’t know where their changes will break stuff.

So the software is working and is earning a lot of money. But is it good software? Well-crafted software means that everyone working on it can easily understand the application. It does not matter if it’s a brand new green field project or a legacy application of ten years old. It has high test coverage, has a simple design and the code is written in the language of the domain. Adding features should be as painless as in the beginning of the project.

Steadily adding value

It’s not only about adding new features and fixing bugs, but also about improving the structure of the code. This means refactoring parts of the application to keep the code clean, testable and easy to maintain. It is the job of a Software Craftsman to make sure that the older the software gets, the more the client will benefit from it. We do this by making sure we can easily add new features, so that our client can respond quickly to the changing market. Our software should not slow them down and become a source of pain. A great rule is the Boy Scout rule by Uncle Bob, that we should always leave the code cleaner than we found it.

The focus on well-crafted software is important if we want long-lived applications. Totally rewriting an application a few years after it has been developed is usually a bad idea and has a very poor return on investment.

Community of professionals

It’s about sharing the knowledge of building quality software with other developers. Learning from each other is the best way to become better. We can contribute to the greater good of our industry by sharing our code, contributing to open source software, writing blogs, starting local communities and pairing with other developers. Most major cities in most countries have already regular developer meetup’s. The meetings are great because they attract people with all different backgrounds.

Besides communities this item is also about the work environment. Good developers want to work with other great developers that are better than they are. They want to work for great companies with great projects and a passionate atmosphere of improvement.

Productive partnerships

Software craftsmen don’t believe in the usual employer/employee type of relationship. Just coming to work, putting in the hours, keeping your head down and just doing what you are told is not what we expect from a professional. What the contract says, is just a formality. If you are a permanent employee, you should treat your employer in the same way as a contractor or freelancer does. Employers should expect from their employees the same great service they are getting from consultant’s and contractors.

Software craftsmen want to actively contribute to the success of the project. They want to question requirements, understand the business and propose improvements. Good craftsmen was a productive partnership with their clients. The advantages for the employer are enormous, a highly motivated team has a much bigger change to make any project succeed. This is a big difference with the traditional approach where business people talk with clients and tell developers what they have to build.

Conclusion

Software craftsmen are expected to do much more than just writing code. Craftsmen help their clients by proposing alternative solutions and providing ideas to improve their processes. They understand their domain and give them good information and knowledge. Professionals question their clients requirements in relation to the value they provide. They provide value to their customers at all levels.

Thoughts on integrating external services

This week I was working on the backend of a new application. One of the requirements was to convert addresses to coordinates. This process is also known as forward geocoding. Reverse geocoding is exactly the opposite, where you convert coordinates to addresses. Since we don’t have that geographical data ourselves, we decided to look at an external service. We choose for a  competitor of Google Maps, but that does not really matter. What I want to share with you here, is the solution that we came up with for safely integrating that external service.

About integrating external services.

In my first years as a developer I would not think long about implementing the connection with this service. “How hard can it be? Just call the web service and use the response. What could go wrong?” Nowadays I tend to take a step back and consider a few things:

What happens when the external service is not available? Will this kill the process? What will the customer see? Should we just keep retrying until their web service is back? Should we cache the responses on our end?

The solution that we came up with.

When we receive a new address in our system, it needs to be forward geocoded immediately. In an event driven architecture this is easy to accomplish. We just listen to the AddressWasAdded event and dispatch a GeoCodeAddress command on the async command bus. This command gets picked up by one of the workers and is handled by the correct handler. The handler tries to connect with the external service and saves the coordinates in the database. Because the external web service is billing us per request, the GeoCoderService caches the given coordinates for about one day. So if the same address is requested, it will hit the cache and not the actual service. The real implementation is more nuanced but globally this is how it looks:

Diagram of the solution

Diagram of the solution

Takeaways.

The main takeaway of this solution is the introduction of the Async Queue. This is solving most of our problems. If an error happens in the Geo Coder Service, the command is requeued and will be retried at a later time until it’s handled successfully.

Because we work with an event driven architecture and CQRS, processes are triggered via events and commands. But this solution can be implemented in any architecture. Just make sure that you application publishes a message on an async bus and let a consumer handle the rest.

I am very interested in your thoughts on this solution and if I am missing important questions about integrating external services. Please leave a comment to share your thoughts.

First post

As a software developer I want to learn something new every day. My plan for this blog is to make it my learning track record. New posts can be expected very soon.