How to pass a coding test and get the engineering role you really want!

Published on
November 7, 2023
Authors
Advancements in AI Newsletter
Subscribe to our Weekly Advances in AI newsletter now and get exclusive insights, updates and analysis delivered straight to your inbox.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Our coding test is deliberately supposed to be a simple problem to solve, but with room for showing your skills if you choose to do so.

The idea is like a famous test for chefs; you ask them to cook an omelette. You can go all out and add chorizo and some herbs and truffles with black pudding, or just keep it simple and cook a plain omelette, maybe with a bit of cheese. But you have to get the basics right. The egg shouldn't be over cooked and rubbery and it shouldn't be undercooked and runny. There's no reason, as a chef, why it shouldn't be just right.

Some people like to spend longer on the coding test and add other features that are not explicitly asked for, this is great as long as they do them right. But we mark it the same as somebody who doesn't have a lot of time and just opts for the simple solution with minimal features, as long as they do it right.

So, the things we're looking for...

Well engineered code! But what does this look like?

For us, well engineered code is maintainable code. Code has a life. It will do a job for a while and at some point in the future it may be changed to do a slightly different job or some extra things that it didn't originally do. Or it may not be needed anymore and just be deleted. The engineer making those changes may not be the same engineer who wrote the original code. So to write maintainable code you need to be considerate of the person who may have to change it in the future (even if that's you!).

Writing considerate, clean, extensible, composable, testable code makes for maintainable code.

Clean code is code that is easy to understand. It is structured and laid out in an intuitive and natural way. A lot of algorithms can be written in single lines of python, or with recursive solutions, but these often require quite a lot of cognitive load for another engineer to understand what is going on. If they need to make a change to this code they want it to be obvious how the changes will affect the functionality and output so they have the confidence to make any changes in a reasonable time. They don't want to be spending the time trying to decrypt the codez! Most languages have some form of linter or analysis tool that can highlight areas that can be simplified or improved, and this can really help to make a codebase more consistent and easier for new people to pick up (we use pylint). Another thing that can help here is documenting the code. Most languages have a standard way of adding documentation to methods, functions, classes, modules etc; docstrings. Docstrings and inline comments are a great way to help another engineer understand what a function does, or what a particular line of code does and why it was written this way.

Good code also looks good. It may seem silly but consistent layout of code is really important. Again, it helps with the cognitive load of reading and parsing somebody else's code. Most languages have standard guidelines for how to format code; naming conventions for variables, constants, classes etc where to put brackets, how to break up long lines. Python has a tool called black which is a very opinionated formatter which can be used to automatically format all source files to its standards. Other languages have configurable prettifiers which can be applied to source code. It's really worth using one just to be consistent and make the code look cleaner, more pleasing and easier to read.

Extensible code is code that was written with the intention of being changed. The trick here is to not just solve the problem you are presented with but try to solve the problems that you think may also need to be solved in the future. There is a compromise here between effort and extensibility. You don't want to predict everything that might ever want to be done with this code but a reasonable amount of extensibility can really help the next engineer who has to deal with these lines.

For example, imagine you're given a coding test to work out the smallest set of notes to give a customer from an ATM, based on the requested amount. A function that solves the problem based on a passed in set of available notes is much more extensible than a solution that only solves for a specific set of denominations. This would be considered a reasonable level of extensibility that could really help another engineer make a change in the future if they are asked to extend the solution to also include another denomination, such as adding a £5 note. If all they need to do is change a constant somewhere from [50, 20, 10] to [50, 20, 10, 5] we've made their life a lot easier.

Composable code is all about reuse and separation of concerns. Most problems can be broken down into smaller problems, each of which can be solved separately. This is normally represented by separate functions or methods that do a single, clearly defined job. These pieces are much simpler to understand and can be used together in different combinations to solve different, bigger problems. In our coding test this could be; reading a file, obtaining the search term from a list of lines, cleaning a line, matching a search term to a line, and formatting the output. It would be very easy to write a solution to our coding test that was just a script that ran from top to bottom, but breaking it up into smaller, reusable solutions is much easier to understand, much easier to change and maintain, and much easier to test.

Which brings us onto testing. Testing is really important. It comes back to something we mentioned earlier; confidence about making a change. Unit tests are used to test all the little, composable pieces of a larger system. If you have a function that takes in a string and returns another string formatted a certain way, for example, tests can be written that check that this function does what it is supposed to do. And also test that it does sensible things in extreme cases; what happens when an empty string is passed in?; what happens when a number is passed in? etc.

When making a change to a piece of code it is really helpful to have a suite of tests that run against the code to give you confidence that you haven't broken expected functionality. Or broken the functionality of another piece of code that was calling this bit of code and you've now changed one of the outputs which breaks something else.

Writing tests for code can cause you to change the way that code is written. I once worked on a system where we'd made the decision early on to not write automated tests in order to speed up development (it is accepted, wrongly or rightly, throughout the industry that writing tests doubles the engineering time to deliver something). Years down the line someone then decided that we'd got a bit of time and we should start writing those tests. We couldn't. The code had been written in a way that made it impossible to write tests for it without rewriting large amounts of the system.

Testing shouldn't be an afterthought. It should be done at the same time as the code. There is test driven development, where you write the tests first and then develop the code to pass the tests, and a lot of people claim to do this, but in practice I've never met anyone that actually does! But it's still a good idea to write the tests as you go along to make sure that the code is structured in such a way that it can be tested. The composability aspect of code and the testability are quite closely linked.

A key thing that we look for in the testing is signs that the candidate has thought through the problem more than the provided examples. There are normally one or two examples of input and expected output in a coding test and it's great to include these examples in the tests to prove that your solution can solve the provided examples. But it's really good to also include some examples and edge cases of your own. It really shows that you've thought about the problem.

Speaking of thinking about the problem, one other thing that we look for is a knowledge of the limitations of the solution that a candidate submits. These are the assumptions that were made before the solution was developed. It's really useful to see what a candidate has considered, and what they haven't.

In our coding test we require a file to be read, where there is an important piece of information in the last line. A lot of submissions will simply open the file, read all the lines into a list and then it is simple to take the last line from the list. This is all fine. But... what if the file doesn't fit into memory? What if it is a massive, 30gb file of text and we need the last line? There are solutions to this problem, some people try and solve it which is great. But you don't need to solve that problem to "pass" our coding test. You need to have recognised the problem might exist and state somewhere that the solution doesn't deal with it. This could be in the docstrings for the relevant part of the code or in the accompanying README file (or both).

Another example from our coding test is the support for unicode characters. A lot of the submissions we get will find the words using regular expressions, but only look for the characters [a-zA-Z] and have not considered accented characters or letters from other alphabets. There is nothing in the requirements to say that the solution should support other languages, and none of the provided examples use accented characters. But there's nothing in there to say that it shouldn't support it. It's another assumption that is great if the engineer is aware of it and just states it somewhere.

Obviously you can now list those two without ever having thought about it, but there are several other assumptions that every submission to our coding test makes, one way or another, see if you can think of one that you're making and state it in the README file.

And finally! Error handling and user experience. Most coding tests involve writing some code that needs to interact with a user. So one thing that we look for is considerate error handling and suitable information being presented back to the user. If there is a problem with a file, for example, it's useful to tell the user as much information about the file and the problem as possible;

"Error occurred" - not very helpful

"Error with file" - at least we know a file is at fault

"Error occured with file c:/input.txt" - ah, we know which file

"c:/input.txt was empty" - we know which file and what's wrong with it, but not how to fix it

"c:/input.txt was empty, expected file of csv input data" - we know which file, what's wrong with it and what it should be

It is important when handling errors that they are handled properly. We've seen a lot of code submissions where people have tried to handle the errors but have actually made the built-in error handling worse.

A bad example:

try:

  data = read_the_file()

except IOError:

  print("An error occurred")   # a worse error message than we would have gotten if we hadn't bothered catching the IOError

carry_on_even_though_an_error_has_occurred(data)   # program flow has been allowed to continue, 'data' is not going to be correct and further errors will occur

Proper error handling can lead to a much more pleasant end user experience and is another sign of considerate coding.

And, if you're ever in any doubt, just ask! In the real job you wouldn't be expected to continue blindly working if you have a question about the requirements, so if you have a question we'd love for you to ask us. This demonstrates your communication skills and shows that you're not afraid to admit when you don't know something, which are both really valuable skills to have when working on a team. So whether it's a clarification on the expectations or a question about your implementation, if you would normally ask your team, then just ask. We promise not to bite!

Deeper Insights is hiring!

Check out all of our job vacancies here including engineering roles.

Let us solve your impossible problem

Speak to one of our industry specialists about how Artificial Intelligence can help solve your impossible problem

Deeper Insights
Sign up to get our Weekly Advances in AI newsletter delivered straight to your inbox
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Written by our Data Scientists and Machine Learning engineers, our Advances in AI newsletter will keep you up to date on the most important new developments in the ever changing world of AI
Email us
Call us
Deeper Insights AI Ltd t/a Deeper Insights is a private limited company registered in England and Wales, registered number 08858281. A list of members is available for inspection at our registered office: Camburgh House, 27 New Dover Road, Canterbury, Kent, United Kingdom, CT1 3DN.