Testing Project: Test-Driven Development
Objectives:
- to learn test-driven development techniques
- to think abstractly about how a piece of software might be implemented
- to gain experience with class invariants and method pre- and post-conditions
- to consider how to design good APIs
- to practice writing good tests
- to get experience using JUnit framework, including the strengths and limitations of unit testing
- to collaborate with other people using Git
- to help you become a good, systematic tester!
Due: Wednesday, November 9 by 11:59 p.m. Staggered Extension: Your individual analysis is due by Thursday, November 10 at midnight.
FAQ - updated as I get new questions
Team Assignment
This is a team assignment. You will practice collaborating with 3 people using Git (i.e., you will be in a team of 3). Each member of the team must commit/push code to the shared remote repository.
For Tuesday at 10 a.m.: One person from your team should
email me with the subject line CSCI209: Testing Team
and
CC all the team members with the name of your team.
Set up
Accept the assignment. You can either create a new team--the agreed-upon name you told Professor Sprenkle--or join an existing team. After the team's repository is created, copy the link to clone your repository.
In Eclipse, go into the Git repository view and clone your repository. Expand the repository, expand Working Tree, right-click on "Working Tree" and then select "Import Projects ..." Step through and import the project. We won't need to add JUnit to the classpath now. When you create your first test class, that should step you through it.
Add each of your team member's names to the README.md. (Note that only one person should do this and commit/push the file. Otherwise, you will get conflicts.)
Updating your Classpath
If you need to update your classpath with JUnit (e.g., if one team member has created a test class that you pulled and you want to test the tests), follow these instructions:
- Right-click the project, select
Build Path -> Configure Build Path
- In the Libraries tab, click on Classpath, and then click on "Add Library...". Select "JUnit", click Next, and then choose JUnit 5 from the dropdown, and click "Finish".
Class Under Test
The Car class defines a model of a car that can go forward or reverse, using appropriate amounts of fuel.
Notice that the Car
's methods are stubs--they only
contain enough code to make Car
compile. I have my own
version of this code with real method implementations. However, my
code contains faults.
Also notice the class invariants and pre- and post-conditions in the code. They clarify the natural language comments in the code.
Here is the (public) API for the
Car
class.
Part 1: Testing (Team)
Using JUnit,
write test cases for the Car
class. Put your tests in a
separate package called tests
. Since you do not have
code for the Car class, you will have to imagine what that code might
look like to write good tests. You might find it useful to write code
for the Car class if you are having trouble imagining what the code
might look like.
Goal: Create a set of good test cases that are likely to find errors in the code. Specifically, your test cases should provide the specification for what the code should do. You want to be able to reveal the faults in my code. We've been talking about what it means for a test case and/or a set of test cases to be good, e.g., you want to automate the test case, cover all of the code/functionality and you want to test boundary cases, etc.
I'm not giving you the actual code because I don't want you to simply debug my code and write test cases that find those bugs. I want you to write test cases that will reveal ANY and ALL faults in the code.
Enjoy the challenge! How often does a professor ask the students to find and expose their faults?? :)
Creating JUnit test cases in Eclipse
Right-click on the class you want to test, and choose "New -> JUnit Test Case". Put the class in the tests package. Going through that wizard, note the different options you have, e.g., adding the set up methods, adding comments, and which methods you want to test. Select which ever options work the class you're trying to create. (You will likely create multiple test classes, and the options you pick will likely vary every time.)
Resources
Part 2: Analysis (30% - Individual)
In a text file (e.g., Markdown format) or PDF (NOT a doc, docx, odt, or anything else), answer the following questions thoughtfully, critically, clearly, and concisely:
- List your team name and the names of your team members.
- Describe your testing process. (This is broader than just actually writing the test cases, which is in the next question.) What made the process easy? What made the process difficult?
- How difficult was it to write test cases for this class? What made the process difficult? easy?
- Would it be more or less difficult for you to write test cases for a class that you had already written? Why? Try to think of at least one advantage and one disadvantage to testing your own classes.
- What could you do when writing your own classes to make testing easier? How could you make testing your classes easier for someone else?
- What do you think of the JUnit framework? How easy/difficult was it to write test cases using JUnit? How would the testing process go without JUnit? Did JUnit limit you in any way?
- This is your first collaborative project in CSCI209. How did the collaboration go? What would you do to improve collaboration in future projects?
- How would more or fewer people change your process?
- Approximately how much time did you spend on the project--in your team and on your own? (Amount of time does not affect your grade in any way.)
Submit your analysis in the assignment on Canvas.
Hints and Suggestions
You ask: | My advice: |
---|---|
How do I start? | First, start early! Read through the code, start thinking about what needs to be done and then use a systematic approach to test everything. Think about the different testing approaches and implement those. You should create a checklist on paper or in a README (or similar file) of all the possibilities before you try to code anything. (If you need more help, please come talk with me.) |
What do I do when the specification is ambiguous? | Make reasonable assumptions and document them in the tests--not the Car class because I use my own Car class. |
How many tests should I have? How do I know I'm finished? | Ah, that is the tester's dilemma, isn't it? Use a systematic approach to make sure that you have covered all possibilities. Start early so you have time to come back to the project with a fresh mind to see if you may have missed something. |
How should I organize my test methods/classes? | There are several different ways you could organize your test classes. (Note you CAN have multiple test classes for one "class under test".) You can categorize your tests into classes by functionality; by fixtures, e.g., preconditions or object state; by the set up required for the objects; by all pass or all errors. |
How should I name my test methods? | Name tests clearly and consistently. One suggestion is to name them
using the format: functionality_state_expectedresult
|
No really... Where do I start? | The constructor. If you have have specified the requirements for the constructor, then it gives you a good foundation to work from. |
FAQ - updated as I get new questions
Possibly Helpful Exception Classes
- IllegalArgumentException
- Thrown to indicate that a method has been passed an illegal or inappropriate argument.
- IllegalStateException
- Signals that a method has been invoked at an illegal or inappropriate time. In other words, the Java environment or Java application is not in an appropriate state for the requested operation.
Submission
I have access to your team's repository. Everything in the
main
branch will be graded. Make sure all of your code
is in the repository.
Grading (8% of Final Grade)
You will be evaluated based on the following criteria (300 pts):
- Complete set of good test cases
- Ability of test cases to reveal faults
- You must commit some code to our shared repository on GitHub
- Analysis of testing (30%)