I’m up early again. I get ideas in the middle of the night and need to wake up and record them.
Anyway, this is a follow-up from the last post. I think I have a new idea how to manage predictions in a software project. This would fit in nicely with a program like TargetProcess. But could just as easily work with Trac, Bugzilla, Mantis, FogBugz, Jira. or any other bug tracking system. The key is the source control integration.
In TargetProcess (and others) when you check in your code changes, you usually put the number of the bug or task in the comment field. This is automatically linked to that bug/task. In TargetProcess, you can click a link to see the checkins for that task and then view the diff of the files that were checked in. In other words, you can see the actual work that was produced for that task.
The easy part of time tracking are estimates. Developers would put a number in there. The hard (impossible) part is getting them to track their time for each bug. They just don’t do it, and I think it’s lame anyway. As one poster said, “Its a fools errand”. I agree.
In agile methodologies, however, you don’t track REAL hours. You track difficulty points or sometimes “ideal hours”. My proposal to to extend this a little and estimate Code Deltas. A code delta is defined as new lines of code that would need to be produced or lines that were erased because of refactoring. Maybe a new line of code is worth 1 point and an erased line is worth 0.5 points. (In theory, erasing lines takes less time than producing lines, but this might be a bad assumption).
Ok, so now the developer says, “I can do that task in 100 points of Code Delta. They go about that chore and check in their code and close the bug. The bug tracking system gets the checkin and automatically calculates how many REAL points of Code Delta there are (Field: Estimated Code Delta). Then the system justs puts in that number in the “Actual Code Delta” field. Ok, now fast forward ahead 10 more tasks. The developer would start to see the difference between their own estimates and the actual Delta. According to agile methodology gurus, they would get better at estimating. I believe that the developer would be off by the same percentage every time.
Here is the beauty part. I believe there is a correlation between Code Delta and Time Spent. In other words, this is a pure measure of time spent without actually asking the developer to track any of their time. Code Delta is a personal metric as well. One developer might take 100 lines of code to complete a task in 1 hour. Another might take 100 lines of code to complete a task in 4 hours. In agile methods, you give each developer a set of how many Code Deltas they can do in a sprint, so these actual numbers would provide that metric easily per person.
FogBugz has a great feature which automatically calculates Time Spent versus Time Estimates for each developer. (They do have to track time for this to work). It then helps the project manager by giving them charts on the accuracy of the developer in estimating over time. If they switched to Code Delta, they could provide the exact same value, BUT the developer would never ever have to track their time. They just need to check in their code with the proper bug numbers noted.
Most bug tracking systems have source code integration. This seems like it could make developers lives much easier and show accurately the time spent on tasks and help developers estimate better.
This is the part where I close my eyes and wish a talented developer would make this dream come true. Thoughts?