Refactoring Tools
The early pioneers of refactoring toolspeople like William Opdyke, Ralph Johnson, John Brant, and Don Robertsenvisioned a world in which we could look at code that needed a refactoring and simply ask a tool to perform the refactoring for us. In the mid-1990s, John and Don built such a tool for Smalltalk. Software development hasn't been the same since.
After the 1999 publication of Refactoring [F], Martin Fowler challenged tool vendors to produce automated refactoring tools for mainstream languages such as Java. These tool vendors responded, and before long, many programmers throughout the world could execute automated refactorings from their integrated development environments (IDEs). Over time, even die-hard users of programming editors began transitioning to IDEs, largely due to automated refactoring support.
As refactoring tools continue to implement low-level refactorings, like Extract Method [F], Extract Class [F], and Pull Up Method [F], it becomes easier to transform designs by executing sequences of automated refactorings. This has important implications for pattern-directed refactorings because the mechanics for these refactorings are composed of low-level refactorings. When tool vendors automate the majority of low-level refactorings, they will automatically create support for the refactorings in this book.
Using automated refactorings to move towards, to, or away from a pattern is completely different from using a tool to generate pattern code. In general, I've found that pattern code generators provide an excellent way to over-engineer your code. In addition, they generate code that doesn't contain tests, which further limits your ability to refactor as and when needed. By contrast, refactoring lets you discover small design improvements you can safely make to go towards, to, or away from a pattern implementation.
Because refactoring is the act of performing behavior-preserving transformations, you might think that you would not need to run test code after you perform an automated refactoring. Well, you do, much of the time. You may have complete confidence in your automated refactoring tool for some refactorings, while you may not completely trust it for other refactorings. Many automated refactorings prompt you to make choices; if you make the wrong choices, you can easily cause your test code to stop running correctly (which is another way to say that the automated refactoring you performed did add or remove some behavior). In general, it's useful to run all of your tests after refactoring to confirm that the code is behaving as you expect.
If you lack tests, can you trust automated refactoring tools to preserve behavior in your code and not introduce unwanted behavior? You may be able to trust many of the refactorings, while others, which may be just out of production, are less stable or trustworthy. In general, if you lack test coverage for your code, you really won't have much success with refactoring, unless the tools become substantially more intelligent.
Advances in automated refactorings can impact what steps you follow in the mechanics of a refactoring. For example, a recent automation of Extract Method [F] is so smart that if you extract a chunk of code from one method and that same chunk of code exists in another method, both chunks of code will be replaced by a call to the newly extracted method. That capability may change how you approach the mechanics from a refactoring, given that some of the work from multiple steps may be automated for you.
What is the future of refactoring tools? I hope that we see more automated support for low-level refactorings, tools that suggest which refactorings could help improve certain pieces of code, and tools that allow you to explore how your design would look if several refactorings were applied together.