- 1: Start Development with Software Requirements
- 2: Honor Your Users and Communicate with Them Often
- 3: Don't Allow Unwarranted Requirements Changes
- 4: Invest Up Front in Software Architecture
- 5: Don't Confuse Products with Standards
- 6: Recognize and Retain Your Top Talent
- 7: Understand Object-Oriented Technology
- 8: Design Web-Centric Applications and Reusable Components
- 9: Plan for Change
- 10: Implement and Always Adhere to a Production Acceptance Process
- Harris Kern's Enterprise Computing Institute
7: Understand Object-Oriented Technology
Every key software developer, architect, and manager should clearly understand object-oriented technology. We use the term object-oriented technology versus object-oriented programming because one doesn't necessarily imply the other. Many C++ and Java programmers develop in an object-oriented programming language without any in-depth knowledge of object-oriented technology. Their code, apart from the syntax differences, probably looks very much like previous applications they've written in FORTRAN, C, or COBOL.
While object-oriented technology is not a panacea for software developers, it's an important enough technology that the key engineers in every development organization should understand it. Even if your organization doesn't currently have any ongoing object-oriented development projects, you should have people who understand this technology. For starters, without understanding the technology you'll never know whether it's appropriate to use on your next project. Secondly, due to the long learning curves associated with object-oriented technology, organizations need to invest in it long before they undertake their first major project. While object-oriented programming syntax can be learned in a few weeks, becoming skilled in architecting object-oriented solutions usually takes 618 months or more, depending on the initial skill set of the software engineer.