Yes - an obvious UI should reduce the need for documentation. Are you documenting every single screen - and is it really useful?
We split everything into a few buckets:
* Proper and Intended Use of the product
* End User Training
* Suggested workflow and use (kind of a how-to accomplish important tasks)
If users are unable to accomplish their work without reading the documentation - then there is a problem. Our documentation went down from "feet thick" to a small "1 cm thick" manual. Via a removal of duplication and splitting into Role based helped keep changes to a minimum.
Of course - if the UI is changing that drastically every year - are the customers happy? It sounds like there's a huge investment from the customer base to re-learn the product every year. At some point I'd get tired of that and slow down how often I upgraded...or went looking for a less complicated product.
To answer your general question: Yes - it is possible and you will be successful in doing it.
Wider question, not asked but we all derived, it sounds like some change control needs to happen.
Good luck.
Yes - thank you for this - I'll add my follow on here.
I've heard people speak of this difference as Computer Scientist (strategy/concepts) vs a Computer Engineer (code monkey/skills)
Your job will be to solve problems - what language you use is secondary. Solving the problem efficiently is more important than the language you use. My current company outsources to the lowest bidder the code monkey/maint jobs and retains the educated people to solve the hard problems.
Differences in teaching methodology via an example: I recently took a Relational database class at the local University (my alma mater), and a younger friend did the same at a local "skills" College. I learned Relational Algebra & Calculus, how one mathematically reduces a statement to find the shortest/fastest "plan" - brush up on set theory, and how modern "search" is really done. He learned SQL Syntax and how to write/type SQL. I also looked at his C#/Java class and he was learning Syntax - whereas I remember learning Linked Lists the differences between Asm/Lisp/C/Prolog (yea - a while ago) - Functional vs Imperative vs Logic vs... etc and syntax came only as part of learning the concepts and visiting each language.
Coding-wise, when I went to University - Java didn't exist (actually, ANSI-C had just become - well ANSI Standard C, my K&R book was stamped "NEW! Updated for ANSI-C"). But I learned what Garbage collection was - in the class on memory management and CPU architecture. What is a Heap/Stack and why approaches such as Garbage collection are useful (including algorithms for multi vs single pass culling) There were little 1 credit classes to learn specific languages(e.g. later on C++). Heck - I even learned how an ALU physically works in my EE class (that was way-cool ! A light bulb went on and I switched from EE to CS)
The best classes where the Analysis of Algorithms & Data Structures. While I hated it at the time, learning what O(n) means has turned out to be very helpful - esp when applied to other concepts like Bandwidth and Latency. A lot of "new" programmers don't understand latency and believe trips across the wire are just fine - yo - make as many as you'd like. In my day a trip across the wire was from the CPU to main memory. No adays it is from browser to web-server. However, with proper training one learns the Min/Max of "as few as necessary"
If your method of solving a problem can't possibly go faster - fiddling with code will only improve it in single digit percentages. Knowing why this is and finding the better algorithm or mathematical simplification/reduction will improve execution time by double/triple or maybe exponentially - and thus make you a better asset.
After Goliath's defeat, giants ceased to command respect. - Freeman Dyson