Recently Jason Bock posed a question to our company about where should validation happen in an application; on the client, on the server, both? In the past the pendulum in the world of application development has swung back and forth between thin and thick client and as it has the location of validation has shifted. Green screen terminals with heavy logic on the back end, one and two tier applications with the majority of validation in the client. The rise of COM with the ability of creating n-tier applications where the same business logic could be written on the client and the server (I'll ignore the fact that so many people did this poorly).
How did this all happen? Companies were able to unilaterally innovate and proprietary systems reigned supreme. With the rise of the PC companies made DBase, FoxPro, VB; each was independently successful on the PC platform. Several years later MS could unilaterally put COM in place and later .Net. These were more than just languages, frameworks and syntactic sugar, they were designed to allow new paradigms.