(Mis)understanding the Pace of Change in Enterprise Software
In a recent article in AllThingsD, Okta CEO Todd McKinnon, explains why the pace of change is so often misunderstood in enterprise software — and the implications that has for CEOs setting company strategy and VCs investing in new technologies.
There are massive changes taking place in the enterprise, but legacy software is still a reality for most organizations — and it will be for years to come — because of three main reasons: product cycle, adoption time and entrenchment. Head over to AllThingsD to read the full article.
In Silicon Valley, and high technology in general, there’s a common narrative about how the new disrupts the old, and the old subsequently dies. It’s a compelling narrative, especially in an industry such as technology where fortunes are made in the name of innovation — but it’s important to separate the signal from the noise. That narrative is applied too often and too broadly, leading to faulty company strategies and poor investments.
According to “new replaces old” assumptions, the mainframe computer would be long deceased. We all know that’s not the case. I recently met with the CIO of a large, well-known insurance company, who said that for 20 years the company has tried to reduce its reliance on mainframes. The problem is that the company runs several complex processes and algorithms, built by people who have since retired, on those systems. Everybody knows that they work, but nobody really knows how they work, which is why they’re still around two decades later.
Shocking as it may seem, enterprise software giant BMC’s mainframe software revenue is on the rise, thanks largely to scenarios such as the one my friend at the insurance company is dealing with. And remember Lotus Notes? According to the Wall Street Journal, it’s still a $1 billion business, too rooted for large companies to walk away from, and too large for IBM to ignore as it expands into cloud and social software.
To read the rest of the article, visit AllThingsD.