An article in the Kansas City Star online site this week tells of how two of the main Internet providers in the KC area are improving their access speeds in direct response to the arrival of Google Fiber in that market. Comcast is doubling the speed of its lower-tier customers; improving their 25 mbs service to 50, and their 50 mbs customers to 105, while the 105 mbs customers will be moving up to 150, all at no additional charge. It isn’t clear how fast the upgraded Time Warner Cable service will be, since it won’t be ready until next year, but I would expect them to at least match the Comcast offer if they want to stay in the game. It’s already bad enough that Comcast will have the upgraded service online within a few days, or a week at the outside…
It is worth noting that even with these upgrades the existing providers will be at a considerable disadvantage relative to Google Fiber, which is offering a 1,000 mbs service for roughly the same rates that Comcast will be charging for 150 mbs. But doubling the access speeds should help, especially if the company can delight users with how much faster its service has become while maintaining price parity with both Time Warner and Google. Even more importantly, this could give the company the time it will need to upgrade its own systems and create even faster services, much cheaper service tiers, or hopefully both. A much more immediate question from where I’m sitting is what has taken them so long?
Even assuming that Google Fiber didn’t exist, or that it wasn’t coming to the Kansas City area yet, Comcast has been competing directly with Time Warner for some years now, and they have clearly had both the technology and the funds to upgrade their systems in the Kansas City area. It’s hard to imagine why they wouldn’t have wanted to gain the massive competitive edge that having service that was twice as fast as the competition (for a comparable price) would have given them. It’s possible that the company’s attention was elsewhere, or that they didn’t want to get into a price war with their competition, but the most likely explanation is that they were satisfied with the revenue being generated by their facilities in this market, and did not consider the increase in sales that the upgrade would provide to be worth the cost of doing so…
That is, until a new competitor turned up in the market offering a better service than either existing provider, and threatened to take away all of their market share. I can definitely see the arguments in favor of their strategy – if this was, in fact, a deliberate strategy. Maintaining market share by maintaining parity would have made sense, whereas entering into a war over either price or features (access speed) probably didn’t. But I’d still have expected them to keep an eye on potential competitors crashing into their market, and plan accordingly. Unless, of course, they have had this capability all along, and have only started offering the new service for sale because of the appearance of a more powerful competitor…
I’d like to tell you that would surprise me – but I’d be lying…
No comments:
Post a Comment