Apple and WebRTC
It was in the news a few days ago: Apple is starting to integrate WebRTC in WebKit, the engine that is behind Safari. That is a step that should not underestimate. Why?
Apple is the last major platform vendor that does not integrate WebRTC yet. All others, including Google/Chrome, Microsoft/Edge and Mozilla/Firefox rolled it out already. I never understood what they are waiting for. Was it too challenging from a technology point of view? Hard to believe looking at what Apple delivers. Or patents or codecs? Also hard to believe, considering that the others can offer it, even without charging for it. Was it because they thought that they could make it with facetime? With 20 % of the people using the Apple platform, you would end up 80 % of the time not using facetime. Instead you would use Skype on your iPhone. I don't see how that helps Apple.
The impact of Apple eventually joining the group of WebRTC platform vendors will be pretty big. Now software vendors can seriously start building platforms that include iPhones, Android, PC, tablet and what else is out there.
The biggest benefit for WebRTC in my plain old words is that there will be no more interoperability problems. The SIP standard has evolved into a large number of RFC that has really become hard to understand. Buying any SIP phone and plugging it into any other PBX is an adventure (well we spend a lot of time to make it as smooth as possible with out PBX). For WebRTC, the standard is essentially JavaScript with a rock-solid foundation when it comes to running the same code on the different platforms (if we forget about IE7 for a moment). The code will be simply loaded into the client coming from the web server for the site. Even complex functions that will never be standardized in any RFC will work without any problems. It will be a beauty.
Consider facebook. If they offer a talk function on their platform using WebRTC, they will turn into a telecom company that has more subscribers than China telecom. Why would you call someone over a telephone number if you can do the same from WebRTC. If you are a telecom provider you need to really seriously think about providing quality data connections when there are 50 RTP packets per second going back and forth. Users will demand this from the market, and they will get it sooner or later.
It is bad news for those who have written "native" applications that runs soft phones on Windows, MacOS, Android or other platforms. There will be no need to install any software on your device any more. And if there is software being installed it will be essentially a sugar-coated HTML5 application, like we have done with the Vodia App.
But we are not there yet. First of all, Apple needs to release WebRTC in their mainstream products. And even more important, the vendors need to work on usability (all of them). It is still a challenge to have a device ring for an incoming call. End users don't understand what the problem is! There needs to be a way to switch between audio devices like speaker, headset and microphones in JavaScript. And then we have the problem of keeping the connection to the server alive, even when the device needs to seriously save power. Both on Apple and the Android platform there were some promising solutions coming up. Users want to be able to receive a call after not using their device for a couple of hours even days without killing the battery for goofy keep-alive traffic. That might actually have been the reason why Apple was so reluctant with WebRTC. Users don't like if if their battery gets killed by idling applications. That is something I can totally understand.
My feeling is that this was the tipping point. From now on, WebRTC will drive the real time communication business. I don't see any alternative. It will happen, and we want to be part of it!