Cox Communications, Oklahoma’s largest Internet service provider, has caught the attention of net-neutrality activists with a new system that prioritizes its customers’ Internet use.
In late January, Cox announced it would be testing a new method of traffic management on its high-speed networks in Kansas and Arkansas.
Starting in February, Cox began separating traffic into “time-sensitive” and “non time-sensitive” categories to help ease congestion when its network experiences heavy usage. Time sensitive traffic ” which includes streaming video, Web pages, voice calls and games ” will not be slowed down. However, non time-sensitive traffic ” including peer-to-peer file sharing, file uploads and software updates ” will be slowed down at Cox’s discretion.
Cox’s 550,000 high-speed customers across Oklahoma could see the same system implemented in the state later this year. David Grabert, Cox Communications spokesman, said the decision to try a new system is due to changes in Internet habits. Users are streaming online video and uploading rich-media content onto social networking sites more than ever before.
“The way our customers use their Internet service is an evolutionary process,” he said. “We’re always looking at new opportunities to manage our network in a way that will deliver the best possible experience for our customers.”
Grabert said the delays in speed would be practically unnoticeable for the average user.
“The things we have said are ‘non time-sensitive’ will still go through,” he said. “We’re talking about milliseconds and seconds, so it’s not the kind of thing that most customers will notice.”
However small the delay, the new policy is being viewed by many in the technology industry as another blow in the fight for net-neutrality, the idea that all Internet traffic should be treated equally.
In July 2008, the Federal Communications Commission ordered the Internet service provider Comcast to stop slowing down peer-to-peer file sharing on its network.
JR Raphael, contributing editor for PC World, has been following the latest situation with Cox closely and sees many similarities with Cox’s new policy and the FCC ruling on Comcast.
“The two situations strike me as being eerily similar,” he said. “Comcast was slowing down specific kinds of traffic when its network became too busy. Cox is going to slow specific kinds of traffic when its network becomes congested. Comcast limited peer-to-peer file transfers, large downloads and software updates. Cox is planning to limit peer-to-peer file transfers, large downloads and software updates.”
Raphael said he thinks Cox’s decision to be up-front about their practices is good, but it doesn’t mean they won’t run into problems.
“The key difference is the level of transparency,” he said. “But it doesn’t change the fact that the underlying principle of what it’s doing is essentially the same. The company is still deciding the importance of your Internet activities and adjusting your speed accordingly, even though you’re paying the same price for the same access as everyone else.”
One fear is that the larger ISPs become, the more control they will have over what information users can access.
“Who’s to say that uploading files to my Web site is any less important than chatting with your friends on Facebook?” Raphael said. “As long as we’re both paying the same price, we should both have the same access. It’s simply not an Internet provider’s place to say one activity is more valid than another and then limit someone’s access as a result.”
Grabert admits that Cox is prioritizing some traffic, but said it is part of a broader plan to intelligently manage an always-growing network.
“We are prioritizing traffic based on what we believe is our customers’ expectation in terms of the experience they would like to have,” he said. “Congestion will happen on any network and it’s kind of dynamic. What we’re trying to do is the most customer friendly approach as we can.”
Ian Rohrback, a freelance computer technician based in Norman, said he sees the new management policy as a good move for Cox as a company, but not for its customers.
“It’s a legitimate business move,” he said, “but one that’s at the expense of their customers. It’s sort of hindering the ability of the people to use their service.”
Rohrback also said he doesn’t think that peer-to-peer downloads, one of Cox’s “non time-sensitive” categories, cause much more traffic congestion than other downloading and streaming sites.
“From an IT standpoint, peer-to-peer networking has no real effect on a server being bogged down from a more ‘legitimate’ site,” he said. “Streaming from ‘Hulu’ takes about the same bit rate as peer-to-peer.”
AT&T is another major player in the high-speed Internet industry both in Oklahoma and across the country. The telecommunications giant serves more than 16.3 million broadband customers nationwide.
Andy Morgan, AT&T Oklahoma spokesman, said the company does not currently slow down Internet traffic, but they are experimenting with usage-based models to adapt to increasing demand.
“We don’t downgrade or squeeze or slow down Internet traffic,” Morgan said. “We spell that out in our terms of service with our customers. We’re spending billions of dollars every year to try and stay ahead of customer demand for bandwidth.”
Last November, AT&T began testing a new usage-based system in Reno, Nev., and Beaumont, Texas. Customers receive a bandwidth amount ranging from 20 to 150 gigabytes depending on their speed tier, and users are charged $1 for every gigabyte over their determined usage amount.
“The trial will help us evaluate ways of dealing with surging usage trends while continuing to meet customer needs for a high-quality broadband experience at an affordable price,” Morgan said.
As companies acquire more customers, they are forced to invest huge sums of money into figuring out the best ways to manage and improve their expanding networks. The trend of trying new models, like usage-based and congestion management techniques, may increase in the coming years as the demand for more information at faster speeds continues to grow.
Grabert said Cox’s new system won’t be the ultimate answer to network congestion, but it’s through trial and error that you improve a system.
“This is still just a trial,” Grabert said. “We’re carefully listening to our customers, as well as the folks in the public interest groups, and all of these things will be factored into how we do things going forward.” “James Lovett