Published on Save Access (http://saveaccess.org)

When Capacity Is Never Enough

By saveaccess
Created 03/25/2008 - 7:24am

from: MultiChannel News [1]

COVER STORY: When Capacity Is Never Enough
The Cautionary Tale Of Video Downloads
By Tom Steinert-Threlkeld -- Multichannel News, 3/22/2008 5:34:00 AM

WASHINGTON, D.C. -- A bit is a bit is a bit. An electronic packet of data is a packet is a packet. Unless you look inside to see what it contains.

Yet Internet access providers such as Comcast and Time Warner Cable, the nation’sFCC commissioner Jonathan Adelstein two largest cable operators, are loath to open the envelopes.

Even if those packets are generated by a relatively small number of users carry illegally obtained goods. And, in the process, hog bandwidth.

“How do you distinguish, without invading the privacy of the people communicating,” asked Federal Communications commissioner Jonathan Adelstein at last week’s Internet Video Policy Symposium in Washington, D.C. “It’s very hard to distinguish, just looking at packets, whether they’re legal or illegal.”

What Adelstein was talking about was Internet video – and more specifically, the sharing of movies, TV shows and other content by a technique called peer-to-peer sharing.

In effect, subscribers to a cable or telephone company operator’s Internet service use their own machines – and those of anyone who wants to share shows – to manage the distribution of whatever they want to watch. Unchecked, the Net users who spread out the task to known and unknown collaborators have become the bane of network operators who see no way to limit the ability of these Net video hogs to usurp capacity of the bandwidth they have spent billions to install.

Sharing a single high-definition movie can occupy 25 million bits of capacity a second, until completed. A standard-definition show can take up 6 million bits. This isn’t a case for your garden-variety 256,000-bit “broadband” connection, any more.

The solution?

If you’re a policymaker such as Adelstein, the answer looks like … more bandwidth.

But the answer is not that simple, according to cable network operators. With peer-to-peer file-sharing of video over the Internet, no capacity is enough. And network owners would spend huge sums – more billions – to serve the voracious appetites of as little as 5% of their subscriber base.

THE VIDEO HOG
“Video can be a real bandwidth hog sometimes," Adelstein said Tuesday night. “Of course, one great solution for that problem is adding more capacity.”

If that were accomplished, there would be no debate about the P2P file-sharing problem. You don’t see this kind of problem in Japan, he contends, where Internet users already have connections that transfer 100 million bits of data every second.

But, wait. It’s not that cut and dried, says Comcast. In the face of this form of hogging capacity, no capacity is enough. Invest in more bandwidth – and it’ll almost by definition be overtaken by file-sharing, immediately.

“Because these P2P protocols are designed to devour any and all available bandwidth on the network, it is not possible to build one’s way out of the need for reasonable network management,” Comcast public policy counsel Joseph W. Waz Jr. and a phalanx of other Comcast lawyers argued to the Federal Communications Commission in a Feb. 12 filing of comments on broadband industry practices.

How can that be? It’s relatively straightforward, according to Sena Fitzmaurice, senior director of corporate communications and government affairs for Comcast in the nation’s capital.

The primary program used for efficient distribution of files, BitTorrent, splits tasks between available personal computers. A few central servers “seed” participating personal computers with content; and those personal computers in turn seed others. In the end, each computer becomes part of a one-for-all, all-for-one union using bandwidth in the last mile of a cable network to distribute heavy-duty content. To move the video or other big files, the seeded computers ask for as much capacity as possible to deliver the final result as quickly as possible. And since much of the work is taking place in neighborhoods and not on the backbones of networks, available capacity gets maxed out more rapidly.

In effect, when it comes to downloading video for later playback, as opposed to streaming it for on-demand playback (“I Want My Web TV,” Part 1, Net Video Hogs, March 17), it’s BitTorrent – and its adherents – that get defined as a bandwidth hog by network operators and economic analysts.

“It’s a question that answers itself, doesn’t it?, “ said former FCC chief economist Gerry Faulhaber, now a professor at the Wharton School of Busines at the University of Pennsylvania, at the symposium. “ It’s not called BitTrickle, after all.”

By one count in the middle of last year, streaming audio and video on the Web overtook peer-to-peer applications as the top consumer of bandwidth on Internet networks, accounting for 46% of all traffic. That came after four years of P2P overwhelmingly consuming the most bandwidth, according to Ellacoya Networks.

But even with that report indicating that P2P was “losing its status as the biggest bandwidth hog on the Net, that’s not what we’re seeing,” said David Vorhaus, an analyst with Yankee Group, a telecommunications consulting firm. According to Vorhaus, cable operators continue to report that 60%-75% of the traffic on their Internet networks is being generated by P2P file-sharing.

And, as with streaming video, the usage boils down to a very small number of outsized users. Vorhaus estimates that 5%-10% of Internet users are generating 80%-90% of this P2P traffic. In 2006, research conducted by Cable Labs Terry Shaw and Jim Martin, a computer science professor at Clemson University, found that it only takes about 10 BitTorrent users bartering files on a network node serving around 500 houses to double the delays experienced by all other users. Especially if the BitTorrent users are taking in video and everybody else is using “normal priority” services, like e-mail or Web surfing, which are characterized as “best-effort” traffic.

Online, BitTorrent has tried to refashion itself as a legitimate purveyor of video content. Its BitTorrent Entertainment Network, akin to an iTunes store, allows legitimate downloads of more than 10,000 movies, TV shows, games and music videos. And BitTorrent is marketing its technology to businesses that want to distribute video. Its first client was Brightcove, an online distributor that serves publishers like Reuters, National Geographic and TMZ.

But most experts, such as Vorhaus and Faulhaber, believe that the vast majority of traffic on the Net using BitTorrent technology still is composed of Hollywood movies, TV shows and other video content that are being illegally distributed in violation of copyright law. Indeed, 90% of P2P downloads are still of illegally copied content, according to David Hahn, vice president of product management at SafeNet Inc., which tracks the networks.

Which means doubling the capacity of a cable system or telephone network to accommodate more traffic would -- if this continues to be the case -- only support further theft, in effect.

PAMELA’S PARTS
Not managing the flow of this traffic could be life-threatening, FaulhabPamela Anderson (center) and Baywatch caster maintains. Under a regime of “neutral” management of all traffic on the Net, the latest Baywatch video would have the same priority as the transfer of your personal medical history to a hospital the minute you collapse at a restaurant. “Pamela Anderson’s parts are not as important as your heart,” said Faulhaber.

But increasing capacity to support P2P traffic may be inevitable – because legitimate uses of the technology could skyrocket. Using local computers, not just central servers, to distribute video content can be extremely efficient. And if a large for-profit enterprise such as Amazon or NetFlix or Blockbuster or The Walt Disney Co. or all of the above become heavy users of the technology for video download services, bandwidth would be under heavy pressure. “That’s the real concern,’’ said Vorhaus.

The pressure will grow. Video is just starting to become widely downloaded from – or uploaded onto – the Internet. “Tomorrow’s problem is the same problem as today, but on a larger scale," Vorhaus said.

To that point, Comcast has tried to curb the current herd of download hogs by focusing on the uploads that fuel the sharing of files over swarms of personal computers.

To try and limit the load on its network, Comcast, according to its FCC comments, only manages the uploading of files – at a time when a customer or customers are not downloading files at the same time. Such “unidirectional sessions” indicate an automated file-sharing process under way; and are held up until “usage drops below an established threshold” of simultaneous sessions.

This is akin to smoothing out the flow of cars onto a highway, through the use of temporary stop lights at on ramps, Comcast contends.

In effect, Comcast has tried to curb the effects of the use of a peer-to-peer file-sharing application, without condemning the application itself.

But why?

“Why not target the thing that is causing the problem?” said Scott Wallsten, senior fellow at the Georgetown Center for Business and Public Policy, about BitTorrent. “What we have is a problem in waiting.”

Normal network management technique or not, the Comcast practice of delaying BitTorrent traffic at peak times has landed Comcast scrutiny from the Federal Communications Commission. After the Associated Press termed the practice “blocking” of data, a coalition of consumer groups and law professors filed a complaint at the FCC, claiming Comcast had violated the agency's two-year-old net neutrality policy statement, should be ordered to stop and fined up to $195,000 per affected customer. Comcast has about 13 million high-speed data customers.

“Comcast's Dave CohenComcast does not block any Web site, application, or Web protocol, including peer-to-peer services, period,” Comcast executive vice president David Cohen responded. “What we are doing is a limited form of network management objectively based upon an excessive bandwidth-consumptive protocol during limited periods of network congestion.”

The commission subsequently opened a formal investigation of the practice; and held a hearing a few weeks ago at the Harvard Law School into the fairness of the practice.

The commission’s chairman, Kevin Martin, says that two troubling aspects of the case are that, in his view, Comcast at first denied the allegations. Also troubling, he said, were allegations that Comcast altered certain user information in packets to affect a delay in peer-to-peer transmissions. Martin says a ruling will come by July 1.

Comcast, for its part, told the commission in its February comments that many of the complaints about the effects of the “blocking” had nothing to do with blocks or delays. Among the complaints: that users couldn’t check email when sending files from home to work; and that chat services weren’t working properly, allegedly because of bad network management.

THE CAPACITY PANACEA
“These commenters’ calls for Commission intervention are misplaced,” Waz and the Comcast attorneys said. “Surely, the Commission has neither the resources nor the ability to turn itself into the help desk for 60 million broadband households.”

The solution again, at first blush, looks like this: add capacity. Then you could handle all bits equally, without delay.

That “would also deal with the network neutrality issues. The more capacity, the less of an issue it becomes,” Adelstein said.

But capacity can be a chimera, says Faulhaber. Just look at Japan, a country that Adelstein cites as an exemplar of serving net video hogs and average users at the same time.

Japanese consumers are already used to getting access to data on the Internet at the rate of 100 million bits a second – the rate touted by Comcast CEO Brian Roberts at the 2007 Cable Show as the imminent signal speed of “wideband Internet access” from cable operators.

Faulhaber notes that 100 Mbps is not enough speed even in Japan to avoid the suddenly cardinal sin of managing its network that Comcast has committed.

Even in Japan, delaying the arrival of some content -- so-called ‘traffic shaping’ – is common. This is a normal practice that allows more efficient processing of traffic for all users, not just hogs.

“To say, we have a neutral network, you never do,” said Faulhaber. “You have to be pro-active, to give more users more services on a consistent basis.”

Telcos like Verizon also face bandwidth scarcity problems from P2P applications. The problem, however, is more acute for cable since the upstream bandwidth of a DOCSIS cable modem – from the subscriber’s computer to the Internet – is just a fraction of the downstream speeds.

Meanwhile, Verizon’s public relations stance on the issues involving broadband network management stands in contrast to Comcast’s.

Earlier this month, the telco announced it tested a system called P4P, developed by researchers at Yale and the University of Washington, designed to keep peer-to-peer traffic off the backbone networks of the Internet. This technology more intelligently directs P2P traffic to local peers instead of letting software like BitTorrent try to suck data willy-nilly from all over the world. Verizon claimed that using P4P, 58% of peer-to-peer traffic came from nearby P2P users on its network, compared with 6% before.

Verizon attempted to position the move as an embrace of P2P, pointing out there are legitimate uses of peer-to-peer. Its press release, for example, noted that NBC is using P2P as part of its NBC Direct episode-download service. NBC is an especially good example to use since it has been a vocal critic of P2P networks used to swap copyrighted material, urging the FCC last year to require broadband providers to prevent video piracy.

“No longer the dark-alley distribution system for unauthorized file sharing, advanced P2P delivery networks link content-seekers with licensed files they want... P2P is being mainstreamed by distributors like NBC Universal,” Verizon's announcement said.

The cable industry also says it’s working on such engineering solutions to handle P2P traffic with technology suppliers. And that’s how the issues ought be resolved, instead of regulators imposing arbitrary rules on broadband providers, National Cable & Telecommunications Association president and CEO Kyle McSlarrow said during a conference call with reporters last Thursday.

“Let the marketplace and the Internet community examine the results of what is, and is not, working,” McSlarrow said.

The management of traffic – and the curbing of bandwidth hogs – is going to get more critical as more types of video uniquely found on the Internet emerge.

Michael Nelson, a visiting professor in the Communications, Culture and Technology curriculum at Georgetown University, for instance, sees a widely used type of “Second Earth” imagery being hugely consumptive of bandwidth. This is the kind of faux 3D imagery found in the mating of Google Earth images and Google maps that allow an Internet user to see exactly what will face him or her at a particular intersection, building or real-life site – on all sides – before actually getting there in person.

Add to that the Orwellian placement by police, public safety organizations and private property owners of uncountable video cameras along public places for security purposes, the placement of video cameras on mountain sides and beaches for simple ski and surfing purposes, and Nelson and other tracers of the effects of all this online video argue that there is no alternative to managing networks to husband available capacity. There is a presumption that Internet video will amount to either narcissistic homemade, personally-focused video or high-end, high-quality professional video, when the real result is likely to be a host of new types of video applications that could only exist with a low-cost mechanism of worldwide distribution like the internet.

The trick, according to Faulhaber, is for movie producers and other video content producers to collaborate with distributors, such as telephone TV and cable system operators, to provide a protected channel of high-quality video to legitimate viewers. Otherwise, charging for content will be impossible.

In Hong Kong, the movie industry disappeared, he noted, when movie producers could not be protected from on-the-street or on the-screen delivery of illicitly obtained copies of actors and directors’ work.

The problem, says Georgetown’s Wallsten, is timing. In essence: Knowing what to do about Net video hogs. And when.

“The Internet changes really fast. Policies can’t,” he said.

Whatever is thought about managing networks for all users’ benefit now, is subject to change. Even if “delays” are okay and “blocking” is not, “we might not think this way six months from now,” Wallsten says.


Source URL:
http://saveaccess.orgnode/2275