What Filesharing Studies Really Say – Part 16 – Focus on Adaptation, Not on Shuttering P2P

This is part 16 of the re-publication of my meta-analysis on what filesharing studies really say.

[Originally published on ZeroPaid on May of 2012.]

We are starting to near the end of our series. While we’ve found quite a lot, there’s still something left for this series to give. The study of the day seems to follow a theme from previous studies – focus on adaptation, not enforcing copyright laws on file-sharing networks.

The study we are looking at today is called “Peer-to-Peer File Sharing and the Market
for Digital Information Goods”. It was published in 2010 in the Journal of Economics & Management Strategy.

The study begins with a rather familiar tune:

Technological innovations have not only presented new market opportunities but also new threats. The radio, the cassette player, the video recorder, and the compact disc have allowed the industry to deliver additional value and meet new demand. But these same technologies have also been employed to replicate and distribute content without the consent of copyright holders.

Again, this is a theme we’ve seen numerous times in the previous studies we’ve seen in this series alone. Technology provides both opportunities and challenges for businesses.

So, this study actually investigated a number of things. So, this excerpt describing what the paper was trying to find is a little lengthier:

First, Asvanund et al. (2004) show that congestion worsens as the size of a p2p network grows. Our model generates this result endogenously. In fact, the effect of network size on congestion helps explain the coexistence of multiple p2p networks. The model can also accommodate positive network effects when users value content variety.

Second, many studies have shown that heavy users of p2p file sharing networks are more prone to purchase content online. Our framework not only suggests that there is no contradiction in this observed behavior, but also sheds light on the factors that explain the demand for online content in the presence of a p2p network.

Third, we show that the firm has an additional incentive to charge high prices, compared to what a standard model of vertical differentiation would predict. When prices are high, more consumers prefer to download from the p2p network. As a consequence, congestion increases and the value of the p2p network decreases, increasing the attractiveness of the firm’s product.

Finally, researchers and industry analysts have long questioned the existence of applications that drive broadband demand (“killer apps”). Our model shows that file sharing networks strictly benefit from improvements in broadband capacity, creating value for all participants. A study performed by Internet research firm CacheLogic in 2005 revealed that over 60% of total Internet traffic belonged to p2p file sharing applications. This suggests that file sharing is indeed a driver for broadband demand. We believe that our results should be of interest to all participants in markets for digital information goods.

I have to admit, this is a little different of an argument then what I’m use to seeing. By this logic presented here, file-sharing increases consumer demand, but as bandwidth decreases in the network, that decreases the value of file-sharing and, thus, increases the incentive for people to use authorized sources. This is a curious position because, to my knowledge, as, say, a BitTorrent swarm increases in number of peers, that increases the efficiency of the swarm, thus making the swarm less congested, not more congested. So, I’ll be interested in seeing how the figures that a busy file-sharing network decreases its efficiency because that, by itself, doesn’t make any sense to me (we aren’t referring to the alleged ISP congestion here).

Later on, the study says that this study is based on the open Gnutella network. While I can see using a single network in, say, 2003, I don’t see how using a single network is indicative of a trend even in 2010. The paper went on to compare file-sharing to iTunes which could be summarized by this graphic that was provided:

p2p-vs-iTunes

I think it’s pretty easy to disagree with these comparisons on both sides of the table. For instance, iTunes dropped DRM. Second, I think there will be those that dispute that file-sharing is hard to use (especially something like Frostwire since we are talking about the Gnutella network). I think I’ll leave others to touch on the metadata argument as well. While thees comparisons aren’t perfect, it isn’t entirely wrong either. For instance, this comment seems accurate to me:

Digital rights management (DRM) restrictions render licensed content an inferior good compared to unlicensed content; only the latter can be played back on any multimedia device and is future-proof compatible.

The study then talks about the litigation tactics and mentions that the RIAA argues that file-sharing is destroying the music industry, but the study doesn’t say that the RIAA is correct in this argument in section 2.

The study then calculates the cost of peers in the network by saying that those costs include bandwidth, the risk of getting caught among other things. After some calculations, the use of a model is introduced which is this:

To summarize, our model of p2p assumes:
– All peers have 1/θ units of upload bandwidth capacity.
– All peers have at least 1/θ units of download bandwidth capacity.
– Every peer connects to one sharer only.
– A sharer may not connect to herself.
– Upload bandwidth is allocated equably amongst all peers connected to a sharer.
– Second-stage network allocations are stable and equiprobable.

I’m struggling to think of a file-sharing network I’ve ever used where every peer only connects to one peer. You can connect to more than one peer in the ED2K network (eMule). You can connect to more than one peer in Frostwire (client). You can connect to more than one peer in Shareaza (client). You can connect to more than one peer in BitTorrent (protocol). You can connect to more than one peer in the Kademlia network (eMule as well). So, compared to a number of available networks an clients used to connect to these networks, this seems like a rather inefficient network on the third point alone.

In fact, the study does admit that there are differences between the model used and actual p2p networks:

Our model is a simplified abstraction of a p2p network. There are two important aspects in which our model differs from real p2p networks. First, we assume that all nodes are homogeneous in capacity. It is well known that in addition to residential users with largely homogeneous connections, there are nodes with high-speed connections (e.g., computers in university dorm rooms) that contribute a large fraction of upload bandwidth and content.

[…]

Second, our model assumes that every peer connects to one sharer only. While this was the case in earlier generations of p2p file sharing networks (such as Napster), new generations (such as eMule and BitTorrent) allow for multiple links. Unfortunately, relaxing the single link assumption renders the problem intractable.

The study does throw in a few more calculations to compensate. The paper then goes on to discuss “freeriding” in the network (download, but not uploading). The study then looked at sharing partial files (plenty more math involved).

The study concludes on this point the following:

The model suggests that as network size grows and congestion worsens, peers are better off forming new networks with fewer peers (initially) and faster download speeds. The number of coexisting networks must then be a function of population size and the scalability of p2p technology. Although we do not pursue this research question here, it is worth pointing out that to study the equilibrium number of p2p networks and their sizes one would want to work with a model that captures the positive network effects resulting from having higher content variety in larger networks.

I think there are two real world responses one could form here. The first response might be from an avid private BitTorrent user who would say that the networks that was formed were BitTorrent networks on private trackers and, thus, agree with the findings. The other argument might be from an oldschool file-sharer that still uses these open networks and point out that if these networks were so inefficient in the first place, they wouldn’t really be functioning any more, but yet, they do still exist and they are still populated by users. I would personally venture to guess that opinion might be split on this portion of the study.

One problem with calculating this model was also noted later in the study:

To the best of our knowledge, no empirical estimates exist for the model’s main parameters [in the model]. This makes it difficult to calibrate the model and explain available data on p2p file sharing networks.

In determining the size of BitTorrent, the study had this comment:

For a rough estimate of what the relevantNsmight be, consider the fact that the largest p2p network is BitTorrent with approximately 60% of worldwide p2p traffic, and that a conservative estimate of the share of peers exchanging music in any given p2p network is 33% (the real share is probably lower).28 Thus, the relevant number of simultaneous peers exchanging music in a given file sharing network is at most 2,000,000.

We left the citation in here because this comment cites “the Ipoque 2008/09 Internet Study”. So, the numbers may be a little dated by today’s standards. Finally, we get to this part of the study which reveals why congestion is a problem in file-sharing:

The explanatory power of endogenous congestion, incentive schemes, social preferences, and unawareness is likely to be different at different stages in the life cycle of p2p networks. Our model suggests that endogenous congestion is likely to play an important role early in the life of a p2p network, before it becomes large, because the impact of sharing on congestion is always large

That makes a lot more sense. What is actually being referred to is file-sharing networks in the very beginning of it’s life. I can see bandwidth being a problem in small and new file-sharing networks. In that case, unless you are talking about BitTorrent swarms just starting up, it’s a little difficult to see how bandwidth congestion being a problem for a larger portion of the file-sharing population (and even with new swarms, the file is either popular and the issues of bandwidth eventually passes or the file was never that popular to begin with).

The study then introduces iTunes as an alternative. In discussing this, the study comments:

Because content is free on p2p networks, for the firm to persuade users of digital content to pay positive prices, it must offer added benefits that file sharing networks cannot match. Online distributors such as Apple’s iTunes store offer content on a traditional client server architecture and operate in agreement with intellectual property right holders.

I have a feeling that one click hosters never really entered the authors minds because cyber lockers also offers a somewhat traditional server-to-client system. Given that this study was published in 2010, we should note that that MegaUpload (as merely an example of a cyber locker) has been around since 2005. Usenet may also be another exception as content is distributed on servers as well. So, cyberlockers and Usenet may have been exceptions overlooked by the authors.

The study goes on to discuss the relationship between p2p and authorized sources which has pretty much been covered by the graphic we offered earlier in the article.

The conclusion of this study includes this:

We have presented a simple formal model to analyze some aspects of the interaction between p2p and online stores, two alternative models for the distribution of digital information goods. Although the two models have emerged only recently, it is more likely than not that they will endure. iTunes is legal and has grown spectacularly since its inception. Because p2p file sharing activity is mostly illegal, one is tempted to believe that legal attacks against p2p with the goal to shut them down will continue and eventually succeed.However, due to their decentralized nature, p2p networks have proven difficult to block.

I note that this study does not say that shutting down networks will succeed. It does say that it’s easy to think that, but the study doesn’t actually say that. The conclusion also says:

The content industry has so far faced the new online paradigm as a threat more than as an opportunity. But the need to embrace digital distribution seems obvious by now; there is no way back to a world of physical distribution only. Because of this transition and the increasing relevance of the online channel to reach consumers, we expect ISPs to play a stronger role in shaping market structure. We also expect the content industry to reassess their revenue models. Changes toward monetizing products not subject to replication, such as the increased attention paid by major record companies to live concerts and merchandising, may be signals of a new trend.

This agrees with a large body of research we’ve already covered earlier. Also:

Our formal model is necessarily partial in that it is focused around characterizing the firm’s profit-maximizing pricing strategy. More generally (but less formally), to compete effectively against p2p, online digital distribution must strive to become accessible and attractive to consumers. Online content providers are in a unique position to optimize and deliver new experiences to consumers which cannot be matched by decentralized, self-sustained p2p networks.

[…]

The potential industry-wide revenue implications of p2p are still uncertain. However, our analysis suggests that there is scope for profit-maximizing online distributors and content producer to compete effectively against unauthorized file sharing.

While I may not entirely agree with some of the methodology going on in this paper, I do generally agree with the conclusions presented (specifically, I disagreed with some points that make the estimates both undervalued and overvalued the value authorized sources). I think there is a body of literature already discussed previously in this series that also agrees that finding newer business models is certainly the way to go.

Drew Wilson on Twitter: @icecube85 and Google+.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top