Notions of Authenticity: Technology Systems as Representations of Authentic Organizational Form

by Jerome M. Hendricks

My current research explores the actions of intermediary firms in periods of rapid technological change. By asking how new developments in listening to and owning music have changed the music retail industry, I offer the independent record store as a case of such an intermediary. Through a longitudinal multimethod content analysis of media and industry documents, I’ve looked at the evolution of music retail from a variety of angles. Early consideration of the curious position of independent record stores led me to store representatives and their understandings of change and how they see their role in an industry trending toward digital technologies. Recently, I’ve been thinking more about the extent to which structural adaptation has occurred due to the same market changes. In the absence of detailed procedural histories of music retailers across the country, I identified all types of music retailers in my data set and coded the various technologies they utilize. My conceptualization of technologies draws on the social construction of technology (SCOT), actor-network theory, and performativity literatures where technologies are practical strategies, objects, practices, and knowledge that legitimate markets and maintain structures over time (Latour 1988, Callon and Muniesa 2005, Callon 2006, Mackenzie 2008, Bikjer 2010). As an initial step, I look to the extent to which technology systems might represent organizational forms.

However, articles from my data set like this one made me think more about the nature of contemporary organizational form. As one store closes and another opens, their assessment of what constitutes a good record store is noticeably different. And while adaptation seems inevitable due to the evolving composition of the market, these small, specialist firms must resonate with target audiences in order to maintain value and legitimacy. Indeed, recent work on organizational form emphasizes the role of these target audiences in a firm’s successful affiliation with a given form (Hsu and Hannan 2005). Projecting an authentic identity serves as a central component in the positive exchange between firms and audiences. Carroll and Wheaton (2009) identify three components of an organization’s identity that facilitate the projection of an authentic form: “(1) an identity claim must be visibly projected; (2) the purported identity must be credible; and (3) the identity must be perceived as reflecting the meaning of authenticity in question” (273). This suggests that (1) identifying dominant technologies (2) utilized by similar store types (3) in specific contexts can offer insight not only into organizational forms but also corresponding notions of authenticity legitimized in the field.

To take on this issue of authentic form, I used my coded data consisting of firms, products, and practices to build edgelists for the purpose of network analysis. I felt the visual representation that network data offers can provide a clear identification of similar firms and their associated technologies; a representation of form. Additionally, the industry and media sources that constitute the network data correspond strongly to salient identities in the minds of members and external audiences. While not every music retailer is represented throughout the sample, through the discourse of technologies over time, I argue the output uncovers legitimate membership standards and expectations by which all stores are evaluated. Finally, network analysis allows for the identification of statistically relevant communities, or systems of densely connected nodes, through modularity measures. The algorithm utilized in Gephi, the open source software I used, is called the Louvian method where partitions of connected nodes are created based on the maximum modularity measure achievable. The resulting example below (color coded for your convenience) shows six technology systems that consist of at least 5% of the total ties in the network. I use my knowledge of the music retail market to interpret these systems of retailers and associated technologies to identify dominant forms as well as expression of authenticity.

Figure 1. Time Period 1: 1994-1996

Figure 1. Time Period 1: 1994-1996

Major System Technologies

System A

System B

System C

System D

System E

System F

T18. Deep-catalog/
genre Specialization

T26. Customer Service

T35. Location

T08. Electronics/
Accessories

T01. CD sales

T04. Used Product

T02. Vinyl sales

T11. Exclusive Product

T10. Memorabilia

T36. Fanzines/
publications

T29. Large catalog/inventory

T16. Listening stations

T31. Ambiance

T17. Expansion/
Merger/Acquisition

T14. Live music

T21. Sales/
Pricing

T22. Popular titles

T05. Other media

T06. Lifestyle Items

I generally categorize the top-right quadrant, consisting of systems D and E, as the mass retail sector due to the focus on size, growth, and price. The major technologies of system E are sale pricing (T21) with a focus on popular artist releases (T22). These mass retailers, like Best Buy, Circuit City, Target, or WalMart used CDs as a loss leader product where low prices would lure customers in with the hope that they might also buy a television, a blender, and so on. The music superstores, like Tower Records, Virgin Megastore, or HMV, found primarily in system D, could not consistently compete with the low prices mass retailers could offer on popular releases. In response these large, music-only retailers structured their operations around quantity. The major technologies of system D include maintaining a large inventory (T29), providing customers with listening stations (T16) in order to sample a variety of titles, and expanding operations across the country (T17). The emphasis on ambiance (T31) found in this system was also connected to the sheer mass of many of these superstores; highly organized, technologically advanced, warehouses of music product.

System A is the largest system identified in Figure 1, containing almost 24% of the total connections, and overlaps with almost every other system. For these reasons, I define system A as the mainstream independents or small chains that increased as a result of the market growth during this period. As indicated in the major technologies of system A, many of these stores specialized in specific genres (T18) and prided themselves on customer service through expert knowledge (T26). Like the mass retail quadrant, the geographical location (T35) that might ensure regular foot traffic was of great importance to these firms. And, like many types of music retailer, these mainstream stores were diversifying into electronics and accessories (T08). Systems B and F also highlight the mainstream store strategy of diversification at this time. Some stores began to carry a wider variety of media (T05) as well as lifestyle items (T06) like music-centered clothing as we see in system F. Many small chains began selling used product (T04) on a large scale as is evident in system B. While the modularity measure separates these technologies into three distinct systems, they are best understood as three mainstream responses to the price wars initiated by mass retailers.

The remaining system C also overlaps heavily with the mainstream stores but offers a distinct form that separates it from the others. I categorize system C as the tastemakers because of its focus on rare and collectible product as well as the presence of Do-It-Yourself technologies. The major technologies in system C are vinyl records (T02), exclusive product (T11), and memorabilia (T36) which suggest a strong connection to collector markets outside of the CD dominance of the 1990s. Moreover, many of these stores carried local publications or fanzines (T36) and some ran their own independent record labels (T28) showing a distinct connection to their regional music scene. The juxtaposition of technologies in systems C with those in the mainstream systems are very similar to the discussion of differences between coalitions found at this time. Drawing on the typology of organizational authenticity explicated by Carroll and Wheaton (2009), the mainstream systems in this period largely exemplify a brand of “type” authenticity by conforming to audience expectations of a specialized music store. The expertise and selection they provide in highly visible retail spaces legitimates these retailers as an original type of music retailer and an alternative to mass retail forms. The rare and collectible product and Do-It-Yourself practices of the tastemakers tend more toward a brand of “moral” authenticity. Here, the consumer is asked to not only evaluate the product sold by the firm but also the personal objectives of the store owner, managers, or employees. The connection these retailers have with collector markets and the vinyl record medium exemplify an emphasis on cultural values contrary to the price wars at the time. Appealing to an audience’s love of a music scene or a collectible history draws lines between those that sell music as a retail item and those that have given their lives to music.

As I’m writing this, I have drafted an extended version of this analysis that tracks both form and notions of authenticity over four distinct time periods in the music retail history. I argue that tracking changes in these technology systems (mindful of the context in which the change happens) can tell us a great deal about how these retailers have adapted and survived despite changes in the field that disadvantage them. Moreover, I argue that structural changes in what has commonly been identified as an independent record store suggests that our understanding of authenticity is not fixed and, instead, adapts to the context in which it is interpreted. I am curious what others might think of my use of network analysis and the application of “technology systems” to the concept of organizational form. I’m hoping to use this post to improve upon my explanation of the networks and the connections I aim to make to organizations literatures. Questions, comments, or concerns are welcomed!!!!

Works Cited

Bijker, Wiebe E. 2010. “How is Technology Made? – That is the question!” Cambridge Journal of Economics. 34:63-76.

Callon, Michael and F. Muniesa. 2005. “Peripheral Vision: Economic Markets as Calculative Collective Devices.” Organization Studies 26(8):1229-1250.

Callon, Michel. 2006. “What does it mean to say that economics is performative?” Working paper, Ecole des Mines, Paris.

Carroll, Glenn R., and Dennis Ray Wheaton. 2009. “The organizational construction of authenticity: An examination of contemporary food and dining in the U.S.” Research in Organizational Behavior. 29(0):255–282.

Hsu, Greta and Michael T. Hannan. 2005. “Identities, genres, and organizational forms.”  Organization Science. 16, 5: 474.

Latour, Bruno. 1988. Science in Action: How to Follow Scientists and Engineers Through Society. Harvard University Press.

Mackenzie, Donald. 2008. An Engine, Not a Camera: How Financial Models Shape Markets. The MIT Press.

Posted in econ soc, method, org soc, STS, tech | Tagged | Leave a comment

The Online Platform – a New Iron Cage?

or: This bazaar is brought to you by Peer Economy Design, Inc. – The Iron Platform.

by Carla Ilten

Microsoft FUSE lab’s recent call for project proposals on the “peer/ sharing economy” that is emerging online prompted me to dig a little deeper into both the available literature and online platforms. I found some parallels between platforms that organize peer exchange and the platforms that I have recently studied in more detail which organize microvolunteering and -lending (Ilten and Postigo, work in progress). Sure, the obvious similarity between Uber, TaskRabbit, and Kiva and Sparked is that they coordinate participation of growing numbers of users. But who are they – peers? Not so much. The most interesting parallel to me is that these platforms are similar organizations. Here is an attempt to articulate those thoughts more clearly – and maybe end up with a research question and theoretical agenda.

While there is a good amount of research on contributors and labor in peer production (Benkler and Nissenbaum 2006; Dijck 2009), as well as a literature on trust and reputation systems (see peerproduction.net), we are still lacking an understanding of managed peer economy platforms as new forms of socio-technical organizations. This perspective becomes crucial as newer peer economy platforms move further into the service sphere, where online coordination and offline services are mediated by platforms as brokers. Much of the discourse about peer to peer platforms still focuses on platforms that seem to meet the ideal of cybercommunism and practical anarchism – a perceived absence of management (Benkler 2013; Vadén and Suoranta 2009), and ignores the organized actors which increasingly provide the architectures for mass peer to peer systems. These brokers, often companies, govern inclusion and exclusion of participants to the platforms, for example through more and more elaborated identity provision systems. Airbnb, Uber, and TaskRabbit do not meet utopian visions of communal sharing; rather, CEOs and designers deliver a matching service to entrepreneurial individuals and derive a share through monetizing those peer to peer services. A hybrid market niche for mediated – or rather: managed – peer to peer services seems to have emerged – a phenomenon that calls for an organizational analysis.

A few authors have begun to mobilize social theory and organization theory to make sense of the processes of government that occur in all structures of coordination. O’Neil uses Weber’s theories of authority to explain the organization of hacking communities through hybrid forms (O’Neil 2014). For example, index-charismatic authority is a new, reputation-system based form of authority which emerges from networked architectures. Rather than “hacking Weber”, Kreiss et al. (2011) bring his organizational theory of bureaucracy – the ideal type of rational formal organization – back in. Criticizing the “utopian orthodoxy”, the consensus in new media studies that views peer production as inevitably non-proprietary and socially leveling, the authors suggest that “the rationalist spirit and bureaucratic power may yet infuse peer production” – in both welcome and alarming ways (2011:243). While bureaucratic structures can be highly constraining, they have also introduced mechanisms of accountability and explicit rule-making in organizing whose fate is uncertain in managed peer to peer systems. Kreiss et al. ask whether “peer networks serve less as alternatives to Weber’s iron cage of rationalization, than as implements of its diffusion.” (Kreiss et al. 2011:256)

How can we make sense of this ambivalent argument? We need to untangle (but possible re-tangle) the two concepts that make up the iron cage: 1) bureaucracies/ bureaucratic structures/ really existing bureaucratic organizations/ “iron cages;” and 2) the rationalist spirit/ rationalization/ “the iron cage.” With this distinction, we can start asking about the location of power in these structures. So if peer economy platforms and microaction platforms fail the ideal type of formal bureaucracy in terms of accountability – and, impersonality, I’d like to add –, why should we think about them as bureaucratic structures?

Think of the other most widely used metaphor for bureaucracy: that of a rational machine, an architecture that is designed to coordinate large amounts of processes smoothly. This technical dimension of bureaucracies goes to the heart of what a sharing economy does: divide up labor. At the same time, much more fuzzy mechanisms of sharing are at work (John 2013): in order to participate, peers must provide identity information – share themselves. The design mechanisms of identity production are elements of a socio-technical architecture which is not developed by peers, but delivered by entrepreneurs, programmers, and designers.

Science and technology studies have a long history of showing how technological infrastructures are not neutral, but unfold social and material power – artifacts have politics (Winner 1995), and this is particularly critical for digital platforms, where materiality can evade our view (Gillespie 2010). While most research on the peer economy either ignores the material basis to peer exchange systems, or heralds web structures as inherently “peer” (decentralized), I think that good old organizational theory on bureaucracy can help us really pay attention to the “plumbing” (Musiani 2012) that structures peer economies. Again, we must focus on individual platforms (organizations, architectures), and look out for the broader rationality that is embodied in these cases. They certainly have a new look and feel that is quite different from Weber’s state bureaucracies. But the structures governing participation and exchange, cast in algorithms, are no less rule-based and hierarchical on the technical dimension of the architectures. Importantly, the newer architectures of participation that I have called managed or mediated above come with some fairly centralized design/power structures. This is a departure from what we could almost call “traditional” (or, to stick with Weber: value-rational) online peer production in for example Free Software projects. This bazaar is brought to you by Peer Economy Design, Inc., the banner could read.

So what kinds of cages are these new architectures? Weber’s original term stahlhartes Gehäuse translates not so much into cage as than into casing, or housing. Or, in the era of online structures, into platform. We are not so much stuck in that iron casing as we are voluntarily stepping onto new iron platforms that efficiently and appealingly organize processes we feel compelled to participate in. The rationalist spirit has a new vehicle, it seems – a great opportunity to bring organization theory, social theory and STS together (once more) to see the bigger picture that connects rationalities and social structures.

References

Benkler, Yochai. 2013. “Practical Anarchism Peer Mutualism, Market Power, and the Fallible State.” Politics & Society 41 (2): 213–51. doi:10.1177/0032329213483108.

Benkler, Yochai, and Helen Nissenbaum. 2006. “Commons-Based Peer Production and Virtue*.” Journal of Political Philosophy 14 (4): 394–419. doi:10.1111/j.1467-9760.2006.00235.x.

Dijck, José van. 2009. “Users like You? Theorizing Agency in User-Generated Content.” Media, Culture & Society 31 (1): 41–58. doi:10.1177/0163443708098245.

Gillespie, Tarleton. 2010. “The Politics of ‘platforms.’” New Media & Society 12 (3): 347–64. doi:10.1177/1461444809342738.

John, Nicholas A. 2013. “Sharing and Web 2.0: The Emergence of a Keyword.” New Media & Society 15 (2): 167–82. doi:10.1177/1461444812450684.

Kreiss, Daniel, Megan Finn, and Fred Turner. 2011. “The Limits of Peer Production: Some Reminders from Max Weber for the Network Society.” New Media & Society 13 (2): 243–59. doi:10.1177/1461444810370951.

Musiani, Francesca. 2012. “Caring About the Plumbing: On the Importance of Architectures in Social Studies of (Peer-to-Peer) Technology.” Journal of Peer Production 1 (online). http://hal-ensmp.archives-ouvertes.fr/hal-00771863.

O’Neil, Mathieu. 2014. “Hacking Weber: Legitimacy, Critique, and Trust in Peer Production.” Information, Communication & Society 17 (7): 872–88. doi:10.1080/1369118X.2013.850525.

Vadén, Tere, and Juha Suoranta. 2009. “A Definition and Criticism of Cybercommunism.” Capital & Class 33 (1): 159–77. doi:10.1177/030981680909700109.

Winner, Langdon. 1995. “Political Ergonomics.” In Discovering Design: Explorations in Design Studies. Chicago: University of Chicago Press.

Posted in econ soc, org soc, soc mov, tech | Tagged | Leave a comment

The Dynamics of the Microbrewing Industry in the U.S.

by Paul-Brian McInerney & Ann Tünde Cserpes

We analyzed country-level representative data provided by the Brewers Association to find three classes of microbrewers that best describe the current state of the industry. We identify them as established, up and coming, and innovator classes. Both the established and innovator classes exhibit a social movement logic (high prevalence on the spatial clustering, while low prevalence on the density indicator). The up-and-coming class is more likely to follow a market logic (low prevalence on the spatial clustering, while high prevalence on the density indicator). Up and coming breweries tend to seek out new markets away from established microbreweries, setting up business in states like Georgia or Tennessee. These brewers are likely to be members of the Brewers Association, but are less likely to join state guilds (this is in part because guilds might not exist in the states where they set up shop). Breweries in the innovator class tend to locate where there is an established brewery presence, in states such as California, Illinois, Minnesota, and Pennsylvania. Innovators are likely to start up in places with liberal self distribution laws and are likely to join state guilds, but not the Brewers Association. The established class of microbrewers are likely to join both the Brewers Association as well as the state guilds. The table below summarizes our findings about the three classes of microbrewers.

beer

The map below compares the proportion of state guild memberships and the percentage change in beer production. The overall quantity of beer produced has stalled in states where guild membership rates are highest. However, this might also signal the maturity of the industry in those locations. At the same time, in mature but not leading states – such as Illinois or California – microbreweries are in a much more flexible position: a good balance exists between having neighbors who are like them (and have access to the same resources), and who are not.

brew2

State guilds facilitate knowledge-sharing, access to resources, and act on legislative issues. As the graph below shows, breweries seem better off when located in states where most, but not all breweries, belong to the guild. If every brewery belongs to the state guild, everyone has access to the same resources. Studies have found that these conditions tend to stall innovation and industry growth.

The relationship between average state guild membership and percentage change in production
 

brew3

We summarize successful strategies in the microbrewery industry as:

  • Be where the winners are, but have your own identity

or

  • Create new markets far from the center

Our results show that there is no single or dominant organizational identity in the craft brewing industry. Craft brewers succeed because they combine the art and business of brewing in compelling and economically sustainable ways.

Posted in beer, econ soc, org soc, soc mov, Uncategorized | Tagged , | Leave a comment

Vinyl Revival: A Brief History

by Jerome M. Hendricks

My dissertation explores the actions of intermediary firms in periods of rapid technological change. By asking how new developments in listening to and owning music have changed the music retail industry, I offer the independent record store as a case of such an intermediary. Through a longitudinal multimethod content analysis of media and industry documents, I consider the massive decline of music retail from 1992 to 2012. By investigating patterns of composition and understandings in the field, I argue that independent record stores exemplify the uneven, reciprocal processes of legitimation required to alter the meanings of goods and services, enable survival, and change markets. In order to properly situate my argument, it is necessary to consider the backdrop of circumstances in which these processes develop. The posts that follow are explanations of various important industry issues that provide the requisite backdrop. I invite any suggestions or comments on this exercise as it unfolds…

Part 1, Format Wars: Analog vs Digital

Because of the intermediary position that the independent record store occupies in the music industry, it made most sense to me to start with a discussion of the product. Specifically, I look at product format debates that threaten the viability of brick-and-mortar record stores. In purely economic terms, the purpose of independent record stores is to purchase product for resale and profit from price manipulation, providing immediacy, coordinating buyers and sellers, and guaranteeing quality (Spulber 1999). Yet retail spaces are often valued as sites of self-development (Miller 2006) and reflect an arena of social and economic interaction where continuous processes of qualifying and requalifying products determine what is evaluated as worthy or unworthy. This evaluation of product emerges from practical activities, including discourses within the store, between stores, and with audiences. The vinyl record serves as a physical manifestation of these qualification processes where a history of sounds, visuals, and rituals coalesce.

Technological developments in how we listen to and own music have rendered these processes, places, and products obsolete for many. And while the digital shift in the music industry has changed the choices of products and places to consume them, this is yet another stage in the evolution of music value. As early as 1922, the recording industry was attacking new technologies (at that time, we’re talking radio) for offering free music to would-be consumers. The recent revival of vinyl records is wedded to criticisms of the quality and experience of digital music technology; some legitimate, some less so. In the discussion that follows, I consider those qualifications that make the vinyl format valuable and the issues many fans have with digital replication. I find that the benefit independent stores might gain from this format war may have less to do with what product is “best” and more to do with challenging consumers to determine what they value in music.

The Quality of Vinyl Records

At the heart of any discussion of music formats will always be some debate over sound quality. As John Kunz, owner of Waterloo Records in Austin, TX, asserts in Record Store Days, “We have these wonderful analog listening devices on the sides of our heads that don’t want to hear zeroes and ones. They want sound waves, a human-size arc.” Indeed, the reproduction of sound waves etched on a vinyl record as opposed to the replicated sample of musical sound in digital recording is a reoccurring issue when discussing how recorded music should be heard. Vinyl reproduction is often credited with providing a level of dimension and realism to music missing in digital replication due to the rate at which the sampling takes place and the types of compression applied to the sounds. This intimacy and warmth is regularly extended to the discussion of imperfections caused by dust or damage over time that produce hisses and pops especially evident in silent or quiet moments on a vinyl record.

An aural analysis of vinyl recordings and their digital counterparts is beyond the scope of this work yet it is important to note that the issue of sound appears much more subjective than one might assume. In terms of the creation of sounds, many recording engineers prefer CDs because of the accuracy in which the format can replicate the original. In fact, in some cases, the warmth attributed to vinyl records is a slight distortion of the bass due to signal processing in the recording phase and the quality of the turntable being used. Moreover, due to digital recording hardware and software that is largely standard today, many new vinyl records are little more than a copy of a high quality CD. The advancements in digital recording have been significant since CDs surfaced in 1983, yet audiophiles remain convinced that with the proper sound system only vinyl records can create the illusion of live instrumentation. This suggests that a component of the sound quality argument may have less to do with the production of the music than it does with its consumption.

Accordingly, I find that in most debate over music format the road eventually leads to the practice of vinyl record consumption. Many argue that in the shift to digital products, music lost some of its romance. People simply did not interact with CDs the way they did with vinyl records. Much of the intimacy discussed above can be attributed to the ritual of putting on a new record, vetting the musical statement of the artist(s), and exploring the object itself. As artist Sammy James Jr. of the rock band Mooney Suzuki once stated, “rock ‘n’ roll needs to be seen, touched, smelled, and tasted just as much as heard. I want to unroll the poster, open the gatefold, explore the cover art and the liner notes.” This practice exemplifies what Beckert (2010) describes as an imaginative bridge to the transcendental. The intersection of sound, imagery, and feel in the vinyl record ritual produces a social and contextual link to the artist(s). Such an imaginative link allows the consumer to “participate” in the artist’s style, statement, protest, humor, creativity, and so on.

The Digital Shift: Why do CD’s get such a hard time?

As Mark Richardson, Editor-in-Chief of Pitchfork, wrote in a 2013 Op-Ed piece, over time the disdain for poor mp3 sound quality has been translated into a generalized opinion about digital audio that isn’t entirely accurate. In fact, when CDs hit the market in 1983, their main selling points were not much different than when vinyl LPs replaced shellac records; they offered longer playing time and better sound quality. CDs are seventy-four minutes long thus surpassing original LPs by around 30 minutes. This eliminated the need to “flip sides” and offered new, novel control through skipping or randomizing the tracks on the disk. Perhaps more importantly, CDs offered a range of 96 decibels while LPs were only 70 which provided more volume from recordings. In addition, digital replication made silence completely silent; an impressive dynamic for music consumers use to the hiss and pops of vinyl records. According to Travis Elborough in his book, The Vinyl Countdown, by the end of the 1980s, widespread consensus was that the striking, rigid precision of digital replication did sound better and perhaps much of that was because of the newness of the technology and the dynamics of the sound.

While Elborough argues that this extreme clarity came with an overall lack of warmth to the sound where “everything appeared a little clinical”, it is important to note that many early CDs were recordings made for vinyl. And rather than remastering these works for a digital medium, record companies chose to release products that fell short of the full potential of the format. Regardless, for the vinyl record fan, the physical presentation itself could never match the appeal of the LP sleeve. While the CD artwork was the same, it was reduced to a booklet the size of a postcard and the plastic cases cracked and scratched easily. What is more, the digital format made the artist’s work one continuous whole thus rendering the consumer’s engagement with the original sequence of a piece less meaningful. As record companies began to include bonus tracks or “b-sides” to add value to products consumers may have already had on vinyl, the integrity of the artists’ expression was further devalued. In many respects, CDs get a hard time because they ushered in a new form of music consumption. They carried on a tradition of convenience rooted in the cassette revolution of the late 1970s and early 1980s while establishing a new level of luxury and modernity appealing to conspicuous consumption.

Despite the criticism CDs received, by the end of the 1980s sales were skyrocketing and most companies stopped production of vinyl records altogether by the early 1990s. Around this same time, there was an increasing commercial interest in compressing media files for various purposes like sending digital sound files over phone lines or fitting video files on CD-ROM disks. In 1988, the Moving Picture Experts Group (MPEG) was commissioned to evaluate various digitization procedures and standardize industry coding practices. Through a series of tests performed with professional engineers, audio companies, and radio stations, the MPEG team standardized a digital audio format that conformed quite well to the music industry standards of the time. The result of this work was the MPEG Audio Layer III; the MP3. By 1993, the MPEG team introduced the “.mp3” file extension in commercial applications that coded and decoded audio for internet transfer.

As Johnathan Stern details in his book, MP3: The Meaning of a Format, the work of MPEG set in motion a series of critiques that have contributed significantly to the debate over formats we find today. First, a major innovation in their work was layering the digital sounds based on a psycho-acoustic approach that exploits the limitations of human hearing. Often referred to as masking, this approach gave the MPEG team the flexibility to discard sounds people don’t hear and embellish those they do. Thus by further compression of what were now smaller data files, the MP3 standard enabled efficient file sharing via the internet despite significant decrease in the overall sonic quality. Second, these new coding practices were tested on popular music by industry experts. Or, as Stern argues, a small group of audio engineers have ultimately determined what a good recording should sound like. Thus in addition to issues of sound quality, these audio data files were diluted further by catering to specific popular styles of music recording and mastering.

Despite their limitations and clear lack of sound quality, the brief history of the MP3 has proven that in many respects, people simply do not care. In some sense, the rise of the MP3 is yet another pivotal moment in a history peppered with conflict over the commodification of an art form. Since the early 1920’s, the music industry has been in an ongoing battle with radio, illegal distributors, bootleggers, pirates, and so on over property rights and profit. Technological advancements such as the cassette tape decks, CD burning drives, and file sharing applications continue to make it easier and easier to share music. When the free (or inexpensive) MP3 became readily available through websites like Napster and later iTunes, it is no surprise that consumers chose this format over an $18 CD. In addition, the MP3 and its associated tools have provided unprecedented convenience and closeness to music. Since the advent of the Walkman in 1979, the social demand for having our music with us has exploded. While this closeness may not resemble the intimacy some find in a vinyl record listening ritual, it can certainly change a dreadful commute. And while a renewed passion for vinyl records continues to rise, perhaps the debate over format is ultimately an issue of consumption practices.

An Issue of Value and Music

Briefly speaking from personal experience, here are a few things my appreciation for vinyl records never stopped me from doing: 1) taping music off the radio when I was a kid, 2) making mixed tapes (and later on mixed CDs) for friends, free of charge, 3) blasting rock music in my car everywhere I went as a youth (difficult to do with vinyl), 4) buying a Walkman, Discman, an MP3 player, and now using a smartphone to carry music with me… everywhere, and 5) using Napster. As the legendary DJ, Bill Brewster said in the book, Old Rare New, “Bottom line is, I collect music – not formats – so while it’s sad, there has never been a better and more accessible time for music.” But is it sad that music has become more accessible? Hasn’t accessibility been a major goal since the beginning of recorded music? Clearly, the issue for the independent record store is maintaining a space for vinyl records as a special “method of access” to the artists we cherish. This niche seems sustainable for the collectors as an antique or collectible item. Yet new and reissued albums are reaching $25 or more and independent record stores must consider how the value of the vinyl record ritual will be perceived by the broader music-consuming fan base? Is there a point where it simply isn’t worth the price? Are we already there?

The poor quality of an MP3 doesn’t sound any worse than those overdubbed tapes I used to pull songs off the radio or share something with friends. But sound quality wasn’t the point; we were looking for a convenient way to access the artists we loved despite limited resources. The digital shift in music is a series of innovations that champion convenience and access while rarely including a sound quality discourse in its equation (I reserve the right to abridge this statement as streaming services become more sophisticated). While these innovations have significantly disrupted the unprecedented financial growth of the recording industry, to suggest the value of music has never been lower is simply untrue. Its market value may be at its lowest levels in decades but the value of music has never been higher. Look no further than vinyl records with free download links. These products give the consumer the sexiness of vinyl records as an object, the intimacy and warmth of the vinyl record ritual, and the portability of a digital format. In this format, music need not be worthy only in certain circumstances. We don’t need to adhere to an either/or debate when both/and is available.

In the award winning documentary, I Need That Record, Chris Ashworth, CEO of United Record Press states, “Vinyl requires that you sit down and listen to the music. MP3 players basically give you the optimal portability… Now what [consumers are] doing is saying, ‘Ok, I’ve got my portability. I’ve got my quantity. Now I want my quality… give me the listening experience.’” Perhaps the digital shift has renewed a desire for a music quality lost in the compressed precision of digital replication; an intimacy and warmth to ones music of choice. I am hopeful that the independent record store can continue their impressive survival story through products like those that offer both analog and digital formats. However, as Ashworth implies the value of music lies in the interconnection of the object and the practice and this is evident throughout the debate over analog and digital formats. Far from arguing that the sound quality issue is irrelevant (because MP3s can sound quite awful) what seems more important in the format wars is that we acknowledge the practices we find valuable and appreciate their role in our relationship with music.

Sources

Beckert, Jens. 2011. “The Transcending Power of Goods: Imaginative Value in the Economy.” Pp. 106-125 in The Worth of Goods, edited by J. Beckert and P. Aspers. New York: Oxford University Press.

Calamar, Gary, and Phil Gallo. 2010. Record Store Days: From Vinyl to Digital and Back Again. Sterling.

Elborough, Travis. 2009. The Vinyl Countdown: The Album from LP to iPod and Back Again. Soft Skull Press.

Miller, Laura J. 2006. Reluctant Capitalists: Bookselling and the Culture of Consumption. University of Chicago Press.

The MP3: A History Of Innovation And Betrayal.” 2014. NPR.org. Accessed June 26. http://www.npr.org/blogs/therecord/2011/03/23/134622940/the-mp3-a-history-of-innovation-and-betrayal.

Pettit, Emma. 2008. Old Rare New: The Independent Record Shop. Black Dog Publishing.

Richardson, Mark, July 29, and 2013 at 11:30 A.m. 2014. “Does Vinyl Really Sound Better?” Pitchfork. Accessed June 26. http://pitchfork.com/thepitch/29-vinyl-records-and-digital-audio/.

Rose, Joel. 2014. “For Better Or Worse, MP3s Are The Format Of Choice.” NPR.org. Accessed June 26. http://www.npr.org/blogs/therecord/2011/03/18/134598010/for-better-or-worse-mp3s-are-the-format-of-choice.

Spulber, Daniel F. 1999. Market Microstructure: Intermediaries and the Theory of the Firm. Cambridge University Press.

Sterne, Johnathan. 2012. MP3: The Meaning of a Format. Duke University Press.

Talk of the Nation. 2014. “Why Vinyl Sounds Better Than CD, Or Not.” NPR.org. Accessed June 26. http://www.npr.org/2012/02/10/146697658/why-vinyl-sounds-better-than-cd-or-not.

Toller, Brendan. 2008. I Need That Record! The Death (or Possible Survival) of the Independent Record Store. Documentary, Music.

Posted in econ soc, method, org soc, soc mov, tech | Leave a comment

Civic Tech or Corporate Tech? A Question of Resources and Knowledge

by Carla Ilten

On a recent Saturday, COD colleague Tünde Cserpes made me meet her at 8am at kCura’s airy office spaces on Clark and Jackson for the second “National Day of Civic Hacking in the Chicago Loop.” It turned out so interesting that I wouldn’t even have needed all that coffee. The National Day of Civic Hacking (hackforchange.com) is a platform that brings together “developers, government employees, data scientists, non-profit leaders, designers, community activists and do-gooders together leverage the power of technology to help solve civic problems.”

On hackforchange.com, challenges, projects, and data sources come together in an online petri dish. The offline event – the “hackathon” – does the same thing with present people representing and arguing the case for challenges, presenting data, and offering software development ideas. What is particularly fascinating – especially for those of us studying organizational dynamics and technology – is how people who run nonprofits such as the Chicago Red Cross (Jim McGowan) or the Lakeview Pantry (Erin Stephens) talk about their everyday operational problems and technology use.

Again, the online-offline nexus proves to be crucial: the core tasks of disaster response and food distribution happen fundamentally offline. Volunteers are physically present, and material objects (comfort kits, banana boxes) must be moved in time and space. It is the coordination of these bodies and objects that is enhanced by information technologies. Where is the fire burning, and which volunteers are closest to it? Which supermarket is trying to get rid of spare food, and who can transport it to the pantry? This is where data comes in. As the host and kCura founder observed, most of the projects presented seem to aggregate “big data,” to make it “digestible.”

The current scholarly discourse around the epistemology of “big data” is perfectly reflected in the narratives of those for whom it can become vitally useful – but only if we have questions for it and the interpretive tools to digest it. Jeanne Olsen, who founded Schoolcuts.org, showed how data can be aggregated and interpreted in various ways, a process that is inherently political when the stakes are high.

The speakers made it clear that it matters where data comes from, and how it arrives on platforms. Many nonprofits struggle enough with daily operations to launch their own technologies. The ability to integrate all kinds of open data with Google maps has been the straightforward way for McGowan to coordinate disaster response so far. Google, of course, is not a civic association. Similarly Erin Stephens has started using Salesforce to coordinate volunteers for the Lakeview Pantry – not a tool tailored to nonprofits’ needs either. Yet, the fact that large IT corporations provide software to nonprofits for free makes the difference, seen that nonprofits often struggle for survival.

Finally, the challenge in providing civic tech to civic organizations seems captured in this dialogue between Erin Stephens and a technologist: “So, what do you need?” – “You know, … we don’t always know!” Translating civic problems into civic tech takes many more steps between offline and online than we’d like – but here’s why civic hackers can do more for civic tech than the corporate giants: they ask you what you need in the first place.

Posted in econ soc, org soc, soc mov, tech | Tagged | Leave a comment

Utilizing Name Entity Recognition to Identify Unique Actors

by Jerome M. Hendricks

My dissertation work explores the actions of intermediary firms in periods of rapid technological change. As our economy has become increasingly geared toward knowledge sectors (Powell and Snellman 2004), intermediary firms take on an increasingly important role in establishing markets and developing consumer relationships with products and services. Using the independent record store as a case of an intermediary operating in a rapidly changing market, I argue that certain actors can collectively alter the symbolic meaning of goods and services to enable their survival. To make this argument, I have collected over 2400 music industry and media documents from 1992-2012 in order to track changes in strategy and field understanding over time. In a recent paper, I was able to establish changes in organization understandings empirically by comparing the discourse of independent record stores before and after drastic technological innovation. Since then, I have been investigating ways I might test these findings by tracking types of music retail firms and their strategies and understandings over the twenty-year period. In the discussion that follows, I will share my experience looking to computer science technologies for new ways to identify and track actors and ideas in large data sets.

Early coding of my data utilized an ethnographic content analysis (ECA) approach through Atlas.ti, CAQDAS. ECA allows for the quantitative emphasis of structured data collection in association with descriptive information which informs the context in which meanings emerge (Altheide and Schneider 2013). I found this approach very useful and I could conceivably use it to answer other questions I have. However, from a practical standpoint, large data sets like mine take a long time to code and analyze this way. On the advice of a committee member of mine, Dr. John Mohr, I looked to new ways that social scientists have incorporated data mining software to extract meanings from large data sets. I was immediately intrigued by the growing body of literature that utilizes topic modeling as procedure for coding text into meaningful categories of word clusters associated within and across documents. For examples and analysis of this approach, refer to the special issue on topic models and cultural sciences in Poetics found here. While this approach to extracting “topics” would appear to offer a compelling way to uncover the strategies and understandings I am interested in, without isolating the types of actors associated with each topic, my unit of analysis shifts from the types of organizations to the data sources themselves.

In the same issue of Poetics mentioned above, Mohr and Bogdanov (2013) point to other compatible data mining strategies that can be combined with topic modeling to obtain an even closer view of meanings in texts. Specifically, Mohr and colleagues (2013) analyze the discursive style of the state utilizing a series of natural language processing (NLP), semantic parsing, and topic modeling procedures. This approach allows the authors to identify significant actors, determine their actions in texts, and consider the context in which these actions take place. A subfield of NLP called name entity recognition (NER) offers the most promise for identifying different people, places, organizations, and other miscellaneous human artifacts in texts. Because these tools require expertise in computer programming, I contacted Dr. Dan Roth at the University of Illinois at Urbana-Champaign to inquire further about NER and its applications. From his demo page, you can try a variety of different NLP procedures on sample texts including the NER tagger that will be the central focus of the remainder of this discussion.

Before discussing the specific approach that Dr. Roth, his assistant Chase Duncan, and I have taken, it is important to consider a few challenges with the NER tagger due to the unique nature of my data set. First, while my data set is large, it isn’t particularly massive. Currently, these tools are better suited for open exploration of hundreds of thousands of documents. So while my practical concerns over time management and the size of my data set are real, my data set is rather small relative to the concerns of computer scientists. With a data set in the thousands, accuracy becomes central as there is much less margin for error. In other words, we’ll simply have fewer opportunities to capture target organizations. This leads to a second concern; independent record stores are a somewhat unique entity and can be easily overlooked or misclassified by the NER tagger. Consider the Mohr et al. (2013) paper discussed above, the entities of interest are relatively well-known (multi-national organizations, nations, geo-political entities, and so on) and can be easily verified. Rather than finding references to the “United States” or “Afghanistan”, we are looking for “Dave’s Records” or “Bucket O’Blood Books & Records.” While both procedures require a certain amount of programming the tagger to improve its performance, I am unaware of a complete historical record of independent record stores that can be utilized for training purposes. More information on training an NER tagger for social science purposes can be found here.

Despite these challenges, our team is confident that we can train the NER tagger to perform at a high level despite the somewhat unique entities we aim to identify. The first step in training the tagger requires a separate data set that includes a variety of unique and standard music store names (from “Permanent Records” to “Musicland” to “Best Buy”) and mirrors the “messiness” of media data like links to other news stories, advertisements, and so on. To date, I have compiled 100 articles not used in the original data set for testing the NER tagger. To assist the tagger in identifying stores, we will incorporate Dr. Roth’s Wikifier tool which utilizes Wikipedia as an authoritative source for resolving identities. While many small stores will not be listed on Wikipedia, this will help us increase the accuracy of identifying large chain retailers and popular independent record stores throughout our data. As Dr. Roth and his colleagues have noted previously (Godby et al. 2009) other authority sources have the potential of increasing the effectiveness of resolving identity issues. To this end, utilizing various online databases of independent record stores (e.g. recordstoreday.com, vinylhunt.com, or goingthruvinyl.com) may also be useful in training the NER tagger. Once the modified version of the NER tagger is complete, we will be able to test our trained tagger on this separate data set and compare our results with human classification to assess accuracy and prepare our tool for the original “large” data set.

While it is entirely likely that my research will utilize some of the data mining software tools already familiar to social science research, our attempts to adapt the NER tagger to unique actors has significant implications for content analysis in social science research. In terms of data set size, our ability to train the NER tool more specifically will provide the required level of accuracy for smaller projects only attainable through manual coding procedures. In conjunction with other data mining procedures, such accuracy can allow for hypothesis testing as well as exploratory work. By standardizing these tagging procedures and training processes, the transferability among similar situations may suggest some generalizability of results. And, in light of the cooperative efforts that have brought us this far, prospects for software packages that are more accessible to social scientists, not unlike many topic modeling packages, also seem possible. Though these implications may be little more than conjecture on my part at this point, the prospects for developing procedures that contribute to new approaches to content analysis are exciting. I look forward to reporting the testing results as they become available and assessing the possibilities of NER tagging when actor identities are relatively unique.

References

Altheide, David L., and Christopher J. Schneider. 2012. Qualitative Media Analysis. Second Edition edition. Los Angeles: SAGE Publications, Inc.

Godby, Carol Jean, Patricia Hswe, Larry Jackson, Judith Klavans, Lev Ratinov, and Dan Roth. 2010. “Who’s Who in Your Digital Collection: Developing a Tool for Name Disambiguation and Identity Resolution.” Journal of the Chicago Colloquium on Digital Humanities and Computer Science 1 (2).

Mohr, John W., and Petko Bogdanov. 2013. “Introduction—Topic Models: What They Are and Why They Matter.” Poetics 41 (6). Topic Models and the Cultural Sciences: 545–69.

John W Mohr, Robin Wagner-Pacifici. Ronald L. Breiger, Petko Bogdanov. 2013. “Graphing the Grammar of Motives in National Security Strategies: Cultural Interpretation, Automated Text Analysis and the Drama of Global Politics.” Poetics 41 (6). Topic Models and the Cultural Sciences: 670-700.

Powell Walter W., and Kaisa Snellman. The Knowledge Economy. Annual Review of Sociology,. 2004;30:199-220.

Posted in method, org soc, tech | Leave a comment

Tactics, causes, and theory: Who gets to define what counts as “activism”?

by Carla Ilten

Recently, I gave a presentation at Theorizing the Web 14 in New York. It was entitled: “Activism 2.0: The Politics and Business of Platforms Built for Social Change.” When the presentation got sorted into a panel called “(Ref)user: Movements of Resistance,” I knew I would have to do some good justification work. The audience would expect stories of grassroots uprising, of feminist resistance and of e-bandits (as my co-panelists offered); hopefully heroic, but certainly not for the dark side of the force! The cases I presented, Sparked and Kiva, two platforms that organize micro-volunteering and micro-lending, fall short of heroic, high-stakes activism. No, worse: what they organize in an extremely efficient, online-only way, rubs “social change” the wrong way for some of us. Yet, as I argued, the two platforms exhibit what newer social movement theory on online activism (Earl/Kimport 2011) would consider the most cutting-edge “internet-enabled” organization of participation – low cost, high outcome mobilization! Not least, the people who engage in these micro-actions feel that they are making a difference of some sort. I tried to highlight this finding – I will call it a contradiction for want of a more nuanced analysis at this point – as the starting point of a research journey. Not completely successfully, as the following exchange occurred on twitter after my talk:

twitter_convo_carla_ttw

As a political and interested human being, of course I have (strong!) preferences with regard to causes and visions – I make normative judgments about what activism I find legitimate and desirable. As a sociologist, though, I can and must analytically distinguish between activism and causes, or between movement tactics and movement goals. What is more, when the people whose activities I observe use those terms to describe what they’re doing, then I have to take them seriously, all the while making my own, possibly counter- analysis. That is indeed what I presented in my talk: a project that started out looking for activism online, found something that looked similar but different, and concluded that it was not the wrong case, but… a case of what? This is where things get interesting.

Posted in conferences, soc mov, tech, Uncategorized | Tagged | Leave a comment