Skip to main content

Fostering Digital Communities of Care: Safety, Security, and Trust in the Canadian Humanities and Social Sciences Commons

Published onJun 27, 2022
Fostering Digital Communities of Care: Safety, Security, and Trust in the Canadian Humanities and Social Sciences Commons
·

Today, academia’s relationship with notions of care remains fraught: individual scholars, scholarly communities, and the larger institutions that support them have all profited from the affordances of digital technologies and platforms while also having to contend with the concomitant social challenges of digital scholarship. George Veletsianos, for instance, declares that “academia’s uncomfortable relationship with care is evident in many of its foundational processes” (Social Media 80). To be a scholar in the twenty-first century is—as in preceding centuries—to be a networked scholar.1 But digital scholarship has introduced entirely new possibilities and problems, requiring academic communities to consider what fostering care looks like, in theory and practice, as the technologies mediating networks of researchers and research data continue to evolve.

This paper invites further consideration of care in the networked world vis-à-vis the Canadian Humanities and Social Sciences (HSS) Commons, an in-development research platform by the Implementing New Knowledge Environments (INKE) Partnership.2 Building on the work of Caroline Winter et al. (“Foundations”), we examine how open digital research commons can encourage responsible community-building and collaboration as two interrelated forms of care. In doing so, it draws on Bethany Nowviskie’s interpretation of ethics or networks of care in accord with feminist thought—dating back to the eighteenth and nineteenth centuries—that values “deep connection to others,” and also in stark contrast to “economic systems that valorized a private profit motive and circumscribed the participation of women and the servile under-classes.”3 “A competitive capitalist marketplace,” writes Nowviskie, “depends upon but does not assign much value to things we create through networks of reciprocity, compassion, generosity, mending, and care.” In a sense, then, “care” in the larger historical and philosophical context described by Nowviskie, and adapted provisionally in this paper, might be understood as a diverse set of practices that are both community-minded and intensely opposed to systems or forms of interaction, including economic ones, that threaten the common good of those communities or the individuals that comprise them. Defined in this way, the concept of care—as a form of “deep connection” that is simultaneously at odds with “private profit motive[s]”—is highly relevant to discussions of digital spaces such as social networks and not-for-profit digital research commons intended to bring people together. Such platforms can help researchers freely produce, publish, and share research within and beyond their existing academic networks using sharing features that are at once familiar to users of popular commercial “academic social networking sites” (ASNS), yet frequently missing from “relatively siloed” institutional repositories (Fitzpatrick, “Academia”).

Even so, while open research-sharing platforms such as the Canadian HSS Commons and the Humanities Commons—an academic platform for research-sharing and networking—provide exciting new possibilities for individual scholars and scholarly communities alike, their implementation also raises important questions about how digital knowledge environments can safeguard users and their work as yet another form of care in the sense(s) outlined above. At their core, these questions focus on how best to realize the high ideals excited by such spaces (e.g., openness and equitable access to information), especially in building communities of care around areas of inquiry, thoughts, and ideas. However, consideration of such questions also involves shifting the register from accentuating and supporting the positive to protecting against the potentially negative. As recent studies of Twitter and other networking sites attest, the mechanisms that enable open sharing and communication among academics, or between academics and the general public, can also make researchers and their work vulnerable to being misunderstood or—worse yet—attacked.4 Indeed, the inevitability of negative interactions within social networks, academic or otherwise, is implied in Charlotte Hess and Elinor Ostrom’s important collection, Understanding Knowledge As a Commons (2006). The first sentence of their introduction defines knowledge as “a commons—a resource shared by a group of people that is subject to social dilemmas” (3; emphasis added). That is, sooner or later, digital research commons will need to grapple with the “tensions, dilemmas, and conundrums” that Veletsianos associates with networked scholarship in general and shared virtual spaces in particular (Social Media 53).

Accordingly, this paper briefly introduces our own case study, the Canadian HSS Commons, before gesturing to three of the salient social dilemmas frequently cited in conversations about digital spaces of this kind: 1) enclosure and exclusion as they relate to commercial as well as institutional repositories; 2) the difficulties socially minded research-sharing platforms face in attempting to strike a balance between openness and security; and 3) the disproportionate social risks affecting scholars who have been, and continue to be, racialized and marginalized within academia (as without). In the process, we draw on intersecting conversations from the fields of open access, digital humanities, social knowledge creation, and the emergent field of digital sociology. Throughout, we also propose several ways these challenges might be addressed by and for academic communities. Ultimately, we argue that, despite the many challenges involved in working towards a safe, secure, and open research commons—that is, a community of care—this work is not only increasingly necessary for academia writ large but also an essential aspect of the Canadian HSS Commons’ mandate to empower humanities and social sciences researchers to share knowledge and resources, build community, and find collaborators within and beyond academic institutions. 

The Canadian HSS Commons

The Canadian HSS Commons is a national-scale, online research commons in both official languages, English and French. It will foster an environment for Canadian HSS researchers to share, access, repurpose, and develop scholarly projects, publications, educational resources, data, and tools; of particular note here, it will also facilitate member interactions. For example, from the profile Dashboard, members can build their profile (and decide what is shared publicly); access their inbox, where they can communicate directly with other members; write private or public blog posts; start a collaborative special-interest group, collection, or project; and more (Figure 1).

 

[[Insert Figure 1 here: JensenEtAl_Figure1.gif]]

Figure 1: A brief tour of the Canadian HSS Commons’ profile Dashboard page (hsscommons.ca/members/myaccount).

The Canadian HSS Commons’ in-development prototype is built on HUBzero, an open-source platform originally developed at Purdue University, and all its code is open source and available on GitHub. An initiative of the INKE Partnership, it is coordinated by the Canadian Social Knowledge Institute (C-SKI), which is based in the Electronic Textual Cultures Lab (ETCL) at the University of Victoria (UVic). The prototype is being developed in collaboration with CANARIE, UVic Systems, the Compute Canada Federation, the Federation for the Humanities and Social Sciences, and Humanities Commons. Through the INKE Partnership, the Canadian HSS Commons is also connected to—and has received invaluable feedback from—the Canadian Research Knowledge Network (CRKN–RCDR), the Digital Humanities Summer Institute (DHSI), the University of Newcastle, Edith Cowan University, Érudit, Iter: Gateway to the Renaissance, the Public Knowledge Project (PKP), and UVic Libraries.

1. Free for All? Exclusion from—and Exploitation in—Digital Research Commons 

Like other research-sharing platforms designed to build community and facilitate social interactions among academics, the Canadian HSS Commons must contend with a number of theoretical and practical challenges hearkening back to the pre-digital origins of the commons.5 One such challenge is that of subtractability. Historically, subtractability—“the principle that one person’s use of a commons resource reduces the amount available for other users” (Winter et al.)—has posed difficulties both in terms of resource management and in terms of attendant social issues. For example, over-extraction can cause resource scarcity; in turn, scarcity can produce competition and conflict within a community, resulting in what Garrett Hardin refers to as “the Tragedy of the Commons” (Hardin; see also Winter et al.; Hess and Ostrom; Bollier; Borgman). However, unlike their historical namesakes, digital knowledge commons are not typically subject to the same issues of over-extraction, since knowledge is, as David Bollier points out, a nonsubtractive or “nonrival” resource: “many information commons,” he remarks, actually “exemplify what some commentators have called ‘the cornucopia of the commons,’ in which more value is created as more people use the resource and join the social community” (34; see also Boyle and Winter et al.). Because the Canadian HSS Commons’ mandate to build community and mobilize knowledge hinges on nonsubtractive resources—in the form of open access publications, collections, group communications, and other such digital materials—Bollier’s “cornucopia” is therefore a far more appropriate term for this kind of shared space than Hardin’s “tragedy.”

While digital knowledge commons may not be vulnerable to the “tragedy of the commons” in the traditional sense, modern struggles for control over digital resources can nevertheless re-create other historical problems, such as enclosure and exclusion. In scholarly communities, enclosure occurs when previously shared spaces or resources are moved behind paywalls, for example, or when universities withhold funding for open-access platforms in favour of siloed institutional solutions. Attempting to explain the possible reasons for the latter kind of enclosure, Peter Suber argues that academic institutions have too often been motivated by fears that open-access publication will result in a loss of prestige, or that investing in truly open repositories—which play a crucial role in research ecosystems—might pave the way for “freeloading” universities to reap the benefits of others’ labour (186). Enclosure can also lead to social tensions, and reasonably so. Sometimes enclosure results in the exclusion of researchers from research communities or platforms that control vital data and resources, for instance, or the digital reconstitution of problematic academic and geopolitical hierarchies, including those that translate into unequal access to resources across institutional, national, or linguistic lines.

Similarly, scholars have accused commercial platforms of enclosure, exclusion, and other practices with potentially negative social effects. Many of these practices run counter to the ideals of the commons model. Indeed, Yochai Benkler maintains that cooperation and sharing, not exclusion, must be evident in commons-based initiatives worthy of the name: whether digital or not, true commons “are not built around the asymmetric exclusion typical of property. Rather, the inputs and outputs of the process are shared, freely or conditionally, in an institutional form that leaves them equally available for all to use as they choose at their individual discretion” (62; see also Bollier 29). In this sense, then, the problem with Academia.edu is that it is seemingly operating under the aegis of a commons that has the stimulation and dissemination of knowledge as its guiding mission; in reality, its actions are dictated by an economic model that, according to its many critics, precludes the kind of transparency, sharing, and social governance Bollier, Benkler, and others regard as essential to commons-based systems. More troubling still, Academia.edu has been accused of exploiting users and their data (Bond; Fitzpatrick, “Academia”; Winter et al.); exacerbating professional and personal anxieties within the academic community (Duffy and Pooley); and tacitly encouraging resource sharing in violation of copyright laws while passing the liability for infringements on to users (Adema and Hall; Tennant; Winter et al.). By contrast, as Winter et al. note, “Community-led sites such as institutional and subject repositories, Humanities Commons, and the Canadian HSS Commons provide alternative platforms that encourage researchers to share their work while attending to legal and ethical issues.” In this way, such platforms may be said to enact Michelle Caswell and Marika Cifor’s notion of “structural care,” which “simultaneously dismantles oppressive structures and builds liberatory structures” (2), insofar as they are concerned with addressing—and proactively working to prevent—the issues of enclosure, exclusion, and exploitation that critics have identified with their for-profit counterparts.

2. Balancing Openness and Security: The Threat of Bad Actors in Digital Commons

The question of exclusion—of who can and cannot access or participate in a digital space—introduces a related set of questions about how to strike an appropriate balance between openness and security. For digital knowledge commons, striking such a balance can require adjudicating between competing technical, legal, ethical, cultural, and sociological considerations. For example, to what extent are the basic tenets of open scholarship predicated on expectations regarding access not only to digital resources, but to the people who produce them? As Kathleen Fitzpatrick put it following a recent attack by “bad actors” on the Humanities Commons platform, “How do we balance our commitment to ensuring that the Commons is open to anyone—regardless of credentials, memberships, employment status, language, geographical location, and so forth—with our commitment to ensuring that the members of our community are safe and free from harassment?” (“Community”; see also Veletsianos, Social Media 75). On the one hand, recent scholarship has underscored the potential benefits of open social scholarship and social knowledge creation practices (Arbuckle et al.; El Khatib et al.); on the other, this growing body of scholarship also clearly foregrounds the fact that “open” does not—and should not—mean that anything or anyone goes. As Ryan Scrivens and many others before him have observed, “the open format of the internet has also allowed for bad actors” such as hackers, trolls, and extremists.6 Similarly, in some cases, not all data should be shared.7 In order to create safe and caring spaces for researchers, networked academic communities must be willing to weigh the benefits of openness as an ideal against the potential negative effects of openness as it has played out in a variety of digital contexts.

Consideration must also be given to how the design and technical features of digital spaces can help neutralize the threat of bad actors—a question to which the Canadian HSS Commons and its partners have already begun to respond, and which its infrastructure already addresses (including in the ways documented below). This threat comes not only from without but also from within, once communities begin to take shape. This behaviour can take many shapes: online bullying, the spreading of misinformation, phishing attacks, and so on. More positively, one might consider how the design of digital spaces can actively foster care. Advocating for what he calls “closure” (not to be confused with “enclosure”), Harry Dyer asserts that design can shape online interactions in positive ways: closure “allows a consideration of how site design guides and shapes our ability to act and interact online, and how the design of online social spaces can favor certain types of actions, interactions, discourses, themes, users, and audience members” (79). In this way, Dyer’s term indicates how design can be used not only to set necessary boundaries on users’ behaviour but also, in doing so, to paradoxically facilitate open, healthy forms of engagement. Unlike the concept of enclosure discussed above, in largely pejorative terms, closure in this sense refers to the specific mechanisms regulating the mindful addition of new people to a restricted community, or to how the actions of the people already included in a community are guided in certain directions as part of a larger strategy of maintaining security and fostering positive social interactions in a digital space.

Two of the most notable forms of closure in the Canadian HSS Commons include its account verification process and its content-flagging features. Very few digital research commons provide open access without requiring registration.While anonymity can sometimes shield users from unwanted attention and therefore offer a partial solution to the problem of bad actors in the larger commons of the internet, anonymity can also introduce new concerns—such as when members of a community need to be vetted for perfectly valid reasons, such as the enforcement of security protocols.8 In the Canadian HSS Commons, anyone can browse the site and access content that has not been designated private or made accessible only to other members, but some content and features are limited to members. New membership requests are processed through the Canadian Access Federation (CAF); through ORCID-CA, which links Canadian researchers to a persistent identifier (PID); or through the Canadian HSS Commons (Figure 2).

Figure 2: The Canadian HSS Commons provides sign-in authentication through the Canadian Access Federation (CAF); ORCID-CA, and the Canadian HSS Commons itself (hsscommons.ca/login).

These verification mechanisms help build trust by bolstering security and accountability in the Commons community: members’ accounts are linked to specific organizations or IDs, which, in turn, are bound up with researchers’ professional and personal reputations. Accordingly, this seemingly straightforward feature can serve to mitigate bad behaviour to an extent not possible on sites such as Facebook or Reddit, where alarming increases in digital harassment, fake news, and racist, misogynistic, homophobic, or other “uncivil discourse” have eroded public trust regarding online spaces (Rainie et al. 4–6; see also Veletsianos, Social Media). Still, even in areas of the Canadian HSS Commons accessible only to members, other design features such as content flagging (Figure 3) have been implemented to ensure that HSS societies and other groups within the larger Commons community are equipped both to safeguard their members and to encourage positive forms of interactivity. Even so, we recognize that while content flags can serve as “a genuinely important and necessary part of maintaining user-friendly spaces,” as Kate Crawford and Tarleton Gillespie observe, “they are by no means a direct or uncomplicated representation of community sentiment”—nor are they immune to change, or to being weaponized by bad actors (411, 420). As such, content flags, like tools for account verification, must be considered holistically, as just one complex and constantly evolving facet of care-oriented community governance among a host of others.

Figure 3: A screenshot of the Canadian HSS Commons’ content-flagging feature for publication reviews, which allows users to “Report abuse” (hsscommons.ca/publications).

Consequently, the work of fostering care within digital research commons does not and should not stop at intentional design and the implementation of technical safeguards: policies and governance structures must also be designed, enforced, and re-evaluated on an ongoing basis. Applying decades of scholarship on commons and governance, Ostrom and Hess discuss how this can play out with reference to the Institutional Analysis and Development (IAD) framework, which they define both as “a framework that has been used for over three decades as the main theoretical structure by many commons scholars from multiple disciplines” and as “a diagnostic tool that can be used to investigate any broad subject where humans repeatedly interact within rules and norms that guide their choice of strategies and behaviors” (41). Central to their chapter—and to our invocation of it here—is the claim that the IAD framework is not only amenable to, but also “particularly appropriate for analyses of various types of commons and common-pool resources” (43). Ostrom and Hess remark, for instance, that “Effective design requires successful collective action and self-governing behaviors; trust and reciprocity; and the continual design and/or evolution of appropriate rules” (43). Moreover, in order to be effective, any rules governing such spaces must be “backed by at least a minimal sanctioning ability for noncompliance” (50). While the Canadian HSS Commons prototype is still in development, it already exceeds this minimal threshold: the site’s “Terms of Use” and “Abuse and Harassment Policy” clearly articulate the consequences for behaviour that is not in keeping with University of Victoria policies or federal and provincial legislation. In line with the CARE (Collective Benefit, Authority to Control, Responsibility, and Ethics) Principles for Indigenous Data Governance, the First Nations Principles of OCAP (Ownership, Control, Access, and Possession), and the TRUST (Transparency, Responsibility, User focus, Sustainability, and Technology) principles for digital repositories currently being adopted by research data specialists worldwide, these documents also address security issues pertaining not only to the stewardship of sensitive data or the contents of repositories, but also to the researchers and content creators themselves.9 Furthermore, just as Ostrom and Hess urge knowledge commons to promote community self-governance, the Canadian HSS Commons’ policies emphasize, for instance, that “All members are encouraged to help moderate content and click on ‘report abuse’ links to report any material deemed offensive or inappropriate” (“Abuse”); that is, they stress the importance of self-governance while also indicating specific ways members can take ownership of the site, its tools, and shared content in order to help maintain a positive space. Accordingly, the site facilitates what Ostrom and Hess refer to as the transformation of “rules-in-form” into “rules-in-use”—rules that “are generally known and enforced and generate opportunities and constraints for those interacting” (50). In doing so, the Canadian HSS Commons aims to balance the importance of a rules-based approach with a commitment to community-driven platform development.

3. For the Common(s) Good: Addressing Inequity and Injustice in Digital Communities

A third and equally pressing challenge facing digital research commons is how these platforms and their constituents should best acknowledge—and actively address—the disproportionate risks facing scholars who have been, and continue to be, marginalized in academic spaces. For example, how can digital research communities protect and support racialized and other marginalized scholars for whom participation in such spaces, as in academia writ large, remains especially fraught? As Aimée Morrison observes,

There are real risks to taking scholarship outside the ivory tower, risks that accrue unevenly based on identity and visibility. Internet shitstorms rain down disproportionately, and with disproportionate damage, upon the more precarious: women, people with disabilities, people of color, junior scholars, and the contingently employed. (61)

Morrison’s vivid meteorological pronouncement echoes the findings of scholars in the emerging field of digital sociology. Tressie McMillan Cottom, for instance, has proven that it is not just risks that accrue unevenly based on identity and visibility; so do rewards. “Access is easier to produce than equal access to high status rewards,” she writes; “Similarly, increased internet access (or ‘penetration’) may be easier to produce than egalitarian access to skills, know-how, social networks, and capital” (“Black Cyberfeminism” 214). In other words, the kinds of epistemic injustices that have historically and disproportionately affected marginalized scholars can also be unwittingly reproduced on social networking or digital research platforms. Yet one need not embrace technological determinism, an approach that has justifiably and repeatedly come under fire in the last decade, in order to make this point (see, e.g., Duffy and Pooley; Dyer; Goodwin et al.; McMillan Cottom “Black Cyberfeminism”; Veletsianos, Social Media); while technology can shape individuals and entire social networks, developers, users, and critics alike can nevertheless play a significant role in shaping digital environments to foster care and address the inequities and injustices that continue to plague these spaces. At the same time, the responsibility for this work should not be placed squarely on users’ shoulders—particularly those of the people who are already disproportionately affected. To acknowledge the many instances of inequity and injustice that continue to plague such spaces is only to begin working towards their resolution; but even so, in digital humanities and many other academic communities, this first step is a necessary one—particularly given the field’s documented lack, until the last decade or so, of serious social, cultural, and intersectional critique.10

In its ongoing theorization of the digital commons, the Electronic Textual Cultures Lab hopes to contribute to these much-needed discussions about inequity and injustice within academia. Engaging with the fields of digital sociology and critical race studies, for instance, we hope to arrive at a better understanding of the intersectional implications of each of the challenges discussed in this paper, as well as the ways not-for-profit platforms such as the Canadian HSS Commons can foster care among all members of their respective communities, including in ways that comparable for-profit platforms cannot. This sort of engagement can take practical forms, too. For one thing, open research commons can bring the means of production back to researchers (Willinsky) instead of acquiescing to the commercial control of academic publishing—including for marginalized researchers and researchers from economically disadvantaged countries who are too often excluded from high-visibility academic production. In agreement with Veletsianos’ suggestion that open scholarship practices “may also reflect a form of caring” (Social Media 82), the Canadian HSS Commons embraces open-access publishing as a means of fostering care within and beyond the diverse academic institutions and communities it serves.

A second way that the Canadian HSS Commons hopes to facilitate equitable and caring scholarly practices is by leveraging the research-sharing capabilities of social networking sites while remaining responsive to the privacy and security concerns of marginalized scholars. Like Morrison, Hagit Meishar-Tal and Efrat Pieterse explain how social media and ASNS remain popular among scholars in general, and marginalized scholars in particular, despite the (disproportionate) risk of personal attacks, privacy violations, and data exploitation. However, as McMillan Cottom notes, sites such as Twitter can also give marginalized scholars a previously unprecedented degree of exposure or “hypervisibility” (“Black Cyberfeminism” 222). For some scholars, hypervisibility can be of great value if it is made possible via platforms that also “preserv[e] the aspects of privacy that work for their purposes” (222).11 But not all social networking sites, academic or otherwise, empower users with this level of control. In fact, as Raj Kumar Bhardwaj notes, many ASNS lack basic privacy settings (308–309). Unlike Academia.edu, the Canadian HSS Commons is working to offer members free, unfettered access to citation metrics—all while enabling the kinds of publication and sharing tools that make hypervisibility possible—as well as the granular privacy controls and content-licensing features that vulnerable researchers or research data may require at other times (such as when members want to share a dataset containing protected information with one group of collaborators but not another).  

Conclusion

Hess and Ostrom suggest, quite reasonably, that “there is no one solution to all commons dilemmas” (12). The challenges and possible solutions outlined in this paper may not apply to all digital research commons, either, yet they gesture nevertheless to larger and still-pressing concerns—concerns that bridge distinct generations of researchers as well as seemingly discrete fields of study. For all of these reasons, and others that fall outside the scope of this paper, the work of building and maintaining better digital commons is necessary, but also necessarily ongoing and collaborative; it takes a (digital) village, so to speak.

For its part, the Canadian HSS Commons will continue to reflect on the experiences of those engaging with commons, explore the benefits of fostering care, and strengthen open scholarly communication through innovative interaction and engagement. As the above discussion suggests—and as the INKE Partnership intends to explore further through future studies, community consultations, and opportunities to learn alongside scholars and partner organizations—there are at least three ways the Canadian HSS Commons and other digital spaces can build trust and make meaningful inroads in these directions. The first is by approaching design and governance deliberately, with an eye to the messages or values that a website’s design and attendant governance framework can communicate to visitors. The second is by remaining attentive to the intersection of digital humanities with digital sociology as well as adjacent studies into how misogyny, racism, and epistemic injustice manifest online. The third way is by integrating, enforcing, and re-evaluating site policies or governance frameworks—such as the IAD framework developed by Ostrom and Hess—in ongoing consultation with members and stakeholders of the digital commons’ larger community. Indeed, components of IAD or other governance frameworks can be implemented to address each of these areas. Digital commons with such governance structures clearly and firmly in place are perhaps ideally positioned to provide fluid responses informed by—and for the collective good of—commons members.

To this end, the Canadian HSS Commons has recently started taking other concrete steps to work towards this ideal, such as establishing an Advisory Group. In line with the Canadian HSS Commons’ policies, as well as the project’s mandate to empower humanities and social sciences researchers to share knowledge and foster caring digital communities, this group will provide crucial oversight, feedback, and accountability. Working in directions complementary to those facilitated by existing and planned site features—such as the account-verification and content-flagging features discussed above—the Advisory Group will be able to contribute to the project’s governance in ways that directly address the recommendations of Ostrom, Hess, and other experts.

Moving beyond the idea of care as “mere rhetoric” (Veletsianos, Social Media 80), then, the Canadian HSS Commons will continue to provide the digital infrastructure that makes these activities possible while incorporating evolving best practices regarding the creation and governance of robust and respectful digital environments.

Works Cited

Adema, Janneke, and Gary Hall, editors. Really, We’re Helping to Build This . . . Business: The Academia.Edu Files. Liquid Books, 2015, liquidbooks.pbworks.com/w/page/106236504/The%20Academia_edu%20Files.

Arbuckle, Arbuckle, Nina Belojevic, Tracey El Hajj, Randa El Khatib, Lindsey Seatter, and Raymond G. Siemens, with Alex Christie, Matthew Hiebert, Jon Saklofske, Jentery Sayers, Derek Siemens, Shaun Wong, and the INKE and ETCL Research Groups. “An Annotated Bibliography on Social Knowledge Creation.” New Technologies in Medieval and Renaissance Studies. 2018, https://ntmrs-skc.itercommunity.org.

Benkler, Yochai. The Wealth of Networks: How Social Production Transforms Markets and Freedom. Yale UP, 2006.

Bhardwaj, Raj Kumar. “Academic Social Networking Sites: Comparative Analysis of ResearchGate, Academia.Edu, Mendeley and Zotero.” Information and Learning Sciences, vol. 118, no. 5/6, 2017, pp. 298–316. Emerald Insightdoi.org/10.1108/ILS-03-2017-0012.

Bollier, David. “The Growth of the Commons Paradigm.” Understanding Knowledge As a Commons: From Theory to Practice, edited by Charlotte Hess and Elinor Ostrom, MIT Press, 2006, pp. 27–40. doi.org/10.7551/mitpress/6980.003.0004.

Bond, Sarah. “Dear Scholars, Delete Your Account at Academia.Edu.” Forbes, 23 Jan. 2017, https://www.forbes.com/sites/drsarahbond/2017/01/23/dear-scholars-delete-your-account-at-academia-edu/#3ead127d2d62

Bordalejo, Barbara, and Roopika Risam, editors. Intersectionality in Digital Humanities. Amsterdam UP, 2019. 

Borgman, Christine L. Scholarship in the Digital Age: Information, Infrastructure, and the Internet. MIT Press, 2007. ProQuesthttps://ebookcentral-proquest-com.ezproxy.library.uvic.ca/lib/uvic/detail.action?docID=3338751.  

Boyle, James. The Public Domain: Enclosing the Commons of the Mind. Yale UP, 2008. https://thepublicdomain.org/thepublicdomain1.pdf

Burke, Peter. A Social History of Knowledge: From Gutenberg to Diderot, Based on the First Series of Vonhoff Lectures given at the University of Groningen (Netherlands). Polity Press / Blackwell Publishers, 2000.

Burke, Peter. A Social History of Knowledge. II: From the Encyclopédie to Wikipedia. Polity Press, 2012.

Canadian Humanities and Social Sciences Commons. “Abuse and Harassment Policy.”https://hsscommons.ca/legal/abuse. 2022. Accessed 3 Feb. 2021.

Canadian Humanities and Social Sciences Commons. “Terms of Use / Conditions.” https://hsscommons.ca/about/terms. 2022. Accessed 3 Feb. 2021.

Caswell, Michelle, and Marika Cifor. “Revisiting a Feminist Ethics of Care in Archives: An Introductory Note.” Journal of Critical Library and Information Studies, vol. 3, no. 2, 2021, pp. 1–6, doi.org/10.24242/jclis.v3i2.162.

Crawford, Kate, and Tarleton Gillespie. “What Is a Flag for? Social Media Reporting Tools and the Vocabulary of Complaint.” New Media & Society, vol. 18, no. 3, 2014, pp. 410–428. SAGE Journalsdoi.org/10.1177/1461444814543163.

Digital Citizens Alliance. Trouble in Our Digital Midst: How Digital Platforms Are Being Overrun by Bad Actors and How the Internet Community Can Beat Them at Their Own Game. June 2017,www.digitalcitizensalliance.org/clientuploads/directory/Reports/Trouble-in-Our Digital-Midst Report-June-2017.pdf.

D’Ignazio, Catherine, and Lauren F. Klein. Data Feminism. MIT Press, 2020. data-feminism.mitpress.mit.edu.

Duffy, Brooke Erin, and Jefferson D. Pooley. “‘Facebook for Academics’: The Convergence of Self-Branding and Social Media Logic on Academia.Edu.” Social Media + Society, vol. 3, no. 1, Jan. 2017, pp. 1–11. SAGE Journalsdoi.org/10.1177/2056305117696523.

Dyer, Harry T. “Interactivity, Social Media, and Superman: How Comic Books Can Help Us Understand and Conceptualize Interactivity Online.” Digital Sociologies, edited by Jessie Daniels, Karen Gregory, and Tressie McMillan Cottom, Policy Press, 2017, pp. 77–101.

El Khatib, Randa, Lindsey Seatter, Tracey El Hajj, Conrad Leibel, Alyssa Arbuckle, Ray Siemens, Caroline Winter, and the ETCL and INKE Research Groups. “Open Social Scholarship Annotated Bibliography.” KULA: Knowledge Creation, Dissemination, and Preservation Studies, vol. 3, no. 1, 2019, pp. 1–141. doi.org/10.5334/kula.58.

First Nations Information Governance Centre. “The First Nations Principles of OCAP®.” fnigc.ca/ocap-training/.

Fitzpatrick, Kathleen. “Academia, Not Edu.” Really, We’re Helping To Build This . . . Business: The Academia.Edu Files, edited by Janneke Adema and Gary Hall, Liquid Books, 2015, liquidbooks.pbworks.com/w/page/106424037/Academia, Not Edu.

Fitzpatrick, Kathleen. “Community, Safety, and Trust.” Platypus: The Blog of the Humanities Commons Team, 21 Jan. 2021, team.hcommons.org/2021/01/21/community-safety-and-trust/?shareadraft=baba507_6009a89548ec1.

Hardin, Garrett. “The Tragedy of the Commons.” Science, vol. 162, no. 3859, 13 Dec. 1968, pp. 1243–1248. JSTOR, https://www.jstor.org/stable/1724745.

Hess, Charlotte, and Elinor Ostrom. “Introduction: An Overview of the Knowledge Commons.” Understanding Knowledge As a Commons: From Theory to Practice, edited by Hess and Ostrom, MIT Press, 2006, pp. 3–26, doi.org/10.7551/mitpress/6980.003.0003.

Hugo, Wim. “TRUST Principles Mini Symposium.” Research Data Canada, 7 July 2020, www.rdc-drc.ca/?wpdmdl=2754

The Information Maintainers et al. Information Maintenance as a Practice of Care: An Invitation to Reflect and Share. 2019. Zenododoi.org/10.5281/zenodo.3251131.

Lin, Dawei, et al. “The TRUST Principles for Digital Repositories.” Scientific Data, vol. 7,  2020, p. 144.doi.org/10.1038/s41597-020-0486-7.

Liu, Alan. “Where Is Cultural Criticism in the Digital Humanities?” Debates in the Digital Humanities, edited by Matthew K. Gold, U of Minnesota P, 2011, dhdebates.gc.cuny.edu/read/untitled-88c11800-9446-469b-a3be-3fdb36bfbd1e/section/896742e7-5218-42c5-89b0-0c3c75682a2f. Accessed 28 Apr. 2021.

McMillan Cottom, Tressie. “Black Cyberfeminism: Ways Forward for Intersectionality and Digital Sociology.” Digital Sociologies, edited by Jessie Daniels, Karen Gregory, and Tressie McMillan Cottom, Policy Press, 2017, pp. 211–231.

McMillan Cottom, Tressie. “‘Who Do You Think You Are?’: When Marginality Meets Academic Microcelebrity.” Ada: A Journal of Gender, New Media, and Technology, no. 7, 2015, doi.org/10.7264/N3319T5T.

McPherson, Tara. “Why Are the Digital Humanities So White? Or Thinking the Histories of Race and Computation.” Debates in the Digital Humanities, edited by Matthew K. Gold, U of Minnesota P, 2011, dhdebates.gc.cuny.edu/read/untitled-88c11800-9446-469b-a3be-3fdb36bfbd1e/section/20df8acd-9ab9-4f35-8a5d-e91aa5f4a0ea#ch09. Accessed 28 Apr. 2021.

Meishar-Tal, Hagit, and Efrat Pieterse. “Why Do Academics Use Academic Social Networking Sites?” International Review of Research in Open and Distributed Learning, vol. 18, no. 1, Feb. 2017, doi.org/10.19173/irrodl.v18i1.2643.

Morrison, Aimée. “Of, By, and For the Internet: New Media Studies and Public Scholarship.” The Routledge Companion to Media Studies and Digital Humanities, edited by Jentery Sayers, Routledge, 2018, www.routledge.com/The-Routledge-Companion-to-Media-Studies-and-Digital-Humanities/Sayers/p/book/9780367580681.

Noble, Safiya Umoja. “Toward a Critical Black Digital Humanities.” Debates in the Digital Humanities 2019, edited by Matthew K. Gold and Lauren F. Klein, U of Minnesota P, 2019, pp. 27–35. JSTORdoi.org/10.5749/j.ctvg251hk.5.

Nowviskie, Bethany. “On Capacity and Care.” Bethany Nowviskie, 4 Oct. 2015, https://nowviskie.org/2015/on-capacity-and-care/.

Ostrom, Elinor, and Charlotte Hess. “A Framework for Analyzing the Knowledge Commons.” Understanding Knowledge As a Commons: From Theory to Practice, edited by Hess and Ostrom, MIT Press, 2006, pp. 41–81, doi.org/10.7551/mitpress/6980.003.0005.

Rainie, Lee, Janna Anderson, and Jonathan Albright. “The Future of Free Speech, Trolls, Anonymity and Fake News Online.” Pew Research Center, 29 Mar. 2017, www.pewresearch.org/internet/2017/03/29/the-future-of-free-speech-trolls-anonymity-and-fake-news-online/.

Research Data Alliance International Indigenous Data Sovereignty Interest Group. “CARE Principles for Indigenous Data Governance.” Global Indigenous Data Alliance, Sept. 2019, www.gida-global.org/care.

Scrivens, Ryan. “Identifying Extremism in Online Forums.” MSUToday, Michigan State U, 6 Jul. 2020, msutoday.msu.edu/news/2020/ryan-scrivens-identifying-extremism-in-online-forums/.

Suber, Peter. “Creating an Intellectual Commons through Open Access.” Understanding Knowledge as a Commons: From Theory to Practice, edited by Charlotte Hess and Elinor Ostrom, MIT Press, 2007, pp. 171–208, doi.org/10.7551/mitpress/6980.003.0011.

Tennant, Jon. “Who Isn’t Profiting Off the Backs of Researchers?” Discover, 1 Feb. 2017,www.discovermagazine.com/technology/who-isnt-profiting-off-the-backs-of-researchers.

Veletsianos, George. “Open Practices and Identity: Evidence from Researchers and Educators’ Social Media Participation.” British Journal of Educational Technology, vol. 44, no. 4, July 2013, pp. 639–651. British Educational Research Associationdoi.org/10.1111/bjet.12052.

Veletsianos, George. Social Media in Academia: Networked Scholars. Routledge, 2016. Taylor & Francis Groupdoi.org/10.4324/9781315742298.

Willinsky, John. The Access Principle: The Case for Open Access to Research and Scholarship. MIT Press, 2006.

Winter, Caroline. “The TRUST Principles for Digital Repositories.” Open Scholarship Policy Observatory, 19 Dec. 2020,ospolicyobservatory.uvic.ca/the-trust-principles-for-digital-repositories/.

Winter, Caroline, Tyler Fontenot, Luis Meneses, Alyssa Arbuckle, Ray Siemens, and the ETCL and INKE Research Groups. “Foundations for the Canadian Humanities and Social Sciences Commons: Exploring the Possibilities of Digital Research Communities.” Pop! Public. Open. Participatory, no. 2, Oct. 2020, doi.org/10.54590/pop.2020.005.

Comments
0
comment

No comments here

Why not start the discussion?