PROJECTS

How Does Algorithmic Assistance Impact Human Performance? Evidence from Algorithm-Assisted Evaluation in Software Development.

ABSTRACT

Human-algorithm augmentation is becoming increasingly common in organizations, where algorithmic assistance during mundane tasks frees up humans to perform other high-level tasks. While augmentation literature generally predicts a synergistic integration between humans and algorithms, this perspective often overlooks potential negative spillover effects of algorithmic assistance on human performance. Focusing on the autonomy of algorithm that operate without human involvement, this study argues that it may stifle interactions between humans, a key mechanism through which knowledge is exchanged, collective insights are formed, and learning occurs. Therefore, it will lower human’s performance. Specifically, I examine the introduction of autonomous algorithm to assist humans in evaluating each other’s contributions in collaborative projects, a process traditionally conducted through human interaction and discussion. Using a stacked cohort generalized Difference-in-Differences design and data about software development projects on GitHub, the study finds that although the adoption of Continuous Integration (CI) bot within a project—algorithms that assist evaluation—reduce the burden of developer’s evaluation, it also decreases interaction among developers during evaluation. This leads to a decline in developer’s performance in two essential tasks: monitoring fatal problems on an ongoing basis and searching for new ideas within the project. Plus, it increases total un-resolved fatal problems within the project, possibly due to reduced developer’s performance. Lastly, I find that the benefits and pitfalls of the bot adoption magnify when the project receives contributions from diverse knowledge domains. These findings challenge prevailing assumptions about human-algorithm augmentation, highlighting the need for a balanced approach to integrating algorithmic assistance.

Algorithm Learning: Benefits of Third-Party Human-Intermediaries’ Co-Existence with Algorithm on Platforms (with PK Toh)

ABSTRACT

A digital transaction-platform intermediates using algorithm while often allowing third-party human-intermediaries to co-exist on the platform, prompting questions of what strategic benefits they bring. Conventional research typically focuses on participants’ value-add to the platform, implicitly assuming that algorithm remains static in participants’ presence. We instead recognize that algorithm learns and changes dynamically. We draw on the organizational learning literature to depict intermediation as a learning process and illustrate that human-intermediaries add learning benefits to the algorithm. Using Instagram data and Sentiment Analysis, we first show that while algorithm on average generates wider user-reach for advertising brands (complementors), influencer (human-intermediary) elicits more positive user-sentiments. Next, we demonstrate that algorithm learns vicariously from a given influencer to elicit more positive user-sentiments over time, especially when the influencer provides richer or less ambiguous information on her past intermediations. Further, we find that algorithm’s vicarious learning from the influencer alone appears insufficient; rather, it generates improvements interactively with algorithm’s own experiential learning over time, similar to the vicarious-experiential interaction effect in human-learning. Findings stress that without recognizing such algorithm learning, we may have previously underestimated value-add of human participants on platforms or in organizations that employ algorithm for similar tasks. Findings also join recent research in stressing that a platform is inclusive of participants not just for promoting generative joint-value creation but can also be for enhancing its own value capture.

Competitive Crowding and Complementor Participation in Platform-Based Ecosystems: A Study of the Instagram Platform. (with PK Toh)

ABSTRACT

Complementor participation is critical to platform growth. Prior research asserts that increasing it, however, can sometimes result in competitive crowding which discourages further participation. Departing from this assertion, we highlight that the platform orchestrator often offers algorithmic intermediation to help a complementor reach new users. We then draw on the competitive search literature to propose that competitive crowding may not reduce participation but rather changes its nature – induces the complementor to seek the adoption of algorithmic intermediation. Using Instagram data, LDA topic-modeling and a natural experiment, we demonstrate that this effect varies with the algorithm's, competitors’ and the complementor’s user reach abilities. Findings suggest that the tradeoff between competitive crowding and network effect is more nuanced and highlight how competition works as a value-appropriation tool for the platform.

Product Abandonment in Platform. (with Shiva Agarwal & Cameron Miller)

ABSTRACT

The products with network effects face uncertainty not only with respect to the standalone value of the product but also whether the product could generate network effects, and these dual sources of uncertainty may shape managers' decision-making throughout the product life-cycle, especially during the early stages of product development. Specifically, during the early stages of product development, there is a lot of uncertainty about its market acceptance, and managers often seek signals to resolve that uncertainty by relying on user feedback. However, the tension arises when the user feedback conveys differential signals about the underlying quality of the product, which helps in resolving uncertainty regarding the standalone value of the product, and the potential installed base of the product, which help resolve uncertainty regarding its potential to create network effects. This paper aims to abductively explores such decesion-making processes of managers, especially how network effects impact managers' decision-making when uncertainty regarding the standalone value of the product and its potential to generate network effects are not fully resolved.

Strategies for Social Insertion into Collaboration Networks. (with Francisco Polidoro)

ABSTRACT

Given the strategic importance and inertial forces of collaboration networks, extant literature highlights strategies that enable existing firms to overcome the inertial forces and become more connected players. However, research thus far has overlooked how firms that are totally new to domains can become more connected players. This study redresses this tension by examining the entry strategies of firms: collaborative entry and standalone entry. We argue that standalone entrants take longer to succeed in initial activities in domains, as they lack essential resources. However, once they succeed in such activities, they become more valued as prospective partners, since their capabilities become evident. We demonstrate such dynamics in the context of corporate incumbents’ investments in ventures. Overall, this study provides implications for research on network evolution.