Souti Rini Chattopadhyay

Assistant Professor of Computer Science

Viterbi School of Engineering

University of Southern California

Hi! I'm Souti ['show-tēē], and I go by Rini. I am an Assistant Professor of CS at the Viterbi School of Engineering, USC. I work in HCI and Software Engineering, redesigning digital experiences to align software and human cognition based on psychological principles.I am the founder of the Adaptive Computing Experiences (ACE) lab at USC. We investigate cognitive and social aspects of how humans interact with software across various activities from programming to building social identities.

Outside of work, I am passionate about 🪴 plants of all kinds, 🎹making music 🎤, arts 🎨!
On a sunny day, you might catch me near the beautiful ☀️ 🌊 SoCal beaches or atop the mountains 🏔

Highlighted Research Projects


.01

Cognitive Biases in Developers: Tools and Practices to Improve Productivity

Role : Lead UX Researcher, Project Lead

Analyzing the effect of biases on developer productivity through a mixed-methods study to build best practices and design tool interventions that can improve developer experience.The research resulted in two publications at ICSE, awarded the ACM SIGSOFT Distinguished Paper Award, featured as research highlights in ACM Communications, and drove NSF IIS funding of 1.2 million USD (in partnership with UTK).

.02

What's Wrong with Data Analysis Tools?

Role : Lead UX Researcher, Project Lead.

Current computational notebooks are inadequate to support the growing needs of data scientists. Through a 3 part mixed methods study with data scientists, we identified 10 major pain points data scientists face and suggest feature designs to improve user experience.The study resulted in a publication with the Honourable Mention Award by ACM SIGCHI (top 5% papers per year), inspired features in industry tool, and feature on Nature Magazine.

.03

YouTube Vlogs to Dismantle Stereotypes about Developers

Role : Lead UX Researcher, Project Lead.

Misconceptions about who is a developer? and what do they do? create barriers for people to pursue a software development career. Through a combined video and artifact analysis, paired with the contextual inquiry we identified 10 stereotypes that can be broken through vlogs and identify information that developers find more useful, resulting in two well-received D&I academic publications in Software Engineering and HCI Communities.


Cognitive Biases in Developers: Tools & Practices Improving Productivity

Problem Statement
Cognitive Biases are ways in which humans make sub-optimal decisions affecting our productivity. The question is, how much? and how to design tools that help?
Role
Project Lead, Sole UX Researcher, UX Designer
Users: Developers
Interface: Programming Environment (IDE)

UX Research Method

Exploratory Study

** Method Design Philosophy**
To understand the real-world implication of biases, we need to observe users at work and understand their problem-solving thought processes.
Study Design
Observe 10 users through a Field Study, non-invasive (Fly-on-the-wall Observation) while asking them to follow Think-Aloud Protocol. The 60 min observation sessions were screen recorded and users' think-aloud were audio recorded.
To confirm our understanding of their thoughts was correct, users then went through a 15-minute contextual inquiry explaining their actions and clarifying events that observers noted.

Data Analysis

Qualitative Analysis

Using Card Sorting all actions in the videos were categorized into one of the following: read, edit, navigate, execute, ideate.
Using an Open Coding approach all actions were associated with one of the 28 categories of biases under investigation, based on previous scientific findings [Mohanani. et. al.].
Through Inductive (Axial) Coding, the 28 categories of biases were then grouped in 10 Bias Categories (CB1 -CB10) based on the observed effect of each bias on the programmer's behavior. Because we are observing biases posterior to the effect, we can only identify biases based on their effect (biases with similar effect was grouped together).
The categories were validated by an expert to ensure each category is well-defined, independent, and unique.

Bias CategoryDescriptionBias CategoryDescription
CB1 PreconceptionTendency to select actions based on preconceived mental modelsCB2 OwnershipDevelopers give undue weight to artifacts that they themselves create or already posses
CB3 FixationAnchoring problem-solving efforts on initial assumptions, and not modifying the anchorCB4 DefaultDevelopers choose readily available options based solely on their status as the default
CB5 OptimismBiases that lead to false assumptions and premature conclusionsCB6 ConvenienceThe assumption that simple causes exist for every problem
CB7 Subconscious ActionOffloading evaluation and sense-making to external toolsCB8 Blissful ignoranceThe assumption that everything is nominal and working, even in the face of contradictory information
CB9 Superficial selectionInformation being unduly valued based on superficial criteriaCB10 MemoryHow developers remember information from a series of alternates

Quantitative Analysis

We first defined reversal actions --- that was undone or discarded at a later stage. The hypothesis was, biased actions are more likely to be undone or discarded leading to a loss in productivity.Visualization of Biases and Actions show that 45.7% of all actions were associated with a bias, which cost a total of 25% of the entire work time due to biases.To find whether the Biases and Reversal Actions were Independent, we used a Chi-Square Test of Independence with Bonferroni Corrections for multiple comparisons. The Chi-Square Test revealed a p-value < 2.2e-16, rejecting the null hypothesis and suggesting biases were highly associated with reversals (Cramer's V measure showed large effect size and association.

Effect Analysis

To see the effect of different biases, we do a tree-area visualization where each tuple (CB3, 428) refers to the category and number of actions associated with it.Fixation (CB3) bias was the most frequently occurring bias category; 89.7% of these actions were reversed. E.g., when participants ‘fixated’ on specific solutions, rejecting warnings and errors that contradicted their beliefs as they continued to pursue their solution.

Validation Study

To validate biases are actually present, we conducted a confirmatory survey with 16 users across 3 companies to triangulate findings. I described the different bias categories to each user and asked them to rate on a Likert scale how frequently have they seen others act under biases? with the hypothesis that maybe users are unaware of biases.Then, to understand the needs of the user, I further asked them what type of support they desire (or they think already exist) for each type of bias.

Quantitative and Qualitative Analysis of Needs

For most categories of biases, users agreed more that they sometimes, often, or always act under bias influence.

We collected and transcribed users' experiences with biases like P12 says about CB6:
[32:12] “It happens all the time!... It’s the story behind why technical debt happens! Three months and then you go and ask why on earth is this failing? And when you look back and somebody overwrote something because it was easier. And it screwed up everything!”
We also collected users' needs from the tools and interfaces to help with the effect of biases. We categorized these using card sorting into 14 helpful practices at the organizational/ team/ individual levels to combat the effect of biases.Also conducted thematic analysis of the user needs from the interview to identify 7 different tool interventions that can help with biases.

UX Design - Tool to Present History in Context to Reduce Bias

Design Hypothesis and Philosophy

Fixation can be reduced by tracking the user's previous similar changes that lead to the same error.
The tool tracks if the user makes similar changes 10 times that lead to the same error, and presents the option to check the output from history.
The typography, iconography, and color theme needed to match the existing platform norms (VSCode).

UI/UX Design Components

Interface format
The pop-up appears on the bottom right corner of the platform. This maintains internal and external design consistency.

Use Case
When users made similar changes to the same line of code and get the same error 10 times in a row, a bias warning shows up on the platform where users have 3 options

  • Check the previous instance of similar changes and output

  • Rerun the previous code change on the current state

  • Ignore and continue what they were doing


Accolades

This research was recognized with the ACM Distinguished Paper Award(given to top 10 papers every year) and featured as a Research Highlights by Communications of ACM (only ~25 papers per year given this recognition). The research also contributed in bringing 1.2 million USD in NSF Funding in a cross-university collaboration.

Publications for This Project



What's Wrong with Data Science Tools?
Reshaping Interfaces to Improve Usability

Problem Statement
Data Scientists are frustrated with current tools! Where are the issues? How to design interfaces that help scientists understand and use analysis?
Role
Project Lead, Sole UX Researcher, UX/UI Designer, Front-End Development
Users: Data Scientist
Interface: Jupyter Notebooks

UX Research Method

Exploratory Study

** Method Design Philosophy**
The main goal here is to catalog the user difficulties with the tools. For this, I incorporated data triangulation by using both observation study and interviews across multiple companies at this stage.
Study Design
Observed 5 users through a Contextual Field Study at Microsoft where they walk through their work. Interviewed 15 users from 5 different companies to reduce data bias. During the interview, users explained their challenges at all stages of their work.

Qualitative Data Analysis

Using the inductive open coding approach in ATLAS.ti
For round 1 coding, two coders applied descriptive codes. Through inter-rater negotiation, the codes were split and merged based on themes.
Then, through Axial Coding, similar groups were further merged and 9 high-level categories of challenges were captured.To ensure the validity of categories, we recruited two experts to examine the categories who reached a moderately significant agreement (statistical Cohen's Kappa = ~0.8).The 9 categories of challenges are listed below.

ChallengeDescriptionChallengeDescription
SetupLoading and cleaning data from multiple sources and platforms is tortuousSecurityMaintaining data confidentiality and access control is an ad hoc process
Explore and AnalyzeAn unending cycle of copy-paste and tweaking bits of codeShare and CollaborateSharing data or parts of analysis at different levels unsupported
Manage CodeManaging code without software engineering supportReproduce and ReuseReplicating results or reusing parts of code is infeasible
ReliabilityScaling to large datasets is unsupported, causing kernel crashes and inconsistent dataNotebooks as ProductsDeploying to production requires significant cleanup and packaging of libraries
ArchivalPreserving the history of changes and states within and between is unsupported  

Validation Study & QuantitatIve Analysis

To validate challenges, and identify high impact challenges we conducted a survey with 156 users asking them two quantitative Likert scale based questions:

  • How difficult is it to perform 20 listed activities in the tool? on a scale of very easy (1) to very difficult (5).

  • How important is it to perform 20 listed activities in the tool? on a scale of not important (1) to very important (5).

The hypothesis was if an activity is both highly important and highly difficult it needs immediate support through changes to the interface or tool.I used visualization and descriptive statistics (median rating value) to identify the intersecting high-impact challenges areas :
Deploy in Production, Explore History, Long-running tasks, and Refactoring

We concluded that notebooks have end-to-end avenues for improvement from early-stage activities like data setup to late-stage processes like sharing and deployment. Identified two major underlying causes creating this friction between users and this tool:

  • Loss of decisions tied to the contents in existing notebooks

  • Inability to comprehend existing notebook content

Towards these two problems, I built and evaluated an alternative user interface that captures decisions and created affordance for better usability.

UX Design - Human comprehension Based Interface

Design Hypothesis and Philosophy

Redesigning computational notebooks with affordances and interfaces inspired by models of human comprehension (like the PQRST model) can improve usability, comprehension, and overall experience.NotebookStories - the new interface - tackles both underlying problems of comprehension (with improved interface) and loss of decision (by capturing code decision through pattern analysis)

Design Motivations

UI/UX Features

  1. Side navigation bar of Chapters - quick navigation without a long scroll through analysis file.

  2. Expandable Section headers capturing intention - only see parts they want AND better understand the analysis.

  3. Highlight/Comment code or output - PQRST model says annotation helps users use and remember better.

  4. Export selected parts - users can export useful code and output by leaving them expanded and using the feature.

  5. Icons to locate graphs, data, models, etc. - allows users to quickly locate analysis for input, output, model, & libraries.

Backend Intention mining algorithm

To capture intention within code, we wrote a PatternMiner script in python that takes identifies code patterns.We trained this model through content analysis of 50 existing notebooks on GitHub. We used a parser to identify functions and keywords in code and qualitatively coded 24 different intention patterns.We then feed any existing notebook to this script to identify patterns. Patterns are the Section Headers for the UI.

Lo-fi Wireframes

UI/UX Design Components

Consistency and Familiarity are key for this UI. As programmers are used to programming in the competitive interface of Jupyter, too many abrupt changes to the UI component will not allow evaluation of the design of the philosophy. So the themes and typography are similar to the JupyterLab environment, whereas I added icons that have a complementary color scheme to attract attention.

iconography


Theme


Typography


Prototype

Final Showcase

Notebook stories - An alternate Data Analysis Interface

Competitive Analysis (Ongoing)

Initial phase quantitative and qualitative analysis with 25 data analysts (experts and novices) had two parts:

  • Control Evaluation - the users walked through a notebook explaining the analysis using a Jupyter Interface.

  • UI Evaluation - the users use NotebookStories Interface to explain the analysis.

Log Analysis

MeasuresJupyterNotebookStories
Comprehension score3.2/5 (mean)4.3/5 (mean)
Navigation time11.5 secs (mean)5 secs (mean)
Backtracking23 clicks11 clicks
Scroll time250 secs82 secs

User Feedback

  • "Analysis was easier to understand" as users could remember "where the model was" etc. using clickable navigations

  • "Cognitive load was lessened" as users could look at the icons and quickly find what they were looking for, and only keep certain parts of code expanded "which in turn made it easier to piece the analysis together!"

Coming Soon - system Usability Survey


Accolades

The research was awarded the Honourable Mention Award by ACM SIGCHI (given to top 5% papers every year).The design opportunities we discussed inspired features in mainstream notebooks like Deepnote and the breadth of challenges discussed was featured in Nature Magazine article. The research further inspired a 1.2 million USD NSF funding in a cross-university, cross-continent collaboration (under review).

Publications for This Project


YouTube Vlogs to Dismantle Stereotypes about Developers

Problem Statement
Developers are taking to vlogging on YouTube to uncover the mystery behind who is a developer and what do they do? There are thousands of 'A day in the life' vlogs which garners millions of views and comments. How do these videos contribute to breaking barriers?
Role
Project Lead, Sole UX Researcher
Users: Vloggers and Viewers
Interface: YouTube

UX Research Method

** Method Design Philosophy**
The first goal is to understand the content that is included in the vlogs and the viewers' perspectives of the vlogs.
Study Design
Using conducted a video analysis of 130 vlogs by developers on YouTube and contextual inquiry with 16 vloggers who are also developers to understand motivations behind the vlogs. Then, I analyzed 1176 comments on these vlogs to identify impact of vlogs on the community

Exploratory Qualitative study

Using random stratified sampling based on location, the number of subscribers, and views, I identified 130 vlogs portraying a developer's day. Then using Atlas.ti I qualitatively analyzed videos on two dimensions - 1. activities shown, 2. topics discussed in the videos.Then I conducted a contextual inquiry with 16 of these developers to understand their motivations behind vlogging. Using card sorting, I identified the main motivations behind vlogging.Finally, I mined the top 10 comments on each of the vlogs, leading to a total of 1176 comments. I analyzed these based on the topic, the type of audience commenting, and the tone of the comment.

01 Motivations behind vlogging

From the interview with developers who vlogged, I identified these 5 motivations behind vlogging using semi-structured interviews that lasted between 30-45 minutes. We asked the users (who are vloggers here) about their motivation for starting a YouTube channel and why they continued to post content regularly.
Using card sorting, followed by open descriptive coding we identified the 5 motivations until we reached saturation.

02 Analysis of vlog contents

First, we transcribed the videos and assigned descriptive codes (labels/short phrases) to the various topics covered by vloggers as well as the activities they show as part of their everyday life. Then, we reorganized these codes and performed selective coding by grouping related topics into standalone thematic concepts to identify 6 topics covered in the vlogs.

03 Analysis of viewer comments

First, using an inductive open coding approach I assigned descriptive codes to each comment based on how the viewers found value in the video and the type of video e.g. sharing their gratitude, asking technical questions. There were 15 such codes, which I rearranged into 5 high-level themes of comments using axial coding approach on how viewers perceived the vlogs

  • A source of information and consultation for lifestyle as well as career

  • Seeking advice from vloggers and other viewers on specific cases

  • Discovering a community to engage with and share perspectives with

  • Expressing empathy and camaraderie with other viewers and vloggers

  • Friendly banter and arguments, viewers treated others as equals

Breaking Stereotypes

Convergent analysis of codes from the 3 stages lead to identifying the effect that vlogs have on the community - breaking the misconceptions of about developers, and in turn reducing barriers for a diverse population to be part of the communityWe identified that these vlogs dismantle 10 different types stereotypes listed below:

StereotypesStereotypes
S1 Developers are mostly male and mostly whiteS2 Developers are a young crowd, with no familial responsibility
S3 Developers are math wizards and they are born with coding skillsS4 Getting a traditional CS degree is essential to be a developer
S5 Developers code all day and know nothing beyond itS6 Developers seldom talk to others
S7 Stereotypes about job titles, startups, freelancingS8 Developers have no time for fun
S9 Developers are asocial or anti-social and prefer to be left aloneS10 Developers lead an unhealthy lifestyle

Papers/Projects



Contact me!

Thanks for visiting my page.

If you're interested in the research and would like to collaborate, please drop an email atschattop@usc.edu.Students are encouraged to visit the ACE Lab page.


Adaptive Computing Experiences (ACE) lab

ACE LAB website has migrated

At the ACE Lab, we are working to transform computing experiences by adapting users' experiences to their cognitive processes.

Computing [/kəmˈpyo͞odiNG/] [noun. the use or operation of computers] has evolved beyond programmatic and mathematical operations in today's world. People's interactions with computers range from developers building software to people monitoring their heartbeat during exercises on wearable devices. However, these experiences are often unintuitive, stale, and require humans to adapt to the machine. We aim to build experiences that adapt to the users, for activities ranging from programming to lifestyle.